Here again we set up all of our cameras next to one another, all facing the same direction. We then let our phones buzz constantly with numerous activities alerts, in order to determine which models could best differentiate between meaningful motion (someone walking by the camera) and nuisance motion (leaves falling outside the window, cars driving by on the street, and the like). We found that all models were nearly identical in this regard, with the exception of the models that allow you to set designated areas where the camera should look for motion and areas where it shouldn't.
After testing activity alerts and determining that no camera would miss meaningful motion we moved onto assessing the actual services provided by each camera manufacturer, both with and without a subscription. Within the lens of monitoring performance this was mostly a fact finding mission, with careful verification that all claims were true by using the devices we had purchased. We assessed how easy it was to use all of these monitoring services in our app ease of use testing.
Finally we tested the audio quality of each camera. This was done by putting a tester in front of our wall of watching eyes, standing equidistant from all of the cameras, and speaking at a normal volume. We recorded this, and then compared the audio of all the resulting clips, listening to one after another, all on the same phone. This allowed us to directly compare things like, volume and clarity. For the models with two way audio we repeated this test in reverse, speaking to a tester standing equidistant from all the cameras, one camera at a time. We repeated this process with testers, then averaged their scores of how well they we able to understand audio coming from each of the cameras.
Image quality is a difficult thing to measure, and in our testing of projectors, chromebooks, and drones we've learned that the most meaningful way to evaluate image quality is to simply view identical videos side by side. So we set up all of our cameras right next to each other and captured identical video on them. We then viewed that video side by side on identical phones (as a smartphone is the most common medium for viewing security cameras). We repeated this for both day and night time scenarios. Our test shots included text placards to give us a better sense of how crisp the image was and people walking through the frame to evaluate how well each camera dealt with motion.
Once we had abundant video of people walking past each camera, we grabbed screenshots of faces. This allowed us to see how adept each camera was at capturing facial details, and which one would be most likely to lead to the identification of an intruder. Having someone walk through the frame also let us judge how much more we could see from the cameras with wide angle lenses, and whether or not the edges of the image were distractingly distorted.
App Ease of Use
Generally users will both view footage from and adjust the settings of their security camera via the associated app. We tested how easy these apps were to use by putting them into the hands of multiple testers. First we asked these testers to live with these cameras sending them alerts for a few days, to get a feel for the kinds of motion the cameras were and weren't detecting. After they had a feel for each camera we then asked the testers to adjust things like scheduling and notification settings. Again, for this test all of the cameras were set up next to each other looking at the same scene. We also then had testers look into the video history of each camera and try to pull out the same clip from each.
After this real world experience we asked the testers to rate multiple attributes, including: how easy the app was to navigate in general, how easy it was to set scheduling, how simple it was to pull out specific clips from the video history, and how easy it was to access special features like 2-way audio.
We also looked at geofencing in this test, having all of our testers leave and reenter the office multiple times with their phones in their pocket. We found that this worked equally well for all models that offer it, and that the quality of the feature was more dependent on the phone than the camera, e.g. This feature worked great on all the cameras using a current iPhone, not nearly as well for all of them using an older smartphone.
Real Time Viewing
We used two different tests for real time viewing. First, the latency test. We had a tester open real time viewing on his phone, and then walk in front of the corresponding camera and start a stopwatch when he reached a line of tape on the floor. They then looked at their phone and stopped the stopwatch when they see themselves reach the line of tape on the floor via the real time video feed. This told us how much latency there was between the camera seeing something and that actually being displayed on the corresponding app. In order to eliminate any possible advantages of the camera and smartphone being on the same wifi network we shut the wifi off on the phone we used for this test, forcing it to run off of 4G.
We also conducted a continuous activity test, where we had someone pace around in front of each camera for 30 seconds while someone else watched the real time feed on a smartphone (again on 4G). Viewing this continuous activity allowed us to see how many times the video feed hiccupped and refreshed in that time span, or if it stayed smooth and continuous. It also allowed us to ascertain the video quality of the real time feed, because some cameras are able to capture and save high resolution clips, not all can transfer that quality to a live feed.