How We Tested Drones

By:
Max Mutter and Steven Tata

Last Updated:
Saturday


Share:

We designed all of our testing metrics in order to determine which consumer drone is best able to easily and reliably capture high quality footage. To do this we purchased nine of the most popular amateur models on the market and flew them side by side through a series of exhaustive tests.

Video Quality


To test video quality we flew all of our testing models along similar paths at similar times of the day to ensure that lighting conditions remained fairly consistent. We made sure these flight paths included both long, smooth panning shots and some tight twist and turns. We then closely evaluated the resulting footage for a number of factors: how stable the camera remained through all the maneuvers, the resolution and color quality of the images, and if the propellers ever impeded on the camera's view. We also paid attention to things like whether pointing the camera at the sun affected color saturation. We made all of these comparisons side by side on large, high definition monitors that allowed us to open multiple videos in the same viewing environment.

Flight Performance


We evaluated flight performance simply by flying. We took each model on long out and back routes, put them through the twisting and turning of following mountain bikers and frolicking dogs, and made sure they could stay steady when hovering close to the ground. This gave us a clear idea of the responsiveness of each model. We paid particular attention to how stable each model was during takeoff and landing, as these are some of the most accident prone points in every flight. We also specifically tested each model's specific autonomous flight features. Since most of these features are designed to make capturing certain types of footage easier, we evaluated these features both in how smooth the flight appeared when we used them, and how good the resulting footage looked. For example, orbit point of interest features are meant to yield a nice circular panning shot centered around a particular point. We scored these features based on how reliably and steadily the flight pattern was executed, and how smooth and focused the resulting footage looked. Likewise, cable cam features are meant to fly a predictable, straight line between two points while still allowing the pilot to move the camera around. We scored these features based on how reliably each model flew that straight line, and how well the footage from these flights looked. We made sure to closely evaluate the dependability of each model's automatic return to home function, though we would not recommend using such a feature unless it is a last resort (ie. you've lost sight of or contact with the drone).

We tested every aspect of performance  from the camera to the rotors.
We tested every aspect of performance, from the camera to the rotors.

Ease of Use


In assessing ease of use we considered the amount of effort required between opening the box and getting the drone in the air. This included initial setup of the drone, attaching rotors, downloading any required applications, and pairing the drone, controller, and smart device. We also closely evaluated the user interface on each controller, and any menu options that appear on the connected smart device display. We were looking for joysticks that felt solid and provided good tactile feedback, and on screen menus that allowed easy navigation of in flight functions. After many hours of flying each one of these models, our testers were able to make meaningful comparisons.

Video Downlink


We tested the quality of each video downlink by completing a 3000 foot flight with each of the models we tested (an exception as made for the Yuneec and the Parrot, which have much smaller ranges). This distance pushed the range of the drones towards the outer edge of what can be considered reasonable and safe flight. Throughout these flights we closely inspected the video downlinks, noting how clear each one was, if there were any decreases in quality during the flight, and if there were any glitches or loss of picture. We also noted if any on screen menus obscured any part of the video downlink, and how easy it was to view the downlink while using the controls normally.

We spent a lot of time flying drones in beautiful locations.
We spent a lot of time flying drones in beautiful locations.

Customer Service


We critiqued each manufacturer's customer service by calling each customer service line at least three times with both real and fabricated problems. If we were able to quickly talk to a real live person, rather than an automated phone tree, and that person was knowledgeable and helpful, we bestowed high scores. If it was hard to get in touch with anyone, and getting any sort of help required jumping through some hoops, we awarded low scores.


  • Share this article:
 

Follow Us



Related Review
The Best Drones of 2017

by Max Mutter and Steven Tata

Unbiased.