The world's most in-depth and scientific product reviews

How We Tested Projectors

Monday September 21, 2020

We spent a lot of time on our projector testing plan. It seemed the most obvious course was to get a lot of sophisticated equipment and measure the two big specifications, brightness and contrast ratio, ourselves and see how our measurements compared to manufacturer claims. This would give you, our readers, some additional information, as specifications measured with the same equipment by the same people are more comparable than specifications measured by various different manufacturers. However, we still didn't feel like this would get at the essence of the projector experience. As mentioned in our buying advice article, contrast ratio is the go to spec for representing image quality. Still, so many different things can affect image quality, including the type of media being projected, the dominant color of the current frame, and how light from the projector is bouncing around the room at any particular moment. We don't feel that pointing a machine at a black and white checkerboard and taking a measurement can really capture all of this. So instead we focused our testing on real world scenarios, watching movies and viewing presentations on multiple projectors side by side in order to determine which one functionally performed the best.

We broke out testing down into four metrics; image quality, brightness, ease of use, and fan noise. These metrics cover the full spread of factors that will affect projector performance and your user experience.

Credit: Jenna Ammerman

Image Quality

Image quality is the most important performance aspect of a projector. If you have a low quality image there is no reason to blow it up and project it on the big screen. In keeping with our real world testing strategy we split image quality into two categories, ambient light performance (performance in a well lit room) and dark room performance, and evaluated each using media that would normally be projected in each of those conditions.

Credit: Jenna Ammerman

In order to make our comparisons fair we used four projectors at once, all in the same room. This ensured lighting ambient light conditions were consistent across all four projectors. We also projected the exact same photos and video using a high quality, 4-way HDMI splitter.

Credit: Jenna Ammerman

Ambient Light Performance

To evaluate ambient light image quality we used text heavy powerpoint slides, graphs, spreadsheets, and high quality photos. Generally, these are the types of things that pop up during a presentation in a well lit conference room or small lecture hall. We projected identical slides and images side by side in the same room and compared how easy it was to read text and graphs, and how vivid or washed out the photos appeared.

Dark Room Performance

Our test for performance in darkened rooms centered around movie watching. We watched a multitude of different movies, movie trailers, and tV shows, but the centerpiece of our movie testing was the Martian. This film provides a unique challenge for projectors. It is difficult to let that red hue of the martian landscape come through, and then quickly transition to the stark white interiors of space ships, capsules, and stations without making all the actors' skin tones look overly red and sunburned. And who doesn't want to stare at Matt Damon, am I right?

Comparing projectors side by side in the same room allowed us to...
Comparing projectors side by side in the same room allowed us to make very detailed and fair comparisons.
Credit: Jenna Ammerman

We supplemented our movie watching with some high resolution photos as well, for any of you photographers who would probably dim the lights before exhibiting your work. We did the same side by side comparisons we did for the ambient light test and evaluated movies and photos for color accuracy, skin tones, and resolution. We watched plenty of outer space videos to evaluate how true the blacks were, and when we were on the fence about which model had better resolution, we used a siemens star to settle the argument.

For all image quality tests we cycled through the preset color modes of each model to see which looked best in each situation. We based our scores on the best images we could get via these presets. We know it is probably possible to make these images better by tinkering away with the brightness and contrast of individual colors, but that is time consuming and shouldn't be necessary get a good image. Accordingly, our testing focused on the much more user friendly presets.


Brightness was one area that we felt merited taking specific measurements. Brightness is far less objective than contrast ratio. Measured brightness is much more directly related to how bright an image will appear than measured contrast ratio is related to how good an image will look.

Credit: Jenna Ammerman

We used a light meter to measure brightness. When measuring brightness we projected a white screen with nine circles evenly spaced throughout. We measured the brightness inside each one of these circles and then averaged those 9 measurements to get an overall brightness figure.

Ease of Use

In the course of our testing we were constantly setting up and breaking down each projector over and over and over. This entailed adjusting the front legs, dialing in the focus and zoom, using the keystone correction to get a square image, and using the remotes to scroll through menus, adjust the volume, and switch color modes. We kept careful notes of how simple or frustrating each one of these tasks were for each model and turned those notes into an ease of use score.

Credit: Jenna Ammerman

Fan Noise

Fans react to the temperature of the projector, ramping up when it gets hot and shutting off when things cool down. Thus fan noise can be quite variable. To test fan noise we but each model on its brightest setting and projected a bright white image. This combination creates the most heat and thus the most demands on the fan. We then let each projector run in this state for half an hour and noted how unbearable the most annoying and distracting noise made by the fan was, and how often that noise level was reached. This data was then used to create a fan noise score.

You may think it would be more logical to measure noise levels with a decibel meter. In doing this for other reviews we've found that within the lower ranges measured decibel levels have very little correlation with how noticeable and distressing a sound is. So we left the decibel meter on the shelf for this test.