We spent a lot of time on our projector testing plan. It seemed the most obvious course was to get a lot of sophisticated equipment and measure the two big specifications ourselves — brightness and contrast ratio — and see how our measurements compared to manufacturer claims. This would give you, our readers, some additional information, as specifications measured with the same equipment by the same people are more comparable than specifications measured by various manufacturers. However, we still didn't feel like this would get at the essence of the projector experience. As mentioned in our buying advice article, contrast ratio is the go-to spec for representing image quality. Still, so many different things can affect image quality, including the type of media being projected, the dominant color of the current frame, and how light from the projector is bouncing around the room at any particular moment. We don't feel that pointing a machine at a black and white checkerboard and taking a measurement can really capture all of this. So instead, we focused our testing on real-world scenarios, watching movies, and viewing presentations on multiple projectors side by side to determine which one functionally performed the best in this impressive projector lineup.
We broke our testing down into four metrics; image quality, ease of use, brightness, and fan noise. These metrics cover the full spread of factors that will affect projector performance and your user experience.
After a ridiculous amount of testing, it became obvious that contrast ratio is the highest contributor to the quality of an image. With a narrow ratio, details are lost in both shadows and bright spots. To test the contrast ratio, we view the same slide show and movies on each projector, pausing and zooming in to investigate the details. The first slide of our sideshow exhibits dynamic contrast. Dynamic contrast is the difference between the lightest whites and darkest blacks. We explore the ratio by viewing a white-to-black scale and determining the level of visible difference between each block. Slides two through five introduce colors to help portray a little more versatility, providing more information to the viewer.
We also watch a video touring Costa Rica, which offers a very high level of detail, helping us decern how well the contrast ratio performs when the image is in motion. We pause and zoom in at the same two spots to view a highly textured from and to investigate the level of detail lost, or not lost, in the dark shadows of the lush forest.
We watched a multitude of different movies, movie trailers, and television shows, but the centerpiece of our movie testing was The Martian. This film provides a unique challenge for projectors. It is difficult to let that red hue of the martian landscape come through and then quickly transition to the stark white interiors of space ships, capsules, and stations without making all the actors' skin tones look overly red and sunburned. We evaluated movies and photos for color accuracy and skin tone accuracy. For all Color Accuracy tests, we cycled through the preset color modes of each model to see which looked best in each situation. We based our scores on the best images we could get via these presets. We know it is probably possible to make these images better by tinkering away with the brightness and contrast of individual colors, but that is time-consuming and shouldn't be necessary to get a good image. Accordingly, our testing focused on the much more user-friendly presets.
We supplemented our movie watching with some high-resolution photos as well, for any of you photographers who would probably dim the lights before exhibiting your work. We made the same side-by-side comparisons for the ambient light test and evaluated movies and photos for color accuracy, skin tones, and resolution. We watched plenty of outer space videos to evaluate how true the blacks were, and when we were on the fence about which model had better resolution, we used a Siemens star to settle the argument. Also zoomed in up t 400% on small details like eyelashes to see if there were any steps or blurring.
Ease of Use
Over the course of our testing, we were constantly setting up and breaking down each projector over and over and over. This entailed adjusting the front legs, dialing in the focus and zoom, using the keystone correction to get a square image, and using the remotes to scroll through menus, adjust the volume, and switch the color modes. We kept careful notes of how simple or frustrating each one of these tasks was for each model and turned those notes into ease of use scores.
Brightness was one area that we felt merited taking specific measurements. Brightness is less objective than contrast ratio. Measured brightness is much more directly related to how bright an image will appear than measured contrast ratio is related to how good an image will look.
To make our comparisons fair, we used four projectors at once, all in the same room. This ensured ambient light conditions were consistent across all four projectors. We also projected the exact same photos and video using a high-quality, 4-way HDMI splitter.
We used a light meter to measure brightness. When measuring, we projected a white screen with nine circles evenly spaced throughout. We measured the brightness inside each one of these circles and then averaged those nine measurements to get an overall brightness figure.
To evaluate ambient light image quality, we used text-heavy PowerPoint slides, graphs, spreadsheets, and high-quality photos. Generally, these are the types of things that pop up during a presentation in a well-lit conference room or small lecture hall. We projected identical slides and images side by side in the same room and compared how easy it was to read text and graphs and how vivid or washed out the photos appeared.
Fans react to the projector's temperature, ramping up when it gets hot and shutting off when things cool down. Thus fan noise can be quite variable. To test fan noise, we put each model on its brightest setting and projected a bright white image. This combination creates the most heat and thus the most demands on the fan. We then let each projector run in this state for half an hour and noted how unbearable the most annoying and distracting noise made by the fan was and how often that noise level was reached. This data was then used to create a fan noise score.
You may think it would be more logical to measure noise levels with a decibel meter. In doing this for other reviews, we've found that measured decibel levels have very little correlation with how noticeable and distressing a sound is within the lower ranges. So we left the decibel meter on the shelf for this test.