The angle is often up to 30 degrees, more if it is resting on a table or desk. While Smartphones are primarily single viewer devices, the variation in display performance with viewing angle is still very important because single viewers frequently hold the display at a variety of viewing angles. By all means, Google is playing with the best of the best in this department. By comparison, the iPhone XS Max lands at 25% and the Galaxy Note 9 at 27%. On the Pixel 3 XL, however, that’s down to just 28%. Viewing AnglesĭisplayMate explains that, typically speaking, a 30% angle will decrease brightness on a smartphone display by up to 55%. The first of those is the solid viewing angles. The lengthy piece details tons of information about the display, but there are a few key points I found particularly interesting while digging through it. Our own Stephen Hall says that they are “indistinguishable” to the naked eye.Īs usual, DisplayMate’s Pixel 3 display “shoot-out” is extremely in-depth. Before getting into that, however, it’s important to note that the smaller Pixel 3 and the larger Pixel 3 XL are on similar wavelengths for their displays. Now, we’re getting a closer look at what makes this new display so great. If you’ll recall, DisplayMate announced last week that it had already tested the Pixel 3 XL display and given it high marks, even handing out a “Best Smartphone Display” award and an “A+” grade. Now, that new Pixel 3 XL display is being put to the test, and DisplayMate has its in-depth results. One of the biggest areas where Google has improved on these devices, however, is on the OLED displays being used. By using a short exposure HDR+ avoids blowing out highlights, and by combining enough shots it reduces noise in the shadows.Google’s Pixel 3 and Pixel 3 XL launch later this week and our full, lengthy review, has already been published. However, bracketing is not actually necessary one can use the same exposure time in every shot. This makes alignment hard, leading to ghosts, double images, and other artifacts. Unfortunately, bracketing causes parts of the long-exposure image to blow out and parts of the short-exposure image to be noisy. "One solution is to capture a sequence of pictures with different exposure times (sometimes called bracketing), then align and blend the images together. "If exposure stays the same, then ISO must be the variable that changes slightly with each exposure" No, as they only combine "underexposed" frames. Furthermore Google often said that HDR+ only combines frames with the same(!) exposure, only underexposed frames. The Google Nexus 5x doesn't offer HDR+ dng files, so I had to test this with a modified Google camera app apk, which produced the same jpgs and exif data as the stock camera app. "the exposure time shown in Google Photos (if you press "i") is per-frame, not total time, which depends on the number of frames captured"įurthermore I have tested with the Google Nexus 5x that about the same highlights are blown out in the HDR+ dng file as when I capture a single(!) frame dng file with a third party app with the same exif exposure time and Iso. The exif data is related to the exposure time of a single frame, therefore it would be inconsistent to relate Iso to the total exposure time. Dereken, combining multiple frames doesn't affect Google's Iso number.
0 Comments
Leave a Reply. |