Google's issue __with the Google Glass camera was its size. While the Search giant was hoping to have a camera on its wearable that matched the quality of smartphone cameras, the sensor used in Glass was smaller than those employed on a handset. That meant that Google engineers had to figure out a way for the smaller sensor to capture enough light so that Glass users wouldn't have to wear a miner's helmet on their head. Eventually, a solution was invented that was called Gcam.
Since the hardware couldn't be changed, Google worked on software, and more precisely, the image processing. Gcam is similar to HDR in that multiple shots of the same image are snapped consecutively and then merged together. The theory is that the final picture offers the best of each photo snapped. This is so similar to HDR that on the Pixel, it is called HDR+. The technology is beginning to find its way into other Android devices and Google owned apps like YouTube and Google Photos.
What should be interesting will be a comparison of the pictures taken __with the Pixel models and those snapped by the upcoming BlackBerry KEYone. The latter will employ the same exact sensors as the Pixel does on both of its cameras. with such a comparison, we can see how much this technology really adds to those high quality snapshots that the Pixel and Pixel XL are known for.
source: X via 9to5Google
No comments:
Post a Comment