With other smartphone technology advancements beginning to plateau, such as display quality (resolution, color, black levels, etc.) and CPU capabilities, OEMs are looking to other areas for differentiation.
Alongside things like software features and packing massive batteries, one area several manufacturers have started focusing efforts is image sensing, as indicated by the massive achievements in both 2012 and 2013.
First was Nokia’s PureView technology with the not-so-mainstream, Symbian-powered 808 PureView, which came packed with a 41-megapixel sensor capable of taking up to 38-megapixel images. Those images, at full resolution, were surprisingly impressive, even still today. And what exactly can one do with a 38-megapixel image? Start with that oversampled image and downsample it to a smaller resolution, of course, for a much sharper, cleaner image than a standard image.
Shortly thereafter, Nokia launched yet another PureView-branded smartphone, the Lumia 920. With an 8-megapixel sensor, the 920 was more phone than camera, but it wasn’t just any ol’ 8-megapixel sensor. This camera had optical image stabilization (OIS), giving it the ability to take truly impressive low-light images, as well as smooth video in less-than-desirable, bumpy conditions.
And not quite one year later, Nokia married the two phones with a fair balance of both, the Lumia 1020, a Windows Phone handset with a 41-megapixel sensor, OIS, and a notably smaller chassis than the 808, though it did come with the now-iconic, Oreo-like camera housing around back.
HTC also jumped on the image sensing bandwagon last year with its own technology, UltraPixel. Effectively, HTC fitted its flagship One handset with an optically stabilized 4-megapixel sensor. As underwhelming as that sounds, the physical sensor size was comparable to that of the standard 8- and 13-megapixel sensors in competing flagships. The promise was to capture more light and, thus, more detail.
UltraPixel created a great divide. Some people were impressed with what HTC could do with so little, while others hated the comparatively low-res images that sometimes came filled with noise. But one thing is certain. Along with Nokia’s bold advancements, HTC’s focus on image quality and sensor size helped push mobile image sensing forward in a big way. We have both companies to thank for such massive leaps in mobile image sensing in just a year’s time.
LG, Motorola, Samsung, and other companies have all contributed, each in their own way.
As of last week, however, an up-and-comer has joined the mobile image sensing race: Oppo. The Chinese handset maker made the Find 7 smartphone official last Wednesday. The Find 7 is a seemingly normal smartphone, if you consider a phone with a 5.5-inch QHD display normal. Its specs are par for the course: 3GB RAM, 2.5GHz Snapdragon 801 SoCs, 32GB fixed storage, 3,000mAh battery, and a 13-megapixel camera.
However, the Find 7 has an unusual trick up its sleeve: the ability to take 50-megapixel images.
When our own Stephen Schenck wrote about the technology last week, there were a handful of naysayers in the comments, predictably comparing the Find 7’s image sensing chops to those of the Lumia 1020.
The verdict? According to you ladies and gents, the Find 7 sucks, man!
Does it? Can we write it off so nonchalantly? Do we actually know what it’s capable of?
In short, no. Not even remotely. Most of the sample images I’ve personally seen have been taken indoors or on an overcast day, in particularly terrible shooting conditions, under which most smartphone cameras would struggle anyway. And comparing it to the Lumia 1020, as logical as it seems, doesn’t actually make a lot of sense.
The 1020 has a 41-megapixel sensor and – by default – downsamples shots down to 5-megapixel images. It takes seriously amazing pictures with an unbelievable amount of detail. The Find 7, in a way, is doing the opposite. It takes images at 13-megapixels and, with the so-called “Super Zoom” mode enabled, blows the resulting picture up to a staggering 50-megapixel image.
How does it do so without creating a nasty, pixelated mess? This is what Oppo is calling Super Zoom, and it’s a lot more simple than it sounds. This isn’t some astounding hardware feat or black magic. In fact, it uses software technology not unlike what astronomers have used for decades to take clearer pictures of distance objects in outer space.
In essence, the Find 7’s camera fires 10 shots in rapid succession, thanks to the prowess of its Pure Image 2.0 image processor. It gathers the data from those consecutive shots, intelligently picks the best four, and combines the data from them to create the final image at a staggering resolution of 8,160 by 6,120 pixels. Essentially, it uses the additional captures to fill in the blanks, so to speak, and cram additional detail into the image.
All things considered, it works quite well – far better than I would have imagined.
Oppo says the resulting images are comparable to other 50-megapixel cameras. If you take what we’ve seen thus far, it’s not nearly a match for the 1020’s image sensing prowess. While the 50-megapixel images are notably clearer than the standard 13-megapixel shots, they feature an oil painting texture at 100 percent crop, and they’re understandably filled with noise and artifacts.
That said, comparing the Find 7 to the 1020 is short-sighted and fruitless – it gets us nowhere.
If, instead, you take a step back and revel in what the technology is actually accomplishing, what it means for the future of mobile image sensing, and appreciate what Oppo is doing with more reasonable-resolution image sensors, it’s actually quite impressive. Engadget’s Richard Lai states,
“While Super Zoom is a software-based feature, the 1/3.06-inch IMX214 sensor also deserves credit for its 480 megapixel-per-second bandwidth, which is 33 percent faster than the 13-megapixel CMOS chip on the Find 5. And of course, the bright f/2.0 aperture helps, too.”
That’s certainly worth mentioning. But the more impressive aspect of the camera’s performance is a theoretical I brought to Stephen just before writing this editorial. “What sort of effect do you get by taking the 50MP image and scaling down to the original 13MP resolution?”
The 50-megapixel images, admittedly, aren’t terribly impressive. But they are notably sharper than the 13-megapixel samples. (Who needs a 50-megapixel image anyway?) If you downsample it, by scaling it back down to the original 13-megapixels, the resulting image is quite a bit clearer. That’s exactly what you’re seeing to the left (click to enlarge). The top half is a 100 percent crop on the original 13-megapixel image. The bottom half is a 100 percent crop of a 50-megapixel image scaled down to 13-megapixels. The results are rather splendid.
It’s not perfect. In fact, it’s far from perfect. But the technology, and what we’ve seen from the Find 7 after just a handful of sample images, bears a message. There’s more that can be done with software to improve image quality than you might think.
So before you throw your hands in the air and write the Find 7 off as a failure, think about what Oppo has accomplished here and what it will mean for the forthcoming generations of smartphones. This technology comes from a much smaller smartphone manufacturer who is doing more to push mobile image sensing forward than some of its exponentially larger competitors.
Keep an eye on Oppo. It has a knack for thinking outside the box while its toughest competitors bend over backwards to make a new cookie cutter mold.