By Stephen Schenck | August 7, 2013 7:46 PM
The just-launched LG G2 has plenty of compelling features: its crazy-thin bezel, that new Snapdragon 800 chip, and its unconventional hardware layout with rear-mounted buttons. One of these many features to get some individual attention during the launch presentation was the phone’s 13-megapixel main camera.
On its own, a 13-megapixel sensor isn’t that remarkable, but LG has paired the G2′s sensor with optical image stabilization hardware. That combination has the potential to be really great, as a crisp, stable image is just going to let each those millions of pixels capture the data it should, rather than contributing to a blurry mess.
Maybe more than what OIS can mean for the G2 and LG, its presence here really speaks to the growing popularity of the tech. We first started paying a lot of attention to it with Nokia’s efforts, beginning with the Lumia 920, but have seen other OEMs showing a shine to it this year, like HTC and its stabilized lens on the One. With LG on-board as well, I think we can safely start calling OIS the new hottness.
The problem is, for as much help as OIS gives us in snapping better-than-ever shots, I’m concerned that its popularity could be coming at the expense of a much more important aspect of smartphone camera development: the need for larger image sensors.
The Balancing Act
Photography, in many ways, is about finding balance. All those daunting settings on a manual camera aren’t just to intentionally confuse novices, but to give photographers the control they need to get the shots they want.
For instance, there’s shutter speed and aperture size. Both are fully distinct parts of a camera’s hardware, but each plays a key role in governing how much light the camera lets in. While the aperture limits size of the light beam – and ultimately, the sheer number of photons flying through the lens at any moment – the shutter controls how long that beam is exposed to the camera’s sensor (or film, or even paper, if you’re going all pinhole).
It’s a bit more complicated than this, with considerations like how aperture effects focal depth, but at a very basic level, you can get a similar exposure by either letting in a whole lot of light all at once (large aperture, fast shutter) or much less light, but for a longer period of time (small aperture, slow shutter).
Why this impromptu camera primer? Because OIS and sensor size both fit into this same balancing act in the fight for light.
You can think of OIS as a feature that improves the quality of an exposure. The problem with using longer shutter times is that, in order for that balancing act to make sense, the very same, unmoving image needs to be coming down that beam of light the whole time the shutter’s open. Unless you’re going after blur for a stylistic reason, that means keeping the camera as still as possible; the easiest way to do this is with a tripod. The benefit of OIS is that the camera can still move a little, but OIS keeps the desired light path constant, preserving a sharp exposure.
So Far, So Good, Right?
My problem is that manufacturers seem so focused on OIS right now that they’re ignoring efforts to make smartphone cameras with physically larger sensors. Just as OIS helps out with the shutter side of exposures, a larger sensor helps in much the same way as a larger aperture. That is, while the aperture controls how much light is coming into the camera during any given time period, a larger sensor allows the camera to sample more of it, making better use of that light.
What does this have to do with OIS, though? OIS solves a specific problem, but a move to larger sensors wouldn’t just help image quality in general: it would simultaneously address many of the issues OIS is here to fix. The reason’s simple: with a larger sensor, capturing more light, we can dial-down shutter times. A fast enough shutter effectively negates the temporal issues that give rise to the need for OIS in the first place.
Reasons Why Not
If a bigger sensor is such the magic bullet to greatly improving camera performance, why aren’t we seeing any? Well, to an extent we are – some manufacturers have been pushing size boundaries on a handful of models – but it should be very telling that we’re still waiting for a smartphone with a sensor to measure-up to the 808 PureView’s, and not even the 1020′s compares.
There are some very good reasons why no one’s really running with this idea. Cost comes foremost to mind, and raw sensor size is going to have a lot more to do with the expense of a component than any megapixel count. There are also space issues, and not just along the plane of the sensor – everything else being equal, a larger sensor needs to be further away from a lens than a smaller sensor, bringing phone thickness into the equation. That doesn’t mean it can’t be made to work, but it still makes the design process more complicated.
Some manufacturers have tried to “cheat” their way around the need for a larger sensor by making smaller sensors capture more light – like Motorola and its new RGBC component. As should be painfully clear from our Moto X review, that’s not quite the solution Motorola would like it to be, as color reproduction takes a noticeable hit.
Apologies to the G2 for using it as a whipping boy, as it’s just the latest in a line of cameras placing an emphasis on OIS over sensor size. I’ve got the feeling that this trend will continue for a while, with even more companies jumping on the bandwagon. Honestly, I’m not sure how we might ever see a similar push for larger sensors, given how hesitant manufacturers seem to consider them on even higher-end phones. That’s a real shame, because there’s absolutely a market for smartphones with a big emphasis on camera performance.