5 Galaxy S9 Camera Improvements Samsung Needs to Win
With the lightning-fast pace of advancement in smartphone imaging tech, it seems that every few months a new smartphone claims the photography crown. Lately the iPhone X and Pixel 2 have run neck-and-neck for the title of top smartphone camera — and that means Samsung has its work cut out for it with its upcoming Galaxy S9 and S9 Plus.
The Galaxy S8 may have represented a design breakthrough for Samsung's mobile division, but that phone's approach to photography was decidedly more conservative. Both the S8 and S8 Plus used f/1.7 12-megapixel cameras that were largely unchanged from those in the Galaxy S7 and S7 Edge. And that put Samsung a generation behind 2017's finest flagship shooters from the likes of Apple, Google, HTC and Huawei.
So when Samsung hinted at a reimagining of the smartphone camera in the company's teasers for the Galaxy S9's launch, we took notice. And Samsung's promise got us thinking: What can the company do to ensure that its next flagship shooter beats the competition? To answer that question, we poured through our past comparisons and spoke to a professional photographer as well as an expert at DxOMark Image Labs, a site that specializes in rating cameras. Here's our conclusion about what Samsung needs to do.
Improve Low-Light Performance
This is the main weakness of every smartphone camera, even the best ones. If you want to see a camera's true colors, try to take pictures indoors, at night or in a dimly lit restaurant. Smartphones are compact little marvels of technology, and the image sensor and camera stack as a whole must be as tiny as possible to fit within a cluster of components all wrestling for space. But this necessity is at odds with the way an image sensor operates.
"Think of the sensor as a net trying to capture as much light as possible," says Robert Lowdon, a commercial photographer. "The bigger the sensor, the more you'll catch in a sense."
Lowdon says a truly high-performing mobile camera — something that would be more akin to a DSLR and that would put the cameras in the iPhone X and Pixel 2 to shame — would have a significantly larger image sensor with a greater number of aperture options and a lens that zooms optically. But because would be very expensive and hard to physically fit inside the confines of a mobile phone, widening the aperture might be the next best solution.
According to reports, the Galaxy S9's aperture on its Super Speed Dual-Pixel camera will be as wide as f/1.5.
MORE: These Smartphones Can Replace a Compact Camera
"Maximum lens aperture is a good step in the right direction," Lowdon says. "F/1.8 on the iPhone X and f/1.5 on this new Samsung are very large, better than you will see in a lot of professional camera lenses. The Galaxy S8 has a[n] f/1.7 max aperture, which is half a stop smaller. An f/1.5 aperture will give a bit better results, but I doubt the average person would notice [the difference]."
The larger the aperture is on your phone, the more light reaches the sensor and the better low-light performance should be. Owing to this, phone makers have constantly struggled to one-up one another by equipping their cameras with progressively wider apertures. The Galaxy S8 led the industry with an f/1.7 lens until the end of 2017, when the LG V30 and Huawei Mate 10 Pro delivered f/1.6 lenses.
Therefore, an f/1.5 aperture on the Galaxy S9 would be the largest in any smartphone to date. However, Samsung may not be finished there. Additional reports claim the S9's aperture will be adjustable, alternating between f/1.5 and f/2.4 modes depending on the requirements of the scene. Adjustable aperture would make for another smartphone first, though it's critical to note that a variety of other factors play into improving low-light performance.
Deliver Better Image Processing
Another huge factor in phone camera performance is image processing, which depends on software algorithms. Each manufacturer tunes its algorithms a little differently. This is part of the reason you see wildly different results from phone to phone looking at the same scene, said Nicolas Touchard, vice president of marketing at DxOMark, an industry reference for rating camera quality.
"Tuning [software] is very important in these devices," Touchard told Tom's Guide. "It's not just the lens, the aperture, the sensor and other camera hardware components, but the tuning, the algorithms and how the parameters are set [that] are all very important for getting high quality."
Software is a huge reason why the camera on Google's Pixel 2 impressed us so much. The device uses Google's HDR+ system to maximize dynamic range, or the gradation between the darkest and lightest points of a frame. With the help of the Pixel Visual Core — a custom-designed image-subprocessing chip — the Pixel 2 can deliver fantastically lit scenes with just the right amount of detail and color where other phones might fail. Samsung could learn from that when making the Galaxy S9.
"It's not just the lens, the aperture, the sensor and other camera hardware components, but the tuning, the algorithms and how the parameters are set [that] are all very important for getting high quality."
— Nicolas Touchard, DxOMark
"If you look at the Note 8, the level it reaches is good," Touchard says. "But we think that, given the hardware — the lenses, the sensor size, the chipset — [Samsung has] the potential to do even better."
While the Note 8 added a second rear lens — a feature that will appear on the Galaxy S9+ by all accounts — the phone processed photos similarly to the Galaxy S8 in our experience. It handled most scenarios well, but tended to leave images a little blurry. Samsung may have tuned the software to limit visual noise in its previous flagships, but an aggressive approach to smoothing resulted in lost detail. Striking a better balance could go a long way toward once again rivaling Apple and Google's best efforts.
Work on Live Focus Mode
Speaking of the Note 8's dual cameras, the S9 and S9+ figure to sport a significant distinction that Samsung's flagships previously lacked. While the rumored 6.2-inch S9 Plus could sport two lenses around the back, the 5.8-inch standard S9 will reportedly get only a single lens.
The Note 8 was Samsung's first device with two rear shooters, and although its Live Focus mode was good for capturing shallow depth-of-field portraits, it wasn't our favorite compared to similar offerings in Apple's 2018 iPhones, the Pixel 2 and Huawei's Mate 10 Pro.
The Note 8's portraits tended to be hazy, imprecise and lacking in contrast. Two phones we tested it against — the iPhone 8 Plus and the Mate 10 Pro — delivered more-natural results in portraits. And while it comes back to image processing, this is a use case Samsung should pay special attention to getting right with the Galaxy S9.
Speed Everything Up
The Galaxy S8 and Note 8 already have some of the fastest cameras in the business, but there's always room for improvement. Based on details we've seen in leaks of the S9's packaging and prematurely published pages on Samsung's website (before they were taken down), it looks like the Galaxy S9 could be even quicker to the draw.
It's all thanks to Samsung's ISOCELL three-stack, fast-readout sensor, which has been tipped to debut in the new flagship phones. While the specifics on the sensor remain unknown, Touchard explained how the stacked design could make capturing photos and video faster and more accurate.
"It allows you to reduce electronic rolling shutter artifacts,” Touchard says, “and boost bit rate to reach higher video quality up to 4K [resolution] and 60 frames per second or more."
MORE: Samsung Galaxy S9 Rumors: Specs, Release Date and More
The electronic rolling shutters used in phones are unlike the mechanical shutters typically found in conventional cameras. A rolling shutter doesn’t take a snapshot of an entire frame at once, but instead scans it horizontally or vertically. This can result in a warping phenomenon for fast-moving objects, like the blades of a helicopter, that fast-readout sensors can help stamp out.
If a stacked sensor really does make it to the Galaxy S9, it won't be the first time the technology appears a phone. Sony's Xperia XZ1 employs a similar structure that enables the phone to capture hundreds of frames a second while minimizing distortion.
Step Up Slow Motion
Much like in the Xperia XZ1, that fast-readout sensor should definitely come in handy for slow-motion video. According to rumors, the Galaxy S9 should match the XZ1's ability to capture video at 960 frames per second and 720p resolution, or 480 fps and 1080p.
But innovative, AI-derived software could make the phone better at anticipating those slow-mo-worthy moments, too. A recent leak claims one of the Galaxy S9's shooting modes will automatically activate slow-motion recording only once it's detected movement in the scene. Not even the iPhone X can manage that.
Such a feature could solve a frustrating problem central to the very nature of slow-motion video: Pressing the shutter button at a fraction of a second's notice is much, much harder than you might think, even with the best hardware.
Bottom Line
Samsung will have to do all of these things and more if it wants to return to the highest echelon of smartphone photography. Based on the leaks mentioned in this article — all of which seem pretty certain to be accurate, given that they've been corroborated by many sources at this point — it seems like the company has rightly identified many of the Galaxy S8 and Note 8's flaws and intends to correct them.
However, while the reported specifications, adjustable-aperture technology and fast-readout sensor seem impressive, whether the S9's camera succeeds will likely be mostly a matter of processing smarts. And that's what we currently know the least about.
Software has become the primary differentiator among the best phone cameras. It's how the Pixel 2 is able to achieve Portrait Mode with only one lens, and how the iPhone X and 8 Plus employ machine learning to let you modify a photo's lighting after you've taken the picture. How will Samsung use software to enhance the S9's camera? The company won't be able to "reimagine" mobile photography without relying on software.
Fortunately, we won't have to wait very long to find out. Samsung will explain everything at its Galaxy Unpacked event at Mobile World Congress 2018 on Feb. 25. Until then, you can track all the latest leaks with our constantly updated rumor roundup.
Credit: Samsung
Post a Comment