Google's Camera app isn't exactly feature-rich, at least not when it is compared to alternatives offered by OEMs and many independent developers. Of course, that may be one of the reasons it is fairly popular – the interface remains simple and the features that did make it in, like Photospheres, are pretty cool. After looking through the latest update, it looks like Google is working toward another major feature addition called Smart Burst, and it might just become the best way to take photos of your friends.
Smart Burst
When we think of a traditional burst mode, it would usually be described as holding down a shutter button while a camera takes as many pictures as it can in rapid succession. The end result is usually a massive heap of photos with very minor differences that a photographer (or lowly-paid assistant) will sort through for the next few hours (or days) for the very best. Based on some evidence from the latest Camera update, it looks like there's a new feature that will do most of the heavy lifting. To begin with, it is called "Smart Burst."Just from the strings above, it's safe to say Smart Burst will not be turned on magically without at least one more app update. There is also a separate launcher icon and interface, which suggests Smart Burst was broken out of the primary camera app so it wouldn't interfere with development.
sb_ic_launcher
Interface
There's not much in the way of an interface, but the same could really be said for the entire Camera app. There are three layouts:
- sb_activity_capture: This shows a screen filled with a view from the camera, and it's covered by a capture button and a TextView that shows debug information.
- sb_activity_eval: This screen shows a title (currently, "Postprocess Eval App"), a progress bar named "video_progress" and a surface view named "video_view."
- sb_activity_result_picker: This is just a large StackView control. (StackView examples:Developer docs, StackOverflow)
From these layouts, the most interesting detail is the use of "video" in some names. It's possible Google is treating the capture process as a video recording instead of rapid-fire still shots. In the past, this would have resulted in some very below-average results because of limitations in the data pipeline required videos to capture mediocre quality frames; but thanks to considerably more advanced hardware and the capabilities that were opened up with the Camera 2 API, it's realistic to record video without sacrificing a pixel of quality.
Processing
Regardless of the method to capture images, the real magic comes in the post processing stage. The app examines each frame using a series of algorithms to extract and measure different details, sorts everything into groups based on similarity, and then scores photos based on an assortment of criteria.
Starting with the list of filters and converters that are used to extract details from photos:
After basic processing is finished, images are further examined using facial recognition software originally developed by a company named PittPatt, which was acquired by Google in 2011. This is used to find the faces in a photo and determine some specific details about each one. With this step out of the way, each photo is finally scored for quality. These are the criteria I could find:
I'm not sure where it occurs in the process, but the images are also compared using a pHash. Similar in concept to regular hashing functions that create significantly different values as a result of small differences in files, pHash attempts to construct values that are "close" to each other for images that are very similar. This is likely used to group frames for comparison purposes, or may even play a part in automatically eliminating frames that are too similar to others but but have lower scores.
In the end, users will be presented with a short list of the highest scoring images (think back to the sb_activity_result_picker layout with the StackView). There will probably be an actions to delete, share, or maybe even dig into images that weren't initially included. Since the interface is obviously still in development, we can probably expect this part to change in the future.
Wrap-Up
Google is obviously late to the game in terms of burst mode, but this is an effort to produce something better than just another rapid-fire camera mode. Of course, this isn't the first evolved variation on burst mode, either. We've seen similar features on HTC's camera with Zoe enabled, Samsung's camera in "Shot & more" mode, and a few others. Ultimately, what matters is that this Smart Burst actually delivers great results, and we won't know that until it's ready to roll out to users.
As I suggested before, this isn't something we'll see turned on with a server-side switch. There are still some things missing and a few details that require some polish, but it looks like most of the important pieces have been added. It's likely that the biggest hold-up is related to fine tuning the scoring algorithms.
Given that we're around 3 months from the release of the next Nexus phone, I wouldn't overlook the possibility that Google will hold this back to present it as a headlining feature. If 2015 brings a truly exceptional camera to the Nexus lineage, this Smart Burst feature would be a spectacular way to show it off. Let the anxious waiting begin...
0 comments:
Post a Comment