Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel

Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel

It’s no secret most of the world’s photos are now shot with, and viewed on, a smartphone. For casual photography, the impressive on-board processing of modern phones is usually plenty: Simply shoot and share. But if you’re a little more fussy about your images, or are photographing difficult subjects or in tricky lighting, then you’re likely to want to do some image editing. Even with clever apps like Lightroom CC and Snapseed, that can be painful on a phone’s small screen. Fortunately, there are now some ways to stay light but have a better platform for editing.

Perhaps taking pity on me for carrying 20 pounds of photo gear around every tech conference I cover, Google challenged me to see what I could do with a Pixelbook, a Pixel 2, and Lightroom. So I’ve been relying on that combination whenever possible for the last few weeks to see how complete a mobile photography solution I can make it. I’ve supplemented it with either my Canon G9 X point-and-shoot or my Nikon D7500 for capturing images beyond what the phone can do on its own. Here’s how it’s worked out, along with some options for tuning your own mobile photography workflow.

Besides the Pixelbook and Pixel 2, I’ve been carrying either a compact or small DSLR for shots needing more zoom or other additional capabilities. Along with a couple SD cards and a mouse (yes, I still use one), that’s enough for a surprising number of photo projects.
Besides the Pixelbook and Pixel 2, I’ve been carrying either a compact or small DSLR for shots needing more zoom or other additional capabilities. Along with a couple SD cards and a mouse (yes, I still use one), that’s enough for a surprising number of photo projects.

Making the Most out of Your Smartphone Camera

First, for the photos I care about, I shoot in RAW. In the case of the Pixel 2 or my personal OnePlus 5, that is typically DNG, although with Lightroom I can now also take advantage of Adobe’s Raw HDR workflow. The latter produces a high-dynamic range, floating point DNG image that has already been given a default tone mapping, but still has much more range to work with than a simple JPEG or traditional DNG. Other than the requirement for post-processing, the only other reason not to shoot in RAW is that some of the super-clever computational imaging done by high-end phones is only accessible when you shoot JPEG. For example, Google’s HDR+ technology that combines multiple frames to make a single, superior, image only works for JPEGs shot with the Camera app.

Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel

The images shot with the Lightroom camera are also stored in Google Photos — which is pretty cool, since all images shot with a Pixel 2 between now and 2020 will be stored in original resolution for free by Google. As I’ve written about previously though, the proliferation of various backend photo clouds is adding some confusion, as each vendors apps currently work best with their own cloud.

As long as both devices are online, images shot on the Pixel 2 sync through Lightroom automatically to the Pixelbook (and to the Lightroom desktop machine in my studio, and my main laptop). To get images off my Nikon D7500 or Canon G9 X, I either need to use Wi-Fi or an SD card. Unfortunately, while the Pixel 2 paired nicely with the D7500, the Pixelbook didn’t. Ultimately I’m not sure how big a deal that is, as the cameras’ Wi-Fi is too slow to transfer large numbers of image or RAW files. So a USB SD card reader is the small and simple answer. For the Pixelbook, you’ll either need a USB-C version or an inexpensive adapter.

Not surprisingly, the workflow starting with the smartphone camera is a whole lot simpler. But how does it measure up for challenging photography situations? High-dynamic-range and low-light scenes have been some of the toughest to capture with smartphones. A variety of computational imaging technologies under the loose heading “HDR” are now available to address those shortcomings. We’ll take a look at them and how they perform.

HDR on Smartphones is Improving by Leaps and Bounds

When HDR first appeared on smartphones, it was clever, but fairly clunky. It mimicked the process of bracketing on a standalone camera by (relatively slowly) capturing 2-3 images and then tone mapping them in a straightforward way. Now, the Pixel 2, for example, captures up to 10 images in a fraction of a second, then aligns and assembles them using the full power of the phone’s GPU and image processing hardware. Finally, it does noise reduction and an AI-based tone mapping that takes into account local contrast and the overall scene. Even the initial exposure is calculated based on a machine learning engine that has been trained on thousands of sample scenes. Apple, Samsung, and other high-end phone makers have similar systems, although they vary in how many images they capture, whether the images all have the same exposure, and in the quality of the post-processing and artifact suppression.

This is one of the scenes that really sold me on Google’s HDR+ as implemented in the Pixel 2. This is was captured in full Auto mode, and is straight out of the camera. You can click through to see a 50 percent down-sampled version of the original.
This is one of the scenes that really sold me on Google’s HDR+ as implemented in the Pixel 2. This is was captured in full Auto mode, and is straight out of the camera. You can click through to see a 50 percent down-sampled version of the original.

The result of Google’s HDR+ (and similar features in other phones) is an effective extension of the phone camera’s dynamic range well beyond what the 10-bit image sensor can provide natively. Google, as well as Apple, Samsung, and a couple other phone makers have also done an excellent job reducing or eliminating the artifacts that come along with doing all that image fusion. You can still fool them with enough motion in the scene, but it is getting harder. For anyone who wants an instantly usable image, this in-camera HDR produces a standard JPEG you can share right away. But if you want the ultimate in HDR, Adobe has pushed things even further.

While doing an excellent job of rendering the high-dynamic-range scene, this Adobe RAW HDR shows ghosting in the cyclist’s leg despite a nominal shutter speed of 1/1800s.
While doing an excellent job of rendering the high-dynamic-range scene, this Adobe RAW HDR shows ghosting in the cyclist’s leg despite a nominal shutter speed of 1/1800s.

With the newest version of Lightroom Mobile, if you have one of the supported smartphones, Lightroom Mobile’s camera feature can painlessly capture enough individual images to record both the shadow and highlight areas of a scene. It then automatically merges the individual RAW images into a high-fidelity floating point RAW version for follow-on processing. The results are very impressive, at least for static scenes. The process is slower than the built-in HDR+ feature, so it doesn’t work as well when there is motion in the scene. Also, because this is a twist on the RAW format that is unique to Adobe, images in this format aren’t widely supported, at least not yet. For example the Adobe HDR images I shot with the Pixel 2 aren’t viewable on Google Photos. However, they fit right into Lightroom, which brings us to the next piece of the puzzle, image processing.

I was able to make some quick adjustments using Lightroom Mobile shown by the Adjustment Brush mask, which were synced automatically to my desktop and the Pixelbook where I could do further editing.
I was able to make some quick adjustments using Lightroom Mobile shown by the Adjustment Brush mask, which were synced automatically to my desktop and the Pixelbook where I could do further editing.

Lightroom Now Spans Just About Every Device from the Largest to the Smallest

Once only available on full-on computers, thanks in part to a complex interface that begged for a keyboard, mouse, and large display, now Lightroom is easily accessible on phones, tablets, computers, and even the latest Chromebooks that have Android support. While the available feature set varies between devices, as you’d expect, even the Mobile version has become quite powerful. When used on a Pixelbook or large tablet, you can do a large amount of professional-grade image editing with it. If that isn’t enough, all of your images can be automatically synced to your computers for further editing, still in full fidelity.

The Pixelbook isn’t your Father’s Chromebook

Mobile Photography Workflow: Pushing the Envelope With Lightroom and Pixel

Overall, Google has put together an effective 1-2 punch for photographers who want to travel light, but still have a high-end workflow. That said, if you don’t need the keyboard on the Pixelbook, then an iPad or an Android tablet with an active stylus would be a less expensive, and lighter-weight, alternative to the Pixelbook. Similarly, if you want one of the best smartphone cameras on the market, the Pixel 2 is ideal. But if you’re on a budget, you can find less-expensive models that still support some form of automatic HDR and Adobe’s RAW HDR capability. For example, my cheaper OnePlus 5 fit just as nicely into this workflow, although it doesn’t produce the same image quality as the Pixel 2.

[Images by David Cardinal]

Continue reading

Google Pixel Slate Owners Report Failing Flash Storage
Google Pixel Slate Owners Report Failing Flash Storage

Google's product support forums are flooded with angry Pixel Slate owners who say their devices are running into frequent, crippling storage errors.

Google Will Use Pixel’s Camera to Measure Heart Rate and Breathing
Google Will Use Pixel’s Camera to Measure Heart Rate and Breathing

Like many of Google's machine learning projects, this one is coming first to Pixel phones, and more phones will probably get it down the line.

The Pixel 6 Might Come With Google’s Custom ARM Processor
The Pixel 6 Might Come With Google’s Custom ARM Processor

According to leaked documents, the sixth-gen Pixel phones could be the first to have Google's long-rumored custom ARM chips, which are allegedly codenamed Whitechapel.

Google Says Upcoming Pixel 5a Will Only Launch in US and Japan
Google Says Upcoming Pixel 5a Will Only Launch in US and Japan

Multiple reports claimed the chip shortage has forced some changes. Initially, sources said the phone was canceled altogether, but Google now confirms the 5a is coming but only to the US and Japan.