Some believe that smartphones will never replace “real cameras” like DSLRs or mirrorless cameras. This article is about providing evidence to the contrary and about making some amazing images with limited equipment. Here’s my account on shooting some of my first images of the Milky Way with nothing but a smartphone and a tripod.
Introduction
Introduction
The above photo was made with a smartphone. Smartphone sensor technology has actually been quite good for a couple years but it’s only just now that we’re seeing the right combination of sensors, lenses and software to make photographs that were once only possible on enthusiast-oriented, large sensor cameras.
I anticipate the day that our slim pocketable smartphones will be more powerful and capable cameras than the the top-of-the-line DSLRs and mirrorless cameras available today. Smartphones are already serious good cameras. Today they dominate the world of photography as the most common camera in the world. They have made photography more popular and more ubiquitous than it has ever been in history. Photography is thriving and it’s almost solely because of the smartphone. The smartphone’s progression into the professional and enthusiast realm is just starting. We’re already seeing smartphone manufacturers concentrating specifically on the phone’s camera as being the primary selling point. For example, the primary marketing campaign of the iPhone 6 has focused entirely on photographs shot on the iPhone. I predict that smartphones, or some evolution of them, will become truly enthusiast oriented and professional level photography tools, equal to or better than the best cameras we use today. The current progression of available technology indicates that it is inevitable.
The OnePlus One
The OnePlus One
I recently purchased what, in my personal experience, is the camera with the most advanced sensor technology I have personally used. That camera is housed in a smartphone called the OnePlus One. The One was released a year ago on a limited invite-only basis and was opened to the public in April of 2015 for unrestricted purchase via OnePlus’s webstore. It houses a 13 megapixel sensor with a 3.79mm/2 lens. (Approximately equivalent to a 28mm field of view on a full frame camera) It’s actually not the largest or highest resolution sensor or the fastest lens found in a phone but its combination of hardware and easily hackable software have made for one of the most capable smartphone cameras available so far. Only a few days after buying the OnePlus One, Diana and I departed on our All-American road trip through the Southwestern United States. With no final destination in mind and no set schedule for the road trip, we’re capturing photos of our beloved home country, doing a little backpacking, and shooting a few astrophotos, too. You can follow the progress of our travels via our travel blog, my Instagram or Diana’s Instagram.
A post shared by Diana Southern ? NorthToSouth (@northtosouthtravel) I packed a bunch of different cameras on this trip including the Sony a7II with a bunch of lenses, the compact Sony RX100 III, an old 35mm Film Olympus 35RD Rangefinder. But the camera that I’m was most excited to try using for astrophotography was my smartphone, the OnePlus One. I suppose the thought of pushing any given type of gear to its limit really interests me. I always want to see what’s possible with the simplest of equipment and I feel that trying to shoot the Milky Way with a smartphone is sort of the holy grail version of that idea. I think that a smartphone is, quite possibly, the most minimalist friendly camera ever made and the thought of being able to capture the Milky Way with one gives me a lot of satisfaction. We’re just getting there.
Photographing the Milky Way with a Smartphone
Photographing the Milky Way with a Smartphone
A few days into our trip, and on our first night with clear skies in Valley of Fire, a Nevada State Park, I had my opportunity to use the OnePlus One smartphone to shoot photos of the Milky Way. All of my shots from Valley of Fire were made around the park’s Arch Rock and Atlatl Campgrounds. Las Vegas is still less than 100 miles away from Valley of Fire, so we dealt with a fair amount of light pollution from the city but the conditions were fair enough for shooting the Milky Way. If you want to learn more about shooting the night sky, read my most popular article: How to Photograph the Milky Way or check out our complete list of tutorials at Astrophotography 101. Initially, I started shooting with Camera FV-5, a paid app for Android that provides advanced levels of control akin to a DSLR. With the app it’s possible to control the ISO, shutter speed, white balance, and focus method. The aperture on most smartphones is completely fixed so there’s no control for the f/number when using Camera FV-5 for the OnePlus One and it stays at a fixed f/2, which is rather fast for a smartphone (the iPhone 6 has an f/2.2 lens for comparison). Camera FV-5’s advanced controls and interface design make it the first app I would recommend for performing long exposures on a smartphone. Only a few smartphones are capable of truly long exposures with an app like this and the OnePlus One is one of the best available. For support, I used a simple smartphone clamp that has a standard 1/4-20 tripod thread on it so that it can be mounted to any tripod. I been using two different tripods during my travels: the carbon fiber Sirui T-025X and the Dynamic Perception D-Pod. I used both for making the smartphone images in this post.
First Test Shots
First Test Shots
For the first test shot, a few hours before the Milky Way Galactic Center would rise, I tried shooting a photo of our campsite at the maximum exposure possible on the app: 64 seconds, f/2, at ISO 3200. The results immediately looked promising: an ample amount of shadow detail, visible stars and generally acceptable levels of noise. Straight out of the camera, the shot didn’t look that great but with a little bit of noise reduction in post processing the photo doesn’t look half bad. That’s our little hatchback that we’re traveling in for the road trip: After a few hours, once the Milky Way’s Galactic Center was high enough in the sky to photograph, I tried my hand at my first ever photograph of the Milky Way with a smartphone. I dialed in the same exposure: 64 seconds, f/2 and ISO 3200, a little bit long for a typical Milky Way exposure but I didn’t want to compromise the amount of light the camera could gather. The result straight out of the camera looked a little bit like this (pushed 2 stops in Lightroom to give you a better idea of noise): There it is! The Milky Way galactic center was clearly visible in the shot and I was excited to try some more. The long 64 second exposure certainly helped pull out more brightness from the scene but also caused some visible star trailing. Similar to our first exposure, the same speckled color noise is present but it’s honestly not overly bad.
Making Improvements
Making Improvements
After this first test shot and knowing full well the limitations of the camera and observing the level of noise in the image, I figured I would try my hand at shooting a set of separate exposures to be stacked later in post processing for noise reduction. I shot three more photos for a total of four images and later post processed them using my [image stacking technique]. This technique allows us to gather more total light to combine together for a final image that will have less noise than a single exposure. Combining four exposures with identical framing to the one above, and applying some careful noise reduction filtering, I was able to create this result: Even better. I think the level of detail is more than acceptable and the reduction in noise using this stacking technique always produces improved results. The final thing that I started wondering about is what the results would look like when shooting in RAW. One thing I observed while shooting is that jpegs from the phone seemed to lose a lot of color information, especially in the shadows, making the rocks appear green/grey rather than the red hue that they are naturally.
RAW Long Exposures on a Smartphone
RAW Long Exposures on a Smartphone
With the most recent update of Google’s Android 5 Lollipop, Android phones have just started getting the new Camera2 API capability for capturing RAW files from their cameras as well as performing more advanced functions like manual focus. The OnePlus One doesn’t actually have the Camera2 API but it does sport most of the necessary features through the use of an app like Camera FV-5. Other phones that support RAW capture with Camera FV-5 are the Nexus 5 and Nexus 6 but they don’t support extra long exposures like on the OnePlus One. With phones just starting to shoot RAW, serious editing capability is becoming more and more possible. RAW files contain more total information and are usually much better for post processing than the typical jpeg format, especially in the case of astrophotography where we usually require big pushes in brightness when editing.
Problems
Problems
RAW recording and long exposures is problematic on the OnePlus One. It’s possible but there are some big issues. In any app that supports it, trying to shoot in RAW on the OnePlus One with exposures longer than 2 seconds causes the camera driver to crash, requiring a restart of the phone. Camera FV-5 avoids this bug by forcing jpeg-only output for exposures longer than 2 seconds. In order to try to actually shoot in RAW for my next set of photos, I was able to instead use a community hacked version of Cyanogen Camera called CameraNext that allowed me to force capture of RAW at longer exposures at the expense of crashing the camera driver. In order to shoot these next images, I had to restart my phone each time in order to capture each subsequent RAW image. Here’s what a single raw image looks like straight out of the phone, this time a 32 second exposure pushed two stops to show the level of noise: There’s a lot more fine grain and noise present in the RAW files from the OnePlus One than the jpegs, especially a pinkish glow that seems to dominate the image. That said, the natural red color of the rocks at Valley of Fire is more visible in the RAW files, making the colors a little more representative of the real thing. I imagine the jpeg engine applies a lot of noise reduction straight out of the camera while shooting in RAW does not but even so, the amount hot pixels present in the jpegs is no longer as strong. Using the same stacking technique, the image here shows the improvements that are possible with a stack of 4 RAW exposures: The shorter 32 seconds exposure was much more suitable for the OnePlus One’s 28mm equivalent lens to keep the stars looking mostly like pinpoints while still being adequate for light gathering. One thing that I was surprised by was that the in-camera/phone processing of the jpegs seemed to do a great job at reducing noise in the shadows and I feel like I had a harder time with the RAW files at bringing out the shadow detail without introducing too much noise. I’m sure it has more to do with the variability in my processing technique but I was actually quite happy with the jpegs, save for the certain lack of color in the deep shadows. Either way, the smartphone did a great job at producing a usable Milky Way shot, and on the first try too. I think that these photos are a testament to the progress of photographic technology and give light to the state of cameras in smartphones. They’re a lot better than most of us might think but almost all of them are severely limited by software. Software apps like Camera FV-5, are delivering a more DSLR-like experience and also unlocking a lot of unused potential from smartphone cameras. Final usable resolution, even though from a 13 megapixel camera, is not quite as good as what you could get with a larger lens/sensor camera of similar resolution but the result is still impressive and really speaks to the capability of the Sony Exmor RS sensor. Stacking multiple separate exposures always produces better results no matter which camera you use and it’s through this technique of combining multiple images that I think smarphones will really start to improve imaging capability. (More on this idea later.) I still think that the software error on the OnePlus One that crashes the camera driver when attempting to shoot long exposures in RAW is completely unacceptable. For the time being, I’ll likely stick with Camera FV-5 which prevents the problem from happening by disabling RAW capture for exposures longer than 2 seconds. Hopefully OnePlus can fix these issues with a future release of software for the phone. Overall, my first experience photographing the Milky Way with a smartphone has me itching for the future. I’ve written more about my thoughts on how mobile photography will progress in the “Final Thoughts” section below.
More Samples
More Samples
Diana and I continued our travels and eventually had another opportunity to shoot the night sky with the OnePlus One, this time from the magical setting of White Sands National Monument in New Mexico. We secured the last permit of the day for the backcountry hiking and camping trail. Only ten permits are issued per day and it’s the only way to be allowed to stay in the park overnight. After a beautiful sunset hike, being pelted with sand from the evening winds, being rained on, and a short nap, we woke at 2 AM to clear dark skies. Below are some results from the night: When I finally mustered the confidence to venture out into the dark, the clouds were just starting to part. Our tent was positioned at our assigned backcountry campsite #3, positioned on the desert floor between a set of dunes. The gypsum earth is hard and crusty in this area and a lot of grassy patches and wildflowers grow in these dune troughs. Still drowsy from my nap, I stayed near the tent for my initial photos. Once I was a little more awake I set off up the sand dunes. A lone Yucca plant grew out of one of the white dunes near our camp. Although we were almost 100 miles from El Paso, Texas, the city produced a lot of light pollution to the south which is the orange glow you see behind all of the dunes. After spending some time shooting the Yucca plant, I worked my way farther up to the crest of the dune to look out farther to the rest dune field. Each of the dunes runs like a snake from north to south, separated by the troughs that contain all the desert vegetation. Although they are stark white during the day, the strong light pollution from El Paso colored the dunes orange in most of my night photos. The small bright spot behind the dunes to the far left of the image above is likely another photographer’s headlamp. The above image is a self portrait. I returned back down the dunes towards the Yucca plant and positioned the OnePlus One right next to it, pointing the lens up towards the galactic center. I took note of the position of the brightest part of the light pollution dome from El Paso and trudge up directly towards the glow, positioning myself right at the apex of the dune. In this case I used Camera FV-5’s interval function to take a number of consecutive 30-second exposures. I just made sure to stand extra still for the length of a couple photos and then picked the best frame. Before retiring the OnePlus One for the night, I made one final image of our tent with the smartphone, edited it with the Snapseed app and posted it directly to Instagram:
A post shared by Ian Norman (@inorman) Anyone who would like to see what the photos look like straight out of the phone, the below image is what the jpeg file looked like directly from Camera FV-5. You can click the image to download the original frame as an example.
Final Thoughts on Hardware and the Future of Photography
Final Thoughts on Hardware and the Future of Photography
The OnePlus One uses a fairly modern Sony IMX214 Exmor RS CMOS sensor, which is a 13MP stacked, back-illuminated CMOS sensor. It’s considered a sixth generation Exmor sensor, three full generations newer than the best Sony sensors available in DLSRs and mirrorless cameras (Sony, Nikon, and Fujifilm cameras all use Sony sensors in some form or another). If you’re interested in reading about the evolution of Exmor technology, and why Sony has some of the best sensor technology on the market, check out this great article from Darren Bessette of Framos. Essentially, without going too deep into details about the sensor, the Exmor RS sensor is significantly more efficient and sensitive and has the electronics architecture to produce images with less noise and better detail, per pixel, than the more conventional CMOS sensor designs found in most full size DSLRs and enthusiast cameras. In other words, the OnePlus One and other smartphones that use similar camera sensors are more technologically advanced designs than our large sensor cameras. The OnePlus One’s sensor uses an older version of the Exmor RS. Other phones like the Samsung Galaxy S6 and Galaxy Note 4 and the upcoming LG G4 all have newer, larger (1/2.6” vs. smaller 1/3.06”) versions of the Exmor RS sensor. Where the OnePlus One stands out is in its price and in the community behind it: it’s less than $350 for their 64GB Sandstone Black model, unlocked and world GSM capable. That’s nearly half the price of the other flagship phones out there. It’s also supported by a huge community of fans and developers that love to tinker with tech. It’s through this awesome group of people that the One has acquired the capability to utilize its camera to the full potential. I’m convinced that it’s one of the only phones on the market that’s capable of fully utilizing its sensor, the other being the upcoming LG G4, with its newer, larger Exmor RS sensor, industry best f/1.8 lens, and enthusiast-oriented camera app that allows manual control and exposures up to 30 seconds long. The G4 is likely to be able to outperform the OnePlus One but it’s also predicted to be priced around $600, twice as much as the base-level OnePlus One.
What About Physics?
What About Physics?
Now, even the latest and greatest smartphones still have physical limitations based on their lens and sensor sizes: it’s just not possible to collect as much total light with a smaller lens and smaller sensor. That’s the primary reason why smartphone cameras are not as good as large sensor DSLRs. Larger sensors and lenses collect more light and this capability leads to generally higher image quality through cleaner images with less noise. Physics dictates that a camera can only record as much light as it collects and a smaller lens collects a lot less light. Assuming equal levels of sensor efficiency, the only way for a smartphone to collect more light using such a small lens is to increase the exposure time or increase the number of cameras. And that leads me to the technology that will help close the gap and and help smartphones surpass the capability of current day DSLRs: multi-camera arrays
The Future of Mobile Photography
The Future of Mobile Photography
Multi-camera arrays. In the same manner that I used multiple separate exposures on OnePlus One to enhance the results through image stacking, I can collect more light by increasing the total exposure time. A multi-camera array would allow a smartphone to perform the same operation automatically and instantaneously at the time of capture. Essentially, the more data we can collect at the time of capture, the cleaner and more detailed the photo will be. This idea isn’t new. The practice of combining multiple exposures to create more detailed and less noisy photos is a technique that has been used for decades by astrophotographers. For practical (and financial) reasons the technique has always generally been performed with a single camera: shooting multiple exposures, one after another, and combining them together into a final, more detailed image. The manual process is slow and time consuming but can yield greatly improved results. Certain modern cameras (the OnePlus One included) have image stacking features that can perform this multi-exposure operation by combining several consecutive exposures together in the camera. On Sony cameras like the RX100 III, it’s called “Multi-Frame Noise Reduction” and on the OnePlus One’s default camera app, the feature is called “Clear Image”. These multi-shot exposure modes noticeably improve image quality and reduce noise in many photographic situations but they’re not practical for situations with moving subjects. Using a similar method of combining multiple exposure to enhance image quality, Olympus has also released the OM-D E-M5 Mark II which can automatically use multiple consecutive exposures and a special sensor shift mechanism to record 40+ megapixel files from a 16 megapixel sensor. This superresolution feature also has the same issues of not being able to properly handle moving subjects. I wrote another article about how to make your own superresolution images on our sister photography site, Photon Collective. In astrophotography, combining multiple exposures of the stars relies on re-alignment of the images (which I review in my stacking tutorial) or the use of a star tracker as the stars move across the sky with the rotation of the Earth. The consecutive exposure and alignment/tracking process requires a lot of time for both capture and for post processing. My photograph of the Yucca plant from White Sands is an example of one of these stacks, a combination of 4 frames averaged together to reduce noise. The processed required two passes, one for the foreground and one for the sky. You move the slider on the image to see what the original single frame looked like before stacking. Being able to capture and process all of the separate images simultaneously by using a whole bunch of cameras (a multi-camera array), could bring this technique of image enhancement to the snapshot type of shooting that we’re typically accustomed to in most of our everyday photography. As much as I enjoy the process of shooting a multiple long exposure image stack with my smartphone, there’s definitely a better way and that’s with more cameras, all shooting in parallel. Smartphones are a perfect application for affordable multi-camera arrays. A smartphone camera module cost is in the tens of dollars. At the current price of about $20 per camera module, an array of 16 cameras in a smartphone would only cost about $320. As miniaturization progresses, it will be easier and easier to fit more and more camera modules into a single device, and at lower cost, too. You already saw the improvement possible by combining 4 separate photos above. Now imagine if you had an array of 16 or more cameras that could perform the same operation instantaneously and automatically from a single device. There are already market indications that multi-camera array technology is going to come to our smartphones. In a recent news release, Apple Inc. acquired the Linx Imaging company, a group developing multi camera array technology. Similarly, another startup company called Light recently announced a deal with Foxconn, the same manufacturer that Apple Inc. uses to build its iPhones and MacBooks. Light is pitching their multi-camera array technology as one that will be capable of matching the capability of a DSLR. Combining the data from about 16 separate 1/2.6” sensors would allow a smartphone to start to approach the total light sensor area and light gathering capability of a typical DSLR sensor. A smartphone with the same total sensor area as a DSLR should be able to match the light gathering capability and thus come closer to the image quality of the DSLR. It’s with some form of this technology that smartphones will start to close the gap and catch up with larger sensor DSLRs. With all the recent tech-talk about multi-camera arrays, I imagine we’ll start seeing the technology come to smartphones soon.
Closing Remarks
Closing Remarks
I hope this article presented some solid evidence for the rapid increase in the capability of smartphones. My photos of the Milky Way made on my smartphone don’t quite rival the quality of a full frame camera like the Sony a7S, but the results are exciting nonetheless. The camera in the OnePlus One is honestly severely limited by its physical size and yet it’s still possible, with some careful work and post processing, to create some photographs that were previously thought to be impossible on a smartphone. If manufacturers continue developing hardware and software combinations like the OnePlus One and LG G4 that directly target the enthusiast level camera, the smartphones of the future are going to be awesome.
About the Author
About the Author
Ian Norman is a photographer, blogger and full-time traveler. He is deeply passionate about photography and takes great joy in teaching others. He is the creator of Lonely Speck, a blog about astrophotography and he recently started a new website called The Photon Collective, a new community for photographers. This article originally appeared here.You can keep up to date on Ian’s work by following his YouTube, Instagram, Twitter or Facebook.