THE GOOGLE PIXEL is the best smartphone you can buy, thanks in part to its killer camera. But its photo smarts go beyond the specs. Like many of Google’s greatest hits, the magic’s in the algorithms.

The Pixel offers compelling evidence that phones haven’t just replaced point-and-shoot cameras. They’ve perfected the concept. Smartphone cameras are smarter, faster, and they’re always in your pocket. Even better, they can improve your shots even after you’ve taken them.

Automatic For the People
According to independent analysis from DxO Mark, the Pixel and Pixel XL feature the best smartphone cameras yet—despite having specs that, while competitive, don’t scream “Best Camera Ever.” What makes the Pixel camera so good is you don’t have to think about it. And it’s fast. You just point, shoot, and let the Pixel do its thing.

Pixel’s marquee mode, dubbed HDR+, captures several images in rapid sequence, analyzes their attributes, works some tone-mapping magic, and creates a single optimized image. It’s the default auto mode, and how DxO evaluated the camera. Another cool feature, Smart Burst, captures up to 30 images per second, analyzes things like color balance, blur, and even facial expressions, and marks the shot in your gallery that it considers best. It’s often right.

The previous generation Nexus smartphones, the 5X and 6P, offered HDR+ and Smart Burst. The big difference here is Google could “control the entire stack,” according to the Pixel team. “It’s something we’ve been working on for a few years in terms of building up the expertise and intellectual property,” says Android VP of Engineering Dave Burke, who says the camera team worked closely with the Google Photos team on the capture and post-capture pipeline. It shows.

Sensor Surprise
It helps that Google didn’t skimp on components. The 12-megapixel sensor is from Sony, the global market leader for CMOS image sensors. Everything from iPhones to drones to Nikon DSLRs and, of course, Sony’s excellent cameras uses a Sony sensor. The Pixel features the state-of-the-art model, the IMX378 sensor. The only other phone to offer it is Xiaomi’s Mi 5S.

The sensor’s photosites—the buckets that receive photons through the lens—are big for a smartphone. Aided by the Pixel’s wide aperture f/2.0 lens, they sponge up more light, making the phone more adept in dark conditions. But low-light photography is just one way Google squeezes the most out of the sensor’s bigger pixels. The larger light receptacles also help stabilize photos; because they’re more efficient at gathering light, the phone can use faster shutter speeds.

“We’re keeping exposure time short,” says Android camera engineering manager Tim Knight. “The strategy is to take multiple frames at short exposure times and then fuse them together with some clever software. That gives a similar quality to one image with a really long exposure time. It’s not just that you’re eliminating blur, you’re also better capturing moving subjects in the scene.”

Keeping It Stable
Software also helps overcome a perceived Pixel camera deficiency. It lacks optical image stabilization, a headline feature for flagships like the Galaxy S7 lineup and iPhone 7.

“With larger pixels, you don’t need OIS as much,” says Knight. He says several things about the camera help stabilize images without OIS: pixel size, faster shutter speeds, and algorithmic analysis of burst-mode frames based on the “lucky imaging” principles of astrophotography.

But in video mode, the Google team thinks it has come up with a better form of stabilization. It’s driven by the phone’s gyroscope. You can jostle the phone while you’re recording video, and the Pixel’s stabilization is evident on the screen. The system samples readings from the gyroscope 200 times per second, then corrects shake as you’re shooting.

“We have a physical wire between the camera module and the hub for the accelerometer, the gyro, and the motion sensors,” Knight explains. “That makes it possible to have very accurate time-sensing and synchronization between the camera and the gyro.”

Lasers and Beyond
The interplay of sturdy hardware and smart software doesn’t stop there.

“The premise that you should use a burst of shots for everything is a principle that we hold strongly,” says Burke. “There’s a post-processing step where it’s working with the shot, and we’ve sped that up in the Pixel phones with accelerated digital signal processing and post-processing.”

With all that need for speed, perhaps the biggest advantage of the IMX378 sensor is what’s fused onto the back of it. The sensor uses Sony’s latest “stacked” design, with a DRAM chip attached to its back. That helps move image data off the sensor quickly, where Google’s post-processing tricks can go to work. The peppy quad-core Qualcomm Snapdragon 821, a powerful GPU, and 4 gigs of RAM help Pixel work fast.

Each shot in a burst mode sequence normally looks sharp, thanks to phase-detection autofocus built into the sensor. Compared to contrast-detection autofocus, which is common in standalone cameras, phase-detection AF is often much faster. It’s more effective at locking focus on objects approaching or retreating from the camera. In the same situations, contrast-detection systems need to “search” in and out with the lens.

But phase-detection relies on having lots of light to work with, making it a bad match for the Pixel’s low-light prowess. So of course, Google augmented it with a secondary system that uses lasers.

“If you look on the back, you’ll see two little holes next to the microphone,” says Knight. “One is a laser emitter, and one is a laser receiver. It sends out a cone of IR light which reflects off the subject and bounces back. It’s a time-of-flight laser, which means it uses the phase of the receive light to very accurately figure out how far away the subject is.”

Unlike the iPhone 7 Plus, Google’s approach to the Pixel camera involves more rapid-fire photos rather than more lenses. That may disappoint pro-grade photographers, as might all the processing that makes photos better described as good than accurate. But that’s not the target audience. Google designed the Pixel for the everyperson, while the Nexus lineup was characteristically different.

The design team even worked on removing the “camera bump” on Nexus phones in order to make the phone more approachable. Instead, the Pixel phones have a wedge-like slope that is thicker near the camera module. “The Nexus 6P, it looks kind of technical,” says Burke. “We wanted a broader appeal.”

Although the Pixel launched at the top of the phone-photography pack, it’s clear that the team behind the camera has further innovation in mind. Burke and Knight repeatedly referred to the Pixel simply as “this year’s camera,” with sights squarely on what’s coming next. How do you beat the best? With hard work, great components, and lasers, sure. But also with better algorithms.