Ever since the Pixel 5 came out in 2020, Google has implemented a handy level feature in its camera app that I really enjoy using. It’s so good, I’m convinced that just about every other smartphone and camera manufacturer should just shamelessly drop it for our collective betterment.
By default, the Pixel camera app automatically summons some handy virtual horizon lines when you hold the phone steady to align a shot. You guide the two tilting white lines into a static yellow line, and when your photo is perfectly level from side to side (roll) and front to back (pitch), they all align and turn yellow. You even get a nice haptic buzz once your horizon reaches even zero degrees. The user interface of this tool changes when you point the phone downwards or upwards, replacing the lines with two crosshairs. Align the moving white reticle with the yellow static, and bada bing bada boom, you’ve got a perfectly flat hem.
I can’t emphasize enough how useful this is when shooting lay-flat images, such as an aesthetically pleasing arrangement of items on a table or a delicious meal you’ve just had at a fancy restaurant – you know, the stuff that’s perfect for the ‘gram’ . Those pictures look awkward when skewed because it throws the proportions off easily.
Google isn’t the originator of all this leveling up with the Pixel. Apple started the dual reticle in 2017 with iOS 11 on the iPhone X, and I know Samsung got it from at least the Galaxy Note 10 generation in 2019. But Apple and Samsung only enable crosshair leveling when using their cameras’ grid overlays, and they lack flat lines for standard photos. So Google imitated but also innovated a bit, making it a better experience overall.
As for full size cameras, they have had virtual horizons for many years, but their implementations have been outclassed by smartphones. I remember my first camera with it was the Nikon D700 in 2009, although it was pretty basic. The sad thing is that bigger cameras haven’t improved much since then. The virtual horizons of expensive flagships like a Sony A1, Nikon Z9, Canon R3 or Leica SL2 work fine for standard photos, but are lost once you point them down for a simple flat layout. As someone who often uses Sony A9 II and A7 IV cameras for weddings, I’m going crazy knowing that the phone in my back pocket could potentially help me get this shot faster than these state-of-the-art mirrorless cameras.
Why am I so obsessed with this? Maybe I secretly want to achieve something real level, or maybe I have a bad habit of taking slightly skewed photos when I’m working fast and I’m a bit of a perfectionist. Anyway, I’m convinced that it’s not just picky part-time wedding photographers like me who will benefit from this position. Product and food photography are entire categories that often require direct, flat images, and believe me when I say it’s no fun Break out a physical spirit level to make sure you get it right in the camera.
Yes, you can take a bunch of shots to make sure you get it right, later use software to correct skewed and off-axis proportions, buy one of these crazy things that takes up your hot shoe or “git gud” and somehow captures a perfect handheld shot every time, but the technology exists to help everybody get this consistently right on the first try. It should be implemented in all cameras, smartphones and professional cameras. Moreover, you learn how many tables and desks in the world are actually a bit crooked.