From The Verge:
One of the most impressive demos at Google I/O started with a photo of a woman in front of a waterfall. A presenter onstage tapped on the woman, picked her up, and moved her to the other side of the image, with the app automatically filling in the space where she once stood. They then tapped on the overcast sky, and it instantly bloomed into a brighter cloudless blue. In just a matter of seconds, the image had been transformed.
The AI-powered tool, dubbed the Magic Editor, certainly lived up to its name during the demo. It’s the kind of tool that Google has been building toward for years. It already has a couple of AI-powered image editing features in its arsenal, including the Magic Eraser, which lets you quickly remove people or objects from the background of an image. But this type of tool takes things up a notch by letting you alter the contents — and potentially, the meaning — of a photo in much more significant ways.
While it’s clear that this tool isn’t flawless — and there remains no firm release date for it — Google’s end goal is clear: to make perfecting photos as easy as just tapping or dragging something on your screen. The company markets the tool as a way to “make complex edits without pro-level editing tools,” allowing you to leverage the power of AI to single out and transform a portion of your photo. That includes the ability to enhance the sky, move and scale subjects, as well as remove parts of an image with just a few taps.
Google’s Magic Editor attempts to package all the steps that it would take to make similar edits in a program like Photoshop into a single tap — or, at least, that’s what it looks like from the demo. In Photoshop, for example, you’re stuck using the Content-Aware Move tool (or any of the other methods of your choice) to pick up and move a subject inside of an image. Even then, the photo still might not look quite right, which means you’ll have to pick up other tools, like the Clone Stamp tool or maybe even the Spot Healing Brush, to fix any leftover artifacts or a mismatched background. It’s not the most complicated process ever, but as with most professional creative tools, there’s a definite learning curve for people who are new to the program.
I’m all for Google making photo editing tools free and more accessible, given that Photoshop and some of the other image editing apps out there are expensive and pretty unintuitive. But putting powerful and incredibly easy-to-use image editing tools into the hands of, well, just about everyone who downloads Google Photos could transform the way we edit — and look at — photos. There have long been discussions about how far a photo can be edited before it’s no longer a photo, and Google’s tools push us closer to a world where we tap on every image to perfect it, reality or not.
. . . .
To be fair, there are a ton of similar photography-enhancing features that are built in to smartphone cameras. As my colleague Allison Johnson points out, mobile photography already fakes a lot of things, whether it’s by applying filters or unblurring a photo, and doctored images are nothing new. But Google’s Magic Editor could make a more substantial form of fakery easier and more attractive. In its blog post explaining the tool, Google makes it seem like we’re all in search of perfection, noting that the Magic Editor will provide “more control over the final look and feel of your photo” while getting the chance to fix a missed opportunity that would make a photo look its best.
Call me some type of weird photo purist, but I’m not a fan of editing a photo in a way that would alter my memory of an event. If I was taking a picture of a wedding and the sky was cloudy, I wouldn’t think about swapping it for something better. Maybe — just maybe — I might consider moving things around or amping up the sky on a picture I’m posting to social media, but even that seems a little disingenuous. But, again, that’s just me. I could still see plenty of people using the Magic Editor to perfect their photos for social media, which adds to the larger conversation of what exactly we should consider a photo and whether or not that’s something people should be obligated to disclose.
Link to the rest at The Verge
At the risk of seeming semantic, an unedited photo is most definitely not the same as the object/person/scene photographed. The negative image created by a film-based camera is much different than the electronic file created by a digital camera.
When the first high-end digital cameras appeared (think Nikon, Canon, etc.), many photographers, including some superb artists, thought digital images could never match the subtle changes in tone and smoothness. At first, they were correct. The first digital cameras PG used produced pretty crude images.
However, as with a great many things technical, the engineers got smarter, the digital cameras got much, much, much better, then Apple put quite a nice digital camera into every iPhone.
The last time PG checked, some time ago, 35 mm film was very hard to find. He expects that, in the hands of serious experts with decades of experience, some large-format film cameras still provide some benefits that digital cameras cannot match, but that was and is the smallest, albeit important, part of the camera universe.