To my eye, it never really works: a photo of a landmark or landscape under a dull and lifeless sky, transfused with instant awesomeness via “AI Powered” sky replacement software.
You know what I’m talking about: a city skyline in front of a sunset that looks like a scene from the Book of Revelation. Rock and cactus in Arizona silhouetted against the blazing heart of the Milky Way, as if our solar system was actually in galactic downtown and not out on the dusty fringe. An old castle shot on a gray day just as the tour bus was leaving – when the photographer got home it magically reappeared under blue sky and fluffy clouds.
These images might impress, but the buzz fades quickly because in the visual center of your brain, something is saying wait a minute… this doesn’t add up. I’m not claiming I could reliably distinguish real photos from sky-replacements, but I think over time my score would be a lot better than chance And if we could rate photos on how people really reacted to them, subconsciously, the truth would be told: something just doesn’t feel right.
There are things about natural outdoor light that our brain processes in sophisticated ways that we’re not directly aware of. One is color, another is direction.
The color of clouds is a factor of the angle of the sun and the size of the contained water droplets; larger droplets are bluer, but a low sun can counteract that . Nitrogen dioxide – smog – makes clouds yellowish over cities. Blue sky comes from molecular scattering of photons, and varies a lot depending on the direction of the sun, the direction you’re looking, the altitude and the time of day. Overall, the color temperature of outdoor light varies from about 2,000K to 5,500K depending on a whole bunch of factors. The bottom line is that despite the AI hype, the temperature of a replacement sky is very unlikely to match the temperature of the rest of the scene; and that’s not easy to correct, even for a human. We sense the temperature of the sky and it affects our mood.
Sky replacement software doesn’t even try to determine the direction and diffusion of the original light source and match it. That sort of thing can be done, but it requires a lot more processing power and software sophistication than is found on a photographer’s PC. Pixar Animation understands this, but their animators control the light source and the digital modeling of everything in the scene, so the “ray tracing” is to some extent straightforward; even so, it requires a serious server cluster. (BTW they’ve been doing this since the 90s and don’t even bother claiming it’s “Powered By AI”. ) Hollywood CGI can produce some amazing things, but it’s all guided and fine-tuned by highly skilled people, and aimed at a different aesthetic.
Dung beetles roll their collections in a direction determined by stars and the Milky Way. Migratory birds navigate by the direction and polarization of light rays angled low to the horizon.. So it’s a good bet our brains do at least some of that too, subconsciously. And we have stereoscopic vision which is constantly building a 3D model of our surroundings based on the angles of light, shadows, and reflections. We sense when that model is “off” or inconsistent, although we may not immediately know why.
So Go Big
I say if you’re going to replace a sky, make a statement: add something to the story, don’t just try to salvage a photo. Maybe aim for something a bit surreal. And here’s a tip: black-and-white means no issues with mismatched color temperature.
I got a good shot of this young Bald Eagle as he swooped overhead, but even with a 600mm lens the frame was 80% empty blue sky. Little to lose, so I added some dramatic clouds I’d photographed elsewhere, lit from roughly the same direction, and rotated the eagle for a better match. The result seems to me to say “up”, like he’s climbing the sky, gaining altitude. I won’t say it looks real, but I like it. Maybe I’ll sell motivational posters.
This is empty, unsold space in a downtown office tower; there was nothing but blank sky outside the windows. I wanted to emphasize the weird sci-fi look of this space – so I used some Hubble images (with attribution) to put the building out there, somewhere in the cosmos…
There’s an obvious ‘ethical’ question here as well. If I’m selling prints of that eagle image, should I make it clear what I’ve done? If I don’t , and a buyer finds out later, and is disappointed, I’d feel bad, like I’d cheated him. So that’s my ethical position, But here’s recent post on FStoppers.com, where the opinions are really flying.