VFX artists debunk UFO sightings sent by fans, exposing AI fakes and perspective tricks

VFX artists analyze fan-submitted UFO clips, revealing common AI tells, perspective illusions, and why human brains see aliens where there are balloons and contrails.
If you have ever watched a grainy video of a glowing orb zipping across the night sky and felt a shiver of possibility, you are not alone. But a team of visual effects artists has a message for you: slow down, look at the shadows, and count the seconds. The same skills that make them experts at making the impossible look real also make them experts at spotting when something is faked. And right now, the UFO community is being flooded with fakes — the majority generated by artificial intelligence.
In a recent episode of a popular VFX-focused series, a group of professional artists dug through clips submitted by viewers to their debunking email address. The results range from obvious AI hallucinations to textbook perspective illusions. Some clips are cleverly crafted visual effects pieces that fooled even the artists for a moment. But none of them are aliens.
The AI flood: short clips and impossible physics
The first clip submitted to the team shows a security cam view of a bright light moving through trees at night. The light illuminates leaves and casts shadows that shift as it passes. At first glance, the video looks convincing — the light source behaves plausibly, the shadows move. But the artists picked it apart in seconds.
“The light looks so good, though. It actually looks like a real light source,” one artist said. The problem is the shadow at the base of a telephone pole: as the light moves, that shadow should rotate around the pole. Instead, it stays pointed in the same direction the entire time. That is a classic AI failure. Neural networks can approximate realistic lighting but often fail to maintain consistent geometry over time.
There is another tell: the camera frame itself begins to shift, revealing that the entire scene is a generated image rather than real security footage. The doorframe wiggles in ways a real fixed camera would not. The artists quickly labeled it AI-generated.
The next clip, sent as “red horizon pulse,” shows what appears to be a massive red energy beam in the sky, captured from two angles. Multiple camera angles often signal credibility because AI video generators have struggled to maintain consistency between different views. But in this case, the two angles show completely different events — one a beam below clouds, the other an explosion above. The artists identified the technique: image-to-video generation. Each clip runs about 15 seconds, the typical maximum length offered by current AI video services. To create longer clips, the creator used a frame at the 15-second mark as the starting image for a new generation, creating an obvious jump cut where the scene abruptly changes direction. The explosion clip, for instance, goes from a steady glow to sudden bursting fire the moment the new generation kicks in.
This duration limit is the single easiest way to spot AI video today. Most consumer AI video tools — including Runway Gen-2, Pika, and others — cap generations at roughly 4 to 16 seconds. Clips that are exactly 15 seconds long with a visible seam at the midpoint are almost certainly AI-generated.
That is not a UFO, it is a missile interceptor
Not every submitted clip is computer-generated. One video purports to show a UFO destroying two Russian missiles. The footage is dramatic: a bright object appears to hover, then two streaks fly toward it, and the object accelerates away. The artists recognized it immediately.
“This is the actual anti-missile defense system in Israel trying to shoot down Iranian missiles,” one explained. The streaks are not Russian warheads but Israeli interceptors launched from the ground. The supposed “ufo” is the incoming Iranian missile, seen from a camera that loses depth perception. What looks like a stationary alien craft is actually a rocket descending from high altitude. The moment it appears to flee is simply the missile getting closer to the camera and passing below the clouds before impacting the ground. Perspective, combined with a lack of spatial context, tricks the brain into seeing agency where there is only ballistics.
This kind of misidentification is common. Without reference points — trees, houses, the horizon line — the human visual system defaults to a 2D interpretation. Small objects far away look static; fast-moving objects on a collision course appear to accelerate. The artists noted that the same footage has been circulating online for years, attached to different narratives depending on which conflict is in the news.
Balloons, lanterns, and contrails: the mundane truth
The majority of genuine non-AI clips sent to the team turned out to be ordinary atmospheric objects. A white stationary dot over New Mexico, filmed by a man named Joshua during a trip with his father, was carefully examined. Joshua argued the object did not move with the wind. The artists noted that the object is high altitude — possibly a weather balloon or NASA research balloon. NASA runs its annual fall balloon campaign at Fort Sumner, New Mexico, precisely during that time. The lack of apparent motion is parallax: a distant object moves very little against the sky even when the wind blows, while nearer clouds drift faster.
Another clip from Colombia shows a classic flying saucer shape ascending above a city. The artists agreed it did not look AI-generated. But they identified it as a sky lantern — an open flame inside a paper bag that rises on hot air. “It is not technically a balloon,” one artist argued, sparking a debate about the definition of a balloon. (Google’s definition: a flexible bag made of rubber, latex, or other materials filled with gas, sealed at the neck. A sky lantern, being open, fails that test.) The shape and motion match perfectly: a bright light wobbling upward, propelled by heat.
Perhaps the most persistent visual mystery is the “chemtrail” — a long, dark, vertical line in the sky that appears to form and fade mysteriously. One submitter captured a lion-shaped cloud formation alongside a strange vertical stripe. The artists identified it as a contrail (condensation trail) from an airplane flying directly away from the camera. The dark color comes from the sun setting behind it, putting the trail in shadow. The fact that it appears only partway across the sky is due to variable humidity at altitude. Contrails form only when the atmosphere is cold and moist enough for the water vapor in jet exhaust to condense and freeze into ice crystals. If the air is dry, the trail either does not form or evaporates quickly.
When VFX artists build a UFO effect
One of the more sophisticated submissions was a two-minute clip showing a bright dot appearing at the end of a contrail and then gradually erasing the cloud from the sky. The team was impressed. “This feels a lot like visual effects,” one said. They were right.
By stabilizing the footage, they noticed the contrail itself shows no motion — it is a static background element. The dissolving effect was almost certainly created by motion-tracking the real contrail, compositing a clean sky plate, and using an animated fractal noise pattern to eat away at the density. The telltale sign: at the edges of the erasure, the clouds feather into transparency in a way real clouds do not. Real clouds have hard edges; this clip has a soft, painted transition. The bright dot moves with perfect bezier curves, indicating a 2D animation, not a real object in 3D space.
“It is a very good UFO effect shot,” one artist conceded. The only natural explanation would be a spacecraft emitting enormous heat to vaporize ice crystals, but that would produce other visible artifacts — heat shimmer, condensation trails, or radiation. None are present.
What this means for the UFO community
The volume of AI-generated fake footage has exploded in the past year. Models like OpenAI’s Sora, Google’s Veo, and open-source alternatives have made it trivially easy to produce convincing video of any imagined phenomenon. The UFO community, already struggling with hoaxes and misinterpretations, now faces an ocean of synthetic content that is harder to spot than ever.
The VFX artists pointed out a grim irony: if real alien evidence ever surfaces, it will be drowned in a sea of fakes. The same technology that lets artists create believable fantasy worlds also lets anyone fabricate UFO footage indistinguishable from real security camera captures — unless you know where to look.
The standard debunking toolkit is straightforward:
- Check the length. If a clip is exactly 10-16 seconds or has an obvious seam at the midpoint, it is likely AI image-to-video.
- Examine shadows. Do they rotate around objects as the light moves, or stay fixed?
- Look at camera movement. Does the frame distort in unnatural ways? Do building edges wobble?
- Ask about perspective. Is there a sense of depth, or does everything look flat?
- Research the event. The same missile-defense footage has been debunked for years.
For now, the most likely explanation for almost any UFO video remains prosaic: a balloon, a plane, a satellite, a camera artifact, or a deliberate hoax. As AI generation improves, those categories will only expand. The artists’ advice is simple: enjoy the cool imagery, but be skeptical. And if you see a lion in the sky, it is probably just a contrail.
The search for extraterrestrial life continues. But it is going to have to compete with a lot of very good visual effects.
Staff Writer
Tessa writes about music, television, and digital media trends.
Comments
Loading comments…



