Quick Summary
- 1Viral videos on X and TikTok claim to show the aftermath of a historic snowstorm in Kamchatka, Russia, featuring buildings buried in snow and children sliding down frozen walls.
- 2Independent analysis using HiveModeration and Google's SynthID Detector confirmed all three videos are AI-generated, with detection probabilities reaching 99.
- 3Reverse image searches traced the origins of the videos to content creators in Turkey and Instagram accounts specializing in AI simulations, not to the Russian peninsula.
- 4The videos circulated on the same day Kamchatka experienced its heaviest snowfall in 60 years, creating a perfect storm for misinformation.
Quick Summary
Impressive videos circulating on social media claim to show the aftermath of a historic snowstorm in Kamchatka, Russia. The footage features buildings buried in snow and children sliding down frozen walls.
However, analysis reveals the footage is entirely synthetic. Independent verification tools confirm the videos are AI-generated fabrications, not documentation of the region's actual weather events.
The Viral Footage
Three specific videos gained traction on X (formerly Twitter) on Monday, January 19. The timing coincided with the heaviest snowfall to hit the Kamchatka peninsula in 60 years.
The first video shows a building completely covered in snow, with children climbing out a window and sliding down the massive drift. The Russian audio translates to: "Nossa, parece que cai direto do telhado. Este é o quarto andar, é assustador. Mas é divertido" (Wow, it looks like it fell right off the roof. This is the fourth floor, it's scary. But it's fun).
A second clip displays half of an edifice buried under snow, while a third compilation shows streets lined with towering snow walls where people walk between them. Captions in English claim the "winter arrived in Kamchatka" and that "life continues with impressive calm" amid the frozen landscape.
"Nossa, parece que cai direto do telhado. Este é o quarto andar, é assustador. Mas é divertido"— Russian audio translation from video
Technical Verification
Forensic analysis of the videos revealed consistent synthetic markers. HiveModeration, a tool designed to detect AI-manipulated media, analyzed all three clips.
The results were definitive:
- Video 1: 99.9% probability of AI-generated images and audio
- Video 2: 99.9% probability of AI generation
- Video 3: 99.9% probability of AI generation
A visual anomaly in the first video provided immediate evidence of fabrication: a child sliding down the snowbank leaves no footprints or trail in the snow.
"Feito com IA do Google (vídeo) – Synth ID identificado em todo ou parte do conteúdo carregado"
The third video underwent additional scrutiny using Google's SynthID Detector. This tool identified a digital watermark embedded directly into the video frames—a technique imperceptible to human viewers but traceable by Google's systems. The analysis confirmed the presence of this watermark across nearly the entire video.
Origins Revealed
Reverse image searches using Google Lens traced the videos to their actual sources, none of which were in Russia.
The first video, featuring the "snow slide," was originally posted on Instagram on January 18 by a content creator specializing in AI simulations. The hashtags in the caption explicitly indicated the synthetic nature of the content.
The second video appeared on TikTok on January 16, posted by an account dedicated to AI-generated content.
The third video, which SynthID identified as Google AI content, was published by a graphic designer based in Ankara, Turkey on January 12. The original caption explained the footage was a simulation of how the Turkish city would look covered in snow.
The Real Storm
The misinformation campaign exploited a genuine weather event. On the same day the videos circulated, Kamchatka experienced its most severe snowstorm in six decades.
The actual storm created massive snowdrifts, blocked building entrances, and buried vehicles. This real meteorological event provided the perfect backdrop for the fabricated videos to gain credibility.
By attaching the footage to a real news event occurring at that exact moment, the creators ensured the content would be perceived as authentic documentation of the historic weather.
Key Takeaways
This incident demonstrates the increasing sophistication of AI-generated misinformation and its ability to exploit real-world events. The videos were created and distributed during the actual storm, making them appear timely and relevant.
Verification tools like HiveModeration and SynthID provide essential capabilities for detecting synthetic content, but the speed at which misinformation spreads often outpaces fact-checking efforts.
As AI video generation technology advances, distinguishing between authentic footage and synthetic creations will require both technical tools and increased media literacy among consumers.
"Feito com IA do Google (vídeo) – Synth ID identificado em todo ou parte do conteúdo carregado"— Google SynthID Detector analysis result
Frequently Asked Questions
The videos claim to show the aftermath of a historic snowstorm in Kamchatka, Russia. They feature buildings completely buried in snow and children sliding down frozen walls as if they were giant slides.
Independent verification tools analyzed the footage and found 99.9% probability of AI generation. Google's SynthID Detector also identified a digital watermark embedded in the video frames, confirming it was created using Google's AI technology.
Reverse image searches traced the videos to content creators in Turkey and Instagram accounts specializing in AI simulations. The footage was posted days before the Kamchatka storm, with captions explaining it was a simulation or AI creation.
The videos circulated on the same day Kamchatka experienced its heaviest snowfall in 60 years. This timing made the fabricated footage appear to be authentic documentation of a real weather event, increasing its credibility and shareability.










