Home > Viral News > Fact Check: Is the Viral ‘Condom Nala Scandal’ from Delhi Hostel Real or Fake? Here’s the Evidence

Fact Check: Is the Viral ‘Condom Nala Scandal’ from Delhi Hostel Real or Fake? Here’s the Evidence

Viral videos claiming to show used condoms in a Delhi women’s hostel drain are fake. One is AI-generated, and the other is from Nigeria, not India.

Published By: Sumit Kumar
Last Updated: October 30, 2025 14:17:48 IST

Two viral videos claiming to show used condoms clogging the drainage system of a women’s hostel in Delhi are not real. The videos, widely shared on Facebook and other social media platforms, have sparked outrage online. However, fact-checking reveals that the clips have no connection to Delhi — one was generated using AI tools, and the other is from Nigeria.

What the Viral Videos Show

In the first video, news reporters can be seen zooming in on used condoms floating in a drain. The reporter says, “As you can see, this is the sewage chamber of the hostel. Yesterday, when the cleaning staff opened the cover of this sewage drain, a huge number of used condoms were revealed. These were found near the girls’ hostel.”

Another clip shows similar visuals, condoms inside and scattered near a drain. A Facebook user shared the video with the caption, “Shock in Delhi PG Hostel as thousands of used condoms were found during drainage cleaning. Locals have dubbed it the ‘Condom Nala Scandal’, raising serious hygiene and safety concerns.”

These videos quickly gained traction, with users expressing shock and disgust. But a closer look tells a completely different story.

AI-Generated Clip Using Sora Tool

Fact-checkers observed that the first video carried a watermark “VN Creator”. This clue led to a Facebook page by the same name, which had shared the video on October 27. Investigators discovered that the “VN Creator” watermark had been used to cover the original Sora watermark, a logo of OpenAI’s video generation tool.

The same Facebook page also uploaded another video claiming to show used condoms found in hostels in Pune. In that clip, the Sora watermark was clearly visible, confirming the video’s AI-generated origin.

Therefore, it’s evident that the first video was not shot in any Delhi hostel, and it was artificially generated using AI technology and falsely linked to India to attract views.

The Second Video Is from Nigeria, Not India

A reverse image search of keyframes from the second viral clip revealed that it had no connection to Delhi either. Instead, the video originally came from Nigeria.

A Nigerian social media influencer had shared the same video on October 13 on Instagram, describing it as a local incident in Nigeria. In the longer version of the video, the word “Nigeria” can be heard in the background voiceover.

Additionally, a Nigerian news outlet, Edo Online Television, posted the same clip on the same day with the caption “#Nigeria”. These findings confirm that the visuals circulating online as being from a Delhi hostel were in fact filmed in Nigeria.

How the Fake News Spread Online

The misleading posts went viral after several users on Facebook and X (formerly Twitter) shared them without verifying the source. Many captions and hashtags mentioned “Delhi PG Hostel” or “Condom Nala Scandal”, giving the impression of a scandal in the national capital.

Because the videos appeared to be from a news report, users found them believable. The presence of reporters and commentary further added to the illusion of authenticity. But both videos turned out to be unrelated to Delhi — one digitally created using AI, the other from another country altogether.

Fact-Check Conclusion

  • The first video is AI-generated using OpenAI’s Sora tool, not real footage.
  • The second video is from Nigeria, not from any Delhi women’s hostel.
  • There is no evidence of such an incident occurring in Delhi.

The claim that “used condoms were found in a Delhi hostel drain” is false. The videos are misleading and unrelated to India.

Why Verifying Videos Matters

This case highlights how easy it is for fake or AI-generated content to spread on social media. Many users shared the clips without checking facts, contributing to misinformation and public panic.

Experts warn that AI tools are now being misused to create realistic-looking fake videos. When such content includes sensational topics or appears to involve women’s hostels or public hygiene, it spreads even faster online.

Before sharing any viral video, users should verify it through credible news sources or reverse image search tools. Social media platforms should also take stronger action to flag AI-generated misinformation.

Latest News

The Daily Guardian is India’s fastest
growing News channel and enjoy highest
viewership and highest time spent amongst
educated urban Indians.

Follow Us

© Copyright ITV Network Ltd 2025. All right reserved.

The Daily Guardian is India’s fastest growing News channel and enjoy highest viewership and highest time spent amongst educated urban Indians.

© Copyright ITV Network Ltd 2025. All right reserved.