Pentagon: Pictures of the explosion in the Pentagon,( Pentagon) America’s largest military establishment, created panic on the internet. Seeing these pictures went viral on the internet, due to which there was an earthquake in the American stock market. Here the S&P 500 registered a decline of 30 points.
However, soon it became clear that this news is false and the pictures viral on the internet are AI generated fake photos. An interesting thing in this matter was that many TV channels, believing these pictures to be true, ran the news as well.
After the fake photo of the explosion went viral, the Arlington Police Department confirmed in a tweet that the photos were fake. Police said that ‘@PFPAOfficial and the ACFD are aware of a report circulating on social media about an explosion near the Pentagon. There is no explosion or incident occurring on or near the Pentagon Reservation, and there is no immediate danger or threat to the public.
See here how the stock market fell due to the fake explosion.
This fake photo was shared on social media by many accounts. Many told AI generated of this photo. But after the photo went viral, the Pentagon was forced to clarify that no such explosion had taken place. A Pentagon spokesperson clarified that, “We can confirm that this was a false report and that the Pentagon was not attacked today.” The AI-generated photo of the incident was one of a series of fake photos. These included depictions of former US President Donald Trump being arrested and Pope Francis donning a puffer jacket.
The advent of emerging generative AI technology has simplified the process of creating any kind of photo, eliminating the need for individuals to have specialized skills in programs such as Photoshop. Now, even non-experts can rapidly create compelling images in minutes.