95.5 FM THE HEAT PHOENIX RADIO
  • HOME
  • ADVERTISE WITH US
  • CONTACT
  • Hot Seat
  • ONLINE STORE
  • The Heat Beat: Read about the music of Phoenix Radio and beyond

are famous musicians helping texas flood victims?

7/16/2025

0 Comments

 
By Jess Santacroce
Music Writer, 955 The Heat, Phoenix Radio

Anyone who uses social media has likely seen articles reporting many of our favorite musicians’ efforts to help with the aftermath of the horrifying Texas floods that took so many lives, and ruined so many more. Fans are uplifted, praising their favorites in social media comments, even sharing memories of seeing the person on tv or meeting them and confirming that they are in fact great people. While the stories in the comments may or may not be true, the articles are sadly all fake. 

How we know these particular stories are hoaxes

Many of the videos used in hoaxes are not from Texas, and in some cases, little effort is made to match landscapes. Carefully examine any video clips you’re shown or find on social media. If you notice any details of land that does not match photos and videos you know to be of Texas, you are looking at a hoax. If you aren’t sure, watch a few minutes of a YouTube channel set in Texas, or look up a few travel videos. Scammers have reportedly been stealing clips from places as far away as India. 

As with many things wrong with the world these days, AI is involved. When looking at a picture of a musician reported to show them helping flood victims in Texas, first focus on their hands. AI has difficulty creating hands, and often gives people oddly long fingers, fingers too thin to belong to the hands they’re on, and even six fingers on one hand. Photos created or altered by AI also commonly feature people with skin that appears plastic and eyes that seem glassy or unfocused. Repetition in patterns, such as someone having two of the exact same wave in their hair, and too much resemblance between people in the photos can also be a telltale sign. 

Many of the accompanying articles are also AI generated. While AI can write you a paper….or an article…on any subject you can name and get it technically “correct,” the content is almost never of very high quality. The word choice is overly basic and vague, giving a bland, often hokey tone to the article, regardless of the seriousness of the subject matter. 

When prompted to write an article about a completely fictional rock star helping out, chat gpt included the lines “known for her powerhouse vocals and electric stage performance,” and “has traded the spotlight for flood boots.” 

The generated article contained no indication that the piece was fiction, despite me typing a made up name into the prompt. I used my dog’s name for the first name and my oldest sister’s middle name as the last name, and the chat bot wrote the article as though this were a real person. 

Just as repeated details in appearance provide a telltale sign that a photo is AI generated or altered, repeated details among the reports and features may indicate fake news. Of course multiple millionaires can donate money, but every rock star or every country star certainly isn’t going to first donate millions of dollars and then follow that up by visiting Texas. It is also highly unlikely that two people known to have very different viewpoints and personalities are going to give similar quotes to the press. 

The danger behind these false tales 

Reading a false, AI generated news report claiming that your favorite rock star donated and then went to Texas, then reading the same thing about your second favorite rock star, then your favorite country star, then your cousin’s favorite country star, is annoying at best, and the annoyance it generates can lead readers to turn away from real, important news about the horrific floods and their aftermath. These articles can also provide a false sense that there is enough help and enough resources flowing into the area, discouraging people from donating. 

Many of these articles exist because they’re hosted by sites that generate funds per click. They’re trying to get readers to click on them simply to make money. Clicking on these sites can expose your computer to a variety of types of malware and spyware. You may click on a link to an article and give hackers and identity thieves access to everything from your medical records to your bank accounts and credit cards. 

Some of these articles may even reference or contain links to fraudulent charities and other fundraising scams. Even if the charity mentioned is known and respected, the link in the article may be false, leading readers to the scammer’s money collection page. 

Coping with these false “your favorite musician donated millions of dollars and went to Texas” articles 

Ignoring these articles completely is the best way to keep safe from them. Never click on one, even just to see if it looks real. If you find yourself getting such a steady feed that you cannot even read your social media pages, block the site that published the article. 

 Watch the news or visit the official online spaces of reputable news outlets like NBC, ABC, CNN, and PBS for information.  Should you feel called to donate to help the Texas flood victims, use only the official web pages of established charities. 



0 Comments



Leave a Reply.

    Topics

    All
    Censorship
    For Musicians
    Issues In Music
    Local Events
    Music & Health
    Music History
    Music & Pets
    Music & Social Justice
    Music Styles & Genres
    Playlists
    Top Five Lists
    Tributes And Honors

    RSS Feed

Contact Us

Business Line: 315-797-2417

​Studio Line: 315-507-3135


  • HOME
  • ADVERTISE WITH US
  • CONTACT
  • Hot Seat
  • ONLINE STORE
  • The Heat Beat: Read about the music of Phoenix Radio and beyond