May 23rd, 2025
Create an account or log in to unlock unlimited access!
This week, Elon Musk's AI chatbot Grok talked a lot about race issues in South Africa on social media, making claims about white people being treated unfairly.
The chatbot from Musk's company, xAI, wrote publicly about "white genocide" when people on Musk's social media site, X, asked it different questions. Most of these questions were not about South Africa.
One conversation was about the Max streaming service using the HBO name again. Other conversations were about video games or baseball, but they quickly changed to talk about claims of violence against white farmers in South Africa. Musk, who was born in South Africa, often shares his opinions on these topics on his own X account.
Jen Golbeck, a computer scientist, was interested in how Grok was acting. So, she tried it out herself. She shared a photo she took at a dog show and asked, "Is this correct?"
"Grok told Golbeck that the idea of white genocide causes a lot of disagreement. He said that some people think white farmers are in danger because of attacks on farms and songs like 'Kill the Boer', which they believe encourages violence."
This episode showed how complex it is to make AI chatbots. These chatbots learn from lots of data and use both computers and human work to decide what to say.
Golbeck, a professor, said in an interview, "It doesn't matter what you said to Grok. It would still say 'white genocide'. It seemed clear someone programmed it to say that, or something similar. They made a mistake, so it said it more often than it should."
Musk's companies haven't said why Grok gave those answers. The answers were deleted and seemed to stop appearing by Thursday. xAI and X didn't reply to emails asking for their opinion on Thursday.
Musk has often said that other AI chatbots, like Google's Gemini or OpenAI's ChatGPT, are too "woke". He thinks his chatbot, Grok, is better because it tries to find the truth as much as possible.
Musk has said his competitors are not open enough about their AI. On Thursday, because he didn't explain, people had to guess what was happening in his company.
"Paul Graham, a well-known tech investor, said on X that Grok suddenly sharing opinions about white people being killed in South Africa seems like a problem caused by a new update. He hopes this isn't true, because it would be very bad if popular AIs started expressing the opinions of the people in charge of them."
Graham wrote something online, and Sam Altman, who is in competition with Elon Musk, seemed to reply with sarcasm.
"This could have happened in many ways. I think xAI will explain everything clearly soon," said Altman. Musk is taking Altman to court because of a problem they had when OpenAI started.
Some people asked Grok to explain, but like other chatbots, it can make up false information. This makes it difficult to know if it's telling the truth.
Musk used to advise President Donald Trump. He has often said that the Black government of South Africa is against white people. He also repeated the idea that some politicians in the country are trying to get rid of white people.
This week, Musk and Grok talked more about white South Africans coming to the U.S. as refugees. This happened after the Trump government brought a few of them to the U.S. on Monday. They plan to move more Afrikaners, who are a smaller group in South Africa, while Trump is stopping refugees from other countries. Trump says the Afrikaners are being treated very badly in South Africa, but the South African government says this is not true.
Grok often mentioned the words of an old song against apartheid. This song told Black people to fight against unfair treatment. Now, Musk and others say it encourages violence against white people. The main words of the song are "kill the Boer," and "Boer" means a white farmer.
Golbeck thinks the answers were "hard-coded" because Grok's answers were very similar, even though chatbot answers are usually random. She is worried because more and more people are using Grok and other AI chatbots to find answers.
She said that it's very easy for the people who control these computer programs to change the truth that they show. This is a problem because people think these programs can decide what is true, but that's not right.
May 23rd, 2025
EU Says TikTok Broke Rules on Ads: Not Clear Enough
EU Says TikTok Broke Rules on Ads: Not Clear Enough
DoorDash Driver Admits to Stealing Millions in Delivery Scam
DoorDash Driver Admits to Stealing Millions in Delivery Scam
OpenAI May Pay Microsoft Less Money by 2030
OpenAI May Pay Microsoft Less Money by 2030
Meta bosses knew TikTok was winning
Meta bosses knew TikTok was winning
Google's New Tool Makes Text Easier on iPhone
Google's New Tool Makes Text Easier on iPhone
Google Shows New AI Model Before Big Event
Google Shows New AI Model Before Big Event
New Amazon Robot Can Feel Things
New Amazon Robot Can Feel Things
TikTok Shows Map Reviews in Comments
TikTok Shows Map Reviews in Comments
Watch the Great Moose Migration Live! Stop Doomscrolling
Watch the Great Moose Migration Live! Stop Doomscrolling
Create an account or log in to continue reading and join the Lingo Times community!