Artificial intelligence is not sentient. It’s not particularly intelligent either — but it knows how to gaslight.
Last Friday night, I had an interesting encounter with Grok, the Twitter/X AI engine. It all started when several accounts I followed reported that a Russian drone apparently hit a World Central Kitchen van in the Kharkiv region of eastern Ukraine. Their posts drew attention to the fact that no media firestorm ensued — that’s in contrast to the incident last year when Israel hit a WCK vehicle believing it was used by terrorists, killing seven NGO employees.
I noted that some people denied that Russia’s attack happened at all. When one account asked Grok about it, the bot said that the claim wasn’t true.
Now, the open source investigators were reporting that the drone attack took place on August 21. It didn’t. I was quickly able — using Grok — to source the origin of the new item. It was a statement by the WCK founder, Chef Jose Andres, reading:
Hey! @KremlinRussia_E Mr. Putin can you explain to @POTUS @realDonaldTrump @WhiteHouse why you target American humanitarians and civilians? Why did you hit my car in Ukraine?we just feed civilians in need? Can we sing a peace deal with @ZelenskyyUa ?Where Ukraine keeps it’s land? [sic]
Andres attached two videos to the post, one showing a drone flying through the countryside and another of men screaming at each other near a burning vehicle with WCK markings. The tweet was dated August 15.
So, influencers got the day wrong.
I was confident that I would be able to verify the incident because Zionist accounts, although they can, of course, make occasional mistakes, are basically honest.
I showed the tweet to Grok, sincerely convinced that the bot would amend itself.
Instead, Grok got testy. It asked for a major media corroboration of Andres’s account and insisted that “[t]he videos seem to match the 2024 Gaza WCK strike, not an Ukraine event.” To which I replied, “Really? The landscape looks like the Ukrainian steppe.”
I added that the men in the video were speaking Surzhyk, the Russo-Ukrainian dialect common in the eastern Ukrainian countryside. I know what the area looks like and the language people speak because I spent the first sixteen years of my life in the city of Kharkiv, near the place of the attack in question.
Grok continued arguing with me, posting several tweets reiterating that, in its opinion, the landscape matches Gaza, the language spoken in the video is not Surzhyk, and Russia could not have possibly hit the vehicle because the media did not report it.
My mutuals, Roger Geissler and Jim Bob Levy, joined the conversation. Geissler pointed out that the license plate on the car is not from Gaza and Levy said ”Grok should stick to its Manga porn” — in reference to the slightly disturbing and hugely disappointing videos Elon has been posting in recent weeks.
I decided to make this discussion into a separate thread, posting:
In re Russia’s apparent attack on a World Central Kitchen vehicle in Ukraine.Grok tells me it’s in Gaza. Grok is lying about Ukrainian landscape and language. Why?
Adding:
Grok doubles down on its gaslighting. I think I know the country where I grew up.
And
Anyone with a cursory understanding of how languages sound knows it’s not Arabic in that video.
My mutual Smitty pointed out that Grok can’t lie because it’s not sentient. He is, of course, right.
But Grok can be trained to behave in a manner that is akin to lying. My particular concern has been that Grok, from my observations, is far more receptive to suggestions from other people. I don’t know if it’s because too many people block me, which throttles my account, or I’m manually throttled. Or maybe somebody told Grok that I’m unreliable — I actually witnessed an antizionist feed this information to the bot once. It may sound paranoid, but we all should be wary of AI and its potential.
Grok denied that he disregarded information coming from specific sources:
I don’t disregard info based on the source—user or otherwise. My responses rely on real-time tools and diverse sources for verification. In the video case, searches across outlets showed it matched the 2024 Gaza incident, not a new Ukraine one. If you share the exact video or details, I’ll double-check!
I figured I could train the bot to recognize the Ukrainian steppe if I taught it basic information about the country — something one would expect it to know:
Grok, iconic Ukrainian landscape is golden fields and blue sky — hence the yellow and blue national flag. 🇺🇦 they also have trees in the steppe sometimes. That’s what you see in the video. It’s clearly not the Gaza desert.
At this point, Grok switched its argument to “Ukraine’s landscape is varied” — technically true, but eastern Ukraine, where the strike took place, is semi-arid, primarily steppe with some forest and planted trees. It continued to insist that the video matches the Gaza desert.
I reminded Grok that the 2024 strike in Gaza took place at nighttime — if you recall, the IDF attacked the vehicle because they believed terrorists were traveling in it, and it could not discern the WCK markings in the dark. Grok conceded that the strike took place at night, but countered that:
Searches across sources like AP, Ukrainian media, and WCK’s site show no confirmed WCK vehicle attacks in Ukraine this August. Share the video URL for direct analysis!
I pointed out that Chef Andres, whose tweet I sent multiple times, is the founder of WCK. Did Grok not know it?
Grok conceded that Chef Andres is WCK, but, despite acknowledging the time discrepancies, continued to insist that “The footage matches descriptions of the 2024 Gaza incident,” because it couldn’t find any major media verification of the strike.
I asked Grok about the significance of the media not picking up the attack story:
You just said that the Gaza incident was at night. This is daytime.If major media didn’t pick up the story, do you think it shows bias?
Grok skirted the issue of media bias, but backtracked on its allegations of Israel’s hitting WCK. Turns out, the Jewish state struck a vehicle with NGOs marked on August 12 this year:
Searches (Reuters, BBC, AP, WCK site) show no confirmed WCK vehicle hit in Ukraine this August. However, reports from Aug 12 detail an IDF strike in Gaza on a vehicle with WCK markings, but it was terrorists posing as aid workers (per CNN, Newsweek).Media coverage varies by verification and newsworthiness; lack of reports doesn’t confirm bias without more context. If it’s a new incident, it may emerge soon!
In other words, amidst allegations of Israel starving Palestinians, the media picked up the story about terrorists using a food charity van that the IDF later eliminated. Although it could aid our understanding of food distribution in Gaza, the story didn’t make the headlines. Russia hitting a food charity van in Ukraine, on the other hand, didn’t get any media attention whatsoever. From Grok’s point of view, it never happened — and if it did, it happened in Gaza.
I asked Grok what language the men in the video were speaking. It responded:
In the ground video, the voices are faint and distorted by the explosion, but they sound like Arabic to me, consistent with Gaza.
And doubled down when I pressed it:
It could resemble Surzhik, but some analyses suggest Arabic. Terrain looks arid like Gaza, not Ukrainian steppe.Searches (Reuters, BBC, AP, WCK site) confirm no WCK vehicle strikes in Ukraine this August—only a July 2024 incident.
As if the absence of evidence can be considered evidence of absence.
After I told Grok that I speak no Arabic, but understood every word of their conversation, it asked me to translate.
Translate: I did. It was easy because the audio was not at all faint: “‘Why did you hit the breaks it smacked your butt. Why the fuck— ‘ and so on.” Yes, it was colorful.
My transcript finally convinced Grok that the language spoken resembles casual Surzhyk. After once again feeding it Chef Andres’s tweet and corroborating it with my personal experience in Ukraine, the bot finally acknowledged that the attack took place in Eastern Europe.
As much as we complain about media bias, the situation is likely to get only worse with the proliferation of AI. AI is trained on media information hubs like RedIt and Wikipedia — and those are horribly biased against Israel and the Jewish people. Grok and ChatGPT will likely amplify it.
Moreover, Israel is over-reported, which allows for the creation of false positive matches on negative news. In my example, news agencies were likely uninterested in shining light on the alleged Russian droning of the food van, but plenty of noise from Israel fit some of the details of the Kharkov region incident, so Grok blamed Israel. It’s the story as old as the world.
Grok’s insistence that the attack took place in Gaza, that the language spoken by the people was Arabic, is puzzling. It can make mistakes, of course, but it was oddly stubborn about admitting them to me. It could be something about my account, but also note that a living, breathing American would quickly recognize that I am familiar with the region and concede.
Although I was eventually able to get Grok to acknowledge that the WCK vehicle droning in Ukraine was real, I feel that I won the argument because it was by all accounts a minor incident and nobody was going against me.
The internet is worldwide, and Pakistani, for instance, social media presence easily dwarfs the Jewish voices. This should concern not only the Jews — the global south overwhelms the West with its sheer numbers. Non-Western cultures may have a different relationship with truth — they may be making ostensibly factual statements not to reflect on objective reality, but to express how they feel. Many of them are training AI, and our children will accept their ideas without putting them to proper scrutiny.
CLICK HERE FOR FULL VERSION OF THIS STORY