Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Zoë Schiffer: Oh, wow.
Leah Feiger: Yeah, exactly. Who already has Trump’s ear. It has spread. And so we were talking about people go to X Grok and they said to themselves: “Grok, what is it?” And what did Grok tell them? No, no. Grok said they were not images of the demonstration in Los Angeles. They said they came from Afghanistan.
Zoë Schiffer: Oh. Grok, no.
Leah Feiger: They said to themselves: “There is no credible support. This is a bad attribution. It was really bad. It was really, really bad. And then there was another situation where another couple of people shared these photos with Chatgpt and Chatgpt was also like”, yes, it is Afghanistan. This is not correct, etc., etc. It’s not great.
Zoë Schiffer: I mean, do not start this moment to come after many of these platforms have systematically dismantled their programs to verify the facts, decided to deliberately leave through much more content. And then you add chatbots to the mixture which, for all their uses, and I think they can be very useful, they are incredibly confident. When they make hallucines, when they spoil, they do it in a very convincing way. You will not see me here by defending Google research. Absolute trash, nightmare, but it is a little clearer when it gets lost, when you are on a random and not credible blog that when Grok tells you with total confidence that you see a photo of Afghanistan when you are not.
Leah Feiger: It’s really worrying. I mean, it’s amazing. It is completely incredible, but it is with the fanfaron of the drunk of fraternity that you were unfortunately stuck during a party of your life.
Zoë Schiffer: Nightmare. Nightmare. Yeah.
Leah Feiger: They are like “No, no, no. I’m sure. I have never been more sure of my life.”
Zoë Schiffer: Absolutely. I mean, okay, so why do chatbots give these incorrect answers with such confidence? Why don’t we just see saying, “Well, I don’t know, so maybe you should check elsewhere. Here are some credible places to search for this response and this information.”
Leah Feiger: Because they don’t do that. They do not admit that they do not know, which is really wild for me. There has actually been a lot of studies on this subject, and in a recent IA research tools at the Tow Center for Digital Journalism at Columbia University, he revealed that chatbots were “generally bad to refuse to answer questions to which they could not answer with precision. Offering rather than incorrect or speculative responses”. Really, really, really wild, especially when you consider the fact that there were so many articles during the elections on: “Oh no, sorry, I’m tickled and I can’t weigh on politics.” You are like, well, you weigh a lot now.
Zoë Schiffer: Okay, I think we should stop there on this very horrible note and we will come back. Welcome back to Strange valley. I am joined today by Leah Feiger, editor -in -chief of politics at Wired. Okay, so beyond trying to check the information and images, there has also been a lot of reports on deceptive videos generated by AI. There was a Tiktok account that began to download videos of an alleged national guard soldier named Bob who had been deployed for the Los Angeles demonstrations, and you could see him say false and inflammatory things like the fact that the demonstrators “touch balloons full of oil” and one of the videos had almost a million views. So, I don’t know, I have the impression that people have to become a little more able to identify this kind of false images, but it is difficult in an environment that is intrinsically without context as an article on X or a video on Tiktok.