Generative AI has the potential to completely change life as we know it, but in spite of the fact that this is the case, it is also guilty of creating mountains of false narratives. As ChatGPT became more prominent and competitors such as Google’s Bard began popping up, the higher level of focus should have made these occurrences less frequent than might have been the case otherwise. However, a recent study showed that things are not getting any better.
In the case of ChatGPT, a 100% of the false narratives that were found in the initial test in April came from it. As for Bard, 76% of false narratives were attributed to it. Over the past few months, ChatGPT has hardly made any progress, with more recent tests revealing a 98% of spewing false narratives with all things having been considered and taken into account.
With all of that having been said and now out of the way, it is important to note that Bard has moved in the opposite direction. It has gone from 76% to 80%, suggesting that it is creating more misinformation than might have been the case otherwise.
Bard even sometimes uses sources attributed to conspiracy theories such as Q Anon. This could lead to an enormous rise in the proportion of fake information out there, and a lack of critical thinking skills could make this situation spiral out of control.
This data comes from an analysis conducted by NewsGuard. It will be interesting to see future results because of the fact that this is the sort of thing that could potentially end up determining the nature of truth online. The combination of faulty AI chatbots as well as social media platforms that can disseminate them to a wide enough audience is creating a ripple effect that might not go away anytime soon.
As the 2024 US presidential election draws closer, all eyes will be on AI chatbots to see the type of impact they have. Chances are that they will be a game changer moving forward and candidates will need to keep them in mind.
Read next: ChatGPT's Summer Slump: Perplexity.AI Steps Up the Game
by Zia Muhammad via Digital Information World
No comments:
Post a Comment