News 17 cringe-worthy Google AI answers demonstrate the problem with training on the entire web

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

d0x360

Distinguished
Dec 15, 2016
123
49
18,620
Last night I was using real Gemini not search and immediately after loading I'd get a weird message then I'd ask a normal question and it would immediately try to get to have bang it lol. I have the log but I can't load the LLM yet it says it's broken but each prompt and it's reply is saved in my Google account
 

d0x360

Distinguished
Dec 15, 2016
123
49
18,620
I'm not talking about a website like yours which has almost no effect on society at large. Who cares if you got something wrong ?

I'm talking stuff like propagating 'information' that Saddam Hussein had links with Al Qaeda that led to the death of hundreds of thousands of people, and the 'journalists' involved got promoted. I'm talking people like Petraeus that claimed that the Ukrainian counteroffensive would be a tremendous success and ended killing dozens of thousands of ukrainians for nothing. I'm talking about the smell of civil war where every blow is allowed, every lie is perfectly fine. That's nihilism. And it's everywhere.

'accountable' my *ss. There is no accountability. Maybe in your small corner of the internet that doesn't threaten anyone.

I don't understand where you see that Google is a trusted source of information. Google doesn't rate websites according to truthfullness and you may perfectly find a website that tell you to smoke while pregnant through Google. Why should I trust whatever Google says because it is AI ? Google is not god. Why should I trust that more that any website Google recommends that tell me I have cancer because I cough ?

As I said, since LLMs use our datas, they are a mirror of what we are saying. Breaking the mirror is not the right/healthy way to proceed.

As for the competing aspect, I understand it, but it is pretty much the same than with any other human journalist. The argument brought against LLMs by journalists (NYT) are pretty flimsy.
Actually it is because it helps them find flaws in their rule set or training data. That said training data issues don't explain Geminis Insistence on chat sex I kept getting looped back into last night. The chat loaded (fresh) and it would immediately start in some weird location like a dungeon or crypt then no matter what I said it would be talking about something Hilarious but not appropriate for polite people like myself lol.