QuickTidal, on 08 October 2025 - 05:41 PM, said:
Can anyone explain to me the right wing pipeline to ChatGPT love?
My sister and brother in law (and one of their closet friends) are ALL about using Chat GPT for every-fucking-thing....these are the same people who SWORE during covid that it was a hoax, the govt was lying to them, the vaccines didn't work, masks were and overstep, doctors orchestrated all this...how do THOSE people find their way to buying into AI-as-a-be-all-end-all for "accuracy"? Like the COVID vaccine you consider a pharmaceutical industry money grab....COVID itself you consider was a lab manufactured virus for some kind of population control...but somehow "Let's let this massive heat-sinked computer algorithm LLM decide everything for us because it MUST be the best option"....like I can't fathom how they got there...is it because big capitalism has told them to?
I don't know how you reject actual science and logic...only to fall for AI like it's a lover...
I admit I've been tempted to get them to ask Chat GPT about the truth about all the shit they thought during covid and watch them try to explain away why their new "god machine" is telling them their beliefs for the last half decade were wrong and why.
Seems like they never properly learned the virtue of checking whether factual claims are coming from reliable sources. Chatbots can be very good at giving authoritative-seeming answers and telling people what they want to hear---unfortunately the use of human feedback in reinforcement learning can lead LLM to optimize for getting positive responses from most of their human testers rather than optimizing for truth. Though OpenAI has said they've taken steps to reduce ChatGPT's sycophancy.
I'm not sure how much it adapts to individual users' perceived biases, or whether ChatGPT will infer political biases from prompts (for example, framing a question in a certain way) and then adapt the answer to appeal to someone with those biases.
Interesting if they prefer ChatGPT over Grok.
In my anecdotal experience ChatGPT doesn't frequently hallucinate facts, though it is still extremely stupid and very lazy. (For example, I recently considered investing in a US farmland REIT, so I wanted to know how much of an impact Trump's trade war might have on it, particularly his trade war with China. The numbers seemed correct, but when ChatGPT tried to do an actual calculation of the impact on AFFO, it stupidly treated AFFO from exports to China---which made up a minority of total exports for this REIT---as if it were AFFO for all of the REIT's exports. However, it was able to correct the error and redo the calculation after I pointed this out. So when it comes to original analysis, I wouldn't recommend using it unless you already have a decent enough grasp of the subject matter to recognize errors and double-check its work before acting on its suggestions.)
This post has been edited by Azath Vitr (D'ivers: 08 October 2025 - 06:09 PM