Mastodon

It's just technology

Richard Stallman in Reasons not to use ChatGPT

I call it a "bullshit generator" because it generates output "with indifference to the truth".

We are three years into the ChatGPT era, and I feel as confident as I have at any point in those last three years that, while Large Language Models (LLMs) are a serious and notable technology advancement, they are more of a normal technology than many give them credit for.

On one end, you have people who think LLMs are truly intelligent and we're just 6 months away from becoming God. On the other end, you have people who think it's absolute fluff that nobody actually wants. I don't know the author of the post I was referencing, but it certainly sounds like they are more on the latter side of this dichotomy.

I quoted the specific line about "indifference to the truth" because it's one that I constantly see from the anti-LLM crowd: the idea that these models "don't actually know anything," and I cannot express to you enough how little I care about that critique.

Yes, I will agree with you that neither ChatGPT, nor Claude, nor Gemini, nor anything currently available knows anything in a conscious sense, but that doesn't mean I don't find them useful. Does a spreadsheet "know" anything? No. Does HTTP "know" things? Absolutely not. That doesn't mean those technologies are useless either.

As ever, I feel like the people on either end of me are looking at LLMs as if they're magic, and some think that magic is good and others think it's bad. It's just normal technology, people. Computers haven't had to understand what they were doing before and they don't have to now.