If llms are juiced up auto complete then humans are juiced up bacteria. Yeah they both have the same end goal, guess the next word, survive and reproduce , but the methods they use to accomplish them are vastly more complex.
I made mine act like a 30s style paper editor with a cigar in his mouth. If I was going full bit I’d have him mention how my papers don’t have enough Spiderman.
“Stop the presses! Send my wife some flowers and bring me an Advil! What do you mean you don’t work for me? You’re hired! Now that you’re hired, you’re fired! Now that you don’t work here, we can be friends! Now that we’re friends, how come you never call? Some friend you are!” hangs up
I could see myself having conversations with an LLM, but I wouldn’t want it to pretend it’s anything other than a program assembling words together.
The way it clicks for me is that it’s a juiced up auto-complete tool.
It’s literally that.
Well that explains why that user thinks it completes them.
If llms are juiced up auto complete then humans are juiced up bacteria. Yeah they both have the same end goal, guess the next word, survive and reproduce , but the methods they use to accomplish them are vastly more complex.
I made mine act like a 30s style paper editor with a cigar in his mouth. If I was going full bit I’d have him mention how my papers don’t have enough Spiderman.
“Stop the presses! Send my wife some flowers and bring me an Advil! What do you mean you don’t work for me? You’re hired! Now that you’re hired, you’re fired! Now that you don’t work here, we can be friends! Now that we’re friends, how come you never call? Some friend you are!” hangs up
“God, I love this business!”
It’s not pretending to be anything, that’s just the function you described: assembling words together.