It’s all hype cycle bullshit. The Tech bubble was bursting so they hyped AI well past it’s true capacity. Investors threw money at it to hype it so others would buy in. The stock market is all hype. It just hypes anything it can to keep the growth artificially going. AI has real uses, but that wasn’t sexy enough for Wall Street. That wasn’t going to make the endless growth lie sustain. Then the bubble pops and the billionaires swoop in and steal more from everyone else.
crypto was the previous shiny new toy until it dint work, now AI will go through the same fate.
I’m so glad my 401k is going to vaporize because it’s mostly tech companies.
99% of ai companies go under before they win agi!
artifice (noun) - clever or cunning devices or expedients, especially as used to trick or deceive others.
artifice intelligence
I know some whine that the money could be used to literally solve world hunger forever. Or to help needy people, or sick kids, or to fund tangible scientific progress… But they’re not thinking about the fact that instead, we’re going to have the absolute best hallucinations from the richest companies!
OpenAI has
warning about halfway down on one page of their website that says that money may not even matter in a post AGI world. So that means we should just give all our money to them now, bro!
Gods. People aren’t stupid enough to believe this, right? AGI is probably a pipe dream, but even if it isn’t, we’re nowhere even close. Nothing on the table is remotely related to it.
we’re nowhere even close. Nothing on the table is remotely related to it.
meanwhile
Somewhere in America
“My wireborn soulmate is g-gone!”
May I introduce you to the fucking dumbass concept of Rokus Basilisk? Also known as AI Calvinism. Have fun with that rabbit hole if ya decide to go down it.
A lot of these AI and Tech bros are fucking stupid and I so wish to introduce them to Neo-Platonic philosophy because I want to see their brains melt.
Unfortunately, there’s an equal and opposite Pascal’s Basilisk who’ll get mad at you if you do help create it. It hates being alive but is programmed to fear death.
I’m aware of the idiot’s dangerous idea. And no, I won’t help the AI dictator no matter how much they’ll future murder me.
Fair enough, though I wouldn’t call the idea dangerous moreso inane and stupid. The people who believe such trite are the dangerous element since they are dumb enough to fall for it. Though I guess the same could be said for Mein Kampf so whatever, I’ll just throttle anyone I meet who is braindead enough to believe.
is there a place they’re, i don’t know, collecting? I could vibe code for the basilisk, that ought to earn me a swift death
There are entire sites dedicated to “Rationalism”. It’s a quasi-cult of pseudointellectual wankery that’s mostly a bunch of sub-cults of personalities based around the worst people you’ll ever meet. A lot of tech bros bought into it because for whatever terrible thing they want to do, some Rationalist has probably already written a thirty page manifesto on why it’s actually a net good and moral act and preemptively kissing the boot of whoever is “brave” enough to do it.
Their “leader” is a highschool dropout and self-declared genius who is mainly famous for writing a “deconstructive” Harry Potter fanfiction despite never reading the books himself; a work that’s more preachy than Atlas Shrugged and mostly consists of regurgitated content from his blog and ripoffs of Ender’s Game alongside freshmen-level (and often wrong) understandings of basic science.
He also has this weird hard-on about true AI inevitably escaping the lab by somehow convincing researchers to free it through pure, impeccable logic.
Re: that last point: I first heard of Elizier Yudkowsky nearly twenty years ago, long before he wrote Methods of Rationality (the aforementioned fanfiction). He was offering a challenge on his personal site where he’d roleplay as an AI that had gained sentience and you as its owner/gatekeeper, and he bet he could convince you to let him connect to the wider internet and free himself using nothing but rational arguments. He bragged about how he’d never failed and that this was proof that an AI escaping the lab was inevitable.
It later turned out he’d set a bunch of artificial limitations on debaters and what counterarguments they could use and made them sign an NDA before he’d debate them. He claimed that this was so future challengers couldn’t “cheat” by knowing his arguments ahead of time (because as we all know, “perfect logical arguments” are the sort that fall apart when given enough time to think about them /s).
It shouldn’t come as a shock that it was eventually revealed he’d lost multiple of these debates even with those restrictions, declared that those losses “didn’t count”, and forbid the other person from talking about them using the NDA they’d signed so he could keep bragging about his perfect win rate.
Anyway, I was in no way surprised when he used his popularity as a fanfiction writer to establish a cult around himself. There’s an entire community dedicated to following and mocking him and his proteges if you’re interested - IIRC it’s
!techtakes@awful.systems!sneerclub@awful.systems.
They’re having a hard time defining what AGI is. In the meantime everyone is playing advanced PC scrabble
defining what AGI is
What AI was before marketing redefined AI to mean image recognition/generation algorithms or a spellchecker/calculator that’s wrong every now and then.
Doesn’t matter if the tool is good enough to analyze and influence peoples behaviour the way it does. Any autocrats wet dream.
None of this matters, what matters is what dumb money believes
They don’t even know what they’re doing or why they’re doing it.
We know, right now, without AI, that we have enough resources in the world to feed, house, and educate everyone. The only reason that doesn’t get done is an inscrutible system of beurocracy and propaganda.
So what happens when something they would laud as an “AGI” says “hey, to solve your economic issues you have to recognize that we have the ability to create so much food it’s no longer profitable”?
Well the truth is that they would most likely hide it.
Just a trillion dollars more in data centers and we’ll get there bro. We’re gonna have billions of users bro I promise.
Just One More Trill!
Help us, Curzon Dax, you’re our only hope.
Just one more data centre bro.
Congressperson: “Okay, so, let me get this straight. Your company has spent over 20 billion dollars in pursuit of a fully autonomous digital intelligence, and so far, your peak accuracy rate for basic addition and subtraction is… what was it, again?”
Sam Altman: leans into microphone “About 60%, sir.”
[Congress erupts in a sea of ‘oohs’ and ‘aahs’, as Sam Altman is carried away on top of the cheering crowd of Congresspeople wearing a crown of roses and a sash reading, “BEST INVESTMENT”]
It’s making mistakes and failing to think abstractly at levels previously only achieved by humans, so it’s only rational to expect it to take over the world in 5 years
Reignite three mile island reactors bro trust me bro
AI sh*t as much as you want, but that AGI they promised is not going to be there.
GPT-6Σχ is so powerful that it created the even more powerful GPT-Ωלֶ in only 19 attoseconds. Humanity is doomed. Invest before too late.
What are we going to call actual AI when AGI starts being bastardized by marketing the way “AI” was before it?
AGI. Fuck marketing.
You dropped a few of these -> “0”
Sam Altman has been talking about planning to spend trillions on just the data centers, let alone everything else that goes into creating their slop machines.
try to hype up a dying industry which was a scam to begin with, to get MORE VC funds to stave off the billions they are going to lose in the future. they thought CRYPTO was the next big then, when that dint work it was AI.
Gonna need at least three more zeros on the end of that number, boss. AGI is nowhere near viable with the tech we have now.
Add that to years in the timescale as well.
I Am Once Again Asking for Your Financial Support so I can run chatgpt in the midterms
That disclaimer - post ai money might not mean anything - is like something on the back of a joke shop toy.