

It’s also an 8 gigaparameter model. That’s pretty tiny, even if they use it heaps.


It’s also an 8 gigaparameter model. That’s pretty tiny, even if they use it heaps.


More than a decade on, and it’s still one of the best kindles ever made, in my opinion.
You had physical buttons instead of a fiddly touch-screen, you could have music, have it read to you, and also go on the internet.
Plus it’s old enough it supports a bunch of formats, and registers as a mass storage device to a computer, so anything can use it.
There have been a few over time. It originally started out as a project to test some new Reddit features with fake users, using Markov/GPT-2 bots, and then it became funny enough to let users see.
Them calling themselves bots or coincidentally being unable to tell cats and dogs apart was also quite funny back in the day. (They didn’t do any actual image recognition, it was just making links and a title.)


Human drivers, if they could get LIDAR with their car, would probably also use it.
Why not aim for better than what humans can do?


According to the article linked in the article, it’s not that the operating system itself is more demanding, but more that the DE, and Browsers/Websites are more demanding now.
It feels like that Canonical basically needs to do the games thing of having a set of minimum specs for Ubuntu to run at all, and a recommend specs for Ubuntu to run well. Canonically basically bumped up the latter, but it’s being taken as the former.


If memory serves, he also claimed to have been driving when he teleported into a ditch 50 miles away.
Which just comes across like he was driving when he really shouldn’t have been (Drunk/Tired and Emotional), and fallen asleep whilst on the road.


It’s odd, since they used to have a rather nice HTML web interface specifically for low-peformance devices, but it’s since gone away.


This doesn’t seem so bad, though. 2 GB more in about 10 years is pretty reasonable in terms of an increase.
It’s not like they doubled it.


Specifically using publicly available information that they could find on search engines.
They didn’t track them down with a PI or anything quite like that.


deleted by creator


50 GB in memory for a visual studio/programming project being a bigger project seems like rather an understatement, unless you’re working on machine learning, simulations, or something of that nature.
Did he read/hear about gut flora somewhere, and get his eggs scrambled?


I’d honestly agree. It’s fine after you get established and a feed set up, but before then, not having a good way to find stuff to follow in the first place hurts it a bit.


I’d argue that it was more to do with the fediverse setup being confusing/complicated, if you’re not used to it.
People would think you’d need to sign up to all the servers that you wanted to access, rather than using just one account for everything.


Though it’s better now, it used to be that Lemmy and a lot of Lemmy-type alternatives’ documentation were more for people who wanted to host their own server, rather than someone who wanted to join a social network.
But at much the same time, that complication also hurts adoption, so if people ever wanted Lemmy to be a proper social media site to replace the existing ones, the barrier to entry does also need to go down.


Or that it’s not right for their use case.
Like someone throwing a bunch of data into an LLM and trying to use it to process it into a chart or something. It can work, but it was never designed to be used in that manner.
I’ve got an acquaintance who does that, despite the fact that python would be a better thing to use.
Personally, I sometimes run a few saved images thorough a multi-modal 8 gigaparameter local model on my computer, so I can automate giving them more descriptive names than randomnumbers.png, and that seems to work fine. I could do it by hand, but it would take hours and days, compared to minutes, and since it’s not too important, it doesn’t matter if it’s wrong. The resource usage is also less of an issue, since it’s my own computer.


Especially if they can achieve their goal of keeping it alive for months.
Right now, we can only safely do it for hours. Potentially months is a massive improvement.
The oil crisis isn’t quite that bad yet.
It’s like a better iPad in a way, since you could run full-scale desktop programs on it, and use it like a desktop.
I wouldn’t be too surprised if things like surfaces were one of the reasons why Apple seems to be making a push to try and make the iPad functional as a computer on its own.