

The sub is dead, long live the community!
The sub is dead, long live the community!
It’s an untenable situation because its so much bigger than the tech world and open source. FOSS fundamentally works on a communal model: everyone needs lots of software, no one can hope to write it all themselves, so what if we distributed the labor out among the community so that everyone can work on some things important to them and the whole community benefits.
Then, capitalist businesses entered the picture and began using more and more open software as backbone for their enterprises. Government entanglements further complicate the picture, but fundamentally the capitalist mindset is incapable of building or maintaining our current technological base. It isn’t capable of maintaining or building our infrastructure either: almost all of that was built on government subsidies, socialism.
And now that vulture capitalism is the law of the land, everything is falling apart because there’s no more “slack” in the system where people can engage in personal socialism on projects like FLOSS, every bit of our time is being stolen to pad the numbers of capitalists.
This bleeds over into attitude as well. Every entitled user who thinks their personal issue is more important than any other concern is a trump or musk in miniature, believing that the the blowhard bravado of our current government is a model for forcing work to get done rather than a death spiral there’s no pulling out of.
You want FLOSS software that’s good? You want less burden on maintainers? You want a safer, saner, more human-centric technology base? You want a better tech world?
Eat. The. Rich.
If by more learning you mean learning
ollama run deepseek-r1:7b
Then yeah, it’s a pretty steep curve!
If you’re a developer then you can also search “$MyFavDevEnv use local ai ollama” to find guides on setting up. I’m using Continue extension for VS Codium (or Code) but there’s easy to use modules for Vim and Emacs and probably everything else as well.
The main problem is leveling your expectations. The full Deepseek is a 671b (that’s billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.
They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren’t as impressive as the cloud hosted big versions though.