And then you have a trained model that requires vast amounts of energy per request, right? It doesn’t stop at training.
You need obscene amounts GPU power to run the ‘better’ models within reasonable response times.
In comparison, I could game on my modest rig just fine, but I can’t run a 22B model locally in any useful capacity while programming.
Sure, you could argue gaming is a waste of energy, but that doesn’t mean we can’t argue that it shouldn’t have to cost boiling a shitload of eggs to ask AI how long a single one should. Or each time I start typing a line of code for that matter.










As a double check (or check before buying) you can search for your new GPU on https://linux-hardware.org/ to see if other users have it working without any issues. The hardware probe is also a handy tool to share your PC’s specs if you should ever need to do so!