It’s possible to run small AIs on gaming PCs. For Stable Diffusion and small LLMs (7, maybe 13B), a GPU with 4GB (or even 2GB?) VRAM is sufficient. A high-end gaming PC can also be used to modify them (ie make LoRas, etc.). Cloud computing is quite affordable, too.
Stable Diffusion, which had such an impact, reportedly cost only 600k USD to train. It should be possible to make a new one for a fraction of that today. Training MPT-7B cost MosaicML reportedly 200k USD. Far from hobbyist money, but not big business, either.
It’s possible to run small AIs on gaming PCs. For Stable Diffusion and small LLMs (7, maybe 13B), a GPU with 4GB (or even 2GB?) VRAM is sufficient. A high-end gaming PC can also be used to modify them (ie make LoRas, etc.). Cloud computing is quite affordable, too.
Stable Diffusion, which had such an impact, reportedly cost only 600k USD to train. It should be possible to make a new one for a fraction of that today. Training MPT-7B cost MosaicML reportedly 200k USD. Far from hobbyist money, but not big business, either.