32 GB VRAM for less $1k sounds like a steal these days, and I’m sure it’s not getting cheaper any time soon.
Does anyone here use this GPU? Or any recent Arc Pros? I basically want someone to talk me out of driving to the nearest place that has it in stock and getting $1k poorer.


I’m going to be brutal with you. I spent a few thousand dollars on 176GB of AMD vram because I was happy with getting vram for cheap and I hate Nvidia. It works and its nice to be able to run bigger models at usable performance, but if you need serious concurrency or good support for diffusion, you NEED Nvidia. AMD(and likewise Intel) just doesn’t have the environment support for non-server GPUs. Again, coming from someone who’s using this shit daily.
If you understand this limitation, then yes those B70s are cool as are AMD Pro 9700 which might have slightly better support rn. You may consider nvidia V100s which are old and cheap. I always recommend people start with 3090s (as a general powerhouse) or a pair of 5060tis (for really hood llm support) though. It will make your life easy if you can afford the vram limitation
Wouldn’t using the Vulkan backend instead of ROCm help a ton with concurrency and diffusion, at a marginal (1-2%) performance loss?
Thank you! This is really helpful. 32 GB V100 or pair of 5060ti s looks very interesting, and about the same price. Does running multiple GPUs require any special hardware? I mean apart from the motherboard with 2+ PCIe x16 slots?
It’s getting better, but yeah, I can’t run a lot of models.