I have a 7900XTX I would like to use for AI. I tried to get ollama working on my windows gaming PC this last weekend through Docker and WSL, but that was a pain.
It seems that pytorch might have worked but I still need to try out TensorFlow. Both of those ecosystems seem a hair fractured when it comes to AMD.
While I am sure it can be done, its nowhere near the point and click experience like it was on my laptop with an Nvidia card.
I’ll keep trying, but I don’t know how far I’ll get.
I am still going to build a dedicated machine for AI work, so I am not sure if I want to break my gaming rig too hard. All of that mess would be so much easier on a Linux rig with none of the windows fluff getting in the way.
Not nearly as flexible though.
I have a 7900XTX I would like to use for AI. I tried to get ollama working on my windows gaming PC this last weekend through Docker and WSL, but that was a pain.
It seems that pytorch might have worked but I still need to try out TensorFlow. Both of those ecosystems seem a hair fractured when it comes to AMD.
While I am sure it can be done, its nowhere near the point and click experience like it was on my laptop with an Nvidia card.
Mostly because of Nvidia’s dominance with CUDA, don’t stop trying to make it work though, don’t reward Nvidia for their BS lol
I’ll keep trying, but I don’t know how far I’ll get.
I am still going to build a dedicated machine for AI work, so I am not sure if I want to break my gaming rig too hard. All of that mess would be so much easier on a Linux rig with none of the windows fluff getting in the way.