\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

Never buy AMD to do local AI. Forget it. STOP

Without CUDA support you'll find that nothing works. Sure, A...
https://i.imgur.com/chK2k5a.jpeg
  12/17/25


Poast new message in this thread



Reply Favorite

Date: December 17th, 2025 1:49 PM
Author: https://i.imgur.com/chK2k5a.jpeg


Without CUDA support you'll find that nothing works. Sure, AMD has their proprietary ROCm platform, but 95% of models either don't work or run shitty on ROCm. Furthermore AMD seems to have bailed on ROCm development so hard that, last I checked, they still didn't have support for RDNA4 GPUs. That's right: you can buy an AMD RyzenAI with 64gb of soldered RAM, but it won't run Ollama.

(http://www.autoadmit.com/thread.php?thread_id=5811374&forum_id=2Reputation#49516931)