\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

RTX 3090 really is a shockingly good buy

I doubted it for a long time because it's a Samsung product....
https://imgur.com/a/o2g8xYK
  12/05/25
Oh another 180 aspect of this card is that the non-Ti varian...
https://imgur.com/a/o2g8xYK
  12/05/25


Poast new message in this thread



Reply Favorite

Date: December 5th, 2025 7:25 PM
Author: https://imgur.com/a/o2g8xYK


I doubted it for a long time because it's a Samsung product. However, it you ONLY want to run local LLMs it's actually 180. It runs Gemma 27b lightning fast. Sure it will drink 350W while it's doing it, but it drops right down to 5 W at idle. It's really quite efficient in that sense, because you're hardly ever hitting it. If all you want is something to host ollama, it's perfect. I know people are using AMD but it's a nightmare, believe me. Having Nvidia means everything just works. These are like $800 on ebay last I checked. 24gb of VRAM. There's no point in getting the ti if you're just using it for AI

(http://www.autoadmit.com/thread.php?thread_id=5806767&forum_id=2/#49487222)



Reply Favorite

Date: December 5th, 2025 7:28 PM
Author: https://imgur.com/a/o2g8xYK


Oh another 180 aspect of this card is that the non-Ti variant uses the good old fashioned power connectors. If you get the 3090ti you're forever stuck with the Nvidia proprietary connector, and it's 1st gen. AVOID

(http://www.autoadmit.com/thread.php?thread_id=5806767&forum_id=2/#49487226)