we can't ban open source AI so we'll remove the ability to run it locally
| Cerise Pervert | 12/17/25 | | Higgs Monaghan | 12/17/25 | | merry screenmas | 12/17/25 | | nude sister in law | 12/17/25 | | Norm McDonalds | 12/17/25 | | grizzly disgusting spot | 12/17/25 | | Cerise Pervert | 12/17/25 | | grizzly disgusting spot | 12/17/25 | | Cerise Pervert | 12/17/25 | | Elite Ceo Keepsake Machete | 12/17/25 | | Cerise Pervert | 12/17/25 | | Elite Ceo Keepsake Machete | 12/17/25 | | merry screenmas | 12/17/25 | | i gave my cousin head | 12/17/25 | | merry screenmas | 12/17/25 | | Norm McDonalds | 12/17/25 | | Zippy Magical Volcanic Crater | 12/17/25 | | Cerise Pervert | 12/17/25 | | Startled French Chef | 12/17/25 | | Zippy Magical Volcanic Crater | 12/17/25 | | Cerise Pervert | 12/17/25 | | Louis Poasteur | 12/17/25 | | i gave my cousin head | 12/17/25 | | nude sister in law | 12/17/25 | | i gave my cousin head | 12/17/25 | | i gave my cousin head | 12/17/25 | | dadgummit | 12/17/25 | | i gave my cousin head | 12/17/25 | | dadgummit | 12/17/25 | | nude sister in law | 12/17/25 | | https://imgur.com/a/o2g8xYK | 12/17/25 | | i gave my cousin head | 12/17/25 | | i gave my cousin head | 12/17/25 | | Insanely Creepy Gay Wizard Hell | 12/17/25 | | Cerise Pervert | 12/17/25 | | comical cumskin | 12/17/25 | | Bespoke Pale Persian | 12/17/25 | | cant believe this moniker wasnt taken | 12/17/25 | | Norm McDonalds | 12/17/25 | | Taylor Swift is not a hobby she is a lifestyle | 12/17/25 | | merry screenmas | 12/17/25 | | nude sister in law | 12/17/25 | | Norm McDonalds | 12/17/25 | | merry screenmas | 12/17/25 | | nude sister in law | 12/17/25 | | liberal mustiness | 12/17/25 | | Higgs Monaghan | 12/17/25 | | Norm McDonalds | 12/17/25 | | https://imgur.com/a/o2g8xYK | 12/17/25 | | Taylor Swift is not a hobby she is a lifestyle | 12/17/25 |
Poast new message in this thread
 |
Date: December 17th, 2025 9:25 AM Author: Cerise Pervert
why don't you "ping" your pencil neck you weird little freak
(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2).#49516133)
|
 |
Date: December 17th, 2025 1:13 PM Author: i gave my cousin head
here are the normie instructions for low iq mos
https://lmstudio.ai/
+
https://huggingface.co/TheDrummer/Big-Tiger-Gemma-27B-v1
also if you cant answer these questions on your own you probably dont even have a strong enough setup to begin with
(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2).#49516771) |
 |
Date: December 17th, 2025 7:49 PM
Author: https://imgur.com/a/o2g8xYK
You need 24gb VRAM to do inference. 16gb isn't enough. However, unless you are coding there's little reason to go above 24gb. The reason coding can use more VRAM is because going through each iteration of the code generates long context windows. If you run out of context window the AI will forget what it was doing earlier. This is also why you can't run 15gb models on 16gb of VRAM. The context windows spills into system RAM and slows everything down
48gb VRAM lets you do more with image and video generation, but will not give you measurable gains in inference. You can put bigger models on the system but they probably won't give you better results.
(http://www.autoadmit.com/thread.php?thread_id=5811303&forum_id=2).#49517900) |
|
|