I got 32 additional GB of ram at a low, low cost from someone. What can I actually do with it?
Keep it and wait for the applications to bloat up. You won’t feel like you have an excessive amount of RAM in a few years.
You can run AI Models in it. Probably ones with 70b or up to 60b of you want to do other stuff while running them.
I built my PC recently and splurged to get about 100gb of ddr5, thinking it was going to be a waste of money.
I couldn’t have been more wrong, there are occasionally times when I’m almost running out of memory. How? Multiple desktops, each with tons of programs and stuff open, including probably like several hundred Firefox tabs open at the worst of times.
Basically, extra ram has allowed me to kinda postpone the responsibility of having the close programs, maintain cleanliness, etc. I still have to stay organised using desktops so I don’t go crazy with the number of things I have open, but I’m the limiting factor here, not my computer. And that’s a super liberating feeling.
TL;DR: you can NEVER have too much ram.
Does it have RGB? If not just bin it. It is worthless anyway.
Unused RAM is wasted RAM.
More than I could do on my Apple IIe at 64k.
Open 10 extra tabs in chrome
LOL, maybe If I used Chrome.
An extra 20 Firefox tabs
In my case, it’s less about being able to open more Firefox tabs and more about Firefox being able to go longer between crashes due to a memory leak. (I know, I know… Firefox doesn’t have memory leaks anymore. It’s probably due to an extension or some bad JavaScript in one of my perpetually-open sites or something. One of these days I’ll get around to troubleshooting it…)
If it’s any consolation, I also get this from Firefox when I leave tabs open for a long time.
♾️ extra Orion tabs
You can install it in a compatible computer.
Which I did
Excellent!
thanks
I used it for virtual machines and Docker containers.
One docker container per VM just to maximise the ram usage.
I realise that you are making a joke, but here’s what I used it for:
- Debian VM as my main desktop
- Debian VN as my main Docker host
- Windows VM for a historical application
- Debian VM for signal processing
- Debian VM for a CNC
At times only the first two or three were running. I had dozens of purpose built VM directories for clients, different hardware emulation, version testing, video conferencing, immutable testing, data analysis, etc.
My hardware failed in June last year. I didn’t lose any data, but the hardware has proven hard to replace. Mind you, it worked great for a decade, so, swings and roundabouts.
I’m currently investigating, evaluating and costing running all of this in AWS. Whilst it’s technically feasible, I’m not yet convinced of actual suitability.
costing running all of this in AWS
The cost will be oh, so much more than you’re expecting. I have not been at a shop where they didn’t later go “oh shit. Repatriate that stuff so it doesn’t cost us a mint.”
Yeah, I’ve been using AWS for many years. I’m familiar :)
Hetzner has better pricing if you don’t need to scale down dynamically.
That just sounds like QubesOS with extra steps
I unironically do this in proxmox. Keeps things nice and separate and i still have plenty ram left.
Any reason for not using LXC as PX has native support?
In my case, I’m not a fan of running unknown code on the host. Docker and LXC are ways of running a process in a virtual security sandbox. If the process escapes the sandbox, they’re in your host.
If they escape inside a VM, that’s another layer they have to penetrate to get to the host.
It’s not perfect by any stretch of the imagination, but it’s better than a hole in the head.
I do use LXC but those are still pretty much a virtual machine.
Fair point.
Sell it to somebody at a medium, medium cost who needs it
You could run a Java program, but you’d quickly run out of ram.
-
Compressed swap (zram)
-
Compiling large C++ programs with many threads
-
Virtual machines
-
Video encoding
-
Many Firefox tabs
-
Games
-
I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.
And, yeah, docker’s always taking up 3-4 GB.
vram would help even more i think
Either you use your CPU and RAM, either your GPU and VRAM
Fair, I didn’t realize that. My GPU is a 1060 6 GB so I won’t be running any significant LLMs on it. This PC is pretty old at this point.
You can run a very decent LLM with that tbh
You could potentially run some smaller MoE models as they don’t take up too much memory while running. I’d suspect the deepseek r1 8B distill with some quantization would work well.
I tried out the 8B deepseek and found it pretty underwhelming - the responses were borderline unrelated to the prompts at times. The smallest I had any respectable output with was the 12B model - which I was able to run, at a somewhat usable speed even.
Ah, that’s probably fair, i haven’t run many of the smaller models yet.
Run a fairly large LLM on your CPU so you can get the finest of questionable problem solving at a speed fast enough to be workable but slow enough to be highly annoying.
This has the added benefit of filling dozens of gigabytes of storage that you probably didn’t know what to do with anyway.
Keep (checks math) 3 more tabs open in chrome.