this post was submitted on 21 Dec 2024
22 points (92.3% liked)

Linux

48665 readers
572 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I currently have a 1 TiB NVMe drive that has been hovering at 100 GiB left for the past couple months. I've kept it down by deleting a game every couple weeks, but I would like to play something sometime, and I'm running out of games to delete if I need more space.

That's why I've been thinking about upgrading to a 2 TiB drive, but I just saw an interesting forum thread about LVM cache. The promise of having the storage capacity of an HDD with (usually) the speed of an SSD seems very appealing, but is it actually as good as it seems to be?

And if it is possible, which software should be used? LVM cache seems like a decent option, but I've seen people say it's slow. bcache is also sometimes mentioned, but apparently that one can be unreliable at times.

Beyond that, what method should be used? The Arch Wiki page for bcache mentions several options. Some only seem to cache writes, while some aim to keep the HDD idle as long as possible.

Also, does anyone run a setup like this themselves?

you are viewing a single comment's thread
view the rest of the comments
[–] schizo@forum.uncomfortable.business 10 points 20 hours ago* (last edited 20 hours ago) (1 children)

...depends what your use pattern is, but I doubt you'd enjoy it.

The problem is the cached data will be fast, but the uncached will, well, be on a hard drive.

If you have enough cached space to keep your OS and your used data on it, it's great, but if you have enough disk space to keep your OS and used data on it, why are you doing this in the first place?

If you don't have enough cache drive to keep your commonly used data on it, then it's going to absolutely perform worse than just buying another SSD.

So I guess if this is 'I keep my whole steam library installed, but only play 3 games at a time' kinda usecase, it'll probably work fine.

For everything else, eh, I probably wouldn't.

Edit: a good usecase for this is more the 'I have 800TB of data, but 99% of it is historical and the daily working set of it is just a couple hundred gigs' on a NAS type thing.

[–] tiddy@sh.itjust.works 3 points 20 hours ago (1 children)

I'm curious what type of workflow you have to utilise mainly the sane data consistently, I'm probably biased because I like to try software out - but I can't imagine (outside office use) a loop that would remain this closed

It is mostly professional/office use where this make sense. I've implemented this (well, a similar thing that does the same thing) for clients that want versioning and compliance.

I've worked with/for a lot of places that keep everything because disks are cheap enough that they've decided it's better to have a copy of every git version than not have one and need it some day.

Or places that have compliance reasons to have to keep copies of every email, document, spreadsheet, picture and so on. You'll almost never touch "old" data, but you have to hold on to it for a decade somewhere.

It's basically cold storage that can immediately pull the data into a fast cache if/when someone needs the older data, but otherwise it just sits there forever on a slow drive.