this post was submitted on 10 Dec 2024
25 points (100.0% liked)

Linux

48665 readers
589 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

This hasn't happened to me yet but I was just thinking about it. Let's say you have a server with an iGPU, and you use GPU passthrough to let VMs use the iGPU. And then one day the host's ssh server breaks, maybe you did something stupid or there was a bad update. Are you fucked? How could you possibly recover, with no display and no SSH? The only thing I can think of is setting up serial access for emergencies like this, but I rarely hear about serial access nowadays so I wonder if there's some other solution here.

you are viewing a single comment's thread
view the rest of the comments
[–] InEnduringGrowStrong@sh.itjust.works 3 points 1 week ago (1 children)

I passthrough a GPU (no iGPU on this mobo).
It only hijacks the GPU when I start the VM, for which I haven't configured autostart.
Before the VM is started it's showing the host prompt. It doesn't return to the prompt if the VM is shutdown or crashed, but a reboot would, hence not autostarting that VM.
If it got borked too much, putting a temporary GPU might be easier.

Also, don't break your ssh.
Pretty easy with PKI auth.

[–] berylenara@sh.itjust.works 1 points 1 week ago (1 children)

It only hijacks the GPU when I start the VM

How did you do this? All the tutorials I read hijack the GPU at startup. Do you have to manually detach the GPU from the host before assigning it to the VM?

[–] InEnduringGrowStrong@sh.itjust.works 2 points 1 week ago (1 children)

Interesting.
I'm not doing anything special that wasn't in one of the popular tutorials and I thought that's how it was supposed to work, although it might very well be a "bug" how it behaves right now.

I don't know enough about this, but the drivers are blacklisted on the host at boot, yet the console is still displayed through the GPU's HDMI at that time which might depend on the specific GPU (a vega64 in my case).

The host doesn't have a graphical desktop environment, just the shell.

[–] berylenara@sh.itjust.works 3 points 1 week ago

the drivers are blacklisted on the host at boot

This is the problem I was alluding to, though I'm surprised you are still able to see the console despite the driver being blacklisted. I have heard of people using scripts to manually detach the GPU and attach it to a VM, but sounds like you don't need that, which is interesting