this post was submitted on 01 Sep 2023
128 points (85.2% liked)

Linux

48335 readers
424 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

In response to Wayland Breaks Your Bad Software

I say that the technical merits are irrelevant because I don't believe that they're a major factor any more in most people moving or not moving to Wayland.

With only a slight amount of generalization, none of these people will be moved by Wayland's technical merits. The energetic people who could be persuaded by technical merits to go through switching desktop environments or in some cases replacing hardware (or accepting limited features) have mostly moved to Wayland already. The people who remain on X are there either because they don't want to rebuild their desktop environment, they don't want to do without features and performance they currently have, or their Linux distribution doesn't think their desktop should switch to Wayland yet.

you are viewing a single comment's thread
view the rest of the comments
[–] 0x0@social.rocketsfall.net 38 points 1 year ago* (last edited 1 year ago) (6 children)

X11 is, to put it simply, not at all fit for any modern system. Full stop. Everything to make it work on modern systems are just hacks. Don’t even try to get away with “well, it just works for me” or “but Wayland no worky”.

I really don't know if there could be a more obnoxious opening than this. I guess Wayland fanatics have taken a page from the Rust playbook of trying to shame people into using it when technical merits aren't enough ("But your code is UNSAFE!")

[–] skullgiver@popplesburger.hilciferous.nl 45 points 1 year ago* (last edited 1 year ago) (2 children)

[This comment has been deleted by an automated system]

[–] orangeboats@lemmy.world 11 points 1 year ago* (last edited 1 year ago) (1 children)

I feel that the biggest mistake of X11's protocol design is the idea of a "root window" that is supposed to cover the whole screen.

Perhaps that worked greatly in the 1990s, but it's just completely incompatible with multi-displays that we commonly see in modern setups. Hacks upon hacks were involved to make multi-displays a possibility on X11. The root window no longer corresponded to a single display. In heterogenous display setups, part of the root window is actually invisible.

Later on we decided to stack compositing on top of the already-hacky mess, and it was so bad that many opted to disable the compositor (no Martha, compositors are more than wobbly windows!).

And then there's the problem of sandboxing programs... Which is completely unmappable to X11 even with hacks.

[–] michaelrose@lemmy.ml -1 points 1 year ago (2 children)

Multiple displays work fine. The only thing that needs to be drawn in the root window is attractive backgrounds sized to your displays I'm not sure why you think that is hacky or complicated.

[–] siberianlaika@lemm.ee 7 points 1 year ago (2 children)

Multiple displays only work as long as you have identical resolutions and refresh rates. Good luck mixing monitors with different scaling factors and refresh rates on X11.

[–] Hexarei@programming.dev 2 points 1 year ago

I run multiple refresh rates without any trouble, one 165hz monitor alongside my other 60hz ones. Is that supposed to be broken somehow?

[–] michaelrose@lemmy.ml -1 points 1 year ago

This wasn't true in 2003 when I started using Linux in fact the feature is so old I'm not sure exactly when it was implemented. You have always been able to have different resolutions and in fact different scaling factors. It works like this

You scale your lower DPI display or displays UP to match your highest DPI and let X scale down to the physical size. HIGHER / LOWER = SCALE FACTOR. So with 2 27" monitors where one is 4k and the other is 1080p the factor is 2, a 27" 4K with a 24" 1080p is roughly 1.75.

Configured like so everything is sharp and UI elements are the same size on every screen. If your monitors are vertically aligned you could put a window between monitors and see the damn characters lined up correctly.

If you use the soooo unfriendly Nvidia GPU you can actually configure this in its GUI for configuring your monitors. If not you can set with xrandr the argument is --scale shockingly enough

Different refresh rates also of course work but you ARE limited to the lower refresh rate. This is about the only meaningful limitation.

[–] orangeboats@lemmy.world 1 points 1 year ago (1 children)

It's the fact that the root window is a lie.

[–] scroll_responsibly@lemmy.sdf.org -1 points 1 year ago (1 children)
[–] orangeboats@lemmy.world 1 points 1 year ago

...What? The root window was supposed to mean "the whole screen". It no longer does - that's the lie. Then people created XRandR to help work around it - that's the hack.

[–] woelkchen@lemmy.world 10 points 1 year ago (1 children)

This is not an insult to the people behind X11.

The people behind X11 agree and that's why they founded Wayland.

[–] skullgiver@popplesburger.hilciferous.nl 1 points 1 year ago* (last edited 1 year ago) (1 children)

[This comment has been deleted by an automated system]

[–] Auli@lemmy.ca 4 points 1 year ago (1 children)

Sure but the people behind X11 are the same ones behind Wayland so when the develpers didn't think it was worth the time to fix X11 and it would be better to start a new project to fix the issues. How can end users think we should just fix X11 make anysense? I think their biggest mistake is they should have called Wayland X12 or something like that.

[–] skullgiver@popplesburger.hilciferous.nl 1 points 1 year ago* (last edited 1 year ago) (1 children)

[This comment has been deleted by an automated system]

[–] woelkchen@lemmy.world 1 points 1 year ago (1 children)

X11 has decades of tooling that doesn’t work on Wayland anymore.

Wayland 1.0 was released in 2012, though.

[–] skullgiver@popplesburger.hilciferous.nl 1 points 1 year ago* (last edited 1 year ago) (1 children)

[This comment has been deleted by an automated system]

[–] woelkchen@lemmy.world 1 points 1 year ago (1 children)

That’s not necessarily a bad thing, but it does break some workflows.

11 years after Wayland 1.0 and 7 years after Gnome 3.22 were released.

[–] skullgiver@popplesburger.hilciferous.nl 3 points 1 year ago* (last edited 1 year ago) (1 children)

[This comment has been deleted by an automated system]

[–] woelkchen@lemmy.world 1 points 1 year ago

11 years after the majority of Linux users didn’t notice anything and stuck with X11

All major distributions default to Gnome Wayland since years and since last year's Steam Deck release, even millions of super casual gamers use Wayland without even knowing what a display server is.

[–] Static_Rocket@lemmy.world 25 points 1 year ago

No, no, they've got a point. The architecture of Wayland is much more sane. Because of the way refresh events are driven its also much more power and memory efficient. I'll miss bspwm and picom but man there is a lot riding on simplifying the graphics stack under Linux. The X hacks, GLX, and all the other weird interactions X decided to take away from applications made things non-portable to begin with and a nightmare for any embedded devices that thought GLES was good enough.

[–] Sh1nyM3t4l4ss@lemmy.world 21 points 1 year ago (2 children)

There are several remarks in that article that bothered me. I agree with their message overall and am a strong proponent of Wayland but...

Unless your workflow (and hardware) comes from 20+ years ago, you have almost no reason to stick with Xorg

There definitely are valid use cases that aren't 20 years old that will keep you on X11 for a little while longer. And hardware too: NVIDIA dropped driver support for Kepler GPUs and older before they added GBM support which is effectively a requirement for Wayland, so you can't use these older cards on Wayland with the proprietary drivers

Of course, NVIDIA likes to do their own thing, as always. Just use Nouveau if you want to do anything with Xwayland, and you don’t have several GPUs.

Uh, no. Nouveau is not a serious option for anyone who likes using their GPU for useful things. And on those older cards it will likely never work well.

The author of that article seems extremely ignorant of other people's needs.

[–] woelkchen@lemmy.world 2 points 1 year ago (1 children)

NVIDIA dropped driver support for Kepler GPUs and older before they added GBM support which is effectively a requirement for Wayland, so you can’t use these older cards on Wayland with the proprietary drivers

That's definitively the fault of people to buy NVidia hardware which only works fine on Windows. It's not the fault of Wayland developers that NVidia is a shit company that does not care to make their hardware properly run on Linux.

[–] Sh1nyM3t4l4ss@lemmy.world 2 points 1 year ago (1 children)

Can we stop shaming people who buy NVIDIA?

For one, people want to keep using what they have and not buy something new just because it may work better on Linux, abd they may not even be able to afford an upgrade. They probably didn't even know about Linux compatibility when they got it.

And additionally, some people have to use NVIDIA because e. g. they rely on CUDA or something (which is unfortunate but not their fault).

And honestly, NVIDIA is fine on Linux nowadays. It sucks that support for older cards will likely stay crappy forever but hopefully with the open kernel drivers and NVK newer cards won't have to suffer that fate.

[–] woelkchen@lemmy.world 3 points 1 year ago

Can we stop shaming people who buy NVIDIA?

Can people who buy NVidia hardware contrary to widespread wisdom just start to own up to their decisions and not complain about Wayland every time it is mentioned?

[–] michaelrose@lemmy.ml -1 points 1 year ago

The author is a Wayland fanboy which almost by definition makes them a moron. We are talking about folks who were singing the same song like 7 years ago when the crack they were promoting was outrageously broken for most use cases.

[–] russjr08@outpost.zeuslink.net 20 points 1 year ago (2 children)

I find that usually when people write "Full stop", it's best to just stop reading there in most cases.

It comes off as "I am correct, how dare you think that for a moment I could be wrong".

I'd love to use Wayland, but until it works properly on Nvidia hardware like X11 is, then it's not a viable option for me. Of course, then someone always goes "Well then use an AMD card" but money doesn't grow on trees. The only reason I'm not still using a 970 is because a friend of mine was nice and gave me his 2080 that he was no longer using, along with some other really nice upgrades to my hardware.

Honestly it's one of the biggest issues I have with the Linux community. I love Linux and FOSS software but the people who go around and yell at anyone who isn't using Linux, and the people who write articles like this who try to shame you for your choices (something that is supposed to be a landmark of using open source software) only make Linux look bad.

There's a difference between someone kindly telling others that X11 is not likely to receive any new major features and bug fixes (which is the right thing to do, in order to inform someone something they may not know) - and then there's whatever the author of this quote is doing.

[–] happyhippo@feddit.it 11 points 1 year ago* (last edited 1 year ago) (3 children)

It happens all the time in the magical world of closed source, too.

Ever heard about the iOS vs Android fights? How people shame Android users for being green bubbles?

It's just the extension of the my camp vs theirs applied to the tech field, nothing new.

[–] pelotron@midwest.social 11 points 1 year ago* (last edited 1 year ago) (1 children)

I laughed off reports about this kind of thing, thinking "omg who could possibly give a shit about what color their text bubble is in a group chat?" Later my gen Z office mate told me about how he uses an iPhone and cited this exact reason unironically. I was stunned into silence.

[–] zwekihoyy@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

there's a decent amount of research into the psychology behind it and how reading white text on the light green is more difficult than on the blue bubble. it's rather interesting.

edit: although I would think dark mode should change that effect a little bit

[–] russjr08@outpost.zeuslink.net 1 points 1 year ago

Oh absolutely, I am sadly all far too well aware of those cases (especially the "green bubbles" thing, I've never rolled my eyes harder at a silly situation).

It's not even strictly a tech thing either, its a long standing thing in human history no matter where you look, and unfortunately I don't see it going away any time soon.

[–] bemenaker@lemmy.world 1 points 1 year ago

It sounds like you need to be complaining to nvidia to do a better job with their drivers. If the drivers suck, it doesn't matter what wayland does.

[–] Auli@lemmy.ca 9 points 1 year ago (2 children)

Ok but then how about the developers of X11 who decided it wasn't worth fixing the issues and to start a new project called Wayland where they could start from scratch to fix the issues. Does that change your mind at all?

[–] duncesplayed@lemmy.one 3 points 1 year ago

That would be a "technical merit", which the article author claims is irrelevant to the discussion.

[–] 0x0@social.rocketsfall.net -1 points 1 year ago (1 children)

I have not had a single X11-related issue in the last decade.

[–] siberianlaika@lemm.ee 5 points 1 year ago (1 children)

I don't want to sound rude, but how old is your setup? Are you using a desktop or a laptop computer?

Because I'm daily driving a late 2015 Dell XPS 9350 and X11 just ain't cutting it, even though the laptop is nearly a decade old. On X11, its trackpad would be garbage, GNOME's animations would be stuttery, and fractional scaling would be a mess, because I have a docking station with a 75 Hz ultrawide monitor, meaning that I must utilise both 125% and 100% scaling factors, as well as 60 Hz and 75 Hz refresh rates and different resolutions. Sure, not everyone uses multi monitor setups, but those who do serious office tasks or content production work often cannot imagine their workflow without multiple monitors. Point is, X11 is to ancient to handle such tasks smoothly, reliably and efficiently.

[–] 0x0@social.rocketsfall.net 4 points 1 year ago

It's not rude - don't worry. My main desktop runs 4 monitors at 1080p. GPU is an RX 580. I have a number of other laptops/tablets/desktops running similar configs, including ones with mixed resolutions and refresh rates. Gaming/video production/programming.

I think people are really discounting the amount of value experience with a certain set of software has to the end-user. Wayland isn't a drop-in replacement. There's a new suite of software and tooling around it that has to be learned, and this is by design. Understandably, many people focus on getting displays working properly on mixed resolutions and refresh rates, but there are concerns for usability/accessibility outside of that.

[–] michaelrose@lemmy.ml 3 points 1 year ago (1 children)

This is literally the exact bad attitude of your average Wayland proponent. The thing which has worked for 20 years doesn't work you just hallucinated it along with all the show stopper bugs you encountered when you tried to switch to Wayland.

[–] orangeboats@lemmy.world 6 points 1 year ago* (last edited 1 year ago) (14 children)

It's really not "working" per se. VRR was breaking on X11, sandboxing was breaking on X11, fractional scaling and mixed DPI were breaking on X11.

How did we achieve HiDPI on X11? By changing Xft.dpi (breaking old things) or adding random environment variables (terrible UX - do you want to worsen Linux desktop's reputation even more?). Changing XRandR? May your battery life be long lasting.

There's genuinely no good way to mix different DPIs on the same X server, even with only one screen! On Windows and Mac, the old LoDPI applications are scaled up automatically by the compositor, but this just doesn't exist on X11.

I focus on DPI because this is a huge weakness of X11 and there is a foreseeable trend of people using HiDPI monitors more and more, there are tons of other weaknesses, but people tend to sweep them under the rug as being exotic. And please don't call HiDPI setups exotic. For all the jokes we see on the eternal 768p screens that laptop manufacturers like to use, the mainstream laptops are moving onto 1080p. On a 13" screen, shit looks tiny if you don't scale it up by 150%.

You can hate on Wayland, you may work on an alternative called Delaware for all I care, but let's admit that X11 doesn't really work anymore and is not the future of Linux desktop.

load more comments (14 replies)