Ask HN: Functioning hidpi setup on Linux, how?
How do you manage to have a functioning hidpi setup on Linux?
With X, all monitors must share the same pix density, and with Wayland, X apps (running via XWayland) are blurry, like VS Code for instance (scaled up on hidpi displays). > With X, all monitors must share the same pix density To get proper fractional scaling on X11 with monitors of different pixel densities, you'll want to use either: 1. GNOME Shell with a patched version of Mutter, which is included by default in Ubuntu[1] and its derivatives (such as Pop!_OS and elementary OS). Manjaro also offers the patched version of Mutter as the mutter-x11-scaling package.[2] You'll need to enable the x11-randr-fractional-scaling experimental feature for any of these distros.[3] 2. Cinnamon, included by default in Linux Mint To get proper fractional scaling on Wayland, you also have two options: 1. GNOME Shell (no patch needed), with the scale-monitor-framebuffer experimental feature enabled[3] 2. KDE Plasma, which is still a little bit buggy on Wayland [1] https://salsa.debian.org/gnome-team/mutter/-/blob/ubuntu/mas... [2] https://github.com/puxplaying/mutter-x11-scaling [3] https://www.omgubuntu.co.uk/2019/06/enable-fractional-scalin... It simply works for me. My desktop is on Debian sid, running GNOME 41.0 on Wayland with dual monitor setup 3840×2160 and 1920×1200 on AMD graphics, and everything just works. I did deliberately buy AMD graphics because of the support.
Laptop is also on sid, with Intel graphics and 4k display and there absolutely no issues there too. I don't see any blurriness in vscode or any other application that I use. But also, I'm not using a lot of graphical apps. Mostly it's just Firefox, vscode, android-studio, tilda, and sublime-merge. [edit] Looks like this: https://imgur.com/a/Zcc031e How are you using tilda on Wayland, i.e. without global hotkeys? Sorry, it's tilix, not tilda.. used tilda for years until I got used to tilix and the splitting. I'm still missing the quake mode that's unavailable under wayland. The compositor (mutter) provides support for global hotkeys. Do you re-open a running tilda instance? Or open a new one every time you press the hotkey? You should be able to run VSCode without XWayland using `--enable-features=UseOzonePlatform --ozone-platform=wayland`. HiDPI just works for me. Intel graphics, Sway. Yes! Tried that on Sway with code-insiders, but the experience is unstable (random crashes).
What distribution are you running? There's an upstream report at https://github.com/microsoft/vscode/issues/109176 ; it sounds like your experience will vary wildly depending on the version of electron you're using and the options it was built with. I had relatively little issues with this on Debian on my HP Spectre x360 (2019 model I think) Relevant excerpt from my install log > Setting global scale to 200-250% in KDE 5 and upping a few fonts seems to fix virtually all HiDPI-issues, except for SDDM and the console framebuffer. You may also want to resize the start bar size, but that is personal preference. > For SDDM: > Put in /etc/sddm.conf.d/kde_settings.conf I have a laptop with hidpi (14" 2560x1440) and I gave up : I now use a non-native lower resolution. I tried x11, I tried wayland, I tried scaling, I tried fractional scaling, I tried changing default font size.
I always ended up with text too small, text too blurry or inconsistencies between applications. I have no experience with Gnome and KDE, but certainly both scaling and fractional scaling cannot give anything else than bad results. For correct text rendering, you must set the screen to its native resolution and then set a correct value for the DPI so that a font of 12 points (or of 10 points, whichever you prefer) should have the right size that you like in a default font. If you use Gnome or KDE, I do not know where they can set the exact value for the screen DPI, but they must have such a setting, otherwise it is indeed not possible to use properly a HiDPI display. EDIT: At least KDE has a "Force Font DPI" setting, while the corresponding setting in Gnome seems harder to find. It appears that the reason while I never had any problems with bad text rendering on HiDPI in Linux is that I never attempted to use the so-called HiDPI support from Gnome and KDE, which is just an extremely ugly and inefficient workaround for GUI applications that have been written incompetently, so they lack the options for changing the sizes of fonts and of other GUI elements. In my experience, most Linux applications have the required options, so they do not need the "HiDPI support" workaround, which is guaranteed to render badly any text. > I tried x11, I tried wayland, I tried scaling It's usually not about x11 or wayland, it's about the window manager support. Vanilla Ubuntu 20.04, Gnome, X server, RTX 2070 Super GPU, driver version 495. Laptop screen: 17 inch 3840x2160, 200% scaling Dual external screens: 27 inch 3840x2160, 150% scaling No issues with the above. I only have 4K screens, so I haven't tested this with different resolutions, but at the very least different scaling does work. Maybe my lack of knowledge is playing a role here but this is why I'm not fan of pushing Wayland everywhere. It seem like X server works pretty nicely while I hear everywhere about performance and functionality issues with Wayland. Wayland supposed to be better solution but I don't understand how since it have so many flaws. X has what, 40 years of history - it's battle-tested to hell and back, and knows all the corner cases. Wayland is much younger and a full rewrite, inevitably it will have issues for years and years. Part of the push is so that these cases emerge. Wayland is probably easier for developers to work with - the X codebase by now is ancient. But it will lack some functionality that X has, probably forever. >Wayland is much younger and a full rewrite Isn't more like Wayland is just some protocols + some testing code, and all DEs use their own implementation, So is like Wayland is like at least 3 different rewrites. Also I just read someone argument that X devs are coding wayland, how the fuck are they coding a protocol? or are they working on GNOME implementation since RedHat == GNOME.
I would also abandon an old code base and just write some protocols for 10% of the old code features and then let the suckers implement it. It's not just Wayland vs X. For me abandoned Unity worked much better than actively supported Gnome. It took forever for Gnome to get fractional scaling which Unity had for years and Gnome still can't match features of Unity to this day even with a bunch of plugins installed. How exactly do you make this work? Whenever I try and use different scaling on my different monitors, the OS (Ubuntu 20.04) ends up setting them both to the same setting.. or something else weird happens.
I have tried a lot, and never managed to get Ubuntu 20.04 to set 200% on one monitor, and 150% on the other. I don't really do anything special, just set everything up in Gnome's "Settings" app. Are all your monitors the same resolution though? If not, that could be the problem. I've tried to setup a 4K and a 2K monitor side-by-side, with one at 200% and the other at 100%. How did you manage to get different scaling to work, if I may ask? That's the only thing I haven't been able to get properly working myself. I have a similar laptop screen, and 3 external 4k monitors. I would love to be able to scale my laptop screen differently as well :) No special tweaks, everything out of the box using Gnome's Settings app. Initially I didn't expect that to work, because of earlier reports I kept seeing, but it did. In fact just now I set one of my external monitors to 100% while the other remained at 150% and it worked. The mouse cursor seemed a bit stuttery than usual on the 100% one, but maybe I was just imagining things. > With X, all monitors must share the same pix density `xrandr` allows different per-output scaling. For example if you have a 4k laptop and and 4k monitor that is 50% bigger than the laptop display, X will pretend that the monitor is 5760x3240 and scale down to the actual pixels. Having said all that, I dislike scaled anything, so I choose a pixel density between the highest and lowest densities of my physical devices, and manually pick the font size of my terminals. Using PopOS 21.10 (Gnome Wayland), mixed dpi 2x and 1x on intel works pretty ok for wayland native apps. Xwayland stuff renders blurry on the hidpi screen but in practice i dont hit this often at all.
Everything is blurry unless you disable shadows though because gnome went and did their own pretty thing without thinking it through. Also integer scale is a happy place - wayland still has no real non-integer scale protocol. (See: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/i...) > Everything is blurry unless you disable shadows Where can I disable shadows ? Gtk CSS override, will provide when at computer. If you have one screen at 192 dpi, one at 96 dpi, double the resolution of the one at 96 dpi and scale down: xrandr --dpi 192 --output eDP-1 --auto --output HDMI-1 --left-of e-DP1 --auto --scale 2 I don't have my laptop with me, so I am not 100% sure. You can also go the other way, but the result is more blurry. Otherwise, as said in another post, apply tips from ArchLinux wiki. I have also put my experience here: https://vincent.bernat.ch/en/blog/2018-4k-hidpi-dual-screen-... (notably if you want something a bit dynamic with a laptop). I recommend this, it's exactly how I handle it in Ubuntu at least. I also often use Arandr in addition to xrandr, there's one feature (positioning the tiles if you have a very nonstandard setup) which is way easier to use in Arandr. But Arandr also lacks a lot of options that xrandr has. You can do it all with xrandr IIRC, but not with Arandr. Arandr just helps with positioning if you have say, three monitors of different sizes which don't line up in any typical way. I’m running Fedora, and use two monitors, one of which is 5k. No blurriness on Gnome + Wayland. I may have set a flag for Electron … I don’t recall. But I can say it’s not only possible, it’s quite nice. I love Fedora. When I were using a TV as an X11 display, I have set a very low DPI setting such as 96, which led to large fonts everywhere which was convenient, and of course they rendered just cool. I solved this problem in 2016 by buying a 4K laptop (Dell Precision 9550) and a 4K external monitor. Similar pixel density; at 200% scale the pixels on the laptop are a bit small but most apps work well. Most GNOME apps, VSCode, Firefox, Chrome all work with HiDPI out of the box and everything looks great, and this has been true for the last 5 years. For me I avoid running X apps altogether, electron can be told to render in alternative modes which are wayland native; others are using GTK or Qt so those can be told to be wayland native. The only application I use which is not wayland native is IntelliJ IDEA. > How do you manage to have a functioning hidpi setup on Linux? Pure HiDPI works okay-ish by modifying density settings (blurriness is AFAIK solved in current Electron builds?); mixed setups are still hopeless in my experience. My work machine has three monitors with different resolutions, different ideal scaling levels, and different refresh rates. The main one is also HDR-enabled. I completely gave up on getting Linux to work properly on that setup and settled on Windows 11 + WSL2 when needed. On the laptop I am writing from right now I multi-boot a couple of Linux distros and Windows 11. My preferred scaling level is 125% on Windows and it generally works great, but I stick to 100% on Linux because fractional scaling is still imperfect (yes, sadly even on Wayland). Some applications (chrome or firefox) have some CLI options for improved HIDPI For most other applications, finding a resolution which you can run HIDPI at 2x (fractional scaling didn't look good) XFCE works quite well, also I found this guide really helpful: https://wiki.archlinux.org/title/HiDPI So in case you consider XFCE, also GTK and other frameworks need to be configured. With closed-source Nvidia drivers the configuration options with Xrandr are limited though IIRC. I do not think I understand which is exactly your problem. I use monitors with different sizes and I am not aware of a way to specify in Linux a scale factor for each monitor, so if I move a window from a larger monitor to a smaller monitor, then the window, including its text, will become smaller (or possibly greater, if the second monitor has a much lower resolution), the ratio in sizes depending on both the ratio between the physical sizes and the ratio between the resolutions. If the monitors would have very different resolutions and/or sizes, then there is no doubt that the setup would become difficult to use, unless you use some applications that you always open on a certain monitor and then you do not move them to other monitors, so you can select in the application preferences a font size and a window size appropriate for that monitor. What I do not understand, is your reference to a blurry rendering, because the application window should be rendered in the same way on any monitor, which would lead to different text sizes, but not to differences in blurriness. I have been using HiDPI displays in Linux for 6 or 7 years, without any problems and I also do not understand what is meant when people talk about "scaling", this might be something specific to Gnome or KDE, which I do not use. In any case "scaling" (as in Windows) sounds like something that should be strictly avoided. The fonts must be rendered for the right number of dots per font size, they must not be scaled from renderings for other DPIs (which would indeed cause blurriness). In XFCE, you have one global setting for fonts "Custom DPI setting", which, together with the settings for anti-aliasing and hinting, determines how all the typefaces will be rendered. After setting an appropriate high value for DPI and choosing in your applications adequate font sizes, there are no blurriness problems whatsoever, if you take care to use decent typefaces. The default typefaces in most Linux distributions are seldom good, but there are many excellent typefaces, both free and commercial. The only programs with which I ever had problems on HiDPI in Linux are those written in Java, which very frequently lack any options for changing the font used by the program, which I consider to be a capital sin for any GUI program. Moreover, for many Java programs, the installers that use a GUI crash and die on monitors with more than 8 bit per pixel. In my experience Java is always the farthest from its claim of "write once, run everywhere". What is there not to understand? This is not hard or complex. I am guessing that if you are not getting what the problem is here, you must be pretty young and have good eyesight. Perhaps you also mainly use single-screen setups. (By comparison, I am 53, myopic, and normally wear varilux spectacles to give me different focal lengths which my eyes' lenses can no longer provide.) As an example: I have 3 screens on my work setup. A 24" screen in portrait (80dpi), a 22" in landscape (87dpi) and the laptop's built-in 12½" (110dpi). The difference between the 2 external screens is not really noticeable, but the internal screen's density is significantly higher. When I put windows on that display, their contents become much smaller, to the extent that they are hard to read. That is the problem here. On my Macs, when I attach an external screen, the OS adjusts the scaling factor so that windows, text, controls and so on are the same size on all displays. You can even position a window over the split between monitors and the OS adjusts the contents on the fly. X.org can't do that. You get one global scaling factor for all displays. I would like my internal screen to show its contents at 1¼× the size of the external screens, but X.org can't do that. Wayland can, but I don't like most of the desktops that run on Wayland. I can't stand GNOME and dislike KDE. I prefer Xfce or Ubuntu Unity -- but they don't run on Wayland. (Yet, in the case of Xfce.) This leaves 2 less-than-ideal choices: run LCDs at non-native resolutions in order to roughly match on-screen sizes, or put up with things changing size and possibly being unreadably small. These options are tolerable on standard-definition screens, but they aren't on HiDPI screens. Worse still, a mixture of HiDPI and SD screens can be totally unusable. I've tried Ubuntu, fedora and regolith-linux Regolith-linux was the easiest to set up for hdpi and best solution for me. Fedora used to be decent but the fractional scaling just doesn't work. Current versions of VSCode support running under Wayland, though they don't do so by default. I would suggest running under Wayland, and making all apps use Wayland natively. Even when run under Wayland, VSCode looks blurry with a scaling of 150%... Wayland advocates love to mention that the real issue is using fractional scaling at all, but Windows and macOS can deal with it just fine. VSCode/VSCodium should render sharply with fractional scaling on Wayland, as long as you run the application with these command line options:[1] [1] https://wiki.archlinux.org/title/Visual_Studio_Code#Running_... > Even when run under Wayland, VSCode looks blurry with a scaling of 150%... It shouldn't when running natively; it should render everything at the correct size rather than scaling them. X apps get blurry because they're rendered large and downscaled; that shouldn't happen for native Wayland apps.
[X11]
ServerArguments=-dpi 240
EnableHiDPI=true
is what I've been using in a no-DE setup on a 4k screen $ cat .Xresources
Xft.dpi: 144
URxvt.font: xft:Anonymice Powerline:style=Regular:size=18
You probably want to update your desktop entries to lock in these options by default.[2] --enable-features=UseOzonePlatform --ozone-platform=wayland