It's all very depressing. Like most of us here, I have become increasingly frustrated with the state of Mac (touch bar, bad keyboards, odd choices that fit neither creators nor developers well, and so on). No point rehashing it all.
But where does that leave us? I spent a year on Windows 10 not so long ago, on upper end laptop hardware with HiDPI screen, and it was less fun and more problem prone than macOS - especially when it came to fast day-to-day stuff we never think about, such as PDF printing, quickview, etc. Plus in Windows you get the mysterious stop-the-world issues (that I suspect are OS level misbehaviors when some remote network connection has become lagged). There may be some use cases where Windows is actually faster/nicer/better than macOS, but they're not non-MS development, Adobe CC work, regular file management work, etc. Maybe games are it.
Desktop Linux (even on standard desktop computers, not just specialized laptops) is far from perfect. Getting everything to work is a lot of effort, and then mysteriously something will stop working. Every window manager has some rough edges or cases where it's unpleasant. Getting and keeping a Linux Desktop in good working order is harder than it should be at this age of Linux.
ChromeOS?... I know some people who use that as their daily driver. Obviously there are desktop apps that they don't need/use (Adobe CC, for example). But what happens when some automated Google thing triggers an account ban? Bricked laptop and no data?
Thing really do not seem to be improving on the whole. Old problems are replaced with new problems, but the layers and separated concerns mean more complexity and more difficulty solving problems.
> Getting and keeping a Linux Desktop in good working order is harder than it should be at this age of Linux.
Maybe because all the developers ran away to Macs, instead of sticking around dogfooding and fixing bugs? Free software is a collaborative social enterprise - it's not really anyone's job to make free stuff for you to use. If, as a developer, you've taken the easy path of proprietary software for years, and now after finally getting fed up of being abused by a company that doesn't have your best interests at heart you return to the world of Free Software only to complain that it's still unpolished - then I'm afraid you are rather entitled, and deserve a serenading by the world's tiniest open source violin.
And make no mistake, the kinds of "rough edges" you refer to are not hard computer science problems. They're broadly just the kinds of everyday maintenance things that inevitably crop up in a huge software project on an ongoing basis and are easily fixed by many eyes and a little graft. The more eyes, and the more graft, the more polished the system.
For what it's worth, Desktop Linux for all its faults is still light years ahead of the competition. Far from perfect, as you say, but it really is the least bad choice. As they say - the best time to rally around Free Software is 10 years ago, and the second best time is now.
I have been dealing with computers since I started CS in school 18 years ago, and well my preferences have been changing since then.
I used to love playing with computers and being able to try different linux distributions even if I had to spend 5 hours to have a usable external monitor (actually spending that time was one of the best parts) but that changed mostly because of my needs, but also because of my economy.
When I had to start being more productive and I could afford it I bought my first Mac Book, and it lasted ten years! At that point I realized that maybe for me it was worth it spending that money and using that time to configure a keyboard doing something else.
I feel I was the target at that time for Apple with some of their products, maybe now things have changed and I am not anymore.
And I guess yes, marketing is what is driving them, but I think they are abandoning part of what they used to be their target and, hopefully, someone will fill that void with something that makes me as happy as when I bought my first Mac Book.
BTW, now I am using a Pixelbook and a Mac Book for my personal stuff and Windows professionally and although is getting worst and worst I will still keep buying them.
While I sympathize with your political position on free software, I have come to the unfortunate conclusion that this point of view is idealistic[1] and short sighted.
The vast majority of engineers don’t commit to switching to free software not because not enough hearts and minds have been convinced yet but because the entire economy and the basis of everyone’s material existence depends on a system that is at this moment market-based.
The popularity of proprietary software stacks is ultimately structural, as are the problems and caveats of proprietary software. Engineers are, in the vast majority, dependent on selling their labor to the commercial employment market for their livelihood, in many cases an extreme amount of labor in a highly competitive environment.
It is absolutely great that many engineers dedicate free labor to contributing to free software, and it is completely unreasonable to expect that anything at all could make the vast majority do it. Not an unreasonable expectation of an individual, but of the structure.
[1] in the sense of philosophical idealism vs materialism
This!. Love your reply. I'm also tired of developers criticizing Linux desktop for not being friendly enough and running away to propietary systems instead of trying to help to fix what they don't like of Linux Desktop. That's the good thing of open source: don't like it? Then fix it and send me a patch and stop complaining. Its not like propietary systems where you only can complain (you can't fix)
Maybe they want to get stuff done, other than puttering around in the tooling. It's a nice option to have, but would be ruinous to productivity.
Also, design by committee rarely produces good UI. You can patch little UI bugs, but if you really want to holistically improve the UI it's a huge undertaking.
Just because you're developer, doesn't mean you want to hack on every tool you use. Software has gotten way too big for that.
I switched from macOS to Linux [Fedora now] about a year and a half ago, out of the same frustrations that affect a lotta folks --
It took a little while to adjust, but since there are a lot of similarities between the Linux and Darwin command lines [and Linux package management being better than homebrew by my lights] , it wasn't very long before I was like "Damn, why didn't I switch 20 years ago!!!" Plus now my hardware works [edit : and is user-serviceable], is reasonably ergonomic, and OS updates don't break my existing programs...
I agree. And I just did the opposite. Been using Fedora for a year or so on an X1 Carbon and switched to having to use a new 2019 macbook pro for work. I had fond memories of macs when I last used them 6 years ago but today....wow. Not that great an of an experience. At this point for work (eng, devops/sre etc) I'd rather have Linux and I really wish I could use my carbon X1. :(
I see no real adv to dealing with the mac platform today if what you want is a "unix" env. On the Linux of today, the minor annoyances are worth it - and they really are very minor on good hardware and a mainstream distro like Fedora or Ubuntu. Plus you get the tooling on the platform it was developed for - real docker, real pkg mgmt, etc.
Don't get me wrong, there are still some nice things in macos and the hardware (yes the trackpad is nice), but to me they really are not compelling enough args now - at least for me.
Now if only we could get a few of these minor annoyances fixed in the Linux desktop (e.g. Gnome) then I think it might really be the "year of Linux on the desktop" ;)
What I keep hoping for is for Microsoft to announce a new Windows with a Windows desktop & API on top of Linux--a gradual, parallel transition like the decade-long DOS- to NT-based Windows transition.
Since MS seems to have grown tired of trying to "sell" Windows, not sure whether to fight pirates or support them, etc., and is looking to services, most of which run on Linux, they could give "Lindows" away, like Google with Android but with more thought behind it, and create the kind of focus that could get all the things working reliably in a desktop Linux. Other Linux distros could then start with working versions of everything and offer customization options in only the areas that a specific customer segment wanted to manage for themselves.
Spot on re : tooling -- likewise, I too gave up on Gnome early on, and also dislike KDE [too heavy] -- a friend turned me on to LXDE and it's been smooth sailing since
If you haven't tried KDE Plasma in the last year or two, it's worth revisiting. I tentatively tried KDE 4.x back when it was the hot thing, and was disgusted by how it seemed to bring even a top-spec gaming desktop to its knees. Now I run Plasma 5 on my laptop (on Void Linux - I feel that package maintainers make a big difference here), and it's far and away the best desktop experience I've ever had.
I've finally recovered that sense I always got from the Windows 95 shell, that's been missing in every desktop environment until now - that sense of "wow, they've really thought this through".
What I want fixed in the linux desktop is IT support. Everything my company does or make works with Mac or Windows, but I can’t even get config values to set for anything else.
I've used Linux several times over the years and the hardware never totally works. I've always had trackpad issues, issues with suspend and hibernate, wifi issues, screen resolution issues, battery life issues, sound issues, driver issues etc.
It 'works' in terms of you can technically do work on it, but you have to make a huge number of concessions and this is before considering it has less application support.
[Edit]: The replies suggest this is mostly still the case, people say it works flawlessly - except for [insert issue here]
Considering that HN is full of programmers and technical users, and literally every time Linux is mentioned people go "I had a problem getting Z working, but I did that and...", "X and Y did not work", "U is not supported", I'd say hit and miss.
Case in point, just from this thread, and from people who say they are otherwise satisfied:
"The worst of my Linux woes were the usual suspend/hibernate issues running Ubuntu on company-provided Thinkpads."
"Suspend/hibernate works on thinkpads with a moderately recent Linux kernel and a bios toggle."
"The only issues I have had is that the sound output sometimes does not automatically switch to/from HDMI when I unplug a TV, and that gnome gets slightly laggy after a really long uptime (its a really weak laptop performance wise)."
"I did have issues setting up certain drivers (i.e. the bloody wifi which tbh is still a bit iffy on my machine) but otherwise it's been amazing."
"actually, on one of my workstations I have a ten-year-old Quadro 4000 that was being a pain in the neck on Fedora 29 -- but the fellows over at LinuxQuestions sorted me out in two shakes."
"I have the latest X1 and almost nobody has a working mic. Also battery life sucks and Bluetoothd always needs a restart after suspends."
"I have a gen 6 and no issues. Is it a 7th gen? BTW what disto? Im on Fedora 30. I would check that you have linux compat turned on in firmware and patch to latest firmware."
"My only Linux laptop was a Fujitsu Lifebook. Turns out laptop manufacturers are garbage at writing ACPI drivers for their own hardware, and so Microsoft has been going around quietly patching them so nobody notices. I ended up having to create my own driver for the lifebook. I cobbled from two other people's failed attempts at making a better one (each fixed different problems) and some fixes from the newest model in that product line. At no point in time did I think it would be fun to write ACPI drivers."
I haven't touched windows or mac os for 15+ years. I definitely recall having to recompile my kernel, tweak default configurations, or hack up broken code to get my desktops working back in the day, but it really hasn't been an issue for at least 7 or 8 years now. One thing I started doing that makes a huge difference is only buying parts and systems that appear on a major distro's compatibility list.
You got me, though, I did have issues with my bluetooth mouse losing connection during suspend on my latest Thinkpad. You could get it working by turning the mouse off then on again, but eventually I fixed it for good with a two line shell script.
Counterpoint is we just issued a shitload of the latest gen x1 carbons (whatever the first gen to get USB c is) and it's all flawless out of the box for Ubuntu. I guess they ship now with a firmware update to fix s3 sleep issues.
Everything on the thinkpad x201 works out of the box with alpine Linux. Considering that’s one of the smaller distros I’m sure the other ones work great (unless of course you run chrome on gnome3 but that’s going to suck even on recent computers.)
>[Edit]: The replies suggest this is mostly still the case, people say it works flawlessly - except for [insert issue here]
I count 7 or 8 broadly unconditional "it works fine" child comments. 1 "it mostly works", 1 "I had to write my own ACPI driver", 1 "it works fine except HDMI sound switching isn't automatic sometimes", 1 "drivers were hard to set-up, WiFi is iffy", 1 "it's not perfect so try it first".
IMO that's majority positive - more than can be said for e.g. macOS which fails to do basic things like mount and copy files off an Android phone out of the box.
I don't think your conclusion is a fair one (or perhaps it was premature) and is perhaps tainted by your own negative experiences.
At my last job we had people very happy with linux, with the following caveats:
- they didn’t care about moving it around half a day and keeping decent battery life, for most of them it was effectively a desktop.
- they didn’t care about any CJK support. None of them would be bitching about chinese fonts on japanese pages or worse IME.
- they didn’t need any peripherals outside of screens, keyboards and mices. We had to use a usb dongle to project into videoconf room at a client, I have no hope it would have worked on linux. Printer/scanners would be a nightmare too (and it’s already not good on other platforms I concede)
Linux workstations are plenty usable, thus a lot of positive feedback. But we should acknowledge it’s not for every use case, and it can need tweaking in parts that are different for each user. MacOS with all its current flaws has still a lot of things better done than most (any?) linuxes.
Isn't the whole point of OP one giant "it works flawlessly - except for xxx" post? Except in this case, the xxx is a much more seriously negative experience compared to a couple small up-front issues with getting hardware set up the correct way.
It's pretty clear that no OS is going to be perfect; but the message I take away is that Linux is to the point where you completely have the ability to tweak and adjust your system so that it works flawlessly after some effort (as in my case, switched ~1.5 years ago. Zero problems after initial checking of driver situations). On the other hand, macOS, even from a technically savvy point of view, is unfriendly and beyond the user's power to fix.
In my experience most hardware works flawlessly, although admittedly some hardware is better-supported. I've owned two (soon three) Dell XPS 13 laptops since 2013 and have always had a really smooth experience running Ubuntu. The worst of my Linux woes were the usual suspend/hibernate issues running Ubuntu on company-provided Thinkpads.
I've also had to use various Macs for work in the past and have had terrible issues with wireless networking and bluetooth.
"The worst of my Linux woes were the usual suspend/hibernate issues running Ubuntu on company-provided Thinkpads."
I don't doubt that is your experience but I don't know why. Lenovo are pretty Linux friendly and suspend/restore has been a fixed thing in general for years. Maybe you need to update your BIOS.
I can provide you with a very long list of fucked up systems that will, say, not run Windows 10 1903, depending on already installed software and other silly stuff.
I'm typing this on a Dell laptop that runs Arch Linux.
I run Archlinux on a gaming laptop (a Lenovo Legion) and it's perfect for my needs. And everything works! Linux becomes better and more compatible with each passing year. This laptop is a portable development machine with a nice keyboard and a powerful enough processor (I specifically wanted a 45W CPU).
All the "pro" laptops seem to be focused on smaller sizes, and on premium/luxury features and price. I'm glad cheap gaming laptops running Linux are a perfect fit for my use case.
If you are not a gamer and don't need a beefy GPU for development (most people don't), you can also look at more "workstation" grade laptops from e.g. Lenovo (P-series, that kinda thing) or Dell. You'll get a few enterprise features and spec bumps that might be beneficial versus 'gaming' features (e.g.: Xeon, ECC, more NVMe + free slots, free RAM slot, RAM up to 32 or even 64GB).
But yeah, gaming machines always make for great dev stations, whether desktop or laptop.
I seem to remember borked cifs/smb support at one point being the most painful.
Edit: not sure why the downvote... MacOS, don't remember when, switched to custom cifs network sharing, and it caused a LOT of issues for my workflow at the time.
1) For the record, all my Linux machines came preinstalled with Windows;
2) I don't think it's at all unfair to macOS / Mac hardware. Macs used to be great, hardware and software-wise, for development and other "making stuff" type work, and they used to be stone reliable. There's been a real qualitative change in Apple's approach to 'pro' users [for lack of a better term] -- OS updates having more unexpected / unwelcome effects, number and type of physical ports on systems becoming smaller and less practical [not even one single normal USB port on this year's MPB!], user-servicability + upgradeability of hardware diminishing over time to practically zero -- the situation is well bad.
I was a devoted macOS / OSX user, but as I mentioned in the earlier comment, I got fed up and jumped ship due to these reasons, which reasons had real impact on my productivity and contentedness-as-a-user
My ASUS Zenbook (like 3 years old) works absolutely flawlessly. I actually had full crashes on windows, but none on linux. I never thought it would be this good.
The only issues I have had is that the sound output sometimes does not automatically switch to/from HDMI when I unplug a TV, and that gnome gets slightly laggy after a really long uptime (its a really weak laptop performance wise). And I'm a person who gets very annoyed with small issues, I'm just super happy. I have a real native terminal with package management, and a window system that I like, and all software except photoshop.
FWIW, I switched from using a Mac for a decade to using Ubuntu on a brand new Dell XPS and it works perfectly. Even though Linux users are a tiny minority of users, I find they are quite helpful at publishing their problems and eventual solutions. I did have issues setting up certain drivers (i.e. the bloody wifi which tbh is still a bit iffy on my machine) but otherwise it's been amazing.
I'd rather have 100 bugs that I could fix in principle than 1 bug that I just have to deal with because some company thinks it's a feature. And I'd rather be part of thousands of engaged tinkerer types interested in improving the support community than be part of millions that would rather convince me that their expensive hardware is proportionally better instead of actually talking about the tech.
I've been using Linux as a daily driver since high school (15 years ago). Lately though, I have noticed that we Linux types are insufferable in our own ways...
“Lately though, I have noticed that we Linux types are insufferable in our own ways...”
Indeed, but this is the (sour) price of 'victory' too: it means Linux is going mainstream now. Hence, it's becoming more 'popular' with all that entails...
Linux is best 'lived' in professional circles nowadays, imho: specific blogs, forums and channels with solution-driven people. I find myself more and more attracted to the RHEL/CentOS/Fedora space for this reason, too.
>I've used Linux several times over the years and the hardware never totally works. I've always had trackpad issues, issues with suspend and hibernate, wifi issues, screen resolution issues, battery life issues, sound issues, driver issues etc.
I have a dell XPS13 that came pre-loaded with Ubuntu. It works perfectly out of the box and the hardware quality is fantastic.
I recently built a home media server/PC and decided to try Manjaro. Amazingly, everything worked out of the box. I actually had a Ubuntu live USB ready to go in case I hit issues, but now I’m in love with Manjaro + KDE :)
As the comments below suggest [and I also run Thinkpads for portables] -- yep, it works real good for me. Fedora 30 especially has been essentially flawless in my case. No issues with suspend, drivers [even for notoriously fickle stuff like printers], sound, etc -- just solid performance.
EDIT: actually, on one of my workstations I have a ten-year-old Quadro 4000 that was being a pain in the neck on Fedora 29 -- but the fellows over at LinuxQuestions sorted me out in two shakes.
My only Linux laptop was a Fujitsu Lifebook. Turns out laptop manufacturers are garbage at writing ACPI drivers for their own hardware, and so Microsoft has been going around quietly patching them so nobody notices. I ended up having to create my own driver for the lifebook. I cobbled from two other people's failed attempts at making a better one (each fixed different problems) and some fixes from the newest model in that product line. At no point in time did I think it would be fun to write ACPI drivers.
I uploaded it somewhere so nobody else would have to deal with this bullshit, but you still had to recompile the kernel to get the new version to override the firmware.
At the time they were still talking about how Linux was going to take over the desktop, while the early numbers were already in that laptops were killing desktops. Linux is taking over the server room, but it's essentially nonexistent anywhere else, and it truly has earned that honor.
I've always specifically selected and researched my hardware before buying. Typically Thinkpads are a good bet - I've used several different generations of The X1 and T series with no problems. I'm currently using an X1 carbon that has great battery life, suspends properly, handles external displays, etc.
I have a gen 6 and no issues. Is it a 7th gen? BTW what disto? Im on Fedora 30.
I would check that you have linux compat turned on in firmware and patch to latest firmware. Also running tlp helps depending on the distro. I get good bat life, mic works fine and no BT issues after sleep/wake.
this is not true on the 6th gen. I don't know about the 7th gen though... if it is true on 7th gen it would be a shame.
Although traditionally lenovo has done well with linux, I will say this is where Dell does this better - actually certifying and supporting some of its laptops on linux for devs.
Newer distributions like Ubuntu, Fedora tend to have much better hardware support than other distros. I've used Linux since Ubuntu 7.04, and hardware support has made leaps and bounds since then.
That said, it's still not perfect, and warrants a trial on a throwaway hard drive if absolute hardware functionality is a must.
As long as the 38 step install process works. ;-) I tried, it locked up trying to load the GUI, specifically 5.3 kernel installer for navi support. Wound up reverting to Ubuntu LTS, will re-evaluate in 6 months when things for my hardware stabilizes.
Done the install manually several times and it is crystal clear what you have to do once you understand how Linux works from boot to the OS. Of course not recommended for beginners and those who have no interest in how the internals work.
Sorry to hear it did not work for Navi, was the wiki any helpful to understand what goes wrong?
Not really... pretty much started loading, then hard locked... couldn't navigate (ctrl-alt-f#) to a usable terminal, etc. At that point, I'd been through several failed attempts and just jumped to Ubuntu LTS, since it's supported by AMD, and dealt with the wifi and onboard audio not working.
Started with Pop!_OS with 5.x kernel (5.2 fir wifi drivers, though not using it). Onboard audio also weird, the UI would show for audio out, but alsamixer was needed to raise "headset" to control the port. When I upgraded to 5.3 and dropped in the lib files for navi10, it booted with the card swapped but stuck at 1080p (on a big 4k screen) after messing with it, I couldn't boot to a gui, but was still able to use the terminal. I decided to give Manjaro architect a try (since you can choose your kernel) that gave the same issue that Arch itself seemed to... hard locking during boot.
I would like to know more, but at some point, I just want to get work done...
Yep! I dig FreeBSD also, and strongly considered it especially for that reason -- just ended up going with Linux because I had a little more experience with it [having remotely used / administered linux servers &etc]
Just don't buy hardware that is too new. X570 platform and rx5700xt, and issues a plenty. Not to mention absolutely abysmal RGB support from hardware vendors.
It's frustrating because it's the "least worst" option.
I use MacOS because it has a thoughtful, consistent, coherent, beautiful user interface and desktop environment. The things I use all day are basically a web browser, mail client, code editor, and terminal. I can do actual work on any BSD really, there is no proprietary app that causes lock-in, I don't use iCloud or any of their services.
All I want is their desktop and core system apps. If Apple stripped down the OS back to Mac OS X Snow Leopard standards, and charged $129, I would pay for it.
If there were an equivalent desktop environment for BSD that replicated the MacOS desktop environment, I would pay money for this. Charge customers to hire full time designers, developers etc.
That is definitely the most frustrating thing about all of this over the past few years. Even with a customised Windows 10 to get rid of the nonsense nobody wants and adding what is missing to even work on the damn thing it's still not as 'least worse' as macOS. It's almost like all commercial OS development moved towards STB's and mobile and nobody gives a crap anymore.
I'll probably still stick to macOS on my mobile hardware (Linux and BSD on the fixed machines, embedded and servers), as it is still least worst, but I miss the feeling of high stability and productivity you would normally get when you work on your machine and don't have to touch any of the other flavours.
Right now, all any other vendor has to do (besides the create-a-BSD-desktop) is good hardware integration. Because that is more effective than people might think. Even with the whole butterfly crap the whole package deal is unbeatable. The only time I ever had hardware/firmware issues was back in the 90's where OpenFirmware got sad because one of the data lines of the ROM was corroding and the SMU would reset every boot making sleep unreliable.
Having a machine that has a good hardware-software relation down to the firmware and no weird double powerons or a bunch of stupid splash screens, one where you can just add your tools and work, it's the best thing. It it used to be exactly that when you got a Mac... any Mac, even if you don't get a powerful one. It always works the same way, it always delivers consistently (unless you break it yourself), always stays out of the way so you can do what you actually came to do.
What makes the MacOS UI so much better than say Gnome or Enlightenment? I personally can't stand most of the Mac UI experience, however, I am very curious what you (and others) love about it that cannot be replicated on a different OS?
Font rendering for one. It's really nice. IME, text quality on a Linux desktop varies even between applications, depending on what toolkit they happen to be using.
That might be true, some apps just don't follow standards well (on MacOS too btw, there are old and ugly UIs from third-party vendors all over the place).
But when you're living between a terminal, editor and browser, and fonts are perfect on these 3 (or whatever else you use)... it's hard to justify using a lesser and more expensive platform, just to correct the 1% of the time some app doesn't look great.
In fact, if we speak 'consistency' in a functional way — not disturbing the user's "flow" while working — I'd argue that bad or extremely limited UX the likes of Windows and now sadly MacOS too (by comparison, and because it got worse in the last five years) is a much bigger problem than font rendering, for most workflows.
Case in point if you really care you can fix font rendering for pretty much every app (and select aliasing parameters like strength and RGB ordering to conform to your external displays panel type and pixels 'pitch'; whereas MacOS or Windows will be hit-or-miss with some models and you just can't help it). Haven't needed to on Ubuntu or Fedora, but I know as of 2017 Arch let you apply general font rendering settings over GTK and Qt, and you can use terminals like urxvt to fully control such things.
Yes. Snow Leopard was the greatest OS release of all time!
I actually went to WWDC 2008 when it was announced and they handed out CD’s for it — no new user-level features — just a hardened OS (grand central station and kernel level threading and stuff under the hood was changed).
Sigh. Marketing now rules the world. Look new emoji’s!
Things are worse than ever before. Mac OS stability has taken a bit, but it’s not like Windows machines have gotten better. Windows 10 is still a total clusterfuck when it comes to switching between a high dpi internal monitor and an external monitor. The other day my X1 Carbon started randomly powering down (but not hard looking, controlled shutdown, even installing updates). It rebooted four different times in one evening.
Things are just bad now that Steve Jobs is dead. We just have to get used to that.
> Windows 10 is still a total clusterfuck when it comes to switching between a high dpi internal monitor and an external monitor
I really think this must depend on the hardware; Windows 10 switches between monitors fine for me, on multiple laptops. Indeed, I only actually have a problem with this on MacOS, where my MBP even crashes on occasion when switching to external displays.
I have a Windows 10 laptop with 150% scaling on the built in display, and 125% scaling on the monitor I plug into; both high DPI. Whenever I plug in or unplug I get treated to 5-10 seconds of seizure-like moving and resizing of windows, and some blinking of the displays. It usually doesn't get things right, for example windows that were on half the screen will be neither on half nor whole, or they'll be slightly off where they should be but mostly right.
You might want to try the Dell XPS 9380 Developer Edition that comes pre-loaded and configured with Ubuntu 18.04. I switched to it from a 2015 mac laptop I had been using. Set up was pretty easy and everything just worked out of the box. It's also nice that the performance per dollar ratio is better than a new macbook pro.
My only complaint is that the trackpad is a little small for my hands, but the sturdy keyboard with real function keys more than makes up for it.
I've come to accept that most OSes are inevitably broken in different ways. OS X has always broken my dev environment on every update. So over three years ago I switched to a ThinkPad X1 Carbon with XUbuntu and have no issues other than having to change my DPI when I connect/disconnect to a non HiDPI screen. Then I recently built a gaming PC with an AMD CPU and an Nvidia GPU, which did not go as smoothly with Linux: I had to install experimental Nvidia drivers, audio is a little wonky, and sometimes the wifi doesnt work and I have to restart. But overall I'm quite happy on XUbuntu and it gives me fewer problems or annoying prompts than Mac OS or Windows ever did. For gaming I use Windows of course.
It is depressing, but I'll tell you which of those options is actually actively interested in developers and enthusiasts: Linux. Give it the effort that people spend buying exactly the hardware Apple says to buy and then installing (or writing) apps that fix MacOS's clunky window management - and Homebrew, for that matter; consider where it's going as well as where it is; and I think you'll find something for you.
Eh... I've been on Ubuntu 18.04 for about 1.5 years now on my Dell Precision 5510. Zero problems, all hardware works out of the box. Suspend and hibernate still works flawlessly.
Do you have any tips for getting the touchpad to work well? Mine picks up palm touches and the only solution I've found is to disable the edges of the touch pad.
>Desktop Linux (...) is far from perfect. Getting everything to work is a lot of effort, and then mysteriously something will stop working.
Couldn't disagree more. Gnome took a bit of tweaking but it has been performant, stable, and exactly what I want. I recently changed jobs and am forced to use a mbpro. I hate it. It's slow and painful if not impossible to customize.
That’s your problem, stop looking for a windows/OS X replacement. The ideas are fundamentally broken; a lot of them are built around pushing you toward just consuming things from other people.
you can't just buy any hardware. I'm sure Ubuntu and other distros publish a hardware guide. I have a Thinkpad T490, tried out Manjaro, Suse, Fedora and ended up settling with Kubuntu | Budgie. No hiccups. Everything works out of the box. I have 40gb ram, something I could never get on the current generation macbooks. Keyboard is excellent n trackpoint. only bad thing hardware wise, are the speakers. I also have a Retina macbook pro 2013 personal. and use a 2015 Macbook pro for work. My old job I used the 2018 macbook pro, a shitty computer, that I would compare to my 2012 Dell inspiron, in terms of shittiness. Forgot to mention my thinkapd dispaly is 500 nits 100% ARGB. KDE has fractional scaling to handle DPI displays.
I moved to Arch Linux a few years back. So did wifey but she doesn't care about what OS she uses. We both use KDE <thingie> for a Window Manager, Libre Office and Evolution (Exchange is involved). Printing is of course CUPS, which is lovely and simply works. Lots of other things installed from the one set of repos.
KDE allows you to "lock" the desktop icons/widgets in place. That is a feature that is sorely missing from all other WMs (Mac/MS/ int al) and for me is a killer feature. Some of my end users have a habit of smearing icons (launchers) all over the place.
Desktop Linux is way better than say Desktop Windows (by OP metric). Updates take a few minutes rather than hours in some cases. Fixing things does not involve sfc /scannow but then neither does that work on any Windows box unless it has spinning rust and loses power regularly without battery backed cache.
I update my wife's laptop via SSH when she is using it and I throw a reboot job at it out of hours via cron.
My wife is happy with what she needs to get to - Facebook, email, some odd Flash games on FB and a few other things. I'm sure that Mac and Windows sysadmins can all say the same (I'm those as well.)
Interesting you mention Adobe CC as being better on MacOs. For the past couple of years I've seen most people mentioning that Windows is better/faster for these apps.
> to work is a lot of effort, and then mysteriously something will stop working.
Oh Please. this lie keeps getting repeated every single time this topic comes up. Meanwhile people who actually use Linux every single day have like zero issues for years to keep things running and working. I have 4 machines using different flavors, getting updates all the time and nothing mysteriously breaking, and this for years. Sure, you will always find the one user saying the exact opposite but among all the Linux users I interact with this is not the norm.
Sample bias? Linux users are people who aren't sufficiently bothered by Linux's flaws or don't encounter any flaws due to their HW / usage patterns. Same as any other OS.
To say the parent poster is lying is a bit much. You have know idea what kind of stuff they're trying to do.
Idk about your experiences but I've been using Linux Mint for over a year now and it's just been working smoothly out of the box (better than Windows even). The only part where I had some problems initially were graphics drivers but that's only because I messed with them unnecessarily.
it's far from an ideal solution, but the only thing that keeps me sane is to have different computers for their strengths.
For example, one of the author's use cases is "Because I’m an idiot with reasons, I have a python daemon that launches as root via launchd." - i shuddered just reading this, i'd never attempt to do that with anything other than linux. but chromeos is perfect for my little laptop i keep in the living room to check my email on. i was also happy using an iPad for the same thing. My software dev machine is a linux desktop, but if i needed a work laptop there's no way i'd put linux on it.
maybe one day there will be an OS that i can use for all the things without complaint, but right now that doesn't seem to be a reality, and i've stopped pretending it can be.
Things are so broken here at Apple. I joined about 4 years ago.
I am awed by the fact that we manage to release any software at all, let alone functional software.
The biggest problem is communication. No one fucking communicates.
- No communication between orgs. Tons of bureaucratic tape to cut through just to get a hand on someone working on a different product
- Barely any communication between teams. Literally every group of 4 people is in a little silo with no incentive to go outside it
- Broken management structure. I have had many managers (a red flag in itself) but even worse none of the managers take suggestions from engineers. Everything is purely top down. If an engineer realizes there is a problem on a macro scale they cannot fix it. It is literally impossible to unite more than 1.5 teams to get anything done.
- So what happens is that you’re working on a product that’s part of another product but you never talked to any other teams or orgs on how to make your product fit in
- 10 different teams working on the same products and services. Zero unification means you are literally wasting developers and internally fragmenting every tool. Even worse, these teams compete for internal market domination
- Culture of secrecy means nothing gets fucking done. You file a bug report and you can’t even see it any more for some orgs
This is only the tip of the iceberg. There are fundamental and serious problems at Apple that no one in management gives a shit about solving. Any time engineers try to congregate or work on anything constructive with another team, they are shot down.
The only time I have seen cross-team developers working together has been to deal with critical bugs.
Because of the lack of communication, none of management’s goals align. They are all out of sync and poorly thought out. So year after year your manager has something they want you to implement but the feature for the year is bullshit because it makes no sense and is just there to pad the manager’s resume.
And you can’t speak out about this. Apple doesn’t take well to employees complaining. Even then, because of the lack of organization there is no one you could raise these issues with.
It absolutely isn't the individual employee developers job or responsibility to try to fix corporate culture. Almost anyone on here or reading this is a line worker developer and trying to take on the job responsibilities of C level staff is setting yourself up for disappointment and failure.
Any company larger than a few dozen people is entrenched - there will be a hierarchy and the top will dictate the order of things. If they are paying you to write pointless software then you are either content with the paycheck and probably a lot of free time at work if nothing really matters or can go somewhere else to find meaningful work.
But seriously, the larger the company the less you should ever consider thinking yourself as some engineering talent can change the system. You change the system in those circumstances by realizing the failure, networking with your peers, and starting your own company to do it better. Assuming you didn't sign a deal with the devil by noncompeting your way into being stuck. You were hired to write code, not fix corporate culture. Largely because most large corporations have layers of management dedicated to insuring it is not fixed.
Yes it isn't but who else will do that for them? Culture is only possible when people talk to each other and/or exchange thoughts in other ways. If normal people can't do that, there's no single culture, there's many little cultures. Remember that organizations are people and not some magical beings from a different dimension.
Culture is one of those things where once it's broken, it's usually easier to let it die than to fix it. Go join some smaller organization that doesn't have a broken culture and help it succeed. Once enough people do that, the small company becomes a big company, the big company rots on the vine and eventually goes bankrupt, and the cycle starts again.
This. Culture is one of the only things that every employee can change by themselves, just by going out of their way to find and talk to people. It takes work, but also takes absolutely zero permission or red tape, and it makes a huge difference. Talk to a new stranger at work every day, and in a month, you'll be solving problems nobody knew about.
> Assuming you didn't sign a deal with the devil by noncompeting your way into being stuck.
Well, in California it's basically impossible to enforce a non-compete, so there's that going for anyone who wants to do this as a current Apple employee.
I think the issue is that culture at Apple is very much not supposed to be this, and this probably isn’t what the C-suite is intending to push. So you’re not really going against them; rather, you’re going with them but against the current status quo.
If upper management wants culture to change they will take the action necessary to change it. The last people to be powerless to inflict change on a corporation are the executives running said business.
You should do some reading about the idea of the "frozen middle", which is pretty much what's being described as the problem. It's as resistant to edicts from the top as ground-up initiatives from the bottom.
In the sense that everyone wants cake and to eat it sure.. But really everything anyone wants is stack ranked and their management prefers the apple demo to the world format long after the luster has gone in no small part because of the secrecy, silos and paranoia it requires. In a financial sense their management is right and an employee wanting to be part of something less bland and Oracle like is wrong.
> Any company larger than a few dozen people is entrenched - there will be a hierarchy and the top will dictate the order of things.
To a point, but have you ever tried changing something purely through a dictate from the top? People will just say ‘Yes boss!’ and then keep working exactly as they had been.
disappointing to see this grey, because i think there's a lot of truth in here.
i think a lot of developers are used to wielding great power with technology, getting immediate visceral feedback, shipping, and whatever else. this gives them an impression that fixing people problems is just as easy - the equivalent of opening up the ol' IDE and rocking out for 8 uninterrupted hours, getting an MVP up.
the differences, though, are crucial. the compiler doesn't lie to you - you missed a semicolon; that's not the case with people. the in memory database doesn't have another agenda; again, not the case with people. the UI doesn't aspire to be something greater, or protect itself; managers often do.
i really appreciate that individuals try - i just don't think it's really worth it, if the org is > 15 or so people.
People really hate to acknowledge that working for most corporations, especially big public tech giants, is not them being welcomed in to change the world and blaze trails but to write code for their boss.
I’m on an internal infra team, and even tho we’re supposed to be making tooling for the rest of the company, it’s just as much of a clusterf*.
I’ve been interviewing with other teams, but with this disaster of a release and us demonstrating that our corporate values are just empty words in the face of opposition from China, it’s likely time for me to make my exit.
The overt sexism that I’ve been witness to in iCloud management certainly doesn’t help either.
This sounds largely as I feared. The organization was built to serve the will of an omniscient god-king who knew better than anyone else. And now he's gone, and the organization is still set up to make orders trickle down from on high (despite no-one qualified or interested being there to make those calls) and hasn't learned make good decisions by communicating internally either. So the acolytes and high priests try to just keep doing what they were doing before god died.
> - So what happens is that you’re working on a product that’s part of another product but you never talked to any other teams or orgs on how to make your product fit in
I'm surprised to read this, because I've always thought tight integration and clever synergies between product lines were precisely one of the things Apple excelled at.
And I was wondering how exactly they managed to make that happen in such a famously secretive organization, where very few people have the 10.000 feet view required to come up with these ideas.
How do you make Mac Catalyst or Sidecar happen with 4 people silos who hardly ever meet and adjust ? How do you redesign the iOS photos app to the capabilities of the new hardware, in a way that will make sense once software+hardware become a product ? How do pictures of unreleased Airpods end up in recent iOS beta releases ? I mean, at some point you've got to make these things work together, and one decision on one side that's oblivious to the other side's constraints might make things impossible to them, and they'll want to push back. This is how "normal" companies function.
One more example, not something especially clever but more something that would have been a huge bummer if it hadn't happened : it seems like the Pro Display XDR has charging capabilities way beyond what any current Apple device might require, and it's speculated that it's for the upcoming 16" MBP : https://www.macrumors.com/2019/10/04/16-inch-macbook-pro-96w...
Again, how do you even achieve that if people don't communicate ? Through extremely well defined internal interfaces and specs ?
the fact that GP is complaining about it may be an indication. by hiring good people who see the structural problems and work around them works. you can brute force work around poor management and a bad structure.
Edit: you can see this explained in one of snapples sibling posts.
I should note that most developers here really do care, and that’s probably why products can be released in the first place. You have to have really dedicated people willing to cut through the organizational bullshit to get things done.
All of the engineers I’ve met here are smart and innovative. Only if we could organize, things would be much better.
When my friends at apple talk to me I feel any of them could write your post except...none of them would say "I am awed by the fact that we manage to release any software at all, let alone functional software" (well, maybe one would).
I'd like to work with these folks again* but the incredible secrecy would bug me. I understand that some things have to be secret, and I don't at feel I need to know what that group over there is doing but I'd like to talk with my (NDAd, same company) friends about what I work on! So I don't even apply for a job there. But some people seem to consider it OK.
* The subset of friends there who used to be colleagues of mine, I mean.
He's specifically talking about how the product subsystems are effectively mapped out by the product departments and that they should try to interface with each other with minimal constraints.
But, my take was that there needs to be a LOT of communication between departments and an ongoing debate between them as well.
Edit: The more that I think about it...
This might a big reason why Musk companies defy the odds, and why it is so difficult for incumbents to catch up.
The over the air updates of Tesla are a good example of hardware & software departments working together to make something very difficult to compete with (if you're a regular old school siloed company).
I'm on an internal tooling team @ PlayStation and man do I appreciate how much the company encourages cross-site and cross-team collaboration. Reading your comment made me cringe :/ so so sorry for you folks working at Apple.
I feel like there are people who get around the secrecy/isolation veil, they just have long-standing friends in other projects and can rely on them to push features and bugs though the bureaucracy. If you don’t have that, well, expect your Radars to sit in “punted from this release” until the end of time.
Seems crazy to hire the best engineers and pay them top dollar only to put a bunch of organizational barriers in the way they have to try and navigate if they want to get anything done.
And what do PMs tell engineers on their team once they're back from their super secret meeting with the PMs from the other team ?
"Do exactly this, I can't give you context but it'll make sense on release day" ? Sounds like extreme Fordism (and basically a hellish way to work for creative minds) and a great way to prevent people from spotting problems early.
I worked at Apple for a few years and he was not running around bringing teams together. At the senior leadership team level sure. But not at the engineering or middle management levels.
It sounds quite similar to what OP was experiencing i.e. lots of siloed, secretive teams across a very large, 100K employee organisation.
Absolutely. Based on various anecdotes it seems apple succeeded nearly in spite of him, but he surrounded himself with people like Wozniak, John Carmack, Tevanian etc people who’s confidence in what they knew could always win out against Steve’s bluster. Steve brought people together, but also brought creative tension. He was also pretty good at spotting opportunities and trends. Apple really does feel rudderless without him. There’s no coherence of vision between products. Quality doesnt even really seem to matter. Apple still operates in a very top down manner, but it just doesnt seem as though there’s anyone at the top with much in the way of interesting ideas. Watch did okay, but was hardly a game changer. Earbuds are a success, but they cant be making that much on them ... Steve’s gone, and more’s the pity for all concerned.
And yet most of Apple's products are consistently best-in-class, so I guess it's working for them? And a bumper crop of bugs in a point zero macOS release doesn't count as a disaster. They've been shipping super buggy point zeros for two decades.
Over the years I've been surprised a number of times by seeing Apple hire a particular software engineer that I wouldn't hire for my own little company. You're still kinda junior @ 4 years there, but.. have you seen seen any slippage in hiring standards?
Apple hiring is independent per team. You can literally interview forever at apple and the way each team interviews is wildly variant. Had some pretty cool interview loops. I got rejected and then got an offer after interview 4 or 5 with a team on a high profile app that I really liked.
Apple doesn't give good offers although, so I took another one at a company / team I also really liked with a better offer.
I have a buddy who's an old Apple QA who tells me similar jaw dropping stories about the silos and secrecy. When I marvel how they can possibly ship all those wonderful products he says it's done through grueling brute force manual QA testing. Which also sounds insane but somehow it has mostly worked through the years. But maybe they're reaching the level of complexity where brute force manual testing is not scaling well enough.
I used to work at Dyson and it was a similar story. I think the secrecy thing is key. They went on and on about "protecting our secrets" (ha) and operating on a need-to-know basis. It just meant communication was really bad, and getting help from anyone who didn't know you was impossible.
> Culture of secrecy means nothing gets fucking done. You file a bug report and you can’t even see it any more for some orgs
> And you can’t speak out about this. Apple doesn’t take well to employees complaining. Even then, because of the lack of organization there is no one you could raise these issues with.
No, it sounds like any large multinational. In the end, all commercial entities turn into the same thing where they only differ in branding, segment and origins.
You are probably segregated away from where the real work is happening. This is by design, because you haven't proven yourself yet or they don't trust you yet. Good work is being done in Apple every day. You're just not a part of it and are not seeing it.
After four years? Seriously? Why would a bad hire not be laid off at that point? What's the difference between "real work" and "not real work"? How many engineers are on-site at HQ who don't do real work?
> The final (well, first) Catalina release along with the outright awful public beta makes me think one thing. And that is Apple’s insistence on their annual, big-splash release cycle is fundamentally breaking engineering. I know I’m not privy to their internal decision making and that software features that depend on hardware releases and vice-versa are planned and timed years (if not half-decades) in advance, but I can think of no other explanation than that Marketing alone is purely in charge of when things ship.
I don't work at Apple, but this part hit home for me. In the past few years my jobs have revolved around shipping features at all costs with zero regard for engineering feasibility.
We all like to criticize CEOs for prioritize short-term stock prices over long-term company goals, but I'm beginning to think the average employee or middle manager has even more perverse incentives to make poor short-term decisions. I've seen a lot of engineers and managers swing for the fences to deliver headline features that can't possibly be completed on time with any standard of quality, testing, or long-term support. It doesn't matter, though, if your goal is to add that accomplishment to your resume so you can pivot into the next higher-paying job elsewhere. After that, your mess becomes someone else's problem and you're off the hook. Up or out.
> The iPhone could play a section of a song or a video, but it couldn’t play an entire clip reliably without crashing. It worked fine if you sent an e-mail and then surfed the Web. If you did those things in reverse, however, it might not. Hours of trial and error had helped the iPhone team develop what engineers called “the golden path,” a specific set of tasks, performed in a specific way and order, that made the phone look as if it worked.
I guess I post this here as a means to say, while what OP is talking about certainly sucks, Apple seems to have a long history of this. Doesn't make it better, and certainly what he's outlined seems extra bad, but not completely unexpected.
Great read! Thanks for sharing. Highly encourage anyone who is on the fence about reading the whole thing to go for it.
Steve, and others like him, do make me wonder. On the one hand, I work four days a week, never stay late at work, and live a good, steady life. But on the other hand, I see these super-stars, these drive-people-to-the-edge, sleep-on-shop-floor types, and see how much change and drive they create, it makes me start to think that maybe I should work _much_ harder. But then again, I quite like all this time I have to think on things. And of course, we don't get all the details about how this style of work _really_ affects home life; I'm sure we'd have much less respect for these super-stars if we knew they _all_ had screwed up lives away from work.
I know what you mean, the appeal of this total dedication. For myself, I've come to the conclusion that I could not do it for long, not as long as I'd have to. I think for those types like Jobs, Musk, it wasn't even a conscious choice they made to put all they had into their work, they are/were just driven. You'd know it if you were, too. They could never have done a four-day work week with zero overtime, it was never an option for them. So, enjoy your life as it is, this is yours, theirs is different, and don't think you're missing out on anything.
Reminds me of an anecdote I heard about starting your own company. It's great, you're the boss. You can work half days if you want. You even get to choose which 12 hours that will be.
Demos of new products always have golden paths, and even demos of production grade software have golden paths very frequently because maintaining complex software in a demo-able state is hard (i.e. historical reports need to show a plausible history without anyone actually using the system).
Indeed. Apple's MO for a LONG time has been, "The way things have always worked should always be rethought, and we've come up with a better way for you". Many Apple users on forums like this tend to be Apple apologists until they introduce that one change that's too much, and then it's all "Apple's lost their way!" when the reality is it's just Apple being Apple.
I work for a big software corp in SV with a yearly conference where we announce all of our products. Managers give zero shits about product quality as long as we deliver on time for the "big show". Bugs, future maintenance burden, and other shortcomings be damned because they'll probably be long gone after they get their promotion (and probably at another company rinsing and repeating) and by then, these issues become someone else's. This has caused engineers to become incentivized to place priority on delivering over quality. Funny how this short-term growth mindset comes top-down from where it all starts in Wall Street. If we want to fix the system, we need to first fix Wall Street and bring back accountability.
Counterpoint: I've worked at SV unicorns that were not public, and did not intend to go public in the near term. And they had a similar "damn the torpedoes! full speed ahead!" attitude. So I think it is more pervasive than just Wall Street.
Regulation for publicly listed companies means that the bottom is actually higher than for VC-funded companies, where nothing of substance needs to be generally known. (wework?)
Same short term thinking, same self-aggrandizement, same convincing yourself of how rational you are, when most actions are driven by emotions, but with more casual clothing.
"big-splash release cycle is fundamentally breaking engineering"
I strongly prefer Apple ship functionality incrementally. No more big bang.
Especially be more cautious with new kits (core libraries). Just one or two end user facing features on some fraction of hardware. Then expand over time.
With Apple's installed base, it's an engineering marvel there's so little drama. But I want no drama, which means waiting a bit longer. Which is fine.
I was thinking on this the other day, and I think what ultimately boils down to is this: there’s no “why” anymore at Apple.
Go back and watch old keynotes with Steve. Almost always, whenever he’s talking about a new feature or piece of hardware, he starts with the “why.” Not everyone will agree with the “why,” of course, but it’s still given.
Why do we want to get rid of the CD drive on MacBook Airs? (We see OTA software updates and media downloads as the way of the future and it wastes space.)
Why do we want to migrate from PowerPC to Intel? (We need better performance-per-watt so we can build MacBooks that have better battery life and don’t overheat.)
Why do we want to not have a physical keyboard on the iPhone? (Because the buttons and controls can’t self-specialize for each application.)
There are obviously exceptions to this rule, but by and large it’s fairly accurate. Now, watch the most recent keynotes. Has there been even a single second dedicated to WHY we need Apple TV+? Apple News+? Apple Arcade? Thinner keyboard mechanisms?
No, and it’s because we all know what the answer is.
There's still a "why" to those product launches. It's just that the "why" is no longer design-driven and is instead driven by business needs– namely, "we need to diversify our revenue streams away from just hardware sales and into services." This started when Tim Cook (instead of Jony Ive) took over as CEO.
I'm not sure if anyone has seen this yet, but Catalina is letting me login without entering my password!
I have two users on my machine.
1. I "lock screen" from the Apple menu and close the lid. 2. I reopen the lid and it does not ask for password. 3. I start using laptop and lock screen suddenly pops up, but asks password for the wrong user. 4. I hit random key and the screen goes away, and i can continue working.
Also, it looks like a lot of settings don't work on the lock screen / choose user screen. For instance, the pointer speed doesn't match what I have set, font sizes don't match either, and the resolution looks wrong.
Despite having a lower quality than before since a few years, Windows has login and locking features that actually work (and I don't even really remember of bugs in there, like ever), so no, from what you describe it does not feel like Windows at all, it feels like some completely broken crap.
While on the subject, on Linux I've also noticed that Xscreensaver (or GNOME screensaver, can't remember which) sometimes goes straight into the desktop after wakeup for a few seconds before the lock screen prompt actually appears. You can even run programs. Really bizarre and feels like this issue has been present for a while now. Has anyone else noticed this or is it just me?
Yeah, saw this the other day on an Ubuntu machine (running whatever the last release was, not LTS) I have hooked up to a TV. Flash of the desktop for a second or so before the login screen. Speaks to something fundamentally wrong with the way the whole thing works, I figure.
I had the same impression. That general feeling that the screensaver is like an app/overlay that is invoked only once the desktop is active. The speed of invocation also seemed related to how fast your machine is at launching general desktop programs...
I run xlock on suspend rather than on resume, seems more reliable. I'm any case, locking is trivial to bypass by pressing Control-C a few times on resume to kill xorg, hence why I also start X with "startx; exit" so that this drops to the login prompt instead of the shell.
> Right now with screensavers under X it's basically capturing the input and continually redrawing over the display. > With Wayland, Kristian plans for the lock-screen to be part of the Wayland compositor. In having the compositor handle the screensaver role, it can ensure that no window can appear atop the screensaver surface, it can properly detect idling and grabs already, and has complete control over the screen. Unlike the X design, there wouldn't even need to be a screensaver "window" that's on top but the compositor could just keep painting a black screen. For those interested in a "fancy screensaver", a plug-in could be used or an out-of-process Wayland client for drawing whatever you desire.
I run the real xscreensaver though (I was building it from the source until around this time last year, now I run the one from alpine but I don’t think they’ve done anything weird (I haven’t checked though))
I’m pretty sure I’ve seen what everyone is talking about and it’s bothered me a little too (granted, my machine is configured in such an undiscoverable way that just opening an xterm window is obscure enough to be nearly equivalent to a very short pin, and then you need to know bash. Now that I’m not in college and don’t have anything important that’s probably good enough even without xcreensaver.)
for the record I used to be able to see a bit of the desktop on windows after waking from sleep before the lock screen came on. can't remember if that's still true, though.
Ah yes, I remember that happening on windows xp a lot when it was waking up or between screensaver and login timeout (when they overlap), you'd even get the transition-to-welcome-screen sound so you know it only just triggered the locking just then. I suspect if you are fast you can run something before it actually locks.
Recent Windows login issues I've had to deal with:
* Non-consensual insertion of Windows Update latency into my schedule. Often I don't mind. Sometimes, though, I really, really do.
* Said updates failing but giving no indication of failure other than taking infinitely long.
* Keyboard layout sometimes gets swapped back to QWERTY with no visual indication. This interacts especially poorly with stringent Active Directory 3-try-lockout policies.
* Network hiccups + active directory (or something) can cause login to spin indefinitely, requiring a restart.
* Login screen background occasionally changes to a random picture from my computer. Usually a wildly upscaled application resource. I haven't entirely ruled out my own clumsiness as a contributing factor, but I've also seen this in the wild, so it's at least a UX issue somewhere.
None of this is as bad, in a theoretical sense, as Apple's no-password fiasco, but it has resulted in a far larger footprint on my day-to-day activity.
There were some good bugs in the login part of windows 2000/XP. Most of the ones I know of involve opening a browser while still on the login screen (my favorite was in the help for the screen reader), which is running as the System user that has complete control over the system.
They fixed most of them somewhere around XP SP2 or SP3, where they pretty much disabled help functionality on everything on the lock screen.
When Cortana was introduced, they had some issues of being able to launch the browser while the machine was still locked (though as the user that the computer was locked by). You couldn't do much as the previous bugs as the lock screen still covered everything.
Wouldn't know about the windows lock screen particularly, and I haven't been a windows user since Vista, but such low quality software and UX is what I remember being accustomed to during my time with windows.
I've never seen a bug like that on Windows. Honestly, it feels like a lot of Mac users live in a bubble where Windows is a buggy pile of crap. Meanwhile most Windows users around the world are getting on with life on a stable OS with a great choice of hardware. While nothing's perfect, Windows is in a really good place at the moment.
As a Linux user I feel like windows users live in a bubble where where computers are more or less expected to behave like diseased wild animals rather than machines.
Except that I can not open "Windows Search settings", since it crashes after loading a few seconds. And all the other shit. It might not be as bad as this Mac update, I don't feel confident doing a comparison. But certainly take issue with "Windows is in a really good place at the moment"
Not OP but there will always be counterexamples for any software with a sufficient number of users. I've never experienced the bug you're referring to.
> While nothing's perfect, Windows is in a really good place at the moment.
Sure, minus the fact that my Windows box would be _spying on me_ if I wasn't a very technically capable person. That's a non-trivial driver for a lot of folks when they decide to pick Apple products. Or, at least, it was until recently with the iCloud / China debacle.
My brand new PC running Windows 10 will simply not wake up from sleep, requiring a hard restart. A google search turns up thousands of similar complaints (and no solution).
Since Mojave, waking my 2012 rMPB results in a flash of desktop and content before showing the lock screen. This sounds similar, if not a continued degradation.
I have been using Linux on my home workstations for over 10 years. I use a MacBook Pro 1 day a week for work for 5 years. I occasionally use a Windows 10 machine if a friend or family member needs help.
Linux desktop is easily my most productive and least worse environment. It's so smooth to develop on. All other OS behave like an annoying authoritarian dictator. And Windows specifically is a kafkaesque dumpster fire of menus and windows all with seemingly different design concepts. MacOS is pretty consistent in design and performance but using the machine it's obvious that developer productivity is not their top goal and you can't develop using a mobile phone as your workhorse so desktops are all we got.
Can I just say how happy I am with Xfce. It's a Linux desktop environment that looks like in the early 2000s. It's fast. There are no unnecessary frills. It just gets the job done. Its release cycles are measured in several years, and keep it minimal. I used to be on Windows, then macOS, then Ubuntu and now this. As a developer with soon to be 20 years of experience, it's the best environment I've had.
So was I until I switch to a 4k monitor and realize that there's no real, stable, functional fractional scaling on Linux, even less so on XFCE, unfortunately.
I main Linux on virtually every one of my computers, and I love it. I wouldn't switch off Linux if you paid me. But this complaint is spot on. It drives me crazy that we still don't have good support for higher resolutions.
And there's nobody else to blame for that -- we've known for ages that 4K and fractional scaling was going to be a thing, just like we've known for ages that touchscreens were going to be a thing.
But nope, let's just measure everything in pixels. It's like the majority of native developers on Linux all looked at responsive design on the web and thought, "I'm pretty sure that's just a fad." Everybody just dug in their heels almost on principle or something and refused to make it a priority, and now we're behind both Windows and Mac when it comes to high-resolution touch devices.
And I still run into people who argue that what we should just scale the physical size of a pixel for the entire desktop by a percentage, just so we can keep building fixed layouts that absolute position all of their elements. At a certain point, it feels more like a cultural problem than a technical one.
Everybody else is doing responsive design. QT already supports `em` units (well, sort of[0]). We could be using them on Linux.
Try using Linux on a laptop with an hdpi internal monitor, and a regular old external. It's a guaranteed way to generate a daily urge to throttle yourself. (Using Gnome in my case rather than XFCE, but it's still an unparalleled shitshow)
My work laptop is in this situation and it got to the point where I just ran the eDP display in 1080p instead. Honestly on a 15 in screen where I'm only doing code and not editing photos, there's no point in running it in UHD.
Depends on your distro but on Debian+XFCE I use ARandR (front-end to xrandr cli tool, which is a front-end to xorg's randr). Laptop plus external monitor works great, with auto-config on connect and other stuff.
This is the biggest thing keeping me from leaving the Mac for Linux, and it hasn’t really improved much in the last 4 years. I don’t think most people care about HighDPI (or whatever you want to call it), particularly Linux devs. But I can’t go back to lower DPI screens.
I do care about high dpi but not in the way most people seem to mean: I want as much text as readable on a screen. One of the many reasons for not using Gnome everything is that it insists on making everything SO BIG with enormous amounts of wasted space around it. Those 1600x1200 pixels on my laptop or 1920x1280 on the monitor are there to be used, not wasted. Firefox used to be another offender in this respect albeit one which is easier to tame - just change layout.css.devPixelsPerPx to something sensible (0.7 works quite well), move all controls to the navigation bar and close all toolbars.
Ubuntu 18.04 using 2x scaling is broken in lots of little ways for me.
From the simple (boot screen runs at 1x so text is unreadable) to the wierd (VirtualBox tangle of bugs), to the frustrating (dolphin file manager, which I use to avoid UI bugs in the gnome file manager).
I found workarounds for some issues, and maybe they are fixed in the next LTS, but there is no way I could recommend 18.04 with 4k to a non-professional.
Nah it's too big on a 28" 4k, trust me I tried everything before going back to Windows, even using Gnome Shell (which I despise) and experimental fractional scaling (which barely works, freezes and scales back to 100% randomly)
Qt 5.14 is now in beta and has a toggle switch to do per-display scaling based off DPI on all platforms.
I've had a 4k 27" monitor mixed with 1080p and 1440p monitors of varying sizes for years and have managed to get 90% of software working great, and whenever I dip into Wayland get to use the fractional scaling there it goes up to 99% of software.
> I don’t think most people care about HighDPI (or whatever you want to call it), particularly Linux devs. But I can’t go back to lower DPI screens.
I find this surprising. I'd say anyone who does any serious amount of multitasking (whether a Linux dev or not) would easily want one. I think people do care but they are just waiting on better pricing/availability.
> I find this surprising. I'd say anyone who does any serious amount of multitasking (whether a Linux dev or not) would easily want one.
I would have thought so too, but this isn't my experience. The smartest developers I know at my current job still develop with old laptops with awful 720p screens (I mean, awful beyond just the low res) and can't be bothered to attach an external Full HD monitor, let alone ask for a 4K one. And still they are brilliant, produce great software, and are key when planning profound software changes in the company. These are people who I deeply admire and from whom I always learn something valuable when they speak. Keep in mind they are also developers, not "whiteboard engineers".
My conclusion is that we nerds tend to overestimate ergonomics, because they are easier to see ("pff, I can only type with a mechanical keyboard!", "how can people code with fewer than three 4K monitors!"), but the actual bottlenecks and difficulties of building complex software lie elsewhere.
My experience with hiDPI on Linux has been pretty varied. On GNOME, everything just works, and I recommend GNOME to most first-time Linux users. I use i3, and the scaling situation there involves setting some environment variables globally, but is otherwise fine (GDK_scale or something like that). The only issue I usually run into is when Firefox opens my file browser, which breaks the scaling somehow. Other than that, hiDPI is as good as it is in windows. MacOS definitely leads in that regard, but things have certainly gotten better to the point of not really worrying about it in Linux these days.
Seconding i3. I bought into vim when the evangelists came for me early in my career and it has payed wonderful dividends. I can't recommend i3wm for everyone but for those who like keeping their hands on the keyboard and want the wm to just get out of their way, i3 is perfect.
I think things will improve as more Linux devs get HiDPI monitors (ones with good IPS panels are still quite expensive). Ubuntu has some beta-level support for fractional scaling but it looked awful when I tried it. But I live in hope that it'll all be sorted out soon!
This is far from true. Gnome shell itself is mostly OK (if buggy), but only a minority of real apps work properly. The standard Gnome apps (Nautilus, Gnome Shell etc) are fine, but that's where the support pretty much stops.
Step 1: put 'xrandr --dpi <your actual DPI>' in .xinitrc
Step 2: Use QT applications (Plasma is a fantastic QT desktop)
Step 3: Enjoy your reasonably sized everything.
"Scaling" is a broken concept to work around applications assuming 96 DPI (which is considered scale=1). You don't need it if you use programs that actually respect your real DPI. Unfortunately X11 doesn't properly compute DPI settings, even though EDID information generally contains the screen size - I imagine, for fear of breaking stuff.
(You can correct GTK3/GDK applications by setting GDK_DPI_SCALE=<actual dpi / 96>, but in my view it's a sin that you need to do that)
It isn't simple? no. You have different env variables for QT, GTK and xrander can set its own DPI as well. It is kinda a mess, but I use i3 so I figure I'm in for whatever pain I put myself into.
I've been running Linux on HiDPI devices for a couple of years now and I've got things sorted to where it's no an issue on any new device. I need to give Wayland/Sway another shot at some point. I hope it has better native zoom support than X11.
KDE has excellent support for fractional scaling, to a single decimal point. It is also a very lightweight and fast desktop environment, but has plenty of useful features.
I did try with Plasma on both Kubuntu and Arch at 1.2 and in both cases some controls had very ugly and/or blurry fonts and some were fine. The inconsistencies were too annoying.
typing this from a Linux desktop on a 4k screen, don't understand what does not work ? I can set whatever dpi in my .Xresources's Xft.dpi key and it looks fine
This lol and the tiny title bars that can't be resized in XFCE unless editing the XFWM theme images (unlike Mutter where the title bar size can be changed in a file).
Sorry but I don't have time for this anymore. I would have on my SUSE box in 2003.
Gnome does, but a WM that works isn't much use without apps that do likewise. Linux desktop software support for hdpi is an absurd mess. Some apps can switch resolution when you move them between screens. Others can't. When you combine this with Gnome's inability to deterministically start an app on the main monitor, this means sometimes it's sometimes literally impossible to get an app in the resolution & on the screen you want. Some apps start up with an apparently random resolution (Calibre, I'm looking at you!). What a horror show it is.
Wayland sort-of handles this. That is to say you can set scaling per-screen without xrandr hacks (with Gnome anyway, not sure about xfce). But there's such a variation in support from apps, it's still a mess).
By using a combo of xrandr and Xfce's hi dpi setting, I've managed to make my linux desktop more stable, functional and consistent across a 4k screen and a 1920*1200 screen next to it than Windows can manage.
Sure, I had to put in a bit of effort, but now that I have the results are excellent, there's no scale jumping or even the weird rendering MacOS does when moving windows around.
Honestly I just had to switch window scaling to 2x in the Xfce appearance settings, then I run this to fix my layout on each login, giving me a consistent size and no weird rendering -
AFAICT this effectively renders the smaller screen as if it had a much larger resolution (2880x1800), then downscales, giving a good quality image, and then positions the 4k screen to the right of it. The key is the scale factor compensating for different DPI on the different screens.
I tried scaling the other way first, by giving a sub-1.0 scaling factor to the 4k screen, but that meant rendering a lower res and then upscaling, which looked terrible!
The only annoyance now that I have this set up is that occasionaly driver updates change which output is which and I have to update the script so it works again.
Not sure I did anything special, I'm using the clearlooks theme with some customisation of the fonts. See the other answer for how I sorted out the res/DPI stuff.
I'm posting in the Linux thread because it's probably annoying to some to see 'but I moved to Linux' threads throughout this comment.
But the original post is very sobering.
I left OS X a few years back when the new laptop came out. Partly the whole USB-C thing I suppose, but I felt that there was pretty good hardware around and the new MBP didn't really shine (and was fiercely expensive). So I made the move to Ubuntu on some new Lenovo hardware (Carbon X1).
I really look forward to the twice-yearly Ubuntu releases. More of the same, rarely any huge new surprises and just generally more polish. I upgraded to Ubuntu 19.10 immediately once the 19.10beta came out and it seems speedier.
I'm sorry to see Mac users get so badly burnt but I'm very glad I made the decision to leave when I did.
Apple fanboy and software developer here. Also disappointed with the Catalina release. Had do an NV RAM reset, boot into safe mode, talk to Apple support to solve iCloud issues, and also click away dozens of privacy notifications.
But I still really don’t consider windows or Linux as legitimate alternatives because of the ecosystem log in.
Yes, macOS has its downsides. But what keeps me from even thinking about Linux are all these little niceties: - copy something on my iPhone, paste it on the Mac (still regularly leaves people speechless when they watch me do it: “what!? You can do that?”) - my watch unlocks my Mac - my desktop is always in my pocket, all files sync’ed per default, zero config - Start Reading something in safari, Hand it over to my iPhone in a second and walk out the door - All my browser tabs synced across my Mac, iPhone and iPad - I can curate my TV Watchlist on my Mac and it’s automatically available on my AppleTV when I get home that night
I could go on with dozens more of these little things that I can’t imagine living without anymore. I know that it’s probably possible to achieve most or all of this with a Linux/Android/Chromecast Setup. But every time I watch some of my friends do it, it just looks like so much work to set up and maintain.
The only argument I understand here is the joy of tinkering and that feeling of achievement when you get it to work in the end. Apple is a bit boring in that regard. A lot of the integration stuff just works (yeah, yeah, sure not all, but still more so than on any other platform I know).
I’m not 20 anymore and I just prefer to spend my time with other things these days than tweaking my OS.
So although I wish Apple would invest more into quality on macOS again, for me the walled garden just totally works, and I’ll stick with the lesser evil.
1) After upgrading I had the avalanche of permission requests. I clicked through all of them on day 1. Needless to say it’s running fine if you ignore running any of the new features.
2) For the first time in my 10+ years supporting Macs at an enterprise level, I’m holding off on upgrading my company to 10.15.
I’m genuinely pissed that there’s an over abundance of annoyances when upgrading that it’s not the Steve Jobs experience. Yes he’s gone, but his spirit and ideals for what brought Apple back from bankruptcy 20+ years ago.
I’ll veer off of Catalina into a Steve Jobs decision that made sense in 1997. Apple had too many products and it was too confusing. Steve said two types of users. Consumer/Pro. Two types of computers for them, portable/desktop. Now we have five different iPad lines. Multiple iPhone lines. What made things simple to focus on at Apple have gone away to colleg grads with no experience.
I think windows has always hands down beat osx in everything but aesthetics in terms of functionality and control. Maybe you could say except for proprietary features / apps but I don't think that really counts because that's comparing ecosystems. Even for development on Linux, running a vagrant instance and writing code in Windows is just a better experience up front. I think Microsoft understands this when their Linux subsystem stuff and upcoming revamped terminal. Developing for Mac is only better if you're trying to develop for the apple ecosystem. For non code use I think it goes either way, with Mac being the choice for graphics stuff since it's generally bought as a complete package with guaranteed color accurate screens etc. Linux is really powerful and a better OS than windows for running code, but precisely for running it and nothing else imo.
I mean, even the supposed top software product from Apple nowadays, iPadOS, is suffering from annoying bugs.
This will be the third time I have to reboot my iPad 6th gen today, the first two were because the keyboard would randomly close and not come back (if someone has a workaround for this issue, I would be grateful, it is a really annoying bug specially since it happened while I was writing a big text) [1].
Now, the app that I use for Hacker News (Octal) is broken on the dock, so the only way to switch to it is using the Expose-like screen. I already tried to force remove it from dock, however opening it again triggered the same bug. Also, force touching the icon shows a “Share (null)” (probably the cause of the bug).
I know, not a big deal (making the user reboot three times in one day, however, should be a big deal). But this is for their top product for gods sake. I don’t want to even imagine how Catalina is.
[1]: If this was Android I could at least force close the keyboard app. I didn’t find a way to force closing iPadOS keyboard, BTW, I tried to add and remove keyboard languages without success. Also annoying that I had to use another device to search about this issue since, you know, I had no keyboard at all.
Catalina is definitely the most badly broken release in the last 10 years.
I can't get Chrome desktop notifications to work anymore, but the worst thing is that all my Apple Music playlists have disappeared from my iPhone.
They do exist in the Catalina Music app, but not on the phone where I need them. I suspect it's because my iPhone 6 isn't supported by the latest iOS release and there appears to be some iCloud version incompatibility that was triggered by Catalina. The iPhone Music app has crashed a few times as well.
They do warn me of just such an incompatibility when I open the Reminders app (which now has a far more convoluted user interface for no apparent benefit). I get the feeling that my decision to just keep using my iPhone 6 until it's not longer good enough will be difficult to sustain.
Catalina also randomly stalls for a second or two and some mouse clicks just don't register for a long time until they eventually do. This seems to have gotten a bit better now so I'm hopeful that it was related to uncached data.
I can add something to the list which is Sidecar i.e. using an iPad as secondary screen. It was a real nightmare to set it up as it’d not detect the iPad despite meeting all the requirements. I found no fixes online so I had the idea of logout & login again (iCloud) so that it might detect it.
Well, it detected I had an iPad but the connection would time out every single time. After rebooting both the Mac and the iPad (another wild guess) it kind of worked - very laggy and couldn’t use the Mac properly as clicking in e.g. System Preferences would open Finder to show me the app under Applications instead of the actual system preferences panel because of reasons.
It’s truly broken and felt like an alpha version. But this has been the trend with macOS and guess it’ll stay like this unless someone high enough at Apple decides to wake up.
SideCar was the reason I was excited for Catalina. But alas, the feature is not supported on 2015 Retina Macbook Pros, only the newer ones with the horrible keyboard, pointless touchbar, only one kind of port, etc.
I'm in the same boat. Installed the betas on both iPad and MBP about a month ago. Turns out my 2015 pre-horrible-keyboard MBP is too old. And none of the published hacks around the restriction work nowadays.
Not helpful but I had the opposite experience. Sidecar Just Worked with my MBP and 2016 iPad Pro, and drawing on it was extremely smooth. A better experience than Astropad so far, personally.
But where does that leave us? I spent a year on Windows 10 not so long ago, on upper end laptop hardware with HiDPI screen, and it was less fun and more problem prone than macOS - especially when it came to fast day-to-day stuff we never think about, such as PDF printing, quickview, etc. Plus in Windows you get the mysterious stop-the-world issues (that I suspect are OS level misbehaviors when some remote network connection has become lagged). There may be some use cases where Windows is actually faster/nicer/better than macOS, but they're not non-MS development, Adobe CC work, regular file management work, etc. Maybe games are it.
Desktop Linux (even on standard desktop computers, not just specialized laptops) is far from perfect. Getting everything to work is a lot of effort, and then mysteriously something will stop working. Every window manager has some rough edges or cases where it's unpleasant. Getting and keeping a Linux Desktop in good working order is harder than it should be at this age of Linux.
ChromeOS?... I know some people who use that as their daily driver. Obviously there are desktop apps that they don't need/use (Adobe CC, for example). But what happens when some automated Google thing triggers an account ban? Bricked laptop and no data?
Thing really do not seem to be improving on the whole. Old problems are replaced with new problems, but the layers and separated concerns mean more complexity and more difficulty solving problems.
reply