Apple’s declining software quality
sudophilosophical.comThis has to be a management problem. Apple has total control over the hardware, total control over third party developers, and $203 billion in cash. What are they doing wrong?
Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods. It's expensive, but Apple ships so many copies that the cost per unit is insignificant.
Microsoft did that, starting with Windows 7. Two things made Windows 7 stable. The first was the Static Driver Verifier, which examines driver source code to check if there's any way it can crash the rest of the OS. This includes buffer overflows. The driver may not work, but it won't take the rest of the kernel down with it. All signed drivers have passed the Static Driver Verifier, which anyone can run. Driver-caused crashes stopped being a big problem.
With the driver problem out of the way, any remaining kernel crashes were clearly Microsoft's fault. (This has the nice problem that kernel bugs could no longer be blamed on third party drivers.) Microsoft had a classifier system developed which tries to group similar crash reports together and send the group to the same developer. It's hard to ignore a bug when a thousand reports of crashes from the same bug have been grouped together.
That's part of how Microsoft finally got a handle on their software products. Is Apple doing anything like this?
Nah, I think it's a perception problem.
As someone whose starry-eyed Mac obsession predated Windows 95 - Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio, and it was buggy under Jobs. I remember getting plenty of sad Macs under System 6 and 7, and early versions of OS X weren't any better.
We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.
The comparison with Microsoft is instructive. Microsoft software was even buggier than Apple's during their period of greatest dominance. Win95/Win98/WinME would crash all the time, and was an open barn door for security. Early versions of IE were pieces of shit. Even later versions of IE (6-9) were pieces of shit. Microsoft finally got a handle on security & software quality just as the world ceased to care about them.
Apple's been driving change in the computer industry since the iPhone was introduced in 2007. New products are always buggy - the amount of work involved in building up a product category from scratch is massive, and you don't know how they'll be received by the market, so there're frantic changes and dirty hacks needed to adapt on the fly, and they often invalidate whole architectural assumptions. It's just that most of the time, this work goes on when nobody's paying attention, and so by the time people notice you, you've had a chance to iron out a lot of the kinks. Apple is in the unenviable position of trying to introduce new product categories while the whole world is looking.
The Apple Watch is buggy as hell, but I still find it useful, and pretty cool.
I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues.
This made total sense in some of Apple's biggest products: OS X and iPhone. When OS X first came out it couldn't even burn CDS, but we all "understood" the magnitude of the project and thus gave it some slack. Similarly, the iPhone lacked a lot and was slow, but it was such a revolution that we let it slide -- in fact we let the rest of the products slide.
The problem today I think is that these decisions are being made for reasons that users don't deem "worthy". Introducing some new music services is not a good enough reason to break my iTunes. The fact that a new watch was released is not considered important enough to let other platforms languish. We "get" why less attention is being paid to other products, but unlike with the phone, its not deemed a good trade off.
In other words, I don't think Jobs was distracting us with promises, but with actual shiny things that made the bugginess worthwhile.
As a long-time Apple user and former employee, this is exactly how I feel about the current situation. I still think that my Mac/Apple devices are more solid than my Android/PC devices, but this is exactly what I'm noticing more regularly.
I forgive most faults that happen because it's almost as if I can forgive them not working all the time since 99% of the time, everything is awesome. On Windows, that same forgiveness manifests itself as me not using my Windows machines as much as my Macs. I still love to use my PCs, but not for anything that I need to rely on the majority of the time.
Now, though, Apple is making changes to things (iPhoto/Aperture were a really great example) where it seems like the change is just to bring parity of some sort to OS X and iOS rather than introducing new features. iPhoto was buggy as hell when they added Faces and Places to it, but I totally forgave that because 99% of the time it was making my life way easier than it was before by detecting faces properly. If it crashes every now and then, it at least saved the data, so I was still better off than I was before the update. I still like Final Cut X (I know, I know... I'm an outlier), but convincing me that a switch like iPhoto/Aperture -> Photos is worthwhile is much harder since there's nothing to distract me away from those issues and I've somehow managed to actively lose features that they convinced me were necessities in the past.
I hope this is not an indicator of things to come. One thing that gives me some hope is that they've gone back to alternating between feature updates and stability updates. Leopard was cool, but Snow Leopard was incredible to me. If that pace comes back, I'll be happy again. Until then, Apple needs to get their software game back in line with the rest of the company.
If you think that it was bad when they removed features from iPhoto, then I'd hate to think what you thought when they removed features from Numbers!
Oh yes... That was a bad move, I think. Luckily, I rarely have to use Numbers so I didn't really care. It just annoyed me that they removed some of the features that I actually did use when I needed to use Numbers. If they added the features back as quickly as they did with other apps, I wouldn't care, but they didn't. :(
I (Apple) will one-up you with Final Cut Pro X http://arstechnica.com/apple/2012/01/more-fcpx-fallout-top-r...
I love the new FCP. As a long time user of FC7, I'm ok with losing out on some of these features as long as they added them back over time and they've done that, for the most part (at least for my uses). The old FCP really needed a facelift and was trapped in an such an old mindset when video was still mainly stored on tape drives and needed to work like real life video editing tools. FCP X is so fast for me and such a treat to use for 99% of things that I can deal with having to jump back to FCP 7 every once in a while. As long as Apple doesn't somehow prevent me from using FCP 7, I don't care and love the new direction of FCP X.
Reminds me of a discussion that was on anothe HN article a few weeks back where someone proudly stated that if a feature customers used didn't fit for in with the companies strategic direction they'd drop it, and tough luck for the customer.
Apple seem to have the same mentality. They used'to get away with it, but mostly because they replaced it with something better. Now they just seem to drop features entirely. That's not a good way of going. As much as I despise Steve Jobs, he never let that quality of a product drop to the degree that big customers (or even smaller customers) left Apple without a major fight to keep them.
It's looking like Apple's obsession with making great and quality products is taking a bit of a backward seat. I think they probably need to worry a little less on their schedule, and more on polish and feature completeness.
Rather remarkable I'm actually saying this, to be honest! Apple would be the last people I would have guessed needed this advise...
I think that's an awesome sentiment if you're talking about something like an Arduino, where part of the experience is working around its quirks and limitations. If you've bought a device expecting it to basically be a transparent window into the internet (or your documents, etc.), having to deal with its quirks and limitations can put a really bad taste in your mouth. Especially if you paid top dollar for it.
"I think you've touched on something key: people's acceptance of bugs is often outweighed by the new features they get. In other words, if the "new functionality" is perceived to outweigh the costs of the lack of reliability, the product will be deemed "less buggy" because people "understand" why there are issues."
Right - which is why we have all of the snow leopard nostalgia: because none of the newer releases have given us anything substantive that we really needed to justify the hassle and the bugs.
I am trying to think of something - anything - that compels me to upgrade SL on my mac pro, and all I can think of is that nifty take-a-picture-of-your-signature in the Preview app that you can then insert into PDF documents.
Ok, and maybe USB3 ?
That's all I can think of.
AirPlay; much better multi-display support; tags in Finder; Spotlight enhancements. More than anything, the iCloud/iOS continuity features were also big if you had an iPhone or iPad, everything is just much easier to keep in sync.
I'm a Safari user (better battery usage for the # of tabs I have open) and it too has improved with El Capitan though that's irrelevant for Chrome/FF users.
ok, airplay I guess - although I've never used it, I do see people using it to good effect.
Worth mentioning that airplay is just userland software - nothing special, and no reason it couldn't have been added to SL.
I don't know about multi display, though - I've been under the impression that that is broken in new and fascinating ways with every single release...
Yeah absolutely. Snow Leopard's multi-display was great. As was Leopard's, Tiger's, and Panther's.
Then Apple broke it massively in Lion, and only finally resolved most of the (severe, productivity-destroying) issues with Mavericks.
Handoff is a really useful feature (when it works).
Also SL mamed Expose (that weird non-proportional grid view) that was reverted to the Leopard-style in Mission Control (of which Mavericks/Yosemite had the best implementation, and they've now broken its utility in ElCap thanks to hiding thumbnails by default. FFS.)
But apart from that... I think I preferred the Apple apps back in 2009-or-so.
To be honest, I think the latest Apple release cycles have been more about "remove a feature so that we can add it in again and sell it to our users again". Think multi-monitor support, something that worked perfectly in SL and earlier, and then broke fantastically with the full screen apps in .. Lion? ML? One of the two.
True Apple software has always been buggy, Apple calls it undocumented features.
Apple always makes up for bugs with newer devices with faster CPU and GPU units that make OS code run faster. That means buying a new Apple device to get better performance. The older Apple devices are left out of updates eventually and if they do update to a newer OS version it runs slower.
Apple is driven by an upgrade model to buy a new Apple device every three years or so. In the PC world Windows 7 can still run on old Pentium 4 systems and if I am not mistaken some of them can upgrade to Windows 19, the 32 bit version but it can still work. For example I used to have a Macbook that only ran up to 10.7 and 10.8 needed a newer Intel CPU to install. Anyone with an iPhone 4 is going to find the latest iOS slow as well.
It is in Apple's business model to sell customers a new device every few years or so and phase out old Apple devices.
Apple doesn't care if their software isn't the best quality as long as it is easy to use and will keep people buying new Apple devices to run things faster.
I myself like GNU/Linux better than OSX, because it can run on older PC systems and it runs quite fast and has a good quality to it. GNU/Linux is virtually unknown to the average consumer and when people get tired of Microsoft they usually just buy an Apple device. Apple devices are easier to maintain and use. You even got toddlers using iPads, that is how easy they are to learn to use.
Apple has saved up billions just in case they have problems. Apple has done well financially in an uncertain economy where other companies are struggling.
Only Alphabet seems to be doing better for some reason. Google's parent company. Google's Android needs better quality as well and since Oracle sued them over the Java API they have to change the way the OS works. The Web Services seem to earn a lot of money and Google's AI is very advanced.
Apple is driven by an upgrade model to buy a new Apple device every three years or so.
I disagree.
My wife's iMac is 6 1/2 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iMac soon.
My iPad 2 is almost 5 years old, and still gets the latest OS updates (for free!), though we're upgrading to a new iPad soon.
It is precisely because our older Apple hardware is still working well, and Apple still supports us with the latest updates to that older hardware, that my family is not only sticking with Apple, but we've recently invested in new iPhones.
Apple has earned our trust.
Yeah, OS X and the iPhone were new products, not just upgrades of what came before (as you say, OS X had significant limitations compared to OS 9). You didn't have to upgrade right away, but if you did, you got an entirely new experience.
Compare this to today's Apple, where upgrades add "hundreds of features" but feel mostly the same (except everything runs a bit more slowly). There's no coherent vision of what the future of the software should be like.
wow, I couldn't put my finger on it, but this is why I am starting to hate apple. I spend a lot of time on the computer so I like a lot of things apple does, mostly the interface and the UI. To clarify what UI means to me, it is how simple and intuitive it is to use the device, and give it the instructions to do what I want. I am willing to forgive a lot because this is good.
Apple has always been a fairly closed system and it didn't bother me more than not having the features I wanted. In El Capitan, it was different. Things didn't work well and Apple took over my whole system. With SIP(system integrity protection) I had no control. It would seem to turn protection back on after being disabled, and it takes a nontrivial amount of time to turn it off because you have to reboot the entire system into recovery mode, wait for it to connect to the internet and download a bunch of apple shit, and then select a language preference and then type a command into bash and reboot.
Deleting apps is difficult, changing settings is difficult, having siri take up 10% of my iphone is annoying, removing apps destabalize the system, installing my system from time machine reinstalls their system and settings and overrides mine.
I disabled most of apples applications and processes, the system in fairly stable, although I think I went to far with disabling notification center, but your point is correct.
tl;dr users are willing to accept a lot for revolutionary changes. Evolutionary changes with only marginal improvements are not going to make me forget that they unpredictably disallow me from using sudo and are fucking up all my devices doing things I don't want them doing in the first place.
A lot of people in this thread seem to think this is all about Apple not adding enough revolutionary features or something. But consider this alternate explanation: with years of experience comes a more sophisticated judgement. What used to seem good enough now seems to have obvious flaws, even if it the same as it was before. Lack of control is an example: beginners often don't notice or care much, especially if it feels simpler, but as your needs deepen it becomes more important. Being able to set things up and then not keep touching it is one of those tastes that develop with experience.
that is a good point as well. I definitely agree with it. The one thing I would add is that I repeatedly get update notifications on my iPhone. Due to the increase in lockdown of all features, I am legitimately afraid to update as the:
* provides security update
* increases iTunes performance
type descriptors, do not provide enough information about how they will fundamentally change my system. Most notably when I updated my iPhone and found out I loaded in some horribly inefficient talking pseudo AI that was not neutral, but a straight up negative feature consuming system resources.
I think you are really correct though, as you gain more experience and skill with technology you have more needs and better judgement. You can evaluate things better because you are aware of what is possible. The biggest problem isn't that they make changes, it is that those changes are not predictable so they become difficult to mitigate.
I'm also concerned about updates. For instance, I'm currrently having to route all my iPad web traffic via Charles proxy to remove any instances of style="overflow:hidden;" in a body tag are cleared out.
Why? Because in iOS 9.2 Apple released it with a bug that causes the viewport to zoom incorrectly on these web pages. This affects LibreOffice's OpenGrok, which I browsed regularly on my iPad.
They still haven't fixed this, and it's a major regression. iOS updates are few and infrequent. Consequently I'm seriously questioning what their updates actually do to my iPad and iPhone.
I wouldn't hold my breath. The iOS Mail app can not negotiate any TLS version above 1.0 (for IMAP, possibly SMTP too) even though it obviously supports TLS 1.2 because it sends a TLS version of 1.0 in the ClientHello message even though that same message will contain TLS 1.2 ciphers (AES-GCM).
I reported it in October and Apple's security team replied they're aware of it but it's still not fixed 2 releases later even though they probably need to fix like 1 line of code (the advertised version flag).
They have actually fixed it - if you get bit with it then you can reference rdar://problem/22242515
The WebKit bug is here:
https://bugs.webkit.org/show_bug.cgi?format=multiple&id=1528...
The patch to fix it is here:
https://bugs.webkit.org/attachment.cgi?id=268394&action=diff
The workaround, FWIW (thanks Simon!) is to add shrink-to-fit=no” to the meta viewport tag.
For me, it was too much effort to get OpenGrok fixed, so I just did a rewrite rule in Charles Proxy that gets rid of the style attribute.
I agree with your summation. To add to this, there are many things going on under the hood that none of us asked for that are taking up system resources, dialing home and draining battery life.
Some time, try this yourself:
sudo opensnoop
The original iPhone wasn't slow at all. One of its main selling point was the speed of menus and apps (I forget what they actually called apps before the app store).
I think you forget how crazy slow feature phones were. Opening a GPS app and finding your location could take 5-10 minutes in 2007 on a feature phone.
They called them apps too. It's easy to remember with the infamous (quoting from memory): "You can write apps in HTML".
Guess I completely forgot that. Thanks.
I think you're right but people tend to take for granted the features that are worthwhile and underestimate the difficulty of making anything ever work correctly all.
>> Apple's software has always been buggy. It was buggy under Sculley, it was buggy under Amelio...
Classic Mac OS was buggy by design - it didn't have multitasking and memory protection, it was single user... Windows had that same problem before 2000 (well, NT 4, but not many people used that).
I use OS X daily and I use Windows 7 daily. I have far fewer issues with OS X for whatever reason that may be. My computers don't magically reboot or bluescreen nearly as much. It might happen every 6 months at the most, where with Windows it probably happens every 2 months.
I agree that OS X is more reliable. I have both a Mac and a PC in my office, and my 5K iMac has never crashed. I had some weird issues with Windows 10 after I upgraded my laptop, where it would hang after an update.
I switched from using Windows for 20 years to OS X (excluding Linux for work related stuff.) This is the first time I've been able to work from a laptop with my productivity level as good or better than a desktop. The design and usability surpassed what I expected. I haven't noticed any bugs.
I can't imagine having a different brand of computer, but there are lots of OSX bugs that cause my teeth to grind. The Finder doesn't remember that I only want one layout ever, and it resets to a random alternative setting regularly. There are still progress bars that pop over, and can't be hidden. The data display which shows used disk space and it is nearly all "other" is a bug as far as I'm concerned. My AppleTV(8) has stopped incrementing itself, but its not ideal. And as a genuine question, has anyone not had strange Xcode behaviour or crashes at least once per day? I currently have a slow motion simulator that changes views over 5-10ish seconds.
In terms of laptop OSes, OS X is by far the best. It's stable, usable and 100% desktop-OS focused. I can't stand the touchscreen features that Windows 10 tries to still force on you. Somehow my Windows 10 laptop got put into "tablet mode," and it was pretty unusable. I couldn't access the desktop anymore, it was slow and it took awhile to figure out the issue.
I think it was flipped on after an update, but why would I even want to be able to enable that mode on a laptop without a touchscreen?
If Windows 7 is "magically rebooting and bluescreening" often enough to comment on, then you have a hardware problem.
I don't use Windows very much, because I hate using Windows. But what I will give it is that I haven't seen a blue screen in the past decade for any reason short of bad memory, overheating, dead disk. I suspect that a lot of the Windows image problem is that people are free to buy really cheap hardware and fiddle with things they don't understand.
Apple merging MacOS with NextOS to make a Unix OS was the best move they could make at the time. It happened during the time of the $10,000 Unix workstation and Apple made OSX as an easier to use Unix. It was cheaper to buy a Macintosh than it was to buy a SparcStation or some other Unix workstation.
Because of Apple making OSX Unix based, it cut into sales of other Unix companies like SGI, and also GNU/Linux cut into sales of SGI and others as well.
But making OSX Unix based solved a lot of problems that Classic MacOS had that they couldn't solve.
"...magically reboot or bluescreen nearly as much." Mine never magically reboot. Ever. (OS X)
My Macbook Pro did which reminds me the extended warranty period is ending.
I regularly get asked to fix someones windows PC, no such problem with people who have Macs. With OS X there are waves of releases(major versions) if I remember correctly - some introduce swaths of new code/features/replacement code. Some other are more of a speedup and bugfix versions. Maybe I'm wrong.
I go places on my Windows pc that I wouldn't dare take my Mac. I expect it to need repair.
My pc is my beater car, and it needs repair--regularly.
My Mac is the classic car in the garage, that only gets used for work, or safe places.
Speak for yourself. I used an old G4 PowerBook for grad school, and it travelled over 100,000 air miles, and into the various labs where I had to work, and also to far-off Asian countries for holiday. Plus, I didn't have to buy a developer kit: it came free with the machine.
I assumed that the travel referred to dangerous parts of the web.
Actually NT 3.51, which I used for dev and was great compared to my colleagues on plain windows.
Actually NT 3.1. 3.5 followed, then 3.51, then 4.0, then 5.0 (2000), and then the NT line ended as it was unified with the non-NT line.
Technically 9x line ended since Windows XP was NT-based and not 9x-based.
Yes, I suppose so. I didn't want to say the 9x line ended, because it's really the line of DOS-based OSes, and while the NT line is ongoing, it's no longer called NT. 2000 was the last version to mention NT, and it wasn't part of the name itself, just a tagline.
One of the things about OS X is that most of the time it's put on high quality, but non-exceptional hardware. So things like bad RAM, flakey power supplies, bad GPU drivers etc are almost never an issue with a stock machine.
Windows, not so much. The only stability issues I've had with windows have been related to poor drivers, almost exclusively from nVidia or ATI/AMD. The equivalent hardware for Apple machines either didn't exist at the time, or was running much less ambitious drivers.
I probably have more issues with my Macbook Air (relating to sleep, hibernate, and wake-up) than I do with my Windows machines these days.
To give you a counter-anecdote, I use macbooks in work. For the last five years, I've had two hardware failures and gray screens maybe once every four months. In addition to that, I have issues maybe once a month where the machine more or less locks up (from the logs it looks like windowserver/loginwindow has crashed and OS X is trying to do spindump to them).
Compare that with the _desktop_ Windows 7 machine. It first crashed intermittently (memory failures), but after I changed the motherboard, it has not crashed at all. But then again, I am not using, for example, the most cutting-edge graphic drivers.
I remember quite some crashes during the Windows XP times, but I've since taken a more conservative approach to hardware and drivers.
Jeez, you're talking about something designed and built in 1982-1983 (over 30 years ago) and meant to run on something with 128KB of RAM with no hard drive and a 400KB floppy disk.
You try fitting all that plus a GUI those constraints.
What's amazing is that it had the features it had and that it worked at all.
You might like to check out MenuetOS/KolibriOS and the old QNX demo disk.
Both provide GUIs and rudimentary Web browsers. QNX was full POSIX, too, although the demo disk didn't include a terminal.
... and there goes my macbook (2008?) where windows 7 runs harder better faster than osx, and pretty much more reliably than the monster imac at the office.
Pity that nobody remembers Windows NT4, it was miles better than OS7. I stopped using mac altogheter after starting to use it.
> Microsoft finally got a handle on security & software quality just as the world ceased to care about them.
Not quite. They got a handle on security when Linux list a fire under their ass.
Competition, true honest to market competition, spurs improvement.
The thing about Apple is that they may have competition on hardware, but they have no competition on Software.
If you buy a Mac or a iPhone, you have already thrown money at Apple. But you can easily assemble a PC without Windows and then install Linux on it.
Keep in mind that the latest US warship is not running Windows, but RHEL. That is a very big wake up call for Microsoft, where before we have seen the likes of Win2k (US ship) and XP (UK submarine) used around the world.
> They got a handle on security when Linux list a fire under their ass.
I have my doubts that all 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS influenced Microsoft to do anything but if you have data to show otherwise I'd be interested in it.
I'd assume parent was thinking less of PC users and more of other OS consumers: https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...
And those are the public ones.
BTW, i seem to recall that the London Stock Exchange had a spectacular failure when they went with Windows. The result being that they switched to Linux with in a year or two of bringing their brand new Windows system online.
Ah yes, found it: http://www.computerworld.com/article/2467082/data-center/lon...
Desktop smesktop. For MS the desktop has always been a means to an end. Its have been about "total cost of ownership", where they can claim people need less training before being productive at their new job.
But to manage all those desktops you need server, and with MS the billing pr active user etc.
"if you have data to show otherwise I'd be interested in it." ...
Some data says Linux desktop/laptop share is 1.5% (not counting chromebooks)
https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...
Please note also that Android is Linux and iOS is Darwin BSD unix.
Linux and Unix based kernels are more numerous than Windows based units.
The fact that iOS and Android have a UNIX like kernel doesn't count much if the majority of userspace apps use non UNIX APIs and tooling.
They could replace the kernel with something else and most devs wouldn't even notice.
> 0.02% or whatever of PC users who are dedicated to using Linux as their desktop OS
Wikipedia: 1.5% [1] NetMarketShare: 1.71% [2] W3Schools: 5.6% [3]
[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste... [2] https://www.netmarketshare.com/operating-system-market-share... [3] http://www.w3schools.com/browsers/browsers_os.asp
I don't know about Linux influencing Microsoft on the desktop, but surely it has lit it up on the server. You are 100% trolling by claiming 0.02% of PC folks use Linux. It is at least two orders of magnitude higher than that, and then android... :)
The obvious example would be netbooks. MSFT moved very fast to counter Linux on that front.
Not exactly, oem pc's have ( mostly ) an oem version of Windows on it ( which is pretty cheap, don't know what other incentives microsoft has though).
But you can install Linux on it for sure :)
I had to pay $100 extra for the Windows license that came with my Lenovo Ideapad.
And Lenovo wouldn't let me buy their laptop without the windows license.
Mostly ;)
http://www.amazon.com/s?ie=UTF8&page=1&rh=n%3A565108%2Cp_n_o...
(amazon shop --Laptop -- PC )
Frankly i suspect MS would love to ignore the consumer world, except that then they would lose their beloved "total cost of ownership" argument for doing B2B sales.
OS X and iOS are both way more stable than any pre-OS X Apple OS. I can't believe people forget it.
When people talk about OS X having issues, they often mean some new feature is a little flaky. Classic Mac OS lacked basic stability and security features like preemptive multitasking and memory protection. Classic Mac OS was just like the pre-NT Windows: crash prone.
That's not a high bar though. The only other high profile desktop OS around at the time, Windows NT, was more stable than any pre-OS X Apple OS for a long, long time.
I agree with this sentiment. We can complain all day, but the fact that we have these devices and software that have been made accessible to us by Apple is astounding from a historical perspective. My parents are in awe of the calendar app, and apple maps, etc. As they should be!
OSX also is still the best development platform despite it's flaws.
Are you missing an /s tag there or am I temporarily dumb?
Neither? Not sure which part you are referring. I really believe that it's the best dev platform, I've tried all of them. Ubuntu can come close, but orders of magnitude less user friendly. In my opinion and the opinions of folks I've discussed this with.
I would agree. I love my Macs for dev work. Web dev and app dev alike are an absolute treat with a Unix-style backend but a much more polished front-end. Ubuntu is probably the only Linux distro that comes close to giving me that terminal power without rubbing it in my face constantly when I'm just trying to manage my day to day stuff and, even then, it's not even close to OS X. Windows, on the other hand, is only usable for me with third-party software for everything and then I feel like I'm spending just as much time futzing with everything as I am doing anything productive.
Same with me, I basically do 3 things on my computer: develop code, edit pictures and write stuffs. Almost all my files are in the cloud available through the browser, and the fullfledged terminal with lots of convenience tools just feels great.
I still can't work out how to get XCode to load up the LibreOffice gbuild projects. When I do I think I'll probably be a convert. Till then, I guess I remain with vi.
OK, thanks for your polite answers.
Perception problem is the right description. Let us take a devil's advocate view and try to fit the facts into a narrative that inverts the public wisdom.
Microsoft is making over 4 times what it made in its glory days, growing year by year, across a wide range of products and services. Windows and office account for only half of that, making them a diversified company with plenty of potential for revenue growth. Windows 10 is by far the most successful windows release ever, with more active installs than os x (any version). Basically the only place microsoft is truly failing is phone.
Apple by contrast gets two thirds of their revenue from the iphone. They have nothing else that even comes close, and nothing that could replace it if iphone sales start dropping. Mac sales are down, ipad sales are down, and the apple watch is a dud. Since 1990 apple has basically had two hits: ipod and iphone. I did not mention ipad because it is just another iphone model, which you can tell by its sales slumping as iphone screen sizes moved up. Success for Apple is rare, and most of what they do isn't all that amazing. The apple tv isn't going anywhere, even after the refresh. The apple watch distinguishes itself from other smartwatches only by its price. Basically the only place that apple is truly succeeding is phone.
Perception is everything. How you choose to look at the facts determines which facts you see. Apple is perceived as strong and microsoft as weak, but the facts give you the option of going either way.
Regardless, apple has few excuses for any quality issues. They have the resources, and they have had enough time (given that aside from the watch everything else is half a decade old or more). Personally my mac and ipad anno 2015 have the same amount of glitches as my mac and ipod anno 2005. For me, Apple doesn't seem to be getting worse, but they don't seem to be getting any better either.
> We just didn't care because Steve Jobs was really good at distracting us with promises about how great things were going to be, really soon now.
I tend to characterize the "reality distortion field" as a magician-like talent to focus an audiences attention on a particular subject.
Jobs was taught the RDF by his professor. This means it can be learned. The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.
As the villain in the Incredibles said:
"When everyone's a superhero, no-one will be."
> Jobs was taught the RDF by his professor.
I didn't know that story. Who was the professor? What was the technique?
> This means it can be learned.
Oh definitely. Magicians learn all their tricks, and they are very useful for anyone performing in front of a crowd.
> The reader is encouraged to learn how to perform the field, so that they can defend themselves, and others, from its effects.
Which reader is that?
Robert Friedland, apparently. And he wasn't a professor, but rather a classmate (and Reed's student-body president).
We cared when we tried using System 7 Macs to control industrial machines, as I did.
If you treated 16- and 32-bit Windows nice -- typically running one program over long time periods -- they were quite stable on the plant floor.
We have two different theories here. We're smart people, right? We should be able to figure out if we just percieve software quality to be worse or if it really is.
So how do we measure this in some valid manner?
> I remember getting plenty of sad Macs under System 6 and 7
When System 7,8 crashed, it crashed hard. Complete system lockup. And it crashed rather often. No recoverable... progressive crash like Windows.
When you say the world doesn't care you might be nearly right from a consumer perspective but that's not really their target market. Outside of cool IT and design companies almost everyone's business machine is running Windows, a lot of servers are running windows too and sql server and visual studio are at an all time prominence for business software development.
As someone who regularly uses OS X and Windows, I'd say OS X is as reliable as Windows 7, 8.1 or 10 or even more reliable, particularly with regards to crashes. Apple did have a really annoying wifi bug that has been fixed but that did take awhile.
iOS is also in pretty good shape, but almost every time Apple releases a new version its buggy. By now, iOS 9 it's a very stable and robust OS, but it needed work up front.
The biggest places were Apple is having trouble are with new products. Watch OS was slow, buggy and limited at release. It's pretty much at a 1.0 state right now. The new Apple TV is by far the best version of the Apple TV, but the OS is buggy and still needs refinement.
My take on this is two-fold:
1) Apple is doing more and more products, causing there to be issues with newer products. They haven't been putting in the QA work on newer software. OS X is old and mature software, so it's pretty stable, but something like Watch OS is very new.
2) Apple's insistence on yearly OS upgrades is causing there to be a lot of 1.0 roughness each year. Just slowing down to a two-year cycle would allow for a lot more time to refine and more time where the OS has been patched and is the latest OS. iOS 10 will be announced in a few months, but iOS 9 still has at least one major point update to go.
This is more or less my take - other than some occasional high profile bugs - I'm not sure Apple has a declining problem, their software seems to still be well above industry norms for quality (and either in line or better than Microsoft, largely because of their strategy to abandon backwards compatibility).
Apple however is held to a much higher standard than they've ever met, and much higher again than industry norms - and you raise very valid points that they're shipping shit before its ready, I don't think yearly upgrades are terribly compelling anymore, or really needed - I want a computer that works really well and does the things that I want it to do, with a minimum amount of fuss.
Apple's R&D budget seems to be mostly focused on hardware. It's hard to say definitively, given how secretive they are.
Microsoft invests a massive amount of money into MSR, and creates tools out of the most useful results. The Static Driver Verifier depends on Z3, an SMT solver developed at MSR. Other verification tooling like SAL (C/C++ annotations to assert contracts for functions) has a similar history.
They should probably consider using that "static driver verifier" because Surface Pro 4 is crashy as hell.
Yeah, but that's because Skylake is buggy as hell. It's a bit of a bummer. MS cannot publicly blame Intel for the raft of skylake bugs, but it's first gen hardware, first gen software, and first-gen firmware.
For what it's worth, my Surface Book (obviously a product they care a lot about) has basically gotten to a crash rate equal to my mac, which is maybe 1 actual problem every 2 weeks). I suspect the Surface 4 Pro will get that love next as the other Skylake power issues sort out.
If Microsoft was serious about enhancing the real and perceived quality of the brand they are establishing with the Surface Pro line (which seems to be very nice based on the my mother's earlier generation model), they should have tested Skylake, and it in the Surface Pro 4 prototypes, detected the problem, and delayed the launch. Intel's problem became their's when they shipped it in their hardware.
What's interesting about this is that it appears they did! Very very early SB models exist and tested MUCH better than the first run of production hardware. You can find a lot of early reviews that praise the battery life, etc.
Then the first wave of consumer-facing SBs went out and it was a total disaster. This might be something that Microsoft can fix, because they have dramatically improved the product experience and been very receptive to trading defective hardware. Mine was traded up the instant they looked at it, with apologies and a customer care call.
> they should have tested Skylake, and it in the Surface Pro 4 prototypes, detected the problem, and delayed the launch. Intel's problem became their's when they shipped it in their hardware.
Did you say the same thing when Apple shipped a massive defect rate on their first gen retina macbooks? Because they DID test thouse, and they still ended up shipping a truly phenomenal number of lemon screens with huge defect and failure rates.
Oh, and Apple refused to replace all but the most egregious failures. I still have a machine with such significant ghosting that it can be difficult to use. Ironically, DaringFireball is actually unreadable. I keep this machine around because it was part of a very special segment of my life, but also because I like showing people, "Yes even Apple's legendary hardware is rife with first gen bugs, and your iPhones and hypothetical new macbooks are no different."
Yep. That's likely why there are no Skylake MacBook yet.
The fact that Skylake was totally out of cycle with Apple's usual release efforts might have something to do with it. MS was in an unusual position.
As an engineer,this is the correct response.
Well as an engineer, I think we all know that manufacturing defects can creep in even after prototypes pass.
It's also the case that it's incredibly hard to test things like battery life, wifi connectivity and the effects of heavy processor workloads in a systematic way. You hope that your vendors do a good job (and I bet Microsoft's contract with Intel involves penalties for these major defects to try and incentivize Intel to handle these).
Look at the first Iphone4. How did they miss something as simple as skin contact causing significant antenna interference? Most of us hit it immediately. The answer: hardware in the real world is really hard.
Just a little bit of trivia which I found interesting -- Apple actually did not miss the antenna interference problem. They knew that it was an issue, but I guess they figured it was an acceptable tradeoff for the design they wanted.
http://www.bloomberg.com/news/articles/2010-07-16/jobs-says-...
I get the impression thatJobs did not think it would be received as negatively as it was.
I suspect if they knew how badly it hurt my reception (I totally lost signal and it took a long time to get back), they would be less surprised at the reaction.
That's not what I'd call a bug that "crept in". Surface Book I bought crashed unprovoked _several times a day_. That's deliberately shipping a completely faulty product that any self-respecting customer will take right back to the store. Which I did.
You should take your Mac to the store and have them take a look. Or at least run hardware diagnostics. I've been using Macs for well over a decade and in that time I have maybe seen 4-5 crashes total, across multiple laptops and desktops. A crash every two weeks is not a normal situation in OS X.
Depends on what you are doing.
Just yesterday, OSX was convinced that I had an external monitor. I did, but that was 2 hours back when I was at the office. So I got to the preferences screen and.. the kernel crashed.
Do you have some kind of third-party display or window manager installed? I regularly (as in "every day") have my Macbook hooked up to dual monitors and I've never had the kernel crash due to a disconnect or a change on the preferences screen. Does that happen regularly for you or was that just a one-off occurrence?
I've got an otherwise unexceptional LG monitor that with one specific generation of MacBook causes all sorts of problems. My windows machines and newer macbooks don't have this problem, and connect to the monitor fine.
So it can be hardware issues. Often subtle ones.
Connected via HDMI? And goes into YPbPr mode because OS X thinks its a TV? And has no override.
Some combination of (Windows on VMWare, startup utilities,corporate virus/malware protection tool and external monitors) are my bane.
I've given up keeping VMWare open, and I experience very very few issues - even on an older OSX release (again, due to corp IT).
Lucky you, VMWare is getting rid of Fusion anyway.
If you use virtualization, it is.
I don't know if it's fair to compare Microsoft's efforts with Windows and Apple's with OSX. Windows runs on hardware from a variety of different vendors. OSX is pretty much commercially locked down to Apple's own hardware (unless i'm missing something?). It's actually a shame that it isn't damn hear flawless.
OS X runs pretty well on non-Apple-certified hardware. I have a Hackintosh I've been running on desktop for a few years. There's a large community around these things and users' experiences are mostly positive. Of course, this is due to the dedicated efforts of a small group of hackers.
My hardware configuration matches no Apple product.
Microsoft indeed improved quality a lot, though Windows 7 inexplicably grinds to a halt and sometimes outright hangs on my desktop occasionally (one can blame my Wacom tablet but that contradicts the thesis of driver verification working wonders), and Windows 8 periodically renders the laptop unusable, using near-100% of the disk bandwidth (I tried like 5 tweaks recommended on the web for this problem, nothing helped.)
But that is not nearly as bad compared to having to rely on software developed the way they do in the aerospace business! From http://blogs.law.harvard.edu/philg/2010/02/09/public-tv-figu...:
> Who crashed Colgan 3407? Actually the autopilot did. … The airplane had all of the information necessary to prevent this crash. The airspeed was available in digital form. The power setting was available in digital form. The status of the landing gear was available in digital form. …
> How come the autopilot software on this $27 million airplane wasn’t smart enough to fly basically sensible attitudes and airspeeds? Partly because FAA certification requirements make it prohibitively expensive to develop software or electronics that go into certified aircraft. It can literally cost $1 million to make a minor change. Sometimes the government protecting us from small risks exposes us to much bigger ones.
(I agree that Apple's cash hoard does not make $1M sound like a lot, however, they also have much more software to tend to.) Overall, it seems that today you have to trade correctness for features and development time, and the cost in features and development time cannot be borne by a market participant unless the market is regulated so that all competitors have to do it, in which case the user is going to get way, way less functionality. I believe that the cost of bulletproof correctness might drop significantly enough at some point to change the game - and I really, really hope formal methods will take off big time, without being sure they can - but it doesn't seem like we're there yet. (This is my opinion, not data, of course; the one thing that I think $millions buy that works very well without costing too much time or features is automated testing.)
Tim Cook doesn't seem to care about anything other than the supply chain. He came from being a COO, so that's all he knows. Consider how many different models of iPad there are.[1] It's like he's bragging. "Look how good we are at managing suppliers. Look! LOOK!" Meanwhile, the watch, Apple Music, and everything else that reached a v1 release under his tenure so far has been buggy and broken. But hey, at least they have a "gold" Macbook now!
>>Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods. It's expensive, but Apple ships so many copies that the cost per unit is insignificant.
They don't have the time, however.
They have the money to buy more resources and a lot of those tasks can be done in parallel.
There's this really great book you should read about myths and man-months
That's true but only if there's only one team working on software at Apple. There's no reason to assume that the iTunes team needs to be the Photos team; while there might be certain dependencies on something like iCloud or OS X, the areas everyone complains about tend to be clearly contained within a single app and there are well-practiced ways to deal with things like API changes.
9 women combined can't make a baby in 1 month
That just means involving more people _right now_ won't speed things up. But you can "make 9 babies in 9 months" if you plan early and involve enough people. I.e. know how much Q/A people you need and hire them upfront.
Yeah, and have them twiddling their thumbs while there's no product to test?
9 women can make a baby every month for 9 months
with an initial delay of 8 months.
Or rather, the marginal return isn't worth the investment.
>>> Apple has total control over the hardware, total control over third party developers, and $203 billion in cash. What are they doing wrong?
Nothing. And that is the problem. When you are generally speaking doing everything well there is little room for massive improvement. The days of perpetually exponential improvement, and resulting growth, are over. Apple is not a startup. Like Ford, Sony and GE, they now have to settle into the grind of incremental improvements for reasonable returns.
Or they can put their markers down on ever more grandiose schemes. They could branch into transportation by starting an airline, or a robot taxi service, but I doubt shareholders will tolerate such outflows for long. If doing so causes the neglect of the core business (iPhone) shareholders will revolt.
Apple is losing market share to Android. The gravy train may not go on forever. Apple today is in the position of GM in GM's glory days, the wonderful days of powerful V8 engines, HydraMatic transmissions, tailfins, and the "longer, lower, wider" wide-track Pontiac. GM didn't think they needed massive improvement in quality. They were wrong.
Watch this commercial for the 1967 Pontiac GTO.[1] Looks a little like one of Apple's teaser ads from the Jobs era, doesn't it?
I think that the Apple/Google/Microsoft/IBM quadrifecta perfectly illustrates one of the lesser-known points of The Innovator's Dilemma: customers care about different values in different points of the product's lifecycle, and that leads to differing companies becoming dominant.
When a new product category is introduced, customers primarily care about ease of use and relevance to their lives. Radical vertical integration is usually necessary to achieve this, because any friction in the product's interface is on top of the friction of getting consumers to use a product that they're completely unfamiliar with. Hence, the market is totally dominated by one company that makes everything from the chips to the hardware to the OS to the apps. This is Apple. This is the Apple II in 1976, the Mac in 1985, the iPhone in 2007, the iPad in 2010, and now the Apple Watch in 2015. (It's also Netscape in 1993, Yahoo in 1998, Amazon in the early 2000s, and AWS today.)
As the market matures, more competitors enter. An ecosystem of third-party apps develops. Hardware supplier prices drop as more hardware manufacturers develop expertise and enter the market. Customers start to value compatibility, options, customizability, and adherence to standards over raw ease of use. This is Google now and was Microsoft in the 80s & 90s. This is MS-DOS in 1981, and Windows 3.1 in 1991, and IE5 in 1999, Google Search in 2000, Chrome in 2008, and Android in 2011-present.
Eventually the technology moves up-market. Customers start to care more about security, stability, reliability, and performance. That's Microsoft now and IBM in the 70s & 80s. That's mainframes in the 80s, and MS Office and Win7 now. At this point, the technology is already being disrupted, but the disruptive technology isn't reliable enough for a segment of the market.
Finally, you get to the point where customers care about brand and compatibility with existing installations. This is maintenance work, where the company becomes a consulting outfit to keep all the technologies they invented a generation ago running. That's IBM now.
This seems like a good description of product maturity process in B2B markets. But consumers are motivated by different things(once products are good enough) - the chief among them is psychological/social value/perception, mostly created via marketing or by being first - and in general pretty hard to disrupt.
I think that the emotional-value aspect slows down the disruption cycle in consumer markets, but it doesn't stop it.
Emotions, after all, are just the brain's way of processing lots and lots of information that can't be compared on a rational basis. Part of that information is "What do my friends use?", part of it is "How does it fit into my life?", and part of it is "What does it say about me as a person and what I value?"
But all of those factors are still subject to reality: if a new product comes out that fits into your life better, eventually somebody's going to break ranks and adopt it, and they'll be able to explain to their friends, authentically, why they believe it's better. All of the catalysts I mentioned in the original post reflect changes in the ecosystem: the shift from ease-of-use to features & compatibility reflects more things you can do with the product, the shift from features to reliability reflects using the product in more consequential situations, and the shift from reliability to branding & maintenance reflects how you're perceived for choosing the product.
The Tipping Point describes the mechanism for this in consumer markets well. Product adoption starts off with Mavens, people who like trying & evaluating new technology on its own merits. It spreads through Connectors, who have a wide circle of friends and enjoy telling them about interesting new things that might benefit their life. Finally, the holdouts are convinced by Salespersons who explain, point-by-point the benefits and answer objections.
>> I think that the emotional-value aspect slows down the disruption cycle in consumer markets, but it doesn't stop it.
That may be true.
But in the context of iOS vs Android:
1. Most features come from apps - both have strong app ecosystems, and iOS probably has the stronger app ecosystem because it serves wealthier people. To a certain extent that applies to reliability.
2. Some features are native to the OS. So you see a competition, and Android is certainly faster there, via the rooting community, competition between OEM's , etc. But Apple usually respond - at least when things appeal to the mainstream , and don't negate their strategy.As for the question of reliability - i'm not sure Android is viewed as more reliable(think security vulnerabilities like stagefright). But yes, maybe Google can lead Apple here ,because they seem stronger technologically. The only question will they do this permanently or will it just buy them some time and would it be enough ?
Also , let's not forget the network effect embedded in iOS via iMessage(which many users say it prevents them from moving to Android).
>> the shift from reliability to branding & maintenance reflects how you're perceived for choosing the product.
I'm not sure that's true. it all depends on how psychologically important that product is to you, versus how important is the features/reliability differential.
Thanks, that was a really interesting read. However, like my sibling states, consumer markets are subject to the whims of marketing, which may distort this somewhat.
> Apple is losing market share to Android
Who cares ? They are siphoning profits from the market and selling more phones than they ever have before. They have periphery businesses e.g. Apple Pay, App Store that are doing very well and I am sure more will come in the future. They are never going to be the company that goes for market share above all else.
> Apple today is in the position of GM in GM's glory days
I don't think so. Apple seems to be quite happy just to acquihire their way out of any innovation slump. There are a ridiculous number of companies especially in the VR space that they have acquired that we have seen no evidence of in their problems.
Pretty exciting times ahead I imagine.
They're losing marketshare in the phone market as a whole, but in the high end, premium market, they're doing quite well. And that's the market where the profit is, not in the low end, free on contract devices.
Fantastic commercial!
"The Great One"
I can almost see Don Draper standing in the shadows behind the car.
2 possible causes for this:
1) Apple still develops the OS using waterfall over the year. Entire sweeping changes are made only at x.0 releases that trickle down to teams that have to work around the instability all year long and there's no other approved way to get in significant changes.
2) They keep adding more apps to the core OS image that can only be updated with a full software update now. This makes delivery of quick fix updates near impossible since they have to go through the OS release management teams.
It certainly sells better to have a huge list of changes at WWDC that then become reasons to upgrade, but software delivery has moved on from waterfall, so in that respect Apple's OS teams are behind.
>They keep adding more apps to the core OS image that can only be updated with a full software update now.
This is a huge drawback for Safari, both on desktop and mobile.
#2 is just not true. They deliver point releases that add new core functionality all the time. For example, Photos for OS X was delivered in 10.10.3. A point release that came mid-year and delivered a huge amount of new functionality, including photo streams shared between iOS and OS X.
They also deliver many bug fix releases throughout the year, on both platforms. The 9.x releases have seen them add support for WatchOS 2, and many other things.
It's a huge ecosystem, and many teams, that all have to line up their product release schedules, and now 4 operating systems (OS X, iOS, tvOS, WatchOS) that have features that all work together. This is not trivial.
I would be skeptical about agile on something like OS development which is on a different scale from your average software project.
Not saying it won't work, but I would like to see some comparisons of OS level projects that have gone agile and compare it to the waterfall approach.
Well Linux is run like that right? Releases very often
Not to mention Facebook itself... We all know FB has fallen flat many times but its never been busted for weeks at a time to my knowledge
One possible answer is that near-perfection (the perception of "Job's" products), including recognition (widespread adoption), is attainable at any given moment in time, but typically unsustainable long-term...given human and technological constraints...
It could be entropy, as some have suggested, or simply the difficulty of maintaining a level of quality one has become associated with producing...
Maintaining the (high) level of quality one has reached is difficult enough...
Incremental gains on a level attained become much more difficult...opportunities become infinitesimally smaller...
>Apple has the resources to approach software like aerospace. Formal interface specs, heavy Q/A, tough regression tests, formal methods.
I would think that what you stated, while true that they currently have these resources, would directly go against their product roadmap schedule, the consumption of said devices in that schedule and thus their bottom line ($203B in cash)
There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one. I didn't like Jobs but respect his ability to achieve things.
Perhaps, the years of oversimplifying applications has created an Apple that can't handle complex applications?
Or, there's a chicken and egg question: XCode and the surrounding tools are atrociously buggy and hostile to the developer and it seems to increase with each release. Is this a symptom of what's going on inside Apple or a cause - perhaps Apple's own developers are dealing with the same hellish development experience and are just happy when something can compile without crashing Xcode.
Or, perhaps, at some point, software becomes too complex for humans to deal with.
Windows got so much flack over the years. It wasn't the prettiest but it worked and did what it said. Sure it BSODed sometimes and had some memory problems but it handles infinity more hardware/software/driver situations than OS X. Visual Studio is a dream, if you're into that ecosystem. MS dev tools are actually very nice.
>There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one. I didn't like Jobs but respect his ability to achieve things.
You probably wasn't paying attention when Job was running things.
OS X 10.1 was a mess -- it took until several updates to become somewhat usable. The Finder was half-arsed for a decade. Mail.app had several issues. The Windows versions of iTunes was crappy. OS X releases that are now praised as "the best ever" etc, got tons of complaints for introducing bugs and instability. XCode has been historically crap (and it's much better now). And don't even get me started on the state of their Cloud offerings under Jobs.
Hardware wise the same. Every new release, from the original iPod to the iPad was met with complaints and bickering ("No wifi? Less space than a Nomad? Lame") -- even if it actually took wifi and batteries 5 more years to even start making practical sense to have on such a device for syncing. Aside from specs people complained about, there were also all kind of other HW issues, from the overheating G4 Cube, to the logic boards dying on G3 iBooks, to cooling goo spilling from G5 towers, the crappy "round" mouse, and lots of other stuff besides.
That said, I don't buy the "Apple software went downhill as of late" thing. First, because as said there were always issues. Second, because in normal use I don't see any particular decline. If anything things have got better, to the point where we complaint about trivial stuff. The thing is Apple of today puts out a heck of a lot more software and hardware products than the rosy Apple you remember.
I'd take iTunes in the back and kill it though -- as the latest redesigns are absolutely crap from a UX perspective. Then again, I wouldn't call that a programming quality issue -- more of a "idiotic focus and shoving on our faces of BS online music platform issue".
>Or, there's a chicken and egg question: XCode and the surrounding tools are atrociously buggy and hostile to the developer and it seems to increase with each release.
The opposite. XCode was "atrociously buggy" in the 3/4/5 era and before, and has gotten quite better in the 6/7 series (despite having to support a whole new language).
In fact a large list of early XCode 6 crashing bugs have been squashed months ago -- which was (as reported) around 90% of them.
That said, I don't buy the "Apple software went downhill as of late" thing. First, because as said there were always issues.
I fully agree. I jumped on the Mac and OS X bandwagon in 2007. The first few releases of Leopard (10.5) were quite hellish, because I was constantly having Wifi problems. My HP Laserjet didn't work until 10.5.2 (in 10.5.0 and 10.5.1 it would just print a few pages and get stuck). Snow Leopard was a smoother ride, but only after a couple of dot releases.
I think there was a low point around Lion and Mountain Lion (lots of new features, but they were not polished enough). But since Mavericks it's been smooth sailing for me.
People also seem to have overly romantic recollections of past Apple software. Sure, Aperture and iPhoto were killed in favor of Photos. But Aperture was crap. Version 2 was still acceptable, but 3.x had constant hangs and slowness. And they were always trailing behind badly with RAW support. As many people, I have bought Aperture 2 & 3 and switch to LightRoom pretty soon after because of Aperture's lack of polish. A while ago I tried the new Photos. And although it does not have all of Aperture's features, it's a far better application in terms of speed and user experience.
tl;dr: I don't see this drop in quality.
Of course, App Store and iTunes need to be burned down to the ground and rewritten from scratch.
I agree that photos is a lot better than iPhoto & co. The performance is blazing and I migrated from lightroom to it because of it. I can't find a faster and easier to use photo library management app. Lightroom is really slow compared to it.
You just convinced me to give Photos another chance. I absolutely hated it when they gave up on iPhoto and Aperture because I felt like I lost a lot of features if I moved to Photos. Ever since then, though, I haven't found a photo solution that works quite as well as any of the Apple products I've used in the past so I'm willing to give it another shot.
Photos.app is atrocious from a UI perspective. It's an iOS app shoehorned onto OS X. How to back up your library to, oh, I don't know, an external disk? Drink the kool aid or else.
What? The UI is totally fine. Just because it's a grid of photos doesn't mean it's just "an iOS app shoehorned onto OS X." The app is written from the ground up for OS X. It's not just some basic half assed port of the iOS version. The UI is different. The editing tools are more powerful. The performance is blazingly fast. Go give the Photos app on Windows 10 a shot if you want to see what the competition is up to. That is a pathetic Windows (mobile) 10 port.
> The Windows versions of iTunes was crappy.
Oh joy. I remember those days when people were angry because plugging your iPod into a new PC means wiping your library!
> That said, I don't buy the "Apple software went downhill as of late" thing.
Perception on Mac after people switch from Windows to Mac OSX immediately is awesome. The perception is different if you use Mac OSX for a long time and suddenly you see computer frozen every few days and requires a hard reboot. Many of that have to do the software you are running at the time, but why a sudden OS freeze, mouse couldn't move? That can't be a fault with Office / Skype, it has to be either a OS issue, or underlying hardware issue.
Mac OSX is generally really great, it doesn't feel slow compare to a lot of the PC out there. Polished UI and very responsive. In some release you do see slowness opening between finders or moving from one setting view to another. Those are defects people will cry and ask those defects weren't caught by the performance team.
Maybe the grass is greener on the other side. I was using an iMac for a couple months on a job a while back after using only Ubuntu for 5+ years and I found the whole Mac UI extremely sluggish and confusing.
My main computer is an OSX both at work and at home. Honestly it isn't very responsive compared to recent lung desktops or even the latest windows 10. Or at least it mostly feels that way when using Eclipse, so maybe its the JVM.
I usually think UI in most Java-based applications are very slow on Mac OSX but that's just my limited experience with Java-based UI on this platform.
> The opposite. XCode was "atrociously buggy" in the 3/4/5 era and before, and has gotten quite better in the 6/7 series (despite having to support a whole new language).
Xcode 3/4 were a lot harder to work with (IB and Instruments separate, had to use Clang on command line etc), but I don't recall them being particularly buggy. Xcode 7 with Swift is the buggiest version I have used. It crashes on me 10+ times a day with all of the instant compiler checks it does. The progression from 3 to 7 is amazing, but I feel it has come at the cost of stability.
Xcode 3's code completion made everything run so slowly that I had to turn it off, and for a while I just copied and pasted method names out of the Cocoa header files.
I just wish one could tell xcode to stop using ALL THE CPU
> If anything things have got better
Much better in my opinion. For example they've fixed all the issues with iMessage interoperability on OSX/iOS. There are a time when if you used Messages on the Mac you had to suffer through missing messages, duplicate messages, out-of-order messages, etc. Now, as far as I can tell, it's working flawlessly. Even better that it now includes SMS messages.
> If anything things have got better, to the point where we complaint about trivial stuff.
This is the key. We're definitely spoiled by how solid a lot of the basics are.
I can fault them for some of the cloud service issues nowadays, but even those work better than they did in years past.
Apple has always had strange hardware and software issues, just like everyone else. Just today I was trying to buy Windows 10 to install it on a PC I'm building, and the MS storefront and my Live account couldn't sync the deletion/addition of a credit card until I logged out of everything, closed the browser, and logged back in. No one is getting this stuff working flawlessly.
Microsoft's OAuth stuff is ridiculously buggy. I've tried half a dozen times to get my Office 365 Microsoft account properly linked with my (identical email address) MSDN Microsoft account, and it's always wrong. Every time I go to login to some Microsoft OAuth site, it picks the wrong cookie, so I have to logout whatever account it thinks I'm using and login repeatedly with the right one.
> OS X releases that are now praised as "the best ever" etc, got tons of complaints for introducing bugs and instability.
I bet it would be easy to find an "OS X is on a slippery slope..." article for each OS X release.
I remember the initial Windows release of Safari. Apple insisted on doing their own font rendering, and the result was blurry headache-inducing text.
Wasn't that a Windows issue, though? I feel like font-rendering on Windows, in general, is far inferior to font-rendering and aliasing on the Mac. Especially with the new HiDPI displays popping up, Apple's stuff looks far better to me than any display I've ever seen on Windows.
font-rendering on Windows, in general, is far inferior to font-rendering and aliasing on the Mac
That's because Windows has to deal with so many more differences in hardware. E.g. look at these pixel-level variations in LCD screens. http://www.digitalversus.com/tv-television/screen-technology... I don't understand how it would be possible to support subpixel rendering on some of those. IIRC when I last used Windows (XP?) it let you choose from 8 different schemes. But that still wouldn't nearly be enough!
OS X and IOS can be tweaked to support a much smaller subset of that mess.
They actually went back on that decision because Windows users complained so much that the text was antialiased properly.
That's right, the anti-aliasing didn't look good to me either. I'm sure it comes out very nicely in print.
That's interesting because 'back in the day' the IT department accidentally installed safari on my win XP box and as a long time mac user I thought 'finally some font rendering done right'. I guess it's down to taste or it perhaps it didn't work well for screens with low pixel density or people who like really small text and ui settings.
People like what they're used to. I'm not in the least surprised that you happened to be a former Mac user...
A lot of awful software shipped under Steve Jobs. X Code has always been a buggy mess, iTunes bloated greatly under Steve (remember Ping?), iMovie '08, etc etc.
Yeah. The revisionist hero worship and sanctification of Steve Jobs has reached absurd levels since his death.
The Cult of Steve? It think it's actually declined a little.
That Isaacson book really leveled him for a lot of people.
OP is right about him not putting up with crap. Sure, not everything that came out of Cupertino was rock solid. But I think as much of an aesthete that he appears to be have been, it's also pretty evident that he was incredibly shrewd and knew he had to ship at some point.
The cruddy software might be coming from Apple repositioning itself from an innovation to a legacy brand. Acquiring Beats (can you imagine Steve championing this - lol), less of a focus on the high-end workstations for consumer electronics and watches, removing the grunt from high-end apps like Logic and FCP. Certainly no longer the underdog we root for, as the article points out.
> That Isaacson book really leveled him for a lot of people.
To me it actually had the opposite effect: after reading it I had a newfound respect towards Apple and Jobs. The book was honest and fair, I don't know why people think it does a disservice to him since everyone knew he was an asshole. This is coming from someone that avoids Apple products, FWIW.
That's the book that uncritically repeats Jobs' (completely bogus) claim that Apple invented the switching power supply.
http://www.righto.com/2012/02/apple-didnt-revolutionize-powe...
[ q.e.d. ... dang, c'mon dude. give it a rest. ]
Totally forgot about Ping. I love being reminded of old "features".
Its funny watching the announcements of these products now.
I just get sad seeing Steve with meat on his bones... We watched him wither away.
iTunes was hideous. I once tried to reverse engineer the XML that stored the library and concluded it must have been designed by a distracted intern.
Why reverse-engineer? The format (known as "Apple XML property list") has always been openly documented:
https://developer.apple.com/library/mac/documentation/Cocoa/...
There are also many libraries available to decode and encode these files in various languages.
The reason why XML property lists look this way is that they were a direct translation of the older NeXT "property list" format, which was sort of like binary JSON. Dumping an alternating list of keys and values isn't pretty XML, but it ensured minimum translation headaches from the old format.
Right, so technical debt then.
It's still hideous. Case in point, although this is a combination of things I guess but, when I upgraded to a iPhone6S+ 2 months ago I backed up my 5s in iTunes then restored on the 6S+. 22 apps could not be restored, among them 6 of Apple's own apps.
Yes, good software applications from Apple have been the historical exception rather than the rule. Great hardware, good OS, mediocre software for the most part. This reflects the company's priorities as a hardware vendor: they make very little money from software, and hence have little direct incentive to put a lot of resources into it.
I don't agree with great hardware, if we're talking about durability and easily fixable. At least the Macbook line seems designed to fail after a number of years or be too expensive to continue fixing in favor of buying a new one. Apple products, besides the high end desktops, seem disposable, no matter how well made it looks.
Years is still a long time to measure the lifespan of a laptop. Plenty of less-expensive laptops fail at the one-year mark, some fail at two, some laptops equally as expensive as a Macbook will fail before five years are up. Some Macs fail before five years, some last beyond that.
And five years is a long time when it comes to computers.
Jobs didn't care about Xcode. iTunes was decent or even good. He hated social and reluctantly allowed Ping to go forward. iMovie is a tiny niche.
I agree with OP. There is no way Steve would have allowed some of the current stuff to see the light of day.
iLife was very important to Steve, he dedicated a lot of time demoing the apps on stage. He was a huge fan of ripping out all the features of iMovie.
Same with iTunes (which has gotten a ton of criticism over the years), he was the one that kept cramming everything under the sun into it.
I don't know if he was as passionate about X Code, but he sure liked to brag about having the best development environment. Interface Builder in particular seems right out of Steve Jobs' brain (even if it sucks).
Here he is introducing X Code: https://youtu.be/Rh5spZrzu6c?t=59m3s
Interface Builder is superb once you get the hang of it.
It does a great job of getting you from zero to an app, but it breaks down terribly once you try to do anything custom with it. The fact that it hides code from the developer drives me nuts.
Well it depends what you mean by custom. I create custom views all the time and its even better now we can render them 'live' in IB:
https://developer.apple.com/library/ios/recipes/xcode_help-I...
Ooh I forgot about that feature. It does at least answer my big problem with IB: you execute code you didn't write and can't read. I'll have to give live rendering another look.
I won't start a religious war here, but IB is awesome in demos and less awesome in real world. Versioning alone is enough to make you go crazy (especially back in the .nib days!).
I understand both sides of the argument - it took me considerable time to wrap me head around IB and for the longest time I just assumed I simply wouldn't "get it".
Interested to how you do versioning on UI with or without IB though. Personally I just maintain branches until an agreed upon design is in place.
There are old videos of him speaking about development environment at NeXT (against Sun). Sir Steve did like allowing 'creation'.
> iTunes was decent or even good
You clearly never used iTunes for Windows.
| You clearly never used iTunes for Windows.
IME, every terrible thing about iTunes applies equally to both platforms.
It's worse on Windows. The Windows version would hash file names or something. Probably to get the search features working. So once you imported your files you didn't know what was what. The Mac version probably just uses spotlight and file names were always readable.
My iTunes library has made several round-trips across platforms and I've never encountered anything like that. My music, apps, books, and podcast files all have consistent, readable naming schemes.
iTunes sucks everywhere. And it's a horrible UX that it's needed for syncing an iPhone. A better UX would be just to plug it in and move things around.
iTunes was awesome on Windows when it came out (until about 4.0?). Compared to its contemporaries, Music Match, which was a bloated piece of junk, and Winamp 2.x which while awesome, had a steep learning curve to get really useful (J for the win) and had an extremely unpleasant UI.
> iTunes was awesome on Windows when it came out
It forced a bundled QuickTime install (still does I believe), and back then, it tried hard to become the default media player on your system (even reverting your "no" choice after updates). This led many people to remove QuickTime, only to discover now their iTunes refused to work. It would fail halfway through sync's and upgrades of iDevices routinely, had a generally pretty buggy interface that wasn't very responsive most of the time. It's iDevice backup process was cumbersome for normal users, and often failed without the user knowing (leading to very upset individuals when they needed to restore but couldn't).
Now it seems every new version redesigns the UI in major ways, causing even long term users to not know what to click, etc...
If you really just listen to music, maybe it's fine. For all other purposes, it was/is horrible, however I can't complain because it generated quite a lot of work for my side repair/contract business back then.
> iTunes was awesome on Windows when it came out
Whip the Lama's A WinAmp is still better.
I had whatever version was around when the video ipods came around (and a bodgy bit of hardware that was). I couldn't sort videos the way I wanted to - itunes says that that file extension means "tv episode" instead of "movie"? Sorry, it's a tv episode. Not to mention the terrible UI with tiny targets that doesn't blend in with the user's desktop theming. Maybe that was v4+, but all I remember is hating to use itunes (including managing updates as already mentioned)
The software that went with Microsoft's Zune media player was amazingly good. Sadly, even though it was available separate from the Zune hardware, almost nobody downloaded and tried it.
You clearly have never read his biography ;)
via an All Things D event:
> What’s more, thanks to the popularity of iTunes on PCs, Apple has become a major Windows software developer. “We’ve got cards and letters from lots of people who say that iTunes is their favorite app on Windows,” noted Jobs. “It’s like giving a glass of ice water to somebody in Hell.”
That quip apparently caused Bill Gates to become quite angry:
https://books.google.com/books?id=6e4cDvhrKhgC&lpg=PA463&ots...
Steve Jobs publicly mentioning only praise about things Steve Jobs made? Unbelievable!
I can only conclude that the people sending such letters had only previously used GTK+ apps on Windows or something.
That may have been by design.
Don't forget early OS X. So slow it was nearly unusable until performance improvements in 10.3.
Improvements of a new OS are happily received. However, we're talking about the reverse, deterioration of mature products.
What mature products deteriorated (apart from iTunes) and in what concrete ways?
Mail is much better than it was even a year ago, XCode too, the OS is stable...
And FCPX didn't deteriorate over 7 -- it was a written-from-scratch reboot of the platform that just happened to cut some features people used (most got back with a vengeance).
There a number of examples in the surrounding threads. I find most software getting worse these days on all platforms. See CADT: https://www.jwz.org/doc/cadt.html
And most of that is simply looking at the past with rose tinted glasses.
> the years of oversimplifying applications has created an Apple that can't handle complex applications?
Ah, you've fallen for the illusion. Those "simple" Mac apps you love are fantastically complex to implement. It's the user experience that is simple, and it takes a ton of sophisticated engineering to pull that off.
You can think of it as there being a certain fixed amount of complexity in the user attaining some goal. You can make your software simpler by foisting the complexity on to the user: just make them do all of the nit-picky tasks.
If you want to make it simple for them to achieve their goal, your app is going to have to contain that complexity itself.
>>Or, perhaps, at some point, software becomes too complex for humans to deal with.
I'd say rather that things need to be simplified and features removed in order to improve quality. Every new Mac or iOS release touts XXX number of new features. If you want to offer the best products, having more features isn't necessarily a prerequisite.
I can't upvote this enough. So many products have been ruined by feature creep, it's sad to even think about it...
Its interesting to think about - is feature creep uniquely in the domain of software? I don't think so. There is plenty of software that has been immune to it (think everyday developer tools, command line stuff, etc). Often the apps with the least amount of feature creep are the ones where they are bounded by the APIs they talk to, i.e. if the API (which they don't control) is static, they can't really add any more features.
Outside of software, we see a similar thing, a wrench (or spanner) hasn't gotten many new features, because the bolt that it turns isn't changing. the goal is roughly the same. The only appreciable changes have been in ergonomics, and even these are effectively static.
On the other hand, we have cars, which are suffering from so much feature creep it is unbelievable. every year, car engines get a little bit more efficient, but they also get heavier, bloated with more systems, infotainment, seat adjustments, window adjustments, etc.As a result, the efficiency of the improved powertrain seldom makes appreciable performance difference. (yes, there are outliers).
Cars then, might be the equivalent of "the ultimate app" which does everything for you, but loses sight of its purpose. Meanwhile, we have a long history of leaving single tools alone, and they tend to work great.
The trouble is that in the world of physical tools, the workflow changes/context switching between using one tool and then another is easy. Meanwhile, in software, feature creep ends up being the solution for poor context switching between apps. In an ideal world, working on an image in photoshop and then pixelmator, and then illustrator, and then publishing to wordpress would be as seamless as using a wrench, then a screwdriver, and then cleaning things up with a rag. Unfortunately, software interface constraints almost necessitate feature creep as the "simplest" way to add functionality, even when convoluted menus, hotkeys, and naming conventions obscure utility.
Awesome analogy about context switching. It's definitely true. The "unix philosophy" never really worked well for GUIs and that's been a big problem for composability.
Interestingly, the Unix philosophy works great in machine to machine communications. It is pretty excellent at handling the needs of the IoT and distributed systems, networks, sensors, etc. You hit the nail on the head about GUIs.
Ironically, it is the "handoff" features that Apple was traditionally best at. BUT, I have this feeling that most of what we hate can be described as "too little, too soon". After all, it could be argued that Apple is aiming for an environment which is "document focused" with their focus on standardized APIs (Adobe is working on this, too), but none of this works yet, because software is still written from an "Apps" POV. In the real world, we change the tool relative to the job, and the best tools are specific single tasks. given the experience so far, I'm far from convinced that a 2D GUI should ever try duplicate that.
I think it boils down to asking what the product is actually for, or rather, what fundamental human need does it solve? That's a rather tough answer for computers, but for cars it's easy - getting from point A to point B. Every modification to a car is in pursuit of either making the job more efficient or making the job easier/more comfortable for the customer.
I think the fact that the answer is so tough for computers is why feature creep is so common with software. When your tool can literally do anything you want it to, why say no? The constraints are actually in the mind of the customer.
This is very good point. The constraints are in the mind of the customer, but also in the UI. i.e. a panorama viewing app is not great in vertical orientation on an iPhone. There are physical as well as mental constraints in the UI of a computer.
On the topic of cars, it is true that modifications strive toward comfort, but they also strive for marketability. While a leather-wrapped dashboard may be comfortable to look at, it is primarily focused at adding something to market. This same phenomenon exhibits in software, where features are added to increase marketability, "Look! this ERP software has a social network built in!". The utility of a given feature is often perspective-based, especially when the user is not well aware of their actual needs, or the user and purchaser are not the same person.
I agree, feature creep is not unique to software. Look at cars. Creep is just made worse now that they can run software.
Plenty of product are ruined because they don't allow you to do that one thing you really need to do.
Feature creep is necessary for closed applications.
The ideal is a set of simple tools around a simple and open file format. As soon as you lock in folks to your application (as Apple loves to do), you're on the hook for being all things to all people, or getting panned that your software is only for the lightest casual usel
They (Designers and Product people) will one day learn that the single greatest feature and design, the most objective one is raw speed.
Always go faster for a long time. Meaning tackle long aged hardware, long uptime, and just generally maintain same performance. Across everything, from device hardware and speed locally on the device to the wifi/4G signals, ISP, to the backend and their software and hardware. Performance in speed is the greatest subjective design and feature.
Unfortunately the time to invest in it requires convincing the designer and product that integrating A/B tests for hearts vs star for favoriting isn't worth it for the long run.
> I'd say rather that things need to be simplified and features removed in order to improve quality.
I'd say they need to fix the features they've implemented, and only release new features when they're fully cooked.
Removing features is terrible: it is not an improvement to me when my stuff breaks or when I'm forced into a dumbed-down, lowest-common-denominator use-case.
It's funny -- on the one hand, people SAY they want this (only fully cooked new features). Yet everyone is really quick to jump up and down and complain and bash $company for lacking innovation when their new OS or the next version of their software comes out and the focus has been on stability/bug fixes.
Not just Wall-Street types either; I see this primarily from tech people. Examples: Android M, OS X Snow Leopard (I think, I may have that release wrong but there was one there that was definitely 'not many new features'), basically anything where the next version isn't predominantly new shiny.
Damned if you do, damned if you don't.
> There's just no way in hell Steve Jobs would be putting up with this
Maybe. I felt like the quality started going downhill shortly after iOS came out, starting with low-level APIs, then xcode, then making it into user-level applications.
My theory is that a lot of the really experienced engineers (the one who started with NEXT and OpenStep) left when they were rich after the iPhone stock jump.
So really there is nothing Steve Jobs could have done unless he had a developer education program or something.
>> "My theory is that a lot of the really experienced engineers (the one who started with NEXT and OpenStep) left when they were rich after the iPhone stock jump."
While possible true I think the real issue is resources being spread too thin. Remember they had to delay the release os OS X one year as all their engineers were working on iOS? Now they're doing yearly releases of both OS's. I'm sure they've hired more engineers but pre-iOS there were probably lots of engineers who had been working exclusively on OS X for years and were able to maintain quality. New people take time to get up to speed.
I agree. I don't think this is a Steve Jobs thing at all. In fact, I think post Steve Jobs Apple actually pays a lot more lip service to better software than they did before.
I simply think the attempts to build stuff in common with iOS is taking its toll.
Aren't you contradicting yourself? If post-Jobs they only pay lip service to quality but with Jobs they actually took action, doesn't that suggest Jobs may have been a positive influence on quality?
I think the action they take is about the same. They pay more lip service these days. I wouldn't be surprised if they are actually putting more work into better quality these days, but stuff is just more complex (e.g., everything is cloud based. Imagine OSX today, except running MobileMe, .Mac, etc. as the cloud backend instead of iCloud, which while it has its issues, is way better than those disasters).
Ah ok... maybe you have a different usage, but "lip service" is used to indicate "all talk, no action". So paying more lip service to something means they're churning out more empty words without the activity to back it up.
> There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one.
Steve Jobs, Steve Jobs ... that's the guy with the skeuomorphic preferences, right? The one who, at the first iPhone release, told developers that they don't need native apps because using HTML + WebView is enough, right?
Just to make sure we're on the same page here.
Not everybody hates skeumorphism, especially when not taken to an extreme (okay, Apple under Jobs often did take it to an extreme... like the leather Calendar app). But skeumorphism, when used properly, provides affordances and hints to the user.
> told developers that they don't need native apps because using HTML + WebView is enough
That's because the SDK wasn't ready for developers yet. You must remember that, first and foremost, Jobs was a great salesman. If you don't yet sell it, it's a piece of crap and unnecessary, right? And when you _do_ sell it, it's the greatest thing since sliced bread.
No, they weren't going to do a public SDK, until after the huge sustained public outcry.
I understand the plan was to work with select third parties on a case by case basis, sort of akin to standalone game consoles.
I hear that repeated a lot, but I haven't seen any real evidence to back it up.
It seems equally plausible, especially given how quickly they announced the SDK after the iPhone release, that Apple was preparing for it but not ready yet (and/or wanted to give themselves time to work out the early kinks in all of the other parts: working with AT&T, supporting the new device and new OS, etc, etc).
It was reported in the Isaacson biography:
"Apple board member Art Levinson told Isaacson that he phoned Jobs “half a dozen times to lobby for the potential of the apps,” but, according to Isaacson, “Jobs at first quashed the discussion, partly because he felt his team did not have the bandwidth to figure out all the complexities that would be involved in policing third-party app developers.”
Apple had already established a model for selling third party software on the iPod video line through iTunes -- with select partners only.
When WWDC 2007 rolled around, after the release of the iPhone, Jobs presented web apps as the solution for developers, without the need for an SDK. It was only after months of sustained outcry from the development community, and the nascent jailbreaking scene, that Jobs announced that they would prepare a public SDK for the next year, after they decided on a method of signing and sandboxing applications.
None of which contradicts my thesis: that it wasn't rejected, but that Apple wasn't ready at the time of release to undertake the task of opening it to all comers.
The supposition that it was only due to the outcry of the development community is exactly that, a supposition.
According to an Apple board member it was rejected internally by Jobs himself.
What was amazing was the '07 jailbreak experience was like 1000% better than what I'd experienced with Palm or WinMobile.. tap, tap, app appeared on home screen with a "lickable" loading progress bar.
I seriously doubt that jailbreakers invented the smooth experience - they just likely unlocked the functionality that existed before Apple was ready to push it out (I'm guessing Steve wanted apps in the store at launch).
Yeah the UI assets had to be there in the first place. Compared to SBSettings, the jailbreak era control center, which looked like someone taped together the buttons, and it becomes apparent that Job's statement was just salesmanship.
Oh really. That, I wasn't aware of. Well, at least they saw the cash cow that the App Store could become at some point, and acted. It's good to change your mind when new information comes to light.
I suspect people rooting and writing their own apps pushed them as well. If they didn't control the 3rd party apps, other people would.
I'm rather firmly convinced that the "they don't need native apps" was Jobs merely stalling til the SDK was stable enough to release to the public.
"The one who, at the first iPhone release, told developers that they don't need native apps because using HTML + WebView is enough, right?"
Nowadays we have plenty of people complaining that they have to have "apps" for everything when they have a perfectly good web browser on their phone, so he wasn't that far off.
For OS X, the decline began under Jobs with 10.7. I'd always assumed it was simply that Apple no longer cared about computers; in Jobs' own words, “milk the Macintosh for all it's worth and get busy on the next great thing.”
>For OS X, the decline began under Jobs with 10.7
So what issues do you have with 10.11? Because I don't see anything to complain about. Then again, I also didn't see anything troubling in 10.10, 10.9 and 10.8, with the exception of their ill-fated transition to a new DNS backend, which they later reverted.
A few bugs here and there, yes. Nothing I haven't seen since 10.2, or that's not comparable to the kind of issues I have with Windows 10 or the Ubuntu box I use for development (actually that's far worse, but I digress).
I'll take a crack at that:
- UI non-responsiveness: it is extraordinarily frequent that I will chord a tab change in Safari or in iTerm2, and the system will not respond for sometimes multiple seconds. It's the same with creating tabs in Safari. ⌘-t or ⌘-{ do not respond.
- Application switching focus failures: ⌘-tab will raise another window, but window focus will not follow. This has caused me to lose work. ⌘-tab, ⌘-w will sometimes close an iTerm2 tab that's behind the Safari window I'm looking at.
- Mouse pointer lag: Probably related to the input lag above, the trackpad will not respond for multiple seconds after I begin touching it. If I "wake" it with a two-finger scroll, it will often lose half the input and instead click.
- AirPlay stuttering: even two ethernet-wired systems will still lose data between them. It's a crappy experience.
- discoveryd: My Apple TV's network name is currently "Apple TV (5)". Macs sometimes do this too.
- Slow laptop wakeup: I almost always have to tap a keyboard key to wake the display after opening my laptops (MBr and MBPr). Almost always. But not always.
That's just off the top of my head. Many of these have followed me between OS X revisions and different hardware. It amazes me that such bugs stick around.
As I read your comment I was thinking "Huh, I've never seen those focus failures on Mac, but they happen to me on Windows at work all the time."
Then I clicked Chrome in the dock, hit Cmd-Q, and watched Safari disappear while Chrome opened a new window. Guess it's not just you.
>UI non-responsiveness: it is extraordinarily frequent that I will chord a tab change in Safari or in iTerm2, and the system will not respond for sometimes multiple seconds. It's the same with creating tabs in Safari. ⌘-t or ⌘-{ do not respond.
Ok, for this I can't say much, because every since 2010 or so I've used Chrome in place of Safari. As for iTerm2, I've tried to switch to it several times over the years (later mostly because of Tmux integration) but always found it to be buggy and reverted to the Terminal.
>Application switching focus failures: ⌘-tab will raise another window, but window focus will not follow. This has caused me to lose work. ⌘-tab, ⌘-w will sometimes close an iTerm2 tab that's behind the Safari window I'm looking at.
Hmm, haven't seen this -- and I use ⌘-tab and the ~ variant heavily.
I have seen lagginess in focus when switching full-screen apps use, and I sometimes start typing before that happens. This got a little better in 10.11 though (either faster focus switch or less transition time).
>- discoveryd: My Apple TV's network name is currently "Apple TV (5)". Macs sometimes do this too.
DNS issues I've had (and mentioned in another comment). They tried a transition to a new DNS backend which was buggy. They reverted back to the old one with 10.11 (or sometime in 10.10.x) though and has been OK since then.
>Slow laptop wakeup: I almost always have to tap a keyboard key to wake the display after opening my laptops (MBr and MBPr). Almost always. But not always.
Do see this from time to time (though it almost always works in my case).
Could be a sensor issue though -- not a software thing (the display up sensor not registering, but tap working ok).
I see the input focus lags behind the UI after cmd-tab almost every day, for AG least the last two OS releases.
Just a quick nitpick: discoveryd was reverted on the last release... OSX is back using mDNSResponder again. You can't really say "It amazes me that such bugs stick around" if you're not actually on the newest version.
True, I conflated tvOS with OS X here, but it's still software by Apple.
Caveat: i havent used it on my main machine yet, so a lot of this is impressionistic; feel free to correct.
Still a total disregard of Fitt's law. Horribly inconsistent keyboard support. Behaviour that should be trivially configurable seemingly set in stone. Still, I think, impossible to cut a file in Finder. However many shots that take, apple can't get WiFi working properly. The transparency is an abomination.
If you want something more 'big picture', I think all the changes introduced over the lifetime of OSX have been a bit piecemeal with no overall, unifying process. For example, full-screen mode gets bolted on, rather than nicely integrated with other window actions. Notifications blossom into a side panel, but there's an overlap with bouncing icons in the dock. Etc. There are some great ideas there, but we could really do with an OS11 that picks the best ones and presents them together, in a clean interface, in which they all belong.
>Still a total disregard of Fitt's law. Horribly inconsistent keyboard support. Behaviour that should be trivially configurable seemingly set in stone. Still, I think, impossible to cut a file in Finder. The transparency is an abomination.
Well, those are not software quality issues. Some of those are design decisions, and have been with us forever, not random accidents: "Cut", for example, has never been on the Mac. Transparency in 10.11 is so lightweight you don't even notice it -- nothing Vista-like about it.
As for "total disregard of Fitt's law" that's not some decline either, as it's not worse or better than it has ever been in OS X.
>However many shots that take, apple can't get WiFi working properly.
Well, that qualifies as buggy software. But I have to wonder.
I've had an iBook, 2 MacBookPros (1 company issued), an iMac, a MacBook Pro Retina (current), 2 iPads and 2 iPhones thus far. And I've travelled all over the US, Europe and in several parts of Asia. I've never had any trouble with wifi, even to non-chain, el-cheapo motels.
The only offender has been my iPhone(s), which indeed I've not been able to connect to 3-4 places (restaurants etc) while traveling, over many hundreds of locations over 8 years. And I can't even know if it was because of the iPhone crapping out, or they using some crappy, third party router.
So I wonder, what are all those wi-fi issues people mention in forums etc.
""Cut", for example, has never been on the Mac. "
Apple Notes would like a word with you. Specifically the word 'Cut' under the Edit menu.
Apple Calendar would also like the word with you.
Terminal. Script Editor.
Not sure how you come to some conclusion that Cut is not "on the Mac".
My Apple devices regularly need me to switch WiFi off and on, in my home which has an Airport Extreme and Express, both with latest firmware.
In context, they mean "cut" as a way to move a file from one place to another, not related to text which seems to be what you are talking about.
>Not sure how you come to some conclusion that Cut is not "on the Mac".
We're talking about Cut for files (in the Finder), not inside apps.
>And I've travelled all over the US, Europe and in several parts of Asia. I've never had any trouble with wifi, even to non-chain, el-cheapo motels. So I wonder, what are all those wi-fi issues people mention in forums etc.
I agree with this 98%, with the exception of early releases of Yosemite, which really did seem to have a WiFi problem on the rMBP (even with Apple Airport Extreme base stations), in that it would disconnect a lot and you'd have to recycle your WiFi off/on. Annoying but not a deal breaker. And fixed within a month or two.
Otherwise I'd suspect there are some hardware + driver variations of Macbooks that may have had issues for others.
If you want "Cut" functionality in Finder, you can Copy (cmd-C) followed by Move Item Here (cmd-opt-V). It's non-obvious, but it's there.
So Cmd-C Copy can retroactively remove the file instead of copying it?
I think you're proving his case rather well.
Compare to Windows where it's called "Cut" but is really "Copy and mark for maybe deleting if you paste later". It doesn't remove files when they go to the clipboard (which is what cut does literally anywhere else).
And then Paste becomes a destructive operation that deletes your original files, or if you prefer, paste turns into "Move Item Here."
They've taken two different approaches to avoiding accidental data loss by overwriting the clipboard, but I wouldn't say one is inherently more right than the other. Windows makes new actions but has the interface pretend that it's doing the same thing as normal cut and paste. OS X makes the UI less standard, but describes what's being done more explicitly.
10.7 was a pretty low point, but OSX quality has been inconsistent throughout, with a few high points late in release cycles. And it's always panned in .0 releases (same historical issue with initial hardware revisions).
Context: that's a quote from 1996.
Seems like many people are not making enough of the distinction between choices and bugs. Sure there are bugs in lots of stuff, and that's an entirely valid conversation, but there's also a layer of confounding choices. Some make sense from an "corporate" perspective, some...
-iTunes bloat is a choice, and bugs.
-Mail.app is mostly bugs. (non-standard .mbox was a choice)
-Final Cut X was a choice.
-Eschewing strong AppleScript support in native applications is a choice.
-The app store(s) are a choice.
-Allowing core utilities like Contacts and iCal to stagnate and be outshone by 3rd parties (BusyMac) was a choice.
-Aperture+iPhoto=Photos was a choice. (So was selling Aperture after it was EOLed)
-FFF
> The app store(s) are a choice.
In that case, allow me to summarize some of the choices that went into building the Mac App Store.
- It's a web view. No local caching of layout or anything. If your network connection hiccups, instead of a reasonably rendered error message, you get this: http://i.imgur.com/5xNgwMH.png
- Text in the search box is blurry. I'm not 100% sure why, but I think someone made a choice to render it with its y-position at a half pixel increment. And then nudge it down so it comes into focus when you click on it. Because really, what sort of asshole would still be using a low DPI screen? http://i.imgur.com/gj3mqrz.png
- How many Install buttons does an update need? Eh, let's choose two. Two's a nice number. And have that "Update All" button stick around even though all updates were already installed. https://i.imgur.com/p9QoCqZ.png
- Earlier today, though it didn't lend itself to screenshots, my Xcode "Install" vs "Installing" button that couldn't decide what mode it wanted to be in and just bounced back and forth instead. Why choose one when you can be both!
- The "Check for Unfinished Downloads" menu. Do you know what happens when a download doesn't finish? Or why it's not presented in a sane way in the normal UI? Me neither. Instead of gracefully handling errors, let's choose to bury them in a secret menu that people will happen upon via StackOverflow searches when their installation keeps failing with no indication of why. https://i.imgur.com/xkgP0lc.png
A+ design work right there. Excellent choices.
This is exactly my point. Apple has been making these choices for a while, but they're being swept into a convo about bugs, and thus capacity/engineering. I think atp.fm dip into this for a second in their most recent (yesterday) podcast episode.
> There's just no way in hell Steve Jobs would be putting up with this
Says everyone who disagrees with any decision Apple makes. "Steve would have had the same opinion about this I do!" Statements like this are just you projecting your own opinion onto him.
OSX doesn't need to handle that many hardware and driver issues, so I don't see how that's relevant to a Windows comparison.
I've been using OSX since just after Panther. I generally agree with the idea that some things started getting worse after Snow Leopard, but I still don't think it's come close to a point where I'd actually move back to Windows or try out desktop Linux.
And I'd say Windows had far more issues that BSODs and memory problems. I've used Window for music production for years (by the time I switched everyday stuff to OSX, I was locked into my music workflow and haven't cared to spend the time learning a new package like Logic, even after all these years). The way I survive Windows problems is pretty simple: never plug in an ethernet cable. I'm sure things are far better now, but for a large part of the past 15 years, doing so opened you up to a lot of problems and required utilizing software you simply should not have to install in order to have a functional system.
Also, a high percentage of OSX users have no idea what XCode is, let alone care if it's not as nice as VisualStudio.
It's sad that so many people, myself very much included, now stick with OS X merely because ``it's not quite shitty enough to switch''. When I switched to the Mac initially, I switched because it was vastly better option than XP; now, I feel that lead has been eroded and that Windows 10, while different, isn't far behind OS X in most respects.
As far as workflow and window management goes, OS X is a good half decade behind Windows, and in a lot of system-level ways, it feels a decade behind. I use a MBP Retina at work, and it feels primitive compared to Windows 10, which I use everywhere else. Everywhere else, I'm doing audio, video, and photo editing though. If your day to day work is more CLI and *nix oriented, I can understand the appeal of OSX... although some of the variations of Vim that I see people running might as well be GUIs.
Windows 10 is -in my limited experience- awful. Admittedly I don't use it for much more than gaming, light browsing, and occasional ssh sessions but it's painful to use, the apps aren't great, and it frequently craps out with weird error messages (e.g. "The required TCP protocols not installed on this machine" actually meant "The NAS isn't responding").
I've found the opposite, its been rock solid for me.
I switched to OSX originally because the company I was working for was all-Apple. I probably wouldn't have thought XP was shitty enough to change otherwise, even though it obviously was.
Conversely, I'm now working at a company that's all-MS, but even after two years now (albeit, only a month on W10, previously on W7), I'm still not feeling much of a desire to switch to Windows at home. My Macbook is getting pretty old now and I'm going to want to replace it sometime soon and it's going to be another MBP.
The funny thing is I actually switched from OS X to Windows XP, primarily because Apple made me angry when they refused to do anything to continue Classic app support. (If I'm going to lose all of my favorite apps anyway and have to start over, might as well start over on the OS that has 10 times more apps, right?) (Also, I still hold a grudge over the "free forever" .Mac service.)
I wouldn't say Windows 10 is behind OS X at all. In some contexts, like a corporate workplace, it's at least a decade ahead, and always has been. (Then again, Apple and Mac fans generally discount that environment entirely.)
The biggest problem Windows 10 has are: 1. Crummy HighDPI support (and yes, they've been working on this, but the work is WAY too slow-- this should have been solved 5 years ago, guys) 2. Crummy third-party apps, made by developers who have no respect for the OS or its users 3. The new "constantly updates, and occasional ads" philosophy Windows 10 is taking. I wouldn't even mind the ads much if they weren't so stupid. (Stop trying to sell me the copy of Office 365 I ALREADY OWN!)
> The way I survive Windows problems is pretty simple: never plug in an ethernet cable. I'm sure things are far better now, but for a large part of the past 15 years, doing so opened you up to a lot of problems and required utilizing software you simply should not have to install in order to have a functional system.
I don't think that's really true. From 2K onwards windows was pretty solid if you kept it up to date. I never used any firewall/antivirus/etc., just disabled unneeded services (admittedly the defaults in 2K and XP were poor), didn't run executables attached to suspicious emails etc.
>Also, a high percentage of OSX users have no idea what XCode is, let alone care if it's not as nice as VisualStudio.
I really liked XCode, a lot more than Visual Studio, until they remade it to look like iTunes (around 2012).
So there is definitely some preference involved for those people who say Visual Studio is better (and of course, vice versa).
Xcode 4 was a definite backward step in UX terms, in my view, from which the product has yet to recover. Xcode 4 has the dubious distinction of being one of only a handful of products that I've used daily for a good period - in Xcode 4's case, 20 months - without ever finding a way that I could be happy using them.
(Sometimes I just throw my hands up and decide that something, whatever it is, is just never going to be my cup of tea, and that's that. I've done that with a few software packages and/or styles of working. But in Xcode's case, I'm pretty sure it's them, not me. Because Xcode 3 was fine...)
Regarding your last point about many people not knowing what XCode is: I assume GP was theorizing that Apple's declining software quality is related or even in part caused by their bad development environment and tools, compared to competitiors.
just curious, what software were you using for audio production under windows?
> Windows got so much flack over the years. It wasn't the prettiest but it worked and did what it said.
No. Just No. Windows loves to "forget" things. Things like Bluetooth devices. Or wifi devices. Windows likes to update your laptop for you when you're trying to close it ("Don't turn off your computer..." Wait, what? I have a plane to catch!). Windows scatters files all over the place! And that registry. UGH!
> Windows scatters files all over the place!
Oh look, OSX has touched this usb stick, and vomited it's dotfiles onto it. Did it ever write to the stick? Nope, just need to throw those trash dotfiles onto every. single. thing. it. sees.
Grr. You can always tell when somebody has been on the NAS or done any development work on a branch checked out of source control on a MacBook...
The development work thing isn't the fault of Apple, but rather the stupid developer who didn't pay attention to their commit.
I think there's plenty of blame to go to both of them. There's no defensible reason to put dotfiles in every single folder, and there are many, many reasons why it may not be desired of various levels of vehemence.
When closing your laptop Windows goes into standby. It should only install updates when shutting down and it tells you that beforehand. So it "did what it said" ;)
> Windows scatters files all over the place!
Just have a look into your Library folder on OS X. Also some programs don't use that, but create a dot folder in your home directory. Maybe I'm missing something though? I haven't used OS X much lately.
Library is pretty well organized. Settings go in Library/Preferences, cache files in Library/Caches, persistent files that don't need to be exposed to the user go in Library/Application Support, and they're all organized by app name or bundle ID.
Apps which use dot folders in the home directory need to be smacked until they stop, but the OS can't really control where (non-sandboxed) apps put things, it can only establish conventions and encourage apps to follow them.
I can't look it up right now, but I think it's called "Library Support" or something? Anyway: When I was writing a program for Mac OS I had to choose the name of the folder myself. Using the app name or bundle ID seems to be just a convention, I don't see the difference to %AppData% on Windows or ~/.config on Linux.
Also I remember a tool I used when I had a Macbook which helped you remove all the files when uninstalling something. I think it was this one: http://www.macupdate.com/app/mac/25276/appcleaner I remember some programs even putting files in other directories than Library Support.
> There's just no way in hell Steve Jobs would be putting up with this and I wish he was alive to tear some people a new one.
Remember Antennagate? The iPhone 4 had a serious hardware fault that significantly degraded it's signal. Jobs himself was the one who said to a customer "You're holding it wrong"
The software that had "the focus of Steve" was generally very high quality. However there were a number of rotting products at Apple while he was still in charge.
The problem now is none of the SVPs seem to have that attention to detail, or they are too stretched doing numerous things.
Give me a break. Steve Jobs put out plenty of crap products and buggy software over the years. Don't pretend like he had a perfect record just because he's dead.
I haven't read much about Jobs but I wonder how he achieved the perfection in his product. Force the teams to work overtime to correct the bugs? Did he double the QA size in order to catch all these bugs? Did he demand that all code checked in must be at incredibly high standards, thus forcing the devs to QA more of their own code?
With Jobs, it wasn't so much what you have right now as it was where he was taking you. To the bright, beautiful future where everything was different and better. One could overlook the imperfections because the vision was so appealing.
Now that he's gone (and without anyone with his charisma to take over), the public is left to contemplate the ignoble reality of the current product sitting in front of them. The sense of wonder and possibility is absent.
They weren't perfect by far. For years I've seen people struggle to figure out something as basic as creating a playlist or syncing to a new device. It's perhaps always been better software than windows counterparts, it always fallen short of the "intuitive" goal it's aspired to (and frankly, bragged about)
This is the best explanation I've ever seen. Just lots of focus, attention to detail, and iteration until they got it right:
http://inventor-labs.com/blog/2011/10/12/what-its-really-lik...
He went on stage and gave a sock presentation that made everyone forget about the bugs. Users judged the book by it's well-kerned prettily animated cover.
This is the same old tripe about the young generation's moral decline, and rose-tinted rearview glasses.
Imagine how many "this never would've happened if Steve were still alive" articles would be written if the iTunes 13 installer were to delete users' entire hard drives like iTunes 2's did (http://www.wired.com/2001/11/glitch-in-itunes-deletes-drives...).
Exactly this. Snow leopard was a great release in terms of quality, but leopard wasn't.
Many releases had critical flaws that needed a point release in less than 2 weeks.
The Steve Jobs method: hire really good people and inspire them to work to the best of their ability.
Apple UIs were better because they were doing what ESR suggested at a time when the rest of the world was smashing every new option into the right-click menu. Occasionally I'll still hear someone say, "Oh, just put that in the right-click menu." The right click menu isn't a trashbin to dump things that don't fit elsewhere. In any case, here is ESR's UI advice: http://www.catb.org/esr/writings/cups-horror.html
I agree with a lot of the sentiment in this discussion and from the article. It feels like Apple software is getting worse from my perspective but I wonder if a change in the target market has a part to play. Perhaps the HN crowd is not a significant Apple demographic any more?
When I got my first Mac back in the PowerPC days it was definitely a step up. In the past I feel that Apple was targeting power users. Today I think they are going after casual users. Probably capitalizing on the general popularity of iOS.
I've still got a couple of old Macs but they're all running Windows now. Visual Studio still has the occasional lock-up whatever hardware it's on. At least you don't have to sign in to an app store to update it though.
Windows handles maybe an eighth of the hardware that works in Linux.
Visual Studio is nice, if you're not writing in Rust, Perl 6, Ruby, Objective C, D, Scala, Smalltalk...
And what IDE on Linux handles all those, without installing 30+ plugins?
Certainly not Visual Studio. Maybe it'll be taken seriously again if it were ported to the platforms where development is happening nowadays.
The cult of jobs exemplified. There were absolutely issues under Jobs, but somehow his sheen made people ignore them.
> Visual Studio is a dream, if you're into that ecosystem. MS dev tools are actually very nice
Most likely because dogfooding
Huh? Apple devs use Apple dev tools, and they are still terrible.
Like it or not, but MS tends to be very good with productivity software, be it Word, Excel, or Visual Studio.
It feels a little clichéd to say it, but I do feel that Snow Leopard, outdated now as it is, was some kind of fortuitous confluence of factors that resulted in OS X being as close to perfect as it's ever been.
Even putting aside the iOS-style elements being added, my experience has been that each version of OS X has been slightly worse than the last, and tends to introduce strange little anomalies and instabilities on hardware that was otherwise working just fine. Sometimes these issues are fixed in the next major version, but sometimes they aren't; and even when they are fixed, there are an equal number of new issues introduced.
My perspective is perhaps coloured by having switched to the Mac during the Tiger era, which was about equal with Snow Leopard in terms of stability and 'completeness'. Since then, with the exception of Snow Leopard, it's been downhill. (I realise, of course, that Tiger benefitted from 11 point updates and thus more polish than any version before or since, but the point stands.)
One wonders if the yearly release schedule has something to do with this. It's got to be a constant rush to tick off the new feature boxes, and then we don't get the period of bug fixing and stability before the engineers all rush off to start implementing next year's new additions.
For a long time it was effectively on a two year clock; they had releases in 2003, 2005, 2007, 2009, and 2011. Now it's half that.
I also think this is the root problem.
The thing is, we don't need a new OS every year, and most people I know don't want to update their devices either.
Where is the demand coming from?
Accelerated growth and capex.
I once saw a very nice chart about the number of users for a great, cool start-up everyone knows about. The CEO illustrated: "You see that inflexion point on the release day? That's what happens for every release. If we postpone it by 3 months, we not only lose 3 months of new customers (painting the 3-months area between lower and higher sales). We also miss on the offset." and he painted the unlimited area between the lower sales and higher sales. And he talked about the recurrence of the inflexion for each release, and he talked about the immobilized capital inventory.
But their growth primarily comes from hardware, how much would OS X updates influence that growth?
Their hardware from 2008 is running El Capitan, so there isn’t much persuasion for existing customers to upgrade. It seems like if someone is going to switch to Mac, they would do it regardless of any features introduced in the last 3-4 OS X releases.
I don’t know, I’m not a sales guy. But it seems like the tradeoff of losing your customer base due to unreliable software isn’t worth it.
OSX updates are driven by iOS updates, which are the main revenue generators. Apple must release iOS every year to follow phone releases; OSX has to follow suit in order to support iOS "integration" features (Handoff, Photos etc etc).
Because release cycles are so short, test cycles are shorter as well; and of course new hardware is tested first (and foremost), so updates to older hardware will see more bugs, and updates are always buggier anyway (because it's harder for developers to predict the state of your system pre-migration).
I certainly see the yearly release cycle for OSX as being a big factor for the perceived fall in quality. In addition, there is probably a glut of "peak Air", people who switched to Mac when the Air was unrivalled and have not bought anything since. Their hardware is less and less tested with each update, so they're feeling the pinch. Apple don't care, because they want new money from them.
Are they really losing a significant percentage of their customer base due to unreliable software though? I doubt it. I expect that most people who use OS X and care about bugs probably have a negative perception of Windows (whether that's still warranted or not). So that leaves Linux. And as good as desktop Linux has become, it's still not the OS for people who don't want to spend time dealing with strange computer issues. So what's left to switch to?
It comes from matching iOS releases.
Can't release new iOS features which involve the Mac without a corresponding OSX release.
Hardware releases
Mac Mini is 1.5 years old, Mac Pro is >2 years old, 13" Macbook Pro is 1 year next month (and conspicuously absent from rumors of Apple's upcoming announcement event, gotta release those watch bands though).
The iPhone is definitely on a 1-year cycle, but it's not really something they put effort into maintaining on their computers. MacBook Air and iMac are probably the closest.
I blame iOS/Mac features tie-in like Continuity. OS X definitely gets the short end of the stick from that.
I think you have rose colored glasses about Snow Leopard. It had plenty of bugs and hiccups.
People deride the "iOS style" features of El Capitan but first, what specifically are you referring to besides Launcher? Do you think adding Notifications to OS X is bad? I find them extremely useful, and like the integration better than the spotty support Growler had among third party software.
I never use Launcher but it's not there for me. It's no skin off my nose that it exists for other people. And it's not like Finder or Spotlight have lost functionality, if anything I feel Spotlight in El Capitan is much better than it was in Snow Leopard or any prior OS X.
Mission Control works great and I think is much more elegant in El Capitan than in previous OS X releases. Granted in older OS X you could arrange your desktops in grids instead of just left to right but I only use about 3-4 desktops anyway, so I don't mind.
OS X has also improved for power users since Snow Leopard as well. Lion introduced FileVault 2 which I find to be a fantastically easy to use encryption system, that I have had no problems with. I enjoy my Retina screen and the Retina support that later versions introduced as well. Air Drop is also handy since my wife and coworkers have Macs as well.
In my mind, each new release of OS X, like each new release of iOS or anything really, has good things and less good things in it. But on the whole I'm glad to be running El Capitan today rather than Snow Leopard. I don't pine for those days.
It's less about Snow Leopard having been supposedly perfect and more about how OS X feels like it's just floundered around randomly since then.
In fact, for me, I can honestly say that OS X has almost strictly regressed in terms of my day-to-day experience. Spotlight now covers up way too much of the screen for almost no gain; if I'd wanted that, I'd have just used Alfred. Mission Control is still not as powerful and flexible as Expose + Spaces, and now it even goes so far as to hide desktop thumbnails even on my dual 27" monitor setup. Seriously?
What's most shocking to me is how much Windows has caught up from a UX perspective. I have Windows 10 on my (primarily gaming-oriented) desktop. Since I don't use the desktop for much programming, I almost never find myself wishing that it was running OS X instead. That's a real blow to the magical grip that OS X once held over me. Cortana is just as good as spotlight -- better, even -- and it manages to be much more space efficient as well. Windows Explorer is so much more useful than the bafflingly overly simplistic Finder. On Windows, if I want to tile windows, I can -- get this -- just drag them to the side, top, or any of the four corners. On OS X, I have to either fiddle with their impressively clunky split-screen full-screen app disaster or use a third-party tool like Spectacle or BetterTouchTool.
I could go on, but I'll leave it at this: there was a day where I never wanted to touch Windows again because I felt OS X was so much better. Now I find myself only really sticking to OS X for two reasons: (1) it still has better touchpad interaction on a laptop; (2) it's still based on Unix so I prefer it for programming.
It's less about Snow Leopard having been supposedly perfect and more about how OS X feels like it's just floundered around randomly since then.
Agreed. It's not that Snow Leopard was some magic release that can never be bettered, it's that it has never been bettered. I can't think of any new features since then that I really value in OS X.
there was a day where I never wanted to touch Windows again
I switched from Windows to OS X back in 2006 thinking I would run Boot Camp for Windows 90% of the time. Boy was I wrong. But nowadays, if the Surface Book trackpad was as good as the Macbook's (it might be, I don't know) and cygwin was better integrated into the OS, I would be very tempted to switch back.
Reviewers seem to think the Surface Book trackpad isn't quite as good as the MacBook Pro, but it's close. (No idea if they're comparing against the new Force Touch trackpads, which I personally think are a regression):
http://www.zdnet.com/article/microsoft-surface-book-solving-...
hmmm, why do you feel that the new Force Touch trackpads are a regression? I've yet to use one, but I have other products that use Force Touch functionality, and I'm a big fan of my older Apple Trackpad.
I kept trying them in the Apple Store every week, and at first I thought it was amazing - it feels like an actual click! - but it feels like a mushy, unsatisfying click. The more I played with it, I noticed I was triggering lots of accidental clicks, accidentally moving files around on the desktop etc. I tried tweaking the settings (for sensitivity and click feel) but couldn't find any that gave me what I wanted. Disabling Force Touch and making the main Force Touch click standard for all clicks would be very close to what I want, but there's no way to do that.
I ended up purchasing a non-Retina MacBook Pro, in large part because of the trackpad. The click on the old 2012-era trackpad is sharp, solid, satisfying & deliberate, no accidental clicks.
It's just a matter of what you're used to. I'm a recent Mac convert and I can't stand the old clicky trackpads, but I love the haptic click on my MBP's trackpad.
I have a Surface Book from work, and I hate the trackpad. It feels (just the default out of the box, no specific tweaking) that its "mushy"-it feels like there's more resistance moving my fingers around on it than my MBP.
>how OS X feels like it's just floundered around randomly since then.
Well said. For nearly a decade, every OSX was clearly better than the previous. The GUI kept getting more efficient, bugs disappeared.
Now when there's a new OSX version, my first thought is, "Oh no, what's going to break?"
OSX is known to have a not so good window manager, in example. Well, instead of improving it, by implementing other WM's standards (look at almost all the Linux WMs and Windows', they have maximize buttons, they stack on the border of the screen, etc) they chose to not be smart and not, oh god not copy what works the best. They chose to "force" fullscreen instead of maximizing, to not implement stacking shortcuts, etc. They are going backwards. And sadly you can say that for a lot of things about OSX
For programming especially, yes. I can see that point.
IMHO, an Android programmer in particular will find OSX more comfortable and supportive than Windows because Android Studio and drivers and stuff mostly _just work_ on a Mac.
Example: get adb to talk to a Kindle Fire from your windows machine and then do that same thing from your Mac
Example: install genymotion on your Windows machine and fiddle around with VirtualBox and what have you, and then install genymotion on your Mac
Example: use something like GitBash (Android devs just have to use Git a lot) on Windows and then compare that experience to using a real Unix command line on OSX
Example: watch Tor Norbye give a talk about Android Studio productivity tips and notice that he can't help but give you Apple keyboard shortcuts.
Macs appear to be what most Android programming expert-types and the Android dev team itself use, day to day.
Android dev just seems to go a bit smoother on Macs than on Windows.
As someone running Arch, I have no idea how devs can put up with OSX or Windows. Both are awful, both you cannot fix yourself, and both get in the way all the time of what you want to do.
For me, if I'm missing something, its a pacman or AUR search away. If I need development features of anything it exists as a -git repo as well, and I can super fast insert my patches and get what I need immediately. No updates or any of this insanity stand in my way, and my systems been stable for almost three years since I built it, I just subscribe to the Arch announcements mailing list for major updates that might cause problems. We just got Linux 4.4 yesterday, and I booted today and kept on rolling as per usual.
"Both are awful"
Personal opinion. Most Linux GUIs are awful to me.
"both you cannot fix yourself"
Moot point, as the vast majority of people, even tech oriented people, wouldn't do that if they could.
"and both get in the way all the time of what you want to do."
Again, completely depends on your personal workflow.
"For me, if I'm missing something, its a pacman or AUR search away."
Mac App Store, Fink, or Homebrew.
"If I need development features of anything it exists as a -git repo as well"
Git works just fine on OS X.
"No updates or any of this insanity stand in my way"
Except for, what you just mentioned, which are updates.
"We just got Linux 4.4 yesterday, and I booted today and kept on rolling as per usual."
Same thing happens with OS X updates.
I develop on Windows and don't see the value on such examples.
Never used Genymotion, rather HAXM or real devices.
There are quite a few nice GUIs for Git, also no one has forced me to use Git so far.
I will become productive with Android Studio the day they are able to match what I already had with Ant, ndk-build and Eclipse ADT/CDT in terms of IDE/build performance and C++ support.
Keyboard shortcuts are the least that I care about in Studio.
"Never used Genymotion, rather HAXM or real devices."
Genymotion is miles and miles ahead of HAXM. Real Devices are best, though. But using real devices is much easier on OS X, due to not having to even think about drivers. I had to use Windows at a big corporate gig for a while, and that was one of the absolute worst parts of it: drivers.
"There are quite a few nice GUIs for Git, also no one has forced me to use Git so far."
It's pretty pervasive. Just about any major library is in git. Most projects are using git, too.
I use git on Windows every day and don't have any problem. I mostly use command line (the vim shell works just fine for entering commit messages), and TortoiseGit for when I want to look at history. For a difftool, I use BeyondCompare (company already had a license), which is really good.
The type of enterprise customers we have just don't go adding projects if they aren't approved by legal and IT departments.
Usually the projects are SDK only.
My current customer is using subversion and the third party projects, if any, use the internal Maven repository or vendoring for the C++ libraries.
Very similar experience here: As a mac guy, I only have a single gaming / audio PC running Windows 10 in the living room and I'm very impressed with how snappy, useful and flawless everything works. Frankly, my usage on this machine is very limited - no development environment, editors or other applications, didn't even configure a mail client. Just Steam, Kodi and Roon.
I don't have any intention to switch platforms and still think the third party applications available on Windows are, while available in huge quantities, mostly subpar and terribly designed, however the core parts Microsoft delivers with the OS really, really impressed. Very solid.
I don't have a particular issue with any of the iOS stuff ported over, but it's a lament I hear quite frequently from other Mac users.
You're right that there's a certain amount of rose-tinting with regards to Snow Leopard. It certainly wasn't perfect, and I wouldn't really want to switch back to it; the world has moved on since then, and SL isn't really competitive as a primary OS now.
With that said, though, my use-case in OS X hasn't dramatically changed since Snow Leopard --- mostly web-browsing, some IM, using the Terminal for various tasks --- but OS X's stability has changed in that time, for the worse. Apps crash more often, the OS itself is generally less stable and prone to random freezes which require a hard-restart. Doing a fresh install didn't fix these issues, and my fianceé's MacBook has them too, which makes me think it's not just hardware issues.
I think what I'm really looking for is Snow El Capitan: a year where nothing new is added and Apple throw intense focus on fixing the various weirdnesses that are introduced during their breakneck release cycle.
> People deride the "iOS style" features of El Capitan but first, what specifically are you referring to besides Launcher?
So much hideousness.
Like breaking Save As... with a clunky autosave set up.
Like a badly thought-out full screen mode that couldn't even cope with an external monitor being connected (this being something that worked on the 512k Mac).
Like automatically terminating background apps, and purging resources within apps (so that Safari would reload all your background tabs randomly).
Like App Sandboxing which is so restrictive as to rule out an entire class of applications from the piss-poor Mac App Store, and to continually fill the Console with Apple's own system services getting dinged for sandbox violations.
Like discoveryd.
Like abstracting away the filesytem, so that you hit "Move" on an document and get a crippled filepicker appearing from the title bar.
Have they actually added anything good since 10.6? Some window management has improved a bit. Startup is immensely better. iMessage is getting there (though iChat AV was still better at group video chat than anything since, including Hangouts). You mention AirDrop, but that's a mess (and the awful Chooser was still a better way of finding other Macs to connect to than the Finder is today).
I've noticed the same thing, but with one proviso: Only after an upgrade, rather than a clean install. And with quite literally every major OSX release since Snow Leopard.
Just generalized instability and "weirdness" for lack of a better term. The problems always seem to go away after a fresh load.
It sucks that this is required, but between Time Machine bringing my apps and settings back, and launching the upgrade at night before bed, it's not really that much of an annoyance.
I'd still rather use OSX than any other OS at this point. But I think we need a Snow Leopard 2, a release that only tightens up the backend stuff with no new shiny features.
Your post reminded me of something: after years of doing Time Machine enhanced updates, last year I did a fresh install without Time Machine. I think that this made things better, but I can't really be specific why.
I think a lot of people feel this way. I wish I never had to give up 10.6.8, that was the best computer I ever owned.
I blame the stock market pressure towards growth. The stock market rewards growth at the expense of everything else. It's like a kid pinching his arm to blow up a mosquito that was biting him- it's forced to grow and grow and grow until it pops. It's no longer acceptable to simply run a profitable business with happy employees and customers.
It creates a push towards constant acceleration in all things- shorter release cycles, more product categories, etc. This supplies the continuous growth that shareholders demand right up until the point where it kills the host.
Also, "success hides failure". If you're a titanically successful corporation, any internal argument along the lines of "we shouldn't do X anymore, we should do Y instead" can be shot down with "well, look at how successful we were while we were doing X! X must not be so bad after all." It degrades an organization's ability to be reflective and self-critical.
These are problems for all successful companies, which become bigger and bigger problems with increasing success.
> Also, "success hides failure". If you're a titanically successful corporation, any internal argument along the lines of "we shouldn't do X anymore, we should do Y instead" can be shot down with "well, look at how successful we were while we were doing X! X must not be so bad after all." It degrades an organization's ability to be reflective and self-critical.
See: Kodak.
Innovators dilemma i think the official term is.
See also General Motors. Its difficult to remember now, but GM was once an innovator.
Pretty simple solution...if you don't want to be beholden to external shareholders, don't go public.
If you're pre IPO, fund your company through debt, which is extremely cheap at the moment with interest rates at historical lows, or sane, sustainable equity rounds through investors who you know and trust and are in it for the long haul.
If you're post IPO, pull a Dell and exit the public markets so you can refocus on core business ideas and long-term growth rather than spending time bickering with Carl Icahn and fighting market sentiment.
Unfortunately "pulling a Dell" is only feasible years after a company's market cap has completely tanked, and still requires a ton of capital plus no perceived value to anyone outside the company (otherwise someone else would buy it long before the company could buy itself).
The circumstances were ideal for Dell, and even then look at how incredibly long and tortuous that process was. It almost didn't happen, it's almost miraculous that it finally did. Good for them though, I wish it were much easier for companies to go private again.
Whether you're public or private, you're always going to have to deal with the people who own the company.
> I blame the stock market pressure towards growth
This is because almost everyone's retirement fund/401k/IRA is now directly tied to the stock market in one form or another. There are no more pensions, so people need growth in order to retire and live a reasonable life in your last years.
There are still pensions and they are also tied to the stock market. Pension funds also have greater influence over CEOs and they need growth every quarter because they are so underfunded (i.e. most pension funds are capitalized based on extremely unreasonable expectations of stock market growth).
So, the reality is that pension funds exert a much greater pressure on CEOs to create short term gains than individual investors who -- incongruously -- take a longer view.
Investors simply do not bid up stock prices for short term gains at the expense of long term. Stock prices are always based on long term expected returns.
It happens that CEOs manipulate the business to show a short term profit at the expense of long term, but if investors find out about it the stock will promptly tank to reflect the long view.
A common related idea is that CEOs gut businesses to satisfy Wall Street. This makes no fiscal sense at all. It can very well be true that Wall Street and CEOs often have conflicting and/or mistaken ideas about which way a business should turn, but it is never about gutting a business.
And yes, if a business can make more money by being "parted out" than as a business, then parting it out makes sense.
Pensions weren't/aren't invested in the market similarly to 401k's?
Yes. Always were. It's economics literacy that seems to be absent.
The people that are nostalgic for defined benefit pensions never seem to remember the corrupt Teamsters' pension funds. They only remember how their Fidelity 401k did in 2008.
There's been much talk about the "financialization of everything"[1], but how did it come to this? What change in laws, what shift in culture, produced this growth at all costs mentality? HFT? The Reagan Revolution? Moving manufacturing offshore?
I suspect also the end of Bretton-Woods, and western counties moving from dominantly Keynesian economic policy to dominantly Monetarism/Neoliberalism (which is responisble for your last two points).
I suggest the following article [1] for more on neoliberalism, as well as Philip Pilkington's series of articles on the origin on neoliberalism.
[1] http://www.globalexchange.org/resources/econ101/neoliberalis...
Sometime in the 1970s or 1980s, companies started optimizing for maximum shareholder value; everything else be damned. HFT and offshoring manufacturing were deliberate effects of it.
https://en.wikipedia.org/wiki/The_Mayfair_Set
While done from a UK perspective, I think part 2 and 3 takes a trip over the Atlantic.
HFT has absolutely nothing to do with maximizing shareholder value.
I don't think it's the stock market. If anything, it's the CEO not focusing on making products that make people's lives better and instead focusing on what everyone else in SV (and the tech industry in general) is making.
Since when are the two mutually exclusive ?
And I am confused what Apple could be doing that the tech industry isn't already doing. There aren't many raw innovation areas to get into now. It's all about execution and innovation at the micro level.
For me Apple's foray into health e.g. Apple Watch, HealthKit, ResearchKit as well as their pretty amazing stance on encryption and privacy is absolutely about making people's lives better. It sure as hell isn't particularly profitable.
HoloLens came from Microsoft.
> I blame the stock market pressure towards growth.
Every capitalist mechanism insists on growth. That's literally what capital is.
The decline seems real to me. A shortlist of things I've noticed:
* Spotlight no longer finds things as easily. I used to use it for everything. Since updating to El Capitan, it has missed some exact match folders. Planning to switch to Alfred.
* iWork was gutted in '13. People used to use Pages professionally. I'm now using Pages '09, and planning to transition to Word or Latex. I tested Pages '13 intensively, and it fails for even basic publishing.
* Siri can only work with default Apple apps. And those default apps are getting worse. So Siri takes a hit with every app that declines. I used to use Mail, now I don't.
* Constant Wifi issues. I frequently have to turn off wifi, then turn on. On my home network. This never happened pre Mavericks.
* In general, all my Apple default software on my iphone is sitting in a folder titled "apple", which I never use. I don't think I use any Apple default App.
* I avoid icloud. It sends scary "do you want to delete all these files" messages if you ever unsync a device, and it's not clear which actions produce which effects. iTunes has a history of destroying files on syncs, so I can't trust iCloud. Even now, itunes will add apps to my device if they're in my library but I deleted them from my phone. It does this without asking! Any other cloud app has figured out how to handle deletions from one device.
Pages 09 hit the hardest. It was wonderful software. I used it for print publishing, and it just worked. Easy to use, incredibly powerful. Have a look at their manual for the level of care they put into their software, as recently as 2009.
Pages 13 can't do half of that. Very basic stuff like "facing pages" for books has been left out.
https://manuals.info.apple.com/MANUALS/0/MA663/en_US/Pages09...
Edit: A comment below pointed out that, I do in fact use default apps. I had taken them for granted. These ones work well and I use them:
Messages, Phone, camera, photos, clock, wallet, calendar, music (UI got worse on this one). Reminders I use occasionally because of the Siri integration.
There are some issues with some of them, but mostly they work pretty well.
On the mac, the only default apps I use frequently are textedit and Preview. Previews remains excellent. I use spotlight, but as noted above it got worse.
I noticed the trend with Snow Leopard. Also the tendency towards iOS-ification of Mac OS, so I decided to switch back to Linux. I think Apple's best hour came with Tiger and Leopard. Simple applications that did one thing well and were robust. Now there's a lot of feature bloat.
A nice Linux set up with minimal software (e.g. xmonad, mutt, emacs or vim) is a joy to use, but takes time to set up and learn to use. So it's not for everyone.
Interestingly, I've experienced the same issue Apple is suffering with Ubuntu. Edgy Eft (6.10) was incredibly simple and nice. Like OS X Tiger. Now there are dozens of services running and something always gives trouble. I guess the old adage applies, make things as simple as possible but no simpler...
Ubuntu has been doing the same thing as Apple with their new UI, trying to jump in on the tablet bandwagon, with this belief that tablets are the future. This is fairly stupid, considering that Ubuntu's marketshare was pretty much 100% PC, 0% tablets. They gave their main demographic a slap in the face, and they never really did acquire any tablet market. Now, according to DistroWatch, Ubuntu is below Mint and Debian in terms of popularity.
There's something to be said about reinventing the wheel just for the sake of reinventing it, change for the sake of change. Constantly redesigning UIs that were working perfectly well, chasing after fads. In implementing a new UI, Ubuntu acquired a lot of new bugs, broken features and reliability issues. That's normal. Newer code is buggier code. You might think this "old" code is crufty, it might not be designed in the ideal way you like, but by throwing it away, you throw away years of testing and fixes too. That's an argument in favor of incremental evolution and refactorings IMO.
Maybe I'm biased. When I was 16, I used to find colorful desktop backgrounds and fancy UIs cool. Now I'm 30, and I just want the damn thing to f'ing work reliably. I don't need rounded corners, or transparency, or animations or even a desktop background. I'd be OK with a bland UI that looks like Windows 98, so long as the machine can do all I need it to reliably and fast.
> according to DistroWatch, Ubuntu is below Mint and Debian in terms of popularity.
Distrowatch is a shitty indicator of anything other than Distrowatch hits. Plenty of people like myself have been using Ubuntu for years and not gone to Distrowatch for years.
Out 'in the wild', I've never seen a distro that's not Ubuntu or openSUSE (desktop anyway). Most statistics on sites such as Wikipedia or Steam also point to Ubuntu's dominance (according to Steam's hardware/software survey, Ubuntu is about 7x more popular than Mint Rosa, which of course is also Ubuntu-based).
> In implementing a new UI, Ubuntu acquired a lot of new bugs, broken features and reliability issues.
To be honest, most of the issues I've ever had with Ubuntu are upstream issues. The 'Unity' interface these days pretty much IS the equivalent of 'legacy'. It's certainly not as radical as Gnome Shell or even some of the happenings in KDE Plasma-land. No one uses Compiz any more, except apparently Ubuntu (yes, I realize eventually we'll have a non-Compiz Unity interface).
> The 'Unity' interface these days pretty much IS the equivalent of 'legacy'.
Really? In such a short time it's considered 'legacy' now? The Gnome 2.x interface is legacy, but I wouldn't consider Unity to be so.
> Now I'm 30, and I just want the damn thing to f'ing work reliably. I don't need rounded corners, or transparency, or animations or even a desktop background
xfce - that's what I use. And I'm an even grumpier old man at 40 :-)
I use xubuntu ;)
I still don't understand why anyone thinks tablets are "the future" for anything other than passive consumption of content.
Convertible tablets like Microsoft Surface don't count -- those are laptops with detachable keyboards.
I noticed that recently, how tablets have gotten much bigger (12-13") and detachable keyboards are now the norm, I'm seeing them everywhere. It makes me laugh to think that the killer feature a tablet can have... Is a keyboard! I feel like we've sort of come back full circle. Netbooks were getting popular before tablets, because people had a need for a computing device that was more portable than traditional laptops. A Microsoft Surface, or a big iPad with a keyboard, those things are basically, like you said, thin and light laptops.
I've been pretty happy with Xubuntu for the last few years. It gives me a nice, no-frills desktop UI but still has access to the entire Ubuntu package ecosystem.
If you want something reliable you may want to put some time into installing a minimal Linux or BSD and creating a simple setup.
I have been running Arch for 7 years before switching to NixOS. I found using no desktop environment, just a tiling window manager and only text-mode software (except firefox and a document viewer) incredibly robust. So few moving parts I never had a major hiccup.
>Now I'm 30, and I just want the damn thing to f'ing work reliably. I don't need rounded corners, or transparency, or animations or even a desktop background.
Exactly. Sounds like you agree with nextos:
> A nice Linux set up with minimal software (e.g. xmonad, mutt, emacs or vim) is a joy to use
Immaturity leading to think that following market trends instead of believing in (and even 'still understanding') you're own quality, will be the only way to sustain your business. Jitter.
Every business (and I'd argue project) needs to grow and change with how the world is developing. The Palm Pilot software was perfectly fine in 2004(?) but wouldn't stand a chance serving users-needs in 2016.
The Linux distro's actually serve multiple 'customer' segments. On the client side there are at least two customers (at a minimum), one is end-users, the other is the OEM's who pay for Linux to be preloaded. The traditional OEM's need a solution to the PC market shrinking while the devices market has eaten their lunch.
It's true you have to determine the difference between a short-term 'trend' and a long-term shift. I'm sure you don't think that the mobile market is a short-term 'trend'.
> wouldn't stand a chance serving users-needs in 2016
I'll allow some simili-troll (only in appearances) rewrite :
... wouldn't stand a chance serving users-needs-that-they-think-they-have in 2016.
Trends go both ways, people think new shiny will make their life gloriouser so they run after that, then business run after that 'need' because that's what the milk machine wants. This leads to a constant spiraling where trends wave in and out, shifting properties by tiny amount most of the time. Maybe that's the best the universe can provide, and if businesses didn't play that game they'd take blows too deep to sustain.
I don't know how to describe mobile. I believe it will be the tail of the so called computer era, not a next phase. If I extrapolate, soon we'll have thumbnails computers in the single digit Watt consumption and GFLOPS. They won't be a thing anymore. Maybe I'm going Kurzweil too much.
> I don't know how to describe mobile. I believe it will be the tail of the so called computer era, not a next phase.
I agree with you, mobile is more of an extrapolation, or an evolution than an entirely new phase or era. We're into the point where English becomes imprecise to define where something is truly different.
And, I completely agree about the loop between being customer-driven and then finding out that it's just a short-term shift. The worst part is that it's fundamentally difficult to tell if something is a short-term trend or a long-term shift.
Where I was going was a much more limited view of 'trends', more like a season in the fashion market. Most businesses have to respond on an annual basis to what their customers wants - they can believe that something is a temporary trend, but can't afford not to respond.
Perhaps we're in violent agreement on the nature of trends, and divided by time-frame and response.
Switch to a distro with rolling releases. I've been a happy Gentoo user for over a decade. It was a scary time when Gnome 3 came out and I had to mask a lot of updates, but eventually Mate became stable enough for me to switch over from Gnome 2. Total control the whole time. If something doesn't work, it's my own fault. Of course XFCE, KDE, etc. have been available the whole time too.
Have you tried MATE? There's a version of Ubuntu with it already bundled in - https://ubuntu-mate.org/
XFCE would be happy to have you :)
Well, it already does, I use xubuntu ;)
> This is fairly stupid, considering that Ubuntu's marketshare was pretty much 100% PC, 0% tablets
Ubuntu is both a commercial AND a community project. The goal has always been to take the power of Linux, make it usable for 'general users' and take it to the market winning new users.
The PC market is shrinking, so much so that all the major manufacturers are struggling (e.g Dell going private, HP splitting itself etc). Meanwhile the growth in the next billion units is a) in China b) on 'mobile' devices.
IF you were in charge of strategy what would you do?
> They gave their main demographic a slap in the face ... > Now, according to DistroWatch, Ubuntu is below Mint and > Debian in terms of popularity.
The demographic for 'traditional' Linux is something like 2-4% of the PC market: the biggest thing that's happened since the 2000's is OSX has stolen developer user-base from Linux. These users are well -served (arguably habituated) by the older interfaces, but more general users are not well-served. Even if you put aside the goal of winning new users, you simply cannot build a successful business on 2% of the market (particularly when that 2% of desktop users is not orientated towards buying anything and hates advertising).
It's tough to serve more than one users-base, but Linux (Ubuntu in this case) can as it's very flexible. There's still a massive pot-pourri of software and options in the repositories! I find self-described 'geeks' complaining about Unity really bizarre - if you're a power user it's literally 3 commands (touch .xinitrc; vim .xinitrc; exec <wm-of-your-choice) to change the interface.
> That's an argument in favor of incremental evolution and refactorings IMO
That works if the old thing can be incrementally improved. The issue for Linux is that it's simply fallen behind the significant changes in the client market. At an infrastructure and applications level the FOSS/Linux environments aren't competitive to the other mobile offerings. And, it's basically impossible to maintain one complete stack for the desktop and a different one for the mobile space at the sizes the Linux companies are.
> Now I'm 30, and I just want the damn thing to f'ing work reliably. > I don't need rounded corners, or transparency, or animations or even a desktop background.
The thing is that puts you in the 2%, the things you care about are quite different. General users do care about animations, basically the whole UI "experience": to get them to change you really have to show them something different. Of course, you have to have some level of stability, but you don't win new users by telling them you are so much more stable - users just restart the app or device, they carry a battery charger everywhere and just shrug and plug-in. You only really have to read some of the comments in this thread to see what I mean ;-)
IIRC, Snow Leopard was actually the best release -- essentially Leopard but with a focus on stability and performance, plus the code signing stuff.
It's the last OS X release that felt like an improvement to me, and one of the last OS X releases that I trusted. Every one since has actually had feature regressions (let's make a formerly visible folder invisible!) or stability issues.
I still have a 2008 Macbook Pro running Snow Leopard and it's more stable than a 2012 MBP running Mavericks and performs about as well.
Mavericks was tough to give up, stable, fast. No discoveryd. IMHO, its the best release of OS X. Maybe everyone has amnesia but pre 10.3, OS X wasn't really usable full time. Each 10.2.x release was a whirlwind of changes.
Early iOS had wonky reset issues too.
Very true statements. However, those were growing pains. I feel like Apple's problem now is that it has completely given up on its overarching philosophy: to make software simple.
Instead, they're just offering us their own versions of things just because they can, not because their's is any better. Maps being exhibit A, and Apple's mail client not changing in 10 years (aside from some minor adds and removes) are examples of this.
Apple used to mean software for people who were not technology savvy. Now it means phone software for people who are not technologically savvy, and computer software that's just like everyone else's. Frankly, I can never figure things out on an iPhone. They're too confusing in their interface. Especially compared to my beloved Newton 2000.
Regarding Mail, I don't really want it to change. Other than the improvements in stability and speed that I've noticed over the years, Mail.app is pretty much exactly what I want in a mail reader on OSX. Sometimes Apple's changes seem purely for the sake of change and result in reduced usability (e.g. Photos), I don't want that to happen with mail.
> Apple's mail client not changing in 10 years (aside from some minor adds and removes) are examples of this
How is Mail.app supposed to change though? It still completes the task that it was designed for. I'll agree that Mail's stability has been spotty across releases, but how are you expecting them to change such a core app?
Don't worry. I remember. cifs.kext is hella more stable nowadays than back then. I used to have the entire OS come to a grinding halt if a mounted Samba (or NFS) share disappeared (e.g. mount local share at home, put laptop to sleep, wake laptop at school).
I remember that in (I think 10.3) they "fixed" the issue by having a timeout dialog popup and ask if you wanted to disconnect... only it was too sensitive and would popup (and then go away before you could react) if there was any jitter in the network connection.
I'm so glad both my work and personal machines are still on 10.9. I've been contemplating updating, but everything I've ready continues to say I shouldn't.
Absolutely. Snow Leopard was the pinnacle. I could run for months without a reboot and did a tremendous amount of heavy work on that old Macbook Pro.
Starting with 10.7 and seeing the decline coming, I was happy that Ubuntu ran very well on that laptop and that held me over until I left Apple hardware entirely.
Interesting. Snow Leopard was OS X's peak for me. It was Leopard with perf and stability improvements. Lion was when the iOS influence came to the mac.
I think we're heading for a synthesis of these opposing trends, the common core being a fatigue with complex bloated user interfaces, and a return to simpler design. Personally I enjoy command line interfaces, especially in full screen with huge fonts, and I've been using Ratpoison (one of the first tiling X window managers) like this since like 2004. But I also enjoy, at least in principle, tablet-style UIs. Neither is based on overlapping windows, for one thing, and they are both helpful for my quasi-ADHD/OCD tendencies.
Mac OS now is in a weird limbo between the Unix heritage, the Mac heritage, and the iOS heritage. I think unifying these strands is an enormous challenge, but also really cool and inspiring. Maybe Apple are going to focus on making cars or whatever, and some weird little upstart is going to come along and make something totally new. You can see people starting to talk about Slack as an operating system, but that's still very primitive.
Chat and command lines are interesting because they represent a huge paradigm that's been overshadowed by the "GUI" paradigm, namely the paradigm of queries and responses: you ask the computer something and get something back, perhaps asynchronously. It's such a useful, coherent model. Easy to program with. Portable across interfaces. Good for exploration. Cognitively appropriate. Etc etc.
I think the complexity and bug-riddenness of most "modern" desktop operating systems come from an over-complicated and incoherent model of operation. Mobile represents a new start. It currently lacks in flexibility, but it's probably a good thing to start with rigid simplicity than to try for flexible complexity, and unless you can perfect a genius UI right away I do think those are the options, if you're aiming for the general public who don't want to learn vim.
I also think the divide between application programmers and computer users is politically a Bad Thing and also a result of exaggerated complexity. Unix was always about user scripting, and it managed that because of textual data, worse-is-better, and RTFM. Apple might not be the company to get back to this, because they're making tons of money from making shiny coherent appliances for people who pay to think as little as possible.
So just generally I'm not holding my breath for Apple to come up with a wildly new and interesting paradigm here. They're a dinosaur. A very pretty dinosaur. I'm more interested in startups with weird ideas, like maybe let's use e-ink displays, cheap ultralight computers, cloud servers, natural language processing, and the interactive fiction adventure exploration paradigm to provide a new style of terminal gadget that's even cooler than Linux. (If you steal this idea, please don't fuck it up.)
Agree on all the items.
Not just Apple, I see similar issues with Google.
Google's Youtube IOS app has issue playing video correctly. It can't even buffer the segments correctly.
Google latest Android Map crashes all the times 1-2 minutes into the navigation, extremely dangerous when I have to restart the navigate while driving. I can't depend on it at all.
I have to roll back to the default factory install older version google map to make it work. Lately I see the older stable version start crash more often, probabaly cause by the "update" on the server API side.
It will be a very scary world if this type of SW development processes are applied to tomorrow's "self driving car".
> Google latest Android Map crashes all the times 1-2 minutes into the navigation
I haven't had these issues, but I have had significant issues with its performance. 15+ seconds to initialize. At least 3+ seconds needed to swap between transit options. Moving to navigation mode feels sluggish. And this is on a Nexus 6, with no issues with other apps.
When I choose "walk" as the navigation mode it puts an Uber route on there, which when selected gives me an ad to try Uber for the first time. I don't have Uber installed on my device. I don't want to take a fucking Uber, I want to see how long it'll take me to walk somewhere!
I get the feeling that something happened in the Google org responsible for maps. It was always a snappy app that was useful to me. Now it's slow and appearing to be an avenue for ads on the device that I bought directly from Google. Unacceptable.
That Uber ad was seriously misleading and annoying when I was traveling last month. I wanted to walk across the city. WALK. I get directions, it gives me a route and a time, it seems quicker than I expected, but not by too much. Start moving and look at it while I'm going and realized they wanted me to get in a car. The time it would've taken a driver to get to me, I'd have been halfway to the destination on foot.
Don't insert ads into an app in a way that misleads users. You'd be pissed if you opened a book, started reading at Chapter 1 and found that the first 3 chapters you read were just a tease of another book, yours starts on Chapter 4.
Yeah, Google web apps must have some scary Javascript bloat going on. I remember when GMail was lightning fast; now it takes 5-10 seconds just to load the first page of my inbox and chat windows.
I've switched to the Basic HTML version full-time. It's way, way faster, even having to reload the entire page for most actions (remember when the whole point of AJAX was faster webpages?). I miss a few features, but it's not even close to being worth using the many-times-slower interface and associated higher system load to get those back. Some of those features (inline "track this package" links in Amazon emails in the mailbox view, for instance) shouldn't require javascript/AJAX at all, but are simply absent, which is frustrating, but again, it's still worth it.
Plus I can, you know, close the tab. One of the worst things about these enormous "javascript applications" is the high startup time (remember how much we hated Flash intros with loading screens?) that leads to leaving the tab open, which means that tab's disgustingly-high memory use is a constant rather than only occasional cost (I'm looking at you, Asana!)
All iOS issues with Google software, I suspect, have more to do with Apple being dicks to Google than with Google fucking up. Remember, Apple is really trying hard to rid iOS of all Google programs. Remember Apple Maps?
I don't think it's just Apple's fault. Every time I try to use the Youtube app on my iPhone, I can never find the things I'm looking for, because they seem to ignore all of Apple's interaction guidelines and implement the Android UI instead.
The other day I was trying to send a link to a friend: instead of clicking the standard share icon, you have to click the arrow (that looks like an email forward?), which pops up a non-standard share UI instead of the standard sheet. Never mind the fact that to dismiss a video you have to first swipe it down (minimizing it into some bizarre picture-in-picture frame), then slide it off from there, instead of using a "Back" button/gesture like every other app in iOS.
Got any actual evidence for this? Yes, I remember Apple Maps; I worked on the team. Apple Maps was the result of Apple being smart and not wanting to be beholden to Google Maps forever; eventually you have to roll your own.
And yet, here we are five years later, and Google Maps is still on the iPhone, shows no evidence of leaving, and Google's iOS apps are ALL still present and better than ever, and Google is still the default search bar in Safari. Meanwhile, Apple Maps is a great product too and has been worked on and polished for several years. "Remember Apple Maps" betrays a mindset where you read one article a few hours after it was released, half a decade ago, and you have no updated impression of the product at all. Apple Maps is, in fact, rock-solid for me right now. It is, in fact, now superior to Google Maps for transit directions, and in some other ways.
So really, I'm not sure what you are talking about, at all.
I wouldn't be so sure. I run Google apps on Android, and although I'm not sure the situation has gotten worse, it definitely hasn't gotten better. I can concur that navigation will crash without warning mid-route, maps will suddenly decide I'm somewhere very, very far away (and change the results list to "match"), and so on. It's not unusable by any means, but it could stand improvement.
Back when iOS 6 was released with Apple Maps and no YouTube, Google could be forgiven for problems with their replacement apps, as Apple had somewhat surprised them with the timing of the removals. That was three years ago, though. In the present tense, it's hard to imagine what you mean; it's not like iOS has some library to inject crashes into Google apps, or Apple is preventing updates to those apps from being published on the store (as ruled out by the frequency of updates).
There's the fact that iOS currently doesn't allow replacing the system hooks for Maps and Safari with Google Maps and Chrome (or any other store apps), but that's not the type of problem the parent was complaining about.
If you look at YouTube on android, I see evidence that they are f'ing up there in how advertisements are presented.
Put yourself on 144p quality. Watch a few videos till you get to an ad. It seems like they disregard your preference and pump up to 4k for your ad, which sucks when it takes 2 minutes to download a 30 second advertisement.
The crash issues I had with google map was on Android. I check the reviews on Google Play Store, it is not just me or my Android devices, almost everyone had that crash issues at that time.
I can't figure out how Google can release any software like that.
Google can't blame IOS/Apple for that.
I suspect some of those crashes are hardware related.
Anecdotal for sure, but Google Maps has been solid for me on Nexus devices.
Anecdotal * 2 = still anecdotal, but my 1st gen moto g has never given me any problems with Maps, other than being a little laggy from time to time.
Samsung?
Rock solid for me on my OnePlus One, too. I've used it for some fairly long driving trips as well, where it's been running for a few hours.
No. Google has the same access to things as any other developer. It's Google, full stop.
> Google's Youtube IOS app has issue playing video correctly. It can't even buffer the segments correctly.
Do you have t-mobile? T-mobile started throttling everyone's videos. I had problems similar to the one you describe and had to turn off tmobile's throttling 'feature' in order to get youtube to work correctly.
youtube for IPad is terrible:
1. it has no back button. sometimes I watched video a, then from the recommend list there is video b and video c I select video b, then the recommend list update if I want to see video c, I have to search again
2. there is no sound volume button, each time I have to use the button in ipad
Completely agree with the Wifi issues. My Macbook has become extremely unreliable, it often takes me five minutes connecting/disconnecting to get online.
Recently I've additionally seen random network slowdowns that are impossible to debug, and they happen only on a single Macbook -- all other machines on the same Wifi are fine. Completely unreliable.
Glad someone else noticed that they hobbled Pages '13. You used to be able to link text boxes in Pages '09 and they removed that feature, and countless others in order to "align" the functionality with their inferior iOS product.
Pages is the only piece of software I have on my Mac where I need to keep the previous version safe because it was so much better!
Indeed. I'm in constant dread that they'll make Pages 09 incompatible on a future OS update.
I have a migration plan, but it will be a lot of work. Whereas my Pages 09 workflow is perfect. Pages 09 did exactly what I wanted it to. I've heard no one say similar things about Pages 13.
Clock has bugs. I've had alarms not go off 3 times in the last year. Pictures to prove it because when the alarm goes off it's no longer "on" (its switch is slide to the off position) so the fact that the switches are still on 30 minutes after the alarm time past means the alarm didn't go off. Nearly missed a flight.
Music once it went flat UI is now complete crap. It drives me nuts all the time and I use it often because there really isn't an alternative.
While I'm ranting one thing that's been bad from the beginning is the locked music UI. The next track button is just a few mm from the volume slider which means about once every 2 weeks I blow my ears out trying to skip to the next track. That "feature" carried over to the toolbar menu (or whatever the slide up menu is called)
IMO the default music app being not very good is OK. The real sin is not allowing others to compete. Android default music player(play music, even the name is crap) is awful. But it's not a problem because there are literally hundreds of free and paid music players that will scratch your particular itch so well, that you won't be able to use anything else after that.
Have your tried something like Ecoute on iOS? It's a third party music player and I use it full time since Apple Music was trying show up everywhere in the music.app
Ecoute uses the same library as the default app, shows a nice simple grid or list of your albums/artists/playlists and is well designed. When you use any controls in either app it will be picked up in both. It doesn't matter in which app you start a playlist/play/pause/skip/whatever. With that in mind, using Ecoute as default app was as easy as replacing the icon in the dock. No need for a default music app setting in my opinion.
More here: http://www.pixiapps.com/ecouteios/
And that tiny slider that indicates where you are in the track. Very, very hard to move it properly. Wasn't so in the non-flat version.
The Android stock alarm clock has the same problem since at least 4.0
Pages is an abomination. My partner was nearly in tears trying to do the most basic of table formatting.
iOS app 'screenshots' (in task selection mode) are often dated, even after multiple openings of the app. I get that they are meant to be just that, snapshots, but if those aren't up to date, you may as well just show the app icon.
Up to once a week, now, iCloud (in OS X settings) asks me to verify my password when "nothing has changed" (no purchases, etc). It then spins for a long time and there's no acknowledgement that it 'succeeded'.
Several more.
> Up to once a week, now, iCloud (in OS X settings) asks me to verify my password when "nothing has changed" (no purchases, etc). It then spins for a long time and there's no acknowledgement that it 'succeeded'.
Only once a week?
Try refusing to log in. You'll get pestered every five minutes, sometimes more often than that.
Try disabling it after not logging in, and it asks you to log in to disable it, and then start pestering you even more frequently. Completely asinine.
I too noticed the wifi issues when I got my new MBP with Mavericks. I had to stop/start the wireless service frequently. I eventually narrowed it down to some incompatibility between my Cisco Surfboard cable modem / WAP and the MBP. I reconfigured it as a bridge and connected a low-end Linksys WAP I had laying around. It solved the problem. I still think this is Apple's problem, since I'd used that WAP for years with many devices and never encountered an issue.
Wi-Fi issues is a big one. I had a MacBook that stopped connecting when on OSX, even over reboots.
I had to reboot the access point in the end so perhaps it wasn't an Apple problem. Yet no other devices were having issues and I've not had a problem with it since it's been running Windows.
This is a bit of a tangent, but is Wi-Fi becoming more finicky in general, perhaps due to spectrum congestion? I find I somewhat frequently have Wi-Fi related problems of the sort you're describing on a variety of devices. It seems Wi-Fi from about 5 years ago was perhaps slower but more reliable. Am I just imagining this?
The gutting of iWork hit my office hard. I'd be using Numbers, Keynote, and Pages for years in a professional environment. The lack of support for some of even the most basic desktop publishing features is mind boggling. (e.g. There is no longer a way to rotate column header labels in the new Numbers. What? Why not?) There are a hundred trivial features like this that were totally scrapped across all iWork software. Really basic stuff. It was a disaster for anyone using the software professionally.
Did your office find a Pages alternative, or are you still using Pages09. (Or Pages13?)
The apps on Mac are always sideshows. My understanding was that one of the earlier versions of iMovie was essentially a 1-man, (or small # of people) effort that was shipped, and eventually replaced.
The difference is that everything worked.
My personal opinion, based purely on speculation is that once Jony Ive & company rolled over everyone in the company, as with any corporate struggle, their priorities are the only ones that really matter. The Apple Music nonsense, that somehow managed to break an already broken product is a similar story I'm sure.
Lots of attention was paid to the visuals in Yosemite, but "unimportant" things like broken wifi, and gratuitous changes like discoveryd were allowed to see the light of day. I think Jobs was the only person able/willing to tell anyone to go fuck off, and the company is suffering from that loss.
What annoys me the most is when my mba awakes from sleep, there are around 10 processes that update their state and it means my CPU usage is at 100% per core for a few minutes. The computer is unusable during that period.
This! I wonder whether this is part of some introduced obsolescence or also happens on recent models.
My 2013 MBA is ready to go pretty much the instant I open it up.
On the WiFi issues I have noticed that when using a normal 2.4GHz router in combination with any Bluetooth mouse, keyboard etc. causes lots of issues for me.
Typically I would be getting really slow speeds and intermittent disconnects.
Switching to a 5GHz WiFi router solved a lot of problems in the WiFi department for me on both El Capitan and Yosemite.
This is a very important point.
Bluetooth interferes with 2.4GHz Wifi, therefore most Macs with bluetooth keyboards/mice struggle a bit on networks at this frequency.
5GHz is pretty much a requirement these days to get the best from a newish Mac.
You don't use any of the default iPhone apps? I find that hard to believe. What about these:
- calculator - camera - clock - contacts - messages - phone - photos - safari
If you literally use none of these apps, I would be really surprised.
Since we're sharing anecdotes: I've never had a problem with WiFi connectivity or Spotlight on El Capitan across multiple devices. I don't use iWork or Siri, so I can't speak to that. iCloud seems fine to me, but I don't use it extensively.
Oops. I'll edit my comment. I use all of those. I was taking the ones that worked for granted.
Well, actually, I don't use Safari, but that's a very particular use case. I wanted to block sites, so I disabled Safari, enabled parental controls, and am using the Google app because it's less convenient and less likely to suck me into time wasters.
Though actually, the dictation search in Google now is better than Safari. And I like being able to have my homepage be a google search. Impossible in Safari.
The only issue I have with spotlight on El Capitan is a strange issue where sometimes after entering the first character the cursor jumps to start again and overwrites the first character, turning a search for "text" which should open TextEdit to "ext" which opens Extractor.
well, it may be a bit of an exaggeration (I'm not the author), but do you think usage of clock to set alarms (or calculator) is strategic to apple's future? I hope not. Whereas use of the browser or maps or email services is. And most people I see using iphones don't use apple's offerings in those (not trivially substitutable) areas.
As for my SO, syncing iphone 5 to mac (on 10.10) just doesn't work anymore. I'm going to have to spend hours debugging why the fuck not soon. And recently her iphone has stopped connecting to wifi. To make it connect, you have to delete the wifi credentials then reboot the device. She already walked through a reset with apple over the phone but it continues to happen. It's infuriating.
I'm always baffled by heavy Calculator users, to be honest. I've never understood the popularity of dedicated calculator apps or widgets, on any platform. 90% of the time you can simply round numbers in your mind, and if precision is necessary for big numbers that's likely a job for Excel.
Same for clock apps. Clock in your taskbar, sure; but who would ever start a dedicated clock app? I barely touch Contacts and always from other apps. Photos is terrible, but that's been the case for a while now, the obsession with hiding the filesystem has degenerated in an unusable interface (sharing in particular is just mystifying).
Of your list, I do use Messages (you don't really get a choice, if you want to receive SMS texts...) and Safari (again, no real choice), as well as the Camera simply because I use it so little it's not worth my time looking for a better app.
> Same for clock apps.
That's where the alarm is, and personally I use it several times a day, to wake me up and to remember the occasional meeting or random events I would otherwise forget about.
I hate ios Photos, mostly because of the white backgrounds however.
> That's where the alarm is
Good point. I never really used alarms until Siri, because it was too cumbersome. Now setting and deleting alarms via Siri is probably the best voice-recognition experience I've ever had.
Interesting, its only a few taps to set an alarm, and I'd find "speaking to the cloud" cumbersome.
Calling someone a liar without providing any proof, or even good reasoning on why it's likely beyond the minimum "I find that hard to believe." provided, is an asshole move. Please don't do that.
> Since we're sharing anecdotes
Well, some of his points were anecdotes, some were observations about how he interacts with apple, and some were specific accounts of problematic functionality.
> iCloud seems fine to me, but I don't use it extensively.
The claims were that it had some poorly worded confirmations for deleting files when you unsync a device (subjective but testable), that iTunes has a history of destroying files on syncs (should be testable with some searching), and that iTunes will add apps to his computer after he deletes them from his phone (probably needs clarification, but definitely testable).
Responding to a comment with an accusation that it's full of anecdotes and then responding to a portion of that comment which has real, testable assertions with a hedged anecdote is not a useful way to further relevant discussion.
They didn't call me a liar. They just implied I was wrong. Big difference. You can be wrong without lying.
I was wrong. I had forgotten that I do use several default apps, taking them for granted, and overstated my case.
Since it was a statement by you about your own actions, saying he thinks you are wrong in that wording is calling you a liar. It could, and I think should, have been worded differently if the intent was to probe whether you were being hyperbolic or mistaken. E.g.
"I'm not sure how someone would go about not using any Apple app, such as calculator, camera, clock, contacts, messages, phone, photos, safari. Do you not require a lot of what these apps provide, or have actually found good alternatives for each of them?"
or
"I really can't imagine not using any Apple apps. How do you get by without any of them?"
In each of these cases, the problem is presented as a failure of the asker's imagination or knowledge, and we ask for clarification on this point. IMO, this is a much more civil way to converse than starting out by questioning the veracity of someone's statements about theirself.
Lying means you believe X but say Y. If you believe Y, say Y, but the fact is X, that's not lying.
I didn't say he lied, I said he was called a liar (but really, it's more that the strong implication was that he lied, that was overly strong wording on my part). Specifically, I think the wording "I find that hard to believe" has different connotations depending on whether you are talking to the root source of some information, or someone relaying that information.
For example, if I state "I had chicken for lunch" and you reply "I find that hard to believe.", I think the implication is clear that you think I'm lying. Alternatively, if a third party says "kbenson had chicken for lunch" and you say "I find that hard to believe", there is not a clear implication that the third party is lying, nor that where they got the information from is lying, as there are multiple locations in the chain of authenticity of such a statement where a mistake or purposeful misrepresentation could have happened, so it's not clear where fault may lay.
So, when someone states "I don't use any of the default iPhone apps" and another person replies directly to them with "I find that hard to believe", I think the implication that they are lying is clear (whether or not they were actually lying).
Now, as for your specific assertion, I think you are obviously correct in the general sense. Although am interested in your opinion on how you would classify someone that is very loose with regard to their statements and their certainty regarding those statements. If I made a statement asserting something, but thought there might be a 15% chance I was wrong if I really looked into it, would I be lying if I stated it as a fact due to the false certainty implied?
Specifically, in this case, if the author was 85% certain they didn't use any default Apple apps and stated as much without qualifying with "I think", or "probably", they may believe they are correct, but be ultimately wrong. Was it a lie to imply a higher level of certainty than existed? I'm not entirely sure how I would classify that. (Note: I don't mean to imply a specific state of mind for the original commenter, this is purely a thought experiment and that statement was handy).
I simply don't see the implied accusation of lying with "I find that hard to believe." It's merely implying that the statement is wrong. You seem to be interpreting it as an accusation based on the idea that a person making this statement about himself is very unlikely to be incorrect. I don't see that as at all unlikely, and you'll observe here that the statement was in fact quite wrong.
As for levels of uncertainty, I think blanket statements cover a pretty wide range. "I don't use the default apps" could be low or high certainty. If you said something like "I totally definitely absolutely never use the default apps" while you are actually somewhat unsure, then sure, that seems like a lie to me.
> You seem to be interpreting it as an accusation based on the idea that a person making this statement about himself is very unlikely to be incorrect.
I'm interpreting it as an accusation based on the prior explanation, which is less about likelihood of correctness and more about questioning the single authoritative root source of information. In truth, that reasoning is really my explanation of what I see in practice. I cannot recall an instance where someone said "I find that hard to believe" to someone's statement about their own current actions that did not also carry a clear "I call bullshit" connotation. That is, while logically "I find that hard to believe" used in this way can mean that a person thinks you might be wrong, I find that in practice it is not used this way, so it's irrelevant in this context. Specifically, I think the statement as used hear carried a clear "I call bullshit" connotation, which is an implication of lying.
That said, I freely admit my experience in the use of this expression in English might be influenced by region, or even my own biased interpretation, and you or others may have experiences where it was used by or to you in reference to an assertive statement about your action in which there was not a clear "I call bullshit" connotation, in which case I would happily hear them and use them as counter evidence to my own experiences.
> "I don't use the default apps" could be low or high certainty.
To me, assertive statements like this do not exhibit low certainty at all, specifically because it's referencing current state. If it's about the past, it's open to recollection issues, if it's about the future, it's about possible future actions, but when you state "This is what I do", to me that is meant as a clearly defined statement of truth.
In my experience, calling bullshit is almost always about saying the person is wrong, not lying. Maybe carried away by braggadocio, but not outright lying.
You say it's questioning the single authoritative root source of information. I say there is no authoritative root source of this information. People's memories are bad, even about their own lives, and when they say something improbable about their own lives, it's not an implicit accusation of lying if you say you think they're wrong.
> In my experience, calling bullshit is almost always about saying the person is wrong, not lying. Maybe carried away by braggadocio, but not outright lying.
I'm not sure that follow most people's interpretations. Wikipedia[1] even asserts it's usual use in response to statement that are "deceiving, misleading, disingenuous, unfair or false". Personally I see it used, and use it myself (sparingly) when responding to someone you think is knowingly misleading you (possibly in jest). I would be offended if anyone but a close friend called my statements bullshit and I believed them, as I would interpret that as an accusation of me intentional misrepresentation.
1: https://en.wikipedia.org/wiki/Bullshit
> I say there is no authoritative root source of this information.
Sure there is, in the instances I am (and have been) referring to. That doesn't mean it's infallible, but in the absence of a third party witness or evidence to the contrary, sometimes all you have is the statement of the person about themselves. The only way to find a fault in those situations is for them to admit it. Related to what we've been discussing, I'm not sure I would distinguish between a lie or a mistake when someone recants immediately after a statement and questioning of said statement when in regards their own actions in the way we've been discussing. I consider both acts of bad faith (if someone puts so little thought into their words that they must recant at the slightest questioning, then I view it as not better than a lie).
"I'm not sure I would distinguish between a lie or a mistake when someone recants immediately after a statement and questioning of said statement when in regards their own actions in the way we've been discussing. I consider both acts of bad faith (if someone puts so little thought into their words that they must recant at the slightest questioning, then I view it as not better than a lie)."
I'd suggest you really ought to lighten up, understand informal discourse for what it is, and stop projecting your own harsh views of other people onto other people.
> understand informal discourse for what it is, and stop projecting your own harsh views of other people onto other people.
I don't think it's overly harsh to expect people, when talking about their own actions (and obviously when understanding the topic and terminology), and when in a discussion which is not a light hearted banter, to expect someone to put enough thought into that statement to make it true, at least to the degree it requires additional information to make them reconsider.
This theoretical exchange in a discussion regarding the merits of Chicken illustrates my point of view: Person 1: I don't eat chicken. Person 2: Are you sure? Person 1: Okay, yes, I eat chicken.
Person 1 has, at this point, proven themselves unreliable in their statements. Whether that is from a lie or mistake is both unprovable,and to me, irrelevant, because the end result is the same. I may or may not converse with them further, depending on other cues, but functionally, what's the difference to Person 2 or observers. I don't think that's overly harsh, just stating the realities of the situation. Sometimes people tell small lies, sometimes they make mistakes, but every occurrence affects your view of them slightly unless you expected that statement to be untrue.
So, to bring this full circle, the original commenter that asserted they didn't use default Apple apps may have lied, but it's far more likely they were mistaken, but in the end, I'll view their statements with a bit more skepticism now, as they've proven themselves capable of making a simple, assertive blanket statements about their behavior that they will recant at the slightest question.
Even so, I still feel the original reply carried an implied accusation of deception, for the reasons we've covered in depth. Regardless of whether the replying commenter was eventually correct, I don't think they had cause to use the wording they did at the time they did. We don't agree, which is fine, but that's why I felt compelled to make the statement I did, and while my wording may have been overly harsh (which I've already admitted), I don't believe my message was.
I assume we're done here, since your last statement was a single sentence whose purpose was to call into question my attitude, understanding, and actions, and offered little additional to discuss? In truth, I found that somewhat belittling, but am trying to ignore that aspect, as I may be misinterpreting it (and I try to be more forgiving of speech directed towards myself than to others). I've tried to be civil, and while this discussion wasn't light hearted, I did find it informative.
It's funny, I was just regretting the latest OSX update that I made. My MacBook Pro (work laptop) is 2 years old and was great when I first set up my dev environment. Now, I need to restart every couple of days.
I should spend some time partitioning the disk and installing Linux.
> I should spend some time partitioning the disk and installing Linux.
That hour spent might save you twenty. But if you want the best experience you need a laptop that was certified for OEM Linux, like some of the Lenovo boxes. The user experience is better than a macbook. Rock solid OS. Everything just works. Seamless upgrades. A bunch of warm hearted people will carefully apply themselves to making the system work better while you sleep.
I lost several playlists with a recent ITunes update. Newer upgrades of iTunes have not helped.
Given how basic the concept of a playlist is to iTunes, I was surprised to find this was not fixed immediately.
Just a note, I was sick of Spotlight too, and I switched to Alfred, which is actually the "father" of spotlight. And it is way better than Spotlight now. You should give it a try
Preview was pretty cool, and for the most part still is, but now it doesn't update any more when viewing a PDF not in page mode but in continuous scroll mode. That basically means I cannot use it for previewing Latex PDFs anymore.
Skim.app is very suitable for Latex PDF preview (it even supports syncing given you have an Editor that supports it too, e.g. Emacs AUCTex or TexShop).
Pages used to be my word processor of choice. I have since subscribed to Office 365.
My understanding is that the desktop version of Pages was brought in line with the iOS version. Simple things like paragraph styles were taken out.
Which mail application or webmail/browser did you switch to?
> Constant Wifi issues. I frequently have to turn off wifi, then turn on. On my home network. This never happened pre Mavericks.
FWIW updating to El Capitan fixed this for me.
iTunes and iCloud deleting all my data is what sent me to Android. Deleting data should be a cardinal sin and require some conscious effort to do.
So we are back to the Apple in the days of Copland?
BTW, to fix these issues by software dev process is not that hard.
* Start by use/create a full functional regression system. * It is not that hard to create system level functional test coverage for wifi, Siri, search, etc. * Require ALL SW developers to run full tests every evening, weekend. Code are not allow to checkin or merge upstream if any test failed. * The test system should submit report on all test scripts pass/fail and performance characterization to central server every night.
> it’s tragic that Aperture and iPhoto were axed in favor of the horrifically bad Photos app (that looks like some Frankenstein “iOS X” app)
I find Photos.app to be the best Apple app I've ever used. It's simple, modern, works amazingly well between iOS and OS X, transparently manages 100GB of data between iCloud and local storage.
This entire piece is surfing on Mossberg's take and goes way too far.
From what we know, Apple does seem to be sorting out big challenges on software side:
- transition to Swift on 2 platforms, which won't happen until they decide to only support 64-bits OS X and iOS at some point in 2017 or 2018 or later
- OS X and iOS foundations are actually super solid. Accidents happen, both codebases are now much more mature and stable. Probably the best they had for decades
- manage the largest updates in history yearly. You hear of bugs because everybody gets to experience them at the same time. Windows never got that many million devices updated overnight
On top of that, they do seem to have conflicting marketing priorities. They don't know what to do with iTunes - as a brand, as an app, as an experience. They're obviously conflicted whether a user should depend on the AppStore to do stuff (feature creep in Notes.app).
IMO this "apple software sucks" is more a consequence of a stalled marketing than an engineering problem.
I don't see how anyone could call Photos.app the best app they've ever used. Right off the bat, I can never tell you why I'm in a mode where I can multi-select or not. When I double-click on a photo, all of a sudden an intermediate navigation thumbnail bar thing pops up, but it doesn't let me do the things the main thumbnail view does.
I don't know when I can use the back arrow or not.
Flagging no longer shows up on the pictures, so I have to look up to the menu bar to see if a photo is flagged.
I've got a few shared galleries with family. Does anyone know which photos are going to show up in Photos vs Albums (All Photos) vs Shared (Activity and your named shared galleries) and the My Photo Stream under Albums?
Virtually every time I recreate a photo library or get a new device, I need to turn off all the icloud syncing crap and photostream crap about 3 times before it finally starts syncing photos. And sometimes when it does, I click on a photo thumbnail and the wrong photo pops up.
Photos is effing TERRIBLE from a UI/UX, features, and functionality standpoint.
Eh, I think you could say a lot of similar things about iPhoto. That was one shitty app, certainly much, much, much, much more shitty than Photos …
There are always many things “wrong” (in your case it’s mostly your specific use case and the place you are coming from, basically mostly your own preferences and biases?) with something, always, always, it’s impossible to avoid. Can’t design for everyone, just can’t.
PS: Check the menu. It’s absolutely crucial for OS X software. It’s a design paradigm Apple has been unflinchingly following forever, with iPhoto, with Aperture, with everything. They will hide all the custom stuff and little features there that depends on subjective preference there. For example, check View, then Meta Data to get your hearts on the thumbnails.
(Deactivate My Photo Stream. It’s deprecated and Apple will hopefully mothball it soon. I agree, it’s confusing, but that’s the debt you acquire when unthinkingly doing stuff. I agree that sucks, definitely. The All Photos album is also confusing, I agree. Apple should have abolished it and had, but people were whining about it. Basically, the Photos view has all your photos in chronological order, the All Photos album has all of your photos in the order you imported them. Also, don’t you have to explicitly share photos for them to be shared? Which is exactly what you want? I’m not sure what your multi-select point is all about. You obviously cannot multi-select when Photos just shows an overview. I agree that’s a tradeoff, but in my view one that’s very well worth it.)
In summary: Photos is modern, blazingly fast, cruft free and extremely logical. I love it. So much better than iPhoto ever was.
iPhoto was pretty bad, but it seems better than Photos. Perhaps I'm just more used to it. Thanks for the tip about the hearts, but I didn't mean in the title bar of the program, I meant overlaid on the photo itself -- the old flagging method was much easier to see at a glance.
I actually like the photostream. It's an easy way of getting recent phone snapshots onto my computer for exporting or doing whatever I want -- without keeping them around forever.
My multi-select comment refers to this: double-click on a photo in the thumbnails. You're now viewing an enlarged photo. Now there's also a double-row of thumbnails to the left of it. You can use this double-row to navigate, but not to multi-select. You must hit the back arrow to get to the OTHER thumbnail view in order to multi-select. I constantly make this mistake, it's so non-intuitive that you can't multi-select thumbnails to (say) export multiple photos.. not in THIS mode, at least, you now have to hit a back arrow to do the exact same task!
I'm not sure if they've fixed this in Photos, but a big gripe of mine with iPhoto was its "sharing" behavior. If you deleted your photos from your computer, it ALSO deleted them from your 'shared' sites; eg, Facebook or whatever. To me, this was absolutely asinine behavior as many of us use multiple computers, can't store our entire photo libraries on them at all times, and so on. For example, my workflow is to take a lightweight laptop with me when I travel, and the SSD can't hold my whole photo library; just enough to do quick edits, upload, then delete.
I'll have to experiment with Photos' 'sharing' support, but I'm afraid of breaking stuff from past experiences.
You can display hearts on the photos by going to the view menu! (As well as titles if they exist, the file format, …) That info is overlaid!
(I also think it’s entirely reasonable to only have the thumbnail bar for photo navigation. I think the one in iPhoto worked exactly the same? It just maybe wasn’t displayed by default? If you give it all the functionality of all the other views there are consequences to that, knock on effects. For example, which image should be displayed enlarged if you select more than one in the bar? Keeping it simple keeps it simple.)
Ah, I see, if you turn things on in the view menu, it enables it to be overload ONLY in the main thumbnail view mode. When you're viewing a single photo, the heart/title/etc is not overlaid on the full size photo or the nav bar thumbnail.
There are more pc's running windows 10 than there are macs in operation, so microsoft definitely does updates at the scale of os x and beyond.
Personally, i don't feel that apple's software quality is getting worse, but then it has always been a mixed bag for me. I found the apple quality halo to be a bit of a myth across the half dozen apple devices i own. Seems comparable to windows' quality.
It is not fair to single out Apple - all the software has gone downhill in the past couple of years - browsers, websites, apps, appliances, virtually everything.
My suspicion is that it's the proliferation of new ways of doing things - new languages, no-sql/key-value db's, new hosting platforms such as AWS, docker, big data stuff like hadoop, storm and spark, all kinds of embedded software (tv's, cars), etc - we've got a lot of new stuff lately, a lot of us don't understand what we're doing and we've introduced a ton of bugs.
I agree.
Though I cannot justify it with numbers I will add the theory that, with the advent of a weak US economy combined with outsourcing to less expensive, global programming labor pools, domestic software management has become kind of spoiled and lower in quality.
Software managers are now, more than ever before, empowered to say "We need to add XYZ feature" and they quickly hear back from someone somewhere in the US or the whole wide world who says "No problem. I'll get on that right away!"
And then the manager thinks they've done their job. Like that's all there is to software management.
A lot of domestic software managers don't seem to have more sophistication than that, or more insight than that. Because, in this marketplace, they don't have to.
This is my take as well. Complexity and the speed we get there has reached an all time high.
You can't be fast moving and have high quality. Software is like any other trade: polish takes time and you need to stop to polish things. This thread has a lot of people praising Snow Leopard. Frankly, the difference between Leopard and SL aren't the high, but that's two years of development at Apple to polish things up. Apple had a rare slow moving period to clean things up.
I suspect the mobile/cloud/whatever revolution is only recently slowing down and we'll be entering a polish period again soon. Then we'll be entering a complexity building period sometime after. Its a cycle that keeps repeating itself.
Agreed. Everything is more complex. Web apps is each iteration harder.. Native App development too (for example, .NET xaml is far harder than winforms and it than VB/Delphi).
And most of this is unnecessary complexity.why haven't anybody came up with better tools to attack that ?
Adding more tools generally adds more complexity...
Because we don't need better tools. If things are getting worse, that shows we had the tools already. The problem IMO: We keep trying new tools where they aren't needed.
#grumpy #getoffmylawn
OK not new tools. But great tools. Most of the tools the web offers are pieces of shit - many developers will agree to that.
And many complain that the web stack sucks. The simple example - you have to learn css, html, javascript, and some backend language - just to build say a business site, when you could do so, at a higher level, just by pointing and clicking and entering some code with something like microsoft lightswitch(which was discontinued).
Because is harder to develop for developers.
I'm dreaming to revive the kind of development made famous by FoxPro/dbase (or more exactly, mix Acces+Excel+Databases).
Form building is another area that is far more complex that it need to be. The problem now is the multi-platform: Everyone (or most) want it, but not good toolkit exist to work on...
FileMaker comes to mind. It actually is a nice development platform for beginners, but has major shortcomings in other areas. Wouldn't recommend it these days.
Is the same problem with Excel/Acces and similar.
They have a good tool, but the language them use for glue things are not a match for the task the tool do.
For example, Excel. VBA have impedance mismatch with the way excel visualy work. If, isntead, it have a more array-based language and bit more functional, then the way you use excel will be the way you code for it.
That is in fact the power FoxPro have. You can make forms with tables, because, forms ARE tables. The syntax and basic operations work fine with the relational storage. You can SELECT INTO ARRAY and stuff like that, or just not use SQL.
It's not a question of fair. It's just pointing out that for a company with huge amounts of cash on hand and the kind of hardware integration they benefit from (which normally makes software development MUCH easier), their products are simply not good anymore. Just because they're Apple doesn't mean they get a pass.
Agreed - the barrier to entry has been lowered significantly, and legions of bro-kiddies who now call themselves developers have been pumping out garbage
Here is the author's list of gripes:
On OS X this is especially true: OpenGL implementation has fallen behind the competition, the filesystem desperately needs updating, the SDK has needed modernizing for years, networking and cryptography have seen major gaffes. And that’s with regards to the under-the-hood details, the applications are easier targets: it’s tragic that Aperture and iPhoto were axed in favor of the horrifically bad Photos app (that looks like some Frankenstein “iOS X” app), the entire industry have left Final Cut Pro X, I dare not plug my iPhone in to my laptop for fear of what it might do, the Mac App Store is the antitheses of native application development (again being some Frankenstein of a web/native app), and iCloud nee MobileMe nee iTools has been an unreliable and slow mess since day one.
I've found myself thinking along similar lines. The two biggest offenders in my opinion are (1) feature bloat and (2) poor UX/UI decisions. iTunes is a prime example of how the confluence of both can turn a relatively simple and popular app into a quasi-unusable nightmare.
I've also noticed a lot more freezing and kernel panics on my iPhone, to the point where I've stopped updating the OS for fear of what might be introduced in the next version.
Feature bloat is a problem in virtually every piece of software people have ever used.
Time and time again a startup comes along with "-foo- simplified", gets people to switch with it's ease, bloats up the product over time, and then loses to the next guy.
Is Apple's software quality truly declining? I've been an OS X user for a couple years now, and I'm perfectly happy with my laptop and its updates.
Whenever I read articles like these, I wondered if the software quality has actually declined, or if it's received too many features to be properly maintained, or if now it has so many more users that the flaws have become more apparent.
Depending how long ago you started using a Mac, you may have started using them after the slide started.
From my own experience, and the experience of other Mac users that I know, I think that the quality of their software has had a marked decline in the last decade or so. The exact reasons aren't totally apparent, but it seems like there's too much rush to add new 'features' without fixing and refining existing ones, and that annual releases aren't giving them sufficient time to polish what's already there. It's no accident that Tiger was incredibly stable and 'complete' after having nearly three years of work done on it.
I'm not advocating they work for three years on each version of OS X, but even 18 months, and fewer new 'features' per release, would give them more time to improve the quality of what they're releasing.
The main reason could be that they don't have a competitor in UNIX devices at the moment, at least when it comes to hardware quality.
Beside that, if you want to do mobile development, you need a OS X at hand, again, not much alternatives.
All companies have a life time at the end, and I think Apple is enjoying its 90s of Microsoft era.
I think I'm in the same boat, only a Mac user for less than 5 years. It seems to me that it's never been perfect, but that it remains somewhat above its alternatives, especially with the hardware/software integration they can have... but it can't compete with its own reputation.
I'm on the last El Capitan.
Actually I have to restart the Finder every few days because the network drives either disappear or are no longer reachable. Sometimes the shortcuts disappear as well especially the ones that are located on a network drive (I tried both smb and afp).
I've also noticed a lot of slowness throughout the OS. The spinning colored wheel shows up very often compared to previous OS versions and on all kind of actions.
That spinning colored wheel issue went away for me when I upgraded from 4GB RAM to 20GB. It was constant and terrible in El Capitan before I did that, though.
My 2013 Macbook Pro keeps crashing with Firefox, a few times every day. I cannot use Firefox anymore, I had to switch to Chrome. And as I am writing this Mail crashed because I wanted to forward an Email with a large picture in it. And messages has crashed numerous times, I don't use that either anymore.
So yes, for me Apple's software quality truly has declined sind Snow Leopard.
How is Firefox crashing an Apple issue, assuming you don't hit the same problems with Chrome?
I was probably unclear: The whole computer freezes, and there is nothing I can do except a "hard" reset, i.e. pressing the power button until it shuts off.
But there are more issues still, e.g. that the computer will only connect to my bluetooth radio after a restart.
Firefox is nowhere related to Apple of course, but it is so bad on Mac. I recently went back to Chrome because it just works so much better on my macbook. (El Capitan)
>Is Apple's software quality truly declining?
I'm not really sure, it's hard to point to something that's just plain broken. I noticed that some small UI features that I don't like in El Capitan, it seems slower, but nothing's crashing or preventing me from working.
Part of it might be that Apple is moving in direction that many professionals/developers don't like.
iTunes is different story though, that piece of junk needs to get replaced. Half the time it would even find my iPod and I doubt Apple cares because "get an iPhone".
Indeed, I rarely have any kernel panics, or system crashes on OS X, and especially iOS with the latest iPhone. Other than iTunes, their software (that I use) has been pretty good lately.
I've actually had more problems with the hardware. My late 2011 Macbook Pro was just returned from the offsite Apple Repair with it's 3rd logic board (that being said, it's still under an extended warranty from Apple, which is at least a consolation prize).
I'm not sure any OS really suffers from system crashes these days. The expected quality bar is now much higher (Fyi: I use Windows, Mac and Debian every day).
> or if it's received too many features to be properly maintained
That means that it's not properly maintained, which means declined.
> or if now it has so many more users that the flaws have become more apparent
More eyes do find rarer bugs, and hundreds of millions of eyes is a LOT.
But I can tell you from personal experience that all sorts of little simple things that used to work don't and it gets very obnoxious. I've been using Apple stuff since ~2004 (ignoring the iPod I had earlier) and I do think things are worse.
+1
Anecdotally I've noticed a significant degradation of quality over the last year or so, everything from updating a new iOS device from on iCloud backup, general iOS bugs, OS X wifi connectivity issues etc etc. It's been noticeable.
Besides little long time bugs (like my device prompting for password because it's been 48 hours since last TouchID when it's actually been like 8) there are bugs in new things.
Since 9.0 (or so) using CarPlay I find that it occasionally messes up the audio engine on the phone requiring a restart to a fix. That means instead of my old weeks and weeks of uptime it now varies between 24h and 7d or so.
Having your phone not have audio work correct? Horribly obnoxious. Especially when 70% of what you do is listen to music and podcasts.
I gave up on Podcasts and switched to Overcast because Apple (despite years) was unable to correctly sync which podcasts I'd listed to and which I hadn't. I'm wary to buy things from iTunes because I end up with 2 or 3 copies in my library of some songs, randomly.
I STILL randomly get corrupted songs on my phone and occasionally have to reload all 45GB of music (everything I own, just easier since I have the space) to fix the issue. Don't know why.
I've been using OSX since 2003 and I think it's better than ever. My guess is some people are just nostalgic for a time when computers were easier to master/understand.
I've heard this said a lot recently, but I must admit it hasn't been my experience. I've had generally no problems with any version of OS X, having used it since Tiger; upgrades seem to have led to general improvement, with the usual proviso of various broken bits and pieces in the first release, pretty much in line with the advice that's always been in place for Apple services – don't use the first release. The same for iOS.
Regarding the specifics pointed out by the author: OpenGL support does and always had lagged behind (sucks) and there have been security gaffes (sucks). I don't agree that the SDK needs 'modernizing', whatever that means, however.
Apple has problems with software quality, granted. But is it realistically any worse than any other provider, or is it getting worse? That's not my experience, at least.
There's a weird combination of things going on with these articles – a mix of:
1. Forgetting how truly crap things used to be
MacOS is better now than ever. Systems 7-9 fell over about every 10 minutes, 10.3 was the first version of OS X to be good enough for work and 10.5 'Leper'- nuff said.
2. Ostalgie [0]
Maybe Windows wasn't that bad after all...
3. 'Steve Jobs wouldn't have allowed it'
Good grief – Apple have had a long history of stinkers even when Jobs was around. Remember Ping, brushed metal UI, Apple Maps v1, Mobile Me, the iCloud launch...
4. Civilisation is crumbling
Things were always better than they are now. And the sky is always falling.
That's not to say there aren't problems, there are. Point them out for sure, but I don't think that these 'things are getting worse' conversations are very helpful. Just seems like negative opportunism.
Somewhere in the dark recesses of my closet is a "System 7.5 sucks less" shirt. What's old is new again, I suppose.
While I think this is exaggerated I also think there is truth to it. I would not blame this on inability to make quality products but bad high level choices. What is xCode today used to be multiple separate apps. The synch process for iPhone used to be separate programs. Now we are getting these huge monolithic programs like xCode and iTunes. Making big monoliths just isn't good software engineering IMHO. They should stick more to the original sound Unix philosophy. Make smaller apps which do a few things well. I think if users find it cumbersome to use multiple programs together, then the solution isn't to combine them all into one but to create an Operating System environment which makes it easy to use multiple programs together.
I think xCode e.g. which I use every day could easily be made into 3 different programs. 1) Project management and configuration. One program where you define which pieces go into your project. How they are compiled, deployed etc. 2) An Editor component which does syntax highlighting, refactoring, code navigation etc. 3) A GUI designer.
Sure integration has its benefits. But it also has many big disadvantages. Many people like to use different editor components, e.g. AppCode does refactoring and code navigation better in many people's view than xCode. Yet it becomes an incomplete solution as they do not have as sophisticated GUI design and project management tools. But it these were separate tools which could just as well be used with AppCode one could inspire much more diversity.
Likewise the monolithic iTunes locks out any kind of alternative third party solutions. There should be one synching applications which can sync content like pictures, books, music etc. But that is all it should do. Movies, Music, Apps etc should be separate apps and we should allow alternative third party alternatives to these. People might want other ways to display and organize their iTunes music e.g.
"I think xCode e.g. which I use every day could easily be made into 3 different programs. 1) Project management and configuration. One program where you define which pieces go into your project. How they are compiled, deployed etc. 2) An Editor component which does syntax highlighting, refactoring, code navigation etc. 3) A GUI designer."
We used to have that. The GUI designer being separate was never a good situation, considering how one would connect the IBOutlets and IBActions.
I'm glad to see that other people have noticed this. iTunes has always been a pain to use, but otherwise Apple's software used to be incredibly robust and worked well. Nowadays Apple seem to be pushing out updates without proper testing. It's got to the point where I'm hesitant to actually update my devices for fear they'll become unusable. And that's never good.
At one point both my MacBook and my iPhone were suffering from regular graphical glitches. It's still an occasional problem, but at least Apple have got on top of it for the most part. I just wish they'd do something to stop my iPhone from kernel panicking and automatically restarting itself every so often.
It's got to the point where I'm hesitant to actually update my devices for fear they'll become unusable. And that's never good.
Sadly, this doesn't seem to be specific to Apple either. It has become almost the norm in many parts of the software industry, to the extent that I don't voluntarily update any software that is installed and working any more, other than essential security updates. These days, even that is only done after a search to see which supposedly essential Windows updates are actually important for security, because I no longer trust Microsoft to be honest about what their updates are for either.
My default assumption otherwise is that running any sort of updater on application software, or heaven forbid on drivers or the whole OS, has a better than even chance of breaking something I care about, and that every time a browser auto-updates there is a close to 100% chance it will break something I care about or change something in a way I don't want or need. But since almost everyone is now producing similar levels of unreliable and unstable products, there's little you can do to get away from it.
Combined with the version ratcheting problem we were discussing a week or so ago here on HN, also in connection with Apple but also a much wider problem within the industry, it's becoming almost impossible to simply choose and use software you actually want, and to upgrade if and when a better version for your needs is available. This is not a good thing.
While I agree that Apple's software quality has declined in recent years, I fear that their hardware and other advantages (even with superior hardware and software I don't trust a Google-or-Amazon-based phone for a second to be anything but a data mine for them), I fear Apple's existing lead in general quality will allow them to become (remain?) complacent and allow things to slide.
I don't even know if Apple owns the hardware quality title anymore. The Surface line from Microsoft is astounding in quality, and it's only a few generations old. This latest line did have some bugs (mostly due to Intel,) but overall when I walk into a Best Buy, and I feel a Surface Book and look over at a MacBook, one feels like the future, while one feels like it's hugging the past.
The problem with Apple, and I've said this for ages, is that Apple is fashion. It's a status symbol, Sony used to be that same way, and look at it now. It had similar business verticals, pushed the same proprietary nonsense instead of adopting what the rest of the industry is doing and they were drowned as they began to expand their portfolio of devices.
Apple makes great quality devices, but Apple also enjoys massive hardware margins. Hardware margins diminish with time, no company has successfully escaped it. So the question is, at what point does Apple go out of style?
> The problem with Apple, and I've said this for ages, is that Apple is fashion. It's a status symbol, Sony used to be that same way, and look at it now. It had similar business verticals, pushed the same proprietary nonsense instead of adopting what the rest of the industry is doing and they were drowned as they began to expand their portfolio of devices.
Apple already did this when Microsoft released Windows and NEC made "clones" that worked really well for a lot less price.
I think Apple will remain in style for this broad generation of consumer computing (mobile devices physically separate from our bodies). No other company has quite the holistic philosophy and approach that Apple commands, even if their dominance lacks perfection. Google and Amazon are ok, and Microsoft has a while to go, even if it is clear that they are trying very, very hard.
Once the nanomachines are here, there is no guarantee that Apple will remain relevant.
There's still nothing that truly competes with the MBP. IMO it's the only thing they have left that's clearly better than the competition.
The Pro line is rarely upgraded, you can get a Razer laptop with better specs, for a similar price, but you'll have to trade out the silver for black and a stupid logo. Otherwise...eh, the Pro line is less than impressive these days. I honestly think the iPad Mini and iPhone are their only superior devices now. Of course all they need is the iPhone, that's basically their only real money maker.
I agree with your concerns about a Google or Amazon phone being a datamine in your pocket, but I think that, for most people, it's not an issue: they simply want a phone that works for them and don't give much thought to how much information about them is being squirrelled away.
A huge factor in Apple's sliding software quality is lock-in: once you're on iOS using an iPhone, it's so much easier to simply stay on iOS than it is to move platform. Most people, myself included, will look at the effort required to move and decide that, really, the grass probably isn't greener enough to make climbing the fence worth it. Once you add in some other Apple devices --- say a laptop or an iPad --- staying becomes even easier.
The reverse is also true - once you decide to leave Apple, the lock-in becomes something you don't go back to. I used to be entirely Apple devices, MacBooks & iPods & iPhones & AirPlay everywhere. But I was fed up enough with my iPhone 4 that I got a Nexus 5 instead - which led me to a Chromecast, a Galaxy Tab A with S-Pen, which led me to a Samsung TV. The grass has definitely been greener for me. I still have my new MacBook Pro, but I don't "love" it like I used to love Apple products.
Then you decide to pull the trigger and move. I did and I am happy that I did. I still have a mac for my primary machine, but literally everything else is Android and Windows. I consciously make sure my data is portable. No lock in and it's not that hard.
I'm happy going Apple with my tablet and Android for my phone (actually I changed again and went Windows Phone). So I still can access all those old games I had on my previous iPhone through my iPad, but I can enjoy new things and try out other phones and operating systems, and spend less money doing so.
"A huge factor in Apple's sliding software quality is lock-in"
I don't believe that. You're just as locked into Android after using it for a while as you would be with iOS.
Um, Quicktime? iTunes? Apple's software was always terrible (as a Windows user).
QuickTime was always way ahead of the game in terms of playing videos with a fantastic user interface. They kept innovating the UI so that it would get out of your way while still letting you do the basic things people need to do to watch a video. Even today I feel like QT is one of Apple's non-failures. The only thing I don't like about it is that you can't easily rewind a few seconds backwards. It requires a really steady hand to make sure you don't jump too far back.
> you can't easily rewind a few seconds backwards
There is a technical reason for this, and it's not a QT program problem, but rather a QT container problem. The QT container file does a horrendous job at synchronizing audio and visual streams. QT playback software exploits this problem, though, by intentionally lagging the start of a video by a few frames in order for the audio and visual streams to match up on the timeline. By not having to track the sync after the initial playback lag, the file plays more "reliably" and "quickly" but this also means that the decoder has no reference point from which it can scrub backwards. Because the "quick" in QuickTime really is a misnomer. This is lazy time, not quick time.
That's not true at all. QuickTime container can represent each frame timestamp exactly, and it has got some advanced features like edit lists. Every frame has got a start and stop time, and each sync sample is marked as such, it's even possible to mark on which sample each sample depends. In Matroska for example the stop time is so unreliable you often have to analyse the frame content or wait for the next frame to know the duration of a sample, and who knows if the sync flag is true or not.
You are probably referring to the delay introduced by b-frames, but the mov container has got a atom ('cslg') to store the max and min offsets and put everything in sync again.
Unfortunately third party mov demuxers don't support cslg or edit lists, so they only supports the simplest mov files.
No, I am referring to the delay introduced by compressed audio streams within QT container files. The issue I refer to does not seem to occur for lossless audio. In these situations, the cslg atom, among others, allow the QT format to reliably copy edited stream data without re-writing to the container file.
AAC, like MP3, introduces a padding of silence at the beginning of the stream. Because modern QT container files do not compensate for this, all audio and video streams within this type of QT file will be off sync by default. QT playback software waits for the audio stream to begin (waits for silence padding to end) before video playback begins, even though the streams themselves line up 1:1 in the container file. This is lazy engineering, not an advanced feature.
From what I read on https://developer.apple.com/library/mac/documentation/QuickT... it seems it's possible to explicitly represent the delay.
Compared to VLC, QuickTime is kintergarden material and has been for a decade.
QuickTime on OSX has a nice feature to quickly move forwards and backwards when watching a video--just swipe horizontally with two fingers to the left or right. It's a little flaky at times, but works for the most part.
What? For me it was always super slow, ugly and unfunctional. Reminded me of Divix player.
It definitely is now. Quicktime was always pretty bad. iTunes worked well in comparison to its contemporaries when it came out which is why the iPod did so well initially (before Apple had a music store). Safari on Windows was always awful and was discontinued without notice leaving users insecure without knowing it.
You forgot Xcode. Xcode has always been mess.
tbf, xcode has gotten a lot better in the last few years I've been working with. stability especially improved between xcode 6 & 7.
Down currently, Google cache: http://webcache.googleusercontent.com/search?q=cache:Us-NZtX...
The worst thing about Apple is not the drop in quality, but the way in which every fan and user kicking and screaming about it doesn't make them realize how awful their products are.
If we could just have a rule that said "don't get Apple's first version of a product", but they continue to be terrible. I recently lost my entire music collection because my Apple Music subscription ended, and now I have to download them all one by one via iTunes - even though I only want to stream them to save the disk space.
This happened across both Windows and OS X, of course.
I bought a first gen apple product with this in mind (new macbook), and it's been surprisingly good. My usb adapter sucks, yes, so that's a pain, but since it's not my primary computer it doesn't matter so much. I wouldn't recommend it to anyone as their primary computer. That being said, when the next one comes out I'm ditching this one and getting the new one; first-gen apple products are really mostly proof-of-concept it seems. (iPad 1 vs iPad 2, for example)
I love that an article entitled "declining software quality" is not pulling up because of "Error establishing a database connection" :)
I didn't suspect so much traffic, we're back up now :)
apple's stock symbol is AAPL, not APPL.
also:
> consumers know that Apple’s hardware is the very best, but more and more their using apps made by Google and Microsoft and Facebook.
"they're", not "their".
Thank you!
You mean "expect", not "suspect." "Suspect" refers to people or facts, "expect" refers to events. (I suspect that English is not your first language, so I expect that you will occasionally make mistakes like this.)
Go to his genius bar, ask for a refund ;)
Probably hosted on an apple server.
I read this article thinking it would provide some new information in 2016 or even a lot of details about what "declining software quality" the author has been experiencing. Instead, I found that it's a poorly written re-hash and re-linking of what others have written about in the last couple of years.
The author's limited experience is mentioned in a few sentences without a lot of details. The whole article could've been trimmed down to just two paragraphs including the links to the other articles, considering that the linked articles are not "recent news" to have additional commentary and considering that other people have already written about it with commentary.
Apple has been declining since about 2011 not only in software quality, but also in hardware quality, in innovating hardware, and in innovating software.
When I got my first Mac, it had Mac OS X Tiger on it, and everything about it really felt like it had the user in mind. These days everything about using it feels like it has shareholders in mind. The features being added seem like the kind of things you'd have executives brainstorming up in a committee, and then demanding that engineers implement.
It's a shame. It no longer does The Right Thing™ by default, and there are more hardware and software bugs than I can ever remember in an Apple product. I've listed them before but here's a short list again:
- This mid-2013 Mac Pro wakes up every few hours at night, even though I have disabled every "wake from sleep" setting that exists in OS X
- My mid-2013 MBP has, several times this week, turned on while closed, and continued as if it was opened (playing YouTube videos or whatever else it be doing), only shutting back off after about 120 seconds or so
- Every time I turn this Mac Pro on, it doesn't recognize the wired Apple keyboard that's plugged into the Apple Cinema Display, and says it's looking for a bluetooth keyboard, until I unplug and replug the keyboard in at least 3 or 4 times
- My MBP once emitted a (very) loud buzzing sound from its speakers, for absolutely no reason, that lasted about 3 or 4 seconds, startling everyone nearby, when no sound-based programs were running
- Yesterday I tried syncing my iPhone to remove about 200 songs, and iTunes said it would remove them, and then "did" remove them, but they were still there on the device, and iTunes once again showed them being present; it took a full iPhone reset to clear them off.
Few things here.
My last solid OS X machine was a 17" MBPro with a G4. I forget the OS - Cheetah maybe? That year I also had a developer Intel box. It seems like it was that transition that I could really feel things were going awry.
I'm not talking about the dev box, it was what came after. I had a MBPro 15" Intel machine that was really buggy hardware-wise. yOu could no longer apply updates without rebooting (something we all disapprovingly looked down on at Windows). The software just didn't work anymore either, or at least it felt like it didn't.
I think a lot of this was tied to more and more embedding of Apple apps, just like Microsft did. That and gaining some sort of complacency because they were not the scrappy underdog.
I've been waiting for another Snow Leopard where Apple just says "no more 'features', just bug fixes and stability" (for iOS as well). bUt I think that's just a pipe dream at this point and the train of more is better had already left the station.
I agree. I won't write a list of the problems in detail, but I've been a Mac user since the 80's because I've always preferred Apple's elegant design and ease of use. But lately, it's hard to figure out how to do things, and it's very buggy. Both I and my son have had to reset our phones to fix problems. It looks like my wife will have to reset hers to deal with a problem with music synchronization.
And the design as way harder to master than it should be. For instance, to search for a track in Apple Music, you have to be in any tab except iTunes Store. OK, there's some logic to it, but it simply is not intuitive. Why can't there be something that specifically says "Apple Music" that makes it obvious that that's where you go to find tracks? I'm an experienced computer user, and in fact a software developer, and when I first purchased Apple Music I was mystified about how to search for a track in Apple Music. I had to Google it. My wife, who isn't used to Googling for these kinds of answers and isn't a technologist, has no choice but to ask me or the kids how to do things like that.
In the Music app iOS, to make it show only the tracks you've downloaded, you have to click on the pulldown where you select Artists/Albums/Songs/etc. It's a switch on the bottom of that pulldown. When you're thinking "What tracks do I have downloaded?" this is just not obvious. When you get used to it, it's fine. But if you're a naive user, you really have to have a friend who's an experienced user just in order to figure out how to do such basic things. Naive users may not even understand that the Artists/Albums/Songs/etc. pulldown is a menu. And even if they know it is a menu, it's not intuitive to think that there's an on-off switch for showing all songs at the bottom of it, which relates a fundamentally different concept.
It all just seems like really poor design. I don't know what their problem is. But my son, who has always been an Apple user because I've been one, and who is applying to colleges like MIT to do engineering, is seriously considering switching to Android so that he doesn't have to deal with so many bugs. (Not that I know that Android is better.)
Good UIs have discoverability, but increasingly the software has - whatever the opposite of discoverability is.
Simple example: Mail tries to guess settings for you. If you don't want it to do this - it regularly gets them wrong - you have to uncheck a box in the Advanced tab for every account you have. (Because obviously, that's where you're going to look.)
Then, instead of making the change, you have to click any other account, just so you can get a save dialog.
And if the account is disabled, it ignores your change until you enable the account. Then you can finally save the change and start modifying the settings.
Elsewhere, the latest version of Logic Pro is so bad it's been causing outrage on user forums all over the Internet.
Product management seems to have become completely clueless about user needs, basic UI designer, or QA.
I have no idea who's in charge now, but whoever it is has no idea what they're doing.
Setting up a plain IMAP account in Apple Mail is so much harder than it should be. It used to keep the wrongly guessed settings, even if you overwrote them, so you'd have to delete the account and start over for it to actually work.
Also their usage of words like "account", "mailbox" and "folder" never ceases to confuse me...
"I have no idea who's in charge now, but whoever it is has no idea what they're doing."
Isn't it Jonny Ives? He has a very sensitive eye for physical design. But he may know, or care, nothing about good software design.
> ... whatever the opposite of discoverability is
I suggest "obscuratationalism"
I've experienced a similar issue with song removal. I've also had to jump through a bunch of hoops to finally delete a few old apps that kept coming back after being deleted as well.
If you think their software quality is bad, you should look at their software development software quality. I'll just leave this: http://www.textfromxcode.com/
Apple seems much more concerned with adding new features than fixing bugs, even if that means a significant portion of their users have a bad experience.
In my opinion, Apple's alway been just as buggy as any other company, it's just easier to see now due to their size.
I use OSX as a glorified terminal and the odd game. Unfortunately it's not a great gaming machine, and it's a horrible terminal. Out of the box, OpenSSL and OpenSSH are shipped broken and insecure, and I'm not sure how anyone could seriously use Safari. Games often come late or don't work as well as their PC counterparts and sometimes require remapping joystick buttons with a third (fourth?) party application.
So by default, to do even the most basic of operations, you have to install homebrew and Chrome. I'm not sure how things got this bad. When OSX came out it was kind of forgivable for the first few years, but now that I've been using it for 16 years it's unforgivable.
> I'm not sure how anyone could seriously use Safari.
I seriously use Safari everyday. Works fine for me. In my experience it's noticeably faster, more stable, and more battery friendly than Chrome.
I want to use Safari but I just cannot deal with having no favicons.
I fully agree with this article. So much stuff on OS X is neglected. The recent releases only focused on minor updates of the UI, while many components in the lower layers have not seen updates since OS X 10.6 Snow Leopard (2011). And if they do, it gets as bad as the disaster around discoveryd introduced with OS X 10.10 Yosemite [1].
On a small sidenote, Apple's ticker symbol is actually AAPL, not APPL. I hope you did not invest into the wrong company's stock!
[1] http://arstechnica.com/apple/2015/01/why-dns-in-os-x-10-10-i...
Issues around the sunset of Aperture or iPhoto, bad Photos, Final Cut Pro X, etc are most certainly not software quality problems, but product management problems.
On such large apps that need large teams of engineers, without good product management and also project management, even great engineers can do a terrible job despite all their efforts or heroics "on the ground".
Any sufficiently advanced product management problem is indistinguishable from a software quality problem.
OS X has major Thunderbolt issues. OS X crashes nearly every time I put it to sleep with a Thunderbolt monitor connected (10.9 through 10.11), daisy chained Apple monitors aren't detected, Thunderbolt monitors aren't aways detected when plugged in (as in they don't turn on), waking up a machine after disconnecting Thunderbolt monitors sometimes results in a black screen where the OS thinks the monitors are connected.
Windows 10 on the same MBP, has none of these issues. OS X crashes significantly more often than any Windows machine I've used in the last 5 years (Windows 7-10). Apple used to brag about how OS X was more stable than Windows, but I think that Microsoft's need to make Windows compatible with so much hardware and software has actually resulted in a more robust and harder to crash OS.
I'm not much of a mac user. I've never seen an OS crash since windows 7 arrived. Neither on windows (7, 8, 10) or on OSX. Are scenarios where the entire operating system crashes really occurring in significant numbers?
As a very casual Apple user (ie I don't own any devices but I sometimes uses my girlfriend imac/macbook), I would say that this has been my opinion for a couple of years now. Pretty much everytime I use OSX or an iphone I run into bugs, as a short list from the top of my memory:
- iphone screen turning black with siri enabled and repeating that she couldn't hear what I said or something like that
- closing lid on macbook, re-opening -> no more wifi until you reboot
- itunes download progress disappearing after a few seconds, changing tab and coming back would show it again for a few seconds
- randomly losing wifi
I might be cursed when it comes to Apple software but that also means I won't go near an Apple car, even though I hope they have very different standards from the OS/phone teams.
It is even worse every time Apple pushes out a major iOS release.
The sentiment in this thread rings true to me. Usually the techies (ie HN readers/commenters) are ahead of the curve, so it will only be so much time before the cries from the general populace become louder.
Meanwhile, last time I was on an airplane (December), every single older woman over the age of 60 had an iPhone. This means that it's not only reached critical mass (the late majority on the technology adoption curve has been achieved), but now it's no longer hip.
I'm not sure what will be next, but I'm guessing it won't be Apple's.
Were I gambling man, I'd have shorted Apple's stock right there after that airplane ride.
I wouldn't discount all the people who don't give a shit about being hip, but just want their devices to be safe and kick ass.
Apple haters seem to often have the misconception that people like Apple products for their style appeal. It's easy to think that, and not wonder if there are possibly some more important reasons.
That's very well, but this entire thread is about how Apple's devices are slowly not kicking ass as much anymore.
So, if we're not buying them for style... they're not working as well... they cost $800... they have questionable tax and manufacturing practices... at what point does the public call it all into question?
This thread, in my opinion, is a sign of bigger and worse things to come for the company. The canary is singing.....
It seems at Apple, engineering and software quality has taken a further back seat since maybe iOS7. Success can do that to the company, high flying bizdev starts to dictate over engineering and solid deliveries, sometimes forgetting what business they are in. Similar to what happened to Microsoft under Ballmer for a decade with Vista/Windows Phone etc.
Apple computers in 2005-2006 became awesome when they went Intel and OSX was solid, everyone loves to code on a pretty front end backed by *nix. Then iOS in 2007+ was solid for many versions. Slowly, it has been getting worse.
I tend to agree that the bugs in OSX during the last few iterations are quite unbecoming of a company with Apple's legacy and reputation. I hope it's just a temporary glitch in their process.
It would seem sensible for Apple to fork an OS like Debian and focus on the distinguishing features of Mac OS such as the UI,gestures,power usage optimizations and Apple specific applications.
From the user and developer point of view, the only major drawback would be applications. Devs porting over apps to linux and users not having some of their favourite apps available.
It's becoming clear that Apple is not paying enough attention to the under the hood details of an OS and the security situation. And maintaining the OS is a huge investment from Apple. So it only feels natural to base Mac OS on linux and, as I said earlier, focus on the distinguishing features.
Technically though, it would still be a huge challenge as there are lots of factors to consider,but in the long term everyone wins Apple,linux though maybe not Microsoft.
I have been thinking about this idea of Apple using linux instead of its proprietary Mac OS for a long time, and have considered the transition only at a very high level. I would like to hear what the HN community thinks of this idea.
Edit: So instead of forking a version of linux and developing it independantly like what Apple did previously, a better model might be where Apple periodically forks from upstream projects and makes its modifications. Something like what Linux mint does with Ubuntu. So this way, they continue to get the latest and greatest developments from upstream projects without investing additional resources into maintaining their forked OS(this is currently what is happening with Mac OS).
Except Apple already did this once, and honestly the under the hood bits are mostly fine. Virtually all the bugs that people complain about are in userspace programs, like Mail, Safari, or especially Finder. Using a different kernel and switching form BSD to GNU under the hood isn't going to fix any of that.
If Apple made a new OS, using a Linux kernel, and getting rid of the NeXT framework stuff (no more Objective C bullshit, no more deprecated Carbon, or Cocoa/ObjC/Swift), I think it would work pretty well.
Many major issues regarding security,filesystem keep cropping up(there are a couple of links in the article). The idea is that instead of spending time and effort trying to maintain the OS, they can instead use the battle tested linux and focus exclusively on the distinguishing features.
And this will hopefully enable Apple to focus on the issues in the userspace.
You could run into licensing problems. Apple wants their products to be a walled-garden. Even then, you'll have people complaining that the core is bad software.
It's undeniably true that Apple software has lacked in quality in the last couple of years. I say this as my Siri stopped working altogether, Photos app upgrade destroyed my library and my iPhone keeps telling me the songs I downloaded aren't there.
For me, this is a great example of what we're talking about:
https://discussions.apple.com/thread/7257238?tstart=0
""Disk Not Ejected Properly" error after EL Capitan goes to sleep and wakes up again."It's because of things like this that I usually wait until the .4 release to upgrade. El Cap is at .3 and I'm feeling stupid for breaking my rule.
Btw, disabling "power nap" made the problem go away.
The UI and UX has also become much less intuitive and more complex. iTunes used to be very easy to use. Every few updates and they are constantly making you relearn where to find things which is annoying. The same thing is happening to the Music app on iOS. It's not nearly as good as the original iPod software. They've added features that don't improve the experience at all.
The 'maximize' button now making a window take over your whole screen being one of the oddest.
And disappearing scroll bars - why?!
Didn't this company write the HIG book?
you can force the scroll bars to be always visible. Look in your general settings
Thanks!
System Preferences -> General
I've been complaining about this for a while now, and I'm glad more people are experiencing it and complaining.
I honestly dread Apple software updates because I know they're not going to fix anything I care about, and they're going to break stuff or arbitrarily replace perfectly fine, working features with new half-assed implementations that are buggy and don't work as well.
The latest example came just a couple days ago when I finally updated to the latest iOs, hoping maybe they've made it possible to remove the "radio" feature from Music. They haven't, and as bonus they replaced the "app close" screen with an ugly, broken one for no good f'ing reason; and the top info display with service/time/battery level now covers the top 20 pixels of several apps, obscuring the view in those programs. And they've added a few new crappy apps I have no interest in using, but can't remove.
I still think their hardware is awesome, but their software is going down the drain.
If we believe what SV CEOs are saying about the scarcity of software developers and the difficulty of hiring, that they are limited by supply and not demand, then I'd expect that Apple is dedicating more of the limited resources to iOS, and OS X will continue to suffer. I think iOS is responsible for much more of their business and I don't see that changing.
Yeah. It looks like iPhone/iPad sales make up about 80% of revenue while Mac sales make up about 10%.
http://cdn.macrumors.com/article-new/2015/01/piechart115.jpg
Had been on OSX for desktop pretty much exclusively for the last 15 years up until recently. Have jumped to Windows due to my current focus on VR.
I used OSX for the first time in a few months yesterday, having installed Windows 10 on my Macbook a few months back.
For about 5 minutes I started to think that I missed it, it looks so nice, but within 10 minutes I'd had two fundamental networking issues crop up that slowed and then stopped me from doing my work without a reboot.
I own a fully pimped out top of the line 2014 15" Retina Pro ($4k to buy at the time). For at least a year it had a WiFi bug that required me to enter a command at the terminal in order to disable some aspect of networking so that I get full WiFi speed. If I don't my WiFI is limited to 10Mbps, once the command is run I get nearly triple that (i.e same as when I plugin ethernet). Why was I having to manually type commands at the command line in order to get my $4k laptop to connect properly to WiFi? 229 pages of people's issues here in just one thread:
https://discussions.apple.com/message/29742198?tstart=0#2974...
Also google "slow wifi MacBook pro"
It's possible this has been resolved in more recent updates and/or El Capitan.
Final issues yesterday was network access to my NAS, just random connectivity. Sometimes I can get access sometimes I can't. If I try to connect with an incorrect account password then access hangs and I'm not prompted to login again, so I have to reboot to get a network login window. It's just so frustrating, because I would love to stick with Apple. I've had none of these issues with Windows 10, or even Manjaro when I tried it again recently. But I do really miss lots of 3rd party software that I can only get on OSX.
On the flagship iPad Pro with Logitech keyboard and IPSEC pre-shared-key VPN:
iOS Logitech VPN
9.1 OK OK
9.2 Fail [1] OK
9.3 Beta 2 OK Fail [2]
Meanwhile, iOS 9.3 beta is slower than iOS 9.2, especially on older devices, http://www.cultofmac.com/406805/caveat-emptor-ios-9-3-beta-1...I think we can take that as evidence that regression testing is hard.
Yes, although the VPN regression was reported in iOS 9.3 Beta 1 and remains unfixed in iOS 9.3 Beta 2.
All throughout elementary and middle school, I used Macs. My schools had them, and my family had one that I used for all my work. I went from System 8, to 9, all the way through 10.0 to 10.6.8.
I bought a Mac this summer to do some OS X-specific development, and I was truly shocked by how bad things had become on the desktop. Performance is generally poor, UI has apparently gone out the window, and the UNIX features that Apple once heralded have been hidden below new toolkits and proprietary replacements of old systems (discoveryd anybody?).
In many ways, I was reminded of how things were both on System 9 and during the early releases of OS X. It's not nearly as bad now as it was then, but Apple also doesn't have nearly as many reasonable excuses to for such a regression in quality.
Aside from getting to the top of Hacker News, I don't think this article achieves much. Listing things like Photos instead of iPhoto is more of a product development decision than a systemic quality problem.
I run Parallels and Windows 10 on my MacBook Pro and Windows runs smoother in there than it does on my Windows laptop...Maybe that points to superior hardware and integration, but I think it's also because OS X is more efficient than Windows and it's 500 lb sack of goiter it drags around in the form of legacy support for everything in the last 30 years.
I used to be a Windows computer, Android phone user. Now I'm an iPhone and MacBook Pro user. Every ecosystem has issues. I find that iOS and OS X still give you the smoothest ride though.
I think the decline is overstated. Couple related points-
- They at least offer updates, many other vendors offer few to none. Is that how you achieve quality? My wife's iPhone4S is still receiving updates and she's been happy with the device since 2011 to today. I've always used competitor's products and never achieved this. As a result, we will continue and increase our scope of purchases from Apple.
- Software is hard, many pieces of software that I've dug into I'm left amazed it even works.
I'm not outright dismissing this blog, but I wish it included who is the leader of software quality- since the statement is that it's not Apple.
My experience leads me to think that if they're an example of failing software, wow... what about the rest?
That's the vendors fault really, forcing their crap software onto the OS and not wanting to invest to keep things up to date to allow updates. Google and Microsoft both do regular updates to their OS, and if you go with a non-vendor phone (like Nexus), you'll get all those updates.
But yeah, the software situation from most phone vendors is atrocious.
The biggest annoyances for me are quite minor, but common occurrences.
Multiple password prompts, having to login twice after a restore, terrible multi-monitor support, lack of window management, unclear system prefs (or things have moved again since the last update), hidden options (holding the option key is so unintuitive) and shortcuts, networking tools are poor, weird update process, notifications blocking the UI of apps, strange finder UI.
There are a lot of UX issues that make it a frustrating experience.
I was a fairly late converter to OS X having used Windows and Linux before. I used to hear people berate Windows similar issues to those above, and I'm surprised on switching to OS X that things aren't all that different/better.
I actually enjoy those little quality fails on Apple and Google apps. Kind of re-assuring that even billion dollar companies can do the same stupid mistakes I make with my code. Quality is hard, especially when you need to ship.
Along with a few other issues with OS X updates over the last month or so, iTunes continually downloads previous purchases that have already been downloaded like 5 times.
This is an incredibly stupid bug and it's eating up my disk space (until I clean it up of course). I am literally talking about 10 entire albums at a time every time I make a purchase. It's been going on for the last year. I'm not sure why I still use iTunes anymore to be quite honest. I buy my songs through iTunes and then they get uploaded to google music, and I think I've just been going about it as a matter of habit all this time.
This is for entertainment purposes only type of question - what if they moved just the desktop to Linux instead of OS X and port Cocoa atop Wayland? Leave the mobile stuff on iOS, which it seems it's good for.
This is the kind of revolutionary thinking Jobs had when he went OS 9 -> OS X... could Linux set them free on the desktop by open sourcing the desktop OS X?
I know, I know, it sounds ridiculous, but there's bits of OS X I like...and bits of Linux I like, as far as technology, architecture, speed, GUI frameworks... like some combo of the two would be a killer OS for the ages and open source too.
If this happened I would be unimaginably happy. Actually, one could argue that the move to open source Swift and invest resources into making it run on Linux could be keeping this option open.
And really, I don't think it's very ridiculous. As you've said, it's more or less exactly what Apple did with BSD/Mach and the OS 9 -> OS X transition.
What would that get them? OS X is already built on top of BSD.
I happen to disagree about software quality getting worse. Most of the people in this thread bashing Xcode don't even know how to spell it, which should tell you something about their level of ignorance on the matter. I use it every day and the Xcode team has been knocking it out of the park.
And OS X -- it's hardy bad by any stretch. Mostly it's fantastic. There are the rare glitches, but as other commenters have said, there are several factors at work, what with increasing complexity, efforts to get platforms to work more closely together, proliferation of multicore CPUs, 64-bit, evolution of security and networking technologies, bringing Swift and modern programming language patterns into the technology stack, dealing with changes in unstable third party platforms, and preparing for whatever unknown things Apple has in the oven.
Apple is trying to bring the future to us. Doing this kind of work, over multiple versions with plans spanning years or even decades, is not simple. This kind of work has been likened to rebuilding a 747 aircraft while it's flying. IMHO it's probably even harder than that, and Apple is doing a damn good job so far.
Admittedly there's a lot of assertion here, from my own experience of things working just fine, and not much (even anecdotal) evidence, but I just wanted to say one needs to view the small bugs in the perspective of all the significant work that is being undertaken.
I don't think releasing stable software is harder than rebuilding a 747 while it's flying. Especially for one of the most valuable companies in the world. If they can't deliver stable, high quality, easy to use software, while also meeting their goals, then they need to change their goals. >multicore CPUs >64-bit >"evolution of security and networking technologies" (?) >Swift >"unstable third party platforms" (?) >"whatever unknown things Apple has in the oven" Yeah none of these things are valid reasons to release broken software.
Obviously opinions differ on whether the software is stable or broken. I think is stable, and not broken, and I'm speaking as someone who actually uses it.
My home systems are Windows and my Work machine is a mac. I find I'm more productive on Windows for Office/Browser like stuff and more productive on my Mac for dev kinds of things.
However, I generally avoid most of the default software that comes with both OS's with one exception: Windows Explorer is lightyears ahead of finder. Finder is really quite terrible and behind in usability and UI from Explorer. So while I avoid Finder, I still use Explorer for many tasks...so many that I don't bother with most "organizer" apps.
There's some things on Windows networks that are really nice, like WDP (which is much nicer than VNC).
Both OS's are pretty rock solid in my experience and across multiple machines. It's actually the Linux machines I come in contact with that are super flaky.
However, I've found that software on my work Mac is pretty crashy/flaky compared to Windows. I also notice that app developers seem to play more monetization games in OS X-land compared to Windows e.g. I just had a free app I've used for a year auto-update and disable itself because the author decided he wanted to turn it into a paid app. To be fair, the crashy flakiness seems to coincide much more often with some combination of opening-closing the lid and losing VPN connection to some servers.
Either way, both OSs seem to be about even to me for 90% of what I do, and the parts where they are better than one another don't really overlap.
I'm glad someone has written about this because it's something I've noticed over the past few OSX major releases. The interesting thing is that speed and stability appear to have been getting better for my use cases. However for the first time in my 20 years as an Apple user, this has been at the expense of visual fidelity. Both iOS and OS X seem to be slightly "glitchier" and applications end up in bad states where the only clear path to a fix is relaunching.
Not to be a stickler, but this particular part of the discussion seems unfair to me. Application bugs should be blamed on the app developer, not on OS X. (Unless, of course, it's one of Apple's built-in apps.)
I use a lot of Apple's software and, for the most part, it works great for me. The only outstanding issue I can think of at the moment that I'd like to see fixed is Safari/OSX is slow opening new tabs because of the new 'favorites' page. Temporarily I've gone back to opening new tabs with 'empty page' to solve the problem. On a daily basis I'm using OSX for 8-10 hours and iOS for 2-4 hours. Everything pretty much just works as expected.
I got the same feeling as the author when I read Mossberg's comments. If someone who is that big of an Apple fan is seeing the cracks in the stucco, it is something Tim Cook really needs to pay attention too. Marco Arment kicked off a similar storm last year at this time[1].
[1] https://marco.org/2015/01/04/apple-lost-functional-high-grou...
> they need to grow the world’s biggest company every quarter to keep Wall Street happy
Keeping customers happy is important. Keeping "Wall Street" happy is the root of most problems.
http://www.forbes.com/sites/bruceupbin/2013/06/30/the-six-ha...
I've been using Apple hardware since cutting my teeth on an Apple //e and then progressing through a continuous chain of Macs now totaling well over a dozen since the late 80s, and my perception is not that things have gotten worse of late—it's been up and down the whole time. My take on this is that Apple has actually improved at their traditional weakness of cloud services to the point where now people use it enough to complain.
Cloud services is the one area where newer companies like Google and Facebook have a distinct advantage over Apple and is really the only threat (Android can never be as stable as iOS because of the vertical integration). Cloud services are amazing because they add so much value in the mobile era where you can easily operate from multiple devices without syncing nightmares. Google's cloud services are the thing which makes Android better than iOS in the areas its superior.
Up until now Apple's offerings were so ridiculously terrible than no one even used them, whether it was iDisk or MobileMe or whatever other branded services they churned through—they were utterly unusable. In the iCloud era it's actually become usable which exposes all the warts in new features which would have been hitherto impossible, whether it's iCloud app developer woes, message syncing issues, or Photos complaints, these things have all become usable now and at least arguably competitive with Google's offerings in the same spaces.
Sure there are warts, and sure it's annoying OS X doesn't get the love it deserves as iOS drives the company, and sure they're struggling with the scope of maintenance they're now saddled with, but that's not really the sky-is-falling narrative that tech bloggers have been pining to snap onto the Tim Cook era.
Apple fans blow expectations way out of proportions. Apple products are fairly decent but far from perfect. Being a web dev, the only reason I use a MacBook Pro is for having a performant Photoshop without a VM. Otherwise I prefer Ubuntu due to the hardware choice and for almost every other aspect.
To be honest, with the Retina MacBooks, the memory consumption has only increased with the 2x graphics and images becoming more common on the web. But the RAM on a MacBook Pro is seriously lacking. 16GB is simply not enough even without running a VM. All it takes is around 100 browser tabs and an IDE for the ram to exhaust.
I wish they had a 32GB version that's not a desktop. And no, their hardware isn't necessarily perfect - despite what the fans like to believe. I have seen 4/15 MacBooks with pretty uneven display backlight that normal users don't notice. The best thing about MacBooks in 2016 is still the trackpad and a pretty design - everything else is outdated now. You can find equivalent or much better in all other aspects of hardware.
>16GB is simply not enough even without running a VM. All it takes is around 100 browser tabs and an IDE for the ram to exhaust.
Browsers are notorious memory leakers. I find my rMBP 16 GB is fine even with an IDE and VMs, but I do have to restart Chrome or Safari at least once a day.
> I wish they had a 32GB version that's not a desktop.
There have been limitations with Intel chipsets there until the latter half of 2015. I'd expect to see a 32 GB and maybe even a 64 GB MBP this year.
It's surprising just how bad Apple's software has got in a seemingly small space of time.
Yosemite was the biggest disgrace of all. My mouse or keyboard would not wake the iMac after it went to sleep. And the beach ball was seen so frequently that my 1500 CHF iMac was useless after just 2 years. El Capitan finally fixed the sleep issue and the beach ball is seen much less, but I'd expect much more for the price I paid. The profiteering of Apple with regard to having an SSD with the iMac also severely annoys me.
My iPhone seems to have numberous glitches and software issues too. And for a fine example of just "what the hell are apple thinking" - the original incarnation of the battery monitor in iOS was it. What's the use of seeing the percentage of the battery an app has used - a completely useless statistic unless it included the time too. I still don't find it a particularly useful feature, as I am sure my battery is draining too fast, but the battery monitor gives me no simple way to prove this.
iOS low battery mode disabling itself once charge goes over 80% is really annoying, but maybe I'm just used to how it behaves on the platform they copied the feature from...
Wasn't there an article some years back that pointed out how whenever someone dares criticize Apple, they're always careful to mention how they use Apple products everywhere and Apple stuff is still very nice, but please Mr. Jobs if you could refrain from deleting everything on my phone when I plug it in I'd be ever so grateful...
This will be controversial but I believe Linux will become the desktop of choice in the long run. Here's why. Tech like Docker is making it easier to deploy whole systems of software, not just in the cloud. Google has put most of the apps 99% of users use 99% of the time on the Web. Everyday users are moving away from the desktop anyway. And the whole litany of other typically quoted reasons to use Linux like security, package management, lack of viruses, performance, ect. Finally companies like Microsoft and Apple will find it too expensive to maintain their subpar OSes when they make all their money elsewhere anyway. E.g. iTunes, iPhone, enterprise software[1]. It probably still a long ways off though.
[1] http://www.zdnet.com/article/apple-google-microsoft-where-do...
Linux on desktop is not subpar? Well that is news.
Silly question but we read all the time those complains from tech journalists or 'apple geeks' about those macos bugs, while I'm not a huge fan of the OS myself, my complains are usually UI or performance related, rarely on actual bugs: curious, is there an obvious, known list of bugs those guys are talking about?
I might pine for the days of Snow Leopard, but part of that was the hardware - for the first time you got some powerful Intel hardware running a mature version of OS X. It was bliss compared to a G4 running 10.2 or the original MacBook Pro on 10.4. Plus, SSD's started to enter the mainstream in 2010.
That said I'm not too upset with 10.12 or whatever the heck we are running right now. What bugs me is iOS 9. Long pauses, freezing up, graphical glitches, first-party apps crashing, sound lagging...it's just terrible.
I guess I'm happy that I haven't had the old problems of not syncing up, the phone getting hot and eating battery life, apps not refreshing, etc. But the current iPhone experience reminds me of amateur hour shit I'd expect form a low-end, no-name smartphone.
In iTunes 10.7 with iOS 6.1.3, it is possible to sync Safari bookmarks, contacts, calendars, photos, and music over USB. I don't always have Internet access, particularly when on a plane.
Notes sync was broken between 10.7 Lion and 10.8 Mountain Lion. Now it won't sync without iCloud.
To downgrade iTunes 11 to 10 on Mavericks 10.9, I had to replace some system libraries. That broke the Mac App Store. I use a friend's laptop to get binaries of apps that they download that are only on the App Store (e.g. Shazam, LINE).
If Apple fixes USB sync, I might force myself to get over the new GUI (which is awful - no track name & artist on the iTunes MiniPlayer?).
If Linux developers get USB sync to work reliably, and make some UI scripting tool similar to AppleScript, then I would consider switching.
Even the article is buggy!
"The best thing for Apple to do is to re-take their position as a leader of software quality before it's too late: consumers know that Apple's hardware is the very best, but more and more their using apps made by Google and Microsoft and Facebook."
*they're
I think it's a management problem. I agree that the software quality have been declining the past years. Snow Leopard is probably the most stable OS I've ever used, and since then, the OS have become bloated IMO.
However, the major thing that bugs me is Apple's services. iCloud, iMessage and Apple Music have had some terrible issues. Especially when they launched Music, it was not only buggy, it was SLOOOOW. I can't believe that the giant Apple can't even compete with little Spotify in terms of ease of use, speed and stability.
It shouldn't take 10 seconds to search for a track, and not take a few seconds to start playing a track. Spotify is, and have (the past 4 years when I've used it) been INSTANTLY on when playing tracks.
Apple's software has never been perfect-- but the competition was often much worse, so it looked really good. (I switched to Mac because I couldn't take WindowsXP crashing all the time.)
However, now Apple is essentially the iPhone company. The iPhone has a new version, with new hardware and new capabilities, every year. This basically means a new iOS, new apps, and often, new capabilities in OS X (Hand-off, whatever is happening with photo sharing this year, etc), and new features in XCode. (Plus all the watch stuff last year.)
I'm all for being agile and moving fast, but there's just not enough time to do this right. Sure, it might help to have more focus, and they have plenty of people and money. They just don't have enough time.
Frankly I think a big part of their quality decline is their approach to bug tracking. And maybe their internal bug tracker is just as bad as their external one.
If you want me to bother to help you find bugs in your system, you have to make it easy to file bugs and you have to make me feel like part of the team when I do that. Apple does not.
Before I even get into interface issues, let's just say that Apple's bug reports are like a ginormous black hole. I've filed plenty of bugs that have never been properly closed or even updated, and it makes no sense. In any other system I've seen, you would at least be able to see that your issue had been assigned to someone, and given a priority, etc. Apple has none of that.
The only thing worse than filing a bug that no one responds to is having to spend 20-30 minutes writing it in the first place, and Apple excels there, too.
Their reporting system is too complex (too many fields that should not be necessary, especially the ones that seem to ask the same question twice). It's not automated in any way. It is not integrated with the OS; e.g. why can't I turn on a "developer mode" on my Mac or i-device that lets me instantly do things like "compose new bug report for this OS version" or "compose new bug report for this version of this application", etc.?
Also, despite Apple's attempt to redo the web interface a few years back (because developers had complained endlessly about the even-worse previous bug reporting page), today's web interface is still mostly a repaint to look iPhone-like. It doesn't really address core usability problems. It has created new ones though; for instance, right at the bottom, right where you'd expect to say "submit" and send in your last 20 minutes of writing, is the DELETE BUG AND OBLITERATE FOREVER button!! I once destroyed a bug report that I'd spent a long time writing and I was extremely frustrated. It made me instantly decide that Apple didn't really need to see my bug report after all. And that is a broken system.
"This is a blank problem. No, this is a blank problem."
I don't think there is, in fact, a "problem" that can be "solved" to return Apple to the glorious design thought leader status we all know and love. The "problem" is that Steve Jobs is dead, and his unique talent was coordinating a huge number of talented designers/programmers/businesspeople to passionately care about meticulous levels of detail in product design.
At this point, the board-meeting-and-shareholder conversation will continue listing the problems with Apple's "management" or "perception" or "process" or whatever. Their efforts are entirely futile.
Microsoft started declining first, back with Windows Vista (7 was pretty much a bugfix). Apple lost it's way when Forestall was booted (regardless of what you think about skeumorphism).
While this is all happening, Linux's software quality continues to rise. KDE Plasma 5.5 is beautiful and stable. GNOME 3.18 is Stable and very useable. XFCE, Mate, and Enlightenment are rock stable.
This isn't new; During KDE 4.x, Microsoft directly ripped off KDE's first plasma interface for Windows 7. Before Vista was released, They were showing off wobbly windows in demos, weeks after it was first accomplished in the Compiz window manager for GNOME (now part of Unity).
> The best thing for Apple to do is to re-take their
> position as a leader of software quality before it’s too
> late: consumers know that Apple’s hardware is the very
> best, but more and more they’re using apps made by Google
> and Microsoft and Facebook.
I'm not sure whether it's not too late already. I switched last fall to an Android phone using only iPhones before that. (I just didn't see the point anymore in paying 2x for an iPhone, which can't do anything better). I started using Google apps like Inbox, Google Calendar, Google Keep, Google Now, Google Fit and now, even I would go back to an iPhone today, there is no way, I'll use Apple Apps again (not as they are today).
While my point of view is strictly limited to security, I really don't think Apples software quality has declined. I think it's always been really shit, but has been receiving unprecedented amounts of attention lately.
The narrow scope of accessibility on Apple desktops -- something I'd hope the company would be more concerned with as its most loyal and oldest users age -- is a similar shambles. This isn't something which (yet) affects me directly, though it does those around me. Some thoughts and observations ... for which Apple's own feedback dialog is too short to submit the whole piece:
There's very little real content in this article. The author complains about a couple of things, quotes others that have as well, then goes on to hypothesize about why the quality is declining.
You're right, thanks for the criticism. I'm experimenting with blogging, and I've learned my posts need to be even more substantive.
My iPhone has wiped my contacts out of the blue 3 times last year. I have resorted to manually doing monthly backups. This never happened before. Apple Maps, which was awesome (Bless Forstall), is now cluttered with icons and logos on new queries that obfuscate my previous searches while I'd WANT the new searches. It feels like UX is going downhill, and it pisses me off. More than anything, what was meant as a personal device I can trust, has become just "another personal device". Wake Up Apple.
My theory on the decline of Apple software is that they're starting to have the Windows legacy problem. Years ago Apple was able to wipe the slate clean and build a modern toolset exactly the way they wanted without having to worry about legacy support, but now they don't have that luxury. Refactoring technology is hard, and it's way harder when you have a lot of developers dependent on the things you want to refactor, and it's even harder when you have a huge ecosystem like an OS.
I think the reason for this is that Jobs actually was a software man more than a hardware man. For hardware he had Ives.
Now that this combination is gone their software is slowly but surely going bad.
But another bigger issue is that Apple is still focusing too much on user interaction rather than system interaction.
The real advantages which we are gaining from technology is not a better input format or navigation scheme, but rather the use of weaker or stronger algorithms to remove the need for the user to interact with their machines.
I agree a bit. I feel like for example regarding ios on the iPhone. Ever since Forstall left, its gotten very sluggish and has too many uncessary apps (Health/Tips/News???) We dont need all of those apps built in. Same goes for the computer. Some of the stuff such as Wifi calling could be easily implemented in a simple update and not pushed into a re release. The design asthetic they had has also gone stray which doesnt make me look at how some of the icons look ridiculous
If that brand name is tarnished by regressions and performance problems, what consumer would buy a car from the brand?
This.
I would never buy a car from Apple, EVER. You make computers, not cars. It would be like Ford suddenly making a PC, how stupid does that sound? Ford would be the laughing stock of the tech industry. But for some reason since a decent computer hardware company thinks they need to get into cars, they think people are going to buy it?
This is the sort of delusional thinking that makes companies go bankrupt.
I'm getting an error as of 9:43am UTC -6
EDIT: Looks okay now 10 minutes later.
Apple has not gotten substantially more buggy IMO, but they've made some questionable UX choices (and put more MSFT like stuff in their OS). Windows has gotten much less buggy but is still doing stupid nonsense that probably started with management. So their software advantage has eroded some, but that's as much MSFT getting less stupid and Apple getting more like MSFT in their approach to stupid nonsense in their OS.
The past two days there has been a badge on my calendar icon on the dock. I'll go in and see a new event in inbox but the badge stays..
Such a tiny thing, yet so infuriating.
Sorry, you lost me at "rockstar developer" (actually at that brilliant quote about the "inferior OS", c'mon we are engineers right?).
In my time using Apple Mac gear since the Mac 128K days, Apple has put a moratorium on new features and focused exclusively upon stability and bug fixes for an OS release only twice in my memory; once under Mac OS, and once under OS X. Software quality at the polish and attention to detail level is a Sisyphean entropic struggle for everyone, though. It's just that Apple's enormously successful "It just works" marketing message from 20006-2009 continues to doggedly trot out with its fans today, so Apple gets tarred with the same bar to clear today despite an enormously more complex operating environment since then.
It doesn't help all the review channels (magazines, newspapers, blogs, vblogs, videos, etc.) focus their attention exclusively upon new features in a new release. You need someone as persuasive as Jobs to pull off convincing those channels to review refinements to add stability as new features. The people Apple has put forward to hopefully capture that role hasn't resonated with the public yet.
But software QA should never be an episodic, herculean, release-bound effort. There are ways to market it positively, but I'd always rather use the finite PR time surrounding a release marketing and selling the quantum leaps that set me apart from the competition, not the incremental steps.
In the age of Big Data, instrumented apps and online-inline updates delivery, a possible initial pass at identifying problem areas to shore up that is actually hitting people in their daily workflows is to simply track on the search engines the popularity of complaints about major bundled applications, subsystems and components. An OS X release should not, for example, let the first-page results for Contacts be about entries disappearing [1], three updates into its lifecycle.
There is still plenty of space for innovation here, I think. There are auto crash reports, but no systematic, automated means for a publisher like Apple to send targeted offers to users sending crash reports that pass a specified threshhold (number of crashes, kind of hardware, stack trace pattern, etc.) to update their app to an instrumented one that give developers not just the results of instrumenting the one or a handful of users who happen to persist enough through the now-familiar troubleshooting dance, but hundreds or even thousands of users at the same time, opening up opportunities to automatically search for commonalities and assist with the manual troubleshooting.
[1] https://www.google.com/search?q=mac+contacts+%22el+capitan%2...
I wonder how much does this relate to Bertrand Serlet leaving the company. It seems to me that the quality of Mac OS X dropped quickly after he left.
Lets see. They are arogant fascists who think they know better than their customers. They charge a lot more but do not provide anything more than their competitors and often less. The company lives and dies on marketting and perception. They are the micrsoft of the Unix world and I hope they go to 0 because they are bad for progress and bad for technological diversity.
For a desktop OS, what's the alternative? Windows 10? Ubuntu? I've used both recently and neither felt like a step up from OS X.
On the Ubuntu side it really comes down to what you want from your operating system. If your desktop is primarily for development, and you want to be in control of your environment then it's viable. I say that as someone whose been using Linux since 2000 though ;-)
I consume all the walled garden media stuff through Android and iOS - they're devices like my fridge. My desktop is the centre of my creative world, I don't want someone telling me how to do it there. And yes, I'm willing to trade-off my own time to obtain that control.
Apple has never had quality software aside from the OS. (Ok Safari is pretty good too.) Ultimately it boils down to, too many khaki + button up wearing MBA white dude types, only 1 Steve Jobs. (Sadly seems Jony Ive is not a replacement.) Their culture is antithetical to hiring good people, and without a slave driver to instill fear (Jobs), no one cares about quality.
Apple could do a lot worse than looking at Lloyd Chamber's 'Apple Core Rot' series of posts and start addressing everything on that list...
http://macperformanceguide.com/topics/topic-AppleCoreRot.htm...
I disagree and agree.
I find the core iOS and OS X operating systems to be just fine, thank you: frequent security and other updates that I get on a timely basis (compare to my Android Note 4, which I really like, but getting updates is infrequent).
On the other hand, I think that the web services like iCloud and Siri have a lot of catching up to do.
I used to complain a lot about ios sdk. I found it so miserably out of date and clunky, and full of undocumented weird behaviors that i constantly had to fight to create fancy UIs...
Then i went back developping on android for a small project, and found that image file names couldn't have space, or uppercase letters.
You guys are scaring me. I have Lion on my Core 2 Duo MBP and Mountain Lion on my i7 MBP. I've been content but Google says that Chrome won't get updates for these after April and I've been thinking about upgrading. Now I'm thinking about just going back to Firefox.
I honestly thought it was just me, but I agree the software quality has been at an all time low...
I hate to break the news but the incentives that are needed to be a top notch Hardware company are orthogonal to the incentives needed to be a top notch software company. Microsoft cannot be a hardware company and Apple cannot be a software company. They can try!
I have tried every iphone since the iphone 3. I always sell it on ebay 3 months later. Usually it's because of the battery issues. My new 6s is the one exception. Everyone is talking about how much the suck even with more revenue and a better product.
Another thing is the forced updates. Now you constantly get nagged about updates on iOS, almost daily, even if you have learnt your lesson and decided not to upgrade your phone's OS because Apple WILL eventually slow your phone down.
To be fair, Apple seems to be addressing this with the last release. El Capitan had fewer new features and concentrated mainly on performance issues. I hope they do the same in the next release. I don't care for more features at this point.
>El Capitan had fewer new features and concentrated mainly on performance issues.
Well if that was their goal I think they failed. I skipped Yosemite, so I'm comparing to Mavericks. El Capitan feels much slower, I don't think I ever seen the beach ball so often.
weird, it felt faster than Mavericks to me. I'm pretty sure that Mavericks eats up more RAM than El Capitan but then again I could be wrong.
Still, having Apple acknowledging the problem is a win. I've tried migrating to Ubuntu and Windows 10. Didn't work. If I could justify a new Mac Pro, it would probably mitigate the problem. I just wish they had a Mac Pro lite. I don't want yet another converted laptop e.g. mini & imac
One thing I've noticed is that Apple product users (myself included) always think Mac and iPhone software was so much better 5 years ago. Is it really? I think if you look at it in detail there are features that take turns at being bad.
Calendar invites from exchange in Mail.app don't show the date of the event in 10.11 - I have been patiently waiting for this to get fixed because I figured it was obvious. Still waiting.
There is a lot of truth in this article. I feel like on some occasions I gravitate to other OSs because Apple has kinda put the quality aspect on hold for their software releases.
Apple needs a 'No man'. (as opposed to a 'Yes man').
Steve Jobs was that guy. He did not care about anything but 'is this a great product and would I use it?'. Stock price or someone's feelings be damned.
I've felt this way for many times of the last few years but the one that hit me hardest (even though this is more esoteric rather than design or technical faults - which imo there have been many) when I walked in the Apple store and saw the center piece of the store - Watch Bands - not the Watch. No, man, that aint right. I like to think Jobs would have fired that person on the spot.
> they need to grow the world’s biggest company every quarter to keep Wall Street happy, ...
And there's the biggest issue right there. Even more so for other companies.
Honestly, I just don't see it. Not because the problems aren't real, but because it's always been like this.
What has changed is the importance of the Apple's software. When it was a minority player that offered advantages over the status quo, we overlooked the issues.
Now that it is the status quo, we are no longer comparing it to anything else, but to an ideal - as we should be.
I think the idea that there is a decline is false, but the increasing criticism and demand for quality is absolutely appropriate, since it is absolutely Apple's responsibility to improve our experience of their products.
What are some companies, shipping consumer software of comparable complexity to Apple, who ship exemplary high quality software? How are they doing it?
I don't feel any substantial difference in software quality of Apple products. IMO, it's more or less the same for the last 10 years.
Very few (if any) comments from Apple employees here. Do they not read HN, or do they have a policy against posting on public forums?
I think there are a few concrete things going on:
* There are many more integration points between their products now. Shipping only the Mac or only the Mac and an iPod or even a first gen iPhone that can only get data into system apps via a USB cable is very simple compared to what they're making today. For Apple's best-cast customer, who owns a Mac, an iPhone, an iPad, an Apple TV, an Apple Watch, and who uses iCloud, how many integration points are involved now? Integration points are like the exponent on software complexity. It's where software goes to die.
* They are still essentially a fat client company that's trying to build more cloud-oriented applications. This leads to additional complexity in the product that other companies just don't have to deal with. An obvious example that jumps to mind is iTunes vs. Spotify. If iTunes was just Apple's version of Spotify, how much better would it be?
* Brain drain. Apple's stock made a lot of people a lot of money, and if you work there, you can't participate in the mobile revolution they started. Steve Jobs's passing could also be a natural book end for people in their careers to try something new, or find a job where they're not working 80 hours regularly, or to just take some time off.
I guess the last one isn't really "concrete", and is more just me speculating, but I threw it out there because of a decent amount of anecdotal evidence I've seen. Here are some other things that are also just speculative but interesting to consider:
* Apple is a product company that succeeds or fails on innovation. As capable of an executive as Tim Cook clearly is, he's not a product person. How does this trickle down into the product development process?
* Product development was micromanaged by Steve Jobs basically until he died. That leaves a HUGE vacuum in an organization and executive team he built to amplify his personal strengths and weaknesses. Who is filling that vacuum now? Is it Jony Ive? Does his new role of "Chief Design Officer" mean he's kind of the new Steve Jobs, in charge of product design, retail stores, office space, etc.?
* If Jony Ive has the final say of all software still (not clear to me in this new role), how good is he at software? How interested is he in it personally? He clearly loves the physical design of things. Steve clearly loved software. If Jony is in charge, does he have that love as well? Does he devote the time and attention into the software as he does with the hardware? Or, to take the iTunes example again, is Eddy Cue basically in charge of that product?
* How good are the people there at software design without Steve? There's a great story about Steve Jobs coming into an iDVD design meeting where he ignored what the team came up with and drew a window on a whiteboard with one area to drag files and one button that says "burn".[1] Is that just one story? How important was that to the day-to-day of the products they shipped? Who does that now?
The key point to me is that, according to Steve himself, Apple is a software company.[2] They make hardware so they can make really great software. Software is what's most important, and I hope stories like this are a bit of a wake up call to re-center their focus on what's truly important.
[1] http://dandemeyere.com/blog/5-most-inspiring-steve-jobs-stor...
Apple need to pull a "no new features" release for the next OS X and iOS. They've done it before.
iTunes somehow managed to go from one of the single-most enjoyable software experiences to literally unusable. Repeat: UNUSABLE. Bizarre to think that the application that played arguably the largest role in Apple's trajectory to hyper-success is now dysfunctional beyond words.
Meanwhile the Gnome apps are amazing.
Shotwell, Music, Nautilus, and Gedit are brilliant and simple to use applications.
agreed 100%. from the trash icon that doesn't change label to eject anymore when dragging a volume to the scroll bars that cover text on top of the safari timeline debugger, there are plenty little paper cuts to be found where eyes don't scour often.
I bought an album on iTunes last week and it downloaded everything except for tracks 1 and 5.
iPhone ecosystem needs XDA community. And I say this with full responsibility. Cydia is great, but instead of a store only, the fine hw should be brainstormed by a bunch of lunetics and hackers. So we could see what is the hw capable of.
After the death of Jobs and the ouster of Scott Forstall, this is a dog-bites-man story.
Probably anecdotal but the default podcast software has been crashing a ton lately.
i've been using mac os x as my primary os since the first version (still do on my MBA) and i've never gotten the impression that their software quality was any good: finder, itunes, etc.
Some of my most common issues on my mac (these are things which used to work, but don't work as well anymore):
Spotlight: This has become extremely unreliable (and ridiculously slow) for me. More often than not, it won;t return results, except for an icon for the topmost result which flashes in the search bar for an instant after I hit a key. If I hit enter while that icon is visible, spotlight will open the app/document.
Data Detectors: What happened to these? I used to love them, but they never seem to work correctly for me anymore.
Safari Autofill: Why can't Safari simply autofill my email address like it used to? Why does it have to show me the opaque contact box, where I'm not sure which email it will autofill for me.
Time Machine: Hasn't worked for me in years.
Calendar: I was a ridiculously heavy user of iCal in the mid to late 2000s. Can't use it anymore. I'm not sure what specifically went wrong with it (just seems a lot more clunky). It used to be far more keyboard friendly (I liked Apple Mac apps when every app had a default layout with the side bar, that showed my organization hierarchy at a glance.)
Contacts: To be fair, this (and Address Book) has always been an unholy skeumorphic mess.
Expose: After years of making Expose unusable, starting with Lion, it's actually back to being pretty good now (now that it works closer to what it did in Tiger).
Dashboard: I was one of the few that actually used dashboard. I wish Apple would just kill it instead of what they are doing to it right now.
Dock: I think the Dock has gotten much better over the years, but I think the Windows start bar is superior at the moment.
Force Quit: Why don't my apps force quit like they used to? Force quit used to be extremely reliable, but now it doesn't seem to actually do a force quit at all.
iWork: Haven't dared to go back to iWork since the refresh. Pre-refresh I loved Pages and Keynote. I happened to use Keynote to present a ppt a year ago, and while the conversion was pretty good, the presenter mode was extremely gimped and uncustomizable. I may be misremembering, but I think it used to be far more customizable in terms of being able to pick the info to display on the presenter screen, etc. Numbers was always terrible, so I'm not even considering that.
iTunes: Do I really need to say anything?
QT Player: I think it's improved, but I'd switched to VLC way back.
Quick Look: The switch in the QT engine means the vast majority of my videos don't quick look any more (I think it was Perian that made them work). Maybe I need to search the net again, to see if there are more plugins for this.
Finder: Still as clumsy as ever for file management, but seems to be better at parallelizing tasks. However, file/folder metadata takes way too long to load. It used to be a lot better (I remember I used to have the inspector window in always open mode. I wouldn't dare to do that anymore).
Bah, that was way too long...once I started, couldn't stop, and I can probably go on for much longer. (I'm one of those guys whose 2nd favorite part about an OSX release, after the Siracusa review, would be the Macrumors "Little things" thread).
Make Apple Great Again
FTFY
Cache since server is currently overwhelmed:
----
Software quality is a nebulous and divisive topic. There are many parameters to software quality – reliability, speed, user experience, design, discoverability, and more – and a move towards any of these virtues leads to sacrifices in others, especially on a limited time schedule. Additionally, a number of forces influence software quality over time, like accommodating for different use cases, changes in platform, changes in hardware, changes in design preferences, changes in market, changes in expectations, and more. Finally, software is not like digging a hole, say, where more people really can dig a hole faster than fewer people: in fact, more people can often slow down a software project.
Nobody knows this better than the technology titans of today: Apple, Google, Facebook, Amazon, Microsoft, IBM, Oracle have all experience unanticipated software problems and regressions and high profile bugs. These are organizations with thousands of programmers writing and maintaining millions of lines of code for billions of devices. And these devices are machines which require perfection: one slight ambiguity of intent, any minor breach of contract, any single unexpected 0 where there should be 1 or vice versa … has the capability the bring down the whole system. In fact, it often does. Countless kernel panics, stack overflow errors, null pointer exceptions, and memory leaks are plaguing poor users and tired system administrators and overworked programmers right now. Machines are fast, but they can be awfully dumb.
And no company is feeling the pain of software’s nebulous nature and hardware’s mindless computing more than Apple right now. The underdog that many loyal fans rooted for is now the world’s (perhaps previous) most valuable company. With that, comes insanely high expectations: they need to grow the world’s biggest company every quarter to keep Wall Street happy, and even harder, they have to keep those nerds that kept them alive through the hard times happy too. And with release after release of the most revolutionary operating system ever, it’s tempting to picture Apple like an actual Titan, in particular Atlas, holding the world upon his shoulders. But it seems more and more every day that another Greek tale is more fitting: it’s time to admit that Apple have flown too close to the sun.
Walt Mossberg, technology journalism’s elder statesman, has this to say about Apple’s software quality:
In the last couple of years, however, I’ve noticed a gradual degradation in the quality and reliability of Apple’s core apps, on both the mobile iOS operating system and its Mac OS X platform. It’s almost as if the tech giant has taken its eye off the ball when it comes to these core software products, while it pursues big new dreams, like smartwatches and cars.
On OS X this is especially true: OpenGL implementation has fallen behind the competition, the filesystem desperately needs updating, the SDK has needed modernizing for years, networking and cryptography have seen major gaffes. And that’s with regards to the under-the-hood details, the applications are easier targets: it’s tragic that Aperture and iPhoto were axed in favor of the horrifically bad Photos app (that looks like some Frankenstein “iOS X” app), the entire industry have left Final Cut Pro X, I dare not plug my iPhone in to my laptop for fear of what it might do, the Mac App Store is the antitheses of native application development (again being some Frankenstein of a web/native app), and iCloud nee MobileMe nee iTools has been an unreliable and slow mess since day one.This isn’t the first time that a prominent member of the Apple community has criticized Apple’s software quality. Here’s Marco Arment from January of 2015:
Apple’s hardware today is amazing — it has never been better. But the software quality has fallen so much in the last few years that I’m deeply concerned for its future. I’m typing this on a computer whose existence I didn’t even think would be possible yet, but it runs an OS with embarrassing bugs and fundamental regressions. Just a few years ago, we would have relentlessly made fun of Windows users for these same bugs on their inferior OS, but we can’t talk anymore.
This is still as true today as it was last year. Macs and iPhones have gotten thinner, more beautiful, and more powerful; the Apple Watch and the new Apple TV are magnificent additions to the product line up. But I’d speculate that part of the problem Apple is having is that if it took 1,000 engineers to write software for Mac when that was the only product, it doesn’t necessarily take 4,000 people to write software for four product lines. In fact, 10,000 of the same grade of engineers might not even do it, especially without proper management and unified goals. Apple may not have listened to rockstar developer Marco Arment, but Walt Mossberg will definitely get there attention. Here’s an anecdote about Steve Jobs from the last time that Mossberg complained about Apple’s software quality: In Fortune’s story, Lashinsky says Steve Jobs summoned the entire MobileMe team for a meeting at the company’s on-campus Town Hall, accusing everyone of “tarnishing Apple’s reputation.” He told the members of the team they “should hate each other for having let each other down”, and went on to name new executives on the spot to run the MobileMe team. A few excerpts from the article.
“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continues, “So why the fuck doesn’t it do that?”
Jobs was also particularly angry about the Wall Street Journal’s Walt Mossberg not liking MobileMe:
“Mossberg, our friend, is no longer writing good things about us.”
It really is time for Tim Cook to take action as drastic as this regarding software quality on Apple’s existing platforms. What worries me is that APPL the stock ticker and Apple the company are in a (self-driving) crash course with one another: APPL needs to launch new products to drive growth and Apple needs to improve the products that have already shipped. The most valuable asset that Apple own is their brand, and that’s the brand that’ll drive sales of any car that may or may not be in development. If that brand name is tarnished by regressions and performance problems, what consumer would buy a car from the brand? In fact, anecdotally, talking to my friends, the Apple Car already has an uphill battle with the kerfuffle surrounding the Maps launch.Jim Dalrymple, in response to Mossberg, writes:
I understand that Apple has a lot of balls in the air, but they have clearly taken their eye off some of them. There is absolutely no doubt that Apple Music is getting better with each update to the app, but what we have now is more of a 1.0 version than what we received last year.
John Gruber, in response to Dalrymple: Maybe we expect too much from Apple’s software. But Apple’s hardware doesn’t have little problems like this.
The best thing for Apple to do is to re-take their position as a leader of software quality before it’s too late: consumers know that Apple’s hardware is the very best, but more and more their using apps made by Google and Microsoft and Facebook. If this trend doesn’t turn around, Apple will find their breakout product and all of its growth will be owned by competitors. And when the time comes to launch their car, they’ll find that loyal fans and everyday consumers have lost trust in the brand. Having said that, I’m still a Mac user at home and at work, my iPhone is a wonderful device that I enriches my life, and I’m still finding new ways to make use of Apple Watch. And to give credit where credit is due: Logic Pro X has improved a lot recently, and Music Memos is a welcome addition to Apple’s music line up. I even use Apple Maps. Apple can do this. It’s not too late. But for sake of all us poor users, and Apple’s tired system administrators and overworked programmers, I hope they started 6 months ago.Gotta put some newlines in your quotes, so we can read them. :)
Nice job formatting that...
The fundamental decline started when they moved towards flat design.
That's what happens when you put a hardware designer (Ive) in charge of software design.
“The iPhone could not be synced because it could not be found.”
On Windows, Apple software has always been bad.
Apple's stock ticker is AAPL not APPL
It is unfortunately happening.
I jumped ship from Android to Apple about 5 years ago. I always gave Apple the benefit of doubt, each time they released new updates with bugs. But, of late, the focus for Apple has been to release new features without actually caring about the stability. I'd much rather prefer a system that is old and stable than new and unstable. Let me explain.
About 2 days ago, I discovered from my closet, my very first old and abandoned HTC Desire (The OEM version of the very first Nexus One). Powered it on, updated to Android 2.3 (Last official update) and I thought heck, I wonder if this thing is still usable.
So, I carried it along with my iPhone for the next few days. Remember, this phone was purchased sometime in 2010. So, I was assuming the battery was probably dead. But, to my pleasant surprise. It wasn't. But,even if it was, no biggie, I realized it was replaceable. Then, I had the phone throw a bunch of insufficient space errors on me. I was so used to the Apple eco-system that I just thought I should delete some existing apps to make some space on my phone. And in this process, I discovered that I could actually expand my storage by purchasing a new memory card for just ~$5 because the desire has a provision for adding a memory card and transferring all my apps and their data to the SD card. Cool!
Now, to the most interesting part. The software. In my experience, when I turned it on, I had to re-auth some of my previous Google accounts and from there, pretty much everything just worked. I didn't notice even a single crash in my entire 1 week heavy usage with this phone. What a pleasant surprise. What was even more relaxing was that, when I opened the music app and played a song, the music just played! I was so invested into Apple's ecosystem of iTunes Match and Apple Music and deluded by them that I didn't realize that I was being shoved with a poor user experience under the guise of subscribing to a premium music service.
For example, on my android, I have a playlist X. I click on track A. In just under a second the song starts playing. I can do this while my phone is still in my pocket just using the remote on the headset. On my iPhone, I click on a track A under playlist X (also using remote) and in my mind, I'm actually quite anxious if this thing will start playing immediately or if it will get stuck waiting to download the track from the cloud forcing me to take my phone out and close the music app, open it, evade the Apple music interface I want to avoid, select my playlist from the playlist tab and find the track I want to play again. What a nightmare. What's worse is my whole phone freezes when using certain apps and sometimes I'm forced to restart it. On the HTC I had to do this absolutely 0 times in the same time period.
When we buy a smartphone, we buy it assuming certain basic use cases - Make calls, listen to music, take pictures. If your smartphone can't even fulfil these basic needs, especially when you charge a premium, then something is seriously wrong with you.
Despite being a 6 year old discarded device, my android phone is far more stable than my current iPhone. It actually came as quite a shock to me. I have some devices still on iOS7 and they're definitely quite not stable enough. This made me realize my original point - Apple's focus has been too much on just pushing updates rather than on stability. This gives you the illusion that you're innovating faster than the others when in reality it isn't.
The reason I took time to explain my experience is not because I want to start some kind of iPhone vs Android flame war, nor suggest that Apple is dying, etc. etc. nor that we all should be buying Androids (I still find iPhone has an edge for my use case, infact).
I just simply want to demonstrate how a company whose selling point is "Everything just works" has consistently failed to deliver and yet how we (atleast me) still been thinking they're the superior ones.
I remember the time OS X didn't even have to restart for an update. Gone are those days.
I don't, and I've been using it since the Public Beta.
Every update needs a restart. How does your mac update without a restart, the latter updates require an installation where OS X isn't running.
I'm saying updates have always required a restart. You said you remember a time when this was not the case, and I'm pretty sure such times never existed.
No they haven't. On the 'Steve Jobs' versions of OS X updates never needed a restart, such versions as Leopard, Snow Leopard and Lion. The only exception was on major updates where a DVD was required for the update.
They even used to use that it used to do this to bash windows.
The 'Steve Jobs' OS X versions were very refined. Somewhere around Lion everything went downhill. Mountain Lion's flaws were slowly more pronounced.
It also used to be than a new OS X version used to be faster despite it running on the same hardware. Its hard to say if this has changed.
Here's a screenshot of Leopard requiring a restart for an update:
http://i.kinja-img.com/gawker-media/image/upload/s--L-qfNFHD...
Here's one for Snow Leopard:
http://cdn.osxdaily.com/wp-content/uploads/2010/03/mac-os-x-...
Lion:
http://cdn.cultofmac.com/wp-content/uploads/2012/02/Screen-S...
And just in case you think this was real but you got the era wrong, here's a screenshot of 10.3 requiring a restart for all sorts of updates, including a Safari update and a Daylight Saving Time update:
https://systemfolder.files.wordpress.com/2011/07/softupdpant...
Plus a normal OS update:
I could have sworn the restart requirement was rare.
I stand corrected.
Irony? "Error establishing a database connection"
I have noticed corner situations of the iOS UX degrade over the time I started with a 3GS to now. This degridation was most obvious when they moved to the new flat design theme. Here's some examples:
1a) The title bar colour changing feature occasionally bugs out. It will show white over white, or some other non-contrasting colour. This happens with no particular reason as far as I can tell
1b) The title bar sometimes stays on top of a 'fullscreen' app, meaning that for example you have the top title bar icons and text sitting over the camera display view. This happens with no partiuclar reason as far as I can tell.
2a) I have noticed that in the portrait/landscape rotation, sometimes the app veiw gets bugged to sticking one way when the screen display itself rotates. For example, All of the content of the app rotates back around to a portrait orientation but is still trapped in a landscape layout on a portrait display. This is rare but seems to have no particular cause that I can recreate.
2b) The above can sometimes happen in the camera app. When you rotate the camera the icons (flash, hdr, etc) rotate for you. Sometimes rotating it causes them to 'stack up' on one another as if they adjust in one axis but not the other. I most noticably found this when I rotated having had the flash options open (force flash, auto, no flash) and they all collapsed to being on top of one another rather than closing that selection pane.
3) The 'zoom back to home screen' animation sometimes interacts strangely with folders. In that the icons of the folder will do the 'fly out' animation but only confined to the final size of the folder GUI box on the screen. It looks buggy and seems to be a corner case situation. I can't recreate it when I intentionally open a folder > lock my phone > unlock my phone but I've noticed it many times so I should try to pay more attention to its causes.
4) Since iOS9, I can press my 5S's home button to get the screen to light up and recognise my thumb print, but sometimes it will activate siri even if I press it very rapidly. This is a 5S that I purchased in November so I don't believe the home button is faulty. My suspicion is the screen initiation routines on the 5S are a little laggy and sometimes the button registers as a press-and-hold for Siri.
5) Right now I have 2 'unread' emails in one email account, despite there not being any as verified by Outlook on my home PC and the 'unread emails' folder which I added to the mailboxes screen of Mail. I suspect if I remove and re-add the account it'll clean this up but Mail is the only place reporting 2 unread emails.
6) Some apps can have their text input boxes bug out, whereby I can't tap the input cursor part-way back in the window and as I delete text it highlights words and goes wierd. I suspect this may be a buggy implementation in the app itself but I'd have expected that to be a pretty robust UI object, so I figured I'd include it.
Anyway, the above are just examples of the sort of lack of polish I'm noticing build up over time. As said, the biggest increase in it came when they switched to the flat UX design, because I suspect they created so many new UI changes that there were inevitably fresh bugs that had been stamped out in the old design. Nevertheless, a number have stuck around and it's frustrating because I noticed more polish and less of these little rough edges back in the 3GS and even my 4S days on iOS5.
The biggest bugbear is that I don't really know of an easy way to submit these bug reports. I guess I have to jump on the apple dev boards or something but there's no nice way of going 'here's a bug, please consider looking at it or asking me for more information if needed!'.
I've see a lot of talk from various places that the 'issue' with the decline in software quality from Apple stems from 'changing too much, too fast' -- I think its actually the opposite -- not changing enough and not changing it fast enough.
There are going to be upgrades -- that is GOOD. But we need to -feel- like there's something new and better and worth the inevitable bugs that -change- is guaranteed to introduce. Those who react to this problem of 'change produces bugs' by saying 'nothing should change' or "don't change what isn't broken" are inevitably in the wrong. Change needs to happen, but more importantly the right changes need to happen.
iCloud is really confusing -- there is just no simple way to understand what it does, what its for, how to use it, how it can go wrong, what happens when it goes wrong ... and on and on. It needs to be turned into something more modular with better boundaries between its unrelated aspects.
- iCloud needs to change faster. In my opinion we've been on on a slow path towards an iCloud account being _required_ for OS X usage -- the slowness of this progression has drastically complicated the introduction of iCloud features. In my opinion, doing iCloud right _requires_ iCloud to be present and configured. Just pull off the bandaid -- require an iCloud identity for each user account and redesign the rest of the setting system to assume its presence ... Re-implement the process by which all the system provided apps query for settings to use the new model designed to keep up with modern users expectations.
In my mind iCloud needs a new statement of purpose. I'd propose this: iCloud is about:
- establishing identity of person for usage of apple services - establishing the ability to mutate configurations associated with that person on any device that person uses. The other _services from apple should not_be called 'iCloud'. I'd like to see iCloud become:
- a screen which is ONLY about configuring 'the iCloud account associated with this user'. This should be about proving who you are to apple so that you can synchronize your other system settings to/from cloud and other devices. Nothing else.
- The rest of iCloud should be folded into a generic interface to all 'system integrated internet services'. I imagine a screen designed to manipulate a data model that looks something like: Service Provider -> User Identities -> Devices -> Services
From one place I can configure my authentication credentials for various service providers, and then decide which services from each provider I want to enable on each of my devices.
The associated settings are synced to iCloud and onward to each device. iCloud is one place to define all the system<->internet integrations for all your devices. The implementation of those services is _not_ part of iCloud. I should be able sit on my mac and configure which mail accounts I want my phone to check -- this is iCloud -- actually downloading the mail on my phone, not iCloud. I think iCloud should implement a user-facing synchronized management console for describing the set of network service integrations to enable for each of my devices -- it should also expose an api by which applications can gain access to these service configurations. That's what iCloud should be in my opinion. Its a lot of _change needed_ to get there, but no change here is worse I think by far...
I'd like to see apple change more about OSX -- change something low-level, long-lived ... Do something controversial that forces the most ossified of unix curmudgeons to learn new tricks. How about a new default shell (fish! ... something homegrown)!
My advice for apple: Keep moving the world forward, don't go into maintenance mode, improve things that need improving, embrace the fact that not all change is churn. Someone at apple needs to stand up for that mantra in the face of all this criticism and prove to the world that can still improve things by changing them. That's the only way this perception of 'declining quality' is going to change.
I've used Apple computers since 2003, I think the decline is unquestionable. As many others note, more and more little bugs and problems seem to crop up than ever before. Everyone in here has an anecdote, so here's mine- on my work laptop, the hot corners just stop working for no discernible reason, and I have to kill the Dock process to get them back. Many small things like that.
And I don't even attempt to trust big things, like backing up my data with iCloud or using Apple music. Hell no.
It seems clear they've stretched their talent too thin over the years, and the yearly release cycle probably also has something to do with it. On the other hand, their hardware comes out roughly yearly and does not exhibit these kind of problems. Their hardware is still very good.
I wouldn't say it's degraded to the point of being Windows yet, but it is definitely in danger. Personally I noticed the software quality started taking a dive around iOS 7. 7 was practically a beta.
OS X Yosemite was a travesty, made worse by the fact that as an iOS developer I HAD to run it, I couldn't stay on Mavericks like I would've greatly preferred. Yosemite had a lot of good things to add, but at the same time it was plagued with tons of bugs and horrifically un-optimized graphics code (full screen blur with no caching brought my Macbook Pro Retina to it's knees many times). El Capitan has been a huge step in the right direction but I can't help but notice so much of the OS's new features weren't so much optimized or fixed as simply regressed.
Additionally, speaking as a developer, many of Apple's API's and SDK's are outright terrible. iCloud implementation for any iOS app is a nightmare, as is using the shared keychain. Then there's the constant obfuscation of the file system on both iOS and OS X which just drives me insane, and the tendency of all Apple apps to operate as if they are the only app you'll ever use for any given purpose, which seems obnoxious at it's best and outright nefarious at it's worst.
I'm glad someone wrote this extremely overdue article which would have, in general, been just as accurate two years ago as now. When Yosemite shipped with Safari that worked on only one of my three macs, I knew there was something seriously wrong. Constant bluetooth and RF wireless problems making peripherals hard or impossible to use. iTunes is so shitty, there is not a single redeeming thing about it. 4k support that works, but only sometimes and requires you to plug and replug your monitors till things line up magically. Shutdown/restart haven't worked properly for years and will frequently make the machine hang. Sometimes sleep does so as well, usually at the most inopportune times. The list could go on for quite some time.
Is it only their software? I don't really see their hardware as first class anymore either.
Looking at Apple laptops, the newest Intel processor I can get is a 5th gen Broadwell, and it's only in the 13" retina. The 15" retina has 4th gen Haswell. And the 13" pro? Some old processor from 2012. 6th gen Skylake is shipping everywhere from every other laptop manufacturer.
The 13" Macbook Air still has a sub-HD display. (1440x900) Yet, I can now get a 4K display on a Razer Blade Stealth with 100% Adobe RGB at a competitive price. Even their 'retina' laptops are low res by comparison.
The only hardware they have that looks reasonably impressive at the moment is the iMac 5K, but that's a desktop.
Add to that thin power cables that fray easily. A poor keyboard (relative to my thinkpad). Glued-in internal battery. Heavy (relative to plastic/composites) aluminium case that dents easily and attenuates wifi. Glossy glass screen that cracks easily and is far too reflective. The trackpad is good for scrolling but right-click is too awkward. The whole design philosophy is form over function.
To be fair, their hardware has never been impressive in modern times, from a technical perspective, except in a very short timeframe (basically from the first Air model to the second MBPR model). Before then, everybody knew Apple laptops and desktops were fairly solid but hugely overpriced, and you paid for stylish design rather than innovative tech.
I suspect this reflects a wider trend which is most probably a consequence of the ease to ship software nowadays (no need to manufacture physical medium anymore, just ship a hotfix), which incites software companies to skip QA and let early adopters find the bugs themselves...
True, though it's also easier to test code too, right?
But why pay for QA testers when users settle for using buggy products.
Plus this trickles down, if I'm a small start-up founder and I see market leaders get away with half-assed releases despite having a lot more resources, why should I strain my limited budget by investing in QA?
"But I’d speculate that part of the problem Apple is having is that if it took 1,000 engineers to write software for Mac when that was the only product, it doesn’t necessarily take 4,000 people to write software for four product lines. In fact, 10,000 of the same grade of engineers might not even do it, especially without proper management and unified goals."
Nope. See Brooks' Law.
Declining? I didn't realize quality could go lower than 0.
Declining? I didn't know software quality could go any lower than 0.
I think this goes back to Apple's Other Big Problem.
They're getting better at software, distributed systems, etc. But everyone else is getting better faster, and catching up on design.
And you hear how Apple employees are forced to work, using codebooks to talk over lunch and without a lot of cross communication for fear of project leaks and you go, "Oh. That's why they're struggling to keep moving forward."
They're basically buying a lot of their core tech and repackaging it these days to keep velocity. But that too isn't a silver bullet, as it fragments the tech space.
The punchline that "it just works" is not lost in time. Lot of Apple designed software are more like talking points when they release iPhone or do a big media event.
Mac OS as lot of Apps which need update, Photos -- was more of an attempt to catch up with Picasa Similarly notes and icloud as well need updates. It feels like Apple just wants to through sometime out since they are trying to catch up with others in the market and once that is done, you have to wait another year just to get an update. Software development and updates should be iterative and not based on media events.
Notes just received a pretty significant upgrade. And Photos was upgraded recently as well -- for better or worse. You picked the two most recently upgraded apps as being in need of an upgrade.