Apple already has several ARM powered laptops drifting around internally
hardware.slashdot.orgFascinating, but take it with a grain of salt. One anonymous poster didn't just violate, but shredded and used as hamster mulch, their NDA to report iPad-with-a-keyboard? It's plausible, but I think most of us here could have written that comment as a speculative exercise.
That AC could be both right and wrong. He could be totally right about the existence of those prototypes, and I bet it's true (it'd be more surprising if Apple didn't have prototype ARM laptops floating around). But it doesn't mean he knows anything about actual plans to get these out the door. There are HUGE barriers to that happening at the moment, the cost/benefit ratio is extremely high. It sounds to me like this AC is correctly reporting the facts but incorrectly extrapolating his own story out of it.
> From what I was told, there's a huge push to get this stuff out the door as soon as they think the market will accept it.
Looks like a) he's not extrapolating anything, and b) the AC knew those huuuuge barriers existed when he wrote his post.
Well, the title of the post is “Last sentence is (almost) BS.”, so take the final paragraph with an additional heaping of salt.
It could also be an intentional leak to gauge reaction.
Makes sense. Allegedly they had pretty much every version OS X after Rhapsody continuing to run on x86 in some capacity until the x86 version of OS X finally came out. Keeping an ARM version around seems like a no brainer.
I suspect this isn't being held in case the Mac market falls apart, but in case the iPad market starts losing to Surface and friends.
Exactly, in fact there's even more to it then that. Particularly at Apple's scale, maintaining a codebase across multiple architectures internally, even if there is absolutely zero foreseeable intention to use them, offers significant value. Strategically of course it creates some hedge against over dependence on any single supplier, it's not just "the Mac market falling apart" so much as Intel/AMD dropping the ball or becoming unable to go in a direction Apple wanted (as happened with PowerPC). By the same token it helps maintain some level of economic negotiating position, even if Apple faces what is effectively right now a single key supplier situation. The mere fact that they could switch if absolutely forced to is of use.
Non-strategic value though is probably just as important as any of this stuff: as probably most of HN knows well, keeping a codebase portable can be quite helpful in terms of plain and simple quality. Obscure bugs or bad patterns that are hard to find on one architecture can be a lot easier to identify on another. It can help promote discipline and good practices. Portability I think is really a constant process rather then a goal or single thing, it's a lot easier to have worked on it all along for years before you need it then try to "port" something later because without the constant pressure of staying portable it's all too easy to start falling into dependence on features (or worse, quirks) of a single arch and build up more and more technical debt. Then when the "bill" (not necessarily just in terms of money but sheer developer hours) finally comes due it's effectively unpayable.
"as probably most of HN knows well, keeping a codebase portable can be quite helpful in terms of plain and simple quality. Obscure bugs or bad patterns that are hard to find on one architecture can be a lot easier to identify on another."
Right. Even if they had no intent of making ARM laptops (although I suspect they probably do have that intent), the exercise of keeping the code portable between Intel and ARM would serve them in good stead even if they later went to yet a third CPU architecture.
In addition to the pure R & D value (with all its uncertainty, but with probable considerable upside in the longer term), doing this likely more than pays for itself in terms of the leverage it provides with/against existing vendors/partners.
This is just my speculation. But, 1) You learn things; 2) You lessen lock-in / dependence, in turn reducing the markup you have to pay on existing production inputs.
Apple's benefited very considerably from the increasing customization and optimization of its phone CPU et al. platform. What's the argument against carrying that over to their computer lines, if and when they can make that a relatively smooth transition?
Should this apply to cloud provider portability?
The problem with portability between "clouds" (IaaS/PaaS providers) is that many of them have features (e.g. object storage; Dynamo-based distributed tables; reliable message-queueing; health checks connected to load-balancing and hypervisor lifecycle control),
• which are "obvious" and perhaps even necessary for productive coding of distributed systems software; and
• which have huge economies of scale (one shared cluster for all customers beats the pants off the performance+availability of your company's puny little three-node private cluster), and yet...
• which other major clouds don't support at all.
Effectively, all the "clouds" currently only offer between 30% and 90% of what you'd want in something that called itself "a cloud." Nobody has a "whole cloud" (AWS is closest, but still not there.)
Designing for portability between these clouds would be like writing assembly intended to be portable between processor architectures, when only one architecture had an ALU, only one had registers, and only one could conditionally branch. It would be madness.
---
Personally, I feel like, to be able to sensibly design for portability between cloud providers, they'd need a lot more features in common than they have now.
Maybe we could invent a minimum common standard to hold the cloud providers' stacks to—maybe a small one at first, with a growing list of expectations over time; or maybe a "core" spec, and then a number of "levels" of support atop it. Then you could say you've targeted "IaaS Level 3", and clouds could claim to support that, and cloud-abstraction libraries like Fog could actually do something useful.
Good question. Ideally it should... operating on Apple's scale, it would be a great idea to have their cloud based solutions running on multiple platforms to reduce dependency/vendor lock-in.
having Marklar around rewarded them substantially in the past
I always thought this was accurate and it was a little more accidental early on: https://www.quora.com/Apple-company/How-does-Apple-keep-secr...
(That said, even on this account they had laptops running Intel for several years before the transition was decided upon).
Since NeXTStep v3.0, mid-90s, the compiler toolchain had the ability to compile multiple-architecture binaries. Of course Apple bought NeXT and much of OS X is still based on it.
I forget the exact magic method, and don't know if it is still in shipping versions of OS X, but I would be surprised if at least internal-to-Apple versions of their compiler and toolchain didn't still have this available for use.
The worst part of an iPAD is carrying it around.
Give me a foldable iPad that still has screen in both halves, but that is usable in landscape, and I will buy.
Appropriate haptics could make it work like a tiny laptop in portrait, but like a regular all-screen iPad in landscape.
They better have ARM laptops done or nearly so. If they didn't, I would conclude that they have no idea what the fuck they're doing anymore.
Although if the software is through the Mac App Store only, then... well, it won't be a very useful device. I hope that they're looking to replace the Macbook with an ARM device, not create a new Chromebook.
I'm sure they will still have a compiler on the system -- it has to be possible to compile code for this machine from source and run it ...
As I think about it -- wouldn't app store (if well run, then it provides the best possible security for distribution of binaries without source code and the most consumer friendly commercial licensing model) + open source eco-system (self-compiled -- source only distribution) pretty much the best possible world for users? There is (kind of) an open-source argument for making binary distribution without source onerous or impossible...
Perhaps they'll bundle or distribute an official version of something like homebrew to make installing open-source software as easy as possible and maybe even provide a way for binaries generated from these packages to become signed and distributed as binaries as an optimization?
I don't think I would mind app-store being the only way to get pre-compiled binaries if augmented with a well-supported open-source ecosystem for utilities that need to venture beyond the capabilities of the app-store sandbox. There is a technical argument for making any software that needs to operate beyond the sandbox subject to source code auditing simply because of the potential attack surface ... I wouldn't necessarily mind if subscription pricing was the only business model for such sandbox-spanning utilities -- continuance maintenance (and ongoing revenue) for software that works beyond the sandbox is needed from a security perspective ...
If the future is supposedly going to be this locked down, I wonder what will happen to the internet when there is no hardware left where you can develop stuff like apache-httpd and php and mysqld.
Mac is dead for developers and pro users if this is the case. My current Mac will be my last, and I'll be happy to give my money to Dell instead and anyone else selling more open hardware.
One of our devs just got a new XPS 15. It's a gorgeous machine. Amazing. First time I've envied a non-Apple machine since the early 2000s. I still marginally prefer the Mac track pad but the rest is on par or better.
Can confirm. I sold my MBP and bought myself one of the new Dell 15" developer editions. Trackpad is about on par with the prior physically-clicking mac trackpads. I run Fedora and the only thing that doesn't work is hot-plugging the Ethernet adapter. Otherwise all the usual Linux pitfalls like sleep/wake, wifi, hidpi, touch screen, etc. work fine out of the box. No plans to switch away.
I did exactly the same with an xps 13. Gave my Macbook Retina to my girlfriend, as the new models were not compelling at all. Fedora 25 is lovely on it and I can run docker and KVM on it natively. I will buy another Mac if they give me a proper value proposition.
Yeah, I think they'll lose 99.5% developer mindshare as well. Which is the one thing Ballmer got right. Don't underestimate the value of developer support for a platform like iOS.
developers go where there is money to be made. only having xcode on osx/macos hasn't hurt apple a bit.
Developers also very often go where the system is convenient and open and developer-friendly, see: the rise of linux, php, rails, node, mysql, wordpress, git; and android vs windows phone/mobile.
Isn't Apple still in the better position here? Since it comes with Unix-friendly tools, and most of the languages and tools you listed.
Do you think you'll be able to use those tools in a gatekeeper-MAS-only world? Or if Apple ships prebuilt versions, how well supported will they be by the authors of those tools if they can't develop and compile new versions on the platform on their own?
For a preview of the future, try this one on macOS 10.12 today:
% lldb --one-line run /usr/bin/python
spoiler:
Current executable set to '/usr/bin/python' (x86_64). error: process exited with status -1 (cannot attach to process due to System Integrity Protection)Your example is just showing that Apple picked decent security defaults for binaries which they ship. SIP can be disabled any time you want and it doesn't apply to things you compile.
There is a genuine argument about control but overselling it just lowers your credibility, especially since it reveals tunnel vision: statistically very few Mac users need to run a debugger but more are at risk for malware which uses sensitive APIs.
I don't feel like I'm overselling this for developers (which this thread is all about). I've hit the SIP block several times trying to genuinely debug my python and ruby scripts. Sure I can disable SIP on today's intel macbooks, but what about the ARM toys that the story is all about?
Also, if you have malware getting far enough to try to ptrace binaries running on your uid, I would imagine things are still game over despite being prevented from debugging a new interpreter process. I'm not buying the malware scare when it comes to debugging newly forked processes on a non-root uid.
I think your argument would have been valid if I'd never clicked "enable debugging for this mac" in xcode.
On server I think this is correct. On consumer devices this is really not the case. Has any platform with a huge consumer base ever died by being abandoned by developers first, as opposed to device owners?
Developers will target Apple, but they won't use Apple or like Apple. It'll be like Windows during MS's dark years.
Right, but what independent developer is making money on the App Store these days?
Nothing stopping you from installing Blink Shell or Prompt on your iPad-like device and connecting to a GCE instance or your own linux server to work on those things.
I see we're basically agreeing the future of working on such things in private is not with Apple hardware then. (Unless you are Google)
Nothing stopping you from running your own server... I don't get what you mean about "in private".
You have to live in the future to invent for the future. These are probably being used by product manager to see the deficiencies for the next feature upgrades to make it perfect.
An interesting product line split might be putting the Macbook line onto ARM for portability & battery life, and keeping the pro or air & pro line on Intel.
If we assume for a moment that this is true (which is a huge assumption), it doesn't mean they'll necessarily release a product quite like it.
I don't know how CPUs and instructions work but wouldn't this mean that every app has to be recompiled for ARM or run in an emulator?
Yes, it would mean that.
In the NeXT days, a program could be compiled as a "fat" binary that would run on multiple supported CPU architectures.
It made things easier for developers because they only had to ship one set of installation media regardless of the CPU the user had.
This capability still exists, and is in fact often used for libraries. The lipo command-line tool lets you bundle code compiled for multiple architectures into the same physical file (or extract the code for a specific architecture, etc.).
Didn't Apple have the same thing during the PowerPC->Intel migration? "universal binary" or something like that.
Apple's 68K->PowerPC transition did roughly the same thing.
If this is true, I sure hope to god developers will have some performant build machine edition. It would make CI even more painful for iOS dev.
I'd love to see a worthy competition to Intel.
The AC points out that the chips were made by Intel.