Settings

Theme

x86 is dead, long live x86

engineering.mercari.com

125 points by harpratap 2 years ago · 171 comments

Reader

slfnflctd 2 years ago

When I think about how long chips like the 6502 have still been in active use (almost 50 years now), it is hard to conceive of a world where there isn't a significant presence of x86 activity for the rest of my life.

The majority of 'the market' may go elsewhere, but for a gazillion reasons, x86 will not be disappearing for quite a while. At this point it would honestly surprise me if we didn't at least have high quality emulation available until the end of the human race as we know it.

Sure, we've probably lost most of the software ever written on it, but a whole lot of interesting artifacts from a key transition point for our species still remain locked up in this architecture.

  • tracker1 2 years ago

    Given the new 128-core AMD server parts are on-par with ARM in terms of power efficiency and capable of more raw compute, it may even grow a bit.

    I think there's lots of room for ARM, Risc-V and x86_64 in the future. There's reasons to support any of them over the others. And given how well developer tool are getting support across them all, it may actually grow a lot. I think the down side is a lot of the secondary compute accelerators, such as what intel is pushing and what the various ARM and Risc-V implementations include in practice.

    The further from a common core you get, the more complex porting or cross platform tooling gets. Even if for big gains in some parts. For example, working on personal/hobby projects in ARM boards that aren't RPi is sometimes an exercise in frustration, with no mainline support at all.

    • mlindner 2 years ago

      > I think the down side is a lot of the secondary compute accelerators, such as what intel is pushing and what the various ARM and Risc-V implementations include in practice.

      I’m curious why this is a downside. The current trends in computing is that we’re long past the point of single threaded compute. The first step of that was multi processor and multi core and that’ll continue with more and more dedicated and specialized computing sub-processors. Energy prices are more and more becoming a major determining factor as is the area needed for cooling. By having more separated subprocessors you get both efficiency and easier ability to cool the parts.

      • tracker1 2 years ago

        The specialized sub-processors are implemented differently, not always available, and not available 1:1 to compute nodes. If you're offering, for example, cloud compute... you can offer 4 cores pretty easily... but if there are 2 specialized sub-processors, then do you offer them, does this queue across all users/clients on that system or do you just block and pretend it doesn't exist. For Zen 4c, they're all general compute.

        This means, practically speaking, you're only going to really have 1 host on a server that wants/needs these specialized sub-processors. Which means more space/heat/power for a single user/service. It's probably fine for some things, but far from ideal. This also doesn't get into software optimization and alternative paths where not available.

        This gets far worse in the ARM space, as it seems every SOC does something different, which means it's often broken, or unusable if you're using a mainline OS/kernel and even then most software won't be optimized for it. At best, you can maybe playback 4k compressed video. At worst, you can't at all. Just speaking to the most common instance in that space, which is video compression, which is often built around closed drivers that mainline OSes (Ubuntu Server, Debian, etc) don't have in the box, and the vendor only supports a single version of a distro fork with no upgrade path.

      • Incipient 2 years ago

        I'm not a hardware designer, so quite possibly wrong.

        My understanding for the push for energy efficiency is not for cost reasons, but for performance and stability. At a certain power level, the chips just can't dissipate enough energy, especially on smaller nodes.

        If amd could double the performance at double the power, they would.

        Cost is obviously a marketing/client-value thing too.

  • Symmetry 2 years ago

    Plausibly we're headed for a world where feature size decreases stall out but manufacturing improvements continue to lower the price of transistors over time. In a world like that throwing in a few x86 cores even if the dominant ISA shifts might be worth it from a backwards compatibility standpoint even if other ISA become dominant.

    There's lots of complications to address there (strict x86 memory ordering versus loose ARM ordering, for instance) but I expect they're solvable.

    • tracker1 2 years ago

      IIRC, AMD's model for chiplets could already combine x86 and ARM as an example, not sure if any such beasts are in existence, but should be possible.

      • Symmetry 2 years ago

        Yeah, I'm actually thinking something like a socket integrator buying an 8 core chiplet from AMD or Intel to connect via CXL or whatever to the ARM/RISC-V/GPU/Tensor cores in the other chiplets from different manufacturers.

  • riffic 2 years ago

    A thing's future longevity can sometimes be predicted by how long it's already been around.

    • briandw 2 years ago
    • _a_a_a_ 2 years ago

      But not humans, regrettably.

      • wongarsu 2 years ago

        It used to somewhat work for humans, until about the mid-19th century (in Europe, other places vary). A newborn was unlikely to reach an age of 5, a 5 year old had better chances but might still die from illness in childhood, but someone who reached an age of 20 could be expected to reach an age of 40, and if you reached 40 you might well make it to 80.

      • User23 2 years ago

        99 year old humans are considerably more likely to live to be 100 than 50 year old humans.

        • Dylan16807 2 years ago

          "future longevity" in this case means how many years are left, not how many years total.

  • babypuncher 2 years ago

    As we learned from Terminator 2, the machines that eventually rise up to eradicate humanity will still be running some kind of 6502 derivative.

  • FirmwareBurner 2 years ago

    >chips like the 6502 have still been in active use (almost 50 years now)

    Also 8051 cores can still be found in modern products

    • pclmulqdq 2 years ago

      8051s are just now getting phased out as control cores of IP blocks, which is pretty wild.

      The 32-bit ARM and RISC-V cores are small enough and easier to program.

      • Kirby64 2 years ago

        ARM vs 8051 is a licensing fee thing usually. 8051 license terms are extremely generous compared to ARM as far as I know. RISC is a being explored more readily, but toolchains are not nearly as robust as arm/8051.

    • RetroTechie 2 years ago

      Just a few years back, I checked out datasheet for an IC encountered in an USB card reader (newly bought).

      Turns out a 8051 core was included (iirc clocked @ ~30 MHz, to control jobs like light busy LED on card read/write ops, some bus arbitration / priority settings, power management or the like).

      Made total sense to encounter an ancient, 'fast', tiny 8-bit core there, even though unexpected.

      There must be (and will be) an endless list of products including tiny CPU cores like that (eg., RFID tags come to mind).

    • oynqr 2 years ago

      Ah yes, the three essential building blocks of electronics: NAND, NOR, 8051.

  • trashtester 2 years ago

    .... until the end of the human race as we know it.

    I think this is the critical part. If humanity (as we know it) only lasts 10 more years, then sure x86 will still be around somewhere.

    If we last a million years, it will probably be gone long before that. Even in a thousand years it's probably gone a long time ago.

    • postmodest 2 years ago

      I'm reminded of the Vernor Vinge novel where a character hacking some fleet's automation hundreds of thousands of years in the future casually mentions that the tech stack is so old that the system time stamp is still the Unix Epoch

      “And via a million million circuitous threads of inheritance, many of the oldest programs still ran in the bowels of the Qeng Ho system. Take the Traders’ method of timekeeping. The frame corrections were incredibly complex—and down at the very bottom of it was a little program that ran a counter. Second by second, the Qeng Ho counted from the instant that a human had first set foot on Old Earth’s moon. But if you looked at it still more closely…the starting instant was actually about fifteen million seconds later, the 0-second of one of Humankind’s first computer operating systems.”

      • skywal_l 2 years ago

        Part of the "Fire Upon the Deep" series of novels (that particular reference is from the "A Deepness in the Sky").

    • tetha 2 years ago

      I mean a thousand years are very hard to imagine, and how many changes there are.

      But the cynical operator in my head could just laugh. We as a tech community are still running MS-DOS productively. Just wait, someone will run the door controls of our first space ships on some x86 chip. Or some similar system you just need, but that never gets time to be updated properly. Just wait, the new cruise liner spaceship of the milky way republic is going to run some x86 emulator for their window control.

  • thatfrenchguy 2 years ago

    Maybe this is because we're mostly a Apple computers household, but a few months ago I realized the only x86 device my household own is our NAS (and frankly it's the worse device we own). Was pretty wilded out when I figured that one out.

  • sneed_chucker 2 years ago

    Are 6502 chips still used? What's the application?

    • babypuncher 2 years ago

      Skynet is due to start producing their T-800 line of Terminators in 2026, which will use a 6502-derived CPU.

    • krylon 2 years ago

      Tamagochis are running on 6502 compatible chips. Which makes me think it is used in other toys, too.

      • tabtab 2 years ago

        What are the top embedded usage chips and what is their typical niche?

        • krylon 2 years ago

          I wouldn't know. I just watched a talk a couple of years back about reverse engineering tamagochis, and they turned out to run on some 6502 compatible chip made by a company that - IIRC - sells to toy makers mostly.

    • dean2432 2 years ago

      I was surprised to learn a few months ago that 6502's are still in production, so there must be some use for it. Perhaps replacement parts for industrial equipment from the 70s-90s?

    • duskwuff 2 years ago

      > Are 6502 chips still used?

      Largely, no. I'm sure there's a few out there, but it's unusual.

      Embedded 8051 cores, on the other hand... we're probably never going to fully escape those.

      • RetroTechie 2 years ago

        8051 was once big in industrial applications like PLCs.

        Somehow I doubt much has changed. In such applications, reliability + maturity of hw/sw ecosystem is much more important than raw speed or design innovations a competing architecture might bring to the table.

        So 8051 based parts may see the occasional process shrink, addition of new peripherals, or new IC packages, I/O pin counts, operating voltage etc.

        But I'd doubt any designer worth their salt would dare touch that core architecture unless their life depended on doing so. :-)

        • duskwuff 2 years ago

          8051 is quite popular in new designs as a low-cost embedded controller. You don't see it as often as a discrete component [1]; it's more frequently embedded in more complex devices, like as controllers for USB peripherals or even for startup sequencing in larger parts.

          [1]: Although that is a thing too; there's a number of manufacturers like Silicon Labs with extensive lines of modern 8051-based mcirocontrollers.

    • _whiteCaps_ 2 years ago

      I don't know anything about physical 6502's but they've been embedded in FPGAs where you need a small MCU in a larger design. Same with Intel 8051's.

  • thefurdrake 2 years ago

    x86 is now permanently a part of humanity. 1000 years from now, when we've transcended our physical bodies and exist only as streams of sentient data and energy traveling between the stars, I 100% guarantee x86 will be detectable somewhere.

  • Gordonjcp 2 years ago

    Stuff like 6502s and Z80s are a bit like little single-cylinder engines - the world will move onto all sorts of interesting new places, but something somewhere will always be powered by a wee Briggs & Stratton that starts first pull of the string, and we'll be glad of it.

    • Dylan16807 2 years ago

      We'll always need tiny cores, but it's worth noting that RISC-V can squeeze down to pretty small sizes and is so much nicer to use. Notably you can go smaller than a 68k or 8086.

      • hedgehog 2 years ago

        I suspect 32-bit RISC-V cores will become the minimum unit of processor for new designs. Not a meaningful cost increase over say a 6502 (or ARM) to build, but the convenience of having mainstream compiler support and that kind of thing does make a difference in the cost of building a product.

        • mjevans 2 years ago

          Less transistors does mean less power too. I could see applications where that matters.

harpratapOP 2 years ago

We did a migration from GCP's Intel based E2 instances to AMD's T2D instances and saw huge 30% savings in overall compute! It is similar amount of savings folks got from switching to AWS Graviton instances, so looks like AMD might keep the x86 ISA still alive

  • jeffbee 2 years ago

    E2 is just extremely old. E2 are a mixed fleet that contain CPUs as old as Haswell. Haswell launched over 10 years ago. It makes sense that you get a better bang for your buck from using something that isn't gravely obsolete. You should also keep that in mind when benchmarking E2, since it's a grab-bag of CPUs you need to control for the actual CPU type, either by specifying the minimum one you require or assuming the worst.

    • harpratapOP 2 years ago

      That is true. But in cloud our hands our tied, we cannot really switch from one one generation to another so easily. GCP has so far never launched a new chip while keeping the price same or lower. We did same cost/perf analysis on newer generation chips like Ice Lake from Intel and even Milan from AMD in the form of N2D, but both are quiet expensive for the performance uplift they provide. The unique thing about T2D is that the price is competitive with 10 year old E2, which has been the case only for ARM based CPUs like Graviton and Ampere Altra (T2A from GCP)

      • jeffbee 2 years ago

        I get what you mean. For the record, here's a GCP engineer holding forth on price/performance ratios who also concludes that T2D is optimal. https://medium.com/google-cloud/google-cloud-platform-cpu-pe...

        ETA: Since you are using Go and targeting a specific modern CPU, you may also get a measurable benefit from setting GOAMD64=v3, so the go compiler generates code using AVX2, BMI2, LZCNT, etc.

        • harpratapOP 2 years ago

          Thanks, that's a really insightful article!

          > Since you are using Go and targeting a specific modern CPU, you may also get a measurable benefit from setting GOAMD64=v3

          That's actually a long pending open issue in our backlog :)

amusingimpala75 2 years ago

I would just like to point out that that is not how the saying goes. When Queen Elizabeth died it would have been “the queen is dead, long live the king”, as the dead thing is the predecessor and the living thing is the one that follows it

  • ihattendorf 2 years ago

    https://en.wikipedia.org/wiki/The_king_is_dead,_long_live_th...!

    In this case, the article is implying that Intel (x86) is dead to them and AMD (x86) is the successor. Whether Intel is dead or not is up for debate (I doubt they are), but the saying is used correctly.

  • harpratapOP 2 years ago

    The dead thing is Intel (x86) and the successor is AMD (x86) as opposed to ARM in our case. Isn't it correct?

    • pjmlp 2 years ago

      If we forget about the Intel patents AMD also needs to produce their own x86 flavour.

    • re-thc 2 years ago

      > Isn't it correct?

      No. Intel isn't dead. They may be behind (for now) but they're definitely catching up and have in a way on the desktop.

      It's not certain that Intel will die and AMD will for sure win. Competition is great.

      • saghm 2 years ago

        If we're going to be that pedantic, we might as well just say that because Intel and AMD aren't living entities so they can't be "dead". Or we could just accept that the original meaning of the phrase has evolved and is flexible enough to be used in situations where it's not absolutely literal.

        • Teever 2 years ago

          I just closed another post on HN because the top thread descended into some bullshit semantics pissing contest so I come here and see the same.

          Is it just me or has anyone else noticed an increase in arguing over the meaning of words on HN lately?

          • kergonath 2 years ago

            I’ve noticed that as well. It beggars belief that people are seriously arguing about applying a centuries-old foreign metaphors about kings to modern large corporations. That’s straight over Pedantic territory into the Obnoxious Contrarian wilderness. Sigh…

            • coldtea 2 years ago

              Sorry, but you've just noticed it?

              Wars about semantics have been standard fare on geekdom since the dawn of time, and on Hacker News since the dawn of Arc.

              What appears as "Obnoxious Contrarian wilderness" is good-old hacker "well, actually" pendanticness (with some sprinkling of on-the-spectrum focus on details and semantics).

              • kergonath 2 years ago

                > Sorry, but you've just noticed it?

                Yes, I have noticed “an increase in arguing over the meaning of words on HN lately”. I also know how to discuss the meaning of words and how they form the context of a conversation, once taken together.

                > Wars about semantics have been standard fare on geekdom since the dawn of time, and on Hacker News since the dawn of Arc.

                Sure, and I enjoy very much some pedantry (and am not above doing my bit every now and then), but that’s not the point. If someone wants to have a heated discussion about how a centuries-old metaphor would apply to the kings of France but not to a multinational company, then sure, have at it, as long as you don’t derail otherwise useful threads. I will just opt out of spending my time that way and instead get some interesting pedantry.

                I must admit I am not really sure what image you want to imply by “dawn of Arc”, though it does go a bit with the French king vibe. Would you care to explicit a little?

                > What appears as "Obnoxious Contrarian wilderness" is good-old hacker "well, actually" pendanticness (with some sprinkling of on-the-spectrum focus on details and semantics).

                Pedantry and semantics are fine, and indeed one of the reasons why I am here. But the point beyond that has to be knowledge and enlightenment, otherwise it’s just, yes, obnoxious. There is also a thin line between a good “well actually” and a nuisance (not even talking about bad faith trolling).

                Please note that I did not address any of the people in question, because that would be stupid, rude, and counter-productive. I am happy if they have their arguments on their side. I was just replying to a fellow commenter that they were not alone with that feeling. There’s no need to be rude about it.

                • dragonwriter 2 years ago

                  > I must admit I am not really sure what image you want to imply by “dawn of Arc”

                  Arc is a Lisp dialect invented by PG whose main notable use is that it is the implementation language for HN: the “dawn of Arc” is thus a flowery way of saying something like “the first concrete steps to the creation of HN”.

                • coldtea 2 years ago

                  >There’s no need to be rude about it.

                  I'm sorry, then! (Was I tho? I thought I answered mostly matter-of-factly)

          • arcanemachiner 2 years ago

            Nerds have been arguing online about irrelevant pedantry as long as the Internet has been around. And this place attracts some pretty hardcore nerds.

          • lcnPylGDnU4H9OF 2 years ago

            I'd guess it's a timing thing. This site likes to prioritize more highly the "current" discussions which are generating upvotes as long as they're not also generating a ton of downvotes.

          • coldtea 2 years ago

            >Is it just me or has anyone else noticed an increase in arguing over the meaning of words on HN lately?

            Depends on what you mean by increase, arguing, and words.

          • throwaway742 2 years ago

            It's unironically one of the things I like about this site.

      • Clamchop 2 years ago

        Weird. The topic was the phrase.

        Mercari replaced Intel with AMD. One x86 out, another in. Usage is correct if figurative.

        Article does not claim Intel is dead. States that they are "catching up". But for their nodes, Intel out ("dead"), AMD in.

        It's reasonable.

      • coldtea 2 years ago

        Your point is about the factual statement (whether Intel x86 is actually dead).

        But this subthread (as started by amusingimpala75), and your direct parent's question was about whether the idiomatic usage of "(old) X is dead, long live (new) X" was correct (given that the author believes AMD taking over Intel is the factual case). That is, not about whether the factual statement is correct or not, just whether it's expressed well idiomatically.

      • GauntletWizard 2 years ago

        Perhaps it would be better phrased as "'intel x86'" vs "'AMD x86-64'". Specific sets of mandatory instructions and extension parameters - But AMD's set "won" at some point in the past, roughly corresponding to the launch of Intel's "Core" microarchitecture in 2006 when they started competing in the x86 space again

  • coldtea 2 years ago

    >When Queen Elizabeth died it would have been “the queen is dead, long live the king”, as the dead thing is the predecessor and the living thing is the one that follows it

    And if a freshly dead king is replaced by a new king, it's "the king is dead, long live the king".

    The predecessor and the living thing doesn't have to be of the opposite sex, or use a different term:

    https://en.wikipedia.org/wiki/The_king_is_dead,_long_live_th...!

    It's a very famous idiom.

  • fsckboy 2 years ago

    no, the "saying" is "the king is dead, long live the king", because the "saying" uses the apparent absurdity of the king being dead and alive to illustrate the stability of the royal system: the people don't need to worry, they are never without a king.

    yes, in a particular circumstance, if there happens to be a queen involved (rarer within agnatic primogeniture), then it would be spoken as you say, but that's not "the saying" that people generally quote.

    Rarer still, but 2 Queens would have the same form as the saying, "the queen is dead, long live the queen", which I mention to mention, when the king is dead, if it's "long live the queen", it's not generally the king's spouse even if she was styled "queen", but would be some direct blood relative of the king such as his daughter. I think his wife would become Dowager Queen. The Dowager queen might rule as Regent if her children were not adults yet.

    https://en.wiktionary.org/wiki/agnatic

mobilio 2 years ago

I've recently switch from VPS with Intel to physical server with latest AMD Zen4.

Single thread performance blow my mind with scores like 4000.

Without change a single line of code = performance was 10x than before.

singhrac 2 years ago

As a separate data point, I briefly switched one of our servers from an r6a.4xlarge (AMD Epyc) to a r6i.4xlarge (Intel Xeon) and saw a 30% speedup in our number-heavy compute task. I would love to find out why (MKL or AVX512? Do I need to recompile numpy?), but for the time being it pays to stay on Xeon.

We eventually switched to m-instances since that fits our compute/memory usage better when we’re at limits.

  • re-thc 2 years ago

    If it's AVX512, you'll be excited to know that m7a, currently in preview has great support for this. AMD Epyc 4th gen i.e. genoa is a lot faster.

  • harpratapOP 2 years ago

    Yes, I point this out in the article too. Which CPU will perform better is heavily dependent on your workloads, so I refrain from relying 100% on synthetic benchmarks and directly ran canaries in production instead. It's definitely possible Ice Lake is superior for your workload than Milan

m4r1k 2 years ago

If you wanna have additional insights, head over my Medium ports with SPEC CPU 2017 results. https://medium.com/google-cloud/google-cloud-platform-cpu-pe...

tester756 2 years ago

x86 would be dead if ARM ISA offered like at least 20% perf boost or energy savings.

But since ISA doesn't imply perf. characteristics itself, then x86 will be alive.

The hard part in changing hardware's stuff is getting software to adjust.

netr0ute 2 years ago

Why do these kinds of articles essentially refuse to admit RISC-V exists?

  • xcdzvyn 2 years ago

    Because it essentially doesn't? At least when I tried looking a few months ago, it was really hard to find any commercially available RISC-V SoCs, let alone high performance ones. Sure there's little hobbyist boards going for $200>, but that's about it.

  • jsheard 2 years ago

    Probably because the article is looking at silicon that's actually available today, and big-iron RISC-V silicon akin to Xeon/EPYC/Graviton/Ampere is more of a hypothetical at this point.

  • harpratapOP 2 years ago

    Cloud world is really slow. Imagine writing about Zen3 Milan in Aug 2023 when AMD has already announced Zen4c Bergamo. Actually we were "ahead of the curve" since we got access to T2D before it was made publicly available in Tokyo region, and even then it took several months to get enough capacity in Tokyo to fully migrate our production Kubernetes cluster.

    I really wish we could test out RISC-V SoCs from the likes of tenstorrent, but it's a long journey

    • re-thc 2 years ago

      > Cloud world is really slow. Imagine writing about Zen3 Milan in Aug 2023 when AMD has already announced Zen4c Bergamo.

      AWS has Zen 4 in preview. Azure and Oracle have Zen 4 available. It's only Google cloud that's been behind this release. The cloud world isn't slow.

      • harpratapOP 2 years ago

        It takes very long time for going from preview to actual production usage for anyone. We had T2D preview access more than a year ago, it took several months to get enough stock in Tokyo region (US always gets preference in such cases whenever a new machine type comes out). GCP already has Zen4 in preview for some US customers. Also, us being one of the largest GCP customers in Japan made things even slower

  • x-complexity 2 years ago

    > Why do these kinds of articles essentially refuse to admit RISC-V exists?

    Because as much as I like RISC-V myself, it hasn't built up the scale needed to supplant x86/64 or ARM. It's still a long way to go before the following are achieved:

    - Similar/better performance to x86/64 or ARM, with at least 80% of their performance

    - Similar/lower prices compared against x86/64 or ARM

    - A win in either price-to-performance or power-to-performance against x86/64 or ARM at some point.

    • snvzz 2 years ago

      >a long way to go

      Veyron releases later this year. Ascalon next year.

      It's going to get exciting.

  • hedgehog 2 years ago

    RISC-V is important, and I use it every day for work, but there are no mainstream mature server platforms using it and that's what this article is about.

  • snvzz 2 years ago

    The context is AWS and there's no AWS RISC-V instances yet.

    But worry not, RISC-V is inevitable.

  • pjmlp 2 years ago

    Because in the great scheme of the universe, it doesn't matter.

    It is an architecure cheered by FOSS folks, that ignore cloud offerings will be just as closed, and no one is selling RISC-V computers at the shopping mall down the street.

retskrad 2 years ago

x86 lives on desktop now. Windows, Intel, AMD and Nvidia have more or less thrown in the white flag and given the laptop market to Apple.

  • o1y32 2 years ago

    A few programmers on HN think only they use laptops (and specifically Macbooks). Not surprising.

    If you just bother to open your eyes just a little bit wider you would notice that there is a huge market for Chromebooks, budget laptops, gaming laptops, mobile workstations, ultrabooks for students, gamers, business people, bankers etc. New things are happening every month. We are seeing more efficient laptops from Intel and AMD. Companies like Framework are doing actual innovations in the decade old laptop area. And there are workflows that can only be done on Windows.

    Your claim is completely unfounded.

  • cardanome 2 years ago

    Apple products are only dominant the US. For the rest of the world they are more of an luxury item.

    That said, I am not terribly interested in ARM-based laptops for now. Yes, they may be more energy efficient and all but that hardly matters to me compared to just having the same x86 architecture I run on my desktop and servers. That sweet binary-compatibility means less headache.

    People underestimate the advantages that CPU architecture monoculture gave us, though they are getting admittingly less important year by year. Maybe one day I am going to run an ARM laptop or even RISC-V.

    • mpweiher 2 years ago

      > ... just having the same x86 architecture I run on my desktop and servers.

      Yep! For me, it's the same ARM64 architecture I run on my desktop and servers[1]. :-)

      Hetzner's offering is very competitive, cheaper than their already rock-bottom x86 offerings. Then there's AWS Graviton, Oracle with their free tier (not sure how expensive that gets if you actually have to load it) and both Azure and Google also have ARM offerings.

      [1] https://blog.metaobject.com/2023/05/setting-up-hetzner-arm-i...

    • pjmlp 2 years ago

      > People underestimate the advantages that CPU architecture monoculture gave us, though they are getting admittingly less important year by year.

      People also oversell it, I never had any problem developing software for Windows 2000/NT, Solaris, Aix, HP-UX, Symbian, from my x86 desktop.

    • agloe_dreams 2 years ago

      A key element to Apple and ARM and why you are right outside the us is this:

      1. Apple's chips are so far beyond everything else that it makes obvious sense for Apple only. Snapdragon is at least 30% slower single core while having worse Performance per watt.

      2. Apple wasn't playing fair with their translation layer. The Rosetta layer cheats a little because apple also made the chip. The secret sauce is that the M1 has a hardware compatibility mode (That technically breaks ARM spec) for x86 memory order that basically gives near 1:1 performance.

      3. Microsoft heavily botched their ARM rollout (again. Hello, Windows RT). The translation layer on W10/11 is just bad, not because of bad coding but just the technical limits of what they were trying to do.

      4. Google is in a great place with ARM compatibility in Chrome OS as the only consumer-facing apps are either built-in or on the play store...which was designed for ARM in the first place. Problem is that nobody will give them a good chip for ChromeOS and the focus is on low end, so Mediatek is just cruising.

      ARM is a great arch...but right now it is Apple VS x86, not ARM.

      • astrange 2 years ago

        > That technically breaks ARM spec

        It does not (or it wouldn't be there) and they aren't the only ARM vendor with TSO (Fujitsu also does it.)

        • monocasa 2 years ago

          Optionally enabling it does break the ARM spec.

          Their cores break the ARM spec in other ways too. Added instructions for AMX, new guard privilege modes, HCR_EL2.E2H can't be disabled, GPRs are clobbered on WFI, etc.

      • yyyk 2 years ago

        >3. Microsoft heavily botched their ARM rollout (again. Hello, Windows RT). The translation layer on W10/11 is just bad, not because of bad coding but just the technical limits of what they were trying to do.

        It's not bad at all, however Intel patents heavily restricted the initial implementation, to the point they couldn't ship the 64bit emulation. Apple decided to wait out the patents.

      • ngcc_hk 2 years ago

        Unfair … strange use of phrase. Innovative would be a better use of word as nothing prepare others to do the same … it is good to have a vendor play differently not follow the slow and dominated by arm player c.

        The conclusion is on the ball but only current. M1 has been a bit old now and if other arm vendor and the two major architecture (ibm power for mainframe … cannot be counted) it is at least x86-32 plus x86-64 vs arm vs apple.

        • agloe_dreams 2 years ago

          The wording of unfair is more of a fun use of the word. Because Apple vertically integrated the whole product and software supply chain, they are singularly able to do things others cannot. In that context, if one is playing "fair" on the terms of everyone else when they could do more...they are losing.

  • kens 2 years ago

    ARM has a lot more market share in people's minds than in actual numbers. One research firm says that ARM has 15% of the laptop market share in 2023, expected to increase to 25% by 2027. (Surprisingly, Apple only has 90% of the ARM laptop market.)

    In the server market, just an estimated 8% of CPU shipments in 2023 were ARM.

    https://www.counterpointresearch.com/arm-based-pcs-to-nearly... https://www.digitimes.com/news/a20230217VL209/amd-arm-digiti...

    • Andrex 2 years ago

      I keep meaning to research this, and maybe this isn't the right place, but how are things going for Linux executables? Last I tried Linux on ARM it seemed like application-level support for ARM was very spotty.

      It doesn't seem like a Rosetta 2-like effort is making it into mainline Linux anytime soon, if ever.

      • arp242 2 years ago

        For open source applications it's generally pretty good. Closed source: not so much.

        qemu-x86_64 is probably the closest there is to Rosetta; it works fairly well and I think conceptually it's identical or similar to Rosetta, but I don't really know what the performance is like, and it's of course not as integrated/automatic as Rosetta.

      • nazgulsenpai 2 years ago

        I suppose it depends on the distribution but most I've used (Manjaro, Void, Alpine) are close to 1:1 with x86. Of course third party applications have to compile for ARM if they don't intend to distribute source code.

      • thatfrenchguy 2 years ago

        Rosetta for Linux VMs is a thing now for Macs: https://developer.apple.com/documentation/virtualization/run...

    • agloe_dreams 2 years ago

      That is spot-on. The fact of the matter is that it is Apple VS x86, not ARM. The M1/2 have dominant mindshare due to being actually better.

      On Windows, ARM products just suck. The product is just bad because there is no reason to use it. The chips available are worse than x86 and then the software issue is bad due to a whole set of reasons that Microsoft can't change on their own..\ Google has that last 10% probably, ChromeOS moves ARM devices all day, every day. Just cheap ones.

  • jimmaswell 2 years ago

    Based on what? I would never buy an Apple laptop and no one I know has one either. The Windows "gaming laptop" I got a few years ago for game dev is perfectly adequate and has features missing from Macbooks:

    - ethernet port

    - hdmi port

    - multiple usb A and C ports

    - solid keyboard

    Same for my work laptop, again Windows/x86 and no one I know with a work laptop is supplied a Mac either.

    • edgyquant 2 years ago

      We live in different worlds. All engineers I know have Macs and the ones that don’t have Linux. The one guy on my team who uses windows is a constant problem as he has to find work arounds for every process we have/service we build.

      • shortrounddev2 2 years ago

        Every company I've worked for provided a macbook to us, and all my coworkers use a macbook. When I have the option, I pick PC. My last company gave me a choice between mac or PC, and I picked the thinkpad. My current company gave me a macbook pro, but I opted to use my personal Windows machines instead.

        I'm the guy who has to come up with a workaround, but it's never a problem for others. Usually I'm translating some bash command into powershell, which takes maybe 2 minutes. Everything else runs just fine, and I don't even need to use WSL. Windows is a perfectly viable development platform, if you give a shit about learning the differences between a unix-like environment and windows instead of just copping out and using WSL

      • oblak 2 years ago

        1. Build a pipeline with macOS in mind (because "All engineers I know have Macs")

        2. Complain when the macOS-oriented process doesn't work for everyone.

        • WeylandYutani 2 years ago

          Apple is doing what people hated Microsoft for. It's amazing that a closed ecosystem is so welcomed by Apple fans. But I suppose man has always been tribal.

          • Dylan16807 2 years ago

            Any closed ecosystem is annoying but I'm not going to complain very hard while their desktop market share is below 20-25%.

            • WeylandYutani 2 years ago

              Desktop is becoming increasingly irrelevant. Most people don't even own one anymore. Apple is as close to WeChat as you can get in Western countries.

        • ezfe 2 years ago

          I work on a split team, Windows and macOS. We do straightforward Java work and most of the problems that we have are related to installing command-line software on Windows.

          Installing Postgres, Redis, etc. on Windows is wildly complicated compared to on macOS or Linux.

          • jimmaswell 2 years ago

            Tried WSL?

            • edgyquant 2 years ago

              Yes it’s a constant source of problems. Trying to live share and get any actual work done leads to constant crashes, simply git cloning and trying to docker-compose and yarn install is a convoluted mess etc. windows is just not friendly to the kind of development common these days.

        • edgyquant 2 years ago

          It was actually built out with Linux in mind since that’s what everyone deploys to. Mac just happens to work flawlessly with Linux and windows doesn’t

      • parineum 2 years ago

        If your dev environment is meant to be multiplatform, it's sounds a lot more like the other developers are checking in things that don't work on anything but macs.

        Your Linux devs are just used to dealing with it.

        • senttoschool 2 years ago

          >If your dev environment is meant to be multiplatform

          Why would this be the case?

          Dev environments should concentrate on a single platform as much as possible.

      • redwall_hp 2 years ago

        Yep. I work in an entire office of engineers with only Macs (mostly ARM ones now) and I only have a Mac at home, besides a very old ThinkPad running Ubuntu. (Aside from FFXIV on MacOS, I've moved any gaming to PS5 and Switch.)

        I haven't actively used Windows since 2008. It's such an obnoxious OS to use, and I have never seen a Windows laptop with quality in the same zip code as a Mac. There's always a terrible touch pad, keyboard flex, plasticky trash fit and finish, mediocre display or something that just ruins the whole thing.

        • shortrounddev2 2 years ago

          I like the aluminum bodies of macs, but I hate the flat keyboards. In my view, no laptop brand has had a good keyboard since the Thinkpad T420.

          In terms of build quality, apple laptops are pretty good, but I can't stand macOS and apple keeps fucking with their ports.

          My Inspiron 16 has a good glass screen and a pretty solid aluminum body. It's got all the ports I want (HDMI, USB-A 3.0, SD card reader, USB-C, and a 3.5mm headphone jack). Personally, the macbook pro my company issued me is one of my least favorite laptops I've ever used. I also had compatibility or performance problems with x86 based software when I got my first M1 macbook, I don't know if they're improved anything since then

        • selectodude 2 years ago

          I just received a brand new Thinkpad from work. I don't even mind Windows but holy crap is that thing an absolute chore to use. I don't understand how this is still an issue while Apple has been putting out what amounts to the perfect laptop for like 10+ years now.

          • rejectfinite 2 years ago

            >I don't even mind Windows but holy crap is that thing an absolute chore to use.

            What are your issues? I have a T490 that is a few years old and its great for my use. But I keep it docked in to a monitor and external mouse and keyboard 90% of the time.

          • innocenat 2 years ago

            On the other hand, I moved from MacBook to ThinkPad and holy crap it's just so much easier to use.

        • hota_mazi 2 years ago

          > I haven't actively used Windows since 2008. It's such an obnoxious OS to use

          You might want to modernize your perception of it, holding on to 15 year old views is not a very smart move in the tech industry.

          Windows 10 and 11 are a joy to use, and to develop. I used a Mac for 15 years and switched to Windows to develop 5 years ago, and now macOS looks extremely antiquated and creaky for development.

          • Clamchop 2 years ago

            Sure are a lot of extreme opinions on operating systems when they all seem pretty damned similar to me on the fundamentals.

            Development work strikes me as maybe the least distinguishing kind of task to compare them on. Very few hard choices to make unless you're developing for Mac or iOS.

          • redwall_hp 2 years ago

            The operative word is "actively." I've certainly touched it since then and am just as unimpressed. Windows is a chore to use, and the best they've been able to manage to close the gap is shipping a glorified Linux virtual machine. I'll take Homebrew over that any day, and leave the VMs to Docker.

            The windows UI itself is an unpolished joke that's a hodgepodge of things that haven't been touched since Windows XP and half-baked new things, with sad window management (ironic) that lacks the perfection of Exposé and Spaces. To say nothing about laptop battery life, plugging in and removing multiple monitors several times a day.

            I would strongly consider not working somewhere if they didn't use Macs

      • hota_mazi 2 years ago

        You probably live in the US and parent lives... anywhere else in the world really.

    • ezfe 2 years ago

      MacBook Pros have HDMI ports and many USB-C ports. I also don't know any issue with MacBook keyboards except the 2016-2019 model years.

      I work for a large financial services company and receive a MacBook Pro for my work.

      My work laptop battery can handle all-day IntelliJ & web browsing without charging, I've never heard the fan and the laptop stays cool. It has a Geekbench multicore score of 13,649.

      My personal laptop also lasts that long and stays cool, and doesn't even have a fan.

    • glimshe 2 years ago

      I would never buy an Apple laptop. I have one for work and I hate it. Linux and Windows are by far more productive for me, so all I need to do is keep getting better PC laptops. While Mac hardware does have a slight edge on cost/benefit at that price point, macOS is just too cumbersome for my needs.

      EDIT: It's funny to see this downvoted. I'm literally speaking of my particular use case and needs! Are there people who believe Apples are better in all scenarios?

      • adamc 2 years ago

        The whole "click once on the app for it to wake up, click again for it to know which window you wanted" makes using Mac OS highly unpleasant on a daily basis. Yes, I'm on Mac OS, and no, I don't like it.

        • cthalupa 2 years ago

          As a daily Mac user for about 13 years now I have no idea what phenomenon you're talking about.

          For any program that is open, even if it hasn't been in use for a while, I click it once, and it comes to the forefront immediately.

          For programs that are not open, I click it once, and then if I wait for it to launch, it appears in the forefront. If I immediately click back over to my web browser and begin using it before the program launches, it does not appear in the forefront... because I interacted with another program after starting the launch.

          • Clamchop 2 years ago

            They might be referring to how clicks don't register with an app until it has focus (the first click being to focus it).

            On Windows you can, say, click once to pause on a video playing in a browser window that doesn't have focus but is visible. On macOS, it's two clicks, because you have to focus the app first.

            I don't care for the behavior but it's so minor to me that it carries essentially no weight.

            • cthalupa 2 years ago

              Ah! This does make sense. I mostly run things fullscreen on my laptop and use Mission Control to swap windows so I really only encounter this when my laptop is docked to a larger display.

              It doesn't particularly bother me one way or the other to the point I didn't even think of this, despite also being a daily Windows users on my desktop computer.

        • TillE 2 years ago

          Use Mission Control for everything, it's great.

          • zdragnar 2 years ago

            Except for how there's no stable sort order for windows, even if you turn off the setting that's supposed to not rearrange it all the time. Mission control was probably one of the things I liked least about the Mac when I used one for work... To the point that I ended up using hammerspoon to create shortcuts to switch between my applications directly rather than deal with it.

    • holbrad 2 years ago

      Just chiming in to agree, I don't think we have a single mac in our entire company (100's of people). It's all just windows/x86 like every other business I've worked at.

    • cthalupa 2 years ago

      MBP once again have HDMI ports and good keyboards. I wouldn't mind a single USB-A port but I don't particularly miss it, and 3 USB-Cs is plenty for my uses.

      I haven't needed an ethernet port on a laptop since the mid 2000s troubleshooting why my newly installed stack of Cisco switches weren't working, and that sort of use cases is rare enough that having a dongle seems fine.

      Conversely, about 80% of the people I know use Apple laptops for work.

    • babypuncher 2 years ago

      Macbooks have HDMI ports and solid keyboards.

      I don't see ethernet ports making a comeback on thin laptops anytime soon though. They are handy for LAN parties, so bulky gaming laptops will keep shipping with them, but for everyone else it makes more sense to leave that functionality to a dock.

    • gumby 2 years ago

      That's what's so great about capitalism: I'm not interested in any of those features on a laptop, so Apple is great for me.* You do, and several other companies compete for your business.

      On a desktop (which these days isn't a tower) I do want some of those features, and I get them.

      * except kbd, and I find the apple kbds just fine -- I even survived a couple of rounds of the infamous "butterfly" kbds. But I know some people were unhappy and eventually Apple got their act together.

    • katbyte 2 years ago

      MacBooks now post 2019 have a great keyboard, hdmi and multiple usb c ports, as well sd card and headphone jack.

      The only thing it is actually missing from your list is Ethernet and usb a

    • scarface_74 2 years ago

      > Based on what? I would never buy an Apple laptop and no one I know has one either

      Yes because no one that you know buys an Apple laptop that must mean that Apple isn’t selling any…

      > Same for my work laptop, again Windows/x86 and no one I know with a work laptop is supplied a Mac either.

      I find the lack of self awareness…amusing.

      Because no one you know uses a Mac laptop, that must mean no one uses one.

      Well my anecdote from working at the second largest employer in the US that the vast majority of technical people prefer Macs even though they can choose Macs or Windows.

      • arp242 2 years ago

        "Apple isn't selling any laptops" wasn't what was said. The original claim was x86 has "thrown in the white flag" and that Apple dominates the laptop market, which is a pretty ridiculous claim since x86 is still the vast majority of laptop sales and Windows is still the dominant laptop OS.

        • scarface_74 2 years ago

          > Apple isn't selling any laptops" wasn't what was said

          The parent poster said

          > would never buy an Apple laptop and no one I know has one either

          What relevance then was there in the statement that the original poster made?

          > Apple dominates the laptop market, which is a pretty ridiculous claim since x86 is still the vast majority of laptop sales and Windows is still the dominant laptop OS.

          Nor did I argue otherwise. I merely called out the silliness of the statement that because the parent poster “doesn’t know anyone who owns one” or that because every developer he knows uses a Windows laptop, that his anecdotal experience has any relevance.

        • havblue 2 years ago

          I don't personally see MacBooks as a viable platform either right now, but I would say that this looks like a steady decline for windows x86 laptops until Chrome laptops take over or an arm-based windows replacement takes hold. "Hey this isn't a surrender, it's a war of attrition for the next five years!!!"

          • arp242 2 years ago

            Given that 15-20 years ago almost all laptops were x86/Windows, "decline" is pretty much the only direction the platform could go in.

            I don't expect either x86 or Windows to go away any time soon – in general I think x86 has been somewhat unfairly maligned, and for most applications (including laptops) it works pretty well and is even the best available choice. I would prefer to say "diversification" or "the existence of actual competition" rather than "decline", as that doesn't imply it's going away.

          • scarface_74 2 years ago

            > I don't personally see MacBooks as a viable platform either right now

            But yet Apple is selling millions. Someone must see it as a viable experience.

      • debugnik 2 years ago

        They're not arguing that Apple doesn't sell laptops, they're refuting that "Windows, Intel, AMD and Nvidia have [...] given the laptop market to Apple". Unless one thinks the laptop market only exists in the US, this idea simply can't match reality, not yet anyway.

        • scarface_74 2 years ago

          If they are arguing that, don’t you think their anecdotal “proof” is silly?

          > I would never buy an Apple laptop and no one I know has one either.

          >no one I know with a work laptop is supplied a Mac either.

          I don’t know anyone that uses WeChat. Wouldn’t that be a silly argument?

          • debugnik 2 years ago

            Not if the anecdote contradicts an equally unsubstantiated generalisation, honestly. If you're going to complain about the quality of the discussion, do it with both sides, not just the argument you side against.

            • scarface_74 2 years ago

              Well, it’s really not that hard to search on Google for “percent of laptops running ARM” and find something slightly less silly than “no one I know owns a MacBook nor does anyone I work with”

              And the first link is

              https://www.patentlyapple.com/2023/02/apple-currently-owns-9...

              It would be much more substantial than anecdotal what his friends and coworkers do.

              It’s about as bad as the old Slashdot “do people still watch TV? I haven’t own a TV in 10 years”

              • debugnik 2 years ago

                But why do you expect people to put any effort to answer someone that didn't put any either? Subpar comments are begging for subpar replies.

                If you want to read a non-silly thread, I suggest looking for a non-silly opener, this website has more than average of those.

  • oaiey 2 years ago

    Sounds like a bubble thing. Sure x86 may go (as will arm one day), but there is a very solid, healthy and huge Laptop market outside than Apple.

  • aidenn0 2 years ago

    Just had a family reunion. The only Apple laptop was my wife's, out of about a dozen.

    Furthermore their market share has never been over 20% of units shipped[1]

    1: https://www.statista.com/statistics/576473/united-states-qua...

  • MisterBastahrd 2 years ago

    Unless Apple can deliver a sub-800 dollar product, they'll never own the laptop market.

    • r00fus 2 years ago

      Correct. And the fact is, Apple would rather not (as it comes with government antitrust scrutiny) Instead they'll focus on profit share.

  • ngcc_hk 2 years ago

    Strange and whilst I have 8 macOS and 3 ipad plus 5 iOS devices I found in the train everyone use small dell notebook.

    I guess like me I like macbook 12 and that is what you use for non-desktop like notebook.

    • dc443 2 years ago

      I've been hoarding apple devices too. I have a 12 inch macbook, recently brought it up to Ventura. But the blasted thing won't hold a charge. The battery is fine over 90% health but something is causing power drain while it sleeps even when I try to hibernate it.

      It's impossible to find a use case for the thing. I might have to sell it.

WeylandYutani 2 years ago

"The macOS (OS X) version of this game does not work on macOS Catalina (version 10.15) or later due to the removal of support for 32-bit-only apps"

And that's why x86 is good.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection