Settings

Theme

AMD YOLO

geohot.github.io

120 points by trulyrandom 9 months ago · 84 comments

Reader

dlcarrier 9 months ago

AMD is probably undervalued, but Nvidia is clearly way overvalued.

I don't think the reckoning will come from AMD stealing Nvidia's market share, it'll come when the hype bubble collapses and businesses start treating neural networks like commodities, running whatever is cheapest instead of the absolute most powerful. AMD is in a great position, because they make both great GPU/NPU hardware and great CPU hardware.

  • Jalad 9 months ago

    > AMD is probably undervalued, but Nvidia is clearly way overvalued.

    AMD trades at a price to earnings ratio of 99, NVDA trades at a PE of ~38. PE isn't everything when looking at companies, but I don't see other reasons to think AMD is undervalued

    • hmm37 9 months ago

      AMD's PE ratio looks high because of their Xilinx acquisition and for tax reasons. You have to look at non-GAAP numbers instead, e.g. AMD's forward PE ratio which is only ~20 or so. Nvidia's forward PE is ~30.

    • deadbabe 9 months ago

      Is PE even anything? If you guided your investing based on sane PE ratios for the past decade you would have been consistently unhappy with missing out on huge gains in tech companies.

      • smallmancontrov 9 months ago

        PE is appropriate for companies going steady. It's not appropriate for companies that are growing or dying.

        Thought experiment: A and B have the same earnings per share, but everyone expects A to double its revenue going forward and B to go steady. Should shares of A go for the same price as shares of B? If you think so, I can front-run you.

        Thought experiment 2: A and B have the same earnings per share, but everyone expects A to halve its revenue going forward and B to go steady. Should they go for the same price? If you think so, I have some bags for you to hold.

        The easy answer is that PEG is more appropriate for growing companies and PB is more appropriate for dying companies (since this is HN, I'll also mention that "Team and TAM" is the metric for seed stage). The hard answer is that there is no substitute for modeling the finances of the company and applying a DCF, but your brokerage app can't do that for you so PE/PEG/PB still have their place.

      • georgeecollins 9 months ago

        In the long run, I believe yes. PE also would not have been helpful in the late 1920s or late 1990s. But things tend to revert.

        One of the reasons why different companies have different p/e for a long time is expectation of growth. So META has had a pretty high P/E, but then E has been growing really fast. GM's earnings don't grow so past so the P/E is low. And capital efficiency also hurts P/E.

        Disclaimer: I believe stocks represent fractional ownership of an actual company and are ultimately (but not always) valued as such. You can make an argument that financial instruments are just driven by sentiment, supply and demand, and have no correlation to actual reality.

      • _DeadFred_ 9 months ago

        PE is for a stable intelligent market. 'The steady hand' type market. What you are talking about is called gambling. PE doesn't matter for gambling.

      • Jalad 9 months ago

        Agreed, but I don't think that's a good use for PE. PE is useful for comparisons between similar companies, not as an absolute yardstick

    • ein0p 9 months ago

      That's because the market believes AMD will eventually get its head of its ass on GPUs and start to grow in that segment. Once it starts to grow there, that PE will go up even higher.

    • light_hue_1 9 months ago

      That's too shortsighted.

      Keep in mind that Nvidia gets to charge astronomical prices because AMD's software is crap. Nvidia charges 2-5x as much for equivalent or worse hardware compared to what AMD is selling. That PE ratio will collapse if AMD ever manages to get its act together.

      I'm still astounded AMD has no fired every executive from the CEO down for this obvious multi-year failure.

  • solumunus 9 months ago

    NVDA is valued more appropriately than AMD. NVDA’s valuation based on forward earnings is high, but there are many more extreme examples including AMD.

  • formerly_proven 9 months ago

    Nvidia is imho kind of a resurgence of large scale 90s Unix systems (ala SGI) [1], and not just by pricing and looks but also if you look at the vertical integration. I think this kind of business setup is more vulnerable to competition from below than people like to think, and it really only takes 1-2 product misses for a big shakeup.

    [1] Co-developed proprietary software stacks running on highly proprietary and non-standard hardware targeting very specific workloads.

  • Jlagreen 9 months ago

    The way I see it is that Nvidia might reacht $1 trillion in revenue before AMD reaches $100 billion in revenue. So the upside in revenue growth is higher for Nvidia.

    People think that because a company has grown very large very quickly that it can't grow as much anymore. But on the other hand, there is clear evidence that Nvidia continues to dominate AMD's offerings despite the latter having a competitive product now. So the metric for Nvidia isn't Nvidia vs. AMD but the growth aspect of AI market overall.

  • ZeroTalent 9 months ago

    NVIDIA is one of the most undervalued companies in the SPX looking at fundamentals, even disregarding what they are planning next (subscription services for CUDA, and humanoids, etc).

    Take a look at their last quarter's income statement graph: https://i.imgur.com/mQwZ5o4.png - Once in a few years I see a Sankey graph looking like that. And it's only growing over the last 10 years.

  • xnx 9 months ago

    Is it possible someone will write a CUDA to AMD/Tensor/whatever transpiler (high-level emulator? I'm not sure of the right term) I thought there were a remarkably small number of ops that GPUs perform. Seems like a very high premium to pay for not wanting to rewrite in JAX or whatever.

    • smallmancontrov 9 months ago

      Lack of translation layer wasn't the problem, AMD's shit being completely broken (for this purpose) was the problem. Driver-level broken, possibly hardware-level broken. Black screens, restarts, wrong answers, threads full of people shouting into the void year after year about the same behavior before seeing the light and buying nvidia.

      Then NVDA pumped by trillions and AMD did not and even AMD's crack team of trained denial specialists could no longer stay the course, so they started to turn the ship. But that was only a year or two ago and even the tiniest changes take years in hardware land so the biggest sins are still baked in to the latest chips, but at least the software has started to suck a bit less.

      I no longer see 100% of the people who try to build on AMD compute run away screaming. Many do, but not all. That's a change, an important and positive one. If AMD keeps it up maybe they can save us from 80% margins on matrix multiplication after all.

    • drexlspivey 9 months ago

      AMD hired 1 guy to do it and then fired him

      • xnx 9 months ago

        I wondered if John Carmack would take his strong knowledge of low level hardware and interest in AI and work on this, but he seems to be working on a non LLM flavor of AI.

  • rapsey 9 months ago

    > but Nvidia is clearly way overvalued.

    No it is not actually. They are making insane amounts of money and have very strong forward guidance. With the drop in the last month it is actually cheap (low peg ratio). When the market turns, nvidia is likely to soar once again.

  • ein0p 9 months ago

    > Nvidia is clearly way overvalued

    Tell me you haven't looked at Nvidia's financials (especially the margins) without telling me. It basically prints money, now and in the foreseeable future, and all of its products are permanently sold out, even at the insane prices Nvidia is charging.

    • eightysixfour 9 months ago

      > foreseeable future

      I think this is the arguable part. The more AI compute becomes valuable, the more reason there is to divest from their software moat. Their hardware is good, but not unassailable. I think their modest (by tech hype standards) P/E is recognition of this.

      • ein0p 9 months ago

        While I agree with you in theory, the practice of this has been somewhat less optimistic. So far AMD hardware _doesn't even work_. That's why Geohot had to write his own stack. Moreover, it also doesn't sell - there's maybe one or two obscure cloud providers for it. No major cloud provider has it, or is going to have it anytime soon, until software and driver problems are figured out. Why? Simple - they have no interest whatsoever in disappointing their customers and/or spending all of their support engineering time on supporting obscure GPUs that they'll have to charge less for in order for them to see uptake at all. Even Intel (!) has a more robust offering with Gaudi 2/3, and they're having a heck of a time getting large deployments anywhere outside Intel Developer Cloud.

      • baal80spam 9 months ago

        But they are making strides in robotics already. Jensen is super smart, business-savvy AND hard-working. These are some of many reasons I own NVDA.

        • itsoktocry 9 months ago

          >Jensen is super smart, business-savvy AND hard-working

          Those traits are table stakes for running a Fortune 50 company. Before 2019 when the AI boom came out of nowhere, what was going on? Nvidia was an okay company, but not a real over-performer.

          • Jlagreen 9 months ago

            What are you talking about?

            Nvidia has been a high margin and great performing business for the last decade. Nvidia had better gross margins than apple by selling gaming GPUs only 10 years ago and that's in a market where you can easily exchange the card in a PCIe slot.

            From 2015 till 2022, Nvidia had several years with 50-60% revenue growth. People only look at the recent 2 years and think that Nvidia was "OK" before but I'm invested since 2016 because Nvidia started the growth turbo back in 2014/2015.

            Jensen decided decades ago that Nvidia is premium and he positions the company in that position. What many don't get, Apple has only 25% unit share but 75% profit share. So Apple basically concentrates the profit of the Smartphone business. Nvidia will do the same. They had better gross margins a decade ago than AMD has today. AMD might gain unit share but will never gain Nvidia's margins because AMD is a market follower and will never be able to set pricing unlike Nvidia with premium solutions.

            Jensen also made CUDA possible. Intel on the other hand killed such projects and also their first GPU project. Intel also didn't invest in OpenAI and so on. Jensen is by far the best CEO and fortunately he doesn't get crazy like Musk does.

        • eightysixfour 9 months ago

          Honestly, I'm having a hard time with my sarcasm detector here. Plenty of others working on robots. Plenty of smart, business savvy, hard working CEOs that didn't win the next round of the infinite game.

          I like nvidia, I think they're doing good work, but I don't think their current trajectory is some sort of well-moated flywheel the way some others think it is. Selling shovels during a gold rush can make you rich, it doesn't mean you'll still be selling shovels in 100 years.

          • littlestymaar 9 months ago

            > Selling shovels during a gold rush can make you rich, it doesn't mean you'll still be selling shovels in 100 years.

            This is the best way to put it, it's exactly this.

      • rapsey 9 months ago

        They modest P/E ratio is a consequence of growing their earnings so much.

        • georgeecollins 9 months ago

          And also there is some recognition that its going to get really hard to grow them as much proportionately. AMD could increase its earnings a lot pretty quickly, but its hard for NVIDIA to grow like that for the same reasons it is hard for Apple to grow. You are already getting so many of the $ available to spend in that area.

        • eightysixfour 9 months ago

          And yet, when other companies have grown their earnings like mad, their price has outgrown it, driving their P/E up further, because the market expects higher future earnings. nvidia is reaping the rewards of the things built up to now, but isn't necessarily expected to continue to hockey stick.

          Their P/E is approximately the same as the rest of the S&P 500 technology sector's average.

    • itsoktocry 9 months ago

      >Tell me you haven't looked at Nvidia's financials (especially the margins) without telling me.

      What happens to industries that have huge margins? They get compressed by competition. The biggest companies on the planet, literally, are figuring out how to not pay Nvidia those huge margins.

      • Jlagreen 9 months ago

        Yes, we can see how Big Tech and Apple's huge margins get compressed all the time, they never expand. They are going to zero soon it seems :(

        You can't say that in general because it also depends on the moat.

        Apple has 75% profitshare in the smartphone market not because they have the best smartphone but because they sell iOS. This is why Apple can charge much higher margins on iPhones then any other smartphone competitor. Competitors use Android and are basically exchangable so their HW is commodity more or less and margins are much lower.

        The same will happen with Nvidia. Nvidia will offer complete data center solutions and many other SW/HW solutions for AI and accelerated computing and will charge high margins for that. HW sellers like AMD will sell only chips which will compete with commodity ASICs.

rs186 9 months ago

I have trouble distinguishing this post from those on r/wallstreetbets. To be honest, I have seen quite a few posts on wsb that are much more informative than this

  • qwery 9 months ago

    The author is George Hotz, who most famously developed some iOS and PS3 exploits and got sued by Sony. I have litle interest in the content of this particular article and I think your evaluation of the article is fair.

  • y-curious 9 months ago

    Well, I generally am inclined to take DD from geohotz over anonymous wsb posts.

    • AnotherGoodName 9 months ago

      It's not interesting though. He was sent an MI300X, released in 2023, to develop his AI stack with.

      The MI300X was lacking in a few areas vs the H100 at the time, overall perf and power efficiency were two big areas. Power efficiency is critical atm, that's seriously the biggest barrier to scaling datacenter rollout right now.

      AMDs next card might be better on this front, it might not. But this article doesn't talk to anything about the next card. It's referring to a card from 2023.

Cornbilly 9 months ago

Do all of geohot's posts come off as manic to anyone else?

  • karolist 9 months ago

    Yes, but that's just his personality. His mind seems to be racing at 200mph whilst the output device (hands, keyboard etc) can't keep up, so some context gets dropped here and there. I remember I had a hard time watching his streams because he'd type at 160wpm or so, but half of the keypresses were correcting mistakes...

  • asadm 9 months ago

    i like this actually. there is no fluff or weak/suggestive hints at claims, it is what it is.

  • stefan_ 9 months ago

    It ain't March 8 either

  • slater 9 months ago

    You should see his livestreams. Let's just say, he computers the same way his writing comes across (all while swearing up, down and sideways that he's not on anything /s ;) )

htrp 9 months ago

> Bought in for a quarter million. Long term. It can always dip short term, but check back in 5 years.

So basically he got 2 MI300s and is currently trying to pump AMD?

parsimo2010 9 months ago

"I’m betting on AMD being undervalued, and that the demand for AI has barely started."

I'd love to see AMD get a multiplatorm product so mature that I can pip install PyTorch on Windows or MacOS with an AMD card (https://pytorch.org/get-started/locally/). But I don't think that their market cap will change quickly even if this happens. Many people have been bought AMD cards in the past because they are cheaper and then died waiting for AMD to have a mature CUDA equivalent. Nobody is going to be quickly buying AMD cards as soon as the software is good- they will gradually change when they replace NVDA hardware, and not everyone is going to make that choice.

If I were making a bet (and I'm not), I'd bet that NVDA is overvalued right now and their growth will slow to correct this but it won't crash, and I'd bet that AMD will gradually increase in value to reward them for software investments, but it won't spike as soon as their software looks good. Neither of these things would I want to put a lot of money on, since they are long term bets, and if you're going long then you might as well just invest in the broader market. And even if I thought that NVDA was going to crash and AMD was going to spike, I still wouldn't bet because I have no idea whether it would happen in the next 6 months or 6 years.

wavemode 9 months ago

Unfortunately I think the current AI hype cycle will die down before AMD ever gets a chance to benefit from it in terms of stock price.

AMD is not undervalued, rather it is Nvidia that is overvalued.

blackeyeblitzar 9 months ago

I saw a video the other day that showed a new AMD laptop processor that is comparable to the Apple processors in performance and battery life. That was very surprising and also a great thing for windows or Linux laptops. But at the same time, the market for these and the potential for profit isn’t really that big. Consumers are willing to pay for a premium for Apple but not anyone else.

georgeecollins 9 months ago

I would really love it if people on Hacker News could weigh in on how much of a moat they think CUDA really is. As in: How hard is it to use something else? If you started a project today how much would you want to get paid to not use CUDA?

A lot of readers on this site have a good insight into this and it is a key question financial people are asking without the knowledge many people here possess.

  • czk 9 months ago

    SemiAnalysis has a nice write-up on MI300X vs H100/H200 and concludes that the CUDA moat is still very real: https://semianalysis.com/2024/12/22/mi300x-vs-h100-vs-h200-b...

    "As fast as AMD tries to fill in the CUDA moat, NVIDIA engineers are working overtime to deepen said moat with new features, libraries, and performance updates."

  • BitwiseFool 9 months ago

    AMD's competitor to CUDA is ROCm. Historically, AMD has been hobbled by the quality of their drivers and because they sold less performant hardware. AMD has traditionally been the budget option for both CPUs and GPUs. Things have changed in the CPU space because of Ryzen, but sadly AMD has not been able to realize an equivalent competitive advantage in the GPU space. Intel has also entered the GPU market, but they are even farther behind than AMD. The same problems I am about to describe apply to them as well, to a higher degree.

    Rewriting CUDA programs to run using ROCm is expensive and time consuming. It is difficult to justify this expense when in all likelihood the ROCm version will be less efficient, less performant, and less stable than the original. In the grand scheme of things, AMD hardware is indeed cheaper but it's not that much cheaper. From a business standpoint, it's just not worth it.

    Knowing what I know about how management thinks, even if AMD managed to make an objectively superior product at a much better price, institutional momentum alone would keep people on CUDA for a long time.

    • JohnBooty 9 months ago

          AMD has been hobbled by the quality of their drivers 
      
      
      I always hear this and I believe it, but I've never been able to find any insight about what exactly is holding them back.

      Given the way nVidia is printing money, surely it absolutely cannot be a lack of motivation on AMD's part?

      This is a very uninformed thought as I have no experience writing drivers, nor am I familiar with the various things supported by CUDA and ROCm. But how is AMD struggling with ROCm compute drivers, when their game drivers have been plenty stable as far as I have experienced? Surely the surface area of functionality needed for the graphics drivers is larger and therefore the compute drivers should be a relatively easier task? Or am I wrong and CUDA has a bunch of higher-level stuff baked into it and this is what AMD struggles to match?

           and because they sold less performant hardware.
      
      Does anybody have and insight into specifically what part of compute performance AMD is struggling to match? Did AMD bet on the wrong architectural horse entirely? Are they unable to implement really basic compute primitives as efficiently as they want because nVidia holds key patents? Did nVidia lock down the entire pool of engineers who can implement this shit in a performant way?

      I mean, aside from GPU compute stuff, it sure looks to me like AMD is executing well. It doesn't seem like they're a bunch of dunces over there. Quite the opposite?

    • czk 9 months ago

      Never underestimate the power of institutional momentum! cough IBM AS400

  • jononor 9 months ago

    One aspect that influences is how close to the bleeding edge one needs to be. And how niche the model/application is. ROCm lags by some years. And application/model/framework developers test less on it, which can be problematic in niches. For doing something very established like say image classification, that does not really matter - 3 year old CNNs will generally do the trick. But if on wants to drop in model X just put on GitHub/HuggingFace the last year, one would be buying a lot of trouble.

  • ChocolateGod 9 months ago

    > could weigh in on how much of a moat they think CUDA really is.

    There's movement to implement CUDA libraries that work on non-Nvidia cards, but I guess adoption could be hindered by legal fears.

    https://github.com/vosen/ZLUDA

  • r1chardnl 9 months ago

    Whenever a new AI model gets released and is available for the public. From the last few I've tried they were always NVIDIA only because I assume that's what the researchers had at their disposal.

  • ljlolel 9 months ago

    So why give away the valuable knowledge away for free?

LorenDB 9 months ago

RDNA 4 is proving that AMD can be competitive in GPUs. Is it on par with Blackwell? No. Is is a much better improvement over previous gen than Blackwell? Yes, at least if you consider consumer pricing/marketing.

  • htrp 9 months ago

    It'll be competitive just in time for them to move to UDNA.... or have they walked that back too?

alecco 9 months ago

I like the guy but he is wrong. Besides the CUDA ecosystem, Nvidia hardware has a lot of features and a lot of things to yet optimize. Like optimizations by DeepSeek (non-CUTLASS custom models and DualPipe). I think Nvidia's current hardware has plenty of legs and that's why they are not rushed on releasing next gen chips.

The actual challenger is Cerebras. No need to load (VRAM->SRAM) all the parameters for every batch . But they have yet to prove they can scale and support the custom stack. We'll see.

fangpenlin 9 months ago

I heard that Nvidia's graph cards are the best in the class in terms of power consumption vs TFLOP ration. I wonder what's the number of AMD vs Nvidia? I would like to see the number because power consumption is going to take a big portion of AI training. In comparsion, hardware might not be that expensive in the long run.

mika6996 9 months ago

AMD is finally starting to make progress with their software stack. Who would have thought?

  • fabiensanglard 9 months ago

    Does the post confirm anything about software besides "believe"?

    It seems ADB is just sending more hardware. As far as I know the drivers are still lacking.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection