Settings

Theme

Show HN: Razer x Lambda Tensorbook

64 points by vimeh 4 years ago · 37 comments · 4 min read


Hi all, long time lurker, first time poster.

I want to share with you all something we've been working on for a while at Lambda: the Razer x Lambda Tensorbook: https://www.youtube.com/watch?v=wMh6Dhq7P_Q

But before I tell you about it, I want to make this all about me, because I built this for me.

See, while I'm genuinely interested in hearing from the community what you think as this is the culmination of a lot of effort from a lot of people across so many different fields (seriously, the number of folks across manufacturing, engineering, design, logistics, and marketing who have had to work together to launch this is nuts), I really just want to tie the larger motivations for Tensorbook as a product back to a personal narrative to explain why I'm so proud.

So, flashback to 2018, and I'm a hardware engineer focusing on the compute system at Lyft's autonomous vehicle (AV) program, Level5 (L5). Here was a project that that would save lives, that would improve the human condition, that was all ready to go. I saw my role as coming in to product-ize, to take what was close to the finish line and get it over it. The disappointment was pretty brutal when I realized just how wrong I was.

It's one thing to nod along when reading Knuth write "premature optimization is the root of all evil"; it's another to experience it firsthand.

At Lyft L5 I thought I would be applying specialized inference accelerators (Habana, Groq, Graphcore, etc.) into the vehicle compute system. Instead, the only requirement that mattered org-wide was: "Don't do anything that slows down the perception team". Forget testing silicon with the potential to reduce power requirements by 10x, I was lucky to get a willing ear to hear my case for changing a flag in the TensorFlow runtime to perform inference at FP16 instead of FP32.

Don't get me wrong, there were a multitude of other difficult technical challenges to solve outside of the deep learning ones that were gating, but I had underestimated just how not-ready the CNNs for object detection and classification were. Something I thought was a solved problem was very much not, and ultimately resulted in my team and others building a 5,000 watt monster of server (+ power distribution, + thermals, + chassis, etc etc) that took up an entire rear row of seating. I'm happy to talk about that experience in the comments because I have a lot of fond memories from my time there.

Anyway, the takeaway I have from Lyft, and my first motivation here is that there is no such thing as over-provisioning or too much compute in a deep learning engineer's mind. Anything less than the most possible is a detriment to their workflow. I still truly believe AVs will save lives; so by extension, enabling deep learning engineers enables AVs enables improvement to the human condition. Transitive property, :thumbsup:

So moving on, my following role in industry was characterized by working closely with the least technical people I have ever had the opportunity to work with in my life. And I mean opportunity genuinely, because doing so gave me so much perspective on the things that you and I here probably take for granted. (How do we know that Ctrl+Alt+T will open a terminal? Why does `touch` make a file? How do I quit vim?)

So, the takeaway from that experience, and motivation #2 for me is that computers can be so unaccessible in surprising ways. I have a deep respect and appreciation for Linux, and I want others to see things the same way, so anything I can do to make easier the process of "self-serving" or "bootstrapping" to my level of understanding, is something worth doing to me.

So, with those two personal motivations outlined, I present to you, for your consideration, the Razer x Lambda Tensorbook. A laptop with a no-compromise approach to speeds-and-feeds and shipping with OEM support for Ubuntu.

sincerely, Vinay. Product Marketing @ Lambda

pedalpete 4 years ago

I came across this earlier today, and part of me wants one, but I think a bigger part of me doesn't.

I am sure it was a ton of work to get this done, and congrats on that. I'm curious as to why you chose Razer as a partner in this (though I don't know who else I would recommend), and why a laptop?

My experience.

In my last start-up we bought 4 Razer Blades, well, we actually bought two, but ended up going through 4 machines before we got two working ones! And when I say "two working ones", one of those has a faulty camera, so only 1 actually fully working machines of 4. So, you can understand my feelings when I say "never again will I buy any Razer product".

But our current consideration, as we're looking at buying a few more machines is, do we really need laptops as our core development device? We're looking at getting desktops, and cheaper laptops for if we want to do some coding while we're on the train, or travelling or whatever. We find we are rarely doing heavy lifting work when we're not at the office.

For what I assume is similar cost, we can get a nice small underpowered for our needs laptop, and a suitable desktop. Does anybody else see the market going in this direction?

  • alexvoda 4 years ago

    Indeed, that is why my next laptop will preferably only have integrated graphics. For stuff that requires a powerful GPU, a remote desktop or a Thunderbolt eGPU is a more powerful and more flexible option.

    On a laptop I prefer to spend that weight and energy budget elsewhere.

    And I believe the current gen (AMD 6000 & Intel 12'th gen) of CPUs (paired with as much RAM as possible) is perfect for this pattern. Both have good iGPUs, and USB4/Thunderbolt.

    Too bad OEM are sacrificing quality on other parts like the keyboard.

  • vimehOP 4 years ago

    > why you chose Razer

    Tensorbook is better now.

    > and why a laptop?

    Somebody asked [λ] for one in 2017.

z991 4 years ago

Vinay,

I'm as close to a True Believer for this product as you'll find. I'm typing this from a Blade 15 in Ubuntu I bought 2 weeks ago for deep learning (not knowing about the Lambda collaboration!)

I've been using Razer laptops for years and use Lambda boxes at work. All great computers. As you can imagine, running Linux on these is a chore (my current adventure [1]) so I'm very excited to see some official support.

I have many questions, but the first is something that seems so obvious from the outside: Razer clearly has a battery-failure problem. From my perspective this is because these laptops are plugged in most of the time and have no control to avoid changing to 100% and holding the battery there. Over the lifetime, that's terrible for the battery and most vendors do something about it. Apple will not always charge to 100% via some mysterious schedule, an older Dell Latitude I had used a setting in the BIOS to charge to 90% when plugged in for a long time.

Why doesn't Razer get their BIOS vendor to do this? At this point it feels like a reputational problem, and one that is well deserved. I've bought 4 Razer batteries on eBay for 3 different laptops and expect to buy a fifth for this machine in a few years.

Regardless, thanks for making the first laptop I've ever seen that feels like it was made for me. Can't wait to try one out.

[1] https://abarry.org/ubuntu-on-razer-blade-15-2022-advanced/

  • vimehOP 4 years ago

    Andy (Andrew?), > Apple will not always charge to 100% via some mysterious schedule We prefer to think that "mysterious schedule" and "machine learning" have the same > energy Have you tried `Al Dente` on Mac?

    > an older Dell Latitude... setting in the BIOS So we've engaged with American Megatrends to license the BIOS directly... a cross function of business and legal folks at Lambda are currently figuring that out, but I'm excited to report that we know we can version control the BIOS!

    > I've bought 4... 3 different... a fifth... Oof; that's a reputation. I do really admire and respect the approach Frame.work has taken towards modular, user-servicable systems with components. A mindset of sustainability from the outset is something I think we all here strive for.

    From someone here who has a background in chemical engineering, can you ELI5 lithium ion battery degradation?

    Andy, the director of product (Dan) has been/will be reaching out to you.

    Vinay

  • sabalaba 4 years ago

    Nice tutorial, we worked with the Razer BIOS team to find the touchpad problem and fix it. And we use the 5.15 kernel to fix a handful of the other hardware issues when running Linux on the Razer hardware.

    We can run the battery issue / suggestion by the Razer BIOS team.

ganoushoreilly 4 years ago

https://lambdalabs.com/blog/lambda-teams-up-with-razer-to-la...

Here's a link with more info for anyone else curious.

ckrailo 4 years ago

Ars Technica coverage: https://arstechnica.com/gadgets/2022/04/razer-designed-linux...

nicomachean 4 years ago

One question: why 11th gen intel cpu?

  • rgrmrts 4 years ago

    AFAIK 12th is not yet fully supported on Linux (thread scheduler or something along those lines).

    It should be supported in a coming release last I checked.

  • chrsw 4 years ago

    Someone on Notebookcheck said CPU choice could be related to the lack of AVX-512 support on the 12th gens. But I have my doubts. Wouldn't the GPU be the most important component for ML workloads in general?

    • nicomachean 4 years ago

      Mostly, but sometimes when I'm prototyping to get quick results I will write CPU-intensive preprocessing steps (i.e. very ugly loops of shame) before converting to matrix algebra for the GPU.

  • supermatt 4 years ago

    because mobile 12th gen not yet available?

  • vimehOP 4 years ago

    Many answer!

daviddever23box 4 years ago

Tell us about the choice of Razer Blade 15 (2021) as the platform (or, as we refer to it, CH570).

  • vimehOP 4 years ago

    "We" is so ominous, so threatening. Why not just say, "we on HackerNews", or "we in the Linux community", or "we at THX"?

    • vimehOP 4 years ago

      We at Lambda would love to talk to you about delivering better driver support for the speakers :)

999900000999 4 years ago

Very very cool product, my one suggestion would be to add an option to dual boot Windows.

Sure machine learning's fun, But after a long day I just want to play Gears of War. That 3080 looks perfect for gaming.

  • sharms 4 years ago

    There is an option to add Windows when you click through to customize.

    • 999900000999 4 years ago

      Why in God's name is it 500$ ? OEM windows licenses are basically free.

      Even if you brought it individually it's only 200$.

      https://www.microsoft.com/en-us/d/windows-10-pro/df77x4d43rk...

      Definitely leaves a bad taste in my mouth. Looks like you'd save a solid 1500 by just buying a 3080 Razer laptop and setting it up yourself.

      https://www.bestbuy.com/site/razer-blade-15-advanced-15-6-ga...

      • rfd4sgmk8u 4 years ago

        Negative incentive to discourage windows use, reducing cost for the vendor who now does not have to support proprietary drivers. Many vendors have realized that standard components lead to almost complete support in desktop Linuxes today. Upstream chip vendors providing drivers to the open source kernel, bypassing the need to get Microsofts blessing for selling product to your customer.

        I want a world where you have to pay 500 to keep using that awful proprietary spyware malwared os. Join us in free and open source Linux land. You have less and less reasons to pay microsoft tithes every year.

        • 999900000999 4 years ago

          Your still losing money buying this vs getting the above laptop I mentioned and installing Ubuntu on it.

          I'm not going to get into an argument about OSs. All 3 major desktop OSes do somethings well, and somethings poorly.

          • sabalaba 4 years ago

            The target market for this is people who don't want to have to manage a Linux machine learning environment. If you want to least expensive access to GPU compute, you should design a workstation, purchase all of the parts yourself on newegg or amazon, personally take the risk of a part failing, and fully support yourself. That is probably the least expensive way to get a computer in 2022.

            This is definitely targeted to people who place a premium on their own time and effort.

            • 999900000999 4 years ago

              Can you explain your point?

              I've installed machine learning environments on numerous Linux machines, it's never been too difficult. If you can't figure out how to do that, then you really shouldn't be in this field.

              Here's a guide, https://ubuntu.com/blog/ubuntu-nvidia-rapids

              Even if you get something that's pre configured eventually you're going to have to update stuff.

              This feels targeted at people who can expensive it to a company account and not worry about the price. Their's no logical reason to buy this over the much cheaper, yet equally speced laptop I posted above. Both are made by Razer, they probably have the same exact warranty if things go wrong.

              • sabalaba 4 years ago

                https://lambdalabs.com/lambda-stack-deep-learning-software is an even better way to install the full stack if you want to do it all yourself. It's free.

                • 999900000999 4 years ago

                  This is why I love HN.

                  I'll definitely use this if I ever return to experimenting with machine learning. This definitely proves my point, there's no need to pay such an insane markup when the same exact manufacturer is selling the same item with slightly different branding.

              • vimehOP 4 years ago

                Sigh, I literally Slack'd Stephen earlier today to not engage on this thread, but now I find myself doing so, too.

                > If you can't figure out how to [install machine learning environments], then you really shouldn't be in this field.

                Fundamentally, I disagree with you here. I read this as a mentality that excludes unnecessarily. I hope we both can both come to agree that one should not be excluded from computation and computational tools just because they do not know how to install things.

                > Both are made by Razer, they probably have the same exact warranty if things go wrong.

                That's actually one of the benefits of economies of scale.

                • 999900000999 4 years ago

                  >Fundamentally, I disagree with you here. I read this as a mentality that excludes unnecessarily. I hope we both can both come to agree that one should not be excluded from computation and computational tools just because they do not know how to install things.

                  Did you see the downstream comment ? Their's literally a single command that will set up a machine learning environment on Ubuntu.

                  Regardless, if you can't set it up, how are you going to update it? Do you just buy a new $4,000 computer every time you need to update your machine learning environment?

                  All I see here is Razer fundamentally selling the same laptop, calling one machine learning, and calling the other gaming. I can't understand how installing Ubuntu, and some machine learning tools is worth a $1,500 markup.

DarthNebo 4 years ago

Thunderbolt eGPU device with Mac & PC ML lib support would be something that is of interest to me, upgrading just one component would be better than entire devices every 3-5 years.

vimehOP 4 years ago

http://tensorbook.com

if you're more of a visual person sometimes

terrycody 4 years ago

I would say at least this is one of the best looking laptops, yeah Razor really made good looking products.

braingenious 4 years ago

This is amazing, how did you fit a TPU into this form factor?

dempseye 4 years ago

I for one do not believe the battery life claims.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection