Settings

Theme

Lenovo’s IdeaPad Slim 7 is a showcase for AMD’s exceptional new processor

theverge.com

210 points by vlangber 5 years ago · 200 comments

Reader

notacoward 5 years ago

I have one. The US version "only" has a 4800U, but it was still an absolute steal at US$800. It's a nice size, feels solid, and lasts all day easily. I've used it all day more than once and not seen the battery go below 50% yet. Then again, I haven't really pushed it that hard. I'm sure if I played some games it would "only" last five or six hours. And no, I haven't had any issues with the screen. In fact I turn it down. If you're staring at a 500nit screen at full brightness all day, you're probably not doing your eyes any good.

Unfortunately this model seems to be sold out or perhaps even discontinued (until the European 4900U version becomes available maybe). Meanwhile, the E14 Gen2 14 is very similar in both specs and price.

  • formerly_proven 5 years ago

    > If you're staring at a 500nit screen at full brightness all day, you're probably not doing your eyes any good.

    I'm kinda surprised how many people set their screens to 100 % brightness (300-400 nits for most screens) on their desktop. I find that blinding and very uncomfortable. That also seems to be a reason why people complain so much about IPS bleed and glow; using the screen near full brightness in a dark room for gaming or movies. Personally I find the "0 %" setting on some screens where that's around 50 nits too bright for that (ahem LG).

    • chrismorgan 5 years ago

      It’s hard to change the brightness of desktop monitors. Bafflingly, Windows and macOS don’t support adjusting their brightness the way they do for laptops, despite the existence of DDC/CI to do just that, so you’re left using third-party software if you know about it, or interacting with the awful OSD/buttons on the screen, which you’ll hardly want to do all the time.

      https://news.ycombinator.com/item?id=24316728 is a recent comment chain about this stuff.

      • MrGilbert 5 years ago

        For Windows users, I can recommend ClickMonitorDDC[1]. While the UI is a bit cluttered, it has a neat feature:

        You can display the current brightness in the notification area, hover over it with your mouse, and use the scroll-wheel to adjust it. I really like it.

        [1]: https://clickmonitorddc.bplaced.net/

        • GordonS 5 years ago

          I actually think I tried this one yesterday, and from memory it spammed my traybar with about 12 different icons. The UI isn't just cluttered, it's odd!

          Still, it's a nice demonstration of what you can do with DDC, and just about anything is better than the crappy physical controls on monitors.

          • bllguo 5 years ago

            you can disable them in the settings. it gave me icons for brightness, contrast, saturation, and volume. I only care about setting brightness so I disabled the others

        • greggyb 5 years ago

          Thank you! This has been an annoyance for a long time for me. Appreciate the share!

        • Skunkleton 5 years ago

          Yep, that is very helpful. Thanks!

      • pmontra 5 years ago

        I remember when TV sets and monitors had dials to adjust brightness, contrast and volume. They became useless on TVs because remotes are better, but they'd still be very useful for monitors that stay all the time within reach of the user. Much better than menus.

      • skammer 5 years ago

        I recently discovered MonitorControl[0], which is a Mac app that listens to your default brightness and volume keys and pushes updates to all connected monitors over DDC. Extremely happy with it.

        [0]: https://github.com/MonitorControl/MonitorControl

        • maven29 5 years ago

          I'll quote an HN discussion from a few weeks ago on why this isn't more prevalent.

          >There is a problem with "cheap" monitors and DDC/CI: some of them use EEPROMs to store brightness settings, and this limits you to about 100,000 writes. Worrying about this is the main reason we don't ship DDC/CI with f.lux. (I know that some more modern monitors use NAND and don't have limitations like this.)

          https://news.ycombinator.com/item?id=24344696

      • thesandlord 5 years ago

        Another HN comment a few months ago let me discover Twinkle Tray. It's a Windows app that sits in your tray icons.

        https://twinkletray.com/

        Super cool, you can even link multiple monitors together to control the "global" screen brightness at once.

      • TN1ck 5 years ago

        I didn’t like the available offers, so I build my own shitty macOS app for that once, you can get it here: https://github.com/TN1ck/BrightnessChanger

        Maybe it interests somebody.

        • mjcohen 5 years ago

          Thank you! Works on my 2014 Mac Mini to HP 25f monitor os X 10.13.6.

          Did not work through an HDMI switcher but works fine when directly connected.

      • formerly_proven 5 years ago

        I dunno. The OSD on Dells is pretty simple. You press the topmost button twice, up/down to adjust, lowest button to save and exit. Since my office has windows, I've been doing this two to four times a day on most days for years. Doesn't bother me too much (it's pretty quick though because I largely only use the 0-40 range, which is about 20-120 nits). On my LG it's even quicker, push the joystick back, then forward/back to change, push in or timeout to save and exit.

        Some people say they'd like ambient light sensors, but I'm not sure I'd wanna use something like that. Sometimes I do find the changing brightnesses on mobile devices irritating.

        Changing this directly in the OS would be a better UX, though.

        • tjoff 5 years ago

          That might work if you only have one display. But on more than one that gets tiresome real quick.

          On linux I use ddccontrol to control the brightness on all displays at once (using a simple for loop in the terminal).

          I plan to write a script for it that would allow me to bring up a dialog and enter brightness from just a shortcut, but this is good enough til that itch comes.

          I'm sure there are alternatives to all operating systems.

        • cillian64 5 years ago

          I have a Dell U2415 which is great for this. You can create custom colour/brightness/contrast profiles and assign them to hotkeys. I have one nice bright one for dark terminals or when the sun's out, and a darker lower contrast one for when I have a bright white website or document to read and the office is a bit dimmer. It's two quick button presses to change between these presets.

          I think I'd quite like something which adapted brightness and contrast to ambient light as well as the brightness and contrast of what's on screen. With mobile devices this can be a pain as you move around and in and out of shadows, but I feel it could work a lot better on a desktop display. The display could even have the light sensor on its back so it can work out what its backdrop looks like.

        • moonbug 5 years ago

          that's a shit-ton of clicks compared to a pair of up/down buttons.

      • bserge 5 years ago

        I haven't had a monitor in ~5 years, but the ones I owned had physical buttons for adjusting brightness/contrast/gamma/etc... Do new monitors come with no in-built controls?

        • SahAssar 5 years ago

          They do, but the point is that they are usually not as convenient to change as the controls you already have in your hands (mouse + kbd)

      • 3np 5 years ago

        I paid a significant markup for an Eizo monitor not because I care about the color accuracy and all that but for OSD where I don’t mind adjusting several times a day.

      • banach 5 years ago

        For macOS, Lunar does that you want https://lunar.fyi

    • jve 5 years ago

      > I'm kinda surprised how many people set their screens to 100 % brightness

      Not everyone knows that this impacts your eyes. Someone called me and asked if I know anything about glasses that would restrict blue light and make his eyes feel better. After telling a thing about f.lux, we ended ended up finding out that brightness was at 100%. He lowered to 50% and said: thank you, much better.

      • GordonS 5 years ago

        I've been staring at a screen for decades, but only fairly recently found out about the harm of blue light, and am using f.lux with the "soft light" effect.

        Aside from blue light, is it bad for your eyes just because of the brightness level? Is there any kind of objective measure of how bright is "safe"? And if so, is there any way of knowing how many nits a monitor is emitting?

        • notacoward 5 years ago

          AIUI the problem is mostly the blue light. I can't imagine that becoming habituated to staring at a bright light is all that good either, but I don't know of any specific studies on that. It's also the kind of thing that only shows up statistically after twenty years, as a higher incidence of inability for the eye to accommodate to lower light levels etc. Nobody's going to look at a too-bright screen for a day or two and immediately notice a drastic difference.

        • jve 5 years ago

          Someone advised to put a piece of white paper near monitor when adjusting brightness, so it would be +/- as the paper.

    • scns 5 years ago

      Do you have blue eyes?. I do, and in summer i can't look at the ground without being blinded. Read somewhere that the lack of pigments lets more light through. Comes with the upside of seeing better when it is dark. I use redshift on all computers plus switch to a different colourprofile on my monitor at night. On mobile i use the nightmode all day.

      • ezconnect 5 years ago

        I don't have blue eyes but when I was younger I can read in the dark and hate high brightness monitors. Now that my eyes are old I lost the ability to focus my eyes and need glasses and sometimes need the monitor to be brighter to help focusing.

    • bserge 5 years ago

      I don't think I ever used 100% brightness on LED backlit displays. They're just too bright. Mine are currently sitting at 75%, and I had AOC monitors (with LG panels heh) that were too bright at even 50%.

      Sadly, anything below 50-60% and I notice the PWM, sort of flickering, and it's really tiring - I know I'm not buying the most expensive stuff but I'd like them to use better backlight controllers.

    • bl4ckm0r3 5 years ago

      Last time I talked with an eye doctor (some months ago) and he mentioned that the ideal brightness should match the one of the environment around. if it's darker your eyes work harder and if you are working without environment light (dark room) is bad because you blink less!

      • novok 5 years ago

        Screens that can go brighter are probably better for us, because it means we can use it in sunnier / brighter environments properly, with all the health benefits that sunlight can give us.

        • notacoward 5 years ago

          That's more of an argument for reflective/transflective screens. Trying to overcome both sunlight and reflections with a transmissive screen doesn't work very well, and only exacerbates the eye-health problem. Not great for battery life either. That's a lot of problems to be solved before there's even a chance that people would use their computers more outdoors, and even then the odds are slim. Maybe people should get outside away from their screens.

          • novok 5 years ago

            That's like saying that we shouldn't read books on the beach. You have to meet people where they are.

            A secondary reflective screen on the outside of a laptop might be the better solution, but from what I remember reflective screens usually don't have good colors. The last TFT reflective screen I remember has been hard to find an outdoor picture of although.

            • notacoward 5 years ago

              > That's like saying that we shouldn't read books on the beach.

              It's nothing like that. People do read books outside. They don't use laptops outside. (Statistically speaking.) "Meeting people where they are" means two very different things in those different contexts.

              > reflective screens usually don't have good colors

              Yes, you have to pick your tradeoffs. If you want an outdoor-viewable screen, that will probably mean some sacrifice in refresh time, resolution and/or color gamut. This is why the only place you do find reflective screens is dedicated e-book readers. Brighter transmissive screens are not a solution in this problem space, so this use case does not justify them.

    • kitsunesoba 5 years ago

      I use my desktop display at 100% brightness because turning it down means its great color and contrast capabilities go to waste due to its traditionally backlit nature… it’s like downgrading my monitor to a cheaper model.

      This effect isn’t nearly as strong on my phone’s OLED. With that, I can turn brightness down quite a long way and still have great color and contrast.

      OLED/microLED desktop monitors can’t come soon enough.

    • jackbravo 5 years ago

      I always used to set my monitor to 100% brightness, when I used glasses. I had an eye operation 2 years ago and from the first day I could tell that it was too bright. Normal prescription glasses have light filters that makes you not notice this type of things.

    • LargoLasskhyfv 5 years ago

      Same here. In the evenings and nights with warmwhite lighting 12% brightnes and 10% contrast, during the day usually between 25% to 33% and even during brightest direct sunshine at noon no more than 50%.

  • foxdev 5 years ago

    I wasn't entirely sure about the mere R5 4500U in the laptop I bought recently, but it tears through everything I throw at it. The only thing that stopped me from getting one of the higher-end R7s was that all the laptops with it and a screen with full sRGB coverage were persistently sold out, but I don't feel like what I got falls short with any of the tasks I put it to.

    Whatever dark forces AMD aligned itself with for the latest chips was worth it.

  • justaguyhere 5 years ago

    I just checked the lenovo website, it says not available. When did you get yours?

    • notacoward 5 years ago

      I've had it a bit over a week. The E14 Gen 2 still seems like the closest alternative - slightly slower processor, more memory, less battery. Unfortunately the Labor Day sale is over so it's around $1200 now.

      • keldaris 5 years ago

        The E14 also has a proper Thinkpad keyboard (or the best you can still get, at least). The difference is profound.

ZuLuuuuuu 5 years ago

For people interested in ultrabooks with Ryzen 4800U processors, there is also another model, which for some reason doesn't get mentioned much, which is Lenovo S540 13ARE. It has a 13.3", QHD, 16:10 screen.

I bought this laptop after buying a Lenovo Ideapad/Yoga Slim 7 and returning it (because of some QA issues and 14" was a bit too big for me after using XPS 13 for a long time). I made a small review of the laptop here: https://www.reddit.com/r/Lenovo/comments/ilcw5n/lenovo_ideap...

  • ChuckNorris89 5 years ago

    Unfortunately the machine is not available in Europe outside of the Netherlands for some reason.

    I tried calling a Dutch retailer to ask if they ship to Austria but couldn't even get pass the "Do you speak English please?" phase :(

    Like WTF, the EU single market is a thing since how many decades now?! So why the hell do we still have region specific SKUs of the same product with different parts and availability between EU member states?! Imagine the laptop would be available for sale in California but not in Utah and in California you can only get it with a 512 Samsung SSD and a 300nit display and in New York only with a slower 1TB SKHynix SSD and a 400 nit display! /rantover

    • throwaway2048 5 years ago

      Because charging different countries different amounts of money for the same SKU is illegal in the EU, so instead every country gets its own SKU, and they don't ship to other countries.

      https://europa.eu/youreurope/citizens/consumers/shopping/pri...

      • ChuckNorris89 5 years ago

        Huh, interesting, thanks for that. But then why don't we get different SKUs of PS4/Xbox and others between EU member states? Or do we?

        • throwaway2048 5 years ago

          If the pricing is different its very likely you do, its just minor things changed nobody ever notices.

        • nawgz 5 years ago

          Someone else mentioned keyboard localization - I guess with gamepads instead of keyboards, all localization is software-based

    • jmnicolas 5 years ago

      You found the ONE Dutch that doesn't speak English? I went there about 15 years ago, everybody I met, including old ladies in the street, were speaking much better English than I can (I'm French).

      • ChuckNorris89 5 years ago

        I don't think the guy didn't speak English, I think he was too rude or indifferent to bother offering customer support in English at that moment. You see this in Austria also if you try to get customer support over the phone in English, since lots of employees are stressed from this kind of job so some will just not bother with you with a not my job, I don't get paid to be a translator attitude if you ask in English.

        • KSS42 5 years ago

          That reminds me of when I visited Vienna many years. I went to the information counter at the train station. I didn't want to be rude and presume the information person spoke English. So, I asked if he spoke English. His bemused reply was "Yes, I speak English and 7 other other languages".

        • ableal 5 years ago

          Some people may also be leery of scams in the "hallo this is windows support" vein ...

    • alkonaut 5 years ago

      Many European countries have specific keyboards so for laptops you need a dozen or more models just for Western Europe. I wouldn’t want to accidentally get a German or French SKU when I’m buying one from Sweden so I’m quite happy there are different SKUs.

    • ZuLuuuuuu 5 years ago

      That sucks, one advantage of getting a laptop from Netherlands is that they use ANSI US keyboard, which I prefer over a localized keyboard.

      Usually Dutch people speak English very well, maybe you can call another store.

      • vazamb 5 years ago

        Which also might solve the mystery of why it's only available in the Netherlands? No need to built a different keyboard.

  • dahfizz 5 years ago

    I love my Lenovo X1 Carbon with a 14" QHD display.

    QHD is not a popular resolution but it's absolutely perfect for this size, like you mention. It's better for battery life than 4k and it's much crisper than 1080p.

    With a 13"-14" screen, you would have to scale a 4k display which is a complete waste in my mind. You either lose out on sharpness by getting fuzzy fractional scaling, or you go full 2x scaling and you lose all the screen real estate of a high pixel density display. 1440p is beautifully sharp without requiring scaling on a 14" panel.

    • SahAssar 5 years ago

      > With a 13"-14" screen, you would have to scale a 4k display

      I use a 13" 4k at native scale. I've heard this statement so often and it's always told as a fact when clearly it isn't so for everyone.

      • httpsterio 5 years ago

        I'm pretty confident you're in the minority as reading text on that thing will be very straining on your eyes in almost no time. I have a 20/20 vision and sure I could use a 4k on the same size for a few hours but it wouldn't be comfortable.

  • sandGorgon 5 years ago

    Oh brilliant. Thanks for this. This is coming soon to India.

    Do you run Linux on this ? How's the performance. Would love it if you can test with a liveusb of fedora

    • httpsterio 5 years ago

      I have the previous model of the S540 with i7. Ubuntu, Pop_OS and Arch all work fine on it. I don't see why the Ryzen model wouldn't either.

      • sandGorgon 5 years ago

        that's basically the heart of it - its a different motherboard and CPU architecture and therefore a different bios.

        this is where things break

    • ZuLuuuuuu 5 years ago

      I might try it, I'll let you know if I do under this topic.

  • FullyFunctional 5 years ago

    That was a surprisingly great review, thanks. I do regret that the Apple-style half-sized up and down arrows are so common.

  • bscphil 5 years ago

    Wow. I'd give anything to be able to get this with a decent screen. I edit a lot of photos, and Adobe: 73% P3: 72% simply isn't acceptable in 2020...

    • ZuLuuuuuu 5 years ago

      Hello I just wanted to notify you about a new development on display colors. Yesterday, I was fiddling with AMD display settings and I disabled "Vari-Bright" feature just to see what happens. I never tried to disable this before, because I read that it helps with saving battery life by lowering the brightness depending on content on the screen. But to my surprise, disabling it not only changed the brightness but affected colors in a good way too. And now I am much more happy with the colors. I don't know why they enable this feature by default when it makes the colors noticeably worse.

    • panpanna 5 years ago

      Lenovo almost always uses crappy displays.

      That said, not everyone need color accuracy. It probably works just fine for my usage.

jarym 5 years ago

So Intel...

Lost their lead in consumer desktop CPUs and said ‘oh we have mobile and server’

But now AMD prove that Intel’s lead in mobile has been squandered.

And then the largest buyers of server grade stuff are cloud vendors who are waiting for ARM to come of age.

  • Dig1t 5 years ago

    I don't really think cloud vendors are holding their breath to switch to ARM, they are all heavily invested in X86. All their code has been built on it for forever, and there's a big advantage to having your dev machines running the same architecture as your cloud production machines. I think desktop ARM adoption would have to happen before the server market moves to the same.

    • greggyb 5 years ago

      AWS has multiple generations of its own ARM processor. I can't say anything about it, as I have no experience with it. I know it's at least two generations in and they claim 40% cheaper for similar workloads.

      What I can say is that this level of investment is large.

    • solarkraft 5 years ago

      > their code has been built on it for forever, and there's a big advantage to having your dev machines running the same architecture as your cloud production machines

      Is that really so? Are remote development and emulation not sufficiently advanced yet?

      Super hot paths might be x86 optimized, but how much does that really matter? I'd think at the scales of the big providers nothing matters more than performance/power use and performance/price.

      • kevin_thibedeau 5 years ago

        Tons of x86 code is dereferencing unaligned addresses. Those have to be fixed before porting to ARM.

        • my123 5 years ago

          This isn't applicable to 64-bit Arm at the user application level, don't worry about it.

          Prior to that, unaligned memory accesses were perfectly usable on ARMv7 (and even v6) too, this was fixed a while ago.

      • 8K832d7tNmiQ 5 years ago

        There’s still no real justification for vendors to even offer ARM instances, and instead become more clear to keep x86 instead with the rising of remote development and emulation.

    • jarym 5 years ago

      Desktop and laptop vendors were heavily invested with Intel before now. Apple was heavily invested with Intel before now.

      Don’t underestimate what cloud vendors will do in pursuit of cost or efficiency - they are ruthless at both.

    • OriginalPenguin 5 years ago

      For EC2 instances and the like it will be almost all X86 for a long time.

      But much of the cloud is made up of services like S3, RDS, Redshift, Route 53, etc. ARM can be very competitive for many of these.

      • rumanator 5 years ago

        > For EC2 instances and the like it will be almost all X86 for a long time.

        I wouldn't bet on it being so lopsided. AWS is betting stringing it's ARM processors, which appear to have a better price/performance ratio.

      • bengale 5 years ago

        Graviton is looking really interesting.

    • saghm 5 years ago

      > There's a big advantage to having your dev machines running the same architecture as your cloud production machines. I think desktop ARM adoption would have to happen before the server market moves to the same.

      The new Apple machines are coming out next year, right?

  • jmnicolas 5 years ago

    Don't forget that Apple has pledged to be full Arm on all it's mobile and desktop in 2 years.

    I don't think Intel can recover, or at least not before 5 to 10 years from now.

  • fomine3 5 years ago

    Intel's mobile chips (below 25W) are still great for performance and power consumption because that's only product that uses 10nm and Sunny Cove cores and their low-power optimization is matured.

  • toastal 5 years ago

    The new upgrade in the Xe integrated graphics chip is still compelling to many users

  • imtringued 5 years ago

    That's funny. If Nvidia buys ARM then that will never happen.

  • lazylizard 5 years ago

    epyc rome are cheap n good

mushufasa 5 years ago

How does it work with linux?

My colleague got last year's x395 with the AMD 3000 series. It only lasts 3.5 hours battery with web browsing and moderate coding, whereas it's supposed to last ~7 hours real-world on windows.

  • 3np 5 years ago

    I can’t speak for the new series, but I have a couple of Ryzen 3400G and I’m not sure if I got the iGPU working and being utilized 100% properly yet.

    The drivers are a huge mess of confusion (what goes into user space vs kernel? What do you really need? What even goes into host OS vs containers if you run Docker? What if anything can you actually get out of it without installing the proprietary non-free closed-source amdgpupro?), AGESA updates are needed to not have kernel modules crashes, oh and these updates are left up to mobo vendors, some of which are great, some of which will make you feel like you bought a lemon. And then there’s the whole mess with mesa that I think is just now resolved (20.1) and haven’t yet made it to LTS distros.

    I’m def not an Intel fan but man, 100% working intel drivers are an apt install away and I had both forgotten just what a PITA ATI was with Linux and couldn’t imagine AMD hasn’t stepped up the game at all.

    Unless anyone has anecdotal evidence otherwise, make sure you set aside a couple of working days to hunt down and compile the right kernel modules and make sure the vendor provides recent enough firmware and/or hav patience.

    In short, I wouldn’t hesitate having a new Ryzen for a headless server, or a desktop rig with a dGPU. For a smaller desktop, laptop, or anything else requiring use of the iGPU I would wait a year. Unless you’re one of the few people either already up to speed on all this or finding some absurd pleasure in learning about it, in which case I really do hope you post your process in a blog or forum where other users will find it through web searches.

    • BenjiWiebe 5 years ago

      My ryzen laptop has a 3500u cpu. I did the normal install with fedora 32 kde, and everything works, though with one annoyance. Occasionally a single pixel wide line, maybe 10-50 pixels wide, won't update.

    • imtringued 5 years ago

      I'm not sure what you are talking about. I installed Arch Linux last week on a new machine and all I had to do was pacman -Sy mesa and that's it. Actually, that's a lie. I installed a DE which installed mesa as a dependency automatically.

      • 3np 5 years ago

        With Arch obv you don’t have the issue with Mesa and kernel updates as the main repos are recent enough.

        Which AGESA version are you on? Does hardware accelerated decode and encode work as expected?

  • katmannthree 5 years ago

    You'll want to run a recent kernel, >=5.8 for a renoir chipset. I get maybe 75% of advertised battery life on my fedora rawhide install.

    Granted mine is an ideapad 5 with an amd 4500u but it's been terrific. There are a handful of bugs still but nothing that prevents productive work. The worst one is where turning down the backlight too fast kills the backlight entirely, but you can fix it by just bumping the brightness up key and then going back down to your target.

  • cookiecaper 5 years ago

    Fedora 32 works reasonably well, just some minor hiccups, such as:

    * There is no driver for the Goodix USB fingerprint reader yet

    * Occasionally thin lines of pixels don't update correctly (hard to describe, might get a photo soon)

    * dmesg logs periodic errors with amdgpu and the new driver for the realtek wireless card, but I haven't noticed any negative functional impact associated with these.

    * The TSC is marked unreliable on each boot.

    That said, it's plenty usable as a daily driver, and the performance is very strong for the price point. GeekBench 5 results at https://browser.geekbench.com/v5/cpu/compare/3495992

  • Apofis 5 years ago

    Latest AMD APU's are not fully supported by official AMD drivers on Linux just yet. I also ran into issues with Lenovo's Ideapad with bluetooth (ubuntu 20.04) and wifi (ubuntu 18.04).

  • cttet 5 years ago

    I bought the Chinese version, and PopOS as an Ubuntu derivative is working quite well. You need the mainline kernel for the screen brightness adjustment to work though.

hu3 5 years ago

> Those results were a game-changer. They’re miles better than I got running the same load on the HP Envy x360 (around eight hours), the Dell XPS 13 (seven and a half hours), the Asus Zephyrus G14 (almost nine hours), and even low-power stuff like Lenovo’s Chromebook Duet (11 and a half hours) for which battery life is a major selling point. I’ll be blunt — this is the longest battery life I’ve ever seen from a laptop. It’s astonishing.

  • Cilvic 5 years ago

    For anyone else wondering about the battery life "over 11.5 hours"

    >Running through the multitasking load that I described earlier, in battery saver mode at 200 nits of brightness, the Slim 7 lasted 13 and a half hours. On the Better Battery profile, it lasted over 11 and a half hours. Remember: I was not going easy on this thing — you’ll certainly get even more juice if you’re just clicking around a tab or two.

  • formerly_proven 5 years ago

    In the T440 or so models you could have three batteries (one internal, one regular, one ultrabay) in a Thinkpad for a claimed 30 hours or so battery life with two hot-swappable batteries... :)

  • Tade0 5 years ago

    I have an ASUS G14 with an R7 4800HS and while nine hours are possible (at ~100% it shows exactly 10 hours), you'd have to set the brightness to a pretty low level and forget about doing anything challenging.

    But six hours with your IDE open, brightness at the default level for battery mode and compiling a Node.js + TypeScript project from time to time is something you can reasonably expect to be able to do.

    Currently, even though I have my charging set to max out at 80%, I don't look at the charge indicator too often because I know that I have a good few hours before it's time to plug in.

  • rsynnott 5 years ago

    One potential note of caution is that the screen may be a contributor to that good battery life.

  • tomComb 5 years ago

    > Lenovo’s Chromebook Duet (11 and a half hours) for which battery life is a major selling point.

    No it isn't. For that device, price ($300 with keyboard) is it's major selling point.

pachico 5 years ago

Pitty the screen is so bad. I still don't understand the point for glossy screen, to be honest.

  • n3k5 5 years ago

    If you're in a dim environment, a glossy screen has slightly better contrast and colours. If you have a light source or bright objects nearby where their light falls on the screen, a glossy screen has significantly better contrast and colours, as long as the screen and you are positioned such that you don't see the reflections. If you can't avoid reflections, e.g. because you're outside in bright sunlight, a glossy screen is pretty much useless. But if you mostly have a notebook so you can take it to different indoor workplaces with suitable lighting, and/or so you can move from your desk to a couch or bed (e.g. to watch a movie), the reduced glare can be very nice to have.

    So the thing is, if you never had one yourself and only observed others using them while away from their preferred work spaces, chances are you've literally seen glossy screens in a bad light :)

    Of course, if your use cases are all text based, the potentially better picture quality of a glossy screen is indeed rather pointless. Either option is a compromise; what's better depends entirely on where and for what you use the device.

  • NikolaNovak 5 years ago

    Glossy screens generally have more vibrant colours; matte is inherently somewhat muted; and in some scenarios have better contrast and brightness.

    (Also: PRETTY! and APPLE! are important psychological factors as well :P. Joking aside, many people will look at vibrant screen and how pretty it looks and make their decision without considering the specs, details and use cases )

    For myself, in most situations glossy screens mostly mean I can see everybody behind me better than I can see my work; but they do have their place, especially for professional design/video/photo work IN (and this cannot be overstated) well controlled environment (basically, make everything BUT the screen non-reflective / control the light:).

  • solarkraft 5 years ago

    Matte screen protectors are cheap and look great.

    • GordonS 5 years ago

      ... and are an absolute bastard to apply. Seriously, it's almost impossible to get all the air bubbles out.

      Also, "look great" is stretching it a bit - they tend to make things look slightly softer and desaturated.

      • solarkraft 5 years ago

        You're right about the application, but when you're done you have a screen that is more robust and doesn't have an inset bezel.

        The desaturation and softness are (AFAIK) inherent to matte screen surfaces and the reason manufacturers nowadays tend to avoid them.

        • GordonS 5 years ago

          I tried to put one on my 13" MBP a while back, because I hate how glossy and reflective the screen is. Was horrible to apply, and I ended up with lots of small scratches in the laptop display simply by gently pushing out the air bubbles with a card... I won't be applying a screen protector again in a hurry!

          > The desaturation and softness are (AFAIK) inherent to matte screen surfaces and the reason manufacturers nowadays tend to avoid them

          If it has to be a choice, then I choose matte regardless.

  • ChuckNorris89 5 years ago

    Aren't MacBooks also glossy?

closeparen 5 years ago

IdeaPads still shipping with a TLS intercept ad injection proxy?

https://nakedsecurity.sophos.com/2015/02/20/the-lenovo-super...

andymoe 5 years ago

I bought two for the kids and while they are fine and I’m happy with the value I still despise windows home edition (had to block one from my router during set up so I did not have to make a windows account for instance).

I really really wish Apple would make a reversible 2 in 1. I can’t tell you how much of a better experience that form factor is for young kids. iPads are not a replacement for this.

  • boogies 5 years ago

    Linux Mint (Windows-like) is free and elementaryOS (macOS-like) is pay-what-you-want, $0 if you so desire (https://elementary.io/). It takes less time to flash a drive and boot from it than to endure one Windows update on my desktop (although you can also get GNU preinstalled on fairly Thinkpad-comparible laptops from eg. “Laptop With Linux” https://elementary.io/store/#devices, and Thinkpads themselves will likely ship with GNU soon — many Lenovo PCs are already certified for compatibility with multiple major distros).

    • andymoe 5 years ago

      Thanks! This just made me realize the other requirement is that they can play Minecraft and that it does indeed run on Linux.

      • boogies 5 years ago

        My pleasure!

        Minetest is a free (libre) similar game, but very mod-centric, with mods loadable from servers without installing them (as easy as Roblox) and (also like Roblox) written in Lua, may be a fun intro to programming if they’re keen to try it (ofc there friends probably don’t play, but a couple boys I know ~7-10 enjoyed it to a kind of harmful level), btw.

    • toastal 5 years ago

      If it wasn't for the GNOME lock-in, Pop_OS would be a great recommendation in the space for this sort of user

  • wongarsu 5 years ago

    Upgrade keys for Windows 10 pro are cheap on eBay and are completely worth it.

    • kristianp 5 years ago

      Aren't they unauthorized volume keys that are likely to fail activation?

      • wongarsu 5 years ago

        Yes, they are volume keys. Check the legal situation in your jurisdiction before buying.

        I've had one out of about ten fail on activation. But they cost 1/40th of what Microsoft is charging, so I don't mind that.

ampdepolymerase 5 years ago

It is a pity that a similarly specced ThinkPad would be at least a thousand dollars more expensive.

  • iakov 5 years ago

    A similarly specced ThinkPad or Latitude is built to last, has user-replaceable components and will not fall apart in 2 years after a warranty expires.

    From my experience you get what you pay for with those professional machines.

    • notatoad 5 years ago

      my thinkpad X1 with soldered-in RAM and dead non-replaceable battery disagrees.

      i only paid $100 for it on ebay, so i can't really complain, but it doesn't seem much different to the average consumer laptop to me.

      • noir_lord 5 years ago

        Not all thinkpads are created equal.

        If you want more serviceability then T series (and P if you have the budget) are much better.

        Though even on the T’s they started soldering they dimms which is annoying.

      • hajile 5 years ago

        Which X1 has a non-replaceable battery?

        I've had a couple different models and batteries in both were easy to replace.

        In contrast, there's NO way to replace the keyboard without basically buying a whole new system which is incredibly stupid.

        • notatoad 5 years ago

          i am apparently mistaken, and the battery is replaceable. so thanks for that. just ordered a new one.

    • greggyb 5 years ago

      And a similarly specced Thinkpad will be perfectly usable if purchased in 2-3 years as they cycle out of enterprise fleets, and for a better price than the notebook in the article. Not a satisfying option for everyone, but a great one in my book.

    • wazoox 5 years ago

      My IdeaPad 540 has two M.2 slots and replaceable RAM (I replaced the 8GB it came with with a 32GB stick). The battery seems very easy to replace, too (only one additional screw IIRC).

  • mizzack 5 years ago

    Hmm, Lenovo routinely has deep discounts on Thinkpads. Either public coupons, EPP, etc.

    I just bought a CTO T14 w/4750u, 16GB+0, 400nit screen for ~$830 + tax, less than the price of the Slim 7.

    • fomine3 5 years ago

      I got ThinkPad E495 (Ryzen 5 3500U, 128GB SSD/ 8GB RAM, IPS FHD) on about $360 with huge $110 cashback (in Japan), that's steal. The great thing is that it has 2xSODIMM, 1xSATA, 1xM.2 slots so I added cheap 512GB SSD and 16+8GB RAM rather than expensive BTO option.

    • hcurtiss 5 years ago

      I hate that they make you work so hard for these prices.

niffydroid 5 years ago

I have a HP ENVY x360 15-ee0002na. I like it a lot. Battery is ok, screen even is ok and it has 16gbof ram

Trying to find a Ryzen 4xxx with 16gb of ram in the UK is quite hard! Plenty of Intel's though, it's almost like Intel flooded the market or no one wants Intel

  • ChuckNorris89 5 years ago

    No, it's the opposite, it's not a flood of intel device, it's a shortage of AMD devices.

    AMD had to prebook fab capacity at TSMC years in advance and didn't expect the shortages caused by the pandemic made worse by the WFH demand while Intel can make as many chips as it wants to fulfill market demand since it owns the fabs.

    It's a shame because the 4800U laptops are either sold out or going for huge markups right now.

Havoc 5 years ago

Just ordered a Ryzen 4800U minipc for home server use.

Think it might play in that role. Decent number of cores yet modest TDP seems like a good fit

  • megak1d 5 years ago

    I am on the look out for a replacement for my HP ProLiant home server with something that has this chip. What minipc did you go for?

    • Havoc 5 years ago

      Asus PN50. On pre-order (early oct) but I gather the Aussies already got theirs. If you do go that route google RAM carefully...there seem to be compatibility issues (JEDEC vs XMP).

      Also, I think the Ryzen 5 is probably better value ratio but wasn't available for buy so went for a 7.

      I believe Asrock is also gonna bring out similar stuff but don't know details.

wazoox 5 years ago

I have an IdeaPad S540 API. It's a Ryzen 3500U. It has 2 M.2 slots so you can had a second SSD if needed. I upgraded it to 32 GB and boy how this beast flies. And it's really cheap.

Under Pop_OS with minor tweaks the battery lasts about 5 hours, which is pretty good for a Linux laptop.

Think of turning of Wifi energy saving: Wifi speed went from 50Mbps to more than 300.

  • ihattendorf 5 years ago

    I've got basically the same machine as you as well (ThinkPad E495, 3700U, upgraded M.2 SSD, upgraded 32 GB ram) and while it's great, the 4000 series is a game changer. I really wish I could have waited but my old laptop had other plans.

xioxox 5 years ago

I wonder if they've improved the screen hinges? I've given up on slim Lenovos as they had this weak metal bonded to plastic hinges which are easy to break through regular use. The screen snaps off the base and the case just gets bent and twisted. They seem very poorly designed and are hard to fix.

toastal 5 years ago

Only 80.4% DCI-P3 coverage and no Thunderbolt.

These processors are great and OEMs could offer features users want, but they've still been offering only mid-range or gamer-oriented builds for everything else.

rowanG077 5 years ago

All of the ryzen laptops that exist have abysmal screens. I would have bought one if it weren't for this fact.

  • q3k 5 years ago

    Very happy with my T14 (AMD) IPS 1920x1080 display. Some might dislike that it's not High DPI, but personally that's not a feature I want.

    • rowanG077 5 years ago

      Full HD for 14 inch display is not great. And unless you get the privacy guard screen it's also not a bright screen.

      Tbh I would jump to any high res oled 13 inch laptop almost instantly. Screen is one of the most important parts for me.

    • formerly_proven 5 years ago

      1080p on 14" is high DPI. It's just not 250+ DPI high.

      • chrismorgan 5 years ago

        Colloquially, “high DPI” has a fairly specific meaning: it means “designed to be used with a scaling factor of at least 2 (and definitely uncomfortable to use below a scaling factor of 1.5)”. 1920×1080 on 14″ does not meet this definition.

        And when it’s capitalised, High DPI, as it was in the parent comment, it’s definitely referring to this definition.

        Back in the days when 1366×768 and 1280×800 were common sorts of resolutions and 1920×1080 was the highest available (that is, before Apple’s Retina displays), perhaps you could have said 1920×1080 was “high DPI”, but people didn’t use the term “high DPI” back then. And certainly not “High DPI”.

    • justaguyhere 5 years ago

      I understand it is good for programming. Can it handle video editing? I am trying to decide between T14 and P1, which is almost twice the price as T14.

      Any review will be helpful.

      • q3k 5 years ago

        Do you need more performance benchmarks? Here's my experiences on Linux (NixOS).

        I've done some light Blender work on it and it handled things just fine. Haven't tried GPU acceleration for Cycles, though.

        If I bump up thermal limits to 95 degrees then it compiles things at similar speeds to my old i7-6700k machine, if not slightly faster. At stock 60 degrees it's still pretty damn fast, but aggressively throttles if you try to use all cores at once for more than a few seconds. At stock thermal limits it also never gets too uncomfortable to handle.

        The GPU is good but not great. Satisfactory via Proton struggled (but got to 50FPS on low settings), older games like Portal 2 run great on ultra settings. amdgpu works pretty well, or at least not worse than amdgpu on my desktop.

        Overall, it's a good replacement for my old workstation, can handle some non-trivial video/GPU workloads, and I recommend it. There's some things that need to be ironed out (ACPI S3 sleep is currently somewhat broken, but Lenovo is currently working on certifying this machine with Ubuntu, so that should fix things), but that's mostly because it's such a new CPU/Laptop.

        • justaguyhere 5 years ago

          Thank you. I really want a AMD laptop, but they don't seem to have good displays. If I am to stare at the screen 10+ hours a day, I'd like a good screen.

          Do you like the display?

          • q3k 5 years ago

            Yes.

            But I also really dislike some displays that some people love, like Macbook displays. So, you know, de gustibus.

    • post_break 5 years ago

      I want one, but lenovo still has a screen lottery with their 400 nit low power screen.

    • Grazester 5 years ago

      250 to 500 nits. That is not very bright and this is the potential problem for me.

      • bluedino 5 years ago

        250 is not bright, but 500 sure is.

        • fomine3 5 years ago

          My ThinkPad E495 has 500nit FHD IPS display. It's enough for most place but it's not enough bright for very bright place. It's not looks like 500nit.

        • q3k 5 years ago

          I'm on 300 and it's definitely bright enough for my needs - so your mileage may vary.

        • post_break 5 years ago

          500 is a lie, since that panel has an extra layer for their stupid privacy guard.

  • chrismorgan 5 years ago

    My last couple of laptops have had IPS panels. I intend never to buy a TN panel again.

    My last laptop was a 15″ 1920×1080, and my current is a 13″ 3000×2000 (and I love the aspect ratio). I intend never to buy a laptop with a 1920×1080 screen again.

    At some point I’m afraid I may end up with a ≥120fps screen and rarefy my tastes still further. (I hear good things about them, but have never seen an LCD with such a frame rate.) Fortunately screens with both a high frame rate and a high resolution are still vanishingly rare.

    I just hope someone comes out with a good screen on one of these—I barely even care if it’s super expensive; because it’d be a real shame to have to decide between a good screen and a good CPU.

    • formerly_proven 5 years ago

      For desktop use 120 vs 60 Hz is noticeable on Windows (obviously very noticeable on the mouse cursor, but that doesn't change the UX much), but because DWM has basically optimal latency as far as dragging windows around goes, it's not that big of a difference. On Linux it's a pretty huge difference since Linux compositors aren't as good as DWM. Basically, Linux with 120 Hz feels like Windows on 60 Hz.

      • silon42 5 years ago

        First thing I usually switch off on Linux is the compositor. It's not that bad typically (you notice latency only if you look for it), but why add unnecessary latency...

        • chrismorgan 5 years ago

          I loved wobbly windows in Ubuntu a decade ago, on my first laptop. Switched to i3 in Arch Linux with no compositor on my second laptop. I think wobbly windows was the only thing I missed from Compiz.

        • solarkraft 5 years ago

          On Wayland it's difficult to disable the compositor :-)

          With which setup have you been able to achieve the lowest latency? I'm currently trying to get an overview of the situation.

          • noir_lord 5 years ago

            I went to Gnome over Cinnamon because with an RTX2080 Gnome on wayland is butter smooth and janky as hell on Cinnamon.

            The difference was vast.

            It’s a shame because I much prefer Cinnamon, so much I considered just shoving a cheap RX AMD card in for work stuff/Linux.

        • scns 5 years ago

          If you use Cinnamon, you can switch of compositing when in fullscreen mode.

    • viraptor 5 years ago

      > screens with both a high frame rate and a high resolution are still vanishingly rare

      They're huge power sinks. If you need to play games on the laptop for some reasons and can have it plugged in all the time, it works. But 4k often take 1-2h off the battery life and the high frame rate will likely have an impact too.

      I don't mind 1080p, or 60hz, but I'd love proper HDR OLED for work, so I can have high contrast together with lower brightness.

    • scohesc 5 years ago

      Remember when 99% of laptops priced around $800 or lower were automatically doomed with a 1024x768 screen - and to get a higher resolution you'd be paying at least 300-400 more?

      Yeeks.

      • chrismorgan 5 years ago

        Things were like that when I wanted to get my previous laptop in 2014 or so; you want 1366×768? Great! We have laptops from AU$400 onwards. You want 1920×1080? Here, enjoy our tiny range of AU$1,500+ laptops, all of which are heavy and power-hungry with dedicated graphics cards because you must want that, right? I mean, why else would you want a decent screen?

        It’s not quite so strong these days with 1920×1080 or even with 4K panels, but the segmentation is still definitely real. The feature segmentation, things like pairing dedicated graphics with better screens (even when the screens could easily be driven by dedicated graphics), is particularly distressing, because they’re making you pay more for things that you either don’t care for or actively don’t want, just to get other things you do want.

    • solarkraft 5 years ago

      > I intend never to buy a laptop with a 1920×1080 screen again

      Same here and WOW, it's frustrating. There are so many laptops that would be amazing if it wasn't for their screens. Paradoxically 17" models are almost all crap.

      • chrismorgan 5 years ago

        Ugh, also those times when >99% of 15″ models were 1366×768 or similar, while 11″ ultrabooks were happily shipping 1920×1080 or even higher. Fortunately that’s almost behind us now.

    • bluedino 5 years ago

      >> My last couple of laptops have had IPS panels. I intend never to buy a TN panel again.

      Agreed, but there are plenty of dim IPS panels with poor contrast and color.

  • Yizahi 5 years ago

    I bought recently HP 455 G7 with Ryzen 4300U and IPS screen. It's not the best screen but miles better than TN-film panels anyway. Good for a budget office laptop.

    • addled 5 years ago

      I ordered the 445 G7 with 4750U over a month ago. Shortly after, I was notified the estimated delivery was pushed out to a specific date a couple weeks later than original.

      A couple weeks later, I got notified that that due to supply issues for certain components, the date was pushed, without a specific date, and I could cancel if I wanted or take a discount when it finally shipped.

      This week HP said they cancelled my order altogether, with a list of cancelled business laptop skus. All of them were because of lack of AMD CPUs.

      Supposedly I can get a discount on something else, but there's just not much to pick from.

  • indymike 5 years ago

    After spending about six years with a Macbook with a retina display, going to even 1080p feels like a step back.

    • mroche 5 years ago

      It's a tad ironic (to me) because unless you max out the scaling of the display, none of the MacBook displays have an effective resolution >= 1920x1080. Apple has been defaulting to a fractional scaling the past few years rather than true Retina which is 200% scaling, but even then it falls short[0].

      However, 200% scaling is crisp and I can appreciate it. I just don't like losing all that real estate. And the fractional options on MacBooks aren't bad, but I can see text fuzzing out when it's not on the real Retina resolution. So when I'm running without an external display I do bite the bullet and deal with the lower resolution because otherwise I can feel my eyes straining.

      [0]

          Model | Physical  | Max       | Default   | Retina
          16"   | 3072x1920 | 2048x1280 | 1792x1120 | 1536x960
          15"   | 2880x1800 | 1920x1200 | 1680x1050 | 1440x900
          13"   | 2560x1600 | 1680x1050 | 1440x900  | 1280x800
      • solarkraft 5 years ago

        I use a '15 MBP with Windows for work and usually use it at 125-150%.

        Almost nothing looks blurry. I can't imagine it being worse on macOS.

        At home I use a 4k 15" XPS with Linux on 1,7x scaling and nothing looks bad either.

        I assume app support is there for the former case and it's not for the latter, so I can pretty well recommend 4k screens at that size.

        • chrismorgan 5 years ago

          I’ve heard that macOS fractional scaling seriously is rendering at one size, and then upscaling or downscaling it to the target size. I’m not certain this is true because I haven’t confirmed it myself and it seems such an obviously stupid idea (and though it’s certainly easier, no one else does it that way because it’s such a terrible idea), but I’ve heard people saying this at least three times (twice on the internet, once in real life), about text not being crisp at fractional scaling. I dunno.

          • vetinari 5 years ago

            It is always rendering at integer scale and always downscaling to target size (upscaling would result in blurriness; upscaled apps are only those that support only @1X scale).

            Just take a screenshot of your desktop and check it's resolution, then compare to physical display resolution. Apple uses output scaler of the final, composited image.

            It's not stupid; it is a solution that you can implement without support at application side and is relatively simple. Going Android way means, that all apps have to support random scales, which means they have to ship with assets for that.

            • chrismorgan 5 years ago

              Both upscaling and downscaling result in blurriness, though upscaling will generally yield slightly worse results. But downscaling is still going to yield a result drastically inferior to rendering at the correct size. It totally butchers pixel-perfect lines, for example. It’s the sort of hack that would be awful on low-resolution displays, and only becomes even vaguely tolerable on high-resolution displays because it’s still somewhat better than a low-resolution display for a lot of what people are using their computers for, even if for others it renders it legitimately unusable.

              If this really is true, I remain utterly baffled, and I maintain my position that it is an obviously stupid idea. Doing it that way just makes no sense to me. The visual result is way worse, it’s more resource-demanding and thus slows things down a little, and it doesn’t really simplify anything for app developers anyway—the only difference is that you have an integer scaling factor rather than a float scaling factor; but all code is still having to perform scaling mappings, and using floats would change roughly nothing (though the changes required in your APIs may need to propagate through a few levels, and GUI libraries will have to decide how to handle subpixel alignment). Windows and Android have both done it properly, so that supporting fractional scaling is no burden whatsoever for developers. You talk of having to ship assets for arbitrary scales, but that’s not a reasonable argument: GUI libraries should always choose the most suitable version of a resource, and scale it to the desired size themselves if it doesn’t match exactly.

              The result of taking the proper approach is that users of fractional scaling may get icons being rendered poorly, but images, text, vector graphics, &c. will be rendered precisely and crisply. Meanwhile, this other behaviour people are saying Apple is doing is just guaranteeing that everything is rendered poorly. Surely they’re not actually doing this? Is it perhaps a case of them having erred in making Retina support integral scaling only, but they’ve since made a better version that supports fractional scaling that each app can opt into, but they just haven’t insisted on everyone fixing their stuff yet? (And remember, Apple’s in an excellent position to do such insisting—they do it regularly.) —But as you say, screenshots are scaled at the next integer, which would suggest that yeah, they’re actually doing this mangling system-wide, and there’s no per-app recourse. Thanks for that explanation.

              I just find it hard to believe that Apple would truly butcher things this badly. Even if they’ve been known to do weird things a bit like this before, like killing off their text subpixel rendering with no stated justification, to the clear detriment of low-DPI users (and it may still be worthwhile even for high-DPI users).

              I can’t check any of this because I don’t use a Mac. There may even not be a macOS device within a kilometre or two of me.

              • vetinari 5 years ago

                I understand your position; however, the practice has shown, that this approach is good enough quality-wise. Most users didn't notice. In some aspects it is better than the approach you suggest would be, because it takes into account entire framebuffer at once. It will have less problems with pixel perfect lines, subpixel mouse cursor, etc, than the purely software solution, which will struggle with these more.

                Also, it is not more resource demanding; the only cost is the bigger framebuffer. The scaling itself is free: it is done by the output encoder (that means output hardware that does the encoding for eDP/DP/HDMI; it doesn't use GPU at all for that[1]). Apple has one more trick: it doesn't offer the user zoom percentages as Windows or Linux Gnome do. You cannot do 125% on Apple hardware (that's bad corner case; you need to display 8 framebuffer pixels using 5 physical pixels AND you pay the price for @2X framebuffer). The default mentioned above (1440x900@2X) means displaying 9 pixels from framebuffer using 8 physical pixels. It doesn't compound the error at all.

                We may discuss whether Windows did it correctly and Mac not, but the fact on the ground is, that all Mac apps run correctly on fractionally scaled displays and Windows apps are mixed bag. Even those apps that do support HiDPI on Windows have weird bugs (I'm not going to name & shame).

                By which we are getting to another one of your points: that apps can scale their assets. Sure, they can. And as we can see, every app will do it incorrectly in it's own unique way. So when every app does that, why wouldn't system library do it for them? Any bugs you will fix in one place, and you might find a way to hardware accelerate it in a way, that the respective apps couldn't do (see above about the scaler on the output encoder).

                As I wrote, this solution is good enough quality wise, is simple to implement and brings results quickly. Actually, it so so good enough, that Apple is using it in iPhones too (on some models, the physical and logical resolution do not match. The logical resolution is an integer multiple one).

                [1] For Intel hardware, you can find more info in the Programmer's Reference Manual, Volume 12: Display Engine.

    • nobleach 5 years ago

      I have both a 15" MacBook Pro (2018 model) and a Lenovo Thinkpad Carbon X1 open in front of me right now. I have 20/20 vision and it really is not striking me as that much better. I spend quite a bit of my day in a terminal and the remainder looking at the web, so font rendering definitely matters. Even with Linux's abysmal font-rendering, the mac really doesn't feel like it has the edge.

      • ahartmetz 5 years ago

        > Linux's abysmal font-rendering

        Wait, what? It's somewhere in the middle between Windows (too pixel-fitted) and macOS (too blurry). FreeType, slight hinting, LCD filtering on is the best font rendering that I know.

        • nobleach 5 years ago

          And it jumps around to which version it's going to use. Which means that my terminal can suddenly look gorgeous, and Slack can look like hot garbage!

      • pedrocr 5 years ago

        20/20 vision is only average for the population when you get to 60. So most people here will see significantly better than that. At 20/20 1080p is probably enough. Most people will enjoy a 1440p screen and 2160p will have marginal gains but it may be helpful to be able to use 2x scaling instead of needing fractional steps. But that depends on how you drive it.

      • fomine3 5 years ago

        Font rendering was matter on Low-DPI monitors but IMO not much matter for HiDPI monitors because some techniques like anti-aliasing is no more needed. Fonts are still important.

      • driverdan 5 years ago

        Your vision may be worse than you think. It's easy to distinguish between those resolutions, especially when looking at smaller text.

        • nobleach 5 years ago

          Hmmm, as an engineer, I have my vision checked every year. While a difference is discernible, it's not enough that I'd "miss it" if I didn't have a retina display. (I just ordered a Dell XPS 15 this weekend, I purposely left if at FHD as I didn't see the need for 4K on a laptop)

    • notacoward 5 years ago

      I'm on a three-year-old MBP right now. Next to me is my new IdeaPad Slim 7. Other than Slim's screen being a bit smaller, I hardly notice a difference.

  • pedrocr 5 years ago

    I would have ordered an AMD Lenovo T14s if they hadn't artificially segmented it by not offering the 4K screen available on the Intel version. Hopefully AMD can increase the volume for these chips and more configurations show up.

  • Grazester 5 years ago

    Why the hell is this? I bought one myself and in my search I could not find an AMD laptop with a good screen

    • stefan_ 5 years ago

      It's a $900 laptop. Just need to wait for them to put this CPU into a model line they care about.

      • Grazester 5 years ago

        That goes up to $1,349.40 for the T14's.

        • easton 5 years ago

          Does the T14 get the same battery life as this system? If so, I might have to look into that (or the inevitable X1 Carbon version).

FullyFunctional 5 years ago

> Our reviews generally leave extensive synthetic benchmarking to others

That left a really bad impression - I get that they are too lazy to actually measure the performance, but the snooty "synthetic" was uncalled for and frankly disrespectful to the people doing the work they are too lazy for.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection