Try charging on the right and not on the left
apple.stackexchange.comhigh CPU usage by kernel_task is caused by high Thunderbolt Left Proximity temperature, which is caused by charging and having normal peripherals plugged in at the same time.
Did anyone ever figure out why exactly the temperature sensor was making kernel_task go crazy?
The comment by jksoegaard may explain it: "the purpose of the thread is to force idleness". I guess it uses up all the CPU without actually making the CPU do anything, so other processes can't use the CPU and produce more heat than it can handle.
Wow! This brings back memories. I don't remember which generation of CPUs this was.. maybe Pentium III or AthlonXP. But there was an issue in Windows where the chip would overheat when idle.
There was an app called Rain that you could download and it would just keep the idle process from running away and causing heat. I may need to research this on my lunch break and report back.
EDIT: Here's a related thread: https://arstechnica.com/civis/viewtopic.php?f=8&t=1060257
That wasn't exactly an issue, more like a feature that didn't exist yet.
The Pentium IIIs were the first CPUs that could consume enough power to cause heat problems if not throttled. OSes didn't yet anticipate that; they didn't yet have the concept of issuing halt instructions when there was no application work to do, they just let the CPU spin faster on a waiting-for-message loop. The OS makers soon recognized the need, and halting the CPU became built in to the OS by the time of either Windows 98 or 2000, so yeah, in the interim, tools like Rain were what you used.
Documented here: https://support.apple.com/en-us/HT207359
> One of the functions of kernel_task is to help manage CPU temperature by making the CPU less available to processes that are using it intensely. In other words, kernel_task responds to conditions that cause your CPU to become too hot
If it's about CPU temperature, why does Apple schedule a task instead of just downclocking the CPU, like everyone else? This question predates the release of M1 Macbooks, and everyone else does thermal management of Intel Core processors via clock speed, not artificial idle load.
I'm not an expert, but I know are two forms of throttling under MacOS – the CPU frequency limit, and the "scheduling limit". The CPU reduces frequency under thermal stress just like other machines, but there's a minimum frequency that it can be downclocked to. So if that's not enough to meet the thermal envelope, the scheduler will additionally start to limit the amount of CPU time that can be used. I notice this in particular when the GPU is at full whack – such as running multiple external displays at high resolution with 3D graphics.
I've had my Mac mini throttle down to 180Mhz under Linux (due to incorrect ACPI settings), to the point where I could see the window elements being drawn :) So at least that generation Intel was capable of pretty extreme downclocking.
They almost certainly do adjust CPU frequency as well (although I don't have a 2020 Mac to check this). Lowering the frequency would mean fewer instructions can be processed per second which means you are more likely to see a higher CPU usage for all tasks. What you see is probably the combination of both of these effects.
It's annoying that the process monitor does not provide any way to distinguish high CPU load due to demand and high CPU load due to a low clock frequency but there is software that can display a CPU frequency chart which can give some insight:
https://pietschsoft.com/post/2020/02/15/macos-monitor-cpu-us...
> does not provide any way to distinguish high CPU load due to demand and high CPU load due to a low clock frequency
If you're on Linux, then the "tiptop" utility does just this.
The high cpu usage is essentially the evidence for running certain power-saving measures that "look" like the task is using a lot of CPU time (it does, sleeping the cpu).
There's similar behaviour on linux but it shows as specific kernel threads related to intel power throttling, so it's more obvious what's happening.
It would reduce latency of tasks to do it this way.
That's very interesting. I didn't know it was possible for a process to use the CPU without using the CPU, as it were. Rather I thought the CPU usage was more or less a direct measurement.
From [1]
On processors that have a halt instruction that stops the CPU until an interrupt occurs, such as x86's HLT instruction, it may save significant amounts of power and heat if the idle task consists of a loop which repeatedly executes HLT instructions.
--
I don't know how it is implemented in darwin, HLT is a privileged instruction, then again it's kernel_task.
I would expect that the CPU frequency would throttle down, rather than regularly running halt instructions.
This doesn't make sense, and I'm a bit worried: I "cheaped" out and have a "left side only USB-C" MBPro from 2019.
Should I understand that having any kind of load on the measly 2 ports I have on my 1600€ computer will lead to abysmal performance? Like, attaching a USB hub to add basic necessity like HDMI or USB A?
I have a 2019 MBP that runs hot because it's supporting an external monitor. The year before every laptop became a WFH desktop, Apple decided that running an external monitor could be a massive heat issue and nobody would care. It's literally burning out my work laptop, and soon I'll have to ask IT for a new one. Apple doesn't care, though: they already got paid.
As I understand it, though, most of the thermal issues are to do with the north bridge and the video card. Basically, Intel's power model for USB is built around performance, not thermals. GamersNexus actually has some interesting videos about this, but at the end of the day Intel shit the bed on northbridge performance while developing USB-4 support.
I wildly speculate that this is a massive part of why Apple moved to their own silicon: to get away from Intel, who is clearly losing every competition they're in. (I don't know why Apple didn't go toward AMD, but I expect GPU hang-ups were part of it.)
They finally made the device a little thicker. Apple could have put a better quality heat sync and fan in these models, they chose not too.
As GP pointed out, apple doesn't care, they already got paid.
I'll go further and I'll say I think it's actual dishonesty on their side: they're providing additional thermal headspace to their own cpus, when instead they have constrained Intel for years.
It's still pointless to point this out, apple users are going to buy apple stuff anyway...
*heat sink
Edit: why is this downvoted? This is a legitimately confusing mistake; people may think it has something to do with synchronization.
It's pedantic and the way you did it was rude.
You ought to have commented: "I believe you meant heat sink, not 'heat sync', am I correct?"
The average HNer is above average in intelligence and does not require a minor typo to be explicitly corrected.
If you're worried about thermal damage to your laptop, maybe you can prevent it? Some ideas:
- Leave the laptop screen open. You can turn off the screen by setting its brightness to the minimum setting.
- Power your USB devices from a powered USB hub.
- Point a small fan at the side of the computer.
- Clean the laptop's cooling system of dust and lint. Keep the laptop on a stack of books so the intake vent is raised above any surface that can collect dust.
The laptop is practically new. I keep it standing vertically on its side, with the lid closed, plugged into an external monitor. It's still uncomfortably warm to the touch.
Power draw from the GPU is a lot lower on Monterey. Still not as good as driving only the internal display, but about ~40% lower compared to previous OSs. Lower power mode also helps another 15%.
I don't care about power draw, I care about the fact that my laptop gets physically too hot to touch while running an external monitor.
There hasn’t been a north bridge for nearly a decade, it’s all in the CPU. With no corresponding north bridge, Intel renamed the south bridge to platform controller hub (or just PCH).
> literally burning out my work laptop
Is there any actual evidence for running a CPU near its thermal ceiling causing premature failure vs a cpu running at a 10-20c lower temp?
When I hear people saying things like “my PC is dying”, this is usually due to thermal throttling from dust buildup or malware running causing excessive CPU usage. It’s not like the CPU is getting crispy around the edges and parts of it stop working so only a smaller portion is still available to do work or something.
There seems to be a lot of conjecture out there about the longevity of a CPU vs heat, but I’m wondering if this has ever been actually studied in a scientific way. I understand that electromigration is real, and it could e.g. cause a trace to eventually blow out.
This is the closest thing to a real scientific explanation that I could find https://www.reddit.com/r/hardware/comments/5l3ufj/is_there_a... but it still doesn’t go into why / how / the physical mechanics of it.
https://forums.tomshardware.com/threads/cpu-electromigration... Also attempts to explore this in a fairly scientific manner, but it’s focused on over-volting and overclocking vs normal (especially mobile, where CPU voltages tend to be less tweakable) operation.
TL;DR In my experience, CPU/GPU temps don’t matter as long as they are not causing throttling. Running within a few degrees of the thermal limit (Tjunction max ) 24x7 for years won’t affect the longevity of a CPU/GPU in any functional way, and won’t cause slowdowns, only failure, possibly of a subsystem like USB or other on-die subsystems. If anyone has anything showing otherwise, please share it!
Yes. But the computer is very, very, very thin. Which sells more computers than actual performance under load.
Benchmarks show the MB having decent performance, as they start cold and don’t have external devices attached either. Thermal throttling only shows up once you’ve already bought
I have the 2019 MBP 16" with i9 inside and it's terrible for any task heavier than browsing the internet or watching cat videos.
Running CMake + gcc on a bigger project? The fans start running like crazy. The performance of the i9 is awesome, at least for the first 1-2 minutes, after that the CPU is throttling you back to the 90's and the fans are spinning even louder.
But this is not even the funniest issue with it. I have the biggest and most performant charger (96W, pretty much the maximum, because the USB-C standard states, you can only push up to 100W through it). You may think, if Apple put that charger in the box, it should be enough to achieve max performance with it AND still be able to charge it. Nope, but maybe it would be at least enough power, so the battery is not discharged. Lol, nope, higher load for a extended time WILL discharge your battery while being connected to the power supply.
I've tested it with two separate MBP with exactly the same spec (i9 + 32GB RAM + Radeon 5500M 8GB) with exactly the same behavior.
Pulling 100W on a laptop is pretty impressive though, that’s insane. Can you attach two chargers?
Trying to keep that hardware to 100W is insane given similarly spec'd windows laptops will have up to 250w chargers and sometimes two of them if you have serious GPU power.
You can, but it will only use one, switching to the highest wattage automatically.
> Yes. But the computer is very, very, very thin. Which sells more computers than actual performance under load.
No. Apple sells computers regardless, but now that "form over function" Johnny Ive has left Apple, Mac sales have actually shot up. People want their computer to look and feel good, but it still needs to function as a computer. Now that the actual functionality of the Mac is a priority again, they are selling better.
Reminiscent of the Volkswagen dieselgate situation where emissions are fine during the EPA test drive but horrible otherwise. This sort of thing will probably haunt us forever, in various forms.
Attach an LG Ultrafine 5K and you can fry an egg on your €3,300 laptop and use the white noise from the fan to sleep well.
Attach the same monitor to both an M1 and an Intel Laptop, you can fry chicken on the latter.
Can you? My i5-10300H equipped laptop drives two external monitors daily and the fans don't even start in normal use. Maybe specify which laptop?
The M1 can't even support two monitors plugged in directly, which is a joke for a machine this expensive.
The M1 can't even support two monitors plugged in directly, which is a joke for a machine this expensive.
That is the case for the M1, which is the lower-end of the new processors – a bit disappointing because I'd've otherwise bought one, but a relatively uncommon scenario overall for a lower-end Mac.
It's not the case for the recently-launched chips, which support up to 2x6k external displays (M1 Pro) or 3x6k + 1x4k (M1 Max).
Yes, correct, sorry maybe I should have specified.
For the Macbook Pros definitely. Just try a non-4k (or FHD) display to trigger the external GPU to see the fans spinning up. Any ultra-wide screen will do.
I've had dual-head on 12? 13? year old laptops - no problem (not HiDPI, which was not a thing then, obviously, but FHD or WQHD). Currently doing the same with an i5-8350U, runs passively most of the time.
My M1 Mac mini supports two monitors plugged in directly. I know you mentioned “laptop”.
OT: It’s nice that DisplayLink connections let an M1 machine have unlimited displays, and that the performance of their driver under Big Sur and above has been fantastic for me for the past half year for 2 extra displays. (Four total, 1 is vertical)
The new MacBooks from a few weeks ago can due to Apple putting some effort into the GPU. I think they’re talking about the Intel MacBook Pro, which had meh cooling from what I hear. (Although if you want to see bad cooling, try out a Surface. Throttle city after anything strenuous.)
> if you want to see bad cooling, try out a Surface. Throttle city after anything strenuous
Implying Apple's cooling is any better? Both build difficult to repair devices tuned only for looks. Whether it's the Macbook that literally doubles its performance if you build a custom cooler that prevents throttling, or the Surface Book that you can't use at full performance for long because the charger cannot deliver enough power to sustain it.
Nobody can cheat thermodynamics, and attaching "Pro" to the name doesn't make it go faster either. I just hope I can keep buying "normal" laptops. They're ugly and noisy, but at least I know they can be fixed without melting off the keyboard or the repair guy telling me it's water damage when it isn't.
Do you have a discrete GPU on that laptop?
Intel integrated GPUs are often set up to use main memory instead of dedicated GPU memory (as for the Iris graphics). I wonder if that makes a difference, though it is on the same package.
At least it's white noise. Tonal is far worse :D
Yes, my 13” 2016 Pro has this problem and only has two left side ports. It chews through batteries about once every 18 months. It’s been really annoying.
Maybe they fixed this issue in the time between 2016 and 2019.
I noticed my 2019 MacBook Pro physically hot to touch even after being idle for 8 hours at night. Now I unplug my USB-C monitor and USB-C-to-USB-A adapter at the end of the day and that has solved the problem. Now when I come to the machine in the morning, it is nice and cool, and I plug-in my peripherals again. It seems Apple has noticed this sub-par pro performance and moved back to the MagSafe charger in the latest pro models. Also, for the cynical among us, I would like to point out that each issue I've had with a MacBook, Apple has fixed for free. Don't assume they wont fix it until you take it to the Apple Store and ask.
Was there an Apple Store response for the overheating / overnight-charging problem?
I didn't ask them about the heat issue since I attributed the issue to my cheap non-Apple adapters. Issues they've fixed for me for free: broken back-plate on the old 2010 white unibody macbook, as well as a stuck key on keyboard of 2018 Macbook Pro.
> I noticed my 2019 MacBook Pro physically hot to touch even after being idle for 8 hours at night.
My late-2020 MacBook Pro also stays warm (even hot, sometimes) while supposedly sleeping. It can also no longer charge on the right side, that just stopped working. Had to rearrange my desk to satisfy the power bugs on the laptop.
Quality has gone way down, I don't quite see myself ever getting a Mac laptop again, after being on OSX laptops ever since OSX 10.0 came out.
Strange. The proprietary software you run after you perform a repair has a specific port requirement and you can only use the left side ports. I always wondered what was so special about these ports.
How does something like this happen without QA knowing and Apple fixing it?
I mean, it's happening cause charging it, that's something all users will do :S
Sounds a lot like the antennagate
They spotted it and made the trade off that it wasn’t worth fixing
Or alternately, spotted it and decided to make their own chip.
Or no one has ever submitted a bug report. :-)
https://tidbits.com/2020/06/17/how-to-report-bugs-to-apple-s...
Previous discussion: https://news.ycombinator.com/item?id=28733467
Also why is that submission now marked as a dupe? It doesn't even point to the one that it is supposed to be a dupe of.
I have set up several macs to spin up the fans more aggressively. Subjectively this seems to help a lot.
And the noise is not much worse, because eventually they spin at full blast anyway; just a bit too late when presumably thermal throttling is already hitting.
yep, started using smcfancontrol in 2008 due to Jobs insistence on silent horrible stock fan curves. Only stopped due to M1 Mini's not needing my input on fan speed.
Before I found this solution, my MBP would basically hang up with the fan going crazy whenever I'd have an MS Teams call. It still feels ridiculous when it happens and I switch plugs. It feels like such a strange thing to have to do on a pricey device like this.
And here I thought this was about charging electric cars: "Audi thoughtfully put charging points in both fenders, but only the right one is suitable for high-speed use." -- https://www.cnet.com/roadshow/reviews/2022-audi-rs-e-tron-gt... (Picture caption, about pic #4 from the top.)
Earlier this year, suddenly had my 2017 MacBook pro freak out and freeze on using the camera or screenshare: fans would go nuts and cpu meters were completely loaded. After reinstalling and upgrading the OS with no luck it occurred to me to check for dust. Indeed, there was much dust and after a good proper clean all is normal again. I used compressed air for best results.
Non-obvious debugging involving a mental model of the physical world reminds me somewhat of “the case of the 500 mile email” [0]
This is the reason I have a couple of monitors lying uselessly on my table.
Everytime I connect an external monitor, it makes the GPU go crazy and overheats the Macbook enough to summon this kernel_task demon.
Couldn't comprehend why one of the best laptops on the planet cannot handle one external monitor.
When you attach an external monitor, that switches from using the Intel integrated GPU to the AMD discrete GPU which uses a lot more power.
On my HP Envy x360 (2020 - AMD Ryzen) I got the same problem if the power is going through a USB-C Hub. My thermals are 10 degrees higher while charging through USB-C than when charging through the barrel connector.
I run a displayport hub connected to a M1 MacBook pro and i noticed it crashes sometimes when in the back port. So i have to keep it in the front port.
At least Apple realized USB-C only was a stupid idea after a couple of years and reverted...
Apple hardware quality and the design decisions have gone to shit since jobs died. The folks thet were feed up with the hype of his return to a diying company (and with a good reason, since the company really improved) are the ones keeping the company alive... In my opinion unless something changes it wont take long for it to be the dying company they use to be... They have a grat "body" but they lack the "heart" (and apparently the brain also)
Yeah, Apple hardware may be worse than it used to be. But as far as I can tell, it's still at the top. Can you point out a company that consistently makes better hardware than Apple does?
Literally any random gaming laptop on Amazon probably won't have any of these problems, while also having ethernet ports etc.
They'd also cost half as much and allow you to upgrade disk and ram.