Technical milestone reached: global earth system simulations with 1.2 km resoln
mpimet.mpg.de"I don't know, Timmy, being God is a big responsibility" https://qntm.org/responsibility , Topic relevant short scifi on Earth simulation.
Also relevant and mind-expanding essay:
Simulation, Consciousness, Existence
Hans Moravec, 1998.
https://frc.ri.cmu.edu/~hpm/project.archive/general.articles...
Thank you for sharing that essay. I found it fascinating.
Devs is a great mini-series that has similar themes.
Also IIRC, Devs is loosely based on the above linked short story.
So great. “Uh-oh” for me was one of those Lovecraftian moments of unraveling.
Previously here on HN: https://news.ycombinator.com/item?id=7267811
Being curious about the implementation language. Viewing the code is not easy.
If you guess Fortran, you might be right:
(different ICON Project) "The infrastructure, ICON-Land, for this ICON-A land component has been newly designed in a Fortran2008 object-oriented, modular, and flexible way."
https://mpimet.mpg.de/fileadmin/publikationen/Reports/WEB_Bz...
Fortran alive and kicking:
Most of what they do in the Climate simulation arena uses fortran, this case not being an exception, reason being other language would be less efficient --- and thus more climate endangering emissions would be produced. Plotting and data wrangling is another story.
They don't use FORTRAN because of efficiency or emissions reductions lol, they use FORTRAN because they always did and the sort of scientists who do modelling rarely see any reason to upgrade their skills or tools.
Programming is something scientists tend to study only as far as needed to get results that look right. This is how the most influential COVID model ended up being a 15,000 line student-quality C program with hundreds of single-letter name global variables.
They definitely stick to it, and it might be that they build convenient narratives about it. Mind you that the energy use of climate simulations is not trivial, or do you think such machine can be fed with a typical electric installation from a regular office?
Fortran is no longer in ALL CAPS.
When will people learn it..
I had hoped for Julia.
If the margins are so tight that 1 day can only produce 2.5 days of simulation, why to lose even a small margin with Julia?
How do you know they would lose a significant margin with Julia?
There were talks about it in the Climate modelling arena, and around a 10% slower was mentioned in some discussions.
Well, if that's not hard data!
Please can you specify what do you mean?
I believe the parent was trying to say that the information provided is anecdotal, and that they would prefer hard data. They were doing this in an exasperated tone--the full version of their expression would be some version of "If that isn't hard data, I don't know what is!", and is meant ironically in this context.
Got it, thanks for the input! To the best of my knowledge there are no published results showing this comparison in quantitative terms
For reference: https://clima.caltech.edu/
The article talks about 1.2km horizontal resolution. Is it a 3d grid? If so what is the vertical resolution? Or is the vertical dimension integrated within 2d boxes?
Yes, these are fully 3D simulations. The vertical resolution is slightly harder to describe because models like these typically use more complicated vertical coordinates than just "meters above ground". E.g. a technical article on the ICON-A formulation [1] says that it uses a "hybrid sigma" coordinate - so something that follows a combination of pressure and terrain. For a very high-resolution simulation that resolves clouds, you'd need to crank this up to O(100 m) in the bottom of the model, but it can space out as you get higher in the atmosphere.
[1]: https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2017MS00...
How does such a simulation work?
Eg how do hou predict the temperature at x,y? Is it ground type, water, sand?Altitude? Neighboring values thereof?
What are the inputs? Do you give it a starting point and apply it to a bunch of elements like some giant automata like game of life?
Some kind of finite element analysis thing?
So many questions.
Here is a very brief and basic introduction to numerical simulation:
Simulation of dynamic systems is a big deep area. In general you use what is called numerical simulation where you have a model describing your system, in the form of a partial differential equal equations.
You start with the chosen initial conditions, choose a delta-t as your time increment, and solve the equation for those inputs. That result is the input to the next iteration.
The most basic algorithm to solve such an equation is “Newton’s method” but no one actually uses that, they use many more advanced methods. But if you are learning that is where you start.
This approach has advanced greatly over the last 70 years. Doing numerical simulation is why early computing work got funding, to simulated nuclear reaction inside bombs.
Now numerical simulation is the occupation of all the worlds top super computers. It’s used for climate simulation, bridge strength, how sky scrapers flex in the wind, testing car crashes or even simulating the strength of ceramics. Oh and it used a lot in financial simulations to model risk and calculate the price of assets.
Right now I'm playing with simulated weather systems using an automata for each grid location at an effective resolution of a square km. I'm getting predictive real-world accuracy within around 10 degrees C with a range of 2 days. Very rough. Takes a long time to simulate a globe which I've found is really important to do. A limited region is usually not as useful.
Its an interesting field. But its seems not so easy to get the real methods used by the bigger models.
(I'm not using a supercomputer...)
Such simulations usually consist of three major systems (atmosphere, land, and ocean) that are coupled together at their geometric boundaries by a coupler that 'communicates' values like temperature from one domain to another. The coupler is needed because of different grid geometries, time step size differences and other aspects.
You initialize the system at some known state (I.e. set the temperature, pressure, etc. at all grid points to real world measurements) and then integrate a complex differential equation for the next time step and so forth. So it is not like a automaton. Finite elements analysis comes closer, but I think they use a different scheme like finite volume methods.
A lot of insight can be gained by [this](https://pure.mpg.de/rest/items/item_3379183/component/file_3...) paper. The first 10 pages should give you a rough overview.
> Do you give it a starting point and apply it to a bunch of elements like some giant automata like game of life?
Roughly speaking yes. Divide all into a grid of cells. Model a cell state with a bunch of numbers, apply some rules to update cell state with neighbors. The trick is to figure out rules of updating state. One needs to write differential equations first, incorporating all relevant physical processes into them, and to transform equations into those rules of updating, which will be a way more complex than with Game of Life.
Though it may be even more complex, like different time steps at different time-points, or changing a grid of cells to increase details in some areas where much is going on by slowing down simulation. Most of complications are due to a limited abilities of our computers: the idea to get more precision by calculating less.
giant partial differential equations
They are simpler than you think.
Try proving that their solution is unique.
Simple does not mean trivial.
If only there was a way to find out...
I know, right? But together we can still dream.
This is fantastic - would be really exciting to see this level of resolution making its way to global operational weather forecasts (currently at ~10km).
It's far too expensive with dubious impacts on forecast quality. Adaptive mesh approaches are far more suitable for high res global weather modeling... Why simulate the area under boring, dynamically unimportant areas?
> Why simulate the area under boring, dynamically unimportant areas?
Butterflies... terrifying large, kilometer-scale butterflies.
Which is *exactly* why ultra-high resolution global weather simulation has dubious prospects for improving forecasts. When you're at spatial scales where you need to parameterize convection, there's an inherent "smoothness" to model solutions that suppresses noisy errors. If you go to cloud-resolving scales - which is needed for simulations like the ones here - you don't get the benefit of that smoothness anymore, because you need to actually resolve scales of motion that are incredibly fine. It's a losing proposition; you'll never get it "perfect", so you're much more likely to spin up an error cascade with significant impacts on forecast down the line, through things like the structure of organized convection.
But dynamically uninteresting, quasi-balanced setups and modes? There's far less to worry about in terms of the butterfly effect, and any errors you might worry about will be dwarfed by the fact that we don't have good data to assimilate in places like the remote oceans anyways.
It's also worth pointing out that the mathematics and understanding of error / perturbation growth in the atmosphere are well-understood. In fact, this fundamentally underpins how we've developed data assimilation approaches over the past two or three decades that allow us to effectively leverage new datasets such as satellite data to increase forecast quality and reliability at longer lead times. So it's somewhat trivial to actually directly quantify these "butterflies."
If we're ever going to get to the femtometer resolution required for very precise 100 day weather forecasting, we have to start somewhere, so let them waste their time. It's not as though this is part of a growing trend to abandon conventional weather and climate modeling.
Why do you think that we need "femtometer resolution" for "very precise" 100 day weather forecasting? What even is "very precise" 100 day weather forecasting? I think it's very amusing to do the math on how much memory would be required to run a crude primitive equation dycore over even the tiniest of domains at femtometer resolution :)
> It's not as though this is part of a growing trend to abandon conventional weather and climate modeling.
The thing is, there *absolutely is* a trend towards private investment in weather modeling going towards faux-moonshot ideas like cubesat constellations without demonstrated ROI and that would require evolutionary leaps forward in data assimilation, or for deep learning to replace weather models. A miniature version of this already played out with precipitation nowcasting - probably the easiest weather forecasting problem that you could approach with an AI system, yet the approaches that have been developed so far barely improve over optical flow or other simple approaches, let alone advance our capability to forecast, say, convective initiation.
The future of weather forecasting is larger ensembles (O(100-500) ensemble members, across 2-5 different models) of near-convective-resolving global models at meso-gamma (2-10 km resolution) fed into slightly more sophisticated statistical post-processing systems - almost certainly trained using simple AI/ML techniques on large-scale reforecasts of these parent model systems, or brute-forcing purely Bayesian statistical approaches.
> Why do you think that we need "femtometer resolution" for "very precise" 100 day weather forecasting?
Due to sensitive dependence on initial conditons. Even using measurements at meter resolution will cause the accuracy of a forecast to begin to break down after only a few days.
> What even is "very precise" 100 day weather forecasting?
Anywhere from accurate to exact.
> I think it's very amusing to do the math on how much memory would be required to run a crude primitive equation dycore over even the tiniest of domains at femtometer resolution
And Bill Gates thought 64K should be enough for anybody. Do you really think computers will only have a few GB of memory 50 years from now?
> there absolutely is a trend towards private investment in weather modeling going towards faux-moonshot ideas like cubesat constellations without demonstrated ROI and that would require evolutionary leaps forward in data assimilation, or for deep learning to replace weather models
This straw man does not exactly demonstrate that conventional weather and climate modeling is being abandoned anytime soon. If the unconventional private investments aren't profitable, the market will deal with them.
> The future of weather forecasting is
much like the local weather, impossible to predict with any accuracy years into the future, and yet the tools used to measure it are consistently getting more accurate, cheaper and smaller. Maybe like bottle-openers, weather sensors may superfluously start appear on everything. The more widespread the measurements, the more data descibing initial conditions, the better the forecast will be at any interval.
* You don't have the density of sensors to see even at a meter resolution, to say nothing of femtometer. Actually in oceans you don't have many sensors at all, and most data come from satellite observations that give rather indirect information. Even if you had a capability to compute a model at femtometer resolution, I don't see how much of it would you be able to fit to observations.
* It's pretty hard to predict weather for 100 days, because you would also need to predict many other events in the future: forest fires, volcano eruptions, and many kinds of human activity that also affect weather. However great are your fluid dynamic models, and however well were they able to predict the future state from today's state, they wont help that.
> You don't have the density of sensors to see even at a meter resolution, to say nothing of femtometer.
At the moment, no. But maybe there will be a way to accurately sense weather data at any location from LEO. IR thermometers are pretty neat, maybe something metaphorically along those lines, a satellite with a laser technology that could beam back accurate weather data from any location, and all atmospheric locations it can see along its orbit, sending the data to ground-based ultra-computer networks running simulations.
In 1933, no one would have believed that GPS was 40 years away. In 1985 most would not have been able to understand how flat and thin color monitors were only a decade away, nor that mRNA vaccines were less that 30 years away. Similarly, we really don't know what the future of weather sensing and prediction will be like in 2070, and if we could know, we wouldn't understand how it would be possible.
> Due to sensitive dependence on initial conditons. Even using measurements at meter resolution will cause the accuracy of a forecast to begin to break down after only a few days.
That's an extremely simplistic take on things. In reality, one of the largest issues with high-resolution weather forecasts (1-3 km scale, convection-permitting simulations) is the fact that you small errors in the initialization or model dynamics lead to changes in small-scale storm structure that feedback onto larger scales of motions, disrupting the forecast. Ultra-fine measurements and simulation resolutions only exacerbate this tendency.
> Anywhere from accurate to exact.
You didn't answer the question. Are you trying to predict convective initiation at 100 days lead time? Are you trying to predict a particular synoptic system? Are you trying to predict whether or not it will be warmer than average or not? These are vastly different weather prediction problems which require different approaches.
> And Bill Gates thought 64K should be enough for anybody. Do you really think computers will only have a few GB of memory 50 years from now?
Modern weather and climate modeling is already a tera- or peta-scale endeavor, depending on exactly what one is trying to do. The sorts of simulations alluded to in the OP push into the exascale.
As other commenters have noted, your odd choice of femotometer (10^-15 meters) would lead to memory requirements larger than the number of atoms in the real atmosphere.
> This straw man does not exactly demonstrate that conventional weather and climate modeling is being abandoned anytime soon. If the unconventional private investments aren't profitable, the market will deal with them.
Of course it does. The age of heterogeneous compute for weather/climate models is just beginning, yet you do not see NVIDIA optimizing NWP systems to run on GPUs or Google porting them to run on TPUs, do you? Instead, you see these organizations pursuing AI/DL, while core NWP development is limited to federal research labs and agencies, but they are increasingly struggling to attract developer and research scientist talent to pursue these activities.
This is a very real challenge that is frequently talked about within the weather community in the United States. I'd hazard the guess that you are not a member of this community?
> much like the local weather, impossible to predict with any accuracy years into the future, and yet the tools used to measure it are consistently getting more accurate, cheaper and smaller. Maybe like bottle-openers, weather sensors may superfluously start appear on everything. The more widespread the measurements, the more data descibing initial conditions, the better the forecast will be at any interval.
There is virtually no data assimilation technology to support the ingestion of the vast majority these data, and we do not even run weather models with suitable configurations to take advantage of them if we had the DA support in the first place. And, as I've mentioned repeatedly, not every measurement leads to an improvement in forecast quality. This is simply _not_ the low- or even high-hanging fruit regarding improvements to weather forecast quality and impact.
I've worked in this exact domain of developing novel weather sensing and observation systems and leveraging them to try to improve forecast quality - across federally-funded research and more than one private company over the past ten years - and it's mostly a fools errand. If one wants to develop improved, impactful, useful weather forecasts, this is not the path to pursue.
You want to know the precise shape of the Earth's surface in femtometer precision?
There are some profound problems with that idea once you get below 10 meter or so, but I'll let you think that one through yourself.
No, but I wouldn't mind weather measuments every cubic femtometer of the lower atmosphere and a fast enough computer with enough memory to cruch the data and accurately report what the weather will be like on 29 February.
Are we thinking about the same femtometer? 10^-15 of a meter?
https://en.wikipedia.org/wiki/Femtometre
I mean you can’t even fit a thermometer into a cubic femtometer..?
I think Maursault has thoroughly demonstrated their lack of serious thought or reading on the subject. But just for giggles and for the casual reader, the lattice spacing of silicon is 200,000 femtometer. So if you encode only one bit per cube of this fm cubic lattice, and you manage to encode this into single atoms of silicon, you need a volume of silicon 8,000,000,000,000,000 times larger than the system you model.
> I think Maursault has thoroughly demonstrated their lack of serious thought
When you can't beat the argument, pull out the ad hominem fallacy and attack the man. Fallacy, of course, is faulty reasoning.
> So if you encode only one bit per cube of this fm cubic lattice, and you manage to encode this into single atoms of silicon, you need a volume of silicon 8,000,000,000,000,000 times larger than the system you model.
This explanation is indicative of linear thinking. Apparently Google Earth is not possible, as it would require a computer the size of the planet. Digitizing the Library of Congress apparently requires a memory stick the size of Congress. Seriously? You just can not comprehend how things could ever get better than your current understanding of how things are right now? Consider that if you lived in 1500BC, were an expert at the time in farming, and a plough was described to you, you would mock the person describing it, and insist that tilling soil was impossible.
Pointing out that your thinking exhibits a distinct lack of understanding what you are talking about is not an ad hominem, it's a relevant statement of fact.
And your second paragraph is amply demonstrating this. I pointed out the physical implications of encoding your femtometer cubes at atomic scale. Nothing more. Encoding the Library of Congress has nothing to do with that. You are proposing to simulate at subatomic scale so obviously encoding it into atoms will make the simulation larger than the object similated.
To engage with your argument directly: You have none. All you repeat is that the past has seen technological breakthroughs, therefore the specific fantasy you propose makes sense. Non sequitur. That some breakthroughs have happened doesn't mean that any random breakthrough will happen. And your ideas are pushing hard against the limits of physics.
> Pointing out that your thinking exhibits
Incorrect. Any statement concerning the arguer and ignoring their argument is ad hominem and fallacious argument.
> And your second paragraph is amply demonstrating this.
Incorrect and a tu quoque argument. I did not address you, personally, but only your argument, and if my use of "you" is confusing, it is the royal you, "you all," and I may as well have used "we."
> I pointed out the physical implications of encoding your femtometer cubes at atomic scale.
Using current understanding of how it would have to be done and in denial that it might ever possibly be done more efficiently in the future.
> Nothing more.
>>> I think Maursault has thoroughly demonstrated their lack of serious thought
other than this ad hominem.
> You are proposing to simulate at subatomic scale so obviously encoding it into atoms will make the simulation larger than the object similated.
and this straw man
> To engage with your argument directly: You have none.
but wait while you prove yourself wrong,
> All you repeat is that the past has seen technological breakthroughs,
Yes, my argument is that technology advances, and since it has always done so, my crazy idea is that it will keep on doing so. Though it is possible technology has stopped advancing, I think it is unlikely.
> therefore the specific fantasy you propose makes sense. Non sequitur.
The specific fantasy is your straw man, and I drew no conclusions, those are yours. Frankly, my first comment was a joke and not meant to be taken literally, and I only intended to argue against someone else's idea that increasing simulation resolution is not the future of weather modeling, and you're not the only one that got tripped up by my use of femtometer. Regardless, I still see it as possible territory, and anyone else not being able to conceive of how does not mean it is impossible, only that we can't conceive it, just like Bill Gates, not a stupid man, was unable to conceive of anyone needing more than 64KB of RAM, so you also are unable to conceive of how something that would require incredible, inconceivable advances in technology to achieve, and yet, within the next 5 years modern medicine will advance further than it has in all the years before, that's just how it is, and students of the history of technology know this the same way you know how many femtometers there are in silicon's lattice spacing. What you seem to be unable to do is understand there are things we don't understand, and in physics, a good example of this is dark matter, which for some reason we haven't figured out a way to detect it, and it is so similar to the luminiferous ether in this regard, that only students of the history of science suspect that it might be bullshit, while every scientific mind is convinced it exists, just like the luminiferous ether in 1886.
> That some breakthroughs have happened doesn't mean that any random breakthrough will happen. And your ideas are pushing hard against the limits of physics.
As far as you or anyone else knows, today. But we don't know the future, and we never have. This is not my proof that this will occur, only that we don't know.
What is the smallest possible width of a photon? How do IR thermometers work without fitting them in anywhere? You just never know how it might be done.
there's a very good physical argument that this is impossible. if you want to store 1 bit per femptometer simulated, at current computer sizes, we are taking about a computer billions the size of the earth. even if you use 1 atom per bit, your computer will be almost as big as the earth. such a computer will collapse under it's own gravity.
> at current computer sizes
This. No, not at all at current computer sizes, but at future computer sizes. This is the same mistake someone in the 1970's might make about billions having a smartphone today (supercomputer by their standards). Consider how everything at current computer sizes is effectively two dimensional, even stacked processors are still fundamentally 2D designs. There is still a lot of computing advancement ahead. 40 years from now they'll look back and think the same things we think when we look back 40 years, that the machines were so primitive, hardly anything could be done with them, and some will be nostalgic for them, talk about their strengths, while others will shake their heads and think even messing with the fastest workstation today is a waste of time. Just because we can't conceive of how, doesn't mean it's not possible, some day.
do the math at 1 atom per bit.
It's extremely unlikely that we'll ever get anywhere near that. Even meter precision is impractical.
Unless you know it to be physically or logically impossible, you could not really know how likely it is or isn't. Ask anyone in the mid-1970's how likely it is that billions of people would be walking around with a supercomputer in their pockets, and they'd come up with all sorts of reasons why it was extremely unlikely, such as no individual would ever need so much computing power. The practicality of the precision only depends on the ability to measure and the ability to manipulate and simulate large amounts of data, both of which are extremely likely to get better, and better faster and faster, as time and technology progresses.
A femtometer is a few orders of magnitude smaller than an atom. There are about 2*140 atoms in the atmosphere. You can't even count to that number, let alone do any fluid dynamics to that. I'm confident that we won't have femtometer scale simulations of the atmosphere before the sun becomes a red giant and swallows the earth.
> A femtometer is a few orders of magnitude smaller than an atom.
Thank you for telling me what a femtometer is, as though my using the word wasn't a pretty good indicator I knew what it was. You mean a femtometer is a real thing? And I just made that up out of thin air to mean a meter stick to give to women. What are the odds?
> There are about 2*140 atoms in the atmosphere. You can't even count to that number, let alone do any fluid dynamics to that.
Would you like me to explain how your argument is a straw man, or can I trust you to figure it out?
> I'm confident that we won't have femtometer scale simulations of the atmosphere before the sun becomes a red giant and swallows the earth.
Very colorful, but all you're really saying is that you are pessimistic about technology and about any staggeringly large advancements in computer design or weather sensor tech, while I, otoh, optimistically say I just don't know, but I bet computers will get faster, smaller and cheaper, and that within only a hundred years there will be weather tech that we are incapable of conceiving of today.
Sure, a femtometer is mind-bogglingly small, but it's only 15 orders of magnitude smaller than a meter. It's way bigger than a zeptometer. How is it even possible femtometers can be described so simply? But of course, there could never be any more advancements in mathematics, physics, computer engineering or our current understanding of weather and climate. We basically know all there is to know right now. Huh.
Maybe read https://arxiv.org/pdf/quant-ph/9908043.pdf before you think about computers operating on 2*140 objects and try to be less aggressive in your comments.
Are numbers not objects? Is scientific notation not a way of expressing numbers that are too large or too small to be conceived or expressed in decimal form? Does a large database keep every object and every bit of data stored in it in RAM? How is it possible a fractal can be rendered in parts of the whole? Even older computers ordinarily can operate on 2^140 objects.
Maybe try to ignore who is saying what, and focus only on what was said.user@decadeoldcomputer:~$ echo 2^140 |bc 1393796574908163946345982392040522594123776No computer can operate on 2**140 objects in any meaningful way, because no computer can even remember whether it's already done with one of the objects or not. Your example operates on a single object, a number.
Your response is a no true scotsman fallacy while also moving the goalposts, though what you've described is either a single-user single-task operating system, or you are conflating the limitations of current conventional single computer technology, which has already been solved with conventional clusters as well as quantum computing, and probably also with modern GPUs. Parallel computing has existed at least since the mid-1960s.
You apparently didn't even skim the paper I linked.
Your comment here and suggestion before, and that entire comment, is ad hominem: "ur dum! Go read a book! LOL! Stop disagreeing with me, I don't know logic!"
Not everything that disagrees with you is ad hominem and not knowing something is not a shame. But it does not make for good discussion if you're not engaging with the arguments.
Fallacy doesn't require response, yet I have entertained them as much as I have ignored them, such as this straw man. The other comments mentioned in my GP were ad hominem, though they didn't have to be phrased in a way that made them so, but instead asserted against my argument and cited rather than focusing on what I should do. One must speak to the argument, not to the man, "your facts are in error and your argument flawed because x, y, z," rather than "you are wrong and don't know what you're talking about, and I know this because I am an expert." Fallacy is not too difficult to avoid, but also easy enough to be trapped by,
Uh, that’s 140 objects (the number of bits), not 2^140 objects. Rookie numbers.
No, they are 1393796574908163946345982392040522594123776 numbers.
quod erat demonstrandum1 is a number. 2 are two numbers (1 + 1). 3 are three numbers (1 + 1 + 1). ------------------------------> ∴2^140 are 1393796574908163946345982392040522594123776 numbers.Oops, you forgot to count 1.5
Relatedly, last night I tried to think about sqrt(2), but I couldn't get up that high! I got stuck trying to remember the 43'd irrational number after zero.
It's an 8. Just remember 718-753-7694 is an actual Staten Island phone number.
And the butterflies are full of hate.
As I have reluctantly learned today.
At least three industries will be grateful for high-res wind predictions: aerospace, maritime and wind-generation.
This is for short term weather prediction or long term climate modelling? The 2.5 simulated days per day points into the short term direction.
As mentioned in the article, the system is based on the ICON Earth System model[1], which has the following description:
The Earth system model provides a numerical laboratory for research on the climate dynamics on time scales of a season to millennia. Necessarily most processes are parameterized to allow the computationally efficient integration over long periods.
It's also mentioned it will contribute to DestinE[2]:
Destination Earth (DestinE) aims to develop – on a global scale - a highly accurate digital model of the Earth to monitor and predict the interaction between natural phenomena and human activities. [...] The initial focus will be on the effects of climate change and extreme weather events, their socio-economic impact and possible adaptation and mitigation strategies.
[1]: https://mpimet.mpg.de/en/science/models/icon-esm/
[2]: https://digital-strategy.ec.europa.eu/en/policies/destinatio...
I am working on DestinE with workflow managers and posted my first entry in Who's Hiring some days ago. Lots of work and interesting challenges if anyone is interested in earth system models, NWP, workflows, HPC, GPU's, data formats, etc. Not only on my company, the BSC in Spain, but also in other countries/companies as well.
I'm not sure the added precision is helpful if it's just 2.5 days ahead. An extra week of accurate forecasts is a bit pointless if it takes 8 days.
Simulating 8 days would take 8/2.5 = 3.2 days.
When you’re trying to assess imminent danger from an approaching tropical cyclone, days and hours count. A lot.
I wouldn't think that a temporal resolution measured in days is needed or all that helpful for climate modeling.
The real key of this story is in the "simulating — rather than parameterising".
This sounds alot less than Nvidia's FourCastNet?
https://resources.nvidia.com/en-us-fleet-command/watch-27?xs...
How do these two compare?
They're complete different classes of models.
The modeling system in the linked article is a high-fidelity numerical simulation of the coupled Earth system. It's a giant PDE solver for Navier-Stokes applied to the Earth's atmospheres and oceans, coupled together with a great deal of additional physics simulation. The intent is to reproduce, in simulation, the Earth's atmospheric and oceans with the highest fidelity. This set of simulations is the culmination of nearly 70 years of investment, going back to the very first applications of digital computers for solving complex math equations (one of the first simulations bought for ENIAC was a crude quasi-geostrophic atmospheric mode / weather forecast).
NVIDIA's FourCastNet, while very cool, is quite literally a facsimile of this type of system. It's really not even in the same ballpark.
You realize "facsimile" means "copy", right? You didn't explain how they are different.
It's a deep learning based emulator of a full-complexity NWP system, but it is far from a production-ready or operational technique.
It’s an example of a surrogate model. It’s an ML model trained on the output of large numerical simulations like the OP, rather doing the simulation itself.
Surrogate models are nice because they can emulate the output of the full fidelity calculation in a fraction of the runtime, but they typically are trained within a range of validity outside of which they cannot reliably extrapolate.
I have only one question. Does this mean that we are getting closer to a cloud-hosted ultimate map for games like Call of Duty and Ace Combat?
I want to dogfight over Ohio, land at Offutt to play Warzone in Omaha, then take a MRAP and drive to NY.
1:1 scale multiplayer world maps have existed for decades, e.g. in Microsoft Flight Simulator.
The difficulty of making a "large" map comes from what you want to simulate and in how much detail, not from how big it is per se.
This pretty much.
ED has an approximation of the entire milky way in an MMORPG, where you can visit individual planets of around 400 billion star systems. These are obviously "generated" except for maybe a few handcrafted systems like Sol.
The problem really isn't "size".
And sometimes you don't need to simulate. MSFS pulls in live metar data, though previously they used Meteorblue forecast data.
Since I can't reply to the RoP thread, you said, "Is this about the brown-skinned male Elf and the brown-skinned and beardless female Dwarf? This horse has been beaten to death and back on YouTube and Reddit, and the consensus is that it's fine and faithful to the texts." This is absolutely ASININE in how in accurate it is. The show is BARELY related to the works it is allegedly based on. And brown people aren't the problem, since Haradrim/Easterlings exist in Tolkien's Legendarium. Where is Celeborn? Why is Isildur around 1500 years before he was born? Why is Durin III Durin IV's father, when the dwarves only allow one person at a time to be named Durin due to their belief that each Durin is a reincarnation of the previous one? Why is Gil-galad able to pardon Galadriel (for killing orcs) when it was the Valar who banned her from Valinor, and she isn't pardoned until three thousand plus years later when she rejects the ring. Why is her motivation to get revenge when in the text her motive is to create her own kingdom to rule (kind of like Satan in Christian theology, which is why she's interesting, since her primary conflict is her own pride versus her own wisdom). Speaking of wisdom, why is she a petulant, hot-headed teenager when she's thousands of years old? Why is she going around hunting orcs when what she was doing on Middle-earth in the Second Age ruling various places and being immersed in Elvish politics (and of particular note, her issues with Celibrimbor and Annatar). Why is she going around swinging her sword like an anime protagonist? She might be tall and athletic, but there is scarcely anything written about her in battle, save for her bringing down the walls of Dol Guldur (which typically is done with magic, or a wrecking crew, not an anime sword). Why is she attempting to swim from Valinor to Middle-earth? Why is she in Númenor when she literally never went there?
There's a LOT more I could ask about her, and that's just one character. This show is fan fiction LOOSELY based on the writings. VERY loosely.
That is all. I'll enjoy my ban now.
>throughput of 2.5 simulated days per day Will need to be at least 10x faster to be operationally useful.
is fixed resolution really the way to go? I'd imagine big patches of the ocean and deserts being modeled as single nodes being way more efficient without compromising fidelity.
Although final output in those regions is perhaps less relevant, their state would still have an impact on other areas as the simulation progresses.
Why did they target 1.2km instead of 1? Or any other number?
I guess best ask the authors.
> Our ICON-ESM configuration is already used in production mode for scientific purpose with horizontal resolutions of 10 km, 5 km and 2.5 km. With the 1.2 km configuration we have now opened the door for a new class of numerical models which will allow us to investigate local impacts of climate change, such as extremes of precipitation, storms and droughts.
Some evidence of them using 10 km cells and then subdividing into halves, gets you down to 1.25 km.
The grid is not squared, is hexagonal, surely it relates to that.
How is this going to change seasonal-scale forecasts?
1.25 ~ 1.3: Round half to odd prefers preserving the existing scale of tie numbers avoiding out-of-range results when possible for numeral systems of even radix.