Settings

Theme

Julia 1.9

julialang.org

276 points by kristofferc 3 years ago · 215 comments

Reader

acalmon 3 years ago

I will go against the trend here and give a big thanks to the whole Julia team for all their wonderful work.

I've been a heavy Julia user for +4 years and adore this ecosystem. I use Julia for parallel computing, modeling and solving large-scale optimization problems, stochastic simulations, etc. During the last year or so, creating plots and dashboards has become much easier too.

Julia makes it surprisingly easy to go from "idea" to "large-scale simulation". I've used it in production and just for prototyping/research. I can engage with Julia as deeply as I would with C code or as "lightly" as I engage with Matlab/R.

I'm excited to see what comes next.

brancz 3 years ago

My favorite change (even though it's not listed in the changelog), is that just-in-time compiled code now has frame pointers[1], making Julia code much more debuggable. Profilers, debuggers, etc. all can now work out of the box.

Extra excited that the project I happen to work on (the Parca open source project[2]) influenced this change [3][4]. Shout out to Valentin Churavy for driving this on the Julia front!

[1] https://github.com/JuliaLang/julia/commit/06d4cf072db24ca6df...

[2] https://parca.dev/

[3] https://github.com/parca-dev/parca-demo/pull/37

[4] https://github.com/JuliaLang/julia/issues/40655

aborsy 3 years ago

Matlab users should switch to Julia. It’s a real programming language, and better in many ways.

I provide the option of Julia in my tutorials. Students are lazy, and don’t want to explore something new. Most of them stick with matlab.

What prevents matlab users from switching? The syntax is similar.

  • jasode 3 years ago

    >Matlab users should switch to Julia. [...] What prevents matlab users from switching? The syntax is similar.

    Choosing a programming language based on just comparing the language syntax only works for academic settings or toy projects for self-curiosity and learning. Once you consider adopting a language for complicated real-world industry usage, you have to look beyond the syntax and compare ecosystem to ecosystem.

    E.g. Look over the following MATLAB "toolboxes" and add-ons developed over decades: https://www.mathworks.com/products.html

    Julia doesn't have a strong equivalent ecosystem for many of those. In that MATLAB product list is Simulink. Tesla uses that tool to optimize their cars: https://www.mathworks.com/company/newsletters/articles/using...

    You can take a look at some of the 1-minute overview videos to get a sense of MATLAB toolboxes that companies pay extra money for: https://www.youtube.com/results?search_query=discover+matlab...

    It has add-ons such as medical imaging toolkit, wireless communications (antenna signal modeling), etc. And MATLAB continues releasing new enhancements that the Julia ecosystem doesn't keep up with.

    If one doesn't need any of the productivity tools that MATLAB provides, Julia becomes a more realistic choice.

    Or to put it another way, companies didn't really "choose the MATLAB programming language". What they really did was choose the MATLAB visual IDE and toolkits -- which incidentally had the MATLAB programming language.

    • tholy 3 years ago

      The real story is a bit more complicated. Yes, MATLAB has decades of extra development time, wonderful documentation, a much better debugger (I helped write Julia's debugger so I'm not being mean when I say that), and other advantages. But Julia has many advantages in some of the areas you cite. JuliaHub has JuliaSim, and while I've never used it, there are use-cases where it leaves Simulink in the dust. On medical imaging, Julia is overall more capable than Matlab in seamlessly flowing between multimodality/2D/3D images of large size: try, for example, doing lazy processing on large datasets as explained and demonstrated at https://youtu.be/x4oi0IKf52w?t=2257 (start watching at 55:20 if you just want to see the demo without the explanations leading up to it). Matlab's polish is not to be underestimated, and you're surely right that there are domains where it can't be beat, but there are also domains where Julia is a much better and more productive ecosystem.

    • stdbrouw 3 years ago

      The flip side to this is that Julia is a great general purpose engineering calculator and simulator. For example, calculating friction in hvac ductwork, voltage drop in long electrical circuits, solar gains for windows or solar panels facing various directions, cost/benefit analyses of thicker or thinner roof insulation and so on... these are all 1 to 10 lines of code so there isn't a big porting cost in moving to Julia and in exchange you get a fast, ergonomic and sane language with universal support for physical units [1] and unicode variable names [2] so things are much less verbose than they would be in, say, Python.

      In the past I used Calca [3] for these kinds of things, and there are many of these "fancy calculator apps" around, but it's just so much nicer to work in a real programming language.

      [1] https://painterqubits.github.io/Unitful.jl/stable/

      [2] https://docs.julialang.org/en/v1/manual/unicode-input/

      [3] http://calca.io/

      • cookieperson 3 years ago

        Yea but the language itself, and it's ecosystem is broken on a regular basis. Again fine if you're an academic, not so great for industry.

        • freilanzer 3 years ago

          How?

          • cookieperson 3 years ago

            What do you mean how? Something in base changes, the author of a package didn't put strict enough compatibility on their package and now you can't use the package. It's especially great when this sort of thing happens after spending ten minutes waiting for precompilation an hour for your time to first gradient and three days for a run to complete and you trusted the tool to let you write the results to a CSV using the CSV package(one of the most widely used packages). Then you file a bug report and get gas lit, so you just patch your local Julia version and compile it yourself so you don't wait 4 months for an upstream fix. Good times.

            • cjalmeida 3 years ago

              Counter anecdote, I had to go back to a quite complex project that a was developed in 1.5, 3 years ago, hundreds of dependencies.

              Upgrading to 1.9 RC2 required changing a single dep (Light raps to the drop-in Graphs) and a single line (a library changed the return from a String to a StringView)

              • cookieperson 3 years ago

                You're incredibly lucky. Even minor version bumps have left me up shit creek in production with deadlines.

                • cjalmeida 3 years ago

                  Julia core is backwards compatible all the way to 1.0, libs not so much.

                  Now, something as fundamental as numpy dropped support for Python<3.6 in a minor release: https://numpy.org/devdocs/release/1.19.0-notes.html

                  And I've been bitten by relevant python libs breaking stuff even in patch releases.

                  It happens and it's far from particular to Julia. I just make sure to test stuff property before going to production.

                  • cookieperson 3 years ago

                    Pythons breakages have also been security centric. I'm not a big fan of it, but yea Julia isn't 30 yrs old and it's done things like this every minor release without anouncement. To be clear though I'm not talking about a conscious researchable decision to drop support after 5 years. I'm talking about bugs that pop up over night. Anyway has Julia ever been audited for security? Most of the netstack was written by a single person... Makes ya wonder...

                • freilanzer 3 years ago

                  Why do you update a programming language version in production with deadlines in the first place?

                  • cookieperson 3 years ago

                    Because it has a new feature that's save the day(if it worked), fixes bugs in a janky patched Julia version, and production in the data science world looks different in research and development phases then it does for software. Production in r and d can simply be, boss wants to see the pros and cons next week with a successful run. Only with Julia have I run into these kinds of predicaments

                    • dunefox 3 years ago

                      Other languages and libraries break as well so that's just bad practice.

                      • cookieperson 3 years ago

                        Not as frequently. Having a broken release for a programming language is also bad practice.

                        • cookieperson 3 years ago

                          Look at the backlog of issues in their GitHub for backports. You'll see detailed lists of reported and fixed breakages. The lists aren't small... Now go on to imagine the ones people don't report and instead patch and fix them locally. We aren't even talking about Julia code fixes, often these involve the C that creates it and are a nightmare to diagnose. I can't reply to your post unfortunately the thread is too long.

                          • cjalmeida 3 years ago

                            This is the list of bugs fixed in the upcoming Matlab release:

                            https://www.mathworks.com/support/faq/pr_bugs.html

                            Just look at the first one: "In certain cases, pointAt method of satellite scenario Satellite class interprets Euler angle inputs in radians rather than degrees"

                            Your expectations are completely unrealistic.

                            • cookieperson 3 years ago

                              Your entitled to your own beliefs, but Julia is more of a research project that's crowd sourcing phds to make products for them then it is a programming language. In other programming languages I've used past a 1.0 release (hint that's many) I've never seen the type of breaking bugs that occur. If Julia is this end all be all scientific computing language then I expect the CSV package not to be incompatible with Base after a minor version bump. Having to manually compile a patched fix to read csvs is not normal... Sorry, but that's never happened to me in R, python, Go, Rust, C++, Java, Scala, JavaScript, etc. If that's an unrealistic expectation then maybe Julia is just holding an incredibly low bar compared to it's competitors.

                          • freilanzer 3 years ago

                            Other languages don't have that? CPython has almost 7k issues open atm. GHC has 5k. Again, FUD.

                            • cookieperson 3 years ago

                              It's not about having issues. It's the type of issues and how they affect end users. There's no FUD associated with that. Just facts. Julia has been around for a decade, the FUD campaign stuff shouldn't even be a concern for the project at this point. Other languages from the same era are either gone, or are dealing with very different social deterrers to adoption.

                              I'm not typing these things out for FUD to slow Julia developers role or something. Actually I hope they listen and actually work to fix things that matter. I invested a lot of time and effort into the language. Currently I'm looking at it as a complete waste with mostly negative side effects to my career and where I could have invested my time. Trying to do one of two things, encourage change, get people on the fence about trying it to wait until it's actually ready.

                        • dunefox 3 years ago

                          How exactly are Julia releases broken?

            • Sukera 3 years ago

              Can you be more specific? As is, it's hard to follow what the issue was that you encountered.

              • cookieperson 3 years ago

                At this point the only thing I can say is. Have you ever been a Julia user? Or are you actively developing Julia? If you are actively developing Julia, yea sure pull down the feature branch after someone's reported a bug, hot fix your toml to point too some specific version after reading ten diffs to be sure you have the right one, maybe stand up your own package server or pay 50k for one from julia computing, recompile Julia itself and everything is totally fine! If it's not no big deal just stomp over the type with your own code, yay. Maybe that's even part of your job. If you're a user expecting to do development and receive updates on packages or language, all I'll say is best of luck.

                • amval 3 years ago

                  I run in production several Julia projects and I honestly can't say I know what you are referring to.

                  I do use some libraries through PyCall. This is a relatively frequent source of trouble, with libraries breaking because of 3rd party dependencies, behaviour changing, or even somehow installation suddenly stops working. This all happens in the python side, to the point that I am currently replacing all the libraries even if it means porting some of them or even a loss of functionality.

                  • ChrisRackauckas 3 years ago

                    You can use CondaPkg.jl (https://github.com/cjdoris/CondaPkg.jl) to setup Python dependencies with version control. I haven't played with it too much but it seemed to work out for what I tried. Indeed the reason why I haven't had too many test cases is because in SciML we removed all Python dependencies since they were the main source of instability. PyDSTool.jl, FEniCS.jl, SymEngine.jl (through ParameterizedFunctions.jl and ModelingToolkit.jl) were the biggest development burdens because of the Python deps changing function names all of the time until they were deprecated and replaced by pure Julia packages, so I definitely know the pain.

                    • cookieperson 3 years ago

                      Is it fair to say that it is your career to rewrite code in Julia to write academic papers about julia?

                      • ChrisRackauckas 3 years ago

                        Most of my time isn't academic or writing code. Most of it is focused on enhancing algorithms to achieve what's required for new methods to work deployed in production environments such as the Pumas clinical pharmacology work, and now working on what's required to get scientific machine learning point-and-click GUI ready. I always leave a little bit of time in mornings and weekends though for some coding, and a little bit a day goes a long way after years.

                        I stay away from code rewrites like the plague, which is why SciML maintains so many wrapper packages. It's less work and if there's nothing new then rewriting is thankless work. Unless we have some kind of angle for something new in the algorithm, we just recommend someone use the same old wrappers (for instance, IDA with fully-implicit DAEs). We won't beat the C/Fortran code without something new. However in most areas there's lots of interesting research to be done. I could "sabbatical as an academic" and spend the next 3 years just improving explicit Runge-Kutta methods: those for example are probably still 4x away or so from what I think is theoretically possible, but that's not going to happen any time soon since we have some real applications to focus on.

                        Those package deprecations mentioned above were deprecated by community efforts (BifurcationKit.jl replaced PyDSTool wrappers and was done by Romain Veltz, FEniCS wrappers were largely replaced by Gridap, and SymEngine was replaced by Symbolics which is largely the work of one of the Julia Lab PhD students Shashi). Even those had a largely new element to them though, with BifurcationKit focusing a lot on recent algorithms that other bifurcation software don't have (like the deflated Newton methods), and Shashi's Symbolics.jl work focusing on generality of term rewriting systems to allow for alternative algebras.

                        • cookieperson 3 years ago

                          Think you've dodged the point entirely by fixating on a word rather then the sentiment. Isn't it in your professional interest to endlessly promote Julia at all costs. To find tools in other languages and make sure they are available in Julia, etc?

                          That's basically my point. Readers should be aware of who is making suggestions and why they might be making them. You have a clear status and financial drivers to say "only use Julia, it's ready for anything, it's fine" while not saying any of the flaws other then "it's fixed if you compile the branch of the grad student I asked to fix it's PR".

                          Also the type of work you do is incredibly rare. Most people considering trying to use Julia for their day job don't have the resources you have nor the needs or desires you have. In most cases getting one of the most foremost scientific computing in Julia experts take on what tools they should use really doesn't map well to practical users.

                  • cookieperson 3 years ago

                    So much for solving the 2 language problem. Everytime I ended up in a similar situation the Julia code was rewritten either by me or someone else in c++ for practical/performance reasons. Hope you consider the same. The libraries for doing mathematics in C++ are surprisingly good and you might be surprised to find how easy parallel processing and fine tuning is in c++. In some ways it's less cognitive load than Julia because it's more explicit...

                    • amval 3 years ago

                      My problem is not with Julia, my problem is with the Python ecosystem and its dependency management. Not sure how you can read what I wrote and reach that conclusion.

                • kukkukb 3 years ago

                  We run Julia code in production. These are compute-heavy services called by other parts of the app. Version upgrades are normally easy. Upgrade, run the unit tests, build the Docker images and deploy to K8s. Maybe we've been lucky, but since 1.3 we've never had big issues with version upgrades.

                  • cookieperson 3 years ago

                    That's wild, I've seen huge performance regressions from changes to Base, among commands being dropped, irreconcilable breakages with basics of the ecosystem, spurious bugs in the networking libraries that are nearly impossible to chase down, and the menagerie of compatibility with essential packages break. I stopped caring around 1.7 when I realized this was the norm and it wasn't going to change. You must be honed in on some specific packages with a lot of custom code.

    • NicolasL-S 3 years ago

      I would be cautious saying Simulink has no Julia equivalent. This NASA engineer switched from Simulink to SciML and obtained a 15,000x speed up while his code went from 1000 lines to 50 lines long: https://www.youtube.com/watch?v=tQpqsmwlfY0

  • geokon 3 years ago

    I don't know if you genuinely want feedback... But I'll share my very short experience. I tried Julia one time a few years back. I'll be honest, I didn't put in a lot of effort into (but nor will most potential Matlab converts - bc people are busy and have stuff to do)

    It's got a frustrating "not fun" on-boarding. ie. the number of minutes from downloading "Julia" to getting cool satisfying results

    1. It not a calculator on steroids like Matlab. It doesn't have one main open source IDE like Octave/Rstudio that you can drop in and play around in (see plots docs repl workspace)

    2. The default language is more like a "proper programming language". To even make a basic plot you need to import one of a dozen plotting libraries (which requires learning how libraries and importing works - boring ..) and how is someone just getting started to decide which one..? I don't need that analysis paralysis when I'm just getting started

    3. Documentation .. Well it's very hard to compete with Matlab here - but the website is not as confidence inducing. The landing page is a wall of text: https://docs.julialang.org/en/v1/ Tbh, from the subsequent manual listing it's not even clear it's a math-focused programming language . It's talking about constructors, data types, ffi, stack traces, networking etc etc.

    • markkitti 3 years ago

      A lot has changed in a few years. This release is a big one.

      1. I run Julia on my smartphone and often use it as calculator.

      2. You typically only need Plots.jl for most needs. See https://docs.juliaplots.org/stable/

      3. See https://juliaacademy.com

      Another alternative environment are Pluto notebooks. It's reactive like a spreadsheet, but easy to use in your browser.

      https://featured.plutojl.org/

      I have several users without much coding experience using Pluto notebooks just to generate plots from CSV files. They are finding the combination of a web based interface, reactive UI, and fast execution easier to use than a MATLAB Live script.

      • freilanzer 3 years ago

        > 1. I run Julia on my smartphone and often use it as calculator.

        How?

        • geokon 3 years ago

          You could probably run Termux and launch a REPL from the command line?

          • galangalalgol 3 years ago

            I launch pluto from fedora running in termix, then I have a tab open in firefox pointed at localhost and have a pluto notebook to do stuff in. I don't do this often, typing in phones is hard for me.

            Edit: I almost like unicodeplots.jl better though. Its much lighter weight and still lets me figure out most stuff I use plotting for. Through a ssh session if necessary.

          • cookieperson 3 years ago

            Great way to eat a quarter gig of RAM before doing anything on a small compute device...

            • jrockway 3 years ago

              Probably less RAM than the average browser tab uses. 250MB of RAM? I've forgotten how to count that low.

              Today's smartphone has higher specs than my $10,000 workstation from 10 years ago.

              • cookieperson 3 years ago

                Compare the Julia repls memory consumption and resources used vs python for using it as a calculator. Looking at orders of magnitude in different resources consumed and time to first anything especially on a computer restrained device...Better yet do the same as the unix tool BC. I am very much aware of when I start tossing quarter to half gbs of RAM on small devices to do things like 1+1=2... Leave two or three of those open by being lazy/forgetful and yea there goes a huge chunk of RAM. To be fair I also don't spend thousands of dollars on the latest phones...

      • geokon 3 years ago

        Not to detract from everything else.. I've gotta say, that's the most confusing library name possible. First I thought it's a website about Julia plotting libraries. Then I thought it's a new Julia default plotting interface (with actual plotting libraries as backends). Then after I saw the code I realized that it's a library that literally called "plots"...

        I'm sure it's great SEO though

        • sundarurfriend 3 years ago

          Are you referring to Plots.jl? All Julia libraries end with a .jl in their name; sort of like how many Python libraries start with `py`, and Rust libraries have `rs` as a suffix. Having it be universal makes it more consistent, and leaves the rest of the name to be completely about the content of the package. In my experience/opinion, that has lead to better package names (though Julia's package naming policies also help a lot).

          > Then I thought it's a new Julia default plotting interface (with actual plotting libraries as backends). Then after I saw the code I realized that it's a library that literally called "plots"...

          It's both. It's a library called Plots.jl, that's a plotting interface to backend plotting libraries. It's not "new", not "default" in the sense of built-in, but usually the recommend first option for most use cases.

          • geokon 3 years ago

            the `.jl` was in effect dropped. Tbh, I landed on the page and started to skim it and it was highly confusing..

            Just try to read the landing page without realizing "Plots" is a proper name :))

            "juliaplots.org" - looks like a website about plotting in Julia

            "Plots - powerful convenience for visualization in Julia" I guess they are...

            "Almost everything in Plots is done by specifying plot attributes." I guess you could abstract all plotting that way.. interesting take

            "Intro to Plots in Julia" an introduction to plotting in Julia!

            "Plots is a visualization interface and toolset" I guess that's a way to think about plotting.. kinda philosophical.. but okay

            Only when I got to this sentence "Plots might be the last plotting package you ever learn." did I realize this was the actual proper name of the library :))

            I can also imagine using the name in conversation would lead to ridiculous things like "You should making your plots with plots" etc.

            Anyway, it's just a bit confusing when you skim it. It's gotta be one of the most unusual library names I've seen

            • goerz 3 years ago

              It’s pretty typical for the conventions of Julia package naming… Plots, Statistics, DifferentialEquations, …

        • stellalo 3 years ago

          The library is called Plots because it plots, what is confusing about it?

    • sundarurfriend 3 years ago

      > 1. It doesn't have one main open source IDE like Octave/Rstudio that you can drop in and play around in. (you can see plots docs repl workspace)

      It's looking like VS Code (julia-vscode.org) will be the equivalent, and it's gotten a good chunk of the way towards that - "plots docs repl" are all existing features and pretty easy to use. The docs [1] show a "Workspace" feature too. There's also some integration with tools like JET.jl [2] so that there's in-editor code analysis and diagnostics.

      (And the extension works in VS Codium as well, so you can go completely FOSS if that's your wish.)

      [1] https://www.julia-vscode.org/docs/stable/userguide/grid/ [2] https://aviatesk.github.io/JET.jl/stable/

    • stellalo 3 years ago

      > Tbh, from the subsequent manual listing it's not even clear it's a math-focused programming language

      It’s true that the majority of the Julia community comes from a scientific or engineering background, and this is reflected in the package ecosystem and some of the “embedded” language features for technical computing. However I feel like stressing Julia as a “technical language” any more than it currently is may scare away people that may want to use it for other purposes, such as a general purpose scripting language.

      Now more than ever before, with the improved code caching in 1.9, Julia lends itself perfectly for interactive usage and day-to-day scripting.

      • goerz 3 years ago

        I’m a heavy Julia user, and I love the language, but I really wouldn’t recommend it as a “general purpose scripting language”. Pushing the idea that it is will only lead to disappointment in new users. Julia is very much a language for scientific computing, in the broadest sense.

    • zorked 3 years ago

      VS Code supports Julia well, both plain and in Jupyter notebooks.

    • vfclists 3 years ago

      What is the relevance of a review of the language tried one time a few years back?

      Today is not a few years back.

      No one genuinely wants a review of the product from a few years back if they are seriously considering the language today.

      Julia is not Lisp or Scheme.

  • derstander 3 years ago

    > What prevents matlab users from switching? The syntax is similar.

    I’ll offer my perspective. I’m switching little-by-little. Why don’t I just rip off the bandaid and switch whole-hog immediately? The literally twenty years of tools I’ve developed in MATLAB.

    At work, I’m paid to perform analyses, develop algorithms, and document that work to my colleagues and our sponsors. I’m not paid to learn Julia. If I’m working on something completely novel where none of the MATLAB building blocks I’ve wrote over the years are useful, or the porting time is a small (for some arbitrary and subjective definition) of the overall task time, then I’ll do it in Julia.

    The toolboxes aren’t a huge sticking point for me: Mathworks has only somewhat recently developed toolboxes geared toward my particular domain, so none of my code relies on them. My ability to share with colleagues is a bit of a sticking point. We’re predominantly a MATLAB shop (and this seems true not just at my company, but in our particular niche in industry). There has been some movement toward Python. But if it’s anything like the transition from Fortran to MATLAB — which was still on-going when I started in the early 2000s — then a full switch to either Python or Julia is still a ways off.

  • qsi 3 years ago

    Syntax is one thing, but as others have mentioned it's more than that. Libraries, tooling, IDE and perhaps also knowing the ideosyncracies and pitfalls, and how to recover from them.

    I'm very comfortable in Matlab and often know immediately what's wrong when I hit those oddities. In Python it usually means I spend considerable time googling and tinkering before I even understand what I did wrong because I have far less experience... Same when I tinker with Julia.

    And for some people it being a Real Programming Language may be a disadvantage actually... Typically means you need a better understanding than just type-and-run.

    • wirrbel 3 years ago

      I can mirror your sentiment from the python side. I started my thesis in a matlab heavy department after years of python experience. After three months of fighting I switched to data processing in python and it was a breeze (even though Pandas was kind of an experimental library back then). The matlab ide would often crash when there were problems on the matlab code that my supervisor had written (probably it wasn’t well written but i was astonished that it would bring down the IDE).

      I have made a career in python data science since then overall it was a very good decision for me. I was ahead of the curve when data science popped up.

      I have since joined a company with engineering dept that rely on matlab. I have no doubt that they wouldn’t get much benefit from switching to python or Julia apart from the license costs.

  • kristoffercOP 3 years ago

    > I provide the option of Julia in my tutorials.

    I also did this and only the students that were well ahead of the rest gave it a try. But I think this is natural, many students are already pushed to the limit to manage the current set of studies. Learning a whole new language on top of that introduces a lot of extra risk and has an opportunity cost that might not make it feasible (unless you are already doing very well).

    I don't think it is so much that students are lazy but more that the current way that the study plan is made in universities doesn't allow for much risk taking by the students. So they will just go the "cookie cutter" way.

  • nunuvit 3 years ago

    Intentional or not, I enjoy how humorously provocative this comment is.

    It's not possible to understand why someone would prefer Matlab through a software engineer's lense. Our jobs are different and a real programming language isn't anywhere on our list of needs or wants for matrix laboratory work. We actually prefer semantics like copy-on-write and features like dynamically adding fields to structs at runtime. It fits our workflow better, and that's ultimately what matters most.

    I'm sure one day I'll add Julia to the list of real programming languages that I've used to write a library, but I'll still wrap it and call it from Matlab just like everything else.

  • MagnumOpus 3 years ago

    What prevents it is the libraries. Just like Python.

    Julia as a compiled language is faster and more distributable than either, but there is a chicken-egg problem about the ecosystem. MathWorks' provided libraries for Matlab are excellent, amazingly documented and massively supported. Python libraries are just hugely numerous in any domain you can imagine...

  • chaxor 3 years ago

    There should be a very simple way to translate Matlab to Julia that anyone can use easily. Yeah, GPT-4 "could* do that - but I don't think very many people would ever do that. Why? Because often Matlab code is private and the owners can't give their code over to some other person.

    So some other method e.g. Galpaca distilled or Llama-based distil by step for Matlab->Julia translation model should be popularized.

    Especially with the distill step by step you could get something that runs quite efficiently on a laptop.

  • stodor89 3 years ago

    > Students are lazy, and don’t want to explore something new.

    Pretty much everything students do is new to to them.

  • eigenspace 3 years ago

    People get settled in their toolsets. Besides, Julia does have a lot of syntax similarities to Matlab, but that doesnt mean that switching is straightforward.

    It’s still a completely different language with it’s own paradigms, semantics, ecosystem, etc.

  • kxyvr 3 years ago

    At least in the past, there were some issues with the licensing of dependencies. For example, both Julia and MATLAB were dependent on a variety of routines from SuiteSparse such as the Choleski factorization and the QR factorization. These routines are dual licensed commercial/GPL. The difference is that your MATLAB license gave the ability to use things like the MATLAB Compiler to distribute somewhat closed code to a client because they, ostensibly, have a license for SuiteSparse. Julia did not, so any project compiled into a stand alone executable would be subject to GPL unless an additional license for the dependencies was purchased. Now, if you're only distributing the source code, which most people do, this doesn't matter as much, but we should all be aware of our license responsibilities. MATLAB has more functionality built-in and I trust they've done the legal legwork for their included routines, so I don't have to.

    To be clear, Julia constantly updates and I'm sure many of its original dependencies are being recoded to help address this issue. I still think it's worth a check and audit if someone wants to put together a broader project as to not get burned by this issue later.

    • sundarurfriend 3 years ago

      Yeah, there have been active attempts to move SuiteSparse out of the base sysimage for this reason. It was planned for this 1.9 release, but looks like there are some performance regressions from the current attempts, so it's not yet there.

      Julia code is mostly distributed as source code, and compiling into standalone executables and then distributing them to outsiders is not very common, so it doesn't often come up as an issue. But it's an important detail to keep in mind in case the use case does arise.

      • kxyvr 3 years ago

        I appreciate the work on this. That's not particularly easy code to write, so thanks to whoever is working through this.

  • ptero 3 years ago

    While Matlab language is horrific, switching requires much more than that. Matlab has a lot of inertia in large scientific projects. It will change, eventually, but this is a slow process. Fortran was the tool for numerical computation for decades even when languages with better syntax were available because of its fast, rock solid and validated libraries.

    On technical merits Matlab still comes ahead in fast plotting of large datasets. And Julia still has a reputation for a "go fast and break things" community and the corresponding cavalier response to bugs. Those will slowly change, but as of now, Matlab holds an advantage there. My 2c.

jbieler 3 years ago

This makes a big difference in usability, before loading a big project was almost in the "coffee time" category, now it's more "wait a few seconds". It helps a lot to make the tool feel more responsive.

  • tastyminerals2 3 years ago

    So, you mean that loading a bigger project in Julia was more or less equal to compiling it with some language like C++? And you had to do it every single time in order to work with the project? This doesn’t sound too good tbo.

    • krastanov 3 years ago

      That used to be the case before this version, indeed. It was due to how incredibly difficult it is to cache code efficiently in a language that (1) dynamically compiles things and (2) supports multimethods. Point 1 is crucial for the high performance, and point 2 is crucial for the incredible composability of libraries and reuse of code. To my knowledge julia is the first language to solve both with UX that is getting better and better. JAX/PyTorch/Tensorflow have some similarities (with long compile times for models), but lack the composability.

      • tastyminerals2 3 years ago

        Come on. Julia is >10 years old. This was a language design decision many years ago, the shortcomings of which have been addressed much later when it started to be clear that it is a problem. Native code caching is available only since 1.9. This is great that the team worked hard on them, but if this was not an issue right from the start, Julia could have been in a different place today.

        Hence, my ranting.

        • krastanov 3 years ago

          I think the dynamical recompilation that made code caching difficult is from 0.6, the last release before 0.7 (the first stable release). From 0.1 (2012 or 2013) to 0.7 julia evolved drastically. They stabilized only in 2018ish. This seems like a pretty reasonable timeline for something no other language has done before (although, as I mentioned, pytorch/jax/tensorflow have some limited similarities, and similar problems).

          • tastyminerals2 3 years ago

            I see. Right, 2017 was the first time I tried Julia and never came really back. Then the Torch, TF, Keras, PyTorch era started and kept ML engineers busy while I think Julia couldn’t keep up. Well, there is Flux. Not sure how it compares.

            • jakobnissen 3 years ago

              It can't compare, not even close. There is just too little devpower behind Flux. Also, the AD landscape in Julia have been changing so fast that the ground has been shifting under Flux, so to speak.

              It's a shame, because Julia is so obviously a good language for DL.

    • adgjlsfhk1 3 years ago

      It isn't. That said, it's not as bad as it sounds at first because there are tools like Revise.jl which let you change code without recompiling.

    • gugagore 3 years ago

      You did not have to do it every single time. There are many changes you can make that are handled correctly by Revise.jl

    • GaryNumanVevo 3 years ago

      In practice, it's not a huge issue. Static imports are already precompiled and any code paths within your project are cached. Typically people will use a single session and use hot code reloading, so any changes in the codepath will need a recompile (on the order of milliseconds).

jakobnissen 3 years ago

Congratulations on the release! Package extensions and package images are a huge boost to the usability of Julia.

To all Julia users: Go forth, and make use of PrecompileTools.jl in your packages! The latency only drops if you actually make use of precompilation, and it's pretty easy to use. I can't wait for more of the ecosystem to start making use of it.

npalli 3 years ago

Nice improvements

----------------------------

JULIA 1.8.5

julia> @time using Plots

11.341913 seconds (14.83 M allocations: 948.442 MiB, 6.88% gc time, 12.73% compilation time: 62% of which was recompilation)

julia> @time plot(sin.(0:0.01:π))

3.342452 seconds (8.93 M allocations: 472.925 MiB, 4.44% gc time, 99.78% compilation time: 78% of which was recompilation)

-----------------------------------

JULIA 1.9.0

julia> @time using Plots;

2.907620 seconds (3.43 M allocations: 195.045 MiB, 7.52% gc time, 5.61% compilation time: 93% of which was recompilation)

julia> @time plot(sin.(0:0.01:π))

0.395429 seconds (907.48 k allocations: 59.422 MiB, 98.54% compilation time: 74% of which was recompilation)

majoe 3 years ago

I really like "Julia, the programming language" and had a great experience using it on the few occasions, where it made sense. But whenever a colleague asks me, if I can recommend it, I have to say "no". The crux is, that its "just-ahead-of-time" compiler disqualifies it for a lot of use cases: I actually would prefer it over Python for small scripts, but the compilation overhead is too long. On the other hand I would use it over C++ for some applications, when it could easily produce portable binaries.

With the steady progress in improving precompilation, I'm optimistic to use it more often in the future, though.

  • jakobnissen 3 years ago

    Yeah I agree. It's good for specific use cases where the JIT latency doesn't matter too much - which means either interactive work, or long-running computations. So, mostly science/engineering work, and perhaps stuff like generative art, building wbsites and stuff.

    When latency is much better and/or it can compile static binaries, the use case of Julia will hopefully broaden

  • stellalo 3 years ago

    > I actually would prefer it over Python for small scripts, but the compilation overhead is too long

    Looks like this release reduces that by a lot, see the first section in the OP on caching native code, modulo adoption of good precompilation habits by the various packages.

  • ziotom78 3 years ago

    I have the same feelings as you: I stick to Python for tasks that aren't worthy of a long compilation because the execution time would be small, but I always use Julia for computation-intensive tasks.

    However, because of this, when somebody asks me if I would recommend Julia to them, instead of answering “no” I just say “it depends”

joelthelion 3 years ago

I feel this release might finally make Julia worth considering again. Previously loading time for something as simple as opening a csv and plotting it was a deal breaker.

huijzer 3 years ago

I used to have a reasonably simple notebook for a paper which took about 35 minutes to compile on an old university-provided CPU; even when opening it for the second time.

Therefore, I‘m really excited for the improvements in code caching! Thanks to Tim Holy, Jameson Nash, Valentin Churavy, and others for your work

  • eigenspace 3 years ago

    > Reasonably simple

    > 35 minutes to compile

    What kind of CPU are we talking about here!?

    • cookieperson 3 years ago

      Probably a decent one. Compilation of even small size projects easily took hours as of a year ago. All while people were screaming about how good it was. The equivalent code in other languages would be statically compiled in milliseconds. The size of the binaries was also thousands of times bigger...

      • Sukera 3 years ago

        It would be great if you could share such an example work load - such big examples are often few and far between, even though they are VERY good for debugging compilation performance.

        • cookieperson 3 years ago

          Grab any project you have with more than 2k lines, multiple dependencies and run package compiler on it. Wait an hour, hopefully it didn't barf irreconcilably and check the file size.

          • hpcjoe 3 years ago

            My packages at work are about 1.5k lines. Precompilation is about 10-15 seconds. After precomp, about 1-2 seconds to load with "import" or "using". The longest I've seen on precomp have been DifferentialEquations.jl. I've not used it in a while, though I have plans to for personal (non-work related) projects.

            My packages have 5-10 dependencies in them, I tend to keep my packages/tooling streamlined, and I performance optimize them (everything from loading time, through data structure/algorithm implementation) quite thoroughly.

            Other users at my firm are adopting Julia as well. It isn't displacing python, though that is a possible future. Its similar enough that many grasp it right away. Its fast enough that its a viable alternative to Python + C++.

            This said, Julia is not a silver bullet[1], though it is an excellent programming language. It has been able to do all of the same tasks as my python code. Faster (often multiple orders of magnitude) working with 10s to 100s of GB of data, in parallel.

            Its a better solution for my workflow, and an increasing number of others. If its not to your liking, that's fine. No need to try to push it down with, from my vantage point, what seems like specious claims (very long precompilation times close to an hour, or so). If you have such actual examples, please post them. I'd love to see them. Chances are we could optimize this fairly easily.

            [1] https://www.merriam-webster.com/dictionary/silver%20bullet

            • cookieperson 3 years ago

              Diffeqs precompilation is particularly heinous if you aren't on a serious machine. but I think I misspoke. I thought we were discussing package compilation. Unfortunately though, even if we are discussing precompilation theres a ton of cost there even if you go beyond packages. Time to first anything is often brutal. See the discussions on time to first gradient... This is often up to the developer to work out, but it's a challenge that's pretty unique to Julia itself.

              When I see people describing viable alternatives to python and or C I personally look at C++ and Rust. Julia's GC is good for most academic embarrassingly parallel number crunching things, but is rough for large scale applications. I've only been able to use Julia in a vacuum for research. For product development, every effort I've seen has eroded insanely fast due to things other languages control much more easily. Those languages can often also do the math fast enough too, especially when the cost of failed experiments is accounted for. All those wait three hours to find out your first gradient descent iteration had a type error that propagated to 1000 compute nodes($) moments are gone. It just can't happen in other paradigms, and in some paradigms it's far less likely to happen and when it does the cost is minor because the cost of compilation was already amortized.

      • jrockway 3 years ago

        You've gotten Julia to produce a runnable binary? Impressive.

        • hpcjoe 3 years ago

          It is possible, within specific limits, using StaticTools.jl and StaticCompiler.jl. Sadly for me, my code won't work within the indicated limits.

          This is the biggest issue for me, for deployable code. I'd love to hand my users a single binary (like go/rust), which has all the code/data needed, so no precompilation time, and instant startup. I am hoping the Julia team understand how important this is ... language competitors have runtimes (python, etc.) or binaries (go/rust/c++). We really need the latter to distribute code to production.

          Imagine a post compilation step, kind of like the code caching, which wraps everything we need into a binary, with compiled cached code, startup code, runtime libs, etc. . That would be amazing, and fit well within the julia paradigm.

          • eigenspace 3 years ago

            StaticCompiler / StaticTools are a bleeding-edge playground. GP is talking about PackageCompiler.jl which is for serious things that can be trusted to work, but is slow and produces huge binaries.

            • huijzer 3 years ago

              Plus, to make it fully static, you need to ensure that all code paths are being hit.

            • hpcjoe 3 years ago

              I've been looking into that for a while, as a way to create a common environment for users in my team. I did get it to work, though as you mention, build times are long for this.

          • cookieperson 3 years ago

            Yea or you can just write C++ or Rust and honestly... See some advantages in doing so from a maintenance angle. Dynamicly typed languages have some pretty serious shortcomings. Scientists are smart people, they just need some training to learn to use those tools or to work with someone who can help them do it.

      • eigenspace 3 years ago

        You appear to be talking about sysimage compilation which is a whole different beast than package precompilation which is presumably what the OP was doing in a notebook

    • pjmlp 3 years ago

      Most likely a single or dual core CPU.

      I have similar compilation times with Rust on an old Asus 1215B, where 8GB and SSD hardly help the compile the world from scratch cargo model, when starting a new project.

      • kgwgk 3 years ago

        reasonably simple notebook != compile the world from scratch

        • ChrisRackauckas 3 years ago

          IIRC there was a bug triggering precompilation each time you re-instantiated manifests, and Pluto notebooks keep a manifest of all packages that they used (in order to have full reproducibility, so it doesn't necessarily match your global environment), so Pluto notebooks would effectively precompile, compile, and run each time. I forget the PR that changed this or I would link it, but IIRC somewhere around v1.9-RC2 this was addressed so the v1.9 release should be much nicer to use for Pluto notebooks. I need to double check this myself though since I was last testing Pluto in some of the betas and reported this behavior.

        • pjmlp 3 years ago

          It is, when the libraries aren't shipped as native libraries, and one depends on the slow LLVM to compile them.

        • currymj 3 years ago

          if a small piece of code depends on a few big, complex packages then depending on how things work out, due to the JIT model, you might have needed to essentially recompile all these dependencies at runtime every time. now there are increasingly better precompilation tools to avoid this.

        • eklavya 3 years ago

          But that needs to happen everytime rustc is updated.

mrsofty 3 years ago

we had a big julia push this month after 2 years of just messing around. It's better than APL to read ( so is Sanskrit) but we hit a SCREECHING halt when we realized that it wasn't going to happen that we could our streaming data with Pluto notebooks on the web. Pluto Notebooks are wonderful and can handle streaming data just not on a hosted web page with multiple people using it. We tried to use Stipple.jl ( part of GENIE.jl) and that kept freezing ( we suspect because of pacing issues so 1 sec plus should be fine). The point of all of this is that we have found julia to be GREAT to build the back end stuff but not for manipulation of streaming data on the web. We can easily fix this with ZMQ and send the data to Python but julia was supposed to be a 1 language solution. We're trying to dodge the web side of things with Humane so maybe we'll be happier bunnies in 2024

  • sundarurfriend 3 years ago

    > when we realized that it wasn't going to happen that we could our streaming data with Pluto notebooks on the web

    It sounds like the issue is probably unrelated to using Pluto, and likely more to do with the streaming libraries used and memory management - but that's just a guess based on the minimal info here. When you say it couldn't handle streaming data, what issues did you have? By "streaming data with Pluto notebooks on the web" do you mean PlutoSliderServer or something else?

    FWIW, Fons and co are very responsive to user issues (for eg. on the Zulip pluto channel [1]), so if you haven't tried that already, I'd recommend that. Similarly with Stipple, I believe they're trying to build a company out of it, so they'll probably be very receptive to business use cases and making them work.

    [1] https://julialang.zulipchat.com/#narrow/stream/243342-pluto....

    • mrsofty 3 years ago

      Fons IS the person that informed me that we couldn't deploy our app using Pluto PlutoHooks PlutoSlider. The issue is that we need to have people able to go to a web page and have their own view of the notebook.

      On the Stipple front it's a pacing issue, we think. The developers have been WONDERFULLY helpful and improved our code tremendously. GENIE is a great solution and I am sure that they will be successful. I believe they WILL produce a MWE of this task and we'll certainly look to see if we can make it work. Right now we can't as some of our lab equipment and financial systems generate sub 0.5 sec data stream.

  • jcheng 3 years ago

    Can I ask what you would "easily" do after you "send the data to Python"? What Python framework would you use to easily build interactive real-time streaming data apps?

    I'm asking because I work on Shiny, a reactive web framework for Python (and R) that aims to solve this problem well, and I'm having trouble figuring out how Python people have been doing this sort of thing. It's straightforward with a lower-level web framework like Flask/FastAPI but then you lose the nice reactive properties of something like Stipple.jl (and Shiny).

    • mrsofty 3 years ago

      Sorry, missed this as we're in meetings this week. So to answer your question.

      We use zeroMQ to move data around so to get it into a python script is very easy for us. We "were" going to create a dash/plotly app to consume the data stream and create a trading augmentation tool as we find dash a pleasure to work with. We also find that it's very well supported using videos so not even we ( non web developers) can mess it up. We adhere to the Tufte approach to graphical representation of complex data sets and python allows us to take a minimalist approach to doing that.

      We are also discussing the implementation of async components in a python specific hosting company that seems to address the same market as GENIE. That would give us the GUI approach should we chose to use it. We have experienced some problems with streaming data into Stipple but I would expect that they will provide a MWE that we can modify. I have high hopes for the GENIE team, they seem like excellent people.

      All that said we are progressing out Carbon discussions with various people. In Chicago we are very lucky that we have a deep pool of people skilled at kernel latency avoidance, this helps us consider what we can expect from Carbon and if there are any advantages to what we want to achieve.

      I hope this answers your question.

sundarurfriend 3 years ago

> Users can also create custom local "Startup" packages that load dependencies and precompile workloads tailored to their daily work.

That's big! Now I can add packages to my startup.jl without having to worry that every single REPL startup will be slowed down by them. This also eases the pain of things being moved away from the standard library, since we can just add them back to the base environment and load them at startup, making it basically the same thing.

  • tholy 3 years ago

    Note there's a distinction between "startup.jl" and "Startup.jl": the latter is a package, not a script. That's necessary to allow precompilation. But you can add `using Startup` to your "startup.jl" so that it gets loaded automatically. Fortunately, it's very easy to create these personal packages, see intructions at https://julialang.github.io/PrecompileTools.jl/stable/#Tutor...

sundarurfriend 3 years ago

Two very nice additions to the REPL that weren't mentioned in the highlights:

* `Alt-e` now opens the current input in an editor. The content (if modified) will be executed upon exiting the editor

* A "numbered prompt" mode which prints numbers for each input and output and stores evaluated results in Out can be activated with REPL.numbered_prompt!() (basically `In[3]` `Out[3]` markers like in Mathematica/Jupyter).

ddragon 3 years ago

I'm quite interested in the interactive thread pool (although I assume it works based on conventions of everyone playing nice). Julia seems to have a powerful parallelism model but it couldn't apply it to responsive GUI and web frameworks that requires low latency, so it is nice if you indeed can have for example the tasks handling HTTP request focusing on handling it as fast as possible while the background working threads dealing with larger computations use all the speed of the Julia language without being constantly interrupted.

sundarurfriend 3 years ago

> Pkg.add can now be told to prefer to add already installed versions of packages (those that already have been downloadedd onto your machine)

> set the env var `JULIA_PKG_PRESERVE_TIERED_INSTALLED` to true.

How is this different from setting `Pkg.offline(true)` and then doing the `add`? I don't know the intricacies of how it works, but that's what I've been doing when I just need to try something out in a temp environment.

  • kristoffercOP 3 years ago

    One difference is that Pkg.offline(true) will error if it cannot resolve something with packages already installed while with this option it will fall back to downloading new versions.

sundarurfriend 3 years ago

> We came to the conclusion that a global fastmath option is impossible to use correctly in Julia.

I'd assumed that global fastmath was a bad idea in general, and assumed that was the reason for making this a no-op. Is there a reason it's particularly bad in Julia, some assumptions the standard library makes or something?

  • adgjlsfhk1 3 years ago

    It is bad in general, but it ends up being worse in Julia because C and C++ generally aren't compiled with whole program optimization. global fastmath is more aggressive the more you inline, and in C/C++ the math library is usually a statically linked library which creates an inlining barrier. Julia has all the code at runtime, and therefore is often able to run faster by inlining more code. The downside of this is that a global fastmath flags will optimize more than you think they should and give even more wrong answers than usual.

  • xmcqdpt2 3 years ago

    From the example in the article (exp) it sounds like they are implementing highly optimized versions of transcendental functions. This is great! One of the reason gfortran is so much slower than the intel fortran compiler is the slow special functions it uses. However those tricks appear to degrade badly under some LLVM optimizations that are enabled with fastmath.

    It makes sense to optimize for the non-fast math case because that's the recommended setting, and I guess having two implementations of all the (very important, easy to mess up, core) special functions + all the testing infra to check that they work correctly on all platforms was probably deemed too much work for marginal benefits.

  • NeuroCoder 3 years ago

    From a non-technical point of view (since the technical answer was already provided) I think this sort of magic optimizations is a double edge sword in any language. No matter what you do there will be corner cases that need manual tuning that become inaccessible behind some init option.

    • sundarurfriend 3 years ago

      That's especially true for `fastmath`, which shoves a bunch of optimization tradeoffs together, some of them "handle with care" territory, some of them in the "faulty live grenade that'll probably explode in your face" territory.

      I was just wondering why the post said "impossible to use correctly in Julia" rather than just "impossible to use correctly", but writing it out now, I realize that "impossible" would be hyperbole for the latter, it would be more like "highly likely to go wrong".

  • ChrisRackauckas 3 years ago

    fastmath is bad in general. In Julia it's not as bad as many other languages because it's attempted to be kept local. The @fastmath macro is essentially a find-replace macro, where it finds things like ^ and replaces it with the fast_pow function which drops the 1ulp requirement. However, this global option was really the one piece left in Julia that where it could creep in, hence the reason to drop it.

    There are still some other difficulties of course, since fastmath in the C ABI is quite wild (or I guess, it's really the GCC implementation up to GCC 13 (https://gcc.gnu.org/bugzilla/show_bug.cgi?id=55522#c45)). Simon wrote a nice piece about the difficulties in general: https://simonbyrne.github.io/notes/fastmath/. In a general sense there is still the potential vulnerability that effects the Python ecosystem which is that if any package has binaries built with fastmath it could cause other calculations to be fastmath as well in a non-local way (https://moyix.blogspot.com/2022/09/someones-been-messing-wit...):

    > It turns out (somewhat insanely) that when -ffast-math is enabled, the compiler will link in a constructor that sets the FTZ/DAZ flags whenever the library is loaded — even on shared libraries, which means that any application that loads that library will have its floating point behavior changed for the whole process. And -Ofast, which sounds appealingly like a "make my program go fast" flag, automatically enables -ffast-math, so some projects may unwittingly turn it on without realizing the implications.

    With Julia, there is the advantage here that (a) most libraries don't have binary artifacts being built in another language, (b) the Julia core math library is written in Julia and is thus not a shared library effected by this, and (c) those that do have their binaries built and hosted in https://github.com/JuliaPackaging/Yggdrasil. So in the binary building and delivery system you can see there are some patches that forcibly remove fastmath from the binaries being built to avoid this problem (https://github.com/search?q=repo%3AJuliaPackaging%2FYggdrasi...). The part (b) of course is the part that is then made globally safe by the removal of the flag in Julia itself, so generally Julia should be well-guarded against this kind of issue with these sets of safeguards in place.

kdheepak 3 years ago

I’m very excited for this release! Congrats to everyone that worked so hard on this and the language in general!

singularity2001 3 years ago

"Together with PrecompileTools.jl, Julia 1.9 delivers many of the benefits of PackageCompiler without the need for user-customization."

does it mean I still have to invoke special workflows and commands to get compilation benefits or does it work out of the box for normal julia invocations?

  • jakobnissen 3 years ago

    PrecompileTools works out of the box - in the sense that the package developer needs to add a "@compile_workload" block in their package, but the users don't need to do anything. There is no special workflow or command to use it.

    The tradeoffs are somewhat larger load times (TTL), increased precompilation time (because some of the compilation moves to precompile time), and increased disk usage by the package.

    • sundarurfriend 3 years ago

      > The tradeoffs are somewhat larger load times (TTL)

      The post says "TTL has also been reduced, albeit not as dramatically as TTFX." And the graph seems to indicate the same. Is that not true, or are you comparing it to pre-1.7 TTLs (which are not shown in the post), or is it just context(/project)-dependent?

      • jakobnissen 3 years ago

        Both are true. Package images and the use of PrecompileTools makes packages load slightly slower, because there is more data to load, namely all the precompiled machine code. It's still faster to load than to compile, so the gains in TTFX (i.e. compilation) outweighs the gains in TTL (i.e. loading).

        For 1.9, code loading has also been optimised, such that code loading is in many cases faster in 1.9 than in 1.7. However, this is an optimisation that is separate from the improvements to precompilation. The developers are currently working on even more TTL optimisations, so I expect that TTL will be significantly reduced in Julia 1.10, but for now, it's nice to see that TTL optimisations present in 1.9 has counteracted the increasing load times from package images.

        • ChrisRackauckas 3 years ago

          There's already some pretty nice improvements in TTL in v1.10. I plan to move onto the v1.10 betas once those are cut, since it's not an insignificant difference. A lot of the testing is being done on the test package https://github.com/JuliaComputing/OmniPackage.jl which is a purposefully massive mess of packages. It's a bit difficult to make a direct comparison because there's many different computers benchmarking and it's quite difficult to get that gigantic chunk of packages to be exactly the same versions across multiple Julia versions (since some dependencies bump the Julia minimum and such), but you can get some relative numbers by looking at some of the PRs that mention Omnipackage. In v1.8.5 it took about 100 seconds to load and some GC improvements dropped that to about 50 in v1.9 betas (https://github.com/JuliaLang/julia/pull/49185#issuecomment-1...) back a few months ago, while 3 weeks ago a change reported improving it from 21s -> 17s (https://github.com/JuliaLang/julia/pull/49404, with of course many other changes in the middle, just search Omnipackage in the PRs to see the list). Again, not quite apples to apples but in terms of relative magnitude you can see it has many major improvements. There's also some big pieces that haven't merged, like how code loading can be invalidated which is unnecessary and https://github.com/JuliaLang/julia/pull/49525 can handle that. There's also some ideas for "clumping" invalidation checks to take it down another notch.

          So with all of these piling up, I think v1.10 will make a great new LTS since v1.9 feels a bit "4/5 of the way there" because there's still the using time to chop down in the next release to get things to sub-second.

        • sundarurfriend 3 years ago

          Thanks, that was a great explanation!

    • singularity2001 3 years ago

      Fantastic! so far over 100 packages seem to support it which is a good start:

      https://github.com/search?q=%40compile_workload&type=code

arijun 3 years ago

Does anyone know what the units are for TTL and TTFX? I can’t tell how significant those results are. Label your graphs, people!

  • datadeft 3 years ago

    I guess it is milliseconds based on the names and the results later in the article.

    - time-to-first-execution (TTFX)

    - time-to-load (TTL)

    • jcheng 3 years ago

      I think it's actually seconds, which is why this improvement is so important.

      • sundarurfriend 3 years ago

        Yep, definitely seconds, although it's weird that the numbers are just presented by themselves, without any information about what system specs they're from. The point here is to highlight the relative difference, of course, but it would still be nice to get an idea of where the absolute numbers stand and what kind of machine you need to get these numbers.

tastyminerals2 3 years ago

I like to explore alternatives to Python and Julia has been one of the tools I am waiting to become mature enough to actually invest some time in. But every time I start reading threads, I see the comments from actual users reporting about half an hour minutes and “coffee time” project compilation. Then the dreaded ecosystem problem. Then I think to myself, well, it’s not the time yet.

Also, I wish Julia was as popular in Europe as it is overseas.

  • krastanov 3 years ago

    Could you elaborate on the ecosystem problem? For my corner of the world, Julia probably has one of the highest quality ecosystem (differential equations, physics modeling, autodiff through very complicated code, probabilistic programming, SIMD/multithreading, and wonderful plotting libraries (the Makie.jl ecosystem) and good data wrangling capabilities (the Dataframes.jl ecosystem)).

    I am curious what are the fields where it is less well developed?

    • affinepplan 3 years ago

      > I am curious what are the fields where it is less well developed?

      Data engineering and cloud integration is a big one. It has very few tools in that domain, and I say this as a heavy Julia user (hobby).

      • sundarurfriend 3 years ago

        I've heard about cloud integration as an issue before, but what is "data engineering"?

        • affinepplan 3 years ago

          working with databases, streaming data around with stuff like kafka or snowflake, integration with orchestrators like prefect or dagster, robust interfaces for spark, being able to read a directory of parquet files without the GC going insane, this kind of thing

          I know there are some existing packages that nominally do some of these things, but generally they are understaffed and not fully mature. I love Julia as a language, so I hope this will improve over time. I think it's one of those problems that just requires more adoption before it can be fixed, and things like precompile TTFX improvements in 1.9 are a good way to get that!

  • DNF2 3 years ago

    And now this comment is one of those, even though it's not actually based on trying out the new release.

  • markkitti 3 years ago

    There seems to be significant activity around ASML in the Netherlands: https://info.juliahub.com/sciml-asml-lounge-eindhoven-meetup

    • adgjlsfhk1 3 years ago

      yeah. they have about 700 Julia users and are in the early phases of trying to switch all their code to Julia.

  • cookieperson 3 years ago

    Honestly, I'd stick with python or learn a statically compiled language to broaden your world. I spent years in the Julia situation and it's more of a cult than anything else. If you ever end up with a job asking for Julia(not likely), you can pick it up in a week or so of free time after some muscle memory kicks in.

    • freilanzer 3 years ago

      > I spent years in the Julia situation and it's more of a cult than anything else

      What is a "Julia situation" and how is a programming language a cult? It's used by companies, programmers, scientists, etc. to do stuff. This is a strange take.

      • cookieperson 3 years ago

        Hop into any of the Julia communities online. Hang out for a month. It's very culty.

        • freilanzer 3 years ago

          Source: just trust me, bro.

          • cookieperson 3 years ago

            I offered you a simple test you can perform so you can trust yourself after performing it.

            • freilanzer 3 years ago

              You offered nothing besides FUD. I regularly read the forums and I don't know what you're referring to.

              • cookieperson 3 years ago

                Get involved in the community. You'll see. This isn't FUD. It's observations of someone who has been very active in the community since well before the 1.0 release.

    • cjalmeida 3 years ago

      Cult? That's a very bad take. It's a tool, great for some stuff, rough edges here and there.

      People invest huge amount of effort fixing Python's shortcomings (pyspark, tf, jax, mojo) requiring a completely different way of thinking modulo the syntax. And nobody is talking about the "cult of the snake"

      • cookieperson 3 years ago

        It's not culty to make packages for a language. It is culty to work 12 hrs a day to appease Julia computing for free to have your work taken renamed and offered to the community with different names on the authors list.

        • cjalmeida 3 years ago

          This is a bold statement. Can you provide references?

          • cookieperson 3 years ago

            Sure graphs.jl used to be lightgraphs.jl. rumor has it the author was bullied out of the community for their personal beliefs which had nothing to do with graphs or programming. Then the Julia crew took the project and sunset the lightgraphs guys work. There are other cases of stuff like this happening but I'm too lazy. Just go ahead spend a few years in the community contributing, good times ahead. Keep in mind, if you aren't paying for the product you are the product.

NeuroCoder 3 years ago

I didn't even know some of these things were being worked on until recently. I totally understand why devs don't treat development like a Twitter feed, posting every thought that pops into their head instead of working. However, it would be really interesting to follow some of these developments without having to deep lurk all the PRs.

Sorry, pretty shallow complaint. Great work!

  • ChrisRackauckas 3 years ago

    A lot of things are shared on a daily basis. There's a lot of open discussion on the various community channels like Discourse and Slack: the #ttfx channel on slack for example is a great one to follow to keep up with the latency changes and report wins and losses of different changes. There's a lot of random package devs testing each PR to show how the different changes are effecting their package. One that comes to mind is the Trixi.jl folks which are sharing the result of almost every update with a bunch of plots to track the latency changes. See https://julialang.org/community/ for a full list of community channels.

    Things of course only show up in the HN front page when they reach a sexy conclusion, which also means that what shows up on HN is a very biased subset of the discussion which omits most subtlety and posts the biggest speedup numbers. Most of the day-to-day of course is things more 10% changes in some case, where only when compounded 100 times you finally have a story the general HN public cares to hear. This also generally means that the long discussions of caveats and edge cases is also filtered from what most of the public tends to generally read (it's just difficult to capture some things in a blog post in any concise way), so if you care for the nuance I highly recommend joining some of the Slack channels.

  • sundarurfriend 3 years ago

    The NEWS.md is created pretty early on in the process, so you can track that to see "fresh off the oven" changes. For eg. here's the one for Julia 1.10 (goes without saying that it's incomplete, subject to change, etc.): https://github.com/JuliaLang/julia/blob/master/NEWS.md

  • jakobnissen 3 years ago

    For developments in latency specifically, check out my blog post: https://viralinstruction.com/posts/latency/

    I know this doesn't inform you about dev work on Julia in general, but it goes into detail with the recent improvements to latency

  • krastanov 3 years ago

    I have been a fly on the wall in the ttfx channel on their slack. And there are a few other channels about julia internals. I do not have anything to contribute there, but it is fascinating to learn various julia internal details from listening in on these threads.

  • torrance 3 years ago

    Most of these features were covered at last year’s JuliaCon. Videos are all online - worth checking out!

dekhn 3 years ago

The remaining issues I had are: I heard there are still bugs in the standard library regarding changing index offsets (from 1 to 0 for example), and IIRC also the language build depends on a fork of LLVM (https://github.com/JuliaLang/llvm-project)

Are both of those still true? I'm a zero-index guy, but having index offsets is fine as long as the standard library is high quality. As for LLVM, I'd prefer it not need a fork but that's less important.

  • eigenspace 3 years ago

    I'm not aware of bugs with offset arrays in the standard library. It's happened before and it may happen again, but Base and the standard library are generally very good at avoiding that.

    The main problem is non-standard library packages that were written back in early julia days before OffsetArrays existed (e.g. a big offendeder IIRC was StatsBase.jl), and so wasn't written with any awareness of how to deal with generic indexing.

    OffsetArrays.jl are a neat trick, and sometimes they really are useful e.g. when mimicing some code that was written in a 0-based language, or just when you're working with array offsets a lot, but I wouldn't really recommend using them everywhere. Other non-array indexable types like Tuple don't have 0-based counterparts (as far as I'm aware), so you'll be jumping back and forth from 0-based and 1-based still, and it's just an extra layer of mental load.

    Honestly though, it's often not very necessary to talk about array indices at all. The preferred pattern is just to use `for i in eachindex(A)`, `A[begin]`, `A[end]` etc.

    > and IIRC also the language build depends on a fork of LLVM (https://github.com/JuliaLang/llvm-project)

    Yes, we use a fork of LLVM, but not because we're really changing it's functionality, just because we have patches for bugs. The bugs are typically reported upstream and our patches are contributed, but the feedback loop is slow enough that it's easiest to just maintain our own patched fork. We do keep it updated though (this release brings us up to v14) and there shouldn't be any divergences from upsteam other than the bugfixes as far as I'm aware

    • dekhn 3 years ago

      Thanks, I had misremembered from the last Julia thread I read (thought it was the standard library that still had offset bugs).

g0wda 3 years ago

Incredible release!

sundarurfriend 3 years ago

> To analyze your heap snapshot, open a Chromium browser and follow these steps: right click -> inspect -> memory -> load. Upload your .heapsnapshot file, and a new tab will appear on the left side to display your snapshot's details.

Can the same be done with Firefox's `about:memory`'s `Load...` button, or is it Chromium specific?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection