Settings

Theme

FastAI.jl: FastAI for Julia

forums.fast.ai

196 points by dklend122 4 years ago · 23 comments

Reader

oxinabox 4 years ago

I think particular nice thing about this is that it is a bundle of nice libraries integrated together well, with nice docs. Those libraries in turn also break down into other nice libraries and so forth (but many don't have does quite this nice) because that is how Julia is.

I can't seem myself ever using FastAI.jl (though I am sure many will). But I absolutely can see myself using Flux + FluxTraining.jl which nicely brings together TensorBoardLogger and EarlyStopping and several other things. (https://github.com/FluxML/FluxTraining.jl) And I can well imagine many will use DataLoaders.jl + Flux.

I feel like this project has nicely rounded out the ecosystem. Making standard tools where before there were a bunch of individual solutions per project. (Like I currently do use TensorBoardLogger + Flux directly with my own custom training loop)

ellisv 4 years ago

This is interesting to me but the motivation behind this is unclear. Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.

What does this mean for the development of fastai?

What is the timeline for FastAI.jl to achieve parity?

When should I choose FastAI.jl vs fastai?

  • BadInformatics 4 years ago

    Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:

    > motivation behind this is unclear.

    Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.

    > Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.

    This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.

    > What is the timeline for FastAI.jl to achieve parity?

    > When should I choose FastAI.jl vs fastai?

    This depends on your use cases and how comfortable you are with a) Julia b) having to roll some of your own code. For the first, I'd recommend poking around with the language before as well as using the linked dev channel in TFA to get an informed opinion.

    FastAI.jl itself is composed of multiple constituent packages that can and are used independently, so there's also the option of mixing and matching. For example, https://github.com/lorenzoh/DataLoaders.jl is completely library agnostic.

  • darsnack 4 years ago

    I’m not the main dev on FastAI.jl, but I work on the Julia ML community team that supported this project.

    > Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented.

    We are looking to offer a high level API for ML in Julia similar to fastai for PyTorch. The goal is to enrich the Flux ecosystem, so just calling into Python fastai wouldn’t be appropriate. FastAI.jl is built on top of several lower level packages that can be used separately from FastAI.jl. These packages help build out the ecosystem not just for FastAI.jl, but any ML framework or workflow in Julia.

    > What does this mean for the development of fastai?

    FastAI.jl is “unofficial” in that Jeremy and the fastai team did not develop it. But Jeremy knows about the project, and we have kept in touch with the fastai team for feedback. FastAI.jl doesn’t affect the development of Python fastai in any way.

    > FastAI.jl has vision support but no text support yet.

    > What is the timeline for FastAI.jl to achieve parity?

    We’re working to add more out-of-the-box support for other learning tasks. Currently, we have tabular support on the way, but the timeline for text is not decided.

    Note that the framework itself could already support a text learning method, but you’d have to implement the high level interface functions for it yourself. We just don’t have built-in defaults like vision. You can check out https://fluxml.ai/FastAI.jl/dev/docs/learning_methods.md.htm... for a bit more on what I mean.

    > When should I choose FastAI.jl vs fastai?

    It depends on what you need. PyTorch and fastai are more mature, but Julia and Flux tend to be more flexible to non-standard problems in my experience. If you’re interested, then give Julia/Flux/FastAI.jl a try. If we’re missing a mission critical feature for you, then please let us know so we can prioritize it.

    • ellisv 4 years ago

      Thanks. I use both fastai and Julia already (not together) so this is interesting to me.

      I'm unlikely to adopt FastAI.jl at work anytime soon without a clear win over an existing tool.

    • xvilka 4 years ago

      I didn't check the library yet, asking here in case of the quick answer - does it support GNN too via GeometricFlux.jl[1]?

      [1] https://github.com/FluxML/GeometricFlux.jl

      • darsnack 4 years ago

        You would have to add a learning method to tell it how to encode/decode graph data, but the framework is agnostic to the model choice. So any Flux model is supported.

    • WanderPanda 4 years ago

      It is not surprising to me that there is no text support yet, given the issues of integrating 3D (batch x sequence x features) RNN support (for CUDNN). This issue has prevailed so long that I came to believe it is impossible to integrate with the current Flux.jl stack.

  • oxinabox 4 years ago

    > Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented.

    Yes, and? That is how a port is.

    When FastAI for swift was a thing (is it still a thing?) it was (is?) using Swift For TensorFlow, not PyTorch. https://www.fast.ai/2019/03/06/fastai-swift/

  • NegatioN 4 years ago

    It does seem to actually be an unofficial implementation in Julia. So it shouldnt have any impact on the actual development?

    Im kinda wondering if Jeremy Howard has ok'ed using their name on a library that they're not in charge of? I didnt find a clear answer to that. Particularly troubling since it seems like the FluxML organization is behind this, not some random dude

    • jph00 4 years ago

      Jeremy here. Yup I'm not just OK with the name and project, I'm thrilled with it! :) The Julia community is doing a fantastic job with this and I'm excited to see where it goes.

      Something else to watch: Julia's brand-new compiler-based autodiff library, Diffractor. https://github.com/JuliaDiff/Diffractor.jl

    • BadInformatics 4 years ago

      From TFA:

      > We’ll also be hosting a Q&A session 02.08., 10PM UTC (03.08., 12AM CEST | 8AM AEST). Jeremy will be there, too. Meeting link will follow soon.

      My understanding is that he's at least been aware of this since early development (just over a year), so make of that what you will.

      • NegatioN 4 years ago

        Well, that's great then! :) Guess i should have actually read the post instead of jumping to the github page...

  • jstx1 4 years ago

    > When should I choose FastAI.jl vs fastai?

    Unless you're following the course, you probably shouldn't use either.

    • kyllo 4 years ago

      Why not? Honest question--I haven't used fastai myself but my understanding is it's just a high-level wrapper API for PyTorch. Any particular reason why you wouldn't want to use it outside of the course?

      • NegatioN 4 years ago

        We use it for training some of the models we use in production environments serving millions of requests per day, so there's no reason to exclude it completely IMO. As you say, it's a wrapper for Pytorch with a few extras tacked on.

        With Pytorch you would either use pure torch, FastAI or something like Lightning most of the time. Unless you're using completely mainstream models such as the ones you could get from the Transformers library.

dklend122OP 4 years ago

Note: This is a sanctioned adaptation by members of the Julia community.

losvedir 4 years ago

Wasn't there a Swift version of FastAI, too? Are they trying to have libraries in multiple ecosystems, or did that one peter out?

cbkeller 4 years ago

Looking forward to trying this out!

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection