Settings

Theme

PyTorch: An Imperative Style, High-Performance Deep Learning Library [pdf]

arxiv.org

113 points by stablemap 6 years ago · 16 comments

Reader

jspisak 6 years ago

Finally the solution to all of your PyTorch citation problems! :)

scythe 6 years ago

Man it’s kinda sad as a Lua fan to see so much interest in a project where the main goal is just to not use Lua.

I guess academics like familiarity and Lua insistently refuses to be like other languages (arrays and maps in one type, 1-based arrays, nonstandard builtin patterns, etc).

  • elcomet 6 years ago

    > the main goal is just to not use Lua.

    I don't think that's accurate. People don't really care about Lua, they don't like or dislike it, they just don't know it. The goal of the project is to use python, because people care about Python.

    It just happened that porting (lua) torch to Python was chosen, but it could have been another framework.

  • cfusting 6 years ago

    It's just about package support and the community. If researchers and practitioners were choosing a language based on merit alone it would probably be Julia for native speed and support for scientific computing. It's nice to have a toy language you appreciate but recall the goal is to write math into algorithms; the language is just tool.

  • etaioinshrdlu 6 years ago

    As someone who's been a software person for 15 years I am so glad deep learning is centralizing on python. It helps so much to share tools and to use a relatively boring language. Lua offered literally nothing but ecosystem problems...

  • uoaei 6 years ago

    Academics simply don't have the time to learn languages that do not have substantial ecosystems and relative ease-of-use.

    • scythe 6 years ago

      It’s true - there have been attempts to fix it, but nobody has created something other people want to use. The Torch project largely replaced all of the then-popular Lua packages — wxLua was dropped for Torch’s internal qtLua, the Lua concurrency libraries (Lanes and luaproc) were ignored in favor of zeroMQ, LPeg and Lua patterns were generally less popular than PCRE and Re2 bindings, et cetera. Maybe Torch is to blame (NIH syndrome), maybe the Lua packages weren’t up to the task, maybe communication within the community is too hard (Lua lacks centralized discussion channels where experienced users are regularly active), but in the end, Lua didn’t come away looking good here.

      Learning a new language wasn’t too hard when that language was Python, after all.

      • cfusting 6 years ago

        We'd all be happier writing math; writing code is just a nuisance.

        • uoaei 6 years ago

          The Julia programming language's development started explicitly to address this sentiment.

amrrs 6 years ago

If you want to read it online - https://www.arxiv-vanity.com/papers/1912.01703/

  • wjn0 6 years ago

    ah, yes, why squint at a PDF when you can squint at LaTeX compile errors instead

zapnuk 6 years ago

Is there a typo in Listing 1?

The forward function of the conv net should use:

t3 = self.fc(t2)

instead of:

t3 = self.fc(t1)

AFAIK the nn.functional.relu function is NOT inplace by default [1]

https://pytorch.org/docs/stable/nn.functional.html

foxes 6 years ago

It's a bit funny to call it imperative, when really at the end of the day, the objective is to get something where you have very little insight into what the neural net. is doing to the state.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection