PyTorch Internals: Ezyang's Blog

blog.ezyang.com

443 points by Anon84 a month ago


smokel - a month ago

Also interesting in this context is the PyTorch Developer Podcast [1] by the same author. Very comforting to learn about PyTorch internals while doing the dishes.

[1] https://pytorch-dev-podcast.simplecast.com/

alexrigler - a month ago

This is a fun blast from the near past. I helped organize the PyTorch NYC meetup where Ed presented this and still think it's one of the best technical presentations I've seen. Hand drawn slides for the W. Wish I recorded :\

zcbenz - a month ago

For learning internals of ML frameworks I recommend reading the source code of MLX: https://github.com/ml-explore/mlx .

It is a modern and clean codebase without legacies, and I could understand most things without seeking external articles.

chuckledog - a month ago

Great article, thanks for posting. Here’s a nice summary of automatic differentiation, mentioned in the article and core to how NN’s are implemented: https://medium.com/@rhome/automatic-differentiation-26d5a993...

hargun2010 - a month ago

I guess its longer version of slides but not new I saw comment from as far back as 2023, nonetheless good content (resharable).

https://web.mit.edu/~ezyang/Public/pytorch-internals.pdf

aduffy - a month ago

Edward taught a Programming Languages class I took nearly a decade ago, and clicking through here I immediately recognized the illustrated slides, brought a smile to my face

vimgrinder - a month ago

For someone it might help: If you are having trouble reading long articles, try text-to-audio with line highlight. It helps a lot. It has cured my lack of attention.

quotemstr - a month ago

Huh. I'd have written TORCH_CHECK like this:

    TORCH_CHECK(self.dim() == 1) 
      << "Expected dim to be a 1-D tensor "
      << "but was " << self.dim() << "-D tensor";
Turns out it's possible to write TORCH_CHECK() so that it evaluates the streaming operators only if the check fails. (Check out how glog works.)
- a month ago
[deleted]
bilal2vec - a month ago

See also dev forum roadmaps [1] and design docs (e.g. [2], [3],[4])

[1]: https://dev-discuss.pytorch.org/t/meta-pytorch-team-2025-h1-...

[2]: https://dev-discuss.pytorch.org/t/pytorch-symmetricmemory-ha...

[3]: https://dev-discuss.pytorch.org/t/where-do-the-2000-pytorch-...

[4]: https://dev-discuss.pytorch.org/t/rethinking-pytorch-fully-s...

nitrogen99 - a month ago

2019. How much of this is still relevant?

pizza - a month ago

Btw, would anyone have any good resources on using pytorch as a general-purpose graph library? Like stuff beyond the assumption of nets = forward-only (acyclic) digraph

brutus1979 - a month ago

Is there a video version of this? It seems it is from a talk?

curtisszmania - a month ago

[dead]

blahhh456 - a month ago

[dead]

- a month ago
[deleted]
- a month ago
[deleted]
banana_dick_7 - a month ago

[dead]

banana_dick_8 - a month ago

[flagged]

blahhh2525 - a month ago

[flagged]