Self-Classifying MNIST Digits Using Neural Cellular Automata
distill.pubCAs have a lot of nice properties: translation invariance, locality, arbitrary population sizes, parallel computation, simplicity of implementation, and emergent complexity. Interesting to see where this line of work goes
When I draw an 8 it seems to frequently classify it as a zero. In fact, most of the numbers I draw don't get correctly classified. Just the 0's and 1's are consistent.
I found the same thing. If you draw an 8 with more of an 'x' in the middle, it's fine, but if you draw two o's on top of one another, it can't figure it out. I suspect they need to increase the diversity of their training set.
Authors here. Had great fun making this. We think Distill is an excellent medium.
Happy to respond to any questions (although likely with a few hours delay as we are in CEST and nearing our bedtime).
[Disclosure: I'm involved in Distill, so a little biased.]
Firstly, make sure you click through to the invidudal articles and try the interactive demos at the top of each one:
* https://distill.pub/2020/growing-ca/ * https://distill.pub/2020/selforg/mnist/
I really enjoy how quirky Alex, Ettore, et al's experiments are. It isn't going after benchmarks. It has a very "what would happen if we tried this?" feel to it. If I had to bet, I don't think it's super likely to have direct real world applications, but it's immensely fun to read about and expands the space of ML systems I might imagine.
I've also been enjoying the open slack channel associated with this thread. It's pretty low volume, but there's been a number of fun experiments from people who enjoyed the original article on neural cellular automata. I feel like there's often a lot of really cool experiments and conversations that no one outside a given lab sees, because they don't make it to publication, and I like how the channel allows those to be shared with anyone really excited about this line of work. It was also really heartwarming to see people jump in to give feedback on the draft of the most recent article!
Hi there! Like this line of work. As an enthusiast of Alife-ish things, CAs (and NKS-style computational systems generally), and also an author of an ML library, I was looking forward to jumping on the slack and just having a chat with likeminded folk, but I'm having trouble finding how to actually make an account on https://distillpub.slack.com
Thanks for pointing this out! The slack invite link expires periodically and I need to replace it. It should work as soon as changes propagate! In the mean time, you can also join here: https://join.slack.com/t/distillpub/shared_invite/zt-grg499d...
It's better to pick the most interesting specific article and link to that, instead of posting lists. Otherwise the topic is a bit of a greased pole - there isn't much to grab onto and the discussion ends up being about the lowest common denominator.
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
Alternatively, if the intended topic is the experimental discussion format, that's fine, but then it should explicitly be about that rather than the content that happens to supply the example.
That makes sense. I think the authors actually submitted the latest article, but the community jumped on this one. Their submission: https://news.ycombinator.com/item?id=24295756
Any chance you could consolidate this one into that thread?
Ok, we've changed the URL to that from https://distill.pub/2020/selforg/.