The AI Report #3: No-gradient optimization, open-source lags behind and more
theaireport.substack.comOn the point of open source lagging behind, I am surprised there has not been crowdsourced funding to train a larger open source model
I've been thinking a similar thing. Distributing cryptos to those who contributed for training models is one idea. Later you might use that coin to run the trained model.
Worldcoin is another one of SamA's projects, and it's recently gotten a good bit of funding (for some strange reason).
Thanks for sharing. Always good to find ML/LLM-oriented newsletters that help surface interesting items and help me stay out of twitter :)