Meta won't release its multimodal Llama AI model in the EU
theverge.com> The decision will prevent European companies from using the multimodal model, despite it being released under an open license.
I like that Meta's models are available for general use under a fairly permissive license, but they've really butchered the word "open".
At least in this case, they didn't call it "open source", which it isn't.
I think I'm okay with calling Llama "open". It is relatively open compared to, say, GPT-4, a model released by a company with an interesting name.
> At least in this case, they didn't call it "open source", which it isn't.
There are no open source AI models, strictly speaking. Because they are not reproducible even will all the data and training setup. Randomness is embedded in training. It could be done without, but performance suffers. So, only for learning and some academic research.
So a program built with an non-determinisitc compiler cannot be open source? Or even just a C program that will change based on the order each object file is built? What about build time stamps?
You can absolutely open source an LLM, it's just that noone has. And you can absouletely bundle the random seem with the training data, it's just that, again, noone has.
And runtime randomness is hardly an issue. We write programs with RNG in them all the time and noone takes issue with that
Make sense. They can't easily delete your photos/data from the model if you file a GDPR request for removal. Someone should go after them even if they don't publish it in EU, the fact of publishing shouldn't matter.