Apple AI tool transcribed the word 'racist' as 'Trump'
bbc.co.ukI guess that's an inherent weakness of language models, they do the recognition by using association. So things strongly associated might be hallucinated in place of each other.
The problem with AI hallucination is not that it confuses similar things (that’s just the model being wrong), but that when the AI has no “clear winning answer”, it can and will respond with absolutely anything within its search space, with apparent disregard for any rules or reality it appeared to have understood in the common case.
Hallucination is not the thing you were looking for.
> The problem with AI hallucination is not that it confuses similar things (that’s just the model being wrong), but that when the AI has no “clear winning answer”, it can and will respond with absolutely anything within its search space, with apparent disregard for any rules or reality it appeared to have understood in the common case.
This is objectively incorrect. The sampler, not the model, controls these and you can decide to catch high-entropy cases and choose not to respond.
You’re right. I suppose it’s more simply when the AI is wrong, it can and will respond with absolutely anything within its search space
Summary: Apple is fixing a speech-to-text issue where saying "racist" sometimes transcribed as "Trump." The company blames a speech recognition error, but experts suspect software tampering. The BBC couldn't replicate the mistake, suggesting a fix is in place. This follows another AI-related blunder where Apple suspended faulty news summaries. The company also announced a $500bn U.S. investment, including a Texas data centre for AI.