Settings

Theme

Ask HN: Does Claude use 'prior' in a Bayesian sense more than English?

5 points by slake 10 days ago · 9 comments · 1 min read


Just an observation. When asked to summarize articles, or extract insights, I see the word 'prior' being used a lot more by Claude than usual English language writing (Journalistic essence). And it's clearly using it in a Bayesian sense, because it's always mentioning things like 'Updating priors', 'the prior doesn't hold', etc.

Probably something I noticed after reading the 'goblin' and 'gremlin' article.

bjourne 9 days ago

Probably? Reinforcement learning creates bots with specific styles. For example, ChatGPT is very fond of "typically", "unpack this", and "if you want".

nivertech 9 days ago

AI talk is turning into Silicon Valley pseudo-math slang. Priors, exponentials, latent space

You get lines like “no priors” or “embracing exponentials” that sound smart but mostly signal status

Same move as N Taleb and “convexity.” A real idea turned into a generic intellectual flex

ex-aws-dude 9 days ago

Once again a post with literally 3 points and 2 hours old is the top of /ask

Why is the HN algorithm such ass, can we talk about that?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection