Ask HN: Does Claude use 'prior' in a Bayesian sense more than English?
3 points
10 hours ago
| 2 comments
| HN
Just an observation. When asked to summarize articles, or extract insights, I see the word 'prior' being used a lot more by Claude than usual English language writing (Journalistic essence). And it's clearly using it in a Bayesian sense, because it's always mentioning things like 'Updating priors', 'the prior doesn't hold', etc.

Probably something I noticed after reading the 'goblin' and 'gremlin' article.

bjourne
5 hours ago
[-]
Probably? Reinforcement learning creates bots with specific styles. For example, ChatGPT is very fond of "typically", "unpack this", and "if you want".
reply
ex-aws-dude
8 hours ago
[-]
Once again a post with literally 3 points and 2 hours old is the top of /ask

Why is the HN algorithm such ass, can we talk about that?

reply
pbkompasz
8 hours ago
[-]
Well it did have Claude both in the title and the description...
reply