Rising1 sources· last seen 2h ago· first seen 2h ago
Fun fact: Anthropic has never open-sourced any LLMs
I’ve been working on a little side project comparing tokenizer efficiency across different companies’ models for multilingual encoding. Then I saw Anthropic’s announcement today and suddenly realized: there’s no way to analyze claude’s tokenizer lmao! edit: Google once mentioned in a paper that Ge
Lead: r/LocalLLaMABigness: 27funfactanthropicneveropen-sourced
📡 Coverage
10
1 news source
🟠 Hacker News
0
🔴 Reddit
65
139 upvotes across 1 sub
📈 Google Trends
0
Full methodology: How scoring works
Receipts (all sources)
Fun fact: Anthropic has never open-sourced any LLMs
REDDIT · r/LocalLLaMA · 2h ago · ⬆ 139 · 💬 21
score 130
I’ve been working on a little side project comparing tokenizer efficiency across different companies’ models for multilingual encoding. Then I saw Anthropic’s announcement today and suddenly realized: there’s no way to analyze claude’s tokenizer lmao! edit: Google once mentioned in a paper that Ge