Rising1 sources· last seen 23h ago· first seen 23h ago

Anthropic admits to have made hosted models more stupid, proving the importance of open weight, local models

TL;DR: >On March 4, we changed Claude Code's default reasoning effort from `high` to `medium` to reduce the very long latency—enough to make the UI appear frozen—some users were seeing in `high` mode. This was the wrong tradeoff. We reverted this change on April 7 after users told us they'd pref

Lead: r/LocalLLaMABigness: 37anthropicadmitshostedstupidproving
📡 Coverage
10
1 news source
🟠 Hacker News
0
🔴 Reddit
93
1157 upvotes across 1 sub
📈 Google Trends
0
Full methodology: How scoring works

Receipts (all sources)

TL;DR: >On March 4, we changed Claude Code's default reasoning effort from `high` to `medium` to reduce the very long latency—enough to make the UI appear frozen—some users were seeing in `high` mode. This was the wrong tradeoff. We reverted this change on April 7 after users told us they'd pref