Rising1 sources· last seen 12h ago· first seen 12h ago
64Gb ram mac falls right into the local llm dead zone
So I recently bought a Mac (m2 max) with local llm use in mind and I did my research and everywhere everyone was saying go for the larger ram option or I will regret it later... So I did. Time to choose a model: "Okay, - Nice model, Qwen3.5 35b a3b running 8 bit quant, speedy even with full contex
Lead: r/LocalLLaMABigness: 2764gbrammacfallslocal
📡 Coverage
10
1 news source
🟠 Hacker News
0
🔴 Reddit
63
82 upvotes across 1 sub
📈 Google Trends
0
Full methodology: How scoring works
Receipts (all sources)
64Gb ram mac falls right into the local llm dead zone
REDDIT · r/LocalLLaMA · 12h ago · ⬆ 82 · 💬 88
score 112
So I recently bought a Mac (m2 max) with local llm use in mind and I did my research and everywhere everyone was saying go for the larger ram option or I will regret it later... So I did. Time to choose a model: "Okay, - Nice model, Qwen3.5 35b a3b running 8 bit quant, speedy even with full contex