Rising1 sources· last seen 23h ago· first seen 23h ago
We are finally there: Qwen3.6-27B + agentic search; 95.7% SimpleQA on a single 3090, fully local
LDR maintainer here. Thanks to the strong support of r/LocalLLaMA community LDR got very far. I haven't reported in a while because I thought I was not ready for another prominent post in one of the leading outlets of Local LLM research. But I think the LDR community finally there again. I think it
Lead: r/LocalLLaMABigness: 32finallyqwen36-27bagenticsearch
📡 Coverage
10
1 news source
🟠 Hacker News
0
🔴 Reddit
78
362 upvotes across 1 sub
📈 Google Trends
0
Full methodology: How scoring works
Receipts (all sources)
We are finally there: Qwen3.6-27B + agentic search; 95.7% SimpleQA on a single 3090, fully local
REDDIT · r/LocalLLaMA · 23h ago · ⬆ 362 · 💬 72
score 106
LDR maintainer here. Thanks to the strong support of r/LocalLLaMA community LDR got very far. I haven't reported in a while because I thought I was not ready for another prominent post in one of the leading outlets of Local LLM research. But I think the LDR community finally there again. I think it
Related clusters
Qwen3.6-27B vs Coder-Next
1 sources · bigness 31 · 7h ago
Qwen3.6-27B vs 35B, I prefer 35B but more people here post about 27B...
1 sources · bigness 28 · 11h ago
Qwen3.6-27B at 72 tok/s on RTX 3090 on Windows using native vLLM (no WSL, no Docker), portable launcher and installer
1 sources · bigness 27 · 1d ago