Cluster1 sources· last seen 2h ago· first seen 2h ago
nanollama — train Llama 3 from scratch and export to GGUF, one command, open source
nanollama — train Llama 3 from scratch. I've been working on a framework for training Llama 3 architecture models from scratch: not fine-tuning, not LoRA, actual from-zero pretraining. The output is a llama.cpp-compatible GGUF file. The whole pipeline is one command: ''' bash runs/lambda\_trai
Lead: r/LocalLLaMABigness: 20nanollamatrainmetascratchexport
📡 Coverage
10
1 news source
🟠 Hacker News
0
🔴 Reddit
45
26 upvotes across 1 sub
📈 Google Trends
0
Full methodology: How scoring works
Receipts (all sources)
nanollama — train Llama 3 from scratch and export to GGUF, one command, open source
REDDIT · r/LocalLLaMA · 2h ago · ⬆ 26 · 💬 11
score 119
nanollama — train Llama 3 from scratch. I've been working on a framework for training Llama 3 architecture models from scratch: not fine-tuning, not LoRA, actual from-zero pretraining. The output is a llama.cpp-compatible GGUF file. The whole pipeline is one command: ''' bash runs/lambda\_trai