HACKER Q&A
📣 chrisrodrigue

What tiny LLMs are you getting the best results from?


Curious if anyone here is having any success with running smaller LLMs locally on constrained hardware, such as laptops or GPU-less devices. If so, what kind of utility have they brought you?


  👤 Uzmanali Accepted Answer ✓
I’ve had solid luck with TinyLlama and Phi-2 on my MacBook Air (no GPU). It's great for quick drafts, note summaries, and basic Q&A. No internet needed, so it’s super handy when traveling.