TinyLLM Trials: Can We Run LLMs Locally?
TinyLLM Trials: Can We Run LLMs Locally?

TinyLLM Trials: Can We Run LLMs Locally?

Author
Shiv Bade
Tags
llm
tinyllm
Published
November 12, 2023
Featured
Slug
Tweet
Ran TinyLLM on an M1 Mac:
  • Quantized model = 4GB
  • Great for offline QA tasks
  • Too slow for real chat UX
Still… seeing it run without the cloud is magical.