Categories: None [Edit]
local_llm
local_llm is a lightweight Ruby gem that lets you interact with locally installed Ollama LLMs such as LLaMA, Mistral, CodeLLaMA, Qwen, and more. It supports configurable default models, configurable Ollama API endpoints, real-time streaming or non-streaming responses, and both one-shot and multi-turn chat—while keeping all inference fully local, private, and offline.
Total
Ranking: 188,091 of 188,339
Downloads: 238
Daily
Ranking: 60,961 of 188,315
Downloads: 2
Downloads Trends
Ranking Trends
Num of Versions Trends
Popular Versions (Major)
Popular Versions (Major.Minor)
Depended by
| Rank | Downloads | Name |
|---|
Depends on
| Rank | Downloads | Name |
|---|
Owners
| # | Gravatar | Handle |
|---|---|---|
| 1 | iqc |