Categories: None [Edit]

local_llm

https://rubygems.org/gems/local_llm
https://github.com/barek2k2/local_llm
local_llm is a lightweight Ruby gem that lets you interact with locally installed Ollama LLMs such as LLaMA, Mistral, CodeLLaMA, Qwen, and more. It supports configurable default models, configurable Ollama API endpoints, real-time streaming or non-streaming responses, and both one-shot and multi-turn chat—while keeping all inference fully local, private, and offline.

Total

Ranking: 188,663 of 188,987
Downloads: 263

Daily

Ranking: 121,333 of 188,969
Downloads: 0

Depended by

RankDownloadsName

Depends on

RankDownloadsName

Owners

#GravatarHandle
1iconiqc