by skye-harris
Home Assistant LLM integration for local OpenAI-compatible services (llamacpp, vllm, etc)
# Add to your Claude Code skills
git clone https://github.com/skye-harris/hass_local_openai_llmAllows use of generic OpenAI-compatible LLM services, such as (but not limited to):
This integration has been forked from Home Assistants OpenRouter integration, with the following changes:
<think> tags from responsesHave HACS installed, this will allow you to update easily.
Adding Tools for Assist to HACS can be using this button:
[!NOTE] If the button above doesn't work, add
https://github.com/skye-harris/hass_local_openai_llmas a custom repository of type Integration in HACS.
Local OpenAI LLM integration.local_openai folder from latest release to the
custom_components folder in your config directory.After installation, configure the integration through Home Assistant's UI:
Settings → Devices & Services.Add Integration.Local OpenAI LLM./v1 but may differ depending on your server configuration.Extended OpenAI Conversation integration installed, this has a dependency of an older version of the OpenAI client library.
No comments yet. Be the first to share your thoughts!