AnythingLLM Embed widget submodule for the main AnythingLLM application
# Add to your Claude Code skills
git clone https://github.com/Mintplex-Labs/anythingllm-embedThis is a submodule of AnythingLLM - the all-in-one AI Application
Please report any issues or feature requests to the main repo
[!WARNING] The core AnythingLLM team publishes a pre-built version of the script that is bundled with the main application. You can find it in the main repo here. You should only be working in this repo if you are wanting to build your own custom embed widget for AnythingLLM
This folder of AnythingLLM contains the source code for how the embedded version of AnythingLLM works to provide a public facing interface of your workspace.
The AnythingLLM Embedded chat widget allows you to expose a workspace and its embedded knowledge base as a chat bubble via a <script> or <iframe> element that you can embed in a website or HTML.
by using the AnythingLLM embedded chat widget you are responsible for securing and configuration of the embed as to not allow excessive chat model abuse of your instance
cd embed from the root of the repoyarn to install all dev and script dependenciesyarn dev to boot up an example HTML page to use the chat embed widget.While in development mode (yarn dev) the script will rebuild on any changes to files in the src directory. Ensure that the required keys for the development embed are accurate and set.
yarn build will compile and minify your build of the script. You can then host and link your built script wherever you like.
<script> tag HTML embedThe primary way of embedding a workspace as a chat widget is via a simple <script>
<!--
An example of a script tag embed
REQUIRED data attributes:
data-embed-id // The unique id of your embed with its default settings
data-base-api-url // The URL of your anythingLLM instance backend
-->
<script
data-embed-id="5fc05aaf-2f2c-4c84-87a3-367a4692c1ee"
data-base-api-url="http://localhost:3001/api/embed"
src="http://localhost:3000/embed/anythingllm-chat-widget.min.js"
></script>
<script> Customization OptionsLLM Overrides
data-prompt — Override the chat window with a custom system prompt. This is not visible to the user. If undefined it will use the embeds attached workspace system prompt.
data-model — Override the chat model used for responses. This must be a valid model string for your AnythingLLM LLM provider. If unset it will use the embeds attached workspace model selection or the system set...
No comments yet. Be the first to share your thoughts!