LM Studio (Self Hosted LLM)
Bring Your Self-Hosted LM Studio LLM to Life with Interworky
Last updated
Bring Your Self-Hosted LM Studio LLM to Life with Interworky
Last updated
With Interworky, you can seamlessly integrate your self-hosted LM Studio LLM into any website and bring your AI chatbot to life.
This tutorial will guide you through the setup process, integration steps, and testing of your chatbot.
1. Visit interworky.com/setup-account.
2. Follow the on-screen instructions to create your account.
If you don’t have a website yet, you can use example.com
as a placeholder during the setup process.
1. Log in to your Interworky account and navigate to the Dashboard.
2. Go to the Account Settings section
3. Add the following details:
LM Studio Hosted URL: This is your LM Studio server’s endpoint.
For example:
If your server is hosted on a domain: http://yourdomain.com/v1/chat/completions.
If running locally, see the “Running Locally” note below.
Model Name: Specify the model name of your LM Studio instance (e.g., llama-3.2-3b-instruct).
System Message: Enter the default system message for your chatbot.
Click Save to store your configuration.
Example Data:
LM Studio Hosted URL : https://myllmwebsite.com/v1/chat/completions
Model Name: llama-3.2-3b-instruct
System Message: Respond in a funny quirky way and use metadata attached to understand the user
Navigate to the Integration Tab in your Interworky Dashboard.
Copy the script embed provided for your chatbot.
Add the script to the <head> or <body> section of the website where you want your chatbot to appear.
For platform-specific integration tutorials (e.g., WordPress, Squarespace), refer to the Integration Tutorials
Go to the AI ChatBot Customization tab in your Interworky Dashboard.
Use the Test Chatbot button to test your configuration before integrating it into your website
This ensures that your LM Studio URL, model, and system message are working as expected.
Running Locally with Ngrok
If your LM Studio server is running locally, you can expose it to the internet using Ngrok, allowing your production server or other external users to access it.
Steps to Set Up Ngrok
1. Install Ngrok:
Download and install Ngrok from the official website.
After installation, authenticate your Ngrok account and expose your local server
ngrok http 1237 #if your server is running on port is 1237
Capture your ngrok ip address
Ensure CORS and Serve on Local Network are enabled in you LM Settings
Update your LM Studio End Point in Interworky Account Settings
And Voila You can chat with your self hosted LM Studio model on any website