Page cover

LM Studio (Self Hosted LLM)

Bring Your Self-Hosted LM Studio LLM to Life with Interworky

With Interworky, you can seamlessly integrate your self-hosted LM Studio LLM into any website and bring your AI Agent to life.

This tutorial will guide you through the setup process, integration steps, and testing of your Agent.

Step 1: Create an Interworky Account

1. Visit interworky.com/setup-account.

2. Follow the on-screen instructions to create your account.

If you don’t have a website yet, you can use example.com as a placeholder during the setup process.

Step 2: Add Your LM Studio Configuration

1. Log in to your Interworky account and navigate to the Dashboard.

2. Go to the Account Settings section

Account Settings in the Sidebar

3. Add the following details:

  • LM Studio Hosted URL: This is your LM Studio server’s endpoint.

    • For example:

      • If your server is hosted on a domain: http://yourdomain.com/v1/chat/completions.

If running locally, see the “Running Locally” note below.

  • Model Name: Specify the model name of your LM Studio instance (e.g., llama-3.2-3b-instruct).

  • System Message: Enter the default system message for your Agent.

LM Studio Model Name
  1. Click Save to store your configuration.

  • Example Data:

    • LM Studio Hosted URL : https://myllmwebsite.com/v1/chat/completions

    • Model Name: llama-3.2-3b-instruct

    • System Message: Respond in a funny quirky way and use metadata attached to understand the user

Step 3: Integrate Your Agent on a Website

  1. Navigate to the Integration Tab in your Interworky Dashboard.

  2. Copy the script embed provided for your agent.

  3. Add the script to the <head> or <body> section of the website where you want your agent to appear.

  4. For platform-specific integration tutorials (e.g., WordPress, Squarespace), refer to the Integration Tutorials

Step 4: Test Your AI agent (Optional)

  1. Go to the AI agent Customization tab in your Interworky Dashboard.

  2. Use the Test agent button to test your configuration before integrating it into your website

Running Locally (Note for Locally Hosted Servers)

Running Locally with Ngrok

If your LM Studio server is running locally, you can expose it to the internet using Ngrok, allowing your production server or other external users to access it.

Steps to Set Up Ngrok

1. Install Ngrok:

  • Download and install Ngrok from the official website.

  • After installation, authenticate your Ngrok account and expose your local server

ngrok http 1237 #if your server is running on port is 1237

  1. Capture your ngrok ip address

Ngrok Forwarding IP Address
  1. Ensure CORS and Serve on Local Network are enabled in you LM Settings

Ensure CORS and Serve on Local Network are enabled in you LM Settings

  1. Update your LM Studio End Point in Interworky Account Settings

ADD LM Studio Url to Account Settings

And Voila 🎉 You can chat with your self hosted LM Studio model on any website

Example of Interworky working with LM Studio self hosted Model

Last updated