Interworky
  • Home
  • Introduction to Interworky
  • Getting Started
    • Account Setup Guide
  • Integration Tutorials
    • WordPress
    • Wix
    • Squarespace
    • Duda
    • Odoo
    • Next.js
    • React
  • Django
  • Features Overview
    • AI Agent Customization
      • Website Sync
      • Naming & Personality
      • Profile Picture
      • Opening Statement
      • First Message
      • Custom Knowledge Base
      • Customizing the Theme
      • Knowledge Files
    • Capabilities
      • Managed Capabilities
      • Custom Capabilities
      • Importing Custom Capabilities via Postman Collection
    • Interworky Dashboard
    • Interworky AI Agent
    • Post-Visit Survey AI Agent
    • Appointments System
  • Support and Resources
Powered by GitBook
On this page
  • Step 1: Create an Interworky Account
  • Step 2: Add Your LM Studio Configuration
  • Step 3: Integrate Your Agent on a Website
  • Step 4: Test Your AI agent (Optional)
  • Running Locally (Note for Locally Hosted Servers)
  1. Getting Started

LM Studio (Self Hosted LLM)

Bring Your Self-Hosted LM Studio LLM to Life with Interworky

Last updated 2 months ago

With Interworky, you can seamlessly integrate your self-hosted LM Studio LLM into any website and bring your AI Agent to life.

This tutorial will guide you through the setup process, integration steps, and testing of your Agent.

Step 1: Create an Interworky Account

1. Visit .

2. Follow the on-screen instructions to create your account.

If you don’t have a website yet, you can use example.com as a placeholder during the setup process.

Step 2: Add Your LM Studio Configuration

1. Log in to your Interworky account and navigate to the Dashboard.

2. Go to the Account Settings section

3. Add the following details:

  • LM Studio Hosted URL: This is your LM Studio server’s endpoint.

    • For example:

      • If your server is hosted on a domain: http://yourdomain.com/v1/chat/completions.

  • Model Name: Specify the model name of your LM Studio instance (e.g., llama-3.2-3b-instruct).

  • System Message: Enter the default system message for your Agent.

  1. Click Save to store your configuration.

  • Example Data:

    • LM Studio Hosted URL : https://myllmwebsite.com/v1/chat/completions

    • Model Name: llama-3.2-3b-instruct

    • System Message: Respond in a funny quirky way and use metadata attached to understand the user

Step 3: Integrate Your Agent on a Website

  1. Navigate to the Integration Tab in your Interworky Dashboard.

  2. Copy the script embed provided for your agent.

  3. Add the script to the <head> or <body> section of the website where you want your agent to appear.

Step 4: Test Your AI agent (Optional)

  1. Go to the AI agent Customization tab in your Interworky Dashboard.

  2. Use the Test agent button to test your configuration before integrating it into your website

This ensures that your LM Studio URL, model, and system message are working as expected.

Running Locally (Note for Locally Hosted Servers)

Running Locally with Ngrok

If your LM Studio server is running locally, you can expose it to the internet using Ngrok, allowing your production server or other external users to access it.

Steps to Set Up Ngrok

1. Install Ngrok:

  • After installation, authenticate your Ngrok account and expose your local server

ngrok http 1237 #if your server is running on port is 1237

  1. Capture your ngrok ip address

  1. Ensure CORS and Serve on Local Network are enabled in you LM Settings

  1. Update your LM Studio End Point in Interworky Account Settings

If running locally, see the “” note below.

For platform-specific integration tutorials (e.g., WordPress, Squarespace), refer to the

Download and install Ngrok from the .

And Voila You can chat with your self hosted LM Studio model on any website

🎉
Integration Tutorials
official website
Running Locally
interworky.com/setup-account
Account Settings in the Sidebar
LM Studio Model Name
Ngrok Forwarding IP Address
Ensure CORS and Serve on Local Network are enabled in you LM Settings
ADD LM Studio Url to Account Settings
Example of Interworky working with LM Studio self hosted Model
Page cover image