How to Connect Silly Tavern to Runpod

Posted by:

|

On:

|

Introduction

In this tutorial we will cover how to How to Connect Silly Tavern to Runpod using Ooba Booga’s Text Gen Web UI. With this guide we aim to help you troubleshoot issues and streamline your setup for a smooth experience.

Important Links

Sign up to runpod here: https://runpod.io?ref=49tc28ho

Textgen Template: https://runpod.io/console/gpu-cloud?template=u0jgut4q1v&ref=49tc28ho

Prerequisites to connect Silly Tavern to Runpod

Let’s assume you’re already familiar with Ooba Booga’s Text Generation Web UI and have Silly Tavern installed on your system. If you’re new to these tools or need help with installation, feel free to leave a comment, and I’ll consider creating a detailed guide. Also, let me know if you’d like in-depth videos on Text Generation Web UI and Silly Tavern.

If you haven’t installed these yet, feel free to leave a comment below, and I’ll consider creating a separate tutorial. Also, let me know if you’re interested in detailed videos on setting up the Text Generation Web UI or Silly Tavern.

Step 1: Get Text Gen Running on RunPod

To begin, we need to get the Text Generation Web UI running on RunPod. I’ve prepared templates to simplify this process, and you can find the links in the description below. Here’s how to use them:

  1. Open the Template: Copy the provided link and paste it into your browser’s URL bar. A window will appear, displaying the template.
  2. Choose a Setup: For this tutorial, we’ll select a setup with decent VRAM, like a 4090, and deploy it.

Step 2: Customize Your Deployment

Once you’re in the “Customize Deployment” section:

  1. Open the Environment Variables: Here, you’ll find curated models recommended by various community forums. You won’t need to bring your own model; just choose the one you want by setting its value from 0 to 1.
  2. Select the Model: For this demonstration, we’ll use the model ‘DAREBEAGEL_2x7B_Q5_K_M’. Set its value to 1, apply the overrides, and deploy the model.
  3. Check the Logs: After deployment, go to the logs section. If you see the message “start container,” it means your container has started successfully. The logs will also show the model downloading, which can take 5 to 10 minutes depending on the model size.

If you don’t see the model you’re looking for, join our Discord community. We discuss everything from large language models to Silly Tavern and Stable Diffusion. We even hold bi-monthly competitions with prizes!

Step 3: Connect Silly Tavern to the RunPod API

With the model downloading, you’re almost ready to connect Silly Tavern to the RunPod API:

  1. Start Silly Tavern: Once you’ve started Silly Tavern and are on the main screen, go to the API section. Ensure your API is set to text completion and that the API type is set to the default Ooba Booga.
  2. Get the Server URL: Navigate to the “Connect” section in RunPod, where you’ll find HTTP service ports. First, confirm that the web UI is working by clicking on port 7860. Once confirmed, copy the link for HTTP service port 5000.
  3. Paste the URL: Go back to Silly Tavern and paste the URL into the “Server URL” field.

Step 4: Final Preparations

Before connecting:

  1. Ensure the Model is Active: Head back to the Text Gen Web UI and navigate to the model section. Refresh it to ensure that the model you selected is ready for use.
  2. Configure GPU Layers: Set your n-gpu-layers to 256. This setting is crucial to load the entire model into the GPU, preventing your application from running slowly or timing out.

Step 5: Connect and Test

Now that everything is set up:

  1. Click Connect: Jump back into Silly Tavern and click the “Connect” button. If everything is configured correctly, you should see a message indicating that the ‘DAREBEAGEL_2x7B_Q5_K_M’ model is connected.
  2. Test the Connection: Test out the connection by selecting a character, like Flux the Cat, and engage in a conversation.

Conclusion

And that’s it! You’ve successfully connected Silly Tavern to Ooba Booga’s Text Generation Web UI on RunPod. I hope you found this guide helpful. If you have any questions or want to discuss large language models and Silly Tavern roleplay, join our Discord community.


Join the Community