TensorDock
  • Who we are
    • Welcome to TensorDock
    • Our Ethos and Commitment
  • Virtual Machines
    • How to SSH into your instance
    • How to RDP into your instance
    • Installing NVIDIA Drivers on Windows 10
    • Linux & NVIDIA Drivers
    • Cloud-init
    • File transferring with SCP and Rclone
    • Running Jupyter Notebook
    • SSH server hardening on Ubuntu
    • Running Stable Diffusion in Docker
    • Installing and running Stable Diffusion UI
    • Running Disco Diffusion on a Linux instance
    • Running Disco Diffusion on a Windows instance
    • Running SimpleTuner/Flux on a Linux instance
    • Using Parsec, Moonlight and Sunshine
  • Hosting
    • Installation Guide
  • Whitelabel
    • Overview
    • Setting Up a Storefront
    • Customization Overview
    • Customize Whitelabel Storefront
  • Legal Information
    • Company Information
    • Terms of Service (TOS)
    • Privacy Policy
    • Acceptable Use Policy (AUP)
    • Taxes, VAT, GST
    • Downtime Compensation
    • Supplier Hosting Agreement
  • Quick Links
    • Discord Server
    • TensorDock.com
    • Dashboard
    • API Documentation
Powered by GitBook
On this page
  • Getting Started
  • SSH into the server
  • Using Docker and Docker Networking
  • Setting up Docker Compose

Was this helpful?

  1. Virtual Machines

Running Stable Diffusion in Docker

Stable Diffusion is a deep learning, text-to-visual model that allows programs and projects that can use test descriptions to generate paintings and images.

PreviousSSH server hardening on UbuntuNextInstalling and running Stable Diffusion UI

Last updated 9 months ago

Was this helpful?

Getting Started

Begin by registering for TensorDock Marketplace and selecting a GPU with at least 10 GB of memory to launch an instance.

We will be using a Ubuntu instance for this tutorial. Select an external port that maps to the internal port 22. This will allowing SSHing into the instance.

Finish up your server by setting a secure username and password.

SSH into the server

On the information page for your instance, you can find the necessary IPv4 address and command to SSH through the command line.

Using the command, you can SSH into your server. Through command line, you will need to enter your username and password which will then grant you access to the GPU and using it for stable diffusion!

Using Docker and Docker Networking

Docker comes default in all TensorDock instances, however by adding Docker Networking, you can make external requests needed for stable diffusion. First, clone the following git repo and cd into that directory.

git clone https://github.com/monatis/stable-diffusion-tf-docker.git && cd stable-diffusion-tf-docker

Then, we will add the daemon.json file and restart the docker service.

sudo cp ./daemon.json /etc/docker/daemon.json

sudo systemctl restart docker.service

Setting up Docker Compose

With Docker Compose, we can create a docker-compose.yml file that will look to a specified port variable for the GPU. On whatever external port you chose to forward to port 22, run

export PUBLIC_PORT= *your port number here*

docker compose up -d

Once it’s up and running, go to http://mass-a.tensordockmarketplace.com:*port number*/docs for the Swagger UI provided by FastAPI. Using the POST /generate endpoint, you can generate your image and its download id. You can download it with GET /download/<download_id> .