☁️Running Sogni Fast Worker Remotely
How to run Sogni Fast Worker on a Distributed GPU Cloud platform
Don't have your own local Nvidia box to run Sogni Fast Worker with? This Worker is also designed to be deployable to any cloud GPU rental service which supports loading custom Docker images. NEW: Salad.com now hosts Recipes for Sogni Fast Worker for Flux and Sogni Fast Worker For Stable Diffusion! Follow the links to install through Salad Recipes. The steps below will also work.
Running on Salad.com
Create a Sogni account if you don't have one already. You may create one at app.sogni.ai. Multiple Sogni Fast Workers can be run from one account or separate accounts.
Log in to nft.sogni.ai and mint a free Sogni NFT for each GPU worker you want to deploy. Record the NFT Token ID and the API Key for each worker. The API Key is unique to the Sogni Account. The NFT Token ID will be unique for each worker.
Sign up for a Salad account using the "Deploy on SaladCloud" option and follow their website instructions to create a new Container Group.
Select "Deploy a Container Group" under the SaladCloud Organization screen.
Select the "Sogni Fast Worker" Recipe under the Recipes section.
Select your Sogni Worker Type, and add your Sogni API Key and NFT Token ID.
Leave "Replicas" as 1.
All other settings have been automatically configured for you. Simply press "Deploy" and wait for your remote Sogni Fast Worker deployment to come online!
Running on other hosted GPU platforms
Verify that your target platform supports running Docker container images with GPU access with a supported Nvidia GPU. It must also support configuring ENV Vars for the deployed instance.
Create a Sogni account if you don't have one already. You may create one at app.sogni.ai. Multiple Sogni Fast Workers can be run from one account or separate accounts.
Log in to nft.sogni.ai and mint a free Sogni NFT for each GPU worker you want to deploy. Record the NFT Token ID and the API Key for each worker. The API Key is unique to the Sogni Account. The NFT Token ID will be unique for each worker.
For "Container Configuration" set the Image Source to
sogni/sogni-flux-worker:latest
to spawn a Flux worker.sogni/sogni-stable-diffusion-worker:latest
for a Stable Diffusion worker.Under Replica Count you will always leave this at 1. If you are running multiple workers you will need to create a new Container Group for each worker as Salad.com passes Environment Variables at the Container Group level.
For vCPUs select 4. For Memory select 30 GB. For GPU select "RTX 4090" (RTX 5090 is not supported, coming soon). For Disk Space select the maximum size available for model caching.
For Health Check Probes: Startup Probe, select Protocol: "HTTP/1.X" and Path "/startup". Port 8000. Initial Delay Seconds: 90. Period Seconds: 5, Timeout Seconds: 5, Success Threshold: 2, Failure Threshold: 120.
For Liveness Prove select Protocol: "HTTP/1.X" and Path "/liveness". Port 8000. Initial Delay Seconds: 90. Period Seconds: 10, Timeout Seconds: 30, Success Threshold: 1, Failure Threshold: 6. Note: If your platform does not support health probes you can skip steps 7 and 8, Sogni Worker has it's own internal health probes to fallback on.
Under "Environment Variables" add the following:
API_KEY
: The API Key for your Sogni Account.NFT_TOKEN_ID
: The NFT Token ID for your Sogni worker.AUTO_DOWNLOAD_TO_MIN_MODEL_COUNT
: The number of models to download when the worker starts. Ignored for the Flux worker which has models preloaded.DATA_DOG_API_KEY
: You may supply an optional DataDog API Key for troubleshooting.
Click "Deploy" and your worker will start. You check for latest integrations documentation at docs.sogni.ai and reach out at [email protected] for any additional assistance.
Need help? Join our Discord!
If you have any issues or questions setting up your Sogni Fast Worker you can reach out on Discord or via email to [email protected] for technical support. Join Sogni Discord ✨
Last updated
Was this helpful?