This repository contains information about Cloud GPU offerings for Machine Learning practitioners.
-
Updated
Apr 20, 2021
This repository contains information about Cloud GPU offerings for Machine Learning practitioners.
🐍 | Python library for RunPod API and serverless worker SDK.
A list of Cloud GPU service providers + their prices on servers, instances, etc.
Vast.ai python sdk
Run ComfyUI on Modal with auto-scaling, GPU snapshots, and easy model management.
Train a bidirectional or normal LSTM recurrent neural network to generate text on a free GPU using any dataset. Just upload your text file and click run!
LTX-Video deployment on Modal — serverless AI video generation with cloud GPU scaling
Compare API providers, local GPUs, and cloud for any model
Runnable recipes for fine-tuning, training, and deploying models on VESSL Cloud.
☁️ Cloud GPU platform for AI/ML workloads. Instant access to H100, A100, and RTX GPUs for training and deploying AI models.
Lightweight toolkit for automating Bittensor miner setup, monitoring, and management on cloud GPUs
A practical, non-official deployment guide for Isaac Lab + GR00T on cloud GPU instances, based on real-world testing, troubleshooting, and reproducibility notes.
Cloud GPU Price Comparison 2026 - Real cost testing of 5 GPU cloud providers (GPUhub, RunPod, Vast.ai, Lambda, SaladCloud) for machine learning, deep learning, and AI workloads. Python cost calculator included.
RunPod guardrails for OpenAI Symphony + Linear agents: Codex/Claude Code workers, manifests, artifact proof, cleanup.
Advanced digital twin generator using IPAdapter FaceID with batch image embedding averaging, ControlNet integration, and LoRA training for photorealistic avatar creation
This is a template repository to finetune LLM with Lambda Cloud
Add a description, image, and links to the cloud-gpu topic page so that developers can more easily learn about it.
To associate your repository with the cloud-gpu topic, visit your repo's landing page and select "manage topics."