"The announcement (opens in new tab) notes that the AI model runs on "under 10GB of VRAM on consumer GPUs." Essentially you can run it on a 10GB Nvidia GeForce RTX 3080, an AMD Radeon RX 6700 or potentially something less powerful, though there's nothing here about the minimum graphics requirements. That's still contrary to a lot of AI generation models, which tend to be hosted by servers since they take several Nvidia A100 GPUs to run (opens in new tab)."
"It was trained on 4000 A100s? That's a total of 320 TB VRAM if I'm not mistaken hahahaha
"Stable Diffusion is trained on Stability AI's 4,000 A100 Ezra-1 AI ultracluster, with more than 10,000 beta testers generating 1.7 million images per day in order to explore this approach.
"The core dataset for Stable Diffusion comes from the upcoming CLIP-based AI model LAION-Aesthetics, which filters the images based on how "beautiful" they are. I'm not exactly sure how beauty has been defined in this instance, however. LAION-Aesthetics selects and reworks images from LAION 5B (opens in new tab)'s massive database, that was created in order address the issue (opens in new tab) that datasets—such as the billions of image and text pairs used by Dall-E and CLIP—have not been made openly available.
"It is now available. i'm running it on a 3060 with no problem but it has to have at least 12gb of ram in my experience. I get to occupy 11.4gb in total so at the vram level it is demanding.
"It was confirmed to work on AMD MI200 cards but takes around 4x as long to render
"Would this work on an M1 Mac with 32gb of shared RAM? Yes, it takes 4 minutes.
"I have a 3090 with 24gb and can render at 768 (uses around 20gb) but I'd like to know if I NVLink two 3090's can I render 1024, 2048 or all the way up to 4096
"Already confirmed it runs on 5.1 GB - Emad Twitter user
GeForce RTX 3080 Laptop GPU
GeForce RTX 3070 Laptop GPU
GeForce RTX 3060 Laptop GPU
GeForce RTX 3050 Ti Laptop GPU
GeForce RTX 3050 Laptop GPU
GeForce GTX 1650 Ti
GeForce GTX 1050 (Mobile)
GeForce GTX 960M
Radeon Vega 8