Arduino

Install Arduino software from https://docs.arduino.cc/software/ide-v1/tutorials/Linux

Got an error when tried to 'board' the basic code to an Arduino Uno.

Did: sudo chmod a+rw /dev/ttyACM0 ( https://forum.arduino.cc/t/permission-denied-on-dev-ttyacm0/475568 ) ... and it seemed to work.

TTTThis

VRAM for AI image processing, computers, video cards

"The announcement (opens in new tab) notes that the AI model runs on "under 10GB of VRAM on consumer GPUs." Essentially you can run it on a 10GB Nvidia GeForce RTX 3080, an AMD Radeon RX 6700 or potentially something less powerful, though there's nothing here about the minimum graphics requirements. That's still contrary to a lot of AI generation models, which tend to be hosted by servers since they take several Nvidia A100 GPUs to run (opens in new tab)."

"It was trained on 4000 A100s? That's a total of 320 TB VRAM if I'm not mistaken hahahaha

"Stable Diffusion is trained on Stability AI's 4,000 A100 Ezra-1 AI ultracluster, with more than 10,000 beta testers generating 1.7 million images per day in order to explore this approach.

"The core dataset for Stable Diffusion comes from the upcoming CLIP-based AI model LAION-Aesthetics, which filters the images based on how "beautiful" they are. I'm not exactly sure how beauty has been defined in this instance, however. LAION-Aesthetics selects and reworks images from LAION 5B (opens in new tab)'s massive database, that was created in order address the issue (opens in new tab) that datasets—such as the billions of image and text pairs used by Dall-E and CLIP—have not been made openly available.

"It is now available. i'm running it on a 3060 with no problem but it has to have at least 12gb of ram in my experience. I get to occupy 11.4gb in total so at the vram level it is demanding.

"It was confirmed to work on AMD MI200 cards but takes around 4x as long to render

"Would this work on an M1 Mac with 32gb of shared RAM? Yes, it takes 4 minutes.

"I have a 3090 with 24gb and can render at 768 (uses around 20gb) but I'd like to know if I NVLink two 3090's can I render 1024, 2048 or all the way up to 4096

"Already confirmed it runs on 5.1 GB - Emad Twitter user

GeForce RTX 3080 Laptop GPU 16,618

GeForce RTX 3070 Laptop GPU 15,274

GeForce RTX 3060 Laptop GPU 12,870

GeForce RTX 3050 Ti Laptop GPU
9,742

GeForce RTX 3050 Laptop GPU 9,042

GeForce GTX 1650 Ti 7,498

GeForce GTX 1050 (Mobile)
4,461

GeForce GTX 960M
3,411

GeForce MX150
2,300

Radeon Vega 8
1,586

TTTThis

Samplerbox for Raspberry Pi pianos

https://homspace.nl/samplerbox/SBbuild.html#Xsamples

There is a basic version of the software, and then there is another version someone improved.

Then you need the SalamanderGrandPiano files. This is 1.1gb and the 'Adapted' version is like 145mb (but you can't really get good volume differences with it).

TTTThis

Markets, 2022 in Q3

Dec 21, 2018, S&P low

Image

Feb 14, 2020, S&P high

Image

March 20, 2020, S&P low

Image

Dec 31 2021, S&P high

Image

June 15, 2022 ... S&P low

Image

TTTThis

Old English, Greek Dictionaries

Old English

https://www.st-andrews.ac.uk/~cr30/vocabulary/

Greek

https://en.wiktionary.org/wiki/Appendix:Ancient_Greek_words_with_English_derivatives

TTTThis