And tape the outlet anyway because the painters are hooligans -electrician
That cat is gonna nut.
checks notes Shareholders?!
Now you need ext to compensate
Meanwhile, in the engineering dungeon
Asrock has done me well for budget builds, asus is what I happened to upgrade to for midrange. Honestly being dramatic, just haven’t cared for GB historically.
Good detective work! Adding liquefying thermal pads as a reason to avoid Gigabyte.
The pattern says liquid but the colors say heat damage. Both?
cinnamon, gnome, xfce? Many flavors of Mint
So many in-n-out haters.
Ah, thanks. It is my AMD card causing crashes with SD in my experience. NVIDIA is native to CUDA hence the stability.
What a treat! I just got done setting up a second venv within the sd folder. one called amd-venv the other nvidia-venv. Copied the webui.sh and webui-user.sh scripts and made separate flavors of those as well to point to the respective venv. Now If I just had my nvidia drivers working I could probably set my power supply on fire running them in parallel.
I had that concern as well with it being a new card. It performs fine in gaming as well as in every glmark benchmark so far. I have it chalked up to amd support being in experimenntal status on linux/SD. Any other stress tests you recommend while I’m in the return window!? lol
Thank you!! I may rely on this heavily. Too many different drivers to try willy-nilly. I am in the process of attempting with this guide/driver for now. Will report back with my luck or misfortunes https://hub.tcno.co/ai/stable-diffusion/automatic1111-fast/
version for whatever reason. Does anyone know the current best nvidia driver for
I might take the docker route for the ease of troubleshooting if nothing else. So very sick of hard system freezes/crashes while kludging through the troubleshooting process. Any words of wisdom?
Since only one of us is feeling helpful, here is a 6 minute video for the rest of us to enjoy https://www.youtube.com/watch?v=lRBsmnBE9ZA
I started reading into the ONNX business here https://rocm.blogs.amd.com/artificial-intelligence/stable-diffusion-onnx-runtime/README.html Didn’t take long to see that was beyond me. Has anyone distilled an easy to use model converter/conversion process? One I saw required a HF token for the process, yeesh
All right I’ll bite. I crave more information.