I’m on a GNU/Linux and I have a second gfx card that is nvidia but not plugged in to anything other than my Motherboard, can I use this still for cuda based ai stuff?
I’ve been using cuda for cracking hashes without issue. Now if someone at Nvidia could work on not making their driver suck at daily driving on Linux it would be great
Nvidia: 😂
AMD is boss
Like not plugged to power, or not plugged to a display?
Not plugged to display, still plugged into the motherboard.
I can’t speak to compatibility with AMD but having a display connected isn’t a requirement
ah perfect!
I also wanted to toy around with this new replacement for pytorch. This is great news.
deleted by creator
I’m sorry I misremebered everything. Seems like my nvidia is hooked up to my screens it’s just the amd that is plugged into the mother board while not plugged into any screens. Can I use the amd card for ai stuff as well? I was trying to use pytorch for now
AMD’s compute stack is called ROCm.
thank you kindly sir!
Actually I got an nvidia card working on easy diffusion on Debian. The barrier for getting a text chat ai working with gpu acceleration is actually the fact that I don’t have the patience to deal with all that python venv nonsense so I use llamacpp. It runs in c++ which means no python dependencies to fuck you with at the cost of slower cpu-only generation.
Easy Diffusion just happens to be simple enough that I could actually figure out how to get it to work (it’s in python and needs a virtual environment) but it’s a different story for the text ais.
If you actually had the patience and knowledge to deal with all the python issues and/or had a distro that makes it easy (different distros deal with pip differently), I don’t doubt you’d be able to get Nvidia card acceleration working on some text chat ai.
I have been on my laptop with 4Gb nvidia gpu. If you’re using the webgui there’s optimization parameters like xformers and I think opt.sdp.attention or something that uses about half the amount of memory. I had to update the graphics card to get it working first.
Yeah, for sure. I use my GTX 1070 for CUDA stuff all the time.
Yup, start by running
nvidia-smi
to get the details on your Nvidia. Then install pytorch. The rest is up to you!Yes just rotate it and connect it to the cpu bearings