petsoi@discuss.tchncs.de to Linux@lemmy.ml · 9 months agoDeveloper Explains Why Explicit Sync Will Finally Solve the NVIDIA/Wayland Issues9to5linux.comexternal-linkmessage-square42fedilinkarrow-up1203arrow-down17
arrow-up1196arrow-down1external-linkDeveloper Explains Why Explicit Sync Will Finally Solve the NVIDIA/Wayland Issues9to5linux.competsoi@discuss.tchncs.de to Linux@lemmy.ml · 9 months agomessage-square42fedilink
minus-squaremorrowind@lemmy.mllinkfedilinkarrow-up25·9 months agoGood. This is the better overall solution
minus-squareRandomLegend [He/Him]@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up14arrow-down3·9 months agowell i dearly miss CUDA as i don’t get ZLUDA to work properly with Stable Diffusion and FSR is sill leagues behind DLSS… but yeah overall i am very happy
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up2arrow-down1·9 months agoYou can run Ollama with AMD acceleration
minus-squareRandomLegend [He/Him]@lemmy.dbzer0.comlinkfedilinkarrow-up12arrow-down2·9 months ago yes i know, but Cuda is faster Ollama is for LLM, Stable Diffusion is for images
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up3·9 months agoI’m aware I wanted to point out that AMD isn’t totally useless in AI.
minus-squareRandomLegend [He/Him]@lemmy.dbzer0.comlinkfedilinkarrow-up2·9 months agoOh it definetly isn’t Everything I need does run and I finally don’t run out of vram so easily 😅
Good. This is the better overall solution
well i dearly miss CUDA as i don’t get ZLUDA to work properly with Stable Diffusion and FSR is sill leagues behind DLSS… but yeah overall i am very happy
You can run Ollama with AMD acceleration
I’m aware I wanted to point out that AMD isn’t totally useless in AI.
Oh it definetly isn’t
Everything I need does run and I finally don’t run out of vram so easily 😅