minus-squareklopstock@lemmy.specksick.comtoSelfhosted@lemmy.world•Self-Hosted AI is pretty darn coollinkfedilinkEnglisharrow-up3·edit-25 months agoThere is ipex-llm from Intel which you can use with your intel IGPU/GPU/CPU for llms which also supports ollama. linkfedilink
There is ipex-llm from Intel which you can use with your intel IGPU/GPU/CPU for llms which also supports ollama.