News
Describe the bug I run the ollama in the intel-llm, and use the Continue to connect the ollama, when I ask the questions in the Continue to call the ollama local model. When I tried several times, ...
Describe the bug Hi, I upgrade to the latest image (intelanalytics/ipex-llm-inference-cpp-xpu:2.3.0-SNAPSHOT, which, for now is also the latest one and noticed that deepseek r1 is output meaningless ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results