Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instalntly crashing when trying to run with Ollama #12925

Open
westcoastdevv opened this issue Mar 4, 2025 · 2 comments
Open

Instalntly crashing when trying to run with Ollama #12925

westcoastdevv opened this issue Mar 4, 2025 · 2 comments
Assignees

Comments

@westcoastdevv
Copy link

I am running IPEX in a docker container and for about a month, as soon as I input a prompt ollama gives an internal service error right away. I have looked through them and I am not able to find any useful information that would point to a cause. Here are the logs: https://pastebin.com/pjuizqYN

And here are the environment variables I have set:

ONEAPI_DEVICE_SELECTOR=level_zero:0;level_zero:1
IPEX_LLM_NUM_CTX=16384
OLLAMA_PARALLEL=1
OLLAMA_MAX_LOADED_MODELS=0
OLLAMA_DEBUG=1

I am running it on two Arc a770s, 9900k, and 32gb ram.

And here is the versions I'm running on Ubuntu 24.04:

ollama-0.5.4-ipex-llm-2.2.0b20250220-ubuntu.tgz
https://github.com/oneapi-src/level-zero/releases/download/v1.19.2/level-zero_1.19.2+u24.04_amd64.deb
https://github.com/intel/intel-graphics-compiler/releases/download/v2.5.6/intel-igc-core-2_2.5.6+18417_amd64.deb
https://github.com/intel/intel-graphics-compiler/releases/download/v2.5.6/intel-igc-opencl-2_2.5.6+18417_amd64.deb
https://github.com/intel/compute-runtime/releases/download/24.52.32224.5/intel-level-zero-gpu_1.6.32224.5_amd64.deb
https://github.com/intel/compute-runtime/releases/download/24.52.32224.5/intel-opencl-icd_24.52.32224.5_amd64.deb
https://github.com/intel/compute-runtime/releases/download/24.52.32224.5/libigdgmm12_22.5.5_amd64.deb

Any help is greatly appreciated.

@qiuxin2012
Copy link
Contributor

The error message is like #12654, but it's unresolved.
Are you running on the host or the container?

@westcoastdevv
Copy link
Author

This is in this docker container mattcurf/ollama-intel-gpu. I have also gotten the same result running the portable zip version directly on the host.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants