Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error occurred while running Ollama on Windows Server 2022 #24

Open
shangguan0755 opened this issue Feb 28, 2025 · 0 comments
Open

Error occurred while running Ollama on Windows Server 2022 #24

shangguan0755 opened this issue Feb 28, 2025 · 0 comments
Labels
Investigation Investigate user's questions

Comments

@shangguan0755
Copy link

The error log is as follows:

C:\Users\Administrator>llm_benchmark run
Total memory size : 383.87 GB
cpu_info: AMD EPYC 7713 64-Core Processor
gpu_info: NVIDIA Tesla T4 Microsoft Remote Display Adapter Microsoft 基本显示适配器 NVIDIA Tesla T4
os_version: Microsoft Windows Server 2022 Standard Evaluation

ollama_version: 0.5.12

LLM models file path:C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\llm_benchmark\data\benchmark_models_32gb_ram.yml
Checking and pulling the following LLM models
phi4:14b
Traceback (most recent call last):

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_transports\default.py", line 72, in map_httpcore_exceptions
yield

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_transports\default.py", line 236, in handle_request
resp = self._pool.handle_request(req)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_sync\connection_pool.py", line 256, in handle_request
raise exc from None

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_sync\connection_pool.py", line 236, in handle_request
response = connection.handle_request(
pool_request.request
)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_sync\connection.py", line 101, in handle_request
raise exc

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_sync\connection.py", line 78, in handle_request
stream = self._connect(request)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_sync\connection.py", line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_backends\sync.py", line 207, in connect_tcp
with map_exceptions(exc_map):
~~~~~~~~~~~~~~^^^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\contextlib.py", line 162, in exit
self.gen.throw(value)
~~~~~~~~~~~~~~^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc

httpcore.ConnectError: [WinError 10049] 在其上下文中,该请求的地址无效。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

File "", line 198, in _run_module_as_main

File "", line 88, in _run_code

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Scripts\llm_benchmark.exe_main_.py", line 7, in
sys.exit(app())
~~~^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\llm_benchmark\main.py", line 52, in run
check_models.pull_models(models_file_path)
~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\llm_benchmark\check_models.py", line 38, in pull_models
ollama.pull(model_name)
~~~~~~~~~~~^^^^^^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\ollama_client.py", line 223, in pull
return self._request_stream(
~~~~~~~~~~~~~~~~~~~~^
'POST',
^^^^^^^
...<6 lines>...
stream=stream,
^^^^^^^^^^^^^^
)
^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\ollama_client.py", line 98, in _request_stream
return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\ollama_client.py", line 69, in _request
response = self._client.request(method, url, **kwargs)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_client.py", line 837, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_client.py", line 926, in send
response = self._send_handling_auth(
request,
...<2 lines>...
history=[],
)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_client.py", line 954, in _send_handling_auth
response = self._send_handling_redirects(
request,
follow_redirects=follow_redirects,
history=history,
)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_client.py", line 991, in _send_handling_redirects
response = self._send_single_request(request)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_client.py", line 1027, in _send_single_request
response = transport.handle_request(request)

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_transports\default.py", line 235, in handle_request
with map_httpcore_exceptions():
~~~~~~~~~~~~~~~~~~~~~~~^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\contextlib.py", line 162, in exit
self.gen.throw(value)
~~~~~~~~~~~~~~^^^^^^^

File "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\httpx_transports\default.py", line 89, in map_httpcore_exceptions
raise mapped_exc(message) from exc

httpx.ConnectError: [WinError 10049] 在其上下文中,该请求的地址无效。

@chuangtc chuangtc added the Investigation Investigate user's questions label Feb 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Investigation Investigate user's questions
Projects
None yet
Development

No branches or pull requests

2 participants