Llama-cpp-python gives me Assertion Error even though im using the GGUF Format.
Hi, I am trying to run an AI model in python 3.7.2 with llama-cpp-python 0.1.85, everytime I run this my code I get this error:
error loading model: MapViewOfFile failed: Not enough memory resources are available to process this command.
llama_load_model_from_file: failed to load model
Traceback (most recent call last):
File "server.py", line 26, in <module>
n_ctx=N_CTX,
File "D:\AI 2\Venv\lib\site-packages\llama_cpp\llama.py", line 323, in __init__
assert self.model is not None
AssertionError
I am using the GGUF format, so I don’t know what the problem is, it works fine on a second computer but not on my main machine, any help? Thanks