File tree Expand file tree Collapse file tree 1 file changed +5
-5
lines changed Expand file tree Collapse file tree 1 file changed +5
-5
lines changed Original file line number Diff line number Diff line change @@ -52,31 +52,31 @@ Otherwise, while installing it will build the llama.ccp x86 version which will b
5252To install with OpenBLAS, set the ` LLAMA_BLAS and LLAMA_BLAS_VENDOR ` environment variables before installing:
5353
5454``` bash
55- CMAKE_ARGS=" -DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
55+ CMAKE_ARGS=" -DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" FORCE_CMAKE=1 pip install llama-cpp-python
5656```
5757
5858To install with cuBLAS, set the ` LLAMA_CUBLAS=1 ` environment variable before installing:
5959
6060``` bash
61- CMAKE_ARGS=" -DLLAMA_CUBLAS=on" pip install llama-cpp-python
61+ CMAKE_ARGS=" -DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python
6262```
6363
6464To install with CLBlast, set the ` LLAMA_CLBLAST=1 ` environment variable before installing:
6565
6666``` bash
67- CMAKE_ARGS=" -DLLAMA_CLBLAST=on" pip install llama-cpp-python
67+ CMAKE_ARGS=" -DLLAMA_CLBLAST=on" FORCE_CMAKE=1 pip install llama-cpp-python
6868```
6969
7070To install with Metal (MPS), set the ` LLAMA_METAL=on ` environment variable before installing:
7171
7272``` bash
73- CMAKE_ARGS=" -DLLAMA_METAL=on" pip install llama-cpp-python
73+ CMAKE_ARGS=" -DLLAMA_METAL=on" FORCE_CMAKE=1 pip install llama-cpp-python
7474```
7575
7676To install with hipBLAS / ROCm support for AMD cards, set the ` LLAMA_HIPBLAS=on ` environment variable before installing:
7777
7878``` bash
79- CMAKE_ARGS=" -DLLAMA_HIPBLAS=on" pip install llama-cpp-python
79+ CMAKE_ARGS=" -DLLAMA_HIPBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python
8080```
8181
8282#### Windows remarks
You can’t perform that action at this time.
0 commit comments