Contribute
Arey is under active development. We're building this app under a permissive license to enable usage anywhere and in any shape or form.
We gladly accept contributions from the community.
Please create an issue to share your feedback, any feature requests or bug reports.
Thank you ❤️
Development notes
# Install arey locally in editable mode.
> pip install -e .
> pip install -e .\[test\] # optional, if you wish to run tests
# Install with samples dependency if you wish to run them
> pip install -e .\[samples\]
CUDA support
If you've a GPU, try the following installation instead.
> pip uninstall llama-cpp-python
> CMAKE_ARGS="-DGGML_CUDA=on" FORCE_CMAKE=1 pip install llama-cpp-python --force
CPU with BLAS library
With OPENBLAS, loading time for models is much smaller and inference is about
4-5% faster. Here's how to install llama-cpp-python
with OPENBLAS support: