From 508d47f80ffa43ce0e7d097af0778bbc9c500300 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Adrien=20Gallou=C3=ABt?= Date: Fri, 7 Feb 2025 12:12:13 +0000 Subject: [PATCH] Add README.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Signed-off-by: Adrien Gallouët --- backends/llamacpp/README.md | 24 ++++++++++++++++++++++++ 1 file changed, 24 insertions(+) create mode 100644 backends/llamacpp/README.md diff --git a/backends/llamacpp/README.md b/backends/llamacpp/README.md new file mode 100644 index 00000000..0971efc5 --- /dev/null +++ b/backends/llamacpp/README.md @@ -0,0 +1,24 @@ +# Llamacpp backend + +If all your dependencies are installed at the system level, running +cargo build should be sufficient. However, if you want to experiment +with different versions of llama.cpp, some additional setup is required. + +## Install llama.cpp + + LLAMACPP_PREFIX=$(pwd)/llama.cpp.out + + git clone https://github.com/ggerganov/llama.cpp + cd llama.cpp + cmake -B build \ + -DCMAKE_INSTALL_PREFIX="$LLAMACPP_PREFIX" \ + -DLLAMA_BUILD_COMMON=OFF \ + -DLLAMA_BUILD_TESTS=OFF \ + -DLLAMA_BUILD_EXAMPLES=OFF \ + -DLLAMA_BUILD_SERVER=OFF + cmake --build build --config Release -j + cmake --install build + +## Build TGI + + PKG_CONFIG_PATH="$LLAMACPP_PREFIX/lib/pkgconfig" cargo build