text-generation-inference/.github/workflows
Daniël de Kok c8c7ccd31e
Set maximum grpc message receive size to 2GiB (#2075)
* Set maximum grpc message receive size to 2GiB

The previous default was 4MiB, which doesn't really work well for
multi-modal models.

* Update to Rust 1.79.0

* Fixup formatting to make PR pass
2024-06-17 16:40:44 +02:00
..
autodocs.yml Creating doc automatically for supported models. (#1929) 2024-05-22 16:22:57 +02:00
build_documentation.yml fix: remove useless token (#1179) 2023-10-19 14:04:44 +02:00
build_pr_documentation.yml chore: add pre-commit (#1569) 2024-02-16 11:58:58 +01:00
build.yaml server: use chunked inputs 2024-06-07 08:09:04 +02:00
client-tests.yaml feat: include token in client test like server tests (#1932) 2024-05-22 09:58:26 +02:00
load_test.yaml chore: migrate ci region for more availability. (#581) 2023-07-12 10:01:01 +02:00
stale.yml Add a stale bot. (#1313) 2023-12-05 14:42:55 +01:00
tests.yaml Set maximum grpc message receive size to 2GiB (#2075) 2024-06-17 16:40:44 +02:00
trufflehog.yml Support chat response format (#2046) 2024-06-11 10:44:56 -04:00
upload_pr_documentation.yml chore: add pre-commit (#1569) 2024-02-16 11:58:58 +01:00