text-generation-inference/docs/source
Nicolas Patry 849bd93dc3 Using an enum for flash backens (paged/flashdecoding/flashinfer) ()
* Using an enum for flash backens (paged/flashdecoding/flashinfer)

* Early exit on server too.

* Clippy.

* Fix clippy and fmt.
2024-09-25 06:04:51 +00:00
..
basic_tutorials Update Quantization docs and minor doc fix. () 2024-09-25 06:01:59 +00:00
conceptual Using an enum for flash backens (paged/flashdecoding/flashinfer) () 2024-09-25 06:04:51 +00:00
_toctree.yml add usage stats to toctree () 2024-09-25 05:27:40 +00:00
architecture.md add doc for intel gpus () 2024-09-25 05:21:34 +00:00
index.md fix typos in docs and add small clarifications () 2024-06-10 09:24:52 +03:00
installation_amd.md Preparing for release. () 2024-09-25 05:38:48 +00:00
installation_gaudi.md MI300 compatibility () 2024-07-17 05:36:58 +00:00
installation_inferentia.md MI300 compatibility () 2024-07-17 05:36:58 +00:00
installation_intel.md Preparing for release. () 2024-09-25 05:38:48 +00:00
installation_nvidia.md Preparing for release. () 2024-09-25 05:38:48 +00:00
installation.md MI300 compatibility () 2024-07-17 05:36:58 +00:00
messages_api.md chore: add pre-commit () 2024-04-24 15:32:02 +03:00
quicktour.md Update documentation for Supported models () 2024-09-25 06:04:51 +00:00
supported_models.md Update documentation for Supported models () 2024-09-25 06:04:51 +00:00
usage_statistics.md refactor usage stats () 2024-09-25 05:55:39 +00:00