text-generation-inference/router/src
drbh 13dd8e2361
fix: show warning with tokenizer config parsing error (#1488)
This tiny PR just prints the parsing error when a tokenizer config fails
to load.

This is helpful when a chat_template wont load due to formatting issues
https://github.com/huggingface/text-generation-inference/pull/1427#issuecomment-1909226388
2024-01-26 10:41:39 +01:00
..
health.rs Rebased #617 (#868) 2023-08-28 11:43:47 +02:00
infer.rs Add a new /tokenize route to get the tokenized input (#1471) 2024-01-25 14:19:03 +01:00
lib.rs Add a new /tokenize route to get the tokenized input (#1471) 2024-01-25 14:19:03 +01:00
main.rs fix: show warning with tokenizer config parsing error (#1488) 2024-01-26 10:41:39 +01:00
queue.rs Speculative (#1308) 2023-12-11 12:46:30 +01:00
server.rs Add a new /tokenize route to get the tokenized input (#1471) 2024-01-25 14:19:03 +01:00
validation.rs Add a new /tokenize route to get the tokenized input (#1471) 2024-01-25 14:19:03 +01:00