text-generation-inference/integration-tests/models/__snapshots__/test_flash_llama
OlivierDehaene 5a58226130
fix(server): fix decode token (#334)
Fixes #333

---------

Co-authored-by: Nicolas Patry <patry.nicolas@protonmail.com>
2023-05-16 23:23:27 +02:00
..
test_flash_llama_all_params.json feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00
test_flash_llama_load.json fix(server): fix decode token (#334) 2023-05-16 23:23:27 +02:00
test_flash_llama.json feat(integration-tests): improve comparison and health checks (#336) 2023-05-16 20:22:11 +02:00