text-generation-inference/integration-tests/models/__snapshots__/test_completion_prompts
2024-08-28 10:34:10 +02:00
..
test_flash_llama_completion_many_prompts_stream.json Fixing the batching tokenization in flash causal lm. 2024-08-28 10:34:10 +02:00
test_flash_llama_completion_many_prompts.json fix: adjust test snapshots and small refactors (#2323) 2024-07-29 11:38:38 -04:00
test_flash_llama_completion_single_prompt.json v2.0.1 2024-04-18 17:20:36 +02:00