text-generation-inference/.dockerignore
2024-06-20 15:36:46 +00:00

6 lines
64 B
Plaintext

aml
target
server/transformers
server/flash-attention
hf_cache/