mirror of
https://github.com/huggingface/text-generation-inference.git
synced 2025-05-05 17:22:06 +00:00
Fix missing trust_remote_code
flag for AutoTokenizer in utils.peft (#1270)
Peft loading function was missing the `trust_remote_code=trust_remote_code` argument causing the custom tokenizer code to be not found. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @Narsil
This commit is contained in:
parent
b226e469c9
commit
91111a0dc2
@ -38,7 +38,7 @@ def download_and_unload_peft(model_id, revision, trust_remote_code):
|
||||
os.makedirs(model_id, exist_ok=True)
|
||||
cache_dir = model_id
|
||||
logger.info(f"Saving the newly created merged model to {cache_dir}")
|
||||
tokenizer = AutoTokenizer.from_pretrained(base_model_id)
|
||||
tokenizer = AutoTokenizer.from_pretrained(base_model_id, trust_remote_code=trust_remote_code)
|
||||
model.save_pretrained(cache_dir, safe_serialization=True)
|
||||
model.config.save_pretrained(cache_dir)
|
||||
tokenizer.save_pretrained(cache_dir)
|
||||
|
Loading…
Reference in New Issue
Block a user