Yes, but doesn't the token change mean that?
You can train a tokenizer on old data just like you can train a model on old data.
But you can't use an old model with a new tokenizer. Changing the tokenizer implies you trained the model from scratch
A little bit of post-training will fix that. Folks on /r/LocalLLaMa have been making effective finetunes with diff. tokenizers for years.
You can train a tokenizer on old data just like you can train a model on old data.
But you can't use an old model with a new tokenizer. Changing the tokenizer implies you trained the model from scratch
A little bit of post-training will fix that. Folks on /r/LocalLLaMa have been making effective finetunes with diff. tokenizers for years.