0f5cbb08b3
* Add Llama 3.1 rope * Clippy * Format * Clippy * Add support for multiple eos tokens: * Untagged either * Remove either dep and fix settings.json * Make the max positional embeddings configurable |
||
---|---|---|
.. | ||
benchmarks | ||
bench_main.rs |