candle/candle-wasm-examples/llama2-c
dependabot[bot] 1f1179913a
Update gloo requirement from 0.8 to 0.11 (#1558)
Updates the requirements on [gloo](https://github.com/rustwasm/gloo) to permit the latest version.
- [Release notes](https://github.com/rustwasm/gloo/releases)
- [Changelog](https://github.com/rustwasm/gloo/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rustwasm/gloo/commits)

---
updated-dependencies:
- dependency-name: gloo
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 16:27:20 +01:00
..
src Fix lints for clippy 1.75. (#1494) 2023-12-28 20:26:20 +01:00
Cargo.toml Update gloo requirement from 0.8 to 0.11 (#1558) 2024-01-10 16:27:20 +01:00
README.md Llama2c WASM UI improvements (#732) 2023-09-04 15:59:22 +01:00
build-lib.sh Llama2c WASM UI improvements (#732) 2023-09-04 15:59:22 +01:00
index.html Wasm llama2 tweaks (#309) 2023-08-02 15:49:43 +01:00
lib-example.html fix firstToken, minor ui changes (#971) 2023-09-27 06:01:59 +01:00
llama2cWorker.js fix firstToken, minor ui changes (#971) 2023-09-27 06:01:59 +01:00

README.md

Running llama2.c Examples

Here, we provide two examples of how to run llama2.c written in Rust using a Candle-compiled WASM binary and runtimes.

Pure Rust UI

To build and test the UI made in Rust you will need Trunk From the candle-wasm-examples/llama2-c directory run:

Download assets:

# Model and tokenizer

wget -c https://huggingface.co/spaces/lmz/candle-llama2/resolve/main/model.bin
wget -c https://huggingface.co/spaces/lmz/candle-llama2/resolve/main/tokenizer.json

Run hot reload server:

trunk serve --release --public-url / --port 8080

Vanilla JS and WebWorkers

To build and test the UI made in Vanilla JS and WebWorkers, first we need to build the WASM library:

sh build-lib.sh

This will bundle the library under ./build and we can import it inside our WebWorker like a normal JS module:

import init, { Model } from "./build/m.js";

The full example can be found under ./lib-example.html. All needed assets are fetched from the web, so no need to download anything. Finally, you can preview the example by running a local HTTP server. For example:

python -m http.server

Then open http://localhost:8000/lib-example.html in your browser.