Fix book MNIST reference (no more huggingface) (#1471)

This commit is contained in:
Guillaume Lagrange 2024-03-15 15:30:08 -04:00 committed by GitHub
parent 4e68cb2175
commit cfc0a4d9fb
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
1 changed files with 7 additions and 7 deletions

View File

@ -1,18 +1,18 @@
# Data
Typically, one trains a model on some dataset. Burn provides a library of very useful dataset
sources and transformations. In particular, there are Hugging Face dataset utilities that allow to
download and store data from Hugging Face into an SQLite database for extremely efficient data
streaming and storage. For this guide, we will use the MNIST dataset provided by Hugging Face.
sources and transformations, such as Hugging Face dataset utilities that allow to download and store
data into an SQLite database for extremely efficient data streaming and storage. For this guide
though, we will use the MNIST dataset from `burn::data::dataset::vision` which requires no external
dependency.
To iterate over a dataset efficiently, we will define a struct which will implement the `Batcher`
trait. The goal of a batcher is to map individual dataset items into a batched tensor that can be
used as input to our previously defined model.
Let us start by defining our dataset functionalities in a file `src/data.rs`. We shall omit some of the imports for
brevity,
but the full code for following this guide can be found
at `examples/guide/` [directory](https://github.com/tracel-ai/burn/tree/main/examples/guide).
Let us start by defining our dataset functionalities in a file `src/data.rs`. We shall omit some of
the imports for brevity, but the full code for following this guide can be found at
`examples/guide/` [directory](https://github.com/tracel-ai/burn/tree/main/examples/guide).
```rust , ignore
use burn::{