mirror of https://github.com/tracel-ai/burn.git
c95b34c511 | ||
---|---|---|
.. | ||
benches | ||
src | ||
Cargo.toml | ||
LICENSE-APACHE | ||
LICENSE-MIT | ||
README.md |
README.md
Burn WGPU Backend
Burn WGPU backend
This crate provides a WGPU backend for Burn utilizing the wgpu.
The backend supports Vulkan, Metal, DirectX11/12, OpenGL, WebGPU.
Usage Example
#[cfg(feature = "wgpu")]
mod wgpu {
use burn_autodiff::ADBackendDecorator;
use burn_wgpu::{AutoGraphicsApi, WgpuBackend, WgpuDevice};
use mnist::training;
pub fn run() {
let device = WgpuDevice::default();
training::run::<ADBackendDecorator<WgpuBackend<AutoGraphicsApi, f32, i32>>>(device);
}
}
Configuration
You can set BURN_WGPU_MAX_TASKS
to a positive integer that determines how many computing tasks are submitted in batches to the graphics API.
The best value should be the smallest one that allows 100% GPU usage.
A high value might increase GPU memory usage with no benefit.