Commit Graph

83 Commits

Author SHA1 Message Date
Nathaniel Simard f396094f0b
Fix training dashboard metrics switch (#1228) 2024-02-01 13:56:22 -05:00
Sylvain Benner c4e512d176
Create doc feature in all burn crates and use them for docs.rs build (#1212) 2024-02-01 12:47:46 -05:00
Sylvain Benner 4aa13d6b25
Bump Burn version to 0.13 (#1211) 2024-01-31 16:01:20 -05:00
Nathaniel Simard 2acf6561dc
Chore: put all dependencies versions in workspace (#1210) 2024-01-31 14:47:02 -05:00
Nathaniel Simard 0ec0fba869
Chore: Update ratatui version (#1204) 2024-01-31 11:26:09 -05:00
amfaber ff222b06b5
[burn-train] Fix keys not working reliably to change metrics in terminal UI (#1101)
* Exit event handler if KeyEvent is of release kind

* Switch logic to return on press

---------

Co-authored-by: Dilshod Tadjibaev <939125+antimora@users.noreply.github.com>
2024-01-30 22:26:47 -05:00
Nathaniel Simard eaa4dc3207
Feat/recorder/custom device (#1165) 2024-01-23 13:05:41 -05:00
Kirill Mavreshko 97297538b1
Remove _devauto fuctions (#518) (#1110) 2024-01-06 13:36:34 -05:00
Sylvain Benner fab344c565
[burn-compute] Add persistent cache for autotune in std environment (#1087)
* Add new persistant cache to tune cache

* Serialize autotune presistent cache using vectors

* Properly load and save the persistent cachegf

* Print an error when autotune cache cannot be loaded

* Add tests for persistent cache

Use the same logic as the already implemented tests

* Cargo fmt

* Silence clippy check about implementing default for CliMetricsRenderer

* Add burn-compute feature flag autotune-persistent-cache

This allow burn-compute to remain no-std compliant

* debug

* Git ignore .dir-locals.el files

* Update documentation for compute_checksum implementation

* Expect messages should be an expectation not an  error message

* Replace silent eprintln! by log:warn! macro

* Remove clippy allow attribute

* Fix typos in documentation

* Move creation of additional client into the test fn that requires it

* Create compute clients in test function to test different checksum

* Revert tui as a default feature in burn-train cargo file

* Use structs for autotune cache entries

* Unpack InMemoryCacheEntry for even better readibility

* Remove uneeded checksum_checked field in no-std env

* Make sure that autotune cache directoy exists

* Add test for autotune cache file path creation

* Add prefix device info to autotune cache file

* Use new compute in autotune cache integration tests

This prevents race condition by always reloading the cache fir
each test.

* Move burn-compute rand depdencencie in dev-dependencies

* Avoid creation of formatted msg except in case of actual error

* Fix burn-compute unused code warning in no-std env
2024-01-05 14:19:33 -05:00
unrenormalizable 40ec289a92
Making Embedding.weight public (#1094) 2023-12-22 18:11:38 -05:00
Kirill Mavreshko 1fd07fcb4a
Explicit device tensors (#1081) 2023-12-20 17:49:59 -05:00
Alex Errant 610d64095e
cargo +nightly fmt (#1017) 2023-12-12 13:29:06 -05:00
David Chavez 71d3c1d142
chore(infra): Share some properties across workspace (#1039) 2023-12-12 09:39:07 -05:00
Louis Fortier-Dubois 8fc52113bc
Chore/bump v12 (#1048) 2023-12-04 10:47:54 -05:00
Louis Fortier-Dubois 3088c466a5
patch 0.11.1 (#1047) 2023-12-04 10:18:30 -05:00
nathaniel 3a8dcfb5f4 Fix burn-train feature flag 2023-12-01 16:01:22 -05:00
Nathaniel Simard ab1b5890f5
Chore/release (#1031) 2023-12-01 14:33:28 -05:00
Nathaniel Simard 322480b744
Feat/op fusion decorator (#939)
* WIP

* Impl backend decorator

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* Refactor

* Handle graph single ops execution

* WIP

* Starting to get concrete

* WIP

* Fix locator

* Implement add ops

* Start implementing ops

* Add more ops

* Add more ops

* More float ops

* Almost finish float ops

* Almost done with Int

* Some fix

* Into float

* Implement bool ops

* Almost done with MVP

* Fix adaptive pooling

* Add fusion as backend

* Fix memory leak

* Fix

* WIP Doc

* Doc all ops enum

* Initial docs

* Clippy

* Clippy v2

* Fix typos

* Fix doc

* Fix feature flags

* Add missing ops

* Some cleanup

* Revert u128 id

* cosmetic fixes

---------

Co-authored-by: louisfd <louisfd94@gmail.com>
2023-11-09 21:21:41 -05:00
Nathaniel Simard dddc138757
Add warmup logic when calculating eta (#923) 2023-11-03 08:57:09 -04:00
Louis Fortier-Dubois 2ac348c604
fix singular in estimated time (#928) 2023-11-03 08:52:48 -04:00
Luni-4 8c80c9b94a
ci/Speed up typos checks (#907) 2023-11-02 14:30:07 -04:00
Nathaniel Simard 96524d40a1
[Breaking] Refactor Backend Names (#904) 2023-10-29 18:27:49 -04:00
Nathaniel Simard 233922d60c
Chore: Bump version for next release (#900) 2023-10-24 19:31:13 -04:00
Nathaniel Simard 80fe58c604
[Burn-train] Improve panic messages (#885)
* [Burn-train] Improve panic messages

* Add new to in-memory logger
2023-10-23 10:49:46 -04:00
Damien Elmes 4cb27d289a
Fix train-minimal breakage (#882)
* Fix train-minimal breakage

* Ensure examples get checked in CI
2023-10-22 11:17:36 -04:00
Nathaniel Simard af813d09ed
Feat/early stopping + burn train refactor (#878) 2023-10-20 11:47:31 -04:00
Nathaniel Simard dd4e72a98f
Feat/checkpoint criteria (#862)
* WIP

* Setup

* Test metrics

* Fix bug

* Cleanup
2023-10-17 09:03:11 -04:00
Nathaniel Simard 620b86de98
Feat training events (#857) 2023-10-10 13:27:03 -04:00
Dilshod Tadjibaev 097fd956d0
Upgrade dependency versions (#854)
This updates dependencies including tch to 0.14.0, which uses Torch 2.1.
2023-10-09 14:29:44 -04:00
Nathaniel Simard 904ff1a974
Refactor burn-train (#847) 2023-10-05 13:10:54 -04:00
Nathaniel Simard ce120ead3a
Improve metrics (#844) 2023-10-03 18:15:43 -04:00
Nathaniel Simard aacf191161
Fix training checkpoints (#815) 2023-09-21 08:52:04 -04:00
Damien Elmes d7e9e75099
Fix train-minimal feature and ensure it gets tested (#802) 2023-09-16 09:52:14 -04:00
Nathaniel Simard 57d6a566be
Feat/dashboard tui (#790) 2023-09-13 10:45:14 -04:00
Nathaniel Simard af0be5cfeb
Chore: bump version (#777) 2023-09-06 12:15:13 -04:00
Damien Elmes 08e2ccbed3
Fix: log file creation could not be avoided (#754) 2023-09-03 08:50:48 -04:00
Damien Elmes a47d23c3dd
Add ability to interrupt training loop (#753) 2023-09-02 11:31:46 -04:00
Damien Elmes d80e0d1734
Add ui/metrics feature flags (#740) 2023-09-02 11:26:40 -04:00
Damien Elmes 3669d2a6d4
Migrate from log4rs to tracing (#739) 2023-08-31 21:07:26 -04:00
Damien Elmes ff1c0d8f1a
Fix: ensure final CLI update happens (#716)
The merge of #708 unearthed a bug in the CLI code: if at completion time
the update is within the throttling period, you can end up with a final
output that appears as if the process didn't fully complete.

More info: https://github.com/open-spaced-repetition/fsrs-optimizer-burn/pull/36#issuecomment-1696736807
2023-08-29 07:59:12 -04:00
Damien Elmes a4a9844da3
Feat: Some tweaks to make it more practical to integrate in a GUI app (#706)
* feat: Add support for using a custom renderer

When integrating in an app, the CLI display is undesirable. This will
allow us to collect the progress of iterations, so they can be displayed
in a GUI.

Because CLIDashboardRenderer() writes to the console when ::new() is
called, the code has had to be refactored to defer creation until .build()
is called. This meant that instead of delegating the metric assignments
to the already-created dashboard, we instead need to store them and add
them later.

* feat: Allow opt-out of experiment.log
2023-08-28 16:23:31 -04:00
Nathaniel Simard 481ff14fe1
feat: can add custom training and validation metric loggers (#690) 2023-08-25 07:16:45 -04:00
Nathaniel Simard efee0ac296
Feat/train/custom optimize method (#689)
* Add the possibility to add a custom optimize function for models

* Fix clippy
2023-08-25 07:14:36 -04:00
Elazrod56 dd5ea5251c
Training metrics (#647) 2023-08-21 14:13:36 -04:00
Louis Fortier-Dubois d659f11639
Perf/wgpu/autotune (#609) 2023-08-15 11:26:00 -04:00
Caio Piccirillo 1d3bbaab13
Typos (#608) 2023-08-08 17:57:51 -04:00
Nathaniel Simard 441a7011ce
Feat/tensor casting (#604) 2023-08-08 10:02:17 -04:00
Nathaniel Simard 0a5a2d729a
chore: bump version for next release (#533) 2023-07-26 09:46:28 -04:00
Nathaniel Simard 86b23d5117
fix: training epoch progress (#450) 2023-07-01 10:30:23 -04:00
Dilshod Tadjibaev 825aaa9977
Add missing documents (#424) 2023-06-23 09:28:34 -04:00