Nathaniel Simard
ce120ead3a
Improve metrics ( #844 )
2023-10-03 18:15:43 -04:00
Nathaniel Simard
aacf191161
Fix training checkpoints ( #815 )
2023-09-21 08:52:04 -04:00
Damien Elmes
d7e9e75099
Fix train-minimal feature and ensure it gets tested ( #802 )
2023-09-16 09:52:14 -04:00
Nathaniel Simard
57d6a566be
Feat/dashboard tui ( #790 )
2023-09-13 10:45:14 -04:00
Nathaniel Simard
af0be5cfeb
Chore: bump version ( #777 )
2023-09-06 12:15:13 -04:00
Damien Elmes
08e2ccbed3
Fix: log file creation could not be avoided ( #754 )
2023-09-03 08:50:48 -04:00
Damien Elmes
a47d23c3dd
Add ability to interrupt training loop ( #753 )
2023-09-02 11:31:46 -04:00
Damien Elmes
d80e0d1734
Add ui/metrics feature flags ( #740 )
2023-09-02 11:26:40 -04:00
Damien Elmes
3669d2a6d4
Migrate from log4rs to tracing ( #739 )
2023-08-31 21:07:26 -04:00
Damien Elmes
ff1c0d8f1a
Fix: ensure final CLI update happens ( #716 )
...
The merge of #708 unearthed a bug in the CLI code: if at completion time
the update is within the throttling period, you can end up with a final
output that appears as if the process didn't fully complete.
More info: https://github.com/open-spaced-repetition/fsrs-optimizer-burn/pull/36#issuecomment-1696736807
2023-08-29 07:59:12 -04:00
Damien Elmes
a4a9844da3
Feat: Some tweaks to make it more practical to integrate in a GUI app ( #706 )
...
* feat: Add support for using a custom renderer
When integrating in an app, the CLI display is undesirable. This will
allow us to collect the progress of iterations, so they can be displayed
in a GUI.
Because CLIDashboardRenderer() writes to the console when ::new() is
called, the code has had to be refactored to defer creation until .build()
is called. This meant that instead of delegating the metric assignments
to the already-created dashboard, we instead need to store them and add
them later.
* feat: Allow opt-out of experiment.log
2023-08-28 16:23:31 -04:00
Nathaniel Simard
481ff14fe1
feat: can add custom training and validation metric loggers ( #690 )
2023-08-25 07:16:45 -04:00
Nathaniel Simard
efee0ac296
Feat/train/custom optimize method ( #689 )
...
* Add the possibility to add a custom optimize function for models
* Fix clippy
2023-08-25 07:14:36 -04:00
Elazrod56
dd5ea5251c
Training metrics ( #647 )
2023-08-21 14:13:36 -04:00
Louis Fortier-Dubois
d659f11639
Perf/wgpu/autotune ( #609 )
2023-08-15 11:26:00 -04:00
Caio Piccirillo
1d3bbaab13
Typos ( #608 )
2023-08-08 17:57:51 -04:00
Nathaniel Simard
441a7011ce
Feat/tensor casting ( #604 )
2023-08-08 10:02:17 -04:00
Nathaniel Simard
0a5a2d729a
chore: bump version for next release ( #533 )
2023-07-26 09:46:28 -04:00
Nathaniel Simard
86b23d5117
fix: training epoch progress ( #450 )
2023-07-01 10:30:23 -04:00
Dilshod Tadjibaev
825aaa9977
Add missing documents ( #424 )
2023-06-23 09:28:34 -04:00
Dilshod Tadjibaev
fce45f51be
Doc fixes ( #418 )
2023-06-21 12:32:50 -04:00
Dilshod Tadjibaev
834c7ecc1f
Clean up cargo descriptions and formatting ( #403 )
2023-06-15 09:20:53 -04:00
Dilshod Tadjibaev
d57ca96695
Upgrade dep versions ( #399 )
2023-06-14 09:55:19 -04:00
Yu Sun
105c259d44
feat(learner): add RegressionOutput ( #380 )
2023-06-04 10:21:29 -04:00
Dilshod Tadjibaev
f24f91c651
Fix clippy error ( #377 )
2023-06-01 17:11:20 -04:00
Dilshod Tadjibaev
b170b539b7
Upgrade dep versions ( #359 )
2023-05-21 09:07:39 -04:00
Dilshod Tadjibaev
05763e1878
Bump version to the next minor to indicate dev ( #344 )
2023-05-10 18:02:08 -04:00
Nathaniel Simard
29eecd6383
Prepare next release ( #335 )
2023-05-06 10:32:23 -04:00
Nathaniel Simard
6c9a5c8e58
Refactor/record ( #323 )
2023-05-04 14:59:16 -04:00
Nathaniel Simard
04bcf9550a
Fix/text gen example ( #292 )
2023-04-11 17:18:45 -04:00
Nathaniel Simard
f45e6863d9
fix: text classification example ( #285 )
2023-04-11 12:46:42 -04:00
Nathaniel Simard
66028dc3cf
Feat/lr scheduler ( #276 )
2023-04-08 13:12:27 -04:00
Nathaniel Simard
f04fe101d8
Feat/module no grad ( #274 )
2023-04-07 09:01:27 -04:00
Nathaniel Simard
ca8ee0724d
Refactor/optim ( #272 )
2023-04-05 12:38:53 -04:00
Nathaniel Simard
b2cf37eb8b
Fix: load checkpoints ( #270 )
2023-04-04 13:09:58 -04:00
Nathaniel Simard
d3887bcd3d
Feat/module record ( #265 )
2023-04-02 16:22:05 -04:00
Nathaniel Simard
73f6d1916b
Feat/record ( #262 )
2023-04-02 10:09:29 -04:00
Nathaniel Simard
32d38bebc3
Refactor Param wrapping only for Tensor ( #259 )
2023-03-31 16:45:10 -04:00
Nathaniel Simard
6f43d983f7
State serialization/deserialization overhaul ( #247 )
2023-03-23 11:02:46 -04:00
nathaniel
00625d1527
fix: add version to path dependencies
2023-03-21 10:13:44 -04:00
Nathaniel Simard
4e28e2a776
chore: prepare release v0.6.0 ( #246 )
2023-03-21 09:47:37 -04:00
Nathaniel Simard
d8e5b3fed1
fix(burn-train): use single device loop ( #212 )
2023-03-08 19:35:10 -05:00
Nathaniel Simard
019c5f9c44
Refactor/int backend ( #197 )
...
* Update burn-tensor API
* Migrate burn-autodiff
* Update burn-tch
* Update burn-ndarray
* Add some doc
2023-03-06 14:45:58 -05:00
Nathaniel Simard
b8d9f012da
refactor(burn-core): module visitor mut ( #195 )
2023-03-05 14:40:47 -05:00
Nathaniel Simard
ffd3d35176
Refactor/tensor api ( #191 )
2023-03-05 09:23:42 -05:00
Nathaniel Simard
d4c298c221
Refactor/burn core ( #188 )
2023-03-01 16:10:58 -05:00
Nathaniel Simard
25deb5a13b
Refactor/autodiff ( #186 )
2023-02-28 20:01:26 -05:00
Dilshod Tadjibaev
fb925acc73
Make burn and burn-core packages no_std compatible ( #168 ) ( #173 )
...
* Make burn-ndarray and burn-tensor no_std compatible (#168 )
2023-02-25 09:38:01 -05:00
Nathaniel Simard
7d2f43dfca
Refactor Tensor API ( #163 )
2023-02-17 17:31:20 -05:00
Nathaniel Simard
2401d8ad96
Prepare next release ( #161 )
2023-02-12 15:32:29 -05:00