Commit Graph

103 Commits

Author SHA1 Message Date
Wouter Doppenberg 4c663b4cb7
Added correct expected weights, fixed adam state init (#621) 2023-08-09 17:55:39 -04:00
Caio Piccirillo cb283a9e5b
Max pool1d (#602) 2023-08-09 16:13:48 -04:00
Wouter Doppenberg 1f01fcb640
AdamW implementation (#566) 2023-08-09 16:09:17 -04:00
Dilshod Tadjibaev 716b7569de
Add Dropout ONNX OP and other refactoring (#606) 2023-08-09 09:53:20 -04:00
Caio Piccirillo 1d3bbaab13
Typos (#608) 2023-08-08 17:57:51 -04:00
Gadersd ed255c5561
Use buffered io for massive performance gains when loading and saving… (#593) 2023-08-06 12:56:27 -04:00
Nathaniel Simard ce8a175aa4
Feat/conv transpose1d backward (#586) 2023-08-06 10:50:10 -04:00
Dilshod Tadjibaev 1554a3c898
Full support for ONNX scalar operators and Constants (#578) 2023-08-04 16:51:51 -04:00
Nathaniel Simard ca9a8808d9
Feat/adaptive avg pool1d (#585) 2023-08-04 13:55:18 -04:00
Nathaniel Simard 8436d4ff66
Feat/tensor/adaptive avg pool2d (#572) 2023-08-04 10:23:59 -04:00
Nathaniel Simard 597eab524d
Feat/conv transpose2d (#574) 2023-08-03 15:42:18 -04:00
Nathaniel Simard 0a5a2d729a
chore: bump version for next release (#533) 2023-07-26 09:46:28 -04:00
Luni-4 914820e6e4
burn-core: Use a specific structure for 1d padding (#529) 2023-07-24 19:31:37 -04:00
polina guseva 64090a582b
Add static full method for tensor intialization with custom values (#486)
---------

Co-authored-by: Dilshod Tadjibaev <939125+antimora@users.noreply.github.com>
2023-07-24 13:03:27 -04:00
Luni-4 e066d95d2e
Implement padding for conv2d (#523) 2023-07-24 12:54:45 -04:00
Luni-4 f70ff4dc54
Add MaxPool2d to burn-import (#507) 2023-07-23 19:28:36 -04:00
Louis Fortier-Dubois 57a5476c89
bugfix for macos test (#503) 2023-07-18 16:15:00 -04:00
Dilshod Tadjibaev f433292a3f
Fix intermittent test failure (#497) 2023-07-13 19:40:54 -04:00
Dilshod Tadjibaev 53c088209d
Fix new clippy warnings that cause the CI to fail (#494) 2023-07-13 13:39:39 -04:00
Nathaniel Simard 4bc740a864
Fix: can't use init or load with a record with correct gradient requi… (#488) 2023-07-11 15:01:14 -04:00
Dilshod Tadjibaev 20213ade73
Add sinusoidal positional embedding module (#481) 2023-07-11 10:55:29 -04:00
Dilshod Tadjibaev e62ee1269b
Fix burn-tch's random implementation for standard dist (#469) 2023-07-06 08:50:50 -04:00
Nathaniel Simard 65bf6c1cbb
Refactor index => slice (#466) 2023-07-05 16:30:11 -04:00
Dilshod Tadjibaev a2b2d99aa3
Add Tensor as a module constant (#459) 2023-07-03 13:00:18 -04:00
Nathaniel Simard 5341c0260b
Perf/wgpu/benches (#448) 2023-07-02 12:57:15 -04:00
Dilshod Tadjibaev aced634827
Remove hardcoded "1" stride from Conv1d (#451) 2023-07-01 10:31:16 -04:00
Nathaniel Simard 13836ac599
Fix: wgpu powf (#443) 2023-06-28 12:50:16 -04:00
Dilshod Tadjibaev 61291cc30c
Revert temp half-rs fix for no_std (#430)
Fixes #268
2023-06-26 15:59:06 -04:00
Dilshod Tadjibaev eda241f8cf
Add missing docs and enable missing_docs warn lint (#420) 2023-06-21 14:12:13 -04:00
Yu Sun 9c5afa3469
fix(loss): Declare all contents of reduction as public (#423) 2023-06-21 13:00:57 -04:00
Dilshod Tadjibaev fce45f51be
Doc fixes (#418) 2023-06-21 12:32:50 -04:00
Dilshod Tadjibaev 4683acf726
Minor clean up of doc formatting and remove outdated TODO (#415) 2023-06-20 10:05:28 -04:00
Louis Fortier-Dubois e5cc50a837
refactor/initializer (#398)
* add tests to linear forward

* swap linear weight dims

* rename uniformdefault

* wip bias in init

* refactor initializer

* formatting

* fix get(0) to first

* wip refactor

* builder pattern + kaiming init

* kaiming initializer and major initializer refactor

* fix fan out

* easier fixes

* remove initializer options

* clippy fix

* revert to input*output linear

* gru uses xavier init

* should be normal actually

---------

Co-authored-by: louisfd <louisfortier-dubois@Louiss-MacBook-Pro.local>
2023-06-19 16:55:11 -04:00
Dilshod Tadjibaev 834c7ecc1f
Clean up cargo descriptions and formatting (#403) 2023-06-15 09:20:53 -04:00
Louis Fortier-Dubois fd88398ce4
feat/Xavier glorot (#394) 2023-06-11 09:54:44 -04:00
Mathias Insley f8cd38c071
Feat/gru (#393) 2023-06-09 18:56:40 -04:00
Mathias Insley 8a88a868ee
Feat/lstm (#370) 2023-06-06 14:33:22 -04:00
Yu Sun 498d163a7b
Feat: mse loss (#378) 2023-06-03 10:31:12 -04:00
Nathaniel Simard 974fdfaba1
Feat/wgpu backend setup (#376) 2023-06-02 11:52:47 -04:00
Nathaniel Simard 2a4ba5a6ab
Feat/gather scatter (#367) 2023-05-27 11:40:04 -04:00
Nathaniel Simard 3ef2a18d87
Fix flaky tests + add feature flag (#362) 2023-05-24 08:33:31 -04:00
Nathaniel Simard ba0ac112e1
Refactor: Burn Import use BurnGraph (#355) 2023-05-21 13:09:27 -04:00
Nathaniel Simard 18cb19cd03
Fix deserializer array of constant (#354) 2023-05-19 12:01:59 -04:00
Nathaniel Simard 2df268b674
feat: improve module recorder deserialization (#351) 2023-05-15 08:29:57 -04:00
Nathaniel Simard 976102fec0
Feat/avg pool1d (#349) 2023-05-15 08:29:45 -04:00
Dilshod Tadjibaev 05763e1878
Bump version to the next minor to indicate dev (#344) 2023-05-10 18:02:08 -04:00
Nathaniel Simard 29eecd6383
Prepare next release (#335) 2023-05-06 10:32:23 -04:00
Nathaniel Simard a50262b730
refactor: grad clipping (#326) 2023-05-05 12:56:35 -04:00
Mathias Insley d892f6186c
Gradient Clipping (#305) 2023-05-04 15:01:30 -04:00
Nathaniel Simard 6c9a5c8e58
Refactor/record (#323) 2023-05-04 14:59:16 -04:00