Commit Graph

78 Commits

Author SHA1 Message Date
Caio Piccirillo cb283a9e5b
Max pool1d (#602) 2023-08-09 16:13:48 -04:00
Caio Piccirillo 1d3bbaab13
Typos (#608) 2023-08-08 17:57:51 -04:00
Nathaniel Simard 441a7011ce
Feat/tensor casting (#604) 2023-08-08 10:02:17 -04:00
Nathaniel Simard ca9a8808d9
Feat/adaptive avg pool1d (#585) 2023-08-04 13:55:18 -04:00
Nathaniel Simard 8436d4ff66
Feat/tensor/adaptive avg pool2d (#572) 2023-08-04 10:23:59 -04:00
mmalczak 73fb0eaa7e
Addition of abs tensor opperator #506 (#553) 2023-08-01 18:25:14 -04:00
Dilshod Tadjibaev 74c41bdda2
Add clamp, clamp_min, clamp_max tensor ops (#550) 2023-07-26 20:02:38 -04:00
Nathaniel Simard 0a5a2d729a
chore: bump version for next release (#533) 2023-07-26 09:46:28 -04:00
polina guseva 64090a582b
Add static full method for tensor intialization with custom values (#486)
---------

Co-authored-by: Dilshod Tadjibaev <939125+antimora@users.noreply.github.com>
2023-07-24 13:03:27 -04:00
Nathaniel Simard 0d17cde1cc
Increase repeat performance in tch backend (#499) 2023-07-14 13:21:56 -04:00
Dilshod Tadjibaev 53c088209d
Fix new clippy warnings that cause the CI to fail (#494) 2023-07-13 13:39:39 -04:00
Dilshod Tadjibaev e62ee1269b
Fix burn-tch's random implementation for standard dist (#469) 2023-07-06 08:50:50 -04:00
Nathaniel Simard 65bf6c1cbb
Refactor index => slice (#466) 2023-07-05 16:30:11 -04:00
Nathaniel Simard d26dead299
fix arange (#440) 2023-06-27 12:47:49 -04:00
Dilshod Tadjibaev f0b266c8a3
Add missing docs for burn-wgpu and burn-tch (#427) 2023-06-23 09:31:37 -04:00
Nathaniel Simard a8624590af
feat: mask_where (#409) 2023-06-18 15:04:28 -04:00
Dilshod Tadjibaev 834c7ecc1f
Clean up cargo descriptions and formatting (#403) 2023-06-15 09:20:53 -04:00
Dilshod Tadjibaev d57ca96695
Upgrade dep versions (#399) 2023-06-14 09:55:19 -04:00
Nathaniel Simard 2a4ba5a6ab
Feat/gather scatter (#367) 2023-05-27 11:40:04 -04:00
Dilshod Tadjibaev b170b539b7
Upgrade dep versions (#359) 2023-05-21 09:07:39 -04:00
Nathaniel Simard 976102fec0
Feat/avg pool1d (#349) 2023-05-15 08:29:45 -04:00
Dilshod Tadjibaev fbbc8ac560
Remove build panic from burn-tch (#347) 2023-05-12 09:37:58 -04:00
Nathaniel Simard 747e245cc4
chore: bump tch version (#345) 2023-05-11 14:21:52 -04:00
Dilshod Tadjibaev 05763e1878
Bump version to the next minor to indicate dev (#344) 2023-05-10 18:02:08 -04:00
Nathaniel Simard 73f99ef79f
Feat/maxmin numeric (#340) 2023-05-09 16:35:55 -04:00
Nathaniel Simard a88357ce1d
feat: add max & min ops (#339) 2023-05-09 15:49:33 -04:00
Nathaniel Simard 69001b0d69
Feat/activation ops (#338)
* perf: GELU

* Refactor relu
2023-05-09 08:32:35 -04:00
Nathaniel Simard 29eecd6383
Prepare next release (#335) 2023-05-06 10:32:23 -04:00
Dilshod Tadjibaev 39297a6479
Readme updates (#325)
* Update text-generation readme for Mac users

* Update root readme to reference import crate

* Update import's readme

* Update torch backend
2023-05-04 14:58:44 -04:00
Nathaniel Simard b54f9302c7
Feat/avg pool2d (#318) 2023-04-30 12:25:14 -04:00
Sunny Gonnabathula 02abc373d3
add burn-tch support for bf16 (#303) 2023-04-24 11:24:36 -04:00
Nathaniel Simard c5e31b272f
Feat/group conv (#306) 2023-04-22 15:00:41 -04:00
Nathaniel Simard 78ac09fb7a
Support dilation in convolution operations (#301) 2023-04-18 10:01:11 -04:00
Nathaniel Simard bd58922784
Feat/conv stride (#300) 2023-04-16 22:03:16 -04:00
Yu Sun 26cf555612
Feat: add new tensor ops mask_scatter (#258) 2023-04-13 08:58:31 -04:00
Nathaniel Simard 04bcf9550a
Fix/text gen example (#292) 2023-04-11 17:18:45 -04:00
Nathaniel Simard f04fe101d8
Feat/module no grad (#274) 2023-04-07 09:01:27 -04:00
Nathaniel Simard 6f43d983f7
State serialization/deserialization overhaul (#247) 2023-03-23 11:02:46 -04:00
nathaniel 00625d1527 fix: add version to path dependencies 2023-03-21 10:13:44 -04:00
Nathaniel Simard 4e28e2a776
chore: prepare release v0.6.0 (#246) 2023-03-21 09:47:37 -04:00
Nathaniel Simard eb7b1ceb2f
Bump tch version (#245) 2023-03-20 12:01:55 -04:00
Nathaniel Simard c9e344a97f
Fix/batch norm (#238) 2023-03-17 09:31:52 -04:00
Nathaniel Simard d09ab44979
Feat/index_select (#227) 2023-03-12 17:44:22 -04:00
Nathaniel Simard 9655b74b22
Feat/index_select_dim ops (#225) 2023-03-11 16:14:57 -05:00
Nathaniel Simard 0544a915eb
Refactor/migrate more numeric func (#220) 2023-03-10 10:47:15 -05:00
Nathaniel Simard cf7847acb5
fix: mix precision training on tch backend (#209) 2023-03-07 09:42:51 -05:00
Nathaniel Simard be96160065
refactor: elems (#206) 2023-03-06 18:39:49 -05:00
Nathaniel Simard 019c5f9c44
Refactor/int backend (#197)
* Update burn-tensor API

* Migrate burn-autodiff

* Update burn-tch

* Update burn-ndarray

* Add some doc
2023-03-06 14:45:58 -05:00
Nathaniel Simard 02591e5a6f
fix(burn-tch): inplace binary ops (#193) 2023-03-05 11:23:57 -05:00
Nathaniel Simard 15ec42dd6f
Refactor/backend bool tensor (#192) 2023-03-05 11:23:46 -05:00