Commit Graph

78 Commits

Author SHA1 Message Date
Nathaniel Simard 441a7011ce
Feat/tensor casting (#604) 2023-08-08 10:02:17 -04:00
Nathaniel Simard 8436d4ff66
Feat/tensor/adaptive avg pool2d (#572) 2023-08-04 10:23:59 -04:00
mmalczak 73fb0eaa7e
Addition of abs tensor opperator #506 (#553) 2023-08-01 18:25:14 -04:00
Dilshod Tadjibaev 74c41bdda2
Add clamp, clamp_min, clamp_max tensor ops (#550) 2023-07-26 20:02:38 -04:00
Nathaniel Simard 0a5a2d729a
chore: bump version for next release (#533) 2023-07-26 09:46:28 -04:00
polina guseva 64090a582b
Add static full method for tensor intialization with custom values (#486)
---------

Co-authored-by: Dilshod Tadjibaev <939125+antimora@users.noreply.github.com>
2023-07-24 13:03:27 -04:00
Dilshod Tadjibaev e62ee1269b
Fix burn-tch's random implementation for standard dist (#469) 2023-07-06 08:50:50 -04:00
Nathaniel Simard 65bf6c1cbb
Refactor index => slice (#466) 2023-07-05 16:30:11 -04:00
Nathaniel Simard dc216f574b
fix ndarray index select assign (#441) 2023-06-27 12:48:01 -04:00
Dilshod Tadjibaev 825aaa9977
Add missing documents (#424) 2023-06-23 09:28:34 -04:00
Nathaniel Simard 3b9c997513
feat: provide embedding default impl (#417) 2023-06-20 16:16:31 -04:00
Nathaniel Simard 4d40bde7b9
feat: argmax + argmin (#412) 2023-06-20 10:03:00 -04:00
Nathaniel Simard a8624590af
feat: mask_where (#409) 2023-06-18 15:04:28 -04:00
Dilshod Tadjibaev 834c7ecc1f
Clean up cargo descriptions and formatting (#403) 2023-06-15 09:20:53 -04:00
Dilshod Tadjibaev d57ca96695
Upgrade dep versions (#399) 2023-06-14 09:55:19 -04:00
Dilshod Tadjibaev 80d145d629
Add run-before-pr.sh script and fix clippy errors (#397) 2023-06-14 09:10:24 -04:00
Nathaniel Simard 8c9802c363
Feat/wgpu/reduction (#392) 2023-06-08 16:54:36 -04:00
Nathaniel Simard 2a4ba5a6ab
Feat/gather scatter (#367) 2023-05-27 11:40:04 -04:00
Dilshod Tadjibaev b170b539b7
Upgrade dep versions (#359) 2023-05-21 09:07:39 -04:00
Dilshod Tadjibaev 6fece7e4cb
Dataset Improvements: Add Sqlite storage backend and HF importer improvements (#353) 2023-05-20 14:24:55 -04:00
Dilshod Tadjibaev 05763e1878
Bump version to the next minor to indicate dev (#344) 2023-05-10 18:02:08 -04:00
Nathaniel Simard 73f99ef79f
Feat/maxmin numeric (#340) 2023-05-09 16:35:55 -04:00
Nathaniel Simard 69001b0d69
Feat/activation ops (#338)
* perf: GELU

* Refactor relu
2023-05-09 08:32:35 -04:00
Nathaniel Simard 29eecd6383
Prepare next release (#335) 2023-05-06 10:32:23 -04:00
Dilshod Tadjibaev 314db93b7f
Add ability to load onnx state to the generated source code (#319) 2023-05-03 13:05:43 -04:00
Nathaniel Simard b54f9302c7
Feat/avg pool2d (#318) 2023-04-30 12:25:14 -04:00
Nathaniel Simard c5e31b272f
Feat/group conv (#306) 2023-04-22 15:00:41 -04:00
Nathaniel Simard 78ac09fb7a
Support dilation in convolution operations (#301) 2023-04-18 10:01:11 -04:00
Nathaniel Simard bd58922784
Feat/conv stride (#300) 2023-04-16 22:03:16 -04:00
Yu Sun 26cf555612
Feat: add new tensor ops mask_scatter (#258) 2023-04-13 08:58:31 -04:00
Nathaniel Simard 04bcf9550a
Fix/text gen example (#292) 2023-04-11 17:18:45 -04:00
Nathaniel Simard 66028dc3cf
Feat/lr scheduler (#276) 2023-04-08 13:12:27 -04:00
Nathaniel Simard f04fe101d8
Feat/module no grad (#274) 2023-04-07 09:01:27 -04:00
Mathias Insley d8f64ce1dd
Pretty Print Tensors (#257) 2023-04-06 20:06:38 -04:00
Nathaniel Simard 6f43d983f7
State serialization/deserialization overhaul (#247) 2023-03-23 11:02:46 -04:00
nathaniel 00625d1527 fix: add version to path dependencies 2023-03-21 10:13:44 -04:00
Nathaniel Simard 4e28e2a776
chore: prepare release v0.6.0 (#246) 2023-03-21 09:47:37 -04:00
Dilshod Tadjibaev 6222b887e9
Bump to the latest minor versions of dependencies (#237) 2023-03-15 21:54:47 -04:00
Dilshod Tadjibaev aa8c96d3fb
Update readme with blas-accelerate flag information (#236) 2023-03-15 21:44:13 -04:00
Nathaniel Simard d09ab44979
Feat/index_select (#227) 2023-03-12 17:44:22 -04:00
Nathaniel Simard 9655b74b22
Feat/index_select_dim ops (#225) 2023-03-11 16:14:57 -05:00
Nathaniel Simard 860051ca5c
Perf/ndarray maxpool (#223) 2023-03-10 19:14:18 -05:00
Nathaniel Simard 06c1997559
refactor(burn-ndarray): use single par iter in conv2d (#222) 2023-03-10 13:16:57 -05:00
Nathaniel Simard 0544a915eb
Refactor/migrate more numeric func (#220) 2023-03-10 10:47:15 -05:00
Nathaniel Simard a6a49cdc2a
perf: improve conv2d performance (#219) 2023-03-10 10:46:22 -05:00
Nathaniel Simard b34987f1f5
Perf/ndarray ipow (#217) 2023-03-09 15:23:14 -05:00
Nathaniel Simard a2ec774c37
draft: Perf/ndarray matmul (#214) 2023-03-09 14:00:35 -05:00
Nathaniel Simard be96160065
refactor: elems (#206) 2023-03-06 18:39:49 -05:00
Nathaniel Simard 019c5f9c44
Refactor/int backend (#197)
* Update burn-tensor API

* Migrate burn-autodiff

* Update burn-tch

* Update burn-ndarray

* Add some doc
2023-03-06 14:45:58 -05:00
Nathaniel Simard 15ec42dd6f
Refactor/backend bool tensor (#192) 2023-03-05 11:23:46 -05:00