Commit Graph

330 Commits

Author SHA1 Message Date
polina guseva 64090a582b
Add static full method for tensor intialization with custom values (#486)
---------

Co-authored-by: Dilshod Tadjibaev <939125+antimora@users.noreply.github.com>
2023-07-24 13:03:27 -04:00
Louis Fortier-Dubois 9aca1837c2
Example/wgpu/mnist (#514)
* add wgpu for mnist

* auto graphics api

* fix display tests

* clipy
2023-07-20 17:12:13 -04:00
Nathaniel Simard d7ce52f0da
Feat/wgpu/conv (#512) 2023-07-20 15:14:42 -04:00
Louis Fortier-Dubois 4b60c0e7a0
continuous to contiguous (#511) 2023-07-20 11:28:35 -04:00
Nathaniel Simard c4afff182f
Feat/wgpu/max pool2d (#500) 2023-07-14 13:58:08 -04:00
Dilshod Tadjibaev ef421f0ae9
Add arange with steps op for int tensor (#490) 2023-07-13 17:09:57 -04:00
Nathaniel Simard a2ac2057d8
Fix: wgpu scatter with different shapes (#489) 2023-07-12 13:21:19 -04:00
Jacob cb1ac26bb3
Add SiLU activation function (#482) 2023-07-09 11:28:14 -04:00
Nathaniel Simard 04ad14a32a
refactor: wgpu reductions (#471) 2023-07-06 11:40:37 -04:00
Dilshod Tadjibaev e62ee1269b
Fix burn-tch's random implementation for standard dist (#469) 2023-07-06 08:50:50 -04:00
Nathaniel Simard 65bf6c1cbb
Refactor index => slice (#466) 2023-07-05 16:30:11 -04:00
Nathaniel Simard 98d4abd9c0
Perf/wgpu/comparison (#460) 2023-07-04 09:33:05 -04:00
Nathaniel Simard a1c9970373
Refactor/wgpu/mask (#453) 2023-07-02 12:59:22 -04:00
Nathaniel Simard 77219f7262
Perf/unary binary (#446) 2023-06-29 08:56:35 -04:00
Louis Fortier-Dubois f99fe0fadd
Matmul 2D Tiling (#442) 2023-06-28 16:48:15 -04:00
Nathaniel Simard 13836ac599
Fix: wgpu powf (#443) 2023-06-28 12:50:16 -04:00
Nathaniel Simard dc216f574b
fix ndarray index select assign (#441) 2023-06-27 12:48:01 -04:00
Nathaniel Simard d26dead299
fix arange (#440) 2023-06-27 12:47:49 -04:00
Dilshod Tadjibaev 61291cc30c
Revert temp half-rs fix for no_std (#430)
Fixes #268
2023-06-26 15:59:06 -04:00
Nathaniel Simard 6c834d3b22
Feat/wgpu/index select (#425) 2023-06-23 09:31:49 -04:00
Dilshod Tadjibaev 825aaa9977
Add missing documents (#424) 2023-06-23 09:28:34 -04:00
Nathaniel Simard c4e4c25fef
Feat/wgpu/mask (#422) 2023-06-21 13:01:18 -04:00
Yu Sun 9c5afa3469
fix(loss): Declare all contents of reduction as public (#423) 2023-06-21 13:00:57 -04:00
Dilshod Tadjibaev fce45f51be
Doc fixes (#418) 2023-06-21 12:32:50 -04:00
Nathaniel Simard 3b9c997513
feat: provide embedding default impl (#417) 2023-06-20 16:16:31 -04:00
Nathaniel Simard b6b684b9f2
Feat/wgpu/cat (#416) 2023-06-20 16:16:10 -04:00
Nathaniel Simard 4d40bde7b9
feat: argmax + argmin (#412) 2023-06-20 10:03:00 -04:00
Nathaniel Simard 323261b594
Feat/wgpu/comparisons (#411) 2023-06-19 19:11:49 -04:00
Louis Fortier-Dubois e5cc50a837
refactor/initializer (#398)
* add tests to linear forward

* swap linear weight dims

* rename uniformdefault

* wip bias in init

* refactor initializer

* formatting

* fix get(0) to first

* wip refactor

* builder pattern + kaiming init

* kaiming initializer and major initializer refactor

* fix fan out

* easier fixes

* remove initializer options

* clippy fix

* revert to input*output linear

* gru uses xavier init

* should be normal actually

---------

Co-authored-by: louisfd <louisfortier-dubois@Louiss-MacBook-Pro.local>
2023-06-19 16:55:11 -04:00
Nathaniel Simard 40743c477a
Feat/onnx/more ops (#410) 2023-06-19 16:10:50 -04:00
Nathaniel Simard a8624590af
feat: mask_where (#409) 2023-06-18 15:04:28 -04:00
Dilshod Tadjibaev 834c7ecc1f
Clean up cargo descriptions and formatting (#403) 2023-06-15 09:20:53 -04:00
Louis Fortier-Dubois fd88398ce4
feat/Xavier glorot (#394) 2023-06-11 09:54:44 -04:00
Nathaniel Simard 8c9802c363
Feat/wgpu/reduction (#392) 2023-06-08 16:54:36 -04:00
Nathaniel Simard cb4f049eae
fix: improve assert approx (#390) 2023-06-06 14:34:20 -04:00
Mathias Insley 8a88a868ee
Feat/lstm (#370) 2023-06-06 14:33:22 -04:00
Nathaniel Simard bff752b1a8
Feat/wgpu/index (#386) 2023-06-06 12:29:14 -04:00
Nathaniel Simard c1e1e38a79
Add more ops (#387) 2023-06-06 12:21:20 -04:00
Nathaniel Simard ecc67c58f9
Feat/wgpu/swap dims (#381) 2023-06-04 19:34:35 -04:00
Nathaniel Simard 0a205a3603
feat: add basic matmul for wgpu backend (#379) 2023-06-03 10:32:57 -04:00
Nathaniel Simard 974fdfaba1
Feat/wgpu backend setup (#376) 2023-06-02 11:52:47 -04:00
Nathaniel Simard 2a4ba5a6ab
Feat/gather scatter (#367) 2023-05-27 11:40:04 -04:00
Nathaniel Simard ba0ac112e1
Refactor: Burn Import use BurnGraph (#355) 2023-05-21 13:09:27 -04:00
Nathaniel Simard 510d53f37e
Improve softmax doc (#352) 2023-05-16 14:39:42 -04:00
Nathaniel Simard 976102fec0
Feat/avg pool1d (#349) 2023-05-15 08:29:45 -04:00
Nathaniel Simard 8bbfbd8a8f
perf: softmax (#346) 2023-05-12 09:37:42 -04:00
Dilshod Tadjibaev 05763e1878
Bump version to the next minor to indicate dev (#344) 2023-05-10 18:02:08 -04:00
Nathaniel Simard 73f99ef79f
Feat/maxmin numeric (#340) 2023-05-09 16:35:55 -04:00
Nathaniel Simard a88357ce1d
feat: add max & min ops (#339) 2023-05-09 15:49:33 -04:00
Nathaniel Simard 69001b0d69
Feat/activation ops (#338)
* perf: GELU

* Refactor relu
2023-05-09 08:32:35 -04:00
Nathaniel Simard 29eecd6383
Prepare next release (#335) 2023-05-06 10:32:23 -04:00
Nathaniel Simard b54f9302c7
Feat/avg pool2d (#318) 2023-04-30 12:25:14 -04:00
Sunny Gonnabathula 02abc373d3
add burn-tch support for bf16 (#303) 2023-04-24 11:24:36 -04:00
Nathaniel Simard c5e31b272f
Feat/group conv (#306) 2023-04-22 15:00:41 -04:00
Nathaniel Simard 78ac09fb7a
Support dilation in convolution operations (#301) 2023-04-18 10:01:11 -04:00
Nathaniel Simard bd58922784
Feat/conv stride (#300) 2023-04-16 22:03:16 -04:00
Yu Sun 26cf555612
Feat: add new tensor ops mask_scatter (#258) 2023-04-13 08:58:31 -04:00
Sunny Gonnabathula 69954c14ec
add bf16 element (#295) 2023-04-12 12:36:03 -04:00
Nathaniel Simard 04bcf9550a
Fix/text gen example (#292) 2023-04-11 17:18:45 -04:00
Nathaniel Simard 2220965b5c
Feat/add tensor checks (#283) 2023-04-10 19:16:15 -04:00
Nathaniel Simard f04fe101d8
Feat/module no grad (#274) 2023-04-07 09:01:27 -04:00
Mathias Insley d8f64ce1dd
Pretty Print Tensors (#257) 2023-04-06 20:06:38 -04:00
Dilshod Tadjibaev 4e9e6d2706
Move unsqueeze op to the tensor's base (#261) 2023-04-01 14:20:48 -04:00
Dilshod Tadjibaev 7364d09d32
Add flatten op to the tensor base (#260) 2023-03-31 16:44:48 -04:00
Nathaniel Simard 7d7504686a
Refactor/init modules (#250) 2023-03-24 15:38:02 -04:00
Nathaniel Simard a74e4cd0bc
fix: conv bias backward (#248) 2023-03-23 16:10:26 -04:00
Nathaniel Simard 6f43d983f7
State serialization/deserialization overhaul (#247) 2023-03-23 11:02:46 -04:00
nathaniel 00625d1527 fix: add version to path dependencies 2023-03-21 10:13:44 -04:00
Nathaniel Simard 4e28e2a776
chore: prepare release v0.6.0 (#246) 2023-03-21 09:47:37 -04:00
Dilshod Tadjibaev bf9d33e6fc
Add MNIST inference on the web demo crate (#228)
* Add MNIST inference on the web demo crate

* Fix problems identified during a PR review
2023-03-13 19:51:32 -04:00
Nathaniel Simard 403500f018
feat(burn-core): refactor cross entropy (#229) 2023-03-13 13:47:19 -04:00
Nathaniel Simard d09ab44979
Feat/index_select (#227) 2023-03-12 17:44:22 -04:00
Nathaniel Simard 9655b74b22
Feat/index_select_dim ops (#225) 2023-03-11 16:14:57 -05:00
Nathaniel Simard 860051ca5c
Perf/ndarray maxpool (#223) 2023-03-10 19:14:18 -05:00
Hariganesh Srinivasan 740b554047
Implemented sigmoid and log_sigmoid, Resolves #171 (#221)
* implemented sigmoid and log_sigmoid with tests

* added test for overflow and computation.
2023-03-10 19:03:01 -05:00
Nathaniel Simard 0544a915eb
Refactor/migrate more numeric func (#220) 2023-03-10 10:47:15 -05:00
Nathaniel Simard a2ec774c37
draft: Perf/ndarray matmul (#214) 2023-03-09 14:00:35 -05:00
Nathaniel Simard be96160065
refactor: elems (#206) 2023-03-06 18:39:49 -05:00
Nathaniel Simard 019c5f9c44
Refactor/int backend (#197)
* Update burn-tensor API

* Migrate burn-autodiff

* Update burn-tch

* Update burn-ndarray

* Add some doc
2023-03-06 14:45:58 -05:00
Nathaniel Simard 15ec42dd6f
Refactor/backend bool tensor (#192) 2023-03-05 11:23:46 -05:00
Nathaniel Simard ffd3d35176
Refactor/tensor api (#191) 2023-03-05 09:23:42 -05:00
Nathaniel Simard bddc1461ef
fix: running state (#190) 2023-03-01 17:33:32 -05:00
Nathaniel Simard d4c298c221
Refactor/burn core (#188) 2023-03-01 16:10:58 -05:00
Nathaniel Simard e6e7f4de42
feat: inplace tensor api. (#187) 2023-03-01 10:55:51 -05:00
Nathaniel Simard 25deb5a13b
Refactor/autodiff (#186) 2023-02-28 20:01:26 -05:00
Dilshod Tadjibaev fb925acc73
Make burn and burn-core packages no_std compatible (#168) (#173)
* Make burn-ndarray and burn-tensor no_std compatible (#168)
2023-02-25 09:38:01 -05:00
Dilshod Tadjibaev 9091363ada
Make burn-ndarray and burn-tensor no_std compatible (#168) (#169) 2023-02-21 08:35:24 -05:00
Nathaniel Simard 7d2f43dfca
Refactor Tensor API (#163) 2023-02-17 17:31:20 -05:00
Nathaniel Simard 2401d8ad96
Prepare next release (#161) 2023-02-12 15:32:29 -05:00
Yu Sun 0b85cb0eed
feat(trait-TensorOps): add log1p (#160) 2023-02-11 13:30:50 -05:00
Nathaniel Simard c7963d8485
refactor: device functions (#157) 2023-01-27 18:37:21 -05:00
Nathaniel Simard 2d4e514b41
Refactor/shape function (#156) 2023-01-27 15:18:55 -05:00
Makro f6f0d0e4f3
Add cos, sin and tanh operations (#155)
* Add cos, sin and tanh operations

* Add tests

* Fix formatting
2023-01-24 19:40:30 -05:00
Nathaniel Simard 34d233cd3e
Feat/max pooling backend (#152) 2023-01-21 15:39:21 -05:00
Nathaniel Simard 745c88f0a0
Feat/conv (#147) 2023-01-11 18:33:09 -05:00
Nathaniel Simard 2f179f12c9
Bump versions (#141) 2022-12-30 15:15:51 -05:00
Nathaniel Simard eea5a263bf
Feat/adam optimizer (#140) 2022-12-30 15:02:43 -05:00
Nathaniel Simard 248039da0a
Refactor/metric adaptor (#139) 2022-12-26 16:30:25 -05:00
Visual 567adfb93e
refactor: fix all clippy warnings (#137) 2022-12-25 11:22:25 -05:00
Visual 85f98b9d54
refactor, feat: clean Cargo.toml files, upgrade tch to 0.10 (#131)
* Clean Cargo.toml files, upgrade tch to 0.10

* Add pull_request hook to test.yml workflow
2022-12-25 10:36:23 -05:00
Nathaniel Simard 1ec35a9e1b
feat: from floats (#87) 2022-12-25 10:11:20 -05:00
Nathaniel Simard 3a9dfe6097
feat: cross entropy loss (#130) 2022-12-25 10:10:22 -05:00
Nathaniel Simard 1a1d86dc3e
refactor: save and load state (#129) 2022-12-24 13:02:37 -05:00
Nathaniel Simard 3a91c2c48e
Feat/multi device (#128) 2022-12-20 18:01:58 -05:00
Nathaniel Simard a599eaed88
Feat/module visitor (#127) 2022-12-17 14:25:36 -05:00
Nathaniel Simard d9592411c2
Feat/text generation example (#126) 2022-12-16 19:23:51 -05:00
Nathaniel Simard 63d8d39517
Feat/autoregressive transformer (#125) 2022-12-10 15:47:57 -05:00
Nathaniel Simard b99b23e1a7
refactor: backends (#124) 2022-12-02 19:28:34 -05:00
Nathaniel Simard 7c38a980c1
feat: improve bool tensor (#122) 2022-12-01 19:43:36 -05:00
Nathaniel Simard 8bd0b17296
Feat/mha (#118) 2022-11-26 15:48:26 -05:00
Nathaniel Simard 46d06f0c90
feat: module init (#117) 2022-11-25 22:02:26 -05:00
Nathaniel Simard acb14adc29
Fix/named tensor (#115) 2022-11-25 19:30:35 -05:00
Nathaniel Simard e0e787f87d
Experimental/named tensor (#113) 2022-11-23 19:05:46 -05:00
nathaniel a8e75f6164 fix/publish-crates-io 2022-11-20 13:08:54 -05:00
Nathaniel Simard 9ecd1be992
chore: get ready for next release (#111) 2022-11-20 12:59:10 -05:00
Nathaniel Simard ca94a9f105
refactor: autodiff gradients types (#107) 2022-11-19 19:43:49 -05:00
Nathaniel Simard dda067e79b
Refactor/backend autodiff (#106) 2022-11-19 12:37:06 -05:00
Nathaniel Simard d45d674a04
Refactor/backend ndarray (#105) 2022-11-18 20:37:38 -05:00
Nathaniel Simard 713f078602
refactor: burn tensor testgen (#104) 2022-11-16 21:02:32 -05:00
Nathaniel Simard ab51c22a55
Refactor/extract tch backend (#103) 2022-11-15 21:06:40 -05:00
Nathaniel Simard 23677b8e89
Refactor/zeros ones elems (#102) 2022-11-14 18:41:26 -05:00
Nathaniel Simard 1a45368878
refactor: relu ops (#101) 2022-11-12 13:28:45 -05:00
Nathaniel Simard da7a8e3f6a
refactor: cat ops (#100) 2022-11-12 13:02:10 -05:00
Nathaniel Simard ab39b8779b
refactor: erf ops (#99) 2022-11-12 12:27:31 -05:00
Nathaniel Simard ef01a4ed3f
refactor: pow ops (#98) 2022-11-12 12:06:53 -05:00
Nathaniel Simard 8c050c2904
refactor: exp + log ops (#97) 2022-11-12 11:50:47 -05:00
Nathaniel Simard 7684857282
refactor: args ops (#96) 2022-11-12 11:29:42 -05:00
Nathaniel Simard 0b77ef5dbc
refactor: precision ops (#95) 2022-11-12 11:13:47 -05:00
Nathaniel Simard 9d832a802a
refactor: aggregation ops (#94) 2022-11-12 10:23:00 -05:00
Nathaniel Simard e7094b92ac
refactor: detach-ops (#93) 2022-11-12 09:44:59 -05:00
Nathaniel Simard cba0db14db
refactor: comparison ops (#88) 2022-11-08 19:57:58 -05:00
Nathaniel Simard c5213b6c32
Fix: mnist example (#85) 2022-11-07 17:58:50 -05:00
Nathaniel Simard eee8bf4599
Refactor/mask fill (#74) 2022-11-05 22:09:40 -04:00
Nathaniel Simard 4111c46d6d
Refactor/index (#73) 2022-11-05 21:26:05 -04:00
Nathaniel Simard d369388036
Refactor/reshape (#72) 2022-11-05 20:49:44 -04:00
Nathaniel Simard e6541298b9
refactor: transpose (#71) 2022-11-05 20:18:31 -04:00
Nathaniel Simard ad23898d23
refactor: neg-ops (#70) 2022-11-05 17:19:04 -04:00
Nathaniel Simard 94b0283bac
refactor: matmul-ops (#69) 2022-11-05 16:29:52 -04:00
Nathaniel Simard 10d1c13c88
refactor/div-ops (#68) 2022-11-05 16:13:55 -04:00
Nathaniel Simard ee61e843a5
refactor/mul-ops (#67) 2022-11-05 14:21:52 -04:00
Nathaniel Simard 2bdad6fa00
refactor: sub ops (#66) 2022-11-05 10:00:52 -04:00
Nathaniel Simard 0f4c1e4c3e
refactor: add ops (#65) 2022-11-01 20:59:31 -04:00
Nathaniel Simard 0c4c657854
feat: repeat (#63) 2022-10-24 18:25:53 -04:00
Nathaniel Simard a78886d51e
Feat/arange (#62) 2022-10-23 10:32:44 -04:00
Nathaniel Simard 847243ddae
Feat/embedding (#61) 2022-10-23 10:18:35 -04:00
Nathaniel Simard b1df39e7fc
Refactor/device ops (#60) 2022-10-21 11:36:51 -04:00
Nathaniel Simard 0e1b0accd6
Refactor/tensor ops (#58) 2022-10-16 12:52:38 -04:00
Nathaniel Simard 72e44336b5
Refactor/ad backend decorator (#57) 2022-10-13 21:33:40 -04:00
nathaniel 31d512ed8f fix: dataset + optim 2022-10-10 11:09:09 -04:00
Nathaniel Simard f2f4fa8a92
Add swap dims (#56) 2022-10-08 11:54:53 -04:00