nathaniel
|
b0f0d7e378
|
refactor: create Tensor + Element trait
|
2022-07-27 10:20:21 -04:00 |
nathaniel
|
122cd842a2
|
refactor: move tensor base ops
|
2022-07-27 10:05:38 -04:00 |
nathaniel
|
6f45e878f1
|
refactor: move zeros + ones trait in tensor ops
|
2022-07-27 09:43:57 -04:00 |
nathaniel
|
51cb331aef
|
refactor: imports
|
2022-07-27 09:37:56 -04:00 |
nathaniel
|
66cd4ccb9c
|
perf: optimize from bmatrix for ndarray
|
2022-07-26 18:12:26 -04:00 |
nathaniel
|
3ba6c69875
|
feat: full support for ndarray
|
2022-07-26 18:01:29 -04:00 |
nathaniel
|
7dcd9d5507
|
feat: wip add ndarray
|
2022-07-26 15:51:11 -04:00 |
nathaniel
|
ec32fa730c
|
feat: implement more ops for ndarray backend
|
2022-07-26 14:21:07 -04:00 |
nathaniel
|
b2f3c42376
|
feat: wip ndarray backend
|
2022-07-26 12:22:24 -04:00 |
nathaniel
|
c7dad3977b
|
feat: support higher order autodiff
|
2022-07-26 10:14:17 -04:00 |
nathaniel
|
61b67b44ff
|
doc: add a simple example
|
2022-07-26 10:04:13 -04:00 |
nathaniel
|
b306156cc2
|
feat: init tensor
|
2022-07-26 09:45:13 -04:00 |
nathaniel
|
fe1b9b6972
|
feat: support transpose un auto diff
|
2022-07-26 07:59:24 -04:00 |
nathaniel
|
9af7bde608
|
feat: support neg in auto diff
|
2022-07-26 07:52:24 -04:00 |
nathaniel
|
445a9fbcbe
|
feat: support neg in auto diff
|
2022-07-26 07:52:17 -04:00 |
nathaniel
|
824e1a345f
|
feat: support index + index_assign with auto diff
|
2022-07-25 22:48:37 -04:00 |
nathaniel
|
f939b5c775
|
feat: support reshape in AD
|
2022-07-25 20:55:13 -04:00 |
nathaniel
|
ae994f367a
|
feat: support half precision
|
2022-07-25 19:52:40 -04:00 |
nathaniel
|
0dac3f6bdf
|
refactor: ad element
|
2022-07-25 19:40:14 -04:00 |
Nathaniel Simard
|
3769b63305
|
refactor: breadth first search graph traversal (#5)
|
2022-07-25 19:10:48 -04:00 |
Nathaniel Simard
|
a226eabcc8
|
Refactor/stateless forward node (#4)
* refactor: forward + backward graph
* refactor: always use execute_ops for all ops
* refactor: extract forward and backward nodes
* feat: forward multithread safe
* refactor: backward state
* feat: backward multi-threaded
* feat: remove multi-thread backward
|
2022-07-25 17:59:25 -04:00 |
nathaniel
|
f9cbcd4db4
|
feat: only use gradients struct for getting grad
|
2022-07-25 11:28:42 -04:00 |
nathaniel
|
e430a62795
|
feat: build gradients struct when backward
|
2022-07-25 11:18:26 -04:00 |
nathaniel
|
230cd01ea1
|
fix: backward order
|
2022-07-21 07:39:47 -04:00 |
nathaniel
|
2f7f65cea5
|
fix: some autodiff wrong results
|
2022-07-21 07:07:05 -04:00 |
Nathaniel Simard
|
408210f6b6
|
Refactor/no reference counting on state (#3)
|
2022-07-20 16:56:26 -04:00 |
Nathaniel Simard
|
5e6f0aea75
|
fix: recorded ops and node creating wrong grad (#2)
|
2022-07-20 09:57:15 -04:00 |
Nathaniel Simard
|
8e80502a05
|
refactor: create tape only during backprop (#1)
|
2022-07-19 20:59:32 -04:00 |
nathaniel
|
30fadcb4d1
|
chore: remove old code
|
2022-07-19 12:50:12 -04:00 |
nathaniel
|
2084aced63
|
feat: implement matmul diff
|
2022-07-18 21:55:39 -04:00 |
nathaniel
|
5ce657ded9
|
feat: support sub autograd
|
2022-07-18 20:23:21 -04:00 |
nathaniel
|
f241a6c114
|
fix: single recorded ops
|
2022-07-18 20:03:30 -04:00 |
nathaniel
|
6ee628a748
|
feat: implement TensorBase for ADTensor
|
2022-07-18 19:31:00 -04:00 |
nathaniel
|
902f431fc1
|
First Commit :D
|
2022-07-18 19:19:13 -04:00 |
Nathaniel Simard
|
ebea760c54
|
Initial commit
|
2022-07-18 19:11:45 -04:00 |