@@ -205,27 +215,26 @@ Previously, we have incomplete support for keyword arguments `out` and `where` i
```python
->>> import mindspore.numpy as np
->>>
->>> a = np.ones((3,3))
->>> b = np.ones((3,3))
->>> out = np.zeros((3,3))
->>> where = np.asarray([[True, False, True],[False, False, True],[True, True, True]])
->>> res = np.add(a, b, out=out, where=where) # `out` cannot be used as a reference, therefore it is misleading
+@C.add_flags(has_effect=True)
+def construct(self, *inputs):
+ ...
+ loss = self.network(*inputs)
+ init = self.allo_status()
+ self.clear_status(init)
+ ...
```
|
```python
->>> import mindspore.numpy as np
->>>
->>> a = np.ones((3,3))
->>> b = np.ones((3,3))
->>> out = np.zeros((3,3))
->>> where = np.asarray([[True, False, True],[False, False, True],[True, True, True]])
->>> res = np.add(a, b)
->>> out = np.where(where, x=res, y=out) # instead of np.add(a, b, out=out, where=where)
+def construct(self, *inputs):
+ ...
+ loss = self.network(*inputs)
+ init = self.allo_status()
+ init = F.depend(init, loss)
+ clear_status = self.clear_status(init)
+ ...
```
|
@@ -442,6 +451,12 @@ MSTensor::DestroyTensorPtr(tensor);
- fix executor pending task not execute in some heterogeneous cases.([!13465](https://gitee.com/mind_spore/dashboard/projects/mindspore/mindspore/pulls/13465))
- add passes to support frontend IR unification, including following operations: SliceGrad([!11783](https://gitee.com/mindspore/mindspore/pulls/11783)), ApplyFtrl, ApplyMomentum, ApplyRMSProp, CenteredRMSProp([!11895](https://gitee.com/mindspore/mindspore/pulls/11895)), AvgPoolGrad([!12813](https://gitee.com/mindspore/mindspore/pulls/12813)), BatchNorm([!12115](https://gitee.com/mindspore/mindspore/pulls/12115))
+#### Dataset
+
+- Fix getter functions(e.g. GetDatasetSize) terminated abnormally when use python multi-processing. ([!13571](https://gitee.com/mindspore/mindspore/pulls/13571), [!13823](https://gitee.com/mindspore/mindspore/pulls/13823))
+- Fix unclear error log of data augmentation operators. ([!12398](https://gitee.com/mindspore/mindspore/pulls/12398), [!12883](https://gitee.com/mindspore/mindspore/pulls/12883), [!13176](https://gitee.com/mindspore/mindspore/pulls/13176))
+- Fix profiling performs abnormally when sink_size = False, as saving data is later than profiling analysis. ([!13944](https://gitee.com/mindspore/mindspore/pulls/13944))
+
## MindSpore Lite
### Major Features and Improvements
@@ -655,9 +670,6 @@ class Allocator;
1. Fix the bug that the array in kernel registrar is not initialized.
2. Fix segment fault caused by releasing of OpParameter in Crop kernel in mistake.
3. Fix the bug that the MINDIR aware-training model is finally interpreted as weight-quant model.
-4. Fix getter functions(e.g. GetDatasetSize) terminated abnormally when use python multi-processing. ([!13571](https://gitee.com/mindspore/mindspore/pulls/13571), [!13823](https://gitee.com/mindspore/mindspore/pulls/13823))
-5. Fix unclear error log of data augmentation operators. ([!12398](https://gitee.com/mindspore/mindspore/pulls/12398), [!12883](https://gitee.com/mindspore/mindspore/pulls/12883), [!13176](https://gitee.com/mindspore/mindspore/pulls/13176))
-6. Fix profiling performs abnormally when sink_size = False, as saving data is later than profiling analysis. ([!13944](https://gitee.com/mindspore/mindspore/pulls/13944))
## Contributors