!30033 Add Chinese statement for set_dump API

Merge pull request !30033 from maning202007/code_docs_dump_api
This commit is contained in:
i-robot 2022-02-18 06:53:53 +00:00 committed by Gitee
commit 63e7c9e608
No known key found for this signature in database
GPG Key ID: 173E9B9CA92EEF8F
4 changed files with 42 additions and 19 deletions

View File

@ -124,8 +124,6 @@ mindspore
.. mscnautosummary::
:toctree: mindspore
:nosignatures:
:template: classtemplate.rst
mindspore.set_dump

View File

@ -0,0 +1,23 @@
mindspore.set_dump
==================
.. py:class:: mindspore.set_dump(target, enabled=True)
启用或者禁用target及其子节点的Dump数据功能。
target为Cell或Primitive的实例。请注意此API仅在开启异步Dump功能且Dump配置文件中的 `dump_mode` 字段为"2"时生效。有关详细信息,请参阅 `Dump功能文档 <https://mindspore.cn/docs/programming_guide/zh-CN/master/dump_in_graph_mode.html>`_ 。默认状态下Cell和Primitive实例不使能Dump数据功能。
.. Warning::
此API还在实验阶段后续可能修改或删除。
.. Note::
- 此API只在Ascend后端的图模式有效。
- 当target是一个Cell且enabled设置为True时Cell实例及其子Cell实例的Primitive将递归启用Dump。如果算子不是Cell实例的成员则不会为该算子启用Dump例如在construct方法中直接使用的 `functional 算子 <https://www.mindspore.cn/docs/api/zh-CN/master/api_python/mindspore.ops.html#functional>`_ 。要使此API生效请在Cell的__init__方法中使用self.some_op = SomeOp()的写法。
- 使用set_dump(Cell, True)后Cell正向计算中的算子会被Dump大多数反向计算梯度运算产生的计算不会被Dump。然而由于图的优化一些后向计算的数据仍然会被Dump。可以忽略文件名中包含“Gradients”的反向计算数据。
- 此API只支持训练开始前调用。如果在训练过程中调用这个API可能不会有效果。
- 对于 `nn.SoftMaxCrossEntropyWithLogits 层 <https://www.mindspore.cn/docs/api/zh-CN/master/api_python/nn/mindspore.nn.SoftmaxCrossEntropyWithLogits.html#mindspore.nn.SoftmaxCrossEntropyWithLogits>`_ 正向计算和反向计算使用同一组算子。因此只能看到反向计算中的Dump数据。请注意当使用sparse=True和reduce=“mean”初始化时nn.SoftmaxCrossEntropyWithLogits层也将在内部使用这些算子。
**参数:**
- **target** (Union[Cell, Primitive]) - 要设置Dump标志的Cell或Primitive的实例。
- **enabled** (bool) - True表示启用DumpFalse表示禁用Dump默认值: True。

View File

@ -1,4 +1,4 @@
# Copyright 2021 Huawei Technologies Co., Ltd
# Copyright 2021-2022 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@ -23,22 +23,22 @@ def set_dump(target, enabled=True):
"""
Enable or disable dump for the target and its contents.
Target should be an instance of Cell or Primitive. The default enabled
status for a cell or primitive is False. Please note that this API takes
effect only when the dump_mode field in dump config file is 2. See the
`dump document <https://mindspore.cn/docs/programming_guide/zh-CN/master/dump_in_graph_mode.html>`_
for details.
Target should be an instance of Cell or Primitive. Please note that this API takes
effect only when Asynchronous Dump is enabled and the dump_mode field in dump config file is 2. See the
`dump document <https://mindspore.cn/docs/programming_guide/en/master/dump_in_graph_mode.html>`_
for details. The default enabled status for a cell or primitive is False.
.. warning::
This is an experimental prototype that is subject to change or deletion.
Note:
1. This API is only effective for GRAPH_MODE with Ascend backend.
2. When target is a cell and enabled is True, this API will the enable
2. When target is a cell and enabled is True, this API will enable
dump for the primitive operator members of the cell instance and
its child cell instances recursively. If an operator is not a
member of the cell instance, the dump flag will not be set for
this operator (e.g. functional operators used directly in
this operator (e.g. `functional operators
<https://www.mindspore.cn/docs/api/en/master/api_python/mindspore.ops.html#functional>`_ used directly in
construct method). To make this API effective, please use
self.some_op = SomeOp() in your cell's __init__ method.
3. After using set_dump(cell, True), operators in forward computation
@ -47,15 +47,16 @@ def set_dump(target, enabled=True):
However, due to the graph optimization, a few backward computation
data will still be dumped. You can ignore the backward computation
data which contains "Gradients" in their filenames.
4. This API is not designed to use in the middle of training process.
If you call this API in the middle of training, it only takes effect
for the later compiled graphs. If there is no new graph compilation,
you will see no effect.
5. For operator SparseSoftmaxCrossEntropyWithLogits, the forward
4. This API only supports being called before training starts.
If you call this API during training, it may not be effective.
5. For `nn.SparseSoftmaxCrossEntropyWithLogits
<https://www.mindspore.cn/docs/api/en/master/api_python/nn/
mindspore.nn.SoftmaxCrossEntropyWithLogits.html#mindspore.nn
.SoftmaxCrossEntropyWithLogits>`_ layer, the forward
computation and backward computation use the same set of
operators. So you can only see dump data from backward computation.
Please note that operator SoftmaxCrossEntropyWithLogits will also use
the above operator internally when initialized with sparse=True and
Please note that nn.SoftmaxCrossEntropyWithLogits layer will also use
the above operators internally when initialized with sparse=True and
reduction="mean".
Args:
@ -93,7 +94,7 @@ def set_dump(target, enabled=True):
>>> net = MyNet()
>>> set_dump(net.conv1)
>>> input_tensor = Tensor(np.ones([1, 5, 10, 10], dtype=np.float32))
>>> net(input_tensor)
>>> output = net(input_tensor)
"""
if security.enable_security():
raise ValueError('The set_dump API is not supported, please recompile '

View File

@ -336,7 +336,8 @@ class SummaryRecord:
>>> from mindspore.train.summary import SummaryRecord
>>> if __name__ == '__main__':
... with SummaryRecord(log_dir="./summary_dir", file_prefix="xx_", file_suffix="_yy") as summary_record:
... summary_record.record(step=2)
... result = summary_record.record(step=2)
... print(result)
...
True
"""