unify Conv's attr pad_list

This commit is contained in:
yuchaojie 2021-01-19 22:10:55 +08:00
parent da92f1affb
commit 611a9a3654
16 changed files with 117 additions and 51 deletions

View File

@ -1,3 +1,47 @@
# MindSpore 1.1.1 Release Notes
## MindSpore
### API Change
#### Backwards Incompatible Change
##### Python API
###### `ops.AvgPool`, `ops.MaxPool`, `ops.MaxPoolWithArgmax` change attr name from 'ksize', 'padding' to 'kernel_size', 'pad_mode' ([!11350](https://gitee.com/mindspore/mindspore/pulls/11350))
Previously the kernel size and pad mode attrs of pooling ops are named "ksize" and "padding", which is a little puzzling and inconsistent with convolution ops. So they are rename to "kernel_size" and "pad_mode".
<table>
<tr>
<td style="text-align:center"> 1.1.0 </td> <td style="text-align:center"> 1.1.1 </td>
</tr>
<tr>
<td>
```python
>>> from mindspore.ops import operations as P
>>>
>>> avg_pool = P.AvgPool(ksize=2, padding='same')
>>> max_pool = P.MaxPool(ksize=2, padding='same')
>>> max_pool_with_argmax = P.MaxPoolWithArgmax(ksize=2, padding='same')
```
</td>
<td>
```python
>>> from mindspore.ops import operations as P
>>>
>>> avg_pool = P.AvgPool(kernel_size=2, pad_mode='same')
>>> max_pool = P.MaxPool(kernel_size=2, pad_mode='same')
>>> max_pool_with_argmax = P.MaxPoolWithArgmax(kernel_size=2, pad_mode='same')
```
</td>
</tr>
</table>
# MindSpore 1.1.0 Release Notes
## MindSpore
@ -12,7 +56,7 @@
- [STABLE] Openpose: proposes a bottom-up human attitude estimation algorithm using Part Affinity Fields on COCO2017 dataset.(Ascend)
- [STABLE] CNN-CTC: proposes three major contributions to addresses scene text recognition (STR) on MJSynth and SynthText dataset.(Ascend)
- [STABLE] CenterFace: a practical anchor-free face detection and alignment method for edge devices on WiderFace dataset.(Ascend)
- [STABLE] ShuffleNetV2: a much faster and more accurate netowrk than the previous networks on ImageNet 2012 dataset.(GPU)
- [STABLE] ShuffleNetV2: a much faster and more accurate network than the previous networks on ImageNet 2012 dataset.(GPU)
- [STABLE] EfficientNet-B0: a new scaling method that uniformly scales all dimensions of depth/width/resolution using a simple yet highly effective compound coefficient on ImageNet 2012 dataset.(GPU)
- [BETA] SSD-GhostNet: based on an Ghost module structure which generate more features from cheap operations on Oxford-IIIT Pet dataset.(Ascend)
- [BETA] DS-CNN: Depthwise separable convolutional neural network on Speech commands dataset.(Ascend)
@ -329,7 +373,7 @@ dtype is removed from GumbelCDF and is no longer an argument of the class.
###### `nn.layer.combined.Conv2dBnAct`, `nn.layer.combined.DenseBnAct` move from nn.layer.quant to nn.layer.combined ([!8187](https://gitee.com/mindspore/mindspore/pulls/8187))
Previously Conv2dBnAct and DenseBnAct are in nn.layer.quant, since they are not quant cells, now they are moved to nn.layer.combined. If you import Conv2dBnAct, DenseBnAct from mindspore.nn, then your code dosen't need any change.
Previously Conv2dBnAct and DenseBnAct are in nn.layer.quant, since they are not quant cells, now they are moved to nn.layer.combined. If you import Conv2dBnAct, DenseBnAct from mindspore.nn, then your code doesn't need any change.
<table>
<tr>
@ -414,7 +458,7 @@ In Ascend platform, if group > 1, the weight shape of Conv2D change from [in_cha
3. Add Online Graph optimzation: by fusion Convolution/Matmul/Fullconnection and add/mul/pad/reshape, improve performance up to 50+% for some networks;
4. Add auto tuning: by online tuning in the graph compilation phase, optimize performance up to 10%;
5. Add weight quant: support weight quant
6. Add opencl kernel binary cache: improve Initilization time .
6. Add opencl kernel binary cache: improve Initialization time .
#### Post quantization
@ -461,7 +505,7 @@ The MindSpore Lite ToD framework is already in use in the newest Huawei Smart TV
##### Java API
- [Add] Implament JNI layer and add Java api for CPU and GPU backend
- [Add] Implement JNI layer and add Java api for CPU and GPU backend
#### Deprecations
@ -686,7 +730,7 @@ Contributions of any kind are welcome!
- Python API
- improve interface '__bool__' for tensor([!4000](https://gitee.com/mindspore/mindspore/pulls/4000))
- fix GPU-ResizeNearestNeighbor([!3760](https://gitee.com/mindspore/mindspore/pulls/3760))
- fix topK multi dimention grad func([!3711](https://gitee.com/mindspore/mindspore/pulls/3711))
- fix topK multi dimension grad func([!3711](https://gitee.com/mindspore/mindspore/pulls/3711))
- fix scatterop error msg([!3699](https://gitee.com/mindspore/mindspore/pulls/3699))
- fix bug of cast dtype when using mix_presion in pynative mode([!3730](https://gitee.com/mindspore/mindspore/pulls/3730))
- Executor
@ -773,7 +817,7 @@ Contributions of any kind are welcome!
- Fixing type check mistakes of InplaceAdd and Inplace Sub ops([!2744](https://gitee.com/mindspore/mindspore/pulls/2744]))
- Change order param only equal to group param([!2748](https://gitee.com/mindspore/mindspore/pulls/2748))
- Executor
- The performance of graph whith control flow is optimized([!2931](https://gitee.com/mindspore/mindspore/pulls/2931))
- The performance of graph with control flow is optimized([!2931](https://gitee.com/mindspore/mindspore/pulls/2931))
- Fix bug of wrong number of tuple layers([!3390](https://gitee.com/mindspore/mindspore/pulls/3390))
- Fix cpu multi graph memory exception([!3631](https://gitee.com/mindspore/mindspore/pulls/3631))
- Enable data sync when calling operator without defining a cell([!3081](https://gitee.com/mindspore/mindspore/pulls/3081))
@ -968,7 +1012,7 @@ Contributions of any kind are welcome!
- Fix dropouttopK and addn errors in PyNative mode ([!1285](https://gitee.com/mindspore/mindspore/pulls/1285), [!1138](https://gitee.com/mindspore/mindspore/pulls/1138), [!1033](https://gitee.com/mindspore/mindspore/pulls/1033)).
- Fix memory leaks after execution in PyNatvie mode ([!1201](https://gitee.com/mindspore/mindspore/pulls/1201)).
- Fix HCCL failure in some special scenes ([!1204](https://gitee.com/mindspore/mindspore/pulls/1204), [!1252](https://gitee.com/mindspore/mindspore/pulls/1252)).
- Fix SSD network when Select failed, cann't find kernel info([!1449](https://gitee.com/mindspore/mindspore/pulls/1449)).
- Fix SSD network when Select failed, can't find kernel info([!1449](https://gitee.com/mindspore/mindspore/pulls/1449)).
- Fix Topk operator selection strategy bug between aicore and aicpu([!1367](https://gitee.com/mindspore/mindspore/pulls/1367)).
- Fix input memory size of 'assign' op unequal in control sink mode when assigning a data from one child graph to another child graph([!802](https://gitee.com/mindspore/mindspore/pulls/802)).
- Fix allreduce ir inconsistency([!989](https://gitee.com/mindspore/mindspore/pulls/989)).

View File

@ -301,9 +301,9 @@
{"op_name": "Fill", "inputs": [{"index": 0, "name": "dims", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "value", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["int32", "DefaultFormat"], ["int32", "DefaultFormat"], ["int32", "DefaultFormat"]], [["int32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]], [["int32", "DefaultFormat"], ["float16", "DefaultFormat"], ["int16", "DefaultFormat"]], [["int64", "DefaultFormat"], ["int32", "DefaultFormat"], ["int32", "DefaultFormat"]], [["int64", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]], [["int64", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "fill.so", "compute_cost": 10, "kernel_name": "fill", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": true, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Erf", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["float16", ""], ["float16", ""]], [["float32", ""], ["float32", ""]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "erf.so", "compute_cost": 10, "kernel_name": "erf", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": "formatAgnostic"}
{"op_name": "Erfc", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["float16", ""], ["float16", ""]], [["float32", ""], ["float32", ""]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "erfc.so", "compute_cost": 10, "kernel_name": "erfc", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": "formatAgnostic"}
{"op_name": "DepthwiseConv2dNative", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "bias", "need_compile": false, "param_type": "optional", "shape": "all"}, {"index": 3, "name": "offset_w", "need_compile": false, "param_type": "optional", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "stride", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilation", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "format", "param_type": "required", "type": "str", "value": "all"}, {"name": "offset_a", "param_type": "optional", "type": "int", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NC1HWC0"], ["float16", "C1HWNCoC0"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "NC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "depthwise_conv2d.so", "compute_cost": 10, "kernel_name": "depthwise_conv2d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "DepthwiseConv2dNativeBackpropFilter", "inputs": [{"index": 0, "name": "input", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "filter_grad", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [{"name": "filter_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "stride", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilation", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "format", "param_type": "required", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NC1HWC0"], ["float16", "NC1HWC0"], ["float32", "C1HWNCoC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "depthwise_conv2d_backprop_filter_d.so", "compute_cost": 10, "kernel_name": "depthwise_conv2d_backprop_filter_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "DepthwiseConv2dNativeBackpropInput", "inputs": [{"index": 0, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "input_grad", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [{"name": "input_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "stride", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilation", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "format", "param_type": "required", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "C1HWNCoC0"], ["float16", "NC1HWC0"], ["float16", "NC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "depthwise_conv2d_backprop_input_d.so", "compute_cost": 10, "kernel_name": "depthwise_conv2d_backprop_input_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "DepthwiseConv2dNative", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "bias", "need_compile": false, "param_type": "optional", "shape": "all"}, {"index": 3, "name": "offset_w", "need_compile": false, "param_type": "optional", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "stride", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilation", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "format", "param_type": "required", "type": "str", "value": "all"}, {"name": "offset_a", "param_type": "optional", "type": "int", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NC1HWC0"], ["float16", "C1HWNCoC0"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "NC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "depthwise_conv2d.so", "compute_cost": 10, "kernel_name": "depthwise_conv2d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "DepthwiseConv2dNativeBackpropFilter", "inputs": [{"index": 0, "name": "input", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "filter_grad", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [{"name": "filter_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "stride", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilation", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "format", "param_type": "required", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NC1HWC0"], ["float16", "NC1HWC0"], ["float32", "C1HWNCoC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "depthwise_conv2d_backprop_filter_d.so", "compute_cost": 10, "kernel_name": "depthwise_conv2d_backprop_filter_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "DepthwiseConv2dNativeBackpropInput", "inputs": [{"index": 0, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "input_grad", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [{"name": "input_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "stride", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilation", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "format", "param_type": "required", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "C1HWNCoC0"], ["float16", "NC1HWC0"], ["float16", "NC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "depthwise_conv2d_backprop_input_d.so", "compute_cost": 10, "kernel_name": "depthwise_conv2d_backprop_input_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "GreaterEqual", "inputs": [{"index": 0, "name": "x1", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "x2", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "OPAQUE", "dtype_format": [[["int8", ""], ["int8", ""], ["bool", ""]], [["uint8", ""], ["uint8", ""], ["bool", ""]], [["int32", ""], ["int32", ""], ["bool", ""]], [["float16", ""], ["float16", ""], ["bool", ""]], [["float32", ""], ["float32", ""], ["bool", ""]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "greater_equal.so", "compute_cost": 10, "kernel_name": "greater_equal", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": "broadcast"}
{"op_name": "NotEqual", "inputs": [{"index": 0, "name": "x1", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "x2", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["int8", ""], ["int8", ""], ["bool", ""]], [["uint8", ""], ["uint8", ""], ["bool", ""]], [["int32", ""], ["int32", ""], ["bool", ""]], [["float16", ""], ["float16", ""], ["bool", ""]], [["float32", ""], ["float32", ""], ["bool", ""]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "not_equal.so", "compute_cost": 10, "kernel_name": "not_equal", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": "broadcast"}
{"op_name": "FloorMod", "inputs": [{"index": 0, "name": "x1", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "x2", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["float16", ""], ["float16", ""], ["float16", ""]], [["float32", ""], ["float32", ""], ["float32", ""]], [["int32", ""], ["int32", ""], ["int32", ""]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "floor_mod.so", "compute_cost": 10, "kernel_name": "floor_mod", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": "broadcast"}
@ -440,10 +440,10 @@
{"op_name": "FakeQuantWithMinMaxVarsGradient", "inputs": [{"index": 0, "name": "gradients", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "min", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 3, "name": "max", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "backprops_wrt_x", "need_compile": true, "param_type": "required", "shape": "all"}, {"index": 1, "name": "backprops_wrt_min", "need_compile": true, "param_type": "required", "shape": "all"}, {"index": 2, "name": "backprops_wrt_max", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "num_bits", "param_type": "optional", "type": "int", "value": "all"}, {"name": "narrow_range", "param_type": "optional", "type": "bool", "value": "all"}], "fusion_type": "OPAQUE", "dtype_format": [[["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "fake_quant_with_min_max_vars_gradient.so", "compute_cost": 10, "kernel_name": "fake_quant_with_min_max_vars_gradient", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "FakeQuantWithMinMaxVarsPerChannel", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "min", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "max", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "num_bits", "param_type": "optional", "type": "int", "value": "all"}, {"name": "narrow_range", "param_type": "optional", "type": "bool", "value": "all"}], "fusion_type": "OPAQUE", "dtype_format": [[["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "fake_quant_with_min_max_vars_per_channel.so", "compute_cost": 10, "kernel_name": "fake_quant_with_min_max_vars_per_channel", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "FakeQuantWithMinMaxVarsPerChannelGradient", "inputs": [{"index": 0, "name": "gradients", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "min", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 3, "name": "max", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "backprops_wrt_x", "need_compile": true, "param_type": "required", "shape": "all"}, {"index": 1, "name": "backprops_wrt_min", "need_compile": true, "param_type": "required", "shape": "all"}, {"index": 2, "name": "backprops_wrt_max", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "num_bits", "param_type": "optional", "type": "int", "value": "all"}, {"name": "narrow_range", "param_type": "optional", "type": "bool", "value": "all"}], "fusion_type": "OPAQUE", "dtype_format": [[["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "fake_quant_with_min_max_vars_per_channel_gradient.so", "compute_cost": 10, "kernel_name": "fake_quant_with_min_max_vars_per_channel_gradient", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3D", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "bias", "need_compile": false, "param_type": "optional", "shape": "all"}, {"index": 3, "name": "offset_w", "need_compile": false, "param_type": "optional", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}, {"name": "offset_x", "param_type": "optional", "type": "int", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NDC1HWC0"], ["float16", "FRACTAL_Z_3D"], ["float16", "DefaultFormat"], ["int8", "DefaultFormat"], ["float16", "NDC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d.so", "compute_cost": 10, "kernel_name": "conv3d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3DBackpropInput", "inputs": [{"index": 0, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "input_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "FRACTAL_Z_3D"], ["float16", "NDC1HWC0"], ["float16", "NDC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d_backprop_input_d.so", "compute_cost": 10, "kernel_name": "conv3d_backprop_input_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3DBackpropFilter", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "filter_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NDC1HWC0"], ["float16", "NDC1HWC0"], ["float32", "FRACTAL_Z_3D"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d_backprop_filter_d.so", "compute_cost": 10, "kernel_name": "conv3d_backprop_filter_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3DTranspose", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "bias", "need_compile": false, "param_type": "optional", "shape": "all"}, {"index": 3, "name": "offset_w", "need_compile": false, "param_type": "optional", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "input_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pads", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "optional", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}, {"name": "output_padding", "param_type": "optional", "type": "listInt", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NDC1HWC0"], ["float16", "FRACTAL_Z_3D"], ["float16", "DefaultFormat"], ["int8", "DefaultFormat"], ["float16", "NDC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d_transpose_d.so", "compute_cost": 10, "kernel_name": "conv3d_transpose_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3D", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "bias", "need_compile": false, "param_type": "optional", "shape": "all"}, {"index": 3, "name": "offset_w", "need_compile": false, "param_type": "optional", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}, {"name": "offset_x", "param_type": "optional", "type": "int", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NDC1HWC0"], ["float16", "FRACTAL_Z_3D"], ["float16", "DefaultFormat"], ["int8", "DefaultFormat"], ["float16", "NDC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d.so", "compute_cost": 10, "kernel_name": "conv3d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3DBackpropInput", "inputs": [{"index": 0, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "input_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "FRACTAL_Z_3D"], ["float16", "NDC1HWC0"], ["float16", "NDC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d_backprop_input_d.so", "compute_cost": 10, "kernel_name": "conv3d_backprop_input_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3DBackpropFilter", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "out_backprop", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "filter_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NDC1HWC0"], ["float16", "NDC1HWC0"], ["float32", "FRACTAL_Z_3D"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d_backprop_filter_d.so", "compute_cost": 10, "kernel_name": "conv3d_backprop_filter_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "Conv3DTranspose", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "filter", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "bias", "need_compile": false, "param_type": "optional", "shape": "all"}, {"index": 3, "name": "offset_w", "need_compile": false, "param_type": "optional", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": true, "param_type": "required", "shape": "all"}], "attr": [{"name": "input_size", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "strides", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "pad_list", "param_type": "required", "type": "listInt", "value": "all"}, {"name": "dilations", "param_type": "optional", "type": "listInt", "value": "all"}, {"name": "groups", "param_type": "optional", "type": "int", "value": "all"}, {"name": "format", "param_type": "optional", "type": "str", "value": "all"}, {"name": "output_padding", "param_type": "optional", "type": "listInt", "value": "all"}], "fusion_type": "CONVLUTION", "dtype_format": [[["float16", "NDC1HWC0"], ["float16", "FRACTAL_Z_3D"], ["float16", "DefaultFormat"], ["int8", "DefaultFormat"], ["float16", "NDC1HWC0"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "conv3d_transpose_d.so", "compute_cost": 10, "kernel_name": "conv3d_transpose_d", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "LambApplyOptimizerAssign", "inputs": [{"index": 0, "name": "grad", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "inputv", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "inputm", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 3, "name": "input3", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 4, "name": "mul0_x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 5, "name": "mul1_x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 6, "name": "mul2_x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 7, "name": "mul3_x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 8, "name": "add2_y", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 9, "name": "steps", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 10, "name": "do_use_weight", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 11, "name": "weight_decay_rate", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "output0", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 0, "name": "inputv", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 0, "name": "inputm", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"]], [["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "lamb_apply_optimizer_assign.so", "compute_cost": 10, "kernel_name": "lamb_apply_optimizer_assign", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "LambApplyWeightAssign", "inputs": [{"index": 0, "name": "input0", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "input1", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "input2", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 3, "name": "input3", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 4, "name": "input_param", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "input_param", "need_compile": false, "param_type": "required", "shape": "all"}], "attr": [], "fusion_type": "ELEMWISE", "dtype_format": [[["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"], ["float16", "DefaultFormat"]], [["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "lamb_apply_weight_assign.so", "compute_cost": 10, "kernel_name": "lamb_apply_weight_assign", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}
{"op_name": "NLLLoss", "inputs": [{"index": 0, "name": "x", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "target", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 2, "name": "weight", "need_compile": false, "param_type": "required", "shape": "all"}], "outputs": [{"index": 0, "name": "y", "need_compile": false, "param_type": "required", "shape": "all"}, {"index": 1, "name": "total_weight", "need_compile": false, "param_type": "optional", "shape": "all"}], "attr": [{"name": "reduction", "param_type": "optional", "type": "str", "value": "all"}], "fusion_type": "OPAQUE", "dtype_format": [[["float32", "DefaultFormat"], ["int32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"], ["float32", "DefaultFormat"]]], "imply_type": "TBE", "async_flag": false, "binfile_name": "nll_loss.so", "compute_cost": 10, "kernel_name": "nll_loss", "partial_flag": true, "reshape_type": "", "dynamic_format": false, "dynamic_shape": false, "need_check_supported": false, "op_pattern": ""}

View File

@ -34,7 +34,6 @@ constexpr size_t kConv2DBackpropInputNum = 4;
constexpr size_t kConv2DAxisNum = 4;
constexpr auto kAttrOffsetA = "offset_a";
constexpr auto kAttrPadList = "pad_list";
constexpr auto kAttrPads = "pads";
constexpr auto kAttrMode = "mode";
constexpr auto kAttrChannelMultiplier = "channel_multiplier";
constexpr auto kAttrPerm = "perm";
@ -200,9 +199,8 @@ void SetCommonAttrs(const CNodePtr &conv2d, const CNodePtr &depth_conv) {
AnfAlgo::CopyNodeAttr(kAttrKernelSize, conv2d, depth_conv);
AnfAlgo::CopyNodeAttr(kAttrDilation, conv2d, depth_conv);
AnfAlgo::CopyNodeAttr(kAttrFormat, conv2d, depth_conv);
AnfAlgo::CopyNodeAttr(kAttrPadList, kAttrPads, conv2d, depth_conv);
AnfAlgo::CopyNodeAttr(kAttrPadList, conv2d, depth_conv);
AnfAlgo::CopyNodeAttr(kAttrPadMode, conv2d, depth_conv);
AnfAlgo::CopyNodeAttr(kAttrPad, conv2d, depth_conv);
AnfAlgo::SetNodeAttr(kAttrMode, MakeValue(3), depth_conv);
AnfAlgo::SetNodeAttr(kAttrChannelMultiplier, MakeValue(1), depth_conv);
}

View File

@ -843,7 +843,7 @@ void OnnxExporter::ExportPrimDepthwiseConv2d(const FuncGraphPtr & /*func_graph*/
onnx_attr_proto->set_s("SAME_UPPER");
} else {
onnx_attr_proto->set_name("pads");
SetAttrTupleValueToProto(prim->GetAttr("pads"), onnx::AttributeProto_AttributeType_INTS, onnx_attr_proto, prim);
SetAttrTupleValueToProto(prim->GetAttr("pad_list"), onnx::AttributeProto_AttributeType_INTS, onnx_attr_proto, prim);
}
// set strides
onnx_attr_proto = node_proto->add_attribute();

View File

@ -68,7 +68,7 @@ REG_ADPT_DESC(Conv2DBackpropFilterD, prim::kPrimConv2DBackpropFilter->name(), AD
INPUT_MAP(DepthwiseConv2D) = {{1, INPUT_DESC(x)}, {2, INPUT_DESC(filter)}, {3, INPUT_DESC(bias)}};
ATTR_MAP(DepthwiseConv2D) = {
{"stride", ATTR_DESC(strides, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"pads", ATTR_DESC(pads, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"pad_list", ATTR_DESC(pads, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"dilation", ATTR_DESC(dilations, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"format", ATTR_DESC(data_format, AnyTraits<std::string>())},
};
@ -81,7 +81,7 @@ INPUT_ATTR_MAP(DepthwiseConv2DBackpropInputD) = {
{1, ATTR_DESC(input_size, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())}};
ATTR_MAP(DepthwiseConv2DBackpropInputD) = {
{"stride", ATTR_DESC(strides, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"pads", ATTR_DESC(pads, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"pad_list", ATTR_DESC(pads, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"dilation", ATTR_DESC(dilations, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
};
OUTPUT_MAP(DepthwiseConv2DBackpropInputD) = {{0, OUTPUT_DESC(input_grad)}};
@ -94,7 +94,7 @@ INPUT_ATTR_MAP(DepthwiseConv2DBackpropFilterD) = {
{2, ATTR_DESC(filter_size, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())}};
ATTR_MAP(DepthwiseConv2DBackpropFilterD) = {
{"stride", ATTR_DESC(strides, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"pads", ATTR_DESC(pads, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"pad_list", ATTR_DESC(pads, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
{"dilation", ATTR_DESC(dilations, AnyTraits<std::vector<int64_t>>(), AnyTraits<std::vector<int64_t>>())},
};
OUTPUT_MAP(DepthwiseConv2DBackpropFilterD) = {{0, OUTPUT_DESC(filter_grad)}};

View File

@ -320,7 +320,7 @@ AbstractBasePtr InferImplConv2D(const AnalysisEnginePtr &, const PrimitivePtr &p
int64_t c_axis = 1;
int64_t h_axis = 2;
int64_t w_axis = 3;
auto data_format_ptr = primitive->GetAttr("data_format");
auto data_format_ptr = primitive->GetAttr("format");
std::string data_format = "NCHW";
if ((data_format_ptr != nullptr) && data_format_ptr->isa<StringImm>()) {
data_format = data_format_ptr->cast<StringImmPtr>()->value();

View File

@ -166,11 +166,11 @@ def get_bprop_extract_image_patches(self):
def get_bprop_depthwise_conv2d_native(self):
"""Grad definition for `DepthwiseConv2dNative` operation."""
input_grad = G.DepthwiseConv2dNativeBackpropInput(
self.channel_multiplier, self.kernel_size, self.pad_mode, self.pad, self.pads, self.mode, self.stride,
self.channel_multiplier, self.kernel_size, self.pad_mode, self.pad, self.pad_list, self.mode, self.stride,
self.dilation, self.group
)
filter_grad = G.DepthwiseConv2dNativeBackpropFilter(
self.channel_multiplier, self.kernel_size, self.pad_mode, self.pad, self.pads, self.mode, self.stride,
self.channel_multiplier, self.kernel_size, self.pad_mode, self.pad, self.pad_list, self.mode, self.stride,
self.dilation, self.group
)
get_shape = P.Shape()
@ -280,7 +280,7 @@ def _get_mean_matrix(x_shape, ksize, stride, pad_mode, x_dtype):
the value of element which is padded is 0, else are 1.
For each element of output, it is mapped for slide window: `[h*h_stride : h*h_stride + h_ksize,
w*w_stride : w*w_stride + w_ksize]` of `assist_input_matrix`, so the sum of slide window is the
number of input that assosiate with output element.
number of input that associate with output element.
"""
n_input, c_input, h_input, w_input = x_shape

View File

@ -24,7 +24,7 @@ conv3d_op_info = TBERegOp("Conv3D") \
.kernel_name("conv3d") \
.partial_flag(True) \
.attr("strides", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("dilations", "required", "listInt", "all") \
.attr("groups", "optional", "int", "all") \
.attr("format", "optional", "str", "all") \

View File

@ -25,7 +25,7 @@ conv3d_backprop_filter_op_info = TBERegOp("Conv3DBackpropFilter") \
.partial_flag(True) \
.attr("filter_size", "required", "listInt", "all") \
.attr("strides", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("dilations", "required", "listInt", "all") \
.attr("groups", "optional", "int", "all") \
.attr("format", "optional", "str", "all") \

View File

@ -25,7 +25,7 @@ conv3d_backprop_input_op_info = TBERegOp("Conv3DBackpropInput") \
.partial_flag(True) \
.attr("input_size", "required", "listInt", "all") \
.attr("strides", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("dilations", "required", "listInt", "all") \
.attr("groups", "optional", "int", "all") \
.attr("format", "optional", "str", "all") \

View File

@ -25,7 +25,7 @@ conv3d_transpose_op_info = TBERegOp("Conv3DTranspose") \
.partial_flag(True) \
.attr("input_size", "required", "listInt", "all") \
.attr("strides", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("dilations", "optional", "listInt", "all") \
.attr("groups", "optional", "int", "all") \
.attr("format", "optional", "str", "all") \

View File

@ -25,7 +25,7 @@ depthwise_conv2d_op_info = TBERegOp("DepthwiseConv2dNative") \
.partial_flag(True) \
.attr("stride", "required", "listInt", "all") \
.attr("dilation", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("format", "required", "str", "all") \
.attr("offset_a", "optional", "int", "all") \
.input(0, "x", False, "required", "all") \

View File

@ -26,7 +26,7 @@ depthwise_conv2d_backprop_filter_op_info = TBERegOp("DepthwiseConv2dNativeBackpr
.attr("filter_size", "required", "listInt", "all") \
.attr("stride", "required", "listInt", "all") \
.attr("dilation", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("format", "required", "str", "all") \
.input(0, "input", False, "required", "all") \
.input(1, "out_backprop", False, "required", "all") \

View File

@ -26,7 +26,7 @@ depthwise_conv2d_backprop_input_op_info = TBERegOp("DepthwiseConv2dNativeBackpro
.attr("input_size", "required", "listInt", "all") \
.attr("stride", "required", "listInt", "all") \
.attr("dilation", "required", "listInt", "all") \
.attr("pads", "required", "listInt", "all") \
.attr("pad_list", "required", "listInt", "all") \
.attr("format", "required", "str", "all") \
.input(0, "filter", False, "required", "all") \
.input(1, "out_backprop", False, "required", "all") \

View File

@ -354,8 +354,9 @@ class Conv3DBackpropFilter(PrimitiveWithInfer):
if isinstance(pad, int):
pad = (pad,) * 6
validator.check_equal_int(len(pad), 6, 'pad size', self.name)
self.add_prim_attr('pad', self.pad)
self.pad_list = pad
self.add_prim_attr('pads', self.pad_list)
self.add_prim_attr('pad_list', self.pad_list)
self.pad_mode = validator.check_string(pad_mode.lower(), ['valid', 'same', 'pad'], 'pad_mode', self.name)
if self.pad_mode != 'pad' and self.pad_list != (0, 0, 0, 0, 0, 0):
@ -415,7 +416,7 @@ class Conv3DBackpropFilter(PrimitiveWithInfer):
pad_right = pad_needed_w - pad_left
self.pad_list = (pad_head, pad_tail, pad_top, pad_bottom, pad_left, pad_right)
self.add_prim_attr('pads', self.pad_list)
self.add_prim_attr('pad_list', self.pad_list)
out = {
'value': None,
'shape': w_size_v,
@ -432,7 +433,10 @@ class Conv2DBackpropFilter(PrimitiveWithInfer):
out_channel (int): The dimensionality of the output space.
kernel_size (Union[int, tuple[int]]): The size of the convolution window.
pad_mode (str): Modes to fill padding. It could be "valid", "same", or "pad". Default: "valid".
pad (int): The pad value to be filled. Default: 0.
pad (Union(int, tuple[int])): The pad value to be filled. Default: 0. If `pad` is an integer, the paddings of
top, bottom, left and right are the same, equal to pad. If `pad` is a tuple of four integers, the
padding of top, bottom, left and right equal to pad[0], pad[1], pad[2], and pad[3] correspondingly.
pad_list (tuple): The pad list like (top, bottom, left, right). Default: (0, 0, 0, 0).
mode (int): Modes for different convolutions. 0 Math convolutiuon, 1 cross-correlation convolution ,
2 deconvolution, 3 depthwise convolution. Default: 1.
stride (tuple): The stride to be applied to the convolution filter. Default: (1, 1).
@ -464,7 +468,11 @@ class Conv2DBackpropFilter(PrimitiveWithInfer):
self.mode = mode
pad_mode = pad_mode.upper()
self.add_prim_attr('pad_mode', pad_mode)
self.pad = pad
if isinstance(pad, int):
pad = (pad,) * 4
else:
validator.check_equal_int(len(pad), 4, 'pad size', self.name)
self.add_prim_attr("pad", pad)
if isinstance(stride, tuple) and len(stride) == 4:
self.stride = (stride[2], stride[3])
self.add_prim_attr('stride', self.stride)
@ -506,8 +514,10 @@ class DepthwiseConv2dNativeBackpropFilter(PrimitiveWithInfer):
mode (int): Modes for different convolutions. 0 Math convolutiuon, 1 cross-correlation convolution,
2 deconvolution,3 depthwise convolution. Default: 3.
pad_mode (str): The mode to fill padding which can be: "valid", "same" or "pad". Default: "valid".
pad (int): The pad value to be filled. Default: 0.
pads (tuple): The pad list like (top, bottom, left, right). Default: (0, 0, 0, 0).
pad (Union(int, tuple[int])): The pad value to be filled. Default: 0. If `pad` is an integer, the paddings of
top, bottom, left and right are the same, equal to pad. If `pad` is a tuple of four integers, the
padding of top, bottom, left and right equal to pad[0], pad[1], pad[2], and pad[3] correspondingly.
pad_list (tuple): The pad list like (top, bottom, left, right). Default: (0, 0, 0, 0).
stride (int): The stride to be applied to the convolution filter. Default: 1.
dilation (int): Specifies the space to use between kernel elements. Default: 1.
group (int): Splits input into groups. Default: 1.
@ -522,7 +532,7 @@ class DepthwiseConv2dNativeBackpropFilter(PrimitiveWithInfer):
kernel_size,
pad_mode="valid",
pad=0,
pads=(0, 0, 0, 0),
pad_list=(0, 0, 0, 0),
mode=3,
stride=1,
dilation=1,
@ -533,8 +543,12 @@ class DepthwiseConv2dNativeBackpropFilter(PrimitiveWithInfer):
self.kernel_size = kernel_size
self.mode = mode
self.pad_mode = pad_mode
self.pad = pad
self.pads = pads
if isinstance(pad, int):
pad = (pad,) * 4
else:
validator.check_equal_int(len(pad), 4, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.pad_list = pad_list
self.stride = stride
self.dilation = dilation
self.group = group
@ -567,8 +581,10 @@ class DepthwiseConv2dNativeBackpropInput(PrimitiveWithInfer):
mode (int): Modes for different convolutions. 0 Math convolutiuon, 1 cross-correlation convolution ,
2 deconvolution,3 depthwise convolution. Default: 3.
pad_mode (str): Modes to fill padding. It could be "valid", "same", or "pad". Default: "valid".
pad (int): The pad value to be filled. Default: 0.
pads (tuple): The pad list like (top, bottom, left, right). Default: (0, 0, 0, 0).
pad (Union(int, tuple[int])): The pad value to be filled. Default: 0. If `pad` is an integer, the paddings of
top, bottom, left and right are the same, equal to pad. If `pad` is a tuple of four integers, the
padding of top, bottom, left and right equal to pad[0], pad[1], pad[2], and pad[3] correspondingly.
pad_list (tuple): The pad list like (top, bottom, left, right). Default: (0, 0, 0, 0).
stride (int): The stride to be applied to the convolution filter. Default: 1.
dilation (int): Specifies the space to use between kernel elements. Default: 1.
group (int): Splits input into groups. Default: 1.
@ -583,7 +599,7 @@ class DepthwiseConv2dNativeBackpropInput(PrimitiveWithInfer):
kernel_size,
pad_mode="valid",
pad=0,
pads=(0, 0, 0, 0),
pad_list=(0, 0, 0, 0),
mode=3,
stride=1,
dilation=1,
@ -594,8 +610,12 @@ class DepthwiseConv2dNativeBackpropInput(PrimitiveWithInfer):
self.kernel_size = kernel_size
self.mode = mode
self.pad_mode = pad_mode
self.pad = pad
self.pads = pads
if isinstance(pad, int):
pad = (pad,) * 4
else:
validator.check_equal_int(len(pad), 4, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.pad_list = pad_list
self.stride = stride
self.dilation = dilation
self.group = group

View File

@ -1295,6 +1295,7 @@ class Conv2D(PrimitiveWithCheck):
pad = (pad,) * 4
else:
validator.check_equal_int(len(pad), 4, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.padding = pad
self.pad_mode = validator.check_string(pad_mode, ['valid', 'same', 'pad'], 'pad_mode', self.name)
@ -1400,6 +1401,7 @@ class DepthwiseConv2dNative(PrimitiveWithInfer):
pad = (pad,) * 4
else:
validator.check_equal_int(len(pad), 4, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.padding = pad
self.pad_mode = validator.check_string(pad_mode, ['valid', 'same', 'pad'], 'pad_mode', self.name)
if pad_mode != 'pad' and pad != (0, 0, 0, 0):
@ -1450,7 +1452,7 @@ class DepthwiseConv2dNative(PrimitiveWithInfer):
w_out = math.floor(w_out)
self.pad_list = (pad_top, pad_bottom, pad_left, pad_right)
self.add_prim_attr('pads', self.pad_list)
self.add_prim_attr('pad_list', self.pad_list)
out_channel = self.channel_multiplier * x_shape[1]
out_shape = [x_shape[0], out_channel, h_out, w_out]
@ -1817,12 +1819,12 @@ class Conv2DBackpropInput(PrimitiveWithInfer):
self.add_prim_attr('stride', self.stride)
self.dilation = _check_positive_int_or_tuple('dilation', dilation, self.name, allow_four=True, ret_four=True)
self.add_prim_attr('dilation', self.dilation)
validator.check_value_type('pad', pad, (int, tuple), self.name)
if isinstance(pad, int):
pad = (pad,) * 4
else:
validator.check_equal_int(len(pad), 4, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.padding = pad
self.pad_mode = validator.check_string(pad_mode, ['valid', 'same', 'pad'], 'pad_mode', self.name)
if pad_mode != 'pad' and pad != (0, 0, 0, 0):
@ -7039,6 +7041,7 @@ class Conv3D(PrimitiveWithInfer):
if isinstance(pad, int):
pad = (pad,) * 6
validator.check_equal_int(len(pad), 6, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.padding = pad
validator.check_int_range(self.padding[0], 0, kernel_size[0], Rel.INC_LEFT,
'pad_d belonging [0, kernel_size_d)', self.name)
@ -7128,7 +7131,7 @@ class Conv3D(PrimitiveWithInfer):
w_out = math.floor(w_out)
self.pad_list = [pad_head, pad_tail, pad_top, pad_bottom, pad_left, pad_right]
self.add_prim_attr('pads', (pad_head, pad_tail, pad_top, pad_bottom, pad_left, pad_right))
self.add_prim_attr('pad_list', (pad_head, pad_tail, pad_top, pad_bottom, pad_left, pad_right))
out_channel = self.out_channel
out_shape = [x_shape[0], out_channel, d_out, h_out, w_out]
_check_shape('output', out_shape, self.name)
@ -7206,6 +7209,7 @@ class Conv3DBackpropInput(PrimitiveWithInfer):
if isinstance(pad, int):
pad = (pad,) * 6
validator.check_equal_int(len(pad), 6, 'pad size', self.name)
self.add_prim_attr("pad", pad)
self.pad_list = pad
self.pad_mode = validator.check_string(pad_mode.lower(), ['valid', 'same', 'pad'], 'pad_mode', self.name)
@ -7264,7 +7268,7 @@ class Conv3DBackpropInput(PrimitiveWithInfer):
pad_right = pad_needed_w - pad_left
self.pad_list = (pad_head, pad_tail, pad_top, pad_bottom, pad_left, pad_right)
self.add_prim_attr('pads', self.pad_list)
self.add_prim_attr('pad_list', self.pad_list)
out = {
'value': None,
'shape': x_size_v,
@ -7385,7 +7389,7 @@ class Conv3DTranspose(PrimitiveWithInfer):
# infer shape
x_shape = x['shape']
w_shape = w['shape']
self.add_prim_attr('pads', self.pad_list)
self.add_prim_attr('pad_list', self.pad_list)
pad_head, pad_tail, pad_top, pad_bottom, pad_left, pad_right = self.pad_list
d_out = (x_shape[2] - 1) * self.stride[2] - (pad_head + pad_tail) + self.dilation[2] * \
(self.kernel_size[0] - 1) + self.output_padding[2] + 1