move Metrics from mindspore.nn to mindspore.train

This commit is contained in:
lvyufeng 2022-09-18 08:43:37 +08:00
parent 2b2edc30f4
commit 208147d55b
90 changed files with 372 additions and 280 deletions

View File

@ -263,42 +263,6 @@ Dropout层
mindspore.nn.SGD
mindspore.nn.thor
评价指标
--------
.. mscnplatformautosummary::
:toctree: nn
:nosignatures:
:template: classtemplate.rst
mindspore.nn.Accuracy
mindspore.nn.auc
mindspore.nn.BleuScore
mindspore.nn.ConfusionMatrix
mindspore.nn.ConfusionMatrixMetric
mindspore.nn.CosineSimilarity
mindspore.nn.Dice
mindspore.nn.F1
mindspore.nn.Fbeta
mindspore.nn.HausdorffDistance
mindspore.nn.get_metric_fn
mindspore.nn.Loss
mindspore.nn.MAE
mindspore.nn.MeanSurfaceDistance
mindspore.nn.Metric
mindspore.nn.MSE
mindspore.nn.names
mindspore.nn.OcclusionSensitivity
mindspore.nn.Perplexity
mindspore.nn.Precision
mindspore.nn.Recall
mindspore.nn.ROC
mindspore.nn.RootMeanSquareDistance
mindspore.nn.rearrange_inputs
mindspore.nn.Top1CategoricalAccuracy
mindspore.nn.Top5CategoricalAccuracy
mindspore.nn.TopKCategoricalAccuracy
动态学习率
-----------

View File

@ -26,3 +26,48 @@ mindspore.train
mindspore.train.ReduceLROnPlateau
mindspore.train.RunContext
mindspore.train.TimeMonitor
评价指标
--------
.. mscnplatformautosummary::
:toctree: mindspore
:nosignatures:
:template: classtemplate.rst
mindspore.train.Accuracy
mindspore.train.BleuScore
mindspore.train.ConfusionMatrix
mindspore.train.ConfusionMatrixMetric
mindspore.train.CosineSimilarity
mindspore.train.Dice
mindspore.train.F1
mindspore.train.Fbeta
mindspore.train.HausdorffDistance
mindspore.train.Loss
mindspore.train.MAE
mindspore.train.MeanSurfaceDistance
mindspore.train.Metric
mindspore.train.MSE
mindspore.train.OcclusionSensitivity
mindspore.train.Perplexity
mindspore.train.Precision
mindspore.train.Recall
mindspore.train.ROC
mindspore.train.RootMeanSquareDistance
mindspore.train.Top1CategoricalAccuracy
mindspore.train.Top5CategoricalAccuracy
mindspore.train.TopKCategoricalAccuracy
工具
----
.. mscnplatformautosummary::
:toctree: mindspore
:nosignatures:
:template: classtemplate.rst
mindspore.train.auc
mindspore.train.get_metric_fn
mindspore.train.names
mindspore.train.rearrange_inputs

View File

@ -27,7 +27,7 @@ mindspore.SummaryLandscape
- mindspore.Model用户的模型。
- mindspore.nn.Cell用户的网络。
- mindspore.dataset创建loss所需要的用户数据集。
- mindspore.nn.Metrics用户的评估指标。
- mindspore.train.Metrics用户的评估指标。
- **collect_landscape** (Union[dict, None]) - 创建loss地形图所用的参数含义与SummaryCollector同名字段一致。此处设置的目的是允许用户可以自由修改创建loss地形图参数。默认值None。

View File

@ -1,7 +1,7 @@
mindspore.nn.Accuracy
=====================
mindspore.train.Accuracy
=========================
.. py:class:: mindspore.nn.Accuracy(eval_type='classification')
.. py:class:: mindspore.train.Accuracy(eval_type='classification')
计算数据分类的正确率,包括二分类和多分类。

View File

@ -1,7 +1,7 @@
mindspore.nn.BleuScore
======================
mindspore.train.BleuScore
==========================
.. py:class:: mindspore.nn.BleuScore(n_gram=4, smooth=False)
.. py:class:: mindspore.train.BleuScore(n_gram=4, smooth=False)
计算BLEU分数。BLEU指的是具有一个或多个引用的机器翻译文本的metric。

View File

@ -1,11 +1,11 @@
mindspore.nn.ConfusionMatrix
============================
mindspore.train.ConfusionMatrix
================================
.. py:class:: mindspore.nn.ConfusionMatrix(num_classes, normalize='no_norm', threshold=0.5)
.. py:class:: mindspore.train.ConfusionMatrix(num_classes, normalize='no_norm', threshold=0.5)
计算混淆矩阵(confusion matrix),通常用于评估分类模型的性能,包括二分类和多分类场景。
如果只想使用混淆矩阵,请使用该类。如果想计算"PPV"、"TPR"、"TNR"等,请使用'mindspore.nn.ConfusionMatrixMetric'类。
如果只想使用混淆矩阵,请使用该类。如果想计算"PPV"、"TPR"、"TNR"等,请使用'mindspore.train.ConfusionMatrixMetric'类。
参数:
- **num_classes** (int) - 数据集中的类别数量。

View File

@ -1,7 +1,7 @@
mindspore.nn.ConfusionMatrixMetric
==================================
mindspore.train.ConfusionMatrixMetric
======================================
.. py:class:: mindspore.nn.ConfusionMatrixMetric(skip_channel=True, metric_name='sensitivity', calculation_method=False, decrease='mean')
.. py:class:: mindspore.train.ConfusionMatrixMetric(skip_channel=True, metric_name='sensitivity', calculation_method=False, decrease='mean')
计算与混淆矩阵相关的度量。
@ -9,7 +9,7 @@ mindspore.nn.ConfusionMatrixMetric
此函数支持计算参数metric_name中描述中列出的所有度量名称。
如果要使用混淆矩阵计算,如"PPV"、"TPR"、"TNR",请使用此类。
如果只想计算混淆矩阵,请使用'mindspore.nn.ConfusionMatrix'。
如果只想计算混淆矩阵,请使用'mindspore.train.ConfusionMatrix'。
参数:
- **skip_channel** (bool) - 是否跳过预测输出的第一个通道的度量计算。默认值True。

View File

@ -1,7 +1,7 @@
mindspore.nn.Dice
==================
mindspore.train.Dice
=====================
.. py:class:: mindspore.nn.Dice(smooth=1e-5)
.. py:class:: mindspore.train.Dice(smooth=1e-5)
集合相似性度量。

View File

@ -1,10 +1,10 @@
mindspore.nn.F1
mindspore.train.F1
=====================
.. py:class:: mindspore.nn.F1
.. py:class:: mindspore.train.F1
计算F1 score。F1是Fbeta的特殊情况即beta为1。
有关更多详细信息,请参阅类 :class:`mindspore.nn.Fbeta`。
有关更多详细信息,请参阅类 :class:`mindspore.train.Fbeta`。
.. math::
F_1=\frac{2\cdot true\_positive}{2\cdot true\_positive + false\_negative + false\_positive}

View File

@ -1,7 +1,7 @@
mindspore.nn.Fbeta
==================
mindspore.train.Fbeta
======================
.. py:class:: mindspore.nn.Fbeta(beta)
.. py:class:: mindspore.train.Fbeta(beta)
计算Fbeta评分。

View File

@ -1,7 +1,7 @@
mindspore.nn.HausdorffDistance
mindspore.train.HausdorffDistance
============================================
.. py:class:: mindspore.nn.HausdorffDistance(distance_metric='euclidean', percentile=None, directed=False, crop=True)
.. py:class:: mindspore.train.HausdorffDistance(distance_metric='euclidean', percentile=None, directed=False, crop=True)
计算Hausdorff距离。Hausdorff距离是两个点集之间两点的最小距离的最大值度量了两个点集间的最大不匹配程度。

View File

@ -1,7 +1,7 @@
mindspore.nn.Loss
=================
mindspore.train.Loss
====================
.. py:class:: mindspore.nn.Loss
.. py:class:: mindspore.train.Loss
计算loss的平均值。如果每 :math:`n` 次迭代调用一次 `update` 方法,则计算结果为:

View File

@ -1,7 +1,7 @@
mindspore.nn.MAE
================
mindspore.train.MAE
====================
.. py:class:: mindspore.nn.MAE
.. py:class:: mindspore.train.MAE
计算平均绝对误差MAEMean Absolute Error

View File

@ -1,7 +1,7 @@
mindspore.nn.MSE
================
mindspore.train.MSE
====================
.. py:class:: mindspore.nn.MSE
.. py:class:: mindspore.train.MSE
测量均方差MSEMean Squared Error

View File

@ -1,7 +1,7 @@
mindspore.nn.MeanSurfaceDistance
mindspore.train.MeanSurfaceDistance
===============================================
.. py:class:: mindspore.nn.MeanSurfaceDistance(symmetric=False, distance_metric='euclidean')
.. py:class:: mindspore.train.MeanSurfaceDistance(symmetric=False, distance_metric='euclidean')
计算从 `y_pred``y` 的平均表面距离。通常情况下,用来衡量分割任务中,预测情况和真实情况之间的差异度。

View File

@ -1,12 +1,12 @@
mindspore.nn.Metric
====================
mindspore.train.Metric
=======================
.. py:class:: mindspore.nn.Metric
.. py:class:: mindspore.train.Metric
用于计算评估指标的基类。
在计算评估指标时需要调用 `clear``update``eval` 三个方法,在继承该类自定义评估指标时,也需要实现这三个方法。其中,`update` 用于计算中间过程的内部结果,`eval` 用于计算最终评估结果,`clear` 用于重置中间结果。
请勿直接使用该类,需使用子类如 :class:`mindspore.nn.MAE` 、 :class:`mindspore.nn.Recall` 等。
请勿直接使用该类,需使用子类如 :class:`mindspore.train.MAE` 、 :class:`mindspore.train.Recall` 等。
.. py:method:: clear()
:abstractmethod:
@ -36,7 +36,7 @@ mindspore.nn.Metric
给定(label0, label1, logits)作为 `update` 的输入,将 `indexes` 设置为[2, 1],则最终使用(logits, label1)作为 `update` 的真实输入。
.. note::
在继承该类自定义评估函数时,需要用装饰器 `mindspore.nn.rearrange_inputs` 修饰 `update` 方法,否则配置的 `indexes` 值不生效。
在继承该类自定义评估函数时,需要用装饰器 `mindspore.train.rearrange_inputs` 修饰 `update` 方法,否则配置的 `indexes` 值不生效。
参数:
- **indexes** (List(int)) - logits和标签的目标顺序。

View File

@ -1,7 +1,7 @@
mindspore.nn.OcclusionSensitivity
mindspore.train.OcclusionSensitivity
=============================================
.. py:class:: mindspore.nn.OcclusionSensitivity(pad_val=0.0, margin=2, n_batch=128, b_box=None)
.. py:class:: mindspore.train.OcclusionSensitivity(pad_val=0.0, margin=2, n_batch=128, b_box=None)
用于计算神经网络对给定图像的遮挡灵敏度Occlusion Sensitivity表示了图像的哪些部分对神经网络的分类决策最重要。

View File

@ -1,7 +1,7 @@
mindspore.nn.Perplexity
mindspore.train.Perplexity
===========================
.. py:class:: mindspore.nn.Perplexity(ignore_label=None)
.. py:class:: mindspore.train.Perplexity(ignore_label=None)
计算困惑度perplexity。困惑度是衡量一个概率分布或语言模型好坏的标准。低困惑度表明语言模型可以很好地预测样本。计算方式如下

View File

@ -1,7 +1,7 @@
mindspore.nn.Precision
======================
mindspore.train.Precision
==========================
.. py:class:: mindspore.nn.Precision(eval_type='classification')
.. py:class:: mindspore.train.Precision(eval_type='classification')
计算数据分类的精度,包括单标签场景和多标签场景。

View File

@ -1,7 +1,7 @@
mindspore.nn.Recall
=====================
mindspore.train.Recall
=======================
.. py:class:: mindspore.nn.Recall(eval_type='classification')
.. py:class:: mindspore.train.Recall(eval_type='classification')
计算数据分类的召回率,包括单标签场景和多标签场景。

View File

@ -1,7 +1,7 @@
mindspore.nn.RootMeanSquareDistance
======================================
mindspore.train.RootMeanSquareDistance
=======================================
.. py:class:: mindspore.nn.RootMeanSquareDistance(symmetric=False, distance_metric='euclidean')
.. py:class:: mindspore.train.RootMeanSquareDistance(symmetric=False, distance_metric='euclidean')
计算从 `y_pred``y` 的均方根表面距离。

View File

@ -1,6 +1,6 @@
mindspore.nn.Top1CategoricalAccuracy
====================================
mindspore.train.Top1CategoricalAccuracy
========================================
.. py:class:: mindspore.nn.Top1CategoricalAccuracy
.. py:class:: mindspore.train.Top1CategoricalAccuracy
计算top-1分类正确率。此类是TopKCategoricalAccuracy的特殊类。有关更多详细信息请参阅 :class:`.TopKCategoricalAccuracy`

View File

@ -1,6 +1,6 @@
mindspore.nn.Top5CategoricalAccuracy
=====================================
mindspore.train.Top5CategoricalAccuracy
========================================
.. py:class:: mindspore.nn.Top5CategoricalAccuracy
.. py:class:: mindspore.train.Top5CategoricalAccuracy
计算top-5分类正确率。此类是TopKCategoricalAccuracy的特殊类。有关更多详细信息请参阅 :class:`.TopKCategoricalAccuracy`

View File

@ -1,7 +1,7 @@
mindspore.nn.TopKCategoricalAccuracy
====================================
mindspore.train.TopKCategoricalAccuracy
========================================
.. py:class:: mindspore.nn.TopKCategoricalAccuracy(k)
.. py:class:: mindspore.train.TopKCategoricalAccuracy(k)
计算top-k分类正确率。

View File

@ -1,7 +1,7 @@
mindspore.nn.auc
================
mindspore.train.auc
====================
.. py:function:: mindspore.nn.auc(x, y, reorder=False)
.. py:function:: mindspore.train.auc(x, y, reorder=False)
使用梯形法则计算曲线下面积AUCArea Under the CurveAUC。这是一个一般函数给定曲线上的点
用于计算ROC (Receiver Operating Curve, ROC) 曲线下的面积。

View File

@ -1,12 +1,12 @@
mindspore.nn.get_metric_fn
===========================
mindspore.train.get_metric_fn
==============================
.. py:function:: mindspore.nn.get_metric_fn(name, *args, **kwargs)
.. py:function:: mindspore.train.get_metric_fn(name, *args, **kwargs)
根据输入的 `name` 获取metric的方法。
参数:
- **name** (str) - metric的方法名可以通过 :class:`mindspore.nn.names` 接口获取。
- **name** (str) - metric的方法名可以通过 :class:`mindspore.train.names` 接口获取。
- **args** - metric函数的参数。
- **kwargs** - metric函数的关键字参数。

View File

@ -1,7 +1,7 @@
mindspore.nn.names
==================
mindspore.train.names
======================
.. py:function:: mindspore.nn.names()
.. py:function:: mindspore.train.names()
获取所有metric的名称。

View File

@ -1,11 +1,11 @@
mindspore.nn.rearrange_inputs
==============================
mindspore.train.rearrange_inputs
=================================
.. py:function:: mindspore.nn.rearrange_inputs(func)
.. py:function:: mindspore.train.rearrange_inputs(func)
此装饰器用于根据类的 `indexes` 属性对输入重新排列。
此装饰器目前用于 :class:`mindspore.nn.Metric` 类的 `update` 方法。
此装饰器目前用于 :class:`mindspore.train.Metric` 类的 `update` 方法。
参数:
- **func** (Callable) - 要装饰的候选函数,其输入将被重新排列。

View File

@ -263,42 +263,6 @@ Optimizer
mindspore.nn.SGD
mindspore.nn.thor
Evaluation Metrics
------------------
.. msplatformautosummary::
:toctree: nn
:nosignatures:
:template: classtemplate.rst
mindspore.nn.Accuracy
mindspore.nn.auc
mindspore.nn.BleuScore
mindspore.nn.ConfusionMatrix
mindspore.nn.ConfusionMatrixMetric
mindspore.nn.CosineSimilarity
mindspore.nn.Dice
mindspore.nn.F1
mindspore.nn.Fbeta
mindspore.nn.HausdorffDistance
mindspore.nn.get_metric_fn
mindspore.nn.Loss
mindspore.nn.MAE
mindspore.nn.MeanSurfaceDistance
mindspore.nn.Metric
mindspore.nn.MSE
mindspore.nn.names
mindspore.nn.OcclusionSensitivity
mindspore.nn.Perplexity
mindspore.nn.Precision
mindspore.nn.Recall
mindspore.nn.ROC
mindspore.nn.RootMeanSquareDistance
mindspore.nn.rearrange_inputs
mindspore.nn.Top1CategoricalAccuracy
mindspore.nn.Top5CategoricalAccuracy
mindspore.nn.TopKCategoricalAccuracy
Dynamic Learning Rate
---------------------

View File

@ -30,3 +30,48 @@ Callback
mindspore.train.ReduceLROnPlateau
mindspore.train.RunContext
mindspore.train.TimeMonitor
Evaluation Metrics
------------------
.. msplatformautosummary::
:toctree: mindspore
:nosignatures:
:template: classtemplate.rst
mindspore.train.Accuracy
mindspore.train.BleuScore
mindspore.train.ConfusionMatrix
mindspore.train.ConfusionMatrixMetric
mindspore.train.CosineSimilarity
mindspore.train.Dice
mindspore.train.F1
mindspore.train.Fbeta
mindspore.train.HausdorffDistance
mindspore.train.Loss
mindspore.train.MAE
mindspore.train.MeanSurfaceDistance
mindspore.train.Metric
mindspore.train.MSE
mindspore.train.OcclusionSensitivity
mindspore.train.Perplexity
mindspore.train.Precision
mindspore.train.Recall
mindspore.train.ROC
mindspore.train.RootMeanSquareDistance
mindspore.train.Top1CategoricalAccuracy
mindspore.train.Top5CategoricalAccuracy
mindspore.train.TopKCategoricalAccuracy
Utils
-----
.. msplatformautosummary::
:toctree: mindspore
:nosignatures:
:template: classtemplate.rst
mindspore.train.auc
mindspore.train.get_metric_fn
mindspore.train.names
mindspore.train.rearrange_inputs

View File

@ -19,7 +19,7 @@ Pre-defined building blocks or computing units to construct neural networks.
"""
from __future__ import absolute_import
from mindspore.nn import layer, loss, optim, metrics, wrap, grad, probability, sparse, dynamic_lr, reinforcement
from mindspore.nn import layer, loss, optim, wrap, grad, metrics, probability, sparse, dynamic_lr, reinforcement
from mindspore.nn.learning_rate_schedule import *
from mindspore.nn.dynamic_lr import *
from mindspore.nn.cell import Cell, GraphCell

View File

@ -0,0 +1,53 @@
# Copyright 2020 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""
Metrics from mindspore.train.metrics
"""
from mindspore.train.metrics import Accuracy, HausdorffDistance, MAE, MSE, Metric, \
rearrange_inputs, Precision, Recall, Fbeta, F1, Dice, ROC, auc, \
TopKCategoricalAccuracy, Top1CategoricalAccuracy, Top5CategoricalAccuracy, Loss, \
MeanSurfaceDistance, RootMeanSquareDistance, BleuScore, CosineSimilarity, \
OcclusionSensitivity, Perplexity, ConfusionMatrixMetric, ConfusionMatrix, \
names, get_metric_fn, get_metrics
__all__ = [
"names",
"get_metric_fn",
"get_metrics",
"Accuracy",
"MAE", "MSE",
"Metric", "rearrange_inputs",
"Precision",
"HausdorffDistance",
"Recall",
"Fbeta",
"BleuScore",
"CosineSimilarity",
"OcclusionSensitivity",
"F1",
"Dice",
"ROC",
"auc",
"TopKCategoricalAccuracy",
"Top1CategoricalAccuracy",
"Top5CategoricalAccuracy",
"Loss",
"MeanSurfaceDistance",
"RootMeanSquareDistance",
"Perplexity",
"ConfusionMatrix",
"ConfusionMatrixMetric",
]

View File

@ -32,6 +32,7 @@ from mindspore.train.callback import Callback, LossMonitor, TimeMonitor, ModelCh
History, LambdaCallback, ReduceLROnPlateau, EarlyStopping
from mindspore.train.summary import SummaryRecord
from mindspore.train.train_thor import ConvertNetUtils, ConvertModelUtils
from mindspore.train.metrics import *
__all__ = ["Model", "DatasetHelper", "amp", "connect_network_with_dataset", "build_train_network", "LossScaleManager",
"FixedLossScaleManager", "DynamicLossScaleManager", "save_checkpoint", "load_checkpoint",
@ -40,3 +41,4 @@ __all__ = ["Model", "DatasetHelper", "amp", "connect_network_with_dataset", "bui
__all__.extend(callback.__all__)
__all__.extend(summary.__all__)
__all__.extend(train_thor.__all__)
__all__.extend(metrics.__all__)

View File

@ -38,7 +38,7 @@ from mindspore.train.summary.enums import PluginEnum
from mindspore.train.anf_ir_pb2 import DataType
from mindspore.train._utils import check_value_type, _make_directory
from mindspore.train.dataset_helper import DatasetHelper
from mindspore.nn.metrics import get_metrics
from mindspore.train.metrics import get_metrics
from mindspore import context
# if there is no path, you need to set to empty list
@ -265,7 +265,7 @@ class SummaryLandscape:
- mindspore.Model: User's model object.
- mindspore.nn.Cell: User's network object.
- mindspore.dataset: User's dataset object for create loss landscape.
- mindspore.nn.Metrics: User's metrics object.
- mindspore.train.Metrics: User's metrics object.
collect_landscape (Union[dict, None]): The meaning of the parameters
when creating loss landscape is consistent with the fields
with the same name in SummaryCollector. The purpose of setting here

View File

@ -20,25 +20,25 @@ on the evaluation dataset. It's used to choose the best model.
"""
from __future__ import absolute_import
from mindspore.nn.metrics.accuracy import Accuracy
from mindspore.nn.metrics.hausdorff_distance import HausdorffDistance
from mindspore.nn.metrics.error import MAE, MSE
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.nn.metrics.precision import Precision
from mindspore.nn.metrics.recall import Recall
from mindspore.nn.metrics.fbeta import Fbeta, F1
from mindspore.nn.metrics.dice import Dice
from mindspore.nn.metrics.roc import ROC
from mindspore.nn.metrics.auc import auc
from mindspore.nn.metrics.topk import TopKCategoricalAccuracy, Top1CategoricalAccuracy, Top5CategoricalAccuracy
from mindspore.nn.metrics.loss import Loss
from mindspore.nn.metrics.mean_surface_distance import MeanSurfaceDistance
from mindspore.nn.metrics.root_mean_square_surface_distance import RootMeanSquareDistance
from mindspore.nn.metrics.bleu_score import BleuScore
from mindspore.nn.metrics.cosine_similarity import CosineSimilarity
from mindspore.nn.metrics.occlusion_sensitivity import OcclusionSensitivity
from mindspore.nn.metrics.perplexity import Perplexity
from mindspore.nn.metrics.confusion_matrix import ConfusionMatrixMetric, ConfusionMatrix
from mindspore.train.metrics.accuracy import Accuracy
from mindspore.train.metrics.hausdorff_distance import HausdorffDistance
from mindspore.train.metrics.error import MAE, MSE
from mindspore.train.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.precision import Precision
from mindspore.train.metrics.recall import Recall
from mindspore.train.metrics.fbeta import Fbeta, F1
from mindspore.train.metrics.dice import Dice
from mindspore.train.metrics.roc import ROC
from mindspore.train.metrics.auc import auc
from mindspore.train.metrics.topk import TopKCategoricalAccuracy, Top1CategoricalAccuracy, Top5CategoricalAccuracy
from mindspore.train.metrics.loss import Loss
from mindspore.train.metrics.mean_surface_distance import MeanSurfaceDistance
from mindspore.train.metrics.root_mean_square_surface_distance import RootMeanSquareDistance
from mindspore.train.metrics.bleu_score import BleuScore
from mindspore.train.metrics.cosine_similarity import CosineSimilarity
from mindspore.train.metrics.occlusion_sensitivity import OcclusionSensitivity
from mindspore.train.metrics.perplexity import Perplexity
from mindspore.train.metrics.confusion_matrix import ConfusionMatrixMetric, ConfusionMatrix
__all__ = [
"names",
@ -113,7 +113,7 @@ def get_metric_fn(name, *args, **kwargs):
Gets the metric method based on the input name.
Args:
name (str): The name of metric method. Names can be obtained by `mindspore.nn.names` .
name (str): The name of metric method. Names can be obtained by `mindspore.train.names` .
object for the currently supported metrics.
args: Arguments for the metric function.
kwargs: Keyword arguments for the metric function.

View File

@ -17,7 +17,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore.nn.metrics.metric import EvaluationBase, rearrange_inputs, _check_onehot_data
from mindspore.train.metrics.metric import EvaluationBase, rearrange_inputs, _check_onehot_data
class Accuracy(EvaluationBase):
@ -42,11 +42,12 @@ class Accuracy(EvaluationBase):
Examples:
>>> import numpy as np
>>> import mindspore
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Accuracy
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]), mindspore.float32)
>>> y = Tensor(np.array([1, 0, 1]), mindspore.float32)
>>> metric = nn.Accuracy('classification')
>>> metric = Accuracy('classification')
>>> metric.clear()
>>> metric.update(x, y)
>>> accuracy = metric.eval()

View File

@ -39,15 +39,15 @@ def auc(x, y, reorder=False):
Examples:
>>> import numpy as np
>>> from mindspore import nn
>>> from mindspore.train import ROC, auc
>>>
>>> y_pred = np.array([[3, 0, 1], [1, 3, 0], [1, 0, 2]])
>>> y = np.array([[0, 2, 1], [1, 2, 1], [0, 0, 1]])
>>> metric = nn.ROC(pos_label=2)
>>> metric = ROC(pos_label=2)
>>> metric.clear()
>>> metric.update(y_pred, y)
>>> fpr, tpr, thre = metric.eval()
>>> output = nn.auc(fpr, tpr)
>>> output = auc(fpr, tpr)
>>> print(output)
0.5357142857142857
"""

View File

@ -19,7 +19,7 @@ from collections import Counter
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class BleuScore(Metric):
@ -38,12 +38,12 @@ class BleuScore(Metric):
``Ascend`` ``GPU`` ``CPU``
Examples:
>>> import mindspore.nn as nn
>>> import mindspore.train import BleuScore
>>>
>>> candidate_corpus = [['i', 'have', 'a', 'pen', 'on', 'my', 'desk']]
>>> reference_corpus = [[['i', 'have', 'a', 'pen', 'in', 'my', 'desk'],
... ['there', 'is', 'a', 'pen', 'on', 'the', 'desk']]]
>>> metric = nn.BleuScore()
>>> metric = BleuScore()
>>> metric.clear()
>>> metric.update(candidate_corpus, reference_corpus)
>>> bleu_score = metric.eval()

View File

@ -18,7 +18,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class ConfusionMatrix(Metric):
@ -27,7 +27,7 @@ class ConfusionMatrix(Metric):
including binary classification and multiple classification.
If you only need confusion matrix, use this class. If you want to calculate other metrics, such as 'PPV',
'TPR', 'TNR', etc., use class 'mindspore.nn.ConfusionMatrixMetric'.
'TPR', 'TNR', etc., use class 'mindspore.train.ConfusionMatrixMetric'.
Args:
num_classes (int): Number of classes in the dataset.
@ -45,11 +45,12 @@ class ConfusionMatrix(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import ConfusionMatrix
>>>
>>> x = Tensor(np.array([1, 0, 1, 0]))
>>> y = Tensor(np.array([1, 0, 0, 1]))
>>> metric = nn.ConfusionMatrix(num_classes=2, normalize='no_norm', threshold=0.5)
>>> metric = ConfusionMatrix(num_classes=2, normalize='no_norm', threshold=0.5)
>>> metric.clear()
>>> metric.update(x, y)
>>> output = metric.eval()
@ -174,9 +175,10 @@ class ConfusionMatrixMetric(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import ConfusionMatrixMetric
>>>
>>> metric = nn.ConfusionMatrixMetric(skip_channel=True, metric_name="tpr",
>>> metric = ConfusionMatrixMetric(skip_channel=True, metric_name="tpr",
... calculation_method=False, decrease="mean")
>>> metric.clear()
>>> x = Tensor(np.array([[[0], [1]], [[1], [0]]]))

View File

@ -18,7 +18,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class CosineSimilarity(Metric):
@ -35,10 +35,10 @@ class CosineSimilarity(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn
>>> from mindspore.train import CosineSimilarity
>>>
>>> test_data = np.array([[1, 3, 4, 7], [2, 4, 2, 5], [3, 1, 5, 8]])
>>> metric = nn.CosineSimilarity()
>>> metric = CosineSimilarity()
>>> metric.clear()
>>> metric.update(test_data)
>>> square_matrix = metric.eval()

View File

@ -18,7 +18,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class Dice(Metric):
@ -40,11 +40,12 @@ class Dice(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Dice
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([[0, 1], [1, 0], [0, 1]]))
>>> metric = nn.Dice(smooth=1e-5)
>>> metric = Dice(smooth=1e-5)
>>> metric.clear()
>>> metric.update(x, y)
>>> dice = metric.eval()

View File

@ -17,7 +17,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class MAE(Metric):
@ -38,11 +38,12 @@ class MAE(Metric):
Examples:
>>> import numpy as np
>>> import mindspore
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import MAE
>>>
>>> x = Tensor(np.array([0.1, 0.2, 0.6, 0.9]), mindspore.float32)
>>> y = Tensor(np.array([0.1, 0.25, 0.7, 0.9]), mindspore.float32)
>>> error = nn.MAE()
>>> error = MAE()
>>> error.clear()
>>> error.update(x, y)
>>> result = error.eval()
@ -114,11 +115,12 @@ class MSE(Metric):
Examples:
>>> import numpy as np
>>> import mindspore
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import MSE
>>>
>>> x = Tensor(np.array([0.1, 0.2, 0.6, 0.9]), mindspore.float32)
>>> y = Tensor(np.array([0.1, 0.25, 0.5, 0.9]), mindspore.float32)
>>> error = nn.MSE()
>>> error = MSE()
>>> error.clear()
>>> error.update(x, y)
>>> result = error.eval()

View File

@ -19,7 +19,7 @@ import sys
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs, _check_onehot_data
from mindspore.train.metrics.metric import Metric, rearrange_inputs, _check_onehot_data
class Fbeta(Metric):
@ -40,11 +40,12 @@ class Fbeta(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Fbeta
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([1, 0, 1]))
>>> metric = nn.Fbeta(1)
>>> metric = Fbeta(1)
>>> metric.clear()
>>> metric.update(x, y)
>>> fbeta = metric.eval()
@ -141,7 +142,7 @@ class Fbeta(Metric):
class F1(Fbeta):
r"""
Calculates the F1 score. F1 is a special case of Fbeta when beta is 1.
Refer to class :class:`mindspore.nn.Fbeta` for more details.
Refer to class :class:`mindspore.train.Fbeta` for more details.
.. math::
F_1=\frac{2\cdot true\_positive}{2\cdot true\_positive + false\_negative + false\_positive}
@ -151,11 +152,12 @@ class F1(Fbeta):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import F1
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([1, 0, 1]))
>>> metric = nn.F1()
>>> metric = F1()
>>> metric.update(x, y)
>>> result = metric.eval()
>>> print(result)

View File

@ -22,7 +22,7 @@ import numpy as np
from mindspore.common.tensor import Tensor
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class _ROISpatialData(metaclass=ABCMeta):
@ -98,11 +98,12 @@ class HausdorffDistance(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import HausdorffDistance
>>>
>>> x = Tensor(np.array([[3, 0, 1], [1, 3, 0], [1, 0, 2]]))
>>> y = Tensor(np.array([[0, 2, 1], [1, 2, 1], [0, 0, 1]]))
>>> metric = nn.HausdorffDistance()
>>> metric = HausdorffDistance()
>>> metric.clear()
>>> metric.update(x, y, 0)
>>> mean_average_distance = metric.eval()

View File

@ -15,7 +15,7 @@
"""Loss for evaluation"""
from __future__ import absolute_import
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class Loss(Metric):
@ -32,10 +32,11 @@ class Loss(Metric):
Examples:
>>> import numpy as np
>>> import mindspore
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Loss
>>>
>>> x = Tensor(np.array(0.2), mindspore.float32)
>>> loss = nn.Loss()
>>> loss = Loss()
>>> loss.clear()
>>> loss.update(x)
>>> result = loss.eval()

View File

@ -19,7 +19,7 @@ from scipy.ndimage import morphology
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class MeanSurfaceDistance(Metric):
@ -61,10 +61,11 @@ class MeanSurfaceDistance(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import MeanSurfaceDistance
>>> x = Tensor(np.array([[3, 0, 1], [1, 3, 0], [1, 0, 2]]))
>>> y = Tensor(np.array([[0, 2, 1], [1, 2, 1], [0, 0, 1]]))
>>> metric = nn.MeanSurfaceDistance(symmetric=False, distance_metric="euclidean")
>>> metric = MeanSurfaceDistance(symmetric=False, distance_metric="euclidean")
>>> metric.clear()
>>> metric.update(x, y, 0)
>>> mean_average_distance = metric.eval()

View File

@ -28,7 +28,7 @@ def rearrange_inputs(func):
"""
This decorator is used to rearrange the inputs according to its `indexes` attribute of the class.
This decorator is currently applied on the `update` of :class:`mindspore.nn.Metric`.
This decorator is currently applied on the `update` of :class:`mindspore.train.Metric`.
Args:
func (Callable): A candidate function to be wrapped whose input will be rearranged.
@ -79,7 +79,7 @@ class Metric(metaclass=ABCMeta):
result, and `clear` will reinitialize the intermediate results.
Never use this class directly, but instantiate one of its subclasses instead, for examples,
:class:`mindspore.nn.MAE`, :class:`mindspore.nn.Recall` etc.
:class:`mindspore.train.MAE`, :class:`mindspore.train.Recall` etc.
Supported Platforms:
``Ascend`` ``GPU`` ``CPU``
@ -123,7 +123,7 @@ class Metric(metaclass=ABCMeta):
Note:
When customize a metric, decorate the `update` function with the decorator
:func:`mindspore.nn.rearrange_inputs` for the `indexes` to take effect.
:func:`mindspore.train.rearrange_inputs` for the `indexes` to take effect.
Args:
indexes (List(int)): The order of logits and labels to be rearranged.
@ -136,12 +136,13 @@ class Metric(metaclass=ABCMeta):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Accuracy
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([1, 0, 1]))
>>> y2 = Tensor(np.array([0, 0, 1]))
>>> metric = nn.Accuracy('classification').set_indexes([0, 2])
>>> metric = Accuracy('classification').set_indexes([0, 2])
>>> metric.clear()
>>> # indexes is [0, 2], using x as logits, y2 as label.
>>> metric.update(x, y, y2)

View File

@ -20,7 +20,7 @@ import numpy as np
from mindspore import nn
from mindspore.common.tensor import Tensor
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
try:
from tqdm import trange
@ -55,6 +55,7 @@ class OcclusionSensitivity(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore.train import OcclusionSensitivity
>>>
>>> class DenseNet(nn.Cell):
... def __init__(self):
@ -69,7 +70,7 @@ class OcclusionSensitivity(Metric):
>>> model = DenseNet()
>>> test_data = np.array([[0.1, 0.2, 0.3, 0.4]]).astype(np.float32)
>>> label = np.array(1).astype(np.int32)
>>> metric = nn.OcclusionSensitivity()
>>> metric = OcclusionSensitivity()
>>> metric.clear()
>>> metric.update(model, test_data, label)
>>> score = metric.eval()

View File

@ -19,7 +19,7 @@ import math
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class Perplexity(Metric):
@ -41,10 +41,11 @@ class Perplexity(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Perplexity
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([1, 0, 1]))
>>> metric = nn.Perplexity(ignore_label=None)
>>> metric = Perplexity(ignore_label=None)
>>> metric.clear()
>>> metric.update(x, y)
>>> perplexity = metric.eval()

View File

@ -19,7 +19,7 @@ import sys
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import EvaluationBase, rearrange_inputs, _check_onehot_data
from mindspore.train.metrics.metric import EvaluationBase, rearrange_inputs, _check_onehot_data
class Precision(EvaluationBase):
@ -43,11 +43,12 @@ class Precision(EvaluationBase):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Precision
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([1, 0, 1]))
>>> metric = nn.Precision('classification')
>>> metric = Precision('classification')
>>> metric.clear()
>>> metric.update(x, y)
>>> precision = metric.eval()

View File

@ -19,7 +19,7 @@ import sys
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import EvaluationBase, rearrange_inputs, _check_onehot_data
from mindspore.train.metrics.metric import EvaluationBase, rearrange_inputs, _check_onehot_data
class Recall(EvaluationBase):
@ -44,11 +44,12 @@ class Recall(EvaluationBase):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Recall
>>>
>>> x = Tensor(np.array([[0.2, 0.5], [0.3, 0.1], [0.9, 0.6]]))
>>> y = Tensor(np.array([1, 0, 1]))
>>> metric = nn.Recall('classification')
>>> metric = Recall('classification')
>>> metric.clear()
>>> metric.update(x, y)
>>> recall = metric.eval()

View File

@ -18,7 +18,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs, _binary_clf_curve
from mindspore.train.metrics.metric import Metric, rearrange_inputs, _binary_clf_curve
class ROC(Metric):
@ -38,12 +38,13 @@ class ROC(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import ROC
>>>
>>> # 1) binary classification example
>>> x = Tensor(np.array([3, 1, 4, 2]))
>>> y = Tensor(np.array([0, 1, 2, 3]))
>>> metric = nn.ROC(pos_label=2)
>>> metric = ROC(pos_label=2)
>>> metric.clear()
>>> metric.update(x, y)
>>> fpr, tpr, thresholds = metric.eval()
@ -58,7 +59,7 @@ class ROC(Metric):
>>> x = Tensor(np.array([[0.28, 0.55, 0.15, 0.05], [0.10, 0.20, 0.05, 0.05], [0.20, 0.05, 0.15, 0.05],
... [0.05, 0.05, 0.05, 0.75]]))
>>> y = Tensor(np.array([0, 1, 2, 3]))
>>> metric = nn.ROC(class_num=4)
>>> metric = ROC(class_num=4)
>>> metric.clear()
>>> metric.update(x, y)
>>> fpr, tpr, thresholds = metric.eval()

View File

@ -19,7 +19,7 @@ from scipy.ndimage import morphology
import numpy as np
from mindspore._checkparam import Validator as validator
from mindspore.nn.metrics.metric import Metric, rearrange_inputs
from mindspore.train.metrics.metric import Metric, rearrange_inputs
class RootMeanSquareDistance(Metric):
@ -60,11 +60,12 @@ class RootMeanSquareDistance(Metric):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import RootMeanSquareDistance
>>>
>>> x = Tensor(np.array([[3, 0, 1], [1, 3, 0], [1, 0, 2]]))
>>> y = Tensor(np.array([[0, 2, 1], [1, 2, 1], [0, 0, 1]]))
>>> metric = nn.RootMeanSquareDistance(symmetric=False, distance_metric="euclidean")
>>> metric = RootMeanSquareDistance(symmetric=False, distance_metric="euclidean")
>>> metric.clear()
>>> metric.update(x, y, 0)
>>> root_mean_square_distance = metric.eval()

View File

@ -17,7 +17,7 @@ from __future__ import absolute_import
import numpy as np
from mindspore.nn.metrics.metric import Metric, rearrange_inputs, _check_onehot_data
from mindspore.train.metrics.metric import Metric, rearrange_inputs, _check_onehot_data
class TopKCategoricalAccuracy(Metric):
@ -37,12 +37,13 @@ class TopKCategoricalAccuracy(Metric):
Examples:
>>> import mindspore
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import TopKCategoricalAccuracy
>>>
>>> x = Tensor(np.array([[0.2, 0.5, 0.3, 0.6, 0.2], [0.1, 0.35, 0.5, 0.2, 0.],
... [0.9, 0.6, 0.2, 0.01, 0.3]]), mindspore.float32)
>>> y = Tensor(np.array([2, 0, 1]), mindspore.float32)
>>> topk = nn.TopKCategoricalAccuracy(3)
>>> topk = TopKCategoricalAccuracy(3)
>>> topk.clear()
>>> topk.update(x, y)
>>> output = topk.eval()
@ -120,12 +121,13 @@ class Top1CategoricalAccuracy(TopKCategoricalAccuracy):
Examples:
>>> import numpy as np
>>> from mindspore import nn, Tensor
>>> from mindspore import Tensor
>>> from mindspore.train import Top1CategoricalAccuracy
>>>
>>> x = Tensor(np.array([[0.2, 0.5, 0.3, 0.6, 0.2], [0.1, 0.35, 0.5, 0.2, 0.],
... [0.9, 0.6, 0.2, 0.01, 0.3]]), mindspore.float32)
>>> y = Tensor(np.array([2, 0, 1]), mindspore.float32)
>>> topk = nn.Top1CategoricalAccuracy()
>>> topk = Top1CategoricalAccuracy()
>>> topk.clear()
>>> topk.update(x, y)
>>> output = topk.eval()

View File

@ -27,7 +27,7 @@ from mindspore import log as logger
from mindspore.train.serialization import save_checkpoint, load_checkpoint
from mindspore.train.callback._checkpoint import ModelCheckpoint, _chg_ckpt_file_name_if_same_exist
from mindspore.common.tensor import Tensor
from mindspore.nn.metrics import get_metrics, get_metric_fn
from mindspore.train.metrics import get_metrics, get_metric_fn
from mindspore._checkparam import check_input_data, check_output_data, Validator
from mindspore.train.callback import _InternalCallbackParam, RunContext, _CallbackManager, Callback, TimeMonitor
from mindspore.train.callback import __all__ as internal_cb_names
@ -37,7 +37,7 @@ from mindspore.parallel._utils import _get_parallel_mode, _get_device_num, _get_
_reset_op_id_with_offset
from mindspore.parallel._ps_context import _is_role_worker, _is_role_pserver, _is_role_sched, _is_ps_mode, \
_cache_enable, _enable_distributed_mindrt
from mindspore.nn.metrics import Loss
from mindspore.train.metrics import Loss
from mindspore import nn
from mindspore.boost import AutoBoost
from mindspore.context import ParallelMode

View File

@ -20,7 +20,7 @@ import numpy as np
import mindspore.communication.management as distributedTool
import mindspore.nn as nn
from mindspore import context
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.train import Model
from mindspore.train.callback import LossMonitor, TimeMonitor
from tests.models.official.cv.lenet.src.dataset import create_dataset

View File

@ -23,7 +23,7 @@ import mindspore.dataset.vision as CV
import mindspore.nn as nn
from mindspore.common import dtype as mstype
from mindspore.dataset.vision import Inter
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.train import Model
from mindspore.train.callback import LossMonitor
from mindspore.common.initializer import TruncatedNormal

View File

@ -22,7 +22,7 @@ from mindspore.ops import composite as C
from mindspore.ops import operations as P
from mindspore.nn import Dropout
from mindspore.nn.optim import Adam
from mindspore.nn.metrics import Metric
from mindspore.train.metrics import Metric
from mindspore import nn, Tensor, ParameterTuple, Parameter
from mindspore.common.initializer import Uniform, initializer
from mindspore.train.callback import ModelCheckpoint, CheckpointConfig

View File

@ -26,7 +26,7 @@ from mindspore import log as logger
from mindspore.common import dtype as mstype
from mindspore.common.tensor import Tensor
from mindspore.nn.learning_rate_schedule import LearningRateSchedule, PolynomialDecayLR, WarmUpLR
from mindspore.nn.metrics import Metric
from mindspore.train.metrics import Metric
from mindspore.ops import operations as P
from mindspore.train.callback import Callback

View File

@ -14,7 +14,7 @@
# ============================================================================
"""mIou."""
import numpy as np
from mindspore.nn.metrics.metric import Metric
from mindspore.train.metrics import Metric
def confuse_matrix(target, pred, n):

View File

@ -27,7 +27,7 @@ from mindspore import Tensor, ParameterTuple
from mindspore.common import dtype as mstype
from mindspore.dataset.vision import Inter
from mindspore.nn import Dense, TrainOneStepCell, WithLossCell, ForwardValueAndGrad
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.nn.optim import Momentum
from mindspore.ops import operations as P
from mindspore.ops import functional as F

View File

@ -15,7 +15,7 @@
import mindspore.context as context
from mindspore import set_seed
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.train import Model
from mindspore.train.callback import LossMonitor, TimeMonitor
from mindspore.communication.management import init

View File

@ -15,7 +15,7 @@
import mindspore.context as context
from mindspore import set_seed
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.train import Model
from mindspore.train.callback import TimeMonitor
from mindspore.communication.management import init

View File

@ -22,7 +22,7 @@ import pytest
from mindspore import dataset as ds
from mindspore import nn, Tensor, context
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.nn.optim import Momentum
from mindspore.dataset.transforms import transforms as C
from mindspore.dataset.vision import transforms as CV

View File

@ -22,7 +22,7 @@ from mindspore.nn import EmbeddingLookup, SoftmaxCrossEntropyWithLogits
from mindspore.nn import Adam
from mindspore.train import Model
from mindspore.train.callback import CheckpointConfig, ModelCheckpoint
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.common import set_seed
from mindspore.communication.management import get_rank
import mindspore.ops.operations as op

View File

@ -23,7 +23,7 @@ import mindspore.dataset.vision as CV
import mindspore.nn as nn
from mindspore.common import dtype as mstype
from mindspore.dataset.vision import Inter
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.train import Model
from mindspore.train.callback import LossMonitor
from mindspore.common.initializer import TruncatedNormal

View File

@ -22,7 +22,7 @@ from mindspore import context
from mindspore import Tensor
from mindspore.common import dtype as mstype
import mindspore.nn as nn
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
from mindspore.train.callback import ModelCheckpoint, CheckpointConfig, LossMonitor
from mindspore import load_checkpoint, load_param_into_net, export
from mindspore.train import Model

View File

@ -25,7 +25,7 @@ import pytest
from mindspore.common import set_seed
from mindspore import nn, Tensor, context
from mindspore.common.initializer import Normal
from mindspore.nn.metrics import Loss
from mindspore.train.metrics import Loss
from mindspore.nn.optim import Momentum
from mindspore.ops import operations as P
from mindspore.train import Model

View File

@ -22,7 +22,7 @@ import pytest
from mindspore import nn, Tensor, context
from mindspore.common.initializer import Normal
from mindspore.nn.metrics import Loss
from mindspore.train.metrics import Loss
from mindspore.nn.optim import Momentum
from mindspore.ops import operations as P
from mindspore.train import Model

View File

@ -18,7 +18,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import Accuracy
from mindspore.train.metrics import Accuracy
def test_classification_accuracy():

View File

@ -17,7 +17,7 @@
import math
import numpy as np
from mindspore import Tensor
from mindspore.nn.metrics import ROC, auc
from mindspore.train.metrics import ROC, auc
def test_auc():

View File

@ -15,7 +15,7 @@
"""test_bleu_score"""
import math
import pytest
from mindspore.nn.metrics import BleuScore
from mindspore.train.metrics import BleuScore
def test_bleu_score():

View File

@ -16,7 +16,7 @@
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import ConfusionMatrix
from mindspore.train.metrics import ConfusionMatrix
def test_confusion_matrix():

View File

@ -16,7 +16,7 @@
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import ConfusionMatrixMetric
from mindspore.train.metrics import ConfusionMatrixMetric
def test_confusion_matrix_metric():

View File

@ -16,7 +16,7 @@
import pytest
import numpy as np
from sklearn.metrics import pairwise
from mindspore.nn.metrics import CosineSimilarity
from mindspore.train.metrics import CosineSimilarity
def test_cosine_similarity():

View File

@ -17,7 +17,7 @@ import math
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn, Dice
from mindspore.train.metrics import get_metric_fn, Dice
def test_classification_dice():

View File

@ -18,7 +18,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import MAE, MSE
from mindspore.train.metrics import MAE, MSE
def test_MAE():

View File

@ -17,7 +17,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn, Fbeta
from mindspore.train.metrics import get_metric_fn, Fbeta
def test_classification_fbeta():

View File

@ -18,7 +18,7 @@ import math
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn, HausdorffDistance
from mindspore.train.metrics import get_metric_fn, HausdorffDistance
def test_hausdorff_distance():

View File

@ -17,7 +17,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import Loss
from mindspore.train.metrics import Loss
def test_loss_inputs_error():

View File

@ -18,7 +18,7 @@ import math
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn, MeanSurfaceDistance
from mindspore.train.metrics import get_metric_fn, MeanSurfaceDistance
def test_mean_surface_distance():

View File

@ -17,8 +17,7 @@ import math
import numpy as np
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn
from mindspore.nn.metrics.metric import rearrange_inputs
from mindspore.train.metrics import get_metric_fn, rearrange_inputs
def test_classification_accuracy():

View File

@ -17,7 +17,7 @@ import pytest
import numpy as np
from mindspore import nn, context
from mindspore.common.tensor import Tensor
from mindspore.nn.metrics import OcclusionSensitivity
from mindspore.train.metrics import OcclusionSensitivity
context.set_context(mode=context.GRAPH_MODE)
class DenseNet(nn.Cell):

View File

@ -18,7 +18,7 @@ import math
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn, Perplexity
from mindspore.train.metrics import get_metric_fn, Perplexity
def test_perplexity():

View File

@ -18,7 +18,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import Precision
from mindspore.train.metrics import Precision
def test_classification_precision():

View File

@ -18,7 +18,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import Recall
from mindspore.train.metrics import Recall
def test_classification_recall():

View File

@ -17,7 +17,7 @@
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import ROC
from mindspore.train.metrics import ROC
def test_roc():

View File

@ -18,7 +18,7 @@ import math
import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import get_metric_fn, RootMeanSquareDistance
from mindspore.train.metrics import get_metric_fn, RootMeanSquareDistance
def test_root_mean_square_distance():

View File

@ -18,7 +18,7 @@ import numpy as np
import pytest
from mindspore import Tensor
from mindspore.nn.metrics import TopKCategoricalAccuracy, Top1CategoricalAccuracy, Top5CategoricalAccuracy
from mindspore.train.metrics import TopKCategoricalAccuracy, Top1CategoricalAccuracy, Top5CategoricalAccuracy
def test_type_topk():

View File

@ -20,7 +20,7 @@ import pytest
from mindspore.common import set_seed
from mindspore import nn
from mindspore.nn.metrics import Loss
from mindspore.train.metrics import Loss
from mindspore.train import Model
from mindspore.train.callback import SummaryLandscape
from tests.security_utils import security_off_wrap