mindspore/RELEASE.md

4.3 KiB

Release 0.1.0-alpha

Main Features

Ascend 910 Training and Inference Framework

  • Recommended OS: Ubuntu 16.04 (or later) or EulerOS 2.5 or EulerOS 2.8
  • Python version: 3.7.5
  • Preset models
    • ResNet-50: residual structure-based convolutional neural network (CNN) for image classification, which is widely used.
    • AlexNet: classic CNN for image classification, achieving historical results in ImageNet LSVRC-2012.
    • LeNet: classic CNN for image classification, which was proposed by Yann LeCun.
    • VGG16: classic CNN for image classification, which was proposed by Oxford Visual Geometry Group.
    • YoloV3: real-time object detection network.
    • NEZHA: BERT-based Chinese pre-training network produced by Huawei Noah's Ark Laboratory.
  • Execution modes
    • Graph mode: provides graph optimization methods such as memory overcommitment, IR fusion, and buffer fusion to achieve optimal execution performance.
    • PyNative mode: single-step execution mode, facilitating process debugging.
  • Debugging capability and methods
    • Save CheckPoints and Summary data during training.
    • Support asynchronous printing.
    • Dump the computing data.
    • Support profiling analysis of the execution process performance.
  • Distributed execution
    • Support AllReduce, AllGather, and BroadCast collective communication.
    • AllReduce data parallel: Each device obtains different training data, which accelerates the overall training process.
    • Collective communication-based layerwise parallel: Models are divided and allocated to different devices to solve the problem of insufficient memory for large model processing and improve the training speed.
    • Automatic parallel mode: The better data and model parallel mode can be predicted based on the cost model. It is recommended that this mode be used on ResNet series networks.
  • Automatic differentiation
    • Implement automatic differentiation based on Source to Source.
    • Support distributed scenarios and automatic insertion of reverse communication operators.
  • Data processing, augmentation, and save format
    • Load common datasets such as ImageNet, MNIST, CIFAR-10, and CIFAR-100.
    • Support common data loading pipeline operations, such as shuffle, repeat, batch, map, and sampler.
    • Provide basic operator libraries to cover common CV scenarios.
    • Support users to customize Python data augmentation operators through the Pyfunc mechanism.
    • Support the access of user-defined datasets through the GeneratorDataset mechanism.
    • Provide the MindSpore data format, data aggregation and storage, random access example, data partition, efficient parallel read, user-defined index, and dataset search.
    • Convert user datasets to the MindSpore data format.
    • After data processing and augmentation, provide training applications in feed and graph modes.
  • FP32/16 mixed precision computation, supporting automatic and manual configuration
  • Provide common operators such as nn, math, and array, which can be customized.

Inference Deployment

  • Deploy models in MindSpore format on the Ascend 310 platform for inference.
  • Save models in ONNX format.
  • Support saving models in LITE format and running models based on the lightweight inference framework.
    • Recommended OS: Android 4.3 or later
    • Supported network type: LeNet
    • Provide the generalization operators generated by TVM and operators generated after specific networks are tuned.

Other Hardware Support

  • GPU platform training
    • Recommended OS: Ubuntu 16.04
    • CUDA version: 9.2 or 10.1
    • CuDNN version: 7.6 or later
    • Python version: 3.7.5
    • NCCL version: 2.4.8-1
    • OpenMPI version: 3.1.5
    • Supported models: AlexNet, LeNet, and LSTM
    • Supported datasets: MNIST and CIFAR-10
    • Support data parallel.
  • CPU platform training
    • Recommended OS: Ubuntu 16.04
    • Python version: 3.7.5
    • Supported model: LeNet
    • Supported dataset: MNIST
    • Provide only the stand-alone operation version.

Peripherals and Tools