remove contents of release.md from whl package metadata

This commit is contained in:
yanghaoran 2021-02-08 09:39:52 +08:00
parent d80f77e988
commit fc91501ff6
3 changed files with 20 additions and 19 deletions

View File

@ -1,4 +1,4 @@
![MindSpore Logo](docs/MindSpore-logo.png "MindSpore logo")
![MindSpore Logo](https://gitee.com/mindspore/mindspore/raw/master/docs/MindSpore-logo.png "MindSpore logo")
[查看中文](./README_CN.md)
@ -34,7 +34,7 @@ processor, and software hardware co-optimization. At the meantime MindSpore as
a global AI open source community, aims to further advance the development and
enrichment of the AI software/hardware application ecosystem.
<img src="docs/MindSpore-architecture.png" alt="MindSpore Architecture" width="600"/>
<img src="https://gitee.com/mindspore/mindspore/raw/master/docs/MindSpore-architecture.png" alt="MindSpore Architecture" width="600"/>
For more details please check out our [Architecture Guide](https://www.mindspore.cn/doc/note/en/master/design/mindspore/architecture.html).
@ -50,7 +50,7 @@ TensorFlow adopted static calculation diagrams in the early days, whereas PyTorc
But MindSpore finds another way, automatic differentiation based on source code conversion. On the one hand, it supports automatic differentiation of automatic control flow, so it is quite convenient to build models like PyTorch. On the other hand, MindSpore can perform static compilation optimization on neural networks to achieve great performance.
<img src="docs/Automatic-differentiation.png" alt="Automatic Differentiation" width="600"/>
<img src="https://gitee.com/mindspore/mindspore/raw/master/docs/Automatic-differentiation.png" alt="Automatic Differentiation" width="600"/>
The implementation of MindSpore automatic differentiation can be understood as the symbolic differentiation of the program itself. Because MindSpore IR is a functional intermediate expression, it has an intuitive correspondence with the composite function in basic algebra. The derivation formula of the composite function composed of arbitrary basic functions can be derived. Each primitive operation in MindSpore IR can correspond to the basic functions in basic algebra, which can build more complex flow control.
@ -58,9 +58,9 @@ The implementation of MindSpore automatic differentiation can be understood as t
The goal of MindSpore automatic parallel is to build a training method that combines data parallelism, model parallelism, and hybrid parallelism. It can automatically select a least cost model splitting strategy to achieve automatic distributed parallel training.
<img src="docs/Automatic-parallel.png" alt="Automatic Parallel" width="600"/>
<img src="https://gitee.com/mindspore/mindspore/raw/master/docs/Automatic-parallel.png" alt="Automatic Parallel" width="600"/>
At present, MindSpore uses a fine-grained parallel strategy of splitting operators, that is, each operator in the figure is splitted into a cluster to complete parallel operations. The splitting strategy during this period may be very complicated, but as a developer advocating Pythonic, you don't need to care about the underlying implementation, as long as the top-level API compute is efficient.
At present, MindSpore uses a fine-grained parallel strategy of splitting operators, that is, each operator in the figure is split into a cluster to complete parallel operations. The splitting strategy during this period may be very complicated, but as a developer advocating Pythonic, you don't need to care about the underlying implementation, as long as the top-level API compute is efficient.
## Installation
@ -229,7 +229,7 @@ currently the containerized build options are supported as follows:
```
If you want to learn more about the building process of MindSpore docker images,
please check out [docker](docker/README.md) repo for the details.
please check out [docker](https://gitee.com/mindspore/mindspore/blob/master/docker/README.md) repo for the details.
## Quickstart
@ -256,12 +256,13 @@ Check out how MindSpore Open Governance [works](https://gitee.com/mindspore/comm
## Contributing
Welcome contributions. See our [Contributor Wiki](CONTRIBUTING.md) for
Welcome contributions. See our [Contributor Wiki](https://gitee.com/mindspore/mindspore/blob/master/CONTRIBUTING.md) for
more details.
## Maintenance phases
Project stable branches will be in one of the following states:
| **State** | **Time frame** | **Summary** |
|-------------|---------------|--------------------------------------------------|
| Planning | 1 - 3 months | Features are under planning. |
@ -286,8 +287,8 @@ Project stable branches will be in one of the following states:
## Release Notes
The release notes, see our [RELEASE](RELEASE.md).
The release notes, see our [RELEASE](https://gitee.com/mindspore/mindspore/blob/master/RELEASE.md).
## License
[Apache License 2.0](LICENSE)
[Apache License 2.0](https://gitee.com/mindspore/mindspore#/mindspore/mindspore/blob/master/LICENSE)

View File

@ -1,4 +1,4 @@
![MindSpore标志](docs/MindSpore-logo.png "MindSpore logo")
![MindSpore标志](https://gitee.com/mindspore/mindspore/raw/master/docs/MindSpore-logo.png "MindSpore logo")
[View English](./README.md)
@ -31,7 +31,7 @@ MindSpore提供了友好的设计和高效的执行旨在提升数据科学
同时MindSpore作为全球AI开源社区致力于进一步开发和丰富AI软硬件应用生态。
<img src="docs/MindSpore-architecture.png" alt="MindSpore Architecture" width="600"/>
<img src="https://gitee.com/mindspore/mindspore/raw/master/docs/MindSpore-architecture.png" alt="MindSpore Architecture" width="600"/>
欲了解更多详情,请查看我们的[总体架构](https://www.mindspore.cn/doc/note/zh-CN/master/design/mindspore/architecture.html)。
@ -47,7 +47,7 @@ TensorFlow早期采用的是静态计算图PyTorch采用的是动态计算图
MindSpore找到了另一种方法即基于源代码转换的自动微分。一方面它支持自动控制流的自动微分因此像PyTorch这样的模型构建非常方便。另一方面MindSpore可以对神经网络进行静态编译优化以获得更好的性能。
<img src="docs/Automatic-differentiation.png" alt="Automatic Differentiation" width="600"/>
<img src="https://gitee.com/mindspore/mindspore/raw/master/docs/Automatic-differentiation.png" alt="Automatic Differentiation" width="600"/>
MindSpore自动微分的实现可以理解为程序本身的符号微分。MindSpore IR是一个函数中间表达式它与基础代数中的复合函数具有直观的对应关系。复合函数的公式由任意可推导的基础函数组成。MindSpore IR中的每个原语操作都可以对应基础代数中的基本功能从而可以建立更复杂的流控制。
@ -55,7 +55,7 @@ MindSpore自动微分的实现可以理解为程序本身的符号微分。MindS
MindSpore自动并行的目的是构建数据并行、模型并行和混合并行相结合的训练方法。该方法能够自动选择开销最小的模型切分策略实现自动分布并行训练。
<img src="docs/Automatic-parallel.png" alt="Automatic Parallel" width="600"/>
<img src="https://gitee.com/mindspore/mindspore/raw/master/docs/Automatic-parallel.png" alt="Automatic Parallel" width="600"/>
目前MindSpore采用的是算子切分的细粒度并行策略即图中的每个算子被切分为一个集群完成并行操作。在此期间的切分策略可能非常复杂但是作为一名Python开发者您无需关注底层实现只要顶层API计算是有效的即可。
@ -225,7 +225,7 @@ MindSpore的Docker镜像托管在[Docker Hub](https://hub.docker.com/r/mindspore
[ 2. 2. 2. 2.]]]
```
如果您想了解更多关于MindSpore Docker镜像的构建过程请查看[docker](docker/README.md) repo了解详细信息。
如果您想了解更多关于MindSpore Docker镜像的构建过程请查看[docker](https://gitee.com/mindspore/mindspore/blob/master/docker/README.md) repo了解详细信息。
## 快速入门
@ -250,11 +250,12 @@ MindSpore的Docker镜像托管在[Docker Hub](https://hub.docker.com/r/mindspore
## 贡献
欢迎参与贡献。更多详情,请参阅我们的[贡献者Wiki](CONTRIBUTING.md)。
欢迎参与贡献。更多详情,请参阅我们的[贡献者Wiki](https://gitee.com/mindspore/mindspore/blob/master/CONTRIBUTING.md)。
## 分支维护策略
MindSpore的版本分支有以下几种维护阶段
| **状态** | **持续时间** | **说明** |
|-------------|---------------|--------------------------------------------------|
| Planning | 1 - 3 months | 特性规划。 |
@ -279,8 +280,8 @@ MindSpore的版本分支有以下几种维护阶段
## 版本说明
版本说明请参阅[RELEASE](RELEASE.md)。
版本说明请参阅[RELEASE](https://gitee.com/mindspore/mindspore/blob/master/RELEASE.md)。
## 许可证
[Apache License 2.0](LICENSE)
[Apache License 2.0](https://gitee.com/mindspore/mindspore#/mindspore/mindspore/blob/master/LICENSE)

View File

@ -40,7 +40,6 @@ def _read_file(filename):
readme = _read_file('README.md')
release = _read_file('RELEASE.md')
def _write_version(file):
@ -196,7 +195,7 @@ setup(
},
description='MindSpore is a new open source deep learning training/inference '
'framework that could be used for mobile, edge and cloud scenarios.',
long_description="\n\n".join([readme, release]),
long_description="\n\n".join(readme),
long_description_content_type="text/markdown",
packages=find_packages(),
package_data=package_data,