!37070 [Auto parallel] MoE adaption

Merge pull request !37070 from Xiaoda/code_docs_136-moe-adpation
This commit is contained in:
i-robot 2022-07-02 06:18:58 +00:00 committed by Gitee
commit dde6ce863d
No known key found for this signature in database
GPG Key ID: 173E9B9CA92EEF8F
1 changed files with 1 additions and 1 deletions

View File

@ -1,4 +1,4 @@
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1)
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1, expert_group_size=None)
MoE (Mixture of Expert)的配置。