!37070 [Auto parallel] MoE adaption
Merge pull request !37070 from Xiaoda/code_docs_136-moe-adpation
This commit is contained in:
commit
dde6ce863d
|
@ -1,4 +1,4 @@
|
|||
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1)
|
||||
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1, expert_group_size=None)
|
||||
|
||||
MoE (Mixture of Expert)的配置。
|
||||
|
||||
|
|
Loading…
Reference in New Issue