moe adaption
This commit is contained in:
parent
e2e5850e0c
commit
f944fe7065
|
@ -1,4 +1,4 @@
|
|||
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1)
|
||||
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1, expert_group_size=None)
|
||||
|
||||
MoE (Mixture of Expert)的配置。
|
||||
|
||||
|
|
Loading…
Reference in New Issue