moe adaption

This commit is contained in:
Xiaoda Zhang 2022-07-01 16:00:15 +08:00
parent e2e5850e0c
commit f944fe7065
1 changed files with 1 additions and 1 deletions

View File

@ -1,4 +1,4 @@
.. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1) .. py:class:: mindspore.nn.transformer.MoEConfig(expert_num=1, capacity_factor=1.1, aux_loss_factor=0.05, num_experts_chosen=1, expert_group_size=None)
MoE (Mixture of Expert)的配置。 MoE (Mixture of Expert)的配置。