!37767 Fix Vocab Embedding Doc Error
Merge pull request !37767 from huangxinjing/code_docs_fix_vocab_embedding2
This commit is contained in:
commit
1a29dca199
|
@ -21,7 +21,7 @@
|
|||
Tuple,表示一个包含(`output`, `embedding_table`)的元组。
|
||||
|
||||
- **output** (Tensor) - shape为(batch_size, seq_length, embedding_size)嵌入向量查找结果。
|
||||
- **weight** (Tensor) - shape为(vocab_size, embedding_size)的嵌入表。
|
||||
- **embedding_table** (Tensor) - shape为(vocab_size, embedding_size)的嵌入表。
|
||||
|
||||
**异常:**
|
||||
|
||||
|
|
|
@ -285,7 +285,6 @@ def get_group_size(group=GlobalComm.WORLD_COMM_GROUP):
|
|||
Examples:
|
||||
>>> import mindspore as ms
|
||||
>>> from mindspore.communication.management import init, get_group_size
|
||||
>>> ms.set_context(device_target="Ascend")
|
||||
>>> ms.set_auto_parallel_context(device_num=8)
|
||||
>>> init()
|
||||
>>> group_size = get_group_size()
|
||||
|
|
Loading…
Reference in New Issue