Skip to content

PyTorch implementation of Sparsely-Gated Mixture-of-Experts (MoE).

License

Notifications You must be signed in to change notification settings

Yuan-ManX/MoE-PyTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MoE PyTorch

MoE

PyTorch implementation of Sparsely-Gated Mixture-of-Experts (MoE).

MoE - Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer.

About

PyTorch implementation of Sparsely-Gated Mixture-of-Experts (MoE).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages