Latest from Google AI – Mixture-of-Experts with Expert Choice Routing
Posted by Yanqi Zhou, Research Scientist, Google Research Brain Team The capacity of a neural network to absorb information is limited by the number of its parameters, and as a consequence, finding more effective ways to increase model parameters has become a trend in deep learning research. Mixture-of-experts (MoE), a type of conditional computation where…