Meta's 109B parameter multimodal MoE model with 16 experts and unprecedented 10M token context window for general-purpose AI tasks.