Generative AI

TAG-MoE: Task-Aware Gating for Unified Generative Mixture-of-Experts

YYu XuHHongbin YanJJuan CaoYYiji ChengTTiankai HangRRunze HeZZijin YinSShiyi ZhangYYuxin ZhangJJintao LiCChunyu WangQQinglin LuTTong-Yee LeeFFan Tang
arXiv ID
2601.08881
Published
January 12, 2026
Authors
14
Hugging Face Likes
11
Comments
2

Abstract

Unified image generation and editing models suffer from severe task interference in dense diffusion transformers architectures, where a shared parameter space must compromise between conflicting objectives (e.g., local editing v.s. subject-driven generation). While the sparse Mixture-of-Experts (MoE) paradigm is a promising solution, its gating networks remain task-agnostic, operating based on local features, unaware of global task intent. This task-agnostic nature prevents meaningful specialization and fails to resolve the underlying task interference. In this paper, we propose a novel framework to inject semantic intent into MoE routing. We introduce a Hierarchical Task Semantic Annotation scheme to create structured task descriptors (e.g., scope, type, preservation). We then design Predictive Alignment Regularization to align internal routing decisions with the task's high-level semantics. This regularization evolves the gating network from a task-agnostic executor to a dispatch center. Our model effectively mitigates task interference, outperforming dense baselines in fidelity and quality, and our analysis shows that experts naturally develop clear and semantically correlated specializations.

More in Generative AI

View all
TAG-MoE: Task-Aware Gating for Unified Generative Mixture-of-Experts | Paperchime