The article introduces the latest One-Step Generative Model, Improved MeanFlow (iMF), proposed by He Kaiming's team. This model overcomes key limitations of the original MeanFlow in training stability, guidance flexibility, and architectural efficiency. By reconstructing the training objective into a more stable Instantaneous Velocity Loss, introducing flexible Classifier-Free Guidance (CFG), and adopting an efficient in-context conditioning architecture, iMF significantly improves model performance. In the ImageNet 256×256 benchmark, iMF achieves a FID score of 1.72 at 1-NFE, outperforming the original MF by 50% and reaching performance comparable to multi-step diffusion models. This achievement was jointly completed by He Kaiming's team and a Tsinghua Yao Class sophomore, highlighting the research potential of the younger generation.


