Jingsen Wang
YOU?
Author Swipe
View article: GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models
GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models Open
We present GLM-4.5, an open-source Mixture-of-Experts (MoE) large language model with 355B total parameters and 32B activated parameters, featuring a hybrid reasoning method that supports both thinking and direct response modes. Through mu…
View article: Towards Reliable Advertising Image Generation Using Human Feedback
Towards Reliable Advertising Image Generation Using Human Feedback Open
In the e-commerce realm, compelling advertising images are pivotal for attracting customer attention. While generative models automate image generation, they often produce substandard images that may mislead customers and require significa…