
QwenChat
Technology-Driven Emerging UX Challenges and Early Implications for Human–LLM Interaction
Year
2025
Project Length
2 Platform
Platform
Web-based Application, Mobile App
When I intered as a UX Designer at Alibaba-QwenChat, I was involved in the daily task development for model Qwen3; Meanwhile, I led the user experience research on UX could evolve with the development of AI model.
My Role
UX Design Intern
Timeline
2025.06-2025.08
Team Member
1 UX Designer
1 Design Lead
4 Front-end Developers
2 Logithrim Experts
Tools
MasterGo, Keynote

Problem
Rapid LLM Evolvement
Large language models (LLMs) have advanced at an unprecedented pace, reshaping how people interact with technology and making inroads into countless aspects of everyday life. Grounded in transformer-based neural networks and self-supervised training on massive datasets, LLMs have made rapid progress over the past two years.
Unstabilized Human–LLM interaction
There are no stable mental models and sustained engagement as LLM versions and behaviors evolve.
Design For Uncertainty
Emerging UX challenges showing that the interaction landscape itself is becoming more fluid and difficult to fix into stable patterns.
Process
The Technology-Driven Emerging UX Challenges and Early Implications for Human–LLM Interaction
My Contribution

The Result
..
NDA Content
..
NDA Content
..
NDA Content
..
NDA Content
Reflection
When Technology Leads Design
Working with QwenChat taught me that when technology leads design, the designer’s role shifts from solving known user pain points to interpreting and shaping new possibilities created by the technology. Instead of designing fixed workflows, designers must think in terms of exploration—understanding model capabilities, anticipating unexpected behaviors, and imagining how new interactions might emerge. This also requires designers to work much more closely with algorithm engineers, continuously translating technical capabilities and limitations into experiences users can understand and trust.
When Designers Stop Designing for Fixed Systems
Working on AI products taught me that designers can no longer assume the system is stable, predictable, or fully defined before design begins. Instead of polishing a fixed flow, designers need to think in terms of changing behaviors, incomplete boundaries, and evolving capabilities. That shift requires a different mindset: staying close to the technology, learning from model behavior, and designing flexible experiences that can absorb uncertainty rather than hide it. In this context, design is less about defining one ideal interaction and more about building structures that can adapt as the system continues to change.
When Design Expands Beyond the Interface
Through my work on AI products, I learned that design is no longer limited to shaping screens or refining flows. The quality of the experience also depends on model behavior, system logic, and how technical capabilities are translated into something users can understand and trust. This means designers need to think beyond the interface itself—considering uncertainty, collaboration with engineers, and the evolving relationship between system capability and user expectation. In this context, design becomes not just about what users see, but about how the entire system works behind what they see.

