Noah Qin
2 min read

Reflections: Vivo Developer Conference 2025

Witnessing the intersection of BlueLM AI and humanistic technology.

Vivo Developer Conference Stage

On October 10, 2025, I stepped out of the classroom and into the Vivo Developer Conference (VDC). It was my first time attending a major industry event not as a student spectator, but as an active iOS developer.

While the “BlueLM” (Vivo’s on-device Large Language Model) was technically impressive—showcasing how AI moves from cloud to edge—what truly moved me was something else entirely.

Technology’s Ultimate Goal

The highlight for me was the focus on Humanistic Care. As one speaker put it: “The ultimate goal of technology is to let people feel happiness and arrive at beauty.”

When the light of technology shines in, the needs of hidden groups are finally seen.

”Heard” and “Seen”

Vivo introduced two feature matrices that bridge the gap for the hearing and visually impaired:

  • vivo Heard: For hearing-impaired users, the phone can now separate speakers in a chaotic multi-person conversation and identify key environmental sounds (like a doorbell or kettle) through vibration alerts.
  • vivo Seen: Using “Smart Memory” and “Environment Q&A,” the phone becomes an eye. It describes the environment—friends’ faces, surroundings—in text/voice, helping visually impaired users explore the world with dignity.

Event Atmosphere

My Takeaway

Seeing these features live made me realize that engineering isn’t just about optimization or logic; it’s about empathy. Code can be a bridge.

Leaving the conference, I felt more motivated than ever to push the boundaries of what a high schooler can create.