SeWol.

Layers of Time, Colors of Emotion.

SeWol.

Layers of Time, Colors of Emotion.

SeWol.

Layers of Time, Colors of Emotion.

Context

In Brown University’s CSCI 1951C: Designing Humanity-Centered Technologies, I explored near-future scenarios where intelligent systems—akin to today’s Large Language Model-powered products and services—become deeply integrated into our lives. Framing the project within this context of a rapidly evolving technological landscape sparked central questions:

  1. With the accelerating advancement of Artificial Intelligence, how can we design technologies that protect what matters most to us—our privacy, emotions, memories, and our sense of self?


  1. How can intelligent systems create moments of introspection, enabling us to reconnect with ourselves in an increasingly automated world?

For Project SeWol, I designed and developed an intelligent lamp that listens to users within an indoor space and visualizes the accumulation of emotions inspired by the Korean notion of 기운 (Giun) or 気 (Ki)—a subtle energy that permeates spaces, shaped by the actions, events, and emotions that unfold within them.

Team

Dave (🙋🏻)

Project Lead, Engineer

Project Lead, Engineer

Project Lead, Engineer

Ming Dong

Designer

Designer

Designer

Timeline

a 3 week project, Nov 2024

Skills / Tools

Prototyping, Raspberry Pi, Interaction Design, Design Thinking, Sentiment Analysis

Project Ideation 💡

The project began with the idea of using technology to augment human experiences and highlight our intrinsic characteristics. Specifically, I focused on displaying emotions as entities that can accumulate within a space—or, more accurately, as reflections of the emotional build-up within ourselves—in a way that inspires introspection and self-reflection.

The images below showcase the initial ideations and form factor explorations!

Affective Computing
Understanding Our Emotions

Understanding human emotions is a complex and interdisciplinary challenge. Affective computing systems aim to achieve high accuracy in emotion detection or sentiment analysis by leveraging machine learning techniques across diverse data inputs.

When it comes to representing emotions, two primary approaches exist—discrete and dimensional emotion models.

Design
Functional Prototype

The goal of the functional prototype was to create a working model that could facilitate observing user interactions with the system as it performed similar—but simplified—tasks. For the prototype, Raspberry Pi 4 B was used to perform the input computation.

The functional prototype primarily utilized its microphone to capture conversation content, performing sentiment analysis on the transcribed text. A detailed schema of the pipeline is displayed below.

Final Prototype

The final prototype classifies sentiment into three discrete categories—negative, neutral, and positive—and uses corresponding colors to visualize the accumulation of emotions:

  • Red: Negative

  • Blue: Positive

  • White: Neutral

While the intended design featured subtle shades of red and blue, more vibrant tones were used in the demo for clarity and emphasis.

Understanding Our Emotions
Developing Multimodal Algorithm
Work in Progress!

I am currently enhancing the algorithmic aspects of the prototype to advance the complexity of the emotion classification model. This involves shifting from simple text-based sentiment analysis using the OpenAI API to a multimodal approach that incorporates inputs from voice, text, and heart rate readings.

©

2025

Dave Song. All rights reserved.

Made with 🍞 and 🧈

©

2025

Dave Song. All rights reserved.

Made with 🍞 and 🧈

©

2025

Dave Song. All rights reserved.

Made with 🍞 and 🧈