Oops, please try Firefox, light may not follows your hovering mouse on other browsers :pJumping sheep from giphy.com
(Try this prototype on a larger screen for better display and interactivity)
This project is my interview design exercise for ling.ai, an artificial intelligence technology brand focusing on home and education AI assistant product. I was challenged to analyze the landscape of the current (Spring 2018) smart speaker market and propose a conceptual design with detail about digital and physical interaction.
My idea is that the speaker is smart because of the little creature living in it. Light and voice is the main media for interactivity. The company ling.ai already has robot product that leveraging mechanical movement, which is probably not suitable for a speaker. Meanwhile, ling.ai also has excellent VUI technology, so my idea is to integrate the change of light with their existing VUI to bring product alive.
When your little creature is hiding in the speaker, it can feel your hand gesture and following you around, just like it wants to break out the egg.
Different lighting patterns can express various emotion or functions alone or together with voice:
Go beyond current tech, one day your little creature may pop up in front of you with the help of hologram projector.
Project:Individual interaction concept design
Duration:1 Week, Feb 2018
Waiting at laundromats can be frustrated, but downloading an app to check the availability is, em, too much trouble. Therefore, an adorable chatbot you can turn to at the same number you usually use to call the laundromat staff will be an easier solution.
I did this project based on some previous assignments from my user research method class. A partially functional chatbot is done by Ruby and Twilio. Here are bot personality worksheet and simple flowchart:
The bot is based on SMS/MMS so it shares the same phone call contact number of the laundromat and hence more accessible and acceptable to existing customers. It also helps to keep the application small, smooth and elegant. However, other platforms like messenger provide more opportunity for complex interactions beyond what plain text and image can convey, so it will be another interesting design space.
Project:A simple chatbot prototype based on my previous user research course assignment
Duration:User Research: Sep - Oct 2017Chatbot: 2 weeks, Oct 2017
Tools:Ruby, Twilio, Heroku
I made a quick presentation for Zensors, a computer vision & natural language processing powered start-up on CMU campus, to demonstrate my understanding of their product and meanwhile contributing my perspective as the fresh eye. Zensors achieve the idea of IoT by using CV to understand videos collected with CCTVs and hence transfer these devices into "sensor". I was asked to create a page of a website for a specific segment group of their users.
I suggest applying this idea to assist public and retail space design. Inspiration comes from the previous communication with a colleague, who is experienced at operating Bricks & Mortar store, during my design research internship. The Zensors' product can greatly help us understand the paths of customers in a retail space, hence validate various kinds of design such as communication signage, the arrangement of goods, etc. Here is webpage and brief of my research process:
Project:Research, product concept, mid-fidelity website quick demostration
Duration:One week, Spring 2019
I teamed up with 4 computer science and electronic engineering students and participated in the 2-week of CMU Impact-A-Thon 2017. Teams were asked to identify, articulate and develop a product/service/system solution to address medical relief during emergency situations in natural disasters like storms, earthquakes, wildfires, floods, and tornados that put people, communities, and infrastructure at risk, requiring rapid response in a variety of ways.
Considering the potential absence of power supply and radio communication device, we propose a novel low-cost system, based on adjusting existing light source on the life jacket, to quickly connect injured people with proper medical assistance. We prototyped the basic idea with arduino and openCV.
Our system involves scanning flooded areas with drones to get a map of peoples' injuries and location after the flood. After receiving images from drones, artificial intelligence is used to detect injured people and interpret the messages conveyed by their vest. This information forms a rescue queue, which prioritizes based on the severity of injury and the amount of time past.
Competition:Impact-A-Thon 2017 (a university-wide design challenge / Hackathon)
Price:The third place
My role:Hardware prototyping, presentation material and setting
Collaborators:2 ECE students, 2 CS students
Duration:Two weeks, Fall 2017
This is my undergraduate capstone project. We've made a hearth alive and smart. This product controls the combustion of the wood inside the chamber by itself. With the help of shape memory alloy, the responsible façade of this hearth adjusts the air flows into and out of the hearth according to temperature variation.
The hearth calms the peak combustion, prolongs the combustion duration of the same amount of fuel, hence saves the energy.
The hearth "wakes up" and begins to close its "umbrellas" in 60°C /140°F, approximately the upper limit of human skin can withstand.
Projects:Undergraduate Capstone Project (in a group of 3 people)
My roles:Industrial design/modeling, structure design, prototyping
Duration:Jan - May 2017
Spring 2018 | Course Link
Starting from David Rose's Enchanted Objects, we explore the relationship between human, technoloy and smart products and systems.
Fall 2018, Audit, Individual Assignments | Course Link
I learned about some basic concepts, mathematical and programming details by sitting in and trying to follow the master's introduction to machine learning course from the School of Computer Science
Course works approach VR/AR from different perspectives like design, development and literature survey.
Design and engineering course projects covering multiple topics in data pipeline such as sensing, cleaning, exploring, visualization and modelling.