Smart Wardrobe

Bryan Carbaugh and Nina Lutz. 6.S063 Final Project.

About

What if furniture was interface?

This was our guiding question for this project. Furniture is such a central part of our space and routine. Through this project we wanted to explore and design what might go into an interactive wardrobe.

We chose a wardrobe becuase it is a central part of one's morning routine and is something that you interact with differently everyday, based on what you have to wear and where you are going. Usually the interface you use to make these decisions are done on your phone; checking calendar and weather to determine your outfit. Not to mention when you first wake up to a blaring phone alarm that you snooze.

We decided to put this part of your morning routine, waking up and picking your outfit, into this interactive wardrobe. This wardrobe is meant to be an assitive pressence. And yes, a little smart.

Below is a short demo video of our prototype explaining some of its features.

Below is the system diagram we intended for this prototype.

Software

Currently the software consists of two Processing scripts.Processing is an open source framework for rapid prototyping. Using UDP communication we transmit data via wifi from a heavy backend script to the light front end script that the user interacts with. This is because we designed the front end script to be run on a 32 bit Raspberry Pi.

The front end script runs the voice cues, GUI visualizations, and the state machine like logic to call upon the door to move. The back end handles the OpenCV set up for clothing inventory, HTTP requests to Google Calendar API, LEAP motion handling, and the UDP connection.

For our demo we used our laptops due to the finals season appropriate Raspberry Pi shortage. It is hardcoded as this change delayed the rest of our system implementation.

Electronics and Fabrication

For electronics we utilized addressible LEDs, a servo, and power management components such that we could localize everything to a central 12V power supply but use buffers to provide less voltage to some components. The electronics themselves can be wired to a Raspberry Pi or microcontroller and were installed into the wardrobe itself.

We fabricated the wardrobe from wooden panels and 2x4 beams. This shell is composed as seen below.

Above; some fabrication photos, shell drawing.

We then fabricated a door frame to hold a monitor. We hung this door frame on a rolling door track we purchased from a hardware store. We placed the door on this track such that it rolled along and held up the monitor. To motorize this track we added a rack and pinion we purchased from McMaster Carr. At first we had used a laser cut version and while this worked for the frame and assured us the mechanism would work, the acrylic couldn't quite handle the full torque.

Above; the final rack and pinion system and laser cut version.

Along with the rack and pinion were the lights and microcontrollers to handle the electronics. The lights were installed in various parts of the wardrobe in the hopes of adding more interaction and guiding users to the correct clothes in a timely manner. There is also a camera installed to see the clothes and feed to the main computer for the OpenCV code.

The user interacts with the mirror. There is a monitor in the door frame covered by a piece of acrylic that is half silvered in order to achieve this smart mirror effect. This was achieved by using acrylic and tightly pressing car window tint to it. There is a LEAP motion sensor mounted on the door to enable gesture control, as no one wants a smudged mirror.

Above; a more close up look at the mirror, some mounting of LEDs, and a mounted camera.

Below is the diagram for the final system as well as some photos of the prototype we actually demoed on the 13th.

Interactions

Currently the interaction thread revolves around a user getting up in the morning by the alarm of the wardrobe, swiping away the alarm, and getting an outfit recommendation with the voice assistant and cue-ing LEDs. Below is a marked up model with the GUI in green, the LEDs in purple, power brick in light blue, and motion sensor in pink.

Files and materials

Things we built. Code we wrote. Photos of some moments.

Code

The code is available on our project GitHub.

Materials List

The materials we used can be found here.

Photos

Some lovely photos of our progress can be found here.

Future

There is future work planned for this project. In terms of actually finishing a robust computer vision software to classify various clothing types. As well as getting a new Raspberry Pi and using the original and proper integration we had meant to.

But even more so, to actually prototype different and more advanced interactions. Such as a system that can understand the emotions of the person in the mirror and possibly monitor things like skin, sleep, etc. There are many possible implementations that this type of interface could lead to.