Projects
I'm a big fan of robotics, and to that end I've worked on lots of personal projects both big and small. Most of my recent work has been in microrobots and swarm robotics, but I've also worked on different projects across the field, from robotic grasping to autonomous optical testing stations. I also enjoy branching out in different sides of the tech stack, and I've done projects in areas like computer vision, machine learning, and server management.
I've listed some of my favorite personal projects below:
The Creator - Automated Robotic Analysis and Construction of MEGABLOKS Structures
Four friends and I programmed a robot to inspect a structure built out of MEGABLOKS and autonomously replicate the structure. We used a Microsoft Kinect for Xbox One LiDAR and computer vision techniques the reconstruct what blocks compose a structure, including blocks hidden from the camera's view. This structure map was then sent to a Sawyer robotic arm to pick and place blocks in their respective positions. We used a custom pneumatic balloon to grip the blocks, which (as opposed to a two-fingered gripper) enabled us to pick and place blocks right next to each other.
I designed this system as my final project for EECS C106A (Introduction to Robotics) at UC Berkeley, which generously supplied the Baxter and Sawyer robotic arms for the project. Robots were programmed using ROS
The Creator setup
Demonstration of the Creator building a pyramid
A collective graphic of the computer vision setup and processing, including (from left to right) recording top down depth maps using an AR tag for calibration, voxelizing the depth map into a MEGABLOK-sized grid, reconstructing what blocks are needed to generate the desired depth map, and postprocessing to remove impossible structures or account for noisy measurements.
Predators at Cal - Simulating Predator/Prey Interactions with TurtleBots
Interested in applications for bio-mimetic swarm robotics, I worked in a group of three to simulate predator avoidance strategies using TurtleBots. We devised a realistic model for the prey's line of sight and roaming strategies and extended the ROS Rapidly Exploring Random Tree (RRT) library to incorporate success metrics for both the predator and the prey. We then demonstrated the success of our algorithms in both simulation and controlled real-world trials.
I designed this system as my final project for EECS C106B (Introduction to Robotics) at UC Berkeley, which generously supplied the TurtleBots for the project.
See the full project page here, or read the research paper on the project here!
Demo using a real robot, where AR tags are used to simulate predators. The AR tag's orientation indicates the predator's line of sight.
Demo in simulation. As shown, the prey avoids the area with the predator (indicated by the red dot as shown above) and just roams around the lower half of the map.
Adding a Dandelion-Inspired Airfoil to a MEMS Ionocraft
Continuing my aforementioned passion for bio-mimetic robotic structure, for my first MEMS project I designed a dandelion-inspired airfoil as an add-on to the ionocraft designed by Drew, et al. (featured here). We modified the airfoil design by Cummins, et al. (featured here) to match the ionocraft's dimensions and demonstrated the viability for such an airfoil to improve flight time and stability through COMSOL Multiphysics simulations.
I designed this system as my final project for EE 147 (Introduction to MEMS) at UC Berkeley.
This is the mask layout we designed for fabricating the airfoil, together with the ionocraft. This layout was designed for a standard two-mask SOI process, with the red representing the SOI layer and the blue representing the TRENCH layer.
The above figure shows simulations for the airflow pattern for various Reynold's values using COMSOL Multiphysics. The images are inverted from their normal orientation, with air flow from top to bottom. The structure consists of a porous disk representing the airfoil and a generic box-like structure representing the thruster. As seen from left to right, the air flows represent Re = 100, 200, 300, and 400, respectively. As shown, an air pocket can be observed for lower Re values but not for higher ones, indicating that we lose the SVR’s stability above Re = 300. Above Re = 500, the airflow becomes unstable, giving convergence errors in COMSOL.
Learning Machines - Robust Computer Vision for Medical Equipment Categorization
This is an optical character recognition platform built specifically for medical equipment. Given an input image of an equipment package, it can parse out the lot number and expiration date, two of the key identification components necessary to log and use medical equipment in standard practice. For use in industrial settings, it's designed with high noise tolerance and built to recognize a wide variety of notations used to mark the lot number and expiration date on medical labels.
I designed this system alongside the team at Medinas Health at the MASH Startup Hackathon hosted by UC Launch in March 2018.
You can find the source code for this project on Github. It is licensed under an Apache License, Version 2.0, with copyrights owned by Medinas Health.
A demonstration of the computer vision process and output
Skrapmeister - A Smart Trash Can for Eco-friendly Waste Disposal
To promote responsible disposal and improve the quality of dorm life, I participated in the Robotics@Berkeley Dorm Ex Machina competition in October - November 2016 to build an interactive trashcan organizer, in line with UC Berkeley’s goal of Zero Waste by 2020. It featured multiple bins for trash, compost, and recycling that could be chosen from via voice activation. Its enclosed design also works to keep away flies and minimize odors from old trash and compost, enhancing the hygiene of college life. This project won the Most Useful Design award.
Front view, featuring our laser-cut logo
Top view, demonstrating our microphone and trash segmentation