.
I'm a first year graduate student from Singapore pursuing a MSEE at the University of Washington. I graduated from UW in 2020 with a BSEE with a concentration in embedded systems. My technical background is especially diverse, so I strive to apply this collection of experiences in all the work I do.
I am passionate about finding and building solutions for difficult problems, and working with teams to achieve them. I have experience in both engineering work and project management, and I am looking to apply my experiences in any field that I can.
I just started at UW Formula Motorsports as the Driverless team lead! More to come in the coming months, but in short I'm:
To get familiar with the Raspberry Pi 4 (and to brush up my C), I made an ultrasonic mouse bar. In short, I wanted to make a small device to let me use my hand as a mouse input. To achieve this, I used a row of ultrasonic sensors and a simple Kalman filter to map the position of my hand to coordinates on the screen. I made a quick prototype out of cardboard, tape, and breadboards, but it works surprisingly well despite the physical width of the sensors limiting the resolution of the x-coordinate system. You can check out my code here, and watch it in action here!
Manually repairing servers is labor intensive and costly for large server farms. By implementing a workcell that can replace parts on a server with minimal human intervention, my capstone team and I produced an autonomous solution that can help to reduce costs and man hours spent on menial tasks. Using two KUKA 7-axis arms and custom end effectors on a server work cell, we integrated machine vision with Cognex cameras to handle arbitrary server poses on a conveyor belt. I worked as the team lead for the project where I integrated the Cognex cameras (and computer vision), Arduino communications with the conveyor belt, as well as the project management. Check out our poster here.
Optogenetics is a novel neuromodulation method that essentially uses light to activate (or inhibit) neurons by using optogenetic actuators to allow the neurons to be activated by light. In my time at Dr. Azadeh Yazdan’s Neural Engineering and Rehabilitation Design lab, I designed a high-powered LED array alongside an adapter board to drive the LED array through the Grapevine Nomad neural interface processor. The LED array is only 13mm by 13mm in size and was designed to fit snugly into the housing mounted on the monkey’s skull.
In my 10 weeks in Scottsdale, Arizona, I was exposed to a huge variety of fields including manufacturing, testing and validation, as well as medical testing for Taser exposure. However, most of my time was spent in the Conductive Electrical Weaponry division as a Hardware Engineering Intern where I primarily worked on firmware. My first project was building a test firmware build to utilize an on-board accelerometer on the control board of the Taser 7 to investigate recoil patterns for different firearm primers. After I finished that project, I moved onto redesigning the self-test procedure for the high voltage module to provide more reliable validation in both manufacturing and user use-case scenarios.
I also got exposed to the Taser 7 myself! If you’re wondering why I did it, it was because I was too curious on what it would feel like, and I also felt that if I ever ended working at Axon I should understand the impact of what I would be contributing to.
In my senior year at UW, I took EE 469 – Intro to Computer Architecture and to this day it remains my favorite class at UW. The labs were a continuous development of a 64 bit 5-stage pipelined ARM CPU as we continued to learn about the different constituents of the CPU. The tasks culminated to a functioning ARM CPU done entirely on SystemVerilog, with pipelining features such as branch acceleration and forwarding, with all gates limited to a maximum of 4 inputs (larger gates were built upon). Everything was done with Quartus and ModelSIM.