Vision sensing during Thanksgiving

It’s been two years but the feeling is just like yesterday. One of my most interesting class during my grad school was vision sensing. I enjoyed the complexity of using camera to process an image/colors and the absence of human being in doing the work. It was relatively a new technology back then and I’ve seen more developments recently in this area. So what is vision sensing if you may ask? Vision sensing uses digital camera and image processing software to extract data from an image.  Here are a few places vision sensing is used: manufacturing inspection, tracking system and most recent application is in the car industry, especially for government.

Dr. Rife was an interesting professor. He was young, energetic and very patient. Imaging this, there was one time I went to his office hour for some help, and about 1/2 hour later my other project partner came in, and 1/2 hour later another one came in. He explained the subject twice to each of us and all in 6 different ways to make sure we all understood There were at least a couple of times he came down from Palo Alto on a rainy Sunday night via Caltrain to help us with the project. His PhD project was tracking jelly fish in open water. Extracting colors under the water is not that easily, especially with low visibility in Monterey and colorless jelly fish. He’s my all time favorite teacher.

“Follow the leader” was our last project. Basically, we had to control our robot to follow another robot and mimicked all the motions, which included: stop, turn, reverse, slow down and speed up…etc. The best part of the project was that we only had 4 weeks to work on it while sharing the lab and robots with 2 other groups. In order to get the most usage of the lab time my friend, Danny, and I decided to come in on Thanksgiving day. I can’t remember exactly all the detail works (most of the time was programing, debug and programing again) for that day but we stayed there ’til 7pm and at the very last minute we were able to come up with the “zone method” algorithm to indicate the travel path and directions of the leading robot (see attached report), which was the most crucial part of the project. Just in time for a warm hearty Thanksgiving dinner.

Complete report: follow-the-leader-_final_.pdf


One thought on “Vision sensing during Thanksgiving

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s