Olin Intelligent Vehicles Lab
The Olin Intelligent Vehicles Lab is currently working on multiple robotics projects, primarily centered around multirotors, but also including some ground vehicle projects and a fish tagging project. Information about all of the current projects can be found at the olinrobotics website Spring of freshmen year, I began working on the Snot Bot project (see bottom), where we worked on a system for using multirotors to collect biological samples from whales. I continued working on that project over the summer between my freshmen and softmore year, then worked on expanding that code base to be more widely aplicable during my sophmore year. The next summer, I worked in the robotics lab again, with the goal of making a drone autnomously land on a target using computer vision. In an effort to avoid duplicating work, I also set up a website as a central location for all of our documentation
In the intelligent vehicles lab, we use multirotors equiped with Pixhawk flight controllers and gopro cameras on gimbals. In our original system arcitecture, the drone had a video transmitter on board and a two way link to a ground station computer. All of the code we wrote would run on the ground station computer based on the information it received from the video trnasmition and the two way link, then it would send commands back up to the drone. We use ROS to handle communication between different parts of our code, and the mavros package to communicate with the Pixhawk. During my second summer in the lab, I started running into issues with latency. The built up delays from sending video down to the computer, running calculatoins, and sending commands back up was making it extremely difficult to autonomously land on a target. To reduce this latency, we decided to put a single board computer on the drone. This way, all of the calculations could be done on the drone, and the ground station computer would just supply high level commands, such as takeoff, land, or run a mission. Getting this to work involved figuring out how to run our code on an Odroid (the onboard computer we are using), setting up an ad-hoc network between the Odroid and the base station computer, and setting up ROS to work over multiple computers. This will not only be useful for autonomous landing, but will also provide a better platform for future projects both on drones and on ground vehicles.
For all of our projects which involve a multirotor launching and returning to a moving vehicle, we need a method of landing with more precision than we can with GPS alone. Our goal was to use sensors already present on the vehicles to achieve these precise landings. The most obvious sensor was the gimbal mounted gopro camera: we just needed a target we could detect. For this, we used APRIL tags, which are easy to detect using existing ROS packages and which give us the 3D position of the camera relative to the marker. The major issue with this target is if the target is large enough for the drone to see from a good distance in the air (about 40 feet), it had to be too large to see when the drone was only a few feet above the target. We solved this by nesting APRIL tags inside of eachother. This way, as the drone decends, it detects smaller and smaller markers, and will know its position all the way to the target. After running into an issue with latency from sending video to a ground computer and commands back to the drone, we added an Odroid computer to the multirotor's payload, which allowed all of the computation to be done onboard the vehicle and allowed us to successfully land the drone on a 1 meter target. In addition, the onboard computer will give the lab more flexibility in the future to develop additional sensors and behaviors.
The goal of the snotbot project is to develop an improved method of collecting biological samples from whales using drones. The current methods of collecting whale samples include shooting the whale with a special crossbow and collecting a blubber sample, or driving up to the whale with a small boat and trying to capture the whale blow. Both of these methods are inconsistent and dangerous becuase they involve getting very close to whales with small vessles. We would like to use drones to fly over whales and collect samples of whale blow with a larger ship that can stay a safe distance from the whale. The speed and altitude of the multirotor also make it much easier to collect breth samples with a drone than with a chase boat, which will make this method more consistent than current tactics.
During my time on this project, I worked on developing and testing code for multiple forms of waypoint navigation as well as placing GPS boundaries for where the quadrotors could fly. In addition, I worked on upkeep of the multirotors and designing components to adapt the frames to hold the equipment we needed to use. For example, I designed a plate to mount a gimbal to a hexacopter frame so we could use our camera-based landing code on that vehicle (shown above, used for video below), and I helped design and manufacture a new vacuum formed case for one of our quadcopters so it could carry a video transmitting system which we also need for our landing code (shown above). I also assembled, tested, and imporoved the waterproof hexacopters shown above.