How four students from Texas A&M University designed an AMR to guide users around campus
Texas A&M University is one of the largest college campuses in the United States. Relying on maps to navigate a large campus like Texas A&M can be a difficult task for someone completely unfamiliar. Students Shanley Mullen, Jesse Rosart-Brodnitz, Austin Heibel, and Evan Maraist, created a prototype of an autonomous outdoor robot that can assist in navigating the campus. You can learn more about the background of the TiM$10K Challenge here!
VALE was created from a pre-existing robot called SCUTTLE (sensing, connected utility, transport taxi for level environments), a project that began at Texas A&M University.
VALE’s success is achieved using a SICK TiM781 LiDAR sensor, Google Maps API, a GPS, a magnetometer, a printed circuit board, Raspberry Pi 4, a weather protection case, and severe duty UniBrackets. Each of these elements work together and communicate to program certain locations for the robot to navigate to. When the robot receives a target destination, it uses Google Maps API to generate a path to a user defined destination and then maps data to usable sidewalks.
Obstacle avoidance is crucial to prevent the robot from becoming damaged or harming people. The distance and angle of certain obstacles that the robot comes within two meters of is read from the LiDAR sensor and it indicates a path that can be taken around these obstacles. Since the robot is made for outdoor tours, it also has a weather protection case.
Although VALE was created with the intention of helping people navigate Texas A&M’s campus, it can be useful in other areas. Instead of guests using maps, VALE can provide a solution for helping people navigate certain places like zoos, amusement parks, and malls.