Actions speak louder than words. The truth of this adage is magnified when an action can be broken down into a multi-dimensional array of floating markers captured at 12,000 frames per second with sub millimeter accuracy. From a $300,000 National Science Foundation grant acquired in December of 2018, the Qualisys Oqus 7+ motion capture system now resides in the Collaborative Robotics and Computer Human Engineering (CROCHET) Lab. The cameras utilized in this motion capture system are rated for both indoor and outdoor use, have a resolution of up to 12 megapixels, are highly mobile, and can be controlled through any desktop computer or laptop. This system is also able to calculate the positions of hundreds of marker positions with incredible accuracy and speed, and is currently under consistent use by capstone design students, robotics and social computing research groups, and in a range of other projects. These applications include gait pattern analysis to determine anatomical structure, flight characteristic dissemination of small airborne structures, and tracking of locomotion-via-vibration soft tensegrity robots. In this talk I will discuss the scope of ongoing scientific inquiry enabled by this industry-grade tracking equipment and dedicated student and faculty body available to them, and explore future potential applications.