"MIT baseball coach uses sensors, motion capture technology to teach pitching"

Pitching demo using wireless sensors at MIT Nano.
Image by Praneeth Namburi

Visualization software and hardware could offer new possibilities for coaching and sports training - by Amanda Stoll (MIT.nano) - Published by MIT News Office

The field of sports analytics is most known for assessing player and team performance during competition, but MIT Baseball's pitching coach, Todd Carroll, is bringing a different kind of analytics to the practice field for his student athletes.

"A baseball player might practice a pitch 10,000 times before it becomes natural. Through technology, we can speed that process up," Carroll said in a recent seminar organized by the MIT.nano Immersion Lab. "To help players improve athletically, without taking up that much time, and keep them healthy — that's the goal."

The virtual talk — "Pitching in baseball: Using scientific tools to visualize what we know and learn what we don't" — grew out of a new research collaboration between MIT Baseball, the MIT Clinical Research Center (CRC), and the Immersion Lab.

Carroll started with an explanation of how pitching has evolved over time and what specific skills coaches measure to help players perfect their throw. Then, he and Research Laboratory of Electronics (RLE) postdoc Praneeth Namburi used the Immersion Lab's motion capture platform and wireless physiological sensors from the CRC to explore how biomechanical feedback and interactive visualization tools could change the future of sports.

Namburi stepped up to the (hypothetical) mound, with Carroll as his coach. By interfacing the physical and digital in real time, the two were able to assess Namburi's pitches and make immediate adjustments that improved his athletic performance in one session.

Visualizing sports data

Stride length, pitcher extension, hip-shoulder separation, and ground force production are all measurable aspects of pitching, explained Carroll. The capabilities of the Immersion Lab allow for digital tracking and visualization of these skills. Wearing wireless sensors on his body, Namburi threw several pitches inside the lab. The sensors plot Namburi's position and track his movements through space, as shown in the first part of the video below. Adding in the physiological measurements, the second clip shows the activity of his rotation muscles (in green), his acceleration through space (in blue), and the pressure, or ground force, produced by his foot (in red).

By reviewing the motion capture frames together, Carroll could show Namburi how to modify his posture to increase stride length and extend his hip-shoulder separation by holding his back foot on the ground. In this example, the technology betters the communication between coach and player, leading to more efficient improvements.

Assessing physiological measurements alongside the motion capture can also help decrease injuries. Carroll emphasized how this technology can help rehabbing players, teaching them to trust their body again. "That's a big part of injury recovery, trusting the process. These students find comfort in the data and that allows them to push through."

Following the training session, Namburi overlayed the motion capture from his first and last throw, comparing his posture, spine position, stride length, and feet position. A visual compilation of all his throws compared the trajectory of his wrist, showing that, over time, his movement became more consistent and more natural.

The seminar concluded with a live demonstration of a novice pitcher in the Immersion Lab following the advice of Coach Carroll via Zoom. "Two people who have never thrown a baseball before today, and we're able to teach them remotely during a pandemic," reflected Carroll. "That's pretty cool."

Afterward, Namburi answered questions about the ease of taking the physiological monitoring tools to the field and of being able to capture and measure the movements of multiple athletes at once.

Immersed in collaboration

The MIT.nano Immersion Lab's new seminar series, IMMERSED, explores the possibilities enabled by technologies such as motion capture, virtual and augmented reality, photogrammetry, and related computational advances to gather, process, and interact with data from multiple modalities. The series highlights the capabilities available at the Immersion Lab, and the wide range of disciplines to which the tools and space can be applied.

"IMMERSED offers another avenue for any individual — scientists, artists, engineers, performers — to consider collaborative projects," says Brian W. Anthony, MIT.nano associate director. "The series combines lectures with demonstrations and tutorials so more people can see the wide breadth of research possible at the lab."

As a shared-access facility, MIT.nano's Immersion Lab is open to researchers from any department, lab, or center at MIT, as well as external partners. Learn more about the Immersion Lab and how to become a user.