This research presents an integrated system for near real-time analysis and feedback on exercise form during deadlifts and running. Recognizing that improper technique can lead to reduced performance and heightened injury risk, the proposed system employs a combination of video capture, computer vision, and machine learning to objectively assess user performance. During each exercise session, a video is recorded and automatically segmented into individual deadlift repetitions and running clips. A human pose estimation framework extracts 33 key joint landmarks in a 2d plane, which are then processed using a convolutional neural network to derive biomechanical metrics critical for evaluating movement quality in the respective exercises. Another lightweight deep learning model, deployed via Tensorflow Lite onto the system, classifies each exercise as demonstrating either good or poor form based on the resulting biomechanical metrics. The system then provides actionable feedback through an interactive user interface detailing how the user can improve their form. For repetitions identified with suboptimal technique, the user can review analyzed video segments, facilitating corrective adjustments and improved training outcomes. The proposed system achieved accuracies for the 7 biomechanical running metrics ranging from 79%-98% and accuracies for the 3 biomechanical deadlift metrics ranging from 83%- 99%. Overall, this project contributes a novel tool for athletes and trainers alike, bridging the gap between advanced biomechanical analysis and everyday exercise practice.
Primary Speaker
Additional Speakers
Faculty Sponsors
Faculty Department/Program
Faculty Division
Presentation Type
Do You Approve this Abstract?
Approved
Time Slot
Room
Session
Moderator