A Novel Approach to Studying Human Movement

Machine Learning Driven Biomechanical Analysis of Throwing Motion Using Bidirectional Long Short-Term Memory and Temporal Attention Models

Access the full analysis and model implementation in Google Colab: Open the Colab Notebook

Access the full inference and AI Powered System code in GitHub: Open Github Repo

Summary

This project explores a low-cost, AI-powered system for analyzing human movement, specifically the throwing motion in baseball. It uses wearable IMU sensors and a BiLSTM neural network with attention to predict pitch velocity and extract biomechanical insights.

Why Throwing?

Throwing is one of the most biomechanically complex human motions. It requires coordinated action from the legs, hips, torso, and arm. Injuries are common in youth baseball, and expensive analysis tools are not accessible to most players. This project presents a scalable and affordable alternative.

System Overview

Microcontrollers and IMUs: The system uses an Arduino Uno R4 WiFi microcontroller connected to two MPU6050 Inertial Measurement Units (IMUs) to collect motion data.

Firmware: Custom firmware written in C++ runs on the microcontroller, enabling real-time communication with the sensors.

Data Collection Software: A Python script running on a MacBook connects to the microcontroller via Bluetooth for real-time wireless data transfer. It extracts the data and stores it as CSV files.

Analysis and Inference Software: TensorFlow, a machine learning framework, processes the stored data to train BiLSTM models with attention mechanisms and perform inference—providing insight into biomechanical patterns.

Model Architecture

A BiLSTM model with temporal attention was used to capture biomechanical dependencies in the throwing sequence. It achieved a Mean Absolute Error of 3.5 mph on the test set, learning patterns like hip-shoulder separation that correlate with higher speeds.

Data Collection

20 competitive baseball players performed 20 throws each, creating a dataset of over 400 throwing sequences, which translates to 120,000 timesteps, or individual data points in total—used to train the model. A complementary filter fused accelerometer and gyroscope data. The result was a set of 20 features per timestep, normalized and padded for ML input.

Biomechanical Insights

High-speed throws showed strong activation in hip sensors and broad attention across the motion sequence. Slower throws showed localized attention and shoulder dominance, indicating less efficient mechanics and higher injury risk.

Applications

Future Directions

View the full project poster: Download Poster (PDF)