All posts
Pillar·10 min

AI Running Gait Analysis at Home: How It Works and What to Look For

A practical guide to AI-powered gait analysis you can do at home with just a phone — what it measures, how accurate it is, and how to use the results.

A few years ago, gait analysis was a clinical procedure. You drove to a sports lab, ran on a force-plate treadmill while wearing reflective markers, and walked out with a 20-page PDF that mostly confirmed what your physio already suspected. It cost hundreds of dollars and took a week to interpret.

Today, a teenager with a smartphone and a free browser app can get a meaningful gait analysis in 60 seconds. This is the AI gait analysis revolution — and it's quietly transforming how recreational runners train.

What is gait analysis?

Gait analysis is the systematic measurement of how you walk or run. For runners, it focuses on:

  • Spatial parameters — stride length, step width, foot position at landing
  • Temporal parameters — cadence, contact time, flight time
  • Kinematic parameters — joint angles at the hip, knee, and ankle through the gait cycle
  • Postural parameters — trunk lean, pelvic drop, arm swing symmetry

Traditional clinical gait analysis uses motion capture cameras and force plates. AI gait analysis replaces the cameras with a single phone video and the force plates with computer-vision estimates of ground reaction forces.

How accurate is AI gait analysis vs. a clinic?

This is the question every runner asks. The honest answer:

For the metrics most runners actually need to improve — cadence, vertical oscillation, trunk lean, foot position at strike — modern 2D pose models are within 5–10% of laboratory gold standards when filmed from a clean side angle. That is more than accurate enough to identify your weakest link and track changes over time.

For the metrics where you need clinical precision — sub-degree joint angles, exact ground reaction force in newtons, sub-millisecond contact time — you still need a lab. But unless you're an elite athlete or recovering from a specific surgery, you don't need that precision.

What you can do at home with just a phone

A modern AI gait analysis app like FormStride does the following entirely in your browser:

  1. Detects 17 body keypoints in every frame using TensorFlow.js MoveNet
  2. Computes cadence by counting hip oscillations per minute
  3. Measures vertical oscillation as the range of hip vertical position
  4. Calculates trunk lean from the shoulder-to-hip vector relative to vertical
  5. Estimates knee flexion from the hip-knee-ankle angle
  6. Scores arm symmetry by comparing left and right arm swing amplitude
  7. Flags overstriding when ankle position at landing is too far ahead of the hip

All of this happens locally. Your video is never uploaded.

How to film a gait analysis video at home

The quality of your analysis depends entirely on the quality of your footage. Follow these rules:

Camera position

  • Side-on, perpendicular to your direction of travel
  • 3–5 meters away
  • Phone at hip height (use a tripod, stack of books, or a friend)
  • Landscape orientation
  • Full body must stay in frame for the entire clip

Lighting

  • Bright, even light. Outdoors at midday or a well-lit gym
  • Avoid heavy backlighting — pose models struggle with silhouettes

Clothing

  • Form-fitting works best. Baggy shorts and oversized shirts hide the keypoints the AI needs to track

The run

  • 5–15 seconds at your normal training pace, not a sprint
  • A treadmill is ideal because the camera stays still
  • Outdoors, have a friend pan slowly to keep you centered

Reading your results

A good gait analysis report tells you three things: what's outside healthy ranges, why it matters, and what to do this week. If all you get back is a wall of numbers, the tool failed you.

Here's what the key ranges look like for recreational runners:

MetricHealthy rangeWhy it matters
Cadence165–185 spmLower = more impact, more injury risk
Vertical oscillation5–8% of heightHigher = wasted energy
Trunk lean4–10° forwardToo upright = knee load; too much = hip strain
Knee flexion at strike130–160°Straight legs = harder landing
Arm symmetry>80%Asymmetry signals mobility imbalance

How AI gait analysis compares to wearables

Watches and foot pods (Garmin, Stryd, Coros) measure cadence and contact time well but cannot see your posture. They have no idea if you're overstriding, leaning at the waist, or losing your form when you fatigue. Video-based AI analysis sees what wearables miss — and you don't need to buy any extra hardware.

The ideal stack: a wearable for daily run-by-run cadence tracking, plus a video analysis every 2–4 weeks for posture and overstriding checks.

The privacy problem with cloud-based gait apps

Most AI gait apps upload your video to a server, run analysis there, and store the file indefinitely. Their privacy policies usually grant them rights to use your video for "service improvement" — i.e., training their next model.

If that bothers you (it should), look for apps that explicitly run pose detection in your browser using TensorFlow.js. The compute happens on your device; only the final numbers are saved. FormStride works this way by design.

Start your gait analysis

The hardest part of running gait analysis used to be access. That barrier is gone. Upload a 10-second clip and see your cadence, oscillation, lean, and knee flexion in under a minute — entirely in your browser, video never leaves your device.

Analyze your run for free

Private, in-browser pose analysis. Your video stays on your device.

Try FormStride