Please login to view this media

  • Talk
  • 15/09/2021
  • Canada

Using Markerless Motion Capture for Automated Timed-Up-and-Go Sub-Task Segmentation

Description

Megan Saftich, a second-year master’s student in the human mobility research lab at Queen’s University, presents her research on automating the Timed Up and Go (TUG) test using Marcus motion-capture technology paired with a deep learning algorithm. This presentation explores traditional methods of measuring mobility and balance through timed trials, highlighting the impracticalities involved, and proposes a modern, contactless solution using 3D motion analysis.



By capturing trials with eight video cameras and processing the data to identify key events, the proposed system offers a way to extract not only total time taken for the test but also detailed information on the individual subtasks involved. Initial validations demonstrate high accuracy in measuring time and identifying events, even exceeding traditional methods in specific contextual analysis.



Future directions aim to enhance algorithm effectiveness for clinical populations and extract biomechanical data during assessments, signifying potential improvements in clinical evaluations through automated, detailed analysis of movement. The fusion of motion capture and algorithmic processing promises to transform standard practice in mobility assessments, making them more efficient and insightful for practitioners.

DOI: 10.1302/3114-220954

Specialties