Project Team


Students

Pawan Harikrishnan
Computer Science
Penn State Harrisburg






Faculty Mentors

Truong Tran
Penn State Harrisburg
Department of Engineering, Science and Technology










Project








Project Video




video player icon




Project Abstract


Despite the advancements made in machine learning and augmented reality, the classification of human body movement, specifically movement of the upper extremities such as flexion, rotation, and abduction of the elbow and shoulders, remains a tedious and expensive endeavor. The process involves 3D capture cameras in environments specifically tuned for lighting and distance. To resolve this issue, we propose a simple application that runs on smartphones and leverages the built-in camera to accurately measure joint angles. The application imposes a scaled virtual skeleton using augmented reality onto the user’s body and measures the angle between the joints during a specific movement. This data is then used to assess how much motion is typically required for a given task. The data obtained using the application proves to be highly accurate when compared to existing pose elimination and gait recognition programs. This application will have significant clinical relevance and can be used by doctors and physical therapists to better understand their patients mobility and range of motion.




Evaluate this Project


Use this form link to provide feedback to the presenters, and add your project evaluation for award(s) consideration.