Upper-limb loss affects millions of children worldwide, with the largest access gap in low-resource and humanitarian settings.Children who use prosthetic hands outgrow them quickly, making frequent replacements expensive and difficult to access.
Figure 1: Side by side image of a real hand and a 3D printed phoenix V3 hand. Sized using AI handsizer.
This humanitarian problem inspired me to work on a project that combines 3D printing and machine learning to make prosthetic sizing faster, more accurate, and more accessible.
I am currently developing a custom machine learning model that can estimate hand size from a single photo using a reference card. The goal is to simplify the sizing process so prosthetic hands can be produced and replaced more efficiently for people who need them.
How can you help?
To help my ML model make more accurate hand sizing predictions, and make a tangible difference in accessible healthcare, please fill out the hand sizing survey below:
All submissions are encouraged. Every image helps improve the accuracy and inclusivity of the model.
How the system works:
- A photo is taken of a hand next to any standard card (e.g., credit card, ID card, etc.)
- The model detects the card and calculates the scaling
- The hand is measured digitally
- A correctly sized prosthetic hand is recommended
Figure 2: Workflow of the machine learning model. The system utilises a reference card to create a pixel to mm ratio, measures the hand from the image, converts pixels to mm, and then maps the dimensions to the correct prosthetic size, outputting a recommended 3D printing scale.
Currently, I am working on improving the ML accuracy and integrating it into a simple sizing app. I hope to make correctly sized prosthetics more accessible for those who need them most.
Kiyan Yaghnam, Year 11 Student.
