Smart Skeleton Introduction

The Smart Skeleton

The genesis of this project was a problem I’ve noted for years in my introductory Human Anatomy and Physiology (A&P) classes.  Students seem to have a really hard time learning the actions of the body’s muscles.  In our course at Macon State College, students are required to learn the name, location, and action of a list of muscles in order to pass the laboratory exam.  I think the best approach is to first learn the name and/or location (these will often inform each other – i.e. if you know the gluteal region is the rear-end, then you have a pretty good idea of where you will find the gluteus maximus) but to NOT try to memorize the actions.  Instead, I encourage the students to look at where the muscle begins and ends (its origin and insertion), and use their knowledge of the motion of the joints between to determine what will happen to the joints if the muscle shortens.  In other words, they have to  memorize the name and/or location, but they should figure out the action on their own.  This requires some skill at imagining motion in three dimensions, and many students seem to be lacking in three dimensional visualization skills.   Instead, they attempt to memorize the actions of muscles with little real conceptual understanding of what they are learning.  In our laboratory, we have plastic models of human bodies/limbs with muscles to help with learning the names and locations, but, of course, these do not move and are not very useful for visualizing actions.  We also have computer software with nice interactive animations of muscles contracting, but this is still essentially a two dimensional representation of what is a three dimensional problem.

I thought it may be useful to use a teaching skeleton to help students learn muscle actions.  The skeletons are built to reasonably approximate the degrees of freedom of movement in the various joints of the body, and many of them are painted to mark the origins and insertions of muscles.  My first thought was to use strings and pulleys attached to the skeleton to imitate the actions of muscles.  This quickly proved to be more trouble than it was worth.  I think it would work, but it would require extensive modification of the skeleton (which I was not authorized to do) and students would likely spend more time untangling strings and pulleys than learning muscles actions.  I had recently started using an arduino for another project, and I started thinking about the possibility of using a electronic sensors to track joint angles as a student moved the skeleton.  This could be coupled with software to prompt students to demonstrate muscle actions provide feedback.  This concept would combine the hands-on three dimensional aspects of a physical model (the skeleton) with the interactivity, flexibility and instant feedback of computer software.  The “smart skeleton” was born!

I considered several ways I could make a teaching skeleton “smart.”  I thought about attaching stretch receptors and rotary encoders to the skeleton.  This may have been  a good idea if I was building the skeleton from the ground up, but I quickly realized that adding these sensors to work reliably on an existing skeleton would require too much modification.  For some time I worked on the idea of using accelerometers as tilt sensors to determine the 3-D orientation of the bones relative to gravity, and using this information to calculate joint angles.  Inspired by another project I found from a web search, I bought a few of the cheapest 3-D accelerometer breakout boards that sparkfun electronics had to offer.  After much stumbling around, I realized that just having an accelerometers were great for measuring flexion, extension, abduction, and adduction, (pitch and roll) but they could not measure rotation (yaw), since changes in yaw do not change the relative direction of gravity.  The fact that it took me so long to figure this turns out to be a perfect lesson in what can happen when you work outside of your area of expertise.

An early, accelerometer-based prototype attached to the femur.

I was getting frustrated and thinking about revisiting the idea of stretch sensors and rotary encoders when a few more web searches brought me into the wonderful world of  inertial measurement units (IMU) and sensor fusion.  It turns out that (just like any idea conceived in the last 100 years) many other people had already been thinking about this problem, and that IMUs, electronic devices capable of measuring yaw, pitch, and roll had long been available.  The problem I was trying to solve had been, at least in part, solved by guided missile engineers decades ago.   Modern electronic IMU’s use sensor fusion algorithms to combine the input of multiple MEMs sensors, such as a 3-D accelerometer, a 3-D gyroscope, and a 3-D magnetometer to accurately estimate orientation.  Fortunately for me, DIY enthusiasts and hackers have recently made tremendous strides in developing open-source versions of of electronic IMU for use in projects such as autonomous flying drones.    In my research I wound up at the website of Fabio Varesano, a Ph.D. student at the University of Torino,  Italy.  Fabio has developed the FreeIMU , an open-source IMU hardware design and arduino library which uses a very efficient algorithm originally developed by Sebastian Madgwick, then a Ph.D. student at the University of Bristol, UK.

With an IMU attached to each of the moveable elements in a teaching skeleton (humerus, radius, etc.)  a computer should be able to calculate joint angles and determine if students are moving the joints of a skeleton so as to correctly demonstrate particular muscle actions.  Attaching the IMUs is simple, and require no permanent modification of the skeleton.  Since Fabio’s design is a single IMU designed to communicate to a single arduino board, I realized that a major design challenge for me was to adapt the design to allow multiple IMUs to quickly and efficiently send orientation data to an arduino and/or a host computer.  After much experimentation, I have come up with a reasonable solution to these problems, the “Smart Skeleton IMU

first working prototype of the Smart Skeleton IMU

2 Responses to Smart Skeleton Introduction

  1. Hi,
    I’m involved in a project which needs to monitor the motion of a floating structure. I need to have about 10 IMUs connected to a central microcontroller. As I understand, the MPU6050 only supports 2 slave addresses. Did you use more than two in your smart skeleton project? It would be great if you could guide me on how to add more than two MPU6050 units to the same I2C bus.
    Many Thanks,
    Mihika

  2. Jean-Luc Ward says:

    Hi John: I am very impressed with your skeleton project. I am currently involved in a project using part of your idea monitoring head and neck motion using 2 freeIMU, but I am so stumped it’s no longer funny. I would really appreciate your help or guidance for development of the arduino sketch, the processing sketch and the wiring of it all?
    Thank you very much for your time.

    JL

Leave a Reply

Your email address will not be published. Required fields are marked *

*


seven − = 4

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>