iOS CoreMotion 13 May 2012Posted by David Wilson in General.
Something else I’ve been experimenting with lately. CoreMotion under iOS on the iPhone.
You can see from the screen shot above that a lot of statistical data can be extracted from iOS. Actually that’s very easy to do. The challenge comes in doing something useful with that data. I am able to successfully implement the usage of reference frames and understand what’s happening there. Unfortunately the Mathematicians had a field day after that and introduced Quaternion (http://en.wikipedia.org/wiki/Quaternion) and Rotation Matrix (http://en.wikipedia.org/wiki/Rotation_matrix).
What am I trying to do?
Many of the available articles on the net or questions at http://stackoverflow.com talk about detecting a golf swing… I’m not trying to detect a golf swing but I am trying to detect a swing.
I may be making progress. Where I’m at now is measuring userAcceleration of CMDeviceMotion. userAcceleration (x, y, z) above an arbitrary number begging the process of capturing samples…. when the acceleration on all three axis falls below the arbitrary value then the samples are then all added together divided by the number of samples and multiplied by the time giving a velocity. This is shown in the screen shot above (red text on light green background). below is some sample debug output.
2012-05-13 14:52:24.511 iTest[5161:707] timeStart=116260.614214, TimeEnd=116261.019768, timeDuration=0.405554
2012-05-13 14:52:24.513 iTest[5161:707] x=-0.247293, y=0.263759, z=0.323034
2012-05-13 14:52:24.515 iTest[5161:707] x=0.037945, y=0.267724, z=-0.237428
2012-05-13 14:52:24.517 iTest[5161:707] x=1.686204, y=1.664073, z=-1.194653
2012-05-13 14:52:24.519 iTest[5161:707] x=-0.960586, y=2.531593, z=2.098828
2012-05-13 14:52:24.520 iTest[5161:707] x=0.000058, y=0.430671, z=0.036303
2012-05-13 14:52:24.522 iTest[5161:707] TOTAL x=0.516329, y=5.157821, z=1.026084
2012-05-13 14:52:24.524 iTest[5161:707] VELOCITY velocity=0.543462
Now in theory??? the velocity number will equate to a distance of movement…. And I guess if I use the net of X, Y, Z I may even know where in 3D space the phone has been moved to?
I’m looking forward to getting past this and onto more of the applications real implementation.