[Solved] Quaternion from IMU sensor to GameObject Orientation problem

Hello everyone,

Update:
Solved in the meantime.

Rotation that has to applied is simple:

transform.rotation = Quaternion.Inverse(CalculatedQuaternionFromAlgorithm);

Quote from own comment below:

So basically, all of the available algorithms just don’t put out the
rotation of the sensor with respect to
earth, but vice versa: orientation of
earth frame with respect to sensor
frame. This gets a bit more clear when
you use the magnetic field as well and
not just acceleration for gravity and
see what the outcome is. Simple as
that, the calculated quaternion has to
be inverted in order to apply the
actual sensor orientation to the
GameObject.

**Original question: **

I’ve just messed over 4 1/2 hours with applying an orientation I get from an Inertial Movement (IMU) sensor to a simple cube in Unity.

Straight away: I’ve actually solved it with very simple code. But I have no idea why, and instead this code should to the opposite…

So here’s what I do:

I calculate a quaternion from my sensor data and want to see a cube in Unity move the same as my sensor. My problem was, that when I rotated around a sensor axis, say X, the cube got rotated around the world’s axis X, and not its local axis. I read nearly everything google and Unity Answers provided as information, and I understand a bit about quaternions.
However, the only and really only solution that worked is as follows:

void Update ()
	{
		diffQuat = Quaternion.Inverse(bodyQuat) * lastQuat ;
		transform.Rotate(diffQuat.eulerAngles, Space.World);
		lastQuat = bodyQuat;
	}
  1. I calculate a difference Quaternion (diffQuat) from the last sensor Quaternion (bodyQuat) and the latest sensor Quaternion. So, bodyQuat gets updated between the frame updates. This difference calculation is necessary in order not to rotate infinitely fast around random axes, but only the delta value. Doing it this way, I can use…
  2. … the Rotate() function, where I apply the Euler Angles of my difference Quaternion. Oddly, I have to set Space.World here. I would have expected Space.Self(!) in order to rotate around the cube’s axes, but somehow this leads to the opposite.

Even more strange: Initially, I thought that

transform.Rotate(diffQuat.eulerAngles, Space.Self)

would be the same (in my case) as

transform.rotation = transform.rotation * diffQuat;

However, using transform.rotation, I also get only the rotation around the world’s axes, when I move the sensor around.

So two questions:

  1. Why does the transform.rotation-method not work?
  2. Why do I have to use Space.World in order to get local axis’ rotations?

Can anybody tell me if that code is really the way to go or if there’s another thing I haven’t realized yet how to “attach” an IMU sensor to a simple cube in Unity.
By the way one of the most unresolved questions out there…

Update:

I’ve now spent two more days in that issue and it’s solved for now. To be honest, as I’m quite new to Unity, I had to gain a lot more experience and understanding about the rotations.

My question above has not been answered yet (and I couldn’t figure it out on my own), but I finally found THE solution that connects the sensor movement directly to the cube:

transform.rotation = Quaternion.Inverse(newSnsQuat);

I don’t know why I have to take the inverse, but I took this idea from a different piece of work, where these sensor have been used with an OpenGL framework. That’s the point where I admit that things are still more complicated or irritating than I thought. At least, it’s so stupidly simple that it takes no computational effort.

I would appreciate it a lot if someone with knowledge could clarify this for me:

Assigning -not- the Inverse in the above code would rotate the cube around the world’s axes. Using the Inverse rotates it around it’s own axes, perfectly fine, without the necessity of something like transform.rotation = transform.rotation * newSnsQuat (which leads to continuous rotations around random axes anyway). So, what is the Inverse of a Quaternion in geographic means?

The only other thing I had to do is to put the rotation of my object as the initial conditions for my AHRS algorithm (the one that translates sensor data into a quaternion). And that’s it for now.

I’ll keep this post updated for people who might find it important or necessary at some point and who had to struggle with the same issue(s).

Hi @emthele - I know it’s been awhile since you posted this, but did you ever figure out the reason for the inverse?

Check this http://answers.unity3d.com/answers/1163849/view.html

I’m also looking at IMUs and hoping to send quaternion data over Bluetooth LE. I’m using 2 Arduono 101s (on each wrist) for movement.

Did you use/make your own AHRS algorithm or one from the company that makes your IMU and then convert to quaternion?

I’m using some of the data that comes from FreeIMU or from the Intel Curie chip and followed some info from Phil’s blog (http://philstech.blogspot.com)

In my case, calibration of initial position is important because of the position of the IMUs are not flat on a surface and also face in opposite directions.

Would love to hear how your projects are going for mo-cap too.

Cheers…

Blaine

Hi, I am quite a beginner as well, I wanted to ask you, which brand of IMU did you use and how did you make Unity interface with it?