Posted by:

David G

David G, Chief Explorer

Understand the telemetry needed to ensure your 360 videos are loaded in 360 players facing the same direction as they were shot.

Did you see my post last week?; Adjusting the yaw of an equirectangular 360 photo using ImageMagick.

The example used extracted equirectangular frames. However, in the case of videos there are more efficient ways to achieve the same thing, as I’ll show you in this post.

Assuming the yaw is off by a fixed direction during the video, for example the camera was facing the wrong direction or the monopod was angled slightly left or right for the duration of the video, this is quite easy using ffmpeg.

I’ll use the same World Lock example video from last week.

Here it is before yaw adjustment:

The v360 filter takes a fixed yaw values to apply to all frames like so;

ffmpeg -i GS010013-worldlock.mp4 -vf v360=e:e:yaw=180 -c:v libx265 GS010013-worldlock-yaw180.mp4
  • v360 : filter name
    • e : abbreviation for “equirectangular” (this is the input format)
    • e : abbreviation for “equirectangular” (this is the desired output format)
    • yaw: horizontal center of equirectangular output [0 - 360] relative to current yaw (0)

Don’t forget to copy over global metadata too (the above ffmpeg command will only copy streams):

exiftool -TagsFromFile GS010013-worldlock.mp4 "-all:all>all:all" GS010013-worldlock-yaw180.mp4

And the result:

See how the video now faces in the opposite direction, because the yaw has been offset in each video frame by 180 degrees.

A note on XMP-GSpherical metadata tags

A better way to account for fixed yaw offset is to use the XMP-GSpherical InitialViewHeadingDegrees video metadata tag to achieve the same result.

The InitialViewHeadingDegrees determines the starting yaw of the camera. It will default to the center of the image. However, if you know there is a fixed offset you can account for it here.

So using my previous example, I could set XMP-GSpherical:InitialViewHeadingDegrees to 180 and the video would play in the viewer in the same way as my ffmpeg processed video above (without any need for post-processing).

End note.

Of course, in either case this does not my video processed in World Lock mode. Unlike a fixed offset for yaw, the World Lock offset is dynamic with each frame.

Luckily for us, the GoPro GPMD telemetry allows us to calculate true heading for each frame in the video.

I’ve talked about extracting GoPro telemetry previously with regards to GPS points.

The GPMD telemetry includes a whole host of data, including MAGN (values recorded by the cameras Magnetometer) and CORI (Camera Orientation).

For reference here is the GoPro sensor axis configuration for the sensors;

GoPro Camera Axis Orientation

  • x rotation = pitch
  • y rotation = roll
  • z rotation = yaw

CORI (Camera orientation values)

In GMPD, Camera orientation is a relative measurement (the orientation relative to the orientation the sensor had when the acquisition started), as opposed to an absolute measurement (like orientation to magnetic north).

The first CORI value for our example World Lock video (GS010013-worldlock.mp4) looks like this (extracted using gopro-telemetry);

"CORI":{
  "samples":[{
    "value":[0.9989318521683401,-0.024964140751365705,0.02621539963988159,0.029206213568529312],
    "cts":176.62,
    "date":"2022-05-26T08:35:42.485Z",
    "sticky":{
      "VPTS":1261037}
    },

The values shown (0.9989318521683401,-0.024964140751365705,0.02621539963988159,0.029206213568529312) are Quaternions.

Quaternions contain 4 scalar variables (sometimes known as Euler Parameters not to be confused with Euler angles). For GoPro cameras these are printed in the following axis order; w,x,y,z (according to this thread](https://github.com/gopro/gpmf-parser/issues/100). w is a scalar that stores the rotation around the vector.

I won’t try explaining Quaternions here, but recommend this video which helped me to understand the concept and why they’re needed (because of Gimbal Lock):

I’d also recommend this post on the subject; How to think about Quaternions.

Camera Orientation (Quaternions) is reported at the same frame rate as the video (which can vary depending on what framerate setting was set on the camera, and is also reported in the telemetry as "frames/second").

The relative Quarternation samples can therefore be used to calculate absolute yaw, pitch, and roll angles for each frame in the video.

MAGN (Magnetometer values)

The first MAGN value for our original example video (GS010013-worldlock.mp4) looks like this (extracted using gopro-telemetry);

"MAGN":{
	"samples":[{
		"value":[-4,88,27],
		"cts":163.461,
		"date":"2022-05-26T08:35:42.485Z"
	},

Values from the Magnetometer are reported in the axis order z (yaw),x (pitch),y (roll) in MicroTeslas.

MicroTeslas measure magnetic flux density (often referred to as the magnetic fields).

MAGN samples are taken at an approximate frequency of 24Hz (which can be less than the framerate of the video – thus, not each frame has a directly corresponding MAGN measurement).

Using the x, y components of Magnetometer samples in addition to the yaw and pitch angles calculated from the CORI samples, we can calculate the absolute degrees the camera was facing from magnetic North (it’s heading).

Calculating pitch, roll and yaw

Stay tuned for part 2 of this post next week showing a proof-of concept to calculate these values, and how to use the calculated yaw value to dynamically the adjust the yaw of each frame in ffmpeg.



Never miss an update


Sign up to receive new articles in your inbox as they published.

Discuss this post


Signals Corps Slack