TimeLapse video mode is perfect for creating beautiful TimeLapses, but poses some unique challenges when trying to split them into individual frames.

Earlier in the year I talked about turning videos shot on HERO and MAX cameras in TimeWarp mode into still frames.

The GoPro Fusion camera was released before TimeWarp mode was introduced by GoPro into their lineup of cameras (TimeWarp 1.0 with the Hero 7 in 2019 and TimeWarp 2.0 with HERO 8 Black in 2020).

TimeLapse video mode, available on all GoPro cameras (HERO cameras 7 and older), is more suited to recordings when you’re not moving (the benefit of TimeWarp is the stability/smoothing it adds to the video).

Like with TimeWarp videos, when it comes to turning GoPro TimeLapse videos into photo frames, there are a few extra things to consider when compared to normal GoPro videos (non-TimeLapse/TimeWarp).

If you’ve reached this post looking to convert a regular video into frames, the process for doing so is described in this post from 2021.

TimeLapse 101

From the GoPro user manuals:

Use this preset to capture timelapse video when your camera is mounted and still. It’s great for sunsets, street scenes, art projects, and other shots that unfold over a long period of time.

There are 6 interval options; 0.5s, 1s, 2s, 5s, 10s, 30s, and 60s.

The interval determines how often a frame is captured. For example, setting the interval to 10s captures one frame every 10 seconds.

GoPro then pack the resulting .mp4 (or dual .mp4’s from the Fusion or .360 from the MAX) to a video with a frame rate of 30 FPS (29.97).

The interval setting therefore also determines how fast the real world time is sped up.

Here are more examples using each TimeLapse interval settings and the resulting times for output TimeLapse videos (for a 600 second video);

Interval (secs)Recording time (secs)TimeLapse Video length (secs)
0.560040
160020
260010
56004
106002
306000.6666666667
606000.3333333333

The calculation to work out Video length (secs) in the table above is; recording time (secs) / (interval (secs) * 30 (fps)).

You can see when shooting at a 60 second interval, even 10 mins of recording (600 seconds) produces just 0.3 seconds worth of TimeLapse video (to get a second, you’d need to record for 30 minutes!).

It’s also possible to calculate actual recording time according to mode. We know the timelapse frames are packed at 29.97 per second. So we can multiply the interval by frame rate.

Using TimeLapse 0.5 second mode as an example; 29.97 * 0.5 = actual recording time for 1 second = 59.94 seconds.

Or for TimeLapse mode 60 seconds; 29.97 * 60 = actual recording time for 1 second = 1798.2 seconds (29.97 minutes).

Still a little confused? Here I cycled round my block in each TimeLapse mode to demonstrate what the output of the same circuit looks like (you can see the actual recording times are all very similar)…

GoPro MAX 360 Timelapse 0.5s (2_1) Sample

GoPro MAX 360 Timelapse 1s (1_1) Sample

GoPro MAX 360 Timelapse 2s (1_2) Sample

GoPro MAX 360 Timelapse 5s (1_5) Sample

GoPro MAX 360 Timelapse 10s (1_10) Sample

GoPro MAX 360 Timelapse 30s (1_30) Sample

  • Duration (.360): 0.20 s
  • Number of frames (.360): 0.2 * 29.97 = 5.994
  • Actual recording length (.360): 0.20 * (30 * 29.97) = 179.82s

Too short for YouTube upload (but can be processed by GoPro Player).

GoPro MAX 360 Timelapse 60s (1_60) Sample

  • Duration (.360): 0.10 s
  • Number of frames (.360): 0.1 * 29.97 = 2.997
  • Actual recording length (.360): 0.10 * (60 * 29.97) = 179.82s)

Too short to be processed by GoPro Player.

Identifying videos shot in TimeLapse

GoPro expose if the video was shot in TimeLapse mode using the GoPro:Rate metatag as follows; 0.5s = 2_1SEC, 1s = 1_1SEC, 2s = 1_2SEC, 5s = 1_5SEC, 10s = 1_10SEC, 30s = 1_30SEC, 60s = 1_60SEC

As an example; <GoPro:Rate>2_1SEC</GoPro:Rate>, identifies the video being shot in TimeLapse mode at a 0.5s interval.

  • FUSION:
    • for dual fisheye mp4 videos, in the front file (GPFR) file metadata
      • note, this tag is not in the single processed .mp4 from GoPro Fusion Studio software – it is therefore impossible to automatically detect if these files were shot in TimeLapse and what interval setting was used.
  • MAX:
    • for 360 videos, in the raw .360 file metadata
      • note, this tag is not in the processed .mp4 from GoPro Player software – it is therefore impossible to automatically detect if these files were shot in TimeLapse and what interval setting was used.
    • for HERO videos, in the .mp4 file metadata.
  • HERO:
    • for HERO videos, in the .mp4 file metadata.

Be careful, the GoPro:Rate metatag is used to identify TimeWarp videos too (but the values are different)).

Also, a secondary check can be made to validate the videos do not have an audio track (no TimeLapse videos, in any mode, contain an audio track). If the below was present in the video metadata, it would confirm the video was shot in normal video mode (not TimeLapse mode);

<TrackN:HandlerClass>Media Handler</TrackN:HandlerClass>
<TrackN:HandlerType>Audio Track</TrackN:HandlerType>

Choosing the right framerate for extraction

We already know GoPro packs frames at 30 FPS in the final video.

However, GPS points (and other video metadata) are recorded in real-world time.

What this means is that the video track is sped up, but telemetry track is in normal time.

Therefore, when frames are extracted, we also need to normalise the timestamp assigned to each frame to match the real-world time it was taken.

We can work out the real-world time spacing of the extracted frames using framerate of ffmpeg extraction value and the GoPro TimeLapse mode (as demonstrated in the last table) using the calculation (30 fps * interval setting) / extraction rate.

The table below gives some examples;

Timelapse mode (e.g 0.5 sec)Each photo spacing (sec) @ 0.1 FPS extraction rateEach photo spacing (sec) @ 0.2 FPS extraction rateEach photo spacing (sec) @ 0.5 FPS extraction rateEach photo spacing (sec) @ 1 FPS extraction rateEach photo spacing (sec) @ 2 FPS extraction rateEach photo spacing (sec) @ 5 FPS extraction rate
0.51507530157.53
13001506030156
2600300120603012
515007503001507530
103000150060030015060
6018000900036001800900360

For example, if a video was shot at a 2 second interval, and we extract frames from the video at 5 FPS using ffmpeg, each frame extracted by ffmpeg will be exactly 12 seconds apart in real-world time.

The ffmpeg command for this would be something like;

$ ffmpeg -i GS019006.mp4 -r 5 -q:v 2 FRAMES/img%d.jpg

Setting frame times

The other crucial piece of information required to timestamp the frames is when the recording started.

To assign first photo time, we can use the first GPSDateTime value reported in telemetry.

Using the known time-spacing between photos, you can then incrementally add the times to all subsequent photos extracted (why it is important to logically name your sequences when extracting using ffmpeg – e.g. in numerical order; img%d.jpg)

For example, if the time spacing between images is 12 seconds and the first GPSDateTime is 12:00:00; then image1 time is 12:00:00, image2 time is 12:00:12, image3 12:00:24, and so on.

All that is left to do now is actually write the correct times into the photos.

The steps to do this are described in this post (as well as how to extract and write GPS points extracted from the Timewarp video to each image).


Posted by:

David G

David G, Trek View Chief Explorer