An earth mosaic

So a while back I created a twitter bot @earthin24 that generates a video of earth, as seen from himawari8 satellite, every day.

To see the year out I wanted to create a combined video into a mosaic.

The generated videos aren’t that big so I never bothered to clean up after my script runs on my server which saved me having to generate them again, phew!

First I can get a quick count of all the videos I have:

ls *.mp4 | wc -l

That gives me 196 videos captured so far. The next step is figuring out how I can arrange them into a tiled square.

Math.sqrt(196) // lets me know I can tile in a nice even 14x14 arrangement.

Creating a mosaic with ffmpeg

This handy write up not only saved me a lot of time but also broke down the often cryptic looking cli flags that ffmpeg has. I won’t reiterate what the the syntax means I highly recommend you read the article to get a great break down of the command that follows.

Not wanting to hand write this syntax for a 196 videos I turn to scripting it, in js no less.

This is the output we want our js to generate so I can copy it into a shell script to execute.

ffmpeg
        -i video.mp4 -i ...
        -filter_complex "
                color=s=720x720:c=black [base];
                [0:v] setpts=PTS-STARTPTS, scale=51x51 [vid0];
                [1:v] setpts=PTS-STARTPTS, scale=51x51 [vid1];
                //...
                [base][vid0] overlay=shortest=1 [tmp0];
                [tmp0][vid1] overlay=shortest=1:x=51 [tmp1];
                // ...
        "
        -c:v libx264 earth_mosaic.mp4

The first thing we can do is use the printf command to spread out -i in our command without having to do it 196 times.

printf " -i %s" 2017*.mp4

This will iterate over any file matching the name 2017*.mp4 and output it as -i 20170620.mp4 and so on. So our command above can be changed to:

ffmpeg
        `printf " -i %s" 2017*.mp4`
        -filter_complex "
                color=s=720x720:c=black [base];
                [0:v] setpts=PTS-STARTPTS, scale=154x154 [vid0];
                [1:v] setpts=PTS-STARTPTS, scale=154x154 [vid1];
                //...
                [base][vid0] overlay=shortest=1 [tmp0];
                [tmp0][vid1] overlay=shortest=1:x=154 [tmp1];
                // ...
        "
        -c:v libx264 earth_mosaic.mp4

The back tick means it’ll replace that command with whatever is returned via stdout.

Generating the complex filter argument

The first thing we need to do is generate the video stream slot and label them all appropriately so we can refer to them in the second step to actually overlay the scaled video in the correct [x,y] position in our black video canvas.

var videos = Array(196).fill('vidya');

function slots(videos, scale) {
  return videos.reduce((acc, cur, idx) => {
    return acc + `[${idx}:v] setpts=PTS-STARTPTS, scale=${scale}x${scale} [vid${idx}];\n`
  }, '')
}
var videos = Array(196).fill('vidya');

First we create an array that matches the length of the videos we have, 196.

Then we have a function that accepts the array and a scale argument.

var scale = Math.round(resolution/Math.sqrt(videos.length));

This is a rounded whole number of our resolution 2160 for 4k and the square root of our videos which is 14, that gives us a video slot size of 154×154 since we’re creating a square video of 2160×2160.

return videos.reduce((acc, cur, idx) => {
  return `${acc} [${idx}:v] setpts=PTS-STARTPTS, scale=${scale}x${scale} [vid${idx}];\n`
}, '')

Using Array#reduce to reduce the video array into a string returning our video slots we’ll use in the next step.

Now we’ve build up our 196 slots we can now overlay our videos into them and position them in the right spot on our video canvas.

function overlay(videos, scale) {
  return videos.reduce((acc, cur, idx, arr) => {
    if (idx === 0) {
      return `${acc} [base][vid${idx}] overlay=shortest=1 [tmp${idx}];\n`;
    }

    var vidLen = videos.length;
    var sqrt = Math.sqrt(vidLen);
    var row = Math.floor(idx/sqrt);
    var tag = vidLen - 1 === idx ? '' : `[tmp${idx}];\n`;

    var y = scale * row;
    var x = scale * (idx % sqrt);

    return `${acc} [tmp${idx-1}][vid${idx}] overlay=x=${x}:y=${y} ${tag}`;
  }, '')
}

Much like the slots function this also accepts the videos array and a scale argument.

if (idx === 0) {
  return `${acc} [base][vid${idx}] overlay=shortest=1 [tmp${idx}];\n`;
}

The first item we handle differently as we need to specify where the first video needs to overlay itself which is our [base] video canvas the default [x,y] coordinates default to [0,0] so we don’t specify it for the first item.

var vidLen = videos.length;
var sqrt = Math.sqrt(vidLen);
var row = Math.floor(idx/sqrt);
var tag = vidLen-1 === idx ? '' : `[tmp${idx}];\n`;

The important variable here is row which we use Math.floor to figure out which row we’re currently on in our grid layout.

var y = scale * row;

With the row figured out we can times our scale (154) by the current row to figure out our y position on our video canvas.

var x = scale * (idx % sqrt);

The x position is slightly different as we need to adjust it for each item in the row so they sit next to each other. To do this we use the handy remainder operator (%) to let us know where in the row range the video should be overlayed e.g.

If we’re up to the second row say item 15:

15 % 14 // 1

The following returns 1 letting us know we’re on the second row as 14 goes into 15 once leaving a remainder of 1 which is what the remainder operator (%) returns.

The last line is just building up the string we’ll return which will look like the following

[base][vid0] overlay=shortest=1 [tmp0];
[tmp0][vid1] overlay=shortest=1:x=154 [tmp1];
// ...

Now we have our two functions to produce the slots and video overlays in the correct position we can now put the together into a function to output our ffmpeg command we want to copy into our shell script to run against out videos to produce the mosaic.

function ffmpegcmd() {
  var resolution = 2160;
  var scale = Math.round(resolution/Math.sqrt(videos.length));
  var videos = Array(196).fill('vidya');

  return `ffmpeg \`printf " -i %s" 2017*.mp4\` -filter_complex "
    color=s=${resolution}x${resolution}:c=black [base];
    ${slots(videos, scale)}
    ${overlay(videos, scale)}
" -c:v libx264 earth_mosaic.mp4
`
}

As mentioned above this function will create the videos and scale arguments to be passed into slots and overlay functions. The string it returns is the same as shown at the start of the article.

One neat trick in the Firefox and Chrome dev tools is you can wrap ffmpegcmd() in the copy() function to take the output and put it on your clipboard instead.

copy(ffmpegcmd()) // returns undefined and adds output to your clipboard instead

ffmpeg is amazing

ffmpeg is an incredibly powerful tool that made this possible and it’s amazing it just worked with combining 196 videos like that.