Cropping my webcam with ffmpeg

My laptop is on a riser, at a healthy distance and height from me. But when I do video calls, you see a lot of my surroundings (which aren’t great) and little of me. I sometimes tilt the screen to be vertically centered.

Since I’ve learnt ffmpeg is designed for real-time processing, I decided this would be a good problem to solve and learn with!

I created a loopback video4linux device, then run ffmpeg to feed my built-in laptop webcam’s stream into it, crop it, sharpen it and add a tiny bit of color saturation. It looks pretty good! Not Macbook Pro-webcam-good, but good!

Here’s the original webcam output and the final ffmpeg result, side by side:

Here’s my ffmpeg command, broken up for readability:

ffmpeg -f v4l2 \
  -framerate 30 \
  -video_size 640x480 \
  -input_format mjpeg \
  -i /dev/video1 \
  -vcodec rawvideo \
  -pix_fmt yuv420p \
  -filter:v "crop=494:370:103:111,scale=640:480,setsar=1,unsharp,eq=saturation=1.2" \
  -f v4l2 \
  /dev/video3

The magic happens on -filter:v. If you break it by commas you’ll see the cropping, scaling it back to 640×480, (un)sharpening and color saturation.

I never took the proper time to learn ffmpeg and, while the learning curve is a bit high, it makes perfect sense, it’s very well designed.

ffmpeg can do that?

I was reading Drew DeVault’s In praise of ffmpeg and read this part:

I was recently hanging out at my local hackerspace and wanted to play some PS2 games on my laptop. My laptop is not powerful enough to drive PCSX2, but my workstation on the other side of town certainly was. So I forwarded my game controller to my workstation via USB/IP and pulled up the ffmpeg manual to figure out how to live-stream the game to my laptop. ffmpeg can capture video from KMS buffers directly, use the GPU to efficiently downscale them, grab audio from pulse, encode them with settings tuned for low-latency, and mux it into a UDP socket.

And I was like… ffmpeg can do that? I didn’t know it was possible to do such a complex thing using just ffmpeg. Fair enough, there’s the USB-over-IP thing for the gamepad, but still.

I commented it to Oliver and he was explaining me some stuff I know very little of and should set aside time to learn. KMS, the role of the Compositor, Wayland (I use X11), etc.

Creating a timelapse video with ffmpeg

Having a collection of image files, you can build a timelapse video with ffmpeg like this:

ffmpeg -r 30 -pattern_type glob -i "*.png" -vcodec libx264 output.mp4

-r 30 is the number of images (frames) per second. For example, -r 1 will show every image for one second. -r 30 could be used for an animation with 30 frames per second.