Automagically Creating a Nightly “News” Show from Videos I Can’t Watch During the Day

(How I built a quick-and-dirty, personalized version of YouTube’s Watch Later, but for the entire web.)

Michael David Murphy
5 min readDec 14, 2018

Today’s news is full-bore. Following its twists and turns feels like being on the other end of this firehose.

I wondered if there might be a way to cobble together the day’s links, tweets and short video clips into a “news broadcast” of sorts. Back in the day, there were micro-services for this (I recall a universal watch-later bookmarklet from Boxee that allowed you to build a personal queue of video content, regardless of where it was located).

I’m not a developer, but I have a Mac, know a bit of command-line, and muck around with Python scripts. What I wanted was a tool that allowed me to put my feet-up and watch 30 minutes of the day’s nonsense after putting the kids to bed. In a one-click compilation. Kind of like saving an article to read later in Instapaper, but for video.


Hazel is fantastic, and this project taught me to dive deep into its automation/workflow features, which feel limitless, especially when paired with short commands and shell scripts. I’ve used it for years, but never with extra scripts.

ffmpeg is a workhorse, and although it took a few late nights to bug-fix the errors it threw, it was totally necessary for the project. I installed it via brew.

youtube-dl is frequently updated, totally fresh, infinitely configurable, and is just plain fun to work with. I think you can install it via brew, too.

Drafts is an iOS utility knife, and like Hazel, is only limited by your imagination. I use it to two-tap my way to save anything I want to see later. It’s a writing tool and a text-saving tool. I use it as a trigger to create actions on a Dropbox-connected series of folders where the downloading/saving/renaming/transcoding happens.

Plex. You don’t need this to complete the flow. Maybe you’ve written your own xbmc-based video content management & viewing system: I use Plex.


I’m sure I over-engineered this whole thing, but when you don’t know how to elegantly write actual code, you sometimes have to walk around the block to get to your neighbor’s house.

Here’s the flow, as quickly (and succinctly) as I can describe it.

  1. On my phone, I see a tweet with a 2-minute video I want to watch later. I copy that tweet’s url with the iOS share sheet, and send it to Drafts.
  2. I enact a Drafts action (one-tap) that writes the tweet’s url to a blank txt file on Dropbox.
  3. Hazel (on a remote desktop machine) recognizes the txt file is now more than 0 bytes, and passes the tweet’s url to youtube-dl (via a shell-script inside the Hazel rule).
  4. youtube-dl successfully downloads the file as an mp4, writes the completed url into an archive so the flow doesn’t try and download it again, and upon completion, ffmpeg transcodes the mp4 to an intermediate, slightly downsampled mpg. (I’m not sure this is necessary, but ffmpeg appeared to prefer working with mpgs more than mp4s.)
  5. The downloaded mp4 gets shuffled-off to a self-deleting archive folder, from which I can choose to save (or automatically delete after a week or so — via Hazel) if there’s something I especially want to hang onto like this:
This gif was originally a tweet of a video, which my workflow downloaded, which I turned into a GIF because that’s what Medium prefers, and I wanted to link to the tweet, not embed it. [!]

The five steps above occur whenever I see something I want to save (they only require four-taps of my input on my phone, and fewer, if on my desktop).


While cooking dinner, I enact the final trigger in Drafts (an action called “Make The News”). Here’s what happens:

  1. The intermediate mpgs are renamed into a sequence: 00.mpg, 01.mpg, in the order in which they were downloaded, and they’re stitched together (in that order) into an HD mp4.
Here’s what it looks like with files automatically flying around and getting changed, renamed and prepped for stitching.

2. When that file is successfully completed, Hazel deletes the intermediate files and sends the final, transcoded mp4 from Dropbox over to a local, unsync’d folder monitored by Plex.

3. Sit back, turn on TV & Roku, launch Plex, watch my news.


  • Many “GIF” files across platforms (Reddit, Imgur, Twitter, Giphy) are actually videos — which, until this project, has bedeviled me. Why take one of the world’s most ubiquitous and open file formats and make it less so? The fact that they *are* videos (in GIF’s clothing) makes them perfect fodder for this project.
  • youtube-dl supports so many platforms. While I lean on the twitter example for description, I’m using the workflow across the whole web, and am surprised at how it works on nearly every site.
  • Computers are so expensive, and are so central in all of our lives, it’s ridiculous that we’re still in the infancy of getting them to do specific, odd tasks that we want them to do. I don’t want to wake-up to these windows quite yet, but shouldn’t we be able to create these kinds of tools ourselves?
  • Platforms and content creators probably hate youtube-dl; I can’t tell you if a download through my workflow counts as “ONE VIEW” enabling the creator one more notch toward achieving global dominance of views, likes, and $$$ from advertisers — the tool essentially frees the content from any of its platform-specific constraints. I just don’t know.
  • At some point I want to leaven-in images into the flow as chapter headings, perhaps containing a link or reference to the original source. Currently, the videos are stitched together without pause, and it’s a bit unrelenting. Would be much better with a 2-second gap between files, perhaps with a title card showing original source. But only if it’s generated automatically.
  • I spent more time on stackexchange working through bugs with ffmpeg than anyplace else. The biggest issue was in figuring out how to take a group of files of all different aspect ratios, and re-present them in a 1080p file without stretching or skewing. Good times!
  • In the examples below, you’ll see that the first clip is the same. That’s just a quick placeholder (00.mpg) that I include at the start of each file to set the desired height and width. Worked better when included — didn’t work as well without, will figure out why, later on.
  • Need to work on audio normalization, as can be seen in this example file.
This example uses the source files pictured in the GIF, above.
Example file from Dec. 12th, 2018

MDM (20181214)

(I’m not really on Twitter or FB anymore, but am doing those kinds of things in a broadcast channel on Telegram. Come on over!)