nikdoof.com

/posts/ 2022/automating-a-blogroll-in-hugo

Automating a Blogroll in Hugo

Apr 30, 2022

#python#miniflux#hugo#rss#blogging

Recently i’ve spotted quite an up-tick in discussions about blogging and content generation in this quickly evolving federated and user hosted future. Blogs are “cool” again, tried and tested RSS is the tool to subscribe with, and OPML is back as the method to share your feeds.

For me, RSS never really went anywhere. People like to call it dead after Google Reader shut down, if anything it unified the remaining users into generating new tools and applications to consume RSS. On the website front, RSS is still there just not front and centre, and most large websites still publish their RSS feeds. I’ve personally been using Miniflux as my primary RSS feed consumption tool for a couple of years now, its incredibly easy to self host with only Postgresql as a dependency, and it has a lot of tools built in to manage even difficult RSS feeds that mangle the output.

Tom Critchlow popped up on HackerNews with a post about Increasing the surface area of blogging which discusses RSS, OPML, and personal feeds of news and information. Sharing is a key component of this ecosystem, so I thought i’d take a crack at showing my Miniflux feeds out on my Hugo website.

The Script

First of all, I needed to get the OPML data out of Miniflux. This is quite simple to do as it provides an API which is amazingly simple to use. Using Python 3, Requests and no other external modules I put together a small script that would extract out the OPML from the API and write out the raw XML, as well as an optional JSON formatted version.

miniflux2opml.py

To run it, all you need to do is provide your instance URL (or the hosted version at https://reader.miniflux.app) and your API token. It’ll then connect to Miniflux and pull down the OPML to your console. If you want to write it out to a file, use -o, and if you want to write out the JSON formatted version, use -j.

Hugo Shortcode

Now that we’ve got a script extract the data into a JSON format, we need to parse it in Hugo. Thankfully, Hugo supports arbitrary data files stored in data of your site root, these can then be accessed via .Site.Data values within your templates.

Using that method, I created a shortcode under layouts/shortcodes/blogroll.html:

<p><small>Last Updated: <b>{{ dateFormat "2006-01-02 03:04" (.Site.Data.feeds.opml.head.dateCreated | time) }}</b></small></p>

{{ $opmlJson := index .Site.Data.feeds "opml" "body" "outline" }}
{{ range sort $opmlJson "_text" "asc" }}
    <h2>{{ ._text }}</h2>

    {{ if gt (len .outline) 0 }}
    {{ range .outline }}
    <li><a href="{{ ._htmlUrl }}" target="_blank">{{ ._title }}</a></li>
    {{ end }}
    {{ end }}

{{ end }}

When then can be used in a simple post:

---
title: "Blog Roll"
date: 2022-04-30
draft: false
toc: false
images:
tags:
    - blogroll
referral: false
---

My blogroll is a list of sites and feeds I follow using [Miniflux](https://miniflux.app). This list is auto generated on a weekly basis using a small Python tool that wrangles the XML OPML format into JSON that Hugo can use nicely.

{{< blogroll >}}

Can you can see the result here

Automating the updates

I don’t want to manually run this command every so often, ideally it should be automated and left on auto pilot. I use Github for my repository so i’m able to make use of Github Workflows to regularly run the script and commit it to the repository.

name: Pull OPML from Miniflux
on:
  workflow_dispatch:
  schedule:
    # * is a special character in YAML so you have to quote this string
    - cron:  '0 5 * * 6'
jobs:
  pull_opml:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Download and Commit OPML
        run: |
          tools/miniflux2opml.py -u https://rss.doofnet.uk -t '${{ secrets.MINIFLUX_API_KEY }}' -o static/feeds.opml -j data/feeds.json
          git config --global user.name 'Andrew Williams'
          git config --global user.email 'nikdoof@users.noreply.github.com'
          git add static/feeds.opml data/feeds.json
          git commit -am "Update OPML"
          git push          

With this the feed is pulled and updated on a weekly basis, and my OPML file is also put into the static folder to ensure its available to the public. Now people an be subjected to the weird content I read in a automated way.