Automating Daily Blog Posts with GitHub Actions and Blogger API

Automating Daily Blog Posts with GitHub Actions and Blogger API

Writing blog posts ahead of time is easy. Remembering to publish them on the right day is harder. Paperboy solves this by letting you queue up posts in a JSON file and having GitHub Actions publish them to Blogger automatically every day at 2 PM UTC — no server, no cron job on your machine, no manual publishing required.

This guide walks through exactly how it works, how to set it up, and how to add your own posts to the queue.

Prerequisites

Before you start you'll need:

  • A Blogger blog and its Blog ID (found in your Blogger dashboard URL: blogger.com/blog/posts/YOUR_BLOG_ID)
  • A Google Cloud project with the Blogger API enabled
  • A Google OAuth2 credentials JSON file — either a token-based credentials file or a client secret file for first-time authentication
  • A GitHub account to fork the repository and store secrets
  • Python 3.9 if you want to test locally

How It Works

There are three pieces working together:

  1. blog_posts.json — a file in your repo where you write and schedule posts in advance
  2. run.py — a Python script that reads the JSON, connects to the Blogger API, and publishes any post whose post_date has passed
  3. daily-run.yml — a GitHub Actions workflow that runs the script every day at 14:00 UTC

Each time the workflow runs, the script fetches your blog_posts.json directly from GitHub, checks each post's post_date against the current time, and publishes any post that is due. Crucially, it also checks for duplicate titles — if a post with that title already exists on your blog, it skips it.

Step 1 — Fork the Repository

Fork raelldottin/paperboy to your own GitHub account, then clone it locally:

git clone https://github.com/YOUR_USERNAME/paperboy
cd paperboy

Step 2 — Install Dependencies

The project depends on the Google API client libraries and the requests package. Install them with:

pip install -r requirements.txt

The key libraries used by run.py are google-auth, google-auth-oauthlib, google-api-python-client, and requests.

Step 3 — Configure GitHub Secrets

Go to your forked repository on GitHub, then navigate to Settings → Secrets and variables → Actions and add the following four secrets:

client-secret

The full contents of your Google OAuth2 client secret JSON file, pasted as a single string. This is the file you download from Google Cloud Console when creating OAuth2 credentials. It looks like:

{
  "installed": {
    "client_id": "YOUR_CLIENT_ID.apps.googleusercontent.com",
    "client_secret": "YOUR_CLIENT_SECRET",
    "redirect_uris": ["http://localhost"],
    ...
  }
}

credentials-json

A token credentials JSON string obtained after the first OAuth2 authentication flow. This contains the access and refresh tokens that allow the script to authenticate without user interaction. It looks like:

{
  "token": "ya29.YOUR_ACCESS_TOKEN",
  "refresh_token": "YOUR_REFRESH_TOKEN",
  "token_uri": "https://oauth2.googleapis.com/token",
  "client_id": "YOUR_CLIENT_ID",
  "client_secret": "YOUR_CLIENT_SECRET",
  "scopes": ["https://www.googleapis.com/auth/blogger"]
}

If the credentials-json contains a token field, the script uses it directly. If not, it falls back to running the full OAuth2 flow using client-secret — but this requires a browser, so it won't work in the automated workflow. Always ensure credentials-json contains a valid token before relying on the scheduled run.

blog-id

Your Blogger blog's numeric ID. You can find this in the URL when you're logged into the Blogger dashboard:

https://www.blogger.com/blog/posts/1234567890123456
                                     ^^^^^^^^^^^^^^^^
                                     This is your Blog ID

github-repo

Your GitHub repository in owner/repo format, for example:

YOUR_USERNAME/paperboy

The script uses this to fetch blog_posts.json directly from GitHub at runtime, so the filename you use must match what you pass as the --json-file argument in the workflow.

Step 4 — Write Your Posts in blog_posts.json

Open blog_posts.json. Each entry in the posts array is one blog post. Here is the exact structure the script expects:

{
  "posts": [
    {
      "post_date": "2024-03-01T14:00:00",
      "title": "My First Automated Post",
      "content": "<p>This post was published automatically by Paperboy.</p>"
    },
    {
      "post_date": "2024-03-02T14:00:00",
      "title": "A Follow-Up Post",
      "content": "<p>Another day, another post — fully automated.</p><h2>A Section</h2><p>Content here.</p>"
    }
  ]
}

A few important details about each field:

  • post_date — must be in YYYY-MM-DDTHH:MM:SS format (e.g. 2024-03-01T14:00:00). The script publishes a post as soon as this datetime isn't in the past, so if you want it to go out on March 1st at 2 PM UTC, set it to 2024-03-01T14:00:00.
  • title — the post's title. This is also used for duplicate detection — if a post with this exact title already exists on your blog, the script will skip it rather than publish a duplicate.
  • content — the post body as an HTML string. You can use any standard HTML tags: <p>, <h2>, <ul>, <pre>, <code>, <strong>, and so on.

If any of the three fields are missing from a post entry, the script will log a warning and skip that post rather than failing entirely.

The GitHub Actions Workflow

The workflow file lives at .github/workflows/daily-run.yml. It triggers on two events: a daily cron schedule at 14:00 UTC, and on any push to the main branch (which is useful for testing).

name: Daily Automated Actions

on:
  schedule:
    - cron: '0 14 * * *'  # every day at 14:00 UTC
  push:
    branches:
      - main

jobs:
  daily-run:
    name: 'Runs daily'
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/checkout@v2
        with:
          fetch-depth: 2

      - name: Setup Python version 3.9
        uses: actions/setup-python@v2
        with:
          python-version: 3.9

      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          pip install -r requirements.txt

      - name: Automate blog posts
        run: |
          python run.py \
            --client-secret '${{ secrets.client_secret }}' \
            --credentials-json '${{ secrets.credentials_json }}' \
            --blog-id '${{ secrets.blog_id }}' \
            --github-repo '${{ secrets.github_repo }}' \
            --json-file 'blog_posts.json'

Each step does the following:

  1. Checkout — clones your repository onto the GitHub Actions runner so the workflow has access to run.py and requirements.txt
  2. Setup Python — installs Python 3.9 on the runner
  3. Install dependencies — runs pip install -r requirements.txt to install the Google API and requests libraries
  4. Automate blog posts — runs run.py with your four secrets injected as arguments. The script fetches blog_posts.json from GitHub, finds any posts due for publishing, and creates them via the Blogger API

How run.py Works

When the script runs it follows this sequence:

  1. Authenticate — if the credentials-json contains a token field, it builds a Credentials object directly. Otherwise it runs the local OAuth2 flow using client-secret (only practical when running locally)
  2. Connect to Blogger — initialises the Blogger v3 API client using the authenticated credentials
  3. Fetch the post queue — downloads blog_posts.json from https://raw.githubusercontent.com/YOUR_REPO/master/blog_posts.json and parses it
  4. Check for duplicates — calls the Blogger API to get a list of existing post titles on your blog
  5. Publish due posts — for each post in the queue, it checks three things: all fields are present, the title isn't already published, and post_date is in the past. If all three pass, it publishes the post

Customising the Schedule

The default cron 0 14 * * * fires every day at 14:00 UTC. To change it, edit that line in daily-run.yml. Some examples:

  • 0 9 * * * — 9:00 AM UTC daily
  • 0 18 * * 1 — 6:00 PM UTC on Mondays only
  • 0 8 * * 1,3,5 — 8:00 AM UTC on Monday, Wednesday, and Friday

Note that GitHub Actions scheduled workflows can run a few minutes late during periods of high load, so don't rely on the exact minute for anything time-sensitive.

Verifying It Worked

After a workflow run — whether from the daily schedule or a push to main — check two places:

  • The Actions tab in your GitHub repository shows the full run log. Any errors from authentication failures, missing fields, or API issues will appear there
  • Your Blogger dashboard at blogger.com should show the new post on its scheduled date

Summary

Once configured, your entire workflow is:

  1. Write posts in advance with HTML content
  2. Add them to blog_posts.json with a post_date
  3. Commit and push to main

GitHub Actions handles authentication, duplicate checking, and publishing on the right day — automatically, every day, with no further input from you.

Comments