Build a Full Content Repurposing Workflow With Make and AI

Automate the entire content repurposing pipeline—from new YouTube upload to show notes, clips queue, and newsletter draft—using Make (formerly Integromat) and AI APIs.

Prerequisites

  • Make account (free tier: 1,000 operations/month; Core: $9/month)
  • YouTube channel (with API access enabled in Google Cloud Console)
  • OpenAI or Anthropic API key
  • Castmagic account (optional—can replace with direct transcription API)
  • Notion or Airtable for content queue
  • Basic familiarity with Make's visual workflow builder

What This Workflow Does

Trigger: A new video is published on your YouTube channel.

Automated steps:

  1. Fetches the video transcript from YouTube API
  2. Sends transcript to AI (Claude or GPT-4) to generate: show notes, key quotes, newsletter section, 5 social captions
  3. Creates a structured Notion page with all outputs
  4. Adds to a "clips to create" task in your project manager
  5. Sends you a Slack/email notification that everything is ready

Your manual role: Review outputs (15-20 min), approve Opus Clip run, post content.

Step 1: Set up the YouTube trigger

In Make, create a new scenario. Search for YouTubeWatch Videos. Authenticate with your Google account and select your channel. Set the trigger to run every hour (or every 15 minutes if you publish frequently).

Test the trigger by publishing a test video or selecting an existing video. Make sure you receive the video ID, title, description, and publish time.

Step 2: Fetch the video transcript

YouTube doesn't have a native transcript API in the standard Data API. Two options:

Option A – YouTube Transcript API (via webhook/serverless): Deploy a small cloud function (AWS Lambda, Vercel, or Cloudflare Workers) that accepts a video ID and returns the transcript using the youtube-transcript npm package. Call this from Make using an HTTP → Make a request module.

Option B – Castmagic API: Castmagic has an API that accepts a YouTube URL and returns AI-generated outputs directly. This is simpler to set up but costs API credits.

For most creators, Option B (Castmagic API) is the right tradeoff—less setup, production-ready outputs.

Step 3: Call the AI for extended outputs

Even if using Castmagic for basic outputs, you may want custom outputs the tool doesn't support. Add an HTTP → Make a request module calling:

POST https://api.anthropic.com/v1/messages
Authorization: x-api-key YOUR_ANTHROPIC_KEY
Content-Type: application/json

Body (JSON):

{
  "model": "claude-opus-4-5",
  "max_tokens": 2000,
  "messages": [{
    "role": "user",
    "content": "Here is the transcript of a YouTube video titled '{{video_title}}':

{{transcript}}

Generate:
1. A 400-word show notes summary
2. 5 key quotes (verbatim from the transcript)
3. A 250-word newsletter section
4. One Instagram caption, one LinkedIn post, one Twitter thread (5 tweets)"
  }]
}

Map {{video_title}} and {{transcript}} from previous modules.

Step 4: Parse the AI response

The AI returns one long text block. Use Make's Text Parser module with regex patterns to split the output into sections. Alternatively, prompt the AI to return structured JSON:

{
  "show_notes": "...",
  "key_quotes": ["...", "...", "..."],
  "newsletter_section": "...",
  "instagram_caption": "...",
  "linkedin_post": "...",
  "twitter_thread": ["tweet1", "tweet2", "tweet3", "tweet4", "tweet5"]
}

JSON output is easier to parse in Make—use the JSON → Parse JSON module to access each field.

Step 5: Create a Notion page with all outputs

Add a Notion → Create a Database Item module. Map your parsed fields to Notion properties:

  • Title: Video title + publish date
  • Status: "Needs Review"
  • YouTube URL: Video link
  • Show Notes: Full text in a Notion text property
  • Key Quotes: As a bulleted list
  • Newsletter Section: Full text
  • Instagram Caption: Full text
  • LinkedIn Post: Full text
  • Twitter Thread: Each tweet on a separate line
  • Clips Status: "Not started" (you'll update this after Opus Clip run)

Step 6: Add a task to your clips queue

Add a second Notion → Create a Database Item (or Trello/Asana card, or Airtable record) in your clips project tracker:

  • Title: "Extract clips: [Video Title]"
  • YouTube URL: Link
  • Due date: Same day or next day
  • Status: "To do"

This is your cue to open Opus Clip and run the clip extraction.

Step 7: Send a notification

Add a Gmail → Send Email or Slack → Send Message module. Message:

"New video repurposing complete: [Video Title]. Notion page created with show notes, captions, and newsletter draft. Clips task added to queue. Review → [Notion link]"

Step 8: Test and monitor

Run the scenario manually on a published video. Check:

  • Did the transcript come through in full?
  • Did the AI output correctly for all sections?
  • Did the Notion page populate correctly?
  • Did the notification send?

Set up Make's error handling (Add error handler → route) to send you an email if any step fails. AI API calls can fail due to rate limits or token overages—the error handler ensures you don't silently lose data.

Operating Costs

ServiceCost
Make Core$9/month
Claude API (per video, ~3K tokens in + 2K out)~$0.05-0.15/video
Castmagic API (if used)~$0.10-0.20/video
Total per video~$0.15-0.35

For a creator publishing 4 videos/week, this workflow costs ~$6-10/month in API costs plus the Make subscription—and saves 2-3 hours per video in manual repurposing work.

Discussion

  • Loading…

← Back to Tutorials