Lesson 31: Rolling Stability Dashboard CSV Export and Webhook Ingest Automation

Lesson 30 gave you a manual rolling dashboard that stays honest about release-train health.

This lesson adds automation you can actually maintain: a small ingest path that pulls closed Lesson 29 tracker rows into the dashboard sheet or database on a schedule, plus an optional webhook lane when your incident tool already emits JSON.

The goal is not a full data platform. It is one boring pipeline that stops human copy errors from becoming fake green trains.

Dinosaur Collection illustration for dashboard ingest automation lesson

What you will build

By the end of this lesson, you will have:

  1. A versioned CSV contract that matches the Lesson 29 closing row schema
  2. A scheduled export job (CI or server cron) that uploads or appends rows idempotently
  3. A signed webhook receiver sketch you can implement behind HTTPS with replay protection
  4. A failure visibility rule so silent drops are impossible for more than one run

Step 1 - Freeze the row contract before you automate

Your ingest file must include at minimum the columns from Lesson 30’s mini exercise:

build_id, promotion_date, yellow_open_close, stability_signal, dialogue_signal, train_note, recurrence_flag

Add two automation columns:

  • ingest_batch_id (UUID or UTC timestamp from the exporter)
  • source (csv_export or webhook)

If you skip the contract step, you will automate chaos.

Step 2 - Choose CSV first, webhooks second

CSV wins when:

  • your tracker already lives in Sheets, Notion, or Linear exports
  • your team is more comfortable with Git diffs than HTTP services

Webhooks win when:

  • PagerDuty, Opsgenie, or GitHub already emits structured incident payloads you want to map into train_note or recurrence_flag

Never start with both on day one. Ship CSV ingest, prove the dashboard moves correctly for two real patches, then add webhooks.

Step 3 - Implement a scheduled exporter

Pick the smallest runtime you already operate:

  • GitHub Actions on: schedule writing an artifact plus optional commit to a metrics/ branch
  • a single cron on a tiny VM that runs a Python or Node script and scp/rsync the CSV to your dashboard host
  • Google Apps Script time-driven trigger if your dashboard is already Sheets-native

Hard requirements:

  • exporter writes UTF-8 CSV with stable header order
  • exporter sets ingest_batch_id once per run
  • exporter is idempotent: re-run replaces the same batch slice instead of duplicating rows

Step 4 - Add idempotent merge rules in the dashboard

When importing:

  1. Match rows on build_id plus ingest_batch_id if you expect multiple partial uploads
  2. For the same build_id with a newer ingest_batch_id, overwrite the train columns only
  3. Never delete human annotations in free-text cells unless they live in a protected column outside the ingest range

Document the merge rule beside the sheet tab so future you does not “fix” it with manual sorting.

Step 5 - Optional webhook receiver sketch

If you add HTTP ingest:

  • require HTTPS termination you control
  • verify a shared secret header or HMAC signature computed from raw body plus timestamp
  • reject replays older than a few minutes using the timestamp window
  • respond 200 only after the row is persisted; otherwise return 500 so the sender retries responsibly

Map only the fields you need. Do not mirror entire incident payloads into player-visible sheets.

Mini exercise

Create ingest_contract_v1.md that lists:

  1. final column names and types
  2. example row for a fake build 0.9.7-rc2
  3. merge rule paragraph
  4. where secrets live (CI vault name only, not literal secrets)
  5. alert destination if ingest fails twice in a row

Troubleshooting

Duplicate rows after every cron

Your merge key is too weak. Move uniqueness to build_id plus highest ingest_batch_id per column group.

Webhook works in Postman but not production

Check TLS intermediates, clock skew on signature timestamps, and body canonicalization (JSON minification breaks naive HMAC strings).

Dashboard owners distrust automation

Keep Lesson 29 manual export as a break-glass path for one release. Confidence follows choice, not force.

FAQ

Do we need a database

Not until three rolling columns feel tight. Sheets plus CSV is enough for many indies through first major launch.

How does this relate to player log correlation

Use the same build_id string in ingest rows, CI signing logs, and player-facing crash bundles. The Unity Build Profile and Signing Preflight Checklist chapter already nags you to align those tokens; treat this lesson as the dashboard side of that same habit.

Should webhooks include PII

No. Strip player identifiers in the mapper. Store counts and severities, not names.

Lesson recap

You can now:

  • export Lesson 29 closures on a schedule without copy drift
  • merge ingested rows safely into Lesson 30 views
  • add optional signed webhooks without inventing a new schema per vendor
  • see failures loudly when automation breaks

Next lesson teaser

When you outgrow Sheets, promote the same rows into a read-only warehouse layer with contracts and service accounts: Lesson 32: Read-Only Analytics Warehouse Contracts and Service Accounts. Until then, keep ingest batches small and your merge rules written down.

Related learning

Bookmark this lesson when your Lesson 30 sheet becomes the most opened tab in release week.