📘 Standard Operating Procedure

How to Set Up QA Ledger Automations & Interface

This guide walks you through configuring 6 Airtable automations and the QA Testing Portal interface — no scripts, no coding, just clicks.

Purpose — Configure the QA Ledger's Airtable automations and admin interface so that bugs are created automatically from failed tests, retests are generated when bugs are resolved, ad-hoc findings trigger investigations, and the team gets Slack notifications.
🗺️

The Big Picture

The QA Ledger uses 6 automations to keep everything connected. Here's the full flow at a glance:

💡
That's the core loop. Additionally, Ad-Hoc Findings (issues discovered outside test scope) automatically create both a Bug and a Gap Investigation. The rest of this SOP teaches you how to set up each piece.
📋

Before You Start

⚠️
Important: You must complete ALL setup steps before turning on any automations. A partially configured automation can create broken records.

What You Need

Airtable Access ✅ Required Creator/Owner access to the QA Ledger Template base
Slack Workspace 🔶 For Auto 3 Admin access to the Slack workspace for notification integration
Tables Already Created ✅ Required User Tests, Bugs, Ad-Hoc Findings, Gap Investigations tables must exist

Existing Automations to Replace

If you already have script-based automations, you need to turn them OFF first, then create the new script-free versions.

1

Turn Off Old Automations

  • Go to the Automations tab in your base
  • Find these automations (if they exist): Create Bug, Create new Test after Bug resolved, Send slack notification
  • Toggle each one OFF
  • DO NOT delete them yet — keep as backup until new ones are confirmed working
🐛

Automation 1: Failed Test → Create Bug

When a tester marks a test as Failed and checks Finalize, this automation creates a Bug record and links it back to the test.

ℹ️
Why just "Relevant Test"? All other Bug fields (QA Notes, Severity, Module, etc.) are lookup fields that auto-populate from the linked test. You only need to set the link — everything else fills itself.
1

Create the Automation

  • Click Automations tab → click + Create automation
  • Name it: Failed Test → Create Bug
2

Configure the Trigger

  • Trigger type: When record matches conditions
  • Table: User Tests
  • Add condition 1: Status → is → Failed
  • Add condition 2: Finalize → is → checked
3

Add Action: Create Record

  • Click + Add advanced logic or action
  • Select: Create record
  • Table: Bugs
4

Map the Field

  • Click + Choose field
  • Select: Relevant Test
  • Click the blue + insert button on the right
  • Choose: Airtable record ID (from the trigger step "When record matches conditions")
💡
That's the only field you need to map! All the lookup fields (QA Notes, Severity, Module, Relevant page, Relevant Role) will auto-fill because they pull from the linked User Test.
5

Test the Trigger

  • Click Generate a preview or Test trigger
  • Select a test record that has Status = Failed and Finalize = checked
  • Verify it finds a matching record
6

Turn It On

  • Toggle the automation ON
🔄

Automation 2: Bug Resolved → Create Retest

When a bug is marked as Resolved, this automation creates a new User Test (retest) so the tester can verify the fix worked.

⚠️
Limitation: Without scripts, you cannot auto-increment the Iteration number or copy freeform text. The retest will need some manual cleanup after creation.
1

Create the Automation

  • Click + Create automation
  • Name it: Bug Resolved → Create Retest
2

Configure the Trigger

  • Trigger type: When record matches conditions
  • Table: Bugs
  • Add condition: Resolved → is → checked
3

Add Action: Create Record

  • Click + Add advanced logic or action
  • Select: Create record
  • Table: User Tests
4

Map the Fields

For each field below, click + Choose field, select the field, then click the blue + button to insert the value from the trigger.

User Tests Field Map From (Bug trigger) Notes
Test Name Type: "Retest: " then insert Bug ID from trigger Prefix helps identify retests
Status Select: Todo Static value
Bugs Insert Airtable record ID from trigger Links retest back to the bug
Iteration Type: 2 Static value (manually update if higher)
💡
After the automation runs: Open the newly created retest and manually copy the Test Instructions, Relevant page, Relevant Role, Severity, and Module from the original test. Without scripts, Airtable can't copy these lookup values into writable fields.
5

Test and Turn On

  • Click Test action to verify it creates a record
  • Toggle the automation ON
💬

Automation 3: New Test → Slack Notification

When a new User Test record is created (e.g., a retest), this automation sends a Slack message to notify the QA team.

1

Create the Automation

  • Click + Create automation
  • Name it: New Test → Slack Notification
2

Configure the Trigger

  • Trigger type: When record is created
  • Table: User Tests
3

Add Action: Send Slack Message

  • Click + Add advanced logic or action
  • Select: Slack: Send a message
  • If prompted, click Connect Slack account and authorize your workspace
  • Select your QA team channel
4

Compose the Message

Use the blue + button to insert dynamic fields from the trigger:

Slack Message Template
🧪 New QA Test Available! *Test:* {insert Test Name} *Module:* {insert Module} *Severity:* {insert Severity} *Role:* {insert Relevant Role} *Page:* {insert Relevant page} 📋 View in Airtable: {insert Record URL}
💡
Record URL: When inserting fields, look for Record URL in the list — Airtable provides this automatically for every trigger record.
5

Test and Turn On

  • Click Test action to send a test message to Slack
  • Verify the message appears in your QA channel
  • Toggle the automation ON
🔍

Automation 4: Ad-Hoc Finding → Create Bug

When someone logs an ad-hoc finding (an issue found outside normal testing), this automation creates a Bug record and links it back.

1

Create the Automation

  • Click + Create automation
  • Name it: Ad-Hoc Finding → Create Bug
2

Configure the Trigger

  • Trigger type: When record is created
  • Table: Ad-Hoc Findings
3

Add Action 1: Create Record in Bugs

  • Click + Add advanced logic or action
  • Select: Create record
  • Table: Bugs
Bugs Field Map From (Ad-Hoc Finding trigger)
Ad-Hoc Findings Insert Airtable record ID from trigger
ℹ️
Why only one field? Unlike test-linked bugs, ad-hoc bugs don't have a linked User Test — so lookup fields won't populate. The AI agent will read the finding description directly from the linked Ad-Hoc Finding when processing the bug.
4

Add Action 2: Update the Finding

  • Click + Add advanced logic or action
  • Select: Update record
  • Table: Ad-Hoc Findings
  • Record ID: Insert Airtable record ID from the trigger step
Ad-Hoc Findings Field Value
Created Bug Insert Airtable record ID from Action 1 (the bug you just created)
Status Select: Bug Created
5

Test and Turn On

  • Click Test action to verify it works
  • Toggle the automation ON
🕵️

Automation 5: Ad-Hoc Finding → Create Investigation

When an ad-hoc finding is created, this automation creates a Gap Investigation to analyze why the issue was missed by automated QA.

1

Create the Automation

  • Click + Create automation
  • Name it: Ad-Hoc Finding → Create Investigation
2

Configure the Trigger

  • Trigger type: When record is created
  • Table: Ad-Hoc Findings
3

Add Action 1: Create Record in Gap Investigations

  • Click + Add advanced logic or action
  • Select: Create record
  • Table: Gap Investigations
Gap Investigations Field Map From (Ad-Hoc Finding trigger)
Investigation Title Insert Title from trigger
Source Finding Insert Airtable record ID from trigger
Status Select: Open
4

Add Action 2: Update the Finding

  • Click + Add advanced logic or action
  • Select: Update record
  • Table: Ad-Hoc Findings
  • Record ID: Insert Airtable record ID from the trigger step
Ad-Hoc Findings Field Value
Created Investigation Insert Airtable record ID from Action 1 (the investigation you just created)
💡
No status update here. Automation 4 already sets the Finding status to "Bug Created." No need to overwrite it.
5

Test and Turn On

  • Click Test action to verify
  • Toggle the automation ON

Automation 6: Investigation Verified → Update Finding

When a Gap Investigation is marked as Verified, this automation updates the linked Ad-Hoc Finding status to Investigation Complete.

1

Create the Automation

  • Click + Create automation
  • Name it: Investigation Verified → Update Finding
2

Configure the Trigger

  • Trigger type: When record matches conditions
  • Table: Gap Investigations
  • Add condition: Status → is → Verified
3

Add Action: Update Linked Finding

  • Click + Add advanced logic or action
  • Select: Update record
  • Table: Ad-Hoc Findings
  • Record ID: Insert Source Finding from the trigger (this is the linked finding's record ID)
Ad-Hoc Findings Field Value
Status Select: Investigation Complete
4

Test and Turn On

  • Click Test action
  • Toggle the automation ON
🖥️

QA Testing Portal Interface

Create a user-friendly interface so testers don't need to interact with raw tables. This is the "front door" for the QA team.

1

Create the Interface

  • Click the Interfaces tab at the top of your base
  • Click + Create interface
  • Choose Blank interface (start from scratch)
  • Name it: QA Testing Portal
2

Add Dashboard Page

  • This is your landing page. Add these elements:

Element 1: Summary Numbers

  • Add a Number element
  • Source: User Tests table
  • Add filtered counts for: Todo, Passed, Failed, Blocked

Element 2: My Tests Grid

  • Add a Grid element
  • Source: User Tests table, view: User
  • Show fields: Test Name, Status, Severity, Module, Relevant page
3

Add Test Detail Page

  • Add a Record detail page
  • Source: User Tests table
  • Show all fields the tester needs: Test Instructions, Status, QA Notes, Severity, Video Evidence, Finalize
  • Link this page from Grid clicks (Airtable does this automatically for record detail pages)
4

Add Ad-Hoc Findings Form

  • Add a new page: Report Ad-Hoc Finding
  • Add a Form element
  • Source: Ad-Hoc Findings table
  • Include fields: Title, Description, Severity, Category, Module, Page / URL, Video Evidence, Screenshots
  • Hide fields: Status, Created Bug, Created Investigation (these are auto-managed)
5

Add Bug Tracker Page

  • Add a new page: Bug Tracker
  • Add a Grid element
  • Source: Bugs table, view: Open Bugs
  • Show fields: Bug ID, Relevant Test, Severity, Module, Resolved, Decomp Level
6

Publish the Interface

  • Click Publish in the top-right corner
  • Share the interface link with your QA team
  • Testers use this link — they never need to touch the raw tables
🧪

Testing Your Setup

Run through this checklist to verify every automation is working correctly.

🚨
Test with real data! Create actual records — don't just rely on Airtable's "Test action" button. Real-world timing and field population can differ from test previews.

Test 1: Failed Test → Bug

T1

Trigger Automation 1

  • Go to User Tests table
  • Create a test record with Status = Failed
  • Check Finalize
  • Wait 5 seconds
  • Go to Bugs table
  • ✅ Verify: New bug exists with Relevant Test linked to your test
  • ✅ Verify: Lookup fields (QA Notes, Severity, etc.) are populated

Test 2: Bug Resolved → Retest

T2

Trigger Automation 2

  • Go to Bugs table
  • Check Resolved ✅ on the test bug
  • Wait 5 seconds
  • Go to User Tests table
  • ✅ Verify: New retest record exists with Status = Todo
  • ✅ Verify: Bugs field links back to the resolved bug

Test 3: Slack Notification

T3

Trigger Automation 3

  • The retest from Test 2 should have already triggered a Slack message
  • ✅ Verify: Message appears in your QA Slack channel
  • ✅ Verify: Message contains the test name and relevant details

Test 4: Ad-Hoc Finding → Bug + Investigation

T4

Trigger Automations 4 + 5

  • Go to Ad-Hoc Findings table
  • Create a new record: Title = "TEST: Dashboard loading delay", Description = "Takes 5+ seconds to load"
  • Wait 5 seconds
  • ✅ Verify: Created Bug field is populated (links to new Bug)
  • ✅ Verify: Created Investigation field is populated (links to new Investigation)
  • ✅ Verify: Status = Bug Created
  • Go to Gap Investigations table
  • ✅ Verify: New investigation exists with Source Finding linked back

Test 5: Investigation Verified

T5

Trigger Automation 6

  • Go to Gap Investigations table
  • Change the test investigation's Status to Verified
  • Wait 5 seconds
  • Go to Ad-Hoc Findings table
  • ✅ Verify: Finding status = Investigation Complete

🟢 All Tests Pass

  1. Delete all test records you created
  2. Delete the old script-based automations (now confirmed replaced)
  3. Notify the team that automations are live

🔴 A Test Fails

  1. Go to Automations → click the failing automation
  2. Click Runs or Run history
  3. Click into the failed run to see the error
  4. Check the Troubleshooting section below
📖

Field Reference & Troubleshooting

All 6 Tables in the QA Ledger

Table Purpose Key Fields
Flows User journeys (seeded by AI) Flow Name, Steps, Module
Quality Checks Quality standards (seeded by AI) Check Name, Criteria, Category
User Tests Tests run by humans Status, QA Notes, Finalize, Severity
Bugs Bugs found during testing Relevant Test, Resolved, Decomp Level
Ad-Hoc Findings Issues found outside test scope Title, Description, Created Bug, Status
Gap Investigations Root cause analysis for missed issues Source Finding, Gap Type, Status

Automation Summary

# Name Trigger Actions
1 Failed Test → Create Bug User Test: Status=Failed + Finalize=checked Create Bug record (link to test)
2 Bug Resolved → Create Retest Bug: Resolved=checked Create User Test (link to bug)
3 New Test → Slack User Test created Send Slack message
4 Finding → Create Bug Ad-Hoc Finding created Create Bug + Update Finding
5 Finding → Create Investigation Ad-Hoc Finding created Create Investigation + Update Finding
6 Investigation Verified Investigation: Status=Verified Update Finding status

Troubleshooting

Check: Is the automation toggled ON?

Check: Does the trigger condition match? For "When record matches conditions," the record must transition INTO the matching state — records that already match when the automation is first turned on won't trigger it.

Fix: Create a NEW record or update an existing record so it newly matches the conditions.

Check: Is the Relevant Test field linked correctly to the User Test record?

Check: Does the User Test record have values in QA Notes, Severity, Module, etc.?

Fix: Lookup fields only populate if the source record has data. Fill in the User Test fields first.

Check: Is "Source Finding" a link field type? If it's a formula or text field, it won't work for linking.

Fix: Make sure "Source Finding" in Gap Investigations is a Link to another record field pointing at the Ad-Hoc Findings table.

Check: Is the status value an exact match? Single select options are case-sensitive and include emojis.

Fix: Select the status from the dropdown — don't type it manually.

Check: Is the Slack workspace connected? Go to the automation's Slack action and verify the connection.

Check: Does the bot have permission to post in the selected channel?

Fix: Reconnect Slack and invite the Airtable bot to the channel.

Golden Rules

1
Turn off old automations first. Running both old (script-based) and new (script-free) automations simultaneously will create duplicate records.
2
Set up ALL automations before turning any ON. Automations 4 and 5 both trigger when an Ad-Hoc Finding is created. If only one is ON, you'll get incomplete data.
3
Test with fresh records. "When record matches conditions" only fires when a record newly transitions into the matching state — it won't trigger for records that already match.
4
Trust the lookup fields. For Automation 1, you only map one field (Relevant Test). All other Bug fields auto-populate. Don't try to manually map lookup fields.
5
Clean up test data. After confirming everything works, delete all test records and the old script-based automations.
6
Check the run history. If an automation doesn't fire, go to Automations → click it → Run history. The error message will tell you exactly what went wrong.

📋 SOP Details

General Information
SOP ID
SOP-QA-002
Version
1.0.0
Created
February 2026
Last Updated
February 17, 2026
Purpose
Guide admins through setting up 6 Airtable automations and the QA Testing Portal interface for the QA Ledger base.
Scope
Start Point
QA Ledger base with tables already created
End Point
All 6 automations active + Testing Portal interface published
Includes
  • 6 automation configurations (triggers, actions, field mappings)
  • QA Testing Portal interface setup
  • Testing procedures and troubleshooting
Excludes
  • Creating the Airtable tables (must be done first)
  • Running QA tests (covered by QA Tester SOP)
  • AI bug fixing workflow
Ownership
Responsible
QA Admin / Technical Lead
Accountable
Engineering Lead
Required Resources
Tools
  • Airtable (Creator/Owner access)
  • Slack workspace (for notification automation)
Access
  • QA Ledger Template base (appyyIrpL286aFRVh)
  • Airtable automation permissions
Performance Measures
Success Criteria
  • All 6 automations pass their test procedures
  • No duplicate or broken records are created
  • QA Testing Portal is accessible and functional
  • Slack notifications arrive within 10 seconds
Document Control
Review Cycle
Quarterly
Approved By
Engineering Lead

💬 Commenting Guide

You can leave comments on any specific text in this SOP using Hypothesis — a free, open-source annotation tool.

1

Create a free account

Sign up at hypothes.is/signup — takes about 30 seconds.

2

Open the sidebar

Click the arrow on the top-right corner of this page to expand the Hypothesis sidebar.

3

Highlight & comment

Select any text on the page → a tooltip appears → click Annotate to leave your comment on that exact passage.

4

Everyone can see it

Your annotations are visible to anyone viewing this page. Others can reply to start a discussion thread.