How to Set Up QA Ledger Automations & Interface
This guide walks you through configuring 6 Airtable automations and the QA Testing Portal interface — no scripts, no coding, just clicks.
The Big Picture
The QA Ledger uses 6 automations to keep everything connected. Here's the full flow at a glance:
Before You Start
What You Need
Existing Automations to Replace
If you already have script-based automations, you need to turn them OFF first, then create the new script-free versions.
Turn Off Old Automations
- Go to the Automations tab in your base
- Find these automations (if they exist): Create Bug, Create new Test after Bug resolved, Send slack notification
- Toggle each one OFF
- DO NOT delete them yet — keep as backup until new ones are confirmed working
Automation 1: Failed Test → Create Bug
When a tester marks a test as Failed and checks Finalize, this automation creates a Bug record and links it back to the test.
Create the Automation
- Click Automations tab → click + Create automation
- Name it: Failed Test → Create Bug
Configure the Trigger
- Trigger type: When record matches conditions
- Table: User Tests
- Add condition 1: Status → is → Failed
- Add condition 2: Finalize → is → checked ✅
Add Action: Create Record
- Click + Add advanced logic or action
- Select: Create record
- Table: Bugs
Map the Field
- Click + Choose field
- Select: Relevant Test
- Click the blue + insert button on the right
- Choose: Airtable record ID (from the trigger step "When record matches conditions")
Test the Trigger
- Click Generate a preview or Test trigger
- Select a test record that has Status = Failed and Finalize = checked
- Verify it finds a matching record
Turn It On
- Toggle the automation ON
Automation 2: Bug Resolved → Create Retest
When a bug is marked as Resolved, this automation creates a new User Test (retest) so the tester can verify the fix worked.
Create the Automation
- Click + Create automation
- Name it: Bug Resolved → Create Retest
Configure the Trigger
- Trigger type: When record matches conditions
- Table: Bugs
- Add condition: Resolved → is → checked ✅
Add Action: Create Record
- Click + Add advanced logic or action
- Select: Create record
- Table: User Tests
Map the Fields
For each field below, click + Choose field, select the field, then click the blue + button to insert the value from the trigger.
| User Tests Field | Map From (Bug trigger) | Notes |
|---|---|---|
| Test Name | Type: "Retest: " then insert Bug ID from trigger | Prefix helps identify retests |
| Status | Select: Todo | Static value |
| Bugs | Insert Airtable record ID from trigger | Links retest back to the bug |
| Iteration | Type: 2 | Static value (manually update if higher) |
Test and Turn On
- Click Test action to verify it creates a record
- Toggle the automation ON
Automation 3: New Test → Slack Notification
When a new User Test record is created (e.g., a retest), this automation sends a Slack message to notify the QA team.
Create the Automation
- Click + Create automation
- Name it: New Test → Slack Notification
Configure the Trigger
- Trigger type: When record is created
- Table: User Tests
Add Action: Send Slack Message
- Click + Add advanced logic or action
- Select: Slack: Send a message
- If prompted, click Connect Slack account and authorize your workspace
- Select your QA team channel
Compose the Message
Use the blue + button to insert dynamic fields from the trigger:
Test and Turn On
- Click Test action to send a test message to Slack
- Verify the message appears in your QA channel
- Toggle the automation ON
Automation 4: Ad-Hoc Finding → Create Bug
When someone logs an ad-hoc finding (an issue found outside normal testing), this automation creates a Bug record and links it back.
Create the Automation
- Click + Create automation
- Name it: Ad-Hoc Finding → Create Bug
Configure the Trigger
- Trigger type: When record is created
- Table: Ad-Hoc Findings
Add Action 1: Create Record in Bugs
- Click + Add advanced logic or action
- Select: Create record
- Table: Bugs
| Bugs Field | Map From (Ad-Hoc Finding trigger) |
|---|---|
| Ad-Hoc Findings | Insert Airtable record ID from trigger |
Add Action 2: Update the Finding
- Click + Add advanced logic or action
- Select: Update record
- Table: Ad-Hoc Findings
- Record ID: Insert Airtable record ID from the trigger step
| Ad-Hoc Findings Field | Value |
|---|---|
| Created Bug | Insert Airtable record ID from Action 1 (the bug you just created) |
| Status | Select: Bug Created |
Test and Turn On
- Click Test action to verify it works
- Toggle the automation ON
Automation 5: Ad-Hoc Finding → Create Investigation
When an ad-hoc finding is created, this automation creates a Gap Investigation to analyze why the issue was missed by automated QA.
Create the Automation
- Click + Create automation
- Name it: Ad-Hoc Finding → Create Investigation
Configure the Trigger
- Trigger type: When record is created
- Table: Ad-Hoc Findings
Add Action 1: Create Record in Gap Investigations
- Click + Add advanced logic or action
- Select: Create record
- Table: Gap Investigations
| Gap Investigations Field | Map From (Ad-Hoc Finding trigger) |
|---|---|
| Investigation Title | Insert Title from trigger |
| Source Finding | Insert Airtable record ID from trigger |
| Status | Select: Open |
Add Action 2: Update the Finding
- Click + Add advanced logic or action
- Select: Update record
- Table: Ad-Hoc Findings
- Record ID: Insert Airtable record ID from the trigger step
| Ad-Hoc Findings Field | Value |
|---|---|
| Created Investigation | Insert Airtable record ID from Action 1 (the investigation you just created) |
Test and Turn On
- Click Test action to verify
- Toggle the automation ON
Automation 6: Investigation Verified → Update Finding
When a Gap Investigation is marked as Verified, this automation updates the linked Ad-Hoc Finding status to Investigation Complete.
Create the Automation
- Click + Create automation
- Name it: Investigation Verified → Update Finding
Configure the Trigger
- Trigger type: When record matches conditions
- Table: Gap Investigations
- Add condition: Status → is → Verified
Add Action: Update Linked Finding
- Click + Add advanced logic or action
- Select: Update record
- Table: Ad-Hoc Findings
- Record ID: Insert Source Finding from the trigger (this is the linked finding's record ID)
| Ad-Hoc Findings Field | Value |
|---|---|
| Status | Select: Investigation Complete |
Test and Turn On
- Click Test action
- Toggle the automation ON
QA Testing Portal Interface
Create a user-friendly interface so testers don't need to interact with raw tables. This is the "front door" for the QA team.
Create the Interface
- Click the Interfaces tab at the top of your base
- Click + Create interface
- Choose Blank interface (start from scratch)
- Name it: QA Testing Portal
Add Dashboard Page
- This is your landing page. Add these elements:
Element 1: Summary Numbers
- Add a Number element
- Source: User Tests table
- Add filtered counts for: Todo, Passed, Failed, Blocked
Element 2: My Tests Grid
- Add a Grid element
- Source: User Tests table, view: User
- Show fields: Test Name, Status, Severity, Module, Relevant page
Add Test Detail Page
- Add a Record detail page
- Source: User Tests table
- Show all fields the tester needs: Test Instructions, Status, QA Notes, Severity, Video Evidence, Finalize
- Link this page from Grid clicks (Airtable does this automatically for record detail pages)
Add Ad-Hoc Findings Form
- Add a new page: Report Ad-Hoc Finding
- Add a Form element
- Source: Ad-Hoc Findings table
- Include fields: Title, Description, Severity, Category, Module, Page / URL, Video Evidence, Screenshots
- Hide fields: Status, Created Bug, Created Investigation (these are auto-managed)
Add Bug Tracker Page
- Add a new page: Bug Tracker
- Add a Grid element
- Source: Bugs table, view: Open Bugs
- Show fields: Bug ID, Relevant Test, Severity, Module, Resolved, Decomp Level
Publish the Interface
- Click Publish in the top-right corner
- Share the interface link with your QA team
- Testers use this link — they never need to touch the raw tables
Testing Your Setup
Run through this checklist to verify every automation is working correctly.
Test 1: Failed Test → Bug
Trigger Automation 1
- Go to User Tests table
- Create a test record with Status = Failed
- Check Finalize ✅
- Wait 5 seconds
- Go to Bugs table
- ✅ Verify: New bug exists with Relevant Test linked to your test
- ✅ Verify: Lookup fields (QA Notes, Severity, etc.) are populated
Test 2: Bug Resolved → Retest
Trigger Automation 2
- Go to Bugs table
- Check Resolved ✅ on the test bug
- Wait 5 seconds
- Go to User Tests table
- ✅ Verify: New retest record exists with Status = Todo
- ✅ Verify: Bugs field links back to the resolved bug
Test 3: Slack Notification
Trigger Automation 3
- The retest from Test 2 should have already triggered a Slack message
- ✅ Verify: Message appears in your QA Slack channel
- ✅ Verify: Message contains the test name and relevant details
Test 4: Ad-Hoc Finding → Bug + Investigation
Trigger Automations 4 + 5
- Go to Ad-Hoc Findings table
- Create a new record: Title = "TEST: Dashboard loading delay", Description = "Takes 5+ seconds to load"
- Wait 5 seconds
- ✅ Verify: Created Bug field is populated (links to new Bug)
- ✅ Verify: Created Investigation field is populated (links to new Investigation)
- ✅ Verify: Status = Bug Created
- Go to Gap Investigations table
- ✅ Verify: New investigation exists with Source Finding linked back
Test 5: Investigation Verified
Trigger Automation 6
- Go to Gap Investigations table
- Change the test investigation's Status to Verified
- Wait 5 seconds
- Go to Ad-Hoc Findings table
- ✅ Verify: Finding status = Investigation Complete
🟢 All Tests Pass
- Delete all test records you created
- Delete the old script-based automations (now confirmed replaced)
- Notify the team that automations are live
🔴 A Test Fails
- Go to Automations → click the failing automation
- Click Runs or Run history
- Click into the failed run to see the error
- Check the Troubleshooting section below
Field Reference & Troubleshooting
All 6 Tables in the QA Ledger
| Table | Purpose | Key Fields |
|---|---|---|
| Flows | User journeys (seeded by AI) | Flow Name, Steps, Module |
| Quality Checks | Quality standards (seeded by AI) | Check Name, Criteria, Category |
| User Tests | Tests run by humans | Status, QA Notes, Finalize, Severity |
| Bugs | Bugs found during testing | Relevant Test, Resolved, Decomp Level |
| Ad-Hoc Findings | Issues found outside test scope | Title, Description, Created Bug, Status |
| Gap Investigations | Root cause analysis for missed issues | Source Finding, Gap Type, Status |
Automation Summary
| # | Name | Trigger | Actions |
|---|---|---|---|
| 1 | Failed Test → Create Bug | User Test: Status=Failed + Finalize=checked | Create Bug record (link to test) |
| 2 | Bug Resolved → Create Retest | Bug: Resolved=checked | Create User Test (link to bug) |
| 3 | New Test → Slack | User Test created | Send Slack message |
| 4 | Finding → Create Bug | Ad-Hoc Finding created | Create Bug + Update Finding |
| 5 | Finding → Create Investigation | Ad-Hoc Finding created | Create Investigation + Update Finding |
| 6 | Investigation Verified | Investigation: Status=Verified | Update Finding status |
Troubleshooting
Check: Is the automation toggled ON?
Check: Does the trigger condition match? For "When record matches conditions," the record must transition INTO the matching state — records that already match when the automation is first turned on won't trigger it.
Fix: Create a NEW record or update an existing record so it newly matches the conditions.
Check: Is the Relevant Test field linked correctly to the User Test record?
Check: Does the User Test record have values in QA Notes, Severity, Module, etc.?
Fix: Lookup fields only populate if the source record has data. Fill in the User Test fields first.
Check: Is "Source Finding" a link field type? If it's a formula or text field, it won't work for linking.
Fix: Make sure "Source Finding" in Gap Investigations is a Link to another record field pointing at the Ad-Hoc Findings table.
Check: Is the status value an exact match? Single select options are case-sensitive and include emojis.
Fix: Select the status from the dropdown — don't type it manually.
Check: Is the Slack workspace connected? Go to the automation's Slack action and verify the connection.
Check: Does the bot have permission to post in the selected channel?
Fix: Reconnect Slack and invite the Airtable bot to the channel.