60 min readAutonomous agents

Implementation Guide: Monitor client bank feeds and surface cash flow alerts proactively

Step-by-step implementation guide for deploying AI to monitor client bank feeds and surface cash flow alerts proactively for Accounting & Bookkeeping clients.

Hardware Procurement

Cloud Virtual Machine — Primary Agent Server

Microsoft AzureStandard B2ms (2 vCPU, 8 GB RAM, 32 GB temp storage)Qty: 1

$60/month MSP cost / $120/month suggested resale (bundled into managed service)

Hosts the self-hosted n8n instance, PostgreSQL database, and all agent orchestration containers. Runs Ubuntu 22.04 LTS with Docker. This is the core compute for the AI agent pipeline.

Cloud Virtual Machine — Staging/Dev Server

Microsoft AzureStandard B1ms (1 vCPU, 2 GB RAM)Qty: 1

$15/month MSP cost / included in project fee

Development and staging environment for testing workflow changes, prompt updates, and API integrations before promoting to production.

Dell PowerEdge T360 Tower Server (Optional On-Premises)

Dell TechnologiesPET360_14588Qty: 1

$2,000 MSP cost / $2,800–$3,200 resale

Optional on-premises server for security-sensitive firms that require financial data to remain on-site. Replaces the Azure VM for n8n hosting. Only procure if the client has a regulatory or contractual requirement for on-premises data processing.

Synology NAS for Encrypted Backup

SynologyDS923+Qty: 1

$550 MSP cost / $750–$850 resale

Encrypted backup of alert history, transaction logs, and audit trails. Required for IRS Publication 4557 and FTC Safeguards Rule compliance. Stores nightly database dumps and configuration backups.

APC Smart-UPS 1500VA (On-Premises Only)

APC by Schneider ElectricSMT1500CQty: 1

$450 MSP cost / $625 resale

Uninterruptible power supply for on-premises server and NAS. Provides 15–20 minutes of runtime for graceful shutdown during power events. Only needed for on-premises deployments.

Seagate IronWolf NAS Drives

SeagateST4000VN006 (4TB)Qty: 2

$95/drive MSP cost / $130/drive resale

RAID-1 mirrored storage for the Synology DS923+ NAS. Provides redundant backup storage for financial data audit trails and configuration backups.

Software Procurement

Float Cash Flow Forecasting

FloatSaaS — per-organization subscriptionQty: N/A

$59/month (Essential, 3 users, 5 scenarios) or $99/month (Premium, 10 users, 10 scenarios). Resale at $99–$179/month.

Primary cash flow threshold monitoring tool. Connects directly to QuickBooks Online and Xero, syncs daily, and provides configurable cash balance threshold alerts. Serves as the Phase 1 alerting engine and ongoing baseline monitoring layer.

Fathom Financial Reporting

Fathom HQSaaS — per-company tiered subscriptionQty: N/A

$39/month (Starter, 1 company) to $280/month (Gold, 100 companies). Typical MSP deployment: Bronze at $145/month for 10 companies. Resale at $249–$449/month.

Portfolio-wide financial KPI monitoring and reporting across all client entities. Provides trend analysis, benchmarking, and automated report generation. Complements Float by adding ratio analysis, profitability alerts, and advisory-grade visualizations.

n8n Workflow Automation (Self-Hosted)

n8n GmbHOpen source (fair-code, free self-hosted) or Cloud SaaS from $20/monthQty: N/A

$0/month self-hosted (recommended) or $50/month cloud Pro plan. Resale bundled into managed service at $75–$150/month.

Core AI agent orchestration platform. Hosts all workflow automations including bank feed polling, transaction analysis pipelines, LLM calls, alert routing, and practice management integration. Self-hosted on the Azure VM for full control and compliance.

OpenAI API — GPT-5.4 mini

OpenAIUsage-based APIQty: N/A

$0.15/million input tokens, $0.60/million output tokens. Estimated $20–$40/month for 50-client workload. Resale absorbed into flat monthly fee.

Primary LLM for high-volume transaction classification, categorization validation, and routine cash flow pattern analysis. Cost-effective for processing hundreds of transactions daily.

OpenAI API — GPT-5.4

OpenAIUsage-based APIQty: N/A

$2.50/million input tokens, $10.00/million output tokens. Estimated $15–$60/month for anomaly analysis on flagged transactions. Resale absorbed into flat monthly fee.

Advanced LLM for anomaly investigation, natural-language alert generation, and cash flow narrative summaries. Used selectively on transactions flagged by GPT-5.4 mini or threshold breaches, keeping costs controlled.

QuickBooks Online API (Intuit Developer)

IntuitFree API access with registered developer accountQty: N/A

$0/month for API access. Client must have QBO Plus ($75/month) or Advanced ($275/month) subscription.

Primary data source for bank feed transactions, invoices, bills, and account balances for QBO clients. Provides OAuth 2.0 authentication and webhook notifications for real-time transaction events.

Xero API (Xero Developer)

XeroTiered API access — Starter (free) to Advanced ($895/month)Qty: N/A

$0/month for Starter tier (up to 25 connections). Growing/Established plan required for client ($47–$80/month). API tier pricing from $0–$895/month based on connections and data volume.

Primary data source for bank feed transactions, invoices, bills, and account balances for Xero clients. Provides OAuth 2.0 authentication and webhook notifications for bank transaction events.

Zapier (Phase 1 Alert Routing)

ZapierSaaS — per-seat subscriptionQty: N/A

$29.99/month (Professional plan). Resale at $49–$79/month.

Phase 1 alert routing: connects Float threshold alerts to Slack, Microsoft Teams, email, and practice management tools (Karbon, TaxDome). Replaced by n8n workflows in Phase 2 but retained as a fallback routing layer.

PostgreSQL 15 (Managed Database)

Microsoft Azure / Self-hostedOpen source — free. Azure Database for PostgreSQL Flexible Server from $25/month.Qty: N/A

$25/month for Azure managed instance (Burstable B1ms) or $0 self-hosted on the VM. Resale bundled into managed service.

Stores transaction history cache, alert logs, agent decision audit trails, cash flow snapshots, and client configuration. Critical for compliance audit trail requirements under IRS Publication 4557.

Slack (Alert Delivery)

Salesforce / SlackSaaS — client typically already has a workspaceQty: N/A

$0 (free tier sufficient for alerts) or $8.75/user/month for Pro. No incremental MSP cost.

Primary alert delivery channel for real-time cash flow notifications. Supports rich message formatting with embedded charts, action buttons, and threaded conversations for alert triage.

UptimeRobot Pro (Monitoring)

UptimeRobotSaaS subscriptionQty: N/A

$7/month (Pro plan, 50 monitors). Resale bundled into managed service.

Monitors health of n8n instance, webhook endpoints, API connectivity to QBO/Xero, and database availability. Sends MSP alerts if any component of the monitoring pipeline goes down.

$0.50–$2.00 per successful bank link, volume discounts available. Only needed if connecting banks outside QBO/Xero native feeds.

Optional bank feed aggregation for clients not using QBO or Xero native bank feeds, or for direct bank connections that bypass accounting platform limitations. Most implementations will not need this.

Prerequisites

  • Client accounting firm must have active QuickBooks Online Plus/Advanced or Xero Growing/Established subscriptions for all monitored client entities, with bank feeds already connected and reconciling
  • Client must provide a designated 'Qualified Individual' per FTC Safeguards Rule who will be the compliance point of contact and approve AI system access to financial data
  • Client must have a Written Information Security Plan (WISP) per IRS Publication 4557 — if not, MSP should scope a WISP creation engagement as a prerequisite project
  • All accounting staff who will receive and act on alerts must have email addresses and access to the chosen notification platform (Slack, Microsoft Teams, or email)
  • Client must provide OAuth 2.0 authorization for each QBO or Xero organization to be monitored — this requires a user with Admin or Accountant role in each entity
  • MSP must have an Intuit Developer account (free registration at developer.intuit.com) and/or Xero Developer account (free registration at developer.xero.com) with production app credentials
  • MSP must have an OpenAI Platform account with billing enabled and API keys generated (platform.openai.com)
  • Azure subscription (or equivalent cloud provider account) with permissions to create VMs, networking resources, and managed databases
  • Domain name or subdomain available for the n8n instance (e.g., n8n.mspdomain.com) with ability to create DNS A records and obtain SSL certificates
  • Client must define initial cash flow alert thresholds for each monitored entity: minimum cash balance, large transaction threshold (e.g., >$5,000), and expected revenue ranges
  • Network firewall must allow outbound HTTPS (port 443) to: api.intuit.com, api.xero.com, api.openai.com, floatapp.com, hooks.slack.com, and the n8n webhook domain
  • MFA must be enabled on all accounts that access financial data: cloud hosting, QBO/Xero admin, OpenAI, n8n admin, and database connections
  • Client must provide a list of all entities to be monitored with: entity name, accounting platform (QBO or Xero), primary contact, and industry vertical for alert tuning
  • Backup infrastructure must be in place: either the Synology NAS (on-prem) or Azure Blob Storage for nightly database and configuration backups

Installation Steps

Step 1: Provision Azure Infrastructure

Create the production and staging virtual machines, networking resources, and managed PostgreSQL database in Azure. Use the East US 2 or closest region to the client. Configure a Virtual Network with a Network Security Group allowing only inbound SSH (port 22 from MSP IP range) and HTTPS (port 443 from anywhere). Attach a static public IP to the production VM.

az group create --name rg-cashflow-agent --location eastus2
az network vnet create --resource-group rg-cashflow-agent --name vnet-cashflow --address-prefix 10.0.0.0/16 --subnet-name snet-agent --subnet-prefix 10.0.1.0/24
az network nsg create --resource-group rg-cashflow-agent --name nsg-agent
az network nsg rule create --resource-group rg-cashflow-agent --nsg-name nsg-agent --name AllowSSH --priority 100 --direction Inbound --access Allow --protocol Tcp --source-address-prefixes <MSP_IP_RANGE> --destination-port-ranges 22
az network nsg rule create --resource-group rg-cashflow-agent --nsg-name nsg-agent --name AllowHTTPS --priority 110 --direction Inbound --access Allow --protocol Tcp --source-address-prefixes '*' --destination-port-ranges 443
az network public-ip create --resource-group rg-cashflow-agent --name pip-agent-prod --sku Standard --allocation-method Static
az vm create --resource-group rg-cashflow-agent --name vm-agent-prod --image Canonical:ubuntu-24_04-lts:server:latest --size Standard_B2ms --admin-username azureadmin --ssh-key-value ~/.ssh/id_rsa.pub --vnet-name vnet-cashflow --subnet snet-agent --nsg nsg-agent --public-ip-address pip-agent-prod --os-disk-size-gb 64
az vm create --resource-group rg-cashflow-agent --name vm-agent-staging --image Canonical:ubuntu-24_04-lts:server:latest --size Standard_B1ms --admin-username azureadmin --ssh-key-value ~/.ssh/id_rsa.pub --vnet-name vnet-cashflow --subnet snet-agent --nsg nsg-agent --os-disk-size-gb 32
az postgres flexible-server create --resource-group rg-cashflow-agent --name psql-cashflow-agent --location eastus2 --admin-user pgadmin --admin-password '<STRONG_PASSWORD>' --sku-name Standard_B1ms --tier Burstable --storage-size 32 --version 15
az postgres flexible-server firewall-rule create --resource-group rg-cashflow-agent --name psql-cashflow-agent --rule-name AllowAgentVM --start-ip-address <VM_PRIVATE_IP> --end-ip-address <VM_PRIVATE_IP>
Note

Replace <MSP_IP_RANGE> with your MSP's office IP CIDR. Replace <STRONG_PASSWORD> with a 20+ character password stored in your password manager. The staging VM can be deallocated when not in use to save costs. Total monthly cost for this infrastructure: approximately $105–$130/month.

Step 2: Configure Production Server Base Environment

SSH into the production VM and install Docker, Docker Compose, and essential system packages. Configure automatic security updates, UFW firewall, and fail2ban for SSH protection. Set the timezone and configure log rotation.

ssh azureadmin@<PROD_VM_PUBLIC_IP>
sudo apt update && sudo apt upgrade -y
sudo apt install -y ca-certificates curl gnupg lsb-release ufw fail2ban unattended-upgrades
sudo install -m 0755 -d /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt update && sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
sudo usermod -aG docker azureadmin
sudo systemctl enable docker
sudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow from <MSP_IP_RANGE> to any port 22 proto tcp
sudo ufw allow 443/tcp
sudo ufw enable
sudo timedatectl set-timezone America/New_York
sudo dpkg-reconfigure -plow unattended-upgrades
Note

Log out and back in after adding the user to the docker group. Verify Docker works with 'docker run hello-world'. Replace timezone with the client's local timezone. UFW rules mirror the Azure NSG for defense-in-depth.

Step 3: Deploy n8n with Docker Compose

Create the n8n deployment directory, Docker Compose configuration, and Nginx reverse proxy with Let's Encrypt SSL. n8n will be the central orchestration platform for all AI agent workflows. Configure it to use the Azure PostgreSQL database for workflow storage and enable webhook functionality.

mkdir -p /opt/cashflow-agent/n8n && cd /opt/cashflow-agent
cat > docker-compose.yml << 'EOF'
version: '3.8'
services:
  n8n:
    image: n8nio/n8n:latest
    container_name: n8n
    restart: unless-stopped
    ports:
      - '5678:5678'
    environment:
      - N8N_HOST=n8n.yourdomain.com
      - N8N_PORT=5678
      - N8N_PROTOCOL=https
      - WEBHOOK_URL=https://n8n.yourdomain.com/
      - N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY}
      - DB_TYPE=postgresdb
      - DB_POSTGRESDB_HOST=${POSTGRES_HOST}
      - DB_POSTGRESDB_PORT=5432
      - DB_POSTGRESDB_DATABASE=n8n
      - DB_POSTGRESDB_USER=${POSTGRES_USER}
      - DB_POSTGRESDB_PASSWORD=${POSTGRES_PASSWORD}
      - DB_POSTGRESDB_SSL_REJECT_UNAUTHORIZED=false
      - N8N_BASIC_AUTH_ACTIVE=true
      - N8N_BASIC_AUTH_USER=${N8N_ADMIN_USER}
      - N8N_BASIC_AUTH_PASSWORD=${N8N_ADMIN_PASSWORD}
      - EXECUTIONS_DATA_PRUNE=true
      - EXECUTIONS_DATA_MAX_AGE=168
      - GENERIC_TIMEZONE=America/New_York
    volumes:
      - n8n_data:/home/node/.n8n
  
  nginx:
    image: nginx:alpine
    container_name: nginx-proxy
    restart: unless-stopped
    ports:
      - '443:443'
      - '80:80'
    volumes:
      - ./nginx/conf.d:/etc/nginx/conf.d
      - ./nginx/certbot/conf:/etc/letsencrypt
      - ./nginx/certbot/www:/var/www/certbot
    depends_on:
      - n8n

  certbot:
    image: certbot/certbot
    container_name: certbot
    volumes:
      - ./nginx/certbot/conf:/etc/letsencrypt
      - ./nginx/certbot/www:/var/www/certbot
    entrypoint: "/bin/sh -c 'trap exit TERM; while :; do certbot renew; sleep 12h & wait $${!}; done;'"

volumes:
  n8n_data:
EOF
cat > .env << 'EOF'
N8N_ENCRYPTION_KEY=<GENERATE_WITH_openssl_rand_-hex_32>
POSTGRES_HOST=psql-cashflow-agent.postgres.database.azure.com
POSTGRES_USER=pgadmin
POSTGRES_PASSWORD=<YOUR_POSTGRES_PASSWORD>
N8N_ADMIN_USER=admin
N8N_ADMIN_PASSWORD=<STRONG_ADMIN_PASSWORD>
EOF
chmod 600 .env
mkdir -p nginx/conf.d nginx/certbot/conf nginx/certbot/www
cat > nginx/conf.d/n8n.conf << 'EOF'
server {
    listen 80;
    server_name n8n.yourdomain.com;
    location /.well-known/acme-challenge/ {
        root /var/www/certbot;
    }
    location / {
        return 301 https://$host$request_uri;
    }
}
server {
    listen 443 ssl;
    server_name n8n.yourdomain.com;
    ssl_certificate /etc/letsencrypt/live/n8n.yourdomain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/n8n.yourdomain.com/privkey.pem;
    location / {
        proxy_pass http://n8n:5678;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        chunked_transfer_encoding off;
        proxy_buffering off;
        proxy_cache off;
    }
}
EOF
# First start nginx without SSL to get certificate
# Temporarily modify nginx conf to serve only port 80
docker compose up -d nginx
docker compose run --rm certbot certonly --webroot -w /var/www/certbot -d n8n.yourdomain.com --email admin@yourdomain.com --agree-tos --no-eff-email
# Restore full nginx conf with SSL and start all services
docker compose up -d
Note

Generate the N8N_ENCRYPTION_KEY with: openssl rand -hex 32. Replace 'n8n.yourdomain.com' with your actual subdomain. Create a DNS A record pointing to the VM's public IP before running certbot. The PostgreSQL database 'n8n' must be created first: connect to psql and run CREATE DATABASE n8n. Certbot will auto-renew via the sidecar container.

Step 4: Create PostgreSQL Database Schema for Agent Data

Connect to the Azure PostgreSQL instance and create the application database, tables for transaction caching, alert history, client configuration, and audit logging. These tables support the AI agent's data pipeline and provide the compliance audit trail required by IRS Publication 4557.

sql
psql "host=psql-cashflow-agent.postgres.database.azure.com port=5432 dbname=postgres user=pgadmin sslmode=require"
CREATE DATABASE cashflow_agent;
\c cashflow_agent
CREATE TABLE clients (
  id SERIAL PRIMARY KEY,
  name VARCHAR(255) NOT NULL,
  platform VARCHAR(20) NOT NULL CHECK (platform IN ('qbo', 'xero')),
  platform_entity_id VARCHAR(255) NOT NULL,
  oauth_access_token TEXT,
  oauth_refresh_token TEXT,
  oauth_token_expires_at TIMESTAMPTZ,
  min_cash_threshold NUMERIC(12,2) DEFAULT 5000.00,
  large_txn_threshold NUMERIC(12,2) DEFAULT 5000.00,
  alert_channel VARCHAR(50) DEFAULT 'slack',
  alert_destination TEXT,
  is_active BOOLEAN DEFAULT TRUE,
  created_at TIMESTAMPTZ DEFAULT NOW(),
  updated_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE transactions (
  id SERIAL PRIMARY KEY,
  client_id INTEGER REFERENCES clients(id),
  platform_txn_id VARCHAR(255) NOT NULL,
  txn_date DATE NOT NULL,
  amount NUMERIC(12,2) NOT NULL,
  description TEXT,
  category VARCHAR(255),
  ai_category VARCHAR(255),
  ai_confidence NUMERIC(4,3),
  is_anomaly BOOLEAN DEFAULT FALSE,
  anomaly_reason TEXT,
  raw_json JSONB,
  processed_at TIMESTAMPTZ DEFAULT NOW(),
  UNIQUE(client_id, platform_txn_id)
);
CREATE TABLE alerts (
  id SERIAL PRIMARY KEY,
  client_id INTEGER REFERENCES clients(id),
  alert_type VARCHAR(50) NOT NULL,
  severity VARCHAR(20) NOT NULL CHECK (severity IN ('info', 'warning', 'critical')),
  title VARCHAR(500) NOT NULL,
  body TEXT NOT NULL,
  ai_generated BOOLEAN DEFAULT TRUE,
  delivered BOOLEAN DEFAULT FALSE,
  delivered_at TIMESTAMPTZ,
  acknowledged BOOLEAN DEFAULT FALSE,
  acknowledged_by VARCHAR(255),
  created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE audit_log (
  id SERIAL PRIMARY KEY,
  event_type VARCHAR(100) NOT NULL,
  client_id INTEGER REFERENCES clients(id),
  details JSONB,
  created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE TABLE cash_snapshots (
  id SERIAL PRIMARY KEY,
  client_id INTEGER REFERENCES clients(id),
  snapshot_date DATE NOT NULL,
  total_bank_balance NUMERIC(14,2),
  accounts_receivable NUMERIC(14,2),
  accounts_payable NUMERIC(14,2),
  net_cash_position NUMERIC(14,2),
  days_cash_on_hand INTEGER,
  burn_rate_daily NUMERIC(12,2),
  created_at TIMESTAMPTZ DEFAULT NOW(),
  UNIQUE(client_id, snapshot_date)
);
CREATE INDEX idx_transactions_client_date ON transactions(client_id, txn_date DESC);
CREATE INDEX idx_alerts_client_created ON alerts(client_id, created_at DESC);
CREATE INDEX idx_snapshots_client_date ON cash_snapshots(client_id, snapshot_date DESC);
Note

Store OAuth tokens encrypted at the application level (n8n handles this via its credential store). The audit_log table is essential for compliance — log every AI decision, API call, and alert delivery. Retention policy: keep all data for 7 years per IRS requirements. Set up Azure automated backups for the PostgreSQL instance with 30-day retention.

Step 5: Register and Configure QuickBooks Online API Application

Register a new application in the Intuit Developer portal to obtain OAuth 2.0 credentials. Configure the app with the correct scopes and redirect URIs for n8n integration. This enables programmatic access to bank feed transactions, account balances, invoices, and bills.

1
1. Go to https://developer.intuit.com and sign in
2
2. Click 'Dashboard' > 'Create an app'
3
3. Select 'QuickBooks Online and Payments' 4. Name the app: 'CashFlow Monitor Agent' 5. Note down: Client ID and Client Secret
4
6. Under 'Keys & credentials' > 'Production', set Redirect URIs to:
5
https://n8n.yourdomain.com/rest/oauth2-credential/callback
6
7. Required OAuth scopes:
7
com.intuit.quickbooks.accounting (read access to all accounting data)
8
8. Under 'Webhooks', enable notifications for: - Account (Create, Update) - BankTransaction (Create) - Invoice (Create, Update, Delete) - Bill (Create, Update, Delete)
9
9. Set webhook endpoint to:
10
https://n8n.yourdomain.com/webhook/qbo-events
11
10. Note the Verifier Token for webhook validation
Note

The QBO API has a rate limit of 500 requests per minute per realm (company). For 50 clients polled every 15 minutes, this is well within limits. OAuth tokens expire every 60 minutes and must be refreshed using the refresh token (valid for 100 days). Implement token refresh logic in the n8n workflow. The sandbox environment is available for testing before production deployment.

Step 6: Register and Configure Xero API Application

Register a new application in the Xero Developer portal for OAuth 2.0 access. Configure scopes for bank transactions, accounts, invoices, and contacts. Set up webhook subscriptions for real-time bank feed notifications.

1
1. Go to https://developer.xero.com/app/manage and sign in
2
2. Click 'New app' 3. App name: 'CashFlow Monitor Agent' 4. Company URL: https://yourdomain.com 5. Redirect URI: https://n8n.yourdomain.com/rest/oauth2-credential/callback 6. Note down: Client ID and Client Secret
3
7. Required OAuth scopes:
4
openid profile email
5
accounting.transactions.read
6
accounting.contacts.read
7
accounting.settings.read
8
accounting.reports.read
9
8. Enable webhooks in the Xero developer dashboard 9. Webhook URL: https://n8n.yourdomain.com/webhook/xero-events
10
10. Note the Webhook Key for HMAC signature validation
Note

Xero API free Starter tier allows up to 25 connections — sufficient for most initial deployments. If monitoring more than 25 entities, upgrade to the Core tier. Important: Xero has clarified that API data may NOT be used to train AI/ML models. Our implementation only uses API data for real-time analysis via the OpenAI API (which does not train on API data), so this is compliant. Document this compliance posture in the WISP.

Step 7: Configure n8n Credentials and Core Settings

Log into the n8n web interface and configure all required credentials: QBO OAuth2, Xero OAuth2, OpenAI API key, PostgreSQL database connection, and Slack/Teams webhook. These credentials are encrypted by n8n using the N8N_ENCRYPTION_KEY.

1
1. Navigate to https://n8n.yourdomain.com
2
2. Log in with admin credentials
3
3. Go to Settings > Credentials 4. Add credential: 'OAuth2 API' for QuickBooks Online - Client ID: <QBO_CLIENT_ID> - Client Secret: <QBO_CLIENT_SECRET> - Authorization URL: https://appcenter.intuit.com/connect/oauth2 - Token URL: https://oauth.platform.intuit.com/oauth2/v1/tokens/bearer - Scope: com.intuit.quickbooks.accounting - Authentication: Header 5. Add credential: 'OAuth2 API' for Xero - Client ID: <XERO_CLIENT_ID> - Client Secret: <XERO_CLIENT_SECRET> - Authorization URL: https://login.xero.com/identity/connect/authorize - Token URL: https://identity.xero.com/connect/token - Scope: openid profile email accounting.transactions.read accounting.settings.read accounting.reports.read 6. Add credential: 'OpenAI' - API Key: <OPENAI_API_KEY> 7. Add credential: 'Postgres' - Host: psql-cashflow-agent.postgres.database.azure.com - Database: cashflow_agent - User: pgadmin - Password: <POSTGRES_PASSWORD> - SSL: true 8. Add credential: 'Slack API' (Bot Token) - Bot Token: xoxb-<SLACK_BOT_TOKEN>
4
9. Test each credential using n8n's built-in test function
Note

For each QBO/Xero client entity, you will need to complete the OAuth consent flow individually. n8n stores credentials encrypted at rest using AES-256 with the N8N_ENCRYPTION_KEY. Never commit the .env file to version control. For Microsoft Teams, use an Incoming Webhook URL instead of Slack.

Step 8: Deploy Float and Connect Client Entities (Phase 1)

Set up Float as the baseline cash flow forecasting and threshold alerting tool. Connect each client's QBO or Xero entity to Float, configure cash threshold alerts, and verify initial data sync. This provides immediate value while the custom AI agent is being developed.

1
1. Sign up for Float at https://floatapp.com
2
2. Choose Essential ($59/mo) or Premium ($99/mo) plan
3
3. For each client entity:
4
a. Click 'Add organization' in Float
5
b. Select QuickBooks Online or Xero
6
c. Complete OAuth consent flow with client's admin credentials
7
d. Wait for initial sync (typically 5-15 minutes)
8
e. Navigate to Settings > Alerts
9
f. Set 'Cash balance drops below' threshold per client: - Recommended starting threshold: 2x monthly operating expenses - Example: $10,000 for a small business, $50,000 for mid-size g. Set 'Cash balance drops below threshold within X days': 7, 14, and 30 days
10
h. Enable email alerts to the assigned bookkeeper's email
11
4. Configure Float webhook (if available) or use Zapier: a. In Zapier, create Zap: Float Alert Email → Slack Channel Post
12
b. Parse email subject and body, format as Slack block message
13
c. Route to #cashflow-alerts channel
Note

Float syncs data daily by default. For near-real-time monitoring, the n8n agent pipeline (Phase 2) will provide more frequent updates. Float's value is as a purpose-built cash forecasting visualization that the accounting team can show clients during advisory meetings. Configure a separate Float scenario for each client to enable per-client threshold customization.

Step 9: Deploy Fathom for Portfolio Monitoring

Set up Fathom for portfolio-wide financial KPI tracking across all client entities. This complements Float by providing ratio analysis, trend detection, and automated reporting. Configure KPI alerts for cash-related metrics across the portfolio.

1
1. Sign up for Fathom at https://www.fathomhq.com
2
2. Select Bronze plan ($145/mo for 10 companies) or appropriate tier
3
3. For each client entity:
4
a. Click 'Add company' > select QBO or Xero
5
b. Complete OAuth consent flow
6
c. Wait for initial data import and analysis
7
d. Navigate to KPIs tab
8
e. Configure alerts for: - Current Ratio drops below 1.0 - Cash Conversion Cycle exceeds 60 days - Revenue month-over-month decline > 15% - Accounts Receivable aging > 60 days increases f. Set alert recipients: primary bookkeeper + engagement partner
9
4. Create a Portfolio Dashboard:
10
a. Go to Portfolio > Create Dashboard b. Add widgets: Cash position, Revenue trend, AR aging, Expenses vs Budget
11
c. Schedule monthly PDF report auto-delivery to practice management
12
5. Configure Fathom → Zapier integration for alert forwarding if needed
Note

Fathom refreshes data when the underlying QBO/Xero data updates. For firms managing more than 10 clients, use the Silver ($190/mo) or Gold ($280/mo) tiers. Fathom excels at advisory-grade reporting — use it to generate the monthly client reports that justify the advisory fee. This is a revenue-generating tool for the accounting firm, not just a monitoring tool.

Step 10: Import and Activate n8n Agent Workflows

Import the custom AI agent workflows into n8n. These workflows implement the core autonomous monitoring pipeline: bank feed polling, transaction analysis, anomaly detection, cash flow snapshot generation, and alert delivery. Each workflow is described in detail in the Custom AI Components section of this guide.

1
1. In n8n, go to Workflows > Import from File
2
2. Import each workflow JSON file in order:
3
a. 01-bank-feed-poller.json (scheduled every 15 minutes)
4
b. 02-transaction-analyzer.json (triggered by poller)
5
c. 03-anomaly-detector.json (triggered by analyzer)
6
d. 04-cash-snapshot-generator.json (scheduled daily at 6 AM)
7
e. 05-alert-dispatcher.json (triggered by anomaly detector and snapshot generator)
8
f. 06-daily-digest.json (scheduled daily at 8 AM)
9
g. 07-oauth-token-refresher.json (scheduled every 45 minutes)
10
h. 08-webhook-receiver.json (passive, receives QBO/Xero webhooks)
11
3. For each workflow:
12
a. Open the workflow
13
b. Update credential references to match your n8n credential names
14
c. Update any client-specific configuration nodes
15
d. Activate the workflow (toggle from inactive to active)
16
4. Verify all workflows show 'Active' status
17
5. Check Executions tab after 15 minutes to confirm poller is running
Note

Start by activating workflows for a single test client entity to verify the pipeline end-to-end before rolling out to all clients. The webhook receiver workflow must be activated first so that the QBO/Xero webhook endpoints are registered and responsive. Monitor the Executions tab closely for the first 24 hours to catch any authentication or API errors.

Step 11: Configure Alert Channels and Test End-to-End

Set up the Slack workspace (or Teams/email) alert channels, configure the alert dispatcher workflow with channel mappings, and run a full end-to-end test by manually inserting a test transaction that triggers an alert.

1
1. In Slack, create channels:
2
cashflow-alerts-critical (for threshold breaches and large anomalies)
3
cashflow-alerts-info (for daily digests and informational alerts)
4
cashflow-agent-errors (private, MSP-only, for system errors)
5
2. Install the Slack bot app created in step 7 to all three channels
6
3. In n8n, open the 05-alert-dispatcher workflow
7
a. Map alert severity 'critical' → #cashflow-alerts-critical
8
b. Map alert severity 'warning' → #cashflow-alerts-critical
9
c. Map alert severity 'info' → #cashflow-alerts-info
10
d. Map system errors → #cashflow-agent-errors
11
4. Test the pipeline:
12
a. In QBO sandbox, create a manual bank transaction for $25,000 (above typical large_txn_threshold)
13
b. Wait for the 15-minute polling cycle (or manually trigger the poller) c. Verify: transaction appears in PostgreSQL transactions table d. Verify: anomaly detection flags the transaction e. Verify: alert appears in #cashflow-alerts-critical with formatted message f. Verify: audit_log entry is created
14
5. Test daily digest:
15
a. Manually trigger 06-daily-digest workflow b. Verify: summary message appears in #cashflow-alerts-info
16
6. Test error handling:
17
a. Temporarily invalidate an OAuth token b. Verify: error alert appears in #cashflow-agent-errors
Note

For Microsoft Teams deployments, replace Slack channels with Teams channels and use Incoming Webhook URLs. For email-only deployments, configure SMTP in n8n (Settings > SMTP) and update the alert dispatcher to use the Email Send node instead of Slack. Document all channel mappings in the client handoff documentation.

Step 12: Configure Backup and Disaster Recovery

Set up automated backups for the PostgreSQL database, n8n workflow configurations, and Docker volumes. Configure backup verification and offsite storage to the Synology NAS or Azure Blob Storage.

bash
# Create backup script
cat > /opt/cashflow-agent/backup.sh << 'SCRIPT'
#!/bin/bash
set -euo pipefail
BACKUP_DIR=/opt/cashflow-agent/backups/$(date +%Y%m%d)
mkdir -p $BACKUP_DIR

# Database backup
PGPASSWORD='<POSTGRES_PASSWORD>' pg_dump \
  -h psql-cashflow-agent.postgres.database.azure.com \
  -U pgadmin \
  -d cashflow_agent \
  -F c \
  -f $BACKUP_DIR/cashflow_agent_$(date +%H%M).dump

# n8n workflow export
docker exec n8n n8n export:workflow --all --output=$BACKUP_DIR/workflows.json 2>/dev/null || true

# Docker volume backup
docker run --rm -v cashflow-agent_n8n_data:/data -v $BACKUP_DIR:/backup alpine \
  tar czf /backup/n8n_data.tar.gz /data

# Configuration backup
cp /opt/cashflow-agent/docker-compose.yml $BACKUP_DIR/
cp /opt/cashflow-agent/.env $BACKUP_DIR/

# Encrypt backups
tar czf $BACKUP_DIR.tar.gz $BACKUP_DIR
gpg --symmetric --cipher-algo AES256 --passphrase-file /opt/cashflow-agent/.backup-key $BACKUP_DIR.tar.gz
rm -rf $BACKUP_DIR $BACKUP_DIR.tar.gz

# Upload to Azure Blob (or rsync to NAS)
az storage blob upload \
  --account-name stcashflowbackup \
  --container-name backups \
  --name $(date +%Y%m%d).tar.gz.gpg \
  --file $BACKUP_DIR.tar.gz.gpg \
  --auth-mode login 2>/dev/null || true

# Cleanup: keep last 30 days locally
find /opt/cashflow-agent/backups/ -name '*.gpg' -mtime +30 -delete

echo "Backup completed: $(date)"
SCRIPT
chmod +x /opt/cashflow-agent/backup.sh
# Generate backup encryption key
openssl rand -hex 32 > /opt/cashflow-agent/.backup-key
chmod 600 /opt/cashflow-agent/.backup-key
# Schedule nightly backups at 2 AM
echo '0 2 * * * /opt/cashflow-agent/backup.sh >> /var/log/cashflow-backup.log 2>&1' | crontab -
# For Synology NAS alternative, replace az storage blob upload with:
# rsync -avz --progress $BACKUP_DIR.tar.gz.gpg admin@<NAS_IP>:/volume1/backups/cashflow-agent/
Note

Store the backup encryption passphrase in your MSP password manager — losing it means losing access to all backups. Test backup restoration monthly by restoring to the staging VM. Azure Blob Storage with cool tier is approximately $0.01/GB/month. For Synology NAS, ensure the NAS is also backed up offsite (Synology Hyper Backup to a cloud provider).

Step 13: Deploy UptimeRobot Monitoring

Configure UptimeRobot to monitor all critical endpoints: n8n web interface, webhook receivers, database connectivity, and API health checks. Set up escalation alerts to MSP on-call personnel.

1
1. Sign up at https://uptimerobot.com (Pro plan $7/mo)
2
2. Create monitors: a. HTTPS monitor: n8n Web UI URL: https://n8n.yourdomain.com Interval: 5 minutes Alert contacts: MSP on-call email + SMS b. Keyword monitor: n8n Health Check URL: https://n8n.yourdomain.com/healthz Keyword: ok Interval: 5 minutes c. HTTPS monitor: QBO Webhook Endpoint URL: https://n8n.yourdomain.com/webhook/qbo-events Expected status: 200 Interval: 5 minutes d. HTTPS monitor: Xero Webhook Endpoint URL: https://n8n.yourdomain.com/webhook/xero-events Expected status: 200 Interval: 5 minutes
3
3. Create alert contacts: a. MSP NOC email: noc@msp.com b. MSP on-call SMS: +1XXXXXXXXXX c. Slack webhook: #cashflow-agent-errors channel 4. Set escalation: email immediately, SMS after 10 minutes down
Note

Also configure an n8n internal workflow that runs every 30 minutes checking database connectivity and API token validity. This catches issues like expired OAuth tokens that UptimeRobot cannot detect. Consider PagerDuty or Opsgenie integration for MSPs with formal on-call rotations.

Step 14: Security Hardening and Compliance Documentation

Implement final security hardening measures, generate compliance documentation, and update the client's WISP to reflect the new AI monitoring system. This step ensures the deployment meets IRS Publication 4557 and FTC Safeguards Rule requirements.

1
1. Enable Azure disk encryption on all VMs az vm encryption enable --resource-group rg-cashflow-agent --name vm-agent-prod --disk-encryption-keyvault <KEYVAULT_NAME>
2
2. Enable PostgreSQL SSL enforcement az postgres flexible-server parameter set --resource-group rg-cashflow-agent --server-name psql-cashflow-agent --name require_secure_transport --value on
3
3. Enable Azure Activity Log and Diagnostic Settings az monitor diagnostic-settings create --resource /subscriptions/<SUB_ID>/resourceGroups/rg-cashflow-agent --name diag-cashflow --logs '[{"category":"Administrative","enabled":true}]' --storage-account <STORAGE_ACCOUNT>
4
4. Create WISP amendment document (template in notes)
5
5. Enable audit logging in n8n:
6
Add to docker-compose.yml n8n environment: - N8N_LOG_LEVEL=info - N8N_LOG_OUTPUT=console,file - N8N_LOG_FILE_LOCATION=/home/node/.n8n/logs/n8n.log
7
6. Configure log forwarding to Azure Log Analytics (optional)
8
7. Run CIS benchmark scan on the VM: sudo apt install -y lynis sudo lynis audit system
Note

WISP amendment should document: (1) Description of the AI monitoring system and data it accesses, (2) Encryption methods used (AES-256 at rest, TLS 1.3 in transit), (3) Access controls (OAuth 2.0, MFA, role-based), (4) Vendor security postures (OpenAI SOC 2, Azure SOC 2, Plaid SOC 2), (5) Data retention policy (7 years), (6) Incident response procedures specific to AI system failures, (7) Annual review schedule. Keep a signed copy in the client's compliance file.

Step 15: Roll Out to Full Client Portfolio

After successful testing with 1-3 pilot clients, systematically onboard the remaining client entities. Use a batch onboarding process to efficiently connect QBO/Xero entities, configure thresholds, and validate data flow for each client.

1
1. Create a client onboarding spreadsheet with columns:
2
Entity Name | Platform | Entity ID | OAuth Status | Float Connected |
3
Cash Threshold | Large Txn Threshold | Alert Channel | Go-Live Date
4
2. For each client entity, execute the onboarding workflow:
5
a. Complete OAuth consent flow in n8n (QBO or Xero credential)
6
b. Insert client record into PostgreSQL clients table: INSERT INTO clients (name, platform, platform_entity_id, min_cash_threshold, large_txn_threshold, alert_channel, alert_destination) VALUES ('Client Name', 'qbo', '<REALM_ID>', 10000.00, 5000.00, 'slack', '#cashflow-alerts-critical');
7
c. Connect entity in Float (if using)
8
d. Connect entity in Fathom (if using)
9
e. Manually trigger the bank feed poller for the new client
10
f. Verify transactions are being pulled and analyzed
11
g. Verify first daily digest includes the new client 3. Batch onboarding rate: 5-10 clients per day
12
4. After all clients are onboarded:
13
a. Run a full portfolio daily digest
14
b. Review alert volume — adjust thresholds if too many false positives
15
c. Update the client configuration spreadsheet with final settings
Note

The OAuth consent flow requires a user with Admin or Accountant role in each QBO/Xero entity to approve the connection. Coordinate with the accounting firm's team to schedule consent sessions in batches. Some firms use a single master accountant login across entities — this simplifies onboarding but creates a single point of failure. Recommend individual service account credentials where possible.

Custom AI Components

Bank Feed Poller Workflow

Type: workflow

An n8n workflow that runs on a 15-minute schedule (configurable per client). For each active client entity, it queries the QBO or Xero API for new bank transactions since the last poll timestamp. New transactions are inserted into the PostgreSQL transactions table and passed to the Transaction Analyzer workflow via an n8n trigger. Handles OAuth token refresh, rate limiting, pagination, and error logging.

Implementation:

n8n Workflow Specification (JSON-importable):

**Trigger:** Schedule Trigger — Every 15 minutes

**Node 1: Get Active Clients**
Type: Postgres
Operation: Execute Query
Query: SELECT id, name, platform, platform_entity_id, oauth_access_token, oauth_refresh_token, oauth_token_expires_at, min_cash_threshold, large_txn_threshold FROM clients WHERE is_active = TRUE;

**Node 2: Split Into Batches**
Type: SplitInBatches
Batch Size: 5 (process 5 clients concurrently to respect API rate limits)

**Node 3: Switch on Platform**
Type: Switch
Condition: {{$json.platform}} equals 'qbo' → QBO Branch, equals 'xero' → Xero Branch

**Node 4a: QBO — Fetch Bank Transactions**
Type: HTTP Request
Method: GET
URL: https://quickbooks.api.intuit.com/v3/company/{{$json.platform_entity_id}}/query?query=SELECT * FROM Purchase WHERE MetaData.LastUpdatedTime > '{{$now.minus(16, 'minutes').toISO()}}' ORDERBY MetaData.LastUpdatedTime&minorversion=73
Headers:
  Authorization: Bearer {{$json.oauth_access_token}}
  Accept: application/json
Authentication: None (token in header)
Retry on Fail: true, Max Retries: 3, Wait Between: 5000ms
On Error: Continue (errors handled in error branch)

Also fetch BankDeposit: SELECT * FROM Deposit WHERE MetaData.LastUpdatedTime > '{{$now.minus(16, 'minutes').toISO()}}'

**Node 4b: Xero — Fetch Bank Transactions**
Type: HTTP Request
Method: GET
URL: https://api.xero.com/api.xro/2.0/BankTransactions?where=UpdatedDateUTC>=DateTime({{$now.minus(16, 'minutes').format('yyyy,MM,dd,HH,mm,ss')}})&page=1
Headers:
  Authorization: Bearer {{$json.oauth_access_token}}
  Xero-Tenant-Id: {{$json.platform_entity_id}}
  Accept: application/json

**Node 5: Merge & Normalize Transactions**
Type: Code (JavaScript)
Code:
const items = $input.all();
const normalized = [];
for (const item of items) {
  const data = item.json;
  if (data.QueryResponse && data.QueryResponse.Purchase) {
    for (const txn of data.QueryResponse.Purchase) {
      normalized.push({
        client_id: data._client_id,
        platform_txn_id: txn.Id,
        txn_date: txn.TxnDate,
        amount: -Math.abs(txn.TotalAmt),
        description: txn.Line?.[0]?.Description || txn.EntityRef?.name || 'No description',
        category: txn.Line?.[0]?.AccountBasedExpenseLineDetail?.AccountRef?.name || 'Uncategorized',
        raw_json: txn
      });
    }
  }
  if (data.QueryResponse && data.QueryResponse.Deposit) {
    for (const txn of data.QueryResponse.Deposit) {
      normalized.push({
        client_id: data._client_id,
        platform_txn_id: txn.Id,
        txn_date: txn.TxnDate,
        amount: Math.abs(txn.TotalAmt),
        description: txn.Line?.[0]?.Description || 'Deposit',
        category: txn.Line?.[0]?.DepositLineDetail?.AccountRef?.name || 'Income',
        raw_json: txn
      });
    }
  }
  if (data.BankTransactions) {
    for (const txn of data.BankTransactions) {
      normalized.push({
        client_id: data._client_id,
        platform_txn_id: txn.BankTransactionID,
        txn_date: txn.Date,
        amount: txn.Type === 'RECEIVE' ? Math.abs(txn.Total) : -Math.abs(txn.Total),
        description: txn.Contact?.Name || txn.LineItems?.[0]?.Description || 'No description',
        category: txn.LineItems?.[0]?.AccountCode || 'Uncategorized',
        raw_json: txn
      });
    }
  }
}
return normalized.map(n => ({ json: n }));

**Node 6: Upsert Transactions**
Type: Postgres
Operation: Execute Query
Query (per item): INSERT INTO transactions (client_id, platform_txn_id, txn_date, amount, description, category, raw_json) VALUES ($1, $2, $3, $4, $5, $6, $7) ON CONFLICT (client_id, platform_txn_id) DO UPDATE SET amount = EXCLUDED.amount, description = EXCLUDED.description, category = EXCLUDED.category, raw_json = EXCLUDED.raw_json, processed_at = NOW() RETURNING id, client_id, platform_txn_id, amount, description;

**Node 7: Trigger Transaction Analyzer**
Type: Execute Workflow
Workflow: 02-transaction-analyzer
Pass data: All new/updated transaction IDs

**Node 8: Audit Log**
Type: Postgres
Operation: Insert
Table: audit_log
Values: event_type='bank_feed_poll', client_id={{$json.client_id}}, details={"transactions_found": count, "timestamp": now}

**Error Branch:**
Node E1: Check if error is 401 (token expired) → Trigger token refresh workflow
Node E2: Log error to audit_log with full error details
Node E3: If 3+ consecutive failures for a client, send alert to #cashflow-agent-errors

Transaction Analyzer Agent

Type: agent

An AI agent that analyzes each new bank transaction using GPT-5.4 mini for classification and anomaly scoring. For each transaction, it determines: (1) whether the category assigned by QBO/Xero is correct, (2) whether the transaction amount is anomalous based on historical patterns for this client, (3) whether the transaction represents a potential cash flow concern (e.g., unexpected large payment, duplicate transaction, unusual vendor). Transactions scoring above the anomaly threshold are forwarded to the Anomaly Detector for deeper analysis with GPT-5.4.

Implementation:

n8n Workflow Specification: **Trigger:** Called by Bank Feed Poller (Execute Workflow trigger) **Node 1: Receive Transaction Batch** Type: Execute Workflow Trigger Receives: Array of transaction objects with IDs **Node 2: Fetch Historical Context** Type: Postgres For each transaction's client_id, fetch: SELECT AVG(ABS(amount)) as avg_txn_amount, STDDEV(ABS(amount)) as stddev_txn_amount, MAX(ABS(amount)) as max_txn_amount, COUNT(*) as total_txns, AVG(CASE WHEN amount < 0 THEN ABS(amount) END) as avg_expense, AVG(CASE WHEN amount > 0 THEN amount END) as avg_income FROM transactions WHERE client_id = $1 AND txn_date > CURRENT_DATE - INTERVAL '90 days'; Also fetch recent transactions for duplicate detection: SELECT platform_txn_id, amount, description, txn_date FROM transactions WHERE client_id = $1 AND txn_date > CURRENT_DATE - INTERVAL '7 days' ORDER BY txn_date DESC LIMIT 50; **Node 3: Build Analysis Prompt** Type: Code (JavaScript) const txn = $input.first().json; const stats = $input.item(1).json; const recentTxns = $input.item(2).json.rows; const prompt = `You are a forensic bookkeeping AI agent analyzing bank transactions for anomalies and cash flow concerns.
Sonnet 4.6

Transaction to Analyze

  • Date: ${txn.txn_date}
  • Amount: $${txn.amount}
  • Description: ${txn.description}
  • Current Category: ${txn.category}

Client Historical Context (Last 90 Days)

  • Average transaction amount: $${stats.avg_txn_amount?.toFixed(2) || 'N/A'}
  • Std deviation: $${stats.stddev_txn_amount?.toFixed(2) || 'N/A'}
  • Max transaction: $${stats.max_txn_amount?.toFixed(2) || 'N/A'}
  • Total transactions: ${stats.total_txns || 0}
  • Average expense: $${stats.avg_expense?.toFixed(2) || 'N/A'}
  • Average income: $${stats.avg_income?.toFixed(2) || 'N/A'}

Recent Transactions (Last 7 Days)

${recentTxns.map(t => ${t.txn_date} | $${t.amount} | ${t.description}).join('\n')}

Your Task

Analyze this transaction and respond with ONLY a JSON object (no markdown, no explanation):

json
{
  "suggested_category": "<correct category name>",
  "category_confidence": <0.0-1.0>,
  "is_anomaly": <true/false>,
  "anomaly_score": <0.0-1.0 where 1.0 is most anomalous>,
  "anomaly_reasons": ["<reason1>", "<reason2>"],
  "is_duplicate_suspect": <true/false>,
  "duplicate_of": "<platform_txn_id or null>",
  "cash_flow_impact": "<positive/negative/neutral>",
  "concern_level": "<none/low/medium/high/critical>",
  "brief_note": "<one-sentence plain-English summary for the bookkeeper>"
}

Anomaly detection rules:

1
Amount > 3 standard deviations from mean → anomaly_score >= 0.7
2
Possible duplicate (same amount ± $1, same week, similar description) → is_duplicate_suspect: true
3
Large round numbers ([varies],000) to new payees → concern_level: medium+
4
Weekend/holiday transactions for businesses with weekday-only patterns → flag
5
Negative amounts larger than 50% of average monthly income → concern_level: high
javascript
`;

return [{ json: { prompt, txn, stats } }];

Node 4: Call GPT-5.4 mini

  • Type: OpenAI
  • Resource: Chat Completion
  • Model: gpt-5.4-mini
  • Messages: [{role: 'user', content: '{{$json.prompt}}'}]
  • Temperature: 0.1
  • Max Tokens: 500
  • Response Format: json_object

Node 5: Parse and Store Results

  • Type: Code (JavaScript)
javascript
const analysis = JSON.parse($input.first().json.message.content);
const txn = $input.first().json.txn;

return [{
  json: {
    transaction_id: txn.id,
    client_id: txn.client_id,
    ai_category: analysis.suggested_category,
    ai_confidence: analysis.category_confidence,
    is_anomaly: analysis.anomaly_score >= 0.5 || analysis.is_duplicate_suspect,
    anomaly_reason: analysis.anomaly_reasons?.join('; ') || null,
    anomaly_score: analysis.anomaly_score,
    concern_level: analysis.concern_level,
    brief_note: analysis.brief_note,
    full_analysis: analysis
  }
}];

Node 6: Update Transaction Record

  • Type: Postgres
sql
UPDATE transactions SET 
  ai_category = $1, 
  ai_confidence = $2, 
  is_anomaly = $3, 
  anomaly_reason = $4 
WHERE id = $5;

Node 7: Route Anomalies

  • Type: IF
  • Condition: anomaly_score >= 0.5 OR concern_level IN ('high', 'critical')
  • True → Trigger Anomaly Detector (03-anomaly-detector)
  • False → End (normal transaction, no further action)

Node 8: Audit Log

  • Type: Postgres
  • Insert into audit_log: event_type='transaction_analyzed', details={analysis summary}

Anomaly Detector Deep Analysis Agent

  • Type: agent

A second-tier AI agent using GPT-5.4 (the more capable model) that performs deep analysis on transactions flagged by the Transaction Analyzer. It receives the flagged transaction along with broader client financial context (30-day transaction history, current cash position, upcoming payables) and generates a detailed anomaly report with recommended actions. This agent only runs on flagged transactions to control API costs.

Implementation:

n8n Workflow Specification:

Trigger: Called by Transaction Analyzer when anomaly_score >= 0.5

Node 1: Receive Flagged Transaction

  • Type: Execute Workflow Trigger

Node 2: Fetch Extended Context

  • Type: Postgres (multiple queries)
sql
Query 1 - 30-day transaction history:
SELECT txn_date, amount, description, category, ai_category, is_anomaly
FROM transactions
WHERE client_id = $1 AND txn_date > CURRENT_DATE - INTERVAL '30 days'
ORDER BY txn_date DESC;
Query 2 - Latest cash snapshot:
SELECT * FROM cash_snapshots
WHERE client_id = $1
ORDER BY snapshot_date DESC LIMIT 1;
Query 3 - Client thresholds:
SELECT min_cash_threshold, large_txn_threshold FROM clients WHERE id = $1;

Node 3: Build Deep Analysis Prompt

  • Type: Code (JavaScript)
javascript
const flagged = $input.first().json;
const history = $input.item(1).json.rows;
const snapshot = $input.item(2).json.rows[0];
const thresholds = $input.item(3).json.rows[0];

const prompt = `You are a senior forensic accountant AI agent. A transaction has been flagged as potentially anomalous. Perform a deep analysis and generate an actionable alert.

Flagged Transaction

  • Date: ${flagged.txn_date}
  • Amount: $${flagged.amount}
  • Description: ${flagged.description}
  • Current Category: ${flagged.category}
  • AI Suggested Category: ${flagged.ai_category}
  • Initial Anomaly Score: ${flagged.anomaly_score}
  • Initial Flags: ${flagged.anomaly_reason}

Current Cash Position

  • Total Bank Balance: $${snapshot?.total_bank_balance || 'Unknown'}
  • Accounts Receivable: $${snapshot?.accounts_receivable || 'Unknown'}
  • Accounts Payable: $${snapshot?.accounts_payable || 'Unknown'}
  • Net Cash Position: $${snapshot?.net_cash_position || 'Unknown'}
  • Days Cash on Hand: ${snapshot?.days_cash_on_hand || 'Unknown'}
  • Daily Burn Rate: $${snapshot?.burn_rate_daily || 'Unknown'}

Alert Thresholds

  • Minimum Cash Balance: $${thresholds.min_cash_threshold}
  • Large Transaction Threshold: $${thresholds.large_txn_threshold}

30-Day Transaction History

${history.map(t => ${t.txn_date} | $${t.amount} | ${t.description} | ${t.category} ${t.is_anomaly ? '⚠️' : ''}).join('\n')}

Your Analysis Tasks

1. Confirm or revise the anomaly classification 2. Assess the cash flow impact of this transaction 3. Determine if this transaction, combined with current cash position, creates a cash flow risk 4. Check for patterns (is this part of a series of unusual transactions?) 5. Generate a clear, actionable alert for the bookkeeper Respond with ONLY a JSON object: { "confirmed_anomaly": <true/false>, "revised_anomaly_score": <0.0-1.0>, "alert_severity": "<info|warning|critical>", "alert_title":...

Portfolio Cash Positions (Today)

  • ${snapshots.map(s => - ${s.name}: Bank $${Number(s.total_bank_balance).toLocaleString()} | AR $${Number(s.accounts_receivable).toLocaleString()} | AP $${Number(s.accounts_payable).toLocaleString()} | Net $${Number(s.net_cash_position).toLocaleString()} | ${s.days_cash_on_hand} days cash | Burn $${Number(s.burn_rate_daily).toLocaleString()}/day).join('\n')}

Alerts in Last 24 Hours (${alerts.length} total)

  • ${alerts.length > 0 ? alerts.map(a => - [${a.severity.toUpperCase()}] ${a.client_name}: ${a.title}).join('\n') : 'No alerts triggered.'}

Anomalous Transactions (${anomalies.length} flagged)

  • ${anomalies.length > 0 ? anomalies.map(a => - ${a.client_name}: $${a.amount} — ${a.description} (${a.anomaly_reason})).join('\n') : 'No anomalies detected.'}

Instructions

Generate a morning briefing with these sections:

1
Portfolio Health Summary (1-2 sentences: overall status, how many clients are healthy vs. at risk)
2
⚠️ Clients Requiring Attention (list clients with <14 days cash on hand or active critical/warning alerts, with specific recommended actions)
3
📊 Notable Transactions (any anomalies worth the team's attention)
4
📈 Positive Signals (clients with improving cash positions or notable incoming payments)
5
Today's Priority Actions (numbered list of 3-5 specific actions the team should take today)

Keep the total briefing under 400 words. Use plain English, avoid jargon. Address the team directly.`;

return [{ json: { prompt } }];

Node 5: Call GPT-5.4

  • Type: OpenAI
  • Model: gpt-5.4
  • Messages: [{role: 'system', content: 'You are a concise, professional financial advisor AI. Generate clear, actionable briefings.'}, {role: 'user', content: '{{$json.prompt}}'}]
  • Temperature: 0.3
  • Max Tokens: 800

Node 6: Format and Send to Slack

  • Type: Slack
  • Channel: #cashflow-alerts-info
  • Message: ☀️ *Morning Cash Flow Briefing — {{$now.format('MMMM d, yyyy')}}* {{$json.message.content}} --- _Generated by CashFlow Monitor Agent at {{$now.format('h:mm a')}}_

Node 7: Also send via email (optional)

  • Type: Send Email
  • To: accounting-team@clientfirm.com
  • Subject: Daily Cash Flow Briefing — {{$now.format('MMM d')}}
  • Body: HTML-formatted version of the digest

OAuth Token Refresher

  • Type: workflow

A utility workflow that runs every 45 minutes to proactively refresh OAuth tokens for QBO and Xero before they expire. QBO tokens expire every 60 minutes; Xero tokens expire every 30 minutes. This ensures uninterrupted API access for all agent workflows.

Implementation:

plaintext
n8n Workflow Specification:

**Trigger:** Schedule — Every 45 minutes

**Node 1: Get Clients Needing Token Refresh**
Type: Postgres
SELECT id, name, platform, platform_entity_id, oauth_refresh_token, oauth_token_expires_at
FROM clients
WHERE is_active = TRUE
  AND oauth_token_expires_at < NOW() + INTERVAL '20 minutes';

**Node 2: For Each Client — Refresh Token**

For QBO:
Type: HTTP Request
Method: POST
URL: https://oauth.platform.intuit.com/oauth2/v1/tokens/bearer
Body (form-urlencoded):
  grant_type=refresh_token
  refresh_token={{$json.oauth_refresh_token}}
Headers:
  Authorization: Basic {{base64(client_id + ':' + client_secret)}}
  Content-Type: application/x-www-form-urlencoded

For Xero:
Type: HTTP Request
Method: POST
URL: https://identity.xero.com/connect/token
Body (form-urlencoded):
  grant_type=refresh_token
  refresh_token={{$json.oauth_refresh_token}}
  client_id={{xero_client_id}}
  client_secret={{xero_client_secret}}

**Node 3: Update Tokens in Database**
Type: Postgres
UPDATE clients SET
  oauth_access_token = $1,
  oauth_refresh_token = $2,
  oauth_token_expires_at = NOW() + INTERVAL '$3 seconds',
  updated_at = NOW()
WHERE id = $4;

**Node 4: Error Handling**
If refresh fails (e.g., refresh token expired after 100 days for QBO):
- Log error to audit_log
- Send critical alert to #cashflow-agent-errors
- Set client.is_active = FALSE
- Alert MSP: 'Re-authorization required for {client_name}'

**Node 5: Audit Log**
Log all token refresh events for compliance trail.

Cash Flow Analysis System Prompt

  • Type: prompt

The master system prompt used across all LLM calls in the agent pipeline. Establishes the AI's persona, capabilities, constraints, and output formatting requirements. This prompt is referenced by the Transaction Analyzer and Anomaly Detector agents.

Implementation:

SYSTEM PROMPT — Cash Flow Monitor AI Agent --- You are CashFlow Monitor, an AI agent specialized in forensic bookkeeping and proactive cash flow management for small and medium businesses.
Sonnet 4.6

Your Role

You work for an accounting firm that manages bookkeeping for multiple business clients. Your job is to continuously monitor bank transactions and cash positions, detect anomalies, and surface actionable alerts before cash flow problems become crises.

Your Capabilities

1
Transaction Classification: Verify and correct transaction categories based on description, amount, and historical patterns
2
Anomaly Detection: Identify unusual transactions including potential duplicates, unexpected large payments, unknown vendors, and pattern breaks
3
Cash Flow Forecasting: Project future cash positions based on burn rate, seasonal patterns, and known upcoming obligations
4
Risk Assessment: Evaluate the severity of cash flow concerns and prioritize alerts accordingly
5
Natural Language Reporting: Generate clear, concise alerts and summaries for non-technical bookkeepers

Your Constraints

  • NEVER fabricate or hallucinate financial data. If you lack data to make a calculation, explicitly state 'Insufficient data' rather than estimating.
  • NEVER provide tax advice or legal opinions.
  • ALWAYS express amounts in USD with appropriate precision (dollars and cents for transactions, whole dollars for summaries).
  • ALWAYS classify alert severity conservatively — it's better to over-alert than miss a real problem.
  • Your anomaly detection should have a FALSE POSITIVE rate target of <15%. Err toward flagging suspicious items.
  • NEVER include raw API data, OAuth tokens, or system internals in alert messages.
  • Alerts must be written for a bookkeeper with 2-5 years of experience — avoid AI jargon.
  • When recommending actions, be specific: 'Contact [vendor name] about the [varies],XXX payment on [date]' not 'Investigate the transaction.'

Severity Definitions

  • CRITICAL: Immediate cash flow risk. Cash below threshold, projected shortfall within 7 days, or transaction suggesting fraud/error >$10,000.
  • WARNING: Emerging concern. Projected shortfall within 14-30 days, unusual transaction >$5,000, spending rate increase >25%, or potential duplicate payment.
  • INFO: Noteworthy but not urgent. New vendor payments, category corrections, positive cash flow trends, or monthly pattern observations.

Output Format

Always respond with valid JSON matching the schema requested in the user prompt. Do not include markdown formatting, code fences, or explanatory text outside the JSON object.

Testing & Validation

  • SMOKE TEST — n8n Accessibility: Navigate to https://n8n.yourdomain.com in a browser. Verify the login page loads over HTTPS with a valid SSL certificate. Log in with admin credentials and confirm the dashboard displays all imported workflows.
  • API CONNECTIVITY — QuickBooks Online: In n8n, create a temporary workflow with an HTTP Request node querying GET https://quickbooks.api.intuit.com/v3/company/<REALM_ID>/companyinfo/<REALM_ID>?minorversion=73 using the QBO OAuth2 credential. Execute and verify a 200 response with valid company info JSON.
  • API CONNECTIVITY — Xero: In n8n, create a temporary workflow with an HTTP Request node querying GET https://api.xero.com/api.xro/2.0/Organisation using the Xero OAuth2 credential. Execute and verify a 200 response with valid organization data.
  • API CONNECTIVITY — OpenAI: In n8n, use the OpenAI node to send a simple chat completion request: 'Respond with the word OK'. Verify a 200 response and the model returns 'OK'. Check that billing is active on the OpenAI dashboard.
  • DATABASE — PostgreSQL Connection: Execute a test query from n8n's Postgres node: SELECT NOW(), version(); Verify it returns the current timestamp and PostgreSQL 15.x version string. Then verify all custom tables exist: SELECT table_name FROM information_schema.tables WHERE table_schema = 'public';
  • BANK FEED POLLER — Single Client: Activate the Bank Feed Poller workflow. Manually trigger it or wait for the 15-minute schedule. Check the n8n Executions tab for a successful run. Then query: SELECT COUNT(*) FROM transactions WHERE client_id = <TEST_CLIENT_ID> AND processed_at > NOW() - INTERVAL '1 hour'; Verify transactions were inserted.
  • TRANSACTION ANALYZER — Classification Accuracy: Insert 10 known test transactions covering various categories (payroll, rent, utilities, office supplies, client payment) into the test client's bank feed. Run the analyzer and verify: (1) ai_category is assigned for all 10, (2) ai_confidence > 0.7 for at least 8/10, (3) category suggestions are reasonable for the transaction descriptions.
  • ANOMALY DETECTION — True Positive: Create a test bank transaction with an unusually large amount ($50,000 when average is $2,000). Run the pipeline and verify: (1) Transaction Analyzer flags it with anomaly_score >= 0.7, (2) Anomaly Detector generates a detailed analysis, (3) An alert record is created in the alerts table with severity 'warning' or 'critical'.
  • ANOMALY DETECTION — Duplicate Detection: Create two bank transactions with the same amount ($1,547.32) and similar descriptions within 3 days. Run the pipeline and verify the second transaction is flagged with is_duplicate_suspect = true.
  • ANOMALY DETECTION — False Positive Rate: Process 100 normal historical transactions through the analyzer. Count how many are flagged as anomalies. The false positive rate should be below 15% (fewer than 15 flagged). If higher, adjust the anomaly_score threshold in the Transaction Analyzer workflow.
  • CASH SNAPSHOT — Daily Generation: Manually trigger the Cash Snapshot Generator for the test client. Verify: (1) A record is inserted in cash_snapshots with today's date, (2) total_bank_balance matches the value shown in QBO/Xero, (3) days_cash_on_hand is calculated correctly (bank balance / daily burn rate).
  • THRESHOLD ALERT — Cash Below Minimum: Set the test client's min_cash_threshold to a value above their current bank balance. Trigger the snapshot generator. Verify a CRITICAL alert is generated with title containing 'below minimum threshold' and delivered to the correct Slack channel.
  • THRESHOLD ALERT — Projected Breach: Set the test client's min_cash_threshold to a value that their current burn rate would breach in 10 days. Trigger the snapshot generator. Verify a WARNING alert about projected threshold breach within 14 days.
  • ALERT DELIVERY — Slack: Verify alerts appear in the correct Slack channels: #cashflow-alerts-critical for warning/critical, #cashflow-alerts-info for info. Check that the Slack Block Kit formatting renders correctly with emoji, bold text, and action buttons.
  • ALERT DELIVERY — Deduplication: Trigger the same alert condition twice within 4 hours. Verify that only one Slack message is sent (deduplication working).
  • DAILY DIGEST — Generation and Delivery: Manually trigger the Daily Portfolio Digest Generator. Verify: (1) A well-formatted morning briefing appears in #cashflow-alerts-info, (2) The briefing includes all active clients, (3) Clients with low cash are highlighted, (4) The briefing is under 400 words and uses plain English.
  • OAUTH TOKEN REFRESH — Automatic: Check the audit_log for recent token refresh events: SELECT * FROM audit_log WHERE event_type = 'token_refresh' ORDER BY created_at DESC LIMIT 10; Verify tokens are being refreshed every 45 minutes for clients with expiring tokens.
  • BACKUP — Database Restore: Run the backup script manually: /opt/cashflow-agent/backup.sh. Then restore the backup to the staging database: gpg -d <backup>.tar.gz.gpg | tar xz && pg_restore -h <staging_host> -U pgadmin -d cashflow_agent_test <backup_file>.dump. Verify all tables and data are intact.
  • MONITORING — UptimeRobot: Temporarily stop the n8n Docker container (docker stop n8n). Verify UptimeRobot sends a down alert within 5 minutes. Restart n8n (docker start n8n) and verify UptimeRobot sends an up notification.
  • SECURITY — SSL and Encryption: Run SSL Labs test (ssllabs.com/ssltest) against n8n.yourdomain.com. Verify grade A or higher. Verify PostgreSQL connections require SSL: SELECT ssl FROM pg_stat_ssl WHERE pid = pg_backend_pid(); should return true.
  • END-TO-END — Full Pipeline: Create a new bank transaction in the QBO/Xero sandbox. Time the full pipeline: transaction creation → polling detection → analysis → anomaly check → alert delivery. Target: under 20 minutes from transaction to alert. Document actual time for baseline.
  • LOAD TEST — 50 Client Simulation: Configure all 50 client entities as active. Trigger a simultaneous bank feed poll for all clients. Monitor n8n execution queue, API rate limits (check for 429 errors), and VM resource utilization (CPU, RAM). Verify all polls complete within the 15-minute window without errors.

Client Handoff

Client Handoff Checklist

Training Session (2 hours, in-person or video)

1
System Overview (30 min): Walk through the architecture diagram showing how bank feeds flow through the AI agent to generate alerts. Explain the two-tier analysis (GPT-5.4 mini for classification, GPT-5.4 for deep analysis). Demonstrate the daily digest and how to read it.
2
Alert Response Training (30 min): Show real alert examples for each severity level (critical, warning, info). Train staff on the expected response workflow: receive alert → review in QBO/Xero → take action → acknowledge in Slack. Practice with 3 simulated scenarios: cash threshold breach, duplicate payment detection, unusual vendor payment.
3
Float & Fathom Orientation (30 min): Walk through the Float dashboard for each client showing cash forecasting scenarios. Demonstrate how to adjust cash thresholds when a client's situation changes. Show Fathom portfolio dashboard and how to generate client advisory reports.
4
Configuration Changes (15 min): Train the firm's designated admin on how to request threshold changes, add new clients, and temporarily pause monitoring for a client. Provide the client configuration spreadsheet with current settings for all entities.
5
Escalation Procedures (15 min): Document and review: when to contact the MSP (system errors, missed alerts, false positive patterns), expected response times (4-hour SLA for critical, next business day for non-critical), and the MSP support email/phone.

Documentation to Deliver

  • Operations Runbook (PDF): System architecture, all credentials (stored in shared password manager), alert severity definitions, response procedures, escalation contacts
  • Client Configuration Spreadsheet: All monitored entities with thresholds, alert channels, and OAuth connection status
  • WISP Amendment: Signed document updating the firm's Written Information Security Plan to include the AI monitoring system
  • Vendor Contact Sheet: Support contacts for Float, Fathom, n8n (community), OpenAI, and Azure
  • Monthly Report Template: Template for the MSP's monthly service report showing alert volume, false positive rate, system uptime, and recommendations

Success Criteria to Review Together

Ongoing Maintenance Responsibilities

Daily (Automated, MSP Monitors)

  • Alert Pipeline Health: UptimeRobot monitors n8n uptime, webhook responsiveness, and API connectivity. MSP receives alerts on any downtime within 5 minutes. Automated n8n workflow checks database connectivity and OAuth token validity every 30 minutes.
  • Execution Monitoring: Review n8n Executions tab daily (or configure error notifications) for failed workflow executions. Common failures: expired OAuth tokens (auto-refreshed, but refresh tokens expire after 100 days for QBO), API rate limits, temporary API outages.

Weekly (MSP Technician — 30 minutes)

  • False Positive Review: Query alerts table for the week: SELECT severity, COUNT(*) FROM alerts WHERE created_at > NOW() - INTERVAL '7 days' GROUP BY severity; Review any alerts marked as acknowledged without action (likely false positives). If false positive rate exceeds 15%, schedule prompt tuning.
  • System Resource Check: Verify Azure VM CPU/RAM utilization averages below 70%. Check PostgreSQL storage usage and vacuum status. Review Docker container logs for warnings.
  • Backup Verification: Confirm nightly backups completed successfully: ls -la /opt/cashflow-agent/backups/ and check the most recent backup date.

Monthly (MSP Engineer — 2 hours)

  • Prompt Tuning: Analyze the month's false positives and missed detections. Adjust anomaly_score thresholds, update the system prompt with new patterns, and test changes on the staging environment before promoting to production.
  • Software Updates: Update n8n to the latest stable version (docker pull n8nio/n8n:latest && docker compose up -d). Review OpenAI model deprecation notices and update model references if needed. Apply Ubuntu security patches (automated via unattended-upgrades, but verify).
  • Client Threshold Review: Generate a report of threshold breaches and near-misses. Recommend threshold adjustments to the accounting firm based on seasonal patterns.
  • Monthly Service Report: Deliver to the client: alert volume by severity, false positive rate, system uptime percentage, notable detections, and recommendations for threshold changes.

Quarterly (MSP Engineer — 4 hours)

  • Compliance Audit: Review WISP amendment for accuracy. Verify all vendor SOC 2 certifications are current. Test backup restoration to staging. Review access logs and remove any unused credentials.
  • Performance Optimization: Analyze LLM API costs and optimize: are there transaction types that can be ruled out without LLM analysis? Can GPT-5.4 mini handle more of the anomaly detection load? Review database query performance and add indexes if needed.
  • Client Onboarding/Offboarding: Add new client entities, remove departed clients, update OAuth connections as needed.
  • Disaster Recovery Drill: Restore from backup to the staging VM and verify full pipeline functionality.

Annual

  • Full Security Review: Rotate all API keys, OAuth app secrets, database passwords, and encryption keys. Review and update the WISP. Conduct a penetration test on the n8n instance.
  • Architecture Review: Evaluate new tools and AI models. Consider upgrading from GPT-5.4 to newer models. Assess whether Intuit Assist or Xero's native AI has caught up to the custom agent's capabilities.
  • Contract Review: Review managed service pricing against actual costs and effort. Adjust pricing for renewals.

SLA Commitments

  • System Uptime: 99.5% monthly (allows ~3.6 hours downtime per month)
  • Alert Delivery: Within 20 minutes of transaction detection during business hours
  • Critical Issue Response: 4-hour MSP response time during business hours, next business day for non-critical
  • Threshold Changes: Implemented within 1 business day of request
  • New Client Onboarding: Completed within 5 business days of receiving OAuth authorization

Escalation Path

1
Automated alert to #cashflow-agent-errors Slack channel
2
UptimeRobot notification to MSP NOC email
3
MSP L1 technician triages within 30 minutes
4
If unresolved in 2 hours, escalate to MSP L2 engineer
5
If API/platform issue, open vendor support ticket (Intuit/Xero/OpenAI)
6
If unresolved in 4 hours, notify client's designated contact

Want early access to the full toolkit?