
Implementation Guide: Monitor mls for new listings matching buyer criteria and send personalized alerts
Step-by-step implementation guide for deploying AI to monitor mls for new listings matching buyer criteria and send personalized alerts for Real Estate clients.
Hardware Procurement
Cloud VPS — Production Server
$8/month MSP cost / $50–75/month suggested resale to client
Primary production server hosting n8n workflow engine, PostgreSQL database, Nginx reverse proxy, and all agent orchestration logic. The 4 vCPU / 8 GB RAM spec provides headroom for concurrent MLS polling, LLM API calls, and notification dispatch. A single CX32 can serve 5–10 brokerage clients with isolated databases.
Cloud VPS — Development/Staging Server
$4/month MSP cost / included in implementation fee
Staging environment for testing workflow changes, new MLS integrations, and prompt engineering before deploying to production. Can be decommissioned after initial deployment if budget is tight, but recommended to keep for ongoing development.
On-Premise Server (Alternative for Large Brokerages)
$1,200–1,800 one-time MSP cost / $2,500–3,500 resale + $150/month managed hosting fee
Alternative to cloud VPS for brokerages with data residency requirements or existing server closets. Provides dedicated compute for MLS data processing and agent orchestration. Only recommended when client explicitly requires on-premise hosting.
Software Procurement
n8n (Self-Hosted)
$0/month (self-hosted) / $150–250/month suggested managed service fee to client
Core workflow automation and AI agent orchestration platform. Provides visual workflow builder with 422+ integrations, HTTP request nodes for MLS API polling, OpenAI integration nodes, database nodes for PostgreSQL, and built-in scheduling (cron) for autonomous operation. All agent logic, matching rules, and notification dispatch are built as n8n workflows.
SimplyRETS
$49–99/month + $99 one-time connection fee per MLS / $99–150/month suggested resale
MLS data access API providing standardized RESO-compliant listing data via REST endpoints. Supports both RETS and RESO Web API feeds. Returns JSON-formatted listing data including photos, descriptions, geo-coordinates, and all standard property fields. Requires client's MLS broker authorization.
OpenAI API (GPT-5.4 mini)
$2–10/month at 500 alerts/day / $25–50/month suggested resale
Large language model API used to generate personalized alert messages for each buyer-listing match. GPT-5.4 mini at $0.15/1M input tokens and $0.60/1M output tokens provides the best cost-to-quality ratio for templated personalization tasks. Each alert requires approximately 800 tokens round-trip.
Twilio SendGrid
$0–19.95/month (Free tier: 100 emails/day; Essentials: $19.95/month for 50K emails) / $30–50/month suggested resale
Transactional email delivery API for sending personalized HTML listing alert emails. Provides delivery analytics, bounce management, DKIM/SPF authentication, and unsubscribe handling required for CAN-SPAM compliance.
Twilio Programmable SMS
~$0.0079–0.0083/message + carrier fees (~$15–40/month for 2,000–5,000 messages) / $50–100/month flat suggested resale
SMS notification delivery for time-sensitive listing alerts. Requires A2P 10DLC registration ($44 one-time brand registration + $15 one-time campaign vetting + $1.50–10/month/campaign). Provides delivery receipts, opt-out handling, and compliance features.
PostgreSQL
$0/month
Primary database for storing buyer profiles, search criteria, listing cache, alert history, delivery tracking, and consent records. Runs in Docker alongside n8n on the same VPS.
Docker & Docker Compose
$0/month
Container runtime for deploying and managing n8n, PostgreSQL, and Nginx as isolated, reproducible services. Simplifies updates, backups, and multi-client isolation.
Nginx
$0/month
Reverse proxy and TLS termination for n8n web interface and webhook endpoints. Handles SSL certificates from Let's Encrypt for secure API callbacks.
Let's Encrypt (Certbot)
$0/month
Automated SSL/TLS certificate provisioning for the n8n domain. Required for secure webhook endpoints and HTTPS access to the n8n admin interface.
Follow Up Boss (CRM — Client-Paid)
$69/month/user (Grow) to $499/month for 10 users (Pro) — paid by client directly
Primary CRM integration point for syncing buyer profiles and search criteria into the agent system. The API provides webhooks for new contact creation and contact updates, allowing automatic buyer criteria import. Activity logging pushes alert delivery records back to FUB for agent visibility. Alternative CRMs: kvCORE, LionDesk, Wise Agent.
Prerequisites
- Active MLS membership through a licensed real estate broker — the client must be a licensed brokerage or have broker authorization to access MLS data feeds. The MSP cannot obtain MLS data access independently.
- RESO Web API credentials issued by the client's local MLS board. Application process typically takes 2–4 weeks. The client's broker must submit a vendor access request naming the MSP or the SimplyRETS platform as an authorized data consumer.
- A registered domain name for the deployment (e.g., alerts.clientbrokerage.com) with DNS management access. Required for SSL certificates, email deliverability (DKIM/SPF/DMARC), and n8n webhook endpoints.
- An OpenAI platform account with billing enabled and an API key generated. Create at https://platform.openai.com. Recommended: set a monthly spending limit of $50 to prevent runaway costs during development.
- A Twilio account with: (a) SendGrid API key for email, (b) Programmable SMS phone number, and (c) completed A2P 10DLC brand registration ($44) and campaign registration ($15). A2P registration can take 1–5 business days for approval.
- CRM API credentials — if using Follow Up Boss, the client needs a Pro or Platform plan ($499+/month) for full API access. Obtain the API key from Settings > API in FUB.
- Client-provided buyer preference data in a structured format (spreadsheet or CRM export) including: buyer name, email, phone, preferred price range, preferred locations/zip codes, minimum bedrooms/bathrooms, property type preferences, and any must-have/deal-breaker criteria.
- Written consent records from all buyers who will receive alerts, documenting express written consent for both email and SMS communications with timestamps. Required for TCPA and CAN-SPAM compliance.
- SSH access to the provisioned VPS with a sudo-capable user account. Public key authentication configured (no password-only SSH).
- Basic understanding of Docker and Linux command-line operations by the implementing MSP technician.
Installation Steps
...
Step 1: Provision and Secure the Production VPS
Create a Hetzner CX32 instance running Ubuntu 22.04 LTS. Configure firewall, SSH hardening, and automatic security updates. This server will host all components: n8n, PostgreSQL, Nginx, and Certbot.
# After creating the Hetzner CX32 via the Hetzner Cloud Console (https://console.hetzner.cloud):
# SSH into the new server
ssh root@<SERVER_IP>
# Create a non-root user for operations
adduser mspadmin
usermod -aG sudo mspadmin
# Configure SSH key authentication
mkdir -p /home/mspadmin/.ssh
cp ~/.ssh/authorized_keys /home/mspadmin/.ssh/
chown -R mspadmin:mspadmin /home/mspadmin/.ssh
chmod 700 /home/mspadmin/.ssh
chmod 600 /home/mspadmin/.ssh/authorized_keys
# Disable password authentication and root login
sed -i 's/^PermitRootLogin yes/PermitRootLogin no/' /etc/ssh/sshd_config
sed -i 's/^#PasswordAuthentication yes/PasswordAuthentication no/' /etc/ssh/sshd_config
systemctl restart sshd
# Configure UFW firewall
ufw default deny incoming
ufw default allow outgoing
ufw allow 22/tcp
ufw allow 80/tcp
ufw allow 443/tcp
ufw enable
# Enable automatic security updates
apt update && apt install -y unattended-upgrades
dpkg-reconfigure -plow unattended-upgrades
# Set timezone
timedatectl set-timezone America/New_YorkReplace <SERVER_IP> with the actual IP from Hetzner console. Choose a US or EU data center based on client location. For US real estate clients, use Hetzner's Ashburn, VA (us-east) or Hillsboro, OR (us-west) locations if available, otherwise DigitalOcean NYC or SFO regions.
Step 2: Install Docker and Docker Compose
Install Docker Engine and Docker Compose plugin on the VPS. All application components will run as Docker containers for isolation, reproducibility, and easy updates.
# Log in as mspadmin
su - mspadmin
# Install Docker prerequisites
sudo apt update
sudo apt install -y ca-certificates curl gnupg lsb-release
# Add Docker's official GPG key
sudo install -m 0755 -d /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
sudo chmod a+r /etc/apt/keyrings/docker.gpg
# Add Docker repository
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Install Docker Engine
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
# Add mspadmin to docker group (avoids sudo for docker commands)
sudo usermod -aG docker mspadmin
newgrp docker
# Verify installation
docker --version
docker compose versionLog out and back in after adding user to docker group for the change to take effect. Verify with 'docker run hello-world'.
Step 3: Create Project Directory Structure and Docker Compose Configuration
Set up the directory structure for the n8n deployment and create the Docker Compose file that orchestrates n8n, PostgreSQL, and Nginx containers. This configuration enables persistent data storage, automatic restarts, and proper networking between services.
# Create project directory
mkdir -p ~/mls-agent/{n8n-data,postgres-data,nginx,backups}
cd ~/mls-agent
# Create the Docker Compose file
cat > docker-compose.yml << 'DOCKEREOF'
version: '3.8'
services:
postgres:
image: postgres:16-alpine
container_name: mls-postgres
restart: always
environment:
POSTGRES_DB: n8n
POSTGRES_USER: n8n_user
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
volumes:
- ./postgres-data:/var/lib/postgresql/data
- ./init-db:/docker-entrypoint-initdb.d
ports:
- '127.0.0.1:5432:5432'
healthcheck:
test: ['CMD-SHELL', 'pg_isready -U n8n_user -d n8n']
interval: 10s
timeout: 5s
retries: 5
n8n:
image: n8nio/n8n:latest
container_name: mls-n8n
restart: always
depends_on:
postgres:
condition: service_healthy
environment:
DB_TYPE: postgresdb
DB_POSTGRESDB_HOST: postgres
DB_POSTGRESDB_PORT: 5432
DB_POSTGRESDB_DATABASE: n8n
DB_POSTGRESDB_USER: n8n_user
DB_POSTGRESDB_PASSWORD: ${POSTGRES_PASSWORD}
N8N_HOST: ${N8N_DOMAIN}
N8N_PORT: 5678
N8N_PROTOCOL: https
WEBHOOK_URL: https://${N8N_DOMAIN}/
N8N_ENCRYPTION_KEY: ${N8N_ENCRYPTION_KEY}
GENERIC_TIMEZONE: America/New_York
N8N_BASIC_AUTH_ACTIVE: 'true'
N8N_BASIC_AUTH_USER: ${N8N_ADMIN_USER}
N8N_BASIC_AUTH_PASSWORD: ${N8N_ADMIN_PASSWORD}
volumes:
- ./n8n-data:/home/node/.n8n
ports:
- '127.0.0.1:5678:5678'
nginx:
image: nginx:alpine
container_name: mls-nginx
restart: always
depends_on:
- n8n
ports:
- '80:80'
- '443:443'
volumes:
- ./nginx/conf.d:/etc/nginx/conf.d
- ./nginx/ssl:/etc/letsencrypt
- ./nginx/webroot:/var/www/certbot
DOCKEREOF
# Create environment file
cat > .env << 'ENVEOF'
POSTGRES_PASSWORD=CHANGE_ME_STRONG_PASSWORD_32CHARS
N8N_DOMAIN=alerts.clientbrokerage.com
N8N_ENCRYPTION_KEY=CHANGE_ME_RANDOM_64CHAR_HEX_STRING
N8N_ADMIN_USER=mspadmin
N8N_ADMIN_PASSWORD=CHANGE_ME_STRONG_ADMIN_PASSWORD
ENVEOF
# Generate secure passwords
sed -i "s/CHANGE_ME_STRONG_PASSWORD_32CHARS/$(openssl rand -hex 16)/" .env
sed -i "s/CHANGE_ME_RANDOM_64CHAR_HEX_STRING/$(openssl rand -hex 32)/" .env
sed -i "s/CHANGE_ME_STRONG_ADMIN_PASSWORD/$(openssl rand -base64 24)/" .env
# IMPORTANT: Record these credentials securely
cat .env
chmod 600 .envCRITICAL: Save the generated .env credentials in the MSP's password manager (e.g., IT Glue, Hudu, or Bitwarden). The N8N_ENCRYPTION_KEY is used to encrypt credentials stored in n8n — if lost, all stored API keys in n8n workflows will be unrecoverable. Replace 'alerts.clientbrokerage.com' with the actual subdomain.
Step 4: Configure Nginx Reverse Proxy and SSL
Set up Nginx as a reverse proxy for n8n with SSL termination. This provides HTTPS access to the n8n admin interface and secures webhook endpoints that receive MLS data and CRM callbacks.
cd ~/mls-agent
# Create initial Nginx config (HTTP only, for Certbot challenge)
mkdir -p nginx/conf.d nginx/ssl nginx/webroot
cat > nginx/conf.d/default.conf << 'NGINXEOF'
server {
listen 80;
server_name alerts.clientbrokerage.com;
location /.well-known/acme-challenge/ {
root /var/www/certbot;
}
location / {
return 301 https://$host$request_uri;
}
}
NGINXEOF
# Start just Nginx for certificate issuance
docker compose up -d nginx
# Install Certbot on host and obtain certificate
sudo apt install -y certbot
sudo certbot certonly --webroot -w ~/mls-agent/nginx/webroot -d alerts.clientbrokerage.com --non-interactive --agree-tos --email msp-admin@yourmsp.com
# Copy certificates to Nginx volume
sudo cp -rL /etc/letsencrypt/live/alerts.clientbrokerage.com/ ~/mls-agent/nginx/ssl/
sudo chown -R mspadmin:mspadmin ~/mls-agent/nginx/ssl/
# Update Nginx config with SSL
cat > nginx/conf.d/default.conf << 'NGINXSSLEOF'
upstream n8n_backend {
server n8n:5678;
}
server {
listen 80;
server_name alerts.clientbrokerage.com;
location /.well-known/acme-challenge/ {
root /var/www/certbot;
}
location / {
return 301 https://$host$request_uri;
}
}
server {
listen 443 ssl http2;
server_name alerts.clientbrokerage.com;
ssl_certificate /etc/letsencrypt/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/privkey.pem;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers HIGH:!aNULL:!MD5;
client_max_body_size 50M;
location / {
proxy_pass http://n8n_backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_read_timeout 300s;
proxy_connect_timeout 300s;
}
}
NGINXSSLEOF
# Restart Nginx with SSL config
docker compose restart nginx
# Set up auto-renewal cron
echo '0 3 * * * certbot renew --quiet && cp -rL /etc/letsencrypt/live/alerts.clientbrokerage.com/ /home/mspadmin/mls-agent/nginx/ssl/ && docker compose -f /home/mspadmin/mls-agent/docker-compose.yml restart nginx' | sudo tee /etc/cron.d/certbot-renewEnsure DNS A record for alerts.clientbrokerage.com points to the VPS IP BEFORE running Certbot. DNS propagation may take up to 48 hours. Replace 'alerts.clientbrokerage.com' and 'msp-admin@yourmsp.com' with actual values throughout.
Step 5: Initialize the PostgreSQL Database Schema
Create the application database tables for storing buyer profiles, search criteria, listing cache, alert history, and consent records. This schema supports multi-buyer matching, deduplication, and compliance auditing.
cd ~/mls-agent
mkdir -p init-db
cat > init-db/01-schema.sql << 'SQLEOF'
-- Buyer profiles and preferences
CREATE TABLE IF NOT EXISTS buyers (
id SERIAL PRIMARY KEY,
external_crm_id VARCHAR(255) UNIQUE,
first_name VARCHAR(100) NOT NULL,
last_name VARCHAR(100) NOT NULL,
email VARCHAR(255),
phone VARCHAR(20),
assigned_agent_name VARCHAR(200),
assigned_agent_email VARCHAR(255),
assigned_agent_phone VARCHAR(20),
email_consent BOOLEAN DEFAULT FALSE,
email_consent_timestamp TIMESTAMPTZ,
sms_consent BOOLEAN DEFAULT FALSE,
sms_consent_timestamp TIMESTAMPTZ,
is_active BOOLEAN DEFAULT TRUE,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- Buyer search criteria (one buyer can have multiple saved searches)
CREATE TABLE IF NOT EXISTS search_criteria (
id SERIAL PRIMARY KEY,
buyer_id INTEGER REFERENCES buyers(id) ON DELETE CASCADE,
criteria_name VARCHAR(200) DEFAULT 'Primary Search',
min_price NUMERIC(12,2),
max_price NUMERIC(12,2),
min_bedrooms INTEGER,
max_bedrooms INTEGER,
min_bathrooms NUMERIC(3,1),
max_bathrooms NUMERIC(3,1),
min_sqft INTEGER,
max_sqft INTEGER,
min_lot_sqft INTEGER,
max_lot_sqft INTEGER,
min_year_built INTEGER,
max_year_built INTEGER,
property_types TEXT[], -- e.g., ARRAY['SingleFamily','Condo','Townhouse']
zip_codes TEXT[],
cities TEXT[],
counties TEXT[],
neighborhoods TEXT[],
must_have_features TEXT[], -- e.g., ARRAY['garage','pool','waterfront']
exclude_features TEXT[], -- e.g., ARRAY['hoa','55plus']
max_hoa_fee NUMERIC(8,2),
latitude NUMERIC(10,7),
longitude NUMERIC(10,7),
radius_miles NUMERIC(5,2),
custom_notes TEXT,
is_active BOOLEAN DEFAULT TRUE,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- Listing cache (avoids re-alerting on known listings)
CREATE TABLE IF NOT EXISTS listings_cache (
id SERIAL PRIMARY KEY,
mls_id VARCHAR(50) UNIQUE NOT NULL,
listing_data JSONB NOT NULL,
address TEXT,
city VARCHAR(100),
state VARCHAR(2),
zip_code VARCHAR(10),
price NUMERIC(12,2),
bedrooms INTEGER,
bathrooms NUMERIC(3,1),
sqft INTEGER,
property_type VARCHAR(50),
listing_date TIMESTAMPTZ,
photo_url TEXT,
listing_url TEXT,
first_seen_at TIMESTAMPTZ DEFAULT NOW(),
last_updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- Alert history (audit trail for compliance)
CREATE TABLE IF NOT EXISTS alert_history (
id SERIAL PRIMARY KEY,
buyer_id INTEGER REFERENCES buyers(id),
listing_mls_id VARCHAR(50),
search_criteria_id INTEGER REFERENCES search_criteria(id),
channel VARCHAR(10) NOT NULL, -- 'email' or 'sms'
personalized_message TEXT,
subject_line VARCHAR(500),
delivery_status VARCHAR(50) DEFAULT 'pending',
delivery_provider_id VARCHAR(255),
sent_at TIMESTAMPTZ DEFAULT NOW(),
delivered_at TIMESTAMPTZ,
opened_at TIMESTAMPTZ,
clicked_at TIMESTAMPTZ
);
-- Create indexes for performance
CREATE INDEX idx_listings_mls_id ON listings_cache(mls_id);
CREATE INDEX idx_listings_first_seen ON listings_cache(first_seen_at);
CREATE INDEX idx_listings_price ON listings_cache(price);
CREATE INDEX idx_listings_zip ON listings_cache(zip_code);
CREATE INDEX idx_criteria_buyer ON search_criteria(buyer_id);
CREATE INDEX idx_alerts_buyer ON alert_history(buyer_id);
CREATE INDEX idx_alerts_listing ON alert_history(listing_mls_id);
CREATE INDEX idx_alerts_sent ON alert_history(sent_at);
CREATE INDEX idx_buyers_active ON buyers(is_active) WHERE is_active = TRUE;
SQLEOF
# Bring up all services (this will run the init SQL on first start)
docker compose down
docker compose up -d
# Verify database tables were created
docker exec mls-postgres psql -U n8n_user -d n8n -c '\dt'The init-db SQL files only run on first PostgreSQL container startup (when the data directory is empty). If you need to re-initialize, remove the postgres-data directory: 'rm -rf ~/mls-agent/postgres-data && docker compose up -d'. The JSONB listing_data column stores the full MLS response for each listing, enabling future query flexibility without schema changes.
Step 6: Configure n8n Credentials and API Connections
Log into the n8n web interface and configure all external service credentials: SimplyRETS (MLS data), OpenAI (LLM), SendGrid (email), Twilio (SMS), and PostgreSQL (database). These credentials are encrypted at rest using the N8N_ENCRYPTION_KEY.
- Access n8n at https://alerts.clientbrokerage.com — Login with the N8N_ADMIN_USER and N8N_ADMIN_PASSWORD from .env
- In n8n UI, navigate to: Settings > Credentials > Add Credential
- 1. PostgreSQL Credential — Name: MLS Agent Database | Host: postgres (Docker service name, not localhost) | Port: 5432 | Database: n8n | User: n8n_user | Password: <from .env POSTGRES_PASSWORD>
- 2. HTTP Header Auth (SimplyRETS) — Name: SimplyRETS API | Header Name: Authorization | Header Value: Basic <base64 of username:password from SimplyRETS dashboard>
- 3. OpenAI API Credential — Name: OpenAI GPT | API Key: sk-... (from platform.openai.com)
- 4. SendGrid API Credential — Name: SendGrid Email | API Key: SG.... (from app.sendgrid.com > Settings > API Keys)
- 5. Twilio API Credential — Name: Twilio SMS | Account SID: AC... (from Twilio Console) | Auth Token: <from Twilio Console>
echo -n 'simplyrets_user:simplyrets_pass' | base64All credential values entered in n8n are encrypted using AES-256 with the N8N_ENCRYPTION_KEY. Never share this key. If using SimplyRETS, you will receive API credentials after completing the MLS connection process (~2-4 weeks). For Trestle (alternative), the authentication flow uses OAuth2 — use the n8n OAuth2 credential type instead. Configure SendGrid sender authentication (DKIM/SPF) at app.sendgrid.com > Settings > Sender Authentication before sending emails.
Step 7: Configure SendGrid Email Domain Authentication
Set up domain authentication in SendGrid to ensure listing alert emails are delivered to inboxes rather than spam folders. This involves adding DNS records (DKIM, SPF, DMARC) for the client's domain.
DNS propagation can take up to 48 hours. SendGrid will show green checkmarks when all records are verified. Without proper domain authentication, emails will likely land in spam. Set DMARC policy to 'none' initially and move to 'quarantine' after confirming deliverability for 2-4 weeks.
Step 8: Register Twilio A2P 10DLC for SMS Compliance
Register the client's business and messaging campaign with Twilio for A2P (Application-to-Person) 10DLC compliance. This is MANDATORY for sending SMS alerts in the United States — unregistered messages will be blocked by carriers.
A2P 10DLC registration approval takes 1-5 business days. The campaign description MUST accurately reflect the use case. Misleading descriptions can result in rejection or later suspension. Keep the signed buyer consent forms on file — Twilio and carriers may audit. The Twilio phone number must be associated with the registered campaign before sending any messages.
Step 9: Import Buyer Profiles and Search Criteria
Load initial buyer profiles and their search criteria into the PostgreSQL database. This can be done via a bulk CSV import or through the CRM sync workflow (configured in the next step). For initial deployment, a CSV import is fastest.
# Create a CSV template for the client to fill out
cat > ~/mls-agent/buyer_import_template.csv << 'CSVEOF'
first_name,last_name,email,phone,assigned_agent_name,assigned_agent_email,email_consent,sms_consent,min_price,max_price,min_bedrooms,min_bathrooms,property_types,zip_codes,cities,must_have_features
John,Smith,john@example.com,+15125551234,Jane Agent,jane@brokerage.com,true,true,350000,500000,3,2,"SingleFamily;Townhouse","78704;78745","Austin","garage"
CSVEOF
# After client fills out the CSV, import using a psql command:
# First, copy the file into the container
docker cp ~/mls-agent/buyers_filled.csv mls-postgres:/tmp/buyers.csv
# Run the import script
docker exec -i mls-postgres psql -U n8n_user -d n8n << 'IMPORTSQL'
-- Create temporary import table
CREATE TEMP TABLE buyer_import (
first_name VARCHAR(100),
last_name VARCHAR(100),
email VARCHAR(255),
phone VARCHAR(20),
assigned_agent_name VARCHAR(200),
assigned_agent_email VARCHAR(255),
email_consent BOOLEAN,
sms_consent BOOLEAN,
min_price NUMERIC,
max_price NUMERIC,
min_bedrooms INTEGER,
min_bathrooms NUMERIC,
property_types TEXT,
zip_codes TEXT,
cities TEXT,
must_have_features TEXT
);
\COPY buyer_import FROM '/tmp/buyers.csv' WITH (FORMAT csv, HEADER true);
-- Insert buyers
INSERT INTO buyers (first_name, last_name, email, phone, assigned_agent_name, assigned_agent_email, email_consent, email_consent_timestamp, sms_consent, sms_consent_timestamp)
SELECT first_name, last_name, email, phone, assigned_agent_name, assigned_agent_email, email_consent, CASE WHEN email_consent THEN NOW() END, sms_consent, CASE WHEN sms_consent THEN NOW() END
FROM buyer_import;
-- Insert search criteria
INSERT INTO search_criteria (buyer_id, min_price, max_price, min_bedrooms, min_bathrooms, property_types, zip_codes, cities, must_have_features)
SELECT b.id, bi.min_price, bi.max_price, bi.min_bedrooms, bi.min_bathrooms,
string_to_array(bi.property_types, ';'),
string_to_array(bi.zip_codes, ';'),
string_to_array(bi.cities, ';'),
string_to_array(bi.must_have_features, ';')
FROM buyer_import bi
JOIN buyers b ON b.email = bi.email;
DROP TABLE buyer_import;
IMPORTSQL
# Verify import
docker exec mls-postgres psql -U n8n_user -d n8n -c 'SELECT count(*) as buyer_count FROM buyers;'
docker exec mls-postgres psql -U n8n_user -d n8n -c 'SELECT count(*) as criteria_count FROM search_criteria;'Phone numbers must be in E.164 format (+1XXXXXXXXXX). Property types should match RESO standard values: SingleFamily, Condo, Townhouse, MultiFamily, Land, etc. For ongoing buyer management after initial import, the CRM sync workflow (Step 11) will handle automatic updates. Verify that every buyer with sms_consent=true has a documented signed consent form on file.
Step 10: Build and Deploy the MLS Polling Workflow in n8n
Create the core n8n workflow that polls the SimplyRETS API on a schedule, identifies new listings, and caches them in PostgreSQL. This is the data ingestion layer of the autonomous agent.
- This workflow is built entirely in the n8n visual editor.
- Navigate to https://alerts.clientbrokerage.com
- Click 'Add Workflow' > Name it 'MLS Listing Poller'
- The workflow JSON can be imported directly into n8n: In n8n, click the '...' menu > 'Import from URL' or paste the JSON below
- See the custom_ai_components section for the complete workflow JSON
The MLS Poller workflow runs every 10 minutes via a Cron trigger. It queries SimplyRETS for listings modified in the last 15 minutes (with overlap for safety), compares against the listings_cache table to identify truly new listings, and inserts new ones. The 5-minute overlap between the 10-minute poll interval and 15-minute lookback window ensures no listings are missed due to API latency or clock drift. Adjust the polling interval based on MLS data volume — busy markets may benefit from 5-minute intervals.
Step 11: Build and Deploy the Buyer Matching and Alert Workflow in n8n
Create the n8n workflow that takes newly cached listings, matches them against all active buyer search criteria, generates personalized messages using GPT-5.4 mini, and dispatches email and SMS alerts. This is the core autonomous agent logic.
- This workflow is built in the n8n visual editor.
- Click 'Add Workflow' > Name it 'Buyer Matcher & Alert Dispatcher'
- The workflow is triggered by the MLS Listing Poller workflow (via n8n's Execute Workflow node) or can run on its own Cron schedule.
- See the custom_ai_components section for the complete workflow specification and prompt templates.
This workflow contains the most complex logic. The matching engine runs a SQL query for each new listing that finds all buyers whose criteria match. The LLM personalization step uses a carefully engineered prompt that references both the listing details and the buyer's specific preferences to create a natural, helpful message. Rate limiting is important: the workflow includes a 200ms delay between OpenAI API calls and a 100ms delay between notification dispatches to avoid hitting rate limits.
Step 12: Build and Deploy the CRM Sync Workflow
Create an n8n workflow that synchronizes buyer profiles and search criteria from Follow Up Boss (or alternative CRM) into the PostgreSQL database. This runs on a schedule and via webhooks to keep buyer data current without manual CSV imports.
- This workflow is built in the n8n visual editor.
- Click 'Add Workflow' > Name it 'CRM Buyer Sync (Follow Up Boss)'
- For Follow Up Boss API integration: Base URL: https://api.followupboss.com/v1
- Authentication: HTTP Basic Auth (API key as username, blank password)
- Key endpoints used: GET /v1/people?sort=updated&limit=100&offset=0 — List contacts
- GET /v1/people/{id} — Get single contact details
- GET /v1/people/{id}/customFields — Get custom fields (buyer criteria)
- Webhook setup (for real-time sync): In Follow Up Boss: Admin > API > Webhooks
- Add webhook URL: https://alerts.clientbrokerage.com/webhook/fub-buyer-update
- Events: person.created, person.updated
- See custom_ai_components section for the workflow specification.
Follow Up Boss requires Pro ($499/mo for 10 users) or Platform ($1,000/mo for 30 users) plan for API access. Buyer search criteria should be stored in FUB custom fields. Work with the client's team to set up these custom fields: 'AI Alert Min Price', 'AI Alert Max Price', 'AI Alert Bedrooms', 'AI Alert Zip Codes', 'AI Alert Property Types', etc. If the client uses a different CRM, the API integration will differ but the data flow pattern is identical.
Step 13: Configure Automated Backups
Set up automated daily backups of the PostgreSQL database and n8n workflow configurations. Critical for disaster recovery and client data protection.
cd ~/mls-agent
# Create backup script
cat > backups/backup.sh << 'BACKUPEOF'
#!/bin/bash
BACKUP_DIR=/home/mspadmin/mls-agent/backups
DATE=$(date +%Y%m%d_%H%M%S)
# Backup PostgreSQL
docker exec mls-postgres pg_dump -U n8n_user -d n8n --format=custom > "$BACKUP_DIR/db_$DATE.dump"
# Backup n8n workflows (export via API)
curl -s -u "$N8N_ADMIN_USER:$N8N_ADMIN_PASSWORD" https://alerts.clientbrokerage.com/api/v1/workflows > "$BACKUP_DIR/workflows_$DATE.json"
# Backup n8n credentials metadata (not secrets)
curl -s -u "$N8N_ADMIN_USER:$N8N_ADMIN_PASSWORD" https://alerts.clientbrokerage.com/api/v1/credentials > "$BACKUP_DIR/credentials_meta_$DATE.json"
# Compress
tar -czf "$BACKUP_DIR/full_backup_$DATE.tar.gz" "$BACKUP_DIR/db_$DATE.dump" "$BACKUP_DIR/workflows_$DATE.json" "$BACKUP_DIR/credentials_meta_$DATE.json"
# Clean up individual files
rm -f "$BACKUP_DIR/db_$DATE.dump" "$BACKUP_DIR/workflows_$DATE.json" "$BACKUP_DIR/credentials_meta_$DATE.json"
# Retain last 30 days of backups
find "$BACKUP_DIR" -name 'full_backup_*.tar.gz' -mtime +30 -delete
echo "Backup completed: full_backup_$DATE.tar.gz"
BACKUPEOF
chmod +x backups/backup.sh
# Source env vars for the backup script
cat > backups/backup_env.sh << 'BENVEOF'
export N8N_ADMIN_USER=$(grep N8N_ADMIN_USER /home/mspadmin/mls-agent/.env | cut -d= -f2)
export N8N_ADMIN_PASSWORD=$(grep N8N_ADMIN_PASSWORD /home/mspadmin/mls-agent/.env | cut -d= -f2)
BENVEOF
# Schedule daily backup at 2 AM
(crontab -l 2>/dev/null; echo '0 2 * * * source /home/mspadmin/mls-agent/backups/backup_env.sh && /home/mspadmin/mls-agent/backups/backup.sh >> /home/mspadmin/mls-agent/backups/backup.log 2>&1') | crontab -
# Optional: sync backups to offsite storage (S3, Backblaze B2, etc.)
# apt install -y rclone
# rclone config (configure your preferred cloud storage)
# Add to cron: rclone sync ~/mls-agent/backups remote:mls-backupsBackups should be tested monthly by restoring to the staging server. The n8n workflow export via API captures all workflow definitions but NOT the encrypted credential values. Those are only recoverable from the PostgreSQL backup combined with the N8N_ENCRYPTION_KEY. For HIPAA-adjacent compliance or extra caution, encrypt backups at rest using gpg before offsite sync.
Step 14: Set Up Monitoring and Health Checks
Configure uptime monitoring, resource alerts, and workflow execution monitoring to ensure the autonomous agent runs reliably. The MSP needs visibility into system health without constantly SSHing into the server.
# Option A: Use UptimeRobot (free tier: 50 monitors, 5-min intervals)
# https://uptimerobot.com
# Add HTTP(S) monitor: https://alerts.clientbrokerage.com (expect 200 or 401)
# Add keyword monitor: Check n8n health endpoint
# Option B: Simple self-hosted health check script
cat > ~/mls-agent/healthcheck.sh << 'HCEOF'
#!/bin/bash
# Check if all containers are running
for container in mls-n8n mls-postgres mls-nginx; do
if ! docker inspect -f '{{.State.Running}}' $container 2>/dev/null | grep -q true; then
echo "ALERT: $container is not running!" | mail -s "MLS Agent: Container Down" msp-alerts@yourmsp.com
docker compose -f /home/mspadmin/mls-agent/docker-compose.yml up -d
fi
done
# Check disk usage
DISK_USAGE=$(df -h / | awk 'NR==2{print $5}' | tr -d '%')
if [ "$DISK_USAGE" -gt 85 ]; then
echo "ALERT: Disk usage at ${DISK_USAGE}%" | mail -s "MLS Agent: Disk Warning" msp-alerts@yourmsp.com
fi
# Check if the MLS poller workflow ran in the last 30 minutes
LAST_RUN=$(docker exec mls-postgres psql -U n8n_user -d n8n -t -c "SELECT COUNT(*) FROM listings_cache WHERE first_seen_at > NOW() - INTERVAL '30 minutes';")
if [ "$(echo $LAST_RUN | tr -d ' ')" = "0" ]; then
echo "WARNING: No new listings cached in 30 minutes. MLS poller may be stalled." | mail -s "MLS Agent: Poller Warning" msp-alerts@yourmsp.com
fi
HCEOF
chmod +x ~/mls-agent/healthcheck.sh
# Run health check every 10 minutes
(crontab -l 2>/dev/null; echo '*/10 * * * * /home/mspadmin/mls-agent/healthcheck.sh >> /home/mspadmin/mls-agent/healthcheck.log 2>&1') | crontab -
# Install mailutils for email alerts (or use msmtp for SMTP relay)
sudo apt install -y msmtp msmtp-mta
# Configure msmtp for alert emails via SendGrid SMTP
cat > ~/.msmtprc << 'MSMTPEOF'
account default
host smtp.sendgrid.net
port 587
auth on
user apikey
password SG.YOUR_SENDGRID_API_KEY
tls on
tls_trust_file /etc/ssl/certs/ca-certificates.crt
from alerts@clientbrokerage.com
logfile ~/.msmtp.log
MSMTPEOF
chmod 600 ~/.msmtprcThe health check script is a minimal monitoring solution. For production MSP operations, consider integrating with your existing RMM/PSA tool (ConnectWise, Datto, NinjaRMM) via their agent or API. The 'no new listings in 30 minutes' check may false-alarm during off-hours in slow markets — adjust the threshold or add time-of-day logic. UptimeRobot's free tier is sufficient for basic uptime monitoring.
Custom AI Components
MLS Listing Poller Workflow
Type: workflow Core n8n workflow that polls the SimplyRETS API every 10 minutes for new and updated listings, deduplicates against the local cache, and stores new listings in PostgreSQL. This workflow feeds the buyer matching engine. It handles pagination, error recovery, and data normalization from RESO-format JSON into the local schema.
n8n Workflow: MLS Listing Poller
Node 1: Cron Trigger
- Type: Schedule Trigger
- Settings: Run every 10 minutes
- Cron expression: */10 * * * *
Node 2: Set Timestamp Variables
- Type: Set
- Fields:
lookback_minutes= 15 - Fields:
since_timestamp={{ $now.minus({minutes: 15}).toISO() }} - Fields:
page_offset= 0 - Fields:
page_limit= 200
Node 3: HTTP Request — SimplyRETS API
- Type: HTTP Request
- Method: GET
- URL: https://api.simplyrets.com/properties
- Authentication: Use 'SimplyRETS API' credential (HTTP Header Auth)
- Query Parameter —
lastUpdate:{{ $json.since_timestamp }} - Query Parameter —
limit:{{ $json.page_limit }} - Query Parameter —
offset:{{ $json.page_offset }} - Query Parameter —
status:Active - Header — Accept: application/json
- Options — Response Format: JSON
- Options — Timeout: 30000ms
- Options — Continue On Fail: true
Node 4: Check for Empty Results
- Type: IF
- Condition:
{{ $json.length > 0 }}→ True continues, False ends
Node 5: Loop Over Listings
- Type: Split In Batches
- Batch Size: 1
Node 6: Check if Listing Exists in Cache
- Type: PostgreSQL
- Credential: MLS Agent Database
- Operation: Execute Query
SELECT mls_id FROM listings_cache WHERE mls_id = '{{ $json.mlsId }}';Node 7: IF — New Listing Check
- Type: IF
- Condition:
{{ $json.length === 0 }}(no existing record = new listing)
Node 8: Normalize Listing Data (Code Node)
- Type: Code
- Language: JavaScript
// SimplyRETS RESO-format JSON normalization into local schema
const listing = $input.first().json;
// SimplyRETS response normalization
const normalized = {
mls_id: listing.mlsId || listing.listingId,
address: [
listing.address?.streetNumberText,
listing.address?.streetName,
listing.address?.streetSuffix
].filter(Boolean).join(' '),
city: listing.address?.city || '',
state: listing.address?.state || '',
zip_code: listing.address?.postalCode || '',
price: listing.listPrice || 0,
bedrooms: listing.property?.bedrooms || 0,
bathrooms: listing.property?.bathsFull + (listing.property?.bathsHalf * 0.5) || 0,
sqft: listing.property?.area || 0,
lot_sqft: listing.property?.lotSize || 0,
property_type: listing.property?.type || 'Unknown',
year_built: listing.property?.yearBuilt || null,
listing_date: listing.listDate || new Date().toISOString(),
photo_url: listing.photos?.[0] || '',
listing_url: `https://www.simplyrets.com/listings/${listing.mlsId}`,
hoa_fee: listing.association?.fee || null,
description: listing.remarks || '',
features: [
listing.property?.parking?.description,
listing.property?.pool ? 'pool' : null,
listing.property?.waterFront ? 'waterfront' : null,
listing.property?.garage ? 'garage' : null,
].filter(Boolean),
listing_data: JSON.stringify(listing)
};
return [{ json: normalized }];Node 9: Insert New Listing into Cache
- Type: PostgreSQL
- Credential: MLS Agent Database
- Operation: Execute Query
INSERT INTO listings_cache (mls_id, address, city, state, zip_code, price, bedrooms, bathrooms, sqft, property_type, listing_date, photo_url, listing_url, listing_data)
VALUES (
'{{ $json.mls_id }}',
'{{ $json.address.replace(/'/g, "''") }}',
'{{ $json.city }}',
'{{ $json.state }}',
'{{ $json.zip_code }}',
{{ $json.price }},
{{ $json.bedrooms }},
{{ $json.bathrooms }},
{{ $json.sqft }},
'{{ $json.property_type }}',
'{{ $json.listing_date }}',
'{{ $json.photo_url }}',
'{{ $json.listing_url }}',
'{{ $json.listing_data.replace(/'/g, "''") }}'
)
ON CONFLICT (mls_id) DO UPDATE SET
price = EXCLUDED.price,
listing_data = EXCLUDED.listing_data,
last_updated_at = NOW();Node 10: Trigger Buyer Matcher
- Type: Execute Workflow
- Workflow: 'Buyer Matcher & Alert Dispatcher'
- Pass data:
{ "new_listing_mls_id": "{{ $json.mls_id }}" }
Node 11: Pagination Check (after batch completes)
- Type: IF
- Condition:
{{ $json.length === 200 }}(full page = may be more results) - True: Loop back to Node 3 with offset += 200
- False: End workflow
Error Handling
- Node 3 has 'Continue On Fail' enabled
- Add an Error Trigger workflow that sends an alert email to MSP when the poller fails 3 consecutive times
- SimplyRETS rate limits: 500 requests/hour standard. At 6 polls/hour × ~1-2 requests each = well within limits
Buyer Matching Engine
Type: agent The core autonomous agent logic that matches newly cached MLS listings against all active buyer search criteria stored in PostgreSQL. For each match, it triggers the personalization and notification pipeline. The matching uses SQL-based criteria evaluation for speed and reliability, with support for geographic radius matching, price ranges, property features, and exclusion filters.
Implementation
n8n Workflow: Buyer Matcher & Alert Dispatcher
Trigger: Called by MLS Listing Poller via Execute Workflow node, receiving new_listing_mls_id.
- Node 1: Webhook / Execute Workflow Trigger — Type: Execute Workflow Trigger | Input:
new_listing_mls_id - Node 4: Check for Matches — Type: IF | Condition:
{{ $json.length > 0 }}→ True: continue to personalization - Node 5: Split Into Batches (Per Buyer) — Type: Split In Batches | Batch Size: 1
- Node 6: Build LLM Context (Code Node) — Type: Code | See 'Listing Alert Personalization Prompt' component
- Node 7: OpenAI — Generate Personalized Message — Type: OpenAI | Credential: OpenAI GPT | Resource: Chat Completion | Model: gpt-5.4-mini | Messages: System prompt + user prompt from Node 6 | Temperature: 0.7 | Max Tokens: 500
- Node 9: IF — Email Consent Check — Type: IF | Condition:
{{ $json.email_consent === true && $json.email }} - Node 10: Send Email via SendGrid — Type: SendGrid | Credential: SendGrid Email | Resource: Email | From Email: listings@clientbrokerage.com | From Name:
{{ $json.assigned_agent_name }} | [Brokerage Name]| To:{{ $json.email }}| Subject:{{ $json.subject }}| HTML Content: Use the email template from 'Email Alert Template' component | Reply To:{{ $json.assigned_agent_email }} - Node 12: Wait 200ms (Rate Limiter) — Type: Wait | Duration: 200ms
- Node 13: IF — SMS Consent Check — Type: IF | Condition:
{{ $json.sms_consent === true && $json.phone }} - Node 14: Send SMS via Twilio — Type: Twilio | Credential: Twilio SMS | Resource: SMS | From: [Twilio phone number] | To:
{{ $json.phone }}| Body:{{ $json.smsBody }} - Node 16: Wait 100ms (SMS Rate Limiter) — Type: Wait | Duration: 100ms
- Node 17: Loop back to Node 5 for next buyer match
SELECT * FROM listings_cache WHERE mls_id = '{{ $json.new_listing_mls_id }}';SELECT
b.id AS buyer_id,
b.first_name,
b.last_name,
b.email,
b.phone,
b.assigned_agent_name,
b.assigned_agent_email,
b.assigned_agent_phone,
b.email_consent,
b.sms_consent,
sc.id AS criteria_id,
sc.criteria_name,
sc.min_price,
sc.max_price,
sc.min_bedrooms,
sc.min_bathrooms,
sc.property_types,
sc.zip_codes,
sc.cities,
sc.must_have_features,
sc.exclude_features,
sc.custom_notes
FROM search_criteria sc
JOIN buyers b ON b.id = sc.buyer_id
WHERE b.is_active = TRUE
AND sc.is_active = TRUE
-- Price range match
AND (sc.min_price IS NULL OR {{ $json.price }} >= sc.min_price)
AND (sc.max_price IS NULL OR {{ $json.price }} <= sc.max_price)
-- Bedroom match
AND (sc.min_bedrooms IS NULL OR {{ $json.bedrooms }} >= sc.min_bedrooms)
-- Bathroom match
AND (sc.min_bathrooms IS NULL OR {{ $json.bathrooms }} >= sc.min_bathrooms)
-- Square footage match
AND (sc.min_sqft IS NULL OR {{ $json.sqft }} >= sc.min_sqft)
AND (sc.max_sqft IS NULL OR {{ $json.sqft }} <= sc.max_sqft)
-- Property type match (if specified)
AND (sc.property_types IS NULL OR '{{ $json.property_type }}' = ANY(sc.property_types))
-- Location match: zip codes
AND (sc.zip_codes IS NULL OR '{{ $json.zip_code }}' = ANY(sc.zip_codes))
-- Location match: cities (case-insensitive)
AND (sc.cities IS NULL OR LOWER('{{ $json.city }}') = ANY(SELECT LOWER(unnest(sc.cities))))
-- Deduplication: haven't already sent this listing to this buyer
AND NOT EXISTS (
SELECT 1 FROM alert_history ah
WHERE ah.buyer_id = b.id
AND ah.listing_mls_id = '{{ $json.mls_id }}'
)
ORDER BY b.id;const response = $input.first().json.message.content;
const lines = response.split('\n');
let subject = '';
let emailBody = '';
let smsBody = '';
// Parse structured response
for (const line of lines) {
if (line.startsWith('SUBJECT:')) subject = line.replace('SUBJECT:', '').trim();
if (line.startsWith('EMAIL_BODY:')) emailBody = response.split('EMAIL_BODY:')[1].split('SMS_BODY:')[0].trim();
if (line.startsWith('SMS_BODY:')) smsBody = response.split('SMS_BODY:')[1].trim();
}
return [{ json: { subject, emailBody, smsBody, ...($input.first().json) } }];INSERT INTO alert_history (buyer_id, listing_mls_id, search_criteria_id, channel, personalized_message, subject_line, delivery_status)
VALUES ({{ $json.buyer_id }}, '{{ $json.mls_id }}', {{ $json.criteria_id }}, 'email', '{{ $json.emailBody.replace(/'/g, "''") }}', '{{ $json.subject.replace(/'/g, "''") }}', 'sent');INSERT INTO alert_history (buyer_id, listing_mls_id, search_criteria_id, channel, personalized_message, delivery_status)
VALUES ({{ $json.buyer_id }}, '{{ $json.mls_id }}', {{ $json.criteria_id }}, 'sms', '{{ $json.smsBody.replace(/'/g, "''") }}', 'sent');Listing Alert Personalization Prompt
Type: prompt The system and user prompts sent to GPT-5.4 mini to generate personalized listing alert messages. The prompt references the specific listing details and the buyer's search criteria to create natural, helpful, non-generic alert text. It generates three outputs: email subject line, email body, and SMS body. The prompt includes Fair Housing compliance guardrails.
Implementation:
System Prompt
// Node 6: Build LLM Context
const listing = $('Fetch Listing Details').first().json;
const buyer = $input.first().json;
const userPrompt = `
Generate a personalized listing alert for this buyer and property:
BUYER INFORMATION:
- Name: ${buyer.first_name} ${buyer.last_name}
- Search Name: ${buyer.criteria_name || 'Primary Search'}
- Price Range: $${buyer.min_price?.toLocaleString() || 'Any'} - $${buyer.max_price?.toLocaleString() || 'Any'}
- Bedrooms Wanted: ${buyer.min_bedrooms || 'Any'}+
- Bathrooms Wanted: ${buyer.min_bathrooms || 'Any'}+
- Preferred Areas: ${buyer.cities?.join(', ') || buyer.zip_codes?.join(', ') || 'Not specified'}
- Must-Have Features: ${buyer.must_have_features?.join(', ') || 'None specified'}
- Property Types: ${buyer.property_types?.join(', ') || 'Any'}
- Additional Notes: ${buyer.custom_notes || 'None'}
NEW LISTING:
- Address: ${listing.address}, ${listing.city}, ${listing.state} ${listing.zip_code}
- Price: $${listing.price?.toLocaleString()}
- Bedrooms: ${listing.bedrooms}
- Bathrooms: ${listing.bathrooms}
- Square Feet: ${listing.sqft?.toLocaleString()}
- Property Type: ${listing.property_type}
- Year Built: ${listing.year_built || 'Unknown'}
- Description: ${listing.description?.substring(0, 500) || 'No description available'}
- Listing Photo: ${listing.photo_url || 'Not available'}
AGENT:
- Name: ${buyer.assigned_agent_name}
- Phone: ${buyer.assigned_agent_phone}
- Email: ${buyer.assigned_agent_email}
Generate the SUBJECT, EMAIL_BODY, and SMS_BODY now.
`;
return [{ json: { ...buyer, mls_id: listing.mls_id, price: listing.price, address: listing.address, city: listing.city, photo_url: listing.photo_url, listing_url: listing.listing_url, userPrompt } }];Example Output
Email Alert HTML Template
Type: integration Responsive HTML email template for listing alert emails. Branded with the client's brokerage information, includes property photo, key details, call-to-action button, agent contact info, MLS attribution, and CAN-SPAM compliant unsubscribe link. Used by the SendGrid node in the Buyer Matcher workflow.
Implementation:
In n8n's SendGrid node, set Content Type to HTML and use the following template. Replace placeholders with n8n expressions.
<!-- used in the n8n SendGrid node with Content Type set to HTML -->
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>New Listing Alert</title>
</head>
<body style="margin:0;padding:0;background-color:#f4f4f4;font-family:Arial,Helvetica,sans-serif;">
<table role="presentation" width="100%" cellpadding="0" cellspacing="0" style="max-width:600px;margin:0 auto;background:#ffffff;">
<!-- Header -->
<tr>
<td style="background-color:#1a365d;padding:20px;text-align:center;">
<h1 style="color:#ffffff;margin:0;font-size:22px;">🏡 New Listing Alert</h1>
<p style="color:#a0c4ff;margin:5px 0 0;font-size:14px;">From {{ $json.assigned_agent_name }} at [Brokerage Name]</p>
</td>
</tr>
<!-- Property Photo -->
<tr>
<td style="padding:0;">
<img src="{{ $json.photo_url }}" alt="Property Photo" style="width:100%;height:auto;display:block;max-height:300px;object-fit:cover;" />
</td>
</tr>
<!-- Property Details -->
<tr>
<td style="padding:20px;">
<h2 style="color:#1a365d;margin:0 0 10px;font-size:20px;">{{ $json.address }}, {{ $json.city }}</h2>
<table role="presentation" width="100%" cellpadding="8" cellspacing="0" style="background:#f0f7ff;border-radius:8px;margin:10px 0;">
<tr>
<td style="text-align:center;border-right:1px solid #d0e4ff;">
<strong style="font-size:20px;color:#1a365d;">${{ $json.price?.toLocaleString() }}</strong><br>
<span style="font-size:12px;color:#666;">Price</span>
</td>
<td style="text-align:center;border-right:1px solid #d0e4ff;">
<strong style="font-size:20px;color:#1a365d;">{{ $json.bedrooms }}</strong><br>
<span style="font-size:12px;color:#666;">Beds</span>
</td>
<td style="text-align:center;border-right:1px solid #d0e4ff;">
<strong style="font-size:20px;color:#1a365d;">{{ $json.bathrooms }}</strong><br>
<span style="font-size:12px;color:#666;">Baths</span>
</td>
<td style="text-align:center;">
<strong style="font-size:20px;color:#1a365d;">{{ $json.sqft?.toLocaleString() }}</strong><br>
<span style="font-size:12px;color:#666;">Sq Ft</span>
</td>
</tr>
</table>
<!-- Personalized Message -->
<div style="padding:15px 0;font-size:15px;line-height:1.6;color:#333;">
{{ $json.emailBody }}
</div>
<!-- CTA Button -->
<table role="presentation" width="100%" cellpadding="0" cellspacing="0">
<tr>
<td style="text-align:center;padding:15px 0;">
<a href="{{ $json.listing_url }}" style="background-color:#2563eb;color:#ffffff;padding:14px 30px;text-decoration:none;border-radius:6px;font-size:16px;font-weight:bold;display:inline-block;">View Full Listing</a>
</td>
</tr>
</table>
</td>
</tr>
<!-- Agent Contact -->
<tr>
<td style="padding:15px 20px;background:#f8fafc;border-top:1px solid #e2e8f0;">
<p style="margin:0;font-size:14px;color:#333;"><strong>Your Agent:</strong> {{ $json.assigned_agent_name }}</p>
<p style="margin:3px 0;font-size:14px;color:#333;">📞 {{ $json.assigned_agent_phone }} | 📧 {{ $json.assigned_agent_email }}</p>
</td>
</tr>
<!-- MLS Attribution & Compliance Footer -->
<tr>
<td style="padding:15px 20px;background:#e2e8f0;font-size:11px;color:#666;">
<p style="margin:0;">Listing data provided by [MLS Name]. Information deemed reliable but not guaranteed. Listing courtesy of the listing broker. MLS# {{ $json.mls_id }}.</p>
<p style="margin:8px 0 0;">© {{ new Date().getFullYear() }} [Brokerage Legal Name] | [Brokerage Address]</p>
<p style="margin:8px 0 0;"><a href="{{ $json.unsubscribe_url || '#' }}" style="color:#2563eb;">Unsubscribe from listing alerts</a> | <a href="[privacy_policy_url]" style="color:#2563eb;">Privacy Policy</a></p>
</td>
</tr>
</table>
</body>
</html>Configuration Notes
- Replace
[Brokerage Name],[MLS Name],[Brokerage Legal Name],[Brokerage Address], and[privacy_policy_url]with actual client values - For the unsubscribe link, use SendGrid's built-in
{{{unsubscribe}}}tag or build a custom unsubscribe webhook endpoint in n8n - MLS attribution text must match the exact disclaimer language required by the local MLS board — obtain this from the client
CRM Buyer Sync Workflow
Type: workflow n8n workflow that synchronizes buyer profiles and search criteria from Follow Up Boss CRM into the local PostgreSQL database. Supports both scheduled polling (every 30 minutes) and real-time webhook-triggered updates when buyer records change in the CRM.
Implementation:
n8n Workflow: CRM Buyer Sync (Follow Up Boss)
Trigger Options (implement both):
Core Nodes:
Node 1: HTTP Request — Fetch Updated Contacts from FUB
- Type: HTTP Request
- Method: GET
- URL: https://api.followupboss.com/v1/people
- Authentication: HTTP Basic Auth — Username: [FUB API Key], Password: (leave blank)
- Query Parameter — sort: updated
- Query Parameter — limit: 100
- Query Parameter — updatedAfter: {{ $now.minus({minutes: 35}).toISO() }}
- Query Parameter — tag: ai-alerts-active (use a FUB tag to opt buyers in)
Node 2: Loop Over Contacts
- Type: Split In Batches
- Batch Size: 1
Node 3: Extract Buyer Criteria from Custom Fields (Code Node)
const contact = $input.first().json;
// Follow Up Boss custom fields mapping
// These must be created in FUB by the client's admin
const customFields = contact.customFields || {};
const buyer = {
external_crm_id: String(contact.id),
first_name: contact.firstName || '',
last_name: contact.lastName || '',
email: contact.emails?.[0]?.value || null,
phone: contact.phones?.[0]?.value || null,
assigned_agent_name: contact.assignedTo?.name || '',
assigned_agent_email: contact.assignedTo?.email || '',
assigned_agent_phone: '', // FUB may not expose agent phone via API
email_consent: customFields['AI Alert Email Consent'] === 'Yes',
sms_consent: customFields['AI Alert SMS Consent'] === 'Yes',
// Search criteria from custom fields
min_price: parseFloat(customFields['AI Alert Min Price']) || null,
max_price: parseFloat(customFields['AI Alert Max Price']) || null,
min_bedrooms: parseInt(customFields['AI Alert Min Beds']) || null,
min_bathrooms: parseFloat(customFields['AI Alert Min Baths']) || null,
property_types: customFields['AI Alert Property Types']?.split(',').map(s => s.trim()) || null,
zip_codes: customFields['AI Alert Zip Codes']?.split(',').map(s => s.trim()) || null,
cities: customFields['AI Alert Cities']?.split(',').map(s => s.trim()) || null,
must_have_features: customFields['AI Alert Must Have']?.split(',').map(s => s.trim()) || null,
custom_notes: customFields['AI Alert Notes'] || null
};
return [{ json: buyer }];Node 4: Upsert Buyer Profile
- Type: PostgreSQL
INSERT INTO buyers (external_crm_id, first_name, last_name, email, phone, assigned_agent_name, assigned_agent_email, email_consent, email_consent_timestamp, sms_consent, sms_consent_timestamp, updated_at)
VALUES (
'{{ $json.external_crm_id }}',
'{{ $json.first_name }}',
'{{ $json.last_name }}',
{{ $json.email ? "'" + $json.email + "'" : 'NULL' }},
{{ $json.phone ? "'" + $json.phone + "'" : 'NULL' }},
'{{ $json.assigned_agent_name }}',
'{{ $json.assigned_agent_email }}',
{{ $json.email_consent }},
{{ $json.email_consent ? 'NOW()' : 'NULL' }},
{{ $json.sms_consent }},
{{ $json.sms_consent ? 'NOW()' : 'NULL' }},
NOW()
)
ON CONFLICT (external_crm_id) DO UPDATE SET
first_name = EXCLUDED.first_name,
last_name = EXCLUDED.last_name,
email = EXCLUDED.email,
phone = EXCLUDED.phone,
assigned_agent_name = EXCLUDED.assigned_agent_name,
assigned_agent_email = EXCLUDED.assigned_agent_email,
email_consent = EXCLUDED.email_consent,
sms_consent = EXCLUDED.sms_consent,
updated_at = NOW()
RETURNING id;Node 5: Upsert Search Criteria
- Type: PostgreSQL
INSERT INTO search_criteria (buyer_id, min_price, max_price, min_bedrooms, min_bathrooms, property_types, zip_codes, cities, must_have_features, custom_notes, updated_at)
VALUES (
{{ $json.id }},
{{ $json.min_price || 'NULL' }},
{{ $json.max_price || 'NULL' }},
{{ $json.min_bedrooms || 'NULL' }},
{{ $json.min_bathrooms || 'NULL' }},
{{ $json.property_types ? "ARRAY['" + $json.property_types.join("','") + "']" : 'NULL' }},
{{ $json.zip_codes ? "ARRAY['" + $json.zip_codes.join("','") + "']" : 'NULL' }},
{{ $json.cities ? "ARRAY['" + $json.cities.join("','") + "']" : 'NULL' }},
{{ $json.must_have_features ? "ARRAY['" + $json.must_have_features.join("','") + "']" : 'NULL' }},
{{ $json.custom_notes ? "'" + $json.custom_notes.replace(/'/g, "''") + "'" : 'NULL' }},
NOW()
)
ON CONFLICT ON CONSTRAINT search_criteria_pkey DO UPDATE SET
min_price = EXCLUDED.min_price,
max_price = EXCLUDED.max_price,
updated_at = NOW();Follow Up Boss Custom Fields to Create:
The client's FUB admin must create these custom fields under Settings > Custom Fields in Follow Up Boss.
- AI Alert Min Price (Number)
- AI Alert Max Price (Number)
- AI Alert Min Beds (Number)
- AI Alert Min Baths (Number)
- AI Alert Property Types (Text) — comma-separated: SingleFamily,Condo,Townhouse
- AI Alert Zip Codes (Text) — comma-separated: 78704,78745
- AI Alert Cities (Text) — comma-separated: Austin,Round Rock
- AI Alert Must Have (Text) — comma-separated: garage,pool
- AI Alert Notes (Text) — free-text buyer preferences
- AI Alert Email Consent (Dropdown: Yes/No)
- AI Alert SMS Consent (Dropdown: Yes/No)
Agents fill in these fields when setting up buyer search preferences. The tag 'ai-alerts-active' must be applied to contacts who should receive alerts.
Opt-Out and Unsubscribe Handler
Type: workflow n8n webhook workflow that handles SMS STOP replies (via Twilio webhook) and email unsubscribe clicks (via SendGrid webhook or custom endpoint). Updates buyer consent status in PostgreSQL and logs the opt-out for compliance auditing.
n8n Workflow: Opt-Out Handler
Node 1: Twilio SMS Webhook (for STOP replies)
- Type: Webhook
- HTTP Method: POST
- Path: /webhook/twilio-sms-inbound
- Response Mode: Last Node
Node 2: Check for STOP keyword
- Type: IF
- Condition:
{{ $json.Body.toUpperCase().includes('STOP') }}
Node 3: Deactivate SMS Consent
- Type: PostgreSQL
UPDATE buyers
SET sms_consent = FALSE,
updated_at = NOW()
WHERE phone = '{{ $json.From }}'
RETURNING id, first_name, last_name, email;Node 4: Log Opt-Out
- Type: PostgreSQL
INSERT INTO alert_history (buyer_id, channel, personalized_message, delivery_status, sent_at)
VALUES ({{ $json.id }}, 'sms', 'OPT-OUT: Buyer replied STOP', 'opt_out', NOW());Node 5: Send TwiML Response
- Type: Respond to Webhook
- Content-Type: application/xml
<?xml version="1.0" encoding="UTF-8"?>
<Response>
<Message>You have been unsubscribed from listing alerts. Reply START to re-subscribe.</Message>
</Response>Email Unsubscribe Endpoint
Node A1: Webhook
- Path: /webhook/email-unsubscribe
- Method: GET
- Query Params:
buyer_id,token(HMAC signature for security)
Node A2: Verify Token (Code Node)
const crypto = require('crypto');
const secret = 'YOUR_UNSUBSCRIBE_SECRET'; // Set as n8n environment variable
const expectedToken = crypto.createHmac('sha256', secret)
.update($json.query.buyer_id)
.digest('hex');
if ($json.query.token !== expectedToken) {
throw new Error('Invalid unsubscribe token');
}
return $input.all();Node A3: Deactivate Email Consent
- Type: PostgreSQL
UPDATE buyers SET email_consent = FALSE, updated_at = NOW()
WHERE id = {{ $json.query.buyer_id }};Node A4: Respond with Confirmation Page
- Type: Respond to Webhook
- Response Body: HTML page confirming unsubscription
Twilio Webhook Configuration
- In Twilio Console > Phone Numbers > [Your Number] > Messaging:
- A MESSAGE COMES IN — Webhook URL: https://alerts.clientbrokerage.com/webhook/twilio-sms-inbound
- HTTP Method: POST
Fair Housing Compliance Audit Script
Type: skill A periodic audit script that reviews all search criteria and LLM-generated alert messages for potential Fair Housing Act violations. Flags any content containing protected class references or discriminatory language patterns. Should be run weekly by the MSP as part of compliance monitoring.
Implementation:
This runs as a scheduled n8n workflow (weekly on Mondays at 9 AM) or can be triggered manually.
- Node 1: Schedule Trigger — Cron:
0 9 * * 1
SELECT ah.id, ah.personalized_message, ah.subject_line, ah.sent_at,
b.first_name, b.last_name
FROM alert_history ah
JOIN buyers b ON b.id = ah.buyer_id
WHERE ah.sent_at > NOW() - INTERVAL '7 days'
AND ah.channel = 'email'
ORDER BY ah.sent_at DESC;const items = $input.all();
const flagged = [];
// Protected class related terms and phrases to flag
const prohibitedPatterns = [
// Race/National Origin
/\b(white|black|african|hispanic|latino|asian|chinese|mexican|indian|arab|jewish|christian|muslim|catholic)\s*(neighborhood|area|community|district|enclave)/gi,
/\b(ethnic|racial|minority|diverse|homogeneous)\b/gi,
// Religion
/\b(church|mosque|synagogue|temple)\s*(nearby|close|walking)/gi,
// Familial Status
/\b(no\s*kids|child-?free|adult\s*(only|community|living)|singles\s*only|couples\s*only|empty\s*nest)/gi,
/\b(55\+|senior\s*(only|living|community)|age\s*restricted)/gi,
// Sex/Gender
/\b(bachelor|bachelorette|man\s*cave|she\s*shed|master\s*(bedroom|suite|bath))\b/gi,
// Disability
/\b(handicap|cripple|wheelchair\s*bound|mentally|insane\s*asylum)\b/gi,
// Steering language
/\b(perfect\s*for\s*(families|singles|couples|retirees|young\s*professionals))/gi,
/\b(up\s*and\s*coming|gentrifying|transitional|rough|dangerous|safe)\s*(area|neighborhood)/gi,
// School quality as proxy
/\b(best|top|great|excellent|poor|bad)\s*school/gi
];
for (const item of items) {
const text = (item.json.personalized_message || '') + ' ' + (item.json.subject_line || '');
const matches = [];
for (const pattern of prohibitedPatterns) {
const found = text.match(pattern);
if (found) {
matches.push(...found);
}
}
if (matches.length > 0) {
flagged.push({
alert_id: item.json.id,
buyer: `${item.json.first_name} ${item.json.last_name}`,
sent_at: item.json.sent_at,
flagged_terms: matches.join(', '),
message_excerpt: text.substring(0, 200)
});
}
}
return [{ json: { total_reviewed: items.length, total_flagged: flagged.length, flagged_items: flagged } }];- Node 4: IF — Any Flags Found — Condition:
{{ $json.total_flagged > 0 }} - Node 5: Send Compliance Alert Email to MSP — Type: SendGrid | To: msp-compliance@yourmsp.com | Subject:
⚠️ Fair Housing Audit: {{ $json.total_flagged }} flagged alerts found| Body: Summary of flagged items with excerpts - Node 6: Always — Send Weekly Report — Type: SendGrid | To: msp-admin@yourmsp.com, client-broker@clientbrokerage.com | Subject:
MLS Alert System — Weekly Compliance Report| Body:Reviewed {{ $json.total_reviewed }} alerts sent in the past 7 days. {{ $json.total_flagged }} items flagged for review.
Notes
- The 'master bedroom/suite/bath' flag is included because NAR and many MLSs have moved away from this terminology. Flag but don't necessarily treat as a violation — some MLS data still uses it.
- The school quality flag is a YELLOW flag (not automatic violation) — school information itself isn't prohibited, but using it to steer buyers can be.
- This audit is a supplementary check. The primary protection is the system prompt's Fair Housing guardrails. If flags are found, review the LLM prompt and tighten constraints.
- Maintain an audit log: store each weekly report in a compliance folder for at least 3 years.
Testing & Validation
- INFRASTRUCTURE TEST: SSH into the VPS and run 'docker compose ps' — verify all three containers (mls-n8n, mls-postgres, mls-nginx) show status 'Up' with healthy health checks. Run 'docker exec mls-postgres pg_isready' and confirm PostgreSQL is accepting connections.
- SSL TEST: Navigate to https://alerts.clientbrokerage.com in a browser — verify the padlock icon shows, no certificate warnings, and the n8n login page loads. Use https://www.ssllabs.com/ssltest/ to confirm A or A+ grade.
- DATABASE SCHEMA TEST: Run 'docker exec mls-postgres psql -U n8n_user -d n8n -c "\dt"' and verify all 4 tables exist: buyers, search_criteria, listings_cache, alert_history. Run 'SELECT count(*) FROM buyers;' to confirm buyer import was successful.
- MLS API TEST: In n8n, create a temporary workflow with a single HTTP Request node calling the SimplyRETS API: GET https://api.simplyrets.com/properties?limit=5&status=Active. Verify it returns a JSON array of listing objects with fields like mlsId, listPrice, address, property.
- BUYER MATCHING TEST: Manually insert a test listing into listings_cache that matches a known test buyer's criteria (correct zip code, within price range, correct bedroom count). Trigger the Buyer Matcher workflow and verify it identifies the match. Check that a row appears in alert_history with the correct buyer_id and listing_mls_id.
- LLM PERSONALIZATION TEST: Trigger the alert workflow with a test listing-buyer pair. Capture the OpenAI API response and verify: (a) it contains SUBJECT:, EMAIL_BODY:, and SMS_BODY: sections, (b) it references the buyer's name and specific preferences, (c) it mentions the listing's address and price, (d) the SMS body is under 300 characters and includes 'Reply STOP to opt out'.
- FAIR HOUSING COMPLIANCE TEST: Feed the LLM a test listing in a historically redlined neighborhood. Verify the generated message contains ZERO references to neighborhood demographics, racial composition, or protected class characteristics. Run the Fair Housing Audit Script against the test output.
- EMAIL DELIVERY TEST: Send a test alert email to a Gmail, Outlook, and Yahoo address. Verify: (a) email lands in inbox (not spam), (b) DKIM/SPF pass (check email headers), (c) HTML renders correctly on mobile and desktop, (d) unsubscribe link works, (e) MLS attribution and brokerage address are present in footer.
- SMS DELIVERY TEST: Send a test SMS alert to the MSP's phone number. Verify: (a) message is received within 30 seconds, (b) message is properly formatted and under 160 characters (or concatenated correctly if longer), (c) reply STOP and verify opt-out confirmation is received and buyer record is updated in database.
- END-TO-END INTEGRATION TEST: Wait for a real new listing to appear in the MLS during business hours. Monitor the entire pipeline: (a) MLS Poller picks it up within 10 minutes, (b) listing appears in listings_cache, (c) Buyer Matcher identifies correct matching buyers, (d) personalized email and SMS are generated and sent, (e) alert_history records are created with 'sent' status. Time the full cycle — it should complete in under 2 minutes from listing publication.
- DEDUPLICATION TEST: Run the MLS Poller twice in succession. Verify that the same listings are NOT re-processed and no duplicate alerts are sent. Check listings_cache for ON CONFLICT behavior (last_updated_at should update but no duplicate rows).
- CRM SYNC TEST: Modify a buyer's search criteria in Follow Up Boss (change max price). Wait for the next CRM sync cycle (30 minutes or trigger webhook). Verify the PostgreSQL search_criteria table reflects the updated values.
- LOAD TEST: Simulate 50 simultaneous buyer matches for a single popular listing. Verify all 50 alerts are dispatched without errors, rate limiting is respected (no API throttling from OpenAI, SendGrid, or Twilio), and all 50 alert_history records are created.
- FAILOVER TEST: Stop the PostgreSQL container ('docker stop mls-postgres') and verify the n8n workflows fail gracefully with logged errors rather than crashing. Restart the container and verify workflows resume automatically on the next cron cycle.
- BACKUP RESTORE TEST: Run the backup script manually. Then drop the buyers table, restore from the backup file ('docker exec -i mls-postgres pg_restore -U n8n_user -d n8n < backup_file.dump'), and verify all data is intact.
Client Handoff
Client Handoff Checklist
Training Sessions (Recommended: 2 sessions, 1 hour each)
Session 1: Agent/Staff Training (1 hour)
- How to add a new buyer to the alert system via Follow Up Boss custom fields
- How to set up search criteria using the designated custom fields (AI Alert Min Price, Max Price, Zip Codes, etc.)
- How to apply the 'ai-alerts-active' tag to activate alerts for a buyer
- How to update or deactivate a buyer's alert preferences
- How to obtain and document written consent (email and SMS) before activating alerts
- How consent records must be stored (signed buyer consultation form with opt-in language)
- What the buyer experience looks like: show sample email and SMS alerts
- How to handle buyer questions about alert frequency or content
Session 2: Broker/Admin Training (1 hour)
- Overview of the system architecture (MLS → Agent → Notifications) at a conceptual level
- How to access the n8n dashboard (read-only access for monitoring workflow status)
- Understanding the weekly compliance report and what flagged items mean
- How to request changes to alert templates or matching criteria
- MLS attribution requirements and where they appear in alerts
- Fair Housing Act obligations when using AI-generated content
- Escalation process: who to contact at the MSP for issues
Documentation to Deliver
Success Criteria to Review Together
Maintenance
Ongoing MSP Maintenance Responsibilities
Daily (Automated)
- Health check script runs every 10 minutes — monitors container health, disk usage, and workflow execution
- Automated alerts sent to MSP if any container goes down or disk usage exceeds 85%
- Automated database backups at 2 AM daily with 30-day retention
Weekly
- Fair Housing Compliance Audit: Review the automated weekly compliance report. Investigate any flagged messages. Adjust LLM prompts if patterns emerge. Time: ~30 minutes/week.
- Delivery Metrics Review: Check SendGrid and Twilio dashboards for bounce rates, spam complaints, delivery failures. Investigate any bounce rate above 5%. Time: ~15 minutes/week.
- Workflow Execution Review: Log into n8n, check workflow execution logs for errors or warnings. Look for failed API calls, timeout issues, or unusual patterns. Time: ~15 minutes/week.
Monthly
- System Updates: Update Docker images for n8n, PostgreSQL, and Nginx. Process: pull new images, test on staging, then update production during off-hours. Always backup before updating. Time: ~1 hour/month.
- Database Maintenance: Run PostgreSQL VACUUM ANALYZE on all tables. Check table sizes and index health. Purge listings_cache entries older than 90 days and alert_history older than 365 days (after archiving). Time: ~30 minutes/month.
- SSL Certificate: Auto-renewed by Certbot cron. Verify renewal is working by checking certificate expiry date. Time: ~5 minutes/month.
- Backup Restore Test: Restore the latest backup to the staging server and verify data integrity. Time: ~30 minutes/month.
- Cost Review: Check OpenAI, Twilio, and SendGrid invoices against expected usage. Flag any anomalies. Time: ~15 minutes/month.
cd ~/mls-agent && docker compose pull && docker compose up -dQuarterly
- LLM Prompt Review: Review a random sample of 50 alert messages for quality, accuracy, and compliance. Refine prompts based on patterns. Time: ~2 hours/quarter.
- Security Audit: Review SSH access logs, update server packages, rotate API keys if policy requires, check for Docker image CVEs. Time: ~1 hour/quarter.
- Client Business Review: Meet with client to review alert volumes, buyer engagement metrics, and gather feedback. Discuss any new requirements or criteria adjustments. Time: ~1 hour/quarter.
As-Needed Triggers
- MLS Data Schema Changes: RESO periodically updates the Data Dictionary. SimplyRETS/Trestle may change response formats. Monitor their changelogs and update the normalization code in the MLS Poller workflow.
- New MLS Board Connection: If the client expands to a new MLS coverage area, a new SimplyRETS/Trestle connection must be configured ($99 one-time + $49-75/month).
- CRM Migration: If the client changes CRM platforms, the CRM Sync Workflow must be rebuilt for the new API.
- OpenAI Model Updates: When OpenAI releases new model versions, test the new model on staging before switching production. GPT-5.4 mini may be deprecated — monitor OpenAI's deprecation schedule.
- Regulatory Changes: Monitor NAR policy changes (major overhaul coming January 2026), state privacy law changes, and FCC/FTC rulings on AI-generated communications.
SLA Recommendations
- Uptime Target: 99.5% monthly (allows ~3.6 hours downtime/month for maintenance)
- Alert Latency: New listings should trigger buyer alerts within 15 minutes of MLS publication
- Response Time: Critical issues (system down) — 1 hour response, 4 hour resolution. Non-critical (template change, new buyer setup) — next business day.
- Escalation Path: Tier 1 (MSP helpdesk) → Tier 2 (MSP engineer who built the system) → Tier 3 (vendor support: n8n community/enterprise, SimplyRETS support, OpenAI support)
Alternatives
Turnkey Platform: Ylopo with Raiya AI
Instead of building a custom system, deploy Ylopo's platform which includes built-in MLS listing alerts, AI-powered lead nurturing via their 'Raiya' AI texting assistant, IDX website, and CRM integration. The MSP would configure and manage the Ylopo subscription rather than building custom workflows.
n8n Cloud Instead of Self-Hosted
Use n8n's managed cloud service ($50/month Pro plan for 10,000 workflow executions) instead of self-hosting n8n on a VPS. Eliminates the need for Docker, VPS management, and Nginx configuration. The MSP still builds all the same workflows but doesn't manage the n8n infrastructure.
CrewAI Multi-Agent Python Framework
Build the solution as a Python application using CrewAI's multi-agent framework instead of n8n. Define specialized agents (MLS Monitor Agent, Buyer Matcher Agent, Personalization Agent, Notification Agent) that collaborate to process listings. Deploy as a Python service on the VPS with systemd or Docker.
Hybrid: kvCORE/Follow Up Boss Built-in Alerts + Custom AI Personalization Layer
Use the client's existing CRM platform (kvCORE or Follow Up Boss) for basic MLS listing alerts (which they already support natively), but add a custom AI layer on top that intercepts the alert data and personalizes the messages before delivery. This is a lighter implementation that enhances an existing system rather than replacing it.
Not recommended when: Client needs advanced matching criteria beyond what the CRM supports, or wants a system independent of their CRM choice.
Trestle (CoreLogic) Instead of SimplyRETS for MLS Data
Use Trestle (formerly CoreLogic Trestle) as the MLS data provider instead of SimplyRETS. Trestle is the largest MLS data aggregation platform, connecting to more MLS boards than any other single provider.
Want early access to the full toolkit?