65 min readIntelligence & insights

Implementation Guide: Benchmark client premiums vs. market and flag accounts due for remarketing

Step-by-step implementation guide for deploying AI to benchmark client premiums vs. market and flag accounts due for remarketing for Insurance Agencies clients.

Hardware Procurement

Dell PowerEdge T360 Tower Server

Dell TechnologiesPET360-BASE (configured: Xeon E-2434, 32GB DDR5, 2x 480GB SSD RAID 1, 3yr ProSupport)Qty: 1

$3,200 MSP cost / $4,500 suggested resale

Optional on-premise analytics server for hosting the ETL pipeline, SQL Server database for policy data staging, and Power BI Report Server if the agency prefers on-premise data residency. Only required if the agency has strict data-residency requirements or runs a legacy on-premise AMS installation. Most deployments will use Azure cloud resources instead.

Business Workstation - Dell OptiPlex 7420

Dell TechnologiesOptiPlex 7420 SFF (i5-14500, 16GB DDR5, 512GB NVMe SSD)Qty: 3

$850 MSP cost / $1,150 suggested resale per unit

Workstations for agency producers and account managers to access Power BI dashboards, CRM remarketing workflows, and AMS benchmarking tools. Minimum 16GB RAM and SSD required for responsive Power BI Desktop authoring and multi-tab browser sessions.

Fortinet FortiGate 60F Firewall

FortinetFG-60F with FortiGuard UTM Bundle (1yr)Qty: 1

$550 MSP cost / $800 suggested resale

Secures API traffic between the agency LAN and cloud services (Azure, AMS cloud, Ivans, comparative raters). Provides SSL inspection, application control, and IPS to meet NAIC Model Law #668 network security requirements. Required for GLBA Safeguards Rule compliance.

Ubiquiti UniFi Switch Lite 16 PoE

UbiquitiUSW-Lite-16-PoEQty: 1

$200 MSP cost / $350 suggested resale

Managed network switch for agency LAN segmentation. Enables VLAN separation between general office traffic and analytics/AMS traffic for enhanced security posture and compliance with data security requirements.

Software Procurement

Microsoft 365 E5

Microsoftper-seat SaaS (CSP)

$57/user/month via CSP; includes Power BI Pro. For 10-user agency: $570/month

Provides Power BI Pro (included in E5) for premium benchmarking dashboards, Azure AD/Entra ID for SSO across all SaaS tools, Microsoft Teams for internal remarketing collaboration, Outlook for client communications, and advanced compliance/security features (DLP, Information Barriers) needed for GLBA compliance. If client already has M365 Business Premium, add Power BI Pro at $14/user/month instead.

Power BI Pro

Microsoftper-seat SaaS (included in M365 E5, or standalone)Qty: Typically 3-5 dashboard consumers at agency

$14/user/month standalone; $0 additional if M365 E5 is deployed

Core analytics and visualization platform for premium benchmarking dashboards. Enables scheduled data refresh from AMS APIs, interactive drill-down into policy-level premium comparisons, threshold-based alerting, and mobile access for producers reviewing accounts in the field.

Azure SQL Database

MicrosoftStandard S1 / S2 tier

Standard S1 tier: ~$30/month for 20 DTUs, 250GB. Standard S2 for larger books: ~$75/month

Cloud-hosted relational database for the premium data warehouse. Stores normalized policy data extracted from the AMS, market benchmark data from Ivans/Zywave feeds, historical premium trends, and remarketing workflow state. Serves as the central data source for Power BI dashboards.

~$50-$150/month for a typical agency ETL workload (orchestration runs + data movement)

Managed ETL/ELT service for orchestrating data extraction from AMS APIs, transforming policy data into normalized schema, loading into Azure SQL, and scheduling daily/weekly refresh cycles. Provides visual pipeline designer and monitoring for the MSP.

InsuredMine CRM

InsuredMineper-seat SaaSQty: 10 users

$69/user/month; annual subscription eligible for 5-10% discount. For 10 users: ~$620-$690/month

Insurance-specific CRM overlay that integrates with all major AMS platforms. Provides visual analytics, renewal pipeline management, automated remarketing campaign triggers, client communication tracking, and cross-sell/upsell opportunity identification. Acts as the remarketing workflow execution engine.

Zywave Analytics Cloud

ZywaveSaaS (custom/modular pricing)

Contact vendor; estimated $500-$1,500/month depending on modules and agency size

Provides commercial lines benchmark data covering 400,000+ U.S. insurance programs annually. Enables premium comparison against industry benchmarks by geography, coverage type, risk class, and industry vertical. Supplements personal lines data from other sources.

Applied Insurance AI - PLRI Module

Applied SystemsPLRI Module

Included with Applied Epic subscription; activation/configuration services: $2,000-$5,000 one-time

Built-in premium benchmarking at renewal within Applied Epic. Proactively compares term-over-term premium changes, highlights differences, and evaluates renewal premiums against similar policies. Only applicable for agencies running Applied Epic as their AMS.

EZLynx Rating Engine

Applied Systems (EZLynx)

Starting at $350/month for the EZLynx platform; rating engine included in base subscription

Comparative personal lines rater connecting to 330+ carriers across 48 states. Used in the remarketing workflow to automatically generate competitive quotes when accounts are flagged for remarketing based on premium increase thresholds.

ReFocus AI

ReFocus AI

Contact vendor; estimated $300-$800/month depending on agency size and volume

Automated remarketing workflow engine. Allows definition of triggers (premium increase >10%, policy type, client tenure, claims history) and automatically sends submissions to appointed carriers when thresholds are breached. Integrates with AgencyZoom and major AMS platforms.

Prerequisites

  • Active Agency Management System (AMS) subscription — Applied Epic, AMS360, EZLynx, or HawkSoft — with current supported version and API access enabled
  • Carrier download (IVANS) active and current for all appointed carriers — verify no carriers are sending paper-only; all must be on electronic download for premium data accuracy
  • Microsoft 365 Business Premium or E3/E5 tenant with Azure AD/Entra ID configured — required for SSO, Power BI licensing, and Azure resource management
  • Azure subscription (CSP) linked to the client's Microsoft tenant — required for Azure SQL Database and Azure Data Factory resources
  • Minimum 50 Mbps symmetrical internet connection with business-grade SLA — all SaaS platforms and API integrations require reliable, low-latency connectivity
  • AMS administrator credentials and familiarity with the agency's data schema — the agency must designate an internal AMS administrator or principal who can authorize API access and validate data mapping
  • Complete carrier appointment list with carrier codes and download status — needed to verify which carriers are actively sending policy data via IVANS
  • Agency-defined remarketing thresholds — the agency principal must pre-define what premium increase percentage triggers remarketing (typically 10-20%), minimum account premium for remarketing eligibility, and any exclusion criteria (e.g., claims-active accounts)
  • GLBA compliance documentation current — existing written information security plan per FTC Safeguards Rule; designated security coordinator identified. If not in place, this must be addressed as a prerequisite compliance engagement
  • Network infrastructure capable of VLAN segmentation — firewall and managed switch must support separating analytics traffic from general office use
  • Data Processing Agreements (DPAs) executed with all third-party vendors who will access policy data — InsuredMine, Zywave, ReFocus AI, and any other analytics vendors
  • Agency E&O insurance current and covering technology-assisted remarketing recommendations — verify with the agency's E&O carrier that AI-assisted benchmarking and remarketing workflows are within coverage scope

Installation Steps

...

Step 1: Environment Assessment and Data Quality Audit

Conduct a thorough assessment of the agency's current AMS environment, data quality, and technical infrastructure. This is the most critical phase — the accuracy of premium benchmarking depends entirely on the quality of underlying policy data. Log into the AMS and audit carrier download status, policy data completeness, premium field accuracy, and data freshness. Document all carriers with active downloads vs. those requiring manual entry. Identify data quality issues such as missing coverage codes, inconsistent naming conventions, or stale policies that haven't been updated.

Connect to AMS database
sql
-- Applied Epic SQL Server backend (if direct SQL access is available for
-- Applied Epic on-premise)

SELECT CarrierName, COUNT(*) as PolicyCount, MAX(LastModifiedDate) as MostRecent FROM dbo.Policy GROUP BY CarrierName ORDER BY PolicyCount DESC
  • Check IVANS download status via Applied Epic admin console: Navigate to: Settings > Downloads > Carrier Download Status
  • Export the carrier download activity report for the last 90 days
  • For AMS360, run the Data Integrity Report: Navigate to: Reports > System Reports > Data Integrity
  • Export as CSV for analysis
Network assessment
powershell
# verify bandwidth and latency to Azure/SaaS endpoints

# Network assessment - verify bandwidth and latency to Azure/SaaS endpoints:
Test-NetConnection -ComputerName login.microsoftonline.com -Port 443
Test-NetConnection -ComputerName *.database.windows.net -Port 1433
Test-NetConnection -ComputerName api.appliedsystems.com -Port 443
# Run speed test
Invoke-WebRequest -Uri 'https://www.speedtest.net' -UseBasicParsing
Note

This step typically takes 2-3 full days on-site. Bring a data quality checklist. Common issues: carriers not on electronic download (especially smaller regional carriers), duplicate client records, policies with $0 premium (expired but not archived), and inconsistent coverage type coding. Document everything in a findings report — this becomes the remediation punch list for Phase 2.

Step 2: Data Remediation and AMS Cleanup

Working with the agency's AMS administrator, remediate the data quality issues identified in Step 1. Enable IVANS carrier downloads for any carriers still on paper. Merge duplicate client records. Archive expired policies. Standardize coverage type codes and line-of-business classifications. This step is essential because the benchmarking engine will produce misleading results if the underlying data contains errors, duplicates, or inconsistencies.

  • Enable IVANS download for a carrier in Applied Epic
  • Navigate to: Settings > Downloads > Carrier Setup
  • Select carrier > Enable 'Policy Download' and 'Claims Download'
  • Set download frequency to 'Daily'
  • Save and test with a manual download pull
  • For HawkSoft: Enable carrier download integration
  • Navigate to: Setup > Downloads > Add Carrier
  • Enter IVANS carrier code (obtain from https://www.ivans.com/carrier-list)
  • Map policy fields to HawkSoft schema
SQL script to identify duplicate clients in Epic backend (read-only diagnostic)
sql
SELECT LastName, FirstName, DateOfBirth, COUNT(*) as DuplicateCount
FROM dbo.Client
GROUP BY LastName, FirstName, DateOfBirth
HAVING COUNT(*) > 1
ORDER BY DuplicateCount DESC
  • Bulk update coverage type standardization using Applied Epic REST API to update policy classification codes
Applied Epic REST API: update policy classification codes
http
POST https://api.appliedsystems.com/epic/v1/policies/{policyId}
Content-Type: application/json

{ "lineOfBusiness": "PERSONAL_AUTO", "coverageType": "FULL_COVERAGE" }
Note

Data remediation is the single largest hidden cost in this project. Budget 40-80 hours depending on data quality. Involve the agency's most experienced CSR who knows the book of business. Do NOT skip this step — launching benchmarking on dirty data will erode agency trust in the entire system. Create a remediation tracker spreadsheet and review progress weekly.

Step 3: Azure Environment Provisioning

Provision the Azure cloud resources that will host the premium data warehouse and ETL pipeline. Create a dedicated Azure Resource Group for the insurance analytics project, deploy Azure SQL Database for the premium data warehouse, and configure Azure Data Factory for the ETL orchestration. Set up network security groups, firewall rules, and diagnostic logging.

bash
# Login to Azure CLI
az login

# Set subscription context
az account set --subscription "<client-subscription-id>"

# Create Resource Group
az group create --name rg-insurance-analytics --location eastus2

# Create Azure SQL Server
az sql server create \
  --name sql-insurance-benchmarking \
  --resource-group rg-insurance-analytics \
  --location eastus2 \
  --admin-user sqladmin \
  --admin-password '<StrongPassword123!>'

# Configure firewall rules (allow Azure services + agency office IP)
az sql server firewall-rule create \
  --resource-group rg-insurance-analytics \
  --server sql-insurance-benchmarking \
  --name AllowAzureServices \
  --start-ip-address 0.0.0.0 \
  --end-ip-address 0.0.0.0

az sql server firewall-rule create \
  --resource-group rg-insurance-analytics \
  --server sql-insurance-benchmarking \
  --name AgencyOfficeIP \
  --start-ip-address <agency-public-ip> \
  --end-ip-address <agency-public-ip>

# Create the premium data warehouse database
az sql db create \
  --resource-group rg-insurance-analytics \
  --server sql-insurance-benchmarking \
  --name db-premium-benchmarking \
  --service-objective S1 \
  --max-size 250GB

# Enable TDE (Transparent Data Encryption) - enabled by default but verify
az sql db tde set \
  --resource-group rg-insurance-analytics \
  --server sql-insurance-benchmarking \
  --database db-premium-benchmarking \
  --status Enabled

# Enable auditing for compliance
az sql server audit-policy update \
  --resource-group rg-insurance-analytics \
  --name sql-insurance-benchmarking \
  --state Enabled \
  --storage-account <storage-account-name>

# Create Azure Data Factory instance
az datafactory create \
  --resource-group rg-insurance-analytics \
  --factory-name adf-insurance-etl \
  --location eastus2

# Create Azure Key Vault for API credentials
az keyvault create \
  --name kv-insurance-analytics \
  --resource-group rg-insurance-analytics \
  --location eastus2 \
  --sku standard

# Store AMS API credentials in Key Vault
az keyvault secret set \
  --vault-name kv-insurance-analytics \
  --name ams-api-client-id \
  --value '<ams-api-client-id>'

az keyvault secret set \
  --vault-name kv-insurance-analytics \
  --name ams-api-client-secret \
  --value '<ams-api-client-secret>'
Note

Use Azure CSP billing so the MSP manages the subscription and can mark up consumption. Enable Azure Cost Management alerts at $200/month and $500/month thresholds to avoid runaway costs. The S1 tier (20 DTUs) is sufficient for agencies with up to 5,000 policies; upgrade to S2 for larger books. Always use Key Vault for credential storage — never hardcode API keys in Data Factory pipelines.

Step 4: Database Schema Creation — Premium Data Warehouse

Create the normalized database schema in Azure SQL that will serve as the central premium data warehouse. This schema is designed to store policy data from any AMS, market benchmark data from Ivans/Zywave, and remarketing workflow state. The schema supports historical premium tracking for trend analysis and audit compliance.

Connect to Azure SQL Database and execute schema creation — use Azure Data Studio or SSMS
sql
-- Connect to Azure SQL Database and execute schema creation
-- Use Azure Data Studio or SSMS

-- Core policy data table
CREATE TABLE dbo.Policies (
    PolicyID UNIQUEIDENTIFIER DEFAULT NEWID() PRIMARY KEY,
    AMSPolicyNumber NVARCHAR(50) NOT NULL,
    AMSSource NVARCHAR(20) NOT NULL, -- 'EPIC', 'AMS360', 'EZLYNX', 'HAWKSOFT'
    ClientID UNIQUEIDENTIFIER NOT NULL,
    ClientName NVARCHAR(200) NOT NULL,
    CarrierCode NVARCHAR(20) NOT NULL,
    CarrierName NVARCHAR(200) NOT NULL,
    LineOfBusiness NVARCHAR(50) NOT NULL, -- 'PERSONAL_AUTO', 'HOMEOWNERS', 'COMMERCIAL_PROPERTY', etc.
    CoverageType NVARCHAR(100),
    PolicyEffectiveDate DATE NOT NULL,
    PolicyExpirationDate DATE NOT NULL,
    AnnualPremium DECIMAL(12,2) NOT NULL,
    PriorTermPremium DECIMAL(12,2),
    PremiumChangeAmount AS (AnnualPremium - ISNULL(PriorTermPremium, AnnualPremium)),
    PremiumChangePercent AS (
        CASE WHEN ISNULL(PriorTermPremium, 0) > 0 
        THEN ROUND(((AnnualPremium - PriorTermPremium) / PriorTermPremium) * 100, 2)
        ELSE 0 END
    ),
    State NVARCHAR(2) NOT NULL,
    ZipCode NVARCHAR(10),
    IndustryCode NVARCHAR(20), -- NAICS for commercial lines
    RenewalDate DATE,
    ProducerCode NVARCHAR(20),
    ProducerName NVARCHAR(200),
    ClaimsCount INT DEFAULT 0,
    ClaimsAmountTotal DECIMAL(12,2) DEFAULT 0,
    AccountTenureYears INT,
    TotalAccountPremium DECIMAL(12,2), -- all policies for this client combined
    RemarketingFlag BIT DEFAULT 0,
    RemarketingReason NVARCHAR(500),
    RemarketingDate DATE,
    RemarketingStatus NVARCHAR(20) DEFAULT 'NONE', -- 'NONE','FLAGGED','IN_PROGRESS','COMPLETED','DECLINED'
    LastSyncDate DATETIME2 DEFAULT GETUTCDATE(),
    CreatedDate DATETIME2 DEFAULT GETUTCDATE(),
    ModifiedDate DATETIME2 DEFAULT GETUTCDATE(),
    INDEX IX_Policies_Renewal (RenewalDate, RemarketingFlag),
    INDEX IX_Policies_LOB (LineOfBusiness, State),
    INDEX IX_Policies_Client (ClientID)
);

-- Market benchmark data table
CREATE TABLE dbo.MarketBenchmarks (
    BenchmarkID UNIQUEIDENTIFIER DEFAULT NEWID() PRIMARY KEY,
    DataSource NVARCHAR(50) NOT NULL, -- 'IVANS', 'ZYWAVE', 'MANUAL'
    LineOfBusiness NVARCHAR(50) NOT NULL,
    CoverageType NVARCHAR(100),
    State NVARCHAR(2) NOT NULL,
    ZipCodePrefix NVARCHAR(5), -- first 3-5 digits for geographic grouping
    IndustryCode NVARCHAR(20),
    BenchmarkPeriod DATE NOT NULL, -- first day of the benchmark period
    MedianPremium DECIMAL(12,2),
    AvgPremium DECIMAL(12,2),
    P25Premium DECIMAL(12,2), -- 25th percentile
    P75Premium DECIMAL(12,2), -- 75th percentile
    SampleSize INT,
    AvgPremiumChangePercent DECIMAL(6,2), -- market-wide average increase
    LastUpdated DATETIME2 DEFAULT GETUTCDATE(),
    INDEX IX_Benchmarks_Lookup (LineOfBusiness, State, BenchmarkPeriod)
);

-- Premium history for trend tracking
CREATE TABLE dbo.PremiumHistory (
    HistoryID UNIQUEIDENTIFIER DEFAULT NEWID() PRIMARY KEY,
    PolicyID UNIQUEIDENTIFIER FOREIGN KEY REFERENCES dbo.Policies(PolicyID),
    TermEffectiveDate DATE NOT NULL,
    TermExpirationDate DATE NOT NULL,
    TermPremium DECIMAL(12,2) NOT NULL,
    CarrierCode NVARCHAR(20),
    RecordedDate DATETIME2 DEFAULT GETUTCDATE()
);

-- Remarketing workflow tracking
CREATE TABLE dbo.RemarketingActions (
    ActionID UNIQUEIDENTIFIER DEFAULT NEWID() PRIMARY KEY,
    PolicyID UNIQUEIDENTIFIER FOREIGN KEY REFERENCES dbo.Policies(PolicyID),
    ClientID UNIQUEIDENTIFIER NOT NULL,
    TriggerType NVARCHAR(50) NOT NULL, -- 'PREMIUM_INCREASE', 'BENCHMARK_OUTLIER', 'MANUAL', 'RENEWAL_REVIEW'
    TriggerValue NVARCHAR(100), -- e.g., '18.5% increase'
    FlaggedDate DATE NOT NULL,
    AssignedProducer NVARCHAR(200),
    CarriersSubmitted NVARCHAR(MAX), -- JSON array of carrier names
    QuotesReceived INT DEFAULT 0,
    BestAlternativePremium DECIMAL(12,2),
    BestAlternativeCarrier NVARCHAR(200),
    Outcome NVARCHAR(20) DEFAULT 'PENDING', -- 'PENDING','REMARKETED','RETAINED','LOST'
    OutcomeDate DATE,
    SavingsAmount DECIMAL(12,2),
    Notes NVARCHAR(MAX),
    CreatedDate DATETIME2 DEFAULT GETUTCDATE(),
    ModifiedDate DATETIME2 DEFAULT GETUTCDATE()
);

-- Configuration table for remarketing thresholds
CREATE TABLE dbo.RemarketingThresholds (
    ThresholdID INT IDENTITY(1,1) PRIMARY KEY,
    LineOfBusiness NVARCHAR(50) NOT NULL,
    PremiumIncreaseThresholdPercent DECIMAL(5,2) NOT NULL DEFAULT 15.00,
    MinPolicyPremium DECIMAL(12,2) DEFAULT 500.00,
    MinAccountPremium DECIMAL(12,2) DEFAULT 1000.00,
    MinAccountTenureYears INT DEFAULT 0,
    MaxClaimsCount INT DEFAULT 99,
    ExcludeNewBusiness BIT DEFAULT 0,
    DaysBeforeRenewalToFlag INT DEFAULT 90,
    IsActive BIT DEFAULT 1,
    CreatedDate DATETIME2 DEFAULT GETUTCDATE()
);

-- Insert default thresholds
INSERT INTO dbo.RemarketingThresholds (LineOfBusiness, PremiumIncreaseThresholdPercent, MinPolicyPremium, MinAccountPremium, DaysBeforeRenewalToFlag)
VALUES 
('PERSONAL_AUTO', 15.00, 500.00, 1000.00, 90),
('HOMEOWNERS', 15.00, 800.00, 1500.00, 90),
('COMMERCIAL_PROPERTY', 10.00, 2000.00, 5000.00, 120),
('COMMERCIAL_AUTO', 10.00, 2000.00, 5000.00, 120),
('COMMERCIAL_GL', 10.00, 1500.00, 5000.00, 120),
('WORKERS_COMP', 10.00, 3000.00, 5000.00, 120),
('BOP', 12.00, 1000.00, 3000.00, 90),
('UMBRELLA', 20.00, 300.00, 1000.00, 60);
-- Create stored procedure for the remarketing flagging engine
CREATE PROCEDURE dbo.usp_FlagAccountsForRemarketing
AS
BEGIN
    SET NOCOUNT ON;
    
    -- Reset flags for policies outside the renewal window
    UPDATE p SET 
        RemarketingFlag = 0,
        RemarketingStatus = 'NONE',
        ModifiedDate = GETUTCDATE()
    FROM dbo.Policies p
    WHERE p.RemarketingFlag = 1
    AND p.RemarketingStatus = 'FLAGGED'
    AND p.RenewalDate < DATEADD(DAY, -30, GETDATE());
    
    -- Flag policies that meet remarketing criteria
    UPDATE p SET
        RemarketingFlag = 1,
        RemarketingReason = 
            CASE 
                WHEN p.PremiumChangePercent >= t.PremiumIncreaseThresholdPercent 
                    THEN CONCAT('Premium increase of ', p.PremiumChangePercent, '% exceeds ', t.PremiumIncreaseThresholdPercent, '% threshold')
                WHEN p.AnnualPremium > ISNULL(b.P75Premium, 999999999)
                    THEN CONCAT('Premium $', FORMAT(p.AnnualPremium, 'N2'), ' exceeds 75th percentile market benchmark of $', FORMAT(b.P75Premium, 'N2'))
                ELSE 'Multiple criteria met'
            END,
        RemarketingDate = GETUTCDATE(),
        RemarketingStatus = 'FLAGGED',
        ModifiedDate = GETUTCDATE()
    FROM dbo.Policies p
    INNER JOIN dbo.RemarketingThresholds t ON p.LineOfBusiness = t.LineOfBusiness AND t.IsActive = 1
    LEFT JOIN dbo.MarketBenchmarks b ON 
        p.LineOfBusiness = b.LineOfBusiness 
        AND p.State = b.State
        AND b.BenchmarkPeriod = (
            SELECT MAX(BenchmarkPeriod) FROM dbo.MarketBenchmarks 
            WHERE LineOfBusiness = p.LineOfBusiness AND State = p.State
        )
    WHERE p.RemarketingStatus IN ('NONE', 'FLAGGED')
    AND p.RenewalDate BETWEEN GETDATE() AND DATEADD(DAY, t.DaysBeforeRenewalToFlag, GETDATE())
    AND p.AnnualPremium >= t.MinPolicyPremium
    AND ISNULL(p.TotalAccountPremium, p.AnnualPremium) >= t.MinAccountPremium
    AND ISNULL(p.ClaimsCount, 0) <= t.MaxClaimsCount
    AND ISNULL(p.AccountTenureYears, 99) >= t.MinAccountTenureYears
    AND (
        p.PremiumChangePercent >= t.PremiumIncreaseThresholdPercent
        OR p.AnnualPremium > ISNULL(b.P75Premium, 999999999)
    );
    
    -- Insert remarketing actions for newly flagged policies
    INSERT INTO dbo.RemarketingActions (PolicyID, ClientID, TriggerType, TriggerValue, FlaggedDate, AssignedProducer)
    SELECT 
        p.PolicyID,
        p.ClientID,
        CASE 
            WHEN p.PremiumChangePercent >= t.PremiumIncreaseThresholdPercent THEN 'PREMIUM_INCREASE'
            ELSE 'BENCHMARK_OUTLIER'
        END,
        CASE 
            WHEN p.PremiumChangePercent >= t.PremiumIncreaseThresholdPercent 
                THEN CONCAT(p.PremiumChangePercent, '% increase')
            ELSE CONCAT('Above P75: $', FORMAT(p.AnnualPremium, 'N2'))
        END,
        GETDATE(),
        p.ProducerName
    FROM dbo.Policies p
    INNER JOIN dbo.RemarketingThresholds t ON p.LineOfBusiness = t.LineOfBusiness
    WHERE p.RemarketingStatus = 'FLAGGED'
    AND NOT EXISTS (
        SELECT 1 FROM dbo.RemarketingActions ra 
        WHERE ra.PolicyID = p.PolicyID 
        AND ra.Outcome = 'PENDING'
    );
    
    -- Return summary
    SELECT 
        LineOfBusiness,
        COUNT(*) as FlaggedCount,
        AVG(PremiumChangePercent) as AvgPremiumIncrease,
        SUM(AnnualPremium) as TotalFlaggedPremium
    FROM dbo.Policies
    WHERE RemarketingFlag = 1 AND RemarketingStatus = 'FLAGGED'
    GROUP BY LineOfBusiness;
END;
GO
Note

This schema is AMS-agnostic — it normalizes data from any source AMS into a common format. The computed columns PremiumChangeAmount and PremiumChangePercent automatically calculate variances. The RemarketingThresholds table allows the agency to adjust triggers per line of business without code changes. Default thresholds are set at 10-15% based on industry standard practices. The stored procedure should be scheduled to run daily via Azure Data Factory or SQL Agent.

Step 5: AMS API Integration — Data Extraction Pipeline

Configure the data extraction pipeline to pull policy and premium data from the agency's AMS into the Azure SQL data warehouse. The specific integration method depends on the AMS platform. For Applied Epic, use the REST API SDK. For AMS360, use Vertafore's API. For HawkSoft and EZLynx, use available export mechanisms or third-party integration tools. Configure Azure Data Factory pipelines to orchestrate the extraction on a daily schedule.

Azure Data Factory
text
# REST Linked Service configuration for Applied Epic API

# Azure Data Factory - Create Linked Service for AMS API (Applied Epic example)
# In ADF Studio, create a new REST Linked Service:
# Name: ls_AppliedEpicAPI
# Base URL: https://api.appliedsystems.com/epic/v1/
# Authentication Type: OAuth2 Client Credentials
# Client ID: @Microsoft.KeyVault(SecretUri=https://kv-insurance-analytics.vault.azure.net/secrets/ams-api-client-id/)
# Client Secret: @Microsoft.KeyVault(SecretUri=https://kv-insurance-analytics.vault.azure.net/secrets/ams-api-client-secret/)
# Token endpoint: https://auth.appliedsystems.com/oauth2/token
# Scope: epic.policies.read epic.clients.read
ams_extract.py
python
# AMS data extraction class with OAuth2 auth, paginated policy pull, LOB
# normalization, and Azure SQL upsert

# Python script for AMS data extraction (can run as Azure Function or ADF custom activity)
# Save as ams_extract.py

import requests
import pyodbc
import json
from datetime import datetime, timedelta

class AMSExtractor:
    def __init__(self, ams_type, api_base_url, client_id, client_secret, sql_conn_string):
        self.ams_type = ams_type
        self.api_base_url = api_base_url
        self.client_id = client_id
        self.client_secret = client_secret
        self.sql_conn = pyodbc.connect(sql_conn_string)
        self.token = None
    
    def authenticate(self):
        """Obtain OAuth2 token from AMS API"""
        if self.ams_type == 'EPIC':
            token_url = 'https://auth.appliedsystems.com/oauth2/token'
        elif self.ams_type == 'AMS360':
            token_url = 'https://api.vertafore.com/oauth/token'
        else:
            raise ValueError(f'Unsupported AMS type: {self.ams_type}')
        
        response = requests.post(token_url, data={
            'grant_type': 'client_credentials',
            'client_id': self.client_id,
            'client_secret': self.client_secret,
            'scope': 'policies.read clients.read'
        })
        response.raise_for_status()
        self.token = response.json()['access_token']
    
    def extract_policies(self, modified_since=None):
        """Extract policies modified since last sync"""
        if not self.token:
            self.authenticate()
        
        headers = {'Authorization': f'Bearer {self.token}', 'Content-Type': 'application/json'}
        
        if modified_since is None:
            modified_since = (datetime.utcnow() - timedelta(days=1)).isoformat()
        
        policies = []
        page = 1
        while True:
            params = {
                'modifiedSince': modified_since,
                'page': page,
                'pageSize': 100,
                'status': 'Active'
            }
            response = requests.get(
                f'{self.api_base_url}/policies',
                headers=headers,
                params=params
            )
            response.raise_for_status()
            data = response.json()
            
            if not data.get('items'):
                break
            policies.extend(data['items'])
            page += 1
        
        return policies
    
    def normalize_policy(self, raw_policy):
        """Normalize AMS-specific policy data to common schema"""
        if self.ams_type == 'EPIC':
            return {
                'AMSPolicyNumber': raw_policy.get('policyNumber'),
                'AMSSource': 'EPIC',
                'ClientName': raw_policy.get('insuredName', ''),
                'CarrierCode': raw_policy.get('carrierCode', ''),
                'CarrierName': raw_policy.get('carrierName', ''),
                'LineOfBusiness': self._map_lob(raw_policy.get('lineOfBusiness', '')),
                'CoverageType': raw_policy.get('coverageDescription', ''),
                'PolicyEffectiveDate': raw_policy.get('effectiveDate'),
                'PolicyExpirationDate': raw_policy.get('expirationDate'),
                'AnnualPremium': float(raw_policy.get('annualPremium', 0)),
                'PriorTermPremium': float(raw_policy.get('priorTermPremium', 0)) if raw_policy.get('priorTermPremium') else None,
                'State': raw_policy.get('ratingState', ''),
                'ZipCode': raw_policy.get('zipCode', ''),
                'ProducerCode': raw_policy.get('producerCode', ''),
                'ProducerName': raw_policy.get('producerName', ''),
            }
        # Add AMS360, EZLynx, HawkSoft mappings as needed
        return raw_policy
    
    def _map_lob(self, ams_lob):
        """Map AMS-specific LOB codes to standard codes"""
        lob_mapping = {
            'Personal Auto': 'PERSONAL_AUTO',
            'Homeowner': 'HOMEOWNERS',
            'Home': 'HOMEOWNERS',
            'Commercial Property': 'COMMERCIAL_PROPERTY',
            'Commercial Auto': 'COMMERCIAL_AUTO',
            'General Liability': 'COMMERCIAL_GL',
            'Workers Compensation': 'WORKERS_COMP',
            'BOP': 'BOP',
            'Businessowners': 'BOP',
            'Umbrella': 'UMBRELLA',
            'Excess': 'UMBRELLA',
        }
        return lob_mapping.get(ams_lob, ams_lob.upper().replace(' ', '_'))
    
    def upsert_to_warehouse(self, normalized_policies):
        """Upsert normalized policy data into Azure SQL"""
        cursor = self.sql_conn.cursor()
        for p in normalized_policies:
            cursor.execute("""
                MERGE dbo.Policies AS target
                USING (SELECT ? AS AMSPolicyNumber, ? AS AMSSource) AS source
                ON target.AMSPolicyNumber = source.AMSPolicyNumber AND target.AMSSource = source.AMSSource
                WHEN MATCHED THEN UPDATE SET
                    ClientName = ?, CarrierCode = ?, CarrierName = ?,
                    LineOfBusiness = ?, CoverageType = ?,
                    PolicyEffectiveDate = ?, PolicyExpirationDate = ?,
                    AnnualPremium = ?, PriorTermPremium = ?,
                    State = ?, ZipCode = ?,
                    ProducerCode = ?, ProducerName = ?,
                    RenewalDate = ?, LastSyncDate = GETUTCDATE(), ModifiedDate = GETUTCDATE()
                WHEN NOT MATCHED THEN INSERT (
                    AMSPolicyNumber, AMSSource, ClientID, ClientName,
                    CarrierCode, CarrierName, LineOfBusiness, CoverageType,
                    PolicyEffectiveDate, PolicyExpirationDate,
                    AnnualPremium, PriorTermPremium, State, ZipCode,
                    ProducerCode, ProducerName, RenewalDate
                ) VALUES (?, ?, NEWID(), ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
            """, (
                p['AMSPolicyNumber'], p['AMSSource'],
                p['ClientName'], p['CarrierCode'], p['CarrierName'],
                p['LineOfBusiness'], p['CoverageType'],
                p['PolicyEffectiveDate'], p['PolicyExpirationDate'],
                p['AnnualPremium'], p['PriorTermPremium'],
                p['State'], p['ZipCode'],
                p['ProducerCode'], p['ProducerName'],
                p['PolicyExpirationDate'],  # RenewalDate = expiration for renewal policies
                # INSERT values
                p['AMSPolicyNumber'], p['AMSSource'], p['ClientName'],
                p['CarrierCode'], p['CarrierName'],
                p['LineOfBusiness'], p['CoverageType'],
                p['PolicyEffectiveDate'], p['PolicyExpirationDate'],
                p['AnnualPremium'], p['PriorTermPremium'],
                p['State'], p['ZipCode'],
                p['ProducerCode'], p['ProducerName'],
                p['PolicyExpirationDate']
            ))
        cursor.commit()
        return len(normalized_policies)
pipeline_daily_policy_sync.json
json
# Azure Data Factory pipeline definition; import into ADF Studio

{
  "name": "pipeline_daily_policy_sync",
  "properties": {
    "activities": [
      {
        "name": "Extract_AMS_Policies",
        "type": "AzureFunctionActivity",
        "typeProperties": {
          "functionName": "ams_extract",
          "method": "POST",
          "body": {
            "ams_type": "EPIC",
            "modified_since": "@pipeline().TriggerTime"
          }
        }
      },
      {
        "name": "Run_Remarketing_Flagging",
        "type": "SqlServerStoredProcedure",
        "dependsOn": [{"activity": "Extract_AMS_Policies", "dependencyConditions": ["Succeeded"]}],
        "typeProperties": {
          "storedProcedureName": "dbo.usp_FlagAccountsForRemarketing"
        },
        "linkedServiceName": {
          "referenceName": "ls_AzureSQL_PremiumWarehouse",
          "type": "LinkedServiceReference"
        }
      }
    ]
  }
}
Note

The Python extraction script should be deployed as an Azure Function (Python runtime) for serverless execution, or as a custom activity in Azure Data Factory. The AMS API endpoints and authentication flows vary significantly between vendors — Applied Epic uses OAuth2 with their auth server, AMS360 uses Vertafore's TransactNOW platform, and HawkSoft may require CSV/XML export with file-based ingestion. For HawkSoft agencies, consider using their scheduled report export feature to generate CSV files that are uploaded to Azure Blob Storage and then ingested via ADF. Always implement incremental extraction (modifiedSince parameter) to avoid full reloads.

Step 6: Market Benchmark Data Integration

Configure the ingestion of market benchmark data from Zywave Analytics and/or Ivans Insights into the MarketBenchmarks table. This data provides the comparison baseline for flagging accounts that are priced above market. For agencies using Applied Epic with PLRI, this data is built-in; for others, manual or API-based benchmark feeds must be configured.

Zywave Analytics API integration
python
# Azure Function to pull benchmark data monthly

# For Zywave Analytics API integration (if API access is available):
# Configure Azure Function to pull benchmark data monthly

import requests
import pyodbc

def ingest_zywave_benchmarks(zywave_api_key, sql_conn_string):
    headers = {'Authorization': f'Bearer {zywave_api_key}'}
    
    lines_of_business = [
        'PERSONAL_AUTO', 'HOMEOWNERS', 'COMMERCIAL_PROPERTY',
        'COMMERCIAL_AUTO', 'COMMERCIAL_GL', 'WORKERS_COMP', 'BOP'
    ]
    states = ['AL','AK','AZ','AR','CA','CO','CT','DE','FL','GA','HI','ID','IL','IN','IA',
              'KS','KY','LA','ME','MD','MA','MI','MN','MS','MO','MT','NE','NV','NH','NJ',
              'NM','NY','NC','ND','OH','OK','OR','PA','RI','SC','SD','TN','TX','UT','VT',
              'VA','WA','WV','WI','WY']
    
    conn = pyodbc.connect(sql_conn_string)
    cursor = conn.cursor()
    
    for lob in lines_of_business:
        for state in states:
            response = requests.get(
                f'https://api.zywave.com/analytics/v1/benchmarks',
                headers=headers,
                params={'lineOfBusiness': lob, 'state': state}
            )
            if response.status_code == 200:
                data = response.json()
                cursor.execute("""
                    MERGE dbo.MarketBenchmarks AS target
                    USING (SELECT ? AS DataSource, ? AS LineOfBusiness, ? AS State, ? AS BenchmarkPeriod) AS source
                    ON target.DataSource = source.DataSource 
                        AND target.LineOfBusiness = source.LineOfBusiness 
                        AND target.State = source.State
                        AND target.BenchmarkPeriod = source.BenchmarkPeriod
                    WHEN MATCHED THEN UPDATE SET
                        MedianPremium = ?, AvgPremium = ?, P25Premium = ?, P75Premium = ?,
                        SampleSize = ?, AvgPremiumChangePercent = ?, LastUpdated = GETUTCDATE()
                    WHEN NOT MATCHED THEN INSERT (
                        DataSource, LineOfBusiness, State, BenchmarkPeriod,
                        MedianPremium, AvgPremium, P25Premium, P75Premium,
                        SampleSize, AvgPremiumChangePercent
                    ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?);
                """, (
                    'ZYWAVE', lob, state, data.get('period'),
                    data.get('medianPremium'), data.get('avgPremium'),
                    data.get('p25Premium'), data.get('p75Premium'),
                    data.get('sampleSize'), data.get('avgChangePercent'),
                    'ZYWAVE', lob, state, data.get('period'),
                    data.get('medianPremium'), data.get('avgPremium'),
                    data.get('p25Premium'), data.get('p75Premium'),
                    data.get('sampleSize'), data.get('avgChangePercent')
                ))
    cursor.commit()
    conn.close()
Manual benchmark data import via CSV using BULK INSERT
sql
-- Alternative: Manual benchmark data import via CSV
-- For agencies without API access to Zywave/Ivans, prepare a CSV template
-- and import quarterly using ADF or SSMS

-- CSV Template (benchmark_import_template.csv):
-- DataSource,LineOfBusiness,CoverageType,State,ZipCodePrefix,BenchmarkPeriod,MedianPremium,AvgPremium,P25Premium,P75Premium,SampleSize,AvgPremiumChangePercent
-- INDUSTRY,HOMEOWNERS,,FL,,2025-01-01,2850.00,3100.00,2200.00,3800.00,15000,12.5
-- INDUSTRY,PERSONAL_AUTO,,FL,,2025-01-01,1950.00,2100.00,1500.00,2600.00,25000,8.3

-- Import command using SSMS:
BULK INSERT dbo.MarketBenchmarks
FROM '/data/benchmark_import_template.csv'
WITH (
    FIRSTROW = 2,
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n',
    FORMAT = 'CSV'
);
Note

Benchmark data availability varies significantly by line of business. Personal lines (auto, home) have the most robust benchmark data. Commercial lines benchmarks are harder to obtain and less granular. Zywave's 400,000+ program database is one of the best commercial lines sources. Ivans Insights currently focuses on homeowners/dwelling fire with more lines coming. For lines without market benchmark data, the system falls back to term-over-term premium change analysis (which is always available from the AMS). Update benchmark data at minimum quarterly; monthly is preferred. The MSP should establish a quarterly benchmark refresh as part of the managed service agreement.

Step 7: Power BI Dashboard Development

Build the Power BI dashboards that agency producers and account managers will use daily to view premium benchmarking results, identify remarketing candidates, and track remarketing outcomes. Create a multi-page report with executive summary, renewal pipeline, benchmark comparison, and remarketing tracking views. Connect Power BI to the Azure SQL premium data warehouse using DirectQuery for real-time data.

1
Open Power BI Desktop
2
Get Data > Azure > Azure SQL Database
3
Server: sql-insurance-benchmarking.database.windows.net
4
Database: db-premium-benchmarking
5
Data Connectivity Mode: DirectQuery (for real-time data)
6
Authentication: Azure AD (SSO)
DAX Measures for Premium Benchmarking Dashboard — add these to the Power BI data model
dax
// Measure: Total Policies
Total Policies = COUNTROWS(Policies)

// Measure: Total Premium
Total Premium = SUM(Policies[AnnualPremium])

// Measure: Average Premium Change %
Avg Premium Change = AVERAGE(Policies[PremiumChangePercent])

// Measure: Policies Flagged for Remarketing
Flagged for Remarketing = 
    CALCULATE(
        COUNTROWS(Policies),
        Policies[RemarketingFlag] = TRUE()
    )

// Measure: Remarketing Rate
Remarketing Rate = 
    DIVIDE(
        [Flagged for Remarketing],
        [Total Policies],
        0
    )

// Measure: Premium at Risk (total premium of flagged policies)
Premium at Risk = 
    CALCULATE(
        SUM(Policies[AnnualPremium]),
        Policies[RemarketingFlag] = TRUE()
    )

// Measure: Policies Above Market Benchmark
Above Market = 
    CALCULATE(
        COUNTROWS(Policies),
        FILTER(
            Policies,
            Policies[AnnualPremium] > RELATED(MarketBenchmarks[P75Premium])
        )
    )

// Measure: Remarketing Success Rate
Remarketing Success Rate = 
    DIVIDE(
        CALCULATE(COUNTROWS(RemarketingActions), RemarketingActions[Outcome] = "REMARKETED"),
        CALCULATE(COUNTROWS(RemarketingActions), RemarketingActions[Outcome] <> "PENDING"),
        0
    )

// Measure: Total Savings Achieved
Total Savings = 
    CALCULATE(
        SUM(RemarketingActions[SavingsAmount]),
        RemarketingActions[Outcome] = "REMARKETED"
    )

// Measure: Renewals Next 30/60/90 Days
Renewals Next 30 Days = 
    CALCULATE(
        COUNTROWS(Policies),
        Policies[RenewalDate] >= TODAY(),
        Policies[RenewalDate] <= TODAY() + 30
    )

Renewals Next 60 Days = 
    CALCULATE(
        COUNTROWS(Policies),
        Policies[RenewalDate] >= TODAY(),
        Policies[RenewalDate] <= TODAY() + 60
    )

Renewals Next 90 Days = 
    CALCULATE(
        COUNTROWS(Policies),
        Policies[RenewalDate] >= TODAY(),
        Policies[RenewalDate] <= TODAY() + 90
    )

Power BI Report Pages Layout Specification

  • PAGE 1: Executive Dashboard — KPI Cards: Total Policies, Total Premium, Avg Premium Change, Flagged for Remarketing, Premium at Risk; Donut Chart: Policies by Line of Business; Bar Chart: Premium Change % by Carrier (top 10); Line Chart: Monthly Premium Trend (12-month rolling); Map Visual: Premium by State
  • PAGE 2: Renewal Pipeline — Table: Upcoming Renewals (next 90 days) with columns: Client, Policy#, Carrier, Current Premium, Prior Premium, Change%, LOB, Renewal Date, Remarketing Flag; Slicer: Date Range (30/60/90 days); Slicer: Line of Business; Slicer: Producer; Conditional formatting: Red for >15% increase, Yellow for 10–15%, Green for <10%
  • PAGE 3: Benchmark Comparison — Scatter Plot: Client Premium (Y) vs Market Median (X) by LOB — points above diagonal are overpriced; Box Plot: Premium distribution by LOB with client premiums overlaid; Table: Policies above 75th percentile benchmark with delta amount; Slicer: State, LOB, Carrier
  • PAGE 4: Remarketing Tracker — Funnel: Flagged > In Progress > Quotes Received > Remarketed / Retained; Table: Active Remarketing Actions with status, assigned producer, carriers submitted, best quote; KPI Cards: Total Savings, Success Rate, Avg Days to Complete; Bar Chart: Remarketing Outcomes by Producer
  • PAGE 5: Producer Scorecard — Table: Producer-level metrics — book size, avg premium change, remarketing rate, retention rate, savings generated; Column Chart: Producer comparison on key metrics
1
In Power BI Desktop: File > Publish > Publish to Power BI
2
Select the client's Power BI workspace
3
Configure scheduled refresh: Workspace > Dataset > Settings > Scheduled Refresh
4
Set refresh to 4x daily (6am, 10am, 2pm, 6pm)
5
Configure row-level security (RLS) so producers only see their own book — see RLS Role Definition below
6
Assign users to RLS roles in Power BI Service: Workspace > Dataset > Security > Add members to ProducerFilter role
7
Configure email alerts: On the Remarketing KPI card, set alert — When 'Flagged for Remarketing' increases by > 5, send email to agency principal
Row-Level Security (RLS) DAX filter
dax
# defines the ProducerFilter role so each producer sees only their own book

// RLS Role Definition
// Power BI Desktop > Modeling > Manage Roles
// Role Name: ProducerFilter
// Table: Policies
[ProducerName] = USERNAME()
Note

Power BI Pro licenses are required for all dashboard consumers ($14/user/month or included in M365 E5). Use DirectQuery mode for real-time data from Azure SQL — this ensures producers always see current remarketing flags without waiting for dataset refresh. However, DirectQuery has performance implications for complex DAX calculations; consider Import mode with 4x daily refresh if the agency has a large book (>10,000 policies) and query performance is slow. The Row-Level Security configuration is critical — producers should only see their own book of business. Agency principals and operations managers get unfiltered access. Leave the .pbix file with the client and document how to modify thresholds.

Step 8: InsuredMine CRM Integration for Remarketing Workflows

Deploy and configure InsuredMine as the CRM overlay for automated remarketing workflow execution. InsuredMine connects to the agency's AMS and provides pipeline-based workflow management, automated client communications, and producer task assignment. Configure remarketing pipeline stages, automated triggers, and email/text templates for client outreach during the remarketing process.

1
Account Creation & AMS Integration — Navigate to: Settings > Integrations > Agency Management System. Select AMS type (Applied Epic / AMS360 / EZLynx / HawkSoft). Enter AMS API credentials. Enable bi-directional sync: Contacts, Policies, Renewals. Set sync frequency: Every 4 hours.
2
Create Remarketing Pipeline — Navigate to: Pipelines > Create New Pipeline. Pipeline Name: 'Premium Remarketing Pipeline'. Stages: Stage 1: 'Flagged for Review' (auto-populated from benchmark system), Stage 2: 'Producer Review' (producer confirms remarketing action), Stage 3: 'Submission Sent' (submissions sent to carriers), Stage 4: 'Quotes Received' (competitive quotes returned), Stage 5: 'Client Presentation' (presenting options to client), Stage 6: 'Remarketed - Won' (client moved to new carrier), Stage 7: 'Retained - Current Carrier' (client stays, premium negotiated), Stage 8: 'Declined - No Action' (client declined all options).
3
Configure Pipeline Automation Rules — Navigate to: Automations > Create New Automation. Trigger: 'Deal moves to Submission Sent stage'. Action: 'Send email to client using template: Remarketing_In_Progress'. Action: 'Create task for producer: Follow up on quotes in 5 business days'.
Azure Function webhook receiver
python
# syncs InsuredMine pipeline stage changes back to Azure SQL

# InsuredMine Webhook Configuration for Azure SQL Integration
# This webhook pushes remarketing status updates from InsuredMine back to Azure SQL

# Azure Function to receive InsuredMine webhook events:
import azure.functions as func
import pyodbc
import json

def main(req: func.HttpRequest) -> func.HttpResponse:
    try:
        payload = req.get_json()
        event_type = payload.get('event_type')
        deal = payload.get('deal', {})
        
        conn_string = os.environ['SQL_CONNECTION_STRING']
        conn = pyodbc.connect(conn_string)
        cursor = conn.cursor()
        
        if event_type == 'deal.stage_changed':
            policy_number = deal.get('custom_fields', {}).get('policy_number')
            new_stage = deal.get('stage_name')
            
            status_mapping = {
                'Flagged for Review': 'FLAGGED',
                'Producer Review': 'FLAGGED',
                'Submission Sent': 'IN_PROGRESS',
                'Quotes Received': 'IN_PROGRESS',
                'Client Presentation': 'IN_PROGRESS',
                'Remarketed - Won': 'COMPLETED',
                'Retained - Current Carrier': 'COMPLETED',
                'Declined - No Action': 'DECLINED'
            }
            
            outcome_mapping = {
                'Remarketed - Won': 'REMARKETED',
                'Retained - Current Carrier': 'RETAINED',
                'Declined - No Action': 'LOST'
            }
            
            new_status = status_mapping.get(new_stage, 'IN_PROGRESS')
            outcome = outcome_mapping.get(new_stage)
            
            cursor.execute("""
                UPDATE dbo.Policies 
                SET RemarketingStatus = ?, ModifiedDate = GETUTCDATE()
                WHERE AMSPolicyNumber = ?
            """, (new_status, policy_number))
            
            if outcome:
                cursor.execute("""
                    UPDATE dbo.RemarketingActions 
                    SET Outcome = ?, OutcomeDate = GETDATE(), ModifiedDate = GETUTCDATE()
                    WHERE PolicyID = (SELECT PolicyID FROM dbo.Policies WHERE AMSPolicyNumber = ?)
                    AND Outcome = 'PENDING'
                """, (outcome, policy_number))
            
            cursor.commit()
        
        conn.close()
        return func.HttpResponse('OK', status_code=200)
    except Exception as e:
        return func.HttpResponse(str(e), status_code=500)

Create the following email templates in InsuredMine > Templates > Email:

Template 1: Remarketing_Notification

Subject: Important: Your {{policy_type}} Policy Renewal Update Dear {{client_first_name}}, As your trusted insurance advisor, we proactively review your policies before each renewal to ensure you're getting the best value. We've identified that your {{policy_type}} policy with {{carrier_name}} is renewing with a premium change. We're currently reviewing competitive options from our carrier partners to ensure you have the best coverage at the most competitive price. Your current premium: ${{current_premium}} Renewal premium: ${{renewal_premium}} We'll be in touch shortly with options. No action is needed from you at this time. Best regards, {{producer_name}} {{agency_name}}
Sonnet 4.6

Template 2: Remarketing_Options_Ready

Subject: Great News - Competitive Options for Your {{policy_type}} Insurance Dear {{client_first_name}}, We've completed our market review and have competitive options for your {{policy_type}} policy. I'd like to schedule a brief call to review your options and ensure you have the coverage that best fits your needs. Please click below to schedule a convenient time: {{booking_link}} Best regards, {{producer_name}}
Sonnet 4.6
Note

InsuredMine offers native integrations with Applied Epic, AMS360, EZLynx, HawkSoft, and others — verify the specific AMS integration is supported during procurement. The webhook integration back to Azure SQL is optional but recommended for keeping the Power BI dashboards in sync with remarketing progress tracked in InsuredMine. If ReFocus AI is used instead of or alongside InsuredMine, configure similar webhook-based status synchronization. Train the agency team on pipeline management — this is where they'll spend most of their daily workflow time.

Step 9: Automated Remarketing Trigger Configuration

Configure the automated trigger system that pushes flagged accounts from the Azure SQL data warehouse into the InsuredMine remarketing pipeline. This Azure Function runs after each daily policy sync, identifies newly flagged policies, and creates corresponding deals in InsuredMine via API. This is the bridge between the analytics engine and the operational workflow.

Azure Function: push_remarketing_to_crm.py — Trigger: Timer (runs daily at 7:00 AM after the 6:00 AM policy sync), or Event Grid trigger from ADF pipeline completion
python
# Azure Function: push_remarketing_to_crm.py
# Trigger: Timer (runs daily at 7:00 AM after the 6:00 AM policy sync)
# Or: Event Grid trigger from ADF pipeline completion

import azure.functions as func
import pyodbc
import requests
import json
import os
from datetime import datetime

def main(timer: func.TimerRequest) -> None:
    sql_conn_string = os.environ['SQL_CONNECTION_STRING']
    insuredmine_api_key = os.environ['INSUREDMINE_API_KEY']
    insuredmine_base_url = 'https://api.insuredmine.com/v1'
    
    conn = pyodbc.connect(sql_conn_string)
    cursor = conn.cursor()
    
    # Get newly flagged policies not yet pushed to CRM
    cursor.execute("""
        SELECT 
            p.PolicyID, p.AMSPolicyNumber, p.ClientName, p.CarrierName,
            p.LineOfBusiness, p.AnnualPremium, p.PriorTermPremium,
            p.PremiumChangePercent, p.RenewalDate, p.ProducerName,
            p.RemarketingReason, p.State
        FROM dbo.Policies p
        LEFT JOIN dbo.RemarketingActions ra ON p.PolicyID = ra.PolicyID AND ra.Outcome = 'PENDING'
        WHERE p.RemarketingFlag = 1 
        AND p.RemarketingStatus = 'FLAGGED'
        AND ra.ActionID IS NOT NULL
        AND NOT EXISTS (
            SELECT 1 FROM dbo.RemarketingActions ra2 
            WHERE ra2.PolicyID = p.PolicyID 
            AND ra2.CRMDealCreated = 1
        )
    """)
    
    flagged_policies = cursor.fetchall()
    
    headers = {
        'Authorization': f'Bearer {insuredmine_api_key}',
        'Content-Type': 'application/json'
    }
    
    for policy in flagged_policies:
        deal_data = {
            'pipeline_name': 'Premium Remarketing Pipeline',
            'stage_name': 'Flagged for Review',
            'deal_name': f'Remarket: {policy.ClientName} - {policy.LineOfBusiness}',
            'deal_value': float(policy.AnnualPremium),
            'assigned_to': policy.ProducerName,
            'expected_close_date': policy.RenewalDate.isoformat() if policy.RenewalDate else None,
            'custom_fields': {
                'policy_number': policy.AMSPolicyNumber,
                'carrier': policy.CarrierName,
                'current_premium': float(policy.AnnualPremium),
                'prior_premium': float(policy.PriorTermPremium) if policy.PriorTermPremium else None,
                'premium_change_pct': float(policy.PremiumChangePercent),
                'line_of_business': policy.LineOfBusiness,
                'remarketing_reason': policy.RemarketingReason,
                'state': policy.State
            },
            'tags': ['auto-remarketing', policy.LineOfBusiness.lower()]
        }
        
        response = requests.post(
            f'{insuredmine_base_url}/deals',
            headers=headers,
            json=deal_data
        )
        
        if response.status_code in (200, 201):
            # Mark as pushed to CRM
            cursor.execute("""
                UPDATE dbo.RemarketingActions 
                SET CRMDealCreated = 1, ModifiedDate = GETUTCDATE()
                WHERE PolicyID = ? AND Outcome = 'PENDING'
            """, (str(policy.PolicyID),))
    
    cursor.commit()
    conn.close()
Add CRMDealCreated column to RemarketingActions if not already present
sql
ALTER TABLE dbo.RemarketingActions ADD CRMDealCreated BIT DEFAULT 0;
Azure Function timer trigger configuration (function.json)
json
{
  "scriptFile": "push_remarketing_to_crm.py",
  "bindings": [
    {
      "name": "timer",
      "type": "timerTrigger",
      "direction": "in",
      "schedule": "0 0 7 * * *"
    }
  ]
}
Note

This function is the critical automation bridge. If InsuredMine's API doesn't support deal creation via REST (check current API documentation during implementation), use InsuredMine's Zapier integration as a fallback — create a Zap that monitors a shared Google Sheet or Azure Blob CSV file and creates deals in InsuredMine. Alternatively, if ReFocus AI is used, their platform has native AMS integration for automatic remarketing triggers and can replace this custom function. Monitor the function daily during the first 2 weeks to ensure deals are being created correctly.

Step 10: Network Security and Compliance Configuration

Configure the network infrastructure to meet GLBA Safeguards Rule and NAIC Model Law #668 requirements. This includes firewall rules for API traffic, VLAN segmentation, TLS enforcement, and audit logging. This step is essential for compliance and should be documented thoroughly for the agency's information security plan.

Fortinet FortiGate 60F VLAN, firewall policy, logging configuration, and Azure SQL security settings
shell
# Fortinet FortiGate 60F Configuration
# Access FortiGate web console at https://192.168.1.99 (default)

# 1. Create VLAN for Analytics Traffic
config system interface
    edit "analytics-vlan"
        set vdom "root"
        set ip 10.10.20.1 255.255.255.0
        set allowaccess ping https ssh
        set interface "internal"
        set vlanid 20
    next
end

# 2. Create address objects for cloud services
config firewall address
    edit "Azure-SQL"
        set type fqdn
        set fqdn "sql-insurance-benchmarking.database.windows.net"
    next
    edit "Azure-DataFactory"
        set type fqdn
        set fqdn "*.datafactory.azure.net"
    next
    edit "InsuredMine-API"
        set type fqdn
        set fqdn "api.insuredmine.com"
    next
    edit "Applied-API"
        set type fqdn
        set fqdn "api.appliedsystems.com"
    next
    edit "PowerBI-Service"
        set type fqdn
        set fqdn "*.powerbi.com"
    next
end

# 3. Create firewall policy for analytics VLAN outbound
config firewall policy
    edit 100
        set name "Analytics-to-Cloud"
        set srcintf "analytics-vlan"
        set dstintf "wan1"
        set srcaddr "all"
        set dstaddr "Azure-SQL" "Azure-DataFactory" "InsuredMine-API" "Applied-API" "PowerBI-Service"
        set action accept
        set schedule "always"
        set service "HTTPS" "MSSQL"
        set ssl-ssh-profile "certificate-inspection"
        set logtraffic all
        set comments "Allow analytics traffic to cloud services"
    next
end

# 4. Enable logging for compliance audit trail
config log fortianalyzer setting
    set status enable
end
config log syslogd setting
    set status enable
    set server "<syslog-server-ip>"
end
# Azure SQL - Enable Advanced Threat Protection
az sql db threat-policy update \
  --resource-group rg-insurance-analytics \
  --server sql-insurance-benchmarking \
  --name db-premium-benchmarking \
  --state Enabled \
  --email-addresses security@msp-domain.com agency-principal@agency.com

# Enable Azure SQL Auditing to Log Analytics
az monitor diagnostic-settings create \
  --name sql-audit-logs \
  --resource /subscriptions/<sub-id>/resourceGroups/rg-insurance-analytics/providers/Microsoft.Sql/servers/sql-insurance-benchmarking/databases/db-premium-benchmarking \
  --workspace /subscriptions/<sub-id>/resourceGroups/rg-insurance-analytics/providers/Microsoft.OperationalInsights/workspaces/<workspace-name> \
  --logs '[{"category":"SQLSecurityAuditEvents","enabled":true,"retentionPolicy":{"enabled":true,"days":365}}]'
# Enforce TLS 1.2 minimum on Azure SQL
az sql server update \
  --resource-group rg-insurance-analytics \
  --name sql-insurance-benchmarking \
  --minimal-tls-version 1.2
Note

GLBA Safeguards Rule compliance requires a written information security plan. Document all firewall rules, VLAN configurations, encryption settings, and access controls in a formal Information Security Policy document. Provide this to the agency principal for their compliance file. NAIC Model Law #668 requires annual risk assessments — add this to the MSP's annual service review checklist. Agencies with fewer than 10 employees are exempt from some NAIC #668 provisions but still must comply with breach notification requirements. Keep all audit logs for a minimum of 3 years.

Step 11: End-to-End Testing and Validation

Conduct thorough testing of the entire pipeline from AMS data extraction through benchmark comparison, remarketing flagging, CRM deal creation, and Power BI dashboard display. Use a representative sample of real policy data to validate accuracy. This phase should involve the agency principal and a senior CSR who can verify that benchmark comparisons and remarketing flags are reasonable and accurate.

bash
# Test 1: Verify AMS data extraction
# Run the extraction pipeline manually
az datafactory pipeline create-run \
  --resource-group rg-insurance-analytics \
  --factory-name adf-insurance-etl \
  --name pipeline_daily_policy_sync

# Verify data in Azure SQL
SELECT TOP 20 
    AMSPolicyNumber, ClientName, CarrierName, LineOfBusiness,
    AnnualPremium, PriorTermPremium, PremiumChangePercent,
    RenewalDate, State
FROM dbo.Policies
ORDER BY LastSyncDate DESC;
# Test 2: Verify remarketing flagging
EXEC dbo.usp_FlagAccountsForRemarketing;

# Check flagged accounts
SELECT 
    p.ClientName, p.AMSPolicyNumber, p.CarrierName,
    p.AnnualPremium, p.PriorTermPremium, p.PremiumChangePercent,
    p.RemarketingReason, p.RemarketingStatus,
    t.PremiumIncreaseThresholdPercent as ThresholdUsed
FROM dbo.Policies p
INNER JOIN dbo.RemarketingThresholds t ON p.LineOfBusiness = t.LineOfBusiness
WHERE p.RemarketingFlag = 1
ORDER BY p.PremiumChangePercent DESC;
# Test 3: Verify benchmark comparison accuracy
SELECT 
    p.ClientName, p.LineOfBusiness, p.State,
    p.AnnualPremium as ClientPremium,
    b.MedianPremium as MarketMedian,
    b.P75Premium as Market75thPctile,
    CASE WHEN p.AnnualPremium > b.P75Premium THEN 'ABOVE MARKET' ELSE 'WITHIN MARKET' END as BenchmarkStatus
FROM dbo.Policies p
INNER JOIN dbo.MarketBenchmarks b ON 
    p.LineOfBusiness = b.LineOfBusiness AND p.State = b.State
WHERE b.BenchmarkPeriod = (SELECT MAX(BenchmarkPeriod) FROM dbo.MarketBenchmarks)
ORDER BY (p.AnnualPremium - b.P75Premium) DESC;
# Test 4: Verify InsuredMine deal creation
# Check that flagged policies appear as deals in InsuredMine
curl -X GET 'https://api.insuredmine.com/v1/deals?pipeline=Premium%20Remarketing%20Pipeline&stage=Flagged%20for%20Review' \
  -H 'Authorization: Bearer <insuredmine-api-key>' | python -m json.tool
Note

Testing should span at least 5 business days to verify the daily automated cycle works end-to-end. Have the agency CSR manually verify 20-30 flagged accounts against what they know about those clients. Common issues during testing: (1) LOB mapping mismatches causing incorrect benchmark comparisons, (2) policies with $0 prior premium generating false 'infinite increase' flags, (3) timezone differences causing duplicate sync runs. Fix any issues before proceeding to training. Document all test results in a formal testing report for the client handoff.

Step 12: User Training and Go-Live

Conduct hands-on training sessions with agency staff covering dashboard navigation, remarketing workflow management, and threshold adjustment procedures. Training should be role-specific: producers learn their book review workflow, CSRs learn remarketing pipeline management, and the agency principal learns executive reporting and threshold configuration. Deploy the solution to production and begin the 30-day hypercare support period.

  • Session 1: Producers & Account Managers (3 hours)
  • Power BI dashboard navigation (45 min): Executive summary interpretation, Renewal pipeline filtering and sorting, Benchmark comparison scatter plot reading, Mobile app access for field use
  • Remarketing pipeline in InsuredMine (60 min): Reviewing flagged accounts, Moving deals through pipeline stages, Recording quotes and outcomes
  • Break (15 min)
  • Hands-on practice with real accounts (60 min): Each producer reviews their next 5 renewals, Practice remarketing workflow from flag to completion
  • Session 2: Operations & Management (2 hours)
  • Executive dashboard deep dive (30 min)
  • Remarketing threshold adjustment procedure (30 min): How to modify thresholds per LOB in the config table, When to adjust thresholds (quarterly review)
  • Reporting and export capabilities (30 min)
  • Compliance and data security overview (15 min)
  • Q&A and feedback (15 min)
1
Dashboard_User_Guide.pdf — screenshots and navigation instructions
2
Remarketing_Workflow_SOP.pdf — step-by-step pipeline management
3
Threshold_Configuration_Guide.pdf — how to adjust remarketing triggers
4
Data_Dictionary.pdf — explanation of all fields and calculated metrics
5
Troubleshooting_FAQ.pdf — common issues and resolutions
6
Information_Security_Policy.pdf — GLBA/NAIC compliance documentation
7
Emergency_Contact_Card.pdf — MSP support contacts and escalation paths
Note

Schedule training for a Tuesday or Wednesday (avoid Monday/Friday). Record the training sessions via Teams/Zoom so new hires can reference them later. The 30-day hypercare period should include daily check-ins during week 1, then twice-weekly during weeks 2-4. During hypercare, monitor the pipeline daily and be prepared to adjust thresholds — the initial settings will almost certainly need tuning based on the agency's specific book characteristics. Common adjustment: lowering the personal auto threshold from 15% to 12% because personal auto rates are increasing broadly and the agency doesn't want to remarket every renewal.

Custom AI Components

Premium Benchmarking Scoring Engine

Type: skill A scoring algorithm that evaluates each policy on a 0-100 remarketing urgency scale by combining multiple signals: premium change percentage, deviation from market benchmark, account tenure, claims history, total account premium, and days until renewal. This score prioritizes the remarketing pipeline so producers focus on the highest-value, highest-urgency accounts first.

Implementation:

Stored Procedure: dbo.usp_CalculateRemarketingScore — Calculates a 0-100 remarketing urgency score for each policy. Higher score = more urgent remarketing candidate. Scoring weights: Premium Increase (30%), Market Benchmark Deviation (25%), Account Value (20%), Renewal Urgency (15%), Claims Favorability (10%). These can be tuned per agency. Run this procedure after `usp_FlagAccountsForRemarketing` in the daily pipeline.
sql
-- Stored Procedure: dbo.usp_CalculateRemarketingScore
-- Calculates a 0-100 remarketing urgency score for each policy
-- Higher score = more urgent remarketing candidate

CREATE PROCEDURE dbo.usp_CalculateRemarketingScore
AS
BEGIN
    SET NOCOUNT ON;

    -- Add RemarketingScore column if not exists
    IF NOT EXISTS (SELECT 1 FROM sys.columns WHERE object_id = OBJECT_ID('dbo.Policies') AND name = 'RemarketingScore')
        ALTER TABLE dbo.Policies ADD RemarketingScore DECIMAL(5,2) DEFAULT 0;

    UPDATE p SET RemarketingScore = (
        -- Component 1: Premium Increase Score (0-30 points)
        -- 0% increase = 0 points, 30%+ increase = 30 points, linear interpolation
        CASE 
            WHEN p.PremiumChangePercent >= 30 THEN 30.0
            WHEN p.PremiumChangePercent > 0 THEN p.PremiumChangePercent * 1.0
            ELSE 0.0
        END
        +
        -- Component 2: Market Benchmark Deviation Score (0-25 points)
        -- At or below median = 0, at P75 = 12.5, at 2x median = 25
        CASE
            WHEN b.MedianPremium IS NULL THEN 0.0  -- no benchmark data
            WHEN p.AnnualPremium <= b.MedianPremium THEN 0.0
            WHEN p.AnnualPremium >= b.MedianPremium * 2 THEN 25.0
            ELSE ROUND(((p.AnnualPremium - b.MedianPremium) / NULLIF(b.MedianPremium, 0)) * 25.0, 2)
        END
        +
        -- Component 3: Account Value Score (0-20 points)
        -- Higher total account premium = higher priority for retention
        CASE
            WHEN ISNULL(p.TotalAccountPremium, p.AnnualPremium) >= 25000 THEN 20.0
            WHEN ISNULL(p.TotalAccountPremium, p.AnnualPremium) >= 10000 THEN 15.0
            WHEN ISNULL(p.TotalAccountPremium, p.AnnualPremium) >= 5000 THEN 10.0
            WHEN ISNULL(p.TotalAccountPremium, p.AnnualPremium) >= 2000 THEN 5.0
            ELSE 2.0
        END
        +
        -- Component 4: Renewal Urgency Score (0-15 points)
        -- Closer to renewal = more urgent
        CASE
            WHEN DATEDIFF(DAY, GETDATE(), p.RenewalDate) <= 14 THEN 15.0
            WHEN DATEDIFF(DAY, GETDATE(), p.RenewalDate) <= 30 THEN 12.0
            WHEN DATEDIFF(DAY, GETDATE(), p.RenewalDate) <= 60 THEN 8.0
            WHEN DATEDIFF(DAY, GETDATE(), p.RenewalDate) <= 90 THEN 4.0
            ELSE 1.0
        END
        +
        -- Component 5: Claims Favorability Score (0-10 points)
        -- Fewer claims = easier to remarket, higher score
        CASE
            WHEN ISNULL(p.ClaimsCount, 0) = 0 THEN 10.0
            WHEN p.ClaimsCount = 1 THEN 6.0
            WHEN p.ClaimsCount = 2 THEN 3.0
            ELSE 0.0  -- 3+ claims, very hard to remarket
        END
    ),
    ModifiedDate = GETUTCDATE()
    FROM dbo.Policies p
    LEFT JOIN dbo.MarketBenchmarks b ON 
        p.LineOfBusiness = b.LineOfBusiness 
        AND p.State = b.State
        AND b.BenchmarkPeriod = (
            SELECT MAX(BenchmarkPeriod) FROM dbo.MarketBenchmarks 
            WHERE LineOfBusiness = p.LineOfBusiness AND State = p.State
        )
    WHERE p.RemarketingFlag = 1;

    -- Output top remarketing candidates
    SELECT TOP 50
        p.ClientName, p.AMSPolicyNumber, p.CarrierName,
        p.LineOfBusiness, p.State,
        p.AnnualPremium, p.PriorTermPremium, p.PremiumChangePercent,
        p.RemarketingScore,
        p.RenewalDate,
        DATEDIFF(DAY, GETDATE(), p.RenewalDate) as DaysToRenewal,
        p.ClaimsCount,
        p.TotalAccountPremium,
        b.MedianPremium as MarketMedian,
        b.P75Premium as MarketP75
    FROM dbo.Policies p
    LEFT JOIN dbo.MarketBenchmarks b ON 
        p.LineOfBusiness = b.LineOfBusiness AND p.State = b.State
        AND b.BenchmarkPeriod = (SELECT MAX(BenchmarkPeriod) FROM dbo.MarketBenchmarks WHERE LineOfBusiness = p.LineOfBusiness AND State = p.State)
    WHERE p.RemarketingFlag = 1
    ORDER BY p.RemarketingScore DESC;
END;
GO

Renewal Intelligence Agent

Type: agent An automated agent that runs daily to analyze the upcoming renewal pipeline, generate natural-language summaries of key remarketing opportunities, and send morning briefing emails to each producer with their top 5 remarketing-priority accounts for the week. Uses Azure OpenAI GPT-5.4 to generate personalized, contextual summaries.

Implementation:

renewal_intelligence_agent.py
python
# Azure Function with timer trigger running daily at 7:30 AM weekdays

# renewal_intelligence_agent.py
# Azure Function - Timer Trigger: runs daily at 7:30 AM

import azure.functions as func
import pyodbc
import os
from openai import AzureOpenAI
from datetime import datetime
import smtplib
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText

def main(timer: func.TimerRequest) -> None:
    sql_conn = pyodbc.connect(os.environ['SQL_CONNECTION_STRING'])
    
    client = AzureOpenAI(
        api_key=os.environ['AZURE_OPENAI_API_KEY'],
        api_version='2024-06-01',
        azure_endpoint=os.environ['AZURE_OPENAI_ENDPOINT']
    )
    
    # Get unique producers with flagged accounts
    cursor = sql_conn.cursor()
    cursor.execute("""
        SELECT DISTINCT ProducerName, ProducerEmail
        FROM dbo.Policies p
        INNER JOIN dbo.ProducerContacts pc ON p.ProducerCode = pc.ProducerCode
        WHERE p.RemarketingFlag = 1 
        AND p.RemarketingStatus = 'FLAGGED'
        AND p.RenewalDate BETWEEN GETDATE() AND DATEADD(DAY, 90, GETDATE())
    """)
    producers = cursor.fetchall()
    
    for producer in producers:
        producer_name = producer.ProducerName
        producer_email = producer.ProducerEmail
        
        # Get top 5 remarketing priorities for this producer
        cursor.execute("""
            SELECT TOP 5
                p.ClientName, p.AMSPolicyNumber, p.CarrierName,
                p.LineOfBusiness, p.AnnualPremium, p.PriorTermPremium,
                p.PremiumChangePercent, p.RemarketingScore,
                p.RenewalDate, p.TotalAccountPremium,
                p.ClaimsCount, p.AccountTenureYears, p.State,
                b.MedianPremium, b.P75Premium,
                p.RemarketingReason
            FROM dbo.Policies p
            LEFT JOIN dbo.MarketBenchmarks b ON 
                p.LineOfBusiness = b.LineOfBusiness AND p.State = b.State
                AND b.BenchmarkPeriod = (SELECT MAX(BenchmarkPeriod) FROM dbo.MarketBenchmarks WHERE LineOfBusiness = p.LineOfBusiness AND State = p.State)
            WHERE p.ProducerName = ?
            AND p.RemarketingFlag = 1
            AND p.RemarketingStatus = 'FLAGGED'
            ORDER BY p.RemarketingScore DESC
        """, (producer_name,))
        
        accounts = cursor.fetchall()
        if not accounts:
            continue
        
        # Build context for GPT
        account_summaries = []
        for a in accounts:
            summary = {
                'client': a.ClientName,
                'policy': a.AMSPolicyNumber,
                'carrier': a.CarrierName,
                'lob': a.LineOfBusiness,
                'current_premium': float(a.AnnualPremium),
                'prior_premium': float(a.PriorTermPremium) if a.PriorTermPremium else None,
                'change_pct': float(a.PremiumChangePercent),
                'score': float(a.RemarketingScore),
                'renewal_date': a.RenewalDate.strftime('%B %d, %Y'),
                'total_account': float(a.TotalAccountPremium) if a.TotalAccountPremium else None,
                'claims': int(a.ClaimsCount) if a.ClaimsCount else 0,
                'tenure_years': int(a.AccountTenureYears) if a.AccountTenureYears else None,
                'market_median': float(a.MedianPremium) if a.MedianPremium else None,
                'market_p75': float(a.P75Premium) if a.P75Premium else None,
                'reason': a.RemarketingReason
            }
            account_summaries.append(summary)
        
        # Generate AI briefing
        prompt = f"""You are an insurance agency operations assistant. Generate a morning briefing email 
for producer {producer_name} summarizing their top remarketing priority accounts.

For each account, provide:
1. A 2-sentence summary of why this account needs attention
2. A specific recommended action (e.g., 'Submit to Travelers and Hartford for competitive quotes')
3. Any risk factors to be aware of (claims, tenure, etc.)

Be concise, professional, and action-oriented. Use dollar amounts and percentages.
Do not fabricate carrier names for recommendations — use general advice like 'submit to your top 3 appointed carriers for this line'.

Accounts to summarize:
{json.dumps(account_summaries, indent=2, default=str)}

Format as an HTML email body with a professional layout. Use a table for the account summary 
and bullet points for action items. Include a header 'Your Remarketing Priority Briefing - {datetime.now().strftime("%A, %B %d, %Y")}'.
"""
        
        response = client.chat.completions.create(
            model='gpt-5.4',
            messages=[
                {'role': 'system', 'content': 'You are an insurance industry expert assistant that helps producers prioritize their renewal book and identify remarketing opportunities. Be specific, data-driven, and actionable.'},
                {'role': 'user', 'content': prompt}
            ],
            temperature=0.3,
            max_tokens=2000
        )
        
        email_body = response.choices[0].message.content
        
        # Send email via SMTP (or Microsoft Graph API)
        send_briefing_email(producer_email, producer_name, email_body)
    
    sql_conn.close()


def send_briefing_email(to_email, producer_name, html_body):
    smtp_server = os.environ.get('SMTP_SERVER', 'smtp.office365.com')
    smtp_port = int(os.environ.get('SMTP_PORT', 587))
    smtp_user = os.environ['SMTP_USER']
    smtp_pass = os.environ['SMTP_PASS']
    
    msg = MIMEMultipart('alternative')
    msg['Subject'] = f'Remarketing Priority Briefing - {datetime.now().strftime("%A, %B %d")}'
    msg['From'] = smtp_user
    msg['To'] = to_email
    
    msg.attach(MIMEText(html_body, 'html'))
    
    with smtplib.SMTP(smtp_server, smtp_port) as server:
        server.starttls()
        server.login(smtp_user, smtp_pass)
        server.sendmail(smtp_user, to_email, msg.as_string())
Note

Requires: Azure OpenAI Service (GPT-5.4 deployment, ~$0.005/briefing), Azure Function App (Python 3.11+), SMTP credentials or Microsoft Graph API access for email delivery. Add a ProducerContacts table to the database schema mapping ProducerCode to email addresses. Deploy as an Azure Function with timer trigger at '0 30 7 * * 1-5' (7:30 AM weekdays only).

Premium Trend Anomaly Detector

Type: workflow An automated workflow that runs weekly to detect unusual premium patterns across the agency's book of business — such as a carrier systematically raising rates in a specific geography, an emerging hard market in a particular line of business, or accounts that have had consecutive annual increases exceeding market averages. Generates strategic intelligence reports for the agency principal.

Implementation:

Weekly anomaly detection stored procedure — detects carrier rate spikes, geographic rate concentration, consecutive annual increases, and benchmark outliers across the book of business.
sql
-- Weekly anomaly detection stored procedure
CREATE PROCEDURE dbo.usp_DetectPremiumAnomalies
AS
BEGIN
    SET NOCOUNT ON;

    -- Create temp table for results
    CREATE TABLE #Anomalies (
        AnomalyType NVARCHAR(50),
        Severity NVARCHAR(10), -- 'HIGH', 'MEDIUM', 'LOW'
        Description NVARCHAR(MAX),
        AffectedPolicies INT,
        AffectedPremium DECIMAL(14,2),
        RecommendedAction NVARCHAR(MAX)
    );

    -- Anomaly 1: Carrier with above-average rate increases
    INSERT INTO #Anomalies
    SELECT 
        'CARRIER_RATE_SPIKE' as AnomalyType,
        CASE 
            WHEN AVG(PremiumChangePercent) > 25 THEN 'HIGH'
            WHEN AVG(PremiumChangePercent) > 15 THEN 'MEDIUM'
            ELSE 'LOW'
        END as Severity,
        CONCAT(CarrierName, ' is averaging ', ROUND(AVG(PremiumChangePercent), 1), 
               '% premium increases across ', COUNT(*), ' policies in ', LineOfBusiness,
               '. Market average is ', 
               ISNULL((SELECT TOP 1 ROUND(AVG(AvgPremiumChangePercent), 1) FROM dbo.MarketBenchmarks WHERE LineOfBusiness = p.LineOfBusiness), 'N/A'),
               '%.') as Description,
        COUNT(*) as AffectedPolicies,
        SUM(AnnualPremium) as AffectedPremium,
        CONCAT('Consider reducing exposure to ', CarrierName, ' for ', LineOfBusiness, 
               '. Evaluate appointing alternative carriers in this space.') as RecommendedAction
    FROM dbo.Policies p
    WHERE PremiumChangePercent > 10
    AND RenewalDate BETWEEN GETDATE() AND DATEADD(MONTH, 6, GETDATE())
    GROUP BY CarrierName, LineOfBusiness
    HAVING COUNT(*) >= 5 AND AVG(PremiumChangePercent) > 15;

    -- Anomaly 2: Geographic rate concentration
    INSERT INTO #Anomalies
    SELECT
        'GEO_RATE_CONCENTRATION' as AnomalyType,
        CASE 
            WHEN AVG(PremiumChangePercent) > 20 THEN 'HIGH'
            WHEN AVG(PremiumChangePercent) > 12 THEN 'MEDIUM'
            ELSE 'LOW'
        END,
        CONCAT('Premiums in ', State, ' for ', LineOfBusiness, ' are averaging ',
               ROUND(AVG(PremiumChangePercent), 1), '% increases across ', COUNT(*), ' policies.',
               ' This may indicate a hardening market in this geography.'),
        COUNT(*),
        SUM(AnnualPremium),
        CONCAT('Proactively remarket ', LineOfBusiness, ' policies in ', State,
               ' before renewal. Consider surplus lines carriers for difficult placements.')
    FROM dbo.Policies
    WHERE PremiumChangePercent > 8
    AND RenewalDate BETWEEN GETDATE() AND DATEADD(MONTH, 6, GETDATE())
    GROUP BY State, LineOfBusiness
    HAVING COUNT(*) >= 3 AND AVG(PremiumChangePercent) > 12;

    -- Anomaly 3: Consecutive annual increases (multi-year pain)
    INSERT INTO #Anomalies
    SELECT
        'CONSECUTIVE_INCREASES' as AnomalyType,
        'HIGH',
        CONCAT(p.ClientName, ' (', p.AMSPolicyNumber, ') has had premium increases for ',
               consecutive_years.YearsOfIncreases, ' consecutive years. Current premium $',
               FORMAT(p.AnnualPremium, 'N2'), ', cumulative increase of ',
               ROUND(consecutive_years.CumulativeIncrease, 1), '%.'),
        1,
        p.AnnualPremium,
        CONCAT('High retention risk. Schedule proactive outreach to ', p.ClientName,
               ' with competitive alternatives before the next renewal on ', FORMAT(p.RenewalDate, 'MMM dd, yyyy'), '.')
    FROM dbo.Policies p
    CROSS APPLY (
        SELECT 
            COUNT(*) as YearsOfIncreases,
            ROUND((EXP(SUM(LOG(1 + ph.PremiumChangePct/100.0))) - 1) * 100, 1) as CumulativeIncrease
        FROM (
            SELECT 
                ROW_NUMBER() OVER (ORDER BY TermEffectiveDate DESC) as rn,
                (TermPremium - LAG(TermPremium) OVER (ORDER BY TermEffectiveDate)) / 
                    NULLIF(LAG(TermPremium) OVER (ORDER BY TermEffectiveDate), 0) * 100 as PremiumChangePct
            FROM dbo.PremiumHistory
            WHERE PolicyID = p.PolicyID
        ) ph
        WHERE ph.PremiumChangePct > 0 AND ph.rn <= 3
        HAVING COUNT(*) >= 2
    ) consecutive_years
    WHERE p.RenewalDate BETWEEN GETDATE() AND DATEADD(DAY, 120, GETDATE());

    -- Anomaly 4: Accounts significantly above market benchmark
    INSERT INTO #Anomalies
    SELECT
        'BENCHMARK_OUTLIER' as AnomalyType,
        CASE 
            WHEN p.AnnualPremium > b.P75Premium * 1.5 THEN 'HIGH'
            ELSE 'MEDIUM'
        END,
        CONCAT(p.ClientName, '''s ', p.LineOfBusiness, ' premium of $', FORMAT(p.AnnualPremium, 'N2'),
               ' is ', ROUND(((p.AnnualPremium - b.MedianPremium) / NULLIF(b.MedianPremium, 0)) * 100, 0),
               '% above the market median of $', FORMAT(b.MedianPremium, 'N2'),
               ' and exceeds the 75th percentile ($', FORMAT(b.P75Premium, 'N2'), ').'),
        1,
        p.AnnualPremium,
        CONCAT('Strong remarketing candidate. Current carrier may not be competitive for this risk profile. Submit to top 3 alternative carriers.')
    FROM dbo.Policies p
    INNER JOIN dbo.MarketBenchmarks b ON p.LineOfBusiness = b.LineOfBusiness AND p.State = b.State
        AND b.BenchmarkPeriod = (SELECT MAX(BenchmarkPeriod) FROM dbo.MarketBenchmarks WHERE LineOfBusiness = p.LineOfBusiness AND State = p.State)
    WHERE p.AnnualPremium > b.P75Premium * 1.25
    AND p.RenewalDate BETWEEN GETDATE() AND DATEADD(DAY, 120, GETDATE());

    -- Output anomalies
    SELECT * FROM #Anomalies ORDER BY 
        CASE Severity WHEN 'HIGH' THEN 1 WHEN 'MEDIUM' THEN 2 ELSE 3 END,
        AffectedPremium DESC;

    DROP TABLE #Anomalies;
END;
GO
  • Schedule this procedure to run weekly (Sunday night) via Azure Data Factory.
  • Output can be captured and emailed to the agency principal as a formatted HTML report using the same Azure Function email pattern as the Renewal Intelligence Agent.
  • Add a Power BI report page called 'Strategic Insights' that visualizes the anomaly data.

Client Remarketing Communication Prompt

Type: prompt A structured prompt template for generating personalized client communication when a policy is flagged for remarketing. Used by producers within the InsuredMine CRM to quickly draft professional, compliant outreach emails that explain the remarketing process to clients without alarming them or implying the current carrier is inadequate.

Implementation:

System Prompt

You are a professional insurance agency communication assistant. You help insurance producers write client-facing emails about policy renewals and remarketing. Your tone is: - Warm and reassuring (not alarming) - Professional and knowledgeable - Proactive and service-oriented - Compliant with insurance regulations (never guarantee savings, never disparage current carrier) IMPORTANT RULES: 1. Never state that the current carrier is bad or overcharging 2. Never guarantee savings or lower premiums 3. Always frame remarketing as a proactive service the agency provides 4. Always mention that coverage adequacy is the priority, not just price 5. Never share specific competitor quotes in the initial outreach 6. Always include a call-to-action for a conversation
Sonnet 4.6

User Prompt Template

Write a personalized email from [PRODUCER_NAME] at [AGENCY_NAME] to [CLIENT_FIRST_NAME] [CLIENT_LAST_NAME] regarding their [LINE_OF_BUSINESS] policy with [CARRIER_NAME]. Context: - Policy Number: [POLICY_NUMBER] - Current Annual Premium: $[CURRENT_PREMIUM] - Renewal Premium: $[RENEWAL_PREMIUM] - Premium Change: [CHANGE_PERCENT]% - Renewal Date: [RENEWAL_DATE] - Client has been with the agency for [TENURE] years - Claims in last 3 years: [CLAIMS_COUNT] - Total account premium (all policies): $[TOTAL_ACCOUNT_PREMIUM] Scenario: [Select one] A) Initial notification - informing client we are reviewing their renewal B) Options ready - we have competitive alternatives to present C) Retention - current carrier is still the best option despite increase D) Remarketing complete - client is moving to a new carrier Generate the email with: - Subject line - Greeting - Body (3-4 paragraphs max) - Call to action - Professional signature block
Sonnet 4.6

Example Output (Scenario A)

Subject: Your [Homeowners] Insurance Renewal - We're Working For You Dear [Sarah], I hope this message finds you well. As your renewal date of [March 15, 2025] approaches for your [homeowners] policy, I wanted to reach out personally to let you know that we're actively reviewing your coverage and options. As part of our commitment to providing you with the best possible service, we proactively review every renewal in our book. I've noticed some movement in the [homeowners] insurance market in [Florida], and I want to make sure your coverage remains both comprehensive and competitively priced. Over the next few weeks, I'll be evaluating your current coverage alongside options from our other carrier partners. My priority is ensuring you have the right protection for your home and family — and that you're getting strong value for your premium dollar. I'll be in touch soon with my recommendations. In the meantime, if you've made any changes to your home, added valuable items, or have any questions about your coverage, please don't hesitate to reach out. I'm always happy to help. Warm regards, [Producer Name] [Agency Name] [Phone] | [Email]
Sonnet 4.6

This prompt template should be saved in InsuredMine as a reusable email template with merge fields, or provided to producers as a reference document. If the agency uses Azure OpenAI, this prompt can be integrated into a Teams bot or custom web form that auto-populates client data from the database and generates draft emails for producer review before sending.

AMS Data Normalizer Integration

Type: integration A configurable data normalization layer that maps field names, codes, and formats from different AMS platforms (Applied Epic, AMS360, EZLynx, HawkSoft) into the common premium data warehouse schema. This integration component ensures the analytics engine works identically regardless of which AMS the agency uses.

Implementation:

ams_normalizer.py
python
# Configurable AMS data normalization module

# ams_normalizer.py
# Configurable AMS data normalization module

import json
from datetime import datetime
from typing import Dict, List, Optional

# AMS Field Mapping Configuration
AMS_FIELD_MAPPINGS = {
    'EPIC': {
        'policy_number': 'policyNumber',
        'client_name': 'insuredName',
        'carrier_code': 'carrierCode',
        'carrier_name': 'carrierName',
        'line_of_business': 'lineOfBusiness',
        'coverage_type': 'coverageDescription',
        'effective_date': 'effectiveDate',
        'expiration_date': 'expirationDate',
        'annual_premium': 'annualPremium',
        'prior_premium': 'priorTermPremium',
        'state': 'ratingState',
        'zip_code': 'zipCode',
        'producer_code': 'producerCode',
        'producer_name': 'producerName',
        'industry_code': 'naicsCode',
        'claims_count': 'claimCount',
        'claims_total': 'totalIncurred',
        'account_premium': 'totalAccountPremium',
    },
    'AMS360': {
        'policy_number': 'PolicyNumber',
        'client_name': 'InsuredName',
        'carrier_code': 'CompanyCode',
        'carrier_name': 'CompanyName',
        'line_of_business': 'PolicyType',
        'coverage_type': 'CoverageCode',
        'effective_date': 'EffDate',
        'expiration_date': 'ExpDate',
        'annual_premium': 'WrittenPremium',
        'prior_premium': 'PriorPremium',
        'state': 'StateCode',
        'zip_code': 'Zip',
        'producer_code': 'ExecCode',
        'producer_name': 'ExecName',
        'industry_code': 'SICCode',
        'claims_count': 'NumberOfClaims',
        'claims_total': 'TotalPaid',
        'account_premium': 'AccountPremium',
    },
    'EZLYNX': {
        'policy_number': 'policy_num',
        'client_name': 'named_insured',
        'carrier_code': 'carrier_id',
        'carrier_name': 'carrier_name',
        'line_of_business': 'line_of_business',
        'coverage_type': 'coverage_desc',
        'effective_date': 'eff_date',
        'expiration_date': 'exp_date',
        'annual_premium': 'total_premium',
        'prior_premium': 'prior_term_premium',
        'state': 'state',
        'zip_code': 'zip',
        'producer_code': 'agent_code',
        'producer_name': 'agent_name',
        'industry_code': None,
        'claims_count': 'num_claims',
        'claims_total': 'total_claims_paid',
        'account_premium': 'household_premium',
    },
    'HAWKSOFT': {
        'policy_number': 'PolicyNum',
        'client_name': 'ClientName',
        'carrier_code': 'CoCode',
        'carrier_name': 'CompanyName',
        'line_of_business': 'LOB',
        'coverage_type': 'Type',
        'effective_date': 'EffectiveDate',
        'expiration_date': 'ExpirationDate',
        'annual_premium': 'Premium',
        'prior_premium': 'PriorPremium',
        'state': 'State',
        'zip_code': 'ZipCode',
        'producer_code': 'AgentCode',
        'producer_name': 'AgentName',
        'industry_code': None,
        'claims_count': None,
        'claims_total': None,
        'account_premium': None,
    }
}

# Line of Business standardization mapping
LOB_STANDARDIZATION = {
    # Applied Epic
    'Personal Auto': 'PERSONAL_AUTO',
    'Personal Automobile': 'PERSONAL_AUTO',
    'Homeowner': 'HOMEOWNERS',
    'Homeowners': 'HOMEOWNERS',
    'Home': 'HOMEOWNERS',
    'Dwelling Fire': 'HOMEOWNERS',
    'Commercial Property': 'COMMERCIAL_PROPERTY',
    'Commercial Auto': 'COMMERCIAL_AUTO',
    'Commercial Automobile': 'COMMERCIAL_AUTO',
    'General Liability': 'COMMERCIAL_GL',
    'CGL': 'COMMERCIAL_GL',
    'Workers Compensation': 'WORKERS_COMP',
    'Workers Comp': 'WORKERS_COMP',
    'WC': 'WORKERS_COMP',
    'BOP': 'BOP',
    'Businessowners': 'BOP',
    'Business Owners': 'BOP',
    'Umbrella': 'UMBRELLA',
    'Excess Liability': 'UMBRELLA',
    'Commercial Umbrella': 'UMBRELLA',
    'Inland Marine': 'INLAND_MARINE',
    'Professional Liability': 'PROFESSIONAL_LIABILITY',
    'E&O': 'PROFESSIONAL_LIABILITY',
    'D&O': 'DIRECTORS_OFFICERS',
    'Cyber': 'CYBER_LIABILITY',
    'EPLI': 'EPLI',
    # AMS360 codes
    'PA': 'PERSONAL_AUTO',
    'HO': 'HOMEOWNERS',
    'CP': 'COMMERCIAL_PROPERTY',
    'CA': 'COMMERCIAL_AUTO',
    'GL': 'COMMERCIAL_GL',
    'WK': 'WORKERS_COMP',
    'BP': 'BOP',
    'UM': 'UMBRELLA',
    'IM': 'INLAND_MARINE',
}


class AMSNormalizer:
    def __init__(self, ams_type: str):
        if ams_type not in AMS_FIELD_MAPPINGS:
            raise ValueError(f'Unsupported AMS: {ams_type}. Supported: {list(AMS_FIELD_MAPPINGS.keys())}')
        self.ams_type = ams_type
        self.mapping = AMS_FIELD_MAPPINGS[ams_type]
    
    def normalize_policy(self, raw_data: Dict) -> Dict:
        """Transform a raw AMS policy record into the normalized warehouse schema."""
        normalized = {
            'AMSPolicyNumber': self._get_field(raw_data, 'policy_number'),
            'AMSSource': self.ams_type,
            'ClientName': self._get_field(raw_data, 'client_name', ''),
            'CarrierCode': self._get_field(raw_data, 'carrier_code', ''),
            'CarrierName': self._get_field(raw_data, 'carrier_name', ''),
            'LineOfBusiness': self._standardize_lob(self._get_field(raw_data, 'line_of_business', '')),
            'CoverageType': self._get_field(raw_data, 'coverage_type', ''),
            'PolicyEffectiveDate': self._parse_date(self._get_field(raw_data, 'effective_date')),
            'PolicyExpirationDate': self._parse_date(self._get_field(raw_data, 'expiration_date')),
            'AnnualPremium': self._parse_decimal(self._get_field(raw_data, 'annual_premium', 0)),
            'PriorTermPremium': self._parse_decimal(self._get_field(raw_data, 'prior_premium')),
            'State': self._standardize_state(self._get_field(raw_data, 'state', '')),
            'ZipCode': str(self._get_field(raw_data, 'zip_code', '')).strip()[:10],
            'ProducerCode': self._get_field(raw_data, 'producer_code', ''),
            'ProducerName': self._get_field(raw_data, 'producer_name', ''),
            'IndustryCode': self._get_field(raw_data, 'industry_code'),
            'ClaimsCount': self._parse_int(self._get_field(raw_data, 'claims_count', 0)),
            'ClaimsAmountTotal': self._parse_decimal(self._get_field(raw_data, 'claims_total', 0)),
            'TotalAccountPremium': self._parse_decimal(self._get_field(raw_data, 'account_premium')),
        }
        
        # Calculate renewal date (= expiration date for renewal policies)
        normalized['RenewalDate'] = normalized['PolicyExpirationDate']
        
        # Validate required fields
        if not normalized['AMSPolicyNumber']:
            raise ValueError('Policy number is required')
        if normalized['AnnualPremium'] is None or normalized['AnnualPremium'] < 0:
            raise ValueError(f'Invalid premium for policy {normalized["AMSPolicyNumber"]}')
        
        return normalized
    
    def normalize_batch(self, raw_records: List[Dict]) -> List[Dict]:
        """Normalize a batch of raw AMS records, skipping invalid ones."""
        results = []
        errors = []
        for record in raw_records:
            try:
                normalized = self.normalize_policy(record)
                if normalized['AnnualPremium'] > 0:  # Skip $0 premium policies
                    results.append(normalized)
            except (ValueError, KeyError) as e:
                errors.append({'record': record, 'error': str(e)})
        
        if errors:
            print(f'WARNING: {len(errors)} records failed normalization out of {len(raw_records)}')
            for err in errors[:5]:
                print(f'  - {err["error"]}')
        
        return results
    
    def _get_field(self, data: Dict, standard_field: str, default=None):
        ams_field = self.mapping.get(standard_field)
        if ams_field is None:
            return default
        return data.get(ams_field, default)
    
    def _standardize_lob(self, lob_value: str) -> str:
        if not lob_value:
            return 'OTHER'
        cleaned = lob_value.strip()
        return LOB_STANDARDIZATION.get(cleaned, cleaned.upper().replace(' ', '_'))
    
    def _standardize_state(self, state: str) -> str:
        if not state:
            return ''
        return state.strip().upper()[:2]
    
    def _parse_date(self, value) -> Optional[str]:
        if value is None:
            return None
        if isinstance(value, datetime):
            return value.strftime('%Y-%m-%d')
        if isinstance(value, str):
            for fmt in ('%Y-%m-%d', '%m/%d/%Y', '%Y-%m-%dT%H:%M:%S', '%Y-%m-%dT%H:%M:%SZ'):
                try:
                    return datetime.strptime(value.strip(), fmt).strftime('%Y-%m-%d')
                except ValueError:
                    continue
        return str(value)[:10]
    
    def _parse_decimal(self, value) -> Optional[float]:
        if value is None:
            return None
        if isinstance(value, (int, float)):
            return round(float(value), 2)
        if isinstance(value, str):
            cleaned = value.replace('$', '').replace(',', '').strip()
            if cleaned:
                return round(float(cleaned), 2)
        return None
    
    def _parse_int(self, value) -> int:
        if value is None:
            return 0
        return int(float(str(value).strip() or 0))


# Usage example:
# normalizer = AMSNormalizer('EPIC')
# raw_policies = epic_api.get_policies(modified_since='2025-01-01')
# normalized = normalizer.normalize_batch(raw_policies)
# warehouse.upsert(normalized)

This normalizer is the key component that makes the solution AMS-agnostic. To add support for a new AMS, simply add a new entry to AMS_FIELD_MAPPINGS with the field name mapping and any new LOB codes to LOB_STANDARDIZATION. The normalizer handles date format variations, currency string cleaning, and LOB code standardization automatically.

Testing & Validation

  • Verify AMS API connectivity by extracting 100 policies and confirming all required fields (policy number, client name, carrier, premium, effective/expiration dates, LOB, state) are populated — check for null values in each field and document completeness percentage; target >95% field completeness
  • Validate premium change calculations by selecting 20 policies with known prior-term premiums from the AMS, running them through the data warehouse, and comparing the calculated PremiumChangePercent against manual calculations — all 20 must match within $0.01 and 0.01%
  • Test the remarketing flagging stored procedure by inserting 10 test policies with known premium increases (5 above threshold, 5 below) and verifying that exactly the correct 5 are flagged — reset and re-run with modified thresholds to confirm threshold changes take effect immediately
  • Validate market benchmark comparison by selecting 10 policies across different LOBs and states, comparing their premiums against the MarketBenchmarks table manually, and confirming the Power BI benchmark scatter plot correctly positions them relative to median and P75 values
  • Verify the remarketing scoring engine produces expected priority ordering by creating 5 test policies with different score profiles (high premium increase + close renewal vs. moderate increase + high account value) and confirming the scores rank them in the expected order
  • Test InsuredMine CRM deal creation by manually flagging 5 policies in the database, running the push_remarketing_to_crm Azure Function, and verifying 5 new deals appear in the InsuredMine 'Premium Remarketing Pipeline' with correct client names, premium amounts, and producer assignments
  • Validate Power BI Row-Level Security by logging in as two different producers and confirming each sees only their own book of business — test with the agency principal account to confirm they see the full unfiltered view
  • Test the daily automated pipeline end-to-end by allowing the scheduled triggers to run for 3 consecutive business days without manual intervention — verify data freshness by checking LastSyncDate values in Azure SQL are within 24 hours of current time
  • Verify email alerts work by setting a low threshold (e.g., 'when Flagged for Remarketing > 0') on the Power BI dashboard and confirming the agency principal receives an email notification within 15 minutes of the threshold being crossed
  • Test the Renewal Intelligence Agent by running it manually and verifying that each producer with flagged accounts receives a briefing email with accurate policy data, reasonable AI-generated summaries, and no hallucinated carrier names or fabricated premium amounts
  • Validate the webhook integration between InsuredMine and Azure SQL by moving a deal through all pipeline stages in InsuredMine and confirming the corresponding Policies.RemarketingStatus and RemarketingActions.Outcome fields update correctly in the database within 5 minutes
  • Perform a compliance audit: verify TLS 1.2 is enforced on all Azure SQL connections, Azure SQL TDE is enabled, firewall rules restrict access to only authorized IPs, Key Vault secrets are not exposed in any logs or pipeline definitions, and audit logging is capturing all data access events
  • Stress test the data pipeline with a simulated large book — insert 10,000 test policies and verify the flagging procedure, scoring engine, and Power BI dashboard all perform within acceptable response times (flagging <30 seconds, dashboard page load <5 seconds)
  • Validate data freshness monitoring by intentionally breaking the AMS API connection (revoke credentials temporarily) and confirming that the Azure Data Factory pipeline failure generates an alert email to the MSP within 30 minutes

Client Handoff

1
Solution overview — walk through the architecture diagram showing data flow from AMS to benchmarking dashboards to remarketing workflows
2
Power BI dashboard tour — demonstrate each of the 5 report pages, showing how to filter by producer, LOB, carrier, date range, and how to read the benchmark scatter plot
3
Remarketing workflow demonstration — show a live account moving through the InsuredMine pipeline from 'Flagged' to 'Remarketed'
4
Threshold management — demonstrate how to view and modify remarketing thresholds in the RemarketingThresholds table (provide a simple web form or documented SQL procedure)
5
Morning briefing emails — show a sample producer briefing email and explain the AI-generated content
6
Success metrics definition — agree on KPIs to track: remarketing rate (% of renewals remarketed), savings generated ($), retention rate improvement, and producer adoption rate
7
Documentation handoff — provide printed and digital copies of: Dashboard User Guide, Remarketing Workflow SOP, Threshold Configuration Guide, Data Dictionary, Troubleshooting FAQ, Information Security Policy, and Emergency Contact Card with MSP support phone/email and escalation paths
8
Support expectations — explain the 30-day hypercare period (daily check-ins week 1, twice-weekly weeks 2-4), then transition to standard monthly managed services with quarterly business reviews
9
Login credentials and access verification — ensure all users can log into Power BI, InsuredMine, and Azure portal (for principals only) from their workstations and mobile devices
10
Feedback mechanism — establish a shared Teams channel or email alias for ongoing feedback and feature requests

Maintenance

  • DAILY — Monitor Azure Data Factory pipeline execution status via ADF Monitor dashboard; verify daily policy sync completed successfully; check Azure Function execution logs for the remarketing push and intelligence agent; respond to any pipeline failure alerts within 4 hours SLA.
  • WEEKLY — Review the Premium Trend Anomaly Detector output and forward strategic insights to the agency principal; check Azure SQL DTU utilization and storage consumption; verify Power BI dataset refresh is completing on schedule; review InsuredMine webhook integration logs for any failed status updates.
  • MONTHLY — Generate a managed services report showing: number of policies synced, accounts flagged for remarketing, remarketing outcomes (won/retained/lost), total savings generated, and system uptime; conduct a 30-minute review call with the agency principal to discuss results and any threshold adjustments; apply Azure SQL and Azure Function runtime updates; review and rotate API credentials stored in Key Vault if approaching expiration.
  • QUARTERLY — Refresh market benchmark data from Zywave/Ivans sources; conduct a threshold tuning session with the agency based on 90 days of remarketing data; review Power BI dashboards with the agency and add/modify visualizations based on feedback; perform a compliance review (audit log review, access control verification, information security plan update); update the LOB standardization mappings if the agency has added new lines or carriers; generate a quarterly ROI report showing cumulative savings from remarketing vs. system cost.
  • ANNUALLY — Conduct a full NAIC Model Law #668 risk assessment; review and update the agency's written information security plan; evaluate whether to upgrade Azure SQL tier based on data growth; assess new AI features from Applied Systems, Zywave, or InsuredMine for potential integration; renegotiate software licensing as needed.
  • MODEL RETRAINING — The remarketing scoring weights (premium change 30%, benchmark deviation 25%, account value 20%, renewal urgency 15%, claims favorability 10%) should be reviewed quarterly by analyzing which scored accounts actually resulted in successful remarketing vs. those that didn't — adjust weights based on observed outcomes.
  • ESCALATION PATH — Tier 1 (MSP help desk): dashboard access issues, basic filtering questions, password resets; Tier 2 (MSP analytics team): data quality issues, pipeline failures, threshold adjustments, report modifications; Tier 3 (MSP solutions architect): AMS API integration issues, schema changes, new feature development, vendor escalation to Applied Systems/Zywave/InsuredMine support.

Alternatives

...

Applied PLRI Native Approach (Low Complexity)

For agencies already running Applied Epic, activate the built-in Applied Insurance AI Premium & Loss Ratio Intelligence (PLRI) module instead of building a custom benchmarking pipeline. PLRI provides term-over-term premium comparison and market benchmarking natively within the Epic interface. Pair with Applied Book Builder for cross-sell intelligence. No custom database, ETL pipeline, or Power BI development is required.

InsuredMine + Zywave SaaS-Only Approach (Medium Complexity)

Deploy InsuredMine as the CRM/analytics overlay and Zywave Analytics Cloud for benchmark data without building a custom Azure data warehouse. InsuredMine provides native AMS integrations, visual analytics dashboards, and pipeline-based workflow management. Zywave provides the benchmark data. The MSP configures both platforms and builds automation rules within InsuredMine rather than custom code.

Full Custom Data Warehouse with ML Models (High Complexity)

Build a comprehensive data warehouse on Snowflake or Azure Synapse Analytics with machine learning models that predict which accounts are most likely to be lost at renewal, which carriers will offer the best rates for a given risk profile, and what the optimal remarketing timing is. Uses Azure Machine Learning or Databricks for model training and inference.

ReFocus AI Automated Remarketing Approach

Deploy ReFocus AI as the primary remarketing automation engine instead of InsuredMine + custom Azure Functions. ReFocus AI is purpose-built for automated remarketing and can define triggers based on premium increases, policy type, client tenure, and claims history. It automatically sends submissions to appointed carriers when thresholds are breached, reducing the need for custom workflow automation code.

Power BI + Manual Benchmark Approach (Budget Option)

Build Power BI dashboards connected directly to the AMS database (or CSV exports) with manually maintained benchmark data. Skip the Azure SQL data warehouse, Azure Data Factory, and automated CRM integration. The MSP updates benchmark data quarterly from publicly available sources (state insurance department rate filings, industry reports). Remarketing is flagged via Power BI alerts but managed manually in the AMS.

Want early access to the full toolkit?