
Implementation Guide: Analyze sales velocity, margin, and inventory turns to identify winners and underperformers
Step-by-step implementation guide for deploying AI to analyze sales velocity, margin, and inventory turns to identify winners and underperformers for Retail clients.
Hardware Procurement
Dell OptiPlex Micro Plus 7020
$1,000–$1,200 MSP cost / $1,400–$1,530 suggested resale
Optional on-premises local BI server and ETL host for retailers who require data residency on-site or have unreliable internet. Hosts self-managed Metabase or Airbyte Docker containers. Compact form factor fits in retail back-office environments. Not required for cloud-first deployments.
Synology DiskStation DS925+
$550–$700 NAS + $400–$500 drives = $950–$1,200 MSP cost / $1,200–$1,400 suggested resale
Optional local backup and data archival appliance. Stores nightly data warehouse backups, POS export archives, and historical CSV snapshots. Provides Hyper Backup to Azure Blob or AWS S3 for offsite redundancy. Only required if client has on-premises data retention requirements.
Dashboard Display Monitor
$350–$400 MSP cost / $500–$550 suggested resale
Wall-mounted or desk-mounted display in the buyer/manager office or back-of-store for always-on KPI dashboard visibility. Power BI dashboard runs in kiosk mode via Chrome/Edge on the connected workstation.
Software Procurement
Microsoft Power BI Pro
$11–$12/user/month MSP CSP cost; $14–$16/user/month client price. Typical deployment: 3–5 users = $42–$80/month
Primary business intelligence and dashboard platform. Hosts interactive dashboards for sales velocity, margin analysis, inventory turns, and winner/underperformer identification. Includes natural language Q&A, scheduled refresh, and row-level security for multi-location access control.
Microsoft 365 Business Premium
$18–$20/user/month MSP CSP cost; $22–$24/user/month client price. Typical: 5–10 users = $110–$240/month
Foundation platform providing Azure AD for identity/SSO, Power Automate for alerting workflows, SharePoint for documentation, and Teams for alert delivery. Required for Power BI Pro integration and MFA compliance.
Fivetran Data Pipeline
Free tier for initial setup; Standard plan at $500/million monthly active rows. Typical small retailer: $0–$300/month depending on transaction volume
Automated ETL/ELT platform that extracts data from POS systems (Square, Shopify, Lightspeed), accounting platforms (QuickBooks Online, Xero), and e-commerce systems into the Azure SQL data warehouse. Provides 700+ pre-built connectors with automatic schema migration and incremental sync.
Azure SQL Database
Basic tier: $5/month; Standard S1: $30/month; Standard S2: $75/month. Typical retail deployment: $30–$75/month
Cloud-hosted relational data warehouse that stores transformed POS transactions, inventory snapshots, product catalog with COGS, and calculated KPI tables. Serves as the single source of truth for all Power BI reports.
Power Automate (included with M365)
$0 additional (included in M365 license)
Automation engine for alerting workflows. Triggers email/Teams notifications when KPI thresholds are breached — e.g., margin drops below target, inventory days-of-supply exceeds 90 days, or sales velocity declines week-over-week beyond a configurable percentage.
Azure Data Factory (optional)
$0.25/activity run + $0.25/1000 DIU-hours. Typical: $10–$50/month for scheduled retail ETL
Alternative or supplement to Fivetran for clients already deep in Azure ecosystem. Handles data transformation pipelines, stored procedure execution, and complex data cleansing workflows that go beyond Fivetran's default transformations.
Prerequisites
- Active POS system with API access enabled (Square, Shopify POS, Lightspeed Retail, or Clover). Verify API credentials can be generated and that the POS plan includes API access — some basic-tier plans restrict this.
- Accounting software with API access (QuickBooks Online or Xero preferred). The accounting system must contain accurate COGS/cost-of-goods data at the SKU or category level. Without COGS data, margin calculations will be impossible.
- Minimum 6 months (ideally 12 months) of historical transaction data in the POS system. Verify data has not been purged or archived. If less than 6 months, the initial velocity and turns calculations will have limited statistical significance.
- SKU-level product catalog with cost data. Each product must have a defined cost/landed cost in either the POS or accounting system. If cost data is missing or inconsistent, a data cleansing phase must be added (1–2 weeks).
- Current inventory on-hand quantities accessible via the POS or a separate inventory management system. Inventory snapshot data is required for turns and days-of-supply calculations.
- Microsoft 365 Business Premium (or at minimum Business Basic) deployed for all dashboard users. Azure AD tenant must be configured with MFA enabled. If not yet deployed, add M365 migration as a prerequisite project.
- Stable internet connectivity at each retail location: minimum 25 Mbps download / 10 Mbps upload. For multi-location deployments, each site needs independent broadband.
- A designated client-side project stakeholder — typically the store owner, general manager, or head buyer — who can define which KPI thresholds matter (e.g., minimum acceptable margin %, target inventory turns) and validate data accuracy during UAT.
- Client must sign a data processing agreement acknowledging that transaction data (excluding cardholder data) will be stored in Azure cloud infrastructure. Confirm no contractual restrictions from their POS vendor on data export.
- Admin credentials or delegated admin access to the client's POS system, accounting software, and Microsoft 365 tenant for the MSP implementation team.
Installation Steps
Step 1: Discovery & Data Audit
Conduct a 2–4 hour on-site or remote discovery session with the client stakeholder. Map all data sources: POS system (brand, plan tier, API availability), accounting software (QuickBooks Online/Xero), any separate inventory management tools, and e-commerce platforms. Document the product catalog structure (how SKUs are organized, whether categories/departments are defined), verify COGS data completeness, and identify data quality issues. Export sample data from each source for schema analysis.
curl -X GET https://connect.squareup.com/v2/catalog/list \
-H 'Authorization: Bearer {ACCESS_TOKEN}' \
-H 'Content-Type: application/json' | python3 -m json.tool | head -100curl -X GET https://{store}.myshopify.com/admin/api/2024-01/products/count.json \
-H 'X-Shopify-Access-Token: {ACCESS_TOKEN}'# verify item list with cost data (use QBO API sandbox or Postman collection
# to test)
GET https://quickbooks.api.intuit.com/v3/company/{companyId}/query?query=SELECT * FROM Item WHERE Type='Inventory' MAXRESULTS 10This is the most critical step. The quality of the entire project depends on data completeness. Common red flags: (1) COGS is $0.00 for many items — indicates cost data was never entered, (2) SKU count in POS doesn't match accounting system, (3) inventory quantities show negative values, (4) POS API plan is restricted. Document all findings in a Data Audit Report and get client sign-off before proceeding.
Step 2: Provision Azure SQL Database
Create an Azure SQL Database instance to serve as the central data warehouse. This database will store transformed transaction data, product catalog with costs, inventory snapshots, and pre-calculated KPI tables. Use the Standard S1 tier (20 DTUs) initially — this can be scaled up non-disruptively if query performance requires it.
az login
az group create \
--name rg-retail-analytics-{clientname} \
--location eastus
az sql server create \
--name sql-retail-{clientname} \
--resource-group rg-retail-analytics-{clientname} \
--location eastus \
--admin-user sqladmin \
--admin-password '{STRONG_PASSWORD_HERE}'
az sql db create \
--resource-group rg-retail-analytics-{clientname} \
--server sql-retail-{clientname} \
--name RetailAnalyticsDB \
--service-objective S1
az sql server firewall-rule create \
--resource-group rg-retail-analytics-{clientname} \
--server sql-retail-{clientname} \
--name AllowAzureServices \
--start-ip-address 0.0.0.0 \
--end-ip-address 0.0.0.0
az sql server firewall-rule create \
--resource-group rg-retail-analytics-{clientname} \
--server sql-retail-{clientname} \
--name MSPOffice \
--start-ip-address {MSP_OFFICE_IP} \
--end-ip-address {MSP_OFFICE_IP}
az sql server ad-admin create \
--resource-group rg-retail-analytics-{clientname} \
--server sql-retail-{clientname} \
--display-name 'SQL Admin Group' \
--object-id {AZURE_AD_GROUP_OBJECT_ID}Use a strong, randomly generated password for the SQL admin account and store it in the MSP's password vault (e.g., IT Glue, Hudu). Enable Transparent Data Encryption (TDE) — it's on by default for Azure SQL. Enable auditing and threat detection in the Azure portal under Security settings. Estimated monthly cost: ~$30/month for S1 tier.
Step 3: Create Data Warehouse Schema
Connect to the Azure SQL Database and create the dimensional data model. This schema follows a star schema pattern optimized for retail analytics: a central fact table for transactions, dimension tables for products, locations, time periods, and pre-calculated KPI summary tables that Power BI will query directly.
-- ==============================================
-- DIMENSION TABLES
-- ==============================================
CREATE TABLE dim_product (
product_id NVARCHAR(100) PRIMARY KEY,
sku NVARCHAR(50) NOT NULL,
product_name NVARCHAR(255) NOT NULL,
category NVARCHAR(100),
subcategory NVARCHAR(100),
brand NVARCHAR(100),
vendor_name NVARCHAR(200),
unit_cost DECIMAL(10,2) NOT NULL DEFAULT 0, -- landed cost / COGS per unit
current_retail_price DECIMAL(10,2),
is_active BIT DEFAULT 1,
created_date DATETIME2 DEFAULT GETUTCDATE(),
updated_date DATETIME2 DEFAULT GETUTCDATE()
);
CREATE TABLE dim_location (
location_id NVARCHAR(100) PRIMARY KEY,
location_name NVARCHAR(200) NOT NULL,
address NVARCHAR(500),
city NVARCHAR(100),
state NVARCHAR(50),
zip_code NVARCHAR(20),
is_active BIT DEFAULT 1
);
CREATE TABLE dim_date (
date_key INT PRIMARY KEY, -- YYYYMMDD format
full_date DATE NOT NULL,
day_of_week NVARCHAR(10),
day_of_month INT,
week_of_year INT,
month_num INT,
month_name NVARCHAR(20),
quarter_num INT,
year_num INT,
is_weekend BIT,
fiscal_week INT,
fiscal_month INT,
fiscal_year INT
);-- ==============================================
-- FACT TABLES
-- ==============================================
CREATE TABLE fact_sales (
sale_id BIGINT IDENTITY(1,1) PRIMARY KEY,
transaction_id NVARCHAR(100) NOT NULL,
transaction_date DATETIME2 NOT NULL,
date_key INT NOT NULL REFERENCES dim_date(date_key),
product_id NVARCHAR(100) NOT NULL REFERENCES dim_product(product_id),
location_id NVARCHAR(100) NOT NULL REFERENCES dim_location(location_id),
quantity_sold INT NOT NULL,
unit_price DECIMAL(10,2) NOT NULL, -- actual selling price (after discounts)
unit_cost DECIMAL(10,2) NOT NULL, -- COGS at time of sale
gross_revenue DECIMAL(12,2) NOT NULL, -- quantity * unit_price
total_cogs DECIMAL(12,2) NOT NULL, -- quantity * unit_cost
gross_margin_dollars DECIMAL(12,2) NOT NULL, -- gross_revenue - total_cogs
gross_margin_pct AS (CASE WHEN gross_revenue > 0 THEN (gross_margin_dollars / gross_revenue) * 100 ELSE 0 END),
discount_amount DECIMAL(10,2) DEFAULT 0,
source_system NVARCHAR(50), -- 'square', 'shopify', 'lightspeed'
etl_loaded_at DATETIME2 DEFAULT GETUTCDATE()
);
CREATE INDEX IX_fact_sales_date ON fact_sales(date_key);
CREATE INDEX IX_fact_sales_product ON fact_sales(product_id);
CREATE INDEX IX_fact_sales_location ON fact_sales(location_id);
CREATE INDEX IX_fact_sales_txn_date ON fact_sales(transaction_date);
CREATE TABLE fact_inventory_snapshot (
snapshot_id BIGINT IDENTITY(1,1) PRIMARY KEY,
snapshot_date DATE NOT NULL,
date_key INT NOT NULL REFERENCES dim_date(date_key),
product_id NVARCHAR(100) NOT NULL REFERENCES dim_product(product_id),
location_id NVARCHAR(100) NOT NULL REFERENCES dim_location(location_id),
quantity_on_hand INT NOT NULL,
unit_cost DECIMAL(10,2) NOT NULL,
inventory_value DECIMAL(12,2) NOT NULL, -- quantity_on_hand * unit_cost
etl_loaded_at DATETIME2 DEFAULT GETUTCDATE()
);
CREATE INDEX IX_inventory_snapshot_date ON fact_inventory_snapshot(snapshot_date);
CREATE INDEX IX_inventory_snapshot_product ON fact_inventory_snapshot(product_id);-- ==============================================
-- KPI SUMMARY TABLE (refreshed daily by ETL)
-- ==============================================
CREATE TABLE kpi_product_performance (
kpi_id BIGINT IDENTITY(1,1) PRIMARY KEY,
calculation_date DATE NOT NULL,
product_id NVARCHAR(100) NOT NULL REFERENCES dim_product(product_id),
location_id NVARCHAR(100) NOT NULL REFERENCES dim_location(location_id),
-- Sales Velocity
units_sold_7d INT DEFAULT 0,
units_sold_30d INT DEFAULT 0,
units_sold_90d INT DEFAULT 0,
avg_daily_velocity_30d DECIMAL(10,4) DEFAULT 0,
velocity_trend_pct DECIMAL(10,2) DEFAULT 0, -- 30d vs prior 30d
-- Margin
gross_margin_pct_30d DECIMAL(10,2) DEFAULT 0,
gross_margin_dollars_30d DECIMAL(12,2) DEFAULT 0,
margin_trend_pct DECIMAL(10,2) DEFAULT 0, -- 30d vs prior 30d
-- Inventory Turns
annualized_cogs DECIMAL(14,2) DEFAULT 0,
avg_inventory_value_30d DECIMAL(12,2) DEFAULT 0,
inventory_turns_annualized DECIMAL(10,2) DEFAULT 0,
-- GMROI
gmroi DECIMAL(10,2) DEFAULT 0,
-- Days of Supply
current_on_hand INT DEFAULT 0,
days_of_supply DECIMAL(10,1) DEFAULT 0,
-- Sell-Through Rate
sell_through_rate_30d DECIMAL(10,2) DEFAULT 0,
-- Classification
performance_tier NVARCHAR(20), -- 'STAR', 'OPPORTUNITY', 'UNDERPERFORMER', 'DEAD_STOCK'
calculated_at DATETIME2 DEFAULT GETUTCDATE()
);
CREATE INDEX IX_kpi_product_date ON kpi_product_performance(calculation_date);
CREATE INDEX IX_kpi_product_tier ON kpi_product_performance(performance_tier);-- ==============================================
-- POPULATE dim_date (5 years: 2022-2027)
-- ==============================================
WITH DateCTE AS (
SELECT CAST('2022-01-01' AS DATE) AS dt
UNION ALL
SELECT DATEADD(DAY, 1, dt) FROM DateCTE WHERE dt < '2027-12-31'
)
INSERT INTO dim_date (date_key, full_date, day_of_week, day_of_month, week_of_year, month_num, month_name, quarter_num, year_num, is_weekend, fiscal_week, fiscal_month, fiscal_year)
SELECT
CAST(FORMAT(dt, 'yyyyMMdd') AS INT),
dt,
DATENAME(WEEKDAY, dt),
DAY(dt),
DATEPART(WEEK, dt),
MONTH(dt),
DATENAME(MONTH, dt),
DATEPART(QUARTER, dt),
YEAR(dt),
CASE WHEN DATEPART(WEEKDAY, dt) IN (1,7) THEN 1 ELSE 0 END,
DATEPART(WEEK, dt), -- Adjust for client fiscal calendar if needed
MONTH(dt),
YEAR(dt)
FROM DateCTE
OPTION (MAXRECURSION 2200);Adjust the fiscal calendar fields in dim_date if the client uses a non-calendar fiscal year (common in retail — many use a 4-5-4 or 4-4-5 retail calendar). The performance_tier classification in kpi_product_performance is populated by the daily KPI calculation stored procedure (defined in the Custom AI Components section). Ensure unit_cost in dim_product is kept in sync with the accounting system — this is the most common source of margin calculation errors.
Step 4: Configure Fivetran Data Connectors
Set up Fivetran to automatically extract data from the client's POS system and accounting software into the Azure SQL data warehouse. Fivetran handles incremental syncs, schema changes, and retry logic automatically. Configure one connector per data source.
CREATE LOGIN fivetran_user WITH PASSWORD = '{STRONG_PASSWORD}';
CREATE USER fivetran_user FOR LOGIN fivetran_user;
ALTER ROLE db_owner ADD MEMBER fivetran_user;
-- Fivetran requires db_owner to manage schema creation.
-- After initial sync stabilizes, you can reduce to db_datawriter + ddl_admin.--- SQUARE CONNECTOR ---
Connector Type: Square
Auth: OAuth 2.0 (redirect to Square login)
Tables to sync: transactions, payments, catalog_items,
inventory_counts, locations
Sync frequency: Every 6 hours
Schema prefix: square_raw
--- SHOPIFY CONNECTOR (if applicable) ---
Connector Type: Shopify
Auth: Custom App API credentials
Tables to sync: orders, order_line_items, products,
variants, inventory_levels, locations
Sync frequency: Every 6 hours
Schema prefix: shopify_raw
--- QUICKBOOKS ONLINE CONNECTOR ---
Connector Type: QuickBooks Online
Auth: OAuth 2.0
Tables to sync: items, invoices, invoice_lines,
purchases, purchase_lines, accounts
Sync frequency: Every 12 hours
Schema prefix: qbo_rawSELECT TABLE_SCHEMA, TABLE_NAME,
(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS c
WHERE c.TABLE_SCHEMA = t.TABLE_SCHEMA AND c.TABLE_NAME = t.TABLE_NAME) as column_count
FROM INFORMATION_SCHEMA.TABLES t
WHERE TABLE_SCHEMA IN ('square_raw', 'shopify_raw', 'qbo_raw')
ORDER BY TABLE_SCHEMA, TABLE_NAME;
-- Check row counts on key tables
SELECT 'square_raw.transactions' as tbl, COUNT(*) as rows FROM square_raw.transactions
UNION ALL
SELECT 'square_raw.catalog_items', COUNT(*) FROM square_raw.catalog_items
UNION ALL
SELECT 'qbo_raw.items', COUNT(*) FROM qbo_raw.items;Fivetran's free tier allows 500K monthly active rows — sufficient for initial setup and testing with small retailers. Monitor MAR usage in the Fivetran dashboard after the first week. For a retailer doing ~1,000 transactions/day, expect ~30K-50K MAR across all connectors, well within the free or low-cost tiers. If the client uses Lightspeed or Clover instead of Square/Shopify, Fivetran has native connectors for both. The Clover connector requires a Clover Developer account and app approval.
Step 5: Build ETL Transformation Layer
Create stored procedures that transform raw data from Fivetran's staging schemas into the star schema dimensional model. These procedures handle data cleansing, deduplication, cost lookup, and loading into the fact and dimension tables. Schedule them to run after each Fivetran sync.
-- ==============================================
-- STORED PROCEDURE: Load dim_product from Square + QBO
-- ==============================================
CREATE OR ALTER PROCEDURE sp_load_dim_product
AS
BEGIN
SET NOCOUNT ON;
-- Merge products from Square catalog with costs from QBO
MERGE dim_product AS target
USING (
SELECT
sq.id AS product_id,
COALESCE(sq.item_data_name, 'UNKNOWN') AS sku,
COALESCE(sq.item_data_name, 'Unnamed Product') AS product_name,
sq.item_data_category_id AS category,
NULL AS subcategory,
NULL AS brand,
NULL AS vendor_name,
COALESCE(
-- Try QBO cost first
qbo.purchase_cost,
-- Then Square cost if available
CAST(sq.item_data_variations_0_item_variation_data_price_money_amount AS DECIMAL(10,2)) / 100 * 0.5,
-- Default to 50% of retail as fallback
0
) AS unit_cost,
CAST(sq.item_data_variations_0_item_variation_data_price_money_amount AS DECIMAL(10,2)) / 100 AS current_retail_price,
CASE WHEN sq.is_deleted = 'true' THEN 0 ELSE 1 END AS is_active
FROM square_raw.catalog_objects sq
LEFT JOIN qbo_raw.items qbo ON LOWER(qbo.name) = LOWER(sq.item_data_name)
WHERE sq.type = 'ITEM'
) AS source
ON target.product_id = source.product_id
WHEN MATCHED THEN
UPDATE SET
product_name = source.product_name,
unit_cost = source.unit_cost,
current_retail_price = source.current_retail_price,
is_active = source.is_active,
updated_date = GETUTCDATE()
WHEN NOT MATCHED THEN
INSERT (product_id, sku, product_name, category, subcategory, brand, vendor_name, unit_cost, current_retail_price, is_active)
VALUES (source.product_id, source.sku, source.product_name, source.category, source.subcategory, source.brand, source.vendor_name, source.unit_cost, source.current_retail_price, source.is_active);
PRINT 'dim_product loaded: ' + CAST(@@ROWCOUNT AS VARCHAR) + ' rows affected';
END;
-- ==============================================
-- STORED PROCEDURE: Load fact_sales from Square transactions
-- ==============================================
CREATE OR ALTER PROCEDURE sp_load_fact_sales
@days_back INT = 7 -- Only process recent days for incremental loads
AS
BEGIN
SET NOCOUNT ON;
DECLARE @cutoff_date DATETIME2 = DATEADD(DAY, -@days_back, GETUTCDATE());
INSERT INTO fact_sales (
transaction_id, transaction_date, date_key, product_id, location_id,
quantity_sold, unit_price, unit_cost, gross_revenue, total_cogs,
gross_margin_dollars, discount_amount, source_system
)
SELECT
t.id AS transaction_id,
CAST(t.created_at AS DATETIME2) AS transaction_date,
CAST(FORMAT(CAST(t.created_at AS DATE), 'yyyyMMdd') AS INT) AS date_key,
li.catalog_object_id AS product_id,
t.location_id,
COALESCE(CAST(li.quantity AS INT), 1) AS quantity_sold,
CAST(li.base_price_money_amount AS DECIMAL(10,2)) / 100 AS unit_price,
COALESCE(p.unit_cost, 0) AS unit_cost,
CAST(li.total_money_amount AS DECIMAL(10,2)) / 100 AS gross_revenue,
COALESCE(p.unit_cost, 0) * COALESCE(CAST(li.quantity AS INT), 1) AS total_cogs,
(CAST(li.total_money_amount AS DECIMAL(10,2)) / 100) - (COALESCE(p.unit_cost, 0) * COALESCE(CAST(li.quantity AS INT), 1)) AS gross_margin_dollars,
COALESCE(CAST(li.total_discount_money_amount AS DECIMAL(10,2)) / 100, 0) AS discount_amount,
'square' AS source_system
FROM square_raw.order_line_items li
INNER JOIN square_raw.orders t ON li.order_id = t.id
LEFT JOIN dim_product p ON li.catalog_object_id = p.product_id
WHERE CAST(t.created_at AS DATETIME2) >= @cutoff_date
AND t.state = 'COMPLETED'
AND NOT EXISTS (
SELECT 1 FROM fact_sales fs
WHERE fs.transaction_id = t.id
AND fs.product_id = li.catalog_object_id
);
PRINT 'fact_sales loaded: ' + CAST(@@ROWCOUNT AS VARCHAR) + ' new rows';
END;
-- ==============================================
-- STORED PROCEDURE: Daily inventory snapshot
-- ==============================================
CREATE OR ALTER PROCEDURE sp_load_inventory_snapshot
AS
BEGIN
SET NOCOUNT ON;
DECLARE @today DATE = CAST(GETUTCDATE() AS DATE);
DECLARE @date_key INT = CAST(FORMAT(@today, 'yyyyMMdd') AS INT);
-- Delete today's snapshot if re-running
DELETE FROM fact_inventory_snapshot WHERE snapshot_date = @today;
INSERT INTO fact_inventory_snapshot (
snapshot_date, date_key, product_id, location_id,
quantity_on_hand, unit_cost, inventory_value
)
SELECT
@today,
@date_key,
inv.catalog_object_id AS product_id,
inv.location_id,
CAST(inv.quantity AS INT) AS quantity_on_hand,
COALESCE(p.unit_cost, 0) AS unit_cost,
CAST(inv.quantity AS INT) * COALESCE(p.unit_cost, 0) AS inventory_value
FROM square_raw.inventory_counts inv
INNER JOIN dim_product p ON inv.catalog_object_id = p.product_id
WHERE inv.state = 'IN_STOCK';
PRINT 'Inventory snapshot loaded for ' + CAST(@today AS VARCHAR) + ': ' + CAST(@@ROWCOUNT AS VARCHAR) + ' rows';
END;
-- ==============================================
-- MASTER ETL ORCHESTRATION PROCEDURE
-- ==============================================
CREATE OR ALTER PROCEDURE sp_run_daily_etl
AS
BEGIN
SET NOCOUNT ON;
DECLARE @start DATETIME2 = GETUTCDATE();
PRINT '=== RETAIL ANALYTICS ETL START: ' + CAST(@start AS VARCHAR) + ' ===';
EXEC sp_load_dim_product;
EXEC sp_load_fact_sales @days_back = 7;
EXEC sp_load_inventory_snapshot;
EXEC sp_calculate_product_kpis; -- Defined in Custom AI Components section
PRINT '=== ETL COMPLETE. Duration: ' +
CAST(DATEDIFF(SECOND, @start, GETUTCDATE()) AS VARCHAR) + ' seconds ===';
END;These stored procedures are written for a Square POS source. Adapt the column names for Shopify (orders.line_items schema) or Lightspeed (sales.sale_lines schema) accordingly. The product matching between POS and accounting (QBO) uses product name — this is fragile. During discovery, evaluate whether a more robust matching key exists (e.g., SKU number shared between both systems). Run sp_load_fact_sales with @days_back = 365 for the initial historical load, then switch to 7 for daily incremental.
Step 6: Schedule Daily ETL Execution
Set up an Azure SQL Agent Job equivalent (using Azure Automation or Azure Data Factory) to run the master ETL procedure daily after Fivetran syncs complete. Schedule for early morning (e.g., 5:00 AM local time) so dashboards show fresh data when the store opens.
Option A: Azure Automation Runbook (simplest for MSPs)
az automation account create \
--name auto-retail-{clientname} \
--resource-group rg-retail-analytics-{clientname} \
--location eastus# PowerShell runbook that calls the stored procedure
$connectionString = "Server=sql-retail-{clientname}.database.windows.net;Database=RetailAnalyticsDB;User ID=sqladmin;Password={PASSWORD};Encrypt=True;TrustServerCertificate=False;"
$query = "EXEC sp_run_daily_etl;"
try {
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString)
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$command.CommandTimeout = 600 # 10 minute timeout
$command.ExecuteNonQuery()
Write-Output "ETL completed successfully at $(Get-Date -Format 'yyyy-MM-dd HH:mm:ss UTC')"
} catch {
Write-Error "ETL FAILED: $($_.Exception.Message)"
throw
} finally {
if ($connection.State -eq 'Open') { $connection.Close() }
}az automation schedule create \
--automation-account-name auto-retail-{clientname} \
--resource-group rg-retail-analytics-{clientname} \
--name DailyETL \
--frequency Day \
--interval 1 \
--start-time "2025-01-01T09:00:00Z" \
--timezone "America/New_York"Store the SQL connection password as an Azure Automation encrypted variable — never hardcode it in the runbook. Set up an alert in Azure Monitor to notify the MSP team if the runbook fails (Action Group → email to MSP NOC). Azure Automation costs approximately $0.002 per minute of job runtime — negligible for a 2–5 minute daily ETL job.
Step 7: Build Power BI Data Model and Dashboards
Create the Power BI report file (.pbix) that connects to Azure SQL Database via DirectQuery or Import mode, defines the data model relationships, creates DAX measures for all KPIs, and builds the interactive dashboard pages. This is the core deliverable the client interacts with daily.
# create in Power BI Desktop → Modeling → New Measure
-- === SALES VELOCITY ===
Total Units Sold = SUM(fact_sales[quantity_sold])
Avg Daily Velocity =
DIVIDE(
[Total Units Sold],
DATEDIFF(MIN(fact_sales[transaction_date]), MAX(fact_sales[transaction_date]), DAY) + 1,
0
)
Velocity Trend % =
VAR CurrentPeriod = CALCULATE([Total Units Sold], DATESINPERIOD(dim_date[full_date], MAX(dim_date[full_date]), -30, DAY))
VAR PriorPeriod = CALCULATE([Total Units Sold], DATESINPERIOD(dim_date[full_date], MAX(dim_date[full_date]) - 30, -30, DAY))
RETURN DIVIDE(CurrentPeriod - PriorPeriod, PriorPeriod, 0) * 100
-- === MARGIN ===
Total Revenue = SUM(fact_sales[gross_revenue])
Total COGS = SUM(fact_sales[total_cogs])
Gross Margin $ = [Total Revenue] - [Total COGS]
Gross Margin % = DIVIDE([Gross Margin $], [Total Revenue], 0) * 100
Margin Trend % =
VAR CurrentMargin = CALCULATE([Gross Margin %], DATESINPERIOD(dim_date[full_date], MAX(dim_date[full_date]), -30, DAY))
VAR PriorMargin = CALCULATE([Gross Margin %], DATESINPERIOD(dim_date[full_date], MAX(dim_date[full_date]) - 30, -30, DAY))
RETURN CurrentMargin - PriorMargin
-- === INVENTORY TURNS ===
Annualized COGS = CALCULATE([Total COGS], DATESINPERIOD(dim_date[full_date], MAX(dim_date[full_date]), -365, DAY))
Avg Inventory Value = AVERAGEX(
VALUES(fact_inventory_snapshot[snapshot_date]),
CALCULATE(SUM(fact_inventory_snapshot[inventory_value]))
)
Inventory Turns = DIVIDE([Annualized COGS], [Avg Inventory Value], 0)
-- === GMROI ===
GMROI = DIVIDE([Gross Margin $], [Avg Inventory Value], 0)
-- === DAYS OF SUPPLY ===
Current On Hand =
CALCULATE(
SUM(fact_inventory_snapshot[quantity_on_hand]),
LASTDATE(fact_inventory_snapshot[snapshot_date])
)
Days of Supply = DIVIDE([Current On Hand], [Avg Daily Velocity], 999)
-- === SELL-THROUGH RATE ===
Sell Through Rate =
DIVIDE([Total Units Sold], [Total Units Sold] + [Current On Hand], 0) * 100
-- === PERFORMANCE CLASSIFICATION ===
Performance Tier =
VAR Margin = [Gross Margin %]
VAR Velocity = [Avg Daily Velocity]
VAR MedianMargin = MEDIANX(ALL(dim_product), [Gross Margin %])
VAR MedianVelocity = MEDIANX(ALL(dim_product), [Avg Daily Velocity])
RETURN
SWITCH(TRUE(),
Velocity >= MedianVelocity && Margin >= MedianMargin, "⭐ STAR",
Velocity >= MedianVelocity && Margin < MedianMargin, "🔄 OPPORTUNITY",
Velocity < MedianVelocity && Margin >= MedianMargin, "⚠️ SLEEPER",
"🔴 UNDERPERFORMER"
)- PAGE 1: Executive Summary - KPI cards: Total Revenue, Gross Margin %, Avg Inventory Turns, GMROI - Trend line: Revenue & Margin % over time (last 12 months) - Donut chart: Revenue split by Performance Tier - Location comparison bar chart
- PAGE 2: Winners & Underperformers - Scatter plot: X=Avg Daily Velocity, Y=Gross Margin %, Size=Revenue (quadrant chart showing Stars, Opportunities, Sleepers, Underperformers) - Top 20 table: sorted by GMROI descending - Bottom 20 table: sorted by GMROI ascending (with Days of Supply) - Slicer: Category, Location, Date Range
- PAGE 3: Inventory Health - Gauge chart: Overall Inventory Turns vs target (industry avg: 4-6 for general retail) - Heat map: Category × Location showing turns - Alert table: Items with Days of Supply > 90 (dead stock candidates) - Waterfall chart: Inventory value change over time
- PAGE 4: Category Deep Dive - Drill-through page from any product/category - 12-month velocity trend for selected item - Margin history showing price/cost changes - Inventory on-hand vs sell-through
Use Import mode for best dashboard performance if total data volume is under 1 million rows. For larger datasets or near-real-time needs, use DirectQuery but expect slower dashboard interactions. Enable Row-Level Security (RLS) if different managers should only see their location's data. Publish to a dedicated Power BI workspace named 'Retail Analytics - {ClientName}' and configure workspace access with Viewer role for store managers and Contributor role for the client's data champion.
Step 8: Configure Automated Alerting via Power Automate
Set up Power Automate flows that monitor KPI thresholds and send alerts to store managers and buyers when action is needed. These alerts transform passive dashboards into proactive intelligence.
Flow 1: Daily Underperformer Alert
SELECT
p.product_name,
p.category,
l.location_name,
k.gross_margin_pct_30d,
k.avg_daily_velocity_30d,
k.days_of_supply,
k.inventory_turns_annualized,
k.gmroi,
k.performance_tier
FROM kpi_product_performance k
JOIN dim_product p ON k.product_id = p.product_id
JOIN dim_location l ON k.location_id = l.location_id
WHERE k.calculation_date = CAST(GETUTCDATE() AS DATE)
AND k.performance_tier = 'UNDERPERFORMER'
AND k.current_on_hand > 0
AND k.days_of_supply > 90
ORDER BY k.days_of_supply DESC;Flow 2: Margin Erosion Alert
SELECT
p.product_name,
p.category,
k.gross_margin_pct_30d AS current_margin_pct,
k.margin_trend_pct AS margin_change,
k.gross_margin_dollars_30d
FROM kpi_product_performance k
JOIN dim_product p ON k.product_id = p.product_id
WHERE k.calculation_date = CAST(GETUTCDATE() AS DATE)
AND k.margin_trend_pct < -5 -- Margin dropped more than 5 points
AND k.gross_margin_dollars_30d > 100 -- Only flag items with meaningful revenue
ORDER BY k.margin_trend_pct ASC;Flow 3: Dead Stock Alert
-- items with zero sales in 90 days but positive on-hand inventory
SELECT
p.product_name,
p.category,
l.location_name,
k.current_on_hand,
k.days_of_supply,
k.avg_daily_velocity_30d,
snap.inventory_value
FROM kpi_product_performance k
JOIN dim_product p ON k.product_id = p.product_id
JOIN dim_location l ON k.location_id = l.location_id
JOIN fact_inventory_snapshot snap
ON k.product_id = snap.product_id
AND k.location_id = snap.location_id
AND snap.snapshot_date = CAST(GETUTCDATE() AS DATE)
WHERE k.calculation_date = CAST(GETUTCDATE() AS DATE)
AND k.units_sold_90d = 0
AND k.current_on_hand > 0
ORDER BY snap.inventory_value DESC;Power Automate premium connectors may be needed for the SQL Server connector — verify the client's M365 license includes this (Business Premium does). Test each flow with sample data before going live. Set up a shared mailbox (e.g., analytics@clientdomain.com) as the sender for all automated alerts. Configure the MSP's monitoring system to alert if any flow fails for 2+ consecutive days.
Step 9: Implement Row-Level Security and Access Controls
Configure Power BI Row-Level Security (RLS) so multi-location managers only see their location's data, while the owner/GM sees all locations. Set up Azure AD security groups for access management.
az ad group create --display-name 'PBI-RetailAnalytics-Admins' --mail-nickname pbi-retail-admins
az ad group create --display-name 'PBI-RetailAnalytics-LocationManagers' --mail-nickname pbi-retail-managers
az ad group create --display-name 'PBI-RetailAnalytics-Viewers' --mail-nickname pbi-retail-viewers// Role: LocationManager
// Table: dim_location
// DAX Filter:
[location_id] = LOOKUPVALUE(
UserLocationMapping[location_id],
UserLocationMapping[user_email],
USERPRINCIPALNAME()
)CREATE TABLE UserLocationMapping (
user_email NVARCHAR(200) NOT NULL,
location_id NVARCHAR(100) NOT NULL,
PRIMARY KEY (user_email, location_id)
);
INSERT INTO UserLocationMapping VALUES
('manager1@clientdomain.com', 'LOC001'),
('manager2@clientdomain.com', 'LOC002');Test RLS by using the 'View as Role' feature in Power BI Desktop before publishing. Ensure the owner/executive user is in the Admin group without RLS restrictions. Audit Power BI access logs monthly (available in the Power BI Admin portal under Audit logs). Document all user-to-location mappings in the client's IT documentation system.
Step 10: Final Testing, UAT, and Go-Live
Conduct thorough testing of the entire pipeline — from POS data extraction through dashboard visualization — and perform User Acceptance Testing with the client stakeholder before going live.
-- 1. Data freshness check
SELECT
'fact_sales' AS table_name,
MAX(transaction_date) AS latest_record,
DATEDIFF(HOUR, MAX(transaction_date), GETUTCDATE()) AS hours_behind
FROM fact_sales
UNION ALL
SELECT
'fact_inventory_snapshot',
MAX(CAST(snapshot_date AS DATETIME2)),
DATEDIFF(HOUR, MAX(CAST(snapshot_date AS DATETIME2)), GETUTCDATE())
FROM fact_inventory_snapshot;
-- 2. Revenue reconciliation (compare to POS reports)
SELECT
FORMAT(transaction_date, 'yyyy-MM') AS month,
SUM(gross_revenue) AS total_revenue,
COUNT(DISTINCT transaction_id) AS transaction_count
FROM fact_sales
GROUP BY FORMAT(transaction_date, 'yyyy-MM')
ORDER BY month DESC;
-- 3. Margin sanity check (flag items with suspicious margins)
SELECT
p.product_name,
p.unit_cost,
p.current_retail_price,
(p.current_retail_price - p.unit_cost) / NULLIF(p.current_retail_price, 0) * 100 AS expected_margin_pct,
k.gross_margin_pct_30d AS actual_margin_pct
FROM kpi_product_performance k
JOIN dim_product p ON k.product_id = p.product_id
WHERE k.calculation_date = CAST(GETUTCDATE() AS DATE)
AND (k.gross_margin_pct_30d > 90 OR k.gross_margin_pct_30d < 0)
ORDER BY k.gross_margin_pct_30d;
-- 4. Inventory value reconciliation
SELECT
SUM(inventory_value) AS total_inventory_value,
COUNT(DISTINCT product_id) AS unique_skus,
SUM(quantity_on_hand) AS total_units
FROM fact_inventory_snapshot
WHERE snapshot_date = CAST(GETUTCDATE() AS DATE);Revenue reconciliation is critical — compare the dashboard's monthly revenue totals against the POS system's native reports for the same period. Discrepancies of more than 2% require investigation (common causes: refunds handled differently, tax inclusion/exclusion, rounding). Have the client's bookkeeper or accountant validate that gross margin percentages match their expectations for key product categories. Document any known data discrepancies and their causes in the handoff documentation.
Custom AI Components
Product Performance KPI Calculator
Type: workflow
A stored procedure that runs daily as part of the ETL pipeline to calculate all core KPIs (sales velocity, gross margin, inventory turns, GMROI, days of supply, sell-through rate) for every product-location combination and assigns a performance tier classification (STAR, OPPORTUNITY, SLEEPER, UNDERPERFORMER, DEAD_STOCK). This is the analytical engine that powers all dashboards and alerts.
Implementation:
CREATE OR ALTER PROCEDURE sp_calculate_product_kpis
AS
BEGIN
SET NOCOUNT ON;
DECLARE @today DATE = CAST(GETUTCDATE() AS DATE);
DECLARE @date_key INT = CAST(FORMAT(@today, 'yyyyMMdd') AS INT);
-- Delete today's KPIs if re-running
DELETE FROM kpi_product_performance WHERE calculation_date = @today;
-- Calculate KPIs for every active product × location combination
INSERT INTO kpi_product_performance (
calculation_date, product_id, location_id,
units_sold_7d, units_sold_30d, units_sold_90d,
avg_daily_velocity_30d, velocity_trend_pct,
gross_margin_pct_30d, gross_margin_dollars_30d, margin_trend_pct,
annualized_cogs, avg_inventory_value_30d, inventory_turns_annualized,
gmroi, current_on_hand, days_of_supply, sell_through_rate_30d,
performance_tier
)
SELECT
@today AS calculation_date,
p.product_id,
l.location_id,
-- Units Sold (7d, 30d, 90d)
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -7, @today) THEN fs.quantity_sold END), 0) AS units_sold_7d,
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0) AS units_sold_30d,
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -90, @today) THEN fs.quantity_sold END), 0) AS units_sold_90d,
-- Avg Daily Velocity (30d)
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0) / 30.0 AS avg_daily_velocity_30d,
-- Velocity Trend: (current 30d units - prior 30d units) / prior 30d units * 100
CASE
WHEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -60, @today) AND fs.transaction_date < DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0) > 0
THEN (
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0)
- COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -60, @today) AND fs.transaction_date < DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0)
) * 100.0 / COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -60, @today) AND fs.transaction_date < DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 1)
ELSE 0
END AS velocity_trend_pct,
-- Gross Margin % (30d)
CASE
WHEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_revenue END), 0) > 0
THEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_margin_dollars END), 0)
* 100.0 / SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_revenue END)
ELSE 0
END AS gross_margin_pct_30d,
-- Gross Margin $ (30d)
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_margin_dollars END), 0) AS gross_margin_dollars_30d,
-- Margin Trend (current 30d margin% - prior 30d margin%)
CASE
WHEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_revenue END), 0) > 0
AND COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -60, @today) AND fs.transaction_date < DATEADD(DAY, -30, @today) THEN fs.gross_revenue END), 0) > 0
THEN (
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_margin_dollars END), 0) * 100.0
/ SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.gross_revenue END)
) - (
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -60, @today) AND fs.transaction_date < DATEADD(DAY, -30, @today) THEN fs.gross_margin_dollars END), 0) * 100.0
/ SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -60, @today) AND fs.transaction_date < DATEADD(DAY, -30, @today) THEN fs.gross_revenue END)
)
ELSE 0
END AS margin_trend_pct,
-- Annualized COGS (trailing 12 months extrapolated)
COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -365, @today) THEN fs.total_cogs END),
SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -90, @today) THEN fs.total_cogs END) * 4) AS annualized_cogs,
-- Avg Inventory Value (30d avg from snapshots)
COALESCE(inv_avg.avg_inv_value, 0) AS avg_inventory_value_30d,
-- Inventory Turns (annualized)
CASE
WHEN COALESCE(inv_avg.avg_inv_value, 0) > 0
THEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -365, @today) THEN fs.total_cogs END),
SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -90, @today) THEN fs.total_cogs END) * 4)
/ inv_avg.avg_inv_value
ELSE 0
END AS inventory_turns_annualized,
-- GMROI
CASE
WHEN COALESCE(inv_avg.avg_inv_value, 0) > 0
THEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -365, @today) THEN fs.gross_margin_dollars END),
SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -90, @today) THEN fs.gross_margin_dollars END) * 4)
/ inv_avg.avg_inv_value
ELSE 0
END AS gmroi,
-- Current On Hand
COALESCE(inv_current.quantity_on_hand, 0) AS current_on_hand,
-- Days of Supply
CASE
WHEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0) > 0
THEN COALESCE(inv_current.quantity_on_hand, 0) * 30.0
/ SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END)
ELSE 999 -- infinite days of supply if no sales
END AS days_of_supply,
-- Sell-Through Rate (30d)
CASE
WHEN (COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0)
+ COALESCE(inv_current.quantity_on_hand, 0)) > 0
THEN COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0) * 100.0
/ (COALESCE(SUM(CASE WHEN fs.transaction_date >= DATEADD(DAY, -30, @today) THEN fs.quantity_sold END), 0)
+ COALESCE(inv_current.quantity_on_hand, 0))
ELSE 0
END AS sell_through_rate_30d,
-- Performance Tier Classification
'PENDING' AS performance_tier -- Will be updated in second pass
FROM dim_product p
CROSS JOIN dim_location l
LEFT JOIN fact_sales fs
ON p.product_id = fs.product_id
AND l.location_id = fs.location_id
AND fs.transaction_date >= DATEADD(DAY, -365, @today)
LEFT JOIN (
SELECT product_id, location_id, AVG(inventory_value) AS avg_inv_value
FROM fact_inventory_snapshot
WHERE snapshot_date >= DATEADD(DAY, -30, CAST(GETUTCDATE() AS DATE))
GROUP BY product_id, location_id
) inv_avg ON p.product_id = inv_avg.product_id AND l.location_id = inv_avg.location_id
LEFT JOIN (
SELECT product_id, location_id, quantity_on_hand
FROM fact_inventory_snapshot
WHERE snapshot_date = CAST(GETUTCDATE() AS DATE)
) inv_current ON p.product_id = inv_current.product_id AND l.location_id = inv_current.location_id
WHERE p.is_active = 1
GROUP BY p.product_id, l.location_id, inv_avg.avg_inv_value, inv_current.quantity_on_hand;
-- Second pass: Assign performance tiers using percentile-based classification
-- Calculate median velocity and median margin across all active products
DECLARE @median_velocity DECIMAL(10,4);
DECLARE @median_margin DECIMAL(10,2);
SELECT @median_velocity = PERCENTILE_CONT(0.5) WITHIN GROUP (ORDER BY avg_daily_velocity_30d)
OVER ()
FROM kpi_product_performance
WHERE calculation_date = @today AND units_sold_30d > 0;
SELECT @median_margin = PERCENTILE_CONT(0.5) WITHIN GROUP (ORDER BY gross_margin_pct_30d)
OVER ()
FROM kpi_product_performance
WHERE calculation_date = @today AND units_sold_30d > 0;
-- Assign tiers
UPDATE kpi_product_performance
SET performance_tier =
CASE
WHEN units_sold_90d = 0 AND current_on_hand > 0 THEN 'DEAD_STOCK'
WHEN avg_daily_velocity_30d >= COALESCE(@median_velocity, 0)
AND gross_margin_pct_30d >= COALESCE(@median_margin, 0) THEN 'STAR'
WHEN avg_daily_velocity_30d >= COALESCE(@median_velocity, 0)
AND gross_margin_pct_30d < COALESCE(@median_margin, 0) THEN 'OPPORTUNITY'
WHEN avg_daily_velocity_30d < COALESCE(@median_velocity, 0)
AND gross_margin_pct_30d >= COALESCE(@median_margin, 0) THEN 'SLEEPER'
ELSE 'UNDERPERFORMER'
END
WHERE calculation_date = @today;
-- Log summary
SELECT performance_tier, COUNT(*) AS product_count
FROM kpi_product_performance
WHERE calculation_date = @today
GROUP BY performance_tier;
PRINT 'KPI calculation complete for ' + CAST(@today AS VARCHAR);
END;Seasonal Velocity Adjuster
Type: skill A supplementary calculation that adjusts sales velocity baselines for seasonal patterns, preventing false positives in the winner/underperformer classification. For example, a winter coat with zero summer sales should not be flagged as dead stock in July if it historically sells well in Q4. This component computes a seasonality index per product based on historical monthly sales distribution.
Implementation:
CREATE OR ALTER PROCEDURE sp_calculate_seasonality_index
AS
BEGIN
SET NOCOUNT ON;
-- Create or refresh the seasonality index table
IF OBJECT_ID('product_seasonality_index', 'U') IS NULL
BEGIN
CREATE TABLE product_seasonality_index (
product_id NVARCHAR(100) NOT NULL,
month_num INT NOT NULL,
seasonality_index DECIMAL(6,4) NOT NULL, -- 1.0 = average month, >1 = peak, <1 = trough
avg_monthly_units DECIMAL(10,2),
data_months INT, -- number of months of data used
calculated_at DATETIME2 DEFAULT GETUTCDATE(),
PRIMARY KEY (product_id, month_num)
);
END
-- Truncate and recalculate
TRUNCATE TABLE product_seasonality_index;
-- Calculate seasonality index: ratio of each month's avg sales to the overall monthly average
;WITH monthly_sales AS (
SELECT
product_id,
MONTH(transaction_date) AS month_num,
YEAR(transaction_date) AS year_num,
SUM(quantity_sold) AS monthly_units
FROM fact_sales
WHERE transaction_date >= DATEADD(YEAR, -2, GETUTCDATE()) -- Use last 2 years
GROUP BY product_id, MONTH(transaction_date), YEAR(transaction_date)
),
avg_by_month AS (
SELECT
product_id,
month_num,
AVG(CAST(monthly_units AS DECIMAL(10,2))) AS avg_monthly_units,
COUNT(*) AS data_months
FROM monthly_sales
GROUP BY product_id, month_num
),
overall_avg AS (
SELECT
product_id,
AVG(avg_monthly_units) AS overall_avg_monthly
FROM avg_by_month
GROUP BY product_id
)
INSERT INTO product_seasonality_index (product_id, month_num, seasonality_index, avg_monthly_units, data_months)
SELECT
m.product_id,
m.month_num,
CASE
WHEN o.overall_avg_monthly > 0 THEN m.avg_monthly_units / o.overall_avg_monthly
ELSE 1.0
END AS seasonality_index,
m.avg_monthly_units,
m.data_months
FROM avg_by_month m
JOIN overall_avg o ON m.product_id = o.product_id;
-- Apply seasonal adjustment to KPI performance tiers
-- Products in their trough month (index < 0.3) with low velocity
-- should not be classified as UNDERPERFORMER or DEAD_STOCK
UPDATE k
SET k.performance_tier =
CASE
WHEN k.performance_tier IN ('UNDERPERFORMER', 'DEAD_STOCK')
AND s.seasonality_index < 0.3
AND s.data_months >= 2 -- Need at least 2 data points
THEN 'SEASONAL_LOW' -- Reclassify: expected seasonal dip, not actually underperforming
ELSE k.performance_tier
END
FROM kpi_product_performance k
LEFT JOIN product_seasonality_index s
ON k.product_id = s.product_id
AND MONTH(k.calculation_date) = s.month_num
WHERE k.calculation_date = CAST(GETUTCDATE() AS DATE);
PRINT 'Seasonality index calculated. Products reclassified as SEASONAL_LOW: ' +
CAST((SELECT COUNT(*) FROM kpi_product_performance
WHERE calculation_date = CAST(GETUTCDATE() AS DATE)
AND performance_tier = 'SEASONAL_LOW') AS VARCHAR);
END;Integration: Call EXEC sp_calculate_seasonality_index as the final step in sp_run_daily_etl, after sp_calculate_product_kpis. The seasonality index is recalculated weekly (add conditional logic or separate schedule). Requires minimum 12 months of historical data to be meaningful; skip for new clients in the first year.
Winner-Underperformer Classification Prompt
Type: prompt A natural language prompt template for use with Power BI Copilot or a GPT-based assistant that generates plain-English explanations of product performance for non-technical retail managers. This prompt takes KPI data and produces actionable buyer recommendations.
Implementation
System Prompt
You are a retail merchandising analyst assistant. Given product performance KPI data, generate clear, actionable insights for a retail buyer or store manager. Use plain business language. Always include specific action recommendations. Format output with headers and bullet points.
User Prompt Template
# paste into Power BI Copilot or GPT-based assistant
Analyze the following product performance data for {store_name} as of {date}:
### STAR Products (High Velocity + High Margin):
{star_products_table}
### UNDERPERFORMER Products (Low Velocity + Low Margin):
{underperformer_products_table}
### DEAD STOCK Products (Zero Sales in 90 Days with Inventory On Hand):
{dead_stock_table}
### MARGIN EROSION ALERTS (Margin dropped >5 points vs prior period):
{margin_erosion_table}
For each category, provide:
1. **Summary**: What's happening in 1-2 sentences
2. **Top 3 Action Items**: Specific, prioritized recommendations
3. **Financial Impact**: Estimated dollar impact of recommended actions
4. **Timeline**: When to act (immediate, this week, this month)
For STAR products, recommend: reorder quantities, whether to expand facings/placement, cross-sell opportunities.
For UNDERPERFORMERS, recommend: markdown percentages, bundle opportunities, vendor return eligibility, discontinuation.
For DEAD STOCK, recommend: clearance pricing (suggest specific discount %), donation write-off, or return-to-vendor.
For MARGIN EROSION, recommend: price adjustment, cost renegotiation with vendor, or substitution with higher-margin alternative.Example Output Structure
### ⭐ Stars: Keep Winning
**Summary**: Your top performers are generating strong velocity and healthy margins...
**Actions**:
1. Increase reorder quantity for [Product X] by 25% — current 14-day supply is below your 30-day target
2. ...
### 🔴 Underperformers: Act Now
**Summary**: 12 items are consuming $4,200 in inventory capital while generating below-average returns...
**Actions**:
1. Mark down [Product Y] by 30% — at current velocity, you have 180 days of supply
2. ...Usage & Integration Options
For the Azure OpenAI integration, use the GPT-5.4 model with temperature=0.3 for consistent, factual output.
Inventory Reorder Recommendation Engine
Type: agent An automated recommendation engine that combines velocity data, seasonality indexes, lead times, and safety stock calculations to generate suggested reorder quantities for winning products. Outputs are surfaced as a Power BI report page and optionally emailed as a weekly reorder suggestion report.
Implementation
CREATE OR ALTER PROCEDURE sp_generate_reorder_recommendations
AS
BEGIN
SET NOCOUNT ON;
-- Create output table if not exists
IF OBJECT_ID('reorder_recommendations', 'U') IS NULL
BEGIN
CREATE TABLE reorder_recommendations (
recommendation_id BIGINT IDENTITY(1,1) PRIMARY KEY,
recommendation_date DATE NOT NULL,
product_id NVARCHAR(100) NOT NULL,
location_id NVARCHAR(100) NOT NULL,
product_name NVARCHAR(255),
current_on_hand INT,
avg_daily_velocity DECIMAL(10,4),
lead_time_days INT DEFAULT 14, -- Configurable per vendor
safety_stock_days INT DEFAULT 7, -- Buffer
reorder_point INT,
suggested_order_qty INT,
estimated_cost DECIMAL(12,2),
days_until_stockout DECIMAL(10,1),
priority NVARCHAR(20), -- URGENT, HIGH, NORMAL, LOW
performance_tier NVARCHAR(20),
seasonality_index DECIMAL(6,4),
calculated_at DATETIME2 DEFAULT GETUTCDATE()
);
END
DECLARE @today DATE = CAST(GETUTCDATE() AS DATE);
-- Clear today's recommendations
DELETE FROM reorder_recommendations WHERE recommendation_date = @today;
INSERT INTO reorder_recommendations (
recommendation_date, product_id, location_id, product_name,
current_on_hand, avg_daily_velocity, lead_time_days, safety_stock_days,
reorder_point, suggested_order_qty, estimated_cost,
days_until_stockout, priority, performance_tier, seasonality_index
)
SELECT
@today,
k.product_id,
k.location_id,
p.product_name,
k.current_on_hand,
k.avg_daily_velocity_30d,
14 AS lead_time_days, -- Default; TODO: configure per vendor via vendor_lead_times table
7 AS safety_stock_days,
-- Reorder Point = (daily velocity × lead time) + safety stock
-- Adjusted for seasonality
CEILING(
k.avg_daily_velocity_30d
* COALESCE(si.seasonality_index, 1.0) -- seasonal adjustment
* (14 + 7) -- lead time + safety stock days
) AS reorder_point,
-- Suggested Order Qty = 30-day supply (adjusted for season) minus current stock + reorder point gap
CASE
WHEN k.current_on_hand < CEILING(k.avg_daily_velocity_30d * COALESCE(si.seasonality_index, 1.0) * 21)
THEN CEILING(
k.avg_daily_velocity_30d
* COALESCE(si.seasonality_index, 1.0)
* 30 -- 30-day supply target
) - k.current_on_hand
ELSE 0
END AS suggested_order_qty,
-- Estimated cost
CASE
WHEN k.current_on_hand < CEILING(k.avg_daily_velocity_30d * COALESCE(si.seasonality_index, 1.0) * 21)
THEN (CEILING(k.avg_daily_velocity_30d * COALESCE(si.seasonality_index, 1.0) * 30) - k.current_on_hand) * p.unit_cost
ELSE 0
END AS estimated_cost,
-- Days until stockout
CASE
WHEN k.avg_daily_velocity_30d > 0
THEN k.current_on_hand / (k.avg_daily_velocity_30d * COALESCE(si.seasonality_index, 1.0))
ELSE 999
END AS days_until_stockout,
-- Priority
CASE
WHEN k.current_on_hand <= 0 AND k.avg_daily_velocity_30d > 0 THEN 'URGENT'
WHEN k.current_on_hand / NULLIF(k.avg_daily_velocity_30d * COALESCE(si.seasonality_index, 1.0), 0) < 14
AND k.performance_tier IN ('STAR', 'OPPORTUNITY') THEN 'HIGH'
WHEN k.current_on_hand / NULLIF(k.avg_daily_velocity_30d * COALESCE(si.seasonality_index, 1.0), 0) < 21 THEN 'NORMAL'
ELSE 'LOW'
END AS priority,
k.performance_tier,
COALESCE(si.seasonality_index, 1.0)
FROM kpi_product_performance k
JOIN dim_product p ON k.product_id = p.product_id
LEFT JOIN product_seasonality_index si
ON k.product_id = si.product_id
AND MONTH(@today) = si.month_num
WHERE k.calculation_date = @today
AND k.avg_daily_velocity_30d > 0 -- Only recommend for items that actually sell
AND p.is_active = 1
AND k.performance_tier IN ('STAR', 'OPPORTUNITY', 'SLEEPER', 'SEASONAL_LOW');
-- Remove recommendations where suggested qty = 0
DELETE FROM reorder_recommendations
WHERE recommendation_date = @today AND suggested_order_qty <= 0;
-- Summary output
SELECT priority, COUNT(*) AS item_count, SUM(estimated_cost) AS total_cost
FROM reorder_recommendations
WHERE recommendation_date = @today
GROUP BY priority
ORDER BY
CASE priority WHEN 'URGENT' THEN 1 WHEN 'HIGH' THEN 2 WHEN 'NORMAL' THEN 3 ELSE 4 END;
PRINT 'Reorder recommendations generated: ' +
CAST((SELECT COUNT(*) FROM reorder_recommendations WHERE recommendation_date = @today) AS VARCHAR) + ' items';
END;EXEC sp_generate_reorder_recommendations to the end of sp_run_daily_etl.POS Data Connector Adapter
Type: integration
A modular SQL-based adapter layer that normalizes data from different POS systems (Square, Shopify, Lightspeed) into the common star schema. This allows the same dashboard and KPI calculations to work regardless of which POS the client uses, making the solution portable across MSP clients.
Implementation:
-- ==============================================
-- ADAPTER: Square → Common Schema
-- ==============================================
CREATE OR ALTER VIEW v_adapter_square_sales AS
SELECT
oli.order_id + '_' + CAST(ROW_NUMBER() OVER (PARTITION BY oli.order_id ORDER BY oli.uid) AS VARCHAR) AS transaction_line_id,
o.id AS transaction_id,
CAST(o.created_at AS DATETIME2) AS transaction_date,
oli.catalog_object_id AS source_product_id,
o.location_id AS source_location_id,
COALESCE(CAST(oli.quantity AS INT), 1) AS quantity_sold,
CAST(oli.base_price_money_amount AS DECIMAL(12,2)) / 100.0 AS unit_price,
CAST(oli.total_money_amount AS DECIMAL(12,2)) / 100.0 AS line_total,
COALESCE(CAST(oli.total_discount_money_amount AS DECIMAL(12,2)) / 100.0, 0) AS discount_amount,
'square' AS source_system
FROM square_raw.order_line_items oli
JOIN square_raw.orders o ON oli.order_id = o.id
WHERE o.state = 'COMPLETED';
CREATE OR ALTER VIEW v_adapter_square_products AS
SELECT
co.id AS source_product_id,
COALESCE(co.item_data_name, 'UNKNOWN-' + co.id) AS sku,
COALESCE(co.item_data_name, 'Unnamed Product') AS product_name,
co.item_data_category_id AS category_id,
NULL AS subcategory,
NULL AS brand,
CAST(co.item_data_variations_0_item_variation_data_price_money_amount AS DECIMAL(12,2)) / 100.0 AS retail_price,
CASE WHEN co.is_deleted = 'true' THEN 0 ELSE 1 END AS is_active,
'square' AS source_system
FROM square_raw.catalog_objects co
WHERE co.type = 'ITEM';
CREATE OR ALTER VIEW v_adapter_square_inventory AS
SELECT
ic.catalog_object_id AS source_product_id,
ic.location_id AS source_location_id,
CAST(ic.quantity AS INT) AS quantity_on_hand,
ic.state,
'square' AS source_system
FROM square_raw.inventory_counts ic
WHERE ic.state = 'IN_STOCK';
-- ==============================================
-- ADAPTER: Shopify → Common Schema
-- ==============================================
CREATE OR ALTER VIEW v_adapter_shopify_sales AS
SELECT
CAST(oli.id AS VARCHAR) AS transaction_line_id,
CAST(o.id AS VARCHAR) AS transaction_id,
CAST(o.created_at AS DATETIME2) AS transaction_date,
CAST(oli.product_id AS VARCHAR) AS source_product_id,
CAST(o.location_id AS VARCHAR) AS source_location_id,
oli.quantity AS quantity_sold,
CAST(oli.price AS DECIMAL(12,2)) AS unit_price,
CAST(oli.price AS DECIMAL(12,2)) * oli.quantity AS line_total,
COALESCE(CAST(oli.total_discount AS DECIMAL(12,2)), 0) AS discount_amount,
'shopify' AS source_system
FROM shopify_raw.order_line_items oli
JOIN shopify_raw.orders o ON oli.order_id = o.id
WHERE o.financial_status IN ('paid', 'partially_refunded');
CREATE OR ALTER VIEW v_adapter_shopify_products AS
SELECT
CAST(p.id AS VARCHAR) AS source_product_id,
COALESCE(v.sku, CAST(p.id AS VARCHAR)) AS sku,
p.title AS product_name,
p.product_type AS category_id,
NULL AS subcategory,
p.vendor AS brand,
CAST(v.price AS DECIMAL(12,2)) AS retail_price,
CASE WHEN p.status = 'active' THEN 1 ELSE 0 END AS is_active,
'shopify' AS source_system
FROM shopify_raw.products p
LEFT JOIN shopify_raw.variants v ON p.id = v.product_id;
CREATE OR ALTER VIEW v_adapter_shopify_inventory AS
SELECT
CAST(il.inventory_item_id AS VARCHAR) AS source_product_id,
CAST(il.location_id AS VARCHAR) AS source_location_id,
il.available AS quantity_on_hand,
'IN_STOCK' AS state,
'shopify' AS source_system
FROM shopify_raw.inventory_levels il
WHERE il.available > 0;
-- ==============================================
-- UNIFIED ADAPTER VIEWS (source-agnostic)
-- ==============================================
CREATE OR ALTER VIEW v_unified_sales AS
SELECT * FROM v_adapter_square_sales
UNION ALL
SELECT * FROM v_adapter_shopify_sales;
-- Add: UNION ALL SELECT * FROM v_adapter_lightspeed_sales when needed
CREATE OR ALTER VIEW v_unified_products AS
SELECT * FROM v_adapter_square_products
UNION ALL
SELECT * FROM v_adapter_shopify_products;
CREATE OR ALTER VIEW v_unified_inventory AS
SELECT * FROM v_adapter_square_inventory
UNION ALL
SELECT * FROM v_adapter_shopify_inventory;
-- Then modify sp_load_fact_sales and other ETL procedures to read from
-- v_unified_sales / v_unified_products / v_unified_inventory instead of
-- directly from square_raw.* tables. This makes the solution POS-agnostic.When onboarding a new client, the MSP only needs to complete the following steps. All downstream KPI calculations, dashboards, and alerts work without modification. This is the key to making this a repeatable, scalable MSP offering.
Testing & Validation
- DATA FRESHNESS TEST: Query fact_sales for MAX(transaction_date) and verify it is within 24 hours of the current date. Query fact_inventory_snapshot for MAX(snapshot_date) and verify it equals today's date. If either is stale, the Fivetran sync or ETL procedure has failed.
- REVENUE RECONCILIATION TEST: Compare total monthly revenue from the Power BI dashboard against the POS system's native monthly sales report for the same period. Discrepancy must be less than 2%. Test at least 3 different months including the most recent complete month.
- MARGIN ACCURACY TEST: Select 10 products at random across different categories. Manually calculate gross margin using the product's known cost (from invoice or accounting system) and average selling price. Compare to the dashboard's reported margin. Discrepancy must be less than 1 percentage point.
- INVENTORY TURNS VALIDATION: Calculate inventory turns manually for the client's top 3 categories using the formula (Annualized COGS ÷ Average Inventory Value). Compare to the dashboard's reported turns. Validate the turns number makes sense for the retail vertical (general retail typically 4–8x, grocery 12–15x, apparel 2–4x).
- PERFORMANCE TIER CLASSIFICATION TEST: Review the top 10 STAR products — confirm with the client that these are indeed their best sellers with healthy margins. Review the top 10 UNDERPERFORMER products — confirm these are recognized as slow movers. If the client disagrees with classifications, check for COGS data errors.
- ALERT DELIVERY TEST: Manually insert a test record into kpi_product_performance with days_of_supply = 999 and performance_tier = 'DEAD_STOCK'. Trigger the Power Automate underperformer alert flow and verify the email arrives at the designated recipient within 5 minutes. Delete the test record after validation.
- ROW-LEVEL SECURITY TEST: Log in to Power BI Service as a location manager user (not the admin). Verify that only their location's data is visible. Attempt to access other locations' data through URL manipulation or filter changes — confirm it is blocked. Then log in as the owner/admin and verify all locations are visible.
- ETL FAILURE RECOVERY TEST: Temporarily change the Azure SQL password used by the Automation runbook to simulate a failure. Verify that (1) the runbook fails gracefully with a logged error, (2) Azure Monitor sends a failure alert to the MSP notification email within 15 minutes, and (3) after correcting the password, the next scheduled run completes successfully and fills any data gaps.
- DASHBOARD LOAD PERFORMANCE TEST: Open the Power BI dashboard and measure page load time for each report page. Executive Summary should load in under 5 seconds. The Winners & Underperformers scatter plot should load in under 8 seconds. If slower, consider switching from DirectQuery to Import mode or adding aggregation tables.
- SEASONAL ADJUSTMENT TEST: If the client has 12+ months of data, verify that seasonal products (e.g., holiday items, summer goods) are classified as SEASONAL_LOW during their off-season months rather than UNDERPERFORMER or DEAD_STOCK. Check the product_seasonality_index table for reasonable index values (peak months should show index > 1.5, trough months < 0.5).
Client Handoff
The client handoff meeting should be a 2-hour session with the store owner/GM, head buyer, and any location managers who will use the dashboards. Cover the following topics:
1. Dashboard Walkthrough (45 minutes)
- Navigate each Power BI report page: Executive Summary, Winners & Underperformers, Inventory Health, Category Deep Dive
- Demonstrate how to use slicers (date range, category, location)
- Show drill-through navigation from summary to product detail
- Explain each KPI: what it means, why it matters, and what 'good' looks like for their business
- Show the scatter plot quadrant chart and explain STAR/OPPORTUNITY/SLEEPER/UNDERPERFORMER classifications
2. Alert System Overview (15 minutes)
- Show sample alert emails and explain what triggers each type
- Demonstrate how to adjust alert thresholds (MSP can modify on request)
- Explain the weekly reorder recommendation report
3. Action Playbook (30 minutes)
For each performance tier, review the recommended actions:
- STAR: Increase reorder quantities, expand shelf space, consider bundling
- OPPORTUNITY: Investigate margin — negotiate with vendor, adjust pricing, or reduce discounting
- SLEEPER: Test marketing/promotion to increase velocity; consider if the margin justifies the shelf space
- UNDERPERFORMER: Mark down 20–30%, bundle with winners, investigate return-to-vendor options
- DEAD STOCK: Clearance at 50–70% off, donate for tax write-off, discontinue
4. Data Responsibilities (15 minutes)
- Explain that COGS accuracy is the client's responsibility — new products must have cost entered in the POS/accounting system
- New locations or major product line changes should be communicated to the MSP
- Dashboard data refreshes daily at 6:00 AM; yesterday's data is always available by store opening
5. Support Model (15 minutes)
- How to contact the MSP for dashboard questions or issues
- Monthly check-in schedule (first 3 months, then quarterly)
- Scope of managed service: what's included (monitoring, refresh maintenance, minor dashboard tweaks) vs. what's billable (new report pages, additional data source integrations)
Documentation to Leave Behind
- KPI Data Dictionary (one-page reference defining each metric and its formula)
- Power BI Quick Start Guide (screenshots showing how to navigate, filter, and export)
- Alert Reference Card (list of all automated alerts, their triggers, and recommended responses)
- Escalation Contact Sheet (MSP support email, phone, response time SLAs)
- Data Quality Checklist (monthly checklist for the client to verify COGS accuracy and product catalog completeness)
Success Criteria to Review Together
Maintenance
Ongoing MSP Responsibilities:
Daily (Automated Monitoring - 15 min/day)
- Verify Azure Automation runbook completed successfully (check Azure Monitor dashboard)
- Verify Fivetran sync status — all connectors show 'Succeeded' with no schema drift warnings
- Monitor Power BI dataset refresh status in the Power BI Admin portal
- Respond to any ETL failure alerts within 4-hour SLA
Weekly (30 minutes)
- Review Power BI usage analytics: who's accessing dashboards, which pages are most viewed
- Check data volume growth to ensure Azure SQL tier and Fivetran MAR are within plan limits
- Review any Power Automate flow failures and resolve
- Spot-check 2-3 KPI values against POS native reports for ongoing accuracy
Monthly (2-3 hours)
- Run the seasonality index recalculation (sp_calculate_seasonality_index)
- Review and adjust KPI thresholds with client if needed (e.g., change margin target, velocity benchmarks)
- Check for POS system updates that may have changed API schemas — test Fivetran connectors
- Review Azure costs and optimize (scale down SQL tier if utilization is low, archive old data)
- Generate and send Monthly Analytics Summary to client (automated Power BI paginated report or manual)
- Update user access: add/remove dashboard users as client staff changes
Quarterly (4-6 hours - Quarterly Business Review)
- Present trend analysis: how KPIs have changed over the quarter
- Identify top wins: products promoted from Underperformer to Star based on client actions
- Calculate ROI: estimated revenue recovered or costs avoided from analytics-driven decisions
- Review dead stock clearance results and inventory reduction
- Discuss potential enhancements: additional data sources, new report pages, predictive forecasting
- Renew/adjust managed service agreement terms if needed
Annually
- Full data pipeline audit: verify all connectors, transformations, and KPIs are still accurate
- Power BI license true-up (add/remove seats)
- Azure infrastructure review: right-size SQL tier, review storage retention, update security configurations
- Reassess PCI DSS compliance posture — confirm no cardholder data has leaked into analytics
- Review and update the Data Processing Agreement
Update/Patch Schedule
- Power BI Desktop: Update monthly (Microsoft releases monthly updates)
- Fivetran connectors: Auto-updated by Fivetran; monitor release notes for breaking changes
- Azure SQL: Automatic patching enabled; review major version upgrades semi-annually
- Stored procedures: Version-controlled in Git; update only with tested changes through staging environment
- Power Automate flows: Review quarterly; update when POS APIs change
Escalation Path
- Tier 1 (MSP Help Desk): Dashboard access issues, password resets, basic questions → 4-hour response
- Tier 2 (MSP Data Engineer): Data quality issues, ETL failures, connector problems → 8-hour response
- Tier 3 (MSP BI Developer): Dashboard modifications, new KPI requests, performance optimization → 2 business day response
- Vendor Escalation: Fivetran support for connector issues, Microsoft support for Power BI/Azure issues, POS vendor support for API changes
SLA Targets
- Dashboard availability: 99.5% uptime (aligned with Azure SQL SLA)
- Data freshness: Previous day's data available by 7:00 AM local time, 95% of the time
- Alert delivery: Within 30 minutes of scheduled trigger time
- Critical issue response: 4 hours during business hours, next business day after hours
- Dashboard modification requests: Scoped and estimated within 2 business days, delivered within 5 business days
Alternatives
POS-Native Analytics Only (No Custom BI)
Instead of building a custom Power BI dashboard, rely entirely on the analytics built into the client's existing POS system. Shopify with Stocky provides inventory analytics including velocity, reorder suggestions, and performance reports. Lightspeed Retail includes Advanced Reporting with margin analysis, inventory turns, and sales performance by category. Square Analytics provides sales trends, item-level reporting, and basic inventory insights.
Replace the Power BI + Azure SQL + Fivetran stack with Zoho Analytics as the single platform. Zoho Analytics includes built-in data connectors for popular POS systems and accounting tools, a visual report builder, automated data sync, and pre-built retail analytics templates for margin analysis and inventory performance. The entire Zoho suite (Analytics + CRM + Inventory) provides an integrated retail management platform.
Open-Source Self-Hosted Stack (Metabase + Airbyte + PostgreSQL)
Replace all commercial SaaS with open-source alternatives: Metabase (open-source BI dashboards), Airbyte (open-source ETL), and PostgreSQL (open-source database). Host everything on a single cloud VM (Azure VM B2ms at ~$60/month) or on the Dell OptiPlex Micro on-premises. The MSP manages the entire stack.
Not recommended for MSPs without dedicated Linux/Docker expertise. The MSP is fully responsible for uptime, patches, and disaster recovery with no vendor support available.
Microsoft Fabric End-to-End (Premium Path)
Use Microsoft Fabric as the unified analytics platform, replacing the separate Azure SQL + Fivetran + Power BI components. Fabric includes a Lakehouse for data storage, Data Factory for ETL, Power BI for visualization, and Copilot for AI-powered insights — all in a single consumption-based platform. This is Microsoft's vision for the future of analytics.
Want early access to the full toolkit?