
Implementation Guide: Generate personalized financial plan narratives and investment proposal documents
Step-by-step implementation guide for deploying AI to generate personalized financial plan narratives and investment proposal documents for Financial Advisory clients.
Hardware Procurement
Fortinet FortiGate 40F Next-Generation Firewall
$350 MSP cost / $600 suggested resale (includes initial configuration)
Secure API traffic between the advisory firm's network and Azure OpenAI endpoints. Provides TLS 1.3 inspection, application-layer filtering to prevent shadow AI usage (blocking unauthorized ChatGPT/Claude consumer endpoints), and VPN capability for remote advisors. Essential for GLBA and NY DFS 23 NYCRR 500 compliance.
Synology DiskStation NAS
$600 MSP cost for unit + $600 for 2x 4TB Seagate IronWolf drives / $1,800 suggested resale (configured and deployed)
On-premises compliance archive for AI-generated documents, prompt logs, and audit trails. SEC Rule 204-2 requires advisors to maintain all written communications relating to advisory business. This NAS provides encrypted, RAID-1 mirrored local archival with automated backup to Azure Blob Storage. Also stores document templates and prompt libraries.
APC Smart-UPS 1500VA
$550 MSP cost / $850 suggested resale
Battery backup for the NAS and firewall to ensure compliance archival continuity during power events. Required for business continuity in regulated environments.
Software Procurement
Azure OpenAI Service (GPT-4.1)
$2.00/1M input tokens + $8.00/1M output tokens; estimated $15–$50/month for a typical 5-advisor firm generating 50–150 plans/month. Resale at cost + 25% markup.
Primary LLM engine for generating financial plan narratives and investment proposal text. GPT-4.1's 1M token context window accommodates full client financial profiles. Azure OpenAI provides SOC 2 Type II certification, data residency controls, and content filtering — critical for financial services compliance.
Microsoft 365 Business Premium (CSP)
$22/user/month resale (MSP cost ~$18–20/user/month via CSP)
Foundation platform providing Exchange Online, SharePoint, OneDrive, Teams, and Entra ID (Azure AD). Entra ID handles API access management and MFA. SharePoint serves as the document management layer. Required for Copilot add-on eligibility.
Microsoft 365 Copilot
$30/user/month (MSP cost ~$28/user/month via CSP)
Optional but high-value add-on enabling advisors to interact with generated plan drafts in Word (refine sections, adjust tone) and create client-facing PowerPoint proposal decks from narrative content. Also enables Teams meeting summarization for client meeting notes that feed the AI pipeline.
Azure Key Vault
~$5–$15/month for standard operations volume
Secure storage and automatic rotation of API keys for Azure OpenAI, CRM APIs, and planning software integrations. Prevents hardcoded credentials and satisfies security audit requirements.
Azure Blob Storage (Archive Tier)
~$0.00099/GB/month for archive tier; estimated $5–$20/month for typical document volumes
Cloud-based compliance archive for all AI-generated documents and prompt/response logs. Provides geo-redundant, immutable (WORM) storage satisfying SEC Rule 17a-4 and FINRA Rule 4511 electronic records retention requirements. Syncs with on-premises Synology NAS for 3-2-1 backup strategy.
Holistiplan
$749/year (30 households) to $1,949/year (100 households); resale at list price + implementation fee
Specialized tax planning AI that OCRs uploaded tax returns and generates tax planning narratives. Integrates with the broader AI pipeline to enrich financial plan narratives with tax-specific insights. Already widely adopted in the advisory space, reducing advisor resistance.
FP Alpha
$1,795/year (all-in-one plan)
AI-powered financial planning analysis covering tax, estate, and insurance planning. Automatically surfaces recommendations that feed into the GPT-4.1 narrative engine for comprehensive financial plan documents. Advisor-trained AI models reduce hallucination risk for financial recommendations.
Redtail CRM (or existing CRM — Wealthbox/Salesforce FSC)
$39/user/month (Launch plan); most firms already have a CRM — no new procurement needed
Client data source providing demographic information, household structure, financial goals, risk profiles, and communication history. The AI pipeline pulls client context via API to personalize generated documents. Redtail offers 100+ API endpoints for deep integration.
Python Runtime Environment (Azure App Service or Azure Functions)
$30–$75/month for Azure App Service B2 plan or ~$15–$40/month for Azure Functions consumption plan
Hosts the custom middleware application that orchestrates the data pipeline: pulling client data from CRM/planning APIs, constructing prompts from templates, calling Azure OpenAI, formatting output documents, and logging all activity for compliance.
Prerequisites
- Active Microsoft 365 Business Premium or Enterprise subscription with Entra ID (Azure AD) configured and MFA enforced for all users
- Azure subscription with billing configured and a resource group designated for AI services (subscription Owner or Contributor role required for provisioning)
- Existing CRM system (Redtail, Wealthbox, or Salesforce Financial Services Cloud) with API access enabled and an API key or OAuth credentials provisioned for integration
- Existing financial planning software (eMoney Advisor, MoneyGuidePro, or RightCapital) with data export capability (API preferred; CSV export acceptable as fallback)
- At least 10 completed sample financial plans and 10 investment proposals in the firm's current format to use as training examples for prompt engineering
- Firm's brand style guide, approved disclosure language, and compliance-approved boilerplate paragraphs (risk disclosures, ADV references, etc.)
- Designated compliance officer or CCO who will participate in template review and approve the AI supervisory procedures documentation
- Business-grade internet connection (100+ Mbps symmetric minimum) at all office locations where advisors will use the system
- Python 3.10+ development environment available to the MSP integration developer (not required on client machines)
- Client-side admin has provided or will provide: DNS management access, firewall admin credentials, and NAS deployment location with available Ethernet port and power outlet
- Written engagement letter or SOW signed by the client acknowledging that AI-generated content requires human review before client delivery, and that the MSP is not providing financial advice
Installation Steps
Step 1: Environment Discovery and Audit
Conduct a thorough audit of the client's existing technology environment. Document all current software (CRM, planning tools, portfolio management, document management), network topology, user accounts, and compliance procedures. Identify API availability for each system. Interview 2-3 advisors to understand their current document creation workflow, typical plan structure, and pain points. Collect 10+ sample financial plans and 10+ investment proposals in their current branded format.
This step typically takes 2-3 days including advisor interviews. Schedule interviews during non-client-meeting hours. Do NOT proceed to provisioning until you have collected sample documents and compliance-approved disclosure language. The discovery output should be a formal Technical Assessment document shared with the client for sign-off.
Step 2: Network Security Hardening
Deploy the Fortinet FortiGate 40F firewall (or upgrade existing firewall policies). Configure TLS 1.3 inspection for outbound HTTPS traffic. Create application control policies that block consumer AI endpoints (chat.openai.com, claude.ai, gemini.google.com, etc.) to prevent shadow AI usage — all AI interactions must flow through the managed Azure OpenAI endpoint. Configure DNS filtering. Set up VPN profiles for remote advisors.
# blocks consumer AI endpoints, allows Azure OpenAI, enables deep TLS
# inspection
config system interface
edit port1
set ip 192.168.1.1 255.255.255.0
set allowaccess ping https ssh
next
end
config webfilter urlfilter
edit 1
set name 'Block-Shadow-AI'
config entries
edit 1
set url 'chat.openai.com'
set type wildcard
set action block
next
edit 2
set url 'claude.ai'
set type wildcard
set action block
next
edit 3
set url 'gemini.google.com'
set type wildcard
set action block
next
edit 4
set url 'bard.google.com'
set type wildcard
set action block
next
end
next
end
config firewall address
edit 'Azure-OpenAI-Endpoints'
set type fqdn
set fqdn '*.openai.azure.com'
next
end
config firewall ssl-ssh-profile
edit 'deep-inspection-financial'
config https
set ports 443
set status deep-inspection
end
next
endShadow AI prevention is a critical compliance control. If advisors paste client PII into consumer ChatGPT, the firm faces GLBA violations and potential FINRA enforcement. Document the firewall configuration in the compliance file. If the client already has a managed firewall (e.g., Meraki MX68), configure equivalent policies on that device instead of deploying a new one.
Step 3: Provision Azure OpenAI Service
Create the Azure OpenAI resource in the client's Azure subscription (or the MSP's managed Azure tenant if managing centrally). Deploy GPT-4.1 model. Configure content filtering, rate limits, and diagnostic logging. Store the API endpoint and key in Azure Key Vault.
# Login to Azure CLI
az login
# Set subscription (use client's or MSP managed subscription)
az account set --subscription 'ClientName-Financial-AI'
# Create resource group
az group create --name rg-clientname-ai --location eastus2
# Create Azure OpenAI resource
az cognitiveservices account create \
--name oai-clientname-finplan \
--resource-group rg-clientname-ai \
--kind OpenAI \
--sku S0 \
--location eastus2 \
--custom-domain oai-clientname-finplan
# Deploy GPT-4.1 model
az cognitiveservices account deployment create \
--name oai-clientname-finplan \
--resource-group rg-clientname-ai \
--deployment-name gpt-41-finplan \
--model-name gpt-4.1 \
--model-version '2025-04-14' \
--model-format OpenAI \
--sku-capacity 80 \
--sku-name Standard
# Enable diagnostic logging to Log Analytics
az monitor diagnostic-settings create \
--name oai-diagnostics \
--resource /subscriptions/{sub-id}/resourceGroups/rg-clientname-ai/providers/Microsoft.CognitiveServices/accounts/oai-clientname-finplan \
--logs '[{"category":"RequestResponse","enabled":true},{"category":"Audit","enabled":true}]' \
--workspace /subscriptions/{sub-id}/resourceGroups/rg-clientname-ai/providers/Microsoft.OperationalInsights/workspaces/law-clientname
# Create Azure Key Vault
az keyvault create \
--name kv-clientname-ai \
--resource-group rg-clientname-ai \
--location eastus2 \
--sku standard
# Store the API key in Key Vault
API_KEY=$(az cognitiveservices account keys list --name oai-clientname-finplan --resource-group rg-clientname-ai --query key1 -o tsv)
az keyvault secret set --vault-name kv-clientname-ai --name oai-api-key --value $API_KEY
# Store the endpoint
az keyvault secret set --vault-name kv-clientname-ai --name oai-endpoint --value 'https://oai-clientname-finplan.openai.azure.com/'Select the Azure region closest to the client for lowest latency. Use eastus2 or westus2 for US-based advisors. Enable content filtering at the default level — do NOT disable it, as this is required for compliance. The diagnostic logging captures every API request and response, which forms the core of the compliance audit trail. Capacity of 80K TPM (tokens per minute) is sufficient for up to 20 concurrent advisors. Monitor usage and increase if needed.
Step 4: Deploy Compliance Archive Infrastructure
Set up the Synology NAS on-premises for local document archival, and configure Azure Blob Storage with immutable (WORM) policies for cloud-based compliance archival. Establish automated sync between NAS and Azure. This dual-archive approach satisfies SEC Rule 17a-4 requirements for non-rewritable, non-erasable storage.
Azure Blob Storage Setup
# Create storage account with immutable storage
az storage account create \
--name stclientnamearchive \
--resource-group rg-clientname-ai \
--location eastus2 \
--sku Standard_GRS \
--kind StorageV2
# Create container for AI documents
az storage container create \
--name ai-document-archive \
--account-name stclientnamearchive
# Set immutable policy (365-day retention — adjust per firm's retention policy)
az storage container immutability-policy create \
--resource-group rg-clientname-ai \
--account-name stclientnamearchive \
--container-name ai-document-archive \
--period 365Synology NAS Setup
After physical installation and DSM initial setup via web UI, complete the following configuration steps:
Generate SAS Token for NAS-to-Azure Sync
az storage container generate-sas \
--account-name stclientnamearchive \
--name ai-document-archive \
--permissions rwl \
--expiry 2026-12-31 \
--auth-mode key \
-o tsvThe immutable storage policy CANNOT be shortened after it is locked — confirm the retention period with the client's compliance officer before locking. Most RIAs use 5-7 year retention; broker-dealers may require longer under FINRA Rule 4511 (6 years minimum). Set the immutability period accordingly. The NAS should be placed in a secure, climate-controlled area (server closet or locked office). Label the NAS clearly as 'COMPLIANCE ARCHIVE — DO NOT DISCONNECT'.
Step 5: Build the Middleware Application
Deploy the Python-based middleware application to Azure App Service. This application orchestrates the entire pipeline: it receives requests from the advisor-facing web interface, pulls client data from CRM and planning software APIs, constructs prompts using Jinja2 templates, calls Azure OpenAI, formats the response into branded Word documents using python-docx, logs everything for compliance, and presents the draft for advisor review.
mkdir -p finplan-ai/{app,templates,prompts,static,tests}
cd finplan-aicat > requirements.txt << 'EOF'
flask==3.1.0
gunicorn==22.0.0
openai==1.52.0
python-docx==1.1.0
Jinja2==3.1.4
requests==2.32.3
azure-identity==1.17.1
azure-keyvault-secrets==4.8.0
azure-storage-blob==12.22.0
python-dotenv==1.0.1
redis==5.0.8
celery==5.4.0
pydantic==2.9.0
EOFcat > app/main.py << 'PYEOF'
import os
from flask import Flask, request, jsonify, send_file
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
from app.pipeline import FinancialPlanPipeline
from app.compliance_logger import ComplianceLogger
app = Flask(__name__)
# Load secrets from Azure Key Vault
credential = DefaultAzureCredential()
kv_client = SecretClient(vault_url=os.environ['KEY_VAULT_URL'], credential=credential)
OAI_KEY = kv_client.get_secret('oai-api-key').value
OAI_ENDPOINT = kv_client.get_secret('oai-endpoint').value
pipeline = FinancialPlanPipeline(api_key=OAI_KEY, endpoint=OAI_ENDPOINT)
logger = ComplianceLogger()
@app.route('/api/generate-plan', methods=['POST'])
def generate_plan():
try:
payload = request.get_json()
client_id = payload['client_id']
document_type = payload['document_type'] # 'financial_plan' or 'investment_proposal'
advisor_id = payload['advisor_id']
customization_notes = payload.get('customization_notes', '')
# Pull client data from CRM
client_data = pipeline.fetch_client_data(client_id)
# Pull planning data
planning_data = pipeline.fetch_planning_data(client_id)
# Generate document
result = pipeline.generate_document(
client_data=client_data,
planning_data=planning_data,
document_type=document_type,
customization_notes=customization_notes
)
# Log for compliance (input + output)
log_id = logger.log_generation(
advisor_id=advisor_id,
client_id=client_id,
document_type=document_type,
prompt_used=result['prompt'],
raw_output=result['raw_text'],
model_used=result['model'],
tokens_used=result['usage']
)
return jsonify({
'status': 'draft_ready',
'log_id': log_id,
'document_url': result['document_url'],
'preview_text': result['raw_text'][:2000],
'requires_review': True
})
except Exception as e:
logger.log_error(str(e))
return jsonify({'status': 'error', 'message': str(e)}), 500
@app.route('/api/approve-document', methods=['POST'])
def approve_document():
payload = request.get_json()
log_id = payload['log_id']
advisor_id = payload['advisor_id']
approval_notes = payload.get('notes', '')
logger.log_approval(log_id, advisor_id, approval_notes)
return jsonify({'status': 'approved', 'log_id': log_id})
if __name__ == '__main__':
app.run(debug=False)
PYEOFaz webapp up \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--runtime 'PYTHON:3.11' \
--sku B2az webapp config appsettings set \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--settings \
KEY_VAULT_URL='https://kv-clientname-ai.vault.azure.net/' \
CRM_TYPE='redtail' \
CRM_API_URL='https://smf.crm3.redtailtechnology.com/api/public/v1' \
PLANNING_SOFTWARE='emoney' \
AZURE_STORAGE_ACCOUNT='stclientnamearchive' \
ARCHIVE_CONTAINER='ai-document-archive'az webapp identity assign --name app-clientname-finplan --resource-group rg-clientname-aiPRINCIPAL_ID=$(az webapp identity show --name app-clientname-finplan --resource-group rg-clientname-ai --query principalId -o tsv)
az keyvault set-policy --name kv-clientname-ai --object-id $PRINCIPAL_ID --secret-permissions get listThe B2 App Service plan ($55/month) is sufficient for firms up to 25 advisors. For larger firms, scale to P1v3. The managed identity approach eliminates the need for API keys in application code — the web app authenticates to Key Vault using its Azure-assigned identity. IMPORTANT: CRM API credentials must be stored in Key Vault as separate secrets. The CRM_TYPE and PLANNING_SOFTWARE settings determine which integration module the pipeline loads.
Step 6: Configure CRM Integration Module
Build and configure the integration module that pulls client data from the firm's CRM system. This module fetches client demographics, household members, financial goals, risk tolerance scores, and communication preferences. Support modules for Redtail, Wealthbox, and Salesforce FSC are provided — deploy the one matching the client's CRM.
cat > app/integrations/crm_redtail.py << 'PYEOF'
import requests
from typing import Dict, Any
class RedtailCRMClient:
BASE_URL = 'https://smf.crm3.redtailtechnology.com/api/public/v1'
def __init__(self, api_key: str):
self.headers = {
'Authorization': f'Userkeyauth {api_key}',
'Content-Type': 'application/json'
}
def get_client(self, contact_id: int) -> Dict[str, Any]:
resp = requests.get(
f'{self.BASE_URL}/contacts/{contact_id}',
headers=self.headers
)
resp.raise_for_status()
contact = resp.json()
return {
'id': contact['id'],
'first_name': contact.get('first_name', ''),
'last_name': contact.get('last_name', ''),
'date_of_birth': contact.get('date_of_birth', ''),
'email': contact.get('email', ''),
'phone': contact.get('phone', ''),
'marital_status': contact.get('marital_status', ''),
'employment_status': contact.get('employment_status', ''),
'risk_tolerance': contact.get('category_tags', {}).get('risk_tolerance', 'Moderate'),
'annual_income': contact.get('custom_fields', {}).get('annual_income', ''),
'net_worth': contact.get('custom_fields', {}).get('net_worth', ''),
'retirement_age': contact.get('custom_fields', {}).get('target_retirement_age', '65'),
'goals': self._get_goals(contact_id),
'household_members': self._get_household(contact_id)
}
def _get_goals(self, contact_id: int) -> list:
resp = requests.get(
f'{self.BASE_URL}/contacts/{contact_id}/notes',
headers=self.headers,
params={'category': 'Financial Goals'}
)
if resp.status_code == 200:
return [note['body'] for note in resp.json().get('notes', [])]
return []
def _get_household(self, contact_id: int) -> list:
resp = requests.get(
f'{self.BASE_URL}/contacts/{contact_id}/relationships',
headers=self.headers
)
if resp.status_code == 200:
return resp.json().get('relationships', [])
return []
PYEOFaz keyvault secret set --vault-name kv-clientname-ai --name crm-api-key --value 'REDTAIL_API_KEY_HERE'Redtail API documentation is available at https://corporate.redtailtechnology.com/api-documentation. Custom fields (annual_income, net_worth, target_retirement_age) must be configured in Redtail to match the field names used here — work with the client's CRM administrator to map these. If the client uses Wealthbox, replace with the Wealthbox REST API module (similar structure, different endpoints). For Salesforce FSC, use the simple_salesforce Python library with OAuth 2.0 JWT bearer flow.
Step 7: Configure Financial Planning Software Integration
Build the integration module that pulls financial planning data — projections, asset allocations, cash flow analyses, and scenario results — from the firm's planning software. This data enriches the AI-generated narratives with specific numbers and projections.
# eMoney Advisor API integration client
import requests
from typing import Dict, Any, Optional
class EMoneyClient:
"""Integration with eMoney Advisor API for plan data extraction."""
def __init__(self, api_key: str, base_url: str):
self.base_url = base_url
self.headers = {
'Authorization': f'Bearer {api_key}',
'Accept': 'application/json'
}
def get_plan_summary(self, client_id: str) -> Dict[str, Any]:
"""Fetch the most recent financial plan summary for a client."""
resp = requests.get(
f'{self.base_url}/clients/{client_id}/plans/current',
headers=self.headers
)
if resp.status_code == 200:
plan = resp.json()
return {
'plan_id': plan.get('id'),
'plan_date': plan.get('created_date'),
'retirement_probability': plan.get('monte_carlo_probability'),
'projected_retirement_income': plan.get('annual_retirement_income'),
'total_investable_assets': plan.get('total_assets'),
'total_liabilities': plan.get('total_liabilities'),
'net_worth': plan.get('net_worth'),
'annual_savings_rate': plan.get('savings_rate'),
'asset_allocation': plan.get('current_allocation', {}),
'recommended_allocation': plan.get('target_allocation', {}),
'social_security_estimate': plan.get('ss_benefit'),
'insurance_coverage': plan.get('insurance_summary', {}),
'estate_documents': plan.get('estate_docs_status', {}),
'tax_bracket': plan.get('effective_tax_rate'),
'scenarios': self._get_scenarios(client_id, plan.get('id'))
}
# Fallback: return empty structure for manual data entry
return self._empty_plan_structure()
def _get_scenarios(self, client_id: str, plan_id: str) -> list:
resp = requests.get(
f'{self.base_url}/clients/{client_id}/plans/{plan_id}/scenarios',
headers=self.headers
)
if resp.status_code == 200:
return resp.json().get('scenarios', [])
return []
def _empty_plan_structure(self) -> Dict[str, Any]:
return {
'plan_id': None,
'retirement_probability': None,
'total_investable_assets': None,
'asset_allocation': {},
'recommended_allocation': {},
'scenarios': [],
'_note': 'Plan data not available via API — advisor must input manually'
}# CSV fallback importer for planning tools without API access
import csv
import io
from typing import Dict, Any
class CSVPlanningImporter:
"""Fallback for planning software without API access.
Advisor uploads a CSV export from their planning tool."""
REQUIRED_FIELDS = [
'client_name', 'total_assets', 'total_liabilities',
'retirement_probability', 'annual_income', 'savings_rate'
]
def parse_plan_export(self, csv_content: str) -> Dict[str, Any]:
reader = csv.DictReader(io.StringIO(csv_content))
rows = list(reader)
if not rows:
raise ValueError('Empty CSV file')
row = rows[0] # First row = client data
return {
'plan_id': 'csv-import',
'total_investable_assets': row.get('total_assets', ''),
'total_liabilities': row.get('total_liabilities', ''),
'retirement_probability': row.get('retirement_probability', ''),
'annual_savings_rate': row.get('savings_rate', ''),
'asset_allocation': {
'equities': row.get('equity_pct', ''),
'fixed_income': row.get('fi_pct', ''),
'alternatives': row.get('alt_pct', ''),
'cash': row.get('cash_pct', '')
},
'scenarios': []
}eMoney has a robust API with 40+ custodial and CRM connections. MoneyGuidePro has more limited API access — contact Envestnet for partner API credentials. RightCapital offers a developer API for Premium tier ($1,800/yr) subscribers. For planning tools without API access, the CSV fallback module provides a manual upload path. During implementation, test API connectivity with the specific planning software version the client uses.
Step 8: Deploy Prompt Template Library
Create and deploy the Jinja2-based prompt templates that transform client data into structured prompts for GPT-4.1. Each document type (financial plan narrative, investment proposal, retirement analysis, etc.) has its own template with the firm's approved language, disclosure requirements, and formatting preferences. These templates are the core IP of the solution.
mkdir -p prompts/{financial_plan,investment_proposal,retirement_analysis,common}cat > prompts/financial_plan/narrative.j2 << 'TEMPLATEEOF'
You are a senior financial planning writer employed by {{ firm_name }}. You produce personalized, compliant financial plan narrative documents for clients of registered investment advisory firms.
## CRITICAL RULES:
1. NEVER fabricate financial data. Use ONLY the data provided below. If data is missing, write: "[DATA NEEDED: description of missing information]"
2. NEVER provide specific investment product recommendations (no ticker symbols, fund names, or specific securities)
3. ALWAYS use language consistent with fiduciary duty — the plan is in the client's best interest
4. Include the required disclosure paragraph at the end of every section that discusses investments or projections
5. Use second person ("you"/"your") when addressing the client
6. Write at a 10th-grade reading level — accessible but professional
7. All projected returns must be clearly labeled as hypothetical
## CLIENT PROFILE:
- Name: {{ client.first_name }} {{ client.last_name }}
- Age: {{ client.age }} | Target Retirement Age: {{ client.retirement_age }}
- Marital Status: {{ client.marital_status }}
- Employment: {{ client.employment_status }}
- Annual Household Income: {{ client.annual_income | format_currency }}
- Household Members: {% for member in client.household_members %}{{ member.name }} ({{ member.relationship }}, age {{ member.age }}){% if not loop.last %}, {% endif %}{% endfor %}
## FINANCIAL SNAPSHOT:
- Total Investable Assets: {{ plan.total_investable_assets | format_currency }}
- Total Liabilities: {{ plan.total_liabilities | format_currency }}
- Estimated Net Worth: {{ plan.net_worth | format_currency }}
- Current Savings Rate: {{ plan.annual_savings_rate }}%
- Effective Tax Bracket: {{ plan.tax_bracket }}%
## CURRENT ASSET ALLOCATION:
{% for asset_class, pct in plan.asset_allocation.items() %}- {{ asset_class }}: {{ pct }}%
{% endfor %}
## RECOMMENDED ALLOCATION:
{% for asset_class, pct in plan.recommended_allocation.items() %}- {{ asset_class }}: {{ pct }}%
{% endfor %}
## RETIREMENT PROJECTION:
- Monte Carlo Success Probability: {{ plan.retirement_probability }}%
- Projected Annual Retirement Income: {{ plan.projected_retirement_income | format_currency }}
- Social Security Estimate (at FRA): {{ plan.social_security_estimate | format_currency }}/month
## CLIENT GOALS:
{% for goal in client.goals %}- {{ goal }}
{% endfor %}
## SCENARIOS MODELED:
{% for scenario in plan.scenarios %}- {{ scenario.name }}: {{ scenario.description }} → Probability: {{ scenario.probability }}%
{% endfor %}
{% if customization_notes %}- ADVISOR NOTES: {{ customization_notes }}{% endif %}
## DOCUMENT STRUCTURE (generate each section):
1. **Executive Summary** (1 page) — warm, personal overview of the plan's key findings and recommendations
2. **Your Financial Picture Today** (1-2 pages) — net worth summary, income analysis, current allocation review
3. **Your Goals & What They Mean** (1-2 pages) — each goal restated with quantified targets and timelines
4. **Retirement Readiness Analysis** (2-3 pages) — projection results, Monte Carlo explanation in plain English, Social Security strategy
5. **Investment Strategy** (2-3 pages) — allocation rationale, risk management approach, rebalancing philosophy (NO specific product names)
6. **Tax Planning Opportunities** (1-2 pages) — tax bracket management, Roth conversion analysis if applicable, tax-loss harvesting overview
7. **Risk Management & Insurance Review** (1 page) — coverage assessment, gaps identified
8. **Estate Planning Considerations** (1 page) — beneficiary review, document status, basic planning needs
9. **Action Items & Next Steps** (1 page) — numbered, prioritized action list with responsible party and timeline
10. **Important Disclosures** — insert the following verbatim:
{{ firm_disclosures }}
Generate the complete document now. Use markdown formatting with headers. Each section should flow naturally into the next with transitional language.
TEMPLATEEOFcat > prompts/investment_proposal/proposal.j2 << 'TEMPLATEEOF'
You are a senior investment writer employed by {{ firm_name }}. You produce personalized investment proposal documents for prospective and existing clients.
## CRITICAL RULES:
1. NEVER fabricate performance data. Use ONLY the data provided.
2. NEVER guarantee future returns or use language implying certainty of outcomes
3. All hypothetical projections must be clearly labeled as such
4. Do NOT name specific securities, mutual funds, or ETFs unless provided in the data below
5. Comply with SEC Marketing Rule 206(4)-1: no misleading statements, no cherry-picked performance
6. Focus on the STRATEGY and PROCESS, not on specific products
## PROSPECT/CLIENT PROFILE:
- Name: {{ client.first_name }} {{ client.last_name }}
- Current Portfolio Value: {{ plan.total_investable_assets | format_currency }}
- Risk Tolerance Score: {{ client.risk_tolerance }} / 100
- Investment Time Horizon: {{ client.investment_horizon }} years
- Primary Investment Objective: {{ client.primary_objective }}
## CURRENT ALLOCATION (if available):
{% for asset_class, pct in plan.asset_allocation.items() %}- {{ asset_class }}: {{ pct }}%
{% endfor %}
## PROPOSED ALLOCATION:
{% for asset_class, pct in plan.recommended_allocation.items() %}- {{ asset_class }}: {{ pct }}%
{% endfor %}
## FEE SCHEDULE:
{{ firm_fee_schedule }}
## DOCUMENT STRUCTURE:
1. **Cover Letter** — personalized to client's situation and stated goals
2. **Our Firm & Philosophy** — {{ firm_name }} overview, investment philosophy, fiduciary commitment
3. **Understanding Your Needs** — restate client goals, risk tolerance, time horizon
4. **Current Portfolio Assessment** — analysis of existing allocation, identified concerns
5. **Proposed Investment Strategy** — recommended allocation with rationale for each asset class
6. **Fee Transparency** — clear fee schedule, total cost comparison
7. **What to Expect** — onboarding process, communication cadence, reporting overview
8. **Appendix: Important Disclosures** — insert verbatim: {{ firm_disclosures }}
Generate the complete investment proposal now. Tone: confident but not aggressive, educational, and transparent about costs and risks.
TEMPLATEEOFcat > prompts/common/disclosures.j2 << 'TEMPLATEEOF'
{{ firm_name }} is a registered investment adviser with the Securities and Exchange Commission. Registration does not imply a certain level of skill or training. The information contained in this document is for general informational purposes and should not be construed as personalized investment advice. Past performance is not indicative of future results. All investing involves risk, including the potential loss of principal. Hypothetical projections shown are not guarantees of future performance and are based on assumptions that may not be realized. Monte Carlo simulations provide probabilities based on historical data patterns and do not guarantee specific outcomes. {{ firm_name }} does not provide tax, legal, or accounting advice. Clients should consult their tax advisor regarding their specific tax situation. For additional information, including our Form ADV Part 2A, please visit {{ firm_website }} or contact us at {{ firm_phone }}.
TEMPLATEEOFThese templates are the MOST IMPORTANT deliverable of the entire project. Spend significant time with the client's compliance officer refining the disclosure language and critical rules. Every firm will have different approved language — never use generic disclosures without CCO sign-off. The format_currency Jinja2 filter must be implemented in the pipeline code. Store the firm-specific variables (firm_name, firm_disclosures, firm_fee_schedule, firm_website, firm_phone) in a configuration file that the compliance officer can update without developer intervention.
Step 9: Build the Document Generation Pipeline
Implement the core pipeline module that ties together CRM data, planning data, prompt templates, Azure OpenAI calls, and Word document generation. This is the orchestration engine that transforms raw data into branded, formatted financial documents.
cat > app/pipeline.py << 'PYEOF'
import os
import json
import datetime
from typing import Dict, Any, Optional
from jinja2 import Environment, FileSystemLoader
from openai import AzureOpenAI
from docx import Document
from docx.shared import Inches, Pt, RGBColor
from docx.enum.text import WD_ALIGN_PARAGRAPH
from azure.storage.blob import BlobServiceClient
from azure.identity import DefaultAzureCredential
class FinancialPlanPipeline:
def __init__(self, api_key: str, endpoint: str):
self.client = AzureOpenAI(
api_key=api_key,
api_version='2025-01-01-preview',
azure_endpoint=endpoint
)
self.jinja_env = Environment(
loader=FileSystemLoader('prompts'),
autoescape=False
)
self.jinja_env.filters['format_currency'] = self._format_currency
self._load_firm_config()
def _load_firm_config(self):
with open('config/firm_config.json', 'r') as f:
self.firm_config = json.load(f)
@staticmethod
def _format_currency(value):
try:
return f'${float(value):,.0f}'
except (ValueError, TypeError):
return str(value) if value else '[AMOUNT NOT PROVIDED]'
def fetch_client_data(self, client_id: str) -> Dict[str, Any]:
crm_type = os.environ.get('CRM_TYPE', 'redtail')
if crm_type == 'redtail':
from app.integrations.crm_redtail import RedtailCRMClient
crm = RedtailCRMClient(api_key=self._get_secret('crm-api-key'))
return crm.get_client(int(client_id))
# Add other CRM integrations as elif blocks
raise ValueError(f'Unsupported CRM type: {crm_type}')
def fetch_planning_data(self, client_id: str) -> Dict[str, Any]:
planning_type = os.environ.get('PLANNING_SOFTWARE', 'emoney')
if planning_type == 'emoney':
from app.integrations.planning_emoney import EMoneyClient
planner = EMoneyClient(
api_key=self._get_secret('planning-api-key'),
base_url=os.environ.get('PLANNING_API_URL', '')
)
return planner.get_plan_summary(client_id)
elif planning_type == 'csv':
return {} # Will be populated via CSV upload endpoint
raise ValueError(f'Unsupported planning software: {planning_type}')
def generate_document(self, client_data: Dict, planning_data: Dict,
document_type: str, customization_notes: str = '') -> Dict[str, Any]:
# Select and render prompt template
if document_type == 'financial_plan':
template = self.jinja_env.get_template('financial_plan/narrative.j2')
elif document_type == 'investment_proposal':
template = self.jinja_env.get_template('investment_proposal/proposal.j2')
else:
raise ValueError(f'Unknown document type: {document_type}')
prompt = template.render(
client=client_data,
plan=planning_data,
firm_name=self.firm_config['firm_name'],
firm_disclosures=self.firm_config['disclosures'],
firm_fee_schedule=self.firm_config.get('fee_schedule', ''),
firm_website=self.firm_config['website'],
firm_phone=self.firm_config['phone'],
customization_notes=customization_notes
)
# Call Azure OpenAI
response = self.client.chat.completions.create(
model='gpt-41-finplan',
messages=[
{'role': 'system', 'content': 'You are an expert financial planning document writer. Follow all rules exactly. Never fabricate data.'},
{'role': 'user', 'content': prompt}
],
temperature=0.3,
max_tokens=16000,
top_p=0.9
)
raw_text = response.choices[0].message.content
usage = {
'prompt_tokens': response.usage.prompt_tokens,
'completion_tokens': response.usage.completion_tokens,
'total_tokens': response.usage.total_tokens
}
# Generate Word document
doc_path = self._create_word_document(
raw_text, client_data, document_type
)
# Upload to archive
blob_url = self._archive_document(doc_path, client_data, document_type)
return {
'raw_text': raw_text,
'prompt': prompt,
'model': 'gpt-4.1',
'usage': usage,
'document_url': blob_url,
'document_path': doc_path
}
def _create_word_document(self, markdown_text: str, client_data: Dict,
document_type: str) -> str:
doc = Document()
style = doc.styles['Normal']
font = style.font
font.name = self.firm_config.get('font_name', 'Calibri')
font.size = Pt(11)
# Add firm logo if available
logo_path = self.firm_config.get('logo_path')
if logo_path and os.path.exists(logo_path):
doc.add_picture(logo_path, width=Inches(2.5))
last_paragraph = doc.paragraphs[-1]
last_paragraph.alignment = WD_ALIGN_PARAGRAPH.CENTER
# Title
title_text = 'Comprehensive Financial Plan' if document_type == 'financial_plan' else 'Investment Proposal'
title = doc.add_heading(title_text, level=0)
title.alignment = WD_ALIGN_PARAGRAPH.CENTER
# Subtitle with client name and date
subtitle = doc.add_paragraph()
subtitle.alignment = WD_ALIGN_PARAGRAPH.CENTER
run = subtitle.add_run(
f"Prepared for {client_data['first_name']} {client_data['last_name']}\n"
f"Date: {datetime.datetime.now().strftime('%B %d, %Y')}\n"
f"Prepared by: {self.firm_config['firm_name']}"
)
run.font.size = Pt(14)
run.font.color.rgb = RGBColor(70, 70, 70)
doc.add_page_break()
# Parse markdown sections into Word
for line in markdown_text.split('\n'):
line = line.strip()
if line.startswith('# '):
doc.add_heading(line[2:], level=1)
elif line.startswith('## '):
doc.add_heading(line[3:], level=2)
elif line.startswith('### '):
doc.add_heading(line[4:], level=3)
elif line.startswith('- '):
doc.add_paragraph(line[2:], style='List Bullet')
elif line.startswith('[DATA NEEDED:'):
p = doc.add_paragraph()
run = p.add_run(line)
run.font.color.rgb = RGBColor(255, 0, 0)
run.bold = True
elif line:
doc.add_paragraph(line)
# Save
timestamp = datetime.datetime.now().strftime('%Y%m%d_%H%M%S')
filename = f"{client_data['last_name']}_{document_type}_{timestamp}.docx"
filepath = os.path.join('output', filename)
os.makedirs('output', exist_ok=True)
doc.save(filepath)
return filepath
def _archive_document(self, doc_path: str, client_data: Dict,
document_type: str) -> str:
credential = DefaultAzureCredential()
blob_service = BlobServiceClient(
account_url=f"https://{os.environ['AZURE_STORAGE_ACCOUNT']}.blob.core.windows.net",
credential=credential
)
container = blob_service.get_container_client(os.environ['ARCHIVE_CONTAINER'])
blob_name = f"{datetime.datetime.now().strftime('%Y/%m')}/{os.path.basename(doc_path)}"
with open(doc_path, 'rb') as f:
container.upload_blob(name=blob_name, data=f, overwrite=False)
return f"https://{os.environ['AZURE_STORAGE_ACCOUNT']}.blob.core.windows.net/{os.environ['ARCHIVE_CONTAINER']}/{blob_name}"
def _get_secret(self, name: str) -> str:
from azure.keyvault.secrets import SecretClient
credential = DefaultAzureCredential()
kv = SecretClient(
vault_url=os.environ['KEY_VAULT_URL'],
credential=credential
)
return kv.get_secret(name).value
PYEOFTemperature is set to 0.3 (low) for financial content — this reduces creativity/hallucination while maintaining natural language flow. Do NOT increase above 0.5 for compliance-sensitive documents. The max_tokens of 16000 accommodates a full 20-page plan narrative. Monitor token usage in the first month and adjust if plans are being truncated. The [DATA NEEDED] placeholders in red are a critical safety feature — they force the advisor to fill gaps rather than letting the AI fabricate data.
Step 10: Build the Compliance Logging System
Implement the compliance logging module that captures every AI interaction — the full prompt sent, the complete raw output received, the model used, token counts, timestamps, advisor identity, and approval status. This creates the audit trail required by SEC Rule 204-2 and FINRA Rule 3110.
cat > app/compliance_logger.py << 'PYEOF'
import json
import uuid
import datetime
import os
from azure.storage.blob import BlobServiceClient
from azure.identity import DefaultAzureCredential
class ComplianceLogger:
def __init__(self):
self.credential = DefaultAzureCredential()
self.blob_service = BlobServiceClient(
account_url=f"https://{os.environ['AZURE_STORAGE_ACCOUNT']}.blob.core.windows.net",
credential=self.credential
)
self.container = self.blob_service.get_container_client('ai-compliance-logs')
# Ensure container exists
try:
self.container.create_container()
except Exception:
pass # Already exists
def log_generation(self, advisor_id: str, client_id: str,
document_type: str, prompt_used: str,
raw_output: str, model_used: str,
tokens_used: dict) -> str:
log_id = str(uuid.uuid4())
timestamp = datetime.datetime.utcnow().isoformat() + 'Z'
log_entry = {
'log_id': log_id,
'timestamp': timestamp,
'event_type': 'document_generation',
'advisor_id': advisor_id,
'client_id': client_id,
'document_type': document_type,
'model_used': model_used,
'tokens': tokens_used,
'prompt_hash': self._hash(prompt_used),
'prompt_full': prompt_used,
'output_full': raw_output,
'output_hash': self._hash(raw_output),
'approval_status': 'pending_review',
'approved_by': None,
'approval_timestamp': None,
'approval_notes': None
}
# Write to Azure Blob (immutable archive)
blob_name = f"logs/{datetime.datetime.utcnow().strftime('%Y/%m/%d')}/{log_id}.json"
self.container.upload_blob(
name=blob_name,
data=json.dumps(log_entry, indent=2),
overwrite=False
)
# Also write to local NAS via file system
local_path = os.path.join(
os.environ.get('NAS_MOUNT_PATH', '/mnt/nas/AI-Compliance-Archive'),
'logs',
datetime.datetime.utcnow().strftime('%Y/%m/%d')
)
os.makedirs(local_path, exist_ok=True)
with open(os.path.join(local_path, f'{log_id}.json'), 'w') as f:
json.dump(log_entry, f, indent=2)
return log_id
def log_approval(self, log_id: str, advisor_id: str, notes: str = ''):
timestamp = datetime.datetime.utcnow().isoformat() + 'Z'
approval_record = {
'log_id': log_id,
'event_type': 'document_approval',
'approved_by': advisor_id,
'approval_timestamp': timestamp,
'approval_notes': notes,
'approval_status': 'approved'
}
blob_name = f"approvals/{datetime.datetime.utcnow().strftime('%Y/%m/%d')}/{log_id}_approval.json"
self.container.upload_blob(
name=blob_name,
data=json.dumps(approval_record, indent=2),
overwrite=False
)
def log_error(self, error_message: str):
log_id = str(uuid.uuid4())
timestamp = datetime.datetime.utcnow().isoformat() + 'Z'
error_record = {
'log_id': log_id,
'timestamp': timestamp,
'event_type': 'error',
'error_message': error_message
}
blob_name = f"errors/{datetime.datetime.utcnow().strftime('%Y/%m/%d')}/{log_id}.json"
self.container.upload_blob(
name=blob_name,
data=json.dumps(error_record, indent=2),
overwrite=False
)
@staticmethod
def _hash(content: str) -> str:
import hashlib
return hashlib.sha256(content.encode()).hexdigest()
PYEOFaz storage container create \
--name ai-compliance-logs \
--account-name stclientnamearchive
az storage container immutability-policy create \
--resource-group rg-clientname-ai \
--account-name stclientnamearchive \
--container-name ai-compliance-logs \
--period 2555The 2555-day (7-year) immutability period on the logs container aligns with FINRA Rule 4511's 6-year minimum retention plus a 1-year safety buffer. The SHA-256 hash of both the prompt and output provides tamper-evidence for the compliance record. CRITICAL: The full prompt (including client PII) is stored — ensure the storage account has encryption at rest enabled (default in Azure) and that access policies restrict who can read these logs to the CCO and designated compliance staff only. NAS_MOUNT_PATH should point to the Synology NAS share mounted via SMB on the Azure App Service (or configured as a separate write path for on-premises deployments).
Step 11: Build the Advisor-Facing Web Interface
Deploy a simple, clean web interface that advisors use to request document generation, review drafts, make edits, and approve final documents. This interface enforces the human-in-the-loop review requirement. The UI is intentionally minimal — advisors should spend their time reviewing content, not learning software.
cat > templates/index.html << 'HTMLEOF'
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{{ firm_name }} - AI Document Generator</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet">
<style>
.status-badge { font-size: 0.9rem; }
.review-banner { background-color: #fff3cd; border: 2px solid #ffc107; padding: 15px; border-radius: 8px; margin-bottom: 20px; }
.review-banner h5 { color: #856404; margin: 0; }
#output-preview { white-space: pre-wrap; font-family: 'Georgia', serif; line-height: 1.8; }
.data-gap { color: #dc3545; font-weight: bold; background: #fff5f5; padding: 2px 6px; border-radius: 3px; }
</style>
</head>
<body>
<nav class="navbar navbar-dark bg-dark">
<div class="container">
<span class="navbar-brand">{{ firm_name }} | AI Document Generator</span>
<span class="navbar-text text-warning">⚠️ All documents require advisor review before client delivery</span>
</div>
</nav>
<div class="container mt-4">
<div class="row">
<div class="col-md-4">
<div class="card">
<div class="card-header bg-primary text-white">Generate New Document</div>
<div class="card-body">
<form id="generateForm">
<div class="mb-3">
<label class="form-label">Client ID (from CRM)</label>
<input type="text" class="form-control" id="clientId" required>
</div>
<div class="mb-3">
<label class="form-label">Document Type</label>
<select class="form-select" id="documentType">
<option value="financial_plan">Comprehensive Financial Plan</option>
<option value="investment_proposal">Investment Proposal</option>
</select>
</div>
<div class="mb-3">
<label class="form-label">Customization Notes (optional)</label>
<textarea class="form-control" id="customNotes" rows="4" placeholder="E.g., Client is especially concerned about college funding for twins starting in 2028. Emphasize tax-efficient strategies."></textarea>
</div>
<button type="submit" class="btn btn-primary w-100" id="generateBtn">
Generate Draft
</button>
</form>
</div>
</div>
</div>
<div class="col-md-8">
<div id="reviewSection" style="display:none;">
<div class="review-banner">
<h5>⚠️ DRAFT — REQUIRES YOUR REVIEW AND APPROVAL</h5>
<small>This document was generated by AI and has NOT been reviewed for accuracy. Per firm compliance policy and FINRA Rule 2210, you must review all content before sharing with a client. Look for <span class="data-gap">[DATA NEEDED]</span> placeholders that require your input.</small>
</div>
<div class="card">
<div class="card-header d-flex justify-content-between">
<span>Document Preview</span>
<span class="badge bg-warning status-badge" id="statusBadge">Pending Review</span>
</div>
<div class="card-body" id="output-preview"></div>
<div class="card-footer">
<div class="mb-3">
<label class="form-label">Approval Notes</label>
<textarea class="form-control" id="approvalNotes" rows="2" placeholder="Note any changes made before delivery"></textarea>
</div>
<div class="d-flex gap-2">
<a id="downloadLink" class="btn btn-outline-primary" href="#" target="_blank">Download Word Document</a>
<button class="btn btn-success" id="approveBtn" onclick="approveDocument()">✓ Approve for Client Delivery</button>
<button class="btn btn-outline-danger" onclick="requestRevision()">↻ Regenerate with Changes</button>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<script>
let currentLogId = null;
document.getElementById('generateForm').addEventListener('submit', async (e) => {
e.preventDefault();
const btn = document.getElementById('generateBtn');
btn.disabled = true;
btn.textContent = 'Generating... (30-90 seconds)';
try {
const response = await fetch('/api/generate-plan', {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({
client_id: document.getElementById('clientId').value,
document_type: document.getElementById('documentType').value,
advisor_id: '{{ current_user_id }}',
customization_notes: document.getElementById('customNotes').value
})
});
const data = await response.json();
if (data.status === 'draft_ready') {
currentLogId = data.log_id;
let preview = data.preview_text;
preview = preview.replace(/\[DATA NEEDED:[^\]]+\]/g, '<span class="data-gap">$&</span>');
document.getElementById('output-preview').innerHTML = preview;
document.getElementById('downloadLink').href = data.document_url;
document.getElementById('reviewSection').style.display = 'block';
} else {
alert('Error: ' + data.message);
}
} catch (err) { alert('Error: ' + err.message); }
btn.disabled = false;
btn.textContent = 'Generate Draft';
});
async function approveDocument() {
const response = await fetch('/api/approve-document', {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({
log_id: currentLogId,
advisor_id: '{{ current_user_id }}',
notes: document.getElementById('approvalNotes').value
})
});
const data = await response.json();
if (data.status === 'approved') {
document.getElementById('statusBadge').className = 'badge bg-success status-badge';
document.getElementById('statusBadge').textContent = 'Approved ✓';
document.getElementById('approveBtn').disabled = true;
}
}
function requestRevision() {
document.getElementById('customNotes').value += '\n[REVISION REQUEST]: ';
document.getElementById('customNotes').focus();
}
</script>
</body>
</html>
HTMLEOFThe review banner and compliance warnings are non-negotiable UI elements — they cannot be hidden or dismissed. The [DATA NEEDED] highlighting in red ensures advisors catch gaps before approving. Authentication should be handled via Entra ID SSO — add the Azure AD authentication middleware to the Flask app using msal library. The current_user_id template variable should be populated from the authenticated session. For firms that prefer Microsoft Teams integration, this can alternatively be built as a Teams tab app or Power App calling the same API backend.
Step 12: Create Firm Configuration and Branding
Set up the firm-specific configuration file containing branding, disclosure language, fee schedules, and document formatting preferences. This file is maintained by the MSP and updated when the compliance officer provides new approved language.
mkdir -p config
cat > config/firm_config.json << 'JSONEOF'
{
"firm_name": "[CLIENT FIRM NAME]",
"firm_dba": "[DBA IF DIFFERENT]",
"firm_website": "https://www.clientfirm.com",
"firm_phone": "(555) 123-4567",
"firm_email": "info@clientfirm.com",
"crd_number": "[FIRM CRD NUMBER]",
"sec_registration": "[SEC FILE NUMBER]",
"logo_path": "static/firm_logo.png",
"font_name": "Calibri",
"brand_color_primary": "#1B3A5C",
"brand_color_secondary": "#4A90D9",
"disclosures": "[CLIENT FIRM NAME] is a registered investment adviser with the Securities and Exchange Commission (SEC File Number: [SEC FILE NUMBER]). Registration does not imply a certain level of skill or training. The information in this document is for informational purposes only and does not constitute investment advice. Past performance is not indicative of future results. All investments involve risk, including loss of principal. Hypothetical projections are based on assumptions that may not materialize and are not guarantees. Monte Carlo analysis results represent statistical probabilities, not certain outcomes. [CLIENT FIRM NAME] does not provide tax, legal, or accounting advice. Please consult your tax advisor. For our Form ADV Part 2A brochure, visit [WEBSITE] or call [PHONE].",
"fee_schedule": "Advisory Fee Schedule:\n- First $500,000: 1.00% annually\n- $500,001 - $1,000,000: 0.85% annually\n- $1,000,001 - $3,000,000: 0.70% annually\n- Over $3,000,000: 0.50% annually\n\nMinimum account size: $250,000. Fees are billed quarterly in advance based on the market value of assets under management.",
"advisor_roster": [
{"name": "[ADVISOR 1 NAME]", "title": "CFP®, CFA", "email": "advisor1@clientfirm.com"},
{"name": "[ADVISOR 2 NAME]", "title": "CFP®", "email": "advisor2@clientfirm.com"}
]
}
JSONEOF
echo 'Replace all [BRACKETED] values with actual firm information before deployment.'Every bracketed placeholder must be filled with actual firm information before go-live. The disclosures text MUST be reviewed and approved by the firm's CCO — do not use the template language as-is. Request the firm's ADV Part 2A brochure as a reference. The fee_schedule field should exactly match the firm's current ADV Part 2A fee disclosure. Get the firm's logo as a high-resolution PNG (minimum 600px wide, transparent background preferred).
Step 13: Deploy and Configure End-to-End Application
Push the complete application to Azure App Service, configure all environment variables, set up SSL, configure custom domain if applicable, and enable Entra ID authentication for single sign-on.
cd finplan-ai
az webapp up \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--runtime 'PYTHON:3.11' \
--sku B2 \
--location eastus2
az webapp config set \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--startup-file 'gunicorn --bind=0.0.0.0:8000 --timeout 120 --workers 2 app.main:app'
az webapp config appsettings set \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--settings \
KEY_VAULT_URL='https://kv-clientname-ai.vault.azure.net/' \
CRM_TYPE='redtail' \
CRM_API_URL='https://smf.crm3.redtailtechnology.com/api/public/v1' \
PLANNING_SOFTWARE='emoney' \
PLANNING_API_URL='https://api.emoneyadvisor.com/v1' \
AZURE_STORAGE_ACCOUNT='stclientnamearchive' \
ARCHIVE_CONTAINER='ai-document-archive' \
NAS_MOUNT_PATH='/mnt/nas/AI-Compliance-Archive' \
FLASK_ENV='production'
az webapp update \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--https-only true
az webapp auth update \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--enabled true \
--action LoginWithAzureActiveDirectory \
--aad-allowed-token-audiences 'api://app-clientname-finplan'
az monitor app-insights component create \
--app appinsights-clientname-finplan \
--location eastus2 \
--resource-group rg-clientname-ai \
--application-type web
INSTRUMENTATION_KEY=$(az monitor app-insights component show --app appinsights-clientname-finplan --resource-group rg-clientname-ai --query instrumentationKey -o tsv)
az webapp config appsettings set \
--name app-clientname-finplan \
--resource-group rg-clientname-ai \
--settings APPINSIGHTS_INSTRUMENTATIONKEY=$INSTRUMENTATION_KEY
echo 'Deployment complete. Access at: https://app-clientname-finplan.azurewebsites.net'The gunicorn timeout of 120 seconds is important — GPT-4.1 can take 30-90 seconds to generate a full financial plan. Default timeouts will cause 504 errors. B2 App Service plan ($55/month) is appropriate for up to 25 concurrent users. The Entra ID authentication ensures only firm employees can access the application. For firms requiring a custom domain (e.g., plans.clientfirm.com), configure a CNAME record and add an App Service Managed Certificate (free with custom domain).
Step 14: Compliance Documentation and Supervisory Procedures
Create the required compliance documentation: AI Usage Policy, AI Supervisory Procedures addendum, and Vendor Due Diligence records. These documents are critical for FINRA/SEC examination readiness. The MSP prepares drafts; the firm's CCO must review and formally adopt.
mkdir -p compliance_docscat > compliance_docs/AI_Usage_Policy_TEMPLATE.md << 'DOCEOF'
# [FIRM NAME] Artificial Intelligence Usage Policy
## Effective Date: [DATE]
## Approved by: [CCO NAME], Chief Compliance Officer
### 1. Purpose
This policy governs the use of artificial intelligence tools for generating client-facing financial plan narratives and investment proposal documents at [FIRM NAME].
### 2. Scope
This policy applies to all associated persons who use the firm's AI Document Generator system.
### 3. Approved AI Tools
- Azure OpenAI GPT-4.1 (accessed exclusively via the firm's managed application at [APP URL])
- No other AI tools are approved for generating client-facing content
- Use of consumer AI tools (ChatGPT, Claude, Gemini, etc.) for any client-related work is PROHIBITED
### 4. Human Review Requirement
ALL AI-generated documents MUST be reviewed by a qualified advisor before delivery to any client or prospect. Specifically:
a) The generating advisor must read the entire document for accuracy
b) All financial figures must be verified against source data
c) All [DATA NEEDED] placeholders must be resolved
d) Any AI-generated recommendations must be validated for client suitability
e) The advisor must click 'Approve for Client Delivery' in the system, creating an electronic record of review
### 5. Prohibited Uses
- Generating specific security recommendations
- Creating performance marketing materials without compliance pre-review
- Sharing AI-generated content with clients without completing the approval workflow
- Inputting client data into any non-approved AI system
### 6. Recordkeeping
All AI interactions (prompts, outputs, approvals) are automatically logged and archived in compliance with SEC Rule 204-2 and FINRA Rule 4511. Records are retained for [7] years in immutable storage.
### 7. Supervisory Review
The CCO or designated principal will conduct quarterly reviews of a random sample of AI-generated documents (minimum 10% or 25 documents, whichever is greater) to ensure quality and compliance.
### 8. Vendor Due Diligence
The firm has conducted due diligence on Microsoft Azure OpenAI confirming: SOC 2 Type II certification, data encryption at rest and in transit, no training on customer data, data residency in the United States.
### 9. Annual Review
This policy will be reviewed and updated at least annually, or more frequently as regulatory guidance evolves.
---
Acknowledged by: ___________________________ Date: ___________
DOCEOF
echo 'Template created. CCO must customize, review, and formally adopt.'This policy template is a STARTING POINT — the firm's compliance consultant or outside counsel must review and customize it. For FINRA-registered broker-dealers, additional supervisory procedures under Rule 3110 are required. The MSP should offer to facilitate a meeting between the firm's CCO and the MSP's compliance consultant (if available) to finalize these documents. Document the completion of vendor due diligence with Azure OpenAI's SOC 2 Type II report (available from Microsoft upon request). Keep a copy of Microsoft's data processing agreement (DPA) in the compliance files.
Custom AI Components
Financial Plan Narrative Prompt Template
Type: prompt
Jinja2-based prompt template that transforms structured client and planning data into a comprehensive instruction set for GPT-4.1 to generate a 15-25 page financial plan narrative. Includes sections for executive summary, current financial picture, goal analysis, retirement readiness, investment strategy, tax planning, risk management, estate planning, and action items. Enforces critical guardrails: no fabricated data, no specific product recommendations, fiduciary language, required disclosures.
Implementation
The template uses Jinja2 syntax with the following variables:
- client (dict with first_name, last_name, age, retirement_age, marital_status, employment_status, annual_income, household_members, goals, risk_tolerance)
- plan (dict with total_investable_assets, total_liabilities, net_worth, annual_savings_rate, tax_bracket, asset_allocation, recommended_allocation, retirement_probability, projected_retirement_income, social_security_estimate, insurance_coverage, estate_documents, scenarios)
- firm_name
- firm_disclosures
- customization_notes
The template enforces 7 critical rules at the top of the prompt that GPT-4.1 must follow. Missing data triggers [DATA NEEDED: description] placeholders rather than fabrication.
- The format_currency filter converts numbers to $X,XXX format.
- Temperature: 0.3
- max_tokens: 16000
Investment Proposal Prompt Template
Type: prompt
Jinja2-based prompt template for generating personalized investment proposals for prospective and existing clients. Produces a structured proposal covering firm philosophy, client needs assessment, current portfolio analysis, proposed strategy, fee transparency, and onboarding expectations. Explicitly compliant with SEC Marketing Rule 206(4)-1.
Implementation
See Step 8 - prompts/investment_proposal/proposal.j2. Uses the same client and plan data structures as the financial plan template but with a different document structure focused on: Cover Letter, Firm & Philosophy, Understanding Needs, Current Portfolio Assessment, Proposed Investment Strategy, Fee Transparency, What to Expect, and Disclosures.
- Additional variables: firm_fee_schedule (string), client.investment_horizon, client.primary_objective
- Critical rules emphasize no performance guarantees, no cherry-picked data, strategy/process focus over product names
- Temperature: 0.3
- max_tokens: 12000
CRM Data Integration Agent
Type: integration Python module that interfaces with the client's CRM system (Redtail, Wealthbox, or Salesforce FSC) to fetch client demographics, household information, goals, risk tolerance, and communication history. Implements adapter pattern supporting multiple CRM vendors through a common interface.
Implementation
See Step 6 - app/integrations/crm_redtail.py. The RedtailCRMClient class authenticates using Userkeyauth header format. Key methods: get_client(contact_id) returns a standardized dict with fields: id, first_name, last_name, date_of_birth, email, phone, marital_status, employment_status, risk_tolerance, annual_income, net_worth, retirement_age, goals (list), household_members (list). Goals are fetched from Redtail notes categorized as 'Financial Goals'. Household members from the relationships endpoint. For Wealthbox integration, create crm_wealthbox.py using REST API at api.crmworkspace.com with Bearer token auth. For Salesforce FSC, create crm_salesforce.py using simple_salesforce library with JWT bearer flow, querying FinancialAccount, FinancialGoal, and Contact objects.
Financial Planning Data Integration Agent
Type: integration Python module that pulls financial planning data (projections, allocations, scenarios, Monte Carlo results) from the firm's planning software. Supports eMoney Advisor API, with CSV fallback for MoneyGuidePro and RightCapital where API access is limited.
Implementation
The EMoneyClient class authenticates via Bearer token. Key method: get_plan_summary(client_id) returns dict with the following fields:
- plan_id
- plan_date
- retirement_probability
- projected_retirement_income
- total_investable_assets
- total_liabilities
- net_worth
- annual_savings_rate
- asset_allocation (dict)
- recommended_allocation (dict)
- social_security_estimate
- insurance_coverage
- estate_documents
- tax_bracket
- scenarios (list of dicts with name, description, probability)
CSV fallback accepts an uploaded CSV with required columns: client_name, total_assets, total_liabilities, retirement_probability, annual_income, savings_rate, equity_pct, fi_pct, alt_pct, cash_pct. Returns same standardized structure with scenarios as empty list.
Document Generation Pipeline Orchestrator
Type: workflow
Core orchestration engine that coordinates the end-to-end document generation workflow: receives advisor request, pulls client data from CRM, pulls planning data from planning software, renders Jinja2 prompt template with data, calls Azure OpenAI GPT-4.1, formats the AI response into a branded Word document using python-docx, archives the document to Azure Blob Storage, logs everything for compliance, and returns the draft for advisor review.
Implementation
- See Step 9 - app/pipeline.py.
- The FinancialPlanPipeline class is initialized with Azure OpenAI credentials.
- Main method: generate_document(client_data, planning_data, document_type, customization_notes) orchestrates the full pipeline.
- Azure OpenAI call uses: model='gpt-41-finplan', temperature=0.3, max_tokens=16000, top_p=0.9.
- System message: 'You are an expert financial planning document writer. Follow all rules exactly. Never fabricate data.'
- Word document generation uses python-docx with firm branding (logo, fonts, colors from firm_config.json).
- Markdown-to-Word conversion handles H1-H3 headings, bullet lists, and highlights [DATA NEEDED] placeholders in red bold.
- Documents are auto-archived to Azure Blob Storage with year/month folder structure.
- Returns dict with: raw_text, prompt, model, usage (tokens), document_url, document_path.
Compliance Audit Trail Logger
Type: skill Logging service that creates tamper-evident audit records for every AI document generation event. Captures the full prompt (including client data), complete AI output, model used, token counts, generating advisor identity, timestamps, and subsequent approval/rejection events. Writes to both Azure Blob Storage (immutable/WORM) and local NAS for regulatory examination readiness.
Implementation
See Step 10 - app/compliance_logger.py. The ComplianceLogger class writes JSON log entries to both Azure Blob Storage (ai-compliance-logs container with 7-year immutability policy) and local NAS filesystem. Each log entry contains: log_id (UUID), timestamp (ISO 8601 UTC), event_type (document_generation | document_approval | error), advisor_id, client_id, document_type, model_used, tokens (prompt_tokens, completion_tokens, total_tokens), prompt_hash (SHA-256), prompt_full, output_full, output_hash (SHA-256), approval_status (pending_review | approved), approved_by, approval_timestamp, approval_notes. The SHA-256 hashes provide tamper evidence. Blob paths use date-based hierarchy: logs/YYYY/MM/DD/{log_id}.json. Approvals stored separately: approvals/YYYY/MM/DD/{log_id}_approval.json.
Advisor Review and Approval Workflow
Type: workflow
Web-based human-in-the-loop review interface that ensures no AI-generated document reaches a client without explicit advisor review and approval. Displays draft content with highlighted data gaps, provides download link for the formatted Word document, captures approval notes, and records the approval event in the compliance audit trail.
Implementation
See Step 11 - templates/index.html. Flask-served HTML/JS interface with Bootstrap 5 UI.
Authentication via Entra ID SSO ensures advisor_id is captured from authenticated session. The approval button is the critical compliance control — it creates the electronic supervisory record.
Testing & Validation
- CONNECTIVITY TEST: From the Azure App Service, verify connectivity to Azure OpenAI by sending a test completion request (see command below) — expect 200 response with completion text.
- CRM INTEGRATION TEST: Call the CRM API endpoint for 3 known test client IDs and verify that the returned data structure contains all required fields (first_name, last_name, risk_tolerance, goals, household_members). Cross-reference returned data against the CRM UI to confirm accuracy.
- PLANNING DATA TEST: Call the planning software API for the same 3 test clients and verify that financial data (total_assets, asset_allocation, retirement_probability) matches the planning software's displayed values within $1 tolerance.
- PROMPT RENDERING TEST: Render both prompt templates (financial_plan/narrative.j2 and investment_proposal/proposal.j2) with test data and verify: (a) all Jinja2 variables are properly substituted, (b) no raw {{ variable }} syntax appears in output, (c) currency values are properly formatted, (d) firm disclosures appear verbatim at the end.
- DOCUMENT GENERATION TEST — FINANCIAL PLAN: Generate a complete financial plan narrative for a test client with full data. Verify: (a) document generates without errors in under 120 seconds, (b) all 10 sections are present, (c) financial figures in the document match source data, (d) no specific security names appear unless provided in input data, (e) disclosures section appears at the end, (f) Word document opens correctly in Microsoft Word with proper formatting.
- DOCUMENT GENERATION TEST — INVESTMENT PROPOSAL: Generate a complete investment proposal for a test prospect. Verify: (a) all 8 sections present, (b) fee schedule matches firm_config.json exactly, (c) no performance guarantees or certainty language detected, (d) risk disclosures present.
- DATA GAP HANDLING TEST: Generate a document for a test client with deliberately missing data (e.g., no retirement_probability, no annual_income). Verify that the output contains [DATA NEEDED: ...] placeholders in the appropriate locations rather than fabricated numbers. Verify these placeholders render in red bold in the Word document.
- COMPLIANCE LOGGING TEST: After generating a test document, query Azure Blob Storage ai-compliance-logs container and verify: (a) log entry exists with correct log_id, (b) prompt_full contains the complete prompt with client data, (c) output_full contains the complete AI response, (d) SHA-256 hashes match when recomputed, (e) advisor_id is correctly captured, (f) approval_status is 'pending_review'.
- APPROVAL WORKFLOW TEST: From the web interface, generate a test document, review it, then click 'Approve for Client Delivery'. Verify: (a) approval record appears in Azure Blob Storage approvals/ path, (b) approved_by field matches the logged-in advisor, (c) approval_timestamp is correct, (d) status badge updates to 'Approved' in the UI.
- ARCHIVE INTEGRITY TEST: Verify that generated Word documents appear in both: (a) Azure Blob Storage ai-document-archive container with correct year/month path structure, and (b) local Synology NAS AI-Compliance-Archive share. Compare file checksums to confirm they match.
- FIREWALL SHADOW AI TEST: From an advisor workstation, attempt to access chat.openai.com, claude.ai, and gemini.google.com in a web browser. Verify all are blocked by the FortiGate web filter. Then verify that the firm's AI Document Generator application at app-clientname-finplan.azurewebsites.net loads correctly.
- AUTHENTICATION TEST: Attempt to access the application URL from an unauthenticated browser session. Verify redirect to Entra ID login. Attempt access with a non-firm Microsoft account — verify access denied. Login with a firm employee account — verify access granted and advisor_id populated.
- IMMUTABILITY TEST: Attempt to delete or modify a document in the ai-document-archive and ai-compliance-logs Azure Blob Storage containers. Verify that delete/overwrite operations are rejected due to the immutability policy.
- LOAD TEST: Simulate 5 concurrent document generation requests (representing a busy morning at a 5-advisor firm). Verify all 5 complete successfully within 3 minutes and no requests timeout or fail.
- COMPLIANCE REVIEW SIMULATION: Present 3 AI-generated test documents to the firm's CCO for review. Document their feedback on: (a) accuracy of financial figures, (b) appropriateness of language and tone, (c) completeness of disclosures, (d) any compliance concerns. Iterate on prompt templates based on feedback. This step must be completed before go-live.
curl -X POST https://oai-clientname-finplan.openai.azure.com/openai/deployments/gpt-41-finplan/chat/completions?api-version=2025-01-01-preview \
-H 'api-key: {KEY}' \
-H 'Content-Type: application/json' \
-d '{"messages":[{"role":"user","content":"Hello"}]}'Client Handoff
The client handoff should include the following components delivered over a 2-day on-site (or virtual) training and documentation session:
Training Sessions (Day 1 — Advisors)
Training Session (Day 1 — Compliance Officer)
Documentation Package (Day 2 — Delivery)
- User Guide (PDF): Step-by-step screenshots of the generation, review, and approval workflow
- Quick Reference Card (laminated): Single-page reference with login URL, steps, and support contact
- AI Usage Policy (final, CCO-approved version): For firm compliance manual
- Supervisory Procedures Addendum: For firm's existing WSPs
- Vendor Due Diligence File: Azure OpenAI SOC 2 report, DPA, and security documentation
- System Architecture Diagram: For the firm's technology inventory records
- Emergency Procedures: What to do if the system is down, who to contact, manual fallback process
Success Criteria Review
Maintenance
Ongoing MSP Responsibilities:
Weekly (15 minutes)
- Review Azure Application Insights dashboard for errors, failed requests, and response time trends
- Check Azure OpenAI usage metrics — flag if monthly costs exceed budget threshold by >20%
- Verify NAS-to-Azure sync is current (Synology Cloud Sync status should show 'Up to date')
Monthly (1-2 hours)
- Review and rotate API keys in Azure Key Vault (90-day rotation policy)
- Check for Azure OpenAI model updates — Microsoft deprecates model versions with 90-day notice
- Review compliance log volume and storage costs; archive older logs to cold/archive tier if needed
- Generate monthly usage report for client: documents generated by advisor, token costs, approval rates
- Verify Synology NAS health (DSM Health dashboard, drive SMART status, RAID integrity)
- Apply OS and security patches to Azure App Service runtime
Quarterly (2-4 hours)
- Meet with CCO to review a random sample of AI-generated documents for quality trends
- Update prompt templates based on advisor feedback and compliance review findings
- Review and update firm_config.json if fee schedules, disclosures, or advisor roster have changed
- Performance benchmark: generate 5 test documents and compare quality/speed to previous quarter
- Review CRM and planning software API changes — vendors may update endpoints or deprecate versions
- Update FortiGate firmware and web filter rules (new consumer AI sites to block)
- Verify immutability policies are still active on Azure Blob Storage containers
Semi-Annually (4-6 hours)
- Full disaster recovery test: restore from Azure Blob archive, verify document integrity
- Review Azure OpenAI content filtering settings against current best practices
- Update the AI Usage Policy and Supervisory Procedures if regulatory guidance has changed (monitor FINRA Regulatory Notices and SEC Risk Alerts)
- Conduct a tabletop exercise with the firm: 'What happens if the AI generates something incorrect and it reaches a client?'
Annually
- Full system security review and penetration test
- Renew and update vendor due diligence files (request current SOC 2 reports from Microsoft)
- Review pricing: compare Azure OpenAI costs against alternatives (Anthropic, new models)
- Assess whether prompt templates need major revision based on accumulated feedback
- Support the firm during FINRA/SEC examination if AI usage is examined
SLA Commitments
- System availability: 99.5% uptime (dependent on Azure SLA)
- Response time for generation requests: <120 seconds for 95th percentile
- Support response: 4-hour response for critical issues (system down), 1 business day for non-critical
- Escalation path: Level 1 (MSP helpdesk) → Level 2 (MSP solution architect) → Level 3 (Microsoft Azure support / vendor escalation)
Model Update Triggers
- When Microsoft releases a new GPT model version on Azure OpenAI, evaluate within 30 days
- When a model version deprecation notice is received, migrate within 60 days
- When prompt template changes produce consistently better results in testing, deploy within 2 weeks
- When regulatory guidance changes (new FINRA Notice or SEC rule), update templates and policies within 30 days
Estimated Monthly Managed Services Cost: $1,500–$2,500/month (covers all weekly, monthly, and quarterly activities above)
Alternatives
Pre-Built Financial AI Platform (FP Alpha + Holistiplan)
Instead of building a custom Azure OpenAI pipeline, deploy FP Alpha ($1,795/year) for comprehensive financial plan generation and Holistiplan ($749–$1,949/year) for tax planning narratives. These are purpose-built for financial advisors with pre-tuned AI models, built-in compliance guardrails, and native integrations with common CRM and planning tools. The MSP role shifts from building custom middleware to deploying, configuring, and integrating these SaaS platforms.
Microsoft 365 Copilot-Only Approach
Leverage Microsoft 365 Copilot ($30/user/month) as the sole AI layer. Advisors use Copilot in Word to generate plan narratives from structured prompts that reference client data stored in SharePoint. Investment proposals are generated in PowerPoint. No custom middleware, no API integration — advisors manually provide context to Copilot via a standardized prompt checklist. The MSP configures M365, creates prompt playbooks, and trains advisors.
Self-Hosted Open-Source LLM (Llama 3 / Mistral)
Deploy an open-source LLM (Meta Llama 3 70B or Mistral Large) on dedicated hardware (Dell PowerEdge R760xa with NVIDIA L40S GPUs, ~$25,000–$40,000) or on Azure Virtual Machines with GPU. All client data stays on-premises or in the firm's own Azure tenant with no third-party API calls. The same middleware pipeline is used but points to a local inference endpoint instead of Azure OpenAI.
Hybrid: Azure OpenAI + Nitrogen (Riskalyze) Proposal Generation
Use Azure OpenAI for financial plan narratives (the custom pipeline) but leverage Nitrogen (formerly Riskalyze) for investment proposals specifically. Nitrogen is the industry-standard risk tolerance and proposal generation platform ($99+/month) with built-in compliance features, custodian integrations, and advisor-familiar Risk Number methodology. This hybrid approach plays to each tool's strength.
Want early access to the full toolkit?