
Implementation Guide: Auto-generate work orders from customer purchase orders
Step-by-step implementation guide for deploying AI to auto-generate work orders from customer purchase orders for Manufacturing clients.
Hardware Procurement
Fujitsu ScanSnap iX1600 Document Scanner
$370 MSP cost / $475–$520 suggested resale (25–35% margin)
Primary document scanner for converting paper purchase orders into searchable PDF files. 40 ppm duplex scanning with built-in Wi-Fi. ScanSnap Home software auto-routes scanned PDFs to a monitored OneDrive/SharePoint folder or email inbox for automated pickup. Required only if the client receives paper POs.
Fujitsu fi-8170 Workgroup Scanner
$650 MSP cost / $850–$950 suggested resale (25–35% margin)
High-volume alternative to the iX1600 for clients processing 50+ paper POs per day. 70 ppm duplex with 100-sheet ADF. Recommended for shops with a dedicated receiving clerk who batch-scans incoming paper POs. Only procure this instead of the iX1600 if paper PO volume justifies it.
Zebra DS2208 Barcode Scanner
$105 MSP cost per unit / $150 suggested resale per unit (30–40% margin)
Handheld barcode scanners for the shop floor to scan work order barcodes printed on auto-generated work order packets. Enables production staff to pull up work order details, log start/stop times, and confirm completion in the ERP. Provides the physical bridge between the digital work order and the shop floor.
Dell PowerEdge T150 Tower Server
$2,100 MSP cost / $2,800–$3,200 suggested resale (25–35% margin)
On-premise server for self-hosting n8n workflow engine and/or Azure AI Document Intelligence containers. ONLY required for ITAR-controlled environments or clients with strict data-residency requirements that prohibit cloud processing of PO documents. Spec: Intel Xeon E-2436, 32GB ECC RAM, 2x 1TB SSD RAID-1. Most implementations will NOT need this — use cloud services instead.
Software Procurement
...
Azure AI Document Intelligence
Prebuilt invoice/PO model: $10 per 1,000 pages; Read (OCR) tier: $1.50 per 1,000 pages. Typical client at 200 POs/month = $2–$4/month. Mark up via Azure CSP at 10–15% margin.
Cloud-based OCR and document parsing service. Uses prebuilt invoice/purchase order models to extract structured fields (PO number, line items, quantities, part numbers, ship-to address, dates) from PDF and scanned PO documents with high accuracy. Primary document intelligence layer of the solution.
Parseur
Starter: $59/month (1,000 docs/month); Premium: $149/month (3,000 docs/month). Bundle into managed service fee.
Alternative to Azure AI Document Intelligence for email-centric PO intake. Parseur monitors a dedicated email inbox, automatically extracts data from PO PDFs and email body text using template-based parsing. Easiest no-code setup. Recommended for clients where 90%+ of POs arrive as email attachments and the MSP wants minimal OCR configuration.
Make.com (formerly Integromat)
Pro plan: $16/month (10,000 operations); Teams: $34/month (unlimited active scenarios). Best value per operation for complex workflows.
Primary workflow automation and integration platform (iPaaS). Orchestrates the entire PO-to-work-order pipeline: monitors email inbox → triggers Azure Document Intelligence → maps extracted fields → performs BOM lookup via ERP API → validates data → routes for human approval → creates work order via ERP API. Visual canvas makes complex branching logic maintainable.
Microsoft Power Automate Premium
$15/user/month via Microsoft 365 CSP. AI Builder add-on: $500/unit/month (only if using Power Automate's built-in document processing instead of Azure Doc Intelligence).
Alternative to Make.com for clients already invested in the Microsoft 365 ecosystem. Provides cloud flow automation with native connectors to Outlook, SharePoint, Dataverse, and Azure services. Also supports RPA (desktop flows) for legacy ERP systems without APIs. Resell via CSP at 10–15% margin.
n8n (Self-Hosted Community Edition)
Self-hosted: Free (community edition, unlimited executions). Cloud: $20/month (Starter, 2,500 executions) or $50/month (Pro, 10,000 executions). Self-hosted requires ~$30/month Azure VM or on-prem server.
Alternative workflow automation platform. Primary choice for ITAR-controlled environments where PO data cannot leave the client's network. Self-hosted on-premise or in a US-only Azure VM. 400+ built-in integrations. Fair-code license allows commercial use.
MRPeasy
Professional: $69/user/month; Enterprise: $99/user/month. API access available on all paid plans. Referral commission opportunity: 10–20%.
Cloud-native MRP system used as the work order target. If the client does not yet have an MRP/ERP, MRPeasy is the recommended SMB option due to its simple API, built-in BOM management, and low cost. REST API supports programmatic creation of manufacturing orders, customer orders, and BOM lookups.
Odoo Manufacturing
Standard: $24.90/user/month (cloud); Custom: $37.40/user/month. Community Edition: Free (self-hosted). Manufacturing module included.
Open-source modular ERP alternative. Full REST and XML-RPC APIs for work order creation. Best for clients wanting maximum flexibility and self-hosting capability. Community Edition is free and can be self-hosted for ITAR environments.
SPS Commerce EDI
Starting at ~$60/month per trading partner relationship. Custom pricing for high-volume.
EDI translation service for clients receiving EDI 850 Purchase Order transactions from large customers (e.g., Tier 1 automotive, defense primes, big-box retailers). Translates ANSI X12 EDI 850 documents into structured data that can be consumed by the workflow automation layer. Only needed if client has EDI trading partners.
Microsoft 365 Business Premium
$22/user/month via CSP. Client likely already has this.
Provides the shared mailbox (e.g., purchaseorders@clientdomain.com) that serves as the primary PO intake channel. Also provides SharePoint/OneDrive for PO document archival and Teams for approval notifications. Exchange Online is the email backbone monitored by the automation.
Prerequisites
- Active ERP/MRP system with API access enabled (MRPeasy Professional+, Katana Core, Odoo Standard, Fishbowl, Epicor Kinetic, or JobBOSS²). The API must support programmatic creation of work orders/manufacturing orders and BOM lookups.
- Stable internet connection of 25+ Mbps at the client site for cloud API calls to OCR and workflow services.
- Microsoft 365 or Google Workspace email with a dedicated shared mailbox for PO intake (e.g., orders@clientdomain.com or po@clientdomain.com). The mailbox must support IMAP or have a connector available in the chosen workflow platform.
- Complete and accurate Bill of Materials (BOMs) entered in the ERP system for all products the client manufactures. The automation maps PO line items to BOMs — if BOMs are incomplete, work orders will be incomplete.
- A customer-to-SKU mapping table that translates customer part numbers to internal part numbers/SKUs. Most customers use their own part numbering system on POs. This crosswalk table is critical and must be built during discovery.
- At least 20–50 sample purchase orders from the client's top 5–10 customers, representing all common PO formats. These are needed to configure OCR templates and test field extraction accuracy.
- An Azure subscription (for Azure AI Document Intelligence) or the selected alternative OCR platform account. If using Azure, the MSP's CSP tenant can host this.
- Administrative access to the client's ERP system to create API credentials (API keys, OAuth client ID/secret) and configure webhook endpoints if supported.
- A designated client-side project champion — typically the production manager or operations manager — who understands the current PO-to-WO process, can validate mapped fields, and can approve the automation logic.
- For ITAR environments only: Confirmation that all cloud services are deployed in US-only data centers, or approval to deploy the on-premise server stack (Dell PowerEdge T150 + self-hosted n8n + Azure AI containers).
- Document scanner installed and configured (Fujitsu ScanSnap iX1600 or fi-8170) if paper POs are part of the intake process. ScanSnap Home software configured to auto-save scans to a monitored folder or email.
- Network firewall rules allowing outbound HTTPS (port 443) to Azure cognitive services endpoints, Make.com/Power Automate endpoints, and the ERP API endpoint.
Installation Steps
...
Step 1: Discovery & Process Audit
Conduct a detailed audit of the client's current PO-to-work-order process. Interview the production manager, office administrator, and any staff involved in PO processing. Document: (1) How POs arrive (email %, paper %, EDI %, portal %), (2) What fields are manually keyed into the ERP, (3) How long each WO takes to create, (4) Current error rate and common mistakes, (5) Special handling rules (rush orders, custom specs, partial shipments), (6) Approval workflows currently in place. Collect 20–50 sample POs from top customers. Photograph or screenshot the ERP work order creation screens to understand required fields.
This step is non-technical but absolutely critical. The quality of the automation depends entirely on understanding the client's actual process, not the idealized one. Expect 4–8 hours on-site. Deliverable: a PO Process Map document and a Field Mapping Spreadsheet.
Step 2: Build Customer Part Number Crosswalk Table
Create a master mapping table that translates each customer's part numbers to the client's internal SKUs/part numbers in the ERP. This is typically a CSV or database table with columns: customer_name, customer_part_number, internal_sku, internal_product_name, default_bom_id, unit_of_measure, notes. Populate this by exporting the ERP's item master and cross-referencing with customer PO history. Store this in a SharePoint list, Airtable base, or a dedicated table in the ERP if supported.
# Example SharePoint list schema (create via PowerShell or manual)
# Columns: CustomerName (Text), CustomerPartNumber (Text), InternalSKU (Text),
# ProductName (Text), BOM_ID (Text), UOM (Choice), IsActive (Boolean)
# If using Airtable, create via API:
curl -X POST 'https://api.airtable.com/v0/meta/bases/{baseId}/tables' \
-H 'Authorization: Bearer {pat}' \
-H 'Content-Type: application/json' \
-d '{"name": "PO_SKU_Crosswalk", "fields": [{"name": "CustomerName", "type": "singleLineText"}, {"name": "CustomerPartNumber", "type": "singleLineText"}, {"name": "InternalSKU", "type": "singleLineText"}, {"name": "BOM_ID", "type": "singleLineText"}, {"name": "UOM", "type": "singleSelect", "options": {"choices": [{"name": "EA"}, {"name": "LB"}, {"name": "FT"}, {"name": "KG"}]}}]}'This is often the most time-consuming step and requires significant client input. For a client with 10 customers and 200 unique customer part numbers, expect 8–16 hours to build and validate. The client's office staff who currently process POs are the best source. This table will need ongoing maintenance as new customers/parts are added — build this into the managed service.
Step 3: Configure Shared PO Intake Mailbox
Create or configure a dedicated shared mailbox in Microsoft 365 for PO intake. All customer purchase orders should be directed to this mailbox. Configure the mailbox with: (1) Auto-reply acknowledging PO receipt, (2) Retention policy for 7-year archival (SOX/ITAR compliance), (3) Appropriate permissions for the service account that will be used by the automation platform.
# Connect to Exchange Online PowerShell
Connect-ExchangeOnline -UserPrincipalName admin@clientdomain.com
# Create shared mailbox
New-Mailbox -Shared -Name 'Purchase Orders' -DisplayName 'Purchase Orders' -Alias 'purchaseorders' -PrimarySmtpAddress 'po@clientdomain.com'
# Grant full access to service account
Add-MailboxPermission -Identity 'po@clientdomain.com' -User 'svc-automation@clientdomain.com' -AccessRights FullAccess -InheritanceType All
# Grant Send-As permission for auto-replies
Add-RecipientPermission -Identity 'po@clientdomain.com' -Trustee 'svc-automation@clientdomain.com' -AccessRights SendAs -Confirm:$false
# Set retention policy (create first if needed)
New-RetentionPolicyTag -Name 'PO-7Year' -Type All -RetentionEnabled $true -AgeLimitForRetention 2555 -RetentionAction MoveToArchive
New-RetentionPolicy -Name 'PO Retention Policy' -RetentionPolicyTagLinks 'PO-7Year'
Set-Mailbox -Identity 'po@clientdomain.com' -RetentionPolicy 'PO Retention Policy'If the client already has a shared mailbox for POs, use it — just ensure the service account has FullAccess permission. For Google Workspace clients, create a Google Group with a collaborative inbox or a dedicated Gmail account. The service account should use OAuth 2.0 app-only authentication, not a user password.
Step 4: Provision Azure AI Document Intelligence
Set up the Azure AI Document Intelligence resource that will extract structured data from PO documents. Create the resource in the client's Azure subscription (or the MSP's CSP-managed subscription). Configure the prebuilt invoice model first — it extracts PO numbers, line items, quantities, amounts, dates, and vendor/customer information out of the box. If PO formats are non-standard, plan to train a custom extraction model in a later step.
# Login to Azure CLI
az login
# Set subscription (use CSP subscription for billing)
az account set --subscription 'MSP-CSP-ClientName'
# Create resource group
az group create --name rg-clientname-docai --location eastus
# Create Document Intelligence resource (S0 tier for production)
az cognitiveservices account create \
--name docai-clientname \
--resource-group rg-clientname-docai \
--kind FormRecognizer \
--sku S0 \
--location eastus \
--yes
# Get the endpoint and key
az cognitiveservices account show \
--name docai-clientname \
--resource-group rg-clientname-docai \
--query properties.endpoint -o tsv
az cognitiveservices account keys list \
--name docai-clientname \
--resource-group rg-clientname-docai \
--query key1 -o tsv
# Test with a sample PO (using curl)
ENDPOINT=$(az cognitiveservices account show --name docai-clientname --resource-group rg-clientname-docai --query properties.endpoint -o tsv)
KEY=$(az cognitiveservices account keys list --name docai-clientname --resource-group rg-clientname-docai --query key1 -o tsv)
curl -X POST "${ENDPOINT}formrecognizer/documentModels/prebuilt-invoice:analyze?api-version=2023-07-31" \
-H "Content-Type: application/pdf" \
-H "Ocp-Apim-Subscription-Key: ${KEY}" \
--data-binary @sample-po.pdfUse 'eastus' or 'eastus2' region for lowest latency in North America. For ITAR clients, verify the region meets US-only data residency requirements — Azure Government may be needed. The prebuilt-invoice model works surprisingly well for most PO formats since POs and invoices share similar structure. Keep the API key in Azure Key Vault for production use, not in plain text. The S0 tier has no monthly commitment — you pay only per page analyzed.
Step 5: Configure Document Scanner (If Paper POs)
Install and configure the Fujitsu ScanSnap iX1600 (or fi-8170) document scanner at the client's receiving desk or front office. Configure ScanSnap Home software to: (1) Save scans as searchable PDF (with built-in OCR), (2) Auto-route scanned files to a monitored OneDrive/SharePoint folder, (3) Use consistent file naming (e.g., SCAN_YYYYMMDD_HHMMSS.pdf). This ensures paper POs enter the same digital pipeline as email POs.
- Open ScanSnap Home > Settings > Scan Settings
- Create new profile named 'Purchase Orders'
- Set Color Mode: Auto Detect
- Set Resolution: 300 dpi (optimal for OCR)
- Set File Format: PDF (Searchable)
- Set Destination: OneDrive for Business folder — Path: /PurchaseOrders/Incoming/
- Enable 'Blank page removal' for double-sided scanning
- Set file naming: SCAN_{date}_{time}
- Verify OneDrive sync client is installed and signed in on the scanning workstation
- Create the monitored folder in SharePoint/OneDrive (can also be done via SharePoint admin or Teams) — the automation workflow will monitor this folder for new files
onedrive --versionSkip this step entirely if the client receives no paper POs. The ScanSnap iX1600 has excellent out-of-box OCR via its ABBYY FineReader engine, but we rely on Azure AI Document Intelligence for the actual field extraction — the scanner's OCR just makes the PDF searchable as a backup. Test with 10 representative paper POs before proceeding. Ensure the scanner is placed on a stable surface near the person who opens mail.
Step 6: Set Up Make.com Automation Account and Connections
Create the Make.com organization and configure all necessary connections (modules) that the workflow will use. Make.com is the recommended iPaaS for this project due to its visual workflow builder, excellent error handling, and cost-effective operation pricing. Set up connections to: Microsoft 365 (email monitoring), Azure AI Document Intelligence (OCR), the client's ERP/MRP API, SharePoint/OneDrive (document storage), and Microsoft Teams (approval notifications).
# Make.com setup is GUI-based via https://www.make.com
#
# 1. Create Organization: 'ClientName Manufacturing'
# 2. Subscribe to Pro plan ($16/month, 10,000 ops)
# 3. Create Connections:
#
# Connection 1: Microsoft 365 (OAuth 2.0)
# - Module: Microsoft 365 Email
# - Auth: Sign in with svc-automation@clientdomain.com
# - Permissions: Mail.Read, Mail.ReadWrite, Mail.Send
#
# Connection 2: Azure Document Intelligence (API Key)
# - Module: HTTP > Make a Request (custom)
# - Base URL: {endpoint from Step 4}
# - Headers: Ocp-Apim-Subscription-Key: {key from Step 4}
#
# Connection 3: ERP/MRP API (varies by ERP)
# - For MRPeasy: HTTP module with API key auth
# Base URL: https://api.mrpeasy.com/v1
# Header: Authorization: Bearer {mrpeasy_api_key}
# - For Odoo: HTTP module with session auth
# Base URL: https://clientname.odoo.com/api/v1
#
# Connection 4: Microsoft Teams (OAuth 2.0)
# - Module: Microsoft Teams
# - Auth: Sign in with svc-automation@clientdomain.com
# - Used for sending approval requests to production manager
#
# Connection 5: SharePoint (OAuth 2.0)
# - Module: Microsoft SharePoint
# - Used for reading/writing crosswalk table and archiving POsUse a dedicated service account (svc-automation@clientdomain.com) for all connections, not a personal account. This prevents automation breakage when employees leave. Assign the service account a Microsoft 365 Business Basic license ($6/user/month) minimum. Store all API keys and credentials in Make.com's built-in connection manager — do not hardcode them in scenarios. Enable Make.com's built-in execution logging for audit trail compliance.
Step 7: Build Core PO Intake and OCR Extraction Scenario
Create the first Make.com scenario that monitors the PO intake mailbox for new emails with PDF attachments, downloads the attachment, sends it to Azure AI Document Intelligence for field extraction, and outputs structured PO data. This scenario handles the 'Document Ingestion → Data Extraction' portion of the pipeline.
The Azure Document Intelligence prebuilt-invoice model maps PO fields to invoice field names (InvoiceId = PO Number, VendorName = Customer Name on a PO since the perspective is reversed). You may need to train a custom model if the client's POs have unusual layouts. The async polling pattern (POST to analyze, then GET result) is required — Azure processes documents asynchronously. Add a Retry loop on Module 5 with 3 attempts at 5-second intervals in case analysis is not yet complete. Enable error handling on every module with email notification to the MSP.
Step 8: Build Field Mapping and Validation Scenario
Create the second Make.com scenario (or extend the first) that takes the extracted PO data, maps customer part numbers to internal SKUs using the crosswalk table, validates all required fields, checks quantities against BOM availability in the ERP, and flags any exceptions for human review.
# Make.com Scenario: 'PO-Validation-Mapping'
#
# Module 1: Data Store > Search Records (get extracted PO data)
# OR continue from previous scenario via Router
#
# Module 2: Iterator > Iterate Line Items
# - Loop through each PO line item
#
# Module 3: SharePoint > Search List Items (Crosswalk Lookup)
# - List: PO_SKU_Crosswalk
# - Filter: CustomerName = {{po.customer_name}}
# AND CustomerPartNumber = {{line_item.description}}
# - Output: InternalSKU, BOM_ID, UOM
#
# Module 4: Router (Branching)
# - Branch 1 (Match Found): Continue to ERP BOM lookup
# - Branch 2 (No Match): Flag for manual review
#
# Branch 1, Module 5: HTTP > Make a Request (ERP BOM Lookup)
# - For MRPeasy: GET https://api.mrpeasy.com/v1/bom/{bom_id}
# - Validate: BOM exists, is active, has required materials
#
# Branch 1, Module 6: Set Variable > Build Work Order Payload
# - internal_sku: {{3.InternalSKU}}
# - bom_id: {{3.BOM_ID}}
# - quantity: {{line_item.quantity}}
# - due_date: calculate from PO required delivery date
# - priority: map from PO terms (e.g., 'RUSH' = High)
# - source_po_number: {{po.po_number}}
# - customer_name: {{po.customer_name}}
#
# Branch 2, Module 5b: Microsoft Teams > Create Approval
# - Send adaptive card to Production Manager
# - Message: 'Unknown part number {customer_part} on PO {po_number}
# from {customer}. Please map to internal SKU.'
# - Include link to Crosswalk table for manual update
#
# Module 7: Aggregator > Aggregate all line item results
# - Combine validated line items into a single WO payload array
#
# Module 8: Filter > Check if all items validated
# - If all items mapped: proceed to approval
# - If any items flagged: hold entire PO for reviewThe crosswalk lookup is the most critical step. If a customer part number is not found, the automation MUST NOT create a work order with wrong materials — it must flag for human intervention. Build a simple web form or Teams adaptive card that lets the production manager add new part number mappings directly, which are then saved to the crosswalk table. This 'teach the system' loop is how the automation improves over time. For the BOM lookup, different ERPs have different API endpoints — see the custom_ai_components section for ERP-specific API call templates.
Step 9: Build Human Approval Workflow
Implement a human-in-the-loop approval step using Microsoft Teams Adaptive Cards (or email-based approval). Before any work order is created in the ERP, the production manager or designated approver receives a summary of the proposed work order with all mapped fields, and must explicitly approve or reject it. This satisfies ISO 9001 document control requirements and builds trust during the initial rollout period.
# Make.com Module: Microsoft Teams > Send Adaptive Card
#
# Adaptive Card JSON Template:
# {
# "type": "AdaptiveCard",
# "version": "1.4",
# "body": [
# {"type": "TextBlock", "text": "New Work Order Approval Required", "weight": "Bolder", "size": "Large"},
# {"type": "FactSet", "facts": [
# {"title": "PO Number", "value": "{{po_number}}"},
# {"title": "Customer", "value": "{{customer_name}}"},
# {"title": "PO Date", "value": "{{po_date}}"},
# {"title": "Required Date", "value": "{{required_date}}"}
# ]},
# {"type": "TextBlock", "text": "Line Items:", "weight": "Bolder"},
# {"type": "Table", "columns": [
# {"width": 2}, {"width": 1}, {"width": 1}, {"width": 2}
# ], "rows": [
# {"cells": [
# {"items": [{"type": "TextBlock", "text": "Customer Part#"}]},
# {"items": [{"type": "TextBlock", "text": "Qty"}]},
# {"items": [{"type": "TextBlock", "text": "Internal SKU"}]},
# {"items": [{"type": "TextBlock", "text": "Product Name"}]}
# ]}
# ]}
# ],
# "actions": [
# {"type": "Action.Submit", "title": "✅ Approve", "data": {"action": "approve", "po_id": "{{po_number}}"}},
# {"type": "Action.Submit", "title": "❌ Reject", "data": {"action": "reject", "po_id": "{{po_number}}"}},
# {"type": "Action.Submit", "title": "✏️ Edit & Approve", "data": {"action": "edit", "po_id": "{{po_number}}"}}
# ]
# }
#
# Next Module: Microsoft Teams > Wait for Response
# - Timeout: 4 hours (configurable)
# - If no response in 4 hours: send reminder
# - If no response in 8 hours: escalate to backup approver
#
# Routing after response:
# - approve: proceed to work order creation
# - reject: log rejection, notify sales/CS team
# - edit: open WO details in ERP for manual adjustment, then auto-completeThe approval step can be removed after the system proves reliable (typically after 2–4 weeks of 100% accuracy). However, many ISO 9001 auditors appreciate having a documented approval record, so clients may choose to keep it permanently. If the client does not use Microsoft Teams, implement email-based approval using Outlook actionable messages or a simple approve/reject link that calls a Make.com webhook. For Power Automate implementations, use the built-in Approvals connector which provides a richer approval UI with mobile support.
Step 10: Build ERP Work Order Creation Integration
Create the final automation step that takes the approved, validated work order data and creates the actual manufacturing work order (or manufacturing order/production order, depending on ERP terminology) in the client's ERP system via its REST API. This step also updates the source PO status and logs the conversion for audit purposes.
# MRPeasy, Odoo, and Fishbowl API examples
# ============================================================
# MRPeasy API - Create Manufacturing Order
# ============================================================
# Make.com HTTP Module:
# Method: POST
# URL: https://api.mrpeasy.com/v1/manufacturing-orders
# Headers:
# Content-Type: application/json
# Authorization: Bearer {mrpeasy_api_key}
# Body:
# {
# "article_id": "{{internal_sku}}",
# "quantity": {{quantity}},
# "deadline": "{{due_date}}",
# "bom_id": "{{bom_id}}",
# "customer_order_number": "{{po_number}}",
# "notes": "Auto-generated from PO #{{po_number}} | Customer: {{customer_name}} | Approved by: {{approver_name}} at {{approval_timestamp}}",
# "priority": "{{priority}}"
# }
# ============================================================
# Odoo XML-RPC - Create Manufacturing Order
# ============================================================
# Python snippet (for n8n Code node or standalone script):
import xmlrpc.client
url = 'https://clientname.odoo.com'
db = 'clientname'
uid = 2 # admin user ID
password = 'api_key_here'
models = xmlrpc.client.ServerProxy(f'{url}/xmlrpc/2/object')
# Create manufacturing order
mo_id = models.execute_kw(db, uid, password, 'mrp.production', 'create', [{
'product_id': internal_product_id,
'product_qty': quantity,
'bom_id': bom_id,
'date_planned_start': due_date,
'origin': f'PO #{po_number}',
'company_id': 1,
}])
print(f'Created Manufacturing Order ID: {mo_id}')
# ============================================================
# Fishbowl REST API - Create Work Order
# ============================================================
# Make.com HTTP Module:
# Method: POST
# URL: https://{fishbowl_server}:443/api/work-orders
# Headers:
# Content-Type: application/json
# Authorization: Bearer {fishbowl_token}
# Body:
# {
# "partNumber": "{{internal_sku}}",
# "quantityToManufacture": {{quantity}},
# "dueDate": "{{due_date}}",
# "customerPO": "{{po_number}}",
# "note": "Auto-generated from PO #{{po_number}}"
# }The exact API call varies significantly by ERP. The examples above cover the three most common SMB ERPs. For ERPs without a REST API (older versions of JobBOSS, Epicor Vantage, etc.), use Power Automate Desktop flows (RPA) to fill in the work order creation form in the ERP's desktop client. For Epicor Kinetic, use the REST v2 API with BAQ (Business Activity Query) for data lookups and BO (Business Object) methods for WO creation. Always create the work order with a status of 'Planned' or 'Draft' — never 'Released' — so the production planner can schedule it appropriately.
Step 11: Configure Audit Logging and Document Archival
Set up comprehensive audit logging to track every PO-to-WO conversion. This satisfies ISO 9001 document control, ITAR record-keeping (5-year retention), and SOX requirements (7-year retention). Every automation run must log: source PO document (archived), extracted data, mapping decisions, approval action, created WO ID, and timestamps.
- RunID (Text, auto-generated GUID)
- Timestamp (DateTime)
- SourcePO_Number (Text)
- SourcePO_FileName (Text)
- SourcePO_Link (Hyperlink, to archived PDF)
- CustomerName (Text)
- ExtractedLineItems (Multi-line text, JSON)
- MappedSKUs (Multi-line text, JSON)
- ValidationResult (Choice: Pass/Fail/Partial)
- ApproverName (Text)
- ApprovalTimestamp (DateTime)
- ApprovalDecision (Choice: Approved/Rejected/Edited)
- CreatedWO_ID (Text)
- CreatedWO_Link (Hyperlink, to ERP WO)
- ErrorMessages (Multi-line text)
- ProcessingTimeSeconds (Number)
az storage table create --name POWOAuditLog --account-name clientstorageacct
az storage entity insert --table-name POWOAuditLog --entity PartitionKey=2025-01 RowKey={guid} ...SharePoint is the easiest option for M365 clients and provides built-in retention policies, eDiscovery, and content search. For high-volume clients (500+ POs/month), Azure Table Storage is more cost-effective. The audit log should be immutable — configure the SharePoint list with 'No delete' permissions for all users except a compliance admin. The original PO PDF must be archived alongside the log entry — this is the source document that auditors will want to see.
Step 12: Set Up Error Handling and Alerting
Configure comprehensive error handling in Make.com so that failures are caught, logged, and escalated appropriately. Manufacturing operations cannot afford silent failures — a missed PO means a missed shipment. Set up three tiers of alerting: (1) Automated retry for transient errors, (2) MSP notification for system errors, (3) Client notification for business logic errors.
The daily health-check scenario is critical during the first 30 days. It catches any POs that were received but not processed due to errors, format issues, or edge cases. Configure Make.com's built-in execution history retention to 30 days minimum for troubleshooting. For Power Automate implementations, use the built-in 'Run after' failure configuration and Flow Analytics dashboard.
Step 13: Parallel Testing (Shadow Mode)
Run the automation in parallel with the existing manual process for 2–4 weeks. During this period, the automation processes every PO and generates proposed work orders, but they are NOT created in the ERP. Instead, the proposed WO data is logged to a comparison spreadsheet alongside the manually-created WO data. The client's staff continues to process POs manually as before. At the end of the parallel period, compare automated vs. manual results to measure accuracy.
- Modify the ERP API call module to DISABLE actual creation
- In Make.com, add a boolean variable at the start of the scenario: SHADOW_MODE = true
- Before the ERP API call, add a Router with two branches:
- Branch 1 (SHADOW_MODE = true): Log proposed WO to SharePoint list 'WO_Shadow_Comparison' with columns: PO_Number, ProposedSKU, ProposedQty, ProposedDueDate, ManualWO_Number (filled in by staff), Match (formula)
- Branch 2 (SHADOW_MODE = false): Execute actual ERP API call to create WO
- Instruct client staff to manually fill in the ManualWO_Number column in the comparison spreadsheet after they create each WO manually
- After 2 weeks, run comparison analysis covering: Total POs processed, Automated extraction accuracy %, Field-level accuracy (PO#, Qty, Part#, Date), SKU mapping accuracy %, False positives (wrong WO proposed), False negatives (PO missed entirely)
- Target: 95%+ accuracy before going live
- If <95%: identify failure patterns, adjust OCR templates or crosswalk
- If >98%: can reduce/eliminate approval step
Shadow mode is non-negotiable for manufacturing. The consequences of a wrong work order are real: wrong materials ordered, wrong parts machined, missed deliveries. Two weeks is the minimum parallel period. Have the client's most experienced PO processor review every comparison entry. Document all discrepancies and their root causes. Common issues: (1) OCR misreads handwritten POs, (2) customer uses abbreviations not in crosswalk, (3) multi-page POs with continuation lines, (4) POs with special instructions that need human interpretation.
Step 14: Go-Live and Hypercare
After achieving 95%+ accuracy in shadow mode, switch the automation to live mode. The ERP API calls are now active and work orders are created automatically (after human approval). During the 2-week hypercare period, the MSP monitors all executions daily, responds to issues within 2 hours, and makes real-time adjustments to handle edge cases discovered in production.
- In Make.com, update the shadow mode variable: SHADOW_MODE = false
- Verify the change by processing a test PO: Send a known test PO to po@clientdomain.com
- Wait for approval notification in Teams
- Approve the work order
- Verify work order appears in ERP with correct fields
- Verify audit log entry is created
- If accuracy remains >98%: transition to standard managed service
- If accuracy is 95-98%: extend hypercare 1 week, address issues
- If accuracy <95%: root cause analysis, likely needs OCR retraining
Go-live should happen on a Monday morning so the MSP has a full work week to monitor. Avoid go-live during the client's busiest shipping periods or month-end close. Keep the client's manual process as a fallback for the first week — production staff should know they can create WOs manually if anything goes wrong. Publish a simple one-pager for the production manager: 'What to do if the automation fails' with steps to manually process the PO and notify the MSP.
Custom AI Components
PO Email Intake Monitor
Type: workflow Make.com scenario that monitors the shared PO mailbox for new emails with PDF attachments, filters out non-PO emails (auto-replies, spam, newsletters), and routes valid PO documents to the OCR extraction pipeline. Handles both single-attachment and multi-attachment emails, and also monitors a OneDrive folder for scanned paper POs.
Implementation:
{
"name": "PO-Email-Intake-Monitor",
"scheduling": {"type": "interval", "interval": 5},
"modules": [
{
"id": 1,
"module": "microsoft365:watchEmails",
"parameters": {
"mailbox": "po@clientdomain.com",
"folder": "Inbox",
"hasAttachments": true,
"markAsRead": true,
"limit": 10
}
},
{
"id": 2,
"module": "builtin:filter",
"parameters": {
"condition": "{{1.subject}} does not contain 'Auto-Reply' AND {{1.subject}} does not contain 'Out of Office' AND {{1.from.address}} is not in ['noreply@', 'mailer-daemon@']"
}
},
{
"id": 3,
"module": "builtin:iterator",
"parameters": {
"array": "{{1.attachments}}"
}
},
{
"id": 4,
"module": "builtin:filter",
"parameters": {
"condition": "{{3.filename}} matches '*.pdf' OR {{3.filename}} matches '*.PDF' OR {{3.filename}} matches '*.tif' OR {{3.filename}} matches '*.tiff' OR {{3.filename}} matches '*.png'"
}
},
{
"id": 5,
"module": "microsoft365:getAttachmentContent",
"parameters": {
"messageId": "{{1.id}}",
"attachmentId": "{{3.id}}"
}
},
{
"id": 6,
"module": "builtin:setVariable",
"parameters": {
"variables": [
{"name": "source_type", "value": "email"},
{"name": "source_email_from", "value": "{{1.from.address}}"},
{"name": "source_email_subject", "value": "{{1.subject}}"},
{"name": "source_email_date", "value": "{{1.receivedDateTime}}"},
{"name": "original_filename", "value": "{{3.filename}}"},
{"name": "file_content", "value": "{{5.data}}"}
]
}
},
{
"id": 7,
"module": "webhook:trigger",
"mapper": "call PO-OCR-Extraction scenario"
}
]
}{
"name": "PO-Scan-Folder-Monitor",
"scheduling": {"type": "interval", "interval": 5},
"modules": [
{
"id": 1,
"module": "onedrive:watchFiles",
"parameters": {
"folder": "/PurchaseOrders/Incoming/",
"fileTypes": ["pdf", "tif", "tiff", "png"]
}
},
{
"id": 2,
"module": "onedrive:downloadFile",
"parameters": {
"fileId": "{{1.id}}"
}
},
{
"id": 3,
"module": "onedrive:moveFile",
"parameters": {
"fileId": "{{1.id}}",
"destination": "/PurchaseOrders/Processing/"
}
},
{
"id": 4,
"module": "builtin:setVariable",
"parameters": {
"variables": [
{"name": "source_type", "value": "scan"},
{"name": "original_filename", "value": "{{1.name}}"},
{"name": "file_content", "value": "{{2.data}}"}
]
}
},
{
"id": 5,
"module": "webhook:trigger",
"mapper": "call PO-OCR-Extraction scenario"
}
]
}- Set polling to 5 minutes; reduce to 1 minute for high-urgency shops
- The filter in module 2 prevents processing auto-replies and bounce-backs
- The attachment filter in module 4 ensures only document files are processed
- Scanned files are moved from Incoming to Processing to prevent re-processing
- Both scenarios feed into the same OCR extraction scenario via webhook call
Azure Document Intelligence PO Extractor
Type: integration Sends PO documents to Azure AI Document Intelligence prebuilt-invoice model, handles the asynchronous polling pattern, extracts and normalizes all PO fields into a standardized internal format regardless of the original PO layout. Handles multi-page POs and continuation pages.
Implementation:
## Make.com Scenario Module Sequence: PO-OCR-Extraction
### Module Sequence
Webhook Trigger (from intake)
→ POST to Azure Analyze
→ Sleep 8s
→ GET Result (with retry loop)
→ Parse JSON
→ Normalize Fields
→ Output to Mapping Scenario### Module 1: HTTP - Submit Document for Analysis
Method: POST
URL: {{AZURE_ENDPOINT}}/formrecognizer/documentModels/prebuilt-invoice:analyze?api-version=2023-07-31
Headers:
Ocp-Apim-Subscription-Key: {{AZURE_KEY}}
Content-Type: application/pdf
Body Type: Binary
Body: {{webhook.file_content}}
Parse Response: Yes
Advanced:
Timeout: 30 seconds
Follow Redirect: Yes### Module 2: Tools - Sleep
Delay: 8 seconds### Module 3: HTTP - Poll for Results (with Repeater)
Method: GET
URL: {{1.headers.operation-location}}
Headers:
Ocp-Apim-Subscription-Key: {{AZURE_KEY}}
Parse Response: Yes
Error Handler:
- If status != 'succeeded': Retry (max 5, interval 5s)
- If status == 'failed': Break + notify MSP// Field Extraction Mapping
// Azure prebuilt-invoice field -> Internal field name
const result = analyzeResult.documents[0].fields;
const extractedPO = {
// Header fields
po_number: result.InvoiceId?.content || result.PurchaseOrder?.content || 'UNKNOWN',
po_date: result.InvoiceDate?.content || result.InvoiceDate?.valueDate || null,
due_date: result.DueDate?.content || result.DueDate?.valueDate || null,
// Customer information (note: on a PO, the 'Vendor' in invoice model = the customer)
customer_name: result.VendorName?.content || result.CustomerName?.content || '',
customer_address: result.VendorAddress?.content || '',
customer_id: result.VendorAddressRecipient?.content || result.CustomerId?.content || '',
// Ship-to information
ship_to_name: result.ShippingAddressRecipient?.content || '',
ship_to_address: result.ShippingAddress?.content || '',
// Financial
subtotal: result.SubTotal?.valueCurrency?.amount || 0,
total: result.InvoiceTotal?.valueCurrency?.amount || 0,
currency: result.InvoiceTotal?.valueCurrency?.currencyCode || 'USD',
// Payment terms
payment_terms: result.PaymentTerm?.content || '',
// Line items
line_items: (result.Items?.valueArray || []).map((item, index) => ({
line_number: index + 1,
description: item.valueObject?.Description?.content || '',
product_code: item.valueObject?.ProductCode?.content || '',
quantity: item.valueObject?.Quantity?.valueNumber || 0,
unit: item.valueObject?.Unit?.content || 'EA',
unit_price: item.valueObject?.UnitPrice?.valueCurrency?.amount || 0,
amount: item.valueObject?.Amount?.valueCurrency?.amount || 0,
date: item.valueObject?.Date?.content || null
})),
// Confidence scores
confidence: {
overall: analyzeResult.documents[0].confidence,
po_number: result.InvoiceId?.confidence || 0,
customer_name: result.VendorName?.confidence || 0
},
// Metadata
page_count: analyzeResult.pages?.length || 1,
extraction_timestamp: new Date().toISOString(),
source_type: webhook.source_type,
original_filename: webhook.original_filename
};
return extractedPO;### Module 5: Filter - Confidence Check
Condition: extractedPO.confidence.overall >= 0.70
AND extractedPO.po_number != 'UNKNOWN'
AND extractedPO.line_items.length > 0
If PASS: Continue to mapping scenario
If FAIL: Route to manual review queue with low-confidence alert{{AZURE_ENDPOINT}}/formrecognizer/documentModels/{{CUSTOM_MODEL_ID}}:analyze?api-version=2023-07-31Customer Part Number Crosswalk Mapper
Type: skill Takes extracted PO line items and maps customer-specific part numbers to internal SKUs, BOM IDs, and product names using the crosswalk lookup table. Handles fuzzy matching for common variations (dashes, spaces, leading zeros) and maintains a confidence score for each mapping. Flags unmapped items for human review and supports the 'teach' workflow where new mappings are added by production staff.
SharePoint List Schema: PO_SKU_Crosswalk
Columns:
- ID (auto, SharePoint built-in)
- CustomerName (Single line text, indexed)
- CustomerPartNumber (Single line text, indexed)
- CustomerPartVariants (Multi-line text) // JSON array of known variations
- InternalSKU (Single line text)
- ProductName (Single line text)
- BOM_ID (Single line text)
- DefaultUOM (Choice: EA, LB, FT, KG, GAL, BOX, SET)
- UOMConversionFactor (Number, default 1) // e.g., customer orders in DOZ, internal is EA, factor = 12
- DefaultLeadTimeDays (Number)
- IsActive (Yes/No, default Yes)
- LastUsedDate (DateTime)
- CreatedBy (Person)
- Notes (Multi-line text)Make.com Mapping Module Sequence
For each line_item in extractedPO.line_items:
Module A: SharePoint > Search List Items
List: PO_SKU_Crosswalk
Filter:
CustomerName = {{extractedPO.customer_name}}
AND CustomerPartNumber = {{line_item.product_code}}
AND IsActive = Yes
Limit: 1Branch 1 (Exact Match Found):
Set mapped_sku = {{A.InternalSKU}}
Set mapped_bom = {{A.BOM_ID}}
Set mapped_qty = {{line_item.quantity}} * {{A.UOMConversionFactor}}
Set mapping_confidence = 1.0
Set mapping_method = 'exact'
// Update LastUsedDate
SharePoint > Update List Item
ID: {{A.ID}}
LastUsedDate: now()# Router Branch 2 Setup (No Exact Match, Try Fuzzy)
Branch 2 (No Exact Match - Try Fuzzy):
// Normalize the customer part number
Set normalized_part = UPPER(REPLACE(REPLACE(REPLACE(
{{line_item.product_code}}, '-', ''), ' ', ''), '.', ''))
// Search with normalized comparison
SharePoint > Search List Items
Filter: CustomerName = {{extractedPO.customer_name}}
AND IsActive = Yes
Limit: 100 // get all for this customerconst normalizePartNum = (pn) =>
pn.toUpperCase().replace(/[-\s.\/]/g, '').replace(/^0+/, '');
const searchNormalized = normalizePartNum(line_item.product_code);
// Also check description-based matching
const searchDesc = line_item.description.toLowerCase();
let bestMatch = null;
let bestScore = 0;
for (const row of crosswalkRows) {
// Check primary part number
const rowNormalized = normalizePartNum(row.CustomerPartNumber);
if (rowNormalized === searchNormalized) {
bestMatch = row;
bestScore = 0.95; // normalized exact match
break;
}
// Check variant list
if (row.CustomerPartVariants) {
const variants = JSON.parse(row.CustomerPartVariants);
for (const v of variants) {
if (normalizePartNum(v) === searchNormalized) {
bestMatch = row;
bestScore = 0.90;
break;
}
}
}
// Check if part number is contained in description
if (searchDesc.includes(row.InternalSKU.toLowerCase()) ||
searchDesc.includes(row.ProductName.toLowerCase())) {
if (bestScore < 0.70) {
bestMatch = row;
bestScore = 0.70;
}
}
}
return { bestMatch, bestScore };- If bestScore >= 0.85: Use match, flag for verification
- If bestScore 0.50–0.84: Suggest match, require approval
- If bestScore < 0.50 or null: No match, flag for manual mapping
# Router Branch 3 (No Match, Manual Review via Teams Adaptive Card)
Branch 3 (No Match Found - Manual Review):
// Create Teams adaptive card for manual mapping
Teams > Send Adaptive Card to Production Manager:
Title: 'New Part Number Mapping Required'
Body:
PO: {{extractedPO.po_number}}
Customer: {{extractedPO.customer_name}}
Customer Part#: {{line_item.product_code}}
Description: {{line_item.description}}
Qty: {{line_item.quantity}}
[Dropdown: Select Internal SKU from ERP item list]
[Submit: Save Mapping]
// On response: Create new crosswalk entry
SharePoint > Create List Item in PO_SKU_Crosswalk:
CustomerName: {{extractedPO.customer_name}}
CustomerPartNumber: {{line_item.product_code}}
InternalSKU: {{teams_response.selected_sku}}
BOM_ID: (lookup from ERP)
CreatedBy: {{teams_response.responder}}Initial Crosswalk Population Script
import pandas as pd
# Export from ERP: past 12 months of sales orders with customer PO references
history = pd.read_csv('sales_order_history.csv')
# Group by customer + their part number, find the internal SKU used
crosswalk = history.groupby(['customer_name', 'customer_part_number']).agg({
'internal_sku': 'first',
'product_name': 'first',
'bom_id': 'first',
'uom': 'first',
'order_date': 'max' # most recent usage
}).reset_index()
crosswalk.to_csv('initial_crosswalk.csv', index=False)
print(f'Generated {len(crosswalk)} crosswalk mappings')
# Import this CSV into the SharePoint listWork Order Payload Builder
Type: skill Takes validated and mapped PO data and constructs the ERP-specific API payload for work order creation. Handles ERP-specific field requirements, date calculations (lead time, scheduling), priority mapping, and multi-line PO to multiple work order splitting logic. Supports MRPeasy, Odoo, and Fishbowl API formats.
Implementation:
Make.com Code Module (JavaScript)
// Make.com Code Module (JavaScript)
// Input: validated PO with mapped SKUs
// Output: Array of WO payloads ready for ERP API
function buildWorkOrderPayloads(validatedPO, erpType) {
const workOrders = [];
for (const item of validatedPO.line_items) {
if (!item.mapped_sku || item.mapping_confidence < 0.85) {
continue; // Skip unmapped items (already flagged for review)
}
// Calculate dates
const poDate = new Date(validatedPO.po_date);
const requestedDate = validatedPO.due_date
? new Date(validatedPO.due_date)
: new Date(poDate.getTime() + (item.default_lead_time_days || 14) * 86400000);
// Determine priority
let priority = 'normal';
const poText = (validatedPO.payment_terms + ' ' + (validatedPO.notes || '')).toUpperCase();
if (poText.includes('RUSH') || poText.includes('EXPEDITE') || poText.includes('HOT')) {
priority = 'high';
} else if (poText.includes('BLANKET') || poText.includes('SCHEDULE')) {
priority = 'low';
}
// Calculate quantity with UOM conversion
const quantity = item.quantity * (item.uom_conversion_factor || 1);
// Build notes/reference string
const notes = [
`Auto-generated from PO #${validatedPO.po_number}`,
`Customer: ${validatedPO.customer_name}`,
`Customer Part#: ${item.original_product_code}`,
`PO Line: ${item.line_number}`,
`Approved by: ${validatedPO.approver_name} at ${validatedPO.approval_timestamp}`,
item.notes ? `Special Instructions: ${item.notes}` : ''
].filter(Boolean).join(' | ');
// Build ERP-specific payload
let payload;
switch (erpType) {
case 'mrpeasy':
payload = {
endpoint: 'POST /v1/manufacturing-orders',
body: {
article_id: item.mapped_sku,
quantity: quantity,
deadline: requestedDate.toISOString().split('T')[0],
bom_id: item.mapped_bom_id,
customer_order_number: validatedPO.po_number,
notes: notes,
priority: priority === 'high' ? 1 : (priority === 'low' ? 3 : 2),
status: 'planned' // Never auto-release
}
};
break;
case 'odoo':
payload = {
endpoint: 'POST /api/v1/mrp.production',
body: {
product_id: parseInt(item.mapped_sku),
product_qty: quantity,
bom_id: parseInt(item.mapped_bom_id),
date_planned_start: requestedDate.toISOString(),
origin: `PO #${validatedPO.po_number}`,
priority: priority === 'high' ? '1' : (priority === 'low' ? '0' : '0'),
state: 'confirmed', // Odoo: draft -> confirmed -> planned -> progress -> done
company_id: 1
}
};
break;
case 'fishbowl':
payload = {
endpoint: 'POST /api/work-orders',
body: {
partNumber: item.mapped_sku,
quantityToManufacture: quantity,
dueDate: requestedDate.toISOString().split('T')[0],
customerPO: validatedPO.po_number,
note: notes,
priority: priority === 'high' ? 5 : (priority === 'low' ? 1 : 3),
statusId: 10 // Issued/Planned
}
};
break;
case 'epicor':
payload = {
endpoint: 'POST /api/v2/odata/Erp.BO.JobEntrySvc/Jobs',
body: {
PartNum: item.mapped_sku,
ProdQty: quantity,
ReqDueDate: requestedDate.toISOString().split('T')[0],
PONum: validatedPO.po_number,
CommentText: notes,
JobFirm: false, // Unfirm initially
JobEngineered: true,
JobReleased: false
}
};
break;
default:
throw new Error(`Unsupported ERP type: ${erpType}`);
}
workOrders.push({
...payload,
_meta: {
po_number: validatedPO.po_number,
po_line: item.line_number,
customer: validatedPO.customer_name,
customer_part: item.original_product_code,
internal_sku: item.mapped_sku,
quantity: quantity,
priority: priority,
requested_date: requestedDate.toISOString()
}
});
}
return workOrders;
}
// Execute
const erpType = 'mrpeasy'; // Configure per client
const payloads = buildWorkOrderPayloads(validatedPO, erpType);
return payloads;Multi-WO Splitting Rules
- One WO per PO line item (standard): Each line item becomes its own manufacturing order
- Grouped by product: If multiple PO lines reference the same SKU, optionally combine into one WO with summed quantity
- Split by capacity: If WO quantity exceeds single-run capacity (configured per product), split into multiple WOs
- Configure splitting behavior via a
WO_GENERATION_RULESvariable at scenario level
PO-to-WO Audit Logger
Type: integration Comprehensive audit logging component that records every step of the PO-to-WO conversion process. Creates immutable records in SharePoint with links to source documents and generated work orders. Supports ISO 9001, ITAR, and SOX retention requirements.
SharePoint List: PO_WO_Audit_Log — List Schema
{
"listName": "PO_WO_Audit_Log",
"columns": [
{"name": "RunID", "type": "SingleLineText", "indexed": true},
{"name": "Timestamp", "type": "DateTime", "indexed": true},
{"name": "Stage", "type": "Choice", "choices": ["Received", "OCR_Complete", "Mapped", "Validation_Pass", "Validation_Fail", "Approval_Sent", "Approved", "Rejected", "WO_Created", "WO_Failed", "Error"]},
{"name": "PONumber", "type": "SingleLineText", "indexed": true},
{"name": "POFileName", "type": "SingleLineText"},
{"name": "POArchiveLink", "type": "Hyperlink"},
{"name": "CustomerName", "type": "SingleLineText", "indexed": true},
{"name": "SourceType", "type": "Choice", "choices": ["Email", "Scan", "EDI", "Portal"]},
{"name": "ExtractedDataJSON", "type": "MultiLineText"},
{"name": "MappedDataJSON", "type": "MultiLineText"},
{"name": "OCRConfidence", "type": "Number"},
{"name": "MappingConfidence", "type": "Number"},
{"name": "LineItemCount", "type": "Number"},
{"name": "MappedItemCount", "type": "Number"},
{"name": "UnmappedItemCount", "type": "Number"},
{"name": "ValidationErrors", "type": "MultiLineText"},
{"name": "ApproverName", "type": "SingleLineText"},
{"name": "ApprovalTimestamp", "type": "DateTime"},
{"name": "ApprovalDecision", "type": "Choice", "choices": ["Approved", "Rejected", "Edited", "Auto-Approved", "Pending"]},
{"name": "CreatedWOIDs", "type": "MultiLineText"},
{"name": "ERPResponse", "type": "MultiLineText"},
{"name": "ProcessingTimeMs", "type": "Number"},
{"name": "ErrorDetails", "type": "MultiLineText"},
{"name": "MSPTechNotes", "type": "MultiLineText"}
],
"settings": {
"versioningEnabled": true,
"majorVersionLimit": 500,
"enableAttachments": false,
"noCrawl": false
}
}Make.com Logger Module (reusable sub-scenario)
- Scenario: PO-WO-Audit-Logger
- Trigger: Webhook (called by other scenarios at each stage)
- Input Parameters: run_id (string), stage (string enum), po_data (object, varies by stage), error (string, optional)
- Module 1: SharePoint > Create List Item — List: PO_WO_Audit_Log, Fields: Map from input parameters
- Module 2: Conditional (if stage == 'Error' or stage == 'WO_Failed') — HTTP > POST to error notification webhook
Logging Points in Main Workflow
Retention Policy PowerShell
Connect-PnPOnline -Url 'https://clientname.sharepoint.com/sites/Manufacturing' -Interactive
# Create retention label
$label = New-PnPRetentionLabel -Name 'PO-WO-Audit-7Year' `
-RetentionDuration 2555 `
-RetentionAction Keep `
-IsRecordLabel $true
# Apply to list
Set-PnPList -Identity 'PO_WO_Audit_Log' -DefaultLabel 'PO-WO-Audit-7Year'Monthly Audit Report (scheduled Make.com scenario)
- Trigger: Schedule — 1st of each month at 6:00 AM
- Module 1: SharePoint > Get List Items — Filter: Timestamp >= first day of last month AND Timestamp < first day of this month
- Module 2: Code > Calculate metrics — Total POs processed, Success rate (WO_Created / Received), Average processing time, Top errors by frequency, Unmapped part numbers (need crosswalk updates)
- Module 3: Email > Send report to client production manager + MSP account manager
Daily Health Check Monitor
Type: workflow Scheduled daily workflow that verifies the automation pipeline is functioning correctly. Compares PO intake count to work order creation count, checks for stuck or failed executions, validates API connectivity to all services, and sends a daily status summary to the MSP and client stakeholders.
Implementation:
## Make.com Scenario: PO-WO-Daily-HealthCheck
### Scenario Configuration
Name: PO-WO-Daily-HealthCheck
Schedule: Daily at 7:00 AM (client local time)
Timeout: 5 minutesModule 1: Tools > Set Multiple Variables
- check_date: {{formatDate(addDays(now; -1); 'YYYY-MM-DD')}}
- check_start: {{check_date}}T00:00:00Z
- check_end: {{check_date}}T23:59:59Z
- alerts: [] (empty array)Module 2: SharePoint > Get List Items (Audit Log)
List: PO_WO_Audit_Log
Filter: Timestamp >= {{check_start}} AND Timestamp <= {{check_end}}
Limit: 500const entries = items; // from Module 2
const alerts = [];
// Count by stage
const received = entries.filter(e => e.Stage === 'Received').length;
const ocrDone = entries.filter(e => e.Stage === 'OCR_Complete').length;
const mapped = entries.filter(e => e.Stage === 'Mapped').length;
const approved = entries.filter(e => e.Stage === 'Approved').length;
const created = entries.filter(e => e.Stage === 'WO_Created').length;
const errors = entries.filter(e => e.Stage === 'Error' || e.Stage === 'WO_Failed');
const pending = entries.filter(e => e.Stage === 'Approval_Sent' &&
!entries.some(e2 => e2.RunID === e.RunID && ['Approved','Rejected'].includes(e2.Stage)));
// Alert: POs received but no WO created (dropped POs)
const receivedRunIDs = new Set(entries.filter(e => e.Stage === 'Received').map(e => e.RunID));
const createdRunIDs = new Set(entries.filter(e => e.Stage === 'WO_Created').map(e => e.RunID));
const droppedIDs = [...receivedRunIDs].filter(id => !createdRunIDs.has(id));
const droppedNonPending = droppedIDs.filter(id =>
!pending.some(p => p.RunID === id));
if (droppedNonPending.length > 0) {
alerts.push({
severity: 'HIGH',
message: `${droppedNonPending.length} PO(s) received but no WO created (not pending approval). Run IDs: ${droppedNonPending.join(', ')}`
});
}
// Alert: Pending approvals older than 8 hours
if (pending.length > 0) {
alerts.push({
severity: 'MEDIUM',
message: `${pending.length} PO(s) awaiting approval for 8+ hours. PO numbers: ${pending.map(p => p.PONumber).join(', ')}`
});
}
// Alert: Error rate above threshold
if (received > 0 && errors.length / received > 0.1) {
alerts.push({
severity: 'HIGH',
message: `Error rate is ${((errors.length/received)*100).toFixed(1)}% (${errors.length}/${received}). Review error details.`
});
}
// Alert: Zero POs (might mean intake is broken)
const dayOfWeek = new Date(check_date).getDay();
if (received === 0 && dayOfWeek >= 1 && dayOfWeek <= 5) {
alerts.push({
severity: 'MEDIUM',
message: 'No POs received on a weekday. Verify email intake is functioning.'
});
}
// Calculate average processing time
const completedRuns = entries.filter(e => e.Stage === 'WO_Created' && e.ProcessingTimeMs);
const avgTime = completedRuns.length > 0
? (completedRuns.reduce((sum, e) => sum + e.ProcessingTimeMs, 0) / completedRuns.length / 1000).toFixed(1)
: 'N/A';
return {
summary: {
date: check_date,
pos_received: received,
ocr_completed: ocrDone,
successfully_mapped: mapped,
approved: approved,
wos_created: created,
errors: errors.length,
pending_approval: pending.length,
avg_processing_time_sec: avgTime,
success_rate: received > 0 ? ((created/received)*100).toFixed(1) + '%' : 'N/A'
},
alerts: alerts,
error_details: errors.map(e => ({runId: e.RunID, po: e.PONumber, error: e.ErrorDetails}))
};Module 4: HTTP > Test Azure Doc Intelligence API
Method: GET
URL: {{AZURE_ENDPOINT}}/formrecognizer/info?api-version=2023-07-31
Headers: Ocp-Apim-Subscription-Key: {{AZURE_KEY}}
Expected: HTTP 200
On failure: Add to alerts arrayModule 5: HTTP > Test ERP API Connectivity
Method: GET
URL: (ERP health endpoint, e.g., MRPeasy: GET /v1/articles?limit=1)
Headers: Authorization: Bearer {{ERP_API_KEY}}
Expected: HTTP 200
On failure: Add to alerts arrayModule 6: Router
Branch A (alerts.length > 0 AND any HIGH severity):
Email > Send to MSP alerts + Client production manager
Subject: '[ACTION REQUIRED] PO Automation Alert - {{client_name}} - {{check_date}}'
Branch B (alerts.length > 0, MEDIUM only):
Email > Send to MSP monitoring only
Subject: '[INFO] PO Automation Notice - {{client_name}} - {{check_date}}'
Branch C (No alerts - all clear):
Email > Send daily summary to Client production manager
Subject: 'Daily PO Automation Summary - {{check_date}} - {{summary.pos_received}} POs / {{summary.wos_created}} WOs'Module 7: SharePoint > Create List Item
List: HealthCheck_History
Fields: Date, Summary JSON, Alerts JSON, API_Status<h2>PO Automation Daily Report — {{check_date}}</h2>
<table border='1' cellpadding='8'>
<tr><td><b>POs Received</b></td><td>{{summary.pos_received}}</td></tr>
<tr><td><b>WOs Created</b></td><td>{{summary.wos_created}}</td></tr>
<tr><td><b>Success Rate</b></td><td>{{summary.success_rate}}</td></tr>
<tr><td><b>Avg Processing Time</b></td><td>{{summary.avg_processing_time_sec}}s</td></tr>
<tr><td><b>Pending Approvals</b></td><td>{{summary.pending_approval}}</td></tr>
<tr><td><b>Errors</b></td><td>{{summary.errors}}</td></tr>
</table>
{{#if alerts}}
<h3 style='color:red'>⚠️ Alerts</h3>
<ul>
{{#each alerts}}<li><b>[{{severity}}]</b> {{message}}</li>{{/each}}
</ul>
{{/if}}Testing & Validation
- TEST 1 — Email PO Intake: Send a known test PO PDF as an email attachment to po@clientdomain.com. Verify within 5 minutes that Make.com execution history shows a successful run of the PO-Email-Intake-Monitor scenario. Confirm the email is marked as read and the PDF appears in the SharePoint archive folder.
- TEST 2 — Scanned PO Intake: Scan a paper PO using the Fujitsu ScanSnap iX1600. Verify the PDF auto-saves to the OneDrive /PurchaseOrders/Incoming/ folder and triggers the PO-Scan-Folder-Monitor scenario within 5 minutes. Confirm the file is moved to /PurchaseOrders/Processing/.
- TEST 3 — OCR Field Extraction Accuracy: Process 10 sample POs from 5 different customers through Azure AI Document Intelligence. For each, compare extracted fields (PO number, date, customer name, line items, quantities, unit prices) against the actual PO document. Target: 95%+ field-level accuracy across all 10 POs. Document any fields that extract incorrectly — these indicate a need for custom model training.
- TEST 4 — Crosswalk Mapping (Known Parts): Process a PO containing 3 line items with customer part numbers that ARE in the crosswalk table. Verify all 3 items are correctly mapped to internal SKUs with confidence = 1.0 and no manual review flags.
- TEST 5 — Crosswalk Mapping (Unknown Parts): Process a PO containing 1 line item with a customer part number that is NOT in the crosswalk table. Verify the automation: (a) does NOT create a work order for this item, (b) sends a Teams adaptive card to the production manager requesting manual mapping, (c) logs a 'Validation_Fail' entry in the audit log.
- TEST 6 — Fuzzy Matching: Add a crosswalk entry for customer part 'ABC-1234'. Send a PO with the same part numbered as 'ABC1234' (no dash). Verify the fuzzy matching logic finds the correct mapping with confidence ≥ 0.90.
- TEST 7 — Human Approval Workflow: Process a valid PO through the full pipeline. Verify the production manager receives a Teams adaptive card with correct PO summary. Test all three actions: Approve (verify WO is created), Reject (verify WO is NOT created and rejection is logged), and Edit (verify WO details are flagged for manual editing).
- TEST 8 — Approval Timeout Escalation: Process a PO and intentionally do not respond to the approval request. Verify a reminder is sent after 4 hours and an escalation to the backup approver is sent after 8 hours.
- TEST 9 — ERP Work Order Creation: After approving a test PO, verify in the ERP system that: (a) a manufacturing/work order exists with the correct SKU, quantity, and due date, (b) the WO status is 'Planned' or 'Draft' (not released), (c) the WO notes contain the source PO number, customer name, and approver information, (d) the BOM is correctly linked.
- TEST 10 — Multi-Line PO Processing: Process a PO with 5+ line items, including: 2 known parts, 1 unknown part, and 2 parts requiring UOM conversion. Verify: known parts generate WOs, unknown part is flagged, UOM conversion is applied correctly (e.g., customer orders 3 dozen, WO shows 36 each).
- TEST 11 — Audit Log Completeness: After processing a PO end-to-end, query the PO_WO_Audit_Log SharePoint list for that RunID. Verify entries exist for all stages: Received, OCR_Complete, Mapped, Validation_Pass, Approval_Sent, Approved, WO_Created. Verify each entry has correct timestamps, PO number, and relevant data.
- TEST 12 — Error Handling (OCR Failure): Send a corrupted or blank PDF to the PO mailbox. Verify the automation: (a) does NOT crash, (b) logs an Error entry in the audit log, (c) sends an error notification to the MSP alert email, (d) does NOT create any work orders.
- TEST 13 — Error Handling (ERP API Down): Temporarily revoke the ERP API key or block outbound traffic to the ERP endpoint. Send a valid PO through the pipeline. Verify the automation retries 3 times, then logs a WO_Failed entry, and sends a CRITICAL alert to the MSP and client.
- TEST 14 — Daily Health Check: After running test POs for a full day, verify the Daily HealthCheck scenario fires at 7:00 AM the next day. Check that the email summary contains accurate counts matching the audit log. If errors were intentionally injected, verify they appear in the alerts section.
- TEST 15 — Duplicate PO Detection: Send the same PO PDF twice (same PO number). Verify the automation detects the duplicate (by PO number + customer name match in audit log) and flags it for review instead of creating duplicate work orders.
- TEST 16 — End-to-End Timing: Measure the total elapsed time from PO email arrival to work order appearing in the ERP (including approval time). Target: under 2 minutes of processing time (excluding human approval wait). Document the actual timing for the client handoff report.
- TEST 17 — Volume Stress Test: Send 20 POs to the mailbox within a 5-minute window. Verify all 20 are processed without errors, no POs are dropped, and Make.com execution queue handles the burst without timeout. Verify Azure Document Intelligence rate limits are not exceeded.
Client Handoff
The client handoff meeting should be a 2-hour session with the production manager, office administrator (whoever currently processes POs), and the designated approver. Cover these topics:
Maintenance
- Weekly Maintenance (15-30 min, MSP responsibility): Review Make.com execution history for warnings or partial failures
- Check Azure AI Document Intelligence usage metrics against budget
- Review audit log for any recurring mapping failures that indicate crosswalk gaps
- Verify daily health check emails are being sent and received
- Monthly Maintenance (1-2 hours, MSP responsibility): Review monthly processing metrics: POs received, WOs created, success rate, avg processing time
- Check for Make.com platform updates and apply any scenario updates if connectors change
- Verify SharePoint storage consumption for audit logs and archived POs
- Update Azure AI Document Intelligence API version if new versions are available (test in dev first)
- Review ERP API changelog for any breaking changes in upcoming releases
- Check and rotate API keys/secrets if required by security policy (recommended: every 90 days)
- Quarterly Optimization Review (2-4 hours, MSP + client): Present automation performance report to client stakeholders
- Review unmapped part number frequency — add commonly occurring ones to crosswalk
- Assess if new customer PO formats need OCR template adjustments or custom model retraining
- Evaluate if the human approval step can be reduced or eliminated based on accuracy data
- Review client's ERP/MRP for any changes (new products, new BOMs) that affect the automation
- Discuss any process changes the client wants (new PO sources, additional validations, etc.)
- Update documentation as needed
- Trigger-Based Maintenance (as needed): New customer onboarded: Add their PO format samples to OCR, create crosswalk entries for their part numbers (estimate: 2-4 hours per new customer)
- ERP system upgraded: Test all API calls against new version, update endpoints/payloads if needed
- Make.com connector updates: Review changelog, update affected modules, retest
- Azure Document Intelligence model updates: Test prebuilt model accuracy against current PO samples, retrain custom model if accuracy degrades
- OCR accuracy drops below 90%: Retrain Azure custom model with new labeled samples
- Client adds new product lines: Update crosswalk table and verify BOM associations
- SLA Recommendations:
- P1 (System completely down, no POs processing): 2-hour response, 4-hour resolution
- P2 (Partial failure, some POs failing): 4-hour response, 8-hour resolution
- P3 (Cosmetic issue, minor field inaccuracy): Next business day response
- P4 (Enhancement request, new feature): Scheduled for next quarterly review
Alternatives
Parseur Email-Only Approach (Lowest Complexity)
Replace Azure AI Document Intelligence and Make.com with Parseur as the sole extraction and integration platform. Parseur monitors the PO email inbox, extracts data using template-based parsing, and can push extracted data directly to webhooks or Zapier. Eliminates the need for Azure configuration entirely. Works best when 95%+ of POs arrive as email attachments from a small number of recurring customers with consistent PO templates.
Microsoft Power Automate + AI Builder (All-Microsoft Stack)
Replace Make.com with Power Automate Premium and use AI Builder's prebuilt document processing model instead of Azure AI Document Intelligence as a standalone service. Everything runs within the Microsoft 365 ecosystem. Power Automate handles flow orchestration, AI Builder handles OCR, SharePoint stores data, and Teams handles approvals. No third-party iPaaS needed.
n8n Self-Hosted (ITAR/Air-Gapped Environments)
Replace Make.com with self-hosted n8n Community Edition on a local Dell PowerEdge T150 server or US-only Azure VM. Run Azure AI Document Intelligence containers on-premise (available as Docker containers). All PO data stays within the client's network or a controlled US data center. No data transits through any third-party SaaS platform.
Custom Python Script (Developer-Managed)
Instead of a no-code/low-code iPaaS, build the entire pipeline as a Python application using Azure Document Intelligence SDK, the ERP's REST API client library, and an email library (exchangelib or imaplib). Deploy as an Azure Function (serverless) or on a small VM. Managed via Git repository with CI/CD.
RPA Approach for Legacy ERPs (Power Automate Desktop)
For clients running legacy ERP systems without REST APIs (e.g., older Epicor Vantage, MAPICS, Visual Manufacturing, or custom FoxPro/Access-based systems), use Power Automate Desktop (RPA) to automate the work order creation step. The OCR and mapping layers remain cloud-based, but the final step uses a desktop robot to click through the ERP's GUI and fill in work order fields, simulating a human user.
Want early access to the full toolkit?