Thousands of SKUs. On-prem ERP. Excel pivot tables for everything. Leadership asking the same question and getting three different answers.
A CPG distributor with thousands of SKUs was running analytics through Excel pivots connected to an on-prem ERP. Work-Smart built a live sales dashboard in 10 business days. Result: real-time visibility by country, customer, SKU, product line, salesperson, replacing hours of pivot table filtering.
A previous vendor spent four months. The output was a dashboard nobody used.
The company is a Miami-based CPG distributor operating across multiple export markets. They've been in business for decades. Their ERP, on-premise, holds every transaction, every invoice, every shipment. Thirty years of data.
The VP of Sales needed to answer one question regularly: "How are we doing this month in Market X?" To get the answer, they opened Excel, connected to the ERP via queries, filtered by market, by product line, by customer, by SKU, and waited. The pivot tables worked. They had worked for years. But "works" and "works fast enough to make decisions" are two different things.
The operation manages thousands of SKUs across dozens of customers in multiple countries. At that scale, filtering a pivot table is not analytics, it's data archaeology. Every new product line meant new tabs. Every new market meant new filters. The same report pulled by two different people returned two different numbers, not because either was wrong, but because their filter selections were different. Leadership asked one question and got three answers.
They had already invested in fixing this. Another vendor had been working on a dashboard project for three to four months. The output was slow. The numbers didn't match what leadership trusted in Sage. The tool wasn't usable day to day. The investment was real. The result wasn't.
When the VP reached out, the ask was direct: "I want a snapshot of my operation. Sales by country, by client, by SKU, by product line, by vendor. In dollars and in cases. Something fast. Something I can trust."
The data existed. The problem was accessibility, not quality.
The ERP had everything, decades of transactional history, clean enough to run a business on. The problem wasn't data quality. The problem was data accessibility. Three gaps compounded into one operational bottleneck.
Data Accessibility
The ERP held the data, but it was locked inside a system that wasn't designed for modern visibility. The only way to access it was through complicated queries feeding Excel spreadsheets. No way to share a live view with the team without manual work. Every data request started from scratch.
Real-Time Visibility
Missing entirely. There was no shared, real-time view of the operation. Each person maintained their own Excel file with their own filters. The CEO and VP of Sales had no way to see what was happening without rebuilding a report manually. Meetings ran on stale data, or no data.
Reporting Automation
Every report was manually refreshed. Every export was manually filtered. Every time leadership wanted a different view, different date range, different market, different product line, someone had to rebuild the analysis. The previous vendor had tried to solve this by building a dashboard. They spent months on a platform that didn't validate its numbers against what leadership already trusted. The output was technically a dashboard, it was just one nobody used.
The fix was straightforward: build a read layer on top of Sage that extracts the data the team already uses, structures it for fast queries, and delivers it through a web interface that loads in seconds. Start with what they trust. Match the numbers. Then expand.
Sage Access Setup and Baseline Validation
The IT team set up read-only access to the ERP in under two hours. Nothing in the ERP was modified. The first task was pulling the baseline report the team already trusted, the same view they built manually in Excel, and validating that the numbers matched exactly. This is where the previous vendor had failed: they spent months on architecture and delivered numbers that didn't match what leadership trusted. We matched the baseline first. Everything else came after.
Data Extraction and Structuring
With Sage access validated, the extraction layer was built. The data needed to be restructured for fast queries: sales by country, by customer, by SKU, by product line, by salesperson, in both dollars and cases. Thirty years of transactional history, clean enough to run a business on, now accessible in milliseconds instead of requiring an Excel refresh cycle.
Web Dashboard Build
A custom web dashboard built with all the views, filters, and export functionality the team needed. Total sales, monthly sales, year-to-date, updated daily. Breakdowns by every dimension the team thinks in. Daily and weekly trend lines. Top customers and top SKUs ranked by volume. Fast filters on every view with CSV export for team members who still want to do deeper analysis in Excel. Loads in seconds. Not after a pivot table refresh.
Number Validation and Team Training. Go Live
Final validation: every number in the dashboard checked against the Sage source. Team training: 45 minutes with the VP of Sales and key users. Go live. The dashboard was in daily use by the end of week two. Not piloted. Not tested by one person. Used by the team, because the numbers matched what they already trusted, and it was faster than what they had before.
Report generation
Hours
→Seconds
Shared visibility
None
→1 dashboard
Time to delivery
4 months (failed)
→10 days
Data currency
Stale
→Daily auto-refresh
- ▸The VP of Sales opens a browser tab instead of opening Excel. The CEO gets the same view, no more waiting for someone to compile a report before a meeting.
- ▸When someone asks "how are we doing in a given market this month?" the answer takes 10 seconds, not 10 minutes.
- ▸The same question no longer gets three different answers depending on whose filter selections were active in whose Excel file. One source of truth.
- ▸The team still exports to CSV when they want deeper analysis in Excel, that workflow didn't go away. The starting point changed. They begin from a dashboard that shows the full picture and drill down from there.
- ▸The foundation is in place for the AI layer. Structured data, validated numbers, daily refresh cadence, these are the prerequisites that make anomaly detection and purchasing suggestions actually work. Without this foundation, AI is just a more expensive way to get unreliable answers.
Questions About This Case Study
10 business days. Day 1-2 was ERP access setup and baseline validation. Day 3-5 was data extraction and structuring. Day 6-8 was the dashboard build. Day 9-10 was number validation and team training. The dashboard was live and in daily use by the end of week two.
This was a Phase 1 sales performance dashboard build executed in 10 business days. The engagement covered ERP integration, data extraction and structuring, web dashboard development with all views and filters, number validation, and team training. Optional ongoing maintenance covers daily refresh monitoring, connection upkeep, and performance tuning. The AI layer (anomaly detection, purchasing suggestions, inventory alerts) is typically scoped separately and builds on top of the Phase 1 foundation.
Yes. This engagement connected to the on-prem ERP via read-only SQL/ODBC, the same connection method the team's Excel pivot tables already used. The approach works with any ERP that has a database or API: Sage, SAP Business One, NetSuite, QuickBooks Enterprise, SYSPRO, Acumatica. The connection is read-only, nothing in your ERP is modified.
The most common failure pattern I see: the vendor spends months connecting to data, the numbers don't match what leadership trusts, and the output is too slow to replace what the team already has. This build starts with the baseline report you already trust and validates against it before going live. If the numbers don't match, we fix the connection, not the expectation. That's why it takes 10 days, not 4 months.
Yes. The approach is ERP-agnostic. The connection method varies. SQL, ODBC, API, direct database access, but the outcome is the same: your ERP data in a fast, filterable dashboard your team actually uses. The dashboard structure (sales by country, customer, SKU, product line, salesperson) applies to any distribution operation regardless of the underlying system.
10 business days from kickoff to working dashboard. The ERP data was structured enough to build on immediately. The 65 hours of monthly reporting, pulling data, building pivot tables, cross-referencing, dropped to 2 hours of review and exception handling. The dashboard updates in real time.
If your distribution operation runs on an ERP that holds the data and Excel pivot tables that show it, slowly, the gap between 'data exists' and 'leadership can see what's happening right now' is costing you real decisions.
The first step is a 30-minute call where I ask about your ERP, your current reporting setup, and what your team needs to see. If the answer is a Phase 1 sales dashboard, it's 10 business days from kickoff to go-live.