Strategy
Post-M&A data and AI strategy for a federated logistics holding
Authored the data-and-AI roadmap for a holding company with dozens of acquired brands across 150+ locations and fragmented TMS / WMS / ERP systems — modern and legacy. The strategy: keep regional teams fast and autonomous, unified by a single trusted data layer and a small set of AI capabilities (document automation, anomaly detection, semantic search over freight docs and SOPs).
Adjacent Industry — Vertical Farming
IoT and machine-vision data pipelines
Designed and operated the data pipeline that collected sensor and camera feeds from controlled-environment growing facilities at scale, then processed and curated them as training data front-loaded into ML machine-vision models. The same 'IoT capture → curated dataset → vision model' pattern translates directly to dock damage detection, reefer monitoring, container-ID capture, and package inspection.
Pipeline QA
Data-lake and data-warehouse hardening for an enterprise shipper
Inherited a data platform with 120 ingestion jobs running without automated tests, monitoring that had been silently failing for weeks, and 60+ reference files maintained by hand. Delivered a self-service tool for the data-steward team, an automated test suite covering every ingestion job, and a delivery pipeline that catches problems before they hit production.
Modernization
Legacy reporting modernized into a modern BI layer
50+ legacy reports built on a modern TMS and a legacy mainframe — covering 7.7 million shipments and 10.7 million invoice records — rebuilt as a modern semantic-layer BI estate. Every rebuilt report proven equivalent to the original before retirement, with zero direct database access (firewall-restricted environment).
AI / Anomaly Detection
Anomaly detection on a 100M+ row operational feed
Mined 18 months of history to learn the natural rhythm of a critical pricing feed, then built tiered alerting that catches both whole-feed outages and individual supplier silences. AI is an optional interpretation layer — never sees raw data, has a kill switch, leaves an audit log.
Migration
Excel-to-platform migration for daily commodity-pricing operations
Roughly 10 million formula occurrences across 23 large workbooks. We treated the formula graph as a program — parsed it, unwrapped the business logic embedded across the cells, and reimplemented it as a Python pipeline running on Keboola + Snowflake. Honest gap analysis for the cases that needed redesign rather than direct conversion. Daily commodity pricing for fuel-terminal operations, out of `.xlsb` chaos and into a real platform.
Adjacent Industry — Utility Solar
IoT telemetry pipeline that scaled an operator past 1GW of deployed capacity
Telemetry from weather stations, inverters, and field equipment, ingested reliably over unreliable connectivity; powered both real-time operations dashboards and trading-grade analytics. The same patterns apply directly to fleet, reefer, yard, and warehouse telemetry.
EDI
Retail-grade EDI integration with Kroger
Vertical-farming engagement in which our team delivered EDI integration with Kroger — Fortune 20 grocery, one of the most demanding EDI compliance bars in retail. ASNs, invoicing, chargeback-grade SLAs. The same discipline transfers directly to freight tendering, status, and customs.
Custom Dashboard
Inventory operations dashboard replacing spreadsheet workflows across a multi-brand consumer-goods portfolio
Four-layer dashboard built around the inventory team's actual workflow — governed PostgreSQL underneath, a Python engine replicating two decades of Excel column logic, a FastAPI layer with row-level role-based access, and a Next.js front end designed by the engineers who modeled the data. Replaced a fragile chain of workbooks and 80+ VBA macros (6,400+ lines) wired to SharePoint, across multiple brand units and dozens of distribution sites. Power BI and Tableau couldn't model the workflow; building a dashboard that fit it took less time than another year of patching the spreadsheets — the Approach thesis in production.
Circa 2012 — Origins
Container ID capture by mobile, crowd, and computer vision
Among our earliest creative projects — and the one that maps most directly to today's logistics machine-vision work. A mobile app photographed the backs of shipping containers and transcoded the ISO codes stamped there. Two pipelines ran in parallel: Amazon Mechanical Turk for human labels, and an OpenCV model trained continuously on the same corpus — the humans bought the model time, then the model gradually replaced them. Photographers worked on smartphones in developing countries; we gamified participation with regional leaderboards and a grand prize — a home built from a real shipping container. The lesson held: when you move enough freight, the data isn't a byproduct — it's part of the cargo.