← Back to Case Studies

Loyalty Automation Platform

The Problem

Running a multi-brand restaurant loyalty program involves a surprising amount of manual work. Before the Loyalty Automation Platform existed, the teams responsible for keeping the system running (engineering, QA, and operations) were stitching together disconnected tools and manual processes to get through their day.

Offer creation was a manual file-building exercise. To test each offer, someone had to walk it through the UI by hand. Customer-segment assignments required filing an IT ticket and waiting for execution. Performance reporting either depended on manual data pulls or scheduled reports that often missed the question being asked.

None of this was edge-case work. It was the standard operating procedure for a loyalty platform serving multiple national restaurant brands. Every manual step introduced a potential failure point, and every team handoff added latency to work that should have been routine.

The underlying challenge was fragmentation more than inefficiency. The tools were in place but did not talk to each other; institutional knowledge lived only in people’s heads, and the processes themselves depended on tribal know-how and specific individuals being available.

The Approach

The Loyalty Automation Platform was built as an integration platform: an iPaaS-style system designed to consolidate disconnected workflows into a single, automated, self-service interface.

The platform wasn’t commissioned through a formal initiative. It grew out of deep familiarity with the loyalty system’s pain points, the kind of familiarity that only comes from operating as the technical authority across engineering, QA, operations, and vendor relationships simultaneously.

The architecture uses a tiered approach to handle the range of requests the platform serves. Incoming requests are routed through intent classification that determines the complexity and cost of fulfilling them: from simple lookups that can be resolved instantly to complex operations that require coordinating across multiple systems. AI agents backed by RAG provide users with natural language access to operational data, documentation, and system knowledge.

The platform integrates with vendor systems for secure file delivery and connects to internal data sources to surface loyalty performance data, documentation, and operational context, all through a conversational interface that doesn’t require users to know which system holds the answer.

The Outcome

The offer lifecycle, previously a multi-day, multi-team process involving manual file creation, manual testing, and IT-driven assignment, is now fully self-service. A single person can create, test, and assign offers in minutes.

The operations team has reclaimed the hours that used to go into repetitive configuration. QA now spends its time on tests that actually require human judgment, not on manually validating every offer config. Engineering is no longer the bottleneck for requests that a well-designed interface can handle.

The platform serves as a single point of access for teams that previously had to navigate multiple disconnected tools, track down documentation across systems, or ask specific people for information that should have been readily available.

Decisions & Trade-offs

Solo-build vs. team initiative. This platform was not commissioned. It was built by one person who had enough domain context to see the full picture and enough technical ability to execute end-to-end. That was a deliberate choice: assembling a team would have required a formal business case, prioritization against existing roadmap items, and months of alignment before a single line of code was written. The trade-off is that the platform’s architecture reflects one person’s judgment, which means it needs to be well-documented and transferable to survive beyond its creator. That documentation and knowledge transfer work is ongoing.

Regex-first intent classification vs. LLM-for-everything. The tiered classification architecture (regex resolution before LLM, thin classifier before full agent) was driven by cost and latency constraints, not technical elegance. Routing 40–60% of requests through pre-LLM resolution eliminates unnecessary API calls and keeps response times under a second for common operations. The trade-off is maintenance overhead: every new intent type requires updating the regex layer, not just the prompt. For this use case, the economics justified the complexity.

iPaaS-style consolidation vs. microservices. The platform was designed as a single integration surface rather than a collection of independent services. For a system that coordinates across vendor APIs, internal databases, file delivery pipelines, and testing infrastructure, a unified orchestration layer was simpler to reason about, debug, and operate than distributed services communicating through message queues. The trade-off is that the platform is monolithic in the integration layer. This is acceptable at current scale, but something that would need to be revisited if multiple teams were contributing simultaneously.

What I’d Do Differently

If I were starting this platform today with the benefit of hindsight, I would invest in formal organizational adoption earlier. The platform works (the value is clear to everyone who uses it), but building bottom-up without executive sponsorship means adoption is driven by word-of-mouth rather than mandate. A parallel workstream for stakeholder alignment, even lightweight, would have accelerated the path from “useful tool one team relies on” to “organizational capability.”

I would also design the testing infrastructure differently from the start. The automated smoke tester works reliably, but the anti-bot resilience and failure categorization layers were bolted on reactively as edge cases surfaced in production. A more deliberate upfront investment in the testing framework’s extensibility would have reduced the rework.

The Loyalty Automation Platform continues to expand. The next priorities are broader organizational adoption beyond the original team and deeper integration across the loyalty platform’s operational surface, particularly the reporting and analytics paths that remain bound by manual workflows.

For the philosophy behind why this platform was built the way it was, read Every Great Platform Started as a Pain Point.