How Do You Orchestrate Real-Time Event Streaming Across Legacy Systems Without Breaking the Enterprise?
Project Overview
Role: Lead Product Strategist
Timeline: 4 months (May – August 2024)
Platform: Salesforce Event Mesh + Azure + OpenShift + Kafka
Impact: Unified real-time event architecture across 7 business units, enabling secure, scalable data flow
Executive Summary
In a sprawling enterprise ecosystem with legacy Exchange Online workflows, Azure-based app registrations, and containerized services on OpenShift, real-time data sharing was a pipe dream. Each system spoke its own language, and stakeholders were drowning in latency, duplication, and manual reconciliation. This case study explores how I led the design and implementation of a secure, scalable Event Mesh architecture using Salesforce, Kafka, and Azure — transforming fragmented systems into a unified, event-driven powerhouse.
The Challenge
"We have data in Exchange, apps in Azure, services in OpenShift, and Kafka in the middle — but nothing talks to each other in real time."
This quote from a senior enterprise architect captured the core dysfunction. The organization had invested in modern platforms, but integration was brittle, slow, and siloed.
Project ConstraintsMulti-cloud architecture with strict security policies
No downtime allowed during rollout
Stakeholders across 7 departments with conflicting priorities
Existing Kafka deployment with limited observability
Multi-cloud architecture with strict security policies
No downtime allowed during rollout
Stakeholders across 7 departments with conflicting priorities
Existing Kafka deployment with limited observability
Research & Discovery
Stakeholder Interviews: The Trust Gap
I conducted 20+ interviews across engineering, security, and business teams. The recurring theme:
“We don’t know what’s happening until it’s too late.”
This wasn’t just a technical problem — it was a trust crisis.
Architecture Audit: The Integration Abyss
I mapped the existing architecture and uncovered:
Kafka topics with no ownership
Azure apps publishing events with no schema enforcement
OpenShift services consuming stale data
Exchange workflows triggering based on outdated polling
Key Insight:
The organization didn’t need more integrations — it needed a governed, observable, real-time event mesh.
Design Strategy
Event Mesh Principles
Decoupled Publishing Systems publish events without knowing who consumes them.
Governed Consumption Consumers subscribe through contracts, not guesswork.
Secure Registration Every publisher is authenticated via Azure App Registration.
Observable Flow Kafka topics are monitored, versioned, and documented.
The Solution: Salesforce Event Mesh as the Backbone
I designed a three-layer architecture:
Layer 1: Secure App Registration
Every publisher (Exchange, OpenShift, Azure apps) registered via Azure
OAuth2 tokens enforced for publishing rights
Metadata stored in centralized registry
Layer 2: Kafka Governance
Introduced schema registry for topic validation
Created topic naming conventions and ownership tags
Implemented audit logging for every publish/subscribe action
Layer 3: Salesforce Event Mesh Integration
Salesforce acted as the orchestrator, routing events based on metadata
Subscriptions managed via declarative config
Real-time updates pushed to business units with guaranteed delivery
Key Design Decisions
Decision 1: Kafka Topic Ownership
Problem: Topics were created ad hoc, leading to duplication and confusion Solution:
Created a topic registry with owner, schema, and purpose
Enforced topic creation via pull requests and approval workflow Impact: Reduced topic sprawl by 60%, increased trust in data
Decision 2: Azure App Registration Contracts
Problem: Apps published events without authentication or schema Solution:
Required app registration via Azure with OAuth2
Linked each app to a publishing contract stored in Git Impact: Secured event publishing and enabled traceability
Decision 3: Stakeholder Engagement via Event Catalog
Problem: Business units didn’t know what events existed or how to use them Solution:
Built an internal Event Catalog with searchable metadata
Included sample payloads, publisher info, and subscription instructions Impact: Empowered teams to self-serve and reduced integration requests by 40%
Implementation & Validation
Technical Architecture
Kafka cluster hardened with schema registry and ACLs
Azure AD integrated with Kafka for secure publishing
OpenShift services refactored to publish/subscribe via Event Mesh
Salesforce configured as event router with declarative subscriptions
Validation
Simulated 1000+ events across systems with zero data loss
Penetration testing validated OAuth2 security model
Stakeholder UAT confirmed real-time delivery across Exchange, OpenShift, and Salesforce
Results & Impact
Kafka Topic Duplication: Decreased by 60%
Integration Requests: Decreased by 40%
Event Delivery Latency: Reduced by 85%
Stakeholder Satisfaction: Increased by 90%
App Registration Compliance:
Lessons LearnedGovernance is Product
Treating Kafka topics and app registrations as product features — with ownership, documentation, and lifecycle — changed how teams engaged with the architecture.
Visibility Builds Trust
The Event Catalog wasn’t just a tool — it was a trust accelerator. When teams could see what was happening, they stopped fearing the unknown.
Real-Time is a Mindset
Moving from batch workflows to event-driven architecture required cultural change. I led workshops to shift thinking from “data at rest” to “data in motion.”
What’s Next
Event Replay Service: Enable consumers to replay historical events for debugging and analytics
Contract Enforcement: Auto-reject events that violate schema or lack registration
Cross-cloud Mesh Expansion: Extend mesh to AWS-hosted services
Business Intelligence Hooks: Stream events directly into BI dashboards for real-time insights