Discovery: From the outset, we purposefully sought to understand their needs, and the breadth of the challenge, before committing to any technical solutions.
We did this initially via two Discovery workshops.
The first workshop explored their challenges around data handling. Using Impact Mapping we identified the actors, impacts and deliverables, before delving into data types. From this we were able to sketch out a potential event-based architecture, that demonstrated contemporary coding principles and new ways of working. By involving our client’s BI team in the design of the solution, we ensured use of the best metrics to monitor success. By the end of the first workshop, the stakeholders all had a shared understanding of their company’s data usage intentions.
Spanning three days, the second workshop allowed for a more thorough exploration and evolution of their data strategy:
Day 1: By discussing the current approach with people from across the organisation, we
enabled company-wide involvement in their transformative journey. Collectively we identified
three key goals: giving Executives greater real-time access to critical information; improved
inter-team collaboration; and more efficient software delivery.
Day 2: Through the exploration of data ownership, we established how their teams worked
together, and how a new data paradigm would require these relationships to evolve.
Day 3: Based on our findings we were able to provide insights to guide the data strategy, and
agreed to build a Proof of Capability (PoC). This would allow us to explore the detail of the
proposed solution, and rigorously test the strategy against real-world constraints.
Proving Production Capability
Having established our client’s first use-case – to replace incumbent technology that handled call records and information – the PoC offered us an opportunity to unlock and modernise an interesting dataset. During the development of the PoC, we paired with members of our client’s Data Team, sharing our expertise in end-to-end data architecture, and transferring technical knowledge via hands-on implementation. Collaborating at every stage of design and build was crucial to ensuring that those responsible for maintaining the solution in the future, fully understood it.
Architecture and Technical Solution
Having agreed to decouple producers of data from the consumers of data, we employed subscriptions to streams of events, which allowed our client’s teams to work independently. Calling upon our experience with distributed data engineering, we decided on a persistent log of events with AWS Kinesis. Delivered as-a-service this provided an easy introduction to the overall solution model, without compromising flexibility. We also introduced Event Store, the source of truth for our client’s business events, which provided flexibility and resilience (i.e. replaying events such as a downstream system failure; and enabling new consumers to request new streams from historical data).