In Praise Of Event Sourcing

A few days ago, at Brussels station, Milan and I engaged in an exciting discussion about interpreting the question: "What are the benefits of a given solution?"

During the conversation, I contended that the answer should only include the benefits directly arising from utilizing that particular solution. In my opinion, the question was equivalent to asking: "Why should I adopt that solution?"

On the contrary, Milan argued that all the positive aspects resulting from the solution, even those indirectly related to the solution itself but stemming from the prerequisites it entails, should be listed as benefits.

We were so engrossed in grappling with this little dilemma that I nearly missed my train.

This entire discourse originated from a question we received the day before at Voxxed Days Brussels: "What are the benefits of event sourcing?"

To provide an answer to this question, we must first clarify what we mean by event sourcing. Over the years, I've observed that many people employ this term with different interpretations, which further complicates the situation.

I want to share my understanding of Event Sourcing without claiming to provide a universally accepted interpretation. In the vast realm of event-driven architecture, there are several ways to leverage events. 

Events are often employed to decouple a fact from the possible reactions it may trigger across different software components. In this context, events are typically ordered in a specific time sequence and distributed to all interested components.

This particular use of events is what I refer to as Event Streaming.

Event streaming systems typically use a publish-subscribe messaging model to distribute events to multiple consumers. Event streaming requires some form of event persistence to ensure temporal decoupling among the software components. Even if a component is temporarily unavailable when an event is published, it can catch up with missed events once it becomes available again. However, Event Streaming does not necessarily entail persisting events indefinitely. In many cases, events are persisted only for the duration necessary to ensure their propagation to the interested components.

Event streaming applications typically determine which events to publish based on the current state stored in a dedicated database.

Event sourcing differs from event streaming in this aspect.

Event Sourcing involves utilizing past events to validate the subsequent publication of events, ensuring consistency. This approach relies on the events published in the past to determine the conditions of consistency that must be met for successful event publication. The consistency is guaranteed by the event store, which allows the event publication only if the consistency condition is validated.

In essence, Event Sourcing employs events as a means to maintain the integrity of invariants, going beyond the scope of event streaming.

Like Event Streaming, Event Sourcing also implies the persistence of events. In addition, Event Sourcing requires the events to be persisted forever since they may be needed anytime to rebuild the data structure used to validate invariants. 

Let's now combine the different usage of events with the advantages and disadvantages they provide.

Persistence of Events

Persisting events indefinitely serves as a natural form of audit. Furthermore, event persistence allows storing several additional pieces of information that can be used for different purposes, such as data mining and analytics. However, it's important to note that holding events indefinitely can lead to significant disk space requirements as the number of events continues to grow over time.

Event Streaming

Event streaming is an ideal fit for asynchronous communication. It enables any component to publish events without concern with the consequences these events may trigger. This is an extremely powerful concept because it allows dividing a process into many small actions, isolating the deterministic part, the business rule governing the events' publication, from the non-deterministic part, the consequences of the published events.

While event streaming facilitates reproducing production problems, it can complicate debugging. When combined with the indefinite persistence of events, Event Streaming allows for event replay and empowers implementation of new features that leverage past events. However, applications must support all previously published events over time.

Event Sourcing

Event sourcing comes from a technical necessity. The requirement to utilize a dedicated tool for event streaming introduces a challenge: it becomes impossible to publish events and save the current state within the same transaction. 

There are two resources that do not share the same transactional scope: 

  • the first resource is the tool used to persist the current state, which is necessary for validating invariants
  • the second resource is the tool dedicated to publishing and streaming events. 
Since distributed transactionality cannot be guaranteed, consistency between the current data structure and events becomes unattainable.

To address this issue, event sourcing ensures using a single resource for both purposes. Event sourcing uses the event store as the sole resource needed to provide the current state, represented by past events, and to evolve this state by publishing new events. The event store guarantees that events are only published when consistent with the past by rejecting an append if it fails to meet the expected consistency condition. 

Event sourcing's detractors often highlight the complexity of reconstructing the current state from past events. While this aspect holds some validity, the advantages of Dynamic Consistency Boundaries (DCBs) outweigh the drawbacks. By decoupling the way data is stored - the events - from the structure used to validate invariants, Event Sourcing grants immense flexibility. Through the adoption of DCBs, the ability to dynamically reconstruct any necessary structure from a stream of past events transforms this from a disadvantage into a significant advantage.

Event Sourcing simplifies the implementation of optimistic locking. Since events are immutable, it will be sufficient to verify that there are no new events capable of influencing my decisions, which have been ignored while reading the current state since they were added later.

In conclusion, in response to the question of why use event sourcing, I would say:

Event Sourcing establishes events as the single source of truth within the system; furthermore, through the adoption of the DCBs, event sourcing provides highly flexible use of events, allowing optimal design to emerge over time.

Comments

Popular posts from this blog

Chapter 1 - I am here to kill the aggregate

Chapter 2 - The Aggregate does not fit the storytelling

Chapter 10 - The Aggregate is dead