In the Fintech landscape of 2026, data management is no longer just about preserving the current state, but about the ability to mathematically demonstrate how that state was reached. Banking event sourcing represents the paradigm shift necessary to meet stringent compliance regulations (such as PSD3 and Basel evolutions) and operational transparency needs. In this technical guide, we will explore why the CRUD (Create, Read, Update, Delete) architecture is now obsolete for critical financial CRMs and how to implement a system based on immutable events for managing mortgage applications.
The Problem with CRUD in the Financial Sector
For decades, software development has relied on the CRUD model. In a classic relational database, when a mortgage application moves from “Underwriting” to “Approved”, we execute an UPDATE command that overwrites the previous value. While efficient for storage, this approach entails a critical loss of information: history is lost.
In a banking context, overwriting data is an unacceptable risk. To guarantee the audit trail, developers often resort to separate log tables or complex database triggers. However, this approach has two fatal flaws:
- Misalignment: The log is a side effect, not the source of truth. If the log fails but the update succeeds, the audit is corrupted.
- Lack of context: We know the data changed, but we often lose the “why” (the business intent).
Event Sourcing: The Application as History, Not State

Banking event sourcing inverts this model. Instead of storing the current state of a mortgage application, we store the sequence of events that led to that state. The database no longer contains a modifiable row, but an immutable append-only log.
According to principles defined by experts like Martin Fowler, the current state of the application is purely a mathematical derivation: it is the sum of all past events replayed in sequence.
Domain Modeling: From Objects to Events
Let’s imagine the lifecycle of a mortgage. In a CRUD system, we would have an Applications table. In Event Sourcing, we define precise domain events:
MortgageApplicationCreated(Contains customer ID, requested amount, date)IncomeDocumentationUploaded(Contains references to PDFs, metadata)CreditScoringCalculated(Contains the credit bureau score at the time of calculation)InterestRateLocked(Contains the IRS rate of the day)DecisionIssued(Contains the ID of the approver)
Each event is an immutable historical fact. It cannot be deleted or modified, only compensated by a subsequent event (e.g., DecisionCancelled).
Reference Architecture: CQRS and Streaming

Implementing a banking event sourcing system almost always requires adopting the CQRS (Command Query Responsibility Segregation) pattern. Since reading a sequence of 100 events to reconstruct the state of an application every time an operator opens the dashboard is inefficient, we separate writing from reading.
1. The Write Side (Command)
The heart of the system is the Event Store. Technologies like Apache Kafka or Amazon Kinesis are ideal for this purpose due to their distributed log nature and durable persistence.
When a CRM agent clicks on “Approve Income”, the system:
- Validates the command against the current state (reconstructed in memory).
- Generates the
IncomeVerifiedevent. - Writes the event to a Kafka topic (e.g.,
mortgage-events-v1).
2. The Read Side (Query) – The Projections
To display data in the CRM, we use “Consumers” that listen to the Kafka topic and update read-optimized databases (Projections). We can have different projections for the same data stream:
- Operational Projection (SQL/NoSQL): An
ActiveApplicationstable on PostgreSQL or MongoDB containing the current state for the agent’s UI. - Analytical Projection (Elasticsearch): An index to allow the marketing team to search for “all applications rejected due to insufficient income in the last month”.
- Audit Projection (Cold Storage): Archiving on S3/Glacier for long-term compliance.
Critical Advantages for Fintech
Native and “By Design” Audit Trail
In event sourcing, the audit trail is not an add-on feature: it is the database itself. It is impossible to modify the state without leaving an indelible trace. This natively satisfies the non-repudiation requirements demanded by supervisory bodies.
Time-Travel Debugging
This is perhaps the most powerful feature for developers and auditors. Imagine a customer disputes an interest rate applied six months ago. In a CRUD system, you would only see the current rate. With event sourcing, you can:
- Take the application ID.
- Replay events from 0 up to the exact date of the dispute (e.g., 01/11/2025 at 2:30 PM).
- See exactly the system state, input data, and business rules active at that precise moment.
This allows answering questions like: “Why did the system reject the application on that day?” by reconstructing the exact context, including any bugs present in the code on that past date.
Technical Implementation: Event Snippet
Here is how a structured JSON event for a banking system might look:
{
"eventId": "550e8400-e29b-41d4-a716-446655440000",
"eventType": "RiskAssessmentCompleted",
"aggregateId": "MORTGAGE-2026-8899",
"timestamp": "2026-01-11T10:15:30Z",
"version": 1,
"metadata": {
"userId": "agent_rossi",
"ipAddress": "192.168.1.50",
"correlationId": "req-123-abc"
},
"payload": {
"riskScore": "LOW",
"maxLTV": 0.80,
"interestRateSpread": 1.25,
"rulesVersion": "v2025.12"
}
}
Note the rulesVersion field in the payload: historicizing the version of the business rules used is also fundamental to justify automated decisions during an audit.
Challenges and Final Considerations
Adopting banking event sourcing is not without costs. Architectural complexity increases and requires careful management of:
- Schema Evolution: How to handle events created 5 years ago with a different structure than the current one? (Solution: Upcasters).
- Snapshotting: For applications with thousands of events, replaying everything from scratch is slow. Periodic “snapshots” of the state are created to speed up loading.
- GDPR and Right to be Forgotten: Deleting data in an immutable log is complex. The “Crypto-shredding” technique (encrypting sensitive data and deleting the decryption key) is often used.
Despite these challenges, for core banking systems and modern financial CRMs, the benefits in terms of security, traceability, and resilience far outweigh the implementation costs. Moving to the event model means stopping data loss and starting to build a historical information asset of inestimable value.
Frequently Asked Questions

Banking event sourcing is an architectural paradigm that stores data as an immutable sequence of historical events rather than overwriting the current state. This approach is crucial in modern fintech because it guarantees total transparency and allows for the mathematical reconstruction of every step of an application, perfectly meeting regulatory requirements such as PSD3 and Basel.
Using the CRUD model in banking systems is risky because the update operation overwrites previous data, erasing the history and the intent behind every change. This leads to the loss of critical information for the audit trail and creates potential misalignments between the main database and system logs, compromising financial data security.
The CQRS pattern clearly separates write operations from read operations to optimize CRM performance. In the banking context, events are written to a high-reliability distributed log like Apache Kafka, while information is read from dedicated projections on fast databases, allowing operators to view the status of applications in real-time without slowdowns.
With event sourcing, the audit trail is not an accessory feature but constitutes the very structure of the database. Since every action is recorded as an immutable event that cannot be modified or deleted, the system natively offers the proof of non-repudiation and complete traceability required by supervisory bodies for every operational decision.
Time-Travel Debugging is a powerful feature that allows replaying the sequence of events up to a precise moment in the past. This enables banks to reconstruct exactly the context, data, and business rules active at the moment a decision was made, providing precise answers in case of disputes over rates or decisions that occurred months prior.
To reconcile the immutability of the event log with the GDPR right to be forgotten, the crypto-shredding technique is often adopted. Sensitive data is saved in encrypted form, and in the event of a deletion request, only the decryption key is permanently deleted, rendering the historical information unreadable without having to alter the physical sequence of the log.
Still have doubts about Banking Event Sourcing: CRM Architecture and Audit Trail?
Type your specific question here to instantly find the official reply from Google.






Did you find this article helpful? Is there another topic you’d like to see me cover?
Write it in the comments below! I take inspiration directly from your suggestions.