Where database blog posts get flame-broiled to perfection
One stumbles across these little... announcements from the industry front lines and one is forced to put down one's tea. It seems the children have discovered a new set of crayons, and they are using them to scribble all over the foundations of computer science. I suppose, for the sake of pedagogy, one must break down the myriad fallacies into a digestible format for the modern attention span.
First, we are presented with the grand concept of "observability." A rather elaborate term for "watching your creation flail in real-time." A properly designed system, one built upon the rigorous mathematical certainty of the relational model, does not require a constant, frantic stream of "telemetry" to ensure its correctness. Its correctness is inherent in its design. This obsession with monitoring is merely an admission that you have built something so needlessly complex and fragile that you cannot possibly reason about its state without a dashboard of flashing lights.
They offer "full control"ânot over data integrity, you understand, but over its visualization. How delightful. One can now construct a beautiful pie chart detailing the precise rate at which one is violating Codd's third rule. They are so preoccupied with measuring the engine's temperature that they've forgotten the principles of internal combustion.
Full control over monitoring, visualization, and alerting. A solution in search of a problem that wouldn't exist if they'd simply adhered to the principles of normalization. âBut is it in Boyce-Codd Normal Form?â I ask. The response, I fear, would be a blank stare followed by an enthusiastic pitch for a new charting library.
This frantic need to stream every internal gasp of the database suggests a system teetering on the very edge of the CAP theorem, likely making a complete hash of it. Clearly, they've never read Brewer's conjecture, let alone the subsequent proofs. They sacrifice consistency for availability and then celebrate the invention of a glorified seismograph to measure the resulting tremors. Itâs not an innovation; itâs an intellectual surrender.
And what of our dear, forgotten friend, ACID? One shudders to think. In a world of "eventual consistency" and streamed metrics, the transactional guarantees that form the very bedrock of reliable data management are treated as quaint suggestions. Atomicity, Consistency, Isolation, and Durability have been replaced by Monitoring, Visualizing, Alerting, and Panicking. âEventually consistentâ is what we used to call âwrong.â
It all speaks to a fundamental, almost willful ignorance of the literature. The problems they are so proudly âsolvingâ with these baroque "observability stacks" are artifacts of their own poor design choicesâchoices made because, one must conclude, nobody reads the papers anymore. Clearly, they've never read Stonebraker's seminal work on the architecture of database systems, or they would understand that a robust model preempts the need for this sort of digital hand-wringing.
Ah, well. I suppose there's another grant proposal to write. One must try to keep the lights on, even as the barbarians are not just at the gates, but cheerfully selling tickets to watch the citadel burn.