Where database blog posts get flame-broiled to perfection
Alright, let's pour a cup of lukewarm coffee and review this... masterpiece of engineering. Another Tuesday, another performance benchmark that reads less like a business proposal and more like a ransom note for my budget. I’ve seen sales decks with more clarity, and those are written in crayon.
First, we have the setup. The author casually mentions they compiled twelve different versions of two separate open-source databases from source. Oh, wonderful. So the "free" part of "free and open-source software" just means it's free from any semblance of convenience. The sticker price is zero, but the true cost is a team of specialists who speak exclusively in config file parameters and spend their days on "artisanal, hand-compiled databases." Let's pencil in $450,000 for the salaries of the two wizards we'd need just to understand this setup, shall we?
Then we get to the meat. My ears perked up at this little gem: modern MySQL uses "2X more CPU per transaction" and has "more than 2X more context switches" than Postgres. I'm no engineer, but I know what "2X more CPU" means: it means my cloud provider sends me a fruit basket and a bill that looks like a phone number. So the "great improvements to concurrency" are subsidized by a cloud budget that will grow faster than my quarterly anxiety. Excellent value proposition.
And lest we think Postgres is our savior, the report notes that "Modern Postgres has regressions relative to old Postgres." Regressions. They're shipping new versions that are actively worse under certain loads. Let me get this straight. We invest engineering time to upgrade, validate the new system, and migrate the data, all for the privilege of a 3% to 13% performance drop. It's like trading in your 2012 sedan for a brand new 2024 model, only to find out it has a hand crank and gets eight miles to the gallon.
I particularly enjoyed the author's candor on their data visualization.
On the charts that follow y-axis does not start at 0 to improve readability at the risk of overstating the differences. My compliments to the chef. This is a classic trick I haven't seen since our last vendor pitch. They turn a 2% improvement into a skyscraper to distract you from the fact that their solution costs more than a small island. We're not measuring New Orders Per Minute here; we're measuring Total Cost of Ownership, and the only chart I care about is the one showing our burn rate heading toward the stratosphere.
So let's do some quick, back-of-the-napkin math on the "True Cost" of adopting one of these glorious, free solutions. We start at $0. We add the $450k for our new compiler-whisperers. We'll factor in a 100% increase in our cloud compute bill for the CPU-hungry option, let's call that another $200k annually. Add $150k for the migration consultants, because you know our team will be too busy reading the 800-page manual. Throw in another $75k for retraining and the inevitable "emergency performance tuning sprint" six months post-launch. That brings our "free" database to a cool $875,000 for the first year. The ROI is, and I'm estimating here, negative infinity.
Honestly, at this point, I think our budget would be safer on stone tablets.