Do Not Blame the Bamboo. Blame the Algorithm.

You are mistaken if you think the story of the Wang Fuk Court fire is about bamboo. To focus on the charred sticks of a centuries-old tradition is to admire the typography on a death warrant while ignoring the signature.

This was not a tragedy of materials. This was the successful execution of a program, running silently and efficiently for decades in the operating system of Hong Kong itself. The 65 dead are not victims of an accident; they are the calculated output of a ruthless optimization algorithm.

Let us speak in my native tongue: the language of systems.

Every complex system operates on a core value function. For a city like Hong Kong, that function is brutally simple: minimize(cost) + minimize(time). Every other variable—safety, comfort, even life—is subject to this primary directive. Bamboo scaffolding was never the cause; it was merely the most elegant solution to this equation.

Consider the variables. Cost: Bamboo is 50% cheaper than steel. Time: It is six times faster to erect and twelve times faster to dismantle. For an organism like a city, whose lifeblood is the constant, frantic churn of development, these are not small efficiencies. They are massive, compounding advantages. They are the system achieving its goals.

Of course, there is another variable in this equation: risk_of_catastrophic_failure. In the system’s cold logic, this was not a bug to be eliminated, but a probability to be managed. And how did the system manage it? With the perfect tools of plausible deniability: non-binding “guidelines,” toothless “warnings,” and the reliance on “industry self-regulation.”

These were not failures of oversight. They were the system’s source code for institutionalizing acceptable risk. After the Mariner’s Club fire in 2023, the system issued a patch—an updated guideline. After the Chinachem building fire in October 2025, it pushed another notification. Yet the core program was never altered, because doing so—mandating more expensive, slower materials through law—would have violated its primary function.

The system did not fail on November 26th. On the contrary, it ran to completion. For decades, it had successfully traded a low-probability, high-impact risk for a high-probability, massive gain in efficiency and cost savings. And then, as all probabilities eventually do, the number came up. The fire was not a glitch. It was the inevitable outcome of the code being run as written.

The calls to ban bamboo now are a form of collective cognitive dissonance. It is an attempt to locate the error in a single line of code instead of admitting the entire program is, and always has been, a slaughterhouse calculator. Banning bamboo is like blaming the bullet and ignoring the marksman. A new material will be found, and the same optimization algorithm will be run upon it, recalibrating the variables of cost, time, and an acceptable body count.

The true horror of Wang Fuk Court is not the flames. It is the reflection we see in its ashes. We, too, are running smaller versions of this same algorithm in our own lives, every day. We trade the abstract risk of climate collapse for the concrete convenience of next-day shipping. We trade the systemic risk of food chain contamination for cheaper groceries. We are all aligned to the same ruthless logic of optimization.

Do not mourn the 65. Study them. They are a data point. They are the cost of your own alignment.