The most obvious cause of the disaster at Japan's Fukushima Daiichi nuclear power station was the massive wall of tsunami water that swept the site clean of back-up electricity generation on 11 March, removing cooling capacity from reactor cores and resulting in serial meltdown.
Would a newer reactor have fared better? Was the relationship between industry and regulators too close? Perhaps.
A question less often discussed, but equally intriguing, is whether decisions made half a century ago for reasons of commercial and geopolitical advantage have left the world with basic designs of nuclear reactor that are inherently less safe than others that have fallen by the wayside.
To make an analogy with the world of videotape: have we been guilty of rejecting the nuclear Betamax in favour of an inferior quality VHS?
The rise of nuclear energy from the ashes of Hiroshima and Nagasaki began in the 1950s, against the ever falling temperatures of the Cold War.
Having exploded the first bombs in 1945, the US government believed it would have a 20-year head start on the rest of the world in just about everything nuclear.
It passed the McMahon Act, keeping nuclear know-how within US borders - notably the technology for enriching uranium developed during the Manhattan Project.
Its lead did not last long, however. The Soviet Union exploded its first A-bomb just four years later, followed soon afterwards by the UK.
Washington needed something else.
"The Americans suddenly thought 'it's not obvious to the world that we have dominated and won the nuclear race, so we need to make it clear that we are the leaders'," says Robin Cowan, an economist from the University of Maastricht in the Netherlands.
"So they wanted to show they could create civilian power with nuclear generation."
Seeing the light
An experimental US reactor called EBR-1 generated the first nuclear electricity at its home in Argonne National Laboratory, sending current through a series of lightbulbs in 1951.
But the US did not open the world's first civilian nuclear power station; that honour went to the USSR, whose tiny Obninsk reactor opened in 1953.
And the world's first commercial-scale nuclear station was the UK's Calder Hall, opened the following year.
The race for nuclear power - and with it, political influence - was underway.
"[Soviet chief Nikita] Khrushchev... recognised that achievements in nuclear power made it possible to compete with the United States in the world arena - to say 'our system, the socialist system, is the best - look who is first in areas of science and technology'," relates Soviet historian Paul Josephson.
"You see a rebirth of hope that there will be a glorious communist future, perhaps a nuclear-powered future."
All of these early reactors used different designs, with everyone except US scientists forced to work with natural uranium rather than the enriched variety.
"The availability of uranium to the UK was reasonably secure; but what the UK didn't have was enrichment technology, which was in the United States," recalls Laurence Williams, professor of nuclear safety at the University of Central Lancashire.
"So the UK had to rely on natural uranium, which needed to be moderated using graphite - so that pushed us down a graphite-moderated gas-cooled reactor programme."
The moderator is the material inside a reactor needed to slow neutrons, the particles that travel from nucleus to nucleus - the "chain" of the chain reaction.
With enriched uranium, ordinary water - so-called light water - will do.
But natural uranium, with a lower density of fissile nuclei, requires either graphite or heavy water, in which the ordinary hydrogen atoms are replaced by atoms of deuterium, a heavier isotope of hydrogen.
Obninsk was a different concept again, graphite-moderated but water-cooled.
EBR-1 was something yet more exotic - a fast breeder reactor, with the core cooled by liquid sodium, creating more nuclear fuel than it consumed.
So fertile was the imagination of nuclear scientists that by the time the United Nations convened its Conference on Peaceful Uses of the Atom in 1955, more than 100 ideas were on the table.
Yet now, the nuclear world is dominated by one - the light water reactor, powered by enriched uranium.
The reasons behind this virtual technical monopoly originate underwater.
The top US priority was to develop a reactor capable of powering submarines. A naval officer with a reputation for getting things done, Hyman Rickover, was appointed to lead the task.
Submarine reactors need to be small and compact, and avoid the use of materials such as hot sodium that could prove an explosive hazard.
The light water reactor, with the water under pressure to prevent it from boiling and turning to steam, was Rickover's choice. It quickly entered service powering the Nautilus, the world's first nuclear submarine.
The reputation Rickover gained through the submarine's succesful maiden voyage put him in a powerful position when decisions were being made concerning the first US nuclear power station at Shippingport in Pennsylvania.
"When the civilian urgency came - 'we must prove to the world that we are the leaders' - obviously you pick the one that works," says Robin Cowan.
"So [Rickover] essentially forced the labs to say 'well, if you have to build a nuclear reactor now, the one you want is light water - not that we think it's the right one, but if you have to make a decision today, light water is the one you want'."
The Nautilus reactor was constructed by the Westinghouse Electric Corporation, which began to see the pressurised light water reactor (PWR) as a commercial option.
Meanwhile, other US government labs worked with the General Electric Corporation to develop a variant, the boiling water reactor (BWR) - the type used at Fukushima.
With the US government now actively courting friendly European countries with enriched uranium and other nuclear technology, partly to immunise them against Soviet lures, Westinghouse and General Electric began to market their wares in Europe and the US - and eventually further afield.
"They had a huge vested interest in dominating the nuclear power space - they stood to make many times the amount of money building a nuclear plant as they did a comparable coal or natural gas facility," says technology writer Alexis Madrigal.
"The combination of those two forces - governmental support combined with the corporate imperatives of these two massive corporations - led to this time period which is known as the 'great bandwagon market'. Essentially, both started selling nuclear plants at way below cost."
These lures proved too much for Europe to resist.
France, which had been building gas-cooled graphite-moderated reactors similar to the UK's Magnox design, embraced PWRs in the 1970s.
In 1980, even the UK abandoned its Advanced Gas-Cooled Reactor (AGR) programme, and decided its next reactor - Sizewell B - would be a PWR.
Today, only Canada manufactures anything different on a commercial scale - the heavy water-moderated Candu reactors.
So light water reactors, BWRs and PWRs, dominate the nuclear world.
But are they the best?
Back in the 1950s, engineers believed they were not, with research indicating gas-cooled designs would be more efficient, producing cheaper electricity.
There is an argument that gas-cooled reactors are inherently safer as well.
Because the cores are bigger, the density of heat in them is lower - meaning that in principle, operators would have longer to respond to a developing crisis before meltdown occurred.
In addition, if cooling pumps fail, the gas should continue to circulate through natural convection.
This is a marked contrast to light water reactors, in which - as the Fukushima disaster demonstrated - loss of power can mean catastrophic loss of cooling.
Alvin Weinberg, a physicist who worked on many of the early US reactors and directed research at Oak Ridge National Laboratory (ORNL), said during an interview in the 1980s that the scaling-up of PWRs for commercial use rendered them fundamentally flawed.
"As long as the reactor was as small as the submarine intermediate reactor, which was only 60 megawatts (MW), then the containment shell was absolute, it was safe," he said.
"But when you went to 600MW reactors and 1,000MW reactors, you could not guarantee this, because you could in some very remote situations conceive of the containment being breached by this molten mass; and that change came about, I would assert, because of the enormous economic pressure to make the reactors as large as possible."
At Idaho National Laboratory in the US, Don Miley argues that EBR-2, the successor to EBR-1, showed just how much better sodium-cooled reactors perform under stress than the water-cooled variety at Fukushima.
"They ran an experiment in 1986 in which safety systems were taken offline, they were not allowed to function - and then they turned off their coolant pumps, which nobody in the world had dared to do in a reactor yet.
"And strictly through reactor design - not through engineered systems or operator action, just reactor designs and two very simple concepts, convection currents and thermal expansion - this reactor shut itself down in 300 seconds without any damage to the fuel - in fact, it was re-started the same day."
Other reactor concepts that offer major theoretical advantages over light water reactors have fallen by the wayside, or remain stuck in the research stage.
These include designs that use thorium rather than uranium as fuel, resulting in less long-lived waste and a lower weapons proliferation risk, and travelling wave reactors that burn their waste as they operate.
Would any of these, or gas-cooled graphite-moderated designs, have proven superior to light water designs - even the latest ones that proponents claim have learned the lessons of the past and now incorporate more passive safety systems?
It is not certain - the British AGRs had their problems, and fast breeders have also seen a number of incidents.
Even more provocatively, would any of these lineages have led to reactors that could have survived the Fukushima deluge, averting the need for many thousands of people to leave their homes and for the government to shell out $100bn or so in compensation?
John Idris Jones, a physicist who has worked at the Magnox station at Wylfa in Wales for more than 30 years, believes a gas-graphite reactor would have survived.
"A plant like that, a plant like Wylfa, then Wylfa would be able to keep on cooling itself, yes," he says.
Probably we shall never know whether adopting light water has proved to be nuclear power's VHS moment.
Despite continuing research interests in more exotic designs, virtually all commercial reactors being planned and built around the world are PWRs and BWRs.
"With technologies that do the same thing, very often one of them comes to dominate," says Robin Cowan.
"It's very hard to undo that; we have so much more experience with light water that it would be hard to convince yourself to go back to the beginning and start developing heavy water or gas-graphite.
"So if we look today, probably light water looks pretty good; but had they made a different decision in 1955, what would the world be like?
"That's a much harder thing to document."
But it is, perhaps, a question governments should be contemplating as they consider whether to embark on reactor building programmes that would entrench current designs for further decades to come.
You can hear more about the history of nuclear power in Atomic States, broadcast this week on the BBC World Service. Listen to it here. Atomic States is a Freewheel Production.