Little Boy | |
---|---|
Type | Nuclear weapon |
Place of origin | United States |
Production history | |
Designer | Los Alamos Laboratory |
Manufacturer |
|
Produced | 1945–1947 |
No. built | 1 wartime + 5 postwar |
Specifications | |
Mass | 9,700 pounds (4,400 kg) |
Length | 10 feet (3.0 m) |
Diameter | 28 inches (71 cm) |
Filling | Highly enriched uranium |
Filling weight | 64 kg (141 lb) |
Blast yield | 15 kilotons of TNT (63 TJ) |
Little Boy was a type of atomic bomb created by the United States as part of the Manhattan Project during World War II. The name is also often used to describe the specific bomb (L-11) used in the bombing of the Japanese city of Hiroshima by the Boeing B-29 Superfortress Enola Gay on 6 August 1945, making it the first nuclear weapon used in warfare, and the second nuclear explosion in history, after the Trinity nuclear test. It exploded with an energy of approximately 15 kilotons of TNT (63 TJ ) and had an explosion radius of approximately 1.3 kilometres (0.81 mi) which caused widespread death across the city. It was a gun-type fission weapon which used uranium that had been enriched in the isotope uranium-235 to power its explosive reaction.
Little Boy was developed by Lieutenant Commander Francis Birch's group at the Los Alamos Laboratory. It was the successor to a plutonium-fueled gun-type fission design, Thin Man, which was abandoned in 1944 after technical difficulties were discovered. Little Boy used a charge of nitrocellulose to fire a hollow cylinder (the "bullet") of highly enriched uranium through an artillery gun barrel into a solid cylinder (the "target") of the same material. The design was highly inefficient: the weapon used on Hiroshima contained 64 kilograms (141 lb) of uranium, but less than a kilogram underwent nuclear fission. Unlike the implosion design developed for the Trinity test and the Fat Man bomb design that was used against Nagasaki, which required sophisticated coordination of shaped explosive charges, the simpler but inefficient gun-type design was considered almost certain to work, and was never tested prior to its use at Hiroshima.
After the war, numerous components for additional Little Boy bombs were built. By 1950, at least five weapons were completed; all were retired by November 1950.
There are two primary accounts of how the first atomic bombs got their names. Los Alamos Laboratory and Project Alberta physicist Robert Serber stated, many decades after the fact, that he had named the first two atomic bomb designs during World War II based on their shapes: Thin Man and Fat Man. The "Thin Man" was a long, thin device, and its name came from the Dashiell Hammett detective novel and series of movies about The Thin Man . The "Fat Man" was round and fat so it was named after Kasper Gutman, a rotund character in Hammett's 1930 novel The Maltese Falcon, played by Sydney Greenstreet in the 1941 film version. Little Boy was named by others as an allusion to Thin Man since it was based on its design. [1] [2] It was also sometimes referred to as the "Mark I" nuclear bomb design, with "Mark II" referring to the abandoned Thin Man, and "Mark III" to the "Fat Man." [3]
In September 1945, another Project Alberta physicist, Norman F. Ramsey, stated in his brief "History of Project A," that the early bomb ballistic test shapes designs were referred to as "Thin Man" and "Fat Man" by (unspecified) "Air Force representatives" for "security reasons," so that their communications over telephones sounded "as if they were modifying a plane to carry Roosevelt (the Thin Man) and Churchill (the Fat Man)," as opposed to modifying the B-29s to carry the two atomic bomb shapes as part of Project Silverplate in late 1943. [4] [5]
Another explanation of the names, from a classified United States Air Force history of Project Silverplate from the 1950s, implies a possible reconciliation of the two versions: that the terms "Thin Man" and "Fat Man" were first developed by someone at or from Los Alamos (i.e., Serber), but were consciously adopted by the officers in Silverplate when they were adopting their own codenames for their own project (including "Silverplate"). As Silverplate involved modifying B-29s for a secret purpose, deliberately using codenames that would align with modifying vehicles for Roosevelt and Churchill would serve their needs well. [6]
Because of its perceived simplicity, the gun-type nuclear weapon design was the first approach pursued by the scientists working on bomb design during the Manhattan Project. In 1942, it was not yet known which of the two fissile materials pathways being simultaneously pursued—uranium-235 or plutonium-239—would be successful, or if there were significant differences between the two fuels that would impact the design work. Coordination with British scientists in May 1942 convinced the American scientists, led by J. Robert Oppenheimer, that the atomic bomb would not be difficult to design and that the difficulty would lie only in the production of fuel. Calculations in mid-1942 by theoretical physicists working on the project reinforced the idea that an ordinary artillery gun barrel would be able to impart sufficient velocity to the fissile material projectile. [7]
Several different weapon designs, including autocatalytic assembly, a nascent version of implosion, and alternative gun designs (e.g., using high explosives as a propellent, or creating a "double gun" with two projectiles) were pursued in the early years of the project, while the facilities to manufacture fissile material were being constructed. The belief that the gun design would be an easy engineering task once fuel was available led to a sense of optimism at Los Alamos, although Oppenheimer established a small research group to study implosion as a fallback in early 1943. [8] A full ordnance program for gun-design development was established by March 1943, with expertise provided by E.L. Rose, an experienced gun designer and engineer. Work was begun to study the properties of barrels, internal and external ballistics, and tampers of gun weapons. Oppenheimer led aspects of the effort, telling Rose that "at the present time [May 1945] our estimates are so ill founded that I think it better for me to take responsibility for putting them forward." He soon delegated the work to Naval Captain William Sterling Parsons, who, along with Ed McMillan, Charles Critchfield, and Joseph Hirschfelder would be responsible for rendering the theory into practice. [9]
Concern that impurities in reactor-bred plutonium would make predetonation more likely meant that much of the gun-design work was focused on the plutonium gun. To achieve high projectile velocities, the plutonium gun was 17 feet (5.2 m) long with a narrow diameter (suggesting its codename as the Thin Man) which created considerable difficulty in its ballistics dropping from aircraft and fitting it into the bomb bay of a B-29. [10]
In early 1944, Emilio G. Segrè and his P-5 Group at Los Alamos received the first samples of plutonium produced from a nuclear reactor, the X-10 Graphite Reactor at Clinton Engineer Works in Oak Ridge, Tennessee. Analyzing it, they discovered that the presence of the isotope plutonium-240 (Pu-240) raised the rate of spontaneous fission of the plutonium to an unacceptable amount. Previous analyses of plutonium had been made from samples created by cyclotrons and did not have as much of the contaminating isotope. If reactor-bred plutonium was used in a gun-type design, they concluded, it would predetonate, causing the weapon to destroy itself before achieving the conditions for a large-scale explosion. [11]
As a consequence of the discovery of the Pu-240 contamination problem, in July 1944 almost all research at Los Alamos was redirected to the implosion-type plutonium weapon, and the laboratory was entirely reorganized around the implosion problem. Work on the gun-type weapon continued under Person's Ordnance (O) Division, for use exclusively with highly enriched uranium as a fuel. All the design, development, and technical work at Los Alamos was consolidated under Lieutenant Commander Francis Birch's group. [12]
In contrast to the plutonium implosion-type nuclear weapon and the plutonium gun-type fission weapon, the uranium gun-type weapon was much simpler to design. As a high-velocity gun was no longer required, the overall length of the gun barrel could be dramatically decreased, and this allowed the weapon to fit into a B-29 bomb bay without difficulty. Though not an optimal use of fissile material compared to the implosion design, it was seen as a nearly guaranteed weapon. [2]
The design specifications were completed in February 1945, and contracts were let to build the components. Three different plants were used so that no one would have a copy of the complete design. The gun and breech were made by the Naval Gun Factory in Washington, D.C.; the target case and some other components by the Naval Ordnance Plant in Center Line, Michigan; and the tail fairing and mounting brackets by the Expert Tool and Die Company in Detroit, Michigan. [13] The bomb, except for the uranium payload, was ready at the beginning of May 1945. [14] Manhattan District Engineer Kenneth Nichols expected on 1 May 1945 to have enriched uranium "for one weapon before August 1 and a second one sometime in December", assuming the second weapon would be a gun type; designing an implosion bomb for enriched uranium was considered, and this would increase the production rate. [15] The enriched uranium projectile was completed on 15 June, and the target was completed on 24 July. [16] The target and bomb pre-assemblies (partly assembled bombs without the fissile components) left Hunters Point Naval Shipyard, California, on 16 July aboard the heavy cruiser USS Indianapolis, arriving on 26 July. [17] The target inserts followed by air on 30 July. [16]
Although all of its components had been individually tested, [16] no full test of a gun-type nuclear weapon occurred before the Little Boy was dropped over Hiroshima. The only test explosion of a nuclear weapon concept had been of an implosion-type device employing plutonium as its fissile material, which took place on 16 July 1945 at the Trinity nuclear test. There were several reasons for not testing a Little Boy type of device. Primarily, there was the issue of fissile material availability. K-25 at Clinton Engineer Works was designed to produce around 30 kilograms of enriched uranium per month, and the Little Boy design used over 60 kilograms per bomb. So testing the weapon would incur a considerable delay in use of the weapon. (By comparison, B Reactor at the Hanford Site was designed to produce around 20 kilograms of plutonium per month, and each Fat Man bomb used around 6 kilograms of material.) [18] Because of the simplicity of the gun-type design, laboratory testing could establish that its parts worked correctly on their own: for example, dummy projectiles could be shot down the gun barrel to make sure they were "seated" correctly onto a dummy target. Absence of a full-scale test in the implosion-type design made it much more difficult to establish whether the necessary simultaneity of compression had been achieved. While there was at least one prominent scientist (Ernest O. Lawrence) who advocated for a full-scale test, by early 1945 Little Boy was regarded as nearly a sure thing and was expected to have a higher yield than the first-generation implosion bombs. [19]
Though Little Boy incorporated various safety mechanisms, an accidental detonation of a fully-assembled weapon was very possible. Should the bomber carrying the device crash, the hollow "bullet" could be driven into the "target" cylinder, possibly detonating the bomb from gravity alone (though tests suggested this was unlikely), but easily creating a critical mass that would release dangerous amounts of radiation. [20] A crash of the B-29 and subsequent fire could trigger the explosives, causing the weapon to detonate. [21] If immersed in water, the uranium components were subject to a neutron moderator effect, which would not cause an explosion but would release radioactive contamination. For this reason, pilots were advised to crash on land rather than at sea. [20] Ultimately, Parsons opted to keep the explosives out of the Little Boy bomb until after the B-29 had taken off, to avoid the risk of a crash that could destroy or damage the military base from which the weapon was launched. [22]
The Little Boy was 120 inches (300 cm) in length, 28 inches (71 cm) in diameter and weighed approximately 9,700 pounds (4,400 kg). [23] The design used the gun method to explosively force a hollow sub-critical mass of enriched uranium and a solid target cylinder together into a super-critical mass, initiating a nuclear chain reaction. [24] This was accomplished by shooting one piece of the uranium onto the other by means of four cylindrical silk bags of cordite powder. This was a widely used smokeless propellant consisting of a mixture of 65 percent nitrocellulose, 30 percent nitroglycerine, 3 percent petroleum jelly, and 2 percent carbamite that was extruded into tubular granules. This gave it a high surface area and a rapid burning area, and could attain pressures of up to 40,000 pounds per square inch (280,000 kPa). Cordite for the wartime Little Boy was sourced from Canada; propellant for post-war Little Boys was obtained from the Picatinny Arsenal. [25] The bomb contained 64 kilograms (141 lb) of enriched uranium. Most was enriched to 89% but some was only 50% uranium-235, for an average enrichment of 80%. [24] Less than a kilogram of uranium underwent nuclear fission, and of this mass only 0.7 grams (0.025 oz) was transformed into several forms of energy, mostly kinetic energy, but also heat and radiation. [26]
Inside the weapon, the uranium-235 material was divided into two parts, following the gun principle: the "projectile" and the "target". The projectile was a hollow cylinder with 60% of the total mass (38.5 kilograms [85 lb]). It consisted of a stack of nine uranium rings, each 6.25 inches (159 mm) in diameter with a 4-inch (100 mm) bore in the center, and a total length of 7 inches (180 mm), pressed together into the front end of a thin-walled projectile 16.25 inches (413 mm) long. Filling in the remainder of the space behind these rings in the projectile was a tungsten carbide disc with a steel back. At ignition, the projectile slug was pushed 42 inches (1,100 mm) along the 72-inch-long (1,800 mm), 6.5-inch-wide (170 mm) smooth-bore gun barrel. The slug "insert" was a 4-inch cylinder, 7 inches in length with a 1-inch (25 mm) axial hole. The slug comprised 40% of the total fissile mass (25.6 kilograms or 56 pounds). The insert was a stack of six washer-like uranium discs somewhat thicker than the projectile rings that were slid over a 1-inch rod. This rod then extended forward through the tungsten carbide plug, impact-absorbing anvil, and nose plug backstop, eventually protruding out of the front of the bomb casing. This entire target assembly was secured at both ends with locknuts. [27] [28]
When the hollow-front projectile reached the target and slid over the target insert, the assembled super-critical mass of uranium would be completely surrounded by a tamper and neutron reflector of tungsten carbide and steel, both materials having a combined mass of 2,300 kilograms (5,100 lb). [29] Neutron initiators inside the assembly were activated by the impact of the projectile into the target. [30]
The material was split almost in half, with at one end a group of rings of highly enriched uranium with 40% of the supercritical mass, and at the other end another group of slightly larger rings with 60% of the supercritical mass, which was fired onto the smaller group, with four polonium-beryllium neutron initiators to make the supercritical mass explode. [31] [32]
A hole in the center of the larger piece dispersed the mass and increased the surface area, allowing more fission neutrons to escape, thus preventing a premature chain reaction. [33] But, for this larger, hollow piece to have minimal contact with the tungsten carbide tamper, it must be the projectile, since only the projectile's back end was in contact with the tamper prior to detonation. The rest of the tungsten carbide tamper surrounded the sub-critical mass target cylinder (called the "insert" by the designers) with air space between it and the insert. This arrangement packs the maximum amount of fissile material into a gun-assembly design. [33]
For the first fifty years after 1945, every published description and drawing of the Little Boy mechanism assumed that a small, solid projectile was fired into the center of a larger, stationary target. [34] However, critical mass considerations dictated that in Little Boy the more extensive, hollow piece would be the projectile. Hollow cylinders have higher critical masses than solid pieces of fissile material, because any neutrons encountered by or generated by the material are more likely to get scattered in the air than to continue a chain reaction. The larger piece would also avoid the effects of neutron reflection from the tungsten carbide tamper until it was fully joined with the rest of the fuel. Once joined and with its neutrons reflected, the assembled fissile core would comprise more than two critical masses of uranium-235. [35] In 2004, John Coster-Mullen, a truck driver and model maker from Illinois who had studied every photograph and document on the Hiroshima bomb to make an accurate model, corrected earlier published accounts. [31]
The fuzing system was designed to trigger at the most destructive altitude, which calculations suggested was 1,900 feet (580 m). It employed a three-stage interlock system: [36]
The Little Boy pre-assemblies were designated L-1, L-2, L-3, L-4, L-5, L-6, L-7, and L-11. Of these, L-1, L-2, L-5, and L-6 were expended in test drops. The first drop test was conducted with L-1 on 23 July 1945. It was dropped over the sea near Tinian in order to test the radar altimeter by the B-29 later known as Big Stink , piloted by Colonel Paul W. Tibbets, the commander of the 509th Composite Group. Two more drop tests over the sea were made on 24 and 25 July, using the L-2 and L-5 units in order to test all components. Tibbets was the pilot for both missions, but this time the bomber used was the one subsequently known as Jabit . L-6 was used as a dress rehearsal on 29 July. The B-29 Next Objective , piloted by Major Charles W. Sweeney, flew to Iwo Jima, where emergency procedures for loading the bomb onto a standby aircraft were practiced. This rehearsal was repeated on 31 July, but this time L-6 was reloaded onto a different B-29, Enola Gay , piloted by Tibbets, and the bomb was test dropped near Tinian. L-11 was the assembly used for the Hiroshima bomb, and was fully assembled with its nuclear fuel by 31 July. [37] [38]
Parsons, the Enola Gay's weaponeer, was concerned about the possibility of an accidental detonation if the plane crashed on takeoff, so he decided not to load the four cordite powder bags into the gun breech until the aircraft was in flight. After takeoff, Parsons and his assistant, Second Lieutenant Morris R. Jeppson, made their way into the bomb bay along the narrow catwalk on the port side. Jeppson held a flashlight while Parsons disconnected the primer wires, removed the breech plug, inserted the powder bags, replaced the breech plug, and reconnected the wires. Before climbing to altitude on approach to the target, Jeppson switched the three safety plugs between the electrical connectors of the internal battery and the firing mechanism from green to red. The bomb was then fully armed. Jeppson monitored the bomb's circuits. [39]
The bomb was dropped at approximately 08:15 (JST) on 6 August 1945. After falling for 44.4 seconds, the time and barometric triggers started the firing mechanism. The detonation happened at an altitude of 1,968 ± 50 feet (600 ± 15 m). It was less powerful than the Fat Man, which was dropped on Nagasaki, but the damage and the number of victims at Hiroshima were much higher, as Hiroshima was on flat terrain, while the hypocenter of Nagasaki lay in a small valley. According to figures published in 1945, 66,000 people were killed as a direct result of the Hiroshima blast, and 69,000 were injured to varying degrees. [40] Later estimates put the deaths as high as 140,000 people. [41] The United States Strategic Bombing Survey estimated that out of 24,158 Imperial Japanese Army soldiers in Hiroshima at the time of the bombing, 6,789 were killed or missing as a result of the bombing. [42]
The exact measurement of the explosive yield of the bomb was problematic since the weapon had never been tested. President Harry S. Truman officially announced that the yield was 20 kilotons of TNT (84 TJ). This was based on Parsons's visual assessment that the blast was greater than what he had seen at the Trinity nuclear test. Since that had been estimated at 18 kilotons of TNT (75 TJ), speech writers rounded up to 20 kilotons. Further discussion was then suppressed, for fear of lessening the impact of the bomb on the Japanese. Data had been collected by Luis Alvarez, Harold Agnew, and Lawrence H. Johnston on the instrument plane, The Great Artiste , but this was not used to calculate the yield at the time. [43] More rigorous estimates of the bomb yield and conventional bomb equivalent were made when more data was acquired following the end of the war. A 1985 study estimated the bomb's yield was around 15 kilotons of TNT (63 TJ). [44]
After being selected in April 1945, Hiroshima was spared conventional bombing to serve as a pristine target, where the effects of a nuclear bomb on an undamaged city could be observed. [45] While damage could be studied later, the energy yield of the untested Little Boy design could be determined only at the moment of detonation, using instruments dropped by parachute from a plane flying in formation with the one that dropped the bomb. Radio-transmitted data from these instruments indicated a yield of about 15 kilotons. [44]
Comparing this yield to the observed damage produced a rule of thumb called the 5 pounds per square inch (34 kPa ) lethal area rule. Approximately all the people inside the area where the shock wave carried such an overpressure or greater would be killed. [46] At Hiroshima, that area was 2.2 miles (3.5 km) in diameter. [47]
The damage came from three main effects: blast, fire, and radiation. [48]
The blast from a nuclear bomb is the result of X-ray-heated air (the fireball) sending a shock wave or pressure wave in all directions, initially at a velocity greater than the speed of sound, [49] analogous to thunder generated by lightning. Knowledge about urban blast destruction is based largely on studies of Little Boy at Hiroshima. Nagasaki buildings suffered similar damage at similar distances, but the Nagasaki bomb detonated 2.0 miles (3.2 km) from the city center over hilly terrain that was partially bare of buildings. [50]
In Hiroshima, almost everything within 1.0 mile (1.6 km) of the point directly under the explosion was completely destroyed, except for about 50 heavily reinforced, earthquake-resistant concrete buildings, only the shells of which remained standing. Most were completely gutted, with their windows, doors, sashes, and frames ripped out. [51] The perimeter of severe blast damage approximately followed the 5 pounds per square inch (34 kPa) contour at 1.1 miles (1.8 km).
Later test explosions of nuclear weapons with houses and other test structures nearby confirmed the 5 psi overpressure threshold. Ordinary urban buildings experiencing it were crushed, toppled, or gutted by the force of air pressure. The picture at right shows the effects of a nuclear bomb-generated 5 psi pressure wave on a test structure in Nevada in 1953. [52]
A major effect of this kind of structural damage was that it created fuel for fires that were started simultaneously throughout the severe destruction region.
The first effect of the explosion was blinding light, accompanied by radiant heat from the fireball. The Hiroshima fireball was 1,200 feet (370 m) in diameter, with a surface temperature of 10,000 °F (6,000 °C), about the same temperature as at the surface of the sun. [53] Near ground zero, everything flammable burst into flame. One famous, anonymous Hiroshima victim, sitting on stone steps 850 feet (260 m) from the hypocenter, left a permanent shadow, having absorbed the fireball heat that permanently bleached the surrounding stone. [54] Simultaneous fires were started throughout the blast-damaged area by fireball heat and by overturned stoves and furnaces, electrical shorts, etc. Twenty minutes after the detonation, these fires had merged into a firestorm, pulling in surface air from all directions to feed an inferno which consumed everything flammable. [55]
The Hiroshima firestorm was roughly 2.0 miles (3.2 km) in diameter, corresponding closely to the severe blast-damage zone. (See the USSBS [56] map, right.) Blast-damaged buildings provided fuel for the fire. Structural lumber and furniture were splintered and scattered about. Debris-choked roads obstructed firefighters. Broken gas pipes fueled the fire, and broken water pipes rendered hydrants useless. [55] At Nagasaki, the fires failed to merge into a single firestorm, and the fire-damaged area was only one-quarter as great as at Hiroshima, due in part to a southwest wind that pushed the fires away from the city. [57]
As the map shows, the Hiroshima firestorm jumped natural firebreaks (river channels), as well as prepared firebreaks. The spread of fire stopped only when it reached the edge of the blast-damaged area, encountering less available fuel. [58] The Manhattan Project report on Hiroshima estimated that 60% of immediate deaths were caused by fire, but with the caveat that "many persons near the center of explosion suffered fatal injuries from more than one of the bomb effects." [59]
Local fallout is dust and ash from a bomb crater, contaminated with radioactive fission products. It falls to earth downwind of the crater and can produce, with radiation alone, a lethal area much larger than that from blast and fire. With an air burst, the fission products rise into the stratosphere, where they dissipate and become part of the global environment. Because Little Boy was an air burst 580 meters (1,900 ft) above the ground, there was no bomb crater and no local radioactive fallout. [60]
However, a burst of intense neutron and gamma radiation came directly from the fission of the uranium. Its lethal radius was approximately 1.3 kilometers (0.8 mi), [61] [62] covering about half of the firestorm area. An estimated 30% of immediate fatalities were people who received lethal doses of this direct radiation, but died in the firestorm before their radiation injuries would have become apparent. Over 6,000 people survived the blast and fire, but died of radiation injuries. [59] Among injured survivors, 30% had radiation injuries [63] from which they recovered, but with a lifelong increase in cancer risk. [64] [65] To date, no radiation-related evidence of heritable diseases has been observed among the survivors' children. [66] [67] [68]
After the surrender of Japan was finalized, Manhattan Project scientists began to immediately survey the city of Hiroshima to better understand the damage, and to communicate with Japanese physicians about radiation effects in particular. The collaboration became the Atomic Bomb Casualty Commission in 1946, a joint U.S.–Japanese project to track radiation injuries among survivors. In 1975 its work was superseded by the Radiation Effects Research Foundation. [69]
In 1962, scientists at Los Alamos created a mockup of Little Boy known as "Project Ichiban" in order to answer some of the unanswered questions about the exact radiation output of the bomb, which would be useful for setting benchmarks for interpreting the relationship between radiation exposure and later health outcomes. But it failed to clear up all the issues. In 1982, Los Alamos created a replica Little Boy from the original drawings and specifications. This was then tested with enriched uranium but in a safe configuration that would not cause a nuclear explosion. A hydraulic lift was used to move the projectile, and experiments were run to assess neutron emission. [70]
After hostilities ended, a survey team from the Manhattan Project that included William Penney, Robert Serber, and George T. Reynolds was sent to Hiroshima to evaluate the effects of the blast. From evaluating the effects on objects and structures, Penney concluded that the yield was 12 ± 1 kilotons. [71] Later calculations based on charring pointed to a yield of 13 to 14 kilotons. [72] In 1953, Frederick Reines calculated the yield as 15 kilotons of TNT (63 TJ). [43] Based on the Project Ichiban data, and the pressure-wave data from The Great Artiste, the yield was estimated in the 1960s at 16.6 ± 0.3 kilotons. [73] A review conducted by a scientist at Los Alamos in 1985 concluded, on the basis of existing blast, thermal, and radiological data, and then-current models of weapons effects, that the best estimate of the yield was 15 kilotons of TNT (63 TJ) with an uncertainty of 20% (±3 kt). By comparison, the best value for the Nagasaki bomb was evaluated as 21 kilotons of TNT (88 TJ) with an uncertainty of 10% (±2 kt), the difference in uncertainty owing to having better data on the latter. [44]
To put these numerical differences into context, it is necessary to know that the acute effects of nuclear detonations, especially the blast and thermal effects, do not scale linearly, but generally as a cubic root. Specifically, the distance of these effects scale as a function of the yield raised to an exponential power of 1⁄3. [74] So the range of the 5 pounds per square inch (34 kPa) overpressure damage expected from a detonated 12 kiloton weapon with a height of burst at 1,968 feet (600 m) would be expected to be 0.98 miles (1.58 km), whereas a 20 kiloton weapon would have the same range extend to 1.12 miles (1.80 km), a difference of only 0.14 miles (0.23 km). The areas affected for each would be 3.02 square miles (7.8 km2) and 3.91 square miles (10.1 km2), respectively. As such, the practical differences in effects at these yield ranges are smaller than may at first appear, if one assumes that there is a linear relationship between yield and damage. [75]
Although Little Boy exploded with the energy equivalent of around 15 kilotons of TNT, in 1946 the Strategic Bombing Survey estimated that the same blast and fire effect could have been caused by 2.1 kilotons of conventional bombs distributed evenly over the same target area: "220 B-29s carrying 1.2 kilotons of incendiary bombs, 400 tons of high-explosive bombs, and 500 tons of anti-personnel fragmentation bombs." [76] Since the target was spread across a two-dimensional plane, the vertical component of a single spherical nuclear explosion was largely wasted. A cluster bomb pattern of smaller explosions would have been a more energy-efficient match to the target. [76]
When the war ended, it was not expected that the inefficient Little Boy design would ever again be required, and many plans and diagrams were destroyed. However, by mid-1946 the Hanford Site reactors were suffering badly from the Wigner effect. Faced with the prospect of no more plutonium for new cores and no more polonium for the initiators for the cores that had already been produced, the Director of the Manhattan Project, Major General Leslie R. Groves, ordered that some Little Boys be prepared as an interim measure until a solution could be found. No Little Boy assemblies were available, and no comprehensive set of diagrams of the Little Boy could be found, although there were drawings of the various components, and stocks of spare parts. [77] [78]
At Sandia Base, three Army officers, Captains Albert Bethel, Richard Meyer, and Bobbie Griffin attempted to re-create the Little Boy. They were supervised by Harlow W. Russ, an expert on Little Boy who served with Project Alberta on Tinian, and was now leader of the Z-11 Group of the Los Alamos Laboratory's Z Division at Sandia. Gradually, they managed to locate the correct drawings and parts, and figured out how they went together. Eventually, they built six Little Boy assemblies. Although the casings, barrels, and components were tested, no enriched uranium was supplied for the bombs. By early 1947, the problem caused by the Wigner effect was on its way to solution, and the three officers were reassigned. [77] [78]
The Navy Bureau of Ordnance began in 1947 to produce 25 "revised" Little Boy mechanical assemblies for use by the nuclear-capable Lockheed P2V Neptune aircraft carrier aircraft (which could be launched from, but not land on, the Midway-class aircraft carriers). Components were produced by the Naval Ordnance Plants in Pocatello, Idaho, and Louisville, Kentucky. Enough fissionable material was available by 1948 to build ten projectiles and targets, although there were only enough initiators for six. However, no actual fissionable components were produced by the end of 1948, and only two outer casings were available. [79] By the end of 1950, only five complete Little Boy assemblies had been built. All were retired by November 1950. [80]
The Smithsonian Institution displayed a Little Boy (complete, except for enriched uranium), until 1986. The Department of Energy took the weapon from the museum to remove its inner components, so the bomb could not be stolen and detonated with fissile material. The government returned the emptied casing to the Smithsonian in 1993. Three other disarmed bombs are on display in the United States; another is at the Imperial War Museum in London. [34]
"Fat Man" was the type of nuclear weapon the United States detonated over the Japanese city of Nagasaki on 9 August 1945. It was the second and largest of the only two nuclear weapons ever used in warfare, the first being Little Boy, and its detonation marked the third nuclear explosion in history. The first one was built by scientists and engineers at Los Alamos Laboratory using plutonium manufactured at the Hanford Site and was dropped from the Boeing B-29 Superfortress Bockscar piloted by Major Charles Sweeney.
The Manhattan Project was a research and development program undertaken during World War II to produce the first nuclear weapons. It was led by the United States in collaboration with the United Kingdom and Canada. From 1942 to 1946, the project was directed by Major General Leslie Groves of the U.S. Army Corps of Engineers. Nuclear physicist J. Robert Oppenheimer was the director of the Los Alamos Laboratory that designed the bombs. The Army program was designated the Manhattan District, as its first headquarters were in Manhattan; the name gradually superseded the official codename, Development of Substitute Materials, for the entire project. The project absorbed its earlier British counterpart, Tube Alloys, and subsumed the program from the American civilian Office of Scientific Research and Development. The Manhattan Project employed nearly 130,000 people at its peak and cost nearly US$2 billion, over 80 percent of which was for building and operating the plants that produced the fissile material. Research and production took place at more than 30 sites across the US, the UK, and Canada.
Trinity was the code name of the first detonation of a nuclear weapon, conducted by the United States Army at 5:29 a.m. MWT on July 16, 1945, as part of the Manhattan Project. The test was of an implosion-design plutonium bomb, nicknamed "The Gadget", of the same design as the Fat Man bomb later detonated over Nagasaki, Japan, on August 9, 1945. Concerns about whether the complex Fat Man design would work led to a decision to conduct the first nuclear test. The code name "Trinity" was assigned by J. Robert Oppenheimer, the director of the Los Alamos Laboratory, possibly inspired by the poetry of John Donne.
Nuclear technology is technology that involves the nuclear reactions of atomic nuclei. Among the notable nuclear technologies are nuclear reactors, nuclear medicine and nuclear weapons. It is also used, among other things, in smoke detectors and gun sights.
Nuclear weapons design are physical, chemical, and engineering arrangements that cause the physics package of a nuclear weapon to detonate. There are three existing basic design types:
In nuclear engineering, a critical mass is the smallest amount of fissile material needed for a sustained nuclear chain reaction. The critical mass of a fissionable material depends upon its nuclear properties, density, shape, enrichment, purity, temperature, and surroundings. The concept is important in nuclear weapon design.
Building on major scientific breakthroughs made during the 1930s, the United Kingdom began the world's first nuclear weapons research project, codenamed Tube Alloys, in 1941, during World War II. The United States, in collaboration with the United Kingdom, initiated the Manhattan Project the following year to build a weapon using nuclear fission. The project also involved Canada. In August 1945, the atomic bombings of Hiroshima and Nagasaki were conducted by the United States, with British consent, against Japan at the close of that war, standing to date as the only use of nuclear weapons in hostilities.
Operation Sandstone was a series of nuclear weapon tests in 1948. It was the third series of American tests, following Trinity in 1945 and Crossroads in 1946, and preceding Ranger. Like the Crossroads tests, the Sandstone tests were carried out at the Pacific Proving Grounds, although at Enewetak Atoll rather than Bikini Atoll. They differed from Crossroads in that they were conducted by the Atomic Energy Commission, with the armed forces having only a supporting role. The purpose of the Sandstone tests was also different: they were primarily tests of new bomb designs rather than of the effects of nuclear weapons. Three tests were carried out in April and May 1948 by Joint Task Force 7, with a work force of 10,366 personnel, of whom 9,890 were military.
The Soviet atomic bomb project was authorized by Joseph Stalin in the Soviet Union to develop nuclear weapons during and after World War II.
A boosted fission weapon usually refers to a type of nuclear bomb that uses a small amount of fusion fuel to increase the rate, and thus yield, of a fission reaction. The neutrons released by the fusion reactions add to the neutrons released due to fission, allowing for more neutron-induced fission reactions to take place. The rate of fission is thereby greatly increased such that much more of the fissile material is able to undergo fission before the core explosively disassembles. The fusion process itself adds only a small amount of energy to the process, perhaps 1%.
The Manhattan Project was a research and development project that produced the first atomic bombs during World War II. It was led by the United States with the support of the United Kingdom and Canada. From 1942 to 1946, the project was under the direction of Major General Leslie Groves of the US Army Corps of Engineers. The Army component of the project was designated the Manhattan District; "Manhattan" gradually became the codename for the entire project. Along the way, the project absorbed its earlier British counterpart, Tube Alloys. The Manhattan Project began modestly in 1939, but grew to employ more than 130,000 people and cost nearly US$2 billion. Over 90% of the cost was for building factories and producing the fissionable materials, with less than 10% for development and production of the weapons.
The explosive yield of a nuclear weapon is the amount of energy released such as blast, thermal, and nuclear radiation, when that particular nuclear weapon is detonated, usually expressed as a TNT equivalent (the standardized equivalent mass of trinitrotoluene which, if detonated, would produce the same energy discharge), either in kilotonnes (kt—thousands of tonnes of TNT), in megatonnes (Mt—millions of tonnes of TNT), or sometimes in terajoules (TJ). An explosive yield of one terajoule is equal to 0.239 kilotonnes of TNT. Because the accuracy of any measurement of the energy released by TNT has always been problematic, the conventional definition is that one kilotonne of TNT is held simply to be equivalent to 1012 calories.
Gun-type fission weapons are fission-based nuclear weapons whose design assembles their fissile material into a supercritical mass by the use of the "gun" method: shooting one piece of sub-critical material into another. Although this is sometimes pictured as two sub-critical hemispheres driven together to make a supercritical sphere, typically a hollow projectile is shot onto a spike, which fills the hole in its center. Its name is a reference to the fact that it is shooting the material through an artillery barrel as if it were a projectile.
A nuclear explosion is an explosion that occurs as a result of the rapid release of energy from a high-speed nuclear reaction. The driving reaction may be nuclear fission or nuclear fusion or a multi-stage cascading combination of the two, though to date all fusion-based weapons have used a fission device to initiate fusion, and a pure fusion weapon remains a hypothetical device. Nuclear explosions are used in nuclear weapons and nuclear testing.
"Thin Man" was the code name for a proposed plutonium-fueled gun-type nuclear bomb that the United States was developing during the Manhattan Project. Its development was abandoned in 1944 after it was discovered that the spontaneous fission rate of nuclear reactor-bred plutonium was too high for use in a gun-type design due to the high concentration of the isotope plutonium-240.
The Mark 18 nuclear bomb, also known as the SOB or Super Oralloy Bomb, was an American nuclear bomb design which was the highest yield fission bomb produced by the US. The Mark 18 had a design yield of 500 kilotons. Nuclear weapon designer Ted Taylor was the lead designer for the Mark 18.
Plutonium is a chemical element; it has symbol Pu and atomic number 94. It is a silvery-gray actinide metal that tarnishes when exposed to air, and forms a dull coating when oxidized. The element normally exhibits six allotropes and four oxidation states. It reacts with carbon, halogens, nitrogen, silicon, and hydrogen. When exposed to moist air, it forms oxides and hydrides that can expand the sample up to 70% in volume, which in turn flake off as a powder that is pyrophoric. It is radioactive and can accumulate in bones, which makes the handling of plutonium dangerous.
In nuclear weapon design, the pit is the core of an implosion nuclear weapon, consisting of fissile material and any neutron reflector or tamper bonded to it. Some weapons tested during the 1950s used pits made with uranium-235 alone, or as a composite with plutonium. All-plutonium pits are the smallest in diameter and have been the standard since the early 1960s. The pit is named after the hard core found in stonefruit such as peaches and apricots.
The Los Alamos Laboratory, also known as Project Y, was a secret scientific laboratory established by the Manhattan Project and overseen by the University of California during World War II. It was operated in partnership with the United States Army. Its mission was to design and build the first atomic bombs. J. Robert Oppenheimer was its first director, serving from 1943 to December 1945, when he was succeeded by Norris Bradbury. In order to enable scientists to freely discuss their work while preserving security, the laboratory was located on the isolated Pajarito Plateau in Northern New Mexico. The wartime laboratory occupied buildings that had once been part of the Los Alamos Ranch School.
In a nuclear weapon, a tamper is an optional layer of dense material surrounding the fissile material. It is used in nuclear weapon design to reduce the critical mass and to delay the expansion of the reacting material through its inertia, which delays the thermal expansion of the fissioning fuel mass, keeping it supercritical longer. Often the same layer serves both as tamper and as neutron reflector. The weapon disintegrates as the reaction proceeds, and this stops the reaction, so the use of a tamper makes for a longer-lasting, more energetic and more efficient explosion. The yield can be further enhanced using a fissionable tamper.