—
In-Depth Articles
The Rise and Fall of Medieval Iron Technology Arts & Humanities Reading, Writing, Speaking, Listening Science/Technology
The iron technology that the early colonists brought with them to America had changed little since its development in the Middle Ages. Its essentials were: 1) smelting iron from its ore in a blast furnace, 2) fueled by charcoal, 3) using a blast of forced air to intensify the heat, and 4) refining and shaping the raw pig iron at a forge. This technology had not been sudden-born nor did it, like Athena, spring forth full-blown from the brow of Zeus. Rather, it developed as a result of efforts in the early Middle Ages to revive and improve the iron technology that had been used in the Classical Era that ended with the fall of the Rome.
The technology of the Ancients had been a variety of what later would be called the "bloomery" method of making iron. Chunks of iron ore, mixed with charcoal, were heated and hammered in the equivalent of blacksmiths' forges. Hand-operated leather bellows were used to intensify the heat. This method of producing iron was slow, yielding only about 50 pounds per batch and the iron "blooms," containing much carbon and bits of waste called slag, required additional refining.
Advances in metallurgy during the Middle Ages began with efforts to increase the efficiency and the size of bloomeries. By the eighth Century the Catalan forge had been perfected in northern Spain. Catalonians explored the advantages of taller furnaces and introduced the nozzle or tuyere for a blast of air produced by leather bellows. A five-hour heat of a Catalan Forge produced a pasty batch of slag-free iron weighing about 350 pounds.
In the eighth and ninth centuries in Austria, Saxony, and along the Rhine, the height of furnaces (stuckofen) was extended to as much as 16 feet. These furnaces produced malleable iron directly from the ore and turned out up to 700 pounds per heat. Eventually furnaces were so tall that leather bellows worked by men could not produce a strong enough blast so waterpower was used to compress air into a stronger blast. As more changes one by one were added, medieval blast furnace technology emerged. No single exact date or place is accepted for this-it is agreed, however, that true blast furnaces were developed at several places, including Lapphyttan, Sweden between 1150 and 1350, and at places in modern-day Belgium in 1340 and France in 1409.
The medieval blast furnace technology died in the same manner as the classical bloomery and for the same reasons. It could not keep pace with the growing need for greater output and more diversity in iron. One reason was exhaustion of the "endless forests" for charcoal. A rule of thumb had it that all the charcoal that could be made from all of the wood cut from an acre of hardwood was required for each full day of operation of a blast furnace. After a few years of coaling at that rate, the growing distance of the charcoal mounds from the furnaces drove up hauling costs. Many enterprises failed from depletion of their hardwood forests. In England where the problem was particularly bad, coked bituminous coal began being substituted for charcoal as early as 1709.
Not until the 1840s and 1850s was mineral coal successfully used to fuel iron furnaces in America. Unfortunately it was anthracite (often called "stone coal" at the time) that was first used. Although almost pure carbon, it was difficult to get it to burn. Success came when hot blasts were substituted for cold air blasts. Anthracite coal, abundant in south-eastern Pennsylvania, was about half as costly as charcoal. It was also less frangible, meaning it did not crush as easily as charcoal when loaded in the furnace and so was less apt to clog up the blast. The reduced frangibility of anthracite enabled ironworks to construct larger furnaces and realize greater efficiency and profits. Charcoal furnaces had already reached the size limit imposed by frangibility. By 1850, historian Paul Paskoff has written, it was foolish to try to make money by building a cold blast charcoal furnace. Those already owning such furnaces experimented with ways of heating their blasts.
For about a decade new ironworks sprang up in southeast Pennsylvania near the anthracite coal fields that exist almost nowhere else in the United States. By the close of the Civil War, however, it was found that bituminous coal-located in fields in many parts of the country-when coked was even better than anthracite and the next generation of furnaces sprang up near the coking fields south of Pittsburgh. Meanwhile the Bessemer furnace was sweeping medieval iron and steel technology into history's dust bin.
The rise of rail transportation was probably the most important factor in rendering bar iron from blast furnaces obsolete. Railroads made it possible to move the raw materials for making iron cheaply from rural iron plantations to cities where capital, labor, and markets were abundant. Moreover, railroads themselves were an enormous market for iron and steel products-rails, rolling stock, girders and bridges being only a few of the items needed. Not only was the demand for different kinds of iron than what blacksmiths preferred-it also had to be much cheaper, and mass-produced.
In 1870 the federal manufacturing census showed that 199 blast furnaces in Pennsylvania with just fewer than 11,000 workers produced 1,033,272 tons of pig iron that year-an average of 5,192 tons per furnace or 94 tons per worker. Two years later in 1872, Andrew Carnegie, foremost of the iron and steel producers using the new technology, produced 13,361 tons of pig iron at just one of his many furnaces. By 1900 he was offering steel railroad rails for sale at $20 per ton! Not only couldn't medieval technology match the selling price or quantity needed, it could not even make the desired product.
Even so, the old blast furnace technology did not die immediately. Blacksmiths preferred working with charcoal-fueled, blast furnace bar iron and even modern steel works liked to blend it in alloys for special qualities. Old fashioned ironworks operators persisted and the Curtins of Centre County, Pennsylvania, unwisely built one of the last charcoal furnaces in 1848 and operated it until it burned down in 1921 excepting for a ten-year span from 1889 to 1900. They carved out a small niche by supplying old fashioned charcoal iron to those who wanted it. They pushed their workers to be more efficient and added a small rolling mill and other modern machines to increase output. However, beginning with the depression that followed the Civil War onward, they struggled to keep going while unsuccessfully offering their properties for sale. There were no buyers until the 1960s when the Commonwealth of Pennsylvania took over a part of the property to preserve as an historic site.
pdf Download this article as a PDF file