Skip to Main Content
Background

The first crude metal objects, knives for hunting and tools for farming, evolved during the Stone Age. Forged metal (heated, then hammered into shape) was used to make simple blades and hoes. Historical records indicate that metal casting (melting, and then pouring into a shaped mold) was being done around 4000 B.C. Forged copper weapons preceded castings and led directly to the discovery of the casting art.

Liquid metal was discovered accidentally during the forging of hot metal. While metal was being heated for shaping, some of the copper would melt. It followed naturally that the melted metal could be cast into a shape. The casting process grew during the Bronze Age and developed rapidly in the Orient, where casting art first matured industrially. Development in Europe progressed with the casting of guns, bells, stoves, and ornamental iron.

With the advent of the Industrial Revolution, the production of machines and engines increased dramatically. Mining became an extremely important industry, supplying the world with the materials required to build the machines that transformed society. Manufacturing the metals that would be used to create strong, durable engine pieces led to the development of industrial complexes and mining towns centered on either the production or manufacture of metal products.

Steel, which is stronger than plain iron, is made from iron with other minerals added. Modern, large-scale production of steel in the United States generally is considered to have begun in the mid-1800s when the first commercial batch of steel made by the Bessemer process was produced in Wyandotte, Michigan. This pneumatic process, developed independently by an American named William Kelly and an Englishman named Henry Bessemer, made it possible for the first time to produce steel by the ton instead of by the pound.

As competition from abroad increased, steel production in the United States began to decline in the 1970s. In the early 1980s, steel producers in particular suffered many economic losses and more than 200,000 workers were laid off. Top steel producers lost almost $6 billion. The steel industry began an extensive restructuring. Companies closed plants, rebuilt others, and modernized the rest. In addition to new equipment, such as oxygen-fired furnaces, they changed many of their processes to be more time and cost efficient. These changes helped to make the steel industry stronger, but generally did not result in increased employment. Automation and improved processes allowed for greater production with fewer workers.

Despite a slight recovery during the late 1980s, the steel industry continued to be affected by changes in the economy. The industry was faced with anemic market growth, an expensive labor pool, increased production costs caused in part by new environmental legislation, and stagnant market prices. In addition, foreign competition continued to be a major threat, and steel was slowly being replaced in its largest market, the automotive industry, by substitute materials such as plastics and aluminum.

As a result, the steel industry again restructured itself in the early 1990s. Companies streamlined their operations, looking for ways to cut costs and improve productivity. During this period, minimills came into their own. These smaller mills compete with larger integrated steel mills by producing low-cost metals that use smaller, electric arch furnaces, and high-tech methods.

Producers of other metals such as copper, aluminum, zinc, and lead have faced similar cycles. In order to remain viable, they have had to develop new markets, streamline production, and reduce costs. Since the 1980s, a strong secondary market for recycled (scrap) metals has developed for copper and aluminum in particular.

Today the U.S. steel industry continues to struggle in the face of competition from foreign steel manufacturers despite some efforts by the government to improve the situation.