CHANGING LANDSCAPE OF COAL MINING INDUSTRY IN THE UNITED STATES
Bituminous coal was the first main target of US mining. This changed between 1843 and 1868 when more anthracite began to be mined. Anthracite was used in iron smelting, and this cleaner and smokeless alternative became the preferred fuel in cities. However, limited anthracite resources could not fulfill increasing demand. Production of sub-bituminous coal began to slowly rise; as of 2010 it was higher than bituminous coal production.
Until the 1950s coal was primary mined underground east of the Mississippi River. By the 1970s, the development of cheaper surface mining proved a viable alternative for the coal industry. Currently the Powder River Basin, the Appalachian Basin, and the Illinois Basin are among the largest coal producers in the US.
In the early 1950s, oil and natural gas became the primary source of US energy. Today, petroleum, natural gas, and coal provide 87% of US energy, and coal is used mostly for electricity generation, steel and liquid fuels production, and cement manufacturing.
US coal production dropped 37% in the last 10 years, from over 1.2 billion to about 800 million tons in 2017, cutting the number of coal mines. Coal consumption declined due to the increased supply of cheaper and cleaner natural gas, growth of renewable energy sources, and rigorous environmental regulations. Coal provided close to 60% of US electricity in the mid-1980s, and only about 30% in 2017. From 1,436 coal-powered plants in the US in 2009, about 600 remain as they are shut down due to aging and environmental guidelines.
Coal mining is an industry in transition as new technologies and policies continually reshape the energy landscape. With increased competition from other sources of energy and a steady decline in the number of coal-fired power plants, the coal industry needs innovations to continue successfully forward.