Why ending the gold standard led to a bigger government and a smaller dollar
As The Hill’s Robert P. Murphy notes, most Americans associate the end of the gold standard and the ensuing dollar erosion with Nixon and his 1971 decision to “close the gold window.” In truth, however, the U.S. dollar was weakening long before that, something that can be explained by evaluating the length of the tether between gold and the dollar.
Between our nation’s founding and the Civil War, there was practically no difference between gold and currency. U.S. coins were minted with face values based on the quantity and price of gold or silver they contained. This kind of policy minimized the government’s role in monetary issues. Instead of being stored in vaults, the nation’s precious metals circulated. This decision essentially allowed the public to dictate monetary policy based on natural supply of the metals: individuals presented gold or silver to the U.S. Mint to be manufactured into into coins.
The Civil War saw the first shift away from this approach, when paper notes not immediately redeemable in gold and silver coins were issued by both the Union and the Confederate states (primarily to pay for war efforts). Both sides engaged in inflation, an obvious temptation when no other controls exist on money issue.
Between 1879 and 1914, the government did away with silver monetization, but restored convertibility between the dollar and gold with a roughly $20.67 an ounce ratio. By this time, however, the view of paper as money had already established itself. So long as a $5 bill and a $5 gold coin are fungible (interchangeable, with no loss of value between them), carrying $100 in paper money is just more convenient than carrying $100 in coins.