Gold and silver have been used as currencies for thousands of years. The Egyptians began producing their gold-bearing shekels around 1500 B.C., and silver coins started appearing in the same areas roughly 700 years later. In other words, gold and silver have been part of the human experience for most of recorded history.
Though both gold and silver have fascinating elemental properties, the truth of the matter is that gold and silver have retained their popularity throughout human history because they are shiny and brilliant. There are reliable records that gold and silver were used as media for jewelry even before the Egyptians began their production of the shekel.
Needless to say, by the time the Founding Fathers secured their independence from England in 1783, the practice of using gold and silver as currencies was a well-established practice across the world. So, it was no surprise that, only three years after the implementation of the US Constitution, Congress brought forth the Coinage Act of 1792.
The country’s original founding document, the Articles of Confederation, had called for the issuance of currency (called “Continentals”), but failed to tie the currency to any standard – gold, silver, or otherwise. Thus, the bills quickly became worthless and created an early crisis for the Founding Fathers. Their solution was the Coinage Act.
The Act created the United States Mint and ordered it to begin producing coins of various denominations in gold, silver, and copper. The Coinage Act also certified that the new country’s currency would be the dollar.
Finally, the Coinage Act was the first law to put in writing an exchange rate for the new US dollar. Specifically, the Act declared that 15 pounds of pure silver was equivalent to 1 pound of gold under US law. It also established that each dollar was equivalent to 24.75 grains of fine gold (about 0.057 ounces) or 371.25 grains of fine silver (about 0.85 ounces).
Establishing a tie between new currency and precious metals is one thing. Finding an abundant source of said metals (so as to bolster the US economy) is another.
In 1799, a young man named Conrad Reed discovered a 17-pound yellow rock in Little Meadow Creek, a small body of water on his father John’s land. Conrad brought the rock home to serve as a doorstop for the family home until 1802, when it was sold for the princely sum of $3.50 to a Fayetteville merchant. The merchant later resold it for $3,500 when it was identified as gold.
After John Reed himself found a 28-pound gold nugget, the first gold rush in the nation began. Between 1804 and 1828, North Carolina was the sole provider of gold to the US Mint.
The United States was in the business of expanding its lands dramatically in the 19th century. Most settlers intended to farm or raise livestock on the land.
However, in 1848, gold was discovered at Sutter’s Mill in California. Hundreds of thousands of gold-frenzied Americans beat a path to the West Coast in search of the yellow metal.
Eleven years later, the area immediately east of California experienced a similar influx of people. The discovery of vast quantities of silver in Nevada in 1859 – now more commonly known as the Comstock Lode – provoked a 15-year period of intense mining for the white and flashy metal in and around Mount Davidson.
A third rush occurred when prospectors discovered gold in Alaska’s Klondike. This rush, which began around 1899, is arguably still underway to this day.
It’s important to understand the relationship that the public had with the US Mint during this period. The Mint’s primary duty – or one of them – during the early years was to accept deposits for raw gold and silver from citizens, strike the metals into coins, and return the newly-minted coins back to the depositors.
It’s also important to understand the ongoing effect that the Coinage Act of 1792 continued to have on the US monetary system as time went by. The law had created what was, in essence, a bimetallic standard. Both gold and silver were accepted mediums for legal tender. A dollar in gold was worth a dollar in silver.
Thus, the system soon found itself increasingly subject to Gresham’s Law. Gresham’s Law is a 16th-century economic principle that states when two currencies are operating interchangeably with each other, the less valuable currency will push the more valuable currency out of the market. This motion occurs because those with gold wisely conclude that it is better to hoard one’s gold and use silver for transactions as often as possible.
Lawmakers in Washington did not want to see gold eliminated as a standard of trade in the US because it would create problems overseas. In response, they spent much of the early 1870s creating a new coinage act. The new law, dubbed the Coinage Act of 1873, recommended moving the country toward the gold standard and away from silver.
Though the inspiration behind the Coinage Act of 1873 made some logical sense, its abrupt implementation and lack of demographic understanding led to massive pushback from the public. In essence, silver had become the currency of the people over the past few decades. Most farmers, miners, and residents of the western United States used the white metal as their medium of exchange and did not often deal in gold.
So, when the bill passed in 1873, these individuals suddenly found that they could not exchange their bullion for coins. The new law, dubbed the “Crime of 1873” by its detractors, had effectively demonetized silver in the United States. This demonetization, coupled with a similar move in Germany and several other factors, precipitated the worldwide economic crisis known as the Panic of 1873.
The Panic allowed the rise of powerful opposition to the move away from the silver standard. The Free Silver Movement, which consisted of those same miners, mine owners, and farmers, managed to find an audience within Congress and secured some initial legislative successes. Notably, the Bland-Allison Act of 1878 restored silver as legal currency and obliged the Treasury to purchase and coin between $2 million and $4 million of silver each month.
The Sherman Silver Purchase Act of 1890 further strengthened the government’s commitment to the purchase of silver. The new law compelled the Treasury to purchase a whopping 4.5 million ounces of silver every month. It did not, however, restore silver as a mintable commodity – something the Free Silver advocates wanted above all else.
As is often the case with important issues, the question of silver versus gold became more and more politicized with each passing year. Conservatives blamed the Sherman Act for causing the economic panic in 1893 and worked quickly to repeal it to stop the rush on gold.
Undeterred, the Democrats took up the mantle of the common man and the silver proponents that had been with the Populists up to that point. The Democrats decided to stake their flag on silver as the primary plank of their platform in the 1896 Presidential Election.
Now, the president at the time, Grover Cleveland, was a Democrat himself, but was not a supporter of silver. In fact, he had been one of the primary drivers behind the repeal of the Sherman Act in 1893. He begged the Democrats to abandon their course, but his pleas fell on deaf ears.
Instead, the Democrats chose to nominate William Jennings Bryan as their candidate. Bryan was a staunch advocate for silver, and his advocacy culminated in his Cross of Gold speech, where he rhetorically expanded the debate about the two metals into a referendum on democracy itself.
Bryan essentially became a one-issue candidate after the speech, and though he gained a tremendous number of votes, he failed in his quest for the White House. Instead, Republican William McKinley became the new President of the United States.
McKinley’s presidency bore witness to a dramatic increase in the gold supply in the country, thanks to new discoveries and an improvement in refining techniques. Buoyed by the strength of the new largesse and the overall repudiation of the silver standard, McKinley signed the Gold Standard Act in 1900. The country was now on the gold standard.
The gold standard worked without much difficulty for the first thirty years of the 1900s. Aside from the brief suspension of the gold standard during World War I (which many other countries also did), dollars were, as required by law, backed by an amount of gold. The Federal Reserve Act of 1913, which created the Federal Reserve and changed or introduced several different monetary policies, specified that Federal Reserve notes had to be 40% backed by gold.
The first signs that the gold standard might be in jeopardy occurred in response to the onset of the Great Depression. For reasons more complex than this page can cover, consumption and investment dried up quickly and, as confidence fell, more and more people began hoarding money and holding onto their funds as best they could.
President Franklin D. Roosevelt took office in 1932 amid the third year of the Depression and immediately set out to reverse the lack of spending, confidence, and overall malaise that gripped the economy. One problem he perceived was, in fact, the gold reserves held by the public. The Federal Reserve, which kept a supply of gold as backing for its issued notes, had reached the end of its ability to increase the money supply.
Increasing the money supply was, in FDR’s mind, one key to escaping the Great Depression as it would allow him to build out his massive government spending programs and – hopefully – get people back to work. So, he issued Executive Order 6102 in April 1933.
This EO called for a dramatic and sweeping mandatory exchange of all citizens’ gold coins, bullion, and gold certificates with the Federal Reserve. Americans had roughly two weeks to surrender their gold and receive $20.67 per troy ounce in return, or face large fines and up to ten years in prison.
Roosevelt further strengthened his grip on the nation’s gold reserves in January 1934. The Gold Reserve Act mandated that the Federal Reserve transfer all of its gold and the titles to the stores to the US Treasury. Furthermore, Roosevelt boosted the value of gold to $35 per troy ounce as a means of increasing inflation so that he could pay for his programs.
This $35 per troy ounce valuation remained the exchange rate in the US until 1971. Of course, most citizens could no longer own much gold themselves, so it was largely a monetary policy issue.
FDR also committed to a similar course of action with Executive Order 6814. The August 1934 proclamation called for all citizens to surrender their silver holdings to the Federal Reserve. Once again, the penalties for failing to do so were quite onerous.
Whether FDR’s policies were successful or legal remains a matter for debate. What is clear is that the notion of owning gold changed fundamentally during the turbulent 1930s in the United States.
However, though FDR’s executive orders had widespread effects on the American monetary system and, frankly, the public’s relationship with gold, it largely set the stage for a stabilizing monetary policy. As countries emerged from the profligate spending of World War II, they sought to create a sort of baseline valuation for all currencies.
So, a meeting of leaders from the US, Canada, Great Britain, western European countries, and Japan created a new worldwide system of valuing money. The Bretton Woods Agreement required its signatories to peg their currencies’ exchange rates to within 1% of the US dollar. In turn, foreign governments could exchange their dollars for gold bullion at FDR’s aforementioned $35 per troy ounce rate.
The Bretton Woods system created the International Monetary Fund as an agency to provide reserves and loans in order to stabilize currencies further. It also created the International Bank for Reconstruction and Development, an agency which issues loans to developing countries and which is now part of the World Bank.
The problem with all of these moves is that they are heavily inflationary. For leaders who believe that increased government spending leads to a stronger economy, it is advantageous to allow inflation to proceed at reasonable levels. However, for those who wish to reduce spending and price levels, such policies are anathema. It was only a matter of time before a US leader came along with a deflationary mindset.
As it turned out, that man was Richard Nixon. The early 1970s bore witness to relatively high levels of inflation and unemployment in the US. Nixon, who viewed the weakening dollar as one of the major causes for the problems in the country, announced in August 1971 that he would be suspending foreign governments’ ability to exchange dollars for gold. He also imposed temporary wage and price freezes and instituted a 10% tariff on imports to encourage Americans to buy domestic products.
His actions essentially killed the Bretton Woods system. Although they did not do so explicitly, the closing of the gold window for foreign governments made it an unworkable policy. The US now had a purely fiat currency, untethered to gold, and entered into the era of variable exchange rates.
A legislative change within the Ford administration provided the last adjustment to the American relationship with gold. American citizens regained the right to own gold in December 1974.
Shortly thereafter, the US Treasury held two auctions and unloaded massive quantities of gold to the public. Nearly 1.25 million troy ounces of metal flooded into the open market in 1975 as a result of the January and June sales.
At almost the same time, it became possible to trade gold futures on the New York Stock Exchange’s Commodity Exchange and Chicago’s International Money Market. Thus, gold quickly began to settle into its new role as two things – a variable store of value, and a bit of a novelty.
50 years after the Nixon Shock, gold remains in this same role. Those who buy gold see it as a hedge against inflation and, in some ways, a tangible way to hold value in one’s hands.
Gold is also much more valuable than it used to be. Once the US moved to a fiat currency, there was no limit to how far the dollar could stray from the value of a troy ounce.