The idea of credit has been around for a long time. Certainly, the 1930′s Popeye comic strip character, Wimpy, understood credit with his catchphrase, “I’ll gladly pay you Tuesday for a hamburger today.” Credit cards and their associated rates have evolved through history into the financial option that we know today.
Start your credit card search now by using the FREE credit card chaser tool to choose a credit card!
Credit card rates have changed quite a lot since the introduction of the credit card. While the date of the introduction of the first credit card is debated, the history of changes is well documented. Spurned by government legislation, economic factors, profit margins and public opinion, credit card rates have played a major role in the shaping of this country’s financial history.
The First Credit Card
References to credit cards can be traced back to the 1920s, according to the Encyclopedia Britannica, where credit cards were issued by individual companies, such as hotels, to customers. Diners Club issued a card in 1950 for customers to use at various restaurants, but many claim that since it was, and still is, a charge card it is not the first credit card.
In 1958, Bank of America released the first general-use credit card that mirrors today’s cards; it was issued by a bank and accepted by many different merchants. It was the first example of revolving credit. Called the BankAmericard, it took on the Visa brand name in 1976. The MasterCharge credit card was created in 1966 to compete with the BankAmericard, It later became MasterCard in much the same way Visa developed.
Credit Card Rate Changes From Historical Events
In 1863, the U.S. Congress passed The National Bank Act, creating the Office of the Comptroller of the Currency (OCC), which was charged with creating a system for U.S. banking and a national currency. Another provision was to grant states the right to continue to create their own banking laws, rather than having all banks regulated by the federal government. This is important in terms of credit card rates because each state has different laws, called usury laws, pertaining to the rates that can be charged.
However, the first credit cards responded to the need for life’s pleasures felt by GIs returning from World War II, according to a report by FRONTLINE on PBS. Soldiers felt they had risked their lives braving the battlefields, and they deserved to have the comforts of life now. This was a definite departure from the Depression-era generation that was wary of buying with credit.
The credit card industry did not really take off until a 1978 Supreme Court ruling that effectively deregulated interest rates. Prior to the ruling, credit card companies and banks had to follow the usury laws set by the state where a customer lived. After the ruling, the banks and credit card companies could follow the usury laws of the state where the bank or company was located. Banks could now offer high interest rates to those living in states that capped credit card interest rates at a lower level.
Many credit card companies moved to states with lax usury regulatory laws, or states changed their laws to attract the jobs created by credit card companies and banks. It effectively ended any real caps on interest rates.
The rise of computer technology in the 1980s and 1990s allowed banks and credit card companies to calculate a person’s credit risk through formulas, according to TIME Magazine. Credit scores were born, quantifying every adult’s financial decisions down to a certain number that could be used by lenders to assess risk and set interest rates.
In 2009, the federal government passed legislation under the Credit CARD Act in response to consumers who had had enough with credit card rates, fees, and practices. Credit card companies raised interest rates in part as a response to lost revenue from the Act and in part to an overall increase in default risk from the unstable economy from the 2008 Wall Street meltdown.
Credit Card Interest Rates Highs and Lows
Most credit card rates have a variable interest rate that is essentially tied to the U.S. Prime Rate. The Prime Rate is an agreed-upon benchmark for bank lending, and it is usually tied to the Fed Fund rate, which is set by the Federal Reserve. Credit card companies and banks set credit card rates based on a customer’s risk for repayment, so rates will vary.
According to the Wall Street Journal’s report on the Prime Rate, the lowest rate was found in the first recorded year, 1947, at 1.75%. Rates stayed under 8% until 1969. The Prime Rate reached 10% and above in the 1970s. The highest Prime Rate, of 21.50% was recorded in late 1980. Rates have steadily fallen over the years to arrive at an all-time low of the current 3.25% in 2012. The economic crisis of 2008 has led to the Prime Rate being frozen at this low rate to help stimulate the U.S. economy.
Fox Business lists the average consumer credit card rates for the end of February 2012 at 16.99%. Credit card companies add a set amount of interest, such as 11.25% or 13.75%, to the Prime Rate, making the interest variable.
If the Prime Rate goes up, so does the credit card’s rate.
While the average interest rate is around 17%, credit cards also charge penalty interest rates due to late payments. The most common penalty interest rate is 29.99%. However, the credit card with the highest interest rate in the U.S. was from Premier Bankcard, with a whopping 79.90% rate, according to Fox Business. The lender withdrew the card in early 2011 because too many customers were defaulting.
Get a credit card that won’t charge you high interest rates by using the FREE credit card finder now!
- Will regulators ever cap credit card rates?
- Are today’s rates the highest credit card rates ever?
- What is the highest credit card interest rate allowed by law?
- Are there laws capping credit card interest rates?
- Where can I find local credit card companies in my area?
- Usury & Interest Guide: Usury Definition, Usury Rates, and is Your Credit Card Usurious?