captaincharisma :
if they are such a security risk why has the rest of the 1st world been using these chipped cards for almost a century now? its funny how the US is considered to be the most powerful country in the world (or was) and they are so so far behind in technology
The U.S. is last because the U.S. was first to get magnetic swipe card readers. The huge installed base of magnetic readers at U.S. merchants created a lot of inertia. Merchants were reluctant to pay for newer readers if only a few people had cards which could use them, banks were reluctant to issue new cards if merchants wouldn't buy the new readers.
Same thing happened with cell phone service. The U.S. was first to develop an analog cell phone network. But the time digital cell phones rolled around, the analog network was firmly entrenched in the U.S. Other countries without an analog network (like Korea) jumped straight to digital cell phones. The U.S. was last to make the switch. (This is also why for a long time you had to pay 10-15 cents per text message. Texting was free in other countries because nobody knew how insanely popular it would be. The U.S. carriers had a chance to see in other countries which features were most popular - texts and ringtones - and charged U.S. customers up the wazoo for these things that were free in other countries.)
Or if you want a non-U.S. example (kinda hard because the U.S. is first in a lot of technology), Japan was the acknowledged world leader in HDTV. They had their first HDTV broadcast in the 1960s. But their HDTV standard was analog. In the 1990s, digital signal processors became fast enough that you could decode a digital data stream into a HDTV-resolution 60 fps video in real-time. The U.S. quickly developed a digital HDTV broadcast standard which completely supplanted the Japanese one and became the standard used throughout the world today.
They said chip and sign is bad, but chip and pin is good, but chip and fingerprint may be better.
Chip and fingerprint is terrible, worse than chip and sign. When you hand over your card to the waiter, it has your fingerprints on it. Apple as put a lot of marketing into getting people to believe fingerprints are the next big thing in security. They aren't - you leave copies of your fingerprint everywhere, and if your fingerprint is ever compromised you can't change it for a new one. Fingerprints are OK for convenience and casual security, like preventing anyone else from making calls on your phone. But anything protecting your money or private data needs stronger protection.
Chip and PIN is best because it combines something you have (the chip) with something you know (the PIN). As long as you can keep the PIN secret, the physical card is useless. The same isn't true for the other two - someone can lift your fingerprint, or get a copy of your signature. And if your PIN is ever revealed, you can just change it.
I was under the impression banks weren't upgrading cards in the U.S. because of how profitable selling fraud protection is.
The banks pushed chip and sign instead of chip and PIN because the latter is pretty much bulletproof. If any fraud happens with chip and PIN, it's almost certainly the bank's fault. So they end up paying for any fraud. Chip and sign leaves enough wiggle room that they can blame the merchant, and force the merchant to pay for any fraud like they do now. "The signature you collected at the point of sale does not match the cardholder's signature. The fraud is your fault - you pay for it."
That's been the main reason credit card security hasn't improved these past 3 decades. The banks and credit card companies successfully shifted the cost of credit card fraud onto the merchants. No, those exorbitant interest rates don't pay for fraud. They pay for credit card holders who default on their debt. Fraud is paid for by the merchant, who passes the cost onto you and me in the form of higher prices. Since the banks controlled the credit card system but weren't bearing any of the costs for fraud, they had no incentive to implement better security.