Skip to main content

Imagine a burly doorman at an exclusive party.  When someone claims to be a guest, the doorman checks their invitation and runs it against the names on the list.  If it doesn’t match up, the person won’t make it through the velvet rope.  But what happens if the doorman isn’t doing his job?  His lapse could allow a ringer into the party to scarf up the hors d’oeuvres and steal the valuables. 

It’s not a perfect analogy, of course, but the FTC’s settlements with credit information company Credit Karma and movie ticket site Fandango demonstrate the dangers when companies override the default settings of operating systems designed to authenticate and secure the connections used to transmit sensitive information.

Here’s how things work after a consumer has downloaded an app onto a device.  Think of Secure Sockets Layer (SSL), the industry-standard protocol to establish encrypted connections, as the doorman.  When an online service wants to connect to an app, the service presents an SSL certificate to vouch for its identity.  Once the app validates the certificate, the online service is allowed through the velvet rope and establishes an encrypted connection to the device so the consumer can send information.  This one-two punch of validation through an SSL certificate and encryption creates a safer way for people to transmit sensitive data.

But fraudsters have been known to use spoofing techniques to mount what are called man-in-the-middle attacks.  If the app doesn’t check the SSL certificate, an attacker can use an invalid certificate to get their foot in the door and establish a connection to intercept information sent between the app and the online service.  Neither the person using the app nor the online service realizes what’s going on.

Securing the transmission of personal information against threats like man-in-the-middle attacks is so important that the iOS and Android operating systems provide developers with easy-to-use application programming interfaces – APIs – to implement SSL.  By default, these APIs automatically validate SSL certificates and reject the connection if the certificate is invalid.

The developer documentation for both the iOS and Android operating systems uses particularly strong language to warn against disabling those default validation settings.  According to the iOS documentation, failing to validate SSL certificates “eliminates any benefit you might otherwise have gotten from using a secure connection.  The resulting connection is no safer than sending the request via unencrypted HTTP because it provides no protection from spoofing by a fake server.”  The Android documentation doesn’t mince words either:  An app that doesn’t validate SSL certificates “might as well not be encrypting communication, because anyone can attack users at a public Wi-Fi hot spot . . . [and] the attacker can then record passwords and personal data.”

According to the FTC, Credit Karma and Fandango ignored those “Don’t go there” warnings.  While developing its iOS app, which lets consumers get their credit scores and monitor other financial data, Credit Karma authorized a service provider to use code that disabled SSL certificate validation for the purpose of testing.  But the FTC says Credit Karma let the app go to market without turning the default settings back on.  So between July 18, 2012, and around January 1, 2013, the company’s iOS app was vulnerable to man-in-the-middle attacks, putting users’ Social Security numbers, dates of birth, and credit report data at risk.

How did CreditKarma find out about the problem?  According to the FTC, not through its own in-house checks and monitoring.  The complaint alleges that a user contacted Credit Karma, leading the company’s engineers to update the app in January 2013 to restore the default settings.

But that’s not the end of the Credit Karma story.  A short time later, FTC staff contacted Credit Karma about the problem.  Only then did the company’s in-house team run a security review on both versions of the app.  Was it a complicated, expensive, time-consuming thing?  No.  According to the FTC, it took just a few hours and cost next to nothing.  And guess what it revealed?  In February 2013 – after Credit Karma had been told about the iOS vulnerability – the company launched the Android version of its app with the exact same problem.  The review also revealed another security glitch:  The iOS app was storing authentication tokens and passcodes on the device in an insecure manner.

The FTC’s lawsuit against Fandango charges the company with similar lapses.  From March 2009 until March 2013, the iOS version of Fandango’s app failed to validate SSL certificates, overriding the system’s security defaults.  According to the FTC, Fandango didn’t test its app before release to make sure it was validating SSL certificates and securely transmitting consumers’ personal data, including credit card numbers, expiration dates, and security codes.  Yes, Fandango commissioned some audits in 2011, a full two years after the app was released.  But even then, it limited the scope to include only threats posed when the attacker had physical access to a consumer’s device.  It didn't test for secure data transmission.  Thus, Fandango missed an opportunity to detect the vulnerability it had introduced by overriding the defaults.

The FTC says Fandango compounded the problem by not having an effective channel for people to report security problems.  According to the complaint, a researcher contacted the company in December 2012 through the only method readily available – a Customer Service web form.  Because the researcher’s message included the term “password,” Fandango’s Customer Service system treated it as a routine password reset request and responded with a canned message.  The system then dismissed the security warning as “resolved.”

When did Fandango finally fix the problem?  According to the complaint, it wasn’t until the company heard from FTC staff.  Only then did Fandango run the simple test that revealed that its app failed to validate SSL certificates.  Fandango also found out that the vulnerability affected a separate movie ticket app it hosted for a third party.  Within three weeks, Fandango issued an update of both iOS apps that restored the default settings, thereby plugging that security hole.

The proposed settlements with Credit Karma and Fandango require the companies to put comprehensive security programs in place to address risks related to the development and management of new and existing products and to protect the security, integrity, and confidentiality of information covered by the order.  Consistent with other settlements, Credit Karma and Fandango will need stem-to-stern security audits from an independent professional every other year for the next 20 years.  Of course, the terms of the agreements apply just to those companies, but savvy businesses will want to read the proposed orders to see what’s required of Credit Karma and Fandango.  You can file a comment about the proposed settlements by April 28, 2014.

What else can companies learn from these cases?

1.  Exercise extreme care when modifying security defaults.  Had the companies left well enough alone, the security defaults of the operating systems would have protected consumers’ personal information from man-in-the-middle attacks.  Of course, we’re not saying it’s always illegal to modify a default setting.  In fact, there are ways you can go above and beyond the default SSL certificate validation by implementing an even stronger authentication method known as “certificate pinning.”  But modifying security defaults is the brain surgery of app development.  Companies need to be darn sure they know what they’re doing.

2.  Test your app thoroughly before releasing it.  Carpenters have an old adage:  “Measure twice, cut once.”  The corollary for app developers:  Take advantage of readily available free or low-cost methods for testing the security of your apps before you put them into consumers’ hands.

3.  Consider how people will use your apps.  There’s a reason why SSL is so important in the mobile environment and why the iOS and Android developer documentation makes such a big deal about it:  because people often use mobile apps on unsecured public Wi-Fi networks.  Like chess players, developers need to think a few moves ahead.  Before releasing an app, think through how people are likely to use it and secure it with those real-world considerations in mind.

4.  You’re responsible for what others do on your behalf.  According to the complaint, Credit Karma authorized a service provider to disable the SSL certificate validation process during pre-release testing, but didn’t see to it that the security settings were restored after that.  The first concern:  The testing could have been done without turning the defaults off.  But even so, it’s critically important that companies make sure everything is back in apple pie order before consumers get the app.

5.  Keep your ear to the ground.  There’s an active research community out there that shares information about potential security vulnerabilities.  But by responding to a serious warning with a standard “bedbug letter,” Fandango missed the opportunity to fix the problems.  Has a knowledgeable person contacted your company recently about a potential risk?  And is that message languishing unread in an email box?

6.  Consult available resources. The FTC brochure, Mobile App Developers: Start with Security, offers advice for companies about protecting against this type of vulnerability:

To protect users, developers often deploy SSL/TLS in the form of HTTPS. Consider using HTTPS or another industry-standard method. There’s no need to reinvent the wheel.  If you use HTTPS, use a digital certificate and ensure your app checks it properly. A no-frills digital certificate from a reputable vendor is inexpensive and helps your customers ensure they’re communicating with your servers, and not someone else’s.  But standards change, so keep an eye on current technologies, and make sure you’re using the latest and greatest security features.

Bookmark the FTC’s Privacy & Security page and consult other public sources for free information about developing safer apps.

 

It is your choice whether to submit a comment. If you do, you must create a user name, or we will not post your comment. The Federal Trade Commission Act authorizes this information collection for purposes of managing online comments. Comments and user names are part of the Federal Trade Commission’s (FTC) public records system, and user names also are part of the FTC’s computer user records system. We may routinely use these records as described in the FTC’s Privacy Act system notices. For more information on how the FTC handles information that we collect, please read our privacy policy.

The purpose of this blog and its comments section is to inform readers about Federal Trade Commission activity, and share information to help them avoid, report, and recover from fraud, scams, and bad business practices. Your thoughts, ideas, and concerns are welcome, and we encourage comments. But keep in mind, this is a moderated blog. We review all comments before they are posted, and we won’t post comments that don’t comply with our commenting policy. We expect commenters to treat each other and the blog writers with respect.

  • We won’t post off-topic comments, repeated identical comments, or comments that include sales pitches or promotions.
  • We won’t post comments that include vulgar messages, personal attacks by name, or offensive terms that target specific people or groups.
  • We won’t post threats, defamatory statements, or suggestions or encouragement of illegal activity.
  • We won’t post comments that include personal information, like Social Security numbers, account numbers, home addresses, and email addresses. To file a detailed report about a scam, go to ReportFraud.ftc.gov.

We don't edit comments to remove objectionable content, so please ensure that your comment contains none of the above. The comments posted on this blog become part of the public domain. To protect your privacy and the privacy of other people, please do not include personal information. Opinions in comments that appear in this blog belong to the individuals who expressed them. They do not belong to or represent views of the Federal Trade Commission.

Get Business Blog updates