Outsourced security program failure leads to $100K regulatory fine

one-hundred-thousand-100000-dollar-bill-img

Another reminder of the importance of managing third party vendor relationships…

The Commodity Futures Trading Commission fined AMP Global Clearing (an electronic trading firm) $100,000 for a disclosure of 97,000 files containing customer information to an unauthorized third party due to a misconfigured network attached storage device.

AMP had outsourced parts of it information systems security program to a third party provider who had failed to detect the exposed data during three successive vulnerability audits of AMP’s systemes.

Outsourcing can be a really effective tool for augmenting a firm’s infosec program, but business leaders and CSOs need to remember that the ultimate responsibility for protection of corporate and customer data still remains with them.  However, when the firm is a regulated entity, the risks of relying on an outsider to perform critical parts of the infosec program without adequate supervision outweigh the (admittedly attractive) cost savings.

Monitoring third party service provider performance is a hard problem.  Most firms don’t have the resources to perform in person audits and most providers don’t have the ability to allow every customer to audit them.  This is why external independent audits of third party providers’ security practices are so important.  These audits need to be performed against generally accepted security standards with objective audit criteria.  ISO27001 and SSAE18 SOC2 are two examples of such audit types.

Even if a business partner gets a clean bill of health from an independent auditor, their performance must be monitored by the line of business who engaged them as well as by the infosec department.  Recently, I have been seeing more and more inquiries from my firm’s customers coming between their annual due diligence reviews of our services.   Most of these inquiries occur when there is a “celebrity vulnerability” like Spectre/Meltdown – what I am hoping to see in the future are more questions confirming “security 101” procedures and practices.

The advent of security ratings firms like Security Scorecard and Bitsight can also be helpful in this area.  While their security ratings cover specific aspects of a vendor’s security program (practices that can be seen from the Internet), they can provide an ongoing data point to be used to detect potential problems in between those annual security reviews.  I believe that this industry is in its early stages and that the results that they provide must be examined carefully against the specific requirements of your security program.

As companies outsource infrastructure, applications and services to third parties in order to concentrate on their core competencies, the importance of third party vendor management is going to continue to grow.

Outsourced security program failure leads to $100K regulatory fine

Leaky buckets and acquisition best practices

leaking-bucket-1

There are three interesting things for CSOs to think about in this story on a leak of passport and other personal information on tens of thousands of people:

  1. If you are going to use Infrastructure as a Service providers like Amazon, make sure that the people using them take the time to learn about and use the security features.  Amazon provides the means to store data securely and has a wealth of documentation on security best practices.  Having a breach due to an improperly configured S3 bucket is amateur hour, folks.
  2. When acquiring new companies, especially small ones, security due diligence needs to be job one.  Finding out where sensitive information is stored and how it is protected is a must.
  3. Know your third parties (and those of your acquisitions) – FedEx blamed the breach on an un-named third party.  Remember – you can outsource the function, but you cannot outsource responsibility for security.  When doing an acquisition, look at the list of every vendor that the target company pays and figure out which ones might be holding data.

I have been through the acquisition process a few times in the past ten years – identifying show stopper issues during due diligence is important, but it is vital to keep the process going after the deal is done.  The more you dig into the security of the acquired firm, the more “interesting” security issues you will find.

Leaky buckets and acquisition best practices

Malicious data leaks and corporate liability – a tale of two countries

UKWaterCrisis

Databreaches.net had a link to a very interesting article about corporate liability for an employee’s malicious leaking of employee information.  What was most striking to me was the contrast between cases in the UK and the US.

In the UK, a disgruntled employee leaked payroll data for 100,000 employees of a very large supermarket chain to newspapers in order to embarrass the firm after they were disciplined for bad behavior.  The courts found that employees have the right to sue the supermarket chain for damages as they were “vicariously responsible” for the acts of their employee.

In contrast, a similar case in the US against Coca Cola had a very different outcome.  A Coca Cola employee sold laptops which they were tasked with destroying and these laptops contained personal information of employees.  Employees sued, but the courts dismissed all of their claims, saying that Coca Cola could not have known about the rogue employee’s activities.

This case has a few lessons for infosec professionals:

First, if your firm operates in multiple jurisdictions, the laws and norms in these jurisdictions can be very different.  When judging risk and formulating policy, work with your legal departments to make sure you understand these differences.

Second, I feel that this case also shows the differences in attitudes to personal information in the US and the rest of the world.  It seems like the US does not value individual privacy nearly as much as other countries. Again, if you operate in multiple jurisdictions, you need to keep this in mind.

As the stakes get higher for organizations (for example $20M or 4% of global revenues for each breach of the EU GDPR), these are things we need to worry about.  Buy your general counsel a beer and talk it out before you have to deal with a lawsuit.

Malicious data leaks and corporate liability – a tale of two countries

Two factor authentication on web apps should be the default

zoidberg

tl;dr – If you are using Microsoft Office 365 (or any other hosted email solution) and have not enabled two factor authentication, you are bad and you should feel bad

Microsoft and other cloud vendors really need to make two factor authentication the default for their email and other business critical cloud applications.  You should have to make an active decision to turn off 2FA and be forced to watch a video about companies who were hacked as a result of lack of 2FA in order to make the decision stick.

I spent too much time today dealing with two business partners (one small and the other large) from whom my users received multiple emails containing PDF phishing documents.  These emails were hard for users to recognize as bad –  they came from a real email account of a real person at a real firm that they had done business with.

What had happened is that our partners were using hosted email and had not enabled two factor authentication.  A user at each got phished and the attacker in each case took control of their email to send the evil documents to all of their contacts.

Fortunately for us, our protections worked – user awareness training and multiple layers of web and email filtering alerted us to the problem and none of our users fell into the trap lain by the attacker.

It could have been much worse.  A more sophisticated attacker could have utilized the identities of the email senders in a more sophisticated way, such as to redirect payments on invoices or to get our users to disclose confidential information.  Or who knows what.

That being said, it still is pretty bad – any information we sent to those email accounts in the past is now in the hands of who knows who. We are reviewing the traffic to the hacked accounts to  determine what could have been exposed.  While it seems that these guys were not after intellectual property, we will never know where that information ends up.

The decision on the part of these two partners to not have 2FA has real costs for my firm – users had to be notified, all emails sent to those partners need to be reviewed for sensitive information and an incident report written.

For now, I am pulling all of our email logs to determine which of our vendors are using various hosted email platforms and sending them a note inquiring as to whether they use 2FA.   If not, we are going to have some serious talks with them about their security posture.  We’re also going to start monitoring for partners who move from on-prem to hosted email.

This type of attack is happening way too often and opens up companies who never signed up for these hosted services to risk which just should not be there.

Off to look at emails…

Two factor authentication on web apps should be the default

Great DerbyCon talk on hunting for the bad guys

Wabbits or bad guys, all the same to me

It sometimes seems to me that a lack of data is not the issue when patrolling your networks for signs of evil badness… it is quite the opposite – operating systems, security logs and other sources are drowning us in data which we don’t leverage.  This talk from DerbyCon 2015, “Intrusion Hunting for the Masses – A Practical Guide” really opened my eyes to a number of ways to leverage data that we already have to look for signs of sophisticated intrusions early in the kill chain.  If you manage infosec for your organization or are in the bad guy hunting business, I highly recommend this information and idea packed 45 minute talk by Dave Sharpe (@sharpesecurity).    I love stuff like this – you don’t have to make huge investments in new hardware or software to do this kind of analysis and the potential payoffs are pretty big.   Best con-talk I have watched in a long time.

 

 

Great DerbyCon talk on hunting for the bad guys

The Practitioner’s Perspective on Cybersecurity – June 2015

On June 16th, 2015, I was privileged to participate in a panel entitled “The Practitioner’s Perspective on Cybersecurity” at the SmartBrief Cybersecurity forum, held at the New York Yacht Club.  At this event, co-sponsored by SIFMA, I and a panel of other financial services security professionals bloviated on the challenges facing us today.

Here is a 15 minute “highlights reel” from the panel…

And here is the full discussion, which ran approximately 45 minutes…

The participants were:

Al Berg, Chief Security and Risk Officer, Liquidnet Holdings Inc.
Robert Cornish, Chief Technology Officer and Chief Information Security Officer, International Securities Exchange (ISE)
Boaz Gelbord, Chief Information Security Officer, Bloomberg LP
George Rettas, Managing Director and Chief of Staff, Global Information Security Department – Information Protection Directorate, Citigroup
Moderator: Sean McMahon, Senior Finance Editor, SmartBrief

More videos from this event can be found here.

The Practitioner’s Perspective on Cybersecurity – June 2015

What should InfoSec people be doing?

Every once in a while, I like to take a step back and look at just what it is that I as a Security and Risk professional am supposed to be doing for the people who seem to be regularly depositing money in to my bank account.  Sometimes, getting caught up in the day to day tasks of keeping my company off of page 1 of the Wall Street Journal clouds the bigger picture.  I sat down this weekend and gave this issue some thought and (at the risk of being accused of navel gazing) came up with the following thoughts on what we security people should be doing and why:

 

  • The purpose of the Information Security/Risk Management function is to protect the organization and its stakeholders while enabling it to achieve its business goals.  Information Security/Risk Management should not be the department that says “No,” it should be the department that says “Here’s how we can move forward – safely.”

 

  • Understanding the goals of the organization and the processes, procedures and products used to meet those goals is vital to the work of Information Security and Risk Management.  Every organization (and sometimes divisions within the organization) has a different risk appetite, leading to a unique set of policies, procedures and technologies.

 

  • The foundation of Information Security and Risk Management is the organization’s people and culture.  Technology certainly has a large role to play in building defenses, but a well educated and vigilant management team and work force (the “Human Firewall”) is the keystone of a successful information security program.  Management’s choices as to risk must be informed and the CSRO must provide them with the information needed to make the right decisions.

 

  • While “advanced persistent threats” and cutting edge attacks get a lot of press attention, most security breaches result from the organization’s failure to implement the boring, basic, but vital “Security 101” measures.

 

  • Information security as a practice has changed significantly in the past decade.  While once, we built moats and castle walls to keep the bad guys out of our networks, today we face attackers who can “parachute in” to an organization by taking control of an employee’s computer.  Perimeter controls are still necessary, but networks must be able to withstand an attack from within.

 

  •  The Information Security and Risk professional must always be learning – about their organization, their industry as well as about new risks, threat actors and defensive techniques.  Both the business and Security and Risk landscapes change daily and only by keeping pace with these changes can the Security and Risk professional remain relevant.
What should InfoSec people be doing?