In a move that is likely to raise the blood pressure for more than a few security professionals up a couple points, the U. S. federal government has now determined that companies need to take responsibility for the applications that they use for handling and storing people’s data.
In June of 2015, the U.S. National Institute of Standards and Technology (NIST) released their latest set of guidelines for the handling of Controlled Unclassified Information (CUI), comprising data like personally identifiable information, banking, and health info, or other sensitive bits of data that one would not want falling into the wrong hands.
The December 31, 2017, due date has come and gone, and yet many organizations are still asking questions about what these guidelines mean for them.
The new recommendations, referred to as NIST 800-171, were laid out in a paper titled “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations” which outlined NIST’s expectations for contractors working with the federal government and handle this kind of data.
Derived from Executive Order 13556 that was signed by former President Barack Obama in 2010, NIST’s stated goal in this effort is to bring the “non-federal entities” — read contractors for human speak — that work with the government into line with some of the same standards that the feds have been working towards in recent years.
New Expectations for Application Security
The NIST guidelines cover a wide range of security aspects like access controls, staff training for implementing better security, and plenty of other important best practices.
To get a better handle on where the U.S. government could be headed here, we spoke with NIST Fellow Ron Ross, one of the authors of the publication to discuss the goals and thinking behind the new measures.
Ross first points out that the focus here is on the issue of confidentiality, one of the pillars of data security. While issues of availability and integrity are important to be sure, the primary concern was in protecting the massive amounts of data that contractors could be holding onto. He adds that his team considered that if the confidentiality was being secured, then integrity would be covered by design as well.
The good news for many contractors is that not all data is considered to be CUI. Data that are categorized as Low do not qualify and can be scratched off the list for the guidelines. However, all those data that are ranked as Medium and High — social security and credit card numbers, etc — will, of course, fall into this grouping.
While the issue of application security was less prominently discussed in the document, there is a clear line between data security and the applications that are used to manage those data.
Section 3.11.2, which describes Risk Assessment, requires that contractors, “Scan for vulnerabilities in the information system and applications periodically and when new vulnerabilities affecting the system are identified.”
Not only do contractors have to perform scans, but they are also expected to remediate the vulnerabilities. This essentially moves the responsibility for security to the contractors in a message from the government that they want to see the companies supplying services to them as taking ownership of their products.
Ross and his team believe that many of the application security concerns will be met by the organizations using many of the standardized operating systems that should, in theory, be updated and secured.
From my reading here, however, it is not enough to assume that the products that your organization is using are necessarily secure. This is especially true concerning the open source components that make up the vast majority of applications. Research from Gartner has shown that open source components comprise between 60% to 80% of the code in applications. So even if your organization is using solutions such as SAST and DAST to secure the code that is written in-house (proprietary), the applications that are working with your data are not necessarily protected.
As we saw in the Equifax case that saw the theft of personally identifiable information (PII) belonging to 145 million people in 2017 when attackers exploited a vulnerability in the Apache Struts2 open source component, third-party code can have significant negative impacts on a company’s reputation and integrity.
The problem when dealing with open source components when assessing risk is that developers do not always know what they have in their code. Far too often, they will take code from sources like GitHub to acquire certain functions but then fail to document which code they have committed to their products. It is therefore up to organizations to incorporate technologies such as Software Composition Analysis (SCA) that are the only tools capable of detecting open source components. This way, when new vulnerabilities are discovered by the open source community and alerts are issued for the need to patch, developer teams will be able to know to act in a timely fashion and not be sitting ducks for lucky hackers.
An interesting point to take notice of here is that contractors will essentially be held to the honor system to be in compliance. Some might take this as an opportunity to sidestep taking the regulations seriously, but there is a serious pitfall in the event that an incident occurs.
If a breach occurs and data is compromised, these contractors are legally obligated to disclose, so a failure to comply will be compounded not only in having lost control of the data, but also for not having held up to the expectations for plugging the holes. These security gaps can be particularly glaring if a company was negligent in putting in place a solution for mitigating the risk from known vulnerabilities. The end result here is that not only will a company lose their business with the government, but could be charged with fraud, hitting them with a double knock to the head.
Walking the Line: Giving Significant Flexibility to Keep Acquisitions Flowing
In planning their rollout, Ross says that they wanted to be as fair as possible with the contractors and give them ample wiggle room to prepare themselves. Organizations were given a year and a half to prepare themselves for compliance, and work out schedules for how long it will take them to meet standards for the areas of work where they are not yet in line.
Ross tells us that in rolling out the new standards, contractors will be expected to produce a report of where they are able to meet the standards, and where they are falling behind. From there, they can start a conversation with acquisition officers from the government to give them a schedule of how long it will take them to comply with the remaining requirements.
From the government’s side, Ross says that while they did not want to completely shut down the purchasing pipeline, giving the officers and contractors some wiggle room to find workable arrangements, it does add some new considerations.
He says that a company that is in compliance may be a more attractive option for the team in charge of acquiring the new services, looking better than another contractor that might have a great product but is not on a tenable schedule to come up to standards.
The Ripple Effect: How Could NIST 800-171 Have A Wider Reaching Impact
While it is still early in the process, this decision to expect more from contractors can be considered a huge step in the right direction and could potentially have far-reaching consequences outside of the direct government sphere.
One of the elements of these regulations is that they apply not only to the primary contractors that are working with the U.S. Government but the contractors’ subcontractors as well. This means that contractors need to not only worry about their own apps being in compliance but those of their contractors as well.
While some contractors may see these regulations as a hassle that they will have to comply with in order to keep their current business flowing, we view this as an opportunity for companies that want to apply for contracts with the government to get ahead of the competition by coming to the market compliant ready.
It is also reasonable to assume that the public will consider companies that are not up to the government standard as falling behind and not quite as trustworthy in handling their data.
Conceivably, we might begin to see an expansion of these data protection standards from the federal government to the private sector. As these contractors and their contractors are held to a higher level of expectations, it would make sense that they will have to provide the same level of security to their non-government clients. Afterall, why would you bother having one product that is vulnerable if you are already putting in the effort for your other client?
3 Tips for Approaching NIST 800-171 Compliance
If preparing for the new standards to take effect is making you tear your hair out, Ross has a few tips to help you get started and maintain your sanity.
- Try to see if there is a cloud service provider that is FedRAMP approved. It allows the cloud provider to make a claim about whether the products or services meet the requirements. Every agency can go into FedRAMP and see who is approved to be on their list. That evaluation from FedRAMP will save you a lot of time and money since they have already been preapproved.
- Figure out if you actually hold any CUI in your servers? Will you be getting it as part of a contract with the fed or DOD. Find out if you meet the criteria to be affected by 800-171 with this helpful link to the National Archives’ CUI Registry.
- If I have CUI, where does it actually reside within my network. This can be spread across dozens of servers or workstations. Try to consolidate it to the smallest area possible. Ross points out that it becomes easier to protect data that is in one spot as opposed to across the entire enterprise.