BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Could The U.S. Government's Move To The Commercial Cloud Stop Leaks And Breaches?

Following
This article is more than 5 years old.

It seems not a week goes these days by without another leak or breach of sensitive government data, from a contractor setting the wrong access permissions on a storage directory to a data center compromise to a rogue employee walking out the door with an agency's crown jewels. At the same time, concerns are being raised about the government’s increasing centralization of computing services into the commercial cloud, with the US military most recently following the intelligence community to Amazon’s classified cloud. Instead of creating new risks, perhaps the government’s move into the commercial cloud could finally help it secure its sprawling digital empire and stop the growing flood of leaks and breaches.

In today’s cybersecurity landscape it is an unfortunate fact of life that no matter how hard an organization works to secure its most sensitive data, some of it will inevitably walk out the door.  The US Government faces an especially difficult task securing its digital secrets against the resources of the world’s most sophisticated cyber adversaries. Complicating this task is the sheer magnitude of the government’s computational footprint and its inability to hire and retain top cyber expertise spanning the myriad agencies that make up the US Government. Add to this the almost uncountable number of companies and individuals that perform contracting on behalf of the government and you have an almost impossible task to protect the government’s data.

In the pre-digital era leaking a large number of records was a Herculean feat. Today its simply a matter of hitting “print,” popping in a writeable CD, plugging in a USB drive or just copying the data to a remote server somewhere on the web. Moreover, as the government increasingly relies on contractors and opens its data repositories to an ever broader group of users, the pool of potential bad actors is rapidly expanding.

Coupled with this ever-expanding attack surface is a marked shift in how those new users, especially the younger generation, view secrecy and the right of the government to keep sensitive information secret. Instead of taking concerns to your superior, in today’s world, you simply send that information to a reporter or leak it to the web. Everything is open season now and younger employees, in particular, seem to feel it is their right to disclose whatever they wish, regardless of the impact it might have on national security.

Even where employees are trustworthy, all it takes is one obsolete 10-year-old server sitting in a basement networking closet that was long ago forgotten about or an employee desktop that has automatic patching turned off to offer attackers that all-critical foothold into the local network or ship out the organization’s entire data archive in an evening.

Adding to this mix, contractors increasingly store sensitive government data on their own systems, where it can be readily stolen by adversaries that might not be able to penetrate the government’s more heavily secured copy. Large sprawling collaborative teams spread across time zones and continually exchanging code and data, not to mention countless subcontractors and engineers operating under tight deadlines, all contribute to a mindset that favors “ease of access” over “security first.” Specialty engineers and data scientists may not fully understand the security settings of the systems available to them, setting a cloud storage bucket to world readable without understanding the implications of what they’ve just done. Programmers accustomed to a traditionally on-premises security model may post sensitive data under private URLs, not realizing that security through obscurity doesn’t work in a cloud world where storage buckets can be configured to provide a master directory listing of all folders and files.

Security and monitoring such a massive sprawling hybrid computational ecosystem that spans every government agency, an uncountable number of contractors and subcontractors and millions of users is next to impossible. Centralizing it all into a single hardened cloud, built and run by the top cloud and cyber experts in the world and offering a unified security model and policies and single monitoring infrastructure could solve a great deal of this insecurity challenge.

Centralizing all government services into a single cloud eliminates the forgotten obsolete server in the closet and permits continual auditing of all systems, including their complete software stack, with centralized alerting of delayed patching or vulnerable configurations. Understaffed agencies that can’t afford a dedicated IT staffer would no longer place their information at risk, as security would be enforced centrally. Security audits, vulnerability scans, fuzzing and all the modern security best practices could be applied government-wide on a continual basis. Critical vulnerabilities identified by a government or private security researchers could be identified across the entire government footprint with a single audit check. Default security configurations, such as preventing cloud storage buckets from supporting file listing and forcing users to receive authorization from their security officer when making content world readable would all go a long way towards improving the government’s cybersecurity posture.

The move to the commercial cloud would be especially beneficial for the myriad small government agencies and contractors that store sensitive government data, enabling them to benefit from modern security best practices even when they have no dedicated security staff of their own.

Moreover, as the commercial cloud increasingly pioneers zero-trust authentication in which every device and every user is assumed to be malicious or compromised and as applications increasingly shift back to the thin client model in which data remains on the server, less and less data will leave the secure confines of this hardened cloud. The increasing use of “big data” analysis to understand government records further enshrines the cloud as a walled garden from which data does not emerge.

The eventual centralization of the entire government’s data footprint in one place means that it will eventually be possible to completely audit in realtime every byte of traffic in and out of every government system housed in this walled garden of a cloud. While such auditing could be performed at a network level today, having the ability to reach back down to the virtual machines underlying all of that traffic makes security diagnostics far more powerful and would largely end many of the pathways that leakers take today, while cutting off much of the attack surface of the government’s data footprint. Outsourcing the hosting would even ensure government services continue during shutdowns.

Putting this all together, the government’s growing move to the commercial cloud could dramatically improve its cybersecurity posture, reduce unauthorized leaking and lead to a modern security-first data culture.