Data Security Trends for 2017

Digital transformation was the running theme in 2016 as companies focused on innovation. The trade-off for this attention to innovation was a lack of attention on security, and we saw continued breaches to large entities. As a result of this non-stop assault on companies’ evolving security postures, we see the following as big trends in 2017 for security:

  • A focused, proactive approach
  • Software-defined everything
  • Increased cloud acceptance

Proactive Approach to Security

Security professionals are adjusting their security posture to adopt the general premise of a safeguard adopted by pro cycling. When cheating scandals ravaged the international pro cycling community, they decided they needed new technology to prevent it. The solution—the biological passport—defines a baseline for a variety of biological variables in each athlete; this allows an examination over time of changes to these variables and makes it nearly impossible to cheat. How are companies using this concept to enhance their security?

Application whitelisting is a good example of this proactive approach to security. We also know the forthcoming Goldilocks project from VMware will eventually add whitelisting at the hypervisor level. These are good indications that the tide has turned from only trying to keep bad actors out (and failing) to also defining what a good actor is and discarding everything else.

Software-Defined Everything

Being truly proactive for security means designing components of your infrastructure securely from the ground up. The solutions we have readily available now allow us to layer security on top of existing infrastructure; however, the problem is that our infrastructure is inherently insecure. Software-defined anything and everything means we can design the components of our infrastructure with very specific traits. That also means we can enforce specific traits, prohibiting not only applications from running but also any type of communication or behavior that is not proactively defined. Beyond evolving into a more secure method of designing infrastructure, this allows us to automate infrastructure from the ground up as well. Defining permissible behavior means we also define all the traits a component will have, allowing us to deploy components with minimal human intervention.

Beyond automation and security, software-defined everything also eases integration of multiple data centers while maintaining a single security standard. This means that migration to public cloud is no longer as complex or scary as it has been.

Increased Cloud Acceptance

Another tide that has turned has been the increased cloud-first approach by many technology teams. Whether the driver of this evolution is financial or technical is irrelevant, what matters is that cloud is no longer just an option, it’s an accepted fact of life. This broader acceptance has also meant that many workloads have come back from cloud to legacy on-premises because of poor preparation. That poor preparation is partly because of a lack of understanding about how public cloud providers work, but it’s also because of a lack of quality tools that enable easy transition of workloads to cloud and back if needed. The software-defined everything model will be the true catalyst to broad deployment of cloud infrastructure. The maturity of the tools that lessen the complexity of cloud migration will in turn encourage more aggressive conversion of on-premises workloads to off-premises.

If we compare these trends in 2017 with what we saw in 2016, we see there is little change in general direction. We have the benefit of an additional year of experience and maturity that will enable us to better meet technology challenges in efficient, secure, modular and easily mobile ways. If you want to further discuss how these issues apply in your environment, the Consulting Team at Edge Solutions is happy to have a more nuanced discussion. Please give us a call at 888-861-8884 or contact us online today!

Scroll to Top