3 data-driven strategies to secure the atomized network

3 data-driven strategies to secure the atomized network

Join today's leading executives online at the Data Summit on March 9th. Register here.



This article was contributed by Martin Roesch, CEO of Netography.

In October 1969, the first computer message was successfully sent from a computer at UCLA to another computer at Stanford, heralding the birth of ARPANET. The original intent of ARPANET’s design was to establish a decentralized communications network that would remain operational even if part of it were destroyed. 

As the internet evolved over the ensuing five decades, we’ve witnessed wholesale shifts in how network and compute models are delivered — from centralized mainframes to the distributed desktop revolution, and then back again to centralized data centers, where security controls and data governance could be effectively consolidated.

Now, we are witnessing this pendulum swing once again — yet this time around, the decentralized computing model is an altogether different beast. These days, applications and data are both scattered and automatically replicated across multiple public cloud environments. They might also live in an on-premises data center or a dedicated private cloud. And now two years into a global pandemic, any notion of a network perimeter has been all but obliterated by the demands of employees to work from anywhere. 

Welcome to the atomized network, a fluid computing environment where applications, data, and even system resources are in a perpetual state of motion. And perhaps no one has benefited more from the emergence of the atomized network than a new generation of opportunistic threat actors who now have at their disposal an expansive and fragmented surface area upon which to wage their attacks. 

The security challenges of the atomized network

Security has always been challenging, but securing the atomized network ups the ante considerably. In 2021, organizations were using an average of 110 cloud-based applications while also supporting hundreds of custom applications that run in the cloud.  

The versatility of being able to mix and match different clouds has been a boon to IT leaders who required a more responsive and flexible infrastructure. However, as is the case with so a good deal of IT decisions, there are trade-offs to consider. With each cloud environment that a network or application connects to, more complexity is added to the equation. And of course, as your network becomes more distributed, the harder it becomes to see everything that’s in it and everything that’s connected to it. 

And then of course there’s the most important asset of all — your data. In the atomized network, data now moves seamlessly across these distributed environments via the cloud, as well as to and from a shifting fleet of remote work locations. Each of these unique environments brings with it its own security controls. However, these tools were never designed to work together, nor do they have a common interface to help security leaders truly understand what’s happening within their networks.

All the complexity generated by the atomized network is one of the reasons why it can take months or even years for companies to realize that their network has been compromised in the first place. IBM estimates that it takes an average of 280 days to identify and contain a breach. And with each passing day that an attacker remains undetected, they are afforded the luxury of time to observe, learn and isolate the weak points in their victim’s infrastructure — all of which can mean the difference between a minor incident and a massive breach.

Three data strategies to defend the atomized network

While no one knows what the network of the future will look like, it’s likely to only grow more decentralized as enterprises look to offload more of their applications and workloads into the cloud. So, given these challenges, what steps should security teams take to protect the atomized network? Consider the following:

1. Leverage network metadata as a primary source of threat intelligence 

Conventional network threat detection and response has historically required deep packet inspection appliances that were deployed throughout network environments. The rapid adoption of zero-trust initiatives is accelerating the trend of blinding deep packet inspection due to encryption of network traffic. As zero trust becomes the norm, the utility and practicality of deep packet inspection will dramatically decline in value. After all, you can’t inspect traffic that you can’t decrypt, nor is there a feasible place to locate these “middle box” appliances — in the atomized network, there’s no middle anymore. 

However, that doesn’t mean that the enterprise is unable to analyze encrypted traffic that it sees on the network. As NIST points out, “the enterprise can collect metadata about the encrypted traffic and use that to detect possible malware communicating on the network or an active attacker. Machine learning techniques … can be used to analyze traffic that cannot be decrypted and examined.” It should be noted that any attacker who has successfully penetrated a network must also use that same network to escalate privileges and, try as they might, they will invariably leave a trace in the form of network metadata. The ability to collect and analyze network metadata in real-time will therefore become a critical capability for modern security teams.

2. Move beyond binary security controls

For the past two decades, the whitelisting and blacklisting of discrete applications and entities has served as a practical first line of defense. However, the process of maintaining these lists can be cumbersome, nor do they address how threat actors have evolved their tactics. Just as sophisticated hackers quickly adapted to evade signature-based detection tools, they have likewise found novel ways to circumvent these methods. This issue becomes especially pronounced in the atomized network where there is no shortage of entry and exit points and when every minute matters. While these types of methods and tools will likely continue to serve a function in the security team’s toolbox, defending the atomized network will require an ability to interpret and act decisively on a far broader range of data.  

3. Enrich your data sources to provide behavioral context

Data enrichment strategies should be considered a critical factor in effective threat detection, threat forensics, and remediation. Using enriched data adds event and non-event contextual information to security event data to transform raw data into meaningful insights. It’s also important to have the ability to enrich data in real-time and supplement it with business and threat intelligence details. 

While many security leaders are trying to get full visibility of their atomized networks, we know that gathering vital metadata from all the disparate systems on these networks. This thereby provides visibility and attack detection, reusable integrations to eliminate blind spots, block threats, alert on malicious traffic, and more, which is the best way to protect them.

Martin is the CEO of Netography, the security company for the atomized network.


DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers