Hybrid and Multi-Cloud Security: Bulletproof Software Defined Perimeter Implementations
By Don Boxely, CEO and Co- Founder at DH2i,
The decentralization of today’s enterprise is a recognized fact. The multitude of cloud benefits—cheap storage, pay-per-use pricing, disaster recovery (DR), and on demand resources—will undoubtedly continue to drive adoption rates for some time.
Similarly, indisputable is the fact that security breaches are increasing in regularity (if not severity) with each new assault. A disconcerting amount of data breaches occur in the cloud, jeopardizing the utility of conceivably the most innovative technology advancement of our times.
What’s necessary to fortify the cloud’s value proposition is a security paradigm as flexible and as low latent as the very opportunities and threats cloud computing affords. It should minimize the surface area for attacks while eluding the notice of intruders; it should be deeply embedded within an organization to safeguard its applications and information as the enterprise “family jewels” that they are.
Software defined perimeter is a progressive security model issuing these benefits and others. When properly implemented, it secures gateways at the application layer both to and between clouds for unassailable security with cloaked micro-tunnels hackers won’t see or detect.
The best of these implementations rely on proprietary protocols rarely used, offer micro-tunnel failovers for continuous application connectivity between clouds and on-premises settings, and are dynamically positioned wherever resources are.
With encryption capabilities to ensure even third-party software providers aren’t privy to transmissions, they’re the most fortified deep segmentation perimeter method purposefully designed for hybrid and multi-cloud deployments.
Hybrid and multi-cloud deployments are becoming increasingly necessary to reduce organizational costs and boost productivity. In fact, according to 451 Research’s Voice of the Enterprise: Cloud Hosting and Managed Services, Budgets and Outlook survey of 644 enterprise IT decision-makers, 58% of organizations are pursuing a hybrid strategy involving integrated on-premises systems and off-premises cloud/hosted resources. Moving datacenters or specific applications to the cloud to enable uniform access for distributed locations is a common use case; establishing different nodes in the major public cloud providers for various pricing options, failovers, or burst performance needs is another. Typical perimeter security measures in these examples and others involve establishing Virtual Private Networks (VPNs), which actually multiply risk in numerous ways. VPNs were designed for traditional on-premises security; they’re less effective in the cloud because they expand network surface area, enabling more room for lateral movement attacks. This credential-based security method is also difficult to manage with messy access control lists and the continual reconfiguration of firewalls.
Competitive software defined perimeter solutions exceed these limitations in several ways. They effectively implement segmented micro-tunnels between applications or servers—in different clouds and on-premises—creating micro-perimeters to decrease network attack surface, not expand it. The lack of network expansion means users are simply connected at the application layer via a micro-tunnel gateway that effectively cloaks this conduit so intruders have nothing to scan. In comparison, VPNs leave ports open for hackers to detect. All the access control lists, firewall concerns, costs and risks of standard VPN measures are obsolete with software defined perimeter security.
Because software defined perimeter options facilitate the described invisible security ports directly between applications or servers, they’re highly transferable between settings. They result in a dynamic deployment of perimeter security wherever needed, isolating specific services for engrained user accessibility. Certain implementations of these solutions, however, offer more protection than others do. Most platforms create micro-tunnels with Transmission Control Protocol (TCP), which is widely used and well known to malignant actors. More competitive approaches involve User Datagram Protocol (UDP), which is much less frequently used and therefore less familiar to potential hackers. One reason TCP is more commonly used than UDP is because it has innate error correction capabilities that keeps data orderly. By supplementing UDP with similar data correction capabilities found in TCP, competitive software defined perimeter solutions keep data packets in order while relying on a lesser known protocol for improved security and lower data transmission latencies.
Thus, when distributed, on-premises Oracle client applications are using such a solution to simultaneously talk to an application server in the Azure cloud for a financial services use case, for example, one of the first things to transpire is the opening of randomly generated UDP ports between the on-premises micro-tunnel gateway and the Azure micro-tunnel gateway. Security is enhanced by the random generation of the port—whereas many applications rely on standard ports known to all users—and the fact that most algorithms are trained to hone in on TCP, not UDP ports. Once the micro-tunnels are in place the client application and cloud server application hosts only communicate via their respective micro-tunnel gateways. Their ports are never exposed to the internet, effectively cloaking them from everyone.
Encryption and Availability
The most robust software defined perimeter implementations offer a pair of advantages competitors don’t. The first is application level encryption and Public Key Authentication. Even if attackers did manage to find and access these invisible ports, they’d only get encrypted data. Usually, providers of this form of security don’t encrypt data, making them privy to this information. Impregnable implementations of this paradigm involve software connecting the micro-tunnels between applications without further involvement with the data—because they’re encrypted.
The second boon is unique to this implementation as the actual gateways are highly available. All users have to do is implement multiple gateways between settings. If the micro-tunnel between an on-premises application and AWS, for example, failed for any reason, the data could automatically failover to an Azure cloud, for instance, for availability. Another use case for multi-cloud deployments involves burst performance. If users had, for example, a three-node cluster on premises, in Azure and in AWS for OLTP, they could rely on this implementation of software defined perimeter to burst to large nodes in the cloud for end of the week or month tallying, which would otherwise tax their on-premises resources. If one provider failed for any reason, users could securely go to the other to continue operating.
Not only do such software defined perimeter implementations exceed traditional security measures for hybrid and multi-cloud access, but their protocols, encryption, and high availability surpass those of other implementations. They’re also cloud agnostic for complete flexibility between clouds, enabling users to eschew vendor lock-in with the most effective security for multi-cloud and hybrid usage.