Blog Insights A brief guide to multicloud security
November 21, 2019
6 min read

A brief guide to multicloud security

Five challenges and seven best practices to consider for your multicloud strategy.


Many agree that multicloud is worth the risk.

The multicloud trend has taken hold in recent years, with RightScale finding
that 84% of enterprises run a multicloud strategy
. With multicloud,
organizations deploy applications across two or more cloud platforms, like
AWS, Azure, or Google Cloud.

Increased flexibility is one of the biggest appeals of a multicloud strategy.
Companies avoid vendor lock-in by deploying workloads to different cloud platforms
based on cost and application needs. Hyperscale cloud vendors have data centers
across the globe, so organizations are able to control their cloud expenditures
by scheduling workloads based on location and local time. Multicloud also
protects business operations by reducing down time, and improving resilience in
the event of an outage or workload-disruptive breach (like a DDoS attack).

However, multicloud still has drawbacks that require careful consideration.
The increased complexity of a multicloud environment exponentially increases
an organization’s attack surface and level of risk. Most of these risks can be
mitigated with a thorough assessment and strategy addressing security needs –
and as a study from IDG and IBM has found,
70% of survey respondents agreed that the benefits of multicloud outweigh the risks.

That being said, there’s a lot to consider. In this blog, we’ll run through
some of the top security challenges of multicloud, and dig into the strategies
to conquer them. If you're short on time, feel free to skip down to the best
practices section.

Key security challenges and how to manage them

Access and permissioning

Multicloud adds complexity to your identity and access management efforts.
Employees need access to multiple cloud services as part of their daily work,
and will access your data from a multitude of locations and devices. We
recommend you take a Zero Trust approach here: Allow access on an as-needed
basis, and no more. Data classification levels can help you streamline access
determinations across different clouds, but the key idea is that limited access
will both protect your most mission critical and sensitive information, and
allow you a clear view of when (and by whom) that information is accessed.

Staying up to date

While this is a security concern for any cloud use, upgrades and patching in
multicloud are more challenging because the vulnerabilities and mitigations
from each cloud service provider are different. Multicloud complexity also
makes it difficult to keep track of vulnerabilities as applications communicate
across multiple clouds. Mike Bursell from RedHat
calls this need “workload freshness”
– and suggests that this might require you
to upgrade or patch in place, restart the workload with the latest image, or
check and reload recent dependencies, in order to maintain the most recent
versions of any dependent libraries, middleware, or executables.

A disjointed view of security

Most cloud vendors offer native tools to help you manage security within their
cloud platform, and most of those tools can’t be applied to other vendors. This
disjointed approach to monitoring makes it difficult to gain a thorough
understanding of all the vulnerabilities present in your infrastructure.

Instead of making piecemeal security sense, adopt a multicloud management tool
that serves as a single pane of glass into all the happenings across all of your
cloud platforms. Bursell notes that any monitoring tool needs to be fully aware
of the scope of your deployment. It’s also important to have regular, if not
real-time, updates to your data view so that you’re aware of unusual changes or
activities and can address attacks as they come in. A centralized tool is also
valuable for conducting forensic analysis of your systems in the event of a
late-discovered breach.

Control plane complexity

RedHat’s Bursell defines the control plane as any communication which controls
your applications or how they are run. In addition to securing communications
between and within applications, all scheduling, monitoring, and routing
communications should also be encrypted. It’s critical to secure the
administration, logging, and audit functionality of your applications
(lest you want to give hackers the opportunity to take down your entire
infrastructure). David Locke of World Wide Technology writes
that security functionality and enforcement needs to be uniform within all of
your cloud environments
, allowing those functions to communicate and coordinate
between themselves and support security automation.

Application hardening

When hardening your infrastructure, Bursell recommends knowing what APIs are
exposed, understanding what controls you have on them, and planning what
mitigations you can apply if they come under attack. Tripwire notes that
any software that your organization develops or acquires
from a third party must
be patched and security hardened by your organization.

Best practices

Need a TL;DR? We’ve got you covered:

Key security capabilities and strategies: Multi-factor authentication,
cloud workload security, security analytics, encryption, identity and access
management, cloud security gateways, microsegmentation, threat modeling,
threat intelligence, and endpoint detection and response.

Keep things consistent: Develop a set of security policies and procedures
to enforce on all of your clouds (and any on-prem software too, for that matter).
While there will almost always be some kind of incompatibility, a benchmark or
standardized security policy will reduce the risk of oversights.

Cloud agnostic software: Use security tools that can easily integrate with
any cloud service, and that can scale with increased apps and workloads.

Go beyond your CSP’s tools: Your cloud providers have tools to keep their
offerings safe, but protection of the data itself falls to you. Some vendors
may be able to advise which capabilities you need within their infrastructure
to keep your data safe.

Confidential computing: Data protection usually focuses on data at rest and
in transit, but what about data in use? Protect data as it is being processed,
and always know where the data is being used. Confidential computing will
allow encrypted data to be processed in memory without exposing it to the rest
of the system. This is a relatively new area, so consider keeping tabs on
the Confidential Computing Consortium to
stay in the loop.

Anticipate unforeseen changes: Planning for the unknown seems like an
oxymoron – but in tech, it’s not. Things change constantly, and often in ways
we don’t predict. Make sure your systems and environments can adapt to whatever
the market throws at you.

Stay informed of new computing trends: For instance, Nick Ismail from
Information Age highlights that serverless computing adoption is growing
as it allows cloud
instances to be scaled and patched instantly, and machine learning will be able
to help servers identify patterns of malicious behavior and respond faster than
human administrators can respond.

Looking ahead

Just like every market, cloud will continue to change as vendors make new
alliances and focus on new capabilities. In 2020, Forrester predicts
that hyperscale global public cloud leaders will form more alliances, while
cloud management vendors will shift their focus to security – after a
high-visibility data breach. Take steps to ensure that that breach isn’t yours
by assessing the current and future state of your cloud strategy, and infusing
security into everything you do.

Cover image by Michael Weidner on Unsplash.

We want to hear from you

Enjoyed reading this blog post or have questions or feedback? Share your thoughts by creating a new topic in the GitLab community forum. Share your feedback

Ready to get started?

See what your team could do with a unified DevSecOps Platform.

Get free trial

New to GitLab and not sure where to start?

Get started guide

Learn about what GitLab can do for your team

Talk to an expert