- Share this item with your network:
- Download
Storage
- FeatureThe state of flash storage systems
- FeatureSAN arrays maintain primary data storage dominance, for now
- FeatureStorage plays catch-up with DevOps environment
- FeatureOvercome problems with public cloud storage providers
- OpinionClassification of data can solve your data storage problems
- OpinionData storage infrastructure starts with science-fiction inspiration
- OpinionInfrastructure provisioning made easier with hybrid cloud storage
- OpinionA cure for secondary data storage services blues
Eclipse Digital - Fotolia
Overcome problems with public cloud storage providers
Security and compliance concerns are chief obstacles to public cloud storage adoption, as IT managers are hesitant to have their critical data reside outside the data center.
If you have a new app or use case requiring scalable, on-demand or pay-as-you-go storage, one or more public cloud storage services will probably make your short list. It's likely your development team has at least dabbled with cloud storage, and you may be using cloud storage today to support secondary uses such as backup, archiving or analytics.
While cloud storage has come a long way, its use for production apps remains relatively limited. Taneja Group surveyed enterprises and midsize businesses in 2014 and again in 2016, asking whether they are running any business-critical workloads (e.g., ERP, customer relationship management [CRM] or other line-of-business apps) in a public cloud (see "Deployments on the rise"). Less than half were running one or more critical apps in the cloud in 2014, and that percentage grew to just over 60% in 2016. Though cloud adoption for critical apps has increased significantly, many IT managers remain hesitant about committing production apps and data to public cloud storage providers.
Adoption hurdles
Concerns about security and compliance are big obstacles to public cloud storage adoption, as IT managers balk at having critical data move and reside outside data center walls. Poor application performance, often stemming from unpredictable spikes in network latency, is another top-of-mind issue. And then there's the cost and difficulty of moving large volumes of data in and out of the cloud or within the cloud itself, say when pursuing a multicloud approach or switching providers. Another challenge is the need to reliably and efficiently back up cloud-based data, traditionally not well supported by most public cloud storage providers.
How can you overcome these kinds of issues and ensure your public cloud storage deployment will be successful, including for production workloads? We suggest using a three-step process to assess, compare and contrast providers' key capabilities, service-level agreements (SLAs) and track records so you can make a better informed decision (see: "Three-step approach to cloud storage adoption").
Let's examine specific security, compliance and performance capabilities as well as SLA commitments you should look for when evaluating public cloud storage providers.
Security
Maintaining cloud data storage security is generally understood to operate under a shared responsibility model: The provider is responsible for security of the underlying infrastructure, and you are responsible for data placed on the cloud as well as devices or data you connect to the cloud.
All three major cloud storage infrastructure-as-a-service providers (Amazon Web Services [AWS], Microsoft Azure and Google Cloud) have made significant investments to protect their physical data center facilities and cloud infrastructure, placing a particular emphasis on securing their networks from attacks, intrusions and the like. Smaller and regional players tend also to focus on securing their cloud infrastructure. Still, take the time to review technical white papers and best practices to fully understand available security provisions.
Though you will be responsible for securing the data you connect or move to the cloud, public cloud storage providers offer tools and capabilities to assist. These generally fall into one of three categories of protection: data access, data in transit or data at rest.
Data access: Overall, providers allow you to protect and control access to user accounts, compute instances, APIs and data, just as you would in your own data center. This is accomplished through authentication credentials such as passwords, cryptographic keys, certificates or digital signatures. Specific data access capabilities and policies let you restrict and regulate access to particular storage buckets, objects or files. For example, within Amazon Simple Storage Service (S3), you can use Access Control Lists (ACLs) to grant groups of AWS users read or write access to specific buckets or objects and employ Bucket Policies to enable or disable permissions across some or all of the objects in a given bucket. Check each provider's credentials and policies to verify they satisfy your internal requirements. Though most make multifactor authentication optional, we recommend enabling it for account logins.
Data in transit: To protect data in transit, public cloud storage providers offer one or more forms of transport-level or client-side encryption. For example, Microsoft recommends using HTTPS to ensure secure transmission of data over the public internet to and from Azure Storage, and offers client-side encryption to encrypt data before it's transferred to Azure Storage. Similarly, Amazon provides SSL-encrypted endpoints to enable secure uploading and downloading of data between S3 and client endpoints, whether they reside within or outside of AWS. Verify that the encryption approach in each provider's service is rigorous enough to comply with relevant security or industry-level standards.
Data at rest: To secure data at rest, some public cloud storage providers automatically encrypt data when it's stored, while others offer a choice of having them encrypt the data or doing it yourself. Google Cloud Platform services, for instance, always encrypt customer content stored at rest. Google encrypts new data stored in persistent disks using the 256-bit Advanced Encryption Standard (AES-256) and offers you the choice of having Google supply and manage the encryption keys or doing it yourself. Microsoft Azure, on the other hand, enables you to encrypt data using client-side encryption (protecting it both in transit and at rest) or to rely on Storage Service Encryption (SSE) to automatically encrypt data as it is written to Azure Storage. Amazon's offering for encrypting data at rest in S3 is nearly identical to Microsoft Azure's.
Three-step approach to cloud storage adoption
Customers furthest along in adopting public cloud storage tend to follow a systematic, three-step approach to help them select the best provider(s) and optimize their cloud deployments:
- Begin by documenting business, IT and regulatory requirements for your specific use cases, which serve as a checklist for initial assessment. Include on the list objectives or expectations for data availability, security and application performance, among other things.
- Next, evaluate public cloud storage providers' offerings against your requirements and other deployment criteria to determine "on paper" which best meet your needs. Review service descriptions, capability lists and best practices documents to get a good feel for each offering. Look for third-party audited benchmarks or certifications, covering standards, metrics or observed performance in areas such as security, compliance and data durability. Talk to colleagues or peers in other organizations to learn about their experience with the providers on your list, and scan community blogs to get a sense of user satisfaction levels and any other potential issues.
- Finally, test-drive the selected provider's services, starting with nonproduction data and transitioning to more critical data and apps as your comfort level increases. For example, if you're considering using a service such as AWS Kinesis to load, analyze and process streaming data, test the service with recorded data streams and wait to introduce production streams until your test criteria have been met.
Also, check for data access logging -- to enable a record of access requests to specific buckets or objects -- and data disposal (wiping) provisions, to ensure data's fully destroyed if you decide to move it to a new provider's service.
Compliance standards
Your provider should offer resources and controls that allow you to comply with key security standards and industry regulations. For example, depending on your industry, business focus and IT requirements, you may look for help in complying with Health Insurance Portability and Accountability Act, Service Organization Controls 1 financial reporting, Payment Card Industry Data Security Standard or FedRAMP security controls for information stored and processed in the cloud. So be sure to check out the list of supported compliance standards, including third-party certifications and accreditations.
Performance capabilities
Unlike security and compliance, for which you can make an objective assessment, application performance is highly dependent on IT environment, including cloud infrastructure configuration, network connection speeds and the additional traffic running over that connection. If you're achieving an I/O latency of 5 to 10 milliseconds running with traditional storage on premises, or even better than that with flash storage, you will want to prequalify application performance before committing to a cloud provider. It's difficult to anticipate how well a latency-sensitive application will perform in a public cloud environment without actually testing it under the kinds of conditions you expect to see in production.
Speed of access is based, in part, on data location, meaning expect better performance if you colocate apps in the cloud. If you're planning to store primary data in the cloud but keep production workloads running on premises, evaluate the use of an on-premises cloud storage gateway -- such as Azure StorSimple or AWS Storage Gateway -- to cache frequently accessed data locally and (likely) compress or deduplicate it before it's sent to the cloud.
Third parties make cloud storage more effective
If you're looking to make your cloud storage deployment work more effectively, check out the third-party offerings in provider marketplaces or ecosystems. Though we're focusing here on security and performance, provider ecosystems include products in a wide range of other storage-related areas, such as backup, archive, disaster recovery and file transfer (data movement). Look especially for those that have been prequalified or certified for use on a provider's cloud.
Sample third-party security offerings
Infrastructure security: To better protect apps and data from cyberattacks and other advanced threats (e.g., Trend Micro Deep Security for AWS or Azure, Palo Alto Networks VM-Series for AWS).
Access and control: To tighten policy-based access and improve business governance through single sign-on and multifactor authentication (e.g., OneLogin One Cloud).
Vulnerability assessment: To inspect app deployments for security risks and help remediate vulnerabilities (e.g., Qualys Virtual Scanner Appliance for AWS or Azure).
Also check out Microsoft Azure Security Center, a security monitoring service with hooks to support a broad range of third-party offerings.
Sample third-party performance products
High performance file/block storage: Low-latency, high IOPS/throughput for file/block storage (e.g., Zadara Virtual Private Storage Array).
Hybrid file services/cloud gateways: Hybrid or multicloud file sharing, often accessed via a gateway appliance, to enable improved access for enterprise file sync and sharing, collaboration and so forth across sites or regions (e.g., CTERA Enterprise File Services, Avere Hybrid Cloud NAS, Panzura Cloud Controllers).
To further address the performance needs of I/O-intensive use cases and applications, major public cloud storage providers offer premium storage capabilities, along with instances that are optimized for such workloads. For example, Microsoft Azure offers Premium Storage, allowing virtual machine disks to store data on SSDs. This helps solve the latency issue by enabling I/O-hungry enterprise workloads such as CRM, messaging and other database apps to be moved to the cloud. As you might expect, these premium storage services come with a higher price tag than conventional cloud storage.
Bottom line on application performance: Try before you buy.
What to look for in an SLA
A cloud storage service-level agreement spells out guarantees for minimum uptime during monthly billing periods, along with the recourse you're entitled to if those commitments aren't met. Contrary to many customers' wishes, SLAs do not include objectives or commitments for other important aspects of the storage service, such as maximum latency, minimum I/O performance or worst-case data durability.
In the case of the "big three" providers' services, the monthly uptime percentage is calculated by subtracting from 100% the average percentage of service requests not fulfilled due to "errors," with the percentages calculated every five minutes (or one hour in the case of Microsoft Azure Storage) and averaged over the course of the month.
Typically, when the uptime percentage for a provider's single-region, standard storage service falls below 99.9% during the month, you will be entitled to a service credit. (Though it's not calculated this way for SLA purposes, 99.9% availability implies no more than 43 minutes of downtime in a 30-day month.) The provider will typically credit 10% of the current monthly charges for uptime levels between 99% and 99.9%, and 25% for uptime levels below 99% (Google Cloud Storage credits up to 50% if uptime falls below 95%). Microsoft Azure Storage considers storage transactions failures if they exceed a maximum processing time (based on request type), while Amazon S3 and Google Cloud Storage rely on internally generated error codes to measure failed storage requests. Note that the burden is on you as the customer to request a service credit in a timely manner if a monthly uptime guarantee isn't met.
Also, carefully evaluate the SLAs to determine whether they satisfy your availability requirements for both data and workloads. If a single-region service isn't likely to meet your needs, it may make sense to pay the premium for a multi-region service, in which copies of data are dispersed across multiple geographies. This approach increases data availability, but it won't protect you from instances of data corruption or accidental deletions, which are simply propagated across regions as data is replicated.
Is cloud storage right for you?
With these guidelines and caveats in mind, you can better assess whether public cloud storage makes sense for your particular use cases, data and applications. If public cloud storage providers' service-level commitments and capabilities fall short of meeting your requirements, consider developing a private cloud or taking advantage of managed cloud services.
Though public cloud storage may not be an ideal fit for your production data and workloads, you may find it fits the bill for some of your less demanding use cases.
Next Steps
Companies move toward public cloud storage
Evaluate all variables in the cloud storage equation
Public, private or hybrid? What's the right cloud storage for you?