Google Cloud Platform: Your cloud destination for mission critical SAP workloads
Babu Prasad Elumalai
Technical Lead, SAP on GCP
Pramod Mahadevan
Head, Global SAP Alliance
SAP kicked off its annual TechEd conference this week, and we thought we’d share an update on new capabilities and integrations we’ve built in collaboration with SAP to provide customers with more infrastructure capabilities and options for their workloads.
Optimize your workloads with custom machine types
Custom machine types now let you right size your SAP workload by configuring the optimal amount of CPU and memory. We worked with SAP to define productive support for using custom machines for SAP applications. A combination of right sizing with custom machines and sustained use discounts can offer potential cost savings of more than 50 percent in infrastructure costs.
Here’s how it works. SAP workload sizing is typically done using SAPS (SAP application performance standard). Let's say you determine that in order to exactly achieve the SAPS you need for your SAP workload, you will need 20vCPU. Without the certification for custom machines for SAP applications, you might need to provision a predefined machine type with either 16vCPU (under provision) or 32vCPU (over provision). With custom machines however, you can provision a machine with 20vCPU and pay exactly for what you need(20vCPU vs 32vCPU). And if your load characteristics change in the future, you can easily scale up the size of your machine by just restarting the machine with a larger number of vCPUs.
Here’s the current state of certifications for SAP workloads on custom machine types:
Announcing support for dynamic tiering
SAP HANA dynamic tiering provides disk-based, large-volume extended storage for your SAP HANA warm data. This lets you keep only the most useful or active data in memory, thereby reducing costs. Dynamic tiering also provides extensibility since you can use the same tools that you use to query SAP HANA with data stored in the dynamic tiering storage.
Today, we’re announcing productive support for dynamic tiering on GCP for your SAP HANA workloads. To make sure you have the best possible experience on GCP, we conducted a series of functional tests including querying, backups, monitoring and operations, and followed the sizing guidelines and defined the optimum storage, networking, CPU and memory requirements.
You can find more information about the resourcing requirements in our planning documentation.
New certification to back up SAP HANA into Google Cloud Storage
Backint for SAP HANA is an executable program which can be implemented by a third party such as Google Cloud and installed in your SAP HANA environment so you can back up to a third party server. We developed a Backint executable that can back up your SAP HANA database to a Google Cloud Storage bucket, and we worked with SAP to make sure this interface meets all their certification requirements for backup and restore into SAP HANA.
With this certification, you can use Google Cloud Storage for backing up your production SAP HANA databases. This means you can take advantage of features such as geo redundancy, life cycle management, and instant access so you can build a comprehensive backup and disaster recovery strategy for your mission critical SAP HANA databases.
This certification will further expand our SAP HANA ecosystem by way of integration with first class GCP services.
We’ll share more details on deployment automation, as well as documentation to help you leverage the backint interface for Google Cloud Storage, in the upcoming weeks.
Additional product certifications on GCP
As part of our ongoing certification efforts, today we’re announcing the availability of certification for the following:
SAP MaxDB version 7.9.09 and newer
SAP liveCache technology
SAP Content Server
SAP MaxDB is a relational database system and SAP liveCache technology is an object-based enhancement that can also be used as a memory based database system. With these certifications, we hope to better serve our customers running legacy SAP workloads on MaxDB.
The certification of MaxDB further extends our portfolio of databases supported with SAP Netweaver:
SAP HANA 1.0 and 2.0
Microsoft SQL Server
SAP ASE
IBM DB2
MaxDB
You can get started today with SAP MaxDB on GCP with our comprehensive guides.
In addition to these updates, SAP IQ, SAP’s columnar relational database management system (RDBMS) optimized for big data analytics, is also now certified to run on GCP.
We’re delighted that these new integrations expand upon our deep work with SAP. We’ve partnered with SAP to certify SAP applications like S/4 HANA and the SAP HANA database platform on GCP, providing customers with a world class cloud infrastructure on which they can run their SAP applications. Our platform comes with robust SLAs, and reliability, security and SAP specific capabilities for mission critical SAP applications.
These platform capabilities are available now to customers:
Planned downtime minimization through live migration.
Over 1 Petabit/sec of total bisectional bandwidth to connect applications requiring a high performance network.
Data encryption at rest and in transit.
Encryption controlled by customers leveraging Customer Supplier Encryption Keys.
4TB VMs certified for SAP HANA.
Partner-delivered managed services for SAP HANA that include SAP HANA instances that can be as large as any certified SAP HANA hardware available in the market today.
And, we’re excited for new integrations with SAP coming soon, including:
Virtual machines backed by the upcoming Intel Optane DC Persistent Memory for SAP HANA workloads.
You can find more information on our partnership with SAP on our website. And if you plan on joining us at SAP TECHED, we’ll have plenty of detailed demos that provide a hands-on look at integrating GCP with SAP technology. Please stop by booth 335 to say hello, and be sure to check out the DevGarage at SAP TECHED. See you in Vegas!