Why GCP? Comparison with the rivals.

Let’s be honest: Google is not the first choice for providing public cloud for most of the companies. In fact, as of early 2020, it’s the third player with around 7% of the whole market. With regard to SAP implementations, it looks a little better and more than 10% of the companies select GCP, at least according to my knowledge and observations.

Nevertheless, the public cloud market is getting more and more specialized. Just a few years ago, the market was shared among tens of competitors with over 50% of the revenue made by players outside of the “big four” (AWS, Azure, GCP and Alibaba). Today, those four companies are responsible for 84% of the market. Google is gaining momentum with over 80% growth last year and the best years for this platform are still ahead.

After analyzing main players’ offers, I believe there are quite a few reasons that one should consider Google as the primary target for moving SAP to the public cloud. Here is why:

  • Google has unquestionable technological advantages in networking area. Gmail, YouTube and other apps were the reason for building global fiber network in the early days as no one was able to secure Google’s needs in networking area. A lot of infrastructure was already in place when Google launched it’s public cloud platform in the early days. Network is being constantly improved and massive investments are made in this area. What’s unique, Google’s Virtual Private Network is global, which means it can span many regions (even continents!), greatly reducing networking complexity. Not to mention, it’s all logical: most constraints of physical networks just don’t exist here. Google does not just promise it’s network performs well: in fact, there are already managed services (BigQuery, Cloud Spanner etc) which decouple storage and processing! In this model, it’s not necessary anymore to move data around all the time – in fact, it’s processed just where it’s located by distant computing nodes that make good use of Google’s fiber network. This shows how powerful this beast is.
  • Subnetworks in Google’s VPC can span zones, which greatly simplifies network design for applications that are distributed across different data centers within a specific region. Also, subnets are just a logical concepts that allow you to manage resources in a better way.
  • Google promises that network round-trip between it’s data centers (= zones) within each region is below 1 ms in most cases. This makes it really tempting to deploy SAP Application Instances (or other HANA instances) in different zones, thus adding some SLA to the whole landscape. In fact, multi-zone implementations of a distributed / highly available SAP systems are nothing unusual in GCP.
This statement actually defines a region!
  • Google uses Virtual Machines only. Even if we’re talking about big HANA instances (max is 12 TB for a single box as of early 2020), it’s still all virtual and – what’s even more important – certified by SAP! Those are one of the biggest machines out there, and they’re just a few clicks away… plus, nothing stops you from joining more of those in a HANA scale-out scenario!
  • You’re not constrained with multi-year contracts for big, multi-terabyte VMs for HANA like with some other vendors. You can just upgrade your current workloads to such a size (after making sure your quota allows that). And then you can switch off the machine to stop billing in just seconds. Simple and effective!
  • GCP features like Live Migration and Automatic Restart minimize downtime needed for infrastructure maintenance, plus they can even out-perform standard SAP HANA features like HANA Host Auto-Failover.
  • Security is one of the first potential issues when moving sensitive data that resides in SAP databases. Fortunately, Google is not playing around here and your data must adhere to strict security rules, ranging from physical data center access control all up to encrypting your data at rest and in transit. You are encouraged to design privileges for your users following the principle of least privilege and Google fulfills the strict requirements for being compliant with various industry-known standards and regulations.
  • Google is well known for it’s commitment in Artificial Intelligence and Machine Learning areas, so you have a lot of innovation opportunities at your disposal. SAP HANA can be integrated with GCPs’ TensorFlow library, Google Cloud Storage and Googles’ TPUs (Tensor Processing Units) to provide a complete set of tools for innovation and creating complementary services.
  • Tools like BigQuery can significantly enhance your OLAP type of processing using data that resides in SAP databases. SAP easily integrates with BigQuery and lets you analyze petabytes of data originating from different sources (HANA db, BigQuery db etc).
  • You don’t need to utilize “super-performance-optimized-premium” persistent disks to ensure their characteristics and good enough for running SAP on HANA. You just choose SSD-type disks and ensure they’re big enough to fulfill all requirements. * size of Persistent Disk together with number of vCPUs assigned to a VM determines disk performance. Click here for details.
  • Google’s official partnership with SAP dates back to 2017. Since then, SAP HANA and some other SAP products became available in GCP Marketplace and custom, SAP-ready Operating System images (RHEL for SAP, SLES for SAP) are at your disposal.
  • Real-world, enterprise scale SAP migrations to GCP have already happened.
  • When running your workloads on GCP, you can count on per-second billing and sustained usage discounts right out the box. Google has a clear policy in this area and you can get yourself up to 30% off the shelf price just thanks to running your VMs for a longer period of time.
  • You can go even further with committed usage discounts – for details, click here. When commiting to long-term resource usage, you can freely distribute those resources between different Virtual Machines as your needs change.
  • You can use custom VM types for running SAP workloads. Why is it important? Let’s say that SAP sizing determines that you need about 20 vCPUs. With standard machine types, you can get 16 vCPU machine and 32-core VM is next, which is way too much for you. Instead of choosing a bigger box (and a bigger bill each month…) you can choose to create a suitable, custom machine that suits your requirements and has a positive effect on your TCO.
When using custom VM type, vCPU to memory ratio is important.
  • After joining GCP, you get 300 USD (valid for 1 year) that should allow you to roll out a not-so-small bunch of resources for SAP solutions. If you’re careful when provisioning, de-provisioning and shutting down resources you’re not using at the moment, those credits should last you for quite some time. That’s more than others give you – both in relation to the value and duration.
  • You get a free machine (which actually a container deployed on Google Compute Engine) called Cloud Shell that provides a set of tools and executables to manage your GCP environment. It’s an accessible from anywhere, no-cost administrative linux-based host with free 5 GB of disk space.
  • Google is a carbon-neutral company since 2007 and matches 100% of it’s energy consumption from renewable sources. In the recent years, being environmentally responsible has become more important and in some cases it can be a deciding factor when choosing a cloud provider.
  • Last but not least, I am located in Poland (Central Europe area), which was never considered a significant market by cloud vendors. But it has changed in 2019 and Google – as the first major public cloud vendor – decided to build a new region near Poland capital, Warsaw. Can’t wait to deploy my first SAP system there 🙂

Now that we’re ready with some arguments for choosing Google Cloud Platform as your primary target for SAP workloads, I will explain some details in security area in the next article. If you’d like to be informed about new content, make sure to subscribe to my newsletter!