PGS: Taking seismic intelligence to new depths

About PGS

PGS is an integrated marine geophysics company based in Norway, with offices in 14 countries. Combining its own fleet of purpose-built, customized vessels and one of the largest uses of computing power in the world, PGS maps out the subsurface of the ocean to support a sustainable energy industry, including offshore renewables and carbon storage.

Industries: Technology
Location: Norway

Tell us your challenge. We're here to help.

Contact us

With Google Cloud, PGS has built one of the most powerful high-performance compute solutions in the world in order to provide detailed, high-quality marine geophysical intelligence from seismic data.

Google Cloud results

  • Boosts High Performance Compute and storage capacity with a super computer built with Google Kubernetes Engine
  • Reduces capital expenditure with 100% utilization of servers and only paying for what is used
  • Cuts seismic survey turnaround times from approximately 20 to 2-3 days with the ability to scale resources on demand

Tool performs up to 30 quadrillion calculations a second

In 2012, Dr. Paul Snelgrove, Associate Scientific Director at the Ocean Frontier Institute, said in his TedTalk "A Census of the Ocean" that we know more about the surface of the moon and of Mars than we do about the deep seafloor. PGS is trying to change that.

Headquartered in Norway, PGS is at the cutting edge of marine geophysics, collecting and interpreting seismic data in order to map the ocean's subsurface. "We call ourselves an integrated seismic intelligence provider," says Jon Oluf Brodersen, Chief Information Officer at PGS. "We cover all aspects of seismic data. We have large physical assets, in that we own a fleet of highly specialized vessels that go out and collect seismic data from beneath the seafloor. But we also have the IT capability to analyze and visualize that data, in a process we call imaging, and provide the highest quality recommendations to our customers."

Working with companies and governments around the world, PGS' vision is to support the search for affordable and sustainable energy for all. As the world moves away from hydrocarbons, more and more of PGS' work has involved green energy projects, such as looking for suitable sites for offshore wind farms or searching for suitable cavities in the seabed for carbon capture projects.

As demand for PGS' services grew over the years, the company wanted to expand its capacity for its incredibly resource-intensive imaging processes. Adding more on-premises infrastructure would create technical overheads without the flexibility to scale them according to demand. For instance, most of the compute workload is needed after a survey is completed. However, when a vessel is actually carrying out the survey, with on-premises infrastructure, most of those compute resources are idle as they wait for the data to come in, resulting in inefficiency. So in 2019, PGS turned to Google Cloud.

"Our imaging solutions work with tens of petabytes of data, requiring the use of on-premises supercomputers to carry out quadrillions of calculations every second," explains Brodersen. "We saw Google Cloud as the ideal way to expand our capabilities while keeping capital expenditure and overheads as low as possible."

"Our imaging solutions work with tens of petabytes of data, requiring the use of on-premises supercomputers to carry out quadrillions of calculations every second. We saw Google Cloud as the ideal way to expand our capabilities while keeping capital expenditure and overheads as low as possible."

Jon Oluf Brodersen, Chief Information Officer, PGS

Echoes from the deep

What sets PGS apart is its ability to carry out all aspects of a survey, from data gathering to analysis to delivering the final product. Its fleet of purpose-built, precision-tooled vessels that use echo signals from the crust below the sea bottom to build a comprehensive model of the surveyed area. Seismic readings are passed through bespoke algorithms, which account for factors such as speed, angle, and rock type to build a detailed picture of the ocean floor.

Sifting through, processing, and analyzing all of these datastreams takes enormous resources, and PGS had previously been reliant on some of the world's most powerful supercomputers, in addition to its six data centers around the world. But on-premises High Performance Compute (HPC) resources like these are inflexible and time-consuming to configure and maintain. While cloud computing can be more expensive in the initial outlay, operating costs are significantly reduced over time because of the ability to scale up and down on demand, work with newer kinds of hardware and find more efficient ways of computing and storing data in the cloud.

"We came up with the concept of a 'burst model'," explains Brodersen. "We wanted to be able to very quickly expand capacity by up to 30% in periods of increased demand. That could only be done with the cloud."

PGS chose to deploy its HPC expansion with Google Cloud for three reasons. Firstly, the company felt that the culture of engineering and innovation at Google was a perfect fit. Secondly, Google Cloud and its commitment to open-source tools such as TensorFlow and Kubernetes afforded PGS the opportunity to develop the tools it wanted without having to worry about vendor lock-in. Finally, Google Cloud had a presence in the Nordics where PGS is based and having access to local engineers was a priority for the company.

Building a supercomputer in the cloud

PGS signed an agreement with Google Cloud in 2019 and set about building an HPC presence in the cloud with a "lift and shift" migration to begin with. However, simply trying to recreate the on-premises architecture in the cloud was not the optimal solution. "We found a lift and shift approach ended up not being very user friendly," says Brodersen.

In 2020, Covid-19 reduced demand for PGS' services, but it gave the technological team time to rethink their strategy without having to worry about a depreciating on-premises infrastructure. Following a summit with some of Google Cloud's top engineers about how best to use Google Cloud for the kind of HPC projects that PGS needed, the company moved from replicating the on-premises infrastructure to a more cloud-native architecture, based around Google Kubernetes Engine (GKE).

"Google Kubernetes Engine is the core of our cloud-based services, acting as our High Performance Compute scheduler," says Brodersen. "It allows us to scale up and down so much faster than before and has boosted our capacity significantly. It's a real game-changer for us. We were actually able to create a 30 petaflops supercomputer with 1.2 million vCPU in a few hours and have it disappear automatically when the work is done."

Just as GKE provided Brodersen and his team with the ability to compute at unprecedented speeds, Cloud Storage has been provided an equally powerful storage solution that allows them to read and write petabytes of data at high speed.

In addition to building out its HPC workload on the cloud, PGS migrated most of its enterprise applications onto Google Cloud. BigQuery has proven especially useful as a highly scalable data warehouse for the company's business intelligence needs, while Anthos has helped integrate the existing on-premises infrastructure with the new cloud services.

"Google Kubernetes Engine is the core of our cloud-based services, acting as our High Performance Compute scheduler. It allows us to scale up and down so much faster than before and has boosted our capacity significantly. It’s a real game-changer for us."

Jon Oluf Brodersen, Chief Information Officer, PGS

High performance computing on demand

While the original plan was for a 30% increase in HPC capacity with a "burst model", PGS went much further after seeing early successes with Google Cloud and GKE in particular. When the company's on-premises supercomputers came to the end of their service period in 2022, PGS had adapted their eight most used algorithms to the cloud, and were able to move 80% of its HPC workload onto Google Cloud and shut down two of its main data centers.

The sheer scale at which PGS can operate with Google Cloud is clearly represented by the number of cores it has used so far. Its on-premises infrastructure used 202,000 cores, but at its peak, the company has used 1.2 million vCPU with Google Cloud.

Since the migration to Google Cloud, the company has seen an almost fourfold increase in compute power. If the cloud HPC workload was a single on-premises supercomputer, it would be in the top 25 supercomputers in existence.

Higher utilization rates, lower capital expenditure, faster surveys

When PGS ran on-premises supercomputers, the average utilization rate was around 55%. That meant for just under half the time they were up, the company was running idle servers, incurring large maintenance costs in addition to what it had already paid for purchasing and installing the costs. Moving to the cloud means that PGS can scale its HPC workloads up to levels far in excess of its previous capacity, and then scale back down once the work is done, only paying for what it uses.

"With Google Cloud, we reduced our capital expenditure significantly, because we don't have to run idle servers when we don't need them," says Brodersen. "This means that we not only have more to spend on our industrial assets, but we have also reduced our carbon footprint considerably."

As beneficial as these technical achievements have been for PGS, their real value lies in the benefits that can be passed on to customers. The most obvious is the impact on turnaround time. Previously, processing steps could take up to approximately 20 days to complete, most of which was due to the amount of time that the HPC workloads took. With the ability to scale resources on demand with GKE, PGS can process data much faster and deliver the final product to customers quicker.

"With Google Cloud, we reduced our capital expenditure significantly because we don't have to run idle servers when we don't need them. This means that we not only have more to spend on our industrial assets, but we have also reduced our carbon footprint considerably."

Jon Oluf Brodersen, Chief Information Officer, PGS

New horizons with the cloud

In addition, Google Cloud has provided new opportunities for the company. Thanks to the capacity and speed of Cloud Storage, PGS has also built a new product called Versal, in joint venture with TGS, CGG and SLB, which allows clients to search through entire libraries of seismic data and visualize it on an interactive map. "It's an online portal that allows customers to see what we have and exactly what our seismic data covers very easily," says Brodersen. "It's an effective bit of marketing for our surveys."

Brodersen and his team are now focused on bringing the rest of their systems into Google Cloud. "We're shifting the rest of our HPC capacity and eventually all of our storage in order to be 100% in the cloud," he says. "We've already dramatically expanded our HPC capacity, but we're not even using the full resources available to us. If we did, we would be running a cloud-based supercomputer that was seven times more powerful than what we had before."

Tell us your challenge. We're here to help.

Contact us

About PGS

PGS is an integrated marine geophysics company based in Norway, with offices in 14 countries. Combining its own fleet of purpose-built, customized vessels and one of the largest uses of computing power in the world, PGS maps out the subsurface of the ocean to support a sustainable energy industry, including offshore renewables and carbon storage.

Industries: Technology
Location: Norway