Jump to Content
Databases

LOVOO’s love affair with Spanner

October 7, 2021
Johannes Braun

Head of Engineering, LOVOO

Try Google Cloud

Start building on Google Cloud with $300 in free credits and 20+ always free products.

Free trial

Editor’s note: In this blog, we look at how German dating app LOVOO broke up with its monolith system for a microservices architecture, powered in part by the fully managed, scalable Cloud Spanner. 

Founded in 2011, LOVOO is one of Europe’s leading dating apps, available in 15 languages. We currently employ approximately 170 employees from more than 25 nations, with offices in Dresden and Berlin. LOVOO changes people's lives by changing how they meet. We do this through innovative location-based algorithms, an app radar feature, and live streaming that helps people find successful matches through chat and real-time video. 

Three years ago, we started to encounter growing pains. Our user base was growing at a steady clip, and their activity within the app was growing as well. We had built the app on an on-premises monolith architecture. As we grew, the old system was unable to keep up with the speed and scale we needed to serve our users. 

After assessing the options available to us in 2018, Google’s open source driven approach and cutting edge technology were key drivers for our decision to migrate to Google Cloud and its managed services, including Cloud Spanner. Spanner now hosts more than 20 databases for us, powers 40 microservices and integrates perfectly with our other Google Cloud services. With Spanner’s open source auto-scaler, we can seamlessly scale from 14 to 16 nodes during busier hours in which we perform 20,000 queries per second. One of our databases handles 25 million queries per day and collects 100GB of new data every month. We feel confident in the platform’s ability to scale for our future needs and address our growing customer base while supporting new services and capabilities.

Breaking up with the monolith

Before migrating to Google Cloud, our infrastructure lived on-premises and used open-source PostgreSQL as a database. However, we encountered challenges with bottlenecks in performance, difficulty scaling during peak times, and constantly needing to add new hardware. The cloud promised to give our engineers and product teams a faster, smoother development process, which was a big selling point for us. We performed a lift-and-shift migration of our architecture, but used the migration as a catalyst to modernize and make important changes. We separated some responsibilities from the monolith into microservices, moving them directly onto Google Kubernetes Engine (GKE). We started out by converting about a dozen functions from the monolith into microservices, and we’re now up to over 40 microservices that we’ve separated from the prior monolith.

We performed the migration smoothly within a six month timeline, as we wanted to finish within the time remaining on our on-premises contracts. We have plans to eventually move entirely to a microservices-based architecture, but we are taking it one step at a time. Our billing database and logic is complex, and was built on PostgreSQL, our original database solution. In this specific case,  we chose to lift and shift the workload to Cloud SQL for PostgreSQL, Google’s fully managed database service. 

Falling in love with Spanner

Spanner was our first level of support on Google Cloud, and our preferred solution for large distributed databases. Spanner is a fully managed relational database service with unlimited scale and up to 99.999% availability, which means our prior scale and speed problems are effectively solved. Our developers love managed services like Spanner because routine headaches like infrastructure management, updates, and maintenance are taken care of for us, and we can devote our energy to building new features for LOVOO. 

We have roughly 20 databases in one Spanner instance, with a mix of production and development databases. It’s a kind of multi-tenancy architecture, and most of our services are connected one-to-one with a database. We have 20 TB and 14 nodes (16 at peak) on one regional deployment at the moment.

Among our use cases for Spanner are a notifications database, which is our largest database. This database is where we save data needed to send out notifications to our app’s users when other users take an action on their profiles, such as a view or a match. So when you indicate you are interested in a person and they have already shown interest in you, that translates to a row in the notification table. When the other person logs in, we query the new notifications they have and they will see that they matched with you.

We also have a database on Spanner for our user messaging. Users have conversations in our real-time chats, and messages within those conversations may include various media types they can send to each other, such as photos, audio, and gifs. The microservice that powers this real-time chat feature has a web socket connection to the clients, and it stores the text and contents in Spanner. We have a table for conversations and a table for individual messages (where each message has a conversation id).

A third use case for Spanner is with our in-app credit transaction service, where users can gift each other credits. You can think about it almost like a virtual currency payments system. So that means that we have a table with all our users and for each one we have their credit balance. And when you send out a gift, we decrease the credit number in your row and increase theirs. We also have a “payments '' ledger table that has a row for every credit gifting ever made. This capability is where Spanner’s transactional consistency shines, because we can perform all these operations automatically in one transaction.

Planning a future with Google Cloud

We’ve also been pleased with the Spanner Emulator, which has made our development process a lot easier.  Without needing direct access to Spanner,  an engineer can debug their code on their machine by running the emulator locally. As part of our build process, we launch an emulator so we can have our software tests run against it. Our engineers also use it to run integration tests on-demand on their machines. This ensures that the same API calls we use when we build the code will work when we deploy the code.

Our plans are to build all of our new features on top of Spanner, and to continue pulling services out of our monolith. We’re currently migrating our user device representation database, which tracks all of a user’s various devices. We also want to continue moving away from PHP for future use cases, and we’d like to use Google’s gRPC, an open source communication protocol, to directly connect the clients with the microservices, instead of via PHP. 

With Spanner and other Google Cloud-managed services saving us time and delivering on speed and scalability, we’ll be charting our future roadmap with them on our side. Google Cloud is the right match for us.

Read more about LOVOO and Cloud Spanner. Or read out how Spanner helped Merpay, a fintech enterprise, scale to millions of users.

Posted in