BasisAI: Supporting responsible AI applications from code to production to win customers’ trust

About BasisAI

Founded in 2018, BasisAI helps enterprises accelerate AI adoption. It offers end-to-end AI services, from consultation to fully managed machine learning operations (MLOps), on its proprietary operating system Bedrock. Bedrock is a cloud-based platform-as-a-service (PaaS) that helps enterprises quickly deploy responsible AI, free of biases, in a real-world setting. BasisAI works with large enterprises across industries, from financial services and insurance to healthcare and transportation at every stage of their AI journey.

Industries: Technology
Location: Singapore

Tell us your challenge. We're here to help.

Contact us

BasisAI helps enterprises derive business value from AI in a shorter time by using Google Kubernetes Engine to simplify the development experience and reduce administrative burden.

Google Cloud results

  • Speeds up business impact for customers deploying ML models from prototype to production in minutes, not months
  • Reduces infrastructure burden by 25% with machine learning operations (MLOps) that enable data scientists to focus on modeling
  • Improves security and compliance with Center of Internet Security (CIS) benchmarks to identify potential issues such as misconfigurations

Reduces the time-to-market of ML systems by up to 70%

From chatbots to machine learning algorithms, artificial intelligence (AI) drives digital transformation all over the world. According to Gartner, 75% of companies will deploy real-world AI use cases by the end of 2024.

"We want to help progressive companies power their digital stack with responsible, real-time AI solutions. To enable that, we chose to deploy Google Cloud as it provides best-in-class machine learning, cloud orchestration, and security technology that allows our engineers to provide enterprises with Google-grade security and resilience."

Liu Feng-Yuan, CEO and co-founder, BasisAI

Whether it's for improving customer experience or harnessing data insights, AI is widely used among technology companies. The founders of BasisAI are tech veterans who want to help large enterprises such as finance and telcos unlock the power of AI. To bring their vision to life, they developed Bedrock, a platform that helps customers build and manage responsible AI applications, from development to deployment.

"We want to help progressive companies power their digital stack with responsible, real-time AI solutions," says Liu Feng-Yuan, CEO and co-founder at BasisAI. "To enable that, we chose to deploy to Google Cloud as it provides best-in-class machine learning, cloud orchestration, and security technology that allows our engineers to provide enterprises with Google-grade security and resilience."

Brands stand to lose consumer trust if the decisions they make, such as setting credit limits, favor one group of customers over another due to AI system biases. Bedrock provides tools for machine learning operations (MLOps), or the practice that enables data scientists and operations professionals to collaborate on managing the machine learning production lifecycle. These tools help decision-makers explain that their AI system is fair and unbiased.

BasisAI operates Bedrock on Google Cloud using Google Kubernetes Engine (GKE) and stores log messages from customer projects with Cloud Logging, a feature of Google Cloud’soperations suite. With the help of Security Command Center, BasisAI gains visibility into security issues to minimize potential brand damage and downtime. The company uses Google Workspace applications such as Google Drive and Gmail for productivity and collaboration.

“The easy-to-use Google Cloud documentation and gentle learning curve, combined with a healthy ecosystem of automation tools such as Terraform, enabled us to get our staging environment up and running in less than two weeks,” says Chua Yong Wen, DevOps engineer at BasisAI.

Faster time to market from prototype to production

Taking AI from code to production requires a tight collaboration between data scientists and DevOps within an organization. Once the data scientist designs and trains a model, the DevOps engineer takes over with deployment. By using Bedrock, data scientists have full control of the data and code, while BasisAI manages the machine learning infrastructure.

With Google Cloud, Bedrock enables MLOps practices, reducing the time-to-market of machine learning (ML) systems by up to 70%. It does this by automating workflows for training, reviewing, and deploying ML systems in a reproducible manner at scale.

"Each model is deployed as a production-grade microservice on Google Kubernetes Engine in minutes, instead of what would take months,” says Yong Wen. “The entire deployment process is automated so that data scientists can spend time on prototyping."

"We run a lean team of DevOps engineers at BasisAI. Thanks to automated scaling and deployment with Google Kubernetes Engine, we spend only 25% of our time on infrastructure work, which means the rest of the time, we can focus on building new features."

Chua Yong Wen, DevOps Engineer, BasisAI

Automated tools to minimize infrastructure burden

BasisAI moved to a fully managed environment with GKE in January 2019 and uses an infrastructure-as-code tool Terraform with Google Cloud to create and manage all aspects of customer projects, from GKE clusters to Identity and Access Management (IAM).

Compared to BasisAI’s legacy environment, Google Kubernetes Engine dynamically allocates resources to customer projects, so the engineers don't have to do it manually. "Autoscaling on Google Kubernetes Engine allows us and our customers to save costs on compute resources by only provisioning what we need at any point in time,” says Yong Wen.

For example, traffic spikes on a website utilizing a content recommendation engine may increase CPU usage and memory usage for the model. Autoscaling on Google Kubernetes Engine enables Bedrock to increase the number of nodes to serve more requests without impacting performance.

Running a Kubernetes cluster from scratch can also be challenging, but with GKE, the company's lean DevOps team saves a lot of time because they don't have to worry about the minute details of running a cluster. "We run a lean team of DevOps engineers at BasisAI,” says Yong Wen. “Thanks to automated scaling and deployment with Google Kubernetes Engine, we spend only 25% of our time on infrastructure work, which means the rest of the time, we can focus on building new features."

BasisAI team

Operating AI models at speed with robust monitoring tools

For many BasisAI customers who run online predictions, data needs to be processed with as little latency as possible.

"Finance customers may use AI systems to make real-time predictions such as fraud detection. Any latency may cause a delay in a notification to other systems to prevent cybercrime,” says Yong Wen. “Cloud Logging and Cloud Monitoring help us detect and mitigate issues in Google Cloud before they become a problem for our customers and their customers.”

For example, an out-of-memory error may flag that an AI model is running at close to 100% of memory and needs to be reconfigured to prevent downtime. If the model continues to consume more memory than allocated, BasisAI and the customer may need to review the code to find the root cause for the memory leak.

If CPU or memory resources increase, customers will need to pay more for higher cloud consumption. BasisAI shares Cloud Monitoring metrics to help customers track CPU and memory usage and allocate the right amount of resources to reduce costs.

Improving AI fairness with continuous monitoring

BasisAI designed Bedrock with built-in governance capabilities by providing visibility on all parts of the AI workflow. To promote fairness, Bedrock provides customers with full transparency to explain how each AI model works, why it drives a prediction, and the input data used for training.

Monitoring AI through the ML life cycle is critical for enterprises to prevent unwanted biases such as age and gender from creeping into the predictions. Bedrock makes it easy for data scientists to retrain models on new datasets on Google Cloud without the frustration of managing the platform.

The company adopts software engineering best practices to test code before rolling into production. For example, Google Kubernetes Engine supports canary release, where customers compare the performance of different AI model versions using the same pipeline. To do so, customers send 10% of traffic to a new version of a model, while 90% goes back to the existing version. If the new model doesn't behave as it should, users are rerouted to the old version while the team troubleshoots the issue before full deployment.

BasisAI team at work

Boosting cloud security and privacy for AI

Data privacy is crucial for BasisAI customers, especially those in regulated industries such as financial services and the public sector. As such, customers provide IAM access to BasisAI to manage the AI model hosted on their own Google Cloud environment.

"Robust IAM on Google Cloud allows our customers to easily control and revoke access to their data and code,” says Yong Wen. “For full accountability of data access, there is visibility of who made what changes at what time to Google Cloud through audit logs in Cloud Logging. If a prediction error happens on an AI system, the customer can pinpoint the source and rectify the mistake."

To enhance its security posture, BasisAI uses Security Command Center to identify and resolve potential threats from misconfigurations and compliance violations.

"We have enterprise customers, and we must adhere to cloud security benchmarks set by organizations such as the CIS Center for Internet Security,” says Yong Wen. “Security Command Center helps to identify any violations and advise steps for remediation."

“We’re keen to better serve customers who want to build ML models in any environment, and with new products such as Anthos clusters, we’ll be able to make this happen. As we continue to build out our AI solutions, we’re keen to explore even more possibilities with Google Cloud.”

Chua Yong Wen, DevOps Engineer, BasisAI

An evolving journey with Google Workspace and Google Cloud

BasisAI has been using Google Workspace as its productivity suite from day one. The company's 31 employees use Gmail, Google Calendar, Google Drive, Google Docs, and Google Sheets daily.

"We collaborate extensively on Docs to create design documents, and on Google Slides to build out client presentations,” says Yong Wen. “We also use Drive to share files such as webinar recordings for colleagues who may have missed a session.”

Meanwhile, Feng-Yuan is excited about the regular addition of new features on Google Kubernetes Engine. “Every new feature we adopt helps us take our platform to the next stage of its evolution,” he says. “We’re keen to better serve customers who want to build ML models in any environment, and with new products such as Anthos clusters, we’ll be able to make this happen. As we continue to build out our AI solutions, we’re keen to explore even more possibilities with Google Cloud.”

Tell us your challenge. We're here to help.

Contact us

About BasisAI

Founded in 2018, BasisAI helps enterprises accelerate AI adoption. It offers end-to-end AI services, from consultation to fully managed machine learning operations (MLOps), on its proprietary operating system Bedrock. Bedrock is a cloud-based platform-as-a-service (PaaS) that helps enterprises quickly deploy responsible AI, free of biases, in a real-world setting. BasisAI works with large enterprises across industries, from financial services and insurance to healthcare and transportation at every stage of their AI journey.

Industries: Technology
Location: Singapore