Katonic.ai: Helping customers reduce their Enterprise Generative AI costs
About Katonic.ai
Australian startup Katonic.ai is an AI and machine learning business that provides MLOps and generative AI platforms to customers worldwide. With a presence in Sydney, Singapore, and India, the business was featured in the Everest Group's MLOps Products PEAK Matrix in 2022 and won the Frost & Sullivan Best Practices Entrepreneurial Company of the Year Award in the APAC MLOps Industry.
Tell us your challenge. We're here to help.
Contact usKatonic.ai is making generative AI accessible at scale with Google Cloud so businesses can innovate quickly and more efficiently.
Google Cloud results
- Accelerates feature release cycles from once a month to once every two weeks
- Improves customer experience as measured by a 25% increase in customer satisfaction ratings
- Enables installation of the product in a Google Cloud environment in just 40 minutes
- Reduces customer infrastructure costs from running the platform by 70%
Generative AI platform is available within a customer’s Google Cloud environment in just 40 minutes, saving time, cost, and effort for customers
Artificial intelligence (AI) is becoming mainstream, with management consultants McKinsey estimating that generative AI could add up to $4.4 trillion across 63 use cases, and IDC forecasting a 26 percent increase in global spending on AI to $154 billion in 2023 over the previous year. Thus, Australian startup Katonic.ai is helping businesses surf the AI wave by providing a platform that makes generative AI accessible from a cost and technology standpoint.
"Traditionally, you need data scientists, complex models and detailed use cases to leverage AI, but the rapid development of generative AI has made deriving value from the technology much easier," says Prem Naraindas, an experienced technology entrepreneur and Founder and Chief Executive Officer of Katonic.ai.
The company started operations during the pandemic, with its founders bootstrapping itself with a rigorous focus on cost for its first two years. Today, the business operates on a distributed model with an engineering team based out of India.
Katonic.ai provides a foundational machine learning operations (MLOps) platform, on top of which runs an enterprise platform, Katonic Playground, for customers to build generative AI applications. The platform enables customers to work with 90 large language and foundation models before deciding on one that best meets their application development needs.
Achieving scalability and staying ahead of the curve
The company started out by building and running its own infrastructure, but this proved to be expensive and time-consuming. As well as investing in technology, the business had to divert team members from product development and customer experience to infrastructure maintenance and other low value tasks.
"At a strategic level, we needed a partner that could help us stay ahead of the innovation curve. Google Cloud, with its fully managed Kubernetes through Google Kubernetes Engine, generative AI tools and models through Vertex AI, and the PaLM 2 next-generation LLM from Google AI, met our criteria."
—Prem Naraindas, Founder and Chief Executive Officer, Katonic.ai"While traditional AI is data- and compute-intensive, the biggest issue for generative AI is scalability and availability," says Naraindas. As such, Katonic.ai had to ensure it could support peaks and troughs in demand in the most cost-effective way.
"At a strategic level, we needed a cloud solution that could help us stay ahead of the innovation curve. Google Cloud, with its fully managed Kubernetes through Google Kubernetes Engine, generative AI tools and models through Vertex AI, and the PaLM 2 next-generation LLM from Google AI, met our criteria." The business also uses Compute Engine virtual machines and Cloud Storage for object storage, and integrates all Google Cloud and AI services into its platform through APIs.
The account and technical teams of Google Cloud provided Katonic.ai with detailed advice on provisioning, configuration, fine-tuning and best practices that accelerated the migration process. The business completed the migration to Google Cloud in early 2023, and its platforms are now optimized for customer use in Google Cloud environments.
"As a business, we are database, storage and connector agnostic, but most of our customers use Google Cloud with our services, and benefit from being part of the Google Cloud ecosystem," says Naraindas. According to Katonic.ai, customers on Google Cloud can reduce the infrastructure costs of running the enterprise generative AI platform by 70 percent, and achieve better performance than customers using other infrastructure.
Optimizing performance to support generative AI
Generative AI platforms require large GPU resources to operate smoothly. As such, Katonic.ai leverages Cloud GPU to support its customers' needs. "Our customers prefer open source models tuned to their environments when building generative AI applications, so they need quick access to GPU resources," explains Nairandas. "The speed and availability of GPUs of various configurations across different regions through Google Cloud has been very important to our ability to meet customer expectations."
"The speed and availability of GPUs of various configurations across different regions through Google Cloud has been very important to our ability to meet customer expectations."
—Prem Naraindas, Founder and Chief Executive Officer, Katonic.aiIn addition, Katonic.ai customers also have multiple options to choose from to keep the data that models are trained on fresh. These include using connectors and pipelines from Katonic.ai itself or native services from Google Cloud, including Vertex AI and BigQuery.
Making generative AI accessible to established businesses
Running on Google Cloud has enabled Katonic.ai to make generative AI accessible to established industries, including property, insurance, banking, and healthcare.
"At one of Australia's largest property companies, after every project, team members create best practice and lessons learned documents," explains Naraindas. "However, over time, due primarily to mergers and acquisitions, the company ended up with widely varying documentation. The company turned to us to develop a generative AI system that would answer any relevant question and create a checklist for future projects."
He also describes a loan automation use case in which a bank worked with Katonic.ai to use generative AI alongside human efforts, to check payslips submitted for loan applications. "With our support, a bank built an application that enabled the drag and drop of digital payslip copies into a location from which a large language model performs a series of checks," says Naraindas. "This reduced the average time to check payslips from eight minutes to two minutes, and overall processing times by 50 percent. In addition, the people involved are being redeployed to more interesting work."
Delivering a high quality customer experience and supporting growth
With Google Cloud's infrastructure, Katonic.ai has grown the number of users of its enterprise generative AI platform by 300 in just weeks, and plans to grow the number of users of its free SaaS service from 3,000 to 20,000. Reaching this figure would help the business to promote its paid enterprise generative AI platform to more customers.
However, performance and availability is critical to customer experience and consequently growth, and this is where Google Kubernetes Engine comes in. "Customers expect near-100 percent uptime from a SaaS product and when the number of users increases, uptime and availability become a function of the stability and scalability of the underlying Kubernetes infrastructure," says Naraindas. "This infrastructure lives and breathes, and expands and contracts constantly."
Google Kubernetes Engine also enables Katonic.ai to make its enterprise generative AI platform available within a customer's Google Cloud environment in just 40 minutes. According to Naraindas, this makes its service very easy to market to prospective customers.
Running on Google Cloud has also helped Katonic.ai operate more efficiently and focus on innovation. "We are always short of people and we want our team to focus on building exciting features for customers rather than taking care of infrastructure," says Naraindas. "As well as installation, including our CI/CD pipeline, we have automated our testing frameworks. As a result, we can undertake a new release every two weeks rather than once a month, increasing the frequency with which we ship exciting features to our customers." This has seen the business reduce its overall costs by 40 percent, while achieving a 25 percent increase in user satisfaction.
Accessing a wider pool of customers and adding new products
Currently, Katonic.ai is working on making the paid generative AI platform available to a wider pool of potential customers via Google Marketplace.
"With Google Cloud, we have the infrastructure to scale, timely access to GPUs, support from skilled account teams and products such as PaLM 2 that we offer as a default chatbot to our customers. Google Cloud has everything we need as a startup, and our customers love it."
—Prem Naraindas, Founder and Chief Executive Officer, Katonic.aiKatonic.ai is also adding new products, including a single-tenant managed service for businesses on Google Cloud. In the future, the business anticipates generating 40 percent of its revenue from its managed services and multi-tenant offerings.
"With Google Cloud, we have the infrastructure to scale, timely access to GPUs, support from skilled account teams and products such as PaLM 2 that we offer as a default chatbot to our customers," concludes Naraindas. "Google Cloud has everything we need as a startup, and our customers love it."
Tell us your challenge. We're here to help.
Contact usAbout Katonic.ai
Australian startup Katonic.ai is an AI and machine learning business that provides MLOps and generative AI platforms to customers worldwide. With a presence in Sydney, Singapore, and India, the business was featured in the Everest Group's MLOps Products PEAK Matrix in 2022 and won the Frost & Sullivan Best Practices Entrepreneurial Company of the Year Award in the APAC MLOps Industry.