Jump to Content
AI & Machine Learning

SAVI transforms global surgical instrument tracking with Google Cloud

January 5, 2023
https://storage.googleapis.com/gweb-cloudblog-publish/images/healthcare_2022_N9JWanV.max-2500x2500.jpg
Michael Blake

VP Engineering and Architecture, Max Kelsen

Cameron Bean

FAIDH, Global Health Care and Life Sciences Lead

Try Google Cloud

Start building on Google Cloud with $300 in free credits and 20+ always free products.

Free trial

Powered by Vertex AI (Google Cloud’s platform for accelerating development and deployment of machine learning models into production), SAVI (Semi Automated Vision Inspection)1 is transforming surgical instrument identification and cataloging, leading to fewer canceled surgeries and easing pressure on surgery waitlists. 

Max Kelsen, an analytics and software agency that specializes in machine learning, has worked closely with Google Cloud and Johnson & Johnson MedTech to create a system that can manage tens of thousands of individual devices, their characteristics, and how they apply to each set or tray used by a surgeon. SAVI does this while delivering a one in 10,000 real-world error rate, much faster and more accurately than manual processes currently in use across the industry. Implementing SAVI can also unlock end-to-end visibility and traceability across the surgical set supply chain and provide advanced analytics and insights. 

Eliminating time-consuming manual processes

Surgeons need a large number of specialist instruments and devices to complete complex, delicate procedures. Because each tray of these instruments can typically cost more than $350,000, and having every type of set on shelf at every surgical facility is not feasible, manufacturers generally loan them to hospitals for procedures, such as inserting one of the manufacturers’ implants into a patient’s knee. Once a procedure is complete, the hospital returns the instrument tray to the manufacturer for storage and re-distribution to other hospitals as needed.

Each time a hospital returns a tray, the manufacturer needs to check that each instrument is there, correctly placed, cleaned, and fit for the purpose of the next procedure. As each set may hold more than 400 instruments, completing this process manually is complex and time-consuming. While each tray is checked before and after surgery at the hospital, and again when it arrives and leaves the manufacturer’s facility, Max Kelsen finds that 5% of surgeries can still be affected by missing, broken or bent instruments. This has a severe downstream impact on private hospitals in particular, directly affecting patient safety and outcomes; in Australia, for example, around 60% of surgeries are performed in private hospitals. 

Johnson & Johnson MedTech has 60,000 surgical trays across the Asia-Pacific, and loans these trays out about 100,000 times per month. The manufacturer approached Max Kelsen to help design and develop a solution to make the supply chain more efficient, and to give more visibility into asset movement. As a Google Cloud Partner specializing in applying machine learning at scale in healthcare contexts, Max Kelsen had the expertise and track record to meet Johnson & Johnson MedTech’s need for a globally scalable solution that was engineered for quality and performance.

The first step was to establish a baseline for the project by determining how long the manufacturer’s team took to process each tray, and to set an efficiency number. We then spent six months determining and evaluating how to deliver a robust, accurate solution that outperformed current manual and labor-intensive methods in processing instruments and trays, globally. Our work included extensive technical feasibility research involving a representative sample for the variety and complexity of sets, trays, and devices needed for different types of surgery, including orthopedics, spinal trauma, and maxillofacial groups.  

Working with Google Cloud to accelerate and de-risk the project 

This is a familiar problem that is industry-wide. The issue has been widely explored and tried with a number of technologies over several years without producing the scalability and performance results required to make this an appropriate and feasible solution. Google Cloud partnered with Max Kelsen to accelerate and de-risk this large and strategic project for a mutual customer.

Technical feasibility took four months, prior to a year-long production pilot of SAVl in a distribution center in Queensland that services over 100 hospitals. After obtaining enough real-world data and experience to validate that the solution was as scalable and as accurate as needed, an Asia-Pacific rollout of the system commenced. SAVI is now live across Johnson & Johnson MedTech’s operations in Australia, New Zealand, and Japan, garnering recognition with a JAISA excellence award.

Google Cloud machine learning is integral to SAVI. Google Cloud’s technologies were a big differentiator for Max Kelsen’s engineering team in delivering the breakthroughs needed at scale, and in production, to meet Johnson & Johnson MedTech’s needs.

Reducing checking and documentation time

Running SAVI in Google Cloud has reduced the time Johnson & Johnson MedTech needs to check and document inspections of these surgical instrument sets by over 40%. The application also delivers consistent measurable quality that is often hard to measure at scale when using manual processes. During the pandemic, the application enabled Johnson & Johnson MedTech to operate with a lower headcount for the same volume output, enabling the organization to quickly service a backlog of waiting list surgeries.

In addition, the automation delivered with SAVI has reduced the time required to bring technicians up to speed on quality control processes, from eight to 12 months down to just three months, enhancing productivity and performance while delivering a more robust workforce.

So how does SAVI work in a real-world context? SAVI is deployed via a tablet and a web-based application incorporates an API to photograph the medical device trays, as shown below. Max Kelsen captures the photograph and sends it to a range of different services, via an API endpoint hosted on Google Cloud:

  1. Image information is stored in Cloud Storage

  2. Data relating to the trays is stored in Cloud SQL for PostgreSQL

  3. APIs and web UI components run in CloudRun 

  4. Analytics data is stored within BigQuery

Once this tray and device onboarding stage is completed, the next step is to perform inferences from the images and data. By hosting online models with Kubeflow model serving on GKE, we enable a model to identify all the instruments in a tray at low latency.

https://storage.googleapis.com/gweb-cloudblog-publish/images/1_SAVI_10523.max-1900x1900.jpg

Vertex AI Workbench notebooks are used for data exploration and modeling. Kubeflow training pipelines hosted on GKE are executed to produce machine learning models for specific surgical instrument sets. Several hundred machine learning models are then hosted with Kubeflow model serving on GKE, with state and analytics managed using Firebase. Using machine learning to infer from images whether any devices are incorrectly placed, dirty, or otherwise not fit for purpose, the data is then returned to the tablet for the user to respond accordingly.

https://storage.googleapis.com/gweb-cloudblog-publish/images/2_SAVI_10523.max-1300x1300.jpg

Based on our success to date with SAVI, it is now available on Google Cloud Marketplace to help healthcare organizations achieve machine learning-powered efficiencies across a range of use cases, and ultimately improve patient safety and outcomes.


1. Not to be confused with the usage of Visual Inspection Model (Assembly) available in Vertex AI Vision
Posted in