Stay organized with collections
Save and categorize content based on your preferences.
Develop a Java producer application
Learn how to develop a Java producer application that authenticates with a
Managed Service for Apache Kafka cluster by using
Application Default Credentials
(ADC). ADC lets applications running on Google Cloud automatically find and
use the right credentials for authenticating to Google Cloud services.
If you already have a Managed Service for Apache Kafka cluster, you can
skip this step.
Set up a client VM
A producer application must run on a machine with network access to the cluster.
We use a Compute Engine virtual machine instance (VM). This VM must be in
the same region as the Kafka cluster. It must also be in the VPC containing the
subnet that you've used in the cluster configuration. You can do this as follows:
To create the client VM, run the following command:
Give the Compute Engine default service account the necessary
permissions to connect to the cluster and authenticate. You need to grant the
Managed Kafka Client role (roles/managedkafka.client), the Service Account
Token Creator role (roles/iam.serviceAccountTokenCreator), and the Service
Account OpenID Token Creator role (roles/iam.serviceAccountOpenIdTokenCreator).
This section guides you through creating a Java application that
produces messages to a Kafka topic. Write and compile Java code
using Maven, configure necessary parameters in a
kafka-client.properties file, and then run your
application to send messages.
Write the Producer code
Replace the code in src/main/java/com/google/example/App.java with
the following
packagecom.google.example;importjava.util.Properties;importorg.apache.kafka.clients.producer.KafkaProducer;importorg.apache.kafka.clients.producer.ProducerRecord;importorg.apache.kafka.clients.producer.RecordMetadata;importorg.apache.kafka.clients.producer.Callback;classSendCallbackimplementsCallback{publicvoidonCompletion(RecordMetadatam,Exceptione){if(e==null){System.out.println("Produced a message successfully.");}else{System.out.println(e.getMessage());}}}publicclassApp{publicstaticvoidmain(String[]args)throwsException{Propertiesp=newProperties();p.load(newjava.io.FileReader("kafka-client.properties"));KafkaProducerproducer=newKafkaProducer(p);ProducerRecordmessage=newProducerRecord("topicName","key","value");SendCallbackcallback=newSendCallback();producer.send(message,callback);producer.close();}}
Compile the application
To compile this application, you need packages related to Kafka clients
generally and authentication logic specific to Google Cloud.
In the demo project directory, you find pom.xml with
Maven configurations for this project.
Add the following lines to the <dependencies>
section of pom.xml.
The producer expects client configuration parameters in a file called
kafka-client.properties. Create this file in the demo project directory
(the directory containing pom.xml) with the following contents:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-05 UTC."],[],[],null,["Develop a Java producer application Learn how to develop a Java producer application that authenticates with a\nManaged Service for Apache Kafka cluster by using\n[Application Default Credentials](/docs/authentication/application-default-credentials)\n(ADC). ADC lets applications running on Google Cloud automatically find and\nuse the right credentials for authenticating to Google Cloud services.\n\nBefore you start this tutorial, follow the steps in\n[Create a cluster in Managed Service for Apache Kafka](/managed-service-for-apache-kafka/docs/quickstart-cluster).\n\nBefore you begin\n\nBefore you start this tutorial, create a new cluster by following the steps in\n[Create a cluster in Managed Service for Apache Kafka](/managed-service-for-apache-kafka/docs/quickstart-cluster).\n\nIf you already have a Managed Service for Apache Kafka cluster, you can\nskip this step.\n\nSet up a client VM\n\nA producer application must run on a machine with network access to the cluster.\nWe use a Compute Engine virtual machine instance (VM). This VM must be in\nthe same region as the Kafka cluster. It must also be in the VPC containing the\nsubnet that you've used in the cluster configuration. You can do this as follows:\n\n1. To create the client VM, run the following command:\n\n gcloud compute instances create test-instance \\\n --scopes=https://www.googleapis.com/auth/cloud-platform \\\n --subnet=projects/\u003cvar label=\"cluster project\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e/regions/\u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e/subnetworks/default \\\n --zone=\u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e-f\n\n2. Give the Compute Engine default service account the necessary\n permissions to connect to the cluster and authenticate. You need to grant the\n Managed Kafka Client role (`roles/managedkafka.client`), the Service Account\n Token Creator role (`roles/iam.serviceAccountTokenCreator`), and the Service\n Account OpenID Token Creator role (`roles/iam.serviceAccountOpenIdTokenCreator`).\n\n gcloud projects add-iam-policy-binding \u003cvar label=\"project id\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --member=\"serviceAccount:\u003cvar label=\"project number\" translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e-compute@developer.gserviceaccount.com\" \\\n --role=roles/managedkafka.client\n\n gcloud projects add-iam-policy-binding \u003cvar label=\"project id\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\\\n --member=\"serviceAccount:\u003cvar label=\"project number\" translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e-compute@developer.gserviceaccount.com\" \\\n --role=roles/iam.serviceAccountTokenCreator\n\n gcloud projects add-iam-policy-binding \u003cvar label=\"project id\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e \\\n --member=\"serviceAccount:\u003cvar label=\"project number\" translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e-compute@developer.gserviceaccount.com\" \\\n --role=roles/iam.serviceAccountOpenIdTokenCreator\n\n Replace \u003cvar label=\"cluster\" translate=\"no\"\u003ePROJECT_NUMBER\u003c/var\u003e with the number of the project\n containing the cluster. You can look up this number using `gcloud projects describe `\u003cvar label=\"cluster project\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e.\n\nSet up an Apache Maven project\n\n1. Connect to the client VM using SSH. One way to do this is to\n run the following command:\n\n gcloud compute ssh --project=\u003cvar label=\"cluster project\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e\\\n --zone=\u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e-f test-instance\n\n For more information about connecting using SSH,\n see [About SSH connections](/compute/docs/instances/ssh).\n2. Install Java and Maven with the command:\n `sudo apt-get install maven openjdk-17-jdk`.\n\n | **Note:** The installation of OpenJDK might take more than 5 minutes, and might appear to get stuck. You can try interrupting the installation with \u003ckbd\u003eCtrl\u003c/kbd\u003e+\u003ckbd\u003eC\u003c/kbd\u003e. In most cases, the necessary packages are installed even if the process is interrupted. To confirm the successful installation, proceed to the next step.\n3. Set up an Apache Maven project.\n\n This command will create a package `com.google.example` in a directory\n called `demo`. \n\n mvn archetype:generate -DartifactId=demo -DgroupId=com.google.example\\\n -DarchetypeArtifactId=maven-archetype-quickstart\\\n -DarchetypeVersion=1.5 -DinteractiveMode=false\n\n4. Change into the project directory with `cd demo`.\n\nCreate a Java producer application\n\nThis section guides you through creating a Java application that\nproduces messages to a Kafka topic. Write and compile Java code\nusing Maven, configure necessary parameters in a\n`kafka-client.properties` file, and then run your\napplication to send messages.\n\nWrite the Producer code\n\nReplace the code in `src/main/java/com/google/example/App.java` with\nthe following \n\n package com.google.example;\n\n import java.util.Properties;\n import org.apache.kafka.clients.producer.KafkaProducer;\n import org.apache.kafka.clients.producer.ProducerRecord;\n import org.apache.kafka.clients.producer.RecordMetadata;\n import org.apache.kafka.clients.producer.Callback;\n\n\n class SendCallback implements Callback {\n public void onCompletion(RecordMetadata m, Exception e){\n if (e == null){\n System.out.println(\"Produced a message successfully.\");\n } else {\n System.out.println(e.getMessage());\n }\n }\n }\n\n public class App {\n public static void main(String[] args) throws Exception {\n Properties p = new Properties();\n p.load(new java.io.FileReader(\"kafka-client.properties\"));\n\n KafkaProducer producer = new KafkaProducer(p);\n ProducerRecord message = new ProducerRecord(\"topicName\", \"key\", \"value\");\n SendCallback callback = new SendCallback();\n producer.send(message,callback);\n producer.close();\n }\n }\n\nCompile the application\n\nTo compile this application, you need packages related to Kafka clients\ngenerally and authentication logic specific to Google Cloud.\n\n1. In the demo project directory, you find `pom.xml` with\n Maven configurations for this project.\n Add the following lines to the `\u003cdependencies\u003e`\n section of `pom.xml.`\n\n \u003cdependency\u003e\n \u003cgroupId\u003eorg.apache.kafka\u003c/groupId\u003e\n \u003cartifactId\u003ekafka-clients\u003c/artifactId\u003e\n \u003cversion\u003e3.7.2\u003c/version\u003e\n \u003c/dependency\u003e\n \u003cdependency\u003e\n \u003cgroupId\u003ecom.google.cloud.hosted.kafka\u003c/groupId\u003e\n \u003cartifactId\u003emanaged-kafka-auth-login-handler\u003c/artifactId\u003e\n \u003cversion\u003e1.0.5\u003c/version\u003e\n \u003c/dependency\u003e\n\n2. Compile the application with `mvn compile`.\n\nConfigure and run the application\n\n1. The producer expects client configuration parameters in a file called\n `kafka-client.properties`. Create this file in the demo project directory\n (the directory containing `pom.xml`) with the following contents:\n\n bootstrap.servers=bootstrap.\u003cvar label=\"cluster\" translate=\"no\"\u003eCLUSTER_ID\u003c/var\u003e.\u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e.managedkafka.\u003cvar label=\"cluster project\" translate=\"no\"\u003ePROJECT_ID\u003c/var\u003e.cloud.goog:9092\n value.serializer=org.apache.kafka.common.serialization.StringSerializer\n key.serializer=org.apache.kafka.common.serialization.StringSerializer\n security.protocol=SASL_SSL\n sasl.mechanism=OAUTHBEARER\n sasl.login.callback.handler.class=com.google.cloud.hosted.kafka.auth.GcpLoginCallbackHandler\n sasl.jaas.config=org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required;\n\n2. You are now ready to run the application:\n\n mvn exec:java -Dexec.mainClass=\"com.google.example.App\" --quiet\n\nClean up\n\n\nTo avoid incurring charges to your Google Cloud account for\nthe resources used on this page, follow these steps.\n\nTo delete the cluster, run the [`gcloud managed-kafka clusters delete`](/sdk/gcloud/reference/managed-kafka/clusters/delete) command: \n\n gcloud managed-kafka clusters delete \u003cvar label=\"cluster\" translate=\"no\"\u003eCLUSTER_ID\u003c/var\u003e --location=\u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e\n\nTo delete the VM, run the [`gcloud compute instances delete`](/sdk/gcloud/reference/compute/instances/delete) command: \n\n gcloud instances delete test-instance \\\n --zone=\u003cvar translate=\"no\"\u003eREGION\u003c/var\u003e-f\n\nWhat's next\n\n- [Use the schema registry to produce Avro messages](/managed-service-for-apache-kafka/docs/quickstart-avro).\n\n- [Overview of Managed Service for Apache Kafka](/managed-service-for-apache-kafka/docs/overview).\n\n- [Authenticate to Managed Kafka API](/managed-service-for-apache-kafka/docs/authentication).\n\n*Apache Kafka® is a registered\ntrademark of The Apache Software Foundation or its affiliates in the United\nStates and/or other countries.*"]]