Apache kafka Setup in Google cloud
This blog post guides you through setting up a basic Kafka environment on Google Cloud Platform for learning purposes. Kafka is a powerful distributed event streaming platform used for real-time data processing. We'll walk through launching a Kafka cluster, creating a topic, and sending and consuming messages.
Kafka is a distributed event streaming platform that lets you read, write, store, and process events (also called records or messages in the documentation) across many machines.
Prerequisites:
- A Google Cloud Platform account
Steps:
- Deploying Kafka:
- Head over to the Google Cloud Marketplace:
https://console.cloud.google.com/marketplace/product/google/kafka - Click on "LAUNCH" and proceed with the deployment configuration.
- Important:
- For service account, you can choose an existing one or create a new one with appropriate permissions.
- Select a deployment region closest to you for optimal performance.
- Keep the disk space settings at default for this learning exercise.
- Once done, click on "Create" to deploy the Kafka cluster.
- Connecting via SSH:
- Once deployed, access the Kafka cluster instance through SSH using the provided credentials
- Starting Kafka Services:
-
Navigate to the Kafka installation directory:
- Bash
cd /opt/kafka/
Start Zookeeper, which manages the cluster coordination:
- Bash
sudo bin/zookeeper-server-start.sh config/zookeeper.properties
Open a new SSH terminal and start the Kafka broker service:
- Bash
cd /opt/kafka/ sudo bin/kafka-server-start.sh config/server.properties
These commands will launch the essential Kafka services for your learning environment
- Creating a Topic:
-
Open another SSH terminal and navigate to the Kafka directory:
- Bash
cd /opt/kafka/
Create a topic named "quickstart-events" to hold your messages:
Bashsudo bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092
- This command should output "Created topic quickstart-events." verifying its creation.
- Verifying Topic Details:
-
Use below command to see details about the created topic:
- Bash
sudo bin/kafka-topics.sh --describe --topic quickstart-events --bootstrap-server localhost:9092
You should see information about the topic, including the number of partitions and replicas.
- Producing Messages:
-
Kafka uses a producer-consumer model for data flow. The producer writes messages to a topic.
-
Use the producer console:
- Bash
sudo bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
Start typing your messages and press Enter to send them to the "quickstart-events" topic.
Press
Ctrl+C
to stop sending messages.
- Consuming Messages:
-
Open another SSH terminal and use the consumer console:
- Bash
sudo bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092
The consumer will start displaying any messages sent to the "quickstart-events" topic, demonstrating successful message flow.
Comments
Post a Comment