kafka spring demo
下载:http://download.csdn.net/download/knight_black_bob/9709057
安装详解 :http://knight-black-bob.iteye.com/blog/2343192
使用定时器发送后 结果如下
kafka 代码安装
15.安装kafka cd /usr/local/ wget http://mirror.bit.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz tar xf kafka_2.10-0.10.0.0.tgz ln -s /usr/local/kafka_2.10-0.10.0.0 /usr/local/kafka chown -R hdfs:hadoop /usr/local/kafka_2.10-0.10.0.0 /usr/local/kafka chown -R root:root /usr/local/kafka_2.10-0.10.0.0 /usr/local/kafka /usr/local/zookeeper/bin/zkCli.sh create /kafka '' vim /usr/local/kafka/config/server.properties broker.id=0 zookeeper.connect=dev10.aoiplus.openpf:2181,dev06.aoiplus.openpf:2181,dev05.aoiplus.openpf:2181/kafka scp -r /usr/local/kafka_2.10-0.10.0.0.tgz root@dev05.aoiplus.openpf:/usr/local/ scp -r /usr/local/kafka_2.10-0.10.0.0.tgz root@dev06.aoiplus.openpf:/usr/local/ scp -r /usr/local/kafka/config/server.properties root@dev05.aoiplus.openpf:/usr/local/kafka/config/server.properties scp -r /usr/local/kafka/config/server.properties root@dev06.aoiplus.openpf:/usr/local/kafka/config/server.properties master slave 启动 /usr/local/kafka/bin/kafka-server-start.sh /usr/local/kafka/config/server.properties & 创建topic /usr/local/kafka/bin/kafka-topics.sh --create --zookeeper dev10.aoiplus.openpf:2181,dev06.aoiplus.openpf:2181,dev05.aoiplus.openpf:2181/kafka --replication-factor 3 --partitions 5 --topic baoy-topic /usr/local/kafka/bin/kafka-topics.sh --describe --zookeeper dev10.aoiplus.openpf:2181,dev06.aoiplus.openpf:2181,dev05.aoiplus.openpf:2181/kafka --topic baoy-topic /usr/local/kafka/bin/kafka-console-producer.sh --broker-list dev10.aoiplus.openpf:9092,dev05.aoiplus.openpf:9092,dev06.aoiplus.openpf:9092 --topic baoy-topic /usr/local/kafka/bin/kafka-console-consumer.sh --zookeeper dev10.aoiplus.openpf:2181,dev05.aoiplus.openpf:2181,dev06.aoiplus.openpf:2181/kafka --from-beginning --topic baoy-topic
安装完成 后测试
productor
consumer
spring 接受信息
代码部分
applicationContext-kafka-productor.xml
<?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:int="http://www.springframework.org/schema/integration" xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka" xmlns:task="http://www.springframework.org/schema/task" xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka.xsd http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd"> <!-- commons config --> <bean id="stringSerializer" class="org.apache.kafka.common.serialization.StringSerializer" /> <bean id="kafkaEncoder" class="org.springframework.integration.kafka.serializer.avro.AvroReflectDatumBackedKafkaEncoder"> <constructor-arg value="java.lang.String" /> </bean> <bean id="producerProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean"> <property name="properties"> <props> <prop key="topic.metadata.refresh.interval.ms">3600000</prop> <prop key="message.send.max.retries">5</prop> <prop key="serializer.class">kafka.serializer.StringEncoder</prop> <prop key="request.required.acks">1</prop> </props> </property> </bean> <!-- topic test config --> <int:channel id="pChannel"> <int:queue /> </int:channel> <int-kafka:outbound-channel-adapter id="kafkaOutboundChannelAdapterProductor" kafka-producer-context-ref="producerContext" auto-startup="true" channel="pChannel" order="3"> <int:poller fixed-delay="1000" time-unit="MILLISECONDS" receive-timeout="1" task-executor="taskProductorExecutor" /> </int-kafka:outbound-channel-adapter> <task:executor id="taskProductorExecutor" pool-size="5" keep-alive="120" queue-capacity="500" /> <int-kafka:producer-context id="producerContext" producer-properties="producerProperties"> <int-kafka:producer-configurations> <int-kafka:producer-configuration broker-list="172.23.27.120:9092,172.23.27.115:9092,172.23.27.116:9092" key-serializer="stringSerializer" value-class-type="java.lang.String" value-serializer="stringSerializer" topic="baoy-topic" /> </int-kafka:producer-configurations> </int-kafka:producer-context> </beans>
applicationContext-kafka-consumer.xml
<?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:int="http://www.springframework.org/schema/integration" xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka" xmlns:task="http://www.springframework.org/schema/task" xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka.xsd http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd"> <!-- topic test conf --> <int:channel id="cChannel"> <int:dispatcher task-executor="kafkaMessageExecutor" /> </int:channel> <!-- zookeeper配置 可以配置多个 --> <int-kafka:zookeeper-connect id="zookeeperConnect" zk-connect="172.23.27.120:2181,172.23.27.115:2181,172.23.27.116:2181/kafka" zk-connection-timeout="6000" zk-session-timeout="6000" zk-sync-time="2000" /> <!-- channel配置 auto-startup="true" 否则接收不发数据 --> <int-kafka:inbound-channel-adapter id="kafkaInboundChannelAdapter" kafka-consumer-context-ref="consumerContext" auto-startup="true" channel="cChannel"> <int:poller fixed-delay="1" time-unit="MILLISECONDS" /> </int-kafka:inbound-channel-adapter> <task:executor id="kafkaMessageExecutor" pool-size="8" keep-alive="120" queue-capacity="500" /> <bean id="kafkaDecoder" class="org.springframework.integration.kafka.serializer.common.StringDecoder" /> <bean id="consumerProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean"> <property name="properties"> <props> <prop key="auto.offset.reset">smallest</prop> <prop key="socket.receive.buffer.bytes">10485760</prop> <!-- 10M --> <prop key="fetch.message.max.bytes">5242880</prop> <prop key="auto.commit.interval.ms">1000</prop> </props> </property> </bean> <!-- 消息接收的BEEN --> <bean id="kafkaConsumerService" class="com.curiousby.baoy.cn.kafka.KafkaConsumerService" /> <!-- 指定接收的方法 --> <int:outbound-channel-adapter channel="cChannel" ref="kafkaConsumerService" method="process" /> <int-kafka:consumer-context id="consumerContext" consumer-timeout="1000" zookeeper-connect="zookeeperConnect" consumer-properties="consumerProperties"> <int-kafka:consumer-configurations> <int-kafka:consumer-configuration group-id="default" value-decoder="kafkaDecoder" key-decoder="kafkaDecoder" max-messages="5000"> <int-kafka:topic id="baoy-topic" streams="5" /> </int-kafka:consumer-configuration> </int-kafka:consumer-configurations> </int-kafka:consumer-context> </beans>
KafkaConsumerService
@Service public class KafkaConsumerService { public void process(Map<String, Map<Integer, String>> msgs) { for (Map.Entry<String, Map<Integer, String>> entry : msgs.entrySet()) { System.out.println("======================================Consumer Message received: "); System.out.println("=====================================Suchit Topic:" + entry.getKey()); for (String msg : entry.getValue().values()) { System.out.println("================================Suchit Consumed Message: " + msg); } } } }
KafkaProductorService
@Service public class KafkaProductorService { @Autowired @Qualifier("pChannel") private MessageChannel messageChannel; public void sendInfo(String topic, Object obj) { System.out.println("---Service:KafkaService------sendInfo------"); messageChannel.send(MessageBuilder.withPayload(obj).setHeader(KafkaHeaders.TOPIC,topic).build()); } }
pom
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.curiousby.baoyou.cn</groupId> <artifactId>SpringKafkaDEMO</artifactId> <packaging>war</packaging> <version>0.0.1-SNAPSHOT</version> <name>SpringKafkaDEMO Maven Webapp</name> <url>http://maven.apache.org</url> <!-- properties constant --> <properties> <spring.version>4.2.5.RELEASE</spring.version> </properties> <dependencies> <!-- junit4 --> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.7</version> <type>jar</type> <scope>test</scope> </dependency> <dependency> <groupId>org.dbunit</groupId> <artifactId>dbunit</artifactId> <version>2.4.9</version> <scope>test</scope> </dependency> <dependency> <groupId>com.github.springtestdbunit</groupId> <artifactId>spring-test-dbunit</artifactId> <version>1.1.0</version> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-test</artifactId> <version>${spring.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>javax.servlet</groupId> <artifactId>javax.servlet-api</artifactId> <version>3.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.aspectj</groupId> <artifactId>aspectjrt</artifactId> <version>1.7.2</version> </dependency> <dependency> <groupId>org.aspectj</groupId> <artifactId>aspectjweaver</artifactId> <version>1.7.2</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-aspects</artifactId> <version>${spring.version}</version> <type>jar</type> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-web</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-webmvc</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework.integration</groupId> <artifactId>spring-integration-kafka</artifactId> <version>1.3.0.RELEASE</version> </dependency> <dependency> <groupId>commons-logging</groupId> <artifactId>commons-logging</artifactId> <version>1.1.1</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>1.6.4</version> <type>jar</type> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> <version>1.6.4</version> <type>jar</type> </dependency> <dependency> <groupId>javax</groupId> <artifactId>javaee-api</artifactId> <version>7.0</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-core</artifactId> <version>2.7.6</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.7.6</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-annotations</artifactId> <version>2.7.6</version> </dependency> <dependency> <groupId>org.apache.avro</groupId> <artifactId>avro</artifactId> <version>1.7.7</version> </dependency> </dependencies> <build> <finalName>SpringKafkaDEMO</finalName> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.3</version> <dependencies> <dependency> <groupId>org.codehaus.plexus</groupId> <artifactId>plexus-compiler-javac</artifactId> <version>2.5</version> </dependency> </dependencies> <configuration> <source>1.7</source> <target>1.7</target> <encoding>UTF-8</encoding> <compilerArguments> <verbose /> <bootclasspath>${java.home}/lib/rt.jar:${java.home}/lib/jce.jar</bootclasspath> </compilerArguments> </configuration> </plugin> </plugins> </build> </project>
遇到的问题:
1. spring 中 日志 中的 logback 必须 保持一致 ,这里我使用 org.slf4j 1.6.4
<groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>1.6.4</version> <type>jar</type> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> <version>1.6.4</version> <type>jar</type> </dependency>
2. hosts 问题,之前一直 有连接 连不同, zookeeper 使用 172.23.27.120:2181,172.23.27.115:2181,172.23.27.116:2181 。是可以连通,但这不是我需要的 ,我需要的是172.23.27.120:2181,172.23.27.115:2181,172.23.27.116:2181/kafka ,,
然后 配置成172.23.27.120:2181,172.23.27.115:2181,172.23.27.116:2181/kafka 就报错,怎么也 不成功,之后 配置172.23.27.120:2181,172.23.27.115:2181,172.23.27.116:2181/kafka的时候发现是 域名了,然后联不通,在本地添加 hosts 就可以解决该问题了。
3.使用 jar 包
<dependency> <groupId>org.springframework.integration</groupId> <artifactId>spring-integration-kafka</artifactId> <version>1.3.0.RELEASE</version> </dependency>
捐助开发者
在兴趣的驱动下,写一个免费
的东西,有欣喜,也还有汗水,希望你喜欢我的作品,同时也能支持一下。 当然,有钱捧个钱场(右上角的爱心标志,支持支付宝和PayPal捐助),没钱捧个人场,谢谢各位。
谢谢您的赞助,我会做的更好!
相关推荐
基于xml方式,spring整合kafka demo实例。该实例直接下载即可正常运行
springboot集成kafka,使用教程参考: https://blog.csdn.net/qq_26482855/article/details/119003639 https://blog.csdn.net/qq_26482855/article/details/119005025
主要介绍了Spring纯Java配置集成kafka代码实例,文中通过示例代码介绍的非常详细,对大家的学习或者工作具有一定的参考学习价值,需要的朋友可以参考下
kafka_2.11-1.0.0 版本,java,spring连接实例程序,windows自动构建,无需下载安装包,一键完成jar包导入。
基于 注解Annotation 方式,spring整合kafka demo实例。该实例直接下载即可正常运行
带有Kafka Consumer实例的Spring Boot 本项目介绍如何将Spring Boot与Spring Kafka结合使用以使用Kafka主题中的JSON / String消息 启动Zookeeper bin/zookeeper-server-start.sh config/zookeeper.properties 启动...
主要介绍了spring-cloud-stream结合kafka使用详解,本文通过实例代码给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价值,需要的朋友可以参考下
springboot整合kafka,无任何多余代码,带测试实例,新手入kafka最佳选择
主要给大家介绍了关于spring-boot整合spring-kafka实现发送接收消息的相关资料,文中介绍的非常详细,对大家具有一定的参考学习价值,需要的朋友们下面跟着小编一起来看看吧。
本篇文章主要介绍了spring boot与kafka集成的简单实例,小编觉得挺不错的,现在分享给大家,也给大家做个参考。一起跟随小编过来看看吧
这个Spring Boot应用程序集成了Kafka和Cassandra。 a) 您可以通过post请求将json数据传递给API,将数据插入Cassandra。它处理这些消息并插入到Cassandra DB中。 Spring Boot版本:1.4.2 JDK版本:1.8 Cassandra/...
带有Rest URL的Kafka Producer和Consumer API的Spring Boot应用程序 生产者:将数据或消息发送到kafka服务器的应用程序 消息:一小段数据,即kafka的字节数组 使用者:数据的接收者,即从kafka服务器读取数据 Kafka...
主要介绍了Spring boot集成Kafka消息中间件代码实例,文中通过示例代码介绍的非常详细,对大家的学习或者工作具有一定的参考学习价值,需要的朋友可以参考下
简单介绍了kafka的入门操作,快速搭建一个实例,以及讲解了kafka消费者组的概念
带有Kafka Consumer实例的Spring Boot 本项目介绍如何将Spring Boot与Spring Kafka结合使用以使用Kafka主题中的JSON / String消息 启动Zookeeper bin/zookeeper-server-start.sh config/zookeeper.properties 启动...
Spring Boot Kafka Websocket 这是使用带有Kafka,Spring boot和MVC的SockJs的Websocket的示例代码。 此示例将消息发布到kafka主题中,并通过网络将消息发送给访问该应用程序的用户。1.在本地主机上启动Kafka实例:...
此外,本文还探讨了Kafka的Java客户端访问方式,并且介绍了Spring Boot与Kafka的整合方法。整体上,文档提供了对Kafka架构深度的理解,同时通过实例代码展示了其在实际应用中的强大功能和灵活性。
该示例是此Srping Cloud Streams示例的单个Maven项目的改编版: : 这是此Kafka Streams示例的Spring Cloud Stream改编版: : 该示例演示了kafka流中交互式查询的概念。 作为应用程序的一部分提供了REST服务,可用于...
SpringCloud Demo实例。 Demo中包含高可用Eureka/Zuul/Spring Config/Feign/Ribbon/Spring Cloud Stream(kafka)相关Demo
读我kafka学习代码。基础代码:消费者:com.example.kafka.consumer生产者:com.example.kafka.producer其中消费者均为单线程消费,分为手动提交、自动提交两个版本。在生产下,肯定是多消费者(不管是多线程还是多...