DIY
Lot of examples are shown in the previous sections. Now, let's get ready to do some hands-on. Code will be available in the code bundle for reference. Read README.MD
in the code bundle for the execution of the program.
Source to sink – Flink execution:
We have already seen examples where we integrated Apache Kafka or RabbitMQ as source to Cassandra as sink. Now, here we will integrate Apache Kafka as source and ElasticSearch 2.x as sink. The connector will be changed. Instead of using Cassandra, we will use ElasticSearch.
Here we have the import files:
package com.boof.flink.diy; import java.net.InetAddress; import java.net.InetSocketAddress; import java.util.ArrayList;import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Properties; import org.apache.flink.api.common.functions.RuntimeContext; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; import org.apache.flink...