Search This Blog

Thursday, 8 February 2018

Spark TCP streaming example without Kafka

Spark Streaming is an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant stream processing of live data streams. Data can be ingested from many sources like Kafka, Flume, Kinesis, or TCP sockets. And Finally, processed data can be pushed to filesystems, databases, and live dashboards. On data, you can apply Spark’s machine learning and graph processing algorithms on data streams.

Spark streaming is useful to read data from producer and distribute data over multiple machine in clustor or yarn mode.

Few term related to spark streaming -

RDD stands for resilient data distribution. RDD is created from the data that bring when spark streaming executes in batch interval.

Creating simple example of TCP socket streaming is given below -



 @SuppressWarnings("resource")
 public static void main(String[] args) {
  
  SparkConf sparkConf = new SparkConf().setMaster("spark-master-url").setAppName("xyz")
    .set("spark.executor.memory", "1g").set("spark.cores.max", "5").set("spark.driver.cores", "2")
    .set("spark.driver.memory", "2g");
 
  JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new Duration(3000));

  JavaDStream<String> JsonReq1 = ssc.socketTextStream("bindIP", bindport, StorageLevels.MEMORY_AND_DISK_SER);
  JavaDStream<String> JsonReq2 = ssc.socketTextStream("bindIP", bindport, StorageLevels.MEMORY_AND_DISK_SER);
  ArrayList<JavaDStream<String>> streamList = new ArrayList<JavaDStream<String>>();
  streamList.add(JsonReq1);
  JavaDStream<String> UnionStream = ssc.union(JsonReq2, streamList);

  UnionStream.foreachRDD(new VoidFunction<JavaRDD<String>>() {

   private static final long serialVersionUID = 1L;

   public void call(JavaRDD<String> rdd) throws Exception {

   
    rdd.foreach(new VoidFunction<String>() {

     private static final long serialVersionUID = 1L;

     public void call(String s) throws Exception {
      System.out.println(s);
     }

    });
   }
  });

  System.out.println(UnionStream.count());
  ssc.start();
  ssc.awaitTermination();
 }


Term like bindIP and bindport will be your specific spark ip/port. To test this application you can create a basic service socket port programm which must listen for clients socket from spark executor.
spark-master-url should be the url of machine where spark master is running . spark master url generally looks like spark://machineip:port

No comments:

Post a Comment

Feedback always help in improvement. If you have any query suggestion feel free to comment and Keep visiting my blog to encourage me to blogging

Android News and source code