Search This Blog

Monday, 19 February 2018

Spark tutorial - Understanding and exploring spark core components RDDs

Spark is fast, ease to use data processing engine. Spark is developed as replacement of Hadoop MapReduce and runs over Hadoop. It uses Hadoop Distributed File System of Hadoop. Spark is the open source apache product available since 2014. Spark popularity is growing at rapid speed. Spark doesn't need extra skill set. Knowledge of Core Java and Distributing computing is enough for it.

There are terminology which founds difficults to understand by spark developers. I am going to explain those in my own language.

Spark RDDs 


Spark RDDs is the resilient distributed dataset. RDDs is immutable datasets which can be created from internal source (i.e. Parallelize way) or external source (i.e. Spark streaming, text file, input file format etc). RDD's elements can distributed across spark cluster for processing. RDDs can only be created by reading data from a stable storage such as TCP Streaming, Files or by transformations on existing RDDs.


For example- If as a spark developer you write spark TCP streaming job with interval of 1 seconds. Spark continuously receive data and form a RDD form the data receives in that one seconds.
Look at the image below. RDD consists of elemens E1, E2 .... En which can be json, text, line or any other string. This RDD can be transfom into new RDD but here i am not talking about this.
Elements of RDDs can be processed using foreach  element loop. And then these elements will be distributed across executors in spark clusters.

Spark RDDs


Point (1) -  This explain how RDD is created from the data received on spark tcp socket in 1 seconds. 


Point (2) - This explain that elements of RDD assigned to executors for processing. 


Below is the code of simple spark streaming. Where socket is opend to a IP and port and RDDs are formed every 3s.See clearly where point 1 and 2 lies in the codes.

 SparkConf sparkConf = new SparkConf().setMaster("spark://10.1.0.5:8088").setAppName("App name");  
           JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new Duration(3000));  
           JavaDStream<String> stream = ssc.socketTextStream("socket IP", 9000, StorageLevels.MEMORY_AND_DISK_SER);  
           stream.foreachRDD(new VoidFunction<JavaRDD<String>>() {  
                private static final long serialVersionUID = 1L;  
                public void call(JavaRDD<String> rdd) throws Exception {  

/** Point (1) **/

rdd.foreach(new VoidFunction<String>() { private static final long serialVersionUID = 1L; public void call(String s) throws Exception {

/** Point (2) **/

System.out.println(s); } }); } }); ssc.start(); ssc.awaitTermination();

No comments:

Post a Comment

Feedback always help in improvement. If you have any query suggestion feel free to comment and Keep visiting my blog to encourage me to blogging

Android News and source code