You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Rohith P <rp...@couponsinc.com> on 2015/09/28 17:31:39 UTC

using JavaRDD in spark-redis connector

Hi all,
      I am trying to work with spark-redis connector (redislabs) which
requires all transactions between redis and spark be in RDD's. The language 
I am using is Java but the connector does not accept JavaRDD's .So I tried
using Spark context in my code instead of JavaSparkContext. But when I
wanted to create a RDD using sc.parallelize , it asks for some scala related
parameters as opposed to lists in java.... when I tries to have both
javaSparkContext and sparkcontext(for connector) then Multiple contexts
cannot be opened was the error....
 The code that I have been trying ....


// initialize spark context
	private static RedisContext config() {
		conf = new SparkConf().setAppName("redis-jedis");
		sc2=new SparkContext(conf);
		RedisContext rc=new RedisContext(sc2);
		return rc;

	}
//write to redis which requires the data to be in RDD 
	private static void WriteUserTacticData(RedisContext rc, String userid,
String tacticsId, String value) {
		hostTup= calling(redisHost,redisPort);
		String key=userid+"-"+tacticsId;
		RDD<Tuple2&lt;String, String>> newTup=createTuple(key,value);
		rc.toRedisKV(newTup,hostTup);

// the createTuple where the RDD is to be created which will be inserted
into redis
	private static RDD<Tuple2&lt;String, String>> createTuple(String key,
String value) {
		sc=new JavaSparkContext(conf);
		ArrayList<Tuple2&lt;String,String>> list= new
ArrayList<Tuple2&lt;String,String>>();
		Tuple2<String,String> e= new Tuple2<String, String>(key,value);
		list.add(e);
		JavaRDD<Tuple2&lt;String,String>> javardd= sc.parallelize(list);
		RDD<Tuple2&lt;String,String>> newTupRdd=JavaRDD.toRDD(javardd); 
		sc.close();
		return newTupRdd;
	}



How would I create an RDD(not javaRDD) in java which will be accepted by
redis connector... Any kind of related to the topic would be
appretiated......





--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: using JavaRDD in spark-redis connector

Posted by Rohith P <rp...@couponsinc.com>.
got it ..thank u...




--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391p14812.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: using JavaRDD in spark-redis connector

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
You can create a JavaRDD as normal and then call the .rdd() to get the RDD.

Thanks
Best Regards

On Mon, Sep 28, 2015 at 9:01 PM, Rohith P <rp...@couponsinc.com>
wrote:

> Hi all,
>       I am trying to work with spark-redis connector (redislabs) which
> requires all transactions between redis and spark be in RDD's. The language
> I am using is Java but the connector does not accept JavaRDD's .So I tried
> using Spark context in my code instead of JavaSparkContext. But when I
> wanted to create a RDD using sc.parallelize , it asks for some scala
> related
> parameters as opposed to lists in java.... when I tries to have both
> javaSparkContext and sparkcontext(for connector) then Multiple contexts
> cannot be opened was the error....
>  The code that I have been trying ....
>
>
> // initialize spark context
>         private static RedisContext config() {
>                 conf = new SparkConf().setAppName("redis-jedis");
>                 sc2=new SparkContext(conf);
>                 RedisContext rc=new RedisContext(sc2);
>                 return rc;
>
>         }
> //write to redis which requires the data to be in RDD
>         private static void WriteUserTacticData(RedisContext rc, String
> userid,
> String tacticsId, String value) {
>                 hostTup= calling(redisHost,redisPort);
>                 String key=userid+"-"+tacticsId;
>                 RDD<Tuple2&lt;String, String>>
> newTup=createTuple(key,value);
>                 rc.toRedisKV(newTup,hostTup);
>
> // the createTuple where the RDD is to be created which will be inserted
> into redis
>         private static RDD<Tuple2&lt;String, String>> createTuple(String
> key,
> String value) {
>                 sc=new JavaSparkContext(conf);
>                 ArrayList<Tuple2&lt;String,String>> list= new
> ArrayList<Tuple2&lt;String,String>>();
>                 Tuple2<String,String> e= new Tuple2<String,
> String>(key,value);
>                 list.add(e);
>                 JavaRDD<Tuple2&lt;String,String>> javardd=
> sc.parallelize(list);
>                 RDD<Tuple2&lt;String,String>>
> newTupRdd=JavaRDD.toRDD(javardd);
>                 sc.close();
>                 return newTupRdd;
>         }
>
>
>
> How would I create an RDD(not javaRDD) in java which will be accepted by
> redis connector... Any kind of related to the topic would be
> appretiated......
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/using-JavaRDD-in-spark-redis-connector-tp14391.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>