You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Deepa Jayaveer <de...@tcs.com> on 2014/12/09 14:09:13 UTC

reg JDBCRDD code

Hi All,
am new to Spark.  I tried to connect to Mysql using Spark.  want to write 
a code in Java but 
getting runtime exception. I guess that the issue is with the function0 
and function1 objects being passed in JDBCRDD .

I tried my level best and attached the code, can you please help us to fix 
the issue.



Thanks 
Deepa
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you



Re: reg JDBCRDD code

Posted by Deepa Jayaveer <de...@tcs.com>.
Working!!!  Thanks a lot Akhil!!!


Thanks and Regards
Deepa Devi Jayaveer
Mobile No: 9940662806
Tata Consultancy Services
Mailto: deepa.jayaveer@tcs.com
Website: http://www.tcs.com
____________________________________________
Experience certainty.   IT Services
                        Business Solutions
                        Consulting
____________________________________________



From:   Akhil Das <ak...@sigmoidanalytics.com>
To:     Deepa Jayaveer <de...@tcs.com>
Cc:     "user@spark.apache.org" <us...@spark.apache.org>
Date:   12/10/2014 02:41 PM
Subject:        Re: reg JDBCRDD code



Hi Deepa,

I have it working here




And here's the updated code.

https://gist.github.com/akhld/0d9299aafc981553bc34

Thanks
Best Regards

On Wed, Dec 10, 2014 at 1:02 PM, Deepa Jayaveer <de...@tcs.com> 
wrote:
Hi Akhil, 
Getting the same error . I guess that the issue on Function1 
implementation.   
is it enough if we override apply method in Function1 class? 
Thanks 
Deepa 





From:        Akhil Das <ak...@sigmoidanalytics.com> 
To:        Deepa Jayaveer <de...@tcs.com> 
Cc:        "user@spark.apache.org" <us...@spark.apache.org> 
Date:        12/10/2014 12:55 PM 
Subject:        Re: reg JDBCRDD code 



Try changing this line  

JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 0, 1, 
                getResultset, ClassTag$.MODULE$.apply(String.class));  

to  

JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 100, 1, 
                getResultset, ClassTag$.MODULE$.apply(String.class));  

Here: 
0   - lower bound 
100 - upper bound 
1   - number of partitions i believe. 

Thanks 
Best Regards 

On Wed, Dec 10, 2014 at 12:45 PM, Deepa Jayaveer <de...@tcs.com> 
wrote: 
Thanks Akhil but it is expecting Function1 instead of Function .. I tried 
out writing a new class by implementing Function1 but 
got an error . can you please help us to get it resolved 

JDBCRDD is created as 
JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 0, 1, 
                getResultset, ClassTag$.MODULE$.apply(String.class)); 


overridden  'apply' method in Function1 
public String apply(ResultSet arg0) { 

             String ss = null; 

            try { 
                ss = (String) ((java.sql.ResultSet) arg0).getString(1); 
            } catch (SQLException e) { 
                // TODO Auto-generated catch block 
                e.printStackTrace(); 
            } 

            System.out.println(ss); 

            return ss; 
            // TODO Auto-generated method stub 
        } 

Error log 
Exception in thread "main" org.apache.spark.SparkException: Job aborted 
due to stage failure: Task 0.0:0 failed 1 times, most recent failure: 
Exception failure in TID 0 on host localhost: java.sql.SQLException: 
Parameter index out of range (1 > number of parameters, which is 0). 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073) 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987) 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:982) 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:927) 
        com.mysql.jdbc.PreparedStatement.checkBounds(
PreparedStatement.java:3709) 
        com.mysql.jdbc.PreparedStatement.setInternal(
PreparedStatement.java:3693) 
        com.mysql.jdbc.PreparedStatement.setInternal(
PreparedStatement.java:3735) 
        com.mysql.jdbc.PreparedStatement.setLong(
PreparedStatement.java:3751) 

Thanks 
Deepa



From:        Akhil Das <ak...@sigmoidanalytics.com> 
To:        Deepa Jayaveer <de...@tcs.com> 
Cc:        "user@spark.apache.org" <us...@spark.apache.org> 
Date:        12/09/2014 09:30 PM 
Subject:        Re: reg JDBCRDD code 




Hi Deepa, 

In Scala, You will do something like 
https://gist.github.com/akhld/ccafb27f098163bea622  

With Java API's it will be something like 
https://gist.github.com/akhld/0d9299aafc981553bc34 



Thanks 
Best Regards 

On Tue, Dec 9, 2014 at 6:39 PM, Deepa Jayaveer <de...@tcs.com> 
wrote: 
Hi All, 
am new to Spark.  I tried to connect to Mysql using Spark.  want to write 
a code in Java but 
getting runtime exception. I guess that the issue is with the function0 
and function1 objects being passed in JDBCRDD . 

I tried my level best and attached the code, can you please help us to fix 
the issue. 



Thanks 
Deepa 
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org 




Re: reg JDBCRDD code

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Hi Deepa,

I have it working here

[image: Inline image 1]


And here's the updated code.

https://gist.github.com/akhld/0d9299aafc981553bc34

Thanks
Best Regards

On Wed, Dec 10, 2014 at 1:02 PM, Deepa Jayaveer <de...@tcs.com>
wrote:

> Hi Akhil,
> Getting the same error . I guess that the issue on Function1
> implementation.
> is it enough if we override apply method in Function1 class?
> Thanks
> Deepa
>
>
>
>
>
> From:        Akhil Das <ak...@sigmoidanalytics.com>
> To:        Deepa Jayaveer <de...@tcs.com>
> Cc:        "user@spark.apache.org" <us...@spark.apache.org>
> Date:        12/10/2014 12:55 PM
> Subject:        Re: reg JDBCRDD code
> ------------------------------
>
>
>
> Try changing this line
>
> *JdbcRDD* rdd = *new* *JdbcRDD*(sc, getConnection, sql, 0, 0, 1,
>                 getResultset, ClassTag$.*MODULE$*.apply(String.*class*));
>
> to
>
> *JdbcRDD* rdd = *new* *JdbcRDD*(sc, getConnection, sql, 0, 100, 1,
>                 getResultset, ClassTag$.*MODULE$*.apply(String.*class*));
>
> Here:
> 0   - lower bound
> 100 - upper bound
> 1   - number of partitions i believe.
>
> Thanks
> Best Regards
>
> On Wed, Dec 10, 2014 at 12:45 PM, Deepa Jayaveer <*deepa.jayaveer@tcs.com*
> <de...@tcs.com>> wrote:
> Thanks Akhil but it is expecting Function1 instead of Function .. I tried
> out writing a new class by implementing Function1 but
> got an error . can you please help us to get it resolved
>
> JDBCRDD is created as
> * JdbcRDD* rdd = *new* *JdbcRDD*(sc, getConnection, sql, 0, 0, 1,
>                 getResultset, ClassTag$.*MODULE$*.apply(String.*class*));
>
>
> overridden  'apply' method in Function1
> * public* String apply(ResultSet arg0) {
>
>              String ss = *null*;
>
>             *try* {
>                 ss = *(String) ((java.sql.ResultSet) arg0).getString(1)*;
>             } *catch* (SQLException e) {
>                 // *TODO* Auto-generated catch block
>                 e.printStackTrace();
>             }
>
>             System.*out*.println(ss);
>
>             *return* ss;
>             // *TODO* Auto-generated method stub
>         }
>
> Error log
> Exception in thread "main" *org.apache.spark.SparkException*: Job aborted
> due to stage failure: Task 0.0:0 failed 1 times, most recent failure:
> Exception failure in TID 0 on host localhost: *java.sql.SQLException*:
> Parameter index out of range (1 > number of parameters, which is 0).
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:1073*)
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:987*)
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:982*)
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:927*)
>         com.mysql.jdbc.PreparedStatement.checkBounds(
> *PreparedStatement.java:3709*)
>         com.mysql.jdbc.PreparedStatement.setInternal(
> *PreparedStatement.java:3693*)
>         com.mysql.jdbc.PreparedStatement.setInternal(
> *PreparedStatement.java:3735*)
>         com.mysql.jdbc.PreparedStatement.setLong(
> *PreparedStatement.java:3751*)
>
> Thanks
> Deepa
>
>
>
> From:        Akhil Das <*akhil@sigmoidanalytics.com*
> <ak...@sigmoidanalytics.com>>
> To:        Deepa Jayaveer <*deepa.jayaveer@tcs.com*
> <de...@tcs.com>>
> Cc:        "*user@spark.apache.org* <us...@spark.apache.org>" <
> *user@spark.apache.org* <us...@spark.apache.org>>
> Date:        12/09/2014 09:30 PM
> Subject:        Re: reg JDBCRDD code
>  ------------------------------
>
>
>
>
> Hi Deepa,
>
> In Scala, You will do something like
> *https://gist.github.com/akhld/ccafb27f098163bea622*
> <https://gist.github.com/akhld/ccafb27f098163bea622>
>
> With Java API's it will be something like
> *https://gist.github.com/akhld/0d9299aafc981553bc34*
> <https://gist.github.com/akhld/0d9299aafc981553bc34>
>
>
>
> Thanks
> Best Regards
>
> On Tue, Dec 9, 2014 at 6:39 PM, Deepa Jayaveer <*deepa.jayaveer@tcs.com*
> <de...@tcs.com>> wrote:
> Hi All,
> am new to Spark.  I tried to connect to Mysql using Spark.  want to write
> a code in Java but
> getting runtime exception. I guess that the issue is with the function0
> and function1 objects being passed in JDBCRDD .
>
> I tried my level best and attached the code, can you please help us to fix
> the issue.
>
>
>
> Thanks
> Deepa
> =====-----=====-----=====
> Notice: The information contained in this e-mail
> message and/or attachments to it may contain
> confidential or privileged information. If you are
> not the intended recipient, any dissemination, use,
> review, distribution, printing or copying of the
> information contained in this e-mail message
> and/or attachments to it are strictly prohibited. If
> you have received this communication in error,
> please notify us by reply e-mail or telephone and
> immediately and permanently delete the message
> and any attachments. Thank you
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: *user-unsubscribe@spark.apache.org*
> <us...@spark.apache.org>
> For additional commands, e-mail: *user-help@spark.apache.org*
> <us...@spark.apache.org>
>
>
>

Re: reg JDBCRDD code

Posted by Deepa Jayaveer <de...@tcs.com>.
Hi Akhil,
Getting the same error . I guess that the issue on Function1 
implementation. 
is it enough if we override apply method in Function1 class?
Thanks
Deepa





From:   Akhil Das <ak...@sigmoidanalytics.com>
To:     Deepa Jayaveer <de...@tcs.com>
Cc:     "user@spark.apache.org" <us...@spark.apache.org>
Date:   12/10/2014 12:55 PM
Subject:        Re: reg JDBCRDD code



Try changing this line 

JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 0, 1, 
                getResultset, ClassTag$.MODULE$.apply(String.class)); 

to 

JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 100, 1, 
                getResultset, ClassTag$.MODULE$.apply(String.class)); 

Here:
0   - lower bound
100 - upper bound
1   - number of partitions i believe.

Thanks
Best Regards

On Wed, Dec 10, 2014 at 12:45 PM, Deepa Jayaveer <de...@tcs.com> 
wrote:
Thanks Akhil but it is expecting Function1 instead of Function .. I tried 
out writing a new class by implementing Function1 but 
got an error . can you please help us to get it resolved 

JDBCRDD is created as 
JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 0, 1, 
                getResultset, ClassTag$.MODULE$.apply(String.class)); 


overridden  'apply' method in Function1 
public String apply(ResultSet arg0) { 

             String ss = null; 

            try { 
                ss = (String) ((java.sql.ResultSet) arg0).getString(1); 
            } catch (SQLException e) { 
                // TODO Auto-generated catch block 
                e.printStackTrace(); 
            } 

            System.out.println(ss); 

            return ss; 
            // TODO Auto-generated method stub 
        } 

Error log 
Exception in thread "main" org.apache.spark.SparkException: Job aborted 
due to stage failure: Task 0.0:0 failed 1 times, most recent failure: 
Exception failure in TID 0 on host localhost: java.sql.SQLException: 
Parameter index out of range (1 > number of parameters, which is 0). 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073) 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987) 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:982) 
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:927) 
        com.mysql.jdbc.PreparedStatement.checkBounds(
PreparedStatement.java:3709) 
        com.mysql.jdbc.PreparedStatement.setInternal(
PreparedStatement.java:3693) 
        com.mysql.jdbc.PreparedStatement.setInternal(
PreparedStatement.java:3735) 
        com.mysql.jdbc.PreparedStatement.setLong(
PreparedStatement.java:3751) 

Thanks 
Deepa



From:        Akhil Das <ak...@sigmoidanalytics.com> 
To:        Deepa Jayaveer <de...@tcs.com> 
Cc:        "user@spark.apache.org" <us...@spark.apache.org> 
Date:        12/09/2014 09:30 PM 
Subject:        Re: reg JDBCRDD code 




Hi Deepa, 

In Scala, You will do something like 
https://gist.github.com/akhld/ccafb27f098163bea622  

With Java API's it will be something like 
https://gist.github.com/akhld/0d9299aafc981553bc34 



Thanks 
Best Regards 

On Tue, Dec 9, 2014 at 6:39 PM, Deepa Jayaveer <de...@tcs.com> 
wrote: 
Hi All, 
am new to Spark.  I tried to connect to Mysql using Spark.  want to write 
a code in Java but 
getting runtime exception. I guess that the issue is with the function0 
and function1 objects being passed in JDBCRDD . 

I tried my level best and attached the code, can you please help us to fix 
the issue. 



Thanks 
Deepa 
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org 



Re: reg JDBCRDD code

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Try changing this line

*JdbcRDD* rdd = *new* *JdbcRDD*(sc, getConnection, sql, 0, 0, 1,
                getResultset, ClassTag$.*MODULE$*.apply(String.*class*));

to

*JdbcRDD* rdd = *new* *JdbcRDD*(sc, getConnection, sql, 0, 100, 1,
                getResultset, ClassTag$.*MODULE$*.apply(String.*class*));

Here:

0   - lower bound
100 - upper bound
1   - number of partitions i believe.


Thanks
Best Regards

On Wed, Dec 10, 2014 at 12:45 PM, Deepa Jayaveer <de...@tcs.com>
wrote:

> Thanks Akhil but it is expecting Function1 instead of Function .. I tried
> out writing a new class by implementing Function1 but
> got an error . can you please help us to get it resolved
>
> JDBCRDD is created as
> *JdbcRDD* rdd = *new* *JdbcRDD*(sc, getConnection, sql, 0, 0, 1,
>                 getResultset, ClassTag$.*MODULE$*.apply(String.*class*));
>
>
> overridden  'apply' method in Function1
> *public* String apply(ResultSet arg0) {
>
>              String ss = *null*;
>
>             *try* {
>                 ss = *(String) ((java.sql.ResultSet) arg0).getString(1)*;
>             } *catch* (SQLException e) {
>                 // *TODO* Auto-generated catch block
>                 e.printStackTrace();
>             }
>
>             System.*out*.println(ss);
>
>             *return* ss;
>             // *TODO* Auto-generated method stub
>         }
>
> Error log
> Exception in thread "main" *org.apache.spark.SparkException*: Job aborted
> due to stage failure: Task 0.0:0 failed 1 times, most recent failure:
> Exception failure in TID 0 on host localhost: *java.sql.SQLException*:
> Parameter index out of range (1 > number of parameters, which is 0).
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:1073*)
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:987*)
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:982*)
>         com.mysql.jdbc.SQLError.createSQLException(*SQLError.java:927*)
>         com.mysql.jdbc.PreparedStatement.checkBounds(
> *PreparedStatement.java:3709*)
>         com.mysql.jdbc.PreparedStatement.setInternal(
> *PreparedStatement.java:3693*)
>         com.mysql.jdbc.PreparedStatement.setInternal(
> *PreparedStatement.java:3735*)
>         com.mysql.jdbc.PreparedStatement.setLong(
> *PreparedStatement.java:3751*)
>
> Thanks
> Deepa
>
>
>
> From:        Akhil Das <ak...@sigmoidanalytics.com>
> To:        Deepa Jayaveer <de...@tcs.com>
> Cc:        "user@spark.apache.org" <us...@spark.apache.org>
> Date:        12/09/2014 09:30 PM
> Subject:        Re: reg JDBCRDD code
> ------------------------------
>
>
>
> Hi Deepa,
>
> In Scala, You will do something like
> *https://gist.github.com/akhld/ccafb27f098163bea622*
> <https://gist.github.com/akhld/ccafb27f098163bea622>
>
> With Java API's it will be something like
> *https://gist.github.com/akhld/0d9299aafc981553bc34*
> <https://gist.github.com/akhld/0d9299aafc981553bc34>
>
>
>
> Thanks
> Best Regards
>
> On Tue, Dec 9, 2014 at 6:39 PM, Deepa Jayaveer <*deepa.jayaveer@tcs.com*
> <de...@tcs.com>> wrote:
> Hi All,
> am new to Spark.  I tried to connect to Mysql using Spark.  want to write
> a code in Java but
> getting runtime exception. I guess that the issue is with the function0
> and function1 objects being passed in JDBCRDD .
>
> I tried my level best and attached the code, can you please help us to fix
> the issue.
>
>
>
> Thanks
> Deepa
> =====-----=====-----=====
> Notice: The information contained in this e-mail
> message and/or attachments to it may contain
> confidential or privileged information. If you are
> not the intended recipient, any dissemination, use,
> review, distribution, printing or copying of the
> information contained in this e-mail message
> and/or attachments to it are strictly prohibited. If
> you have received this communication in error,
> please notify us by reply e-mail or telephone and
> immediately and permanently delete the message
> and any attachments. Thank you
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: *user-unsubscribe@spark.apache.org*
> <us...@spark.apache.org>
> For additional commands, e-mail: *user-help@spark.apache.org*
> <us...@spark.apache.org>
>
>

Re: reg JDBCRDD code

Posted by Deepa Jayaveer <de...@tcs.com>.
Thanks Akhil but it is expecting Function1 instead of Function .. I tried 
out writing a new class by implementing Function1 but
got an error . can you please help us to get it resolved

JDBCRDD is created as
JdbcRDD rdd = new JdbcRDD(sc, getConnection, sql, 0, 0, 1,
                getResultset, ClassTag$.MODULE$.apply(String.class));


overridden  'apply' method in Function1 
public String apply(ResultSet arg0) {

             String ss = null;

            try {
                ss = (String) ((java.sql.ResultSet) arg0).getString(1);
            } catch (SQLException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }

            System.out.println(ss);

            return ss;
            // TODO Auto-generated method stub
        }

Error log
Exception in thread "main" org.apache.spark.SparkException: Job aborted 
due to stage failure: Task 0.0:0 failed 1 times, most recent failure: 
Exception failure in TID 0 on host localhost: java.sql.SQLException: 
Parameter index out of range (1 > number of parameters, which is 0).
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987)
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:982)
        com.mysql.jdbc.SQLError.createSQLException(SQLError.java:927)
        com.mysql.jdbc.PreparedStatement.checkBounds(
PreparedStatement.java:3709)
        com.mysql.jdbc.PreparedStatement.setInternal(
PreparedStatement.java:3693)
        com.mysql.jdbc.PreparedStatement.setInternal(
PreparedStatement.java:3735)
        com.mysql.jdbc.PreparedStatement.setLong(
PreparedStatement.java:3751)

Thanks 
Deepa



From:   Akhil Das <ak...@sigmoidanalytics.com>
To:     Deepa Jayaveer <de...@tcs.com>
Cc:     "user@spark.apache.org" <us...@spark.apache.org>
Date:   12/09/2014 09:30 PM
Subject:        Re: reg JDBCRDD code



Hi Deepa,

In Scala, You will do something like 
https://gist.github.com/akhld/ccafb27f098163bea622 

With Java API's it will be something like 
https://gist.github.com/akhld/0d9299aafc981553bc34



Thanks
Best Regards

On Tue, Dec 9, 2014 at 6:39 PM, Deepa Jayaveer <de...@tcs.com> 
wrote:
Hi All, 
am new to Spark.  I tried to connect to Mysql using Spark.  want to write 
a code in Java but 
getting runtime exception. I guess that the issue is with the function0 
and function1 objects being passed in JDBCRDD . 

I tried my level best and attached the code, can you please help us to fix 
the issue. 



Thanks 
Deepa
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain 
confidential or privileged information. If you are 
not the intended recipient, any dissemination, use, 
review, distribution, printing or copying of the 
information contained in this e-mail message 
and/or attachments to it are strictly prohibited. If 
you have received this communication in error, 
please notify us by reply e-mail or telephone and 
immediately and permanently delete the message 
and any attachments. Thank you


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: reg JDBCRDD code

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Hi Deepa,

In Scala, You will do something like
https://gist.github.com/akhld/ccafb27f098163bea622

With Java API's it will be something like
https://gist.github.com/akhld/0d9299aafc981553bc34



Thanks
Best Regards

On Tue, Dec 9, 2014 at 6:39 PM, Deepa Jayaveer <de...@tcs.com>
wrote:

> Hi All,
> am new to Spark.  I tried to connect to Mysql using Spark.  want to write
> a code in Java but
> getting runtime exception. I guess that the issue is with the function0
> and function1 objects being passed in JDBCRDD .
>
> I tried my level best and attached the code, can you please help us to fix
> the issue.
>
>
>
> Thanks
> Deepa
>
> =====-----=====-----=====
> Notice: The information contained in this e-mail
> message and/or attachments to it may contain
> confidential or privileged information. If you are
> not the intended recipient, any dissemination, use,
> review, distribution, printing or copying of the
> information contained in this e-mail message
> and/or attachments to it are strictly prohibited. If
> you have received this communication in error,
> please notify us by reply e-mail or telephone and
> immediately and permanently delete the message
> and any attachments. Thank you
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>