You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Naveen Kumar Pokala <np...@spcapitaliq.com> on 2014/11/11 16:18:00 UTC

scala.MatchError

Hi,

This is my Instrument java constructor.

public Instrument(Issue issue, Issuer issuer, Issuing issuing) {
                                super();
                                this.issue = issue;
                                this.issuer = issuer;
                                this.issuing = issuing;
                }


I am trying to create javaschemaRDD

JavaSchemaRDD schemaInstruments = sqlCtx.applySchema(distData, Instrument.class);

Remarks:
============

Instrument, Issue, Issuer, Issuing all are java classes

distData is holding List< Instrument >


I am getting the following error.



Exception in thread "Driver" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
Caused by: scala.MatchError: class sample.spark.test.Issue (of class java.lang.Class)
        at org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189)
        at org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:188)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
        at org.apache.spark.sql.api.java.JavaSQLContext.getSchema(JavaSQLContext.scala:188)
        at org.apache.spark.sql.api.java.JavaSQLContext.applySchema(JavaSQLContext.scala:90)
        at sample.spark.test.SparkJob.main(SparkJob.java:33)
        ... 5 more

Please help me.

Regards,
Naveen.

RE: scala.MatchError

Posted by Naveen Kumar Pokala <np...@spcapitaliq.com>.
Hi,

Do you mean with java, I shouldn’t have Issue class as a property (attribute) in Instrument Class?

Ex :

Class Issue {
Int a;
}

Class Instrument {

Issue issue;

}


How about scala? Does it support such user defined datatypes in classes

Case class Issue .


case class Issue( a:Int = 0)

case class Instrument(issue: Issue = null)




-Naveen

From: Michael Armbrust [mailto:michael@databricks.com]
Sent: Wednesday, November 12, 2014 12:09 AM
To: Xiangrui Meng
Cc: Naveen Kumar Pokala; user@spark.apache.org
Subject: Re: scala.MatchError

Xiangrui is correct that is must be a java bean, also nested classes are not yet supported in java.

On Tue, Nov 11, 2014 at 10:11 AM, Xiangrui Meng <me...@gmail.com>> wrote:
I think you need a Java bean class instead of a normal class. See
example here: http://spark.apache.org/docs/1.1.0/sql-programming-guide.html
(switch to the java tab). -Xiangrui

On Tue, Nov 11, 2014 at 7:18 AM, Naveen Kumar Pokala
<np...@spcapitaliq.com>> wrote:
> Hi,
>
>
>
> This is my Instrument java constructor.
>
>
>
> public Instrument(Issue issue, Issuer issuer, Issuing issuing) {
>
>                                 super();
>
>                                 this.issue = issue;
>
>                                 this.issuer = issuer;
>
>                                 this.issuing = issuing;
>
>                 }
>
>
>
>
>
> I am trying to create javaschemaRDD
>
>
>
> JavaSchemaRDD schemaInstruments = sqlCtx.applySchema(distData,
> Instrument.class);
>
>
>
> Remarks:
>
> ============
>
>
>
> Instrument, Issue, Issuer, Issuing all are java classes
>
>
>
> distData is holding List< Instrument >
>
>
>
>
>
> I am getting the following error.
>
>
>
>
>
>
>
> Exception in thread "Driver" java.lang.reflect.InvocationTargetException
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:483)
>
>         at
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
>
> Caused by: scala.MatchError: class sample.spark.test.Issue (of class
> java.lang.Class)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:188)
>
>         at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>
>         at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>
>         at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>
>         at
> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
>
>         at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>
>         at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext.getSchema(JavaSQLContext.scala:188)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext.applySchema(JavaSQLContext.scala:90)
>
>         at sample.spark.test.SparkJob.main(SparkJob.java:33)
>
>         ... 5 more
>
>
>
> Please help me.
>
>
>
> Regards,
>
> Naveen.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>


Re: scala.MatchError

Posted by Michael Armbrust <mi...@databricks.com>.
Xiangrui is correct that is must be a java bean, also nested classes are
not yet supported in java.

On Tue, Nov 11, 2014 at 10:11 AM, Xiangrui Meng <me...@gmail.com> wrote:

> I think you need a Java bean class instead of a normal class. See
> example here:
> http://spark.apache.org/docs/1.1.0/sql-programming-guide.html
> (switch to the java tab). -Xiangrui
>
> On Tue, Nov 11, 2014 at 7:18 AM, Naveen Kumar Pokala
> <np...@spcapitaliq.com> wrote:
> > Hi,
> >
> >
> >
> > This is my Instrument java constructor.
> >
> >
> >
> > public Instrument(Issue issue, Issuer issuer, Issuing issuing) {
> >
> >                                 super();
> >
> >                                 this.issue = issue;
> >
> >                                 this.issuer = issuer;
> >
> >                                 this.issuing = issuing;
> >
> >                 }
> >
> >
> >
> >
> >
> > I am trying to create javaschemaRDD
> >
> >
> >
> > JavaSchemaRDD schemaInstruments = sqlCtx.applySchema(distData,
> > Instrument.class);
> >
> >
> >
> > Remarks:
> >
> > ============
> >
> >
> >
> > Instrument, Issue, Issuer, Issuing all are java classes
> >
> >
> >
> > distData is holding List< Instrument >
> >
> >
> >
> >
> >
> > I am getting the following error.
> >
> >
> >
> >
> >
> >
> >
> > Exception in thread "Driver" java.lang.reflect.InvocationTargetException
> >
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> >         at java.lang.reflect.Method.invoke(Method.java:483)
> >
> >         at
> >
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
> >
> > Caused by: scala.MatchError: class sample.spark.test.Issue (of class
> > java.lang.Class)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:188)
> >
> >         at
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> >
> >         at
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> >
> >         at
> >
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> >
> >         at
> > scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
> >
> >         at
> > scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> >
> >         at
> scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext.getSchema(JavaSQLContext.scala:188)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext.applySchema(JavaSQLContext.scala:90)
> >
> >         at sample.spark.test.SparkJob.main(SparkJob.java:33)
> >
> >         ... 5 more
> >
> >
> >
> > Please help me.
> >
> >
> >
> > Regards,
> >
> > Naveen.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: scala.MatchError

Posted by Xiangrui Meng <me...@gmail.com>.
I think you need a Java bean class instead of a normal class. See
example here: http://spark.apache.org/docs/1.1.0/sql-programming-guide.html
(switch to the java tab). -Xiangrui

On Tue, Nov 11, 2014 at 7:18 AM, Naveen Kumar Pokala
<np...@spcapitaliq.com> wrote:
> Hi,
>
>
>
> This is my Instrument java constructor.
>
>
>
> public Instrument(Issue issue, Issuer issuer, Issuing issuing) {
>
>                                 super();
>
>                                 this.issue = issue;
>
>                                 this.issuer = issuer;
>
>                                 this.issuing = issuing;
>
>                 }
>
>
>
>
>
> I am trying to create javaschemaRDD
>
>
>
> JavaSchemaRDD schemaInstruments = sqlCtx.applySchema(distData,
> Instrument.class);
>
>
>
> Remarks:
>
> ============
>
>
>
> Instrument, Issue, Issuer, Issuing all are java classes
>
>
>
> distData is holding List< Instrument >
>
>
>
>
>
> I am getting the following error.
>
>
>
>
>
>
>
> Exception in thread "Driver" java.lang.reflect.InvocationTargetException
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:483)
>
>         at
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
>
> Caused by: scala.MatchError: class sample.spark.test.Issue (of class
> java.lang.Class)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:188)
>
>         at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>
>         at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>
>         at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>
>         at
> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
>
>         at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>
>         at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext.getSchema(JavaSQLContext.scala:188)
>
>         at
> org.apache.spark.sql.api.java.JavaSQLContext.applySchema(JavaSQLContext.scala:90)
>
>         at sample.spark.test.SparkJob.main(SparkJob.java:33)
>
>         ... 5 more
>
>
>
> Please help me.
>
>
>
> Regards,
>
> Naveen.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org