You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by diplomatic Guru <di...@gmail.com> on 2016/01/25 13:38:18 UTC

[Spark] Reading avro file in Spark 1.3.0

Hello guys,

I've been trying to read avro file using Spark's DataFrame but it's
throwing this error:
java.lang.NoSuchMethodError:
org.apache.spark.sql.SQLContext.read()Lorg/apache/spark/sql/DataFrameReader;

This is what I've done so far:

I've added the dependency to pom.xml:

                <dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-avro_2.10</artifactId>
<version>1.0.0</version>
</dependency>

Java code:

    JavaSparkContext sc = new JavaSparkContext(sparkConf);
    SQLContext sqlContext = new SQLContext(sc);
    DataFrame df =
sqlContext.read().format("com.databricks.spark.avro").load(args[0]);

Could you please let me know what am I doing wrong?

Thanks.

Re: [Spark] Reading avro file in Spark 1.3.0

Posted by Kevin Mellott <ke...@gmail.com>.
I think that you may be looking at documentation pertaining to the more
recent versions of Spark. Try looking at the examples linked below, which
applies to the Spark 1.3 version. There aren't many Java examples, but the
code should be very similar to the Scala ones (i.e. using "load" instead of
"read' on the DataFrame).

https://github.com/databricks/spark-avro/tree/branch-1.0

On Mon, Jan 25, 2016 at 4:38 AM, diplomatic Guru <di...@gmail.com>
wrote:

> Hello guys,
>
> I've been trying to read avro file using Spark's DataFrame but it's
> throwing this error:
> java.lang.NoSuchMethodError:
> org.apache.spark.sql.SQLContext.read()Lorg/apache/spark/sql/DataFrameReader;
>
> This is what I've done so far:
>
> I've added the dependency to pom.xml:
>
>                 <dependency>
> <groupId>com.databricks</groupId>
> <artifactId>spark-avro_2.10</artifactId>
> <version>1.0.0</version>
> </dependency>
>
> Java code:
>
>     JavaSparkContext sc = new JavaSparkContext(sparkConf);
>     SQLContext sqlContext = new SQLContext(sc);
>     DataFrame df =
> sqlContext.read().format("com.databricks.spark.avro").load(args[0]);
>
> Could you please let me know what am I doing wrong?
>
> Thanks.
>
>
>
>
>