You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "vdiwakar.malladi" <vd...@gmail.com> on 2014/11/14 11:35:12 UTC

saveAsParquetFile throwing exception

Hi,

I'm trying to load JSON file and store the same as parquet file from a
standalone program. But at the time of saving parquet file, I'm getting the
following exception. Can anyone help me on this.

Exception in thread "main" java.lang.RuntimeException: Unsupported dataType:
StructType(ArrayBuffer(<here displayed all StructFields>)), [1.982] failure:
`,' expected but `' found

Note: This method worked for me for sample JSON files.

Thanks.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/saveAsParquetFile-throwing-exception-tp18929.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: saveAsParquetFile throwing exception

Posted by Cheng Lian <li...@gmail.com>.
Hm, I'm not sure whether this is the official way to upgrade CDH Spark, 
maybe you can checkout https://github.com/cloudera/spark, apply required 
patches, and then compile your own version.

On 11/14/14 8:46 PM, vdiwakar.malladi wrote:
> Thanks for your response. I'm using Spark 1.1.0
>
> Currently I have the spark setup which comes with Hadoop CDH (using cloudera
> manager). Could you please suggest me, how can I make use of the patch?
>
> Thanks in advance.
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/saveAsParquetFile-throwing-exception-tp18929p18935.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: saveAsParquetFile throwing exception

Posted by "vdiwakar.malladi" <vd...@gmail.com>.
Thanks for your response. I'm using Spark 1.1.0

Currently I have the spark setup which comes with Hadoop CDH (using cloudera
manager). Could you please suggest me, how can I make use of the patch?

Thanks in advance.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/saveAsParquetFile-throwing-exception-tp18929p18935.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: saveAsParquetFile throwing exception

Posted by Cheng Lian <li...@gmail.com>.
Which version are you using? You probably hit this bug 
https://issues.apache.org/jira/browse/SPARK-3421 if some field name in 
the JSON contains characters other than [a-zA-Z0-9_].

This has been fixed in https://github.com/apache/spark/pull/2563

On 11/14/14 6:35 PM, vdiwakar.malladi wrote:
> Hi,
>
> I'm trying to load JSON file and store the same as parquet file from a
> standalone program. But at the time of saving parquet file, I'm getting the
> following exception. Can anyone help me on this.
>
> Exception in thread "main" java.lang.RuntimeException: Unsupported dataType:
> StructType(ArrayBuffer(<here displayed all StructFields>)), [1.982] failure:
> `,' expected but `' found
>
> Note: This method worked for me for sample JSON files.
>
> Thanks.
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/saveAsParquetFile-throwing-exception-tp18929.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org