You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Georg Heiler (JIRA)" <ji...@apache.org> on 2016/03/29 21:25:25 UTC

[jira] [Created] (SPARK-14250) Parquet import failure: No predefined schema found

Georg Heiler created SPARK-14250:
------------------------------------

             Summary: Parquet import failure: No predefined schema found
                 Key: SPARK-14250
                 URL: https://issues.apache.org/jira/browse/SPARK-14250
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, SparkR
    Affects Versions: 1.6.1
         Environment: Osx using docker toolbox, as well as linux / ubuntu
            Reporter: Georg Heiler


No predefined schema found, and no Parquet data files or summary files found.

Trying to read a parquet file with sparkR this exception occurs. Reading a parquet file works without any problems on a local spark installation.

http://stackoverflow.com/questions/36283703/spark-in-docker-parquet-error-no-predefined-schema-found?noredirect=1#comment60211289_36283703



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org