You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Alessio (JIRA)" <ji...@apache.org> on 2016/10/13 18:56:20 UTC

[jira] [Updated] (SPARK-17918) Default Warehause location apparently in HDFS

     [ https://issues.apache.org/jira/browse/SPARK-17918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Alessio updated SPARK-17918:
----------------------------
    Description: 
It seems that the default warehouse location in Spark 2.0.1 not only points at an inexistent folder in Macintosh systems (/user/hive/warehouse)  - see first INFO - but also such folder is then appended to an HDFS - see the error.

This was fixed in 2.0.0, as previous issues reported, but appears again in 2.0.1. Indeed some scripts I was able to run in 2.0.0 now throw such errors.


16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is '/user/hive/warehouse'.

py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.
: org.apache.spark.SparkException: Unable to create database default as failed to create its directory hdfs://localhost:9000/user/hive/warehouse

  was:
It seems that the default warehouse location in Spark 2.0.1 not only points at an inexistent folder in Macintosh systems (/user/hive/warehouse)  - see first INFO - but also such folder is then appended to an HDFS - see the error.

This was fixed in 2.0.0, as previous issues reported, but appears again in 2.0.1.

`16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is '/user/hive/warehouse'.`

py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.
: org.apache.spark.SparkException: Unable to create database default as failed to create its directory hdfs://localhost:9000/user/hive/warehouse


> Default Warehause location apparently in HDFS 
> ----------------------------------------------
>
>                 Key: SPARK-17918
>                 URL: https://issues.apache.org/jira/browse/SPARK-17918
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.1
>         Environment: Macintosh
>            Reporter: Alessio
>
> It seems that the default warehouse location in Spark 2.0.1 not only points at an inexistent folder in Macintosh systems (/user/hive/warehouse)  - see first INFO - but also such folder is then appended to an HDFS - see the error.
> This was fixed in 2.0.0, as previous issues reported, but appears again in 2.0.1. Indeed some scripts I was able to run in 2.0.0 now throw such errors.
> 16/10/13 20:47:36 INFO internal.SharedState: Warehouse path is '/user/hive/warehouse'.
> py4j.protocol.Py4JJavaError: An error occurred while calling o32.load.
> : org.apache.spark.SparkException: Unable to create database default as failed to create its directory hdfs://localhost:9000/user/hive/warehouse



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org