You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2016/06/01 13:37:59 UTC

[jira] [Resolved] (SPARK-15683) spark sql local FS spark.sql.warehouse.dir throws on YARN

     [ https://issues.apache.org/jira/browse/SPARK-15683?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Thomas Graves resolved SPARK-15683.
-----------------------------------
    Resolution: Duplicate

> spark sql local FS spark.sql.warehouse.dir throws on YARN
> ---------------------------------------------------------
>
>                 Key: SPARK-15683
>                 URL: https://issues.apache.org/jira/browse/SPARK-15683
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Thomas Graves
>            Priority: Critical
>
> I'm trying to use dataframes with spark 2.0.  It was built with hive but when I try to run a dataframe command I get the error:
> 16/05/31 20:24:21 ERROR ApplicationMaster: User class threw exception: java.lang.IllegalArgumentException: Wrong FS: file:/grid/2/tmp/yarn-local/usercache/tgraves/appcache/application_1464289177693_1036410/container_e14_1464289177693_1036410_01_000001/spark-warehouse, expected: hdfs://nn1.com:8020
> java.lang.IllegalArgumentException: Wrong FS: file:/grid/2/tmp/yarn-local/usercache/tgraves/appcache/application_1464289177693_1036410/container_e14_1464289177693_1036410_01_000001/spark-warehouse, expected: hdfs://nn1.com:8020
> 	at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:648)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:194)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:106)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043)
> 	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1061)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)
> 	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
> 	at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1$1(InMemoryCatalog.scala:123)
> 	at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.createDatabase(InMemoryCatalog.scala:122)
> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:142)
> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:84)
> 	at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:94)
> 	at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:94)
> 	at org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:110)
> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:110)
> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:109)
> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:48)
> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:62)
> 	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:371)
> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:154)
> 	at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:419)
> 	at yahoo.spark.SparkFlickrLargeJoin$.main(SparkFlickrLargeJoin.scala:26)
> 	at yahoo.spark.SparkFlickrLargeJoin.main(SparkFlickrLargeJoin.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:617)
> It seems https://issues.apache.org/jira/browse/SPARK-15565 change it to have default local fs. Even before that it didn't work  either, just different error -> https://issues.apache.org/jira/browse/SPARK-15034.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org