You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rivkin Andrey (JIRA)" <ji...@apache.org> on 2017/01/20 15:04:26 UTC
[jira] [Created] (SPARK-19312) Spark gives wrong error message when
failes to create file due to hdfs quota limit.
Rivkin Andrey created SPARK-19312:
-------------------------------------
Summary: Spark gives wrong error message when failes to create file due to hdfs quota limit.
Key: SPARK-19312
URL: https://issues.apache.org/jira/browse/SPARK-19312
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 1.6.0
Environment: CDH 5.8
Reporter: Rivkin Andrey
Priority: Minor
If we set quota on user space and then will try to create table through hive on spark, which will need more space then avaliable, spark will fail with:
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException): failed to create file /user/xxxx/hive_db/.hive-staging_hive_..../_task_tmp.-ext-10003/_tmp.000030_0 for DFSClient_NONMAPREDUCE_-27052423_230 for client 192.168.x.x because current leaseholder is trying to recreate file.
If we will change hive execution engine to mr and execute the same command - create table, we will get:
Caused by: org.apache.hadoop.hdfs.protocol.DSQuotaExceededException: The DiskSpace quota of /user/xxxx is exceeded: quota = 10737418240 B = 10 GB but diskspace consumed = 11098812438 B = 10.34 GB
After increasing quota hive on spark is working.
The problem is with log message, which is inaccurate and not helpful.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org