You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Blake Martin <bm...@demonware.net> on 2016/04/25 18:38:31 UTC

Using a different FileSystem for Staging

Hi Hive Folks,

We're writing to S3-backed tables, but hoping to use HDFS for staging and
merging.  When we set hive.exec.stagingdir to an HDFS location, we get:

16/04/19 22:59:54 [main]: ERROR ql.Driver: FAILED: IllegalArgumentException
Wrong FS:
hdfs://<redacted>:8020/tmp/hivestaging_hive_2016-04-19_22-59-52_470_66381161375278005-1,
expected: s3a://<redacted>
java.lang.IllegalArgumentException: Wrong FS:
hdfs://ip-172-20-109-129.node.dc1.consul:8020/tmp/hivestaging_hive_2016-04-19_22-59-52_470_66381161375278005-1,
expected: s3a://dwh-dev
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:646)
        at
org.apache.hadoop.fs.FileSystem.makeQualified(FileSystem.java:466)
        at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:230)
        at
org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:426)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:6271)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:9007)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8898)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9743)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9636)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10109)
        at
org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:329)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10120)
        at
org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:211)
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:456)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:316)
        at
org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1181)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1229)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108)
        at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
        at
org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
        at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
        at
org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Is there an easy way to do this, without patching Hive or Hadoop?

Thanks!