You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Mark Memory <gh...@gmail.com> on 2016/05/18 11:24:33 UTC

Hello, I have an issue about hcatalog

hello guys, sorry to bother you.

I'm using hcatalog to write hive tables, but I don't know how to do with
namenode HA
my code was copied from
https://github.com/apache/hive/blob/master/hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/TestReaderWriter.java

*below is my config:*

    hiveConf.setVar(HiveConf.ConfVars.HADOOPBIN, "/opt/modules/hadoop/bin");

    hiveConf.setVar(HiveConf.ConfVars.LOCALSCRATCHDIR, "/opt/modules/hive/
temp");

    hiveConf.setVar(HiveConf.ConfVars.DOWNLOADED_RESOURCES_DIR,
"/opt/modules/hive/temp");

    hiveConf.setBoolVar(HiveConf.ConfVars.HIVE_SUPPORT_CONCURRENCY, false);

    hiveConf.setVar(HiveConf.ConfVars.METASTOREWAREHOUSE, "/warehouse");

    hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://
127.0.0.1:9083");

    hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_DRIVER,
"com.mysql.jdbc.Driver");

    hiveConf.setVar(HiveConf.ConfVars.METASTORECONNECTURLKEY, "jdbc:mysql://
192.168.5.29:3306/hive?createDatabaseIfNotExist=true");

    hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_USER_NAME,
"hive");

    hiveConf.setVar(HiveConf.ConfVars.METASTOREPWD, "123456");

    hiveConf.setVar(HiveConf.ConfVars.HIVEHISTORYFILELOC,
"/opt/modules/hive/temp");

*and the error is:*

Caused by: java.lang.IllegalArgumentException:
java.net.UnknownHostException: cluster

at
org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)

at
org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)

at
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)

*Is anyone can help me? thank you !!!!*

Re: Hello, I have an issue about hcatalog

Posted by Mark Memory <gh...@gmail.com>.
Yeah, I guess so, but I don't know how to config and connect to HDFS, I had
thought that the HiveConf.ConfVars.HADOOPBIN would help connect to HDFS,
but I was wrong.

Here the default config below, but what's the meaning of these default
config?? which one can help to get the connection of HDFS?

    // Hadoop Configuration Properties

    // Properties with null values are ignored and exist only for the
purpose of giving us

    // a symbolic name to reference in the Hive source code. Properties
with non-null

    // values will override any values set in the underlying Hadoop
configuration.

    HADOOPBIN("hadoop.bin.path", findHadoopBinary(), "", true),

    HIVE_FS_HAR_IMPL("fs.har.impl",
"org.apache.hadoop.hive.shims.HiveHarFileSystem",

        "The implementation for accessing Hadoop Archives. Note that this
won't be applicable to Hadoop versions less than 0.20"),

    HADOOPFS(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPFS"),
null, "", true),

    HADOOPMAPFILENAME(ShimLoader.getHadoopShims().getHadoopConfNames().get(
"HADOOPMAPFILENAME"), null, "", true),

    HADOOPMAPREDINPUTDIR
(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPMAPREDINPUTDIR"),
null, "", true),

    HADOOPMAPREDINPUTDIRRECURSIVE
(ShimLoader.getHadoopShims().getHadoopConfNames().get(
"HADOOPMAPREDINPUTDIRRECURSIVE"), false, "", true),

    MAPREDMAXSPLITSIZE(ShimLoader.getHadoopShims().getHadoopConfNames().get(
"MAPREDMAXSPLITSIZE"), 256000000L, "", true),

    MAPREDMINSPLITSIZE(ShimLoader.getHadoopShims().getHadoopConfNames().get(
"MAPREDMINSPLITSIZE"), 1L, "", true),

    MAPREDMINSPLITSIZEPERNODE
(ShimLoader.getHadoopShims().getHadoopConfNames().get(
"MAPREDMINSPLITSIZEPERNODE"), 1L, "", true),

    MAPREDMINSPLITSIZEPERRACK
(ShimLoader.getHadoopShims().getHadoopConfNames().get(
"MAPREDMINSPLITSIZEPERRACK"), 1L, "", true),

2016-05-19 8:12 GMT+08:00 Alan Gates <al...@gmail.com>:

> This looks to me like a Hadoop issue rather than Hive.  It appears that
> you cannot connect to HDFS.  Have you tried connecting to HDFS outside of
> Hive/HCatalog?
>
> Alan.
>
> > On May 18, 2016, at 04:24, Mark Memory <gh...@gmail.com> wrote:
> >
> > hello guys, sorry to bother you.
> >
> > I'm using hcatalog to write hive tables, but I don't know how to do with
> namenode HA
> > my code was copied from
> https://github.com/apache/hive/blob/master/hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/TestReaderWriter.java
> >
> > below is my config:
> >     hiveConf.setVar(HiveConf.ConfVars.HADOOPBIN,
> "/opt/modules/hadoop/bin");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.LOCALSCRATCHDIR,
> "/opt/modules/hive/temp");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.DOWNLOADED_RESOURCES_DIR,
> "/opt/modules/hive/temp");
> >
> >     hiveConf.setBoolVar(HiveConf.ConfVars.HIVE_SUPPORT_CONCURRENCY,
> false);
> >
> >     hiveConf.setVar(HiveConf.ConfVars.METASTOREWAREHOUSE, "/warehouse");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://
> 127.0.0.1:9083");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_DRIVER,
> "com.mysql.jdbc.Driver");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.METASTORECONNECTURLKEY,
> "jdbc:mysql://192.168.5.29:3306/hive?createDatabaseIfNotExist=true");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_USER_NAME,
> "hive");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.METASTOREPWD, "123456");
> >
> >     hiveConf.setVar(HiveConf.ConfVars.HIVEHISTORYFILELOC,
> "/opt/modules/hive/temp");
> >
> > and the error is:
> >
> > Caused by: java.lang.IllegalArgumentException:
> java.net.UnknownHostException: cluster
> >
> > at
> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
> >
> > at
> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
> >
> > at
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
> >
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
> >
> > Is anyone can help me? thank you !!!!
> >
>
>

Re: Hello, I have an issue about hcatalog

Posted by Alan Gates <al...@gmail.com>.
This looks to me like a Hadoop issue rather than Hive.  It appears that you cannot connect to HDFS.  Have you tried connecting to HDFS outside of Hive/HCatalog?  

Alan.

> On May 18, 2016, at 04:24, Mark Memory <gh...@gmail.com> wrote:
> 
> hello guys, sorry to bother you.
> 
> I'm using hcatalog to write hive tables, but I don't know how to do with namenode HA
> my code was copied from https://github.com/apache/hive/blob/master/hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/TestReaderWriter.java
> 
> below is my config:
>     hiveConf.setVar(HiveConf.ConfVars.HADOOPBIN, "/opt/modules/hadoop/bin");
> 
>     hiveConf.setVar(HiveConf.ConfVars.LOCALSCRATCHDIR, "/opt/modules/hive/temp");
> 
>     hiveConf.setVar(HiveConf.ConfVars.DOWNLOADED_RESOURCES_DIR, "/opt/modules/hive/temp");
> 
>     hiveConf.setBoolVar(HiveConf.ConfVars.HIVE_SUPPORT_CONCURRENCY, false);
> 
>     hiveConf.setVar(HiveConf.ConfVars.METASTOREWAREHOUSE, "/warehouse");
> 
>     hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, "thrift://127.0.0.1:9083");
> 
>     hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_DRIVER, "com.mysql.jdbc.Driver");
> 
>     hiveConf.setVar(HiveConf.ConfVars.METASTORECONNECTURLKEY, "jdbc:mysql://192.168.5.29:3306/hive?createDatabaseIfNotExist=true");
> 
>     hiveConf.setVar(HiveConf.ConfVars.METASTORE_CONNECTION_USER_NAME, "hive");
> 
>     hiveConf.setVar(HiveConf.ConfVars.METASTOREPWD, "123456");
> 
>     hiveConf.setVar(HiveConf.ConfVars.HIVEHISTORYFILELOC, "/opt/modules/hive/temp");
> 
> and the error is:
> 
> Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster
> 
> at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
> 
> at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
> 
> at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
> 
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
> 
> Is anyone can help me? thank you !!!!
>