You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by "Ratner, Alan S (IS)" <Al...@ngc.com> on 2012/11/26 22:03:37 UTC

Runs in Eclipse but not as a Jar

I am running HBase 0.94.2 running on 6 servers with Zookeeper 3.4.5 running on 3.  HBase works from its shell and from within Eclipse but not as a jar file.  When I run within Eclipse I can see it worked properly by using the HBase shell commands (such as scan).



I seem to have 2 separate problems.



Problem 1: when I create a jar file from Eclipse it won't run at all:

ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable -classpath "/home/ngc/hbase-0.94.2/*"

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at HBase.CreateBiTable.run(CreateBiTable.java:26) [line 26 is: Configuration conf = HBaseConfiguration.create();]



Problem 2: when I create a "runnable" jar file from Eclipse it communicates with Zookeeper but then dies with:

Exception in thread "main" java.lang.IllegalArgumentException: Not a host:port pair: \ufffd

      5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468<mailto:5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468>



I'd prefer to use a regular jar (5 KB) rather than a runnable jar (100 MB).  But I assume that if I fix Problem 1 then it will proceed until it crashes with Problem 2.



Thanks in advance for any suggestions --- Alan.



-----------------------------

CLASSPATH

ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> env | grep CLASSPATH CLASSPATH=/home/ngc/hadoop-1.0.4:/home/ngc/hbase-0.94.2/bin:/home/ngc/zookeeper-3.4.5/bin:/home/ngc/accumulo-1.3.5-incubating



-----------------------------

HBASE PROGRAM

package HBase;



import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.conf.Configured;

import org.apache.hadoop.hbase.HBaseConfiguration;

import org.apache.hadoop.hbase.HColumnDescriptor;

import org.apache.hadoop.hbase.HTableDescriptor;

import org.apache.hadoop.hbase.client.HBaseAdmin;

import org.apache.hadoop.util.Tool;

import org.apache.hadoop.util.ToolRunner;



public class CreateBiTable extends Configured implements Tool {

            public static String TableName = new String ("BiIPTable");

            public static String cf = "cf";  //column family

            public static String c1 = "c1";  //column1



            public static void main(String[] args) throws Exception {

                        long startTime = System.currentTimeMillis();

                        int res = ToolRunner.run(new Configuration(), new CreateBiTable(), args);

                        double duration = (System.currentTimeMillis() - startTime)/1000.0;

                        System.out.println(">>>> Job Finished in " + duration + " seconds");

                        System.exit(res);

            }



            public int run(String[] arg0) throws Exception {

        Configuration conf = HBaseConfiguration.create(); //      System.out.println("Configuration created");

      System.out.println("\t"+conf.toString());

      HBaseAdmin admin = new HBaseAdmin(conf); //      System.out.println("\t"+admin.toString());

      if (admin.tableExists(TableName)) {

          // Disable and delete the table if it exists

          admin.disableTable(TableName);

          admin.deleteTable(TableName);

          System.out.println(TableName+" exists so deleted");

      }

      // Create table

      HTableDescriptor htd = new HTableDescriptor(TableName);

      HColumnDescriptor hcd = new HColumnDescriptor(cf);

      htd.addFamily(hcd);

      admin.createTable(htd);

      System.out.println("Table created: "+htd);

     // Does the table exist now?

      if (admin.tableExists(TableName))

            System.out.println(TableName+" creation succeeded");

      else

            System.out.println(TableName+" creation failed");

                        return 0;

            }

}



-----------------------------

OUTPUT FROM RUNNING WITHIN ECLIPSE

            Configuration: core-default.xml, core-site.xml, hbase-default.xml, hbase-site.xml

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/ngc/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:host.name=hadoop1.aj.c2fse.northgrum.com

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_25

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.home=/home/ngc/jdk1.6.0_25/jre

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/ngc/AlanSpace/HadoopPrograms/bin:/home/ngc/hadoop-1.0.4/hadoop-core-1.0.4.jar:/home/ngc/zookeeper-3.4.5/zookeeper-3.4.5.jar:/home/ngc/JavaLibraries/Jama/Jama-1.0.2.jar:/home/ngc/AlansOpenCVStuff/core.jar:/home/ngc/OpenCV-2.2.0/javacv/javacpp.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-macosx-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv.jar:/home/ngc/OpenCV-2.2.0/lib:/home/ngc/javafaces/lib/colt.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-math3-3.0.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-math3-3.0-javadoc.jar:/home/ngc/Downloads/jtransforms-2.4.jar:/home/ngc/mahout-distribution-0.7/mahout-core-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-core-0.7-job.jar:/home/ngc/mahout-distribution-0.7/mahout-integration-0.7.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2.jar:/home/ngc/mahout-distribution-0.7/mahout-math-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar:/home/ngc/mahout-distribution-0.7/lib/commons-cli-2.0-mahout.jar:/home/ngc/mahout-distribution-0.7/lib/uncommons-maths-1.2.2.jar:/home/ngc/pig-0.10.0/pig-0.10.0.jar:/home/ngc/Cascading/cascading-core-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-hadoop-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-local-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-xml-2.1.0-wip-76.jar:/home/ngc/hadoop-1.0.4/hadoop-ant-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-client-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-examples-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-minicluster-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-test-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-tools-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/asm-3.2.jar:/home/ngc/hadoop-1.0.4/lib/aspectjrt-1.6.5.jar:/home/ngc/hadoop-1.0.4/lib/aspectjtools-1.6.5.jar:/home/ngc/hadoop-1.0.4/lib/commons-beanutils-1.7.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-beanutils-core-1.8.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-cli-1.2.jar:/home/ngc/hadoop-1.0.4/lib/commons-codec-1.4.jar:/home/ngc/hadoop-1.0.4/lib/commons-collections-3.2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-configuration-1.6.jar:/home/ngc/hadoop-1.0.4/lib/commons-daemon-1.0.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-digester-1.8.jar:/home/ngc/hadoop-1.0.4/lib/commons-el-1.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-io-2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-logging-1.1.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-logging-api-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/commons-math-2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-net-1.4.1.jar:/home/ngc/hadoop-1.0.4/lib/core-3.1.1.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-capacity-scheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-fairscheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-thriftfs-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hsqldb-1.8.0.10.jar:/home/ngc/hadoop-1.0.4/lib/jackson-core-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/lib/jackson-mapper-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/lib/jdeb-0.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-core-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-json-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-server-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jets3t-0.6.1.jar:/home/ngc/hadoop-1.0.4/lib/jetty-6.1.26.jar:/home/ngc/hadoop-1.0.4/lib/jetty-util-6.1.26.jar:/home/ngc/hadoop-1.0.4/lib/jsch-0.1.42.jar:/home/ngc/hadoop-1.0.4/lib/junit-4.5.jar:/home/ngc/hadoop-1.0.4/lib/kfs-0.2.2.jar:/home/ngc/hadoop-1.0.4/lib/mockito-all-1.8.5.jar:/home/ngc/hadoop-1.0.4/lib/oro-2.0.8.jar:/home/ngc/hadoop-1.0.4/lib/slf4j-api-1.4.3.jar:/home/ngc/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar:/home/ngc/hadoop-1.0.4/lib/xmlenc-0.52.jar:/home/ngc/Data/SchemaB.jar:/home/ngc/hive/lib/hive-exec-0.7.0.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2-tests.jar:/home/ngc/hbase-0.94.2/lib/activation-1.1.jar:/home/ngc/hbase-0.94.2/lib/avro-1.5.3.jar:/home/ngc/hbase-0.94.2/lib/avro-ipc-1.5.3.jar:/home/ngc/hbase-0.94.2/lib/commons-digester-1.8.jar:/home/ngc/hbase-0.94.2/lib/commons-httpclient-3.1.jar:/home/ngc/hbase-0.94.2/lib/commons-lang-2.5.jar:/home/ngc/hbase-0.94.2/lib/guava-11.0.2.jar:/home/ngc/hbase-0.94.2/lib/high-scale-lib-1.1.1.jar:/home/ngc/hbase-0.94.2/lib/httpclient-4.1.2.jar:/home/ngc/hbase-0.94.2/lib/httpcore-4.1.3.jar:/home/ngc/hbase-0.94.2/lib/jackson-jaxrs-1.8.8.jar:/home/ngc/hbase-0.94.2/lib/jackson-xc-1.8.8.jar:/home/ngc/hbase-0.94.2/lib/jamon-runtime-2.3.1.jar:/home/ngc/hbase-0.94.2/lib/jasper-compiler-5.5.23.jar:/home/ngc/hbase-0.94.2/lib/jasper-runtime-5.5.23.jar:/home/ngc/hbase-0.94.2/lib/jaxb-api-2.1.jar:/home/ngc/hbase-0.94.2/lib/jaxb-impl-2.2.3-1.jar:/home/ngc/hbase-0.94.2/lib/jettison-1.1.jar:/home/ngc/hbase-0.94.2/lib/jruby-complete-1.6.5.jar:/home/ngc/hbase-0.94.2/lib/jsp-2.1-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/jsp-api-2.1-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/jsr305-1.3.9.jar:/home/ngc/hbase-0.94.2/lib/junit-4.10-HBASE-1.jar:/home/ngc/hbase-0.94.2/lib/libthrift-0.8.0.jar:/home/ngc/hbase-0.94.2/lib/log4j-1.2.16.jar:/home/ngc/hbase-0.94.2/lib/metrics-core-2.1.2.jar:/home/ngc/hbase-0.94.2/lib/netty-3.2.4.Final.jar:/home/ngc/hbase-0.94.2/lib/protobuf-java-2.4.0a.jar:/home/ngc/hbase-0.94.2/lib/servlet-api-2.5-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/snappy-java-1.0.3.2.jar:/home/ngc/hbase-0.94.2/lib/stax-api-1.0.1.jar:/home/ngc/hbase-0.94.2/lib/velocity-1.7.jar

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/home/ngc/AlansOpenCVStuff

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.version=3.2.0-24-generic

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.name=ngc

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ngc

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ngc/AlanSpace/HadoopPrograms

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection

12/11/26 13:48:54 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 23098@hadoop1

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13b3dab1196001f, negotiated timeout = 40000

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=catalogtracker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@30ff8c74<ma...@30ff8c74>

12/11/26 13:48:54 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 23098@hadoop1

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Socket connection established to localhost/0:0:0:0:0:0:0:1:2181, initiating session

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/0:0:0:0:0:0:0:1:2181, sessionid = 0x13b3dab11960020, negotiated timeout = 40000

12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Session: 0x13b3dab11960020 closed

12/11/26 13:48:54 INFO zookeeper.ClientCnxn: EventThread shut down

12/11/26 13:48:54 INFO client.HBaseAdmin: Started disable of BiIPTable

12/11/26 13:48:56 INFO client.HBaseAdmin: Disabled BiIPTable

12/11/26 13:48:57 INFO client.HBaseAdmin: Deleted BiIPTable BiIPTable exists so deleted Table created: {NAME => 'BiIPTable', FAMILIES => [{NAME => 'cf', DATA_BLOCK_ENCODING => 'NONE', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPRESSION => 'NONE', VERSIONS => '3', TTL => '2147483647', MIN_VERSIONS => '0', KEEP_DELETED_CELLS => 'false', BLOCKSIZE => '65536', ENCODE_ON_DISK => 'true', IN_MEMORY => 'false', BLOCKCACHE => 'true'}]}

12/11/26 13:48:59 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=catalogtracker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@30ff8c74<ma...@30ff8c74>

12/11/26 13:48:59 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 23098@hadoop1

12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)

12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session

12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13b3dab11960021, negotiated timeout = 40000

12/11/26 13:48:59 INFO zookeeper.ZooKeeper: Session: 0x13b3dab11960021 closed

12/11/26 13:48:59 INFO zookeeper.ClientCnxn: EventThread shut down BiIPTable creation succeeded

>>>> Job Finished in 5.177 seconds



-----------------------------

OUTPUT FROM REGULAR JAR FILE

ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

            at HBase.CreateBiTable.run(CreateBiTable.java:26)

            at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

            at HBase.CreateBiTable.main(CreateBiTable.java:19)

            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

            at java.lang.reflect.Method.invoke(Method.java:597)

            at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration

            at java.net.URLClassLoader$1.run(URLClassLoader.java:202)

            at java.security.AccessController.doPrivileged(Native Method)

            at java.net.URLClassLoader.findClass(URLClassLoader.java:190)

            at java.lang.ClassLoader.loadClass(ClassLoader.java:306)

            at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

            ... 8 more



-----------------------------

OUTPUT FROM RUNABLE JAR FILE

ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable

            Configuration: core-default.xml, core-site.xml, hbase-default.xml, hbase-site.xml

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.3.3-1073969, built on 02/23/2011 22:27 GMT

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:host.name=hadoop1.aj.c2fse.northgrum.com

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_25

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.home=/home/ngc/jdk1.6.0_25/jre

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/ngc/hadoop-1.0.4/libexec/../conf:/home/ngc/jdk1.6.0_25/lib/tools.jar:/home/ngc/hadoop-1.0.4/libexec/..:/home/ngc/hadoop-1.0.4/libexec/../hadoop-core-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/asm-3.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/aspectjrt-1.6.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/aspectjtools-1.6.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-beanutils-1.7.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-cli-1.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-codec-1.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-collections-3.2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-configuration-1.6.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-daemon-1.0.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-digester-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-el-1.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-httpclient-3.0.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-io-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-lang-2.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-logging-1.1.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-logging-api-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-math-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-net-1.4.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/core-3.1.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hadoop-capacity-scheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hadoop-fairscheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hadoop-thriftfs-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hsqldb-1.8.0.10.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jasper-compiler-5.5.12.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jasper-runtime-5.5.12.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jdeb-0.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-core-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-json-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-server-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jets3t-0.6.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jetty-6.1.26.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jetty-util-6.1.26.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsch-0.1.42.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/junit-4.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/kfs-0.2.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/log4j-1.2.15.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/mockito-all-1.8.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/oro-2.0.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/servlet-api-2.5-20081211.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/slf4j-api-1.4.3.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/xmlenc-0.52.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsp-2.1/jsp-api-2.1.jar

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/home/ngc/hadoop-1.0.4/libexec/../lib/native/Linux-amd64-64

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.version=3.2.0-24-generic

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.name=ngc

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ngc

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ngc/hadoop-1.0.4

12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection

12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181

12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session

12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13b3dab11960023, negotiated timeout = 40000 Exception in thread "main" java.lang.IllegalArgumentException: Not a host:port pair: \ufffd

      5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468<mailto:5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468>

            at org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)

            at org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)

            at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:352)

            at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:90)

            at HBase.CreateBiTable.run(CreateBiTable.java:29)

            at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

            at HBase.CreateBiTable.main(CreateBiTable.java:19)

            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

            at java.lang.reflect.Method.invoke(Method.java:597)

            at org.apache.hadoop.util.RunJar.main(RunJar.java:156)



Alan


Re: Runs in Eclipse but not as a Jar

Posted by Suraj Varma <sv...@gmail.com>.
The difference is your classpath.
So -for problem 1, you need to specify jars under /hbase-0.94.2/lib to
your classpath. You only need a subset ... but first to get over the
problem set your classpath with all these jars. I don't think
specifying a wildcard "*" works ... like below

ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$>
bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable
-classpath "/home/ngc/hbase-0.94.2/*"

you can use bin/hbase classpath to print out full classpath that you
can include in your command line script ...

In addition to the jars,you also need to add your hbase-site.xml
(client side) to the classpath. This would be your problem 2.

Hope that helps.
--Suraj

On Mon, Nov 26, 2012 at 1:03 PM, Ratner, Alan S (IS)
<Al...@ngc.com> wrote:
> I am running HBase 0.94.2 running on 6 servers with Zookeeper 3.4.5 running on 3.  HBase works from its shell and from within Eclipse but not as a jar file.  When I run within Eclipse I can see it worked properly by using the HBase shell commands (such as scan).
>
>
>
> I seem to have 2 separate problems.
>
>
>
> Problem 1: when I create a jar file from Eclipse it won't run at all:
>
> ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable -classpath "/home/ngc/hbase-0.94.2/*"
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at HBase.CreateBiTable.run(CreateBiTable.java:26) [line 26 is: Configuration conf = HBaseConfiguration.create();]
>
>
>
> Problem 2: when I create a "runnable" jar file from Eclipse it communicates with Zookeeper but then dies with:
>
> Exception in thread "main" java.lang.IllegalArgumentException: Not a host:port pair: \ufffd
>
>       5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468<mailto:5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468>
>
>
>
> I'd prefer to use a regular jar (5 KB) rather than a runnable jar (100 MB).  But I assume that if I fix Problem 1 then it will proceed until it crashes with Problem 2.
>
>
>
> Thanks in advance for any suggestions --- Alan.
>
>
>
> -----------------------------
>
> CLASSPATH
>
> ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> env | grep CLASSPATH CLASSPATH=/home/ngc/hadoop-1.0.4:/home/ngc/hbase-0.94.2/bin:/home/ngc/zookeeper-3.4.5/bin:/home/ngc/accumulo-1.3.5-incubating
>
>
>
> -----------------------------
>
> HBASE PROGRAM
>
> package HBase;
>
>
>
> import org.apache.hadoop.conf.Configuration;
>
> import org.apache.hadoop.conf.Configured;
>
> import org.apache.hadoop.hbase.HBaseConfiguration;
>
> import org.apache.hadoop.hbase.HColumnDescriptor;
>
> import org.apache.hadoop.hbase.HTableDescriptor;
>
> import org.apache.hadoop.hbase.client.HBaseAdmin;
>
> import org.apache.hadoop.util.Tool;
>
> import org.apache.hadoop.util.ToolRunner;
>
>
>
> public class CreateBiTable extends Configured implements Tool {
>
>             public static String TableName = new String ("BiIPTable");
>
>             public static String cf = "cf";  //column family
>
>             public static String c1 = "c1";  //column1
>
>
>
>             public static void main(String[] args) throws Exception {
>
>                         long startTime = System.currentTimeMillis();
>
>                         int res = ToolRunner.run(new Configuration(), new CreateBiTable(), args);
>
>                         double duration = (System.currentTimeMillis() - startTime)/1000.0;
>
>                         System.out.println(">>>> Job Finished in " + duration + " seconds");
>
>                         System.exit(res);
>
>             }
>
>
>
>             public int run(String[] arg0) throws Exception {
>
>         Configuration conf = HBaseConfiguration.create(); //      System.out.println("Configuration created");
>
>       System.out.println("\t"+conf.toString());
>
>       HBaseAdmin admin = new HBaseAdmin(conf); //      System.out.println("\t"+admin.toString());
>
>       if (admin.tableExists(TableName)) {
>
>           // Disable and delete the table if it exists
>
>           admin.disableTable(TableName);
>
>           admin.deleteTable(TableName);
>
>           System.out.println(TableName+" exists so deleted");
>
>       }
>
>       // Create table
>
>       HTableDescriptor htd = new HTableDescriptor(TableName);
>
>       HColumnDescriptor hcd = new HColumnDescriptor(cf);
>
>       htd.addFamily(hcd);
>
>       admin.createTable(htd);
>
>       System.out.println("Table created: "+htd);
>
>      // Does the table exist now?
>
>       if (admin.tableExists(TableName))
>
>             System.out.println(TableName+" creation succeeded");
>
>       else
>
>             System.out.println(TableName+" creation failed");
>
>                         return 0;
>
>             }
>
> }
>
>
>
> -----------------------------
>
> OUTPUT FROM RUNNING WITHIN ECLIPSE
>
>             Configuration: core-default.xml, core-site.xml, hbase-default.xml, hbase-site.xml
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in [jar:file:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in [jar:file:/home/ngc/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:host.name=hadoop1.aj.c2fse.northgrum.com
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_25
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.home=/home/ngc/jdk1.6.0_25/jre
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/ngc/AlanSpace/HadoopPrograms/bin:/home/ngc/hadoop-1.0.4/hadoop-core-1.0.4.jar:/home/ngc/zookeeper-3.4.5/zookeeper-3.4.5.jar:/home/ngc/JavaLibraries/Jama/Jama-1.0.2.jar:/home/ngc/AlansOpenCVStuff/core.jar:/home/ngc/OpenCV-2.2.0/javacv/javacpp.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-macosx-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv.jar:/home/ngc/OpenCV-2.2.0/lib:/home/ngc/javafaces/lib/colt.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-math3-3.0.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-math3-3.0-javadoc.jar:/home/ngc/Downloads/jtransforms-2.4.jar:/home/ngc/mahout-distribution-0.7/mahout-core-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-core-0.7-job.jar:/home/ngc/mahout-distribution-0.7/mahout-integration-0.7.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2.jar:/home/ngc/mahout-distribution-0.7/mahout-math-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar:/home/ngc/mahout-distribution-0.7/lib/commons-cli-2.0-mahout.jar:/home/ngc/mahout-distribution-0.7/lib/uncommons-maths-1.2.2.jar:/home/ngc/pig-0.10.0/pig-0.10.0.jar:/home/ngc/Cascading/cascading-core-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-hadoop-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-local-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-xml-2.1.0-wip-76.jar:/home/ngc/hadoop-1.0.4/hadoop-ant-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-client-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-examples-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-minicluster-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-test-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-tools-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/asm-3.2.jar:/home/ngc/hadoop-1.0.4/lib/aspectjrt-1.6.5.jar:/home/ngc/hadoop-1.0.4/lib/aspectjtools-1.6.5.jar:/home/ngc/hadoop-1.0.4/lib/commons-beanutils-1.7.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-beanutils-core-1.8.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-cli-1.2.jar:/home/ngc/hadoop-1.0.4/lib/commons-codec-1.4.jar:/home/ngc/hadoop-1.0.4/lib/commons-collections-3.2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-configuration-1.6.jar:/home/ngc/hadoop-1.0.4/lib/commons-daemon-1.0.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-digester-1.8.jar:/home/ngc/hadoop-1.0.4/lib/commons-el-1.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-io-2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-logging-1.1.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-logging-api-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/commons-math-2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-net-1.4.1.jar:/home/ngc/hadoop-1.0.4/lib/core-3.1.1.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-capacity-scheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-fairscheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-thriftfs-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hsqldb-1.8.0.10.jar:/home/ngc/hadoop-1.0.4/lib/jackson-core-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/lib/jackson-mapper-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/lib/jdeb-0.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-core-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-json-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-server-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jets3t-0.6.1.jar:/home/ngc/hadoop-1.0.4/lib/jetty-6.1.26.jar:/home/ngc/hadoop-1.0.4/lib/jetty-util-6.1.26.jar:/home/ngc/hadoop-1.0.4/lib/jsch-0.1.42.jar:/home/ngc/hadoop-1.0.4/lib/junit-4.5.jar:/home/ngc/hadoop-1.0.4/lib/kfs-0.2.2.jar:/home/ngc/hadoop-1.0.4/lib/mockito-all-1.8.5.jar:/home/ngc/hadoop-1.0.4/lib/oro-2.0.8.jar:/home/ngc/hadoop-1.0.4/lib/slf4j-api-1.4.3.jar:/home/ngc/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar:/home/ngc/hadoop-1.0.4/lib/xmlenc-0.52.jar:/home/ngc/Data/SchemaB.jar:/home/ngc/hive/lib/hive-exec-0.7.0.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2-tests.jar:/home/ngc/hbase-0.94.2/lib/activation-1.1.jar:/home/ngc/hbase-0.94.2/lib/avro-1.5.3.jar:/home/ngc/hbase-0.94.2/lib/avro-ipc-1.5.3.jar:/home/ngc/hbase-0.94.2/lib/commons-digester-1.8.jar:/home/ngc/hbase-0.94.2/lib/commons-httpclient-3.1.jar:/home/ngc/hbase-0.94.2/lib/commons-lang-2.5.jar:/home/ngc/hbase-0.94.2/lib/guava-11.0.2.jar:/home/ngc/hbase-0.94.2/lib/high-scale-lib-1.1.1.jar:/home/ngc/hbase-0.94.2/lib/httpclient-4.1.2.jar:/home/ngc/hbase-0.94.2/lib/httpcore-4.1.3.jar:/home/ngc/hbase-0.94.2/lib/jackson-jaxrs-1.8.8.jar:/home/ngc/hbase-0.94.2/lib/jackson-xc-1.8.8.jar:/home/ngc/hbase-0.94.2/lib/jamon-runtime-2.3.1.jar:/home/ngc/hbase-0.94.2/lib/jasper-compiler-5.5.23.jar:/home/ngc/hbase-0.94.2/lib/jasper-runtime-5.5.23.jar:/home/ngc/hbase-0.94.2/lib/jaxb-api-2.1.jar:/home/ngc/hbase-0.94.2/lib/jaxb-impl-2.2.3-1.jar:/home/ngc/hbase-0.94.2/lib/jettison-1.1.jar:/home/ngc/hbase-0.94.2/lib/jruby-complete-1.6.5.jar:/home/ngc/hbase-0.94.2/lib/jsp-2.1-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/jsp-api-2.1-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/jsr305-1.3.9.jar:/home/ngc/hbase-0.94.2/lib/junit-4.10-HBASE-1.jar:/home/ngc/hbase-0.94.2/lib/libthrift-0.8.0.jar:/home/ngc/hbase-0.94.2/lib/log4j-1.2.16.jar:/home/ngc/hbase-0.94.2/lib/metrics-core-2.1.2.jar:/home/ngc/hbase-0.94.2/lib/netty-3.2.4.Final.jar:/home/ngc/hbase-0.94.2/lib/protobuf-java-2.4.0a.jar:/home/ngc/hbase-0.94.2/lib/servlet-api-2.5-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/snappy-java-1.0.3.2.jar:/home/ngc/hbase-0.94.2/lib/stax-api-1.0.1.jar:/home/ngc/hbase-0.94.2/lib/velocity-1.7.jar
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/home/ngc/AlansOpenCVStuff
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.version=3.2.0-24-generic
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.name=ngc
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ngc
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ngc/AlanSpace/HadoopPrograms
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection
>
> 12/11/26 13:48:54 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 23098@hadoop1
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13b3dab1196001f, negotiated timeout = 40000
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=catalogtracker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@30ff8c74<ma...@30ff8c74>
>
> 12/11/26 13:48:54 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 23098@hadoop1
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Socket connection established to localhost/0:0:0:0:0:0:0:1:2181, initiating session
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/0:0:0:0:0:0:0:1:2181, sessionid = 0x13b3dab11960020, negotiated timeout = 40000
>
> 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Session: 0x13b3dab11960020 closed
>
> 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: EventThread shut down
>
> 12/11/26 13:48:54 INFO client.HBaseAdmin: Started disable of BiIPTable
>
> 12/11/26 13:48:56 INFO client.HBaseAdmin: Disabled BiIPTable
>
> 12/11/26 13:48:57 INFO client.HBaseAdmin: Deleted BiIPTable BiIPTable exists so deleted Table created: {NAME => 'BiIPTable', FAMILIES => [{NAME => 'cf', DATA_BLOCK_ENCODING => 'NONE', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPRESSION => 'NONE', VERSIONS => '3', TTL => '2147483647', MIN_VERSIONS => '0', KEEP_DELETED_CELLS => 'false', BLOCKSIZE => '65536', ENCODE_ON_DISK => 'true', IN_MEMORY => 'false', BLOCKCACHE => 'true'}]}
>
> 12/11/26 13:48:59 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=catalogtracker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@30ff8c74<ma...@30ff8c74>
>
> 12/11/26 13:48:59 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 23098@hadoop1
>
> 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181. Will not attempt to authenticate using SASL (Unable to locate a login configuration)
>
> 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session
>
> 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13b3dab11960021, negotiated timeout = 40000
>
> 12/11/26 13:48:59 INFO zookeeper.ZooKeeper: Session: 0x13b3dab11960021 closed
>
> 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: EventThread shut down BiIPTable creation succeeded
>
>>>>> Job Finished in 5.177 seconds
>
>
>
> -----------------------------
>
> OUTPUT FROM REGULAR JAR FILE
>
> ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
>
>             at HBase.CreateBiTable.run(CreateBiTable.java:26)
>
>             at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
>             at HBase.CreateBiTable.main(CreateBiTable.java:19)
>
>             at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>             at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>             at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>             at java.lang.reflect.Method.invoke(Method.java:597)
>
>             at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
>
>             at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>
>             at java.security.AccessController.doPrivileged(Native Method)
>
>             at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>
>             at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>
>             at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>
>             ... 8 more
>
>
>
> -----------------------------
>
> OUTPUT FROM RUNABLE JAR FILE
>
> ngc@hadoop1:~/hadoop-1.0.4$<mailto:ngc@hadoop1:~/hadoop-1.0.4$> bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable
>
>             Configuration: core-default.xml, core-site.xml, hbase-default.xml, hbase-site.xml
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.3.3-1073969, built on 02/23/2011 22:27 GMT
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:host.name=hadoop1.aj.c2fse.northgrum.com
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_25
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.home=/home/ngc/jdk1.6.0_25/jre
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/home/ngc/hadoop-1.0.4/libexec/../conf:/home/ngc/jdk1.6.0_25/lib/tools.jar:/home/ngc/hadoop-1.0.4/libexec/..:/home/ngc/hadoop-1.0.4/libexec/../hadoop-core-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/asm-3.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/aspectjrt-1.6.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/aspectjtools-1.6.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-beanutils-1.7.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-cli-1.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-codec-1.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-collections-3.2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-configuration-1.6.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-daemon-1.0.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-digester-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-el-1.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-httpclient-3.0.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-io-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-lang-2.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-logging-1.1.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-logging-api-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-math-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-net-1.4.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/core-3.1.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hadoop-capacity-scheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hadoop-fairscheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hadoop-thriftfs-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hsqldb-1.8.0.10.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jasper-compiler-5.5.12.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jasper-runtime-5.5.12.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jdeb-0.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-core-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-json-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-server-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jets3t-0.6.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jetty-6.1.26.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jetty-util-6.1.26.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsch-0.1.42.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/junit-4.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/kfs-0.2.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/log4j-1.2.15.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/mockito-all-1.8.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/oro-2.0.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/servlet-api-2.5-20081211.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/slf4j-api-1.4.3.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/xmlenc-0.52.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/home/ngc/hadoop-1.0.4/libexec/../lib/native/Linux-amd64-64
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.version=3.2.0-24-generic
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.name=ngc
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/ngc
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/ngc/hadoop-1.0.4
>
> 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=180000 watcher=hconnection
>
> 12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Opening socket connection to server localhost/127.0.0.1:2181
>
> 12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Socket connection established to localhost/127.0.0.1:2181, initiating session
>
> 12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13b3dab11960023, negotiated timeout = 40000 Exception in thread "main" java.lang.IllegalArgumentException: Not a host:port pair: \ufffd
>
>       5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468<mailto:5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468>
>
>             at org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>
>             at org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>
>             at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:352)
>
>             at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:90)
>
>             at HBase.CreateBiTable.run(CreateBiTable.java:29)
>
>             at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
>             at HBase.CreateBiTable.main(CreateBiTable.java:19)
>
>             at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>             at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>             at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>             at java.lang.reflect.Method.invoke(Method.java:597)
>
>             at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
>
>
> Alan
>