You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@chukwa.apache.org by Eric Yang <er...@gmail.com> on 2013/02/01 06:37:40 UTC

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Is there multiple version of hadoop jar files in the class path?  This
error looks like hdfs client is from Hadoop 1.x.  If there is older version
of hadoop-core*.jar file, it can generate this error.

regards,
Eric

On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> Hi there,
> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh 4.0.0).But
> when collector runs,it has showed this error :
> Server IPC version 7 cannot communicate with client version 4
>
> I copied lib from /user/lib/hadoop/*.jar & /user/lib/hadoop-hdfs/*.jar,
> but couldn't get result.
>
> I'd be glad if someone can help me.
> Tnx
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
How Can I enabled other logs ? should I config sth special or not ?

On Tue, Feb 5, 2013 at 8:55 AM, Eric Yang <er...@gmail.com> wrote:

> Some of the logs are disabled by default because we started in Hadoop 0.16
> and haven't updated logging to Hadoop 1.x configuration.
>
> Refer to the programming guide on how to stream your custom logs to hbase:
>
> http://incubator.apache.org/chukwa/docs/r0.5.0/programming.html
>
> regards,
> Eric
>
>
> On Sun, Feb 3, 2013 at 11:37 PM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> Tnx a lot Eric, Yeah yor were right,I forgot to insert this code in
>> chukwa-collector-conf.xml :
>> <property>
>>  <name>chukwaCollector.writerClass</name>
>>
>>  <value>org.apache.hadoop.chukwa.datacollection.writer.hbase.HBaseWriter</value>
>> </property>
>>
>> Now I can select the SystemMetrics and chukwa columns,but not the others
>> like : Hadoop,HadoopLog,clustersummary. Why are these table's columns
>> disbaled ? How about my own table like 'myTable' which created in Hbase ?
>>
>>
>>
>> On Sun, Feb 3, 2013 at 10:58 PM, Eric Yang <er...@gmail.com> wrote:
>>
>>> There are two possibilities.  First, HBase is not configured properly to
>>> the instance of ZooKeeper that is storing the table information.
>>>
>>> Another possibility is that hicc is not configured with HBASE_CONF_DIR
>>> to access correct ZooKeeper instance described in hbase-site.xml.
>>>
>>> After you solved the first problem, make sure that Chukwa hbase schema
>>> is populated.  This is done with:
>>>
>>>  hbase shell < $CHUKWA_CONF_DIR/hbase.schema
>>>
>>> This step is required to create Chukwa table on HBase.
>>>
>>> regards,
>>> Eric
>>>
>>>
>>> On Sun, Feb 3, 2013 at 2:12 AM, Farrokh Shahriari <
>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>
>>>> Tnx for your answer,
>>>> By your help I can now run hicc & webui, now in the graph_explorer,It
>>>> shows my table in hbase, but I can't select any of them means I can't
>>>> select any columnFamily of them,
>>>> the hicc.log says :
>>>>
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client
>>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:java.io.tmpdir=/tmp
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:java.compiler=<NA>
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:os.name=Linux
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:os.arch=amd64
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:user.name=root
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:user.home=/root
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>> sessionTimeout=180000 watcher=hconnection
>>>> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7-SendThread()
>>>> ClientCnxn - Opening socket connection to server /192.168.150.254:2181
>>>> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7RecoverableZooKeeper - The identifier of this process is
>>>> 9290@hadoop-standalone.soc.net
>>>> 2013-02-03 13:34:09,254 WARN 819062722@qtp-651528505-7-SendThread(
>>>> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient -
>>>> SecurityException: java.lang.SecurityException: Unable to locate a login
>>>> configuration occurred when trying to find JAAS configuration.
>>>> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
>>>> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - Client will not
>>>> SASL-authenticate because the default JAAS configuration section 'Client'
>>>> could not be found. If you are not using SASL, you may ignore this. On the
>>>> other hand, if you expected SASL to work, please fix your JAAS
>>>> configuration.
>>>> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
>>>> hadoop-standalone.soc.net:2181) ClientCnxn - Socket connection
>>>> established to hadoop-standalone.soc.net/192.168.150.254:2181,
>>>> initiating session
>>>> 2013-02-03 13:34:09,264 INFO 819062722@qtp-651528505-7-SendThread(
>>>> hadoop-standalone.soc.net:2181) ClientCnxn - Session establishment
>>>> complete on server hadoop-standalone.soc.net/192.168.150.254:2181,
>>>> sessionid = 0x13c9adf3ab20075, negotiated timeout = 40000
>>>> 2013-02-03 13:34:09,294 WARN 819062722@qtp-651528505-7 Configuration -
>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>> 2013-02-03 13:34:12,428 WARN 2030128673@qtp-651528505-4HConnectionManager$HConnectionImplementation - Encountered problems when
>>>> prefetch META table:
>>>> org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in
>>>> .META. for table: null, row=null,,99999999999999
>>>>         at
>>>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:158)
>>>>         at
>>>> org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:52)
>>>>         at
>>>> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
>>>>         at
>>>> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:127)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:360)
>>>>         at
>>>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:127)
>>>>         at
>>>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:103)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:876)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:930)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:259)
>>>>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:223)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:268)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:198)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:173)
>>>>         at
>>>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getFamilyNames(ChukwaHBaseStore.java:106)
>>>>         at
>>>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getFamilies(MetricsController.java:137)
>>>>         at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
>>>>
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>          at
>>>> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
>>>>
>>>>
>>>> what do you think about this ?
>>>>
>>>>
>>>> On Sun, Feb 3, 2013 at 1:33 PM, Eric Yang <er...@gmail.com> wrote:
>>>>
>>>>> Hicc is unable to connect to HDFS on hadoop-standalone.soc.net:8020<http://hadoop-standalone.soc.net/192.168.150.254:8020>.
>>>>>  Is the configuration correct?  Make sure port 8020 is not blocked by
>>>>> firewall.
>>>>>
>>>>> Second error seems to be in hbase-site.xml, where the hbase master
>>>>> hostname has non-alpha numeric characters garbled in the hostname.
>>>>>
>>>>> ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>>>>
>>>>> regards,
>>>>> Eric
>>>>>
>>>>> On Sat, Feb 2, 2013 at 11:29 PM, Farrokh Shahriari <
>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>
>>>>>> And I also when I checked this url :
>>>>>> http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error
>>>>>> :
>>>>>>
>>>>>> 2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp:
>>>>>> init
>>>>>> 2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
>>>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>>>> sessionTimeout=180000 watcher=hconnection
>>>>>> 2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
>>>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>>>> 192.168.150.254:2181
>>>>>> 2013-02-03 11:04:42,632 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>>>> 192.168.150.254:2181, initiating session
>>>>>> 2013-02-03 11:04:42,677 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated
>>>>>> timeout = 40000
>>>>>> 2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7ChukwaHBaseStore - java.lang.IllegalArgumentException: Not a host:port
>>>>>> pair: ُ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>>>>> ,60000,1359815006323
>>>>>>         at
>>>>>> org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>>>>>>         at
>>>>>> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>>>>>>         at
>>>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
>>>>>>         at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
>>>>>>
>>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>         at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>         at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>         at
>>>>>> javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>>         at
>>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>>         at
>>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>>         at
>>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>>         at
>>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>>>
>>>>>> Tnx for your help
>>>>>>
>>>>>>
>>>>>> On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>
>>>>>>> This is my last error after i ran hicc & check it on port 4080 ( in
>>>>>>> web ui I got this message : Error in loading dashboard ) , & here is
>>>>>>> hicc.log :
>>>>>>>
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client
>>>>>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:java.io.tmpdir=/tmp
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:java.compiler=<NA>
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:os.name=Linux
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:os.arch=amd64
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:user.name=root
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:user.home=/root
>>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>>>>>> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>>>>> sessionTimeout=180000 watcher=hconnection
>>>>>>> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
>>>>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>>>>> 192.168.150.254:2181
>>>>>>> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>>>>> 192.168.150.254:2181, initiating session
>>>>>>> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated
>>>>>>> timeout = 40000
>>>>>>> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
>>>>>>> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
>>>>>>> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore
>>>>>>> - java.io.IOException: Call to
>>>>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>>>>> exception: java.io.IOException: Broken pipe
>>>>>>>         at
>>>>>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>>>>         at
>>>>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>>>>         at
>>>>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>>>>         at
>>>>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>>>>         at
>>>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>>>>>>>         at
>>>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>>>>>>>         at
>>>>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>>>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>>>> Method)
>>>>>>>         at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>>         at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>         at
>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>         at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>         at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>         at
>>>>>>> javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>>>         at
>>>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>>>         at
>>>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>>>> Caused by: java.io.IOException: Broken pipe
>>>>>>>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>>>>>>>         at
>>>>>>> sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>>>>>>>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>>>>>>>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>>>>>>>         at
>>>>>>> sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>>>>>>>         at
>>>>>>> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>>>>>>>         at
>>>>>>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>>>>>>>         at
>>>>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>>>>>>>         at
>>>>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>>>>>>>         at
>>>>>>> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>>>>>>>         at
>>>>>>> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>>>>>>>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>>>>>>>         at
>>>>>>> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>>>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>>>>>>>         ... 52 more
>>>>>>>
>>>>>>> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore
>>>>>>> - java.io.IOException: Call to
>>>>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>>>>> exception: java.io.IOException: Broken pipe
>>>>>>>         at
>>>>>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>>>>         at
>>>>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>>>>         at
>>>>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>>>>         at
>>>>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>>>>         at
>>>>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>>>>         at
>>>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>>>>>>>         at
>>>>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>>>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>>>> Method)
>>>>>>>         at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>>         at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>>         at
>>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>>         at
>>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>>         at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>>         at
>>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>>         at
>>>>>>> javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>>>         at
>>>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>>>         at
>>>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>>>         at
>>>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>>>> Caused by: java.io.IOException: Broken pipe
>>>>>>>  ... ...
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in
>>>>>>>> hicc.war, then you should replace those too.  But I am not sure if that was
>>>>>>>> the source of the problem.  Can you show more of the stack trace to
>>>>>>>> determine the problem.  This looks like a configuration property is
>>>>>>>> missing.  I am not sure if it is hdfs, hbase, or zookeeper related.
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> Eric
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Yeah,you were right,I should've update zookeeper.jar.
>>>>>>>>> Now there is another probelm,when I've run chukwa hicc,I've got
>>>>>>>>> this error :
>>>>>>>>>
>>>>>>>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>>>>>>>
>>>>>>>>> I read in a place that the hbase jar files should be updated ( I
>>>>>>>>> copied my hbase jar files to share/chukwa/lib/ ),but still have problem,
>>>>>>>>> should I change the inside of hicc.war too ?
>>>>>>>>>
>>>>>>>>> Tnx
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>>>
>>>>>>>>>> Make sure you also update HBase jar file and ZooKeeper jar files
>>>>>>>>>> to your versions.
>>>>>>>>>>
>>>>>>>>>> regards,
>>>>>>>>>> Eric
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Tnx Eric,
>>>>>>>>>>> but my chukwa classpath is this :
>>>>>>>>>>>
>>>>>>>>>>> export
>>>>>>>>>>> CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>>>>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>>>>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>>>>>>>
>>>>>>>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar
>>>>>>>>>>> from "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've
>>>>>>>>>>> got errors.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Is there multiple version of hadoop jar files in the class
>>>>>>>>>>>> path?  This error looks like hdfs client is from Hadoop 1.x.  If there is
>>>>>>>>>>>> older version of hadoop-core*.jar file, it can generate this error.
>>>>>>>>>>>>
>>>>>>>>>>>> regards,
>>>>>>>>>>>> Eric
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi there,
>>>>>>>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>>>>>>>
>>>>>>>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>>>>>>>
>>>>>>>>>>>>> I'd be glad if someone can help me.
>>>>>>>>>>>>> Tnx
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Eric Yang <er...@gmail.com>.
Some of the logs are disabled by default because we started in Hadoop 0.16
and haven't updated logging to Hadoop 1.x configuration.

Refer to the programming guide on how to stream your custom logs to hbase:

http://incubator.apache.org/chukwa/docs/r0.5.0/programming.html

regards,
Eric

On Sun, Feb 3, 2013 at 11:37 PM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> Tnx a lot Eric, Yeah yor were right,I forgot to insert this code in
> chukwa-collector-conf.xml :
> <property>
>  <name>chukwaCollector.writerClass</name>
>
>  <value>org.apache.hadoop.chukwa.datacollection.writer.hbase.HBaseWriter</value>
> </property>
>
> Now I can select the SystemMetrics and chukwa columns,but not the others
> like : Hadoop,HadoopLog,clustersummary. Why are these table's columns
> disbaled ? How about my own table like 'myTable' which created in Hbase ?
>
>
>
> On Sun, Feb 3, 2013 at 10:58 PM, Eric Yang <er...@gmail.com> wrote:
>
>> There are two possibilities.  First, HBase is not configured properly to
>> the instance of ZooKeeper that is storing the table information.
>>
>> Another possibility is that hicc is not configured with HBASE_CONF_DIR to
>> access correct ZooKeeper instance described in hbase-site.xml.
>>
>> After you solved the first problem, make sure that Chukwa hbase schema is
>> populated.  This is done with:
>>
>>  hbase shell < $CHUKWA_CONF_DIR/hbase.schema
>>
>> This step is required to create Chukwa table on HBase.
>>
>> regards,
>> Eric
>>
>>
>> On Sun, Feb 3, 2013 at 2:12 AM, Farrokh Shahriari <
>> mohandes.zebeleh.67@gmail.com> wrote:
>>
>>> Tnx for your answer,
>>> By your help I can now run hicc & webui, now in the graph_explorer,It
>>> shows my table in hbase, but I can't select any of them means I can't
>>> select any columnFamily of them,
>>> the hicc.log says :
>>>
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client
>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:java.io.tmpdir=/tmp
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:java.compiler=<NA>
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:os.name=Linux
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:os.arch=amd64
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:user.name=root
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:user.home=/root
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>>> Initiating client connection, connectString=hadoop-standalone:2181
>>> sessionTimeout=180000 watcher=hconnection
>>> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7-SendThread()
>>> ClientCnxn - Opening socket connection to server /192.168.150.254:2181
>>> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7RecoverableZooKeeper - The identifier of this process is
>>> 9290@hadoop-standalone.soc.net
>>> 2013-02-03 13:34:09,254 WARN 819062722@qtp-651528505-7-SendThread(
>>> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient -
>>> SecurityException: java.lang.SecurityException: Unable to locate a login
>>> configuration occurred when trying to find JAAS configuration.
>>> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
>>> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - Client will not
>>> SASL-authenticate because the default JAAS configuration section 'Client'
>>> could not be found. If you are not using SASL, you may ignore this. On the
>>> other hand, if you expected SASL to work, please fix your JAAS
>>> configuration.
>>> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
>>> hadoop-standalone.soc.net:2181) ClientCnxn - Socket connection
>>> established to hadoop-standalone.soc.net/192.168.150.254:2181,
>>> initiating session
>>> 2013-02-03 13:34:09,264 INFO 819062722@qtp-651528505-7-SendThread(
>>> hadoop-standalone.soc.net:2181) ClientCnxn - Session establishment
>>> complete on server hadoop-standalone.soc.net/192.168.150.254:2181,
>>> sessionid = 0x13c9adf3ab20075, negotiated timeout = 40000
>>> 2013-02-03 13:34:09,294 WARN 819062722@qtp-651528505-7 Configuration -
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>> 2013-02-03 13:34:12,428 WARN 2030128673@qtp-651528505-4HConnectionManager$HConnectionImplementation - Encountered problems when
>>> prefetch META table:
>>> org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in
>>> .META. for table: null, row=null,,99999999999999
>>>         at
>>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:158)
>>>         at
>>> org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:52)
>>>         at
>>> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
>>>         at
>>> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:127)
>>>         at
>>> org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:360)
>>>         at
>>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:127)
>>>         at
>>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:103)
>>>         at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:876)
>>>         at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:930)
>>>         at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)
>>>         at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
>>>         at
>>> org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:259)
>>>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:223)
>>>         at
>>> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
>>>         at
>>> org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:268)
>>>         at
>>> org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:198)
>>>         at
>>> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:173)
>>>         at
>>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getFamilyNames(ChukwaHBaseStore.java:106)
>>>         at
>>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getFamilies(MetricsController.java:137)
>>>         at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
>>>
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>          at
>>> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
>>>
>>>
>>> what do you think about this ?
>>>
>>>
>>> On Sun, Feb 3, 2013 at 1:33 PM, Eric Yang <er...@gmail.com> wrote:
>>>
>>>> Hicc is unable to connect to HDFS on hadoop-standalone.soc.net:8020<http://hadoop-standalone.soc.net/192.168.150.254:8020>.
>>>>  Is the configuration correct?  Make sure port 8020 is not blocked by
>>>> firewall.
>>>>
>>>> Second error seems to be in hbase-site.xml, where the hbase master
>>>> hostname has non-alpha numeric characters garbled in the hostname.
>>>>
>>>> ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>>>
>>>> regards,
>>>> Eric
>>>>
>>>> On Sat, Feb 2, 2013 at 11:29 PM, Farrokh Shahriari <
>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>
>>>>> And I also when I checked this url :
>>>>> http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error :
>>>>>
>>>>> 2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp:
>>>>> init
>>>>> 2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
>>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>>> sessionTimeout=180000 watcher=hconnection
>>>>> 2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
>>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>>> 192.168.150.254:2181
>>>>> 2013-02-03 11:04:42,632 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>>> 192.168.150.254:2181, initiating session
>>>>> 2013-02-03 11:04:42,677 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated
>>>>> timeout = 40000
>>>>> 2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7ChukwaHBaseStore - java.lang.IllegalArgumentException: Not a host:port
>>>>> pair: ُ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>>>> ,60000,1359815006323
>>>>>         at
>>>>> org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>>>>>         at
>>>>> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>>>>>         at
>>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
>>>>>         at
>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
>>>>>
>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>         at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>         at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>         at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>         at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>         at
>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>         at
>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>         at
>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>         at
>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>         at
>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>         at
>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>         at
>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>         at
>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>         at
>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>>
>>>>> Tnx for your help
>>>>>
>>>>>
>>>>> On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>
>>>>>> This is my last error after i ran hicc & check it on port 4080 ( in
>>>>>> web ui I got this message : Error in loading dashboard ) , & here is
>>>>>> hicc.log :
>>>>>>
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client
>>>>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:java.io.tmpdir=/tmp
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:java.compiler=<NA>
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:os.name=Linux
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:os.arch=amd64
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:user.name=root
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:user.home=/root
>>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>>>>> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>>>> sessionTimeout=180000 watcher=hconnection
>>>>>> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
>>>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>>>> 192.168.150.254:2181
>>>>>> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>>>> 192.168.150.254:2181, initiating session
>>>>>> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated
>>>>>> timeout = 40000
>>>>>> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
>>>>>> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
>>>>>> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
>>>>>> java.io.IOException: Call to
>>>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>>>> exception: java.io.IOException: Broken pipe
>>>>>>         at
>>>>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>>>         at
>>>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>>>         at
>>>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>>>         at
>>>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>>>         at
>>>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>>>         at
>>>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>         at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>         at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>         at
>>>>>> javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>>         at
>>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>>         at
>>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>>         at
>>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>>         at
>>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>>> Caused by: java.io.IOException: Broken pipe
>>>>>>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>>>>>>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>>>>>>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>>>>>>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>>>>>>         at
>>>>>> sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>>>>>>         at
>>>>>> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>>>>>>         at
>>>>>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>>>>>>         at
>>>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>>>>>>         at
>>>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>>>>>>         at
>>>>>> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>>>>>>         at
>>>>>> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>>>>>>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>>>>>>         at
>>>>>> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>>>>>>         ... 52 more
>>>>>>
>>>>>> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore
>>>>>> - java.io.IOException: Call to
>>>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>>>> exception: java.io.IOException: Broken pipe
>>>>>>         at
>>>>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>>>         at
>>>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>>>         at
>>>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>>>         at
>>>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>>>         at
>>>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>>>         at
>>>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>>>>>>         at
>>>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>         at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>         at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>>         at
>>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>>         at
>>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>>         at
>>>>>> javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>>         at
>>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>>         at
>>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>>         at
>>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>>         at
>>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>>         at
>>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>>         at
>>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>>         at
>>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>>> Caused by: java.io.IOException: Broken pipe
>>>>>>  ... ...
>>>>>>
>>>>>>
>>>>>> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:
>>>>>>
>>>>>>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in
>>>>>>> hicc.war, then you should replace those too.  But I am not sure if that was
>>>>>>> the source of the problem.  Can you show more of the stack trace to
>>>>>>> determine the problem.  This looks like a configuration property is
>>>>>>> missing.  I am not sure if it is hdfs, hbase, or zookeeper related.
>>>>>>>
>>>>>>> regards,
>>>>>>> Eric
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>
>>>>>>>> Yeah,you were right,I should've update zookeeper.jar.
>>>>>>>> Now there is another probelm,when I've run chukwa hicc,I've got
>>>>>>>> this error :
>>>>>>>>
>>>>>>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>>>>>>
>>>>>>>> I read in a place that the hbase jar files should be updated ( I
>>>>>>>> copied my hbase jar files to share/chukwa/lib/ ),but still have problem,
>>>>>>>> should I change the inside of hicc.war too ?
>>>>>>>>
>>>>>>>> Tnx
>>>>>>>>
>>>>>>>>
>>>>>>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> Make sure you also update HBase jar file and ZooKeeper jar files
>>>>>>>>> to your versions.
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>> Eric
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Tnx Eric,
>>>>>>>>>> but my chukwa classpath is this :
>>>>>>>>>>
>>>>>>>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>>>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>>>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>>>>>>
>>>>>>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar
>>>>>>>>>> from "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've
>>>>>>>>>> got errors.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>>>>
>>>>>>>>>>> Is there multiple version of hadoop jar files in the class path?
>>>>>>>>>>>  This error looks like hdfs client is from Hadoop 1.x.  If there is older
>>>>>>>>>>> version of hadoop-core*.jar file, it can generate this error.
>>>>>>>>>>>
>>>>>>>>>>> regards,
>>>>>>>>>>> Eric
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi there,
>>>>>>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>>>>>>
>>>>>>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>>>>>>
>>>>>>>>>>>> I'd be glad if someone can help me.
>>>>>>>>>>>> Tnx
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
Tnx a lot Eric, Yeah yor were right,I forgot to insert this code in
chukwa-collector-conf.xml :
<property>
 <name>chukwaCollector.writerClass</name>
 <value>org.apache.hadoop.chukwa.datacollection.writer.hbase.HBaseWriter</value>
</property>

Now I can select the SystemMetrics and chukwa columns,but not the others
like : Hadoop,HadoopLog,clustersummary. Why are these table's columns
disbaled ? How about my own table like 'myTable' which created in Hbase ?


On Sun, Feb 3, 2013 at 10:58 PM, Eric Yang <er...@gmail.com> wrote:

> There are two possibilities.  First, HBase is not configured properly to
> the instance of ZooKeeper that is storing the table information.
>
> Another possibility is that hicc is not configured with HBASE_CONF_DIR to
> access correct ZooKeeper instance described in hbase-site.xml.
>
> After you solved the first problem, make sure that Chukwa hbase schema is
> populated.  This is done with:
>
>  hbase shell < $CHUKWA_CONF_DIR/hbase.schema
>
> This step is required to create Chukwa table on HBase.
>
> regards,
> Eric
>
>
> On Sun, Feb 3, 2013 at 2:12 AM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> Tnx for your answer,
>> By your help I can now run hicc & webui, now in the graph_explorer,It
>> shows my table in hbase, but I can't select any of them means I can't
>> select any columnFamily of them,
>> the hicc.log says :
>>
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client
>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:java.io.tmpdir=/tmp
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:java.compiler=<NA>
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:os.name=Linux
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:os.arch=amd64
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:os.version=2.6.32-220.el6.x86_64
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:user.name=root
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:user.home=/root
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
>> Initiating client connection, connectString=hadoop-standalone:2181
>> sessionTimeout=180000 watcher=hconnection
>> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7-SendThread()
>> ClientCnxn - Opening socket connection to server /192.168.150.254:2181
>> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7RecoverableZooKeeper - The identifier of this process is
>> 9290@hadoop-standalone.soc.net
>> 2013-02-03 13:34:09,254 WARN 819062722@qtp-651528505-7-SendThread(
>> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - SecurityException:
>> java.lang.SecurityException: Unable to locate a login configuration
>> occurred when trying to find JAAS configuration.
>> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
>> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - Client will not
>> SASL-authenticate because the default JAAS configuration section 'Client'
>> could not be found. If you are not using SASL, you may ignore this. On the
>> other hand, if you expected SASL to work, please fix your JAAS
>> configuration.
>> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
>> hadoop-standalone.soc.net:2181) ClientCnxn - Socket connection
>> established to hadoop-standalone.soc.net/192.168.150.254:2181,
>> initiating session
>> 2013-02-03 13:34:09,264 INFO 819062722@qtp-651528505-7-SendThread(
>> hadoop-standalone.soc.net:2181) ClientCnxn - Session establishment
>> complete on server hadoop-standalone.soc.net/192.168.150.254:2181,
>> sessionid = 0x13c9adf3ab20075, negotiated timeout = 40000
>> 2013-02-03 13:34:09,294 WARN 819062722@qtp-651528505-7 Configuration -
>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>> 2013-02-03 13:34:12,428 WARN 2030128673@qtp-651528505-4HConnectionManager$HConnectionImplementation - Encountered problems when
>> prefetch META table:
>> org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META.
>> for table: null, row=null,,99999999999999
>>         at
>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:158)
>>         at
>> org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:52)
>>         at
>> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
>>         at
>> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:127)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:360)
>>         at
>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:127)
>>         at
>> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:103)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:876)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:930)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
>>         at
>> org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:259)
>>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:223)
>>         at
>> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
>>         at
>> org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:268)
>>         at
>> org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:198)
>>         at
>> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:173)
>>         at
>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getFamilyNames(ChukwaHBaseStore.java:106)
>>         at
>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getFamilies(MetricsController.java:137)
>>         at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
>>
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>          at
>> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
>>
>>
>> what do you think about this ?
>>
>>
>> On Sun, Feb 3, 2013 at 1:33 PM, Eric Yang <er...@gmail.com> wrote:
>>
>>> Hicc is unable to connect to HDFS on hadoop-standalone.soc.net:8020<http://hadoop-standalone.soc.net/192.168.150.254:8020>.
>>>  Is the configuration correct?  Make sure port 8020 is not blocked by
>>> firewall.
>>>
>>> Second error seems to be in hbase-site.xml, where the hbase master
>>> hostname has non-alpha numeric characters garbled in the hostname.
>>>
>>> ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>>
>>> regards,
>>> Eric
>>>
>>> On Sat, Feb 2, 2013 at 11:29 PM, Farrokh Shahriari <
>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>
>>>> And I also when I checked this url :
>>>> http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error :
>>>>
>>>> 2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp:
>>>> init
>>>> 2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>> sessionTimeout=180000 watcher=hconnection
>>>> 2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>> 192.168.150.254:2181
>>>> 2013-02-03 11:04:42,632 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>> 192.168.150.254:2181, initiating session
>>>> 2013-02-03 11:04:42,677 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated
>>>> timeout = 40000
>>>> 2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7ChukwaHBaseStore - java.lang.IllegalArgumentException: Not a host:port
>>>> pair: ُ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>>> ,60000,1359815006323
>>>>         at
>>>> org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>>>>         at
>>>> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
>>>>         at
>>>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
>>>>         at
>>>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
>>>>
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>         at
>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>         at
>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>         at
>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>         at
>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>         at
>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>         at
>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>         at
>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>         at
>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>         at
>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>         at
>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>         at
>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>         at
>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>
>>>> Tnx for your help
>>>>
>>>>
>>>> On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>
>>>>> This is my last error after i ran hicc & check it on port 4080 ( in
>>>>> web ui I got this message : Error in loading dashboard ) , & here is
>>>>> hicc.log :
>>>>>
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client
>>>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:java.io.tmpdir=/tmp
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:java.compiler=<NA>
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:os.name=Linux
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:os.arch=amd64
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:user.name=root
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:user.home=/root
>>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>>>> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>>> sessionTimeout=180000 watcher=hconnection
>>>>> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
>>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>>> 192.168.150.254:2181
>>>>> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>>> 192.168.150.254:2181, initiating session
>>>>> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated
>>>>> timeout = 40000
>>>>> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
>>>>> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
>>>>> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
>>>>> java.io.IOException: Call to
>>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>>> exception: java.io.IOException: Broken pipe
>>>>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>>         at
>>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>>         at
>>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>>         at
>>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>>         at
>>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>>         at
>>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>         at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>         at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>         at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>         at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>         at
>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>         at
>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>         at
>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>         at
>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>         at
>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>         at
>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>         at
>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>         at
>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>         at
>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>> Caused by: java.io.IOException: Broken pipe
>>>>>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>>>>>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>>>>>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>>>>>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>>>>>         at
>>>>> sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>>>>>         at
>>>>> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>>>>>         at
>>>>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>>>>>         at
>>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>>>>>         at
>>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>>>>>         at
>>>>> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>>>>>         at
>>>>> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>>>>>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>>>>>         at
>>>>> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>>>>>         ... 52 more
>>>>>
>>>>> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore -
>>>>> java.io.IOException: Call to
>>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>>> exception: java.io.IOException: Broken pipe
>>>>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>>         at
>>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>>         at
>>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>>         at
>>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>>         at
>>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>>         at
>>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>>>>>         at
>>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>         at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>         at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>         at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>>         at
>>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>>         at
>>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>>         at
>>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>>         at
>>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>>         at
>>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>>         at
>>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>>         at
>>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>>         at
>>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>>         at
>>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>>         at
>>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>>         at
>>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>>         at
>>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>>         at
>>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>>         at
>>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>>         at
>>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>>> Caused by: java.io.IOException: Broken pipe
>>>>>  ... ...
>>>>>
>>>>>
>>>>> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:
>>>>>
>>>>>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in
>>>>>> hicc.war, then you should replace those too.  But I am not sure if that was
>>>>>> the source of the problem.  Can you show more of the stack trace to
>>>>>> determine the problem.  This looks like a configuration property is
>>>>>> missing.  I am not sure if it is hdfs, hbase, or zookeeper related.
>>>>>>
>>>>>> regards,
>>>>>> Eric
>>>>>>
>>>>>>
>>>>>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>
>>>>>>> Yeah,you were right,I should've update zookeeper.jar.
>>>>>>> Now there is another probelm,when I've run chukwa hicc,I've got this
>>>>>>> error :
>>>>>>>
>>>>>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>>>>>
>>>>>>> I read in a place that the hbase jar files should be updated ( I
>>>>>>> copied my hbase jar files to share/chukwa/lib/ ),but still have problem,
>>>>>>> should I change the inside of hicc.war too ?
>>>>>>>
>>>>>>> Tnx
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Make sure you also update HBase jar file and ZooKeeper jar files to
>>>>>>>> your versions.
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> Eric
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Tnx Eric,
>>>>>>>>> but my chukwa classpath is this :
>>>>>>>>>
>>>>>>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>>>>>
>>>>>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar
>>>>>>>>> from "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've
>>>>>>>>> got errors.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>>>
>>>>>>>>>> Is there multiple version of hadoop jar files in the class path?
>>>>>>>>>>  This error looks like hdfs client is from Hadoop 1.x.  If there is older
>>>>>>>>>> version of hadoop-core*.jar file, it can generate this error.
>>>>>>>>>>
>>>>>>>>>> regards,
>>>>>>>>>> Eric
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi there,
>>>>>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>>>>>
>>>>>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>>>>>
>>>>>>>>>>> I'd be glad if someone can help me.
>>>>>>>>>>> Tnx
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Eric Yang <er...@gmail.com>.
There are two possibilities.  First, HBase is not configured properly to
the instance of ZooKeeper that is storing the table information.

Another possibility is that hicc is not configured with HBASE_CONF_DIR to
access correct ZooKeeper instance described in hbase-site.xml.

After you solved the first problem, make sure that Chukwa hbase schema is
populated.  This is done with:

 hbase shell < $CHUKWA_CONF_DIR/hbase.schema

This step is required to create Chukwa table on HBase.

regards,
Eric

On Sun, Feb 3, 2013 at 2:12 AM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> Tnx for your answer,
> By your help I can now run hicc & webui, now in the graph_explorer,It
> shows my table in hbase, but I can't select any of them means I can't
> select any columnFamily of them,
> the hicc.log says :
>
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:java.io.tmpdir=/tmp
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:java.compiler=<NA>
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:os.name=Linux
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:os.arch=amd64
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:os.version=2.6.32-220.el6.x86_64
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:user.name=root
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:user.home=/root
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
> environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
> 2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
> Initiating client connection, connectString=hadoop-standalone:2181
> sessionTimeout=180000 watcher=hconnection
> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7-SendThread()
> ClientCnxn - Opening socket connection to server /192.168.150.254:2181
> 2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7RecoverableZooKeeper - The identifier of this process is
> 9290@hadoop-standalone.soc.net
> 2013-02-03 13:34:09,254 WARN 819062722@qtp-651528505-7-SendThread(
> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - SecurityException:
> java.lang.SecurityException: Unable to locate a login configuration
> occurred when trying to find JAAS configuration.
> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
> hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - Client will not
> SASL-authenticate because the default JAAS configuration section 'Client'
> could not be found. If you are not using SASL, you may ignore this. On the
> other hand, if you expected SASL to work, please fix your JAAS
> configuration.
> 2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
> hadoop-standalone.soc.net:2181) ClientCnxn - Socket connection
> established to hadoop-standalone.soc.net/192.168.150.254:2181, initiating
> session
> 2013-02-03 13:34:09,264 INFO 819062722@qtp-651528505-7-SendThread(
> hadoop-standalone.soc.net:2181) ClientCnxn - Session establishment
> complete on server hadoop-standalone.soc.net/192.168.150.254:2181,
> sessionid = 0x13c9adf3ab20075, negotiated timeout = 40000
> 2013-02-03 13:34:09,294 WARN 819062722@qtp-651528505-7 Configuration -
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> 2013-02-03 13:34:12,428 WARN 2030128673@qtp-651528505-4HConnectionManager$HConnectionImplementation - Encountered problems when
> prefetch META table:
> org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META.
> for table: null, row=null,,99999999999999
>         at
> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:158)
>         at
> org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:52)
>         at
> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
>         at
> org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:127)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:360)
>         at
> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:127)
>         at
> org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:103)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:876)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:930)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
>         at
> org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:259)
>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:223)
>         at
> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
>         at
> org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:268)
>         at
> org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:198)
>         at
> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:173)
>         at
> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getFamilyNames(ChukwaHBaseStore.java:106)
>         at
> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getFamilies(MetricsController.java:137)
>         at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
>
>
> what do you think about this ?
>
>
> On Sun, Feb 3, 2013 at 1:33 PM, Eric Yang <er...@gmail.com> wrote:
>
>> Hicc is unable to connect to HDFS on hadoop-standalone.soc.net:8020<http://hadoop-standalone.soc.net/192.168.150.254:8020>.
>>  Is the configuration correct?  Make sure port 8020 is not blocked by
>> firewall.
>>
>> Second error seems to be in hbase-site.xml, where the hbase master
>> hostname has non-alpha numeric characters garbled in the hostname.
>>
>> ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>
>> regards,
>> Eric
>>
>> On Sat, Feb 2, 2013 at 11:29 PM, Farrokh Shahriari <
>> mohandes.zebeleh.67@gmail.com> wrote:
>>
>>> And I also when I checked this url :
>>> http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error :
>>>
>>> 2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp: init
>>> 2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
>>> Initiating client connection, connectString=hadoop-standalone:2181
>>> sessionTimeout=180000 watcher=hconnection
>>> 2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>> 192.168.150.254:2181
>>> 2013-02-03 11:04:42,632 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>> 192.168.150.254:2181, initiating session
>>> 2013-02-03 11:04:42,677 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated timeout
>>> = 40000
>>> 2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7ChukwaHBaseStore - java.lang.IllegalArgumentException: Not a host:port
>>> pair: ُ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>>> ,60000,1359815006323
>>>         at
>>> org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>>>         at
>>> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>>>         at
>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
>>>         at
>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
>>>         at
>>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
>>>         at
>>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
>>>
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>         at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>         at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>         at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>         at
>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>         at
>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>         at
>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>         at
>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>         at
>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>         at
>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>         at
>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>         at
>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>         at
>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>         at
>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>         at
>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>         at
>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>         at
>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>         at
>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>         at
>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>
>>> Tnx for your help
>>>
>>>
>>> On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>
>>>> This is my last error after i ran hicc & check it on port 4080 ( in web
>>>> ui I got this message : Error in loading dashboard ) , & here is hicc.log :
>>>>
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client
>>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:java.io.tmpdir=/tmp
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:java.compiler=<NA>
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:os.name=Linux
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:os.arch=amd64
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:user.name=root
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:user.home=/root
>>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>>> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>>> Initiating client connection, connectString=hadoop-standalone:2181
>>>> sessionTimeout=180000 watcher=hconnection
>>>> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
>>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>>> 192.168.150.254:2181
>>>> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>>> 192.168.150.254:2181, initiating session
>>>> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated
>>>> timeout = 40000
>>>> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
>>>> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
>>>> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
>>>> java.io.IOException: Call to
>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>> exception: java.io.IOException: Broken pipe
>>>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>         at
>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>         at
>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>         at
>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>         at
>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>         at
>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>         at
>>>> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>>>>         at
>>>> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>>>>         at
>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>         at
>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>         at
>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>         at
>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>         at
>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>         at
>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>         at
>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>         at
>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>         at
>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>         at
>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>         at
>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>         at
>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>         at
>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>> Caused by: java.io.IOException: Broken pipe
>>>>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>>>>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>>>>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>>>>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>>>>         at
>>>> sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>>>>         at
>>>> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>>>>         at
>>>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>>>>         at
>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>>>>         at
>>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>>>>         at
>>>> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>>>>         at
>>>> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>>>>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>>>>         at
>>>> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>>>>         ... 52 more
>>>>
>>>> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore -
>>>> java.io.IOException: Call to
>>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>>> exception: java.io.IOException: Broken pipe
>>>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>         at
>>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>>         at
>>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>>         at
>>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>>         at
>>>> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>>         at
>>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>>         at
>>>> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>>>>         at
>>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>>         at
>>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>>         at
>>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>>         at
>>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>>         at
>>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>>         at
>>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>>         at
>>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>>         at
>>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>>         at
>>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>>         at
>>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>>         at
>>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>>         at
>>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>>         at
>>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>>         at
>>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>>         at
>>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>>         at
>>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>>         at
>>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>>         at
>>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>>> Caused by: java.io.IOException: Broken pipe
>>>>  ... ...
>>>>
>>>>
>>>> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:
>>>>
>>>>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in hicc.war,
>>>>> then you should replace those too.  But I am not sure if that was the
>>>>> source of the problem.  Can you show more of the stack trace to determine
>>>>> the problem.  This looks like a configuration property is missing.  I am
>>>>> not sure if it is hdfs, hbase, or zookeeper related.
>>>>>
>>>>> regards,
>>>>> Eric
>>>>>
>>>>>
>>>>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>
>>>>>> Yeah,you were right,I should've update zookeeper.jar.
>>>>>> Now there is another probelm,when I've run chukwa hicc,I've got this
>>>>>> error :
>>>>>>
>>>>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>>>>
>>>>>> I read in a place that the hbase jar files should be updated ( I
>>>>>> copied my hbase jar files to share/chukwa/lib/ ),but still have problem,
>>>>>> should I change the inside of hicc.war too ?
>>>>>>
>>>>>> Tnx
>>>>>>
>>>>>>
>>>>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>>>>>>
>>>>>>> Make sure you also update HBase jar file and ZooKeeper jar files to
>>>>>>> your versions.
>>>>>>>
>>>>>>> regards,
>>>>>>> Eric
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>
>>>>>>>> Tnx Eric,
>>>>>>>> but my chukwa classpath is this :
>>>>>>>>
>>>>>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>>>>
>>>>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar
>>>>>>>> from "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've
>>>>>>>> got errors.
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> Is there multiple version of hadoop jar files in the class path?
>>>>>>>>>  This error looks like hdfs client is from Hadoop 1.x.  If there is older
>>>>>>>>> version of hadoop-core*.jar file, it can generate this error.
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>> Eric
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi there,
>>>>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>>>>
>>>>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>>>>
>>>>>>>>>> I'd be glad if someone can help me.
>>>>>>>>>> Tnx
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
Tnx for your answer,
By your help I can now run hicc & webui, now in the graph_explorer,It shows
my table in hbase, but I can't select any of them means I can't select any
columnFamily of them,
the hicc.log says :

2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:java.io.tmpdir=/tmp
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:java.compiler=<NA>
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:os.name=Linux
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:os.arch=amd64
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:os.version=2.6.32-220.el6.x86_64
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:user.name=root
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:user.home=/root
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper - Client
environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
2013-02-03 13:34:09,242 INFO 819062722@qtp-651528505-7 ZooKeeper -
Initiating client connection, connectString=hadoop-standalone:2181
sessionTimeout=180000 watcher=hconnection
2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7-SendThread()
ClientCnxn - Opening socket connection to server /192.168.150.254:2181
2013-02-03 13:34:09,250 INFO 819062722@qtp-651528505-7 RecoverableZooKeeper
- The identifier of this process is 9290@hadoop-standalone.soc.net
2013-02-03 13:34:09,254 WARN 819062722@qtp-651528505-7-SendThread(
hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - SecurityException:
java.lang.SecurityException: Unable to locate a login configuration
occurred when trying to find JAAS configuration.
2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
hadoop-standalone.soc.net:2181) ZooKeeperSaslClient - Client will not
SASL-authenticate because the default JAAS configuration section 'Client'
could not be found. If you are not using SASL, you may ignore this. On the
other hand, if you expected SASL to work, please fix your JAAS
configuration.
2013-02-03 13:34:09,254 INFO 819062722@qtp-651528505-7-SendThread(
hadoop-standalone.soc.net:2181) ClientCnxn - Socket connection established
to hadoop-standalone.soc.net/192.168.150.254:2181, initiating session
2013-02-03 13:34:09,264 INFO 819062722@qtp-651528505-7-SendThread(
hadoop-standalone.soc.net:2181) ClientCnxn - Session establishment complete
on server hadoop-standalone.soc.net/192.168.150.254:2181, sessionid =
0x13c9adf3ab20075, negotiated timeout = 40000
2013-02-03 13:34:09,294 WARN 819062722@qtp-651528505-7 Configuration -
hadoop.native.lib is deprecated. Instead, use io.native.lib.available
2013-02-03 13:34:12,428 WARN
2030128673@qtp-651528505-4HConnectionManager$HConnectionImplementation
- Encountered problems when
prefetch META table:
org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META.
for table: null, row=null,,99999999999999
        at
org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:158)
        at
org.apache.hadoop.hbase.client.MetaScanner.access$000(MetaScanner.java:52)
        at
org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:130)
        at
org.apache.hadoop.hbase.client.MetaScanner$1.connect(MetaScanner.java:127)
        at
org.apache.hadoop.hbase.client.HConnectionManager.execute(HConnectionManager.java:360)
        at
org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:127)
        at
org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:103)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:876)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:930)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
        at
org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:259)
        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:223)
        at
org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
        at
org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:268)
        at
org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:198)
        at
org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:173)
        at
org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getFamilyNames(ChukwaHBaseStore.java:106)
        at
org.apache.hadoop.chukwa.hicc.rest.MetricsController.getFamilies(MetricsController.java:137)
        at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
        at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)


what do you think about this ?

On Sun, Feb 3, 2013 at 1:33 PM, Eric Yang <er...@gmail.com> wrote:

> Hicc is unable to connect to HDFS on hadoop-standalone.soc.net:8020<http://hadoop-standalone.soc.net/192.168.150.254:8020>.
>  Is the configuration correct?  Make sure port 8020 is not blocked by
> firewall.
>
> Second error seems to be in hbase-site.xml, where the hbase master
> hostname has non-alpha numeric characters garbled in the hostname.
>
> ؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>
> regards,
> Eric
>
> On Sat, Feb 2, 2013 at 11:29 PM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> And I also when I checked this url :
>> http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error :
>>
>> 2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp: init
>> 2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
>> Initiating client connection, connectString=hadoop-standalone:2181
>> sessionTimeout=180000 watcher=hconnection
>> 2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>> 192.168.150.254:2181
>> 2013-02-03 11:04:42,632 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>> ClientCnxn - Socket connection established to hadoop-standalone/
>> 192.168.150.254:2181, initiating session
>> 2013-02-03 11:04:42,677 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated timeout
>> = 40000
>> 2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7 ChukwaHBaseStore
>> - java.lang.IllegalArgumentException: Not a host:port pair: ُ؟�
>> 2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
>> ,60000,1359815006323
>>         at
>> org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>>         at
>> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>>         at
>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
>>         at
>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
>>         at
>> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
>>         at
>> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
>>
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>         at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>         at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>         at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>         at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>         at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>         at
>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>         at
>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>         at
>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>         at
>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>         at
>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>         at
>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>         at
>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>         at
>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>         at
>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>         at
>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>         at
>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>         at
>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>         at
>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>         at
>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>         at
>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>
>> Tnx for your help
>>
>>
>> On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
>> mohandes.zebeleh.67@gmail.com> wrote:
>>
>>> This is my last error after i ran hicc & check it on port 4080 ( in web
>>> ui I got this message : Error in loading dashboard ) , & here is hicc.log :
>>>
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client
>>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:java.io.tmpdir=/tmp
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:java.compiler=<NA>
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:os.name=Linux
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:os.arch=amd64
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:os.version=2.6.32-220.el6.x86_64
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:user.name=root
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:user.home=/root
>>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>>> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>>> Initiating client connection, connectString=hadoop-standalone:2181
>>> sessionTimeout=180000 watcher=hconnection
>>> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
>>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>>> 192.168.150.254:2181
>>> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>> ClientCnxn - Socket connection established to hadoop-standalone/
>>> 192.168.150.254:2181, initiating session
>>> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated timeout
>>> = 40000
>>> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
>>> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
>>> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
>>> java.io.IOException: Call to
>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>> exception: java.io.IOException: Broken pipe
>>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>         at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>         at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>         at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>         at
>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>         at
>>> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>>>         at
>>> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>>>         at
>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>         at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>         at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>         at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>         at
>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>         at
>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>         at
>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>         at
>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>         at
>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>         at
>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>         at
>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>         at
>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>         at
>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>         at
>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>         at
>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>         at
>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>         at
>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>         at
>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>         at
>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>> Caused by: java.io.IOException: Broken pipe
>>>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>>>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>>>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>>>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>>>         at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>>>         at
>>> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>>>         at
>>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>>>         at
>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>>>         at
>>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>>>         at
>>> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>>>         at
>>> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>>>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>>>         at
>>> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>>>         ... 52 more
>>>
>>> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore -
>>> java.io.IOException: Call to
>>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>>> exception: java.io.IOException: Broken pipe
>>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>         at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>>         at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>         at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>>         at
>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>>         at
>>> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>>>         at
>>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>>         at
>>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>>         at
>>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>>         at
>>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>>         at
>>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>>         at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>>         at
>>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>>         at
>>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>>         at
>>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>>         at
>>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>>         at
>>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>>         at
>>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>>         at
>>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>>         at
>>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>>         at
>>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>>         at
>>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>>         at
>>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>>         at
>>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>>         at
>>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>>         at
>>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>>         at
>>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>>         at
>>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>>> Caused by: java.io.IOException: Broken pipe
>>>  ... ...
>>>
>>>
>>> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:
>>>
>>>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in hicc.war,
>>>> then you should replace those too.  But I am not sure if that was the
>>>> source of the problem.  Can you show more of the stack trace to determine
>>>> the problem.  This looks like a configuration property is missing.  I am
>>>> not sure if it is hdfs, hbase, or zookeeper related.
>>>>
>>>> regards,
>>>> Eric
>>>>
>>>>
>>>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>
>>>>> Yeah,you were right,I should've update zookeeper.jar.
>>>>> Now there is another probelm,when I've run chukwa hicc,I've got this
>>>>> error :
>>>>>
>>>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>>>
>>>>> I read in a place that the hbase jar files should be updated ( I
>>>>> copied my hbase jar files to share/chukwa/lib/ ),but still have problem,
>>>>> should I change the inside of hicc.war too ?
>>>>>
>>>>> Tnx
>>>>>
>>>>>
>>>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>>>>>
>>>>>> Make sure you also update HBase jar file and ZooKeeper jar files to
>>>>>> your versions.
>>>>>>
>>>>>> regards,
>>>>>> Eric
>>>>>>
>>>>>>
>>>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>
>>>>>>> Tnx Eric,
>>>>>>> but my chukwa classpath is this :
>>>>>>>
>>>>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>>>
>>>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar
>>>>>>> from "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've
>>>>>>> got errors.
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Is there multiple version of hadoop jar files in the class path?
>>>>>>>>  This error looks like hdfs client is from Hadoop 1.x.  If there is older
>>>>>>>> version of hadoop-core*.jar file, it can generate this error.
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> Eric
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi there,
>>>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>>>
>>>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>>>
>>>>>>>>> I'd be glad if someone can help me.
>>>>>>>>> Tnx
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Eric Yang <er...@gmail.com>.
Hicc is unable to connect to HDFS on
hadoop-standalone.soc.net:8020<http://hadoop-standalone.soc.net/192.168.150.254:8020>.
 Is the configuration correct?  Make sure port 8020 is not blocked by
firewall.

Second error seems to be in hbase-site.xml, where the hbase master hostname
has non-alpha numeric characters garbled in the hostname.

؟�2297@hadoop-standalone.soc.nethadoop-standalone.soc.net

regards,
Eric

On Sat, Feb 2, 2013 at 11:29 PM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> And I also when I checked this url :
> http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error :
>
> 2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp: init
> 2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
> Initiating client connection, connectString=hadoop-standalone:2181
> sessionTimeout=180000 watcher=hconnection
> 2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
> ClientCnxn - Opening socket connection to server hadoop-standalone/
> 192.168.150.254:2181
> 2013-02-03 11:04:42,632 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
> ClientCnxn - Socket connection established to hadoop-standalone/
> 192.168.150.254:2181, initiating session
> 2013-02-03 11:04:42,677 INFO 616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
> ClientCnxn - Session establishment complete on server hadoop-standalone/
> 192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated timeout =
> 40000
> 2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7 ChukwaHBaseStore
> - java.lang.IllegalArgumentException: Not a host:port pair: ُ؟�
> 2297@hadoop-standalone.soc.nethadoop-standalone.soc.net
> ,60000,1359815006323
>         at
> org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
>         at
> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
>         at
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
>         at
> org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
>         at
> org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>         at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>         at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>         at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>         at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>         at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>         at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>         at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>         at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>         at
> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>         at
> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>         at
> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>         at
> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>         at
> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>         at
> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>         at
> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>         at
> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>         at
> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>         at org.mortbay.jetty.Server.handle(Server.java:326)
>         at
> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>         at
> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>         at
> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>         at
> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>
> Tnx for your help
>
>
> On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> This is my last error after i ran hicc & check it on port 4080 ( in web
>> ui I got this message : Error in loading dashboard ) , & here is hicc.log :
>>
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client
>> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:java.io.tmpdir=/tmp
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:java.compiler=<NA>
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:os.name=Linux
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:os.arch=amd64
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:os.version=2.6.32-220.el6.x86_64
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:user.name=root
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:user.home=/root
>> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
>> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
>> Initiating client connection, connectString=hadoop-standalone:2181
>> sessionTimeout=180000 watcher=hconnection
>> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
>> ClientCnxn - Opening socket connection to server hadoop-standalone/
>> 192.168.150.254:2181
>> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>> ClientCnxn - Socket connection established to hadoop-standalone/
>> 192.168.150.254:2181, initiating session
>> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
>> ClientCnxn - Session establishment complete on server hadoop-standalone/
>> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated timeout
>> = 40000
>> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
>> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
>> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
>> java.io.IOException: Call to
>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>> exception: java.io.IOException: Broken pipe
>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>         at
>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>         at
>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>         at
>> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>>         at
>> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>>         at
>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>         at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>         at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>         at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>         at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>         at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>         at
>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>         at
>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>         at
>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>         at
>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>         at
>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>         at
>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>         at
>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>         at
>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>         at
>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>         at
>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>         at
>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>         at
>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>         at
>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>         at
>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>         at
>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>> Caused by: java.io.IOException: Broken pipe
>>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>>         at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>>         at
>> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>>         at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>>         at
>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>>         at
>> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>>         at
>> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>>         at
>> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>>         ... 52 more
>>
>> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore -
>> java.io.IOException: Call to
>> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local
>> exception: java.io.IOException: Broken pipe
>>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>         at $Proxy65.getProtocolVersion(Unknown Source)
>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>         at
>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>>         at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>         at
>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>>         at
>> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>>         at
>> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>>         at
>> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>>         at
>> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>         at
>> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>>         at
>> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>>         at
>> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>>         at
>> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>>         at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>>         at
>> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>>         at
>> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>>         at
>> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>>         at
>> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>>         at
>> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>>         at
>> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>>         at
>> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>>         at
>> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>>         at
>> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>>         at
>> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>>         at org.mortbay.jetty.Server.handle(Server.java:326)
>>         at
>> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>>         at
>> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>>         at
>> org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>>         at
>> org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>>         at
>> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>>         at
>> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
>> Caused by: java.io.IOException: Broken pipe
>>  ... ...
>>
>>
>> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:
>>
>>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in hicc.war,
>>> then you should replace those too.  But I am not sure if that was the
>>> source of the problem.  Can you show more of the stack trace to determine
>>> the problem.  This looks like a configuration property is missing.  I am
>>> not sure if it is hdfs, hbase, or zookeeper related.
>>>
>>> regards,
>>> Eric
>>>
>>>
>>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>
>>>> Yeah,you were right,I should've update zookeeper.jar.
>>>> Now there is another probelm,when I've run chukwa hicc,I've got this
>>>> error :
>>>>
>>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>>
>>>> I read in a place that the hbase jar files should be updated ( I copied
>>>> my hbase jar files to share/chukwa/lib/ ),but still have problem, should I
>>>> change the inside of hicc.war too ?
>>>>
>>>> Tnx
>>>>
>>>>
>>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>>>>
>>>>> Make sure you also update HBase jar file and ZooKeeper jar files to
>>>>> your versions.
>>>>>
>>>>> regards,
>>>>> Eric
>>>>>
>>>>>
>>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>
>>>>>> Tnx Eric,
>>>>>> but my chukwa classpath is this :
>>>>>>
>>>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>>
>>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
>>>>>> "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
>>>>>> errors.
>>>>>>
>>>>>>
>>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>>>>>>
>>>>>>> Is there multiple version of hadoop jar files in the class path?
>>>>>>>  This error looks like hdfs client is from Hadoop 1.x.  If there is older
>>>>>>> version of hadoop-core*.jar file, it can generate this error.
>>>>>>>
>>>>>>> regards,
>>>>>>> Eric
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi there,
>>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>>
>>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>>
>>>>>>>> I'd be glad if someone can help me.
>>>>>>>> Tnx
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
And I also when I checked this url :
http://machine:4080/hicc/jsp/graph_explorer.jsp,I've got this error :

2013-02-03 11:04:42,349 INFO 616991384@qtp-220467482-7 /hicc - jsp: init
2013-02-03 11:04:42,630 INFO 616991384@qtp-220467482-7 ZooKeeper -
Initiating client connection, connectString=hadoop-standalone:2181
sessionTimeout=180000 watcher=hconnection
2013-02-03 11:04:42,631 INFO 616991384@qtp-220467482-7-SendThread()
ClientCnxn - Opening socket connection to server hadoop-standalone/
192.168.150.254:2181
2013-02-03 11:04:42,632 INFO
616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
ClientCnxn - Socket connection established to hadoop-standalone/
192.168.150.254:2181, initiating session
2013-02-03 11:04:42,677 INFO
616991384@qtp-220467482-7-SendThread(hadoop-standalone:2181)
ClientCnxn - Session establishment complete on server hadoop-standalone/
192.168.150.254:2181, sessionid = 0x13c9adf3ab20060, negotiated timeout =
40000
2013-02-03 11:04:42,682 ERROR 616991384@qtp-220467482-7 ChukwaHBaseStore -
java.lang.IllegalArgumentException: Not a host:port pair: ُ؟�
2297@hadoop-standalone.soc.nethadoop-standalone.soc.net,60000,1359815006323
        at
org.apache.hadoop.hbase.HServerAddress.<init>(HServerAddress.java:60)
        at
org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:63)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:354)
        at
org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:94)
        at
org.apache.hadoop.chukwa.datastore.ChukwaHBaseStore.getTableNames(ChukwaHBaseStore.java:122)
        at
org.apache.hadoop.chukwa.hicc.rest.MetricsController.getTables(MetricsController.java:125)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
        at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
        at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
        at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
        at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
        at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
        at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
        at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
        at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
        at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
        at
org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
        at
org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
        at
org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
        at
org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
        at
org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
        at
org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
        at
org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
        at
org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
        at
org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
        at org.mortbay.jetty.Server.handle(Server.java:326)
        at
org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
        at
org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
        at
org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
        at
org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)

Tnx for your help

On Sun, Feb 3, 2013 at 9:04 AM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> This is my last error after i ran hicc & check it on port 4080 ( in web ui
> I got this message : Error in loading dashboard ) , & here is hicc.log :
>
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client
> environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:java.io.tmpdir=/tmp
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:java.compiler=<NA>
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:os.name=Linux
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:os.arch=amd64
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:os.version=2.6.32-220.el6.x86_64
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:user.name=root
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:user.home=/root
> 2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Client environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
> 2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
> Initiating client connection, connectString=hadoop-standalone:2181
> sessionTimeout=180000 watcher=hconnection
> 2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
> ClientCnxn - Opening socket connection to server hadoop-standalone/
> 192.168.150.254:2181
> 2013-02-03 08:56:53,947 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
> ClientCnxn - Socket connection established to hadoop-standalone/
> 192.168.150.254:2181, initiating session
> 2013-02-03 08:56:53,964 INFO 127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
> ClientCnxn - Session establishment complete on server hadoop-standalone/
> 192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated timeout =
> 40000
> 2013-02-03 08:56:55,168 INFO 1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
> /etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
> 2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
> java.io.IOException: Call to
> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local exception:
> java.io.IOException: Broken pipe
>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>         at $Proxy65.getProtocolVersion(Unknown Source)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>         at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>         at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>         at
> org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
>         at
> org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
>         at
> org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>         at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>         at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>         at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>         at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>         at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>         at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>         at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>         at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>         at
> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>         at
> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>         at
> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>         at
> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>         at
> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>         at
> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>         at
> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>         at
> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>         at
> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>         at org.mortbay.jetty.Server.handle(Server.java:326)
>         at
> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>         at
> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>         at
> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>         at
> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
> Caused by: java.io.IOException: Broken pipe
>         at sun.nio.ch.FileDispatcher.write0(Native Method)
>         at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
>         at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
>         at sun.nio.ch.IOUtil.write(IOUtil.java:40)
>         at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
>         at
> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>         at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>         at
> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>         at
> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>         at
> java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>         at
> java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>         at java.io.DataOutputStream.flush(DataOutputStream.java:106)
>         at
> org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1047)
>         ... 52 more
>
> 2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore -
> java.io.IOException: Call to
> hadoop-standalone.soc.net/192.168.150.254:8020 failed on local exception:
> java.io.IOException: Broken pipe
>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>         at $Proxy65.getProtocolVersion(Unknown Source)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>         at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>         at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>         at
> org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
>         at
> org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
>         at
> com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
>         at
> com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
>         at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>         at
> com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
>         at
> com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
>         at
> com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
>         at
> com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
>         at
> com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
>         at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
>         at
> com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
>         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
>         at
> org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
>         at
> org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
>         at
> org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
>         at
> org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
>         at
> org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
>         at
> org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
>         at
> org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
>         at
> org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
>         at
> org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
>         at org.mortbay.jetty.Server.handle(Server.java:326)
>         at
> org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
>         at
> org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
>         at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
>         at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
>         at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
>         at
> org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
>         at
> org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
> Caused by: java.io.IOException: Broken pipe
>  ... ...
>
>
> On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:
>
>> Yes, if the hadoop/hbase/zookeeper jar files are packaged in hicc.war,
>> then you should replace those too.  But I am not sure if that was the
>> source of the problem.  Can you show more of the stack trace to determine
>> the problem.  This looks like a configuration property is missing.  I am
>> not sure if it is hdfs, hbase, or zookeeper related.
>>
>> regards,
>> Eric
>>
>>
>> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
>> mohandes.zebeleh.67@gmail.com> wrote:
>>
>>> Yeah,you were right,I should've update zookeeper.jar.
>>> Now there is another probelm,when I've run chukwa hicc,I've got this
>>> error :
>>>
>>> java.lang.IllegalArgumentException: Not a host: port pair:
>>>
>>> I read in a place that the hbase jar files should be updated ( I copied
>>> my hbase jar files to share/chukwa/lib/ ),but still have problem, should I
>>> change the inside of hicc.war too ?
>>>
>>> Tnx
>>>
>>>
>>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>>>
>>>> Make sure you also update HBase jar file and ZooKeeper jar files to
>>>> your versions.
>>>>
>>>> regards,
>>>> Eric
>>>>
>>>>
>>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>
>>>>> Tnx Eric,
>>>>> but my chukwa classpath is this :
>>>>>
>>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>>
>>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
>>>>> "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
>>>>> errors.
>>>>>
>>>>>
>>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>>>>>
>>>>>> Is there multiple version of hadoop jar files in the class path?
>>>>>>  This error looks like hdfs client is from Hadoop 1.x.  If there is older
>>>>>> version of hadoop-core*.jar file, it can generate this error.
>>>>>>
>>>>>> regards,
>>>>>> Eric
>>>>>>
>>>>>>
>>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>>
>>>>>>> Hi there,
>>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>>
>>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>>
>>>>>>> I'd be glad if someone can help me.
>>>>>>> Tnx
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
This is my last error after i ran hicc & check it on port 4080 ( in web ui
I got this message : Error in loading dashboard ) , & here is hicc.log :

2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:java.library.path=/usr/lib/hadoop/lib/native/Linux-i386-32
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:java.io.tmpdir=/tmp
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:java.compiler=<NA>
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:os.name=Linux
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:os.arch=amd64
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:os.version=2.6.32-220.el6.x86_64
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:user.name=root
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:user.home=/root
2013-02-03 08:56:53,939 INFO 127861719@qtp-1979873666-7 ZooKeeper - Client
environment:user.dir=/etc/Chukwa/chukwa-incubating-0.5.0
2013-02-03 08:56:53,940 INFO 127861719@qtp-1979873666-7 ZooKeeper -
Initiating client connection, connectString=hadoop-standalone:2181
sessionTimeout=180000 watcher=hconnection
2013-02-03 08:56:53,946 INFO 127861719@qtp-1979873666-7-SendThread()
ClientCnxn - Opening socket connection to server hadoop-standalone/
192.168.150.254:2181
2013-02-03 08:56:53,947 INFO
127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
ClientCnxn - Socket connection established to hadoop-standalone/
192.168.150.254:2181, initiating session
2013-02-03 08:56:53,964 INFO
127861719@qtp-1979873666-7-SendThread(hadoop-standalone:2181)
ClientCnxn - Session establishment complete on server hadoop-standalone/
192.168.150.254:2181, sessionid = 0x13c9adf3ab2003d, negotiated timeout =
40000
2013-02-03 08:56:55,168 INFO
1152423575@qtp-1979873666-6ChukwaConfiguration - chukwaConf is
/etc/Chukwa/chukwa-incubating-0.5.0/etc/chukwa
2013-02-03 08:56:55,335 ERROR 127861719@qtp-1979873666-7 ViewStore -
java.io.IOException: Call to
hadoop-standalone.soc.net/192.168.150.254:8020failed on local
exception: java.io.IOException: Broken pipe
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
        at org.apache.hadoop.ipc.Client.call(Client.java:1071)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at $Proxy65.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
        at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
        at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
        at
org.apache.hadoop.chukwa.datastore.ViewStore.load(ViewStore.java:74)
        at
org.apache.hadoop.chukwa.datastore.ViewStore.<init>(ViewStore.java:61)
        at
org.apache.hadoop.chukwa.rest.resource.ViewResource.getView(ViewResource.java:52)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
        at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
        at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
        at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
        at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
        at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
        at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
        at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
        at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
        at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
        at
org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
        at
org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
        at
org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
        at
org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
        at
org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
        at
org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
        at
org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
        at
org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
        at
org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
        at org.mortbay.jetty.Server.handle(Server.java:326)
        at
org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
        at
org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
        at
org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
        at
org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.io.IOException: Broken pipe
        at sun.nio.ch.FileDispatcher.write0(Native Method)
        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:29)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:69)
        at sun.nio.ch.IOUtil.write(IOUtil.java:40)
        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:336)
        at
org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
        at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
        at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
        at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
        at
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
        at java.io.DataOutputStream.flush(DataOutputStream.java:106)
        at
org.apache.hadoop.ipc.Client$Connection.sendParam(Client.java:779)
        at org.apache.hadoop.ipc.Client.call(Client.java:1047)
        ... 52 more

2013-02-03 08:56:55,335 ERROR 1152423575@qtp-1979873666-6 ViewStore -
java.io.IOException: Call to
hadoop-standalone.soc.net/192.168.150.254:8020failed on local
exception: java.io.IOException: Broken pipe
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103)
        at org.apache.hadoop.ipc.Client.call(Client.java:1071)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
        at $Proxy65.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
        at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
        at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
        at
org.apache.hadoop.chukwa.datastore.ViewStore.list(ViewStore.java:208)
        at
org.apache.hadoop.chukwa.rest.resource.ViewResource.getUserViewList(ViewResource.java:158)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
        at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:70)
        at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:279)
        at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
        at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:86)
        at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:136)
        at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:74)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1357)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1289)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1239)
        at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1229)
        at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:420)
        at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:497)
        at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:684)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
        at
org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
        at
org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
        at
org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)
        at
org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
        at
org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
        at
org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)
        at
org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)
        at
org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)
        at
org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
        at org.mortbay.jetty.Server.handle(Server.java:326)
        at
org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
        at
org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)
        at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)
        at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
        at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
        at
org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410)
        at
org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.io.IOException: Broken pipe
 ... ...

On Sat, Feb 2, 2013 at 10:29 PM, Eric Yang <er...@gmail.com> wrote:

> Yes, if the hadoop/hbase/zookeeper jar files are packaged in hicc.war,
> then you should replace those too.  But I am not sure if that was the
> source of the problem.  Can you show more of the stack trace to determine
> the problem.  This looks like a configuration property is missing.  I am
> not sure if it is hdfs, hbase, or zookeeper related.
>
> regards,
> Eric
>
>
> On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> Yeah,you were right,I should've update zookeeper.jar.
>> Now there is another probelm,when I've run chukwa hicc,I've got this
>> error :
>>
>> java.lang.IllegalArgumentException: Not a host: port pair:
>>
>> I read in a place that the hbase jar files should be updated ( I copied
>> my hbase jar files to share/chukwa/lib/ ),but still have problem, should I
>> change the inside of hicc.war too ?
>>
>> Tnx
>>
>>
>> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>>
>>> Make sure you also update HBase jar file and ZooKeeper jar files to your
>>> versions.
>>>
>>> regards,
>>> Eric
>>>
>>>
>>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>
>>>> Tnx Eric,
>>>> but my chukwa classpath is this :
>>>>
>>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>>
>>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
>>>> "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
>>>> errors.
>>>>
>>>>
>>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>>>>
>>>>> Is there multiple version of hadoop jar files in the class path?  This
>>>>> error looks like hdfs client is from Hadoop 1.x.  If there is older version
>>>>> of hadoop-core*.jar file, it can generate this error.
>>>>>
>>>>> regards,
>>>>> Eric
>>>>>
>>>>>
>>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>>
>>>>>> Hi there,
>>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>>
>>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>>
>>>>>> I'd be glad if someone can help me.
>>>>>> Tnx
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Eric Yang <er...@gmail.com>.
Yes, if the hadoop/hbase/zookeeper jar files are packaged in hicc.war, then
you should replace those too.  But I am not sure if that was the source of
the problem.  Can you show more of the stack trace to determine the
problem.  This looks like a configuration property is missing.  I am not
sure if it is hdfs, hbase, or zookeeper related.

regards,
Eric

On Sat, Feb 2, 2013 at 10:50 AM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> Yeah,you were right,I should've update zookeeper.jar.
> Now there is another probelm,when I've run chukwa hicc,I've got this error
> :
>
> java.lang.IllegalArgumentException: Not a host: port pair:
>
> I read in a place that the hbase jar files should be updated ( I copied my
> hbase jar files to share/chukwa/lib/ ),but still have problem, should I
> change the inside of hicc.war too ?
>
> Tnx
>
>
> On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:
>
>> Make sure you also update HBase jar file and ZooKeeper jar files to your
>> versions.
>>
>> regards,
>> Eric
>>
>>
>> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
>> mohandes.zebeleh.67@gmail.com> wrote:
>>
>>> Tnx Eric,
>>> but my chukwa classpath is this :
>>>
>>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>>
>>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
>>> "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
>>> errors.
>>>
>>>
>>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>>>
>>>> Is there multiple version of hadoop jar files in the class path?  This
>>>> error looks like hdfs client is from Hadoop 1.x.  If there is older version
>>>> of hadoop-core*.jar file, it can generate this error.
>>>>
>>>> regards,
>>>> Eric
>>>>
>>>>
>>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>>
>>>>> Hi there,
>>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>>> 4.0.0).But when collector runs,it has showed this error :
>>>>> Server IPC version 7 cannot communicate with client version 4
>>>>>
>>>>> I copied lib from /user/lib/hadoop/*.jar &
>>>>> /user/lib/hadoop-hdfs/*.jar, but couldn't get result.
>>>>>
>>>>> I'd be glad if someone can help me.
>>>>> Tnx
>>>>>
>>>>
>>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
Yeah,you were right,I should've update zookeeper.jar.
Now there is another probelm,when I've run chukwa hicc,I've got this error :

java.lang.IllegalArgumentException: Not a host: port pair:

I read in a place that the hbase jar files should be updated ( I copied my
hbase jar files to share/chukwa/lib/ ),but still have problem, should I
change the inside of hicc.war too ?

Tnx

On Sat, Feb 2, 2013 at 9:13 PM, Eric Yang <er...@gmail.com> wrote:

> Make sure you also update HBase jar file and ZooKeeper jar files to your
> versions.
>
> regards,
> Eric
>
>
> On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> Tnx Eric,
>> but my chukwa classpath is this :
>>
>> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
>> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
>> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>>
>> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
>> "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
>> errors.
>>
>>
>> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>>
>>> Is there multiple version of hadoop jar files in the class path?  This
>>> error looks like hdfs client is from Hadoop 1.x.  If there is older version
>>> of hadoop-core*.jar file, it can generate this error.
>>>
>>> regards,
>>> Eric
>>>
>>>
>>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>>> mohandes.zebeleh.67@gmail.com> wrote:
>>>
>>>> Hi there,
>>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh
>>>> 4.0.0).But when collector runs,it has showed this error :
>>>> Server IPC version 7 cannot communicate with client version 4
>>>>
>>>> I copied lib from /user/lib/hadoop/*.jar & /user/lib/hadoop-hdfs/*.jar,
>>>> but couldn't get result.
>>>>
>>>> I'd be glad if someone can help me.
>>>> Tnx
>>>>
>>>
>>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Eric Yang <er...@gmail.com>.
Make sure you also update HBase jar file and ZooKeeper jar files to your
versions.

regards,
Eric

On Fri, Feb 1, 2013 at 9:08 PM, Farrokh Shahriari <
mohandes.zebeleh.67@gmail.com> wrote:

> Tnx Eric,
> but my chukwa classpath is this :
>
> export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
> export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
> export HADOOP_CONF_DIR="/etc/hadoop/conf/"
>
> And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
> "chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
> errors.
>
>
> On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:
>
>> Is there multiple version of hadoop jar files in the class path?  This
>> error looks like hdfs client is from Hadoop 1.x.  If there is older version
>> of hadoop-core*.jar file, it can generate this error.
>>
>> regards,
>> Eric
>>
>>
>> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
>> mohandes.zebeleh.67@gmail.com> wrote:
>>
>>> Hi there,
>>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh 4.0.0).But
>>> when collector runs,it has showed this error :
>>> Server IPC version 7 cannot communicate with client version 4
>>>
>>> I copied lib from /user/lib/hadoop/*.jar & /user/lib/hadoop-hdfs/*.jar,
>>> but couldn't get result.
>>>
>>> I'd be glad if someone can help me.
>>> Tnx
>>>
>>
>>
>

Re: problem with hadoop version2 ( IPC version 7 ) and chukwa 0.5

Posted by Farrokh Shahriari <mo...@gmail.com>.
Tnx Eric,
but my chukwa classpath is this :

export CLASSPATH=${CLASSPATH}:${HBASE_CONF_DIR}:${HADOOP_CONF_DIR}
export HBASE_CONF_DIR="${HBASE_CONF_DIR}"
export HADOOP_CONF_DIR="/etc/hadoop/conf/"

And I've deleted the hadoop-core-1.0.0.jar,hadoop-test-1.0.0.jar from
"chukwa-0.5.0/share/chukwa/lib/" as the manual said, but still I've got
errors.

On Fri, Feb 1, 2013 at 9:07 AM, Eric Yang <er...@gmail.com> wrote:

> Is there multiple version of hadoop jar files in the class path?  This
> error looks like hdfs client is from Hadoop 1.x.  If there is older version
> of hadoop-core*.jar file, it can generate this error.
>
> regards,
> Eric
>
>
> On Tue, Jan 29, 2013 at 11:24 PM, Farrokh Shahriari <
> mohandes.zebeleh.67@gmail.com> wrote:
>
>> Hi there,
>> I downloaded & installed chuckwa 0.5 on hadoop version 2 (cdh 4.0.0).But
>> when collector runs,it has showed this error :
>> Server IPC version 7 cannot communicate with client version 4
>>
>> I copied lib from /user/lib/hadoop/*.jar & /user/lib/hadoop-hdfs/*.jar,
>> but couldn't get result.
>>
>> I'd be glad if someone can help me.
>> Tnx
>>
>
>