You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Gopal Patwa <go...@gmail.com> on 2014/01/02 17:20:58 UTC
Solr use with Cloudera HDFS failed creating directory
I am trying to setup Solr with HDFS following this wiki
https://cwiki.apache.org/confluence/display/solr/Running+Solr+on+HDFS
My Setup:
***********
VMWare: Cloudera Quick Start VM 4.4.0-1 default setup (only hdfs1,
hive1,hue1,mapreduce1 and zookeeper1 is running)
http://www.cloudera.com/content/support/en/downloads/download-components/download-products.html?productID=F6mO278Rvo
SolrCloud:
Mac OS 10.7.5 -> -Running Solr 4.6 with maven jetty plugin in eclipse
outside from HDFS (Cloudera VM) , so it is accessing HDFS as remote service
External zookeeper 3 nodes
Java 1.6, Jett Container 8.1
Collection with 1 shard and 1 replica
************
But I am getting below error "Problem creating directory:" I have created
this directory manually in hdfs. Do I need to setup some special user
permission in Solr?. or do I need to always run solr instance in HDFS (Data
Node)?
[cloudera@localhost ~]$ sudo -u hdfs hadoop fs -mkdir /solr-hdfs
Directory permisson in HDFS:
solr-hdfs rwxr-xr-x hdfs supergroup
Startup Log:
2014-01-01 20:21:57.433:INFO:oejs.Server:jetty-8.1.7.v20120910
2014-01-01 20:21:59.710:INFO:omjp.MavenWebInfConfiguration:Adding overlay:
file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/
2014-01-01 20:22:02.249:INFO:oejpw.PlusConfiguration:No Transaction manager
found - if your webapp requires one, please configure one.
2014-01-01 20:22:03.368:INFO:oejsh.ContextHandler:started
o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
2014-01-01 20:22:03.369:INFO:oejsh.ContextHandler:started
o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
2014-01-01 20:22:03.369:INFO:oejsh.ContextHandler:started
o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
0 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
SolrDispatchFilter.init()
29 [main] INFO org.apache.solr.core.SolrResourceLoader – No /solr/home
in JNDI
30 [main] INFO org.apache.solr.core.SolrResourceLoader – using system
property solr.solr.home: /Users/gpatwa/opensource/solr-hdfs-home
32 [main] INFO org.apache.solr.core.SolrResourceLoader – new
SolrResourceLoader for directory: '/Users/gpatwa/opensource/solr-hdfs-home/'
220 [main] INFO org.apache.solr.core.ConfigSolr – Loading container
configuration from /Users/gpatwa/opensource/solr-hdfs-home/solr.xml
348 [main] INFO org.apache.solr.core.ConfigSolrXml – Config-defined core
root directory:
358 [main] INFO org.apache.solr.core.CoreContainer – New CoreContainer
445620464
359 [main] INFO org.apache.solr.core.CoreContainer – Loading cores into
CoreContainer [instanceDir=/Users/gpatwa/opensource/solr-hdfs-home/]
374 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
socketTimeout to: 120000
375 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
urlScheme to: http://
375 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
connTimeout to: 15000
375 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
maxConnectionsPerHost to: 20
375 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
corePoolSize to: 0
376 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
maximumPoolSize to: 2147483647
376 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
maxThreadIdleTime to: 5
376 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
sizeOfQueue to: -1
378 [main] INFO
org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
fairnessPolicy to: false
645 [main] INFO org.apache.solr.logging.LogWatcher – SLF4J impl is
org.slf4j.impl.Log4jLoggerFactory
646 [main] INFO org.apache.solr.logging.LogWatcher – Registering Log
Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)]
647 [main] INFO org.apache.solr.core.ZkContainer – Zookeeper
client=localhost:2181/search/catalog
653 [main] INFO org.apache.solr.cloud.ZkController – zkHost includes
chroot
762 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Waiting
for client to connect to ZooKeeper
5781 [main-EventThread] INFO
org.apache.solr.common.cloud.ConnectionManager – Watcher
org.apache.solr.common.cloud.ConnectionManager@25630eb6name:ZooKeeperConnection
Watcher:localhost:2181 got event WatchedEvent
state:SyncConnected type:None path:null path:null type:None
5783 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Client
is connected to ZooKeeper
5792 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Waiting
for client to connect to ZooKeeper
5827 [main-EventThread] INFO
org.apache.solr.common.cloud.ConnectionManager – Watcher
org.apache.solr.common.cloud.ConnectionManager@52f5bad0name:ZooKeeperConnection
Watcher:localhost:2181/search/catalog got event
WatchedEvent state:SyncConnected type:None path:null path:null type:None
5827 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Client
is connected to ZooKeeper
5852 [main] INFO org.apache.solr.common.cloud.ZkStateReader – Updating
cluster state from ZooKeeper...
6877 [main] INFO org.apache.solr.cloud.ZkController – Register node as
live in ZooKeeper:/live_nodes/127.0.0.1:8983_solr
6880 [main] INFO org.apache.solr.common.cloud.SolrZkClient – makePath:
/live_nodes/127.0.0.1:8983_solr
6885 [main-EventThread] INFO org.apache.solr.common.cloud.ZkStateReader –
Updating live nodes... (1)
6892 [main] INFO org.apache.solr.common.cloud.SolrZkClient – makePath:
/overseer_elect/leader
6896 [main] INFO org.apache.solr.cloud.Overseer – Overseer
(id=90988393900081217-127.0.0.1:8983_solr-n_0000000017) starting
6913 [Overseer-90988393900081217-127.0.0.1:8983_solr-n_0000000017] INFO
org.apache.solr.cloud.OverseerCollectionProcessor – Process current queue
of collection creations
6917 [Thread-5] INFO org.apache.solr.cloud.Overseer – Starting to work on
the main queue
6946 [main] INFO org.apache.solr.core.CoresLocator – Looking for core
definitions underneath /Users/gpatwa/opensource/solr-hdfs-home
6979 [main] INFO org.apache.solr.core.CoresLocator – Found core
event_shard1_replica1 in
/Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/
6980 [main] INFO org.apache.solr.core.CoresLocator – Found 1 core
definitions
6981 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.cloud.ZkController – publishing core=event_shard1_replica1
state=down
6984 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.cloud.ZkController – waiting to find shard id in
clusterstate for event_shard1_replica1
6984 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.CoreContainer – Creating SolrCore
'event_shard1_replica1' using instanceDir:
/Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1
6984 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.cloud.ZkController – Check for collection zkNode:event
6985 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.cloud.ZkController – Collection zkNode exists
6986 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.cloud.ZkController – Load collection config
from:/collections/event
6987 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.SolrResourceLoader – new SolrResourceLoader for
directory: '/Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/'
7036 [coreLoadExecutor-4-thread-1] WARN org.apache.solr.core.Config – You
should not use LUCENE_CURRENT as luceneMatchVersion property: if you use
this setting, and then Solr upgrades to a newer release of Lucene, sizable
changes may happen. If precise back compatibility is important then you
should instead explicitly specify an actual Lucene version.
7172 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrConfig –
Using Lucene MatchVersion: LUCENE_CURRENT
7283 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.Config –
Loaded SolrConfig: solrconfig.xml
7292 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.schema.IndexSchema – Reading Solr Schema from schema.xml
7354 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.schema.IndexSchema – [event_shard1_replica1] Schema
name=event-hdfs
7686 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.schema.IndexSchema – default search field in schema is
searchKeywords_en_US
7688 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.schema.IndexSchema – query parser default operator is OR
7691 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.schema.IndexSchema – unique key field: id
7826 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
solr.HdfsDirectoryFactory
7836 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.HdfsDirectoryFactory – Solr Kerberos Authentication
disabled
7836 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
[event_shard1_replica1] Opening new SolrCore at
/Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/,
dataDir=hdfs://10.249.132.29:8020/solr-hdfs/
7838 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.JmxMonitoredMap – No JMX servers found, not exposing
Solr information with JMX.
7845 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
path hdfs://10.249.132.29:8020/solr-hdfs
7857 [coreLoadExecutor-4-thread-1] INFO
org.apache.hadoop.metrics.jvm.JvmMetrics – Initializing JVM Metrics with
processName=blockcache, sessionId=1388636531350
2014-01-01 20:22:11.488 java[62306:10b03] Unable to load realm info from
SCDynamicStore
8001 [coreLoadExecutor-4-thread-1] WARN
org.apache.hadoop.util.NativeCodeLoader – Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
8426 [Thread-5] INFO org.apache.solr.common.cloud.ZkStateReader –
Updating cloud state from ZooKeeper...
8428 [Thread-5] INFO org.apache.solr.cloud.Overseer – Update state
numShards=1 message={
"operation":"state",
"state":"down",
"base_url":"http://127.0.0.1:8983/solr",
"core":"event_shard1_replica1",
"roles":null,
"node_name":"127.0.0.1:8983_solr",
"shard":"shard1",
"shard_range":null,
"shard_state":"active",
"shard_parent":null,
"collection":"event",
"numShards":"1",
"core_node_name":"core_node1"}
8450 [main-EventThread] INFO org.apache.solr.common.cloud.ZkStateReader –
A cluster state change: WatchedEvent state:SyncConnected
type:NodeDataChanged path:/clusterstate.json, has occurred - updating...
(live nodes size: 1)
8513 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
[event_shard1_replica1] CLOSING SolrCore
org.apache.solr.core.SolrCore@5035135a
8513 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.update.SolrCoreState – Closing SolrCoreState
8513 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.update.DefaultSolrCoreState – SolrCoreState ref count has
reached 0 - closing IndexWriter
8514 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
[event_shard1_replica1] Closing main searcher on request.
8514 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.CachingDirectoryFactory – Closing
HdfsDirectoryFactory - 0 directories currently being tracked
8516 [coreLoadExecutor-4-thread-1] ERROR
org.apache.solr.core.CoreContainer – Unable to create core:
event_shard1_replica1
org.apache.solr.common.SolrException: Problem creating directory: hdfs://
10.249.132.29:8020/solr-hdfs
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:834)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:625)
at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:256)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:590)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:271)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:263)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680)
Caused by: java.lang.RuntimeException: Problem creating directory: hdfs://
10.249.132.29:8020/solr-hdfs
at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:68)
at org.apache.solr.core.HdfsDirectoryFactory.create(
HdfsDirectoryFactory.java:154)
at org.apache.solr.core.CachingDirectoryFactory.get(
CachingDirectoryFactory.java:350)
at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:251)
at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:465)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:755)
... 13 more
Caused by: java.io.IOException: Failed on local exception:
com.google.protobuf.InvalidProtocolBufferException: Protocol message
contained an invalid tag (zero).; Host Details : local host is:
"LM-SFA-00713958/192.168.1.66"; destination host is: "10.249.132.29":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
at org.apache.hadoop.ipc.Client.call(Client.java:1351)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(
ClientNamenodeProtocolTranslatorPB.java:651)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
DistributedFileSystem.java:1106)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
DistributedFileSystem.java:1102)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(
DistributedFileSystem.java:1102)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:63)
... 18 more
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
message contained an invalid tag (zero).
at com.google.protobuf.InvalidProtocolBufferException.invalidTag(
InvalidProtocolBufferException.java:89)
at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
RpcHeaderProtos.java:1398)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
RpcHeaderProtos.java:1362)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
RpcHeaderProtos.java:1492)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
RpcHeaderProtos.java:1487)
at com.google.protobuf.AbstractParser.parsePartialFrom(
AbstractParser.java:200)
at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(
AbstractParser.java:241)
at com.google.protobuf.AbstractParser.parseDelimitedFrom(
AbstractParser.java:253)
at com.google.protobuf.AbstractParser.parseDelimitedFrom(
AbstractParser.java:259)
at com.google.protobuf.AbstractParser.parseDelimitedFrom(
AbstractParser.java:49)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(
RpcHeaderProtos.java:2364)
at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(
Client.java:996)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
8519 [coreLoadExecutor-4-thread-1] ERROR
org.apache.solr.core.CoreContainer –
null:org.apache.solr.common.SolrException: Unable to create core:
event_shard1_replica1
at org.apache.solr.core.CoreContainer.recordAndThrow(CoreContainer.java:977)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:601)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:271)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:263)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680)
Caused by: org.apache.solr.common.SolrException: Problem creating
directory: hdfs://10.249.132.29:8020/solr-hdfs
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:834)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:625)
at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:256)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:590)
... 10 more
Caused by: java.lang.RuntimeException: Problem creating directory: hdfs://
10.249.132.29:8020/solr-hdfs
at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:68)
at org.apache.solr.core.HdfsDirectoryFactory.create(
HdfsDirectoryFactory.java:154)
at org.apache.solr.core.CachingDirectoryFactory.get(
CachingDirectoryFactory.java:350)
at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:251)
at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:465)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:755)
... 13 more
Caused by: java.io.IOException: Failed on local exception:
com.google.protobuf.InvalidProtocolBufferException: Protocol message
contained an invalid tag (zero).; Host Details : local host is:
"LM-SFA-00713958/192.168.1.66"; destination host is: "10.249.132.29":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
at org.apache.hadoop.ipc.Client.call(Client.java:1351)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(
ClientNamenodeProtocolTranslatorPB.java:651)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
DistributedFileSystem.java:1106)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
DistributedFileSystem.java:1102)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(
DistributedFileSystem.java:1102)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:63)
... 18 more
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
message contained an invalid tag (zero).
at com.google.protobuf.InvalidProtocolBufferException.invalidTag(
InvalidProtocolBufferException.java:89)
at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
RpcHeaderProtos.java:1398)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
RpcHeaderProtos.java:1362)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
RpcHeaderProtos.java:1492)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
RpcHeaderProtos.java:1487)
at com.google.protobuf.AbstractParser.parsePartialFrom(
AbstractParser.java:200)
at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(
AbstractParser.java:241)
at com.google.protobuf.AbstractParser.parseDelimitedFrom(
AbstractParser.java:253)
at com.google.protobuf.AbstractParser.parseDelimitedFrom(
AbstractParser.java:259)
at com.google.protobuf.AbstractParser.parseDelimitedFrom(
AbstractParser.java:49)
at
org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(
RpcHeaderProtos.java:2364)
at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(
Client.java:996)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
8520 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
user.dir=/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs
8521 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
SolrDispatchFilter.init() done
2014-01-01 20:22:12.344:INFO:oejs.AbstractConnector:Started
SelectChannelConnector@127.0.0.1:8983
[INFO] Started Jetty Server
Re: Solr use with Cloudera HDFS failed creating directory
Posted by soodyogesh <so...@gmail.com>.
does anyone able to sort this one out ? im hitting same error
is there a way to fix this by copying right version of jars. I tried copying
older version of jar in solr lib but get same error.
Solr: 4.6.1
Hadoop: 2.0.0..CDH
--
View this message in context: http://lucene.472066.n3.nabble.com/Solr-use-with-Cloudera-HDFS-failed-creating-directory-tp4109143p4123082.html
Sent from the Solr - User mailing list archive at Nabble.com.
Re: Solr use with Cloudera HDFS failed creating directory
Posted by Mark Miller <ma...@gmail.com>.
Yup - sounds right. Previous releases were not built against hadoop 2 GA, but whatever was available at the time (the beta release, etc).
You can only be sure things will work right with the version we build against, though often things should work with other versions depending on what has changed.
- Mark
On Jan 5, 2014, at 5:33 PM, Gopal Patwa <go...@gmail.com> wrote:
> I gave another try with Solr 4.4 which ship with Cloudera VM as Cloudera
> Search but same result. It seems there is compitability issue with protobuf
> library dependecy in haddop java client and HDFS server it self.
>
> Solr 4.4 depend on protobuf-java-2.4.0a.jar
> Solr 4.6 depend on protobuf-java-2.5.0.jar
>
> Finally I tried with Horton work HDFS distribution
> http://hortonworks.com/products/hortonworks-sandbox/#install
>
> Wow!!! it worked without any issue.
>
> Log Snippet:
>
> 4933 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.JmxMonitoredMap – No JMX servers found, not exposing
> Solr information with JMX.
>
> 4942 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
> path hdfs://10.249.132.15:8020/solr-hdfs
>
> 4953 [coreLoadExecutor-4-thread-1] INFO
> org.apache.hadoop.metrics.jvm.JvmMetrics – Initializing JVM Metrics with
> processName=blockcache, sessionId=1388960072262
>
> 2014-01-05 14:14:32.403 java[46758:10b03] Unable to load realm info from
> SCDynamicStore
>
> 5115 [coreLoadExecutor-4-thread-1] WARN
> org.apache.hadoop.util.NativeCodeLoader – Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 5962 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.CachingDirectoryFactory – return new directory for
> hdfs://10.249.132.15:8020/solr-hdfs
>
> 5999 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
> New index directory detected: old=null new=hdfs://
> 10.249.132.15:8020/solr-hdfs/index/
>
> 6075 [coreLoadExecutor-4-thread-1] WARN org.apache.solr.core.SolrCore –
> [event_shard1_replica1] Solr index directory 'hdfs:/
> 10.249.132.15:8020/solr-hdfs/index' doesn't exist. Creating new index...
>
> 6085 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
> path hdfs://10.249.132.15:8020/solr-hdfs/index
>
> 6086 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.HdfsDirectoryFactory – Number of slabs of block cache
> [1] with direct memory allocation set to [true]
>
> 6086 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.HdfsDirectoryFactory – Block cache target memory
> usage, slab size of [134217728] will allocate [1] slabs and use
> ~[134217728] bytes
>
> 6087 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.store.blockcache.BufferStore – Initializing the 1024
> buffers with [8192] buffers.
>
> 6114 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.store.blockcache.BufferStore – Initializing the 8192
> buffers with [8192] buffers.
>
> 6408 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.CachingDirectoryFactory – return new directory for
> hdfs://10.249.132.15:8020/solr-hdfs/index
>
> 7907 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
> SolrDeletionPolicy.onCommit: commits: num=1
>
> commit{dir=NRTCachingDirectory(org.apache.solr.store.hdfs.HdfsDirectory@6cab6dcblockFactory=org.apache.solr.store.hdfs.HdfsLockFactory@4a6d0362;
> maxCacheMB=192.0 maxMergeSizeMB=16.0),segFN=segments_1,generation=1}
>
>
>
>
> On Thu, Jan 2, 2014 at 8:20 AM, Gopal Patwa <go...@gmail.com> wrote:
>
>> I am trying to setup Solr with HDFS following this wiki
>>
>> https://cwiki.apache.org/confluence/display/solr/Running+Solr+on+HDFS
>>
>> My Setup:
>>
>> ***********
>>
>> VMWare: Cloudera Quick Start VM 4.4.0-1 default setup (only hdfs1,
>> hive1,hue1,mapreduce1 and zookeeper1 is running)
>>
>>
>> http://www.cloudera.com/content/support/en/downloads/download-components/download-products.html?productID=F6mO278Rvo
>>
>> SolrCloud:
>>
>> Mac OS 10.7.5 -> -Running Solr 4.6 with maven jetty plugin in eclipse
>> outside from HDFS (Cloudera VM) , so it is accessing HDFS as remote service
>>
>> External zookeeper 3 nodes
>>
>> Java 1.6, Jett Container 8.1
>>
>> Collection with 1 shard and 1 replica
>>
>> ************
>>
>> But I am getting below error "Problem creating directory:" I have created
>> this directory manually in hdfs. Do I need to setup some special user
>> permission in Solr?. or do I need to always run solr instance in HDFS (Data
>> Node)?
>>
>> [cloudera@localhost ~]$ sudo -u hdfs hadoop fs -mkdir /solr-hdfs
>>
>> Directory permisson in HDFS:
>>
>> solr-hdfs rwxr-xr-x hdfs supergroup
>>
>> Startup Log:
>>
>> 2014-01-01 20:21:57.433:INFO:oejs.Server:jetty-8.1.7.v20120910
>>
>> 2014-01-01 20:21:59.710:INFO:omjp.MavenWebInfConfiguration:Adding overlay:
>> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/
>>
>> 2014-01-01 20:22:02.249:INFO:oejpw.PlusConfiguration:No Transaction
>> manager found - if your webapp requires one, please configure one.
>>
>> 2014-01-01 20:22:03.368:INFO:oejsh.ContextHandler:started
>> o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
>> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
>>
>> 2014-01-01 20:22:03.369:INFO:oejsh.ContextHandler:started
>> o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
>> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
>>
>> 2014-01-01 20:22:03.369:INFO:oejsh.ContextHandler:started
>> o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
>> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
>>
>> 0 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
>> SolrDispatchFilter.init()
>>
>> 29 [main] INFO org.apache.solr.core.SolrResourceLoader – No /solr/home
>> in JNDI
>>
>> 30 [main] INFO org.apache.solr.core.SolrResourceLoader – using system
>> property solr.solr.home: /Users/gpatwa/opensource/solr-hdfs-home
>>
>> 32 [main] INFO org.apache.solr.core.SolrResourceLoader – new
>> SolrResourceLoader for directory: '/Users/gpatwa/opensource/solr-hdfs-home/'
>>
>> 220 [main] INFO org.apache.solr.core.ConfigSolr – Loading container
>> configuration from /Users/gpatwa/opensource/solr-hdfs-home/solr.xml
>>
>> 348 [main] INFO org.apache.solr.core.ConfigSolrXml – Config-defined
>> core root directory:
>>
>> 358 [main] INFO org.apache.solr.core.CoreContainer – New CoreContainer
>> 445620464
>>
>> 359 [main] INFO org.apache.solr.core.CoreContainer – Loading cores into
>> CoreContainer [instanceDir=/Users/gpatwa/opensource/solr-hdfs-home/]
>>
>> 374 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> socketTimeout to: 120000
>>
>> 375 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> urlScheme to: http://
>>
>> 375 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> connTimeout to: 15000
>>
>> 375 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> maxConnectionsPerHost to: 20
>>
>> 375 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> corePoolSize to: 0
>>
>> 376 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> maximumPoolSize to: 2147483647
>>
>> 376 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> maxThreadIdleTime to: 5
>>
>> 376 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> sizeOfQueue to: -1
>>
>> 378 [main] INFO
>> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
>> fairnessPolicy to: false
>>
>> 645 [main] INFO org.apache.solr.logging.LogWatcher – SLF4J impl is
>> org.slf4j.impl.Log4jLoggerFactory
>>
>> 646 [main] INFO org.apache.solr.logging.LogWatcher – Registering Log
>> Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)]
>>
>> 647 [main] INFO org.apache.solr.core.ZkContainer – Zookeeper
>> client=localhost:2181/search/catalog
>>
>> 653 [main] INFO org.apache.solr.cloud.ZkController – zkHost includes
>> chroot
>>
>> 762 [main] INFO org.apache.solr.common.cloud.ConnectionManager –
>> Waiting for client to connect to ZooKeeper
>>
>> 5781 [main-EventThread] INFO
>> org.apache.solr.common.cloud.ConnectionManager – Watcher
>> org.apache.solr.common.cloud.ConnectionManager@25630eb6name:ZooKeeperConnection Watcher:localhost:2181 got event WatchedEvent
>> state:SyncConnected type:None path:null path:null type:None
>>
>> 5783 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Client
>> is connected to ZooKeeper
>>
>> 5792 [main] INFO org.apache.solr.common.cloud.ConnectionManager –
>> Waiting for client to connect to ZooKeeper
>>
>> 5827 [main-EventThread] INFO
>> org.apache.solr.common.cloud.ConnectionManager – Watcher
>> org.apache.solr.common.cloud.ConnectionManager@52f5bad0name:ZooKeeperConnection Watcher:localhost:2181/search/catalog got event
>> WatchedEvent state:SyncConnected type:None path:null path:null type:None
>>
>> 5827 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Client
>> is connected to ZooKeeper
>>
>> 5852 [main] INFO org.apache.solr.common.cloud.ZkStateReader – Updating
>> cluster state from ZooKeeper...
>>
>> 6877 [main] INFO org.apache.solr.cloud.ZkController – Register node as
>> live in ZooKeeper:/live_nodes/127.0.0.1:8983_solr
>>
>> 6880 [main] INFO org.apache.solr.common.cloud.SolrZkClient – makePath:
>> /live_nodes/127.0.0.1:8983_solr
>>
>> 6885 [main-EventThread] INFO org.apache.solr.common.cloud.ZkStateReader
>> – Updating live nodes... (1)
>>
>> 6892 [main] INFO org.apache.solr.common.cloud.SolrZkClient – makePath:
>> /overseer_elect/leader
>>
>> 6896 [main] INFO org.apache.solr.cloud.Overseer – Overseer
>> (id=90988393900081217-127.0.0.1:8983_solr-n_0000000017) starting
>>
>> 6913 [Overseer-90988393900081217-127.0.0.1:8983_solr-n_0000000017] INFO
>> org.apache.solr.cloud.OverseerCollectionProcessor – Process current queue
>> of collection creations
>>
>> 6917 [Thread-5] INFO org.apache.solr.cloud.Overseer – Starting to work
>> on the main queue
>>
>> 6946 [main] INFO org.apache.solr.core.CoresLocator – Looking for core
>> definitions underneath /Users/gpatwa/opensource/solr-hdfs-home
>>
>> 6979 [main] INFO org.apache.solr.core.CoresLocator – Found core
>> event_shard1_replica1 in
>> /Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/
>>
>> 6980 [main] INFO org.apache.solr.core.CoresLocator – Found 1 core
>> definitions
>>
>> 6981 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.cloud.ZkController – publishing core=event_shard1_replica1
>> state=down
>>
>> 6984 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.cloud.ZkController – waiting to find shard id in
>> clusterstate for event_shard1_replica1
>>
>> 6984 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.core.CoreContainer – Creating SolrCore
>> 'event_shard1_replica1' using instanceDir:
>> /Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1
>>
>> 6984 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.cloud.ZkController – Check for collection zkNode:event
>>
>> 6985 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.cloud.ZkController – Collection zkNode exists
>>
>> 6986 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.cloud.ZkController – Load collection config
>> from:/collections/event
>>
>> 6987 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.core.SolrResourceLoader – new SolrResourceLoader for
>> directory: '/Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/'
>>
>> 7036 [coreLoadExecutor-4-thread-1] WARN org.apache.solr.core.Config –
>> You should not use LUCENE_CURRENT as luceneMatchVersion property: if you
>> use this setting, and then Solr upgrades to a newer release of Lucene,
>> sizable changes may happen. If precise back compatibility is important then
>> you should instead explicitly specify an actual Lucene version.
>>
>> 7172 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrConfig
>> – Using Lucene MatchVersion: LUCENE_CURRENT
>>
>> 7283 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.Config –
>> Loaded SolrConfig: solrconfig.xml
>>
>> 7292 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.schema.IndexSchema – Reading Solr Schema from schema.xml
>>
>> 7354 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.schema.IndexSchema – [event_shard1_replica1] Schema
>> name=event-hdfs
>>
>> 7686 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.schema.IndexSchema – default search field in schema is
>> searchKeywords_en_US
>>
>> 7688 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.schema.IndexSchema – query parser default operator is OR
>>
>> 7691 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.schema.IndexSchema – unique key field: id
>>
>> 7826 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
>> solr.HdfsDirectoryFactory
>>
>> 7836 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.core.HdfsDirectoryFactory – Solr Kerberos Authentication
>> disabled
>>
>> 7836 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
>> [event_shard1_replica1] Opening new SolrCore at
>> /Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/,
>> dataDir=hdfs://10.249.132.29:8020/solr-hdfs/
>>
>> 7838 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.core.JmxMonitoredMap – No JMX servers found, not exposing
>> Solr information with JMX.
>>
>> 7845 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
>> path hdfs://10.249.132.29:8020/solr-hdfs
>>
>> 7857 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.hadoop.metrics.jvm.JvmMetrics – Initializing JVM Metrics with
>> processName=blockcache, sessionId=1388636531350
>>
>> 2014-01-01 20:22:11.488 java[62306:10b03] Unable to load realm info from
>> SCDynamicStore
>>
>> 8001 [coreLoadExecutor-4-thread-1] WARN
>> org.apache.hadoop.util.NativeCodeLoader – Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>>
>> 8426 [Thread-5] INFO org.apache.solr.common.cloud.ZkStateReader –
>> Updating cloud state from ZooKeeper...
>>
>> 8428 [Thread-5] INFO org.apache.solr.cloud.Overseer – Update state
>> numShards=1 message={
>>
>> "operation":"state",
>>
>> "state":"down",
>>
>> "base_url":"http://127.0.0.1:8983/solr",
>>
>> "core":"event_shard1_replica1",
>>
>> "roles":null,
>>
>> "node_name":"127.0.0.1:8983_solr",
>>
>> "shard":"shard1",
>>
>> "shard_range":null,
>>
>> "shard_state":"active",
>>
>> "shard_parent":null,
>>
>> "collection":"event",
>>
>> "numShards":"1",
>>
>> "core_node_name":"core_node1"}
>>
>> 8450 [main-EventThread] INFO org.apache.solr.common.cloud.ZkStateReader
>> – A cluster state change: WatchedEvent state:SyncConnected
>> type:NodeDataChanged path:/clusterstate.json, has occurred - updating...
>> (live nodes size: 1)
>>
>> 8513 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
>> [event_shard1_replica1] CLOSING SolrCore
>> org.apache.solr.core.SolrCore@5035135a
>>
>> 8513 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.update.SolrCoreState – Closing SolrCoreState
>>
>> 8513 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.update.DefaultSolrCoreState – SolrCoreState ref count has
>> reached 0 - closing IndexWriter
>>
>> 8514 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
>> [event_shard1_replica1] Closing main searcher on request.
>>
>> 8514 [coreLoadExecutor-4-thread-1] INFO
>> org.apache.solr.core.CachingDirectoryFactory – Closing
>> HdfsDirectoryFactory - 0 directories currently being tracked
>>
>> 8516 [coreLoadExecutor-4-thread-1] ERROR
>> org.apache.solr.core.CoreContainer – Unable to create core:
>> event_shard1_replica1
>>
>> org.apache.solr.common.SolrException: Problem creating directory: hdfs://
>> 10.249.132.29:8020/solr-hdfs
>>
>> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:834)
>>
>> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:625)
>>
>> at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:256)
>>
>> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:590)
>>
>> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:271)
>>
>> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:263)
>>
>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>
>> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>
>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>
>> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
>> ThreadPoolExecutor.java:895)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:918)
>>
>> at java.lang.Thread.run(Thread.java:680)
>>
>> Caused by: java.lang.RuntimeException: Problem creating directory: hdfs://
>> 10.249.132.29:8020/solr-hdfs
>>
>> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:68)
>>
>> at org.apache.solr.core.HdfsDirectoryFactory.create(
>> HdfsDirectoryFactory.java:154)
>>
>> at org.apache.solr.core.CachingDirectoryFactory.get(
>> CachingDirectoryFactory.java:350)
>>
>> at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:251)
>>
>> at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:465)
>>
>> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:755)
>>
>> ... 13 more
>>
>> Caused by: java.io.IOException: Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> contained an invalid tag (zero).; Host Details : local host is:
>> "LM-SFA-00713958/192.168.1.66"; destination host is:
>> "10.249.132.29":8020;
>>
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>
>> at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>>
>> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>
>> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
>> ProtobufRpcEngine.java:206)
>>
>> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke(
>> NativeMethodAccessorImpl.java:39)
>>
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>> DelegatingMethodAccessorImpl.java:25)
>>
>> at java.lang.reflect.Method.invoke(Method.java:597)
>>
>> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
>> RetryInvocationHandler.java:186)
>>
>> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
>> RetryInvocationHandler.java:102)
>>
>> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>>
>> at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(
>> ClientNamenodeProtocolTranslatorPB.java:651)
>>
>> at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>>
>> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
>> DistributedFileSystem.java:1106)
>>
>> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
>> DistributedFileSystem.java:1102)
>>
>> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
>> FileSystemLinkResolver.java:81)
>>
>> at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(
>> DistributedFileSystem.java:1102)
>>
>> at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
>>
>> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:63)
>>
>> ... 18 more
>>
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message contained an invalid tag (zero).
>>
>> at com.google.protobuf.InvalidProtocolBufferException.invalidTag(
>> InvalidProtocolBufferException.java:89)
>>
>> at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
>> RpcHeaderProtos.java:1398)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
>> RpcHeaderProtos.java:1362)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
>> RpcHeaderProtos.java:1492)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
>> RpcHeaderProtos.java:1487)
>>
>> at com.google.protobuf.AbstractParser.parsePartialFrom(
>> AbstractParser.java:200)
>>
>> at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(
>> AbstractParser.java:241)
>>
>> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
>> AbstractParser.java:253)
>>
>> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
>> AbstractParser.java:259)
>>
>> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
>> AbstractParser.java:49)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(
>> RpcHeaderProtos.java:2364)
>>
>> at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(
>> Client.java:996)
>>
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>>
>> 8519 [coreLoadExecutor-4-thread-1] ERROR
>> org.apache.solr.core.CoreContainer –
>> null:org.apache.solr.common.SolrException: Unable to create core:
>> event_shard1_replica1
>>
>> at org.apache.solr.core.CoreContainer.recordAndThrow(
>> CoreContainer.java:977)
>>
>> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:601)
>>
>> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:271)
>>
>> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:263)
>>
>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>
>> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>
>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>
>> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
>> ThreadPoolExecutor.java:895)
>>
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:918)
>>
>> at java.lang.Thread.run(Thread.java:680)
>>
>> Caused by: org.apache.solr.common.SolrException: Problem creating
>> directory: hdfs://10.249.132.29:8020/solr-hdfs
>>
>> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:834)
>>
>> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:625)
>>
>> at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:256)
>>
>> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:590)
>>
>> ... 10 more
>>
>> Caused by: java.lang.RuntimeException: Problem creating directory: hdfs://
>> 10.249.132.29:8020/solr-hdfs
>>
>> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:68)
>>
>> at org.apache.solr.core.HdfsDirectoryFactory.create(
>> HdfsDirectoryFactory.java:154)
>>
>> at org.apache.solr.core.CachingDirectoryFactory.get(
>> CachingDirectoryFactory.java:350)
>>
>> at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:251)
>>
>> at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:465)
>>
>> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:755)
>>
>> ... 13 more
>>
>> Caused by: java.io.IOException: Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> contained an invalid tag (zero).; Host Details : local host is:
>> "LM-SFA-00713958/192.168.1.66"; destination host is:
>> "10.249.132.29":8020;
>>
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>
>> at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>>
>> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>>
>> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
>> ProtobufRpcEngine.java:206)
>>
>> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke(
>> NativeMethodAccessorImpl.java:39)
>>
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>> DelegatingMethodAccessorImpl.java:25)
>>
>> at java.lang.reflect.Method.invoke(Method.java:597)
>>
>> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
>> RetryInvocationHandler.java:186)
>>
>> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
>> RetryInvocationHandler.java:102)
>>
>> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>>
>> at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(
>> ClientNamenodeProtocolTranslatorPB.java:651)
>>
>> at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>>
>> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
>> DistributedFileSystem.java:1106)
>>
>> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
>> DistributedFileSystem.java:1102)
>>
>> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
>> FileSystemLinkResolver.java:81)
>>
>> at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(
>> DistributedFileSystem.java:1102)
>>
>> at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
>>
>> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:63)
>>
>> ... 18 more
>>
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message contained an invalid tag (zero).
>>
>> at com.google.protobuf.InvalidProtocolBufferException.invalidTag(
>> InvalidProtocolBufferException.java:89)
>>
>> at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
>> RpcHeaderProtos.java:1398)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
>> RpcHeaderProtos.java:1362)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
>> RpcHeaderProtos.java:1492)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
>> RpcHeaderProtos.java:1487)
>>
>> at com.google.protobuf.AbstractParser.parsePartialFrom(
>> AbstractParser.java:200)
>>
>> at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(
>> AbstractParser.java:241)
>>
>> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
>> AbstractParser.java:253)
>>
>> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
>> AbstractParser.java:259)
>>
>> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
>> AbstractParser.java:49)
>>
>> at
>> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(
>> RpcHeaderProtos.java:2364)
>>
>> at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(
>> Client.java:996)
>>
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>>
>>
>> 8520 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
>> user.dir=/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs
>>
>> 8521 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
>> SolrDispatchFilter.init() done
>>
>> 2014-01-01 20:22:12.344:INFO:oejs.AbstractConnector:Started
>> SelectChannelConnector@127.0.0.1:8983
>>
>> [INFO] Started Jetty Server
>>
>>
>>
Re: Solr use with Cloudera HDFS failed creating directory
Posted by Gopal Patwa <go...@gmail.com>.
I gave another try with Solr 4.4 which ship with Cloudera VM as Cloudera
Search but same result. It seems there is compitability issue with protobuf
library dependecy in haddop java client and HDFS server it self.
Solr 4.4 depend on protobuf-java-2.4.0a.jar
Solr 4.6 depend on protobuf-java-2.5.0.jar
Finally I tried with Horton work HDFS distribution
http://hortonworks.com/products/hortonworks-sandbox/#install
Wow!!! it worked without any issue.
Log Snippet:
4933 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.JmxMonitoredMap – No JMX servers found, not exposing
Solr information with JMX.
4942 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
path hdfs://10.249.132.15:8020/solr-hdfs
4953 [coreLoadExecutor-4-thread-1] INFO
org.apache.hadoop.metrics.jvm.JvmMetrics – Initializing JVM Metrics with
processName=blockcache, sessionId=1388960072262
2014-01-05 14:14:32.403 java[46758:10b03] Unable to load realm info from
SCDynamicStore
5115 [coreLoadExecutor-4-thread-1] WARN
org.apache.hadoop.util.NativeCodeLoader – Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
5962 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.CachingDirectoryFactory – return new directory for
hdfs://10.249.132.15:8020/solr-hdfs
5999 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
New index directory detected: old=null new=hdfs://
10.249.132.15:8020/solr-hdfs/index/
6075 [coreLoadExecutor-4-thread-1] WARN org.apache.solr.core.SolrCore –
[event_shard1_replica1] Solr index directory 'hdfs:/
10.249.132.15:8020/solr-hdfs/index' doesn't exist. Creating new index...
6085 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
path hdfs://10.249.132.15:8020/solr-hdfs/index
6086 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.HdfsDirectoryFactory – Number of slabs of block cache
[1] with direct memory allocation set to [true]
6086 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.HdfsDirectoryFactory – Block cache target memory
usage, slab size of [134217728] will allocate [1] slabs and use
~[134217728] bytes
6087 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.store.blockcache.BufferStore – Initializing the 1024
buffers with [8192] buffers.
6114 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.store.blockcache.BufferStore – Initializing the 8192
buffers with [8192] buffers.
6408 [coreLoadExecutor-4-thread-1] INFO
org.apache.solr.core.CachingDirectoryFactory – return new directory for
hdfs://10.249.132.15:8020/solr-hdfs/index
7907 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
SolrDeletionPolicy.onCommit: commits: num=1
commit{dir=NRTCachingDirectory(org.apache.solr.store.hdfs.HdfsDirectory@6cab6dcblockFactory=org.apache.solr.store.hdfs.HdfsLockFactory@4a6d0362;
maxCacheMB=192.0 maxMergeSizeMB=16.0),segFN=segments_1,generation=1}
On Thu, Jan 2, 2014 at 8:20 AM, Gopal Patwa <go...@gmail.com> wrote:
> I am trying to setup Solr with HDFS following this wiki
>
> https://cwiki.apache.org/confluence/display/solr/Running+Solr+on+HDFS
>
> My Setup:
>
> ***********
>
> VMWare: Cloudera Quick Start VM 4.4.0-1 default setup (only hdfs1,
> hive1,hue1,mapreduce1 and zookeeper1 is running)
>
>
> http://www.cloudera.com/content/support/en/downloads/download-components/download-products.html?productID=F6mO278Rvo
>
> SolrCloud:
>
> Mac OS 10.7.5 -> -Running Solr 4.6 with maven jetty plugin in eclipse
> outside from HDFS (Cloudera VM) , so it is accessing HDFS as remote service
>
> External zookeeper 3 nodes
>
> Java 1.6, Jett Container 8.1
>
> Collection with 1 shard and 1 replica
>
> ************
>
> But I am getting below error "Problem creating directory:" I have created
> this directory manually in hdfs. Do I need to setup some special user
> permission in Solr?. or do I need to always run solr instance in HDFS (Data
> Node)?
>
> [cloudera@localhost ~]$ sudo -u hdfs hadoop fs -mkdir /solr-hdfs
>
> Directory permisson in HDFS:
>
> solr-hdfs rwxr-xr-x hdfs supergroup
>
> Startup Log:
>
> 2014-01-01 20:21:57.433:INFO:oejs.Server:jetty-8.1.7.v20120910
>
> 2014-01-01 20:21:59.710:INFO:omjp.MavenWebInfConfiguration:Adding overlay:
> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/
>
> 2014-01-01 20:22:02.249:INFO:oejpw.PlusConfiguration:No Transaction
> manager found - if your webapp requires one, please configure one.
>
> 2014-01-01 20:22:03.368:INFO:oejsh.ContextHandler:started
> o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
>
> 2014-01-01 20:22:03.369:INFO:oejsh.ContextHandler:started
> o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
>
> 2014-01-01 20:22:03.369:INFO:oejsh.ContextHandler:started
> o.m.j.p.JettyWebAppContext{/solr,[file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/,
> file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/target/tmp/solr-4_6_0_war/]},file:/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs/src/main/webapp/
>
> 0 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
> SolrDispatchFilter.init()
>
> 29 [main] INFO org.apache.solr.core.SolrResourceLoader – No /solr/home
> in JNDI
>
> 30 [main] INFO org.apache.solr.core.SolrResourceLoader – using system
> property solr.solr.home: /Users/gpatwa/opensource/solr-hdfs-home
>
> 32 [main] INFO org.apache.solr.core.SolrResourceLoader – new
> SolrResourceLoader for directory: '/Users/gpatwa/opensource/solr-hdfs-home/'
>
> 220 [main] INFO org.apache.solr.core.ConfigSolr – Loading container
> configuration from /Users/gpatwa/opensource/solr-hdfs-home/solr.xml
>
> 348 [main] INFO org.apache.solr.core.ConfigSolrXml – Config-defined
> core root directory:
>
> 358 [main] INFO org.apache.solr.core.CoreContainer – New CoreContainer
> 445620464
>
> 359 [main] INFO org.apache.solr.core.CoreContainer – Loading cores into
> CoreContainer [instanceDir=/Users/gpatwa/opensource/solr-hdfs-home/]
>
> 374 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> socketTimeout to: 120000
>
> 375 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> urlScheme to: http://
>
> 375 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> connTimeout to: 15000
>
> 375 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> maxConnectionsPerHost to: 20
>
> 375 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> corePoolSize to: 0
>
> 376 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> maximumPoolSize to: 2147483647
>
> 376 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> maxThreadIdleTime to: 5
>
> 376 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> sizeOfQueue to: -1
>
> 378 [main] INFO
> org.apache.solr.handler.component.HttpShardHandlerFactory – Setting
> fairnessPolicy to: false
>
> 645 [main] INFO org.apache.solr.logging.LogWatcher – SLF4J impl is
> org.slf4j.impl.Log4jLoggerFactory
>
> 646 [main] INFO org.apache.solr.logging.LogWatcher – Registering Log
> Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)]
>
> 647 [main] INFO org.apache.solr.core.ZkContainer – Zookeeper
> client=localhost:2181/search/catalog
>
> 653 [main] INFO org.apache.solr.cloud.ZkController – zkHost includes
> chroot
>
> 762 [main] INFO org.apache.solr.common.cloud.ConnectionManager –
> Waiting for client to connect to ZooKeeper
>
> 5781 [main-EventThread] INFO
> org.apache.solr.common.cloud.ConnectionManager – Watcher
> org.apache.solr.common.cloud.ConnectionManager@25630eb6name:ZooKeeperConnection Watcher:localhost:2181 got event WatchedEvent
> state:SyncConnected type:None path:null path:null type:None
>
> 5783 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Client
> is connected to ZooKeeper
>
> 5792 [main] INFO org.apache.solr.common.cloud.ConnectionManager –
> Waiting for client to connect to ZooKeeper
>
> 5827 [main-EventThread] INFO
> org.apache.solr.common.cloud.ConnectionManager – Watcher
> org.apache.solr.common.cloud.ConnectionManager@52f5bad0name:ZooKeeperConnection Watcher:localhost:2181/search/catalog got event
> WatchedEvent state:SyncConnected type:None path:null path:null type:None
>
> 5827 [main] INFO org.apache.solr.common.cloud.ConnectionManager – Client
> is connected to ZooKeeper
>
> 5852 [main] INFO org.apache.solr.common.cloud.ZkStateReader – Updating
> cluster state from ZooKeeper...
>
> 6877 [main] INFO org.apache.solr.cloud.ZkController – Register node as
> live in ZooKeeper:/live_nodes/127.0.0.1:8983_solr
>
> 6880 [main] INFO org.apache.solr.common.cloud.SolrZkClient – makePath:
> /live_nodes/127.0.0.1:8983_solr
>
> 6885 [main-EventThread] INFO org.apache.solr.common.cloud.ZkStateReader
> – Updating live nodes... (1)
>
> 6892 [main] INFO org.apache.solr.common.cloud.SolrZkClient – makePath:
> /overseer_elect/leader
>
> 6896 [main] INFO org.apache.solr.cloud.Overseer – Overseer
> (id=90988393900081217-127.0.0.1:8983_solr-n_0000000017) starting
>
> 6913 [Overseer-90988393900081217-127.0.0.1:8983_solr-n_0000000017] INFO
> org.apache.solr.cloud.OverseerCollectionProcessor – Process current queue
> of collection creations
>
> 6917 [Thread-5] INFO org.apache.solr.cloud.Overseer – Starting to work
> on the main queue
>
> 6946 [main] INFO org.apache.solr.core.CoresLocator – Looking for core
> definitions underneath /Users/gpatwa/opensource/solr-hdfs-home
>
> 6979 [main] INFO org.apache.solr.core.CoresLocator – Found core
> event_shard1_replica1 in
> /Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/
>
> 6980 [main] INFO org.apache.solr.core.CoresLocator – Found 1 core
> definitions
>
> 6981 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.cloud.ZkController – publishing core=event_shard1_replica1
> state=down
>
> 6984 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.cloud.ZkController – waiting to find shard id in
> clusterstate for event_shard1_replica1
>
> 6984 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.CoreContainer – Creating SolrCore
> 'event_shard1_replica1' using instanceDir:
> /Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1
>
> 6984 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.cloud.ZkController – Check for collection zkNode:event
>
> 6985 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.cloud.ZkController – Collection zkNode exists
>
> 6986 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.cloud.ZkController – Load collection config
> from:/collections/event
>
> 6987 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.SolrResourceLoader – new SolrResourceLoader for
> directory: '/Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/'
>
> 7036 [coreLoadExecutor-4-thread-1] WARN org.apache.solr.core.Config –
> You should not use LUCENE_CURRENT as luceneMatchVersion property: if you
> use this setting, and then Solr upgrades to a newer release of Lucene,
> sizable changes may happen. If precise back compatibility is important then
> you should instead explicitly specify an actual Lucene version.
>
> 7172 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrConfig
> – Using Lucene MatchVersion: LUCENE_CURRENT
>
> 7283 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.Config –
> Loaded SolrConfig: solrconfig.xml
>
> 7292 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.schema.IndexSchema – Reading Solr Schema from schema.xml
>
> 7354 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.schema.IndexSchema – [event_shard1_replica1] Schema
> name=event-hdfs
>
> 7686 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.schema.IndexSchema – default search field in schema is
> searchKeywords_en_US
>
> 7688 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.schema.IndexSchema – query parser default operator is OR
>
> 7691 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.schema.IndexSchema – unique key field: id
>
> 7826 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
> solr.HdfsDirectoryFactory
>
> 7836 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.HdfsDirectoryFactory – Solr Kerberos Authentication
> disabled
>
> 7836 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
> [event_shard1_replica1] Opening new SolrCore at
> /Users/gpatwa/opensource/solr-hdfs-home/event_shard1_replica1/,
> dataDir=hdfs://10.249.132.29:8020/solr-hdfs/
>
> 7838 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.JmxMonitoredMap – No JMX servers found, not exposing
> Solr information with JMX.
>
> 7845 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.HdfsDirectoryFactory – creating directory factory for
> path hdfs://10.249.132.29:8020/solr-hdfs
>
> 7857 [coreLoadExecutor-4-thread-1] INFO
> org.apache.hadoop.metrics.jvm.JvmMetrics – Initializing JVM Metrics with
> processName=blockcache, sessionId=1388636531350
>
> 2014-01-01 20:22:11.488 java[62306:10b03] Unable to load realm info from
> SCDynamicStore
>
> 8001 [coreLoadExecutor-4-thread-1] WARN
> org.apache.hadoop.util.NativeCodeLoader – Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> 8426 [Thread-5] INFO org.apache.solr.common.cloud.ZkStateReader –
> Updating cloud state from ZooKeeper...
>
> 8428 [Thread-5] INFO org.apache.solr.cloud.Overseer – Update state
> numShards=1 message={
>
> "operation":"state",
>
> "state":"down",
>
> "base_url":"http://127.0.0.1:8983/solr",
>
> "core":"event_shard1_replica1",
>
> "roles":null,
>
> "node_name":"127.0.0.1:8983_solr",
>
> "shard":"shard1",
>
> "shard_range":null,
>
> "shard_state":"active",
>
> "shard_parent":null,
>
> "collection":"event",
>
> "numShards":"1",
>
> "core_node_name":"core_node1"}
>
> 8450 [main-EventThread] INFO org.apache.solr.common.cloud.ZkStateReader
> – A cluster state change: WatchedEvent state:SyncConnected
> type:NodeDataChanged path:/clusterstate.json, has occurred - updating...
> (live nodes size: 1)
>
> 8513 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
> [event_shard1_replica1] CLOSING SolrCore
> org.apache.solr.core.SolrCore@5035135a
>
> 8513 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.update.SolrCoreState – Closing SolrCoreState
>
> 8513 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.update.DefaultSolrCoreState – SolrCoreState ref count has
> reached 0 - closing IndexWriter
>
> 8514 [coreLoadExecutor-4-thread-1] INFO org.apache.solr.core.SolrCore –
> [event_shard1_replica1] Closing main searcher on request.
>
> 8514 [coreLoadExecutor-4-thread-1] INFO
> org.apache.solr.core.CachingDirectoryFactory – Closing
> HdfsDirectoryFactory - 0 directories currently being tracked
>
> 8516 [coreLoadExecutor-4-thread-1] ERROR
> org.apache.solr.core.CoreContainer – Unable to create core:
> event_shard1_replica1
>
> org.apache.solr.common.SolrException: Problem creating directory: hdfs://
> 10.249.132.29:8020/solr-hdfs
>
> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:834)
>
> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:625)
>
> at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:256)
>
> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:590)
>
> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:271)
>
> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:263)
>
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>
> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
> ThreadPoolExecutor.java:895)
>
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:918)
>
> at java.lang.Thread.run(Thread.java:680)
>
> Caused by: java.lang.RuntimeException: Problem creating directory: hdfs://
> 10.249.132.29:8020/solr-hdfs
>
> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:68)
>
> at org.apache.solr.core.HdfsDirectoryFactory.create(
> HdfsDirectoryFactory.java:154)
>
> at org.apache.solr.core.CachingDirectoryFactory.get(
> CachingDirectoryFactory.java:350)
>
> at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:251)
>
> at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:465)
>
> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:755)
>
> ... 13 more
>
> Caused by: java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> contained an invalid tag (zero).; Host Details : local host is:
> "LM-SFA-00713958/192.168.1.66"; destination host is:
> "10.249.132.29":8020;
>
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>
> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
> ProtobufRpcEngine.java:206)
>
> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:39)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:25)
>
> at java.lang.reflect.Method.invoke(Method.java:597)
>
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:186)
>
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:102)
>
> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(
> ClientNamenodeProtocolTranslatorPB.java:651)
>
> at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>
> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
> DistributedFileSystem.java:1106)
>
> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
> DistributedFileSystem.java:1102)
>
> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
> FileSystemLinkResolver.java:81)
>
> at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(
> DistributedFileSystem.java:1102)
>
> at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
>
> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:63)
>
> ... 18 more
>
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
> message contained an invalid tag (zero).
>
> at com.google.protobuf.InvalidProtocolBufferException.invalidTag(
> InvalidProtocolBufferException.java:89)
>
> at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
> RpcHeaderProtos.java:1398)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
> RpcHeaderProtos.java:1362)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
> RpcHeaderProtos.java:1492)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
> RpcHeaderProtos.java:1487)
>
> at com.google.protobuf.AbstractParser.parsePartialFrom(
> AbstractParser.java:200)
>
> at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(
> AbstractParser.java:241)
>
> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
> AbstractParser.java:253)
>
> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
> AbstractParser.java:259)
>
> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
> AbstractParser.java:49)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(
> RpcHeaderProtos.java:2364)
>
> at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(
> Client.java:996)
>
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>
> 8519 [coreLoadExecutor-4-thread-1] ERROR
> org.apache.solr.core.CoreContainer –
> null:org.apache.solr.common.SolrException: Unable to create core:
> event_shard1_replica1
>
> at org.apache.solr.core.CoreContainer.recordAndThrow(
> CoreContainer.java:977)
>
> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:601)
>
> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:271)
>
> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:263)
>
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>
> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(
> ThreadPoolExecutor.java:895)
>
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:918)
>
> at java.lang.Thread.run(Thread.java:680)
>
> Caused by: org.apache.solr.common.SolrException: Problem creating
> directory: hdfs://10.249.132.29:8020/solr-hdfs
>
> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:834)
>
> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:625)
>
> at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:256)
>
> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:590)
>
> ... 10 more
>
> Caused by: java.lang.RuntimeException: Problem creating directory: hdfs://
> 10.249.132.29:8020/solr-hdfs
>
> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:68)
>
> at org.apache.solr.core.HdfsDirectoryFactory.create(
> HdfsDirectoryFactory.java:154)
>
> at org.apache.solr.core.CachingDirectoryFactory.get(
> CachingDirectoryFactory.java:350)
>
> at org.apache.solr.core.SolrCore.getNewIndexDir(SolrCore.java:251)
>
> at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:465)
>
> at org.apache.solr.core.SolrCore.<init>(SolrCore.java:755)
>
> ... 13 more
>
> Caused by: java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> contained an invalid tag (zero).; Host Details : local host is:
> "LM-SFA-00713958/192.168.1.66"; destination host is:
> "10.249.132.29":8020;
>
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>
> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
> ProtobufRpcEngine.java:206)
>
> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:39)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:25)
>
> at java.lang.reflect.Method.invoke(Method.java:597)
>
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:186)
>
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:102)
>
> at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
>
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(
> ClientNamenodeProtocolTranslatorPB.java:651)
>
> at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>
> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
> DistributedFileSystem.java:1106)
>
> at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
> DistributedFileSystem.java:1102)
>
> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
> FileSystemLinkResolver.java:81)
>
> at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(
> DistributedFileSystem.java:1102)
>
> at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1397)
>
> at org.apache.solr.store.hdfs.HdfsDirectory.<init>(HdfsDirectory.java:63)
>
> ... 18 more
>
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
> message contained an invalid tag (zero).
>
> at com.google.protobuf.InvalidProtocolBufferException.invalidTag(
> InvalidProtocolBufferException.java:89)
>
> at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:108)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
> RpcHeaderProtos.java:1398)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.<init>(
> RpcHeaderProtos.java:1362)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
> RpcHeaderProtos.java:1492)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto$1.parsePartialFrom(
> RpcHeaderProtos.java:1487)
>
> at com.google.protobuf.AbstractParser.parsePartialFrom(
> AbstractParser.java:200)
>
> at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(
> AbstractParser.java:241)
>
> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
> AbstractParser.java:253)
>
> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
> AbstractParser.java:259)
>
> at com.google.protobuf.AbstractParser.parseDelimitedFrom(
> AbstractParser.java:49)
>
> at
> org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(
> RpcHeaderProtos.java:2364)
>
> at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(
> Client.java:996)
>
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>
>
> 8520 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
> user.dir=/Users/gpatwa/workspaces/workspace_pb_search_platform_fr/solr-hdfs
>
> 8521 [main] INFO org.apache.solr.servlet.SolrDispatchFilter –
> SolrDispatchFilter.init() done
>
> 2014-01-01 20:22:12.344:INFO:oejs.AbstractConnector:Started
> SelectChannelConnector@127.0.0.1:8983
>
> [INFO] Started Jetty Server
>
>
>