You are viewing a plain text version of this content. The canonical link for it is here.
Posted to solr-user@lucene.apache.org by Rishabh Patel <ri...@gmail.com> on 2016/10/03 18:38:03 UTC

CheckHdfsIndex with Kerberos not working

Hello,

My SolrCloud 5.5 installation has Kerberos enabled. The CheckHdfsIndex test
fails to run. However, without Kerberos, I am able to run the test with no
issues.

I ran the following command:

java -cp
"./server/solr-webapp/webapp/WEB-INF/lib/*:./server/lib/ext/*:/hadoop/hadoop-client/lib/servlet-api-2.5.jar"
-ea:org.apache.lucene... org.apache.solr.index.hdfs.CheckHdfsIndex
hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/core_node1/data/index

The error is:

ERROR: could not open hdfs directory "
hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/core_node1/data/index
";
exiting org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
Does this error message imply that the test cannot run with Kerberos
enabled?

For reference, I followed this blog
http://yonik.com/solr-5-5/

-- 
Regards,
*Rishabh Patel*

Re: CheckHdfsIndex with Kerberos not working

Posted by Rishabh Patel <ri...@gmail.com>.
Thanks Kevin, this worked for me.

On Mon, Oct 3, 2016 at 11:48 AM, Kevin Risden <co...@gmail.com>
wrote:

> You need to have the hadoop pieces on the classpath. Like core-site.xml and
> hdfs-site.xml. There is an hdfs classpath command that would help but it
> may have too many pieces. You may just need core-site and hdfs-site so you
> don't get conflicting jars.
>
> Something like this may work for you:
>
> java -cp
> "$(hdfs classpath):./server/solr-webapp/webapp/WEB-INF/lib/*:./server/lib/
> ext/*:/hadoop/hadoop-client/lib/servlet-api-2.5.jar"
> -ea:org.apache.lucene... org.apache.solr.index.hdfs.CheckHdfsIndex
> hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/
> core_node1/data/index
>
> Kevin Risden
>
> On Mon, Oct 3, 2016 at 1:38 PM, Rishabh Patel <
> rishabh.mahendra.patel@gmail.com> wrote:
>
> > Hello,
> >
> > My SolrCloud 5.5 installation has Kerberos enabled. The CheckHdfsIndex
> test
> > fails to run. However, without Kerberos, I am able to run the test with
> no
> > issues.
> >
> > I ran the following command:
> >
> > java -cp
> > "./server/solr-webapp/webapp/WEB-INF/lib/*:./server/lib/
> > ext/*:/hadoop/hadoop-client/lib/servlet-api-2.5.jar"
> > -ea:org.apache.lucene... org.apache.solr.index.hdfs.CheckHdfsIndex
> > hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/
> > core_node1/data/index
> >
> > The error is:
> >
> > ERROR: could not open hdfs directory "
> > hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/
> > core_node1/data/index
> > ";
> > exiting org.apache.hadoop.ipc.RemoteException(org.apache.
> hadoop.security.
> > AccessControlException):
> > SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
> > Does this error message imply that the test cannot run with Kerberos
> > enabled?
> >
> > For reference, I followed this blog
> > http://yonik.com/solr-5-5/
> >
> > --
> > Regards,
> > *Rishabh Patel*
> >
>



-- 
Sincerely,
*Rishabh Patel*

Re: CheckHdfsIndex with Kerberos not working

Posted by Kevin Risden <co...@gmail.com>.
You need to have the hadoop pieces on the classpath. Like core-site.xml and
hdfs-site.xml. There is an hdfs classpath command that would help but it
may have too many pieces. You may just need core-site and hdfs-site so you
don't get conflicting jars.

Something like this may work for you:

java -cp
"$(hdfs classpath):./server/solr-webapp/webapp/WEB-INF/lib/*:./server/lib/
ext/*:/hadoop/hadoop-client/lib/servlet-api-2.5.jar"
-ea:org.apache.lucene... org.apache.solr.index.hdfs.CheckHdfsIndex
hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/
core_node1/data/index

Kevin Risden

On Mon, Oct 3, 2016 at 1:38 PM, Rishabh Patel <
rishabh.mahendra.patel@gmail.com> wrote:

> Hello,
>
> My SolrCloud 5.5 installation has Kerberos enabled. The CheckHdfsIndex test
> fails to run. However, without Kerberos, I am able to run the test with no
> issues.
>
> I ran the following command:
>
> java -cp
> "./server/solr-webapp/webapp/WEB-INF/lib/*:./server/lib/
> ext/*:/hadoop/hadoop-client/lib/servlet-api-2.5.jar"
> -ea:org.apache.lucene... org.apache.solr.index.hdfs.CheckHdfsIndex
> hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/
> core_node1/data/index
>
> The error is:
>
> ERROR: could not open hdfs directory "
> hdfs://<NODE_NAME>:8020/apps/solr/data/ExampleCollection/
> core_node1/data/index
> ";
> exiting org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.
> AccessControlException):
> SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
> Does this error message imply that the test cannot run with Kerberos
> enabled?
>
> For reference, I followed this blog
> http://yonik.com/solr-5-5/
>
> --
> Regards,
> *Rishabh Patel*
>