You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@phoenix.apache.org by "Narendra Kumar (JIRA)" <ji...@apache.org> on 2019/03/15 05:52:00 UTC

[jira] [Comment Edited] (PHOENIX-5146) Phoenix missing class definition: java.lang.NoClassDefFoundError: org/apache/phoenix/shaded/org/apache/http/Consts

    [ https://issues.apache.org/jira/browse/PHOENIX-5146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16793328#comment-16793328 ] 

Narendra Kumar edited comment on PHOENIX-5146 at 3/15/19 5:51 AM:
------------------------------------------------------------------

Hi [~elserj] so I am basically modifying the spark config and pointing it to the phoenix jar.

But, after doing some digging, and compare with older runs for older version of stack components (where the test run fine), it seems to be the case that the class has been moved from one jar to another in phoenix.

The test I run, loads "/usr/hdp/3.1.0.0-75/phoenix/phoenix-5.0.0.3.1.0.0-75-client.jar", which doesn't have the required class now(it use to work fine till October18, with older version of stack components). The class is now found in "/usr/hdp/3.1.0.0-75/phoenix/phoenix-5.0.0.3.1.0.0-75-thin-client.jar". (*so I do need to fix  the test code fore sure to load all client jars*)

 

*But, is this change intentional or just a build issue?*


was (Author: narendra19jain20):
Hi [~elserj] so I am basically modifying modifyuing the spark config and pointing it to the phoenix jar.

But, after doing some digging, and compare with older runs for older version of stack components (where the test run fine), it seems to be the case that the class has been moved from one jar to another in phoenix.

The test I run, loads "/usr/hdp/3.1.0.0-75/phoenix/phoenix-5.0.0.3.1.0.0-75-client.jar", which doesn't have the required class now(it use to work fine till October18, with older version of stack components). The class is now found in "/usr/hdp/3.1.0.0-75/phoenix/phoenix-5.0.0.3.1.0.0-75-thin-client.jar". (*so I do need to fix  the test code fore sure to load all client jars*)

 

*But, is this change intentional or just a build issue?*

> Phoenix missing class definition: java.lang.NoClassDefFoundError: org/apache/phoenix/shaded/org/apache/http/Consts
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: PHOENIX-5146
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-5146
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 5.0.0
>         Environment: 3 node kerberised cluster.
> Hbase 2.0.2
>            Reporter: Narendra Kumar
>            Priority: Major
>              Labels: NoClassDefFoundError
>
> While running a SparkCompatibility check for Phoniex hitting this issue:
> {noformat}
> 2019-02-15 09:03:38,470|INFO|MainThread|machine.py:169 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|RUNNING: echo "
>  import org.apache.spark.graphx._;
>  import org.apache.phoenix.spark._;
>  val rdd = sc.phoenixTableAsRDD(\"EMAIL_ENRON\", Seq(\"MAIL_FROM\", \"MAIL_TO\"), zkUrl=Some(\"huaycloud012.l42scl.hortonworks.com:2181:/hbase-secure\"));
>  val rawEdges = rdd.map
> { e => (e(\"MAIL_FROM\").asInstanceOf[VertexId], e(\"MAIL_TO\").asInstanceOf[VertexId])}
> ;
>  val graph = Graph.fromEdgeTuples(rawEdges, 1.0);
>  val pr = graph.pageRank(0.001);
>  pr.vertices.saveToPhoenix(\"EMAIL_ENRON_PAGERANK\", Seq(\"ID\", \"RANK\"), zkUrl = Some(\"huaycloud012.l42scl.hortonworks.com:2181:/hbase-secure\"));
>  " | spark-shell --master yarn --jars /usr/hdp/current/hadoop-client/lib/hadoop-lzo-0.6.0.3.1.0.0-75.jar --properties-file /grid/0/log/cluster/run_phoenix_secure_ha_all_1/artifacts/spark_defaults.conf 2>&1 | tee /grid/0/log/cluster/run_phoenix_secure_ha_all_1/artifacts/Spark_clientLogs/phoenix-spark.txt
>  2019-02-15 09:03:38,488|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SPARK_MAJOR_VERSION is set to 2, using Spark2
>  2019-02-15 09:03:39,901|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: Class path contains multiple SLF4J bindings.
>  2019-02-15 09:03:39,902|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-75/phoenix/phoenix-5.0.0.3.1.0.0-75-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>  2019-02-15 09:03:39,902|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-75/spark2/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>  2019-02-15 09:03:39,902|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|SLF4J: See [http://www.slf4j.org/codes.html#multiple_bindings] for an explanation.
>  2019-02-15 09:03:41,400|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|Setting default log level to "WARN".
>  2019-02-15 09:03:41,400|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
>  2019-02-15 09:03:54,837|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84{color:#ff0000}*|java.lang.NoClassDefFoundError: org/apache/phoenix/shaded/org/apache/http/Consts*{color}
>  2019-02-15 09:03:54,838|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.phoenix.shaded.org.apache.http.client.utils.URIBuilder.digestURI(URIBuilder.java:181)
>  2019-02-15 09:03:54,839|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.phoenix.shaded.org.apache.http.client.utils.URIBuilder.<init>(URIBuilder.java:82)
>  2019-02-15 09:03:54,839|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createURL(KMSClientProvider.java:468)
>  2019-02-15 09:03:54,839|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.crypto.key.kms.KMSClientProvider.getDelegationToken(KMSClientProvider.java:1023)
>  2019-02-15 09:03:54,840|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$1.call(LoadBalancingKMSClientProvider.java:252)
>  2019-02-15 09:03:54,840|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$1.call(LoadBalancingKMSClientProvider.java:249)
>  2019-02-15 09:03:54,840|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:172)
>  2019-02-15 09:03:54,841|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.getDelegationToken(LoadBalancingKMSClientProvider.java:249)
>  2019-02-15 09:03:54,841|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.security.token.DelegationTokenIssuer.collectDelegationTokens(DelegationTokenIssuer.java:95)
>  2019-02-15 09:03:54,841|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.security.token.DelegationTokenIssuer.collectDelegationTokens(DelegationTokenIssuer.java:107)
>  2019-02-15 09:03:54,842|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.hadoop.security.token.DelegationTokenIssuer.addDelegationTokens(DelegationTokenIssuer.java:76)
>  2019-02-15 09:03:54,842|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$org$apache$spark$deploy$security$HadoopFSDelegationTokenProvider$$fetchDelegationTokens$1.apply(HadoopFSDelegationTokenProvider.scala:98)
>  2019-02-15 09:03:54,842|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider$$anonfun$org$apache$spark$deploy$security$HadoopFSDelegationTokenProvider$$fetchDelegationTokens$1.apply(HadoopFSDelegationTokenProvider.scala:96)
>  2019-02-15 09:03:54,843|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at scala.collection.immutable.Set$Set1.foreach(Set.scala:94)
>  2019-02-15 09:03:54,843|INFO|MainThread|machine.py:184 - run()||GUID=1566a829-b1df-4757-8c3d-73a7fa302b84|at
>  {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)