You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by cmbendre <ch...@zeotap.com> on 2017/06/01 07:59:51 UTC
Class org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not
found
Trying to bulk load CSV file on Phoenix 4.9.0 on EMR.
Following is the command -
/export HADOOP_CLASSPATH=$(hbase mapredcp):/usr/lib/hbase/conf
hadoop jar /usr/lib/phoenix/phoenix-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dfs.permissions.umask-mode=000
--table PROFILESTORE --input /user/merged3.csv/
But it throws the following error-
Exception in thread "main" java.lang.RuntimeException:
java.lang.RuntimeException: /java.lang.ClassNotFoundException: Class
org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2227)
at org.apache.hadoop.mapred.JobConf.getMapOutputKeyClass(JobConf.java:813)
at
org.apache.hadoop.mapreduce.task.JobContextImpl.getMapOutputKeyClass(JobContextImpl.java:142)
at
org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:832)
at
org.apache.phoenix.mapreduce.MultiHfileOutputFormat.configureIncrementalLoad(MultiHfileOutputFormat.java:698)
at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.submitJob(AbstractBulkLoadTool.java:301)
at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:270)
at
org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at
org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:101)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
Class org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2219)
... 16 more
Caused by: java.lang.ClassNotFoundException: Class
org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
... 17 more/
When i unset the HADOOP_CLASSPATH like suggested in this JIRA -
https://issues.apache.org/jira/browse/PHOENIX-3835
This shows another error -
/Error: java.lang.RuntimeException: java.sql.SQLException:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the
locations
at
org.apache.phoenix.mapreduce.FormatToBytesWritableMapper.setup(FormatToBytesWritableMapper.java:142)
at
org.apache.phoenix.mapreduce.CsvToKeyValueMapper.setup(CsvToKeyValueMapper.java:67)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:796)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.sql.SQLException:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the
locations
at
org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2432)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2352)
at
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2352)
at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:232)
at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:147)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:337)
at
org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:324)
at
org.apache.phoenix.mapreduce.FormatToBytesWritableMapper.setup(FormatToBytesWritableMapper.java:130)
... 9 more/
Why this is happening and what is the fix ?
--
View this message in context: http://apache-phoenix-user-list.1124778.n5.nabble.com/Class-org-apache-phoenix-mapreduce-bulkload-TableRowkeyPair-not-found-tp3620.html
Sent from the Apache Phoenix User List mailing list archive at Nabble.com.
Re: Class org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
Posted by Sergey Soldatov <se...@gmail.com>.
You may try to remove mapredcp and keep /etc/hbase/conf in the
HADOOP_CLASSPATH.
Thanks,
Sergey
On Thu, Jun 1, 2017 at 12:59 AM, cmbendre <ch...@zeotap.com>
wrote:
> Trying to bulk load CSV file on Phoenix 4.9.0 on EMR.
>
> Following is the command -
>
> /export HADOOP_CLASSPATH=$(hbase mapredcp):/usr/lib/hbase/conf
>
> hadoop jar /usr/lib/phoenix/phoenix-client.jar
> org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dfs.permissions.umask-mode=
> 000
> --table PROFILESTORE --input /user/merged3.csv/
>
> But it throws the following error-
> Exception in thread "main" java.lang.RuntimeException:
> java.lang.RuntimeException: /java.lang.ClassNotFoundException: Class
> org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
> at org.apache.hadoop.conf.Configuration.getClass(
> Configuration.java:2227)
> at org.apache.hadoop.mapred.JobConf.getMapOutputKeyClass(
> JobConf.java:813)
> at
> org.apache.hadoop.mapreduce.task.JobContextImpl.getMapOutputKeyClass(
> JobContextImpl.java:142)
> at
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(
> TableMapReduceUtil.java:832)
> at
> org.apache.phoenix.mapreduce.MultiHfileOutputFormat.
> configureIncrementalLoad(MultiHfileOutputFormat.java:698)
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.submitJob(
> AbstractBulkLoadTool.java:301)
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(
> AbstractBulkLoadTool.java:270)
> at
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(
> AbstractBulkLoadTool.java:183)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> at
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(
> CsvBulkLoadTool.java:101)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> 62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
> Class org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
> at org.apache.hadoop.conf.Configuration.getClass(
> Configuration.java:2195)
> at org.apache.hadoop.conf.Configuration.getClass(
> Configuration.java:2219)
> ... 16 more
> Caused by: java.lang.ClassNotFoundException: Class
> org.apache.phoenix.mapreduce.bulkload.TableRowkeyPair not found
> at
> org.apache.hadoop.conf.Configuration.getClassByName(
> Configuration.java:2101)
> at org.apache.hadoop.conf.Configuration.getClass(
> Configuration.java:2193)
> ... 17 more/
>
> When i unset the HADOOP_CLASSPATH like suggested in this JIRA -
> https://issues.apache.org/jira/browse/PHOENIX-3835
>
> This shows another error -
>
> /Error: java.lang.RuntimeException: java.sql.SQLException:
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the
> locations
> at
> org.apache.phoenix.mapreduce.FormatToBytesWritableMapper.setup(
> FormatToBytesWritableMapper.java:142)
> at
> org.apache.phoenix.mapreduce.CsvToKeyValueMapper.setup(
> CsvToKeyValueMapper.java:67)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:796)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1698)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.sql.SQLException:
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the
> locations
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
> ConnectionQueryServicesImpl.java:2432)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
> ConnectionQueryServicesImpl.java:2352)
> at
> org.apache.phoenix.util.PhoenixContextExecutor.call(
> PhoenixContextExecutor.java:76)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(
> ConnectionQueryServicesImpl.java:2352)
> at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(
> PhoenixDriver.java:232)
> at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(
> PhoenixEmbeddedDriver.java:147)
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(
> PhoenixDriver.java:202)
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
> at org.apache.phoenix.util.QueryUtil.getConnection(
> QueryUtil.java:337)
> at
> org.apache.phoenix.util.QueryUtil.getConnectionOnServer(
> QueryUtil.java:324)
> at
> org.apache.phoenix.mapreduce.FormatToBytesWritableMapper.setup(
> FormatToBytesWritableMapper.java:130)
> ... 9 more/
>
>
> Why this is happening and what is the fix ?
>
>
>
> --
> View this message in context: http://apache-phoenix-user-
> list.1124778.n5.nabble.com/Class-org-apache-phoenix-mapreduce-bulkload-
> TableRowkeyPair-not-found-tp3620.html
> Sent from the Apache Phoenix User List mailing list archive at Nabble.com.
>