You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Mich Talebzadeh <mi...@gmail.com> on 2018/08/19 18:17:25 UTC
ImportTsv fails with expecting hadoop-mapreduce-client-core-2.5.1.jar
in hdfs!
I am trying to import data into Hbase table from a csv file.
The version of Hbase is 1.2.6
This used to work in the older version of Hbase
$HBASE_HOME/bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv
-Dimporttsv.separator=','
-Dimporttsv.columns="HBASE_ROW_KEY,tock_daily:stock,stock_daily:ticker,stock_daily:Date,stock_daily:open,stock_daily:high,stock_daily:low,stock_daily:close,stock_daily:volume"
-Dimporttsv.skip.bad.lines=true tsco hdfs://rhes75:9000/data/stocks/tsco.csv
But now it throws the following error with "File does not exist:
hdfs://rhes75:9000/data6/hduser/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar"
2018-08-19 19:11:09,445 INFO [main] Configuration.deprecation:
io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2018-08-19 19:11:09,557 INFO [main] mapreduce.JobSubmitter: Cleaning up
the staging area
file:/tmp/hadoop-hduser/mapred/staging/hduser1440777758/.staging/job_local1440777758_0001
Exception in thread "main" java.io.FileNotFoundException: File does not
exist:
hdfs://rhes75:9000/data6/hduser/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar
I am at loss why is looking at hdfs directory for this jar file. The jar
exists in $HBASE_HOME/bin!
ls $HBASE_HOME/lib/hadoop-mapreduce-client-core-2.5.1.jar
/data6/hduser/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar
This has also been report here with some unconventional solution.
https://stackoverflow.com/questions/50966661/running-a-mapreduce-job-fails-file-does-not-exist?rq=1
Thanks
Dr Mich Talebzadeh
LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
http://talebzadehmich.wordpress.com
*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.