You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Håvard Wahl Kongsgård <ha...@gmail.com> on 2015/09/27 12:06:47 UTC

Error importing hbase table on new system

Hi, Iam trying to import a old backup to a new smaller system (just
single node, to get the data out)

when I use

sudo -u hbase hbase -Dhbase.import.version=0.94
org.apache.hadoop.hbase.mapreduce.Import crawler
/crawler_hbase/crawler

I get this error in the tasks . Is this a permission problem?


2015-09-26 23:56:32,995 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:mapred (auth:SIMPLE)
cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
14279
2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
But is not that tool for reading regions not export (dumps) ??

-Håvard

On Sun, Sep 27, 2015 at 6:23 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you used HFile tool ?
>
> The tool would print out that information as part of metadata.
>
> Cheers
>
> On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> Yes, I have tried to read them on another system as well. It worked
>> there. But I don't know if they are HFilev1 or HFilev2 format(any way
>> to check ?? )
>>
>> This is the first lines from one of the files
>>
>> SEQ
>> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
>> *org.apache.hadoop.io.compress.DefaultCodec���N��
>> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
>> �қ ��<� /��0/ � ?'/ �
>>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>> pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��  Y;�����
>> / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
>> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
>> �!٠��;c��I kQ
>> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
>> HCG��&�����JG�%��',*!!��
>> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
>> ����_+��nX `#�
>> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>>
>> -Håvard
>>
>> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Have you verified that the files to be imported are in HFilev2 format ?
>> >
>> > http://hbase.apache.org/book.html#_hfile_tool
>> >
>> > Cheers
>> >
>> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
>> > <ha...@gmail.com> wrote:
>> >>
>> >> >Is the single node system secure ?
>> >>
>> >> No have not activated, just defaults
>> >>
>> >> the mapred conf.
>> >>
>> >> <?xml version="1.0"?>
>> >>
>> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> >>
>> >>
>> >> <configuration>
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.job.tracker</name>
>> >>
>> >>     <value>rack3:8021</value>
>> >>
>> >>   </property>
>> >>
>> >>
>> >>   <!-- Enable Hue plugins -->
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.jobtracker.plugins</name>
>> >>
>> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>> >>
>> >>     <description>Comma-separated list of jobtracker plug-ins to be
>> >> activated.
>> >>
>> >>     </description>
>> >>
>> >>   </property>
>> >>
>> >>   <property>
>> >>
>> >>     <name>jobtracker.thrift.address</name>
>> >>
>> >>     <value>0.0.0.0:9290</value>
>> >>
>> >>   </property>
>> >>
>> >> </configuration>
>> >>
>> >>
>> >> >>Have you checked hdfs healthiness ?
>> >>
>> >>
>> >> sudo -u hdfs hdfs dfsadmin -report
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> Present Capacity: 1991514849280 (1.81 TB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> DFS Used%: 17.24%
>> >>
>> >> Under replicated blocks: 52
>> >>
>> >> Blocks with corrupt replicas: 0
>> >>
>> >> Missing blocks: 0
>> >>
>> >>
>> >> -------------------------------------------------
>> >>
>> >> Datanodes available: 1 (1 total, 0 dead)
>> >>
>> >>
>> >> Live datanodes:
>> >>
>> >> Name: 127.0.0.1:50010 (localhost)
>> >>
>> >> Hostname: rack3
>> >>
>> >> Decommission Status : Normal
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> Non DFS Used: 885193736192 (824.40 GB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used%: 11.93%
>> >>
>> >> DFS Remaining%: 57.30%
>> >>
>> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
>> >>
>> >>
>> >> >>To which release of hbase were you importing ?
>> >>
>> >> Hbase 0.94 (CHD 4)
>> >>
>> >> the new one is CHD 5.4
>> >>
>> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> >> > Is the single node system secure ?
>> >> > Have you checked hdfs healthiness ?
>> >> > To which release of hbase were you importing ?
>> >> >
>> >> > Thanks
>> >> >
>> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> >> <ha...@gmail.com> wrote:
>> >> >>
>> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> >> single node, to get the data out)
>> >> >>
>> >> >> when I use
>> >> >>
>> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> >> /crawler_hbase/crawler
>> >> >>
>> >> >> I get this error in the tasks . Is this a permission problem?
>> >> >>
>> >> >>
>> >> >> 2015-09-26 23:56:32,995 ERROR
>> >> >> org.apache.hadoop.security.UserGroupInformation:
>> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
>> >> >> read
>> >> >> 14279
>> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> >> running child
>> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> >> 14279
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> >> cleanup for the task
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Håvard Wahl Kongsgård
>> >> >> Data Scientist
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>> >
>> >
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
But is not that tool for reading regions not export (dumps) ??

-Håvard

On Sun, Sep 27, 2015 at 6:23 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you used HFile tool ?
>
> The tool would print out that information as part of metadata.
>
> Cheers
>
> On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> Yes, I have tried to read them on another system as well. It worked
>> there. But I don't know if they are HFilev1 or HFilev2 format(any way
>> to check ?? )
>>
>> This is the first lines from one of the files
>>
>> SEQ
>> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
>> *org.apache.hadoop.io.compress.DefaultCodec���N��
>> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
>> �қ ��<� /��0/ � ?'/ �
>>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>> pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��  Y;�����
>> / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
>> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
>> �!٠��;c��I kQ
>> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
>> HCG��&�����JG�%��',*!!��
>> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
>> ����_+��nX `#�
>> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>>
>> -Håvard
>>
>> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Have you verified that the files to be imported are in HFilev2 format ?
>> >
>> > http://hbase.apache.org/book.html#_hfile_tool
>> >
>> > Cheers
>> >
>> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
>> > <ha...@gmail.com> wrote:
>> >>
>> >> >Is the single node system secure ?
>> >>
>> >> No have not activated, just defaults
>> >>
>> >> the mapred conf.
>> >>
>> >> <?xml version="1.0"?>
>> >>
>> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> >>
>> >>
>> >> <configuration>
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.job.tracker</name>
>> >>
>> >>     <value>rack3:8021</value>
>> >>
>> >>   </property>
>> >>
>> >>
>> >>   <!-- Enable Hue plugins -->
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.jobtracker.plugins</name>
>> >>
>> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>> >>
>> >>     <description>Comma-separated list of jobtracker plug-ins to be
>> >> activated.
>> >>
>> >>     </description>
>> >>
>> >>   </property>
>> >>
>> >>   <property>
>> >>
>> >>     <name>jobtracker.thrift.address</name>
>> >>
>> >>     <value>0.0.0.0:9290</value>
>> >>
>> >>   </property>
>> >>
>> >> </configuration>
>> >>
>> >>
>> >> >>Have you checked hdfs healthiness ?
>> >>
>> >>
>> >> sudo -u hdfs hdfs dfsadmin -report
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> Present Capacity: 1991514849280 (1.81 TB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> DFS Used%: 17.24%
>> >>
>> >> Under replicated blocks: 52
>> >>
>> >> Blocks with corrupt replicas: 0
>> >>
>> >> Missing blocks: 0
>> >>
>> >>
>> >> -------------------------------------------------
>> >>
>> >> Datanodes available: 1 (1 total, 0 dead)
>> >>
>> >>
>> >> Live datanodes:
>> >>
>> >> Name: 127.0.0.1:50010 (localhost)
>> >>
>> >> Hostname: rack3
>> >>
>> >> Decommission Status : Normal
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> Non DFS Used: 885193736192 (824.40 GB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used%: 11.93%
>> >>
>> >> DFS Remaining%: 57.30%
>> >>
>> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
>> >>
>> >>
>> >> >>To which release of hbase were you importing ?
>> >>
>> >> Hbase 0.94 (CHD 4)
>> >>
>> >> the new one is CHD 5.4
>> >>
>> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> >> > Is the single node system secure ?
>> >> > Have you checked hdfs healthiness ?
>> >> > To which release of hbase were you importing ?
>> >> >
>> >> > Thanks
>> >> >
>> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> >> <ha...@gmail.com> wrote:
>> >> >>
>> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> >> single node, to get the data out)
>> >> >>
>> >> >> when I use
>> >> >>
>> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> >> /crawler_hbase/crawler
>> >> >>
>> >> >> I get this error in the tasks . Is this a permission problem?
>> >> >>
>> >> >>
>> >> >> 2015-09-26 23:56:32,995 ERROR
>> >> >> org.apache.hadoop.security.UserGroupInformation:
>> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
>> >> >> read
>> >> >> 14279
>> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> >> running child
>> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> >> 14279
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> >> cleanup for the task
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Håvard Wahl Kongsgård
>> >> >> Data Scientist
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>> >
>> >
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
But is not that tool for reading regions not export (dumps) ??

-Håvard

On Sun, Sep 27, 2015 at 6:23 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you used HFile tool ?
>
> The tool would print out that information as part of metadata.
>
> Cheers
>
> On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> Yes, I have tried to read them on another system as well. It worked
>> there. But I don't know if they are HFilev1 or HFilev2 format(any way
>> to check ?? )
>>
>> This is the first lines from one of the files
>>
>> SEQ
>> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
>> *org.apache.hadoop.io.compress.DefaultCodec���N��
>> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
>> �қ ��<� /��0/ � ?'/ �
>>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>> pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��  Y;�����
>> / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
>> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
>> �!٠��;c��I kQ
>> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
>> HCG��&�����JG�%��',*!!��
>> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
>> ����_+��nX `#�
>> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>>
>> -Håvard
>>
>> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Have you verified that the files to be imported are in HFilev2 format ?
>> >
>> > http://hbase.apache.org/book.html#_hfile_tool
>> >
>> > Cheers
>> >
>> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
>> > <ha...@gmail.com> wrote:
>> >>
>> >> >Is the single node system secure ?
>> >>
>> >> No have not activated, just defaults
>> >>
>> >> the mapred conf.
>> >>
>> >> <?xml version="1.0"?>
>> >>
>> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> >>
>> >>
>> >> <configuration>
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.job.tracker</name>
>> >>
>> >>     <value>rack3:8021</value>
>> >>
>> >>   </property>
>> >>
>> >>
>> >>   <!-- Enable Hue plugins -->
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.jobtracker.plugins</name>
>> >>
>> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>> >>
>> >>     <description>Comma-separated list of jobtracker plug-ins to be
>> >> activated.
>> >>
>> >>     </description>
>> >>
>> >>   </property>
>> >>
>> >>   <property>
>> >>
>> >>     <name>jobtracker.thrift.address</name>
>> >>
>> >>     <value>0.0.0.0:9290</value>
>> >>
>> >>   </property>
>> >>
>> >> </configuration>
>> >>
>> >>
>> >> >>Have you checked hdfs healthiness ?
>> >>
>> >>
>> >> sudo -u hdfs hdfs dfsadmin -report
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> Present Capacity: 1991514849280 (1.81 TB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> DFS Used%: 17.24%
>> >>
>> >> Under replicated blocks: 52
>> >>
>> >> Blocks with corrupt replicas: 0
>> >>
>> >> Missing blocks: 0
>> >>
>> >>
>> >> -------------------------------------------------
>> >>
>> >> Datanodes available: 1 (1 total, 0 dead)
>> >>
>> >>
>> >> Live datanodes:
>> >>
>> >> Name: 127.0.0.1:50010 (localhost)
>> >>
>> >> Hostname: rack3
>> >>
>> >> Decommission Status : Normal
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> Non DFS Used: 885193736192 (824.40 GB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used%: 11.93%
>> >>
>> >> DFS Remaining%: 57.30%
>> >>
>> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
>> >>
>> >>
>> >> >>To which release of hbase were you importing ?
>> >>
>> >> Hbase 0.94 (CHD 4)
>> >>
>> >> the new one is CHD 5.4
>> >>
>> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> >> > Is the single node system secure ?
>> >> > Have you checked hdfs healthiness ?
>> >> > To which release of hbase were you importing ?
>> >> >
>> >> > Thanks
>> >> >
>> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> >> <ha...@gmail.com> wrote:
>> >> >>
>> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> >> single node, to get the data out)
>> >> >>
>> >> >> when I use
>> >> >>
>> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> >> /crawler_hbase/crawler
>> >> >>
>> >> >> I get this error in the tasks . Is this a permission problem?
>> >> >>
>> >> >>
>> >> >> 2015-09-26 23:56:32,995 ERROR
>> >> >> org.apache.hadoop.security.UserGroupInformation:
>> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
>> >> >> read
>> >> >> 14279
>> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> >> running child
>> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> >> 14279
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> >> cleanup for the task
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Håvard Wahl Kongsgård
>> >> >> Data Scientist
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>> >
>> >
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
But is not that tool for reading regions not export (dumps) ??

-Håvard

On Sun, Sep 27, 2015 at 6:23 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you used HFile tool ?
>
> The tool would print out that information as part of metadata.
>
> Cheers
>
> On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> Yes, I have tried to read them on another system as well. It worked
>> there. But I don't know if they are HFilev1 or HFilev2 format(any way
>> to check ?? )
>>
>> This is the first lines from one of the files
>>
>> SEQ
>> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
>> *org.apache.hadoop.io.compress.DefaultCodec���N��
>> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
>> �қ ��<� /��0/ � ?'/ �
>>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>> pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��  Y;�����
>> / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
>> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
>> �!٠��;c��I kQ
>> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
>> HCG��&�����JG�%��',*!!��
>> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
>> ����_+��nX `#�
>> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>>
>> -Håvard
>>
>> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Have you verified that the files to be imported are in HFilev2 format ?
>> >
>> > http://hbase.apache.org/book.html#_hfile_tool
>> >
>> > Cheers
>> >
>> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
>> > <ha...@gmail.com> wrote:
>> >>
>> >> >Is the single node system secure ?
>> >>
>> >> No have not activated, just defaults
>> >>
>> >> the mapred conf.
>> >>
>> >> <?xml version="1.0"?>
>> >>
>> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> >>
>> >>
>> >> <configuration>
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.job.tracker</name>
>> >>
>> >>     <value>rack3:8021</value>
>> >>
>> >>   </property>
>> >>
>> >>
>> >>   <!-- Enable Hue plugins -->
>> >>
>> >>   <property>
>> >>
>> >>     <name>mapred.jobtracker.plugins</name>
>> >>
>> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>> >>
>> >>     <description>Comma-separated list of jobtracker plug-ins to be
>> >> activated.
>> >>
>> >>     </description>
>> >>
>> >>   </property>
>> >>
>> >>   <property>
>> >>
>> >>     <name>jobtracker.thrift.address</name>
>> >>
>> >>     <value>0.0.0.0:9290</value>
>> >>
>> >>   </property>
>> >>
>> >> </configuration>
>> >>
>> >>
>> >> >>Have you checked hdfs healthiness ?
>> >>
>> >>
>> >> sudo -u hdfs hdfs dfsadmin -report
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> Present Capacity: 1991514849280 (1.81 TB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> DFS Used%: 17.24%
>> >>
>> >> Under replicated blocks: 52
>> >>
>> >> Blocks with corrupt replicas: 0
>> >>
>> >> Missing blocks: 0
>> >>
>> >>
>> >> -------------------------------------------------
>> >>
>> >> Datanodes available: 1 (1 total, 0 dead)
>> >>
>> >>
>> >> Live datanodes:
>> >>
>> >> Name: 127.0.0.1:50010 (localhost)
>> >>
>> >> Hostname: rack3
>> >>
>> >> Decommission Status : Normal
>> >>
>> >> Configured Capacity: 2876708585472 (2.62 TB)
>> >>
>> >> DFS Used: 343284232192 (319.71 GB)
>> >>
>> >> Non DFS Used: 885193736192 (824.40 GB)
>> >>
>> >> DFS Remaining: 1648230617088 (1.50 TB)
>> >>
>> >> DFS Used%: 11.93%
>> >>
>> >> DFS Remaining%: 57.30%
>> >>
>> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
>> >>
>> >>
>> >> >>To which release of hbase were you importing ?
>> >>
>> >> Hbase 0.94 (CHD 4)
>> >>
>> >> the new one is CHD 5.4
>> >>
>> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> >> > Is the single node system secure ?
>> >> > Have you checked hdfs healthiness ?
>> >> > To which release of hbase were you importing ?
>> >> >
>> >> > Thanks
>> >> >
>> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> >> <ha...@gmail.com> wrote:
>> >> >>
>> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> >> single node, to get the data out)
>> >> >>
>> >> >> when I use
>> >> >>
>> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> >> /crawler_hbase/crawler
>> >> >>
>> >> >> I get this error in the tasks . Is this a permission problem?
>> >> >>
>> >> >>
>> >> >> 2015-09-26 23:56:32,995 ERROR
>> >> >> org.apache.hadoop.security.UserGroupInformation:
>> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
>> >> >> read
>> >> >> 14279
>> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> >> running child
>> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> >> 14279
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> >> at
>> >> >>
>> >> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> >> cleanup for the task
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Håvard Wahl Kongsgård
>> >> >> Data Scientist
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>> >
>> >
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you used HFile tool ?

The tool would print out that information as part of metadata.

Cheers

On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> Yes, I have tried to read them on another system as well. It worked
> there. But I don't know if they are HFilev1 or HFilev2 format(any way
> to check ?? )
>
> This is the first lines from one of the files
>
> SEQ
> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
> *org.apache.hadoop.io.compress.DefaultCodec���N��
> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
> �қ ��<� /��0/ � ?'/ �
>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>  pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��
> Y;����� / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
> �!٠��;c��I kQ
> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
> HCG��&�����JG�%��',*!!��
> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
> ����_+��nX `#�
> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>
> -Håvard
>
> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> > Have you verified that the files to be imported are in HFilev2 format ?
> >
> > http://hbase.apache.org/book.html#_hfile_tool
> >
> > Cheers
> >
> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> > <ha...@gmail.com> wrote:
> >>
> >> >Is the single node system secure ?
> >>
> >> No have not activated, just defaults
> >>
> >> the mapred conf.
> >>
> >> <?xml version="1.0"?>
> >>
> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> >>
> >>
> >> <configuration>
> >>
> >>   <property>
> >>
> >>     <name>mapred.job.tracker</name>
> >>
> >>     <value>rack3:8021</value>
> >>
> >>   </property>
> >>
> >>
> >>   <!-- Enable Hue plugins -->
> >>
> >>   <property>
> >>
> >>     <name>mapred.jobtracker.plugins</name>
> >>
> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
> >>
> >>     <description>Comma-separated list of jobtracker plug-ins to be
> >> activated.
> >>
> >>     </description>
> >>
> >>   </property>
> >>
> >>   <property>
> >>
> >>     <name>jobtracker.thrift.address</name>
> >>
> >>     <value>0.0.0.0:9290</value>
> >>
> >>   </property>
> >>
> >> </configuration>
> >>
> >>
> >> >>Have you checked hdfs healthiness ?
> >>
> >>
> >> sudo -u hdfs hdfs dfsadmin -report
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> Present Capacity: 1991514849280 (1.81 TB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> DFS Used%: 17.24%
> >>
> >> Under replicated blocks: 52
> >>
> >> Blocks with corrupt replicas: 0
> >>
> >> Missing blocks: 0
> >>
> >>
> >> -------------------------------------------------
> >>
> >> Datanodes available: 1 (1 total, 0 dead)
> >>
> >>
> >> Live datanodes:
> >>
> >> Name: 127.0.0.1:50010 (localhost)
> >>
> >> Hostname: rack3
> >>
> >> Decommission Status : Normal
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> Non DFS Used: 885193736192 (824.40 GB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used%: 11.93%
> >>
> >> DFS Remaining%: 57.30%
> >>
> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
> >>
> >>
> >> >>To which release of hbase were you importing ?
> >>
> >> Hbase 0.94 (CHD 4)
> >>
> >> the new one is CHD 5.4
> >>
> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> >> > Is the single node system secure ?
> >> > Have you checked hdfs healthiness ?
> >> > To which release of hbase were you importing ?
> >> >
> >> > Thanks
> >> >
> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
> >> >> <ha...@gmail.com> wrote:
> >> >>
> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> >> single node, to get the data out)
> >> >>
> >> >> when I use
> >> >>
> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> >> /crawler_hbase/crawler
> >> >>
> >> >> I get this error in the tasks . Is this a permission problem?
> >> >>
> >> >>
> >> >> 2015-09-26 23:56:32,995 ERROR
> >> >> org.apache.hadoop.security.UserGroupInformation:
> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
> read
> >> >> 14279
> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> >> >> running child
> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> >> >> at
> >> >>
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> >> at
> >> >>
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >> at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> >> cleanup for the task
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Håvard Wahl Kongsgård
> >> >> Data Scientist
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
> >
> >
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you used HFile tool ?

The tool would print out that information as part of metadata.

Cheers

On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> Yes, I have tried to read them on another system as well. It worked
> there. But I don't know if they are HFilev1 or HFilev2 format(any way
> to check ?? )
>
> This is the first lines from one of the files
>
> SEQ
> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
> *org.apache.hadoop.io.compress.DefaultCodec���N��
> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
> �қ ��<� /��0/ � ?'/ �
>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>  pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��
> Y;����� / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
> �!٠��;c��I kQ
> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
> HCG��&�����JG�%��',*!!��
> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
> ����_+��nX `#�
> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>
> -Håvard
>
> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> > Have you verified that the files to be imported are in HFilev2 format ?
> >
> > http://hbase.apache.org/book.html#_hfile_tool
> >
> > Cheers
> >
> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> > <ha...@gmail.com> wrote:
> >>
> >> >Is the single node system secure ?
> >>
> >> No have not activated, just defaults
> >>
> >> the mapred conf.
> >>
> >> <?xml version="1.0"?>
> >>
> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> >>
> >>
> >> <configuration>
> >>
> >>   <property>
> >>
> >>     <name>mapred.job.tracker</name>
> >>
> >>     <value>rack3:8021</value>
> >>
> >>   </property>
> >>
> >>
> >>   <!-- Enable Hue plugins -->
> >>
> >>   <property>
> >>
> >>     <name>mapred.jobtracker.plugins</name>
> >>
> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
> >>
> >>     <description>Comma-separated list of jobtracker plug-ins to be
> >> activated.
> >>
> >>     </description>
> >>
> >>   </property>
> >>
> >>   <property>
> >>
> >>     <name>jobtracker.thrift.address</name>
> >>
> >>     <value>0.0.0.0:9290</value>
> >>
> >>   </property>
> >>
> >> </configuration>
> >>
> >>
> >> >>Have you checked hdfs healthiness ?
> >>
> >>
> >> sudo -u hdfs hdfs dfsadmin -report
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> Present Capacity: 1991514849280 (1.81 TB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> DFS Used%: 17.24%
> >>
> >> Under replicated blocks: 52
> >>
> >> Blocks with corrupt replicas: 0
> >>
> >> Missing blocks: 0
> >>
> >>
> >> -------------------------------------------------
> >>
> >> Datanodes available: 1 (1 total, 0 dead)
> >>
> >>
> >> Live datanodes:
> >>
> >> Name: 127.0.0.1:50010 (localhost)
> >>
> >> Hostname: rack3
> >>
> >> Decommission Status : Normal
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> Non DFS Used: 885193736192 (824.40 GB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used%: 11.93%
> >>
> >> DFS Remaining%: 57.30%
> >>
> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
> >>
> >>
> >> >>To which release of hbase were you importing ?
> >>
> >> Hbase 0.94 (CHD 4)
> >>
> >> the new one is CHD 5.4
> >>
> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> >> > Is the single node system secure ?
> >> > Have you checked hdfs healthiness ?
> >> > To which release of hbase were you importing ?
> >> >
> >> > Thanks
> >> >
> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
> >> >> <ha...@gmail.com> wrote:
> >> >>
> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> >> single node, to get the data out)
> >> >>
> >> >> when I use
> >> >>
> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> >> /crawler_hbase/crawler
> >> >>
> >> >> I get this error in the tasks . Is this a permission problem?
> >> >>
> >> >>
> >> >> 2015-09-26 23:56:32,995 ERROR
> >> >> org.apache.hadoop.security.UserGroupInformation:
> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
> read
> >> >> 14279
> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> >> >> running child
> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> >> >> at
> >> >>
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> >> at
> >> >>
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >> at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> >> cleanup for the task
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Håvard Wahl Kongsgård
> >> >> Data Scientist
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
> >
> >
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you used HFile tool ?

The tool would print out that information as part of metadata.

Cheers

On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> Yes, I have tried to read them on another system as well. It worked
> there. But I don't know if they are HFilev1 or HFilev2 format(any way
> to check ?? )
>
> This is the first lines from one of the files
>
> SEQ
> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
> *org.apache.hadoop.io.compress.DefaultCodec���N��
> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
> �қ ��<� /��0/ � ?'/ �
>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>  pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��
> Y;����� / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
> �!٠��;c��I kQ
> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
> HCG��&�����JG�%��',*!!��
> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
> ����_+��nX `#�
> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>
> -Håvard
>
> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> > Have you verified that the files to be imported are in HFilev2 format ?
> >
> > http://hbase.apache.org/book.html#_hfile_tool
> >
> > Cheers
> >
> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> > <ha...@gmail.com> wrote:
> >>
> >> >Is the single node system secure ?
> >>
> >> No have not activated, just defaults
> >>
> >> the mapred conf.
> >>
> >> <?xml version="1.0"?>
> >>
> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> >>
> >>
> >> <configuration>
> >>
> >>   <property>
> >>
> >>     <name>mapred.job.tracker</name>
> >>
> >>     <value>rack3:8021</value>
> >>
> >>   </property>
> >>
> >>
> >>   <!-- Enable Hue plugins -->
> >>
> >>   <property>
> >>
> >>     <name>mapred.jobtracker.plugins</name>
> >>
> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
> >>
> >>     <description>Comma-separated list of jobtracker plug-ins to be
> >> activated.
> >>
> >>     </description>
> >>
> >>   </property>
> >>
> >>   <property>
> >>
> >>     <name>jobtracker.thrift.address</name>
> >>
> >>     <value>0.0.0.0:9290</value>
> >>
> >>   </property>
> >>
> >> </configuration>
> >>
> >>
> >> >>Have you checked hdfs healthiness ?
> >>
> >>
> >> sudo -u hdfs hdfs dfsadmin -report
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> Present Capacity: 1991514849280 (1.81 TB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> DFS Used%: 17.24%
> >>
> >> Under replicated blocks: 52
> >>
> >> Blocks with corrupt replicas: 0
> >>
> >> Missing blocks: 0
> >>
> >>
> >> -------------------------------------------------
> >>
> >> Datanodes available: 1 (1 total, 0 dead)
> >>
> >>
> >> Live datanodes:
> >>
> >> Name: 127.0.0.1:50010 (localhost)
> >>
> >> Hostname: rack3
> >>
> >> Decommission Status : Normal
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> Non DFS Used: 885193736192 (824.40 GB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used%: 11.93%
> >>
> >> DFS Remaining%: 57.30%
> >>
> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
> >>
> >>
> >> >>To which release of hbase were you importing ?
> >>
> >> Hbase 0.94 (CHD 4)
> >>
> >> the new one is CHD 5.4
> >>
> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> >> > Is the single node system secure ?
> >> > Have you checked hdfs healthiness ?
> >> > To which release of hbase were you importing ?
> >> >
> >> > Thanks
> >> >
> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
> >> >> <ha...@gmail.com> wrote:
> >> >>
> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> >> single node, to get the data out)
> >> >>
> >> >> when I use
> >> >>
> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> >> /crawler_hbase/crawler
> >> >>
> >> >> I get this error in the tasks . Is this a permission problem?
> >> >>
> >> >>
> >> >> 2015-09-26 23:56:32,995 ERROR
> >> >> org.apache.hadoop.security.UserGroupInformation:
> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
> read
> >> >> 14279
> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> >> >> running child
> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> >> >> at
> >> >>
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> >> at
> >> >>
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >> at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> >> cleanup for the task
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Håvard Wahl Kongsgård
> >> >> Data Scientist
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
> >
> >
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you used HFile tool ?

The tool would print out that information as part of metadata.

Cheers

On Sun, Sep 27, 2015 at 9:19 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> Yes, I have tried to read them on another system as well. It worked
> there. But I don't know if they are HFilev1 or HFilev2 format(any way
> to check ?? )
>
> This is the first lines from one of the files
>
> SEQ
> 1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result
> *org.apache.hadoop.io.compress.DefaultCodec���N��
> $��a&t ��wb%!100000107712083-10152358443612846x��� P ]�6:� w |pw
>  ���   �$��K 0�   Npw�$x�@� ���s�;�Ɦ���V�^��ݽW� �� �
> �қ ��<� /��0/ � ?'/ �
>  ����� /��� �� 7{kG [(���  ���w�� OY^I ���}9 � �l��;�TJ�� �� �J� ‹  ����
>  pu���V�   ӡm�\E @ ��V6�oe45U ���,�3 ���Ͻ�w��O���zڼ�/��歇�KȦ/ ����?��
> Y;����� / ��� �� �� }� ��룫-�'_�k����� ��q� $ ��˨� � ���^
> ��� i��� tH$/��e.J��{S �\��S���� >G d���1~ p#��  o �� ��M
> �!٠��;c��I kQ
> �A)|d�i�(Z�f��o Pb �j {�  �x��� � `�����b���cbb`�"�}�������� �����
> HCG��&�����JG�%��',*!!��
> �� �  � ��& �����_Q��R�2�1��_��~>:� b  � �������w @�B    �  ~Y�H�(�h/FR
> ����_+��nX `#�
> |D��� �j��܏�� f ��ƨT��k/ 颚h ��4` +Q#�ⵕ�,Z�80�V:�
>  )Y)4Lq��[�   z#���T<b�  �-*.�����/��m{? �8|�] �6� �sk�t�L\UjT=
>
> -Håvard
>
> On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> > Have you verified that the files to be imported are in HFilev2 format ?
> >
> > http://hbase.apache.org/book.html#_hfile_tool
> >
> > Cheers
> >
> > On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> > <ha...@gmail.com> wrote:
> >>
> >> >Is the single node system secure ?
> >>
> >> No have not activated, just defaults
> >>
> >> the mapred conf.
> >>
> >> <?xml version="1.0"?>
> >>
> >> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> >>
> >>
> >> <configuration>
> >>
> >>   <property>
> >>
> >>     <name>mapred.job.tracker</name>
> >>
> >>     <value>rack3:8021</value>
> >>
> >>   </property>
> >>
> >>
> >>   <!-- Enable Hue plugins -->
> >>
> >>   <property>
> >>
> >>     <name>mapred.jobtracker.plugins</name>
> >>
> >>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
> >>
> >>     <description>Comma-separated list of jobtracker plug-ins to be
> >> activated.
> >>
> >>     </description>
> >>
> >>   </property>
> >>
> >>   <property>
> >>
> >>     <name>jobtracker.thrift.address</name>
> >>
> >>     <value>0.0.0.0:9290</value>
> >>
> >>   </property>
> >>
> >> </configuration>
> >>
> >>
> >> >>Have you checked hdfs healthiness ?
> >>
> >>
> >> sudo -u hdfs hdfs dfsadmin -report
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> Present Capacity: 1991514849280 (1.81 TB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> DFS Used%: 17.24%
> >>
> >> Under replicated blocks: 52
> >>
> >> Blocks with corrupt replicas: 0
> >>
> >> Missing blocks: 0
> >>
> >>
> >> -------------------------------------------------
> >>
> >> Datanodes available: 1 (1 total, 0 dead)
> >>
> >>
> >> Live datanodes:
> >>
> >> Name: 127.0.0.1:50010 (localhost)
> >>
> >> Hostname: rack3
> >>
> >> Decommission Status : Normal
> >>
> >> Configured Capacity: 2876708585472 (2.62 TB)
> >>
> >> DFS Used: 343284232192 (319.71 GB)
> >>
> >> Non DFS Used: 885193736192 (824.40 GB)
> >>
> >> DFS Remaining: 1648230617088 (1.50 TB)
> >>
> >> DFS Used%: 11.93%
> >>
> >> DFS Remaining%: 57.30%
> >>
> >> Last contact: Sun Sep 27 13:44:45 CEST 2015
> >>
> >>
> >> >>To which release of hbase were you importing ?
> >>
> >> Hbase 0.94 (CHD 4)
> >>
> >> the new one is CHD 5.4
> >>
> >> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> >> > Is the single node system secure ?
> >> > Have you checked hdfs healthiness ?
> >> > To which release of hbase were you importing ?
> >> >
> >> > Thanks
> >> >
> >> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
> >> >> <ha...@gmail.com> wrote:
> >> >>
> >> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> >> single node, to get the data out)
> >> >>
> >> >> when I use
> >> >>
> >> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> >> /crawler_hbase/crawler
> >> >>
> >> >> I get this error in the tasks . Is this a permission problem?
> >> >>
> >> >>
> >> >> 2015-09-26 23:56:32,995 ERROR
> >> >> org.apache.hadoop.security.UserGroupInformation:
> >> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should
> read
> >> >> 14279
> >> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> >> >> running child
> >> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> >> >> at
> >> >>
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> >> at
> >> >>
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> >> at
> >> >>
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> >> at
> >> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> >> cleanup for the task
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Håvard Wahl Kongsgård
> >> >> Data Scientist
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
> >
> >
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
Yes, I have tried to read them on another system as well. It worked
there. But I don't know if they are HFilev1 or HFilev2 format(any way
to check ?? )

This is the first lines from one of the files

SEQ1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result*org.apache.hadoop.io.compress.DefaultCodec���N��
$��a&t��wb%!100000107712083-10152358443612846x���P]�6:�w|pw
��� �$��K 0� Npw�$x�@����s�;�Ɦ���V�^��ݽW����
�қ��<�/��0/�?'/�
�����/�����7{kG[(������w��OY^I���}9��l��;�TJ�����J�‹����pu���V�ӡm�\E@��V6�oe45U���,�3���Ͻ�w��O���zڼ�/��歇�KȦ/����?��Y;�����/�������}���룫-�'_�k�������q�$��˨�����^
���i���tH$/��e.J��{S�\��S����>Gd���1~p#��o����M
�!٠��;c��IkQ
�A)|d�i�(Z�f��oPb�j{��x����`�����b���cbb`�"�}�������������HCG��&�����JG�%��',*!!��
������&�����_Q��R�2�1��_��~>:�b��������w@�B�~Y�H�(�h/FR����_+��nX`#�
|D����j��܏��f��ƨT��k/颚h��4`+Q#�ⵕ�,Z�80�V:�
)Y)4Lq��[�z#���T<b��-*.�����/��m{?�8|�]�6��sk�t�L\UjT=

-Håvard

On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you verified that the files to be imported are in HFilev2 format ?
>
> http://hbase.apache.org/book.html#_hfile_tool
>
> Cheers
>
> On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> >Is the single node system secure ?
>>
>> No have not activated, just defaults
>>
>> the mapred conf.
>>
>> <?xml version="1.0"?>
>>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>
>>
>> <configuration>
>>
>>   <property>
>>
>>     <name>mapred.job.tracker</name>
>>
>>     <value>rack3:8021</value>
>>
>>   </property>
>>
>>
>>   <!-- Enable Hue plugins -->
>>
>>   <property>
>>
>>     <name>mapred.jobtracker.plugins</name>
>>
>>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>>
>>     <description>Comma-separated list of jobtracker plug-ins to be
>> activated.
>>
>>     </description>
>>
>>   </property>
>>
>>   <property>
>>
>>     <name>jobtracker.thrift.address</name>
>>
>>     <value>0.0.0.0:9290</value>
>>
>>   </property>
>>
>> </configuration>
>>
>>
>> >>Have you checked hdfs healthiness ?
>>
>>
>> sudo -u hdfs hdfs dfsadmin -report
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> Present Capacity: 1991514849280 (1.81 TB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> DFS Used%: 17.24%
>>
>> Under replicated blocks: 52
>>
>> Blocks with corrupt replicas: 0
>>
>> Missing blocks: 0
>>
>>
>> -------------------------------------------------
>>
>> Datanodes available: 1 (1 total, 0 dead)
>>
>>
>> Live datanodes:
>>
>> Name: 127.0.0.1:50010 (localhost)
>>
>> Hostname: rack3
>>
>> Decommission Status : Normal
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> Non DFS Used: 885193736192 (824.40 GB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used%: 11.93%
>>
>> DFS Remaining%: 57.30%
>>
>> Last contact: Sun Sep 27 13:44:45 CEST 2015
>>
>>
>> >>To which release of hbase were you importing ?
>>
>> Hbase 0.94 (CHD 4)
>>
>> the new one is CHD 5.4
>>
>> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Is the single node system secure ?
>> > Have you checked hdfs healthiness ?
>> > To which release of hbase were you importing ?
>> >
>> > Thanks
>> >
>> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> <ha...@gmail.com> wrote:
>> >>
>> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> single node, to get the data out)
>> >>
>> >> when I use
>> >>
>> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> /crawler_hbase/crawler
>> >>
>> >> I get this error in the tasks . Is this a permission problem?
>> >>
>> >>
>> >> 2015-09-26 23:56:32,995 ERROR
>> >> org.apache.hadoop.security.UserGroupInformation:
>> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> 14279
>> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> running child
>> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> >> at
>> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> at
>> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> at
>> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> at
>> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> cleanup for the task
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
Yes, I have tried to read them on another system as well. It worked
there. But I don't know if they are HFilev1 or HFilev2 format(any way
to check ?? )

This is the first lines from one of the files

SEQ1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result*org.apache.hadoop.io.compress.DefaultCodec���N��
$��a&t��wb%!100000107712083-10152358443612846x���P]�6:�w|pw
��� �$��K 0� Npw�$x�@����s�;�Ɦ���V�^��ݽW����
�қ��<�/��0/�?'/�
�����/�����7{kG[(������w��OY^I���}9��l��;�TJ�����J�‹����pu���V�ӡm�\E@��V6�oe45U���,�3���Ͻ�w��O���zڼ�/��歇�KȦ/����?��Y;�����/�������}���룫-�'_�k�������q�$��˨�����^
���i���tH$/��e.J��{S�\��S����>Gd���1~p#��o����M
�!٠��;c��IkQ
�A)|d�i�(Z�f��oPb�j{��x����`�����b���cbb`�"�}�������������HCG��&�����JG�%��',*!!��
������&�����_Q��R�2�1��_��~>:�b��������w@�B�~Y�H�(�h/FR����_+��nX`#�
|D����j��܏��f��ƨT��k/颚h��4`+Q#�ⵕ�,Z�80�V:�
)Y)4Lq��[�z#���T<b��-*.�����/��m{?�8|�]�6��sk�t�L\UjT=

-Håvard

On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you verified that the files to be imported are in HFilev2 format ?
>
> http://hbase.apache.org/book.html#_hfile_tool
>
> Cheers
>
> On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> >Is the single node system secure ?
>>
>> No have not activated, just defaults
>>
>> the mapred conf.
>>
>> <?xml version="1.0"?>
>>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>
>>
>> <configuration>
>>
>>   <property>
>>
>>     <name>mapred.job.tracker</name>
>>
>>     <value>rack3:8021</value>
>>
>>   </property>
>>
>>
>>   <!-- Enable Hue plugins -->
>>
>>   <property>
>>
>>     <name>mapred.jobtracker.plugins</name>
>>
>>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>>
>>     <description>Comma-separated list of jobtracker plug-ins to be
>> activated.
>>
>>     </description>
>>
>>   </property>
>>
>>   <property>
>>
>>     <name>jobtracker.thrift.address</name>
>>
>>     <value>0.0.0.0:9290</value>
>>
>>   </property>
>>
>> </configuration>
>>
>>
>> >>Have you checked hdfs healthiness ?
>>
>>
>> sudo -u hdfs hdfs dfsadmin -report
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> Present Capacity: 1991514849280 (1.81 TB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> DFS Used%: 17.24%
>>
>> Under replicated blocks: 52
>>
>> Blocks with corrupt replicas: 0
>>
>> Missing blocks: 0
>>
>>
>> -------------------------------------------------
>>
>> Datanodes available: 1 (1 total, 0 dead)
>>
>>
>> Live datanodes:
>>
>> Name: 127.0.0.1:50010 (localhost)
>>
>> Hostname: rack3
>>
>> Decommission Status : Normal
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> Non DFS Used: 885193736192 (824.40 GB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used%: 11.93%
>>
>> DFS Remaining%: 57.30%
>>
>> Last contact: Sun Sep 27 13:44:45 CEST 2015
>>
>>
>> >>To which release of hbase were you importing ?
>>
>> Hbase 0.94 (CHD 4)
>>
>> the new one is CHD 5.4
>>
>> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Is the single node system secure ?
>> > Have you checked hdfs healthiness ?
>> > To which release of hbase were you importing ?
>> >
>> > Thanks
>> >
>> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> <ha...@gmail.com> wrote:
>> >>
>> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> single node, to get the data out)
>> >>
>> >> when I use
>> >>
>> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> /crawler_hbase/crawler
>> >>
>> >> I get this error in the tasks . Is this a permission problem?
>> >>
>> >>
>> >> 2015-09-26 23:56:32,995 ERROR
>> >> org.apache.hadoop.security.UserGroupInformation:
>> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> 14279
>> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> running child
>> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> >> at
>> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> at
>> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> at
>> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> at
>> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> cleanup for the task
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
Yes, I have tried to read them on another system as well. It worked
there. But I don't know if they are HFilev1 or HFilev2 format(any way
to check ?? )

This is the first lines from one of the files

SEQ1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result*org.apache.hadoop.io.compress.DefaultCodec���N��
$��a&t��wb%!100000107712083-10152358443612846x���P]�6:�w|pw
��� �$��K 0� Npw�$x�@����s�;�Ɦ���V�^��ݽW����
�қ��<�/��0/�?'/�
�����/�����7{kG[(������w��OY^I���}9��l��;�TJ�����J�‹����pu���V�ӡm�\E@��V6�oe45U���,�3���Ͻ�w��O���zڼ�/��歇�KȦ/����?��Y;�����/�������}���룫-�'_�k�������q�$��˨�����^
���i���tH$/��e.J��{S�\��S����>Gd���1~p#��o����M
�!٠��;c��IkQ
�A)|d�i�(Z�f��oPb�j{��x����`�����b���cbb`�"�}�������������HCG��&�����JG�%��',*!!��
������&�����_Q��R�2�1��_��~>:�b��������w@�B�~Y�H�(�h/FR����_+��nX`#�
|D����j��܏��f��ƨT��k/颚h��4`+Q#�ⵕ�,Z�80�V:�
)Y)4Lq��[�z#���T<b��-*.�����/��m{?�8|�]�6��sk�t�L\UjT=

-Håvard

On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you verified that the files to be imported are in HFilev2 format ?
>
> http://hbase.apache.org/book.html#_hfile_tool
>
> Cheers
>
> On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> >Is the single node system secure ?
>>
>> No have not activated, just defaults
>>
>> the mapred conf.
>>
>> <?xml version="1.0"?>
>>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>
>>
>> <configuration>
>>
>>   <property>
>>
>>     <name>mapred.job.tracker</name>
>>
>>     <value>rack3:8021</value>
>>
>>   </property>
>>
>>
>>   <!-- Enable Hue plugins -->
>>
>>   <property>
>>
>>     <name>mapred.jobtracker.plugins</name>
>>
>>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>>
>>     <description>Comma-separated list of jobtracker plug-ins to be
>> activated.
>>
>>     </description>
>>
>>   </property>
>>
>>   <property>
>>
>>     <name>jobtracker.thrift.address</name>
>>
>>     <value>0.0.0.0:9290</value>
>>
>>   </property>
>>
>> </configuration>
>>
>>
>> >>Have you checked hdfs healthiness ?
>>
>>
>> sudo -u hdfs hdfs dfsadmin -report
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> Present Capacity: 1991514849280 (1.81 TB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> DFS Used%: 17.24%
>>
>> Under replicated blocks: 52
>>
>> Blocks with corrupt replicas: 0
>>
>> Missing blocks: 0
>>
>>
>> -------------------------------------------------
>>
>> Datanodes available: 1 (1 total, 0 dead)
>>
>>
>> Live datanodes:
>>
>> Name: 127.0.0.1:50010 (localhost)
>>
>> Hostname: rack3
>>
>> Decommission Status : Normal
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> Non DFS Used: 885193736192 (824.40 GB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used%: 11.93%
>>
>> DFS Remaining%: 57.30%
>>
>> Last contact: Sun Sep 27 13:44:45 CEST 2015
>>
>>
>> >>To which release of hbase were you importing ?
>>
>> Hbase 0.94 (CHD 4)
>>
>> the new one is CHD 5.4
>>
>> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Is the single node system secure ?
>> > Have you checked hdfs healthiness ?
>> > To which release of hbase were you importing ?
>> >
>> > Thanks
>> >
>> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> <ha...@gmail.com> wrote:
>> >>
>> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> single node, to get the data out)
>> >>
>> >> when I use
>> >>
>> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> /crawler_hbase/crawler
>> >>
>> >> I get this error in the tasks . Is this a permission problem?
>> >>
>> >>
>> >> 2015-09-26 23:56:32,995 ERROR
>> >> org.apache.hadoop.security.UserGroupInformation:
>> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> 14279
>> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> running child
>> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> >> at
>> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> at
>> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> at
>> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> at
>> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> cleanup for the task
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
Yes, I have tried to read them on another system as well. It worked
there. But I don't know if they are HFilev1 or HFilev2 format(any way
to check ?? )

This is the first lines from one of the files

SEQ1org.apache.hadoop.hbase.io.ImmutableBytesWritable%org.apache.hadoop.hbase.client.Result*org.apache.hadoop.io.compress.DefaultCodec���N��
$��a&t��wb%!100000107712083-10152358443612846x���P]�6:�w|pw
��� �$��K 0� Npw�$x�@����s�;�Ɦ���V�^��ݽW����
�қ��<�/��0/�?'/�
�����/�����7{kG[(������w��OY^I���}9��l��;�TJ�����J�‹����pu���V�ӡm�\E@��V6�oe45U���,�3���Ͻ�w��O���zڼ�/��歇�KȦ/����?��Y;�����/�������}���룫-�'_�k�������q�$��˨�����^
���i���tH$/��e.J��{S�\��S����>Gd���1~p#��o����M
�!٠��;c��IkQ
�A)|d�i�(Z�f��oPb�j{��x����`�����b���cbb`�"�}�������������HCG��&�����JG�%��',*!!��
������&�����_Q��R�2�1��_��~>:�b��������w@�B�~Y�H�(�h/FR����_+��nX`#�
|D����j��܏��f��ƨT��k/颚h��4`+Q#�ⵕ�,Z�80�V:�
)Y)4Lq��[�z#���T<b��-*.�����/��m{?�8|�]�6��sk�t�L\UjT=

-Håvard

On Sun, Sep 27, 2015 at 4:06 PM, Ted Yu <yu...@gmail.com> wrote:
> Have you verified that the files to be imported are in HFilev2 format ?
>
> http://hbase.apache.org/book.html#_hfile_tool
>
> Cheers
>
> On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård
> <ha...@gmail.com> wrote:
>>
>> >Is the single node system secure ?
>>
>> No have not activated, just defaults
>>
>> the mapred conf.
>>
>> <?xml version="1.0"?>
>>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>>
>>
>> <configuration>
>>
>>   <property>
>>
>>     <name>mapred.job.tracker</name>
>>
>>     <value>rack3:8021</value>
>>
>>   </property>
>>
>>
>>   <!-- Enable Hue plugins -->
>>
>>   <property>
>>
>>     <name>mapred.jobtracker.plugins</name>
>>
>>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>>
>>     <description>Comma-separated list of jobtracker plug-ins to be
>> activated.
>>
>>     </description>
>>
>>   </property>
>>
>>   <property>
>>
>>     <name>jobtracker.thrift.address</name>
>>
>>     <value>0.0.0.0:9290</value>
>>
>>   </property>
>>
>> </configuration>
>>
>>
>> >>Have you checked hdfs healthiness ?
>>
>>
>> sudo -u hdfs hdfs dfsadmin -report
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> Present Capacity: 1991514849280 (1.81 TB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> DFS Used%: 17.24%
>>
>> Under replicated blocks: 52
>>
>> Blocks with corrupt replicas: 0
>>
>> Missing blocks: 0
>>
>>
>> -------------------------------------------------
>>
>> Datanodes available: 1 (1 total, 0 dead)
>>
>>
>> Live datanodes:
>>
>> Name: 127.0.0.1:50010 (localhost)
>>
>> Hostname: rack3
>>
>> Decommission Status : Normal
>>
>> Configured Capacity: 2876708585472 (2.62 TB)
>>
>> DFS Used: 343284232192 (319.71 GB)
>>
>> Non DFS Used: 885193736192 (824.40 GB)
>>
>> DFS Remaining: 1648230617088 (1.50 TB)
>>
>> DFS Used%: 11.93%
>>
>> DFS Remaining%: 57.30%
>>
>> Last contact: Sun Sep 27 13:44:45 CEST 2015
>>
>>
>> >>To which release of hbase were you importing ?
>>
>> Hbase 0.94 (CHD 4)
>>
>> the new one is CHD 5.4
>>
>> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Is the single node system secure ?
>> > Have you checked hdfs healthiness ?
>> > To which release of hbase were you importing ?
>> >
>> > Thanks
>> >
>> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård
>> >> <ha...@gmail.com> wrote:
>> >>
>> >> Hi, Iam trying to import a old backup to a new smaller system (just
>> >> single node, to get the data out)
>> >>
>> >> when I use
>> >>
>> >> sudo -u hbase hbase -Dhbase.import.version=0.94
>> >> org.apache.hadoop.hbase.mapreduce.Import crawler
>> >> /crawler_hbase/crawler
>> >>
>> >> I get this error in the tasks . Is this a permission problem?
>> >>
>> >>
>> >> 2015-09-26 23:56:32,995 ERROR
>> >> org.apache.hadoop.security.UserGroupInformation:
>> >> PriviledgedActionException as:mapred (auth:SIMPLE)
>> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> >> 14279
>> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
>> >> running child
>> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> >> at
>> >> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> >> at
>> >> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> >> at
>> >> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> >> at
>> >> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at javax.security.auth.Subject.doAs(Subject.java:415)
>> >> at
>> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> cleanup for the task
>> >>
>> >>
>> >>
>> >> --
>> >> Håvard Wahl Kongsgård
>> >> Data Scientist
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist
>
>



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you verified that the files to be imported are in HFilev2 format ?

http://hbase.apache.org/book.html#_hfile_tool

Cheers

On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> >Is the single node system secure ?
>
> No have not activated, just defaults
>
> the mapred conf.
>
> <?xml version="1.0"?>
>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
>
> <configuration>
>
>   <property>
>
>     <name>mapred.job.tracker</name>
>
>     <value>rack3:8021</value>
>
>   </property>
>
>
>   <!-- Enable Hue plugins -->
>
>   <property>
>
>     <name>mapred.jobtracker.plugins</name>
>
>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>
>     <description>Comma-separated list of jobtracker plug-ins to be
> activated.
>
>     </description>
>
>   </property>
>
>   <property>
>
>     <name>jobtracker.thrift.address</name>
>
>     <value>0.0.0.0:9290</value>
>
>   </property>
>
> </configuration>
>
>
> >>Have you checked hdfs healthiness ?
>
>
> sudo -u hdfs hdfs dfsadmin -report
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> Present Capacity: 1991514849280 (1.81 TB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> DFS Used%: 17.24%
>
> Under replicated blocks: 52
>
> Blocks with corrupt replicas: 0
>
> Missing blocks: 0
>
>
> -------------------------------------------------
>
> Datanodes available: 1 (1 total, 0 dead)
>
>
> Live datanodes:
>
> Name: 127.0.0.1:50010 (localhost)
>
> Hostname: rack3
>
> Decommission Status : Normal
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> Non DFS Used: 885193736192 (824.40 GB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used%: 11.93%
>
> DFS Remaining%: 57.30%
>
> Last contact: Sun Sep 27 13:44:45 CEST 2015
>
>
> >>To which release of hbase were you importing ?
>
> Hbase 0.94 (CHD 4)
>
> the new one is CHD 5.4
>
> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> > Is the single node system secure ?
> > Have you checked hdfs healthiness ?
> > To which release of hbase were you importing ?
> >
> > Thanks
> >
> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <
> haavard.kongsgaard@gmail.com> wrote:
> >>
> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> single node, to get the data out)
> >>
> >> when I use
> >>
> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> /crawler_hbase/crawler
> >>
> >> I get this error in the tasks . Is this a permission problem?
> >>
> >>
> >> 2015-09-26 23:56:32,995 ERROR
> >> org.apache.hadoop.security.UserGroupInformation:
> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> >> 14279
> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> running child
> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> >> at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> at
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> cleanup for the task
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you verified that the files to be imported are in HFilev2 format ?

http://hbase.apache.org/book.html#_hfile_tool

Cheers

On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> >Is the single node system secure ?
>
> No have not activated, just defaults
>
> the mapred conf.
>
> <?xml version="1.0"?>
>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
>
> <configuration>
>
>   <property>
>
>     <name>mapred.job.tracker</name>
>
>     <value>rack3:8021</value>
>
>   </property>
>
>
>   <!-- Enable Hue plugins -->
>
>   <property>
>
>     <name>mapred.jobtracker.plugins</name>
>
>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>
>     <description>Comma-separated list of jobtracker plug-ins to be
> activated.
>
>     </description>
>
>   </property>
>
>   <property>
>
>     <name>jobtracker.thrift.address</name>
>
>     <value>0.0.0.0:9290</value>
>
>   </property>
>
> </configuration>
>
>
> >>Have you checked hdfs healthiness ?
>
>
> sudo -u hdfs hdfs dfsadmin -report
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> Present Capacity: 1991514849280 (1.81 TB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> DFS Used%: 17.24%
>
> Under replicated blocks: 52
>
> Blocks with corrupt replicas: 0
>
> Missing blocks: 0
>
>
> -------------------------------------------------
>
> Datanodes available: 1 (1 total, 0 dead)
>
>
> Live datanodes:
>
> Name: 127.0.0.1:50010 (localhost)
>
> Hostname: rack3
>
> Decommission Status : Normal
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> Non DFS Used: 885193736192 (824.40 GB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used%: 11.93%
>
> DFS Remaining%: 57.30%
>
> Last contact: Sun Sep 27 13:44:45 CEST 2015
>
>
> >>To which release of hbase were you importing ?
>
> Hbase 0.94 (CHD 4)
>
> the new one is CHD 5.4
>
> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> > Is the single node system secure ?
> > Have you checked hdfs healthiness ?
> > To which release of hbase were you importing ?
> >
> > Thanks
> >
> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <
> haavard.kongsgaard@gmail.com> wrote:
> >>
> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> single node, to get the data out)
> >>
> >> when I use
> >>
> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> /crawler_hbase/crawler
> >>
> >> I get this error in the tasks . Is this a permission problem?
> >>
> >>
> >> 2015-09-26 23:56:32,995 ERROR
> >> org.apache.hadoop.security.UserGroupInformation:
> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> >> 14279
> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> running child
> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> >> at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> at
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> cleanup for the task
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you verified that the files to be imported are in HFilev2 format ?

http://hbase.apache.org/book.html#_hfile_tool

Cheers

On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> >Is the single node system secure ?
>
> No have not activated, just defaults
>
> the mapred conf.
>
> <?xml version="1.0"?>
>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
>
> <configuration>
>
>   <property>
>
>     <name>mapred.job.tracker</name>
>
>     <value>rack3:8021</value>
>
>   </property>
>
>
>   <!-- Enable Hue plugins -->
>
>   <property>
>
>     <name>mapred.jobtracker.plugins</name>
>
>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>
>     <description>Comma-separated list of jobtracker plug-ins to be
> activated.
>
>     </description>
>
>   </property>
>
>   <property>
>
>     <name>jobtracker.thrift.address</name>
>
>     <value>0.0.0.0:9290</value>
>
>   </property>
>
> </configuration>
>
>
> >>Have you checked hdfs healthiness ?
>
>
> sudo -u hdfs hdfs dfsadmin -report
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> Present Capacity: 1991514849280 (1.81 TB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> DFS Used%: 17.24%
>
> Under replicated blocks: 52
>
> Blocks with corrupt replicas: 0
>
> Missing blocks: 0
>
>
> -------------------------------------------------
>
> Datanodes available: 1 (1 total, 0 dead)
>
>
> Live datanodes:
>
> Name: 127.0.0.1:50010 (localhost)
>
> Hostname: rack3
>
> Decommission Status : Normal
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> Non DFS Used: 885193736192 (824.40 GB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used%: 11.93%
>
> DFS Remaining%: 57.30%
>
> Last contact: Sun Sep 27 13:44:45 CEST 2015
>
>
> >>To which release of hbase were you importing ?
>
> Hbase 0.94 (CHD 4)
>
> the new one is CHD 5.4
>
> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> > Is the single node system secure ?
> > Have you checked hdfs healthiness ?
> > To which release of hbase were you importing ?
> >
> > Thanks
> >
> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <
> haavard.kongsgaard@gmail.com> wrote:
> >>
> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> single node, to get the data out)
> >>
> >> when I use
> >>
> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> /crawler_hbase/crawler
> >>
> >> I get this error in the tasks . Is this a permission problem?
> >>
> >>
> >> 2015-09-26 23:56:32,995 ERROR
> >> org.apache.hadoop.security.UserGroupInformation:
> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> >> 14279
> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> running child
> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> >> at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> at
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> cleanup for the task
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Have you verified that the files to be imported are in HFilev2 format ?

http://hbase.apache.org/book.html#_hfile_tool

Cheers

On Sun, Sep 27, 2015 at 4:47 AM, Håvard Wahl Kongsgård <
haavard.kongsgaard@gmail.com> wrote:

> >Is the single node system secure ?
>
> No have not activated, just defaults
>
> the mapred conf.
>
> <?xml version="1.0"?>
>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
>
> <configuration>
>
>   <property>
>
>     <name>mapred.job.tracker</name>
>
>     <value>rack3:8021</value>
>
>   </property>
>
>
>   <!-- Enable Hue plugins -->
>
>   <property>
>
>     <name>mapred.jobtracker.plugins</name>
>
>     <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>
>
>     <description>Comma-separated list of jobtracker plug-ins to be
> activated.
>
>     </description>
>
>   </property>
>
>   <property>
>
>     <name>jobtracker.thrift.address</name>
>
>     <value>0.0.0.0:9290</value>
>
>   </property>
>
> </configuration>
>
>
> >>Have you checked hdfs healthiness ?
>
>
> sudo -u hdfs hdfs dfsadmin -report
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> Present Capacity: 1991514849280 (1.81 TB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> DFS Used%: 17.24%
>
> Under replicated blocks: 52
>
> Blocks with corrupt replicas: 0
>
> Missing blocks: 0
>
>
> -------------------------------------------------
>
> Datanodes available: 1 (1 total, 0 dead)
>
>
> Live datanodes:
>
> Name: 127.0.0.1:50010 (localhost)
>
> Hostname: rack3
>
> Decommission Status : Normal
>
> Configured Capacity: 2876708585472 (2.62 TB)
>
> DFS Used: 343284232192 (319.71 GB)
>
> Non DFS Used: 885193736192 (824.40 GB)
>
> DFS Remaining: 1648230617088 (1.50 TB)
>
> DFS Used%: 11.93%
>
> DFS Remaining%: 57.30%
>
> Last contact: Sun Sep 27 13:44:45 CEST 2015
>
>
> >>To which release of hbase were you importing ?
>
> Hbase 0.94 (CHD 4)
>
> the new one is CHD 5.4
>
> On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> > Is the single node system secure ?
> > Have you checked hdfs healthiness ?
> > To which release of hbase were you importing ?
> >
> > Thanks
> >
> >> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <
> haavard.kongsgaard@gmail.com> wrote:
> >>
> >> Hi, Iam trying to import a old backup to a new smaller system (just
> >> single node, to get the data out)
> >>
> >> when I use
> >>
> >> sudo -u hbase hbase -Dhbase.import.version=0.94
> >> org.apache.hadoop.hbase.mapreduce.Import crawler
> >> /crawler_hbase/crawler
> >>
> >> I get this error in the tasks . Is this a permission problem?
> >>
> >>
> >> 2015-09-26 23:56:32,995 ERROR
> >> org.apache.hadoop.security.UserGroupInformation:
> >> PriviledgedActionException as:mapred (auth:SIMPLE)
> >> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> >> 14279
> >> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error
> running child
> >> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> >> at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> >> at
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> >> at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> >> at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >> at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> >> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> >> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> >> cleanup for the task
> >>
> >>
> >>
> >> --
> >> Håvard Wahl Kongsgård
> >> Data Scientist
>
>
>
> --
> Håvard Wahl Kongsgård
> Data Scientist
>

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
>Is the single node system secure ?

No have not activated, just defaults

the mapred conf.

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>


<configuration>

  <property>

    <name>mapred.job.tracker</name>

    <value>rack3:8021</value>

  </property>


  <!-- Enable Hue plugins -->

  <property>

    <name>mapred.jobtracker.plugins</name>

    <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>

    <description>Comma-separated list of jobtracker plug-ins to be activated.

    </description>

  </property>

  <property>

    <name>jobtracker.thrift.address</name>

    <value>0.0.0.0:9290</value>

  </property>

</configuration>


>>Have you checked hdfs healthiness ?


sudo -u hdfs hdfs dfsadmin -report

Configured Capacity: 2876708585472 (2.62 TB)

Present Capacity: 1991514849280 (1.81 TB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used: 343284232192 (319.71 GB)

DFS Used%: 17.24%

Under replicated blocks: 52

Blocks with corrupt replicas: 0

Missing blocks: 0


-------------------------------------------------

Datanodes available: 1 (1 total, 0 dead)


Live datanodes:

Name: 127.0.0.1:50010 (localhost)

Hostname: rack3

Decommission Status : Normal

Configured Capacity: 2876708585472 (2.62 TB)

DFS Used: 343284232192 (319.71 GB)

Non DFS Used: 885193736192 (824.40 GB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used%: 11.93%

DFS Remaining%: 57.30%

Last contact: Sun Sep 27 13:44:45 CEST 2015


>>To which release of hbase were you importing ?

Hbase 0.94 (CHD 4)

the new one is CHD 5.4

On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> Is the single node system secure ?
> Have you checked hdfs healthiness ?
> To which release of hbase were you importing ?
>
> Thanks
>
>> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
>>
>> Hi, Iam trying to import a old backup to a new smaller system (just
>> single node, to get the data out)
>>
>> when I use
>>
>> sudo -u hbase hbase -Dhbase.import.version=0.94
>> org.apache.hadoop.hbase.mapreduce.Import crawler
>> /crawler_hbase/crawler
>>
>> I get this error in the tasks . Is this a permission problem?
>>
>>
>> 2015-09-26 23:56:32,995 ERROR
>> org.apache.hadoop.security.UserGroupInformation:
>> PriviledgedActionException as:mapred (auth:SIMPLE)
>> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> 14279
>> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
>> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> cleanup for the task
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
>Is the single node system secure ?

No have not activated, just defaults

the mapred conf.

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>


<configuration>

  <property>

    <name>mapred.job.tracker</name>

    <value>rack3:8021</value>

  </property>


  <!-- Enable Hue plugins -->

  <property>

    <name>mapred.jobtracker.plugins</name>

    <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>

    <description>Comma-separated list of jobtracker plug-ins to be activated.

    </description>

  </property>

  <property>

    <name>jobtracker.thrift.address</name>

    <value>0.0.0.0:9290</value>

  </property>

</configuration>


>>Have you checked hdfs healthiness ?


sudo -u hdfs hdfs dfsadmin -report

Configured Capacity: 2876708585472 (2.62 TB)

Present Capacity: 1991514849280 (1.81 TB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used: 343284232192 (319.71 GB)

DFS Used%: 17.24%

Under replicated blocks: 52

Blocks with corrupt replicas: 0

Missing blocks: 0


-------------------------------------------------

Datanodes available: 1 (1 total, 0 dead)


Live datanodes:

Name: 127.0.0.1:50010 (localhost)

Hostname: rack3

Decommission Status : Normal

Configured Capacity: 2876708585472 (2.62 TB)

DFS Used: 343284232192 (319.71 GB)

Non DFS Used: 885193736192 (824.40 GB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used%: 11.93%

DFS Remaining%: 57.30%

Last contact: Sun Sep 27 13:44:45 CEST 2015


>>To which release of hbase were you importing ?

Hbase 0.94 (CHD 4)

the new one is CHD 5.4

On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> Is the single node system secure ?
> Have you checked hdfs healthiness ?
> To which release of hbase were you importing ?
>
> Thanks
>
>> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
>>
>> Hi, Iam trying to import a old backup to a new smaller system (just
>> single node, to get the data out)
>>
>> when I use
>>
>> sudo -u hbase hbase -Dhbase.import.version=0.94
>> org.apache.hadoop.hbase.mapreduce.Import crawler
>> /crawler_hbase/crawler
>>
>> I get this error in the tasks . Is this a permission problem?
>>
>>
>> 2015-09-26 23:56:32,995 ERROR
>> org.apache.hadoop.security.UserGroupInformation:
>> PriviledgedActionException as:mapred (auth:SIMPLE)
>> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> 14279
>> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
>> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> cleanup for the task
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
>Is the single node system secure ?

No have not activated, just defaults

the mapred conf.

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>


<configuration>

  <property>

    <name>mapred.job.tracker</name>

    <value>rack3:8021</value>

  </property>


  <!-- Enable Hue plugins -->

  <property>

    <name>mapred.jobtracker.plugins</name>

    <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>

    <description>Comma-separated list of jobtracker plug-ins to be activated.

    </description>

  </property>

  <property>

    <name>jobtracker.thrift.address</name>

    <value>0.0.0.0:9290</value>

  </property>

</configuration>


>>Have you checked hdfs healthiness ?


sudo -u hdfs hdfs dfsadmin -report

Configured Capacity: 2876708585472 (2.62 TB)

Present Capacity: 1991514849280 (1.81 TB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used: 343284232192 (319.71 GB)

DFS Used%: 17.24%

Under replicated blocks: 52

Blocks with corrupt replicas: 0

Missing blocks: 0


-------------------------------------------------

Datanodes available: 1 (1 total, 0 dead)


Live datanodes:

Name: 127.0.0.1:50010 (localhost)

Hostname: rack3

Decommission Status : Normal

Configured Capacity: 2876708585472 (2.62 TB)

DFS Used: 343284232192 (319.71 GB)

Non DFS Used: 885193736192 (824.40 GB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used%: 11.93%

DFS Remaining%: 57.30%

Last contact: Sun Sep 27 13:44:45 CEST 2015


>>To which release of hbase were you importing ?

Hbase 0.94 (CHD 4)

the new one is CHD 5.4

On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> Is the single node system secure ?
> Have you checked hdfs healthiness ?
> To which release of hbase were you importing ?
>
> Thanks
>
>> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
>>
>> Hi, Iam trying to import a old backup to a new smaller system (just
>> single node, to get the data out)
>>
>> when I use
>>
>> sudo -u hbase hbase -Dhbase.import.version=0.94
>> org.apache.hadoop.hbase.mapreduce.Import crawler
>> /crawler_hbase/crawler
>>
>> I get this error in the tasks . Is this a permission problem?
>>
>>
>> 2015-09-26 23:56:32,995 ERROR
>> org.apache.hadoop.security.UserGroupInformation:
>> PriviledgedActionException as:mapred (auth:SIMPLE)
>> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> 14279
>> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
>> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> cleanup for the task
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Håvard Wahl Kongsgård <ha...@gmail.com>.
>Is the single node system secure ?

No have not activated, just defaults

the mapred conf.

<?xml version="1.0"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>


<configuration>

  <property>

    <name>mapred.job.tracker</name>

    <value>rack3:8021</value>

  </property>


  <!-- Enable Hue plugins -->

  <property>

    <name>mapred.jobtracker.plugins</name>

    <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value>

    <description>Comma-separated list of jobtracker plug-ins to be activated.

    </description>

  </property>

  <property>

    <name>jobtracker.thrift.address</name>

    <value>0.0.0.0:9290</value>

  </property>

</configuration>


>>Have you checked hdfs healthiness ?


sudo -u hdfs hdfs dfsadmin -report

Configured Capacity: 2876708585472 (2.62 TB)

Present Capacity: 1991514849280 (1.81 TB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used: 343284232192 (319.71 GB)

DFS Used%: 17.24%

Under replicated blocks: 52

Blocks with corrupt replicas: 0

Missing blocks: 0


-------------------------------------------------

Datanodes available: 1 (1 total, 0 dead)


Live datanodes:

Name: 127.0.0.1:50010 (localhost)

Hostname: rack3

Decommission Status : Normal

Configured Capacity: 2876708585472 (2.62 TB)

DFS Used: 343284232192 (319.71 GB)

Non DFS Used: 885193736192 (824.40 GB)

DFS Remaining: 1648230617088 (1.50 TB)

DFS Used%: 11.93%

DFS Remaining%: 57.30%

Last contact: Sun Sep 27 13:44:45 CEST 2015


>>To which release of hbase were you importing ?

Hbase 0.94 (CHD 4)

the new one is CHD 5.4

On Sun, Sep 27, 2015 at 1:32 PM, Ted Yu <yu...@gmail.com> wrote:
> Is the single node system secure ?
> Have you checked hdfs healthiness ?
> To which release of hbase were you importing ?
>
> Thanks
>
>> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
>>
>> Hi, Iam trying to import a old backup to a new smaller system (just
>> single node, to get the data out)
>>
>> when I use
>>
>> sudo -u hbase hbase -Dhbase.import.version=0.94
>> org.apache.hadoop.hbase.mapreduce.Import crawler
>> /crawler_hbase/crawler
>>
>> I get this error in the tasks . Is this a permission problem?
>>
>>
>> 2015-09-26 23:56:32,995 ERROR
>> org.apache.hadoop.security.UserGroupInformation:
>> PriviledgedActionException as:mapred (auth:SIMPLE)
>> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
>> 14279
>> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
>> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
>> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
>> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
>> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
>> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
>> cleanup for the task
>>
>>
>>
>> --
>> Håvard Wahl Kongsgård
>> Data Scientist



-- 
Håvard Wahl Kongsgård
Data Scientist

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Is the single node system secure ?
Have you checked hdfs healthiness ?
To which release of hbase were you importing ?

Thanks

> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
> 
> Hi, Iam trying to import a old backup to a new smaller system (just
> single node, to get the data out)
> 
> when I use
> 
> sudo -u hbase hbase -Dhbase.import.version=0.94
> org.apache.hadoop.hbase.mapreduce.Import crawler
> /crawler_hbase/crawler
> 
> I get this error in the tasks . Is this a permission problem?
> 
> 
> 2015-09-26 23:56:32,995 ERROR
> org.apache.hadoop.security.UserGroupInformation:
> PriviledgedActionException as:mapred (auth:SIMPLE)
> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> cleanup for the task
> 
> 
> 
> -- 
> Håvard Wahl Kongsgård
> Data Scientist

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Is the single node system secure ?
Have you checked hdfs healthiness ?
To which release of hbase were you importing ?

Thanks

> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
> 
> Hi, Iam trying to import a old backup to a new smaller system (just
> single node, to get the data out)
> 
> when I use
> 
> sudo -u hbase hbase -Dhbase.import.version=0.94
> org.apache.hadoop.hbase.mapreduce.Import crawler
> /crawler_hbase/crawler
> 
> I get this error in the tasks . Is this a permission problem?
> 
> 
> 2015-09-26 23:56:32,995 ERROR
> org.apache.hadoop.security.UserGroupInformation:
> PriviledgedActionException as:mapred (auth:SIMPLE)
> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> cleanup for the task
> 
> 
> 
> -- 
> Håvard Wahl Kongsgård
> Data Scientist

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Is the single node system secure ?
Have you checked hdfs healthiness ?
To which release of hbase were you importing ?

Thanks

> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
> 
> Hi, Iam trying to import a old backup to a new smaller system (just
> single node, to get the data out)
> 
> when I use
> 
> sudo -u hbase hbase -Dhbase.import.version=0.94
> org.apache.hadoop.hbase.mapreduce.Import crawler
> /crawler_hbase/crawler
> 
> I get this error in the tasks . Is this a permission problem?
> 
> 
> 2015-09-26 23:56:32,995 ERROR
> org.apache.hadoop.security.UserGroupInformation:
> PriviledgedActionException as:mapred (auth:SIMPLE)
> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> cleanup for the task
> 
> 
> 
> -- 
> Håvard Wahl Kongsgård
> Data Scientist

Re: Error importing hbase table on new system

Posted by Ted Yu <yu...@gmail.com>.
Is the single node system secure ?
Have you checked hdfs healthiness ?
To which release of hbase were you importing ?

Thanks

> On Sep 27, 2015, at 3:06 AM, Håvard Wahl Kongsgård <ha...@gmail.com> wrote:
> 
> Hi, Iam trying to import a old backup to a new smaller system (just
> single node, to get the data out)
> 
> when I use
> 
> sudo -u hbase hbase -Dhbase.import.version=0.94
> org.apache.hadoop.hbase.mapreduce.Import crawler
> /crawler_hbase/crawler
> 
> I get this error in the tasks . Is this a permission problem?
> 
> 
> 2015-09-26 23:56:32,995 ERROR
> org.apache.hadoop.security.UserGroupInformation:
> PriviledgedActionException as:mapred (auth:SIMPLE)
> cause:java.io.IOException: keyvalues=NONE read 4096 bytes, should read
> 14279
> 2015-09-26 23:56:32,996 WARN org.apache.hadoop.mapred.Child: Error running child
> java.io.IOException: keyvalues=NONE read 4096 bytes, should read 14279
> at org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:2221)
> at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:74)
> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:483)
> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
> 2015-09-26 23:56:33,002 INFO org.apache.hadoop.mapred.Task: Runnning
> cleanup for the task
> 
> 
> 
> -- 
> Håvard Wahl Kongsgård
> Data Scientist