You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by Roberto Tardío <ro...@stratebi.com> on 2017/10/16 09:19:49 UTC

Persistent error after Kafka streaming cube source stopped

Hi,

I'm doing a PoC with to build a cube with Kylin using a Streaming Data 
with Kafka. Kafka connection and cube creation works correctly. I have 
tested using the Kafka test generator included with Kylin and I 
scheduled the process with Crontab for its execution every 10 minutes, 
during a period of 24 hours. However when I stopped Kafka generator and 
Crontab schedule, I got errors in Kylin:

  * Monitor show error when it tries to load jobs. This affects not only
    to Kafka test project, but also to rest other batch projects.
      o Kyling log show this:
          + ERROR [http-bio-7070-exec-4] controller.BasicController:57 :
            org.apache.kylin.rest.exception.InternalErrorException:
            java.lang.RuntimeException:
            org.apache.kylin.job.exception.PersistentException:
            com.fasterxml.jackson.databind.JsonMappingException: No
            content to map due to end-of-input
              at [Source: java.io.DataInputStream@189348ee; line: 1,
            column: 1]
  * I cannot purge or drop this Kafka test streaming cube.
      o Kyli UI shows the following error:
          + Failed to delete cube. Caused by:
            org.apache.kylin.job.exception.hdfs.BlockMissingException:
            Could not obtain block: BP-699932432.....

I have tried and received the same error with Kylin 1.6 and Kylin 2.1. 
Now I Kylin is still running, but I have to solve this errors to use it 
in a normal way (e.g. building new cubes)

¿Anyone can help me?

Thanks in advance,

*Roberto Tardío Olmos*

/Senior Big Data & Business Intelligence Consultant/
Avenida de Brasil, 17, Planta 16.28020 Madrid
Fijo: 91.788.34.10

Re: Persistent error after Kafka streaming cube source stopped

Posted by Roberto Tardío <ro...@stratebi.com>.
Yes of course. Attached log in this email.

There are full log, but yo can search for line like this:

2017-10-16 18:44:52,527 WARN  [Thread-12] hdfs.DFSClient:1002 : Could 
not obtain block: BP-69932432-127.0.1.1-1475238054044:blk_1073816582_75797

Thanks!


El 17/10/2017 a las 11:19, Billy Liu escribió:
> Could you provide more logs about "Kylin can not start"?
>
> 2017-10-17 16:50 GMT+08:00 Roberto Tardío <roberto.tardio@stratebi.com 
> <ma...@stratebi.com>>:
>
>     Thanks Billy an ShaoFeng,
>
>     I have checked that error was due to an issue with HDFS data
>     nodes, so the last streaming cube building could not write some
>     data in HDFS/HBase. However, I have solved hdfs issue but Kylin
>     cannot not start maybe due to the corrupted data from Streaming
>     cube. ¿What do recommend for cases like this one? ¿What is the
>     best way to delete a cube and model with corrupted data?
>
>     Best Regards,
>
>
>     El 16/10/2017 a las 15:18, ShaoFeng Shi escribió:
>>     From this message, it is probably caused by unstable Hadoop env:
>>
>>     org.apache.kylin.job.exception.hdfs.BlockMissingException: Could
>>     not obtain block: BP-699932432.....
>>
>>     Need full log to detect the root cause.
>>
>>     2017-10-16 20:13 GMT+08:00 Billy Liu <billyliu@apache.org
>>     <ma...@apache.org>>:
>>
>>         Hi Roberto,
>>
>>         Could you update more logs. It should has more logs from the
>>         context.
>>
>>         2017-10-16 17:19 GMT+08:00 Roberto Tardío
>>         <roberto.tardio@stratebi.com
>>         <ma...@stratebi.com>>:
>>
>>             Hi,
>>
>>             I'm doing a PoC with to build a cube with Kylin using a
>>             Streaming Data with Kafka. Kafka connection and cube
>>             creation works correctly. I have tested using the Kafka
>>             test generator included with Kylin and I scheduled the
>>             process with Crontab for its execution every 10 minutes,
>>             during a period of 24 hours. However when I stopped Kafka
>>             generator and Crontab schedule, I got errors in Kylin:
>>
>>               * Monitor show error when it tries to load jobs. This
>>                 affects not only to Kafka test project, but also to
>>                 rest other batch projects.
>>                   o Kyling log show this:
>>                       + ERROR [http-bio-7070-exec-4]
>>                         controller.BasicController:57 :
>>                         org.apache.kylin.rest.exception.InternalErrorException:
>>                         java.lang.RuntimeException:
>>                         org.apache.kylin.job.exception.PersistentException:
>>                         com.fasterxml.jackson.databind.JsonMappingException:
>>                         No content to map due to end-of-input
>>                          at [Source:
>>                         java.io.DataInputStream@189348ee; line: 1,
>>                         column: 1]
>>               * I cannot purge or drop this Kafka test streaming cube.
>>                   o Kyli UI shows the following error:
>>                       + Failed to delete cube. Caused by:
>>                         org.apache.kylin.job.exception.hdfs.BlockMissingException:
>>                         Could not obtain block: BP-699932432.....
>>
>>             I have tried and received the same error with Kylin 1.6
>>             and Kylin 2.1. Now I Kylin is still running, but I have
>>             to solve this errors to use it in a normal way (e.g.
>>             building new cubes)
>>
>>             ¿Anyone can help me?
>>
>>             Thanks in advance,
>>
>>             *Roberto Tardío Olmos*
>>
>>             /Senior Big Data & Business Intelligence Consultant/
>>             Avenida de Brasil, 17
>>             <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
>>             Planta 16.28020 Madrid
>>             Fijo: 91.788.34.10
>>
>>
>>
>>
>>
>>     -- 
>>     Best regards,
>>
>>     Shaofeng Shi 史少锋
>>
>
>     -- 
>
>     *Roberto Tardío Olmos*
>
>     /Senior Big Data & Business Intelligence Consultant/
>     Avenida de Brasil, 17
>     <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
>     Planta 16.28020 Madrid
>     Fijo: 91.788.34.10
>
>

-- 

*Roberto Tardío Olmos*

/Senior Big Data & Business Intelligence Consultant/
Avenida de Brasil, 17, Planta 16.28020 Madrid
Fijo: 91.788.34.10

Re: Persistent error after Kafka streaming cube source stopped

Posted by Billy Liu <bi...@apache.org>.
Could you provide more logs about "Kylin can not start"?

2017-10-17 16:50 GMT+08:00 Roberto Tardío <ro...@stratebi.com>:

> Thanks Billy an ShaoFeng,
>
> I have checked that error was due to an issue with HDFS data nodes, so the
> last streaming cube building could not write some data in HDFS/HBase.
> However, I have solved hdfs issue but Kylin cannot not start maybe due to
> the corrupted data from Streaming cube. ¿What do recommend for cases like
> this one? ¿What is the best way to delete a cube and model with corrupted
> data?
>
> Best Regards,
>
> El 16/10/2017 a las 15:18, ShaoFeng Shi escribió:
>
> From this message, it is probably caused by unstable Hadoop env:
>
> org.apache.kylin.job.exception.hdfs.BlockMissingException: Could not
> obtain block: BP-699932432.....
>
> Need full log to detect the root cause.
>
> 2017-10-16 20:13 GMT+08:00 Billy Liu <bi...@apache.org>:
>
>> Hi Roberto,
>>
>> Could you update more logs. It should has more logs from the context.
>>
>> 2017-10-16 17:19 GMT+08:00 Roberto Tardío <ro...@stratebi.com>:
>>
>>> Hi,
>>>
>>> I'm doing a PoC with to build a cube with Kylin using a Streaming Data
>>> with Kafka. Kafka connection and cube creation works correctly. I have
>>> tested using the Kafka test generator included with Kylin and I scheduled
>>> the process with Crontab for its execution every 10 minutes, during a
>>> period of 24 hours. However when I stopped Kafka generator and Crontab
>>> schedule, I got errors in Kylin:
>>>
>>>    - Monitor show error when it tries to load jobs. This affects not
>>>    only to Kafka test project, but also to rest other batch projects.
>>>       - Kyling log show this:
>>>       - ERROR [http-bio-7070-exec-4] controller.BasicController:57 :
>>>          org.apache.kylin.rest.exception.InternalErrorException:
>>>          java.lang.RuntimeException: org.apache.kylin.job.exception.PersistentException:
>>>          com.fasterxml.jackson.databind.JsonMappingException: No
>>>          content to map due to end-of-input
>>>           at [Source: java.io.DataInputStream@189348ee; line: 1,
>>>          column: 1]
>>>          - I cannot purge or drop this Kafka test streaming cube.
>>>       - Kyli UI shows the following error:
>>>          - Failed to delete cube. Caused by:
>>>          org.apache.kylin.job.exception.hdfs.BlockMissingException:
>>>          Could not obtain block: BP-699932432.....
>>>
>>> I have tried and received the same error with Kylin 1.6 and Kylin 2.1.
>>> Now I Kylin is still running, but I have to solve this errors to use it in
>>> a normal way (e.g. building new cubes)
>>>
>>> ¿Anyone can help me?
>>>
>>> Thanks in advance,
>>>
>>> *Roberto Tardío Olmos*
>>> *Senior Big Data & Business Intelligence Consultant*
>>> Avenida de Brasil, 17
>>> <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
>>> Planta 16.28020 Madrid
>>> Fijo: 91.788.34.10
>>>
>>
>>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>
>
> --
>
> *Roberto Tardío Olmos*
> *Senior Big Data & Business Intelligence Consultant*
> Avenida de Brasil, 17
> <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
> Planta 16.28020 Madrid
> Fijo: 91.788.34.10
>

Re: Persistent error after Kafka streaming cube source stopped

Posted by Roberto Tardío <ro...@stratebi.com>.
Thanks Billy an ShaoFeng,

I have checked that error was due to an issue with HDFS data nodes, so 
the last streaming cube building could not write some data in 
HDFS/HBase. However, I have solved hdfs issue but Kylin cannot not start 
maybe due to the corrupted data from Streaming cube. ¿What do recommend 
for cases like this one? ¿What is the best way to delete a cube and 
model with corrupted data?

Best Regards,


El 16/10/2017 a las 15:18, ShaoFeng Shi escribió:
> From this message, it is probably caused by unstable Hadoop env:
>
> org.apache.kylin.job.exception.hdfs.BlockMissingException: Could not 
> obtain block: BP-699932432.....
>
> Need full log to detect the root cause.
>
> 2017-10-16 20:13 GMT+08:00 Billy Liu <billyliu@apache.org 
> <ma...@apache.org>>:
>
>     Hi Roberto,
>
>     Could you update more logs. It should has more logs from the context.
>
>     2017-10-16 17:19 GMT+08:00 Roberto Tardío
>     <roberto.tardio@stratebi.com <ma...@stratebi.com>>:
>
>         Hi,
>
>         I'm doing a PoC with to build a cube with Kylin using a
>         Streaming Data with Kafka. Kafka connection and cube creation
>         works correctly. I have tested using the Kafka test generator
>         included with Kylin and I scheduled the process with Crontab
>         for its execution every 10 minutes, during a period of 24
>         hours. However when I stopped Kafka generator and Crontab
>         schedule, I got errors in Kylin:
>
>           * Monitor show error when it tries to load jobs. This
>             affects not only to Kafka test project, but also to rest
>             other batch projects.
>               o Kyling log show this:
>                   + ERROR [http-bio-7070-exec-4]
>                     controller.BasicController:57 :
>                     org.apache.kylin.rest.exception.InternalErrorException:
>                     java.lang.RuntimeException:
>                     org.apache.kylin.job.exception.PersistentException:
>                     com.fasterxml.jackson.databind.JsonMappingException:
>                     No content to map due to end-of-input
>                      at [Source: java.io.DataInputStream@189348ee;
>                     line: 1, column: 1]
>           * I cannot purge or drop this Kafka test streaming cube.
>               o Kyli UI shows the following error:
>                   + Failed to delete cube. Caused by:
>                     org.apache.kylin.job.exception.hdfs.BlockMissingException:
>                     Could not obtain block: BP-699932432.....
>
>         I have tried and received the same error with Kylin 1.6 and
>         Kylin 2.1. Now I Kylin is still running, but I have to solve
>         this errors to use it in a normal way (e.g. building new cubes)
>
>         ¿Anyone can help me?
>
>         Thanks in advance,
>
>         *Roberto Tardío Olmos*
>
>         /Senior Big Data & Business Intelligence Consultant/
>         Avenida de Brasil, 17
>         <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
>         Planta 16.28020 Madrid
>         Fijo: 91.788.34.10
>
>
>
>
>
> -- 
> Best regards,
>
> Shaofeng Shi 史少锋
>

-- 

*Roberto Tardío Olmos*

/Senior Big Data & Business Intelligence Consultant/
Avenida de Brasil, 17, Planta 16.28020 Madrid
Fijo: 91.788.34.10

Re: Persistent error after Kafka streaming cube source stopped

Posted by ShaoFeng Shi <sh...@apache.org>.
From this message, it is probably caused by unstable Hadoop env:

org.apache.kylin.job.exception.hdfs.BlockMissingException: Could not obtain
block: BP-699932432.....

Need full log to detect the root cause.

2017-10-16 20:13 GMT+08:00 Billy Liu <bi...@apache.org>:

> Hi Roberto,
>
> Could you update more logs. It should has more logs from the context.
>
> 2017-10-16 17:19 GMT+08:00 Roberto Tardío <ro...@stratebi.com>:
>
>> Hi,
>>
>> I'm doing a PoC with to build a cube with Kylin using a Streaming Data
>> with Kafka. Kafka connection and cube creation works correctly. I have
>> tested using the Kafka test generator included with Kylin and I scheduled
>> the process with Crontab for its execution every 10 minutes, during a
>> period of 24 hours. However when I stopped Kafka generator and Crontab
>> schedule, I got errors in Kylin:
>>
>>    - Monitor show error when it tries to load jobs. This affects not
>>    only to Kafka test project, but also to rest other batch projects.
>>       - Kyling log show this:
>>       - ERROR [http-bio-7070-exec-4] controller.BasicController:57 :
>>          org.apache.kylin.rest.exception.InternalErrorException:
>>          java.lang.RuntimeException: org.apache.kylin.job.exception.PersistentException:
>>          com.fasterxml.jackson.databind.JsonMappingException: No content
>>          to map due to end-of-input
>>           at [Source: java.io.DataInputStream@189348ee; line: 1, column:
>>          1]
>>          - I cannot purge or drop this Kafka test streaming cube.
>>       - Kyli UI shows the following error:
>>          - Failed to delete cube. Caused by:
>>          org.apache.kylin.job.exception.hdfs.BlockMissingException:
>>          Could not obtain block: BP-699932432.....
>>
>> I have tried and received the same error with Kylin 1.6 and Kylin 2.1.
>> Now I Kylin is still running, but I have to solve this errors to use it in
>> a normal way (e.g. building new cubes)
>>
>> ¿Anyone can help me?
>>
>> Thanks in advance,
>>
>> *Roberto Tardío Olmos*
>> *Senior Big Data & Business Intelligence Consultant*
>> Avenida de Brasil, 17
>> <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
>> Planta 16.28020 Madrid
>> Fijo: 91.788.34.10
>>
>
>


-- 
Best regards,

Shaofeng Shi 史少锋

Re: Persistent error after Kafka streaming cube source stopped

Posted by Billy Liu <bi...@apache.org>.
Hi Roberto,

Could you update more logs. It should has more logs from the context.

2017-10-16 17:19 GMT+08:00 Roberto Tardío <ro...@stratebi.com>:

> Hi,
>
> I'm doing a PoC with to build a cube with Kylin using a Streaming Data
> with Kafka. Kafka connection and cube creation works correctly. I have
> tested using the Kafka test generator included with Kylin and I scheduled
> the process with Crontab for its execution every 10 minutes, during a
> period of 24 hours. However when I stopped Kafka generator and Crontab
> schedule, I got errors in Kylin:
>
>    - Monitor show error when it tries to load jobs. This affects not only
>    to Kafka test project, but also to rest other batch projects.
>       - Kyling log show this:
>       - ERROR [http-bio-7070-exec-4] controller.BasicController:57 :
>          org.apache.kylin.rest.exception.InternalErrorException:
>          java.lang.RuntimeException: org.apache.kylin.job.exception.PersistentException:
>          com.fasterxml.jackson.databind.JsonMappingException: No content
>          to map due to end-of-input
>           at [Source: java.io.DataInputStream@189348ee; line: 1, column:
>          1]
>          - I cannot purge or drop this Kafka test streaming cube.
>       - Kyli UI shows the following error:
>          - Failed to delete cube. Caused by: org.apache.kylin.job.
>          exception.hdfs.BlockMissingException: Could not obtain block:
>          BP-699932432.....
>
> I have tried and received the same error with Kylin 1.6 and Kylin 2.1. Now
> I Kylin is still running, but I have to solve this errors to use it in a
> normal way (e.g. building new cubes)
>
> ¿Anyone can help me?
>
> Thanks in advance,
>
> *Roberto Tardío Olmos*
> *Senior Big Data & Business Intelligence Consultant*
> Avenida de Brasil, 17
> <https://maps.google.com/?q=Avenida+de+Brasil,+17&entry=gmail&source=g>,
> Planta 16.28020 Madrid
> Fijo: 91.788.34.10
>