You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@eagle.apache.org by Huizhi Lu <ih...@gmail.com> on 2016/06/14 04:49:26 UTC

Re: Requirement for detailed document of eagle

Hi Qianqian,

Here are some troubleshooting tips:

1) run the command: hdfs dfs -cat /tmp/private  and  use the following
command to check kafka consumer if it is able to consume the message.
Change the --zookeeper opts if needed.

bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic
sandbox_hdfs_audit_log

If you can see the message consumed by kafka, then your logstash or log4j
kafka appender is configured correctly.
If not, then the configuration is not correct. You need to double-check it.

2) Run kafka producer shell to produce a message to the topic  and use the
consumer command to see the message to be consumed. Check if you can get an
alert.

bin/kafka-console-producer --broker-list localhost:9092 --topic
sandbox_hdfs_audit_log

bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic
sandbox_hdfs_audit_log


3) Check the storm logs. Find the log for your topology.

If you still have question, feel free to post them out in the community.

-Huizhi


On Mon, Jun 13, 2016 at 8:39 PM, 左倩茜 <zq...@163.com> wrote:

> Hello,
> Thanks a lot for all your effort to eagle.
> I am a student and now  I want to research eagle. I download hdp's sandbox
> and setup environment for eagle , and it runs well.
> But, when i use my own environment( three servers) , it doesn't work.
> According to your tutorial in website , I test sandbox_hdfs_audit_log step
> by step.
> After I create policy in eagle UI and run this command "hdfs dfs -cat
> /tmp/private" , the alert info doesn't appear.
> My kafka can produce and consumer message if i use curl. i guess some
> configuration is not correct.
> So can you offer some detailed document for me?
>
>
>
> Best Regards,
>
> Ms. Zuo Qianqian
>
> 2016-06-14