You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Konstantin Boudnik (JIRA)" <ji...@apache.org> on 2009/06/09 22:11:07 UTC

[jira] Issue Comment Edited: (HADOOP-6003) AspectJ framework for HDFS code and tests

    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12717807#action_12717807 ] 

Konstantin Boudnik edited comment on HADOOP-6003 at 6/9/09 1:10 PM:
--------------------------------------------------------------------

This patch includes the  following additions:
- AspectJ framework (version 1.6.4) is added to the Ivy resolver's configuration
- the implementation of a simple probability calculation and configuration needed by fault injection
- two aspects for datanode's classes BlockReceiver and FSDataset are created and tested

It is expected to see unit tests failing with faults in place. We might need to develop different kind of tests to utilize fault injection in a better way.

The interface of the new framework is as follows:
- ant injectfaults will weave the aspects in place after the normal compilation of HDFS classes is complete
- ant run-test-hdfs will execute unit tests as usual, but faults will be injected according to the rules
- ant jar will create Hadoop's jar as usual, but if 'injectfaults' has been executed before then the jar file will include instrumented classes, e.g. with fault invocations

The rules of faults injection probability calculation are as follows:
* default probability level is set to 0. Thus even with aspects weaved into the classes faults won't be injected/executed unless specified explicitly
* to set certain class' faults probability level one needs to specify system property in the following format
** 
{code}
  ant run-test-hdfs -Dfault.probability.FSDataset=3 
{code}
which will set the probability of faults injections into FSDataset class at about 3%     


      was (Author: cos):
    This patch includes the  following additions:
- AspectJ framework (version 1.6.4) is added to the Ivy resolver's configuration
- the implementation of a simple probability calculation and configuration needed by fault injection
- two aspects for datanode's classes BlockReceiver and FSDataset are created and tested

It is expected to see unit tests failing with faults in place. We might need to develop different kind of tests to utilize fault injection in a better way.

The interface of the new framework is as follows:
- ant injectfaults will weave the aspects in place after the normal compilation of HDFS classes is complete
- ant run-test-hdfs will execute unit tests as usual, but faults will be injected according to the rules
- ant jar will create Hadoop's jar as usual, but if 'injectfaults' has been executed before then the jar file will include instrumented classes, e.g. with fault invocations

The rules of faults injection probability calculation are as follows:
* default probability level is set to 0. Thus even with aspects weaved into the classes faults won't be injected/executed unless specified explicitly
* to set certain class' faults probability level one needs to specify system property in the following format
** 
{code}
  ant run-test-hdfs -Dfault.probability.FSDataset=3 
{code}
which will probability of faults injections into FSDataset class at about 3%     

  
> AspectJ framework for HDFS code and tests
> -----------------------------------------
>
>                 Key: HADOOP-6003
>                 URL: https://issues.apache.org/jira/browse/HADOOP-6003
>             Project: Hadoop Core
>          Issue Type: Sub-task
>    Affects Versions: 0.20.0
>            Reporter: Konstantin Boudnik
>            Assignee: Konstantin Boudnik
>             Fix For: 0.21.0
>
>         Attachments: HADOOP-6003.patch, HADOOP-6003.sh
>
>
> This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.