You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2011/02/11 19:01:35 UTC

[Hadoop Wiki] Update of "GitAndHadoop" by SteveLoughran

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "GitAndHadoop" page has been changed by SteveLoughran.
The comment on this change is: details on ivy cache contamination.
http://wiki.apache.org/hadoop/GitAndHadoop?action=diff&rev1=13&rev2=14

--------------------------------------------------

  
  This Ant target not only builds the JAR files, it copies it to the local {{{${user.home}/.m2}}} directory, where it will be picked up by the "internal" resolver. You can check that this is taking place by running {{{ant ivy-report}}} on a project and seeing where it gets its dependencies.
  
- '''Warning:''' it's easy for old JAR versions to get cached and picked up. You will notice this early if something in hadoop-hdfs or hadoop-mapreduce doesn't compile, but if you are unlucky things do compile, just not work as your updates are not picked up. Run {{{ant clean-cache}}} to fix this 
+ '''Warning:''' it's easy for old JAR versions to get cached and picked up. You will notice this early if something in hadoop-hdfs or hadoop-mapreduce doesn't compile, but if you are unlucky things do compile, just not work as your updates are not picked up. Run {{{ant clean-cache}}} to fix this. 
+ 
+ By default, the trunk of the HDFS and mapreduce projects are set to grab the snapshot versions that get built and published into the Apache snapshot repository nightly. While this saves developers in these projects the complexity of having to build and publish the upstream artifacts themselves, it doesn't work if you do want to make changes to things like hadoop-common. You need to make sure the local projects are picking up what's being built locally. 
+ 
+ To check this in the hadoop-hdfs project, generate the Ivy dependency reports using the internal resolver:
+ {{{
+ ant ivy-report -Dresolvers=internal
+ }}}
+ 
+ Then browse to the report page listed at the bottom of the process, switch to the "common" tab, and look for hadoop-common JAR. It should have a publication timestamp which contains the date and time of your local build. For example, the string "	20110211174419"> means the date 2011-02-11 and the time of 17:44:19. If an older version is listed, you probably have it cached in the ivy cache -you can fix this by removing everything from the org.apache corner of this cache.
+ 
+ {{{
+ rm -rf ~/.ivy2/cache/org.apache.hadoop
+ }}}
+ 
+ Rerun the {{{ivy-report}}} target and check that the publication date is current to verify that the version is now up to date.
+ 
  
  === Testing ===