You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Petrucci Andreas <pe...@hotmail.com> on 2010/12/07 20:46:56 UTC

HDFS and libhfds

hello there, im trying to compile libhdfs in order  but there are some problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is successful.

However when i try ant package -Djava5.home=... -Dforrest.home=... the build fails and the output is the below :

 [exec] 
     [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file
     [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
     [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
     [exec]     at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
     [exec]     at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
     [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
     [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
     [exec]     at java.security.AccessController.doPrivileged(Native Method)
     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
     [exec]     at org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113)
     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436)
     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400)
     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
     [exec]     at org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
     [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
     [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
     [exec]     at org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
     [exec]     at org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
     [exec]     at org.apache.cocoon.Main.main(Main.java:310)
     [exec] Java Result: 1
     [exec] 
     [exec]   Copying broken links file to site root.
     [exec]       
     [exec] 
     [exec] BUILD FAILED
     [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
     [exec] 
     [exec] Total time: 4 seconds

BUILD FAILED
/hadoop-0.20.2/build.xml:867: exec returned: 1


any ideas what's wrong???
 		 	   		  

RE: HDFS and libhfds

Posted by Petrucci Andreas <pe...@hotmail.com>.
yes, my JAVA_HOME is properly set. however in hadoop 0.20.2 that i'm using when i run from HADOOP_HOME the command ant compile-contrib -Dlibhdfs=1 -Dcompile.c++=1  then the tail of the output is the following :

    [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsUtime':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1488: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1488: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1490: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1494: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1494: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1497: error: expected ';' before 'jPath'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1498: error: 'jPath' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1503: error: 'jlong' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1503: error: expected ';' before 'jmtime'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1504: error: expected ';' before 'jatime'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1507: error: 'jthrowable' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1507: error: expected ';' before 'jExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1508: error: 'jExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1508: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1510: error: 'jmtime' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1510: error: 'jatime' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsGetHosts':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1533: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1533: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1535: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1539: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1539: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1542: error: expected ';' before 'jPath'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1543: error: 'jPath' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1547: error: 'jvalue' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1547: error: expected ';' before 'jFSVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1548: error: 'jthrowable' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1548: error: expected ';' before 'jFSExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1549: error: 'jFSVal' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1549: error: 'jFSExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1549: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1559: error: expected ';' before 'jFileStatus'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1563: error: 'jobjectArray' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1563: error: expected ';' before 'jBlockLocations'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1564: error: expected ';' before 'jVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1565: error: expected ';' before 'jExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1566: error: 'jVal' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1566: error: 'jExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1570: error: 'jFileStatus' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1577: error: 'jBlockLocations' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1581: error: 'jsize' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1581: error: expected ';' before 'jNumFileBlocks'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1583: error: 'jNumFileBlocks' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1597: error: expected ';' before 'jFileBlock'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1600: error: expected ';' before 'jVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1601: error: expected ';' before 'jFileBlockHosts'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1602: error: 'jFileBlock' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1613: error: 'jFileBlockHosts' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1616: error: expected ';' before 'jNumBlockHosts'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1617: error: 'jNumBlockHosts' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1633: error: 'jstring' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1633: error: expected ';' before 'jHost'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1637: error: 'jHost' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1638: warning: implicit declaration of function 'strdup'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1638: warning: incompatible implicit declaration of built-in function 'strdup'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsGetDefaultBlockSize':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1677: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1677: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1679: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1683: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1683: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1687: error: 'jvalue' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1687: error: expected ';' before 'jVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1688: error: 'jthrowable' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1688: error: expected ';' before 'jExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1689: error: 'jVal' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1689: error: 'jExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1689: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsGetCapacity':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1708: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1708: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1710: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1714: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1714: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1716: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1717: warning: implicit declaration of function 'globalClassReference'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1724: error: 'jvalue' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1724: error: expected ';' before 'jVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1725: error: 'jthrowable' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1725: error: expected ';' before 'jExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1726: error: 'jVal' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1726: error: 'jExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsGetUsed':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1744: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1744: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1746: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1750: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1750: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1752: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1760: error: 'jvalue' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1760: error: expected ';' before 'jVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1761: error: 'jthrowable' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1761: error: expected ';' before 'jExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1762: error: 'jVal' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1762: error: 'jExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: At top level:
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1775: error: expected ')' before '*' token
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1909: error: expected ')' before '*' token
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsListDirectory':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1962: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1962: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1964: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1968: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1968: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1971: error: expected ';' before 'jPath'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1972: error: 'jPath' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1978: error: 'jobjectArray' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1978: error: expected ';' before 'jPathList'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1979: error: 'jvalue' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1979: error: expected ';' before 'jVal'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1980: error: 'jthrowable' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1980: error: expected ';' before 'jExc'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981: error: 'jVal' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981: error: 'jExc' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1981: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1989: error: 'jPathList' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1992: error: 'jsize' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1992: error: expected ';' before 'jPathListSize'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:1993: error: 'jPathListSize' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2007: error: expected ';' before 'i'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2008: error: expected ';' before 'tmpStat'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2009: error: 'i' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2010: error: 'tmpStat' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2011: warning: implicit declaration of function 'getFileInfoFromStat'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsGetPathInfo':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2041: error: 'JNIEnv' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2041: error: 'env' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2043: error: 'errno' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2047: error: 'jobject' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2047: error: expected ';' before 'jFS'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2050: error: expected ';' before 'jPath'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2051: error: 'jPath' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2056: warning: implicit declaration of function 'getFileInfo'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2056: error: 'jFS' undeclared (first use in this function)
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c: In function 'hdfsFreeFileInfo':
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2080: error: 'hdfsFileInfo' has no member named 'mOwner'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2081: error: 'hdfsFileInfo' has no member named 'mOwner'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2083: error: 'hdfsFileInfo' has no member named 'mGroup'
     [exec] /home/hy59045/sfakiana/hadoop-0.20.2/src/c++/libhdfs/hdfs.c:2084: error: 'hdfsFileInfo' has no member named 'mGroup'
     [exec] make: *** [hdfs.lo] Error 1

BUILD FAILED
/home/hy59045/sfakiana/hadoop-0.20.2/build.xml:1478: exec returned: 2


any ideas???
thnx in advance

> From: cos@apache.org
> Date: Tue, 7 Dec 2010 14:29:03 -0800
> Subject: Re: HDFS and libhfds
> To: common-user@hadoop.apache.org
> 
> It is seems that you're trying to run ant with java5. Make sure your
> JAVA_HOME is set properly.
> --
>   Take care,
> Konstantin (Cos) Boudnik
> 
> 
> 
> 2010/12/7 Petrucci Andreas <pe...@hotmail.com>:
> >
> > hello there, im trying to compile libhdfs in order  but there are some problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is successful.
> >
> > However when i try ant package -Djava5.home=... -Dforrest.home=... the build fails and the output is the below :
> >
> >  [exec]
> >     [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file
> >     [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
> >     [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> >     [exec]     at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> >     [exec]     at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> >     [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> >     [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> >     [exec]     at java.security.AccessController.doPrivileged(Native Method)
> >     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> >     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >     [exec]     at org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113)
> >     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
> >     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436)
> >     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400)
> >     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
> >     [exec]     at org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
> >     [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
> >     [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
> >     [exec]     at org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
> >     [exec]     at org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
> >     [exec]     at org.apache.cocoon.Main.main(Main.java:310)
> >     [exec] Java Result: 1
> >     [exec]
> >     [exec]   Copying broken links file to site root.
> >     [exec]
> >     [exec]
> >     [exec] BUILD FAILED
> >     [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
> >     [exec]
> >     [exec] Total time: 4 seconds
> >
> > BUILD FAILED
> > /hadoop-0.20.2/build.xml:867: exec returned: 1
> >
> >
> > any ideas what's wrong???
> >
 		 	   		  

Re: HDFS and libhfds

Posted by Konstantin Boudnik <co...@apache.org>.
It is seems that you're trying to run ant with java5. Make sure your
JAVA_HOME is set properly.
--
  Take care,
Konstantin (Cos) Boudnik



2010/12/7 Petrucci Andreas <pe...@hotmail.com>:
>
> hello there, im trying to compile libhdfs in order  but there are some problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is successful.
>
> However when i try ant package -Djava5.home=... -Dforrest.home=... the build fails and the output is the below :
>
>  [exec]
>     [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file
>     [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
>     [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
>     [exec]     at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
>     [exec]     at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
>     [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
>     [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
>     [exec]     at java.security.AccessController.doPrivileged(Native Method)
>     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
>     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>     [exec]     at org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113)
>     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
>     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436)
>     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400)
>     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
>     [exec]     at org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
>     [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
>     [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
>     [exec]     at org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
>     [exec]     at org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
>     [exec]     at org.apache.cocoon.Main.main(Main.java:310)
>     [exec] Java Result: 1
>     [exec]
>     [exec]   Copying broken links file to site root.
>     [exec]
>     [exec]
>     [exec] BUILD FAILED
>     [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
>     [exec]
>     [exec] Total time: 4 seconds
>
> BUILD FAILED
> /hadoop-0.20.2/build.xml:867: exec returned: 1
>
>
> any ideas what's wrong???
>

Re: HDFS and libhfds

Posted by Edward Capriolo <ed...@gmail.com>.
2010/12/7 Petrucci Andreas <pe...@hotmail.com>:
>
> hello there, im trying to compile libhdfs in order  but there are some problems. According to http://wiki.apache.org/hadoop/MountableHDFS  i have already installes fuse. With ant compile-c++-libhdfs -Dlibhdfs=1 the buils is successful.
>
> However when i try ant package -Djava5.home=... -Dforrest.home=... the build fails and the output is the below :
>
>  [exec]
>     [exec] Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file
>     [exec]     at java.lang.ClassLoader.defineClass1(Native Method)
>     [exec]     at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
>     [exec]     at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
>     [exec]     at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
>     [exec]     at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
>     [exec]     at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
>     [exec]     at java.security.AccessController.doPrivileged(Native Method)
>     [exec]     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>     [exec]     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
>     [exec]     at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>     [exec]     at org.apache.avalon.excalibur.logger.DefaultLogTargetFactoryManager.configure(DefaultLogTargetFactoryManager.java:113)
>     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
>     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.setupTargetFactoryManager(LogKitLoggerManager.java:436)
>     [exec]     at org.apache.avalon.excalibur.logger.LogKitLoggerManager.configure(LogKitLoggerManager.java:400)
>     [exec]     at org.apache.avalon.framework.container.ContainerUtil.configure(ContainerUtil.java:201)
>     [exec]     at org.apache.cocoon.core.CoreUtil.initLogger(CoreUtil.java:607)
>     [exec]     at org.apache.cocoon.core.CoreUtil.init(CoreUtil.java:169)
>     [exec]     at org.apache.cocoon.core.CoreUtil.<init>(CoreUtil.java:115)
>     [exec]     at org.apache.cocoon.bean.CocoonWrapper.initialize(CocoonWrapper.java:128)
>     [exec]     at org.apache.cocoon.bean.CocoonBean.initialize(CocoonBean.java:97)
>     [exec]     at org.apache.cocoon.Main.main(Main.java:310)
>     [exec] Java Result: 1
>     [exec]
>     [exec]   Copying broken links file to site root.
>     [exec]
>     [exec]
>     [exec] BUILD FAILED
>     [exec] /apache-forrest-0.8/main/targets/site.xml:175: Warning: Could not find file /hadoop-0.20.2/src/docs/build/tmp/brokenlinks.xml to copy.
>     [exec]
>     [exec] Total time: 4 seconds
>
> BUILD FAILED
> /hadoop-0.20.2/build.xml:867: exec returned: 1
>
>
> any ideas what's wrong???
>

I never saw this usage:
-Djava5.home
Try
export JAVA_HOME=/usr/java

" Bad version number in .class file " means you are mixing and
matching java versions somehow.