You are viewing a plain text version of this content. The canonical link for it is here.
Posted to codereview@trafodion.apache.org by zellerh <gi...@git.apache.org> on 2015/10/13 17:09:03 UTC

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

GitHub user zellerh opened a pull request:

    https://github.com/apache/incubator-trafodion/pull/118

    TRAFODION-1521 Build Trafodion without having HBase installed

    Adding a new script, core/sqf/sql/scripts/get_libhdfs_files. The new
    script will download Google Protocol Buffers 2.5.0 and a source tar
    file for Hadoop and build protobuf-2.5.0 and the Hadoop native
    libraries. It will then copy the libraries (libhadoop.so and
    libhdfs.so) to $MY_SQROOT/export/lib$ {SQ_MBTYPE}. It will also copy
    the include file hdfs.h to $MY_SQROOT/include. Since this step is very
    time-consuming, it takes a shortcut if Hadoop is installed
    (e.g. with install_local_hadoop), in that case it will copy the
    existing native libraries.  To address the remaining needs for jar
    files during the build, the hbasetmlib2 build is converted to
    Maven.  We also remove the case in sqenvcom.sh where a
    configuration does not have install_local_hadoop, Cloudera,
    Hortonworks or MapR installed, but has a TOOLSDIR. Such cases should
    use the recently added case for a native Apache Hadoop installation or
    the case where no HBase installation is found. That last case will
    print out a reminder that can be ignored.
    
    Note: Users will need to do "sqgen" to update their classpath. This is
    because we need a new jar in the classpath: trafodion-dtm-1.2.0.jar.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/zellerh/incubator-trafodion bug/1521

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/incubator-trafodion/pull/118.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #118
    
----
commit 77eab6badc3cebaef392a03b25c3d9e1dc5106c2
Author: Hans Zeller <ha...@esgyn.com>
Date:   2015-10-13T14:40:46Z

    TRAFODION-1521 Build Trafodion without having HBase installed
    
    Adding a new script, core/sqf/sql/scripts/get_libhdfs_files. The new
    script will download Google Protocol Buffers 2.5.0 and a source tar
    file for Hadoop and build protobuf-2.5.0 and the Hadoop native
    libraries. It will then copy the libraries (libhadoop.so and
    libhdfs.so) to $MY_SQROOT/export/lib$ {SQ_MBTYPE}. It will also copy
    the include file hdfs.h to $MY_SQROOT/include. Since this step is very
    time-consuming, it takes a shortcut if Hadoop is installed
    (e.g. with install_local_hadoop), in that case it will copy the
    existing native libraries.  To address the remaining needs for jar
    files during the build, the hbasetmlib2 build is converted to
    Maven.  We also remove the case in sqenvcom.sh where a
    configuration does not have install_local_hadoop, Cloudera,
    Hortonworks or MapR installed, but has a TOOLSDIR. Such cases should
    use the recently added case for a native Apache Hadoop installation or
    the case where no HBase installation is found. That last case will
    print out a reminder that can be ignored.
    
    Note: Users will need to do "sqgen" to update their classpath. This is
    because we need a new jar in the classpath: trafodion-dtm-1.2.0.jar.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41887745
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    --- End diff --
    
    There is another library - libhadoop.so, so the assumption is that if libhdfs.so exists so does libhadoop.so.  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41894882
  
    --- Diff: core/sqf/sqenvcom.sh ---
    @@ -580,6 +547,13 @@ EOF
         echo "**** ERROR: Unable to determine location of HBase lib directory"
       fi
     
    +  if [[ -d $TOOLSDIR/thrift-0.9.0 ]]; then
    +    # this is mostly for a build environment, where we need
    +    # thrift from TOOLSDIR
    +    export THRIFT_LIB_DIR=$TOOLSDIR/thrift-0.9.0/lib
    +    export THRIFT_INC_DIR=$TOOLSDIR/thrift-0.9.0/include
    +  fi
    +
       if [ -n "$HBASE_CNF_DIR" -a -n "$HADOOP_CNF_DIR" -a \
    --- End diff --
    
    No, these are still there. Good to know, I was wondering what those are used for. Thanks for checking.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41886534
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    --- End diff --
    
    Would it be useful to do :  protoc --version and if the default version is 2.5.0 or greater, than remove this download step and use the installed version?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/incubator-trafodion/pull/118


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41886129
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    --- End diff --
    
    Do we need the Apache header?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41888635
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    --- End diff --
    
    Sorry, I don't remember anymore why I picked 2.6.0, but it does work. We use 2.5.0-cdh5.3.0 elsewhere, which is different from Apache 2.5.0, and I might have run into some interface issues with the Apache 2.5.0 version.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41888772
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +    tar -xzf ${PROTOBUF_TAR} >>${LOGFILE}
    +  fi
    +
    +  if [[ ! -d $PROTOBUF_TGT_ID ]]; then
    +    cd ${PROTOBUF_ID}
    +    echo "Building Google Protocol Buffers, this could take a while..." | tee -a ${LOGFILE}
    +    if [[ $VERBOSE == true ]]; then
    +      ./configure --prefix=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID} 2>&1 | tee -a ${LOGFILE}
    +    else
    +      ./configure --prefix=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID} 2>&1 >>${LOGFILE}
    +    fi
    +    if [[ $? != 0 ]]; then
    +      echo "Error during configure step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    make 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    +      echo "Error during make step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    # skip the tests
    +    # make check 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    +      echo "Error during check step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    make install 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    +      echo "Error during install step, exiting" | tee -a ${LOGFILE}
    +      # remove partial results, if any
    +      rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +      exit 1
    +    fi
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +  export HADOOP_PROTOC_PATH=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}/bin/protoc
    +
    +  if [[ ! -f ${HADOOP_SRC_TAR} ]]; then
    +    echo "Downloading Hadoop tar file ${HADOOP_SRC_TAR}..." | tee -a ${LOGFILE}
    +    wget ${HADOOP_SRC_MIRROR_URL}/${HADOOP_SRC_TAR} 2>&1 >>${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${HADOOP_SRC_ID}
    +  fi
    +
    +  if [[ ! -d ${HADOOP_SRC_ID} ]]; then
    +    echo "Unpacking Hadoop tar file..." | tee -a ${LOGFILE}
    +    tar -xzf ${HADOOP_SRC_TAR}
    +  fi
    +
    +  if [[ ! -d ${LIBHDFS_TEMP_DIR}/${HADOOP_SRC_ID}/hadoop-dist/target ]]; then
    +    cd ${HADOOP_SRC_ID}
    +    echo "Building native library, this will take several minutes..." | tee -a ${LOGFILE}
    +    if [[ $VERBOSE == true ]]; then
    +      mvn package -Pdist,native -Dmaven.javadoc.skip=true -DskipTests -Dtar 2>&1 | tee -a ${LOGFILE}
    +    else
    +      mvn package -Pdist,native -Dmaven.javadoc.skip=true -DskipTests -Dtar 2>&1 >>${LOGFILE}
    +    fi
    +    if [[ $? != 0 ]]; then
    +      echo "Error during Maven build step for libhdfs, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +  fi
    +
    +  echo "Copying include file and built libraries to Trafodion export dir..." | tee -a ${LOGFILE}
    +  if [[ $VERBOSE == true ]]; then
    +    set -x
    +  fi
    +  cp -f ${LIBHDFS_TEMP_DIR}/${HADOOP_SRC_ID}/hadoop-dist/target/${HADOOP_ID}/include/hdfs.h ${TGT_INC_DIR}
    +  cp -Pf ${LIBHDFS_TEMP_DIR}/${HADOOP_SRC_ID}/hadoop-dist/target/${HADOOP_ID}/lib/native/libhdfs*.so* ${TGT_LIB_DIR}
    +  cp -Pf ${LIBHDFS_TEMP_DIR}/${HADOOP_SRC_ID}/hadoop-dist/target/${HADOOP_ID}/lib/native/libhadoop*.so* ${TGT_LIB_DIR}
    +
    +  ls -l ${TGT_INC_DIR}/hdfs.h       >> ${LOGFILE}
    +  ls -l ${TGT_LIB_DIR}/libhdfs.so   >> ${LOGFILE}
    +  ls -l ${TGT_LIB_DIR}/libhadoop.so >> ${LOGFILE}
    +
    +  # Final check whether all the needed files are there
    +  if [[ ! -r ${TGT_INC_DIR}/hdfs.h || \
    +        ! -r ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +    echo "Error, not all files were created" | tee -a ${LOGFILE}
    +    ls -l ${TGT_INC_DIR}/hdfs.h
    +    ls -l ${TGT_LIB_DIR}/libhdfs.so
    +    exit 1
    +  fi
    +fi
    --- End diff --
    
    This is great - thanks for adding this make step.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41886366
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    --- End diff --
    
    We have been using 2.5.? versions.  Is 2.6.0 compatible with HBase .98?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by DaveBirdsall <gi...@git.apache.org>.
Github user DaveBirdsall commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41884484
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    --- End diff --
    
    Might get an error on this if you went through the code path above that also does this "rm".


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by DaveBirdsall <gi...@git.apache.org>.
Github user DaveBirdsall commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41885893
  
    --- Diff: core/sql/nskgmake/Makerules.linux ---
    @@ -80,20 +80,14 @@ OBJSUFFIX := o
     #######################################################################
     # Define several directories used by various parts of the Linux build
     #######################################################################
    -NSK_SQ    := $(MY_SQROOT)
    -NSK       := $(MY_SQROOT)/sql/nsk
    -XMPIROOT        = $(NSK_SQ)/export/lib$(SQ_MBTYPE)
    -NSKBIN		= $(NSK_SQ)/export/bin$(SQ_MBTYPE)
    -NSKROOT         = $(NSK_SQ)/sql/nsk
    +LIBROOT         = $(MY_SQROOT)/export/lib$(SQ_MBTYPE)
    +TRAF_BIN	= $(MY_SQROOT)/export/bin$(SQ_MBTYPE)
    +XMPIROOT        = $(LIBROOT)
     ifeq ($(SQ_BUILD_TYPE),release)
    --- End diff --
    
    Glad to see the NSK stuff cleaned up.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41887305
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    --- End diff --
    
    It looks like someone can set LIBHDFS_TEMP_DIR before calling this script but if they specify the -d option, tempDir is overwritten - I assume this is the desired action.  Also, LOGFILE is set earlier based on LIBHDFS_TEMP_DIR, if the user changes this value here, should the LOGFILE also be changed to match?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41887027
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +    tar -xzf ${PROTOBUF_TAR} >>${LOGFILE}
    +  fi
    +
    +  if [[ ! -d $PROTOBUF_TGT_ID ]]; then
    +    cd ${PROTOBUF_ID}
    +    echo "Building Google Protocol Buffers, this could take a while..." | tee -a ${LOGFILE}
    +    if [[ $VERBOSE == true ]]; then
    +      ./configure --prefix=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID} 2>&1 | tee -a ${LOGFILE}
    +    else
    +      ./configure --prefix=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID} 2>&1 >>${LOGFILE}
    +    fi
    +    if [[ $? != 0 ]]; then
    +      echo "Error during configure step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    make 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    +      echo "Error during make step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    # skip the tests
    +    # make check 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    --- End diff --
    
    Sure, will do.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41890253
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +    tar -xzf ${PROTOBUF_TAR} >>${LOGFILE}
    +  fi
    +
    +  if [[ ! -d $PROTOBUF_TGT_ID ]]; then
    +    cd ${PROTOBUF_ID}
    +    echo "Building Google Protocol Buffers, this could take a while..." | tee -a ${LOGFILE}
    +    if [[ $VERBOSE == true ]]; then
    --- End diff --
    
    I didn't see a place where you can download binaries. The RPMs that are available for protoc on CentOS don't include version 2.5.0. Let me know if you find a good place to download this as a binary.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41889869
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    --- End diff --
    
    Thanks, will fix that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41888212
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    --- End diff --
    
    Yes, thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41889662
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    --- End diff --
    
    I left out libhadoop.so on purpose. Some of the tar files don't seem to include it and it is not needed to build, only at runtime. There might be a problem if it is missing both at compile time and at runtime.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41887592
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    --- End diff --
    
    What happens if the directory has not been created.  For example, you may need sudo access to create the temp directory or the user specified some invalid location.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41886935
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    --- End diff --
    
    The "-rf" option will suppress errors if the files don't exist.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41885829
  
    --- Diff: core/sqf/sqenvcom.sh ---
    @@ -580,6 +547,13 @@ EOF
         echo "**** ERROR: Unable to determine location of HBase lib directory"
       fi
     
    +  if [[ -d $TOOLSDIR/thrift-0.9.0 ]]; then
    +    # this is mostly for a build environment, where we need
    +    # thrift from TOOLSDIR
    +    export THRIFT_LIB_DIR=$TOOLSDIR/thrift-0.9.0/lib
    +    export THRIFT_INC_DIR=$TOOLSDIR/thrift-0.9.0/include
    +  fi
    +
       if [ -n "$HBASE_CNF_DIR" -a -n "$HADOOP_CNF_DIR" -a \
    --- End diff --
    
    You removed Thrift from the required list in bldenvchk and set the variable only if Thrift is installed in the TOOLSDIR.  What happens if Thrift is not installed?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by DaveBirdsall <gi...@git.apache.org>.
Github user DaveBirdsall commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41884769
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +    tar -xzf ${PROTOBUF_TAR} >>${LOGFILE}
    +  fi
    +
    +  if [[ ! -d $PROTOBUF_TGT_ID ]]; then
    +    cd ${PROTOBUF_ID}
    +    echo "Building Google Protocol Buffers, this could take a while..." | tee -a ${LOGFILE}
    +    if [[ $VERBOSE == true ]]; then
    +      ./configure --prefix=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID} 2>&1 | tee -a ${LOGFILE}
    +    else
    +      ./configure --prefix=${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID} 2>&1 >>${LOGFILE}
    +    fi
    +    if [[ $? != 0 ]]; then
    +      echo "Error during configure step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    make 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    +      echo "Error during make step, exiting" | tee -a ${LOGFILE}
    +      exit 1
    +    fi
    +    # skip the tests
    +    # make check 2>&1 >>${LOGFILE}
    +    if [[ $? != 0 ]]; then
    --- End diff --
    
    Probably should comment out this "if" block too since the "make check" is commented out. Though it is harmless (the "if" should always be false).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by robertamarton <gi...@git.apache.org>.
Github user robertamarton commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41888302
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    +       exit 1
    +    ;;
    +  esac
    +
    +  shift
    +done
    +
    +
    +if [[ $FORCE_BUILD == true || \
    +      ! -e ${TGT_INC_DIR}/hdfs.h || \
    +      ! -e ${TGT_LIB_DIR}/libhdfs.so ]]; then
    +
    +  if [ ! -d $LIBHDFS_TEMP_DIR ]; then
    +    mkdir $LIBHDFS_TEMP_DIR
    +  fi
    +
    +  cd $LIBHDFS_TEMP_DIR
    +
    +  if [[ ! -f ${PROTOBUF_TAR} ]]; then
    +    echo "Downloading Google Protocol Buffers..." | tee -a ${LOGFILE}
    +    wget ${PROTOBUF_MIRROR_URL}/${PROTOBUF_TAR} >${LOGFILE}
    +  fi
    +
    +  if [[ $FORCE_BUILD == true ]]; then
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_ID}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +  fi
    +
    +  if [[ ! -d ${PROTOBUF_ID} ]]; then
    +    echo "Unpacking Google Protocol Buffer tar file..." | tee -a ${LOGFILE}
    +    rm -rf ${LIBHDFS_TEMP_DIR}/${PROTOBUF_TGT_ID}
    +    tar -xzf ${PROTOBUF_TAR} >>${LOGFILE}
    +  fi
    +
    +  if [[ ! -d $PROTOBUF_TGT_ID ]]; then
    +    cd ${PROTOBUF_ID}
    +    echo "Building Google Protocol Buffers, this could take a while..." | tee -a ${LOGFILE}
    +    if [[ $VERBOSE == true ]]; then
    --- End diff --
    
    Why can't we just download binaries for protocol buffers?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by sandhyasun <gi...@git.apache.org>.
Github user sandhyasun commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41891342
  
    --- Diff: core/sqf/sqenvcom.sh ---
    @@ -580,6 +547,13 @@ EOF
         echo "**** ERROR: Unable to determine location of HBase lib directory"
       fi
     
    +  if [[ -d $TOOLSDIR/thrift-0.9.0 ]]; then
    +    # this is mostly for a build environment, where we need
    +    # thrift from TOOLSDIR
    +    export THRIFT_LIB_DIR=$TOOLSDIR/thrift-0.9.0/lib
    +    export THRIFT_INC_DIR=$TOOLSDIR/thrift-0.9.0/include
    +  fi
    +
       if [ -n "$HBASE_CNF_DIR" -a -n "$HADOOP_CNF_DIR" -a \
    --- End diff --
    
    Is CURL_INC_DIR and CURL_LIB_DIR removed ? We need it for LOB support. The code is not yet active due to a few issues hence no regression test yet for this case. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41886771
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    --- End diff --
    
    Thanks, will fix that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by DaveBirdsall <gi...@git.apache.org>.
Github user DaveBirdsall commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41883676
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    +
    +    *)
    +       echo "Unknown command line option: $arg"
    +       echo "Usage $0 [ -f | --force ]"
    --- End diff --
    
    Did you mean to echo $USAGE here?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41895218
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    +# Directories to copy the built libhdfs library and corresponding include file
    +TGT_INC_DIR=$MY_SQROOT/export/include
    +TGT_LIB_DIR=$MY_SQROOT/export/lib${SQ_MBTYPE}
    +
    +FORCE_BUILD=false
    +VERBOSE=false
    +
    +USAGE="Usage $0 [ -f | --force ]\
    +                [ -v | --verbose ]\
    +                [ -d <temp dir> | --tempDir <temp dir> ]"
    +
    +while [[ $# > 0 ]]
    +do
    +  arg="$1"
    +
    +  case $arg in
    +    -f|--force)
    +       FORCE_BUILD=true
    +       ;;
    +
    +    -v|--verbose)
    +       VERBOSE=true
    +       ;;
    +
    +    -d|--tempDir)
    +       shift
    +       if [[ $# < 1 ]]; then
    +         echo "Expecting argument after -d or --tempDir"
    +         exit 1
    +       fi
    +       LIBHDFS_TEMP_DIR="$1"
    +       ;;
    --- End diff --
    
    Yes, the command line options should override environment variables, if specified (mixing the two is probably not a good idea). I will fix the log file issue and update the LOGFILE variable here. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41892116
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    +# files to build required version of Google Protocol Buffers
    +PROTOBUF_MIRROR_URL=https://github.com/google/protobuf/releases/download/v2.5.0
    +PROTOBUF_ID=protobuf-2.5.0
    +PROTOBUF_TAR=${PROTOBUF_ID}.tar.gz
    +
    +# result of protobuf build
    +PROTOBUF_TGT_ID=protobuf-tgt
    +
    --- End diff --
    
    Yes, good idea, I'll add that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41888824
  
    --- Diff: core/sqf/sql/scripts/get_libhdfs_files ---
    @@ -0,0 +1,186 @@
    +#!/bin/sh
    +
    +# This script downloads and/or makes the required libhdfs files
    +# to be able to build Trafodion, which acts as a libhdfs client.
    +#
    +# Basically, what we need are three files:
    +#
    +# hdfs.h       (copied to $TGT_INC_DIR)
    +# libhdfs.so   (copied to $TGT_LIB_DIR)
    +# libhadoop.so (copied to $TGT_LIB_DIR)
    +
    +# Working dir in the Trafodion source tree to extract and build libhdfs files
    +# (can be specified as an environment variable)
    +if [[ -z ${LIBHDFS_TEMP_DIR} ]]; then
    +  LIBHDFS_TEMP_DIR=${MY_SQROOT}/sql/libhdfs_files
    +fi
    +LOGFILE=${LIBHDFS_TEMP_DIR}/build.log
    +
    +# Hadoop source tar file to build libhdfs from
    +HADOOP_SRC_MIRROR_URL=https://archive.apache.org/dist/hadoop/common/hadoop-2.6.0
    +HADOOP_ID=hadoop-2.6.0
    +HADOOP_SRC_ID=${HADOOP_ID}-src
    +HADOOP_SRC_TAR=${HADOOP_SRC_ID}.tar.gz
    +
    --- End diff --
    
    It might also have been because at some point Apache started to distribute the 64 bit native library in the binary distribution - that would not be a strong reason of course, since we are building the library here anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] incubator-trafodion pull request: TRAFODION-1521 Build Trafodion w...

Posted by zellerh <gi...@git.apache.org>.
Github user zellerh commented on a diff in the pull request:

    https://github.com/apache/incubator-trafodion/pull/118#discussion_r41888073
  
    --- Diff: core/sqf/sqenvcom.sh ---
    @@ -580,6 +547,13 @@ EOF
         echo "**** ERROR: Unable to determine location of HBase lib directory"
       fi
     
    +  if [[ -d $TOOLSDIR/thrift-0.9.0 ]]; then
    +    # this is mostly for a build environment, where we need
    +    # thrift from TOOLSDIR
    +    export THRIFT_LIB_DIR=$TOOLSDIR/thrift-0.9.0/lib
    +    export THRIFT_INC_DIR=$TOOLSDIR/thrift-0.9.0/include
    +  fi
    +
       if [ -n "$HBASE_CNF_DIR" -a -n "$HADOOP_CNF_DIR" -a \
    --- End diff --
    
    Good point, I should not have removed it from bldenvchk (did that when I tried to eliminate Thrift altogether - unsuccessfully). I'll put the check back into bldenvchk, because the build would fail.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---