You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Alejandro Fernandez (JIRA)" <ji...@apache.org> on 2017/05/02 02:17:04 UTC

[jira] [Created] (AMBARI-20910) HDP 3.0 TP - Unable to install Spark, cannot find package/scripts dir

Alejandro Fernandez created AMBARI-20910:
--------------------------------------------

             Summary: HDP 3.0 TP - Unable to install Spark, cannot find package/scripts dir
                 Key: AMBARI-20910
                 URL: https://issues.apache.org/jira/browse/AMBARI-20910
             Project: Ambari
          Issue Type: Bug
          Components: stacks
    Affects Versions: 3.0.0
            Reporter: Alejandro Fernandez
            Assignee: Alejandro Fernandez
             Fix For: trunk


STR:
* Install Ambari 3.0 (last build was 650)
* Install HDP 3.0 (last build is 197) with ZK, HDFS, YARN. Note: will fail on RM and Service Checks.
* Because Hive is not yet compiling, temporarily comment out Hive as a required service for Spark, and HIVE_METASTORE as a required co-hosted component.
/var/lib/ambari-server/resources/common-services/SPARK/2.2.0/metainfo.xml
* Restart Ambari Server
* Attempt to add Spark as a service.

Error:
{noformat}
Caught an exception while executing custom service command: <type 'exceptions.KeyError'>: 'service_package_folder'; 'service_package_folder'
{noformat}

This is coming from CustomServiceOrchestrator.py
{code}
    except Exception, e: # We do not want to let agent fail completely
      exc_type, exc_obj, exc_tb = sys.exc_info()
      message = "Caught an exception while executing "\
        "custom service command: {0}: {1}; {2}".format(exc_type, exc_obj, str(e))
      logger.exception(message)
{code}

Looks like Spark 2.2.0 doesn't have the package/scripts directory.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)