You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hop.apache.org by ha...@apache.org on 2022/08/04 13:38:05 UTC

[hop] branch master updated (321131d941 -> 4c92ff311c)

This is an automated email from the ASF dual-hosted git repository.

hansva pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/hop.git


    from 321131d941 Merge pull request #1615 from gkfabs/gkfabs/master
     new 99a8a55c0e HOP-4090: Add Spark integration test and fix starting SparkJob from Hop
     new 70764ac519 [DOC] fix broken link
     new 4c92ff311c Merge pull request #1616 from hansva/master

The 4303 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 assemblies/plugins/engines/beam/pom.xml            |  6 +--
 .../plugins/engines/beam/src/assembly/assembly.xml | 10 +++--
 .../plugins/tech/parquet/src/assembly/assembly.xml |  1 -
 core/pom.xml                                       |  8 ++--
 ...file.unit-tests => Dockerfile.unit-tests-spark} | 23 ++++++++--
 ...base.yaml => integration-tests-base-spark.yaml} |  2 +-
 ...sts-neo4j.yaml => integration-tests-spark.yaml} | 35 +++++++--------
 docker/integration-tests/spark/Dockerfile.master   | 49 +++++++++++++++++++++
 docker/integration-tests/spark/Dockerfile.worker   | 50 ++++++++++++++++++++++
 .../spark/scripts/execute-step.sh                  | 17 +++++---
 .../integration-tests/spark/scripts/finish-step.sh | 17 +++++---
 .../integration-tests/spark/scripts/master.sh      | 18 ++++++--
 .../spark/scripts/wait-for-step.sh                 | 17 +++++---
 .../integration-tests/spark/scripts/worker.sh      | 16 +++++--
 .../modules/ROOT/pages/pipeline/transforms.adoc    |  1 -
 .../0001-test-spark-cluster.hpl}                   |  0
 .../{flink => spark}/dev-env-config.json           |  0
 integration-tests/{flink => spark}/hop-config.json |  0
 .../main-0001-test-spark-cluster.hwf}              | 10 ++---
 .../metadata/pipeline-run-configuration/local.json |  0
 .../metadata/pipeline-run-configuration/spark.json | 26 +++++++++++
 .../metadata/workflow-run-configuration/local.json |  0
 .../project-config.json                            |  0
 23 files changed, 244 insertions(+), 62 deletions(-)
 copy docker/integration-tests/{Dockerfile.unit-tests => Dockerfile.unit-tests-spark} (55%)
 copy docker/integration-tests/{integration-tests-base.yaml => integration-tests-base-spark.yaml} (98%)
 copy docker/integration-tests/{integration-tests-neo4j.yaml => integration-tests-spark.yaml} (65%)
 create mode 100644 docker/integration-tests/spark/Dockerfile.master
 create mode 100644 docker/integration-tests/spark/Dockerfile.worker
 copy integration-tests/scripting/0002-shell-test.sh => docker/integration-tests/spark/scripts/execute-step.sh (63%)
 mode change 100755 => 100644
 copy integration-tests/scripting/0002-shell-test.sh => docker/integration-tests/spark/scripts/finish-step.sh (64%)
 mode change 100755 => 100644
 copy integration-tests/scripting/0002-shell-test.sh => docker/integration-tests/spark/scripts/master.sh (60%)
 mode change 100755 => 100644
 copy integration-tests/scripting/0002-shell-test.sh => docker/integration-tests/spark/scripts/wait-for-step.sh (65%)
 mode change 100755 => 100644
 copy integration-tests/scripting/0002-shell-test.sh => docker/integration-tests/spark/scripts/worker.sh (65%)
 mode change 100755 => 100644
 copy integration-tests/{flink/0001-test-flink-cluster.hpl => spark/0001-test-spark-cluster.hpl} (100%)
 copy integration-tests/{flink => spark}/dev-env-config.json (100%)
 copy integration-tests/{flink => spark}/hop-config.json (100%)
 copy integration-tests/{flink/main-0001-test-flink-cluster.hwf => spark/main-0001-test-spark-cluster.hwf} (91%)
 copy integration-tests/{actions => spark}/metadata/pipeline-run-configuration/local.json (100%)
 create mode 100644 integration-tests/spark/metadata/pipeline-run-configuration/spark.json
 copy integration-tests/{actions => spark}/metadata/workflow-run-configuration/local.json (100%)
 copy integration-tests/{beam_directrunner => spark}/project-config.json (100%)