You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by xu...@apache.org on 2020/08/24 16:28:43 UTC

[hudi] branch master updated: [MINOR] Update README.md (#2010)

This is an automated email from the ASF dual-hosted git repository.

xushiyan pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/hudi.git


The following commit(s) were added to refs/heads/master by this push:
     new 111a975  [MINOR] Update README.md (#2010)
111a975 is described below

commit 111a9753a0dc8a8634a5aaf549ceff5cbd89bcb8
Author: Raymond Xu <27...@users.noreply.github.com>
AuthorDate: Mon Aug 24 09:28:29 2020 -0700

    [MINOR] Update README.md (#2010)
    
    - add maven profile to test running commands
    - remove -DskipITs for packaging commands
---
 README.md | 17 +++++++++++------
 1 file changed, 11 insertions(+), 6 deletions(-)

diff --git a/README.md b/README.md
index 542ff2e..2416b3f 100644
--- a/README.md
+++ b/README.md
@@ -54,7 +54,7 @@ Prerequisites for building Apache Hudi:
 ```
 # Checkout code and build
 git clone https://github.com/apache/hudi.git && cd hudi
-mvn clean package -DskipTests -DskipITs
+mvn clean package -DskipTests
 
 # Start command
 spark-2.4.4-bin-hadoop2.7/bin/spark-shell \
@@ -73,7 +73,7 @@ mvn clean javadoc:aggregate -Pjavadocs
 The default Scala version supported is 2.11. To build for Scala 2.12 version, build using `scala-2.12` profile
 
 ```
-mvn clean package -DskipTests -DskipITs -Dscala-2.12
+mvn clean package -DskipTests -Dscala-2.12
 ```
 
 ### Build without spark-avro module
@@ -83,7 +83,7 @@ The default hudi-jar bundles spark-avro module. To build without spark-avro modu
 ```
 # Checkout code and build
 git clone https://github.com/apache/hudi.git && cd hudi
-mvn clean package -DskipTests -DskipITs -Pspark-shade-unbundle-avro
+mvn clean package -DskipTests -Pspark-shade-unbundle-avro
 
 # Start command
 spark-2.4.4-bin-hadoop2.7/bin/spark-shell \
@@ -94,14 +94,19 @@ spark-2.4.4-bin-hadoop2.7/bin/spark-shell \
 
 ## Running Tests
 
-All tests can be run with maven
+Unit tests can be run with maven profile `unit-tests`.
 ```
-mvn test
+mvn -Punit-tests test
+```
+
+Functional tests, which are tagged with `@Tag("functional")`, can be run with maven profile `functional-tests`.
+```
+mvn -Pfunctional-tests test
 ```
 
 To run tests with spark event logging enabled, define the Spark event log directory. This allows visualizing test DAG and stages using Spark History Server UI.
 ```
-mvn test -DSPARK_EVLOG_DIR=/path/for/spark/event/log
+mvn -Punit-tests test -DSPARK_EVLOG_DIR=/path/for/spark/event/log
 ```
 
 ## Quickstart