You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by souravaswal <gi...@git.apache.org> on 2017/10/20 07:53:46 UTC

[GitHub] spark pull request #19542: Branch 1.1.

GitHub user souravaswal opened a pull request:

    https://github.com/apache/spark/pull/19542

    Branch 1.1.

    ## What changes were proposed in this pull request?
    
    (Please fill in changes proposed in this fix)
    
    ## How was this patch tested?
    
    (Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests)
    (If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
    
    Please review http://spark.apache.org/contributing.html before opening a pull request.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/apache/spark branch-1.1

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19542.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19542
    
----
commit c5d8d82272ed1790b155eac023a1dfd7248899f8
Author: Eric Liang <ek...@google.com>
Date:   2014-09-08T00:57:59Z

    [SPARK-3394] [SQL] Fix crash in TakeOrdered when limit is 0
    
    This resolves https://issues.apache.org/jira/browse/SPARK-3394
    
    Author: Eric Liang <ek...@google.com>
    
    Closes #2264 from ericl/spark-3394 and squashes the following commits:
    
    c87355b [Eric Liang] refactor
    bfb6140 [Eric Liang] change RDD takeOrdered instead
    7a51528 [Eric Liang] fix takeordered when limit = 0
    
    (cherry picked from commit 6754570d83044c4fbaf0d2ac2378a0e081a93629)
    Signed-off-by: Matei Zaharia <ma...@databricks.com>

commit 65dae63fa32d9b4ab6b35bd0caaa40f86f986290
Author: Michael Armbrust <mi...@databricks.com>
Date:   2014-09-08T01:34:46Z

    [SQL] Update SQL Programming Guide
    
    Author: Michael Armbrust <mi...@databricks.com>
    Author: Yin Huai <hu...@cse.ohio-state.edu>
    
    Closes #2258 from marmbrus/sqlDocUpdate and squashes the following commits:
    
    f3d450b [Michael Armbrust] fix brackets
    bea3bfa [Michael Armbrust] Davies suggestions
    3a29fe2 [Michael Armbrust] tighten visibility
    a71aa36 [Michael Armbrust] Draft of doc updates
    52932c0 [Michael Armbrust] Merge remote-tracking branch 'origin/master' into sqlDocUpdate
    1e8c849 [Yin Huai] Update the example used for applySchema.
    9457c39 [Yin Huai] Update doc.
    31ba240 [Yin Huai] Merge remote-tracking branch 'upstream/master' into dataTypeDoc
    29bc668 [Yin Huai] Draft doc for data type and schema APIs.
    
    (cherry picked from commit 39db1bfdab434c867044ad4c70fe93a96fb287ad)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit d555c2ee6879499d666e5905397172d39c46c1a8
Author: Reynold Xin <rx...@apache.org>
Date:   2014-09-08T01:42:24Z

    [SPARK-3408] Fixed Limit operator so it works with sort-based shuffle.
    
    Author: Reynold Xin <rx...@apache.org>
    
    Closes #2281 from rxin/sql-limit-sort and squashes the following commits:
    
    1ef7780 [Reynold Xin] [SPARK-3408] Fixed Limit operator so it works with sort-based shuffle.
    
    (cherry picked from commit e2614038e78f4693fafedeee15b6fdf0ea1be473)
    Signed-off-by: Reynold Xin <rx...@apache.org>

commit e45bfa80124b5a6ee02160f1e9c7e3234cddd231
Author: Cheng Lian <li...@gmail.com>
Date:   2014-09-08T03:38:32Z

    Fixed typos in make-distribution.sh
    
    `hadoop.version` and `yarn.version` are properties rather then profiles, should use `-D` instead of `-P`.
    
    /cc pwendell
    
    Author: Cheng Lian <li...@gmail.com>
    
    Closes #2121 from liancheng/fix-make-dist and squashes the following commits:
    
    4c49158 [Cheng Lian] Also mentions Hadoop version related Maven profiles
    ed5b42a [Cheng Lian] Fixed typos in make-distribution.sh
    (cherry picked from commit 9d69a782bd2fc45193f269d8d8434795ea1580a4)
    
    Signed-off-by: Patrick Wendell <pw...@gmail.com>

commit 8c6306a036d478d7160b85bc7ca2082fdaee0523
Author: Reynold Xin <rx...@apache.org>
Date:   2014-09-08T03:56:04Z

    [SPARK-938][doc] Add OpenStack Swift support
    
    See compiled doc at
    http://people.apache.org/~rxin/tmp/openstack-swift/_site/storage-openstack-swift.html
    
    This is based on #1010. Closes #1010.
    
    Author: Reynold Xin <rx...@apache.org>
    Author: Gil Vernik <gi...@il.ibm.com>
    
    Closes #2298 from rxin/openstack-swift and squashes the following commits:
    
    ff4e394 [Reynold Xin] Two minor comments from Patrick.
    279f6de [Reynold Xin] core-sites -> core-site
    dfb8fea [Reynold Xin] Updated based on Gil's suggestion.
    846f5cb [Reynold Xin] Added a link from overview page.
    0447c9f [Reynold Xin] Removed sample code.
    e9c3761 [Reynold Xin] Merge pull request #1010 from gilv/master
    9233fef [Gil Vernik] Fixed typos
    6994827 [Gil Vernik] Merge pull request #1 from rxin/openstack
    ac0679e [Reynold Xin] Fixed an unclosed tr.
    47ce99d [Reynold Xin] Merge branch 'master' into openstack
    cca7192 [Gil Vernik] Removed white spases from pom.xml
    99f095d [Reynold Xin] Pending openstack changes.
    eb22295 [Reynold Xin] Merge pull request #1010 from gilv/master
    39a9737 [Gil Vernik] Spark integration with Openstack Swift
    c977658 [Gil Vernik] Merge branch 'master' of https://github.com/gilv/spark
    2aba763 [Gil Vernik] Fix to docs/openstack-integration.md
    9b625b5 [Gil Vernik] Merge branch 'master' of https://github.com/gilv/spark
    eff538d [Gil Vernik] SPARK-938 - Openstack Swift object storage support
    ce483d7 [Gil Vernik] SPARK-938 - Openstack Swift object storage support
    b6c37ef [Gil Vernik] Openstack Swift support
    (cherry picked from commit eddfeddac19870fc265ef406d87e1c3db9b54249)
    
    Signed-off-by: Patrick Wendell <pw...@gmail.com>

commit 7a236dcf8e4721472cea6f1ae7b652618c118f43
Author: Henry Cook <hc...@eecs.berkeley.edu>
Date:   2014-09-08T21:56:37Z

    [SQL] Minor edits to sql programming guide.
    
    Author: Henry Cook <hc...@eecs.berkeley.edu>
    
    Closes #2316 from hcook/sql-docs and squashes the following commits:
    
    373f94b [Henry Cook] Minor edits to sql programming guide.
    
    (cherry picked from commit 26bc7655de18ab0191ded3f75cb77bc756dc1c03)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit e884805ce8b42b60534b616cb82f7bb6b8d7f907
Author: Mark Hamstra <ma...@gmail.com>
Date:   2014-09-09T03:51:56Z

    SPARK-2425 Don't kill a still-running Application because of some misbehaving Executors
    
    Introduces a LOADING -> RUNNING ApplicationState transition and prevents Master from removing an Application with RUNNING Executors.
    
    Two basic changes: 1) Instead of allowing MAX_NUM_RETRY abnormal Executor exits over the entire lifetime of the Application, allow that many since any Executor successfully began running the Application; 2) Don't remove the Application while Master still thinks that there are RUNNING Executors.
    
    This should be fine as long as the ApplicationInfo doesn't believe any Executors are forever RUNNING when they are not.  I think that any non-RUNNING Executors will eventually no longer be RUNNING in Master's accounting, but another set of eyes should confirm that.  This PR also doesn't try to detect which nodes have gone rogue or to kill off bad Workers, so repeatedly failing Executors will continue to fail and fill up log files with failure reports as long as the Application keeps running.
    
    Author: Mark Hamstra <ma...@gmail.com>
    
    Closes #1360 from markhamstra/SPARK-2425 and squashes the following commits:
    
    f099c0b [Mark Hamstra] Reuse appInfo
    b2b7b25 [Mark Hamstra] Moved 'Application failed' logging
    bdd0928 [Mark Hamstra] switched to string interpolation
    1dd591b [Mark Hamstra] SPARK-2425 introduce LOADING -> RUNNING ApplicationState transition and prevent Master from removing Application with RUNNING Executors

commit 24262684e4b2a4615466395d2beac9c24987c84d
Author: scwf <wa...@huawei.com>
Date:   2014-09-09T18:57:01Z

    [SPARK-3193]output errer info when Process exit code is not zero in test suite
    
    https://issues.apache.org/jira/browse/SPARK-3193
    I noticed that sometimes pr tests failed due to the Process exitcode != 0,refer to
    https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18688/consoleFull
    https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19118/consoleFull
    
    [info] SparkSubmitSuite:
    [info] - prints usage on empty input
    [info] - prints usage with only --help
    [info] - prints error with unrecognized options
    [info] - handle binary specified but not class
    [info] - handles arguments with --key=val
    [info] - handles arguments to user program
    [info] - handles arguments to user program with name collision
    [info] - handles YARN cluster mode
    [info] - handles YARN client mode
    [info] - handles standalone cluster mode
    [info] - handles standalone client mode
    [info] - handles mesos client mode
    [info] - handles confs with flag equivalents
    [info] - launch simple application with spark-submit *** FAILED ***
    [info]   org.apache.spark.SparkException: Process List(./bin/spark-submit, --class, org.apache.spark.deploy.SimpleApplicationTest, --name, testApp, --master, local, file:/tmp/1408854098404-0/testJar-1408854098404.jar) exited with code 1
    [info]   at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
    [info]   at org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
    [info]   at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply$mcV$sp(SparkSubmitSuite.scala:291)
    [info]   at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
    [info]   at org.apacSpark assembly has been built with Hive, including Datanucleus jars on classpath
    
    this PR output the process error info when failed, it can be helpful for diagnosis.
    
    Author: scwf <wa...@huawei.com>
    
    Closes #2108 from scwf/output-test-error-info and squashes the following commits:
    
    0c48082 [scwf] minor fix according to comments
    563fde1 [scwf] output errer info when Process exitcode not zero
    
    (cherry picked from commit 26862337c97ce14794178d6378fb4155dd24acb9)
    Signed-off-by: Andrew Or <an...@gmail.com>

commit e5f77ae97bf78a2b57ef2e0e2ff8c9ed4bb8c50d
Author: Liang-Chi Hsieh <vi...@gmail.com>
Date:   2014-09-04T00:04:53Z

    [SPARK-3345] Do correct parameters for ShuffleFileGroup
    
    In the method `newFileGroup` of class `FileShuffleBlockManager`, the parameters for creating new `ShuffleFileGroup` object is in wrong order.
    
    Because in current codes, the parameters `shuffleId` and `fileId` are not used. So it doesn't cause problem now. However it should be corrected for readability and avoid future problem.
    
    Author: Liang-Chi Hsieh <vi...@gmail.com>
    
    Closes #2235 from viirya/correct_shufflefilegroup_params and squashes the following commits:
    
    fe72567 [Liang-Chi Hsieh] Do correct parameters for ShuffleFileGroup.

commit 23fd3e8b95845b956b3c90df660bc3cf0ed42d28
Author: Josh Rosen <jo...@apache.org>
Date:   2014-09-02T17:45:14Z

    [SPARK-3061] Fix Maven build under Windows
    
    The Maven build was failing on Windows because it tried to call the unix `unzip` utility to extract the Py4J files into core's build directory.  I've fixed this issue by using the `maven-antrun-plugin` to perform the unzipping.
    
    I also fixed an issue that prevented tests from running under Windows:
    
    In the Maven ScalaTest plugin, the filename listed in <filereports> is placed under the <reportsDirectory>; the current code places it in a subdirectory of reportsDirectory, e.g.
    
    ```
    ${project.build.directory}/surefire-reports/${project.build.directory}/SparkTestSuite.txt
    ```
    
    This caused problems under Windows because it would try to create a subdirectory named "c:\\".
    
    Note that the tests still fail under Windows (for other reasons); this PR just allows them to run and fail rather than crash when trying to create the test reports directory.
    
    Author: Josh Rosen <jo...@apache.org>
    Author: Josh Rosen <ro...@gmail.com>
    Author: Josh Rosen <jo...@databricks.com>
    
    Closes #2165 from JoshRosen/windows-support and squashes the following commits:
    
    651d210 [Josh Rosen] Unzip to python/build instead of core/build
    fbf3e61 [Josh Rosen] 4 spaces -> 2 spaces
    e347668 [Josh Rosen] Fix Maven scalatest filereports path:
    4994af1 [Josh Rosen] [SPARK-3061] Use maven-antrun-plugin to unzip Py4J.

commit 359cd59d1517cbe32a6d6a27a1bf604b53eea08b
Author: Andrew Or <an...@gmail.com>
Date:   2014-09-02T17:47:05Z

    [SPARK-1919] Fix Windows spark-shell --jars
    
    We were trying to add `file:/C:/path/to/my.jar` to the class path. We should add `C:/path/to/my.jar` instead. Tested on Windows 8.1.
    
    Author: Andrew Or <an...@gmail.com>
    
    Closes #2211 from andrewor14/windows-shell-jars and squashes the following commits:
    
    262c6a2 [Andrew Or] Oops... Add the new code to the correct place
    0d5a0c1 [Andrew Or] Format jar path only for adding to shell classpath
    42bd626 [Andrew Or] Remove unnecessary code
    0049f1b [Andrew Or] Remove embarrassing log messages
    b1755a0 [Andrew Or] Format jar paths properly before adding them to the classpath

commit e51ce9a5539a395b7ceff6dcdc77bf7f033e51d8
Author: Patrick Wendell <pw...@gmail.com>
Date:   2014-09-11T05:14:55Z

    HOTFIX: Changing color on doc menu

commit 06fb2d057beb50e9b690bf8b6d5bb7bdb16d8546
Author: Chris Cope <cc...@resilientscience.com>
Date:   2014-09-11T13:13:07Z

    [SPARK-2140] Updating heap memory calculation for YARN stable and alpha.
    
    Updated pull request, reflecting YARN stable and alpha states. I am getting intermittent test failures on my own test infrastructure. Is that tracked anywhere yet?
    
    Author: Chris Cope <cc...@resilientscience.com>
    
    Closes #2253 from copester/master and squashes the following commits:
    
    5ad89da [Chris Cope] [SPARK-2140] Removing calculateAMMemory functions since they are no longer needed.
    52b4e45 [Chris Cope] [SPARK-2140] Updating heap memory calculation for YARN stable and alpha.
    
    (cherry picked from commit ed1980ffa9ccb87d76694ba910ef22df034bca49)
    Signed-off-by: Thomas Graves <tg...@apache.org>

commit 2ffc7980c6818eec05e32141c52e335bc71daed9
Author: Andrew Or <an...@gmail.com>
Date:   2014-09-12T00:18:46Z

    [Spark-3490] Disable SparkUI for tests
    
    We currently open many ephemeral ports during the tests, and as a result we occasionally can't bind to new ones. This has caused the `DriverSuite` and the `SparkSubmitSuite` to fail intermittently.
    
    By disabling the `SparkUI` when it's not needed, we already cut down on the number of ports opened significantly, on the order of the number of `SparkContexts` ever created. We must keep it enabled for a few tests for the UI itself, however.
    
    Author: Andrew Or <an...@gmail.com>
    
    Closes #2363 from andrewor14/disable-ui-for-tests and squashes the following commits:
    
    332a7d5 [Andrew Or] No need to set spark.ui.port to 0 anymore
    30c93a2 [Andrew Or] Simplify streaming UISuite
    a431b84 [Andrew Or] Fix streaming test failures
    8f5ae53 [Andrew Or] Fix no new line at the end
    29c9b5b [Andrew Or] Disable SparkUI for tests
    
    (cherry picked from commit 6324eb7b5b0ae005cb2e913e36b1508bd6f1b9b8)
    Signed-off-by: Andrew Or <an...@gmail.com>
    
    Conflicts:
    	pom.xml
    	yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
    	yarn/common/src/main/scala/org/apache/spark/scheduler/cluster/YarnClientSchedulerBackend.scala

commit 4245404e860efbf09c720c6e1afcc115f2537f1b
Author: Andrew Ash <an...@andrewash.com>
Date:   2014-09-12T00:28:36Z

    [SPARK-3429] Don't include the empty string "" as a defaultAclUser
    
    Changes logging from
    
    ```
    14/09/05 02:01:08 INFO SecurityManager: Changing view acls to: aash,
    14/09/05 02:01:08 INFO SecurityManager: Changing modify acls to: aash,
    14/09/05 02:01:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aash, ); users with modify permissions: Set(aash, )
    ```
    to
    ```
    14/09/05 02:28:28 INFO SecurityManager: Changing view acls to: aash
    14/09/05 02:28:28 INFO SecurityManager: Changing modify acls to: aash
    14/09/05 02:28:28 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aash); users with modify permissions: Set(aash)
    ```
    
    Note that the first set of logs have a Set of size 2 containing "aash" and the empty string ""
    
    cc tgravescs
    
    Author: Andrew Ash <an...@andrewash.com>
    
    Closes #2286 from ash211/empty-default-acl and squashes the following commits:
    
    18cc612 [Andrew Ash] Use .isEmpty instead of ==""
    cf973a1 [Andrew Ash] Don't include the empty string "" as a defaultAclUser
    
    (cherry picked from commit ce59725b8703d18988e495dbaaf86ddde4bdfc5a)
    Signed-off-by: Andrew Or <an...@gmail.com>

commit e69deb81842639ee089b518e994080e27a343297
Author: Davies Liu <da...@gmail.com>
Date:   2014-09-12T01:53:26Z

    [SPARK-3465] fix task metrics aggregation in local mode
    
    Before overwrite t.taskMetrics, take a deepcopy of it.
    
    Author: Davies Liu <da...@gmail.com>
    
    Closes #2338 from davies/fix_metric and squashes the following commits:
    
    a5cdb63 [Davies Liu] Merge branch 'master' into fix_metric
    7c879e0 [Davies Liu] add more comments
    754b5b8 [Davies Liu] copy taskMetrics only when isLocal is true
    5ca26dc [Davies Liu] fix task metrics aggregation in local mode

commit f17b7957a4283952021d9e4106c5bd9994148128
Author: Andrew Or <an...@gmail.com>
Date:   2014-09-12T17:40:03Z

    Revert "[Spark-3490] Disable SparkUI for tests"
    
    This reverts commit 2ffc7980c6818eec05e32141c52e335bc71daed9.

commit 6cbf83c05c7a073d4df81b59a1663fea38ce65f6
Author: Cheng Hao <ha...@intel.com>
Date:   2014-09-12T18:29:30Z

    [SPARK-3481] [SQL] Eliminate the error log in local Hive comparison test
    
    Logically, we should remove the Hive Table/Database first and then reset the Hive configuration, repoint to the new data warehouse directory etc.
    Otherwise it raised exceptions like "Database doesn't not exists: default" in the local testing.
    
    Author: Cheng Hao <ha...@intel.com>
    
    Closes #2352 from chenghao-intel/test_hive and squashes the following commits:
    
    74fd76b [Cheng Hao] eliminate the error log
    
    (cherry picked from commit 8194fc662c08eb445444c207264e22361def54ea)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit 9c06c723018d4ef96ff31eb947226a6273ed8080
Author: Davies Liu <da...@gmail.com>
Date:   2014-09-13T02:05:39Z

    [SPARK-3500] [SQL] use JavaSchemaRDD as SchemaRDD._jschema_rdd
    
    Currently, SchemaRDD._jschema_rdd is SchemaRDD, the Scala API (coalesce(), repartition()) can not been called in Python easily, there is no way to specify the implicit parameter `ord`. The _jrdd is an JavaRDD, so _jschema_rdd should also be JavaSchemaRDD.
    
    In this patch, change _schema_rdd to JavaSchemaRDD, also added an assert for it. If some methods are missing from JavaSchemaRDD, then it's called by _schema_rdd.baseSchemaRDD().xxx().
    
    BTW, Do we need JavaSQLContext?
    
    Author: Davies Liu <da...@gmail.com>
    
    Closes #2369 from davies/fix_schemardd and squashes the following commits:
    
    abee159 [Davies Liu] use JavaSchemaRDD as SchemaRDD._jschema_rdd
    
    (cherry picked from commit 885d1621bc06bc1f009c9707c3452eac26baf828)
    Signed-off-by: Josh Rosen <jo...@apache.org>
    
    Conflicts:
    	python/pyspark/tests.py

commit 44e534eb286381030ae068ca89573ff84fb2a579
Author: Cheng Lian <li...@gmail.com>
Date:   2014-09-13T03:14:09Z

    [SPARK-3515][SQL] Moves test suite setup code to beforeAll rather than in constructor
    
    Please refer to the JIRA ticket for details.
    
    **NOTE** We should check all test suites that do similar initialization-like side effects in their constructors. This PR only fixes `ParquetMetastoreSuite` because it breaks our Jenkins Maven build.
    
    Author: Cheng Lian <li...@gmail.com>
    
    Closes #2375 from liancheng/say-no-to-constructor and squashes the following commits:
    
    0ceb75b [Cheng Lian] Moves test suite setup code to beforeAll rather than in constructor
    
    (cherry picked from commit 6d887db7891be643f0131b136e82191b5f6eb407)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit 70f93d5a931331c182d6b8e225246ef70b8d434f
Author: Nicholas Chammas <ni...@gmail.com>
Date:   2014-09-13T19:34:20Z

    [SQL] [Docs] typo fixes
    
    * Fixed random typo
    * Added in missing description for DecimalType
    
    Author: Nicholas Chammas <ni...@gmail.com>
    
    Closes #2367 from nchammas/patch-1 and squashes the following commits:
    
    aa528be [Nicholas Chammas] doc fix for SQL DecimalType
    3247ac1 [Nicholas Chammas] [SQL] [Docs] typo fixes
    
    (cherry picked from commit a523ceaf159733dabcef84c7adc1463546679f65)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit 78887f94a0ae9cdcfb851910ab9c7d51a1ef2acb
Author: Bertrand Bossy <be...@gmail.com>
Date:   2014-09-15T04:10:17Z

    SPARK-3039: Allow spark to be built using avro-mapred for hadoop2
    
    SPARK-3039: Adds the maven property "avro.mapred.classifier" to build spark-assembly with avro-mapred with support for the new Hadoop API. Sets this property to hadoop2 for Hadoop 2 profiles.
    
    I am not very familiar with maven, nor do I know whether this potentially breaks something in the hive part of spark. There might be a more elegant way of doing this.
    
    Author: Bertrand Bossy <be...@gmail.com>
    
    Closes #1945 from bbossy/SPARK-3039 and squashes the following commits:
    
    c32ce59 [Bertrand Bossy] SPARK-3039: Allow spark to be built using avro-mapred for hadoop2
    
    (cherry picked from commit c243b21a8ba2610266702e00d7d4b5443cb1f687)
    Signed-off-by: Patrick Wendell <pw...@gmail.com>

commit 99a6c5e5edb861bf5a39217d5aa8e69dd04918b5
Author: Kousuke Saruta <sa...@oss.nttdata.co.jp>
Date:   2014-09-15T23:11:41Z

    [SPARK-3518] Remove wasted statement in JsonProtocol
    
    Author: Kousuke Saruta <sa...@oss.nttdata.co.jp>
    
    Closes #2380 from sarutak/SPARK-3518 and squashes the following commits:
    
    8a1464e [Kousuke Saruta] Replaced a variable with simple field reference
    c660fbc [Kousuke Saruta] Removed useless statement in JsonProtocol.scala
    
    (cherry picked from commit e59fac1f97c3fbeeb6defd12625a49763a353156)
    Signed-off-by: Patrick Wendell <pw...@gmail.com>

commit 75158a7eb4a7e8ae59a3c0c99f102271078856dc
Author: Michael Armbrust <mi...@databricks.com>
Date:   2014-09-16T18:51:46Z

    [SQL][DOCS] Improve section on thrift-server
    
    Taken from liancheng's updates. Merged conflicts with #2316.
    
    Author: Michael Armbrust <mi...@databricks.com>
    
    Closes #2384 from marmbrus/sqlDocUpdate and squashes the following commits:
    
    2db6319 [Michael Armbrust] @liancheng's updates
    
    (cherry picked from commit 84073eb1172dc959936149265378f6e24d303685)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit 856156b40640cbdd7e88ff3165f4884cf0374043
Author: Andrew Or <an...@gmail.com>
Date:   2014-09-16T23:03:20Z

    [SPARK-3555] Fix UISuite race condition
    
    The test "jetty selects different port under contention" is flaky.
    
    If another process binds to 4040 before the test starts, then the first server we start there will fail, and the subsequent servers we start thereafter may successfully bind to 4040 if it was released between the servers starting. Instead, we should just let Java find a random free port for us and hold onto it for the duration of the test.
    
    Author: Andrew Or <an...@gmail.com>
    
    Closes #2418 from andrewor14/fix-port-contention and squashes the following commits:
    
    0cd4974 [Andrew Or] Stop them servers
    a7071fe [Andrew Or] Pick random port instead of 4040
    
    (cherry picked from commit 0a7091e689a4c8b1e7b61e9f0873e6557f40d952)
    Signed-off-by: Andrew Or <an...@gmail.com>

commit 937de93e80e6d299c4d08be426da2d5bc2d66f98
Author: Andrew Or <an...@gmail.com>
Date:   2014-09-17T01:23:28Z

    [SPARK-3490] Disable SparkUI for tests (backport into 1.1)
    
    Original PR: #2363
    
    Author: Andrew Or <an...@gmail.com>
    
    Closes #2415 from andrewor14/disable-ui-for-tests-1.1 and squashes the following commits:
    
    8d9df5a [Andrew Or] Oops, missed one.
    509507d [Andrew Or] Backport #2363 (SPARK-3490) into branch-1.1

commit 85e7c52bac6ae2f58a9340fde1dc94506666049d
Author: Michael Armbrust <mi...@databricks.com>
Date:   2014-09-17T19:41:49Z

    [SQL][DOCS] Improve table caching section
    
    Author: Michael Armbrust <mi...@databricks.com>
    
    Closes #2434 from marmbrus/patch-1 and squashes the following commits:
    
    67215be [Michael Armbrust] [SQL][DOCS] Improve table caching section
    
    (cherry picked from commit cbf983bb4a550ff26756ed7308fb03db42cffcff)
    Signed-off-by: Michael Armbrust <mi...@databricks.com>

commit 0690410e4f5b34ad41146cfe2a36457394d394d8
Author: Andrew Ash <an...@andrewash.com>
Date:   2014-09-17T22:07:57Z

    Docs: move HA subsections to a deeper indentation level
    
    Makes the table of contents read better
    
    Author: Andrew Ash <an...@andrewash.com>
    
    Closes #2402 from ash211/docs/better-indentation and squashes the following commits:
    
    ea0e130 [Andrew Ash] Move HA subsections to a deeper indentation level
    
    (cherry picked from commit b3830b28f8a70224d87c89d8491c514c4c191d23)
    Signed-off-by: Andrew Or <an...@gmail.com>

commit 3f1f9744b176424e00d262256eba9bc721cef18b
Author: Kousuke Saruta <sa...@oss.nttdata.co.jp>
Date:   2014-09-17T23:31:58Z

    [SPARK-3564][WebUI] Display App ID on HistoryPage
    
    Author: Kousuke Saruta <sa...@oss.nttdata.co.jp>
    
    Closes #2424 from sarutak/display-appid-on-webui and squashes the following commits:
    
    417fe90 [Kousuke Saruta] Added "App ID column" to HistoryPage
    
    (cherry picked from commit 6688a266f2cb84c2d43b8e4d27f710718c4cc4a0)
    Signed-off-by: Andrew Or <an...@gmail.com>

commit 32f2222e915f31422089139944a077e2cbd442f9
Author: WangTaoTheTonic <ba...@aliyun.com>
Date:   2014-09-18T04:59:23Z

    [SPARK-3565]Fix configuration item not consistent with document
    
    https://issues.apache.org/jira/browse/SPARK-3565
    
    "spark.ports.maxRetries" should be "spark.port.maxRetries". Make the configuration keys in document and code consistent.
    
    Author: WangTaoTheTonic <ba...@aliyun.com>
    
    Closes #2427 from WangTaoTheTonic/fixPortRetries and squashes the following commits:
    
    c178813 [WangTaoTheTonic] Use blank lines trigger Jenkins
    646f3fe [WangTaoTheTonic] also in SparkBuild.scala
    3700dba [WangTaoTheTonic] Fix configuration item not consistent with document
    
    (cherry picked from commit 3f169bfe3c322bf4344e13276dbbe34279b59ad0)
    Signed-off-by: Patrick Wendell <pw...@gmail.com>

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19542: Branch 1.1.

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/19542
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19542: Branch 1.1.

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/19542
  
    Close this


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19542: Branch 1.1.

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/19542


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org