You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2020/09/30 18:51:00 UTC

[jira] [Work logged] (BEAM-5058) Python precommits should run E2E tests

     [ https://issues.apache.org/jira/browse/BEAM-5058?focusedWorklogId=493119&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-493119 ]

ASF GitHub Bot logged work on BEAM-5058:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 30/Sep/20 18:50
            Start Date: 30/Sep/20 18:50
    Worklog Time Spent: 10m 
      Work Description: kennknowles commented on a change in pull request #6707:
URL: https://github.com/apache/beam/pull/6707#discussion_r497728357



##########
File path: buildSrc/src/main/groovy/org/apache/beam/gradle/BeamModulePlugin.groovy
##########
@@ -1492,5 +1492,79 @@ artifactId=${project.name}
         dependsOn ':beam-sdks-java-container:docker'
       }
     }
+
+    /** ***********************************************************************************************/
+
+    project.ext.applyPythonNature = {
+
+      // Define common lifecycle tasks and artifact types
+      project.apply plugin: "base"
+
+      // For some reason base doesn't define a test task  so we define it below and make
+      // check depend on it. This makes the Python project similar to the task layout like
+      // Java projects, see https://docs.gradle.org/4.2.1/userguide/img/javaPluginTasks.png
+      project.task('test', type: Test) {}
+      project.check.dependsOn project.test
+
+      project.evaluationDependsOn(":beam-runners-google-cloud-dataflow-java-fn-api-worker")
+
+      project.ext.envdir = project.findProperty('envBaseDir') ?: "${project.rootProject.buildDir}"
+      project.ext.envdir = project.ext.envdir + "/${project.name}/gradleenv"

Review comment:
       I just happened across this because I was doing virtualenv stuff, and reading about BEAM-4256. Do we know what the limit is or any idea about the cause? Is it just having one directory with a huge name?
   
   Seems like the most natural place for this would be `${project.buildDir}/venv` since that is where outputs for tasks typically live. Does that also get too long?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 493119)
    Time Spent: 7h  (was: 6h 50m)

> Python precommits should run E2E tests
> --------------------------------------
>
>                 Key: BEAM-5058
>                 URL: https://issues.apache.org/jira/browse/BEAM-5058
>             Project: Beam
>          Issue Type: Task
>          Components: sdk-py-core, testing
>            Reporter: Udi Meiri
>            Assignee: Mark Liu
>            Priority: P2
>             Fix For: Not applicable
>
>          Time Spent: 7h
>  Remaining Estimate: 0h
>
> According to [https://beam.apache.org/contribute/testing/] (which I'm working on), end-to-end tests should be run in precommit on each combination of \{batch, streaming}x\{SDK language}x\{supported runner}.
> At least 2 tests need to be added to Python's precommit: wordcount and wordcount_streaming on Dataflow, and possibly on other supported runners (direct runner and new runners plz).
>  These tests should be configured to run from a Gradle sub-project, so that they're run in parallel to the unit tests.
> Example that parallelizes Java precommit integration tests: [https://github.com/apache/beam/pull/5731]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)