You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "chenliang (JIRA)" <ji...@apache.org> on 2019/01/07 06:39:00 UTC

[jira] [Created] (SPARK-26557) The configuration of maven-checkstyle-plugin is error for mvn install

chenliang created SPARK-26557:
---------------------------------

             Summary: The configuration of maven-checkstyle-plugin  is error for  mvn  install
                 Key: SPARK-26557
                 URL: https://issues.apache.org/jira/browse/SPARK-26557
             Project: Spark
          Issue Type: Dependency upgrade
          Components: Build
    Affects Versions: 2.3.0, 2.2.0
            Reporter: chenliang


When I build the Spark with the maven script:

{code:java}
mvn install -Psparkr -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.3  -Phive-thriftserver -Pnetlib-lgpl -Pspark-ganglia-lgpl -DskipTests -Denforcer.skip=true  
{code}

There a error:

{panel:title=Error}
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.17:check (default) on project spark-parent_2.11: Unable to parse configuration of mojo org.apache.maven.plugins:*maven-checkstyle-plugin:2.17*:check for parameter sourceDirectories: Cannot assign configuration entry '*sourceDirectories*' with value 
'  /../../main/java,   /../../main/scala' of type java.lang.String to property of type java.util.List -> [Help 1]
{panel}



Then I check the doc for *maven-checkstyle-plugin:2.17*:  http://maven.apache.org/components/plugins/maven-checkstyle-plugin/



For Major Version Upgrade to version 3.0.0,we could change the config "sourceDirectory" to "sourceDirectories",but the version we set is 2.17.

There are two solutions, 
1.set  *sourceDirectories* to *sourceDirectory*;
2.upgrate version of maven-checkstyle-plugin





--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org