You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/08/30 03:52:00 UTC

[jira] [Resolved] (SPARK-40221) Not able to format using scalafmt

     [ https://issues.apache.org/jira/browse/SPARK-40221?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-40221.
----------------------------------
    Fix Version/s: 3.4.0
         Assignee: Hyukjin Kwon
       Resolution: Fixed

> Not able to format using scalafmt
> ---------------------------------
>
>                 Key: SPARK-40221
>                 URL: https://issues.apache.org/jira/browse/SPARK-40221
>             Project: Spark
>          Issue Type: Question
>          Components: Build
>    Affects Versions: 3.4.0
>            Reporter: Ziqi Liu
>            Assignee: Hyukjin Kwon
>            Priority: Major
>             Fix For: 3.4.0
>
>
> I'm following the guidance in [https://spark.apache.org/developer-tools.html] using 
> {code:java}
> ./dev/scalafmt{code}
> to format the code, but getting this error:
> {code:java}
> [ERROR] Failed to execute goal org.antipathy:mvn-scalafmt_2.12:1.1.1640084764.9f463a9:format (default-cli) on project spark-parent_2.12: Error formatting Scala files: missing setting 'version'. To fix this problem, add the following line to .scalafmt.conf: 'version=3.2.1'. -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org