You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2021/01/07 11:28:41 UTC

[GitHub] [flink] klion26 commented on a change in pull request #14064: [FLINK-19454][docs-zh] Translate page 'Importing Flink into an IDE' into Chinese

klion26 commented on a change in pull request #14064:
URL: https://github.com/apache/flink/pull/14064#discussion_r553267144



##########
File path: docs/flinkDev/ide_setup.zh.md
##########
@@ -25,105 +25,82 @@ under the License.
 * Replaced by the TOC
 {:toc}
 
-The sections below describe how to import the Flink project into an IDE
-for the development of Flink itself. For writing Flink programs, please
-refer to the [Java API]({% link dev/project-configuration.zh.md %})
-and the [Scala API]({% link dev/project-configuration.zh.md %})
-quickstart guides.
+以下章节描述了如何在 IDE 中导入 Flink 项目以此进行 Flink 本身的源码开发。若需要开发 Flink 应用程序,请参考 [Java API]({% link dev/project-configuration.zh.md %}) 
+和 [Scala API]({% link dev/project-configuration.zh.md %})的快速入门指南。

Review comment:
       这个地方可以和上一行合并在一起,不然渲染起来会有空格

##########
File path: docs/flinkDev/ide_setup.zh.md
##########
@@ -25,105 +25,82 @@ under the License.
 * Replaced by the TOC
 {:toc}
 
-The sections below describe how to import the Flink project into an IDE
-for the development of Flink itself. For writing Flink programs, please
-refer to the [Java API]({% link dev/project-configuration.zh.md %})
-and the [Scala API]({% link dev/project-configuration.zh.md %})
-quickstart guides.
+以下章节描述了如何在 IDE 中导入 Flink 项目以此进行 Flink 本身的源码开发。若需要开发 Flink 应用程序,请参考 [Java API]({% link dev/project-configuration.zh.md %}) 
+和 [Scala API]({% link dev/project-configuration.zh.md %})的快速入门指南。
 
-**NOTE:** Whenever something is not working in your IDE, try with the Maven
-command line first (`mvn clean package -DskipTests`) as it might be your IDE
-that has a bug or is not properly set up.
+**注意:** 当你的 IDE 无法正常运行时,请优先尝试使用 Maven 命令 (`mvn clean package -DskipTests`) 因为这可能是你的 IDE 存在问题或设置不正确.
 
-## Preparation
+## 准备工作

Review comment:
       请按照 [wiki](https://cwiki.apache.org/confluence/display/FLINK/Flink+Translation+Specifications) 给翻译的标题添加 <a> 标签,具体的标签名字可以参考英文版的 url
   可以在本地执行 `./docs/docker/run.sh` 然后按照提示进行测试,打开 localhost:4000 查看

##########
File path: docs/flinkDev/ide_setup.zh.md
##########
@@ -142,73 +119,67 @@ Each file needs to include the Apache license as a header. This can be automated
    See the License for the specific language governing permissions and 
    limitations under the License.
    ```
-5. Apply the changes
+5. 点击 Apply 保存变更配置
 
-### FAQ
+### 常见问题
 
-This section lists issues that developers have run into in the past when working with IntelliJ:
+本节列出了开发人员过去使用 IntelliJ 时遇到的问题:
 
-- Compilation fails with `invalid flag: --add-exports=java.base/sun.net.util=ALL-UNNAMED`
+- 编译失败并出现 `invalid flag: --add-exports=java.base/sun.net.util=ALL-UNNAMED`
 
-This means that IntelliJ activated the `java11` profile despite an older JDK being used.
-Open the Maven tool window (View -> Tool Windows -> Maven), uncheck the `java11` profile and reimport the project.
+这表明 IntelliJ 虽然使用了较旧版本的 JDK,但是同时使用了 `java11` 的配置文件。
+打开 Maven 工具栏 (View -> Tool Windows -> Maven), 在 Profiles下取消选中 `java11` 并点击 Reimport 重新导入项目。
 
-- Compilation fails with `cannot find symbol: symbol: method defineClass(...) location: class sun.misc.Unsafe`
+- 编译失败并出现 `cannot find symbol: symbol: method defineClass(...) location: class sun.misc.Unsafe`
 
-This means that IntelliJ is using JDK 11 for the project, but you are working on a Flink version which doesn't
-support Java 11.
-This commonly happens when you have setup IntelliJ to use JDK 11 and checkout older versions of Flink (<= 1.9).
-Open the project settings window (File -> Project Structure -> Project Settings: Project) and select JDK 8 as the project
-SDK.
-You may have to revert this after switching back to the new Flink version if you want to use JDK 11.
+这表明 IntelliJ 中项目使用的JDK 版本是 JDK 11,但是你使用的 Flink 版本不支持 Java 11。
+通常会发生这种情况是因为在 IntelliJ 中设置了使用 JDK 11,但是使用的 Flink 版本过低导致的 (<= 1.9)。
+打开项目配置窗口 (File -> Project Structure -> Project Settings: Project) 并选择 JDK 8 作为项目的 SDK。
+当切换到 Flink 新版本并且希望使用 JDK 11 时则可能需要重新设置 SDK。
 
-- Examples fail with a `NoClassDefFoundError` for Flink classes.
+- 运行 Flink Examples 时出现关于 Flink 相关类的 `NoClassDefFoundError`
 
-This is likely due to Flink dependencies being set to provided, resulting in them not being put automatically on the 
-classpath.
-You can either tick the "Include dependencies with 'Provided' scope" box in the run configuration, or create a test
-that calls the `main()` method of the example (`provided` dependencies are available on the test classpath).
+这可能是由于将 Flink 的依赖设置成了 `provided`,导致依赖没有在类路径中被自动加载。
+可以在运行配置中勾选 "Include dependencies with 'Provided' scope",或新建可以调用 Example 类的 `main()` 方法的测试 (`provided` 相关的依赖存在于测试的类加载路径).
 
 ## Eclipse
 
-**NOTE:** From our experience, this setup does not work with Flink
-due to deficiencies of the old Eclipse version bundled with Scala IDE 3.0.3 or
-due to version incompatibilities with the bundled Scala version in Scala IDE 4.4.1.
+**注意:** 根据我们的经验,由于捆绑 Scala IDE 3.0.3 的旧版本 Eclipse 存在缺陷或由于捆绑 Scala IDE 4.4.1 的 Eclipse 存在版本不兼容问题,

Review comment:
       这句话能否再优化一下呢?读起来有点奇怪




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org