You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@seatunnel.apache.org by GitBox <gi...@apache.org> on 2022/09/29 07:47:22 UTC

[GitHub] [incubator-seatunnel] FWLamb opened a new issue, #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

FWLamb opened a new issue, #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948

   ### Search before asking
   
   - [X] I had searched in the [issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues.
   
   
   ### What happened
   
   When I ran the seatunnel-e2e test case, the following problems arose with the execution:
   
   ERROR AbstractTestContainer: bash: /tmp/seatunnel/bin/start-seatunnel-spark-connector-v2.sh: /bin/bash^M: bad interpreter: No such file or directory.
   
   I put a breakpoint on the program before running, manually entered the container to modify the code of the file, and finally the test was successful.
   
   ### SeaTunnel Version
   
   2.1.3-SNAPSHOT
   
   ### SeaTunnel Config
   
   ```conf
   null
   ```
   
   
   ### Running Command
   
   ```shell
   null
   ```
   
   
   ### Error Exception
   
   ```log
   22/09/29 15:27:56 INFO DockerClientProviderStrategy: Loaded org.testcontainers.dockerclient.NpipeSocketClientProviderStrategy from ~/.testcontainers.properties, will try it first
   22/09/29 15:27:57 INFO DockerClientProviderStrategy: Found Docker environment with local Npipe socket (npipe:////./pipe/docker_engine)
   22/09/29 15:27:57 INFO DockerClientFactory: Docker host IP address is localhost
   22/09/29 15:27:57 INFO DockerClientFactory: Connected to docker: 
     Server Version: 20.10.17
     API Version: 1.41
     Operating System: Docker Desktop
     Total Memory: 11976 MB
   22/09/29 15:27:57 INFO ImageNameSubstitutor: Image name substitution will be performed by: DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor')
   22/09/29 15:27:58 INFO RegistryAuthLocator: Credential helper/store (docker-credential-desktop) does not have credentials for index.docker.io
   22/09/29 15:27:59 INFO DockerClientFactory: Ryuk started - will monitor and terminate Testcontainers containers on JVM exit
   22/09/29 15:27:59 INFO DockerClientFactory: Checking the system...
   22/09/29 15:27:59 INFO DockerClientFactory: ✔︎ Docker server version should be at least 1.6.0
   22/09/29 15:27:59 INFO DockerClientFactory: ✔︎ Docker environment should have more than 2GB free disk space
   22/09/29 15:28:00 INFO 6]: Creating container for image: bitnami/spark:2.4.6
   22/09/29 15:28:00 INFO 6]: Container bitnami/spark:2.4.6 is starting: 6de9c7b98336cc5c8f4be68032fd6662c277d66841df78d0211ff5ecaae8d686
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.07 
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.07 Welcome to the Bitnami spark container
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.07 Subscribe to project updates by watching https://github.com/bitnami/bitnami-docker-spark
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.07 Submit issues and feature requests at https://github.com/bitnami/bitnami-docker-spark/issues
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.07 
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.07 INFO  ==> ** Starting Spark setup **
   22/09/29 15:28:02 INFO 6]: Container bitnami/spark:2.4.6 started in PT2.159S
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.23 INFO  ==> Generating Spark configuration file...
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.23 INFO  ==> ** Spark setup finished! **
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDOUT: 
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR:  07:28:02.24 INFO  ==> ** Starting Spark in master mode **
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDOUT: starting org.apache.spark.deploy.master.Master, logging to /opt/bitnami/spark/logs/spark--org.apache.spark.deploy.master.Master-1-6de9c7b98336.out
   22/09/29 15:28:02 INFO AbstractSparkContainer: The TestContainer[connector-v2/spark:2.4.3] is running.
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR: Spark Command: /opt/bitnami/java/bin/java -cp /opt/bitnami/spark/conf/:/opt/bitnami/spark/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host 6de9c7b98336 --port 7077 --webui-port 8080
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR: ========================================
   22/09/29 15:28:02 INFO AbstractTestSparkContainer: STDERR: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO Master: Started daemon with process name: 55@6de9c7b98336
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SignalUtils: Registered signal handler for TERM
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SignalUtils: Registered signal handler for HUP
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SignalUtils: Registered signal handler for INT
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SecurityManager: Changing view acls to: spark
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SecurityManager: Changing modify acls to: spark
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SecurityManager: Changing view acls groups to: 
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SecurityManager: Changing modify acls groups to: 
   22/09/29 15:28:03 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(spark); groups with view permissions: Set(); users  with modify permissions: Set(spark); groups with modify permissions: Set()
   22/09/29 15:37:26 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:04 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
   22/09/29 15:37:26 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:04 INFO Master: Starting Spark master at spark://6de9c7b98336:7077
   22/09/29 15:37:26 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:04 INFO Master: Running Spark version 2.4.6
   22/09/29 15:37:26 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:04 INFO Utils: Successfully started service 'MasterUI' on port 8080.
   22/09/29 15:37:26 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:04 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://6de9c7b98336:8080
   22/09/29 15:37:26 INFO AbstractTestSparkContainer: STDERR: 22/09/29 07:28:04 INFO Master: I have been elected leader! New state: ALIVE
   22/09/29 15:37:26 INFO AbstractTestContainer: 
   22/09/29 15:37:26 ERROR AbstractTestContainer: bash: /tmp/seatunnel/bin/start-seatunnel-spark-connector-v2.sh: /bin/bash^M: bad interpreter: No such file or directory
   
   
   org.opentest4j.AssertionFailedError: 
   Expected :0
   Actual   :126
   <Click to see difference>
   
   
   	at org.junit.jupiter.api.AssertionFailureBuilder.build(AssertionFailureBuilder.java:151)
   	at org.junit.jupiter.api.AssertionFailureBuilder.buildAndThrow(AssertionFailureBuilder.java:132)
   	at org.junit.jupiter.api.AssertEquals.failNotEqual(AssertEquals.java:197)
   	at org.junit.jupiter.api.AssertEquals.assertEquals(AssertEquals.java:150)
   	at org.junit.jupiter.api.AssertEquals.assertEquals(AssertEquals.java:145)
   	at org.junit.jupiter.api.Assertions.assertEquals(Assertions.java:527)
   	at org.apache.seatunnel.e2e.spark.v2.file.FakeSourceToFileIT.testFakeSourceToLocalFileText(FakeSourceToFileIT.java:40)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:727)
   	at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:156)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:147)
   	at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:86)
   	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103)
   	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
   	at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
   	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:92)
   	at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:86)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:217)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:213)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:138)
   	at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:68)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
   	at java.util.ArrayList.forEach(ArrayList.java:1257)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
   	at java.util.ArrayList.forEach(ArrayList.java:1257)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
   	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
   	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
   	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
   	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
   	at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
   	at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
   	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:147)
   	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:127)
   	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:90)
   	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:55)
   	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:102)
   	at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:54)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114)
   	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86)
   	at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86)
   	at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53)
   	at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:57)
   	at com.intellij.rt.junit.IdeaTestRunner$Repeater$1.execute(IdeaTestRunner.java:38)
   	at com.intellij.rt.execution.junit.TestsRepeater.repeat(TestsRepeater.java:11)
   	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:35)
   	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)
   	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
   
   22/09/29 15:37:27 INFO AbstractSparkContainer: The TestContainer[connector-v2/spark:2.4.3] is closed.
   ```
   
   
   ### Flink or Spark Version
   
   spark:2.4.3
   
   ### Java or Scala Version
   
   jdk8
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] liugddx commented on issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
liugddx commented on issue #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948#issuecomment-1261904904

   Please change file type CRLF to LF


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] liugddx commented on issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
liugddx commented on issue #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948#issuecomment-1261917025

   > > > > se change file type CRLF to LF
   > > > 
   > > > 
   > > > Which file should I modify
   > > 
   > > 
   > > `start-seatunnel-spark-connector-v2.sh` or `start-seatunnel-flink-connector-v2.sh` if you use flink engine.
   > 
   > I didn't find these files in the project. These files are in the Docker container. If I want to modify them, I must mark breakpoints. This means that every test needs to go into the container to modify the file code, which is too troublesome.
   
   https://github.com/apache/incubator-seatunnel/blob/dev/seatunnel-core/seatunnel-spark-starter/src/main/bin/start-seatunnel-spark-connector-v2.sh


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] FWLamb commented on issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
FWLamb commented on issue #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948#issuecomment-1261929755

   > ile type CRLF to LF
   
   
   
   > 
   
   The problem is solved ,thank you


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] FWLamb commented on issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
FWLamb commented on issue #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948#issuecomment-1261908425

   > se change file type CRLF to LF
   
   Which file should I modify


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] FWLamb closed issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
FWLamb closed issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code
URL: https://github.com/apache/incubator-seatunnel/issues/2948


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] liugddx commented on issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
liugddx commented on issue #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948#issuecomment-1261910530

   > > se change file type CRLF to LF
   > 
   > Which file should I modify
   
   `start-seatunnel-spark-connector-v2.sh` or `start-seatunnel-flink-connector-v2.sh` if you use flink engine.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-seatunnel] FWLamb commented on issue #2948: [Bug] [Connector-V2-Spark] 'start-seatunnel-spark-connector-v2.sh' file has problems with the Windows test code

Posted by GitBox <gi...@apache.org>.
FWLamb commented on issue #2948:
URL: https://github.com/apache/incubator-seatunnel/issues/2948#issuecomment-1261915301

   > > > se change file type CRLF to LF
   > > 
   > > 
   > > Which file should I modify
   > 
   > `start-seatunnel-spark-connector-v2.sh` or `start-seatunnel-flink-connector-v2.sh` if you use flink engine.
   
   I didn't find these files in the project. These files are in the Docker container. If I want to modify them, I must mark breakpoints. This means that every test needs to go into the container to modify the file code, which is too troublesome.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@seatunnel.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org