You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "Elek, Marton (JIRA)" <ji...@apache.org> on 2017/04/27 12:53:04 UTC
[jira] [Updated] (BEAM-2093) Update Jackson version to 2.8.8 in
archetype (or align with parent pom)
[ https://issues.apache.org/jira/browse/BEAM-2093?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Elek, Marton updated BEAM-2093:
-------------------------------
Description:
The jackson version generated by the latest examples archetype is not compatible with the latest beam.
Even to execute the plain word-count example (generated by the archetype):
{code}
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.WordCount -Dexec.args="--inputFile=pom.xml --output=counts --runner=spark" -Pspark-runner
{code}
Is failing with:
{code}
11:58:48.231 [main] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
Exception in thread "main" java.lang.RuntimeException: java.lang.ExceptionInInitializerError
at org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:57)
at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:74)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:110)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:98)
at com.hortonworks.hcube.codestream.AsfStat.main(AsfStat.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ExceptionInInitializerError
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.mapPartitions(RDD.scala:706)
at org.apache.spark.api.java.JavaRDDLike$class.mapPartitionsToPair(JavaRDDLike.scala:194)
at org.apache.spark.api.java.AbstractJavaRDDLike.mapPartitionsToPair(JavaRDDLike.scala:46)
at org.apache.beam.runners.spark.translation.TransformTranslator$6.evaluate(TransformTranslator.java:356)
at org.apache.beam.runners.spark.translation.TransformTranslator$6.evaluate(TransformTranslator.java:340)
at org.apache.beam.runners.spark.SparkRunner$Evaluator.doVisitTransform(SparkRunner.java:409)
at org.apache.beam.runners.spark.SparkRunner$Evaluator.visitPrimitiveTransform(SparkRunner.java:395)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:488)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$400(TransformHierarchy.java:232)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:207)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:383)
at org.apache.beam.runners.spark.SparkRunner$2.run(SparkRunner.java:210)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.8
at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:81)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 21 more
Add Comment
{code}
The longterm solution is to use filtering in the archetype and use the jackson version maven property.
> Update Jackson version to 2.8.8 in archetype (or align with parent pom)
> -----------------------------------------------------------------------
>
> Key: BEAM-2093
> URL: https://issues.apache.org/jira/browse/BEAM-2093
> Project: Beam
> Issue Type: Bug
> Components: project-management
> Reporter: Jean-Baptiste Onofré
> Assignee: Elek, Marton
> Fix For: First stable release
>
>
> The jackson version generated by the latest examples archetype is not compatible with the latest beam.
> Even to execute the plain word-count example (generated by the archetype):
> {code}
> mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.WordCount -Dexec.args="--inputFile=pom.xml --output=counts --runner=spark" -Pspark-runner
> {code}
> Is failing with:
> {code}
> 11:58:48.231 [main] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext
> Exception in thread "main" java.lang.RuntimeException: java.lang.ExceptionInInitializerError
> at org.apache.beam.runners.spark.SparkPipelineResult.runtimeExceptionFrom(SparkPipelineResult.java:57)
> at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:74)
> at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:110)
> at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:98)
> at com.hortonworks.hcube.codestream.AsfStat.main(AsfStat.java:54)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
> Caused by: java.lang.ExceptionInInitializerError
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
> at org.apache.spark.rdd.RDD.mapPartitions(RDD.scala:706)
> at org.apache.spark.api.java.JavaRDDLike$class.mapPartitionsToPair(JavaRDDLike.scala:194)
> at org.apache.spark.api.java.AbstractJavaRDDLike.mapPartitionsToPair(JavaRDDLike.scala:46)
> at org.apache.beam.runners.spark.translation.TransformTranslator$6.evaluate(TransformTranslator.java:356)
> at org.apache.beam.runners.spark.translation.TransformTranslator$6.evaluate(TransformTranslator.java:340)
> at org.apache.beam.runners.spark.SparkRunner$Evaluator.doVisitTransform(SparkRunner.java:409)
> at org.apache.beam.runners.spark.SparkRunner$Evaluator.visitPrimitiveTransform(SparkRunner.java:395)
> at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:488)
> at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483)
> at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483)
> at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:483)
> at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$400(TransformHierarchy.java:232)
> at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:207)
> at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:383)
> at org.apache.beam.runners.spark.SparkRunner$2.run(SparkRunner.java:210)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.8
> at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
> at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
> at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
> at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:81)
> at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
> ... 21 more
> Add Comment
> {code}
> The longterm solution is to use filtering in the archetype and use the jackson version maven property.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)