You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kudu.apache.org by "Yanlong_Zheng (JIRA)" <ji...@apache.org> on 2016/11/24 08:24:58 UTC

[jira] [Commented] (KUDU-1757) Issue with newInsert() in spark-shell

    [ https://issues.apache.org/jira/browse/KUDU-1757?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15692621#comment-15692621 ] 

Yanlong_Zheng commented on KUDU-1757:
-------------------------------------

This issue still occurs in current kudu (1.2.0-SNAPSHOT kudu-client-1.2.0-20161124.045847-24.jar).
When run newInsert() in spark-shell, the REPL tries to print the result of each line, and in this case it's trying to stringify the insert before you've set any columns on it.
The line below in function appendCellValueDebugString in PartialRow.java actually throws the exception and stops the process:
    Preconditions.checkState(columnsBitSet.get(idx), "Column %s is not set", col.getName());

I think this line should be fixed as the purpose of this function is to append debug information. It should not impact the process.


> Issue with newInsert() in spark-shell
> -------------------------------------
>
>                 Key: KUDU-1757
>                 URL: https://issues.apache.org/jira/browse/KUDU-1757
>             Project: Kudu
>          Issue Type: Bug
>          Components: api, client
>    Affects Versions: 1.0.1
>         Environment: hardware: Azure cloud, 3 nodes
> OS: CentOS 6.6
> Kudu: 1.0.0-1.kudu1.0.0.p0.6
> Kudu client jar: kudu-client-1.0.1.jar
> spark: spark 1.6.0
>            Reporter: Yanlong_Zheng
>            Priority: Minor
>              Labels: easytest
>
> Hi,
> When trying java kudu api in spark-shell. I met error when tried to insert data into kudu.
> In spark-shell:
> scala> import org.apache.kudu.ColumnSchema;
> import org.apache.kudu.ColumnSchema
> scala> import org.apache.kudu.Schema;
> import org.apache.kudu.Schema
> scala> import org.apache.kudu.Type;
> import org.apache.kudu.Type
> scala> import org.apache.kudu.client._
> import org.apache.kudu.client._
> scala> import java.util.ArrayList;
> import java.util.ArrayList
> scala> import java.util.List;
> import java.util.List
> scala> val kc = new KuduClient.KuduClientBuilder("XXX.XXX.XXX.XXX:7051").build()
> kc: org.apache.kudu.client.KuduClient = org.apache.kudu.client.KuduClient@2b852e9
> scala> val ktbl = kc.openTable("kudu_scala_test")
> ktbl: org.apache.kudu.client.KuduTable = org.apache.kudu.client.KuduTable@7e26fb98
> scala> val ksession = kc.newSession()
> ksession: org.apache.kudu.client.KuduSession = org.apache.kudu.client.KuduSession@cc1f71e
> scala> val kins = ktbl.newInsert()
> java.lang.IllegalStateException: Column id is not set      <== id is the first column in the table.
>      at org.apache.kudu.client.shaded.com.google.common.base.Preconditions.checkState(Preconditions.java:197)
>      at org.apache.kudu.client.PartialRow.appendDebugString(PartialRow.java:559)
>      at org.apache.kudu.client.PartialRow.stringifyRowKey(PartialRow.java:532)
>      at org.apache.kudu.client.Operation.toString(Operation.java:202)
>      at scala.runtime.ScalaRunTime$.scala$runtime$ScalaRunTime$$inner$1(ScalaRunTime.scala:324)
>      at scala.runtime.ScalaRunTime$.stringOf(ScalaRunTime.scala:329)
>      at scala.runtime.ScalaRunTime$.replStringOf(ScalaRunTime.scala:337)
>      at .<init>(<console>:10)
>      at .<clinit>(<console>)
>      at $print(<console>)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      at java.lang.reflect.Method.invoke(Method.java:606)
>      at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
>      at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
>      at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
>      at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
>      at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
>      at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>      at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>      at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>      at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>      at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>      at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>      at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>      at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>      at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>      at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>      at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>      at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1064)
>      at org.apache.spark.repl.Main$.main(Main.scala:31)
>      at org.apache.spark.repl.Main.main(Main.scala)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      at java.lang.reflect.Method.invoke(Method.java:606)
>      at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>      at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>      at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>      at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>      at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> I have tried the kudu java sample, it worked well in java.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)