You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lunen (JIRA)" <ji...@apache.org> on 2015/09/16 12:45:46 UTC
[jira] [Created] (SPARK-10633) Persisting Spark stream to MySQL -
Spark tries to create the table for every stream even if it exist already.
Lunen created SPARK-10633:
-----------------------------
Summary: Persisting Spark stream to MySQL - Spark tries to create the table for every stream even if it exist already.
Key: SPARK-10633
URL: https://issues.apache.org/jira/browse/SPARK-10633
Project: Spark
Issue Type: Bug
Components: SQL, Streaming
Affects Versions: 1.5.0, 1.4.0
Environment: Ubuntu 14.04
IntelliJ IDEA 14.1.4
sbt
mysql-connector-java 5.1.35 (Tested and working with Spark 1.3.1)
Reporter: Lunen
Priority: Blocker
Persisting Spark Kafka stream to MySQL
Spark 1.4 + tries to create a table automatically every time the stream gets sent to a specified table.
Please note, Spark 1.3.1 works.
Code sample:
val url = "jdbc:mysql://host:port/db?user=user&password=password
val crp = RowSetProvider.newFactory()
val crsSql: CachedRowSet = crp.createCachedRowSet()
val crsTrg: CachedRowSet = crp.createCachedRowSet()
crsSql.beforeFirst()
crsTrg.beforeFirst()
//Read Stream from Kafka
//Produce SQL INSERT STRING
streamT.foreachRDD { rdd =>
if (rdd.toLocalIterator.nonEmpty) {
sqlContext.read.json(rdd).registerTempTable(serverEvents + "_events")
while (crsSql.next) {
sqlContext.sql("SQL INSERT STRING").write.jdbc(url, "SCHEMA_NAME", new Properties)
println("Persisted Data: " + 'SQL INSERT STRING')
}
crsSql.beforeFirst()
}
stmt.close()
conn.close()
}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org