You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by justneeraj <ju...@gmail.com> on 2016/05/10 09:21:36 UTC
Reading table schema from Cassandra
Hi,
We are using Spark Cassandra connector for our app.
And I am trying to create higher level roll up tables. e.g. minutes table
from seconds table.
If my tables are already defined. How can I read the schema of table?
So that I can load them in the Dataframe and create the aggregates.
Any help will be really thankful.
Thanks,
Neeraj
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Reading-table-schema-from-Cassandra-tp26915.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
RE: Reading table schema from Cassandra
Posted by Mohammed Guller <mo...@glassbeam.com>.
You can create a DataFrame directly from a Cassandra table using something like this:
val dfCassTable = sqlContext.read.format("org.apache.spark.sql.cassandra").options(Map( "table" -> "your_column_family", "keyspace" -> "your_keyspace")).load()
Then, you can get schema:
val dfCassTableSchema = dfCassTable.schema
Mohammed
Author: Big Data Analytics with Spark
-----Original Message-----
From: justneeraj [mailto:justneeraj@gmail.com]
Sent: Tuesday, May 10, 2016 2:22 AM
To: user@spark.apache.org
Subject: Reading table schema from Cassandra
Hi,
We are using Spark Cassandra connector for our app.
And I am trying to create higher level roll up tables. e.g. minutes table from seconds table.
If my tables are already defined. How can I read the schema of table?
So that I can load them in the Dataframe and create the aggregates.
Any help will be really thankful.
Thanks,
Neeraj
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Reading-table-schema-from-Cassandra-tp26915.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail: user-help@spark.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org