You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kevin Huang (JIRA)" <ji...@apache.org> on 2016/09/09 07:41:20 UTC
[jira] [Created] (SPARK-17467) Spark SQL: Return incorrect result
for the data files on Swift
Kevin Huang created SPARK-17467:
-----------------------------------
Summary: Spark SQL: Return incorrect result for the data files on Swift
Key: SPARK-17467
URL: https://issues.apache.org/jira/browse/SPARK-17467
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.6.2
Reporter: Kevin Huang
I have an AVRO file on Swift(awclassic.avro). I ran Spark SQL to count its records, but got incorrect result. The file has 60855 records, but I got 42451.
The following is the sample code I used:
{code:java}
public class SparkTest {
public static void main(String[] args) throws InterruptedException, IOException {
SparkConf sparkConf = new SparkConf()
.setAppName("Spark-Swift-App")
.setMaster("local[8]");
JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
Configuration hadoopConf = sparkContext.hadoopConfiguration();
Configuration extraHadoopConf = new Configuration(false);
extraHadoopConf.addResource("swift-site.xml"); // swift-site.xml is a customized file which is to store Swift configurations.
hadoopConf.addResource(extraHadoopConf);
SQLContext sqlContext = new SQLContext(sparkContext);
DataFrame df = sqlContext.read().format("com.databricks.spark.avro").load("swift://bdd-edp.bddcs/awclassic/awclassic.avro");
df.registerTempTable("awclassic");
DataFrame result = sqlContext.sql("SELECT COUNT(*) FROM awclassic");
result.show();
}
}
{code}
I also found an interesting thing: if I changed the value of fs.swift.blocksize, I got different results. For details, please see the following table.
||fs.swift.blocksize ||SELECT COUNT\(*\) FROM awclassic ||
|16384(16M) | 51683|
|32768(32M) - default | 42451|
|65536(64M) | 30459|
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org