You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "James Greenwood (JIRA)" <ji...@apache.org> on 2015/07/06 16:29:04 UTC
[jira] [Created] (SPARK-8842) Spark SQL - Insert into table Issue
James Greenwood created SPARK-8842:
--------------------------------------
Summary: Spark SQL - Insert into table Issue
Key: SPARK-8842
URL: https://issues.apache.org/jira/browse/SPARK-8842
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.4.0
Reporter: James Greenwood
I am running spark 1.4 and currently experiencing an issue when inserting data into a table. The data is loaded into an initial table and then selected from this table, processed and then inserted into a second table. The issue is that some of the data goes missing when inserted into the second table when running in a multi-worker configuration (a master, a worker on the master and then a worker on a different host).
I have narrowed down the problem to the insert into the second table. An example process to generate the problem is below.
Generate a file (for example /home/spark/test) with the numbers 1 to 50 on separate lines.
spark-sql --master spark://spark-master:7077 --hiveconf hive.metastore.warehouse.dir=/spark
(/spark is shared between all hosts)
create table test(field string);
load data inpath '/home/spark/test' into table test;
create table processed(field string);
from test insert into table processed select *;
select * from processed;
The result from the final select does not contain all the numbers 1 to 50.
I have also run the above example in some different configurations :-
- When there is just one worker running on the master. The result of the final select is the rows 1-50 i.e all data as expected.
- When there is just one worker running on a host which is not the master. The final select returns no rows.
No errors are logged in the log files.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org