You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph Batchik (JIRA)" <ji...@apache.org> on 2015/07/30 22:12:04 UTC

[jira] [Created] (SPARK-9486) Add aliasing to data sources to allow external packages to register themselves with Spark

Joseph Batchik created SPARK-9486:
-------------------------------------

             Summary: Add aliasing to data sources to allow external packages to register themselves with Spark
                 Key: SPARK-9486
                 URL: https://issues.apache.org/jira/browse/SPARK-9486
             Project: Spark
          Issue Type: Improvement
          Components: SQL
            Reporter: Joseph Batchik
            Priority: Minor


Currently Spark allows users to use external data sources like spark-avro, spark-csv, etc by having them specifying their full class name:

{code:java}
sqlContext.read.format("com.databricks.spark.avro").load(path)
{code}

Typing in a full class is not the best idea so it would be nice to allow the external packages to be able to register themselves with Spark to allow users to do something like:

{code:java}
sqlContext.read.format("avro").load(path)
{code}

This would make it so that the external data source packages follow the same convention as the built in data sources do, parquet, json, jdbc, etc.

This could be accomplished by using a ServiceLoader.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org