You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by Qian Xu <sx...@googlemail.com> on 2014/10/21 03:58:48 UTC

Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/
-----------------------------------------------------------

Review request for Sqoop.


Bugs: SQOOP-1588
    https://issues.apache.org/jira/browse/SQOOP-1588


Repository: sqoop-sqoop2


Description
-------

Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.

The scope is defined as follows:
* Destination: HDFS
* File Format: Avro Parquet and CSV.
* Compression Codec: Use default
* Partitioner Strategy: Not supported
* Column Mapping: Not supported


Diffs
-----

  connector/connector-kite/pom.xml PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
  connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
  connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestExecutor.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java PRE-CREATION 
  connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
  connector/pom.xml e98a0fc 
  execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
  pom.xml f25a29f 
  server/pom.xml 67baaa5 
  test/pom.xml 7a80710 

Diff: https://reviews.apache.org/r/26963/diff/


Testing
-------

New unittests included. All passed.


Thanks,

Qian Xu


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Veena Basavaraj <vb...@cloudera.com>.

> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java, line 35
> > <https://reviews.apache.org/r/26963/diff/1/?file=726929#file726929line35>
> >
> >     headsup: this will need to change once 1551 is committed. Jarcec is still reviewing it.
> 
> Qian Xu wrote:
>     Added a TODO marker for that

the patch is in, if you rebase and have issues, let me know.


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 37
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line37>
> >
> >     can we not call it KiteToConnector.
> >     
> >     I saw this convention in Hbase and Hdfs.
> 
> Qian Xu wrote:
>     KiteConnector is correct. A connector will have FROM and TO. A FROM has FromInitializer Partitioner Extractor and a FromDestroyer. A TO has ToInitializer Loader and ToDestroyer.

makes sense if we are adding the FROM part in another patch


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 40
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line40>
> >
> >     will the initializer/destroyer be different for FROM? 
> >     
> >     What is the plan for FROM? If I understand correctly this will not have a From is it?
> >     
> >     If that is the case, then why not call this KiteToConnector. Also please add javadoc and add notes that is here, would be useful to read what this connector does and does not support.
> >     
> >     
> >     Destination: HDFS
> >     File Format: Avro Parquet and CSV.
> >     Compression Codec: Use default
> >     Partitioner Strategy: Not supported
> >     Column Mapping: Not supported
> 
> Qian Xu wrote:
>     KiteConnector will have both FROM and TO. FROM will be added in another JIRA. In this patch, any calls of FROM will raise exception with text "Not implemented".

please add javadocs on the kite conenctor features as well.


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java, line 28
> > <https://reviews.apache.org/r/26963/diff/1/?file=726932#file726932line28>
> >
> >     can we just make the constants a interface? It is pretty much a standard pattern to use if we need to extend some common constants in a sub entitiy.
> >     
> >     
> >     why do we need constructor for a class that holds static final strings?
> >     
> >     Might be another ticket, but it is one line change and 2 lines to delete.
> >     
> >     public interface Constants {
> >     
> >       /**
> >        * All job related configuration is prefixed with this:
> >        * <tt>org.apache.sqoop.job.</tt>
> >        */
> >       public static final String PREFIX_CONFIG = "org.apache.sqoop.job.";
> >     
> >       public static final String JOB_ETL_NUMBER_PARTITIONS = PREFIX_CONFIG
> >           + "etl.number.partitions";
> >     
> >       public static final String JOB_ETL_FIELD_NAMES = PREFIX_CONFIG
> >           + "etl.field.names";
> >     
> >       public static final String JOB_ETL_OUTPUT_DIRECTORY = PREFIX_CONFIG
> >           + "etl.output.directory";
> >     
> >       public static final String JOB_ETL_INPUT_DIRECTORY = PREFIX_CONFIG
> >           + "etl.input.directory";
> >     
> >     
> >     }
> 
> Qian Xu wrote:
>     Googled a bit, it is considered as bad practice so far. I'm neutral for that, because it really simplifies code. Better open a new clean-up jira.

can you post a link. like to read.


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java, line 31
> > <https://reviews.apache.org/r/26963/diff/1/?file=726934#file726934line31>
> >
> >     Q: is not the target here assumed to be always hdfs? can this be used to write to anything else?
> 
> Qian Xu wrote:
>     Not limited to HDFS. Here is an useful link http://kitesdk.org/docs/current/guide/URIs/

thanks, this makes sense!


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java, line 83
> > <https://reviews.apache.org/r/26963/diff/1/?file=726936#file726936line83>
> >
> >     why cant be we return null ?
> 
> Qian Xu wrote:
>     Good point. I checked the code, somewhere else (e.g. Matcher) will check schema with `schema.isEmpty()`. If it returns `null` here, any check schema methods should be changed to `schema == null || schema.isEmpty()`.

we should guard the matched code, it needs to be defensive, how can we expect that everyine will return a new Schema, my first gut was to return null.

can you file a issue for that?


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java, line 24
> > <https://reviews.apache.org/r/26963/diff/1/?file=726939#file726939line24>
> >
> >     can we please move thi class to a sdk? and resue it in all connectors?
> >     
> >     also nitpick please rename link to linkconfig.
> >     
> >     link has a very different meaning in sqoop

please file a ticket for me, I can move it


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java, line 21
> > <https://reviews.apache.org/r/26963/diff/1/?file=726940#file726940line21>
> >
> >     can we be more specific on supported formats for hdfs via kite sdk, since this will evolve as kote sdk evolves with newer formats.
> >     
> >     Why cant we use a enum class from Kite itself if there is one
> 
> Qian Xu wrote:
>     Unfortunately, ConfigUtils supports String Map Integer Boolean and Enum only. Kite's Format is actually a class.

I am not sure I understand, I mean why not use the Kite sdk format class for avro, csv, parquet... if those are standard formats for the kite

why doe we need another class here.


- Veena


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review57536
-----------------------------------------------------------


On Oct. 21, 2014, 12:07 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 12:07 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.

> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java, line 24
> > <https://reviews.apache.org/r/26963/diff/1/?file=726939#file726939line24>
> >
> >     can we please move thi class to a sdk? and resue it in all connectors?
> >     
> >     also nitpick please rename link to linkconfig.
> >     
> >     link has a very different meaning in sqoop
> 
> Veena Basavaraj wrote:
>     please file a ticket for me, I can move it

LinkConfiguration is per connector different. Are you sure you can reuse them?


- Qian


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review57536
-----------------------------------------------------------


On Oct. 21, 2014, 3:07 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 3:07 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Veena Basavaraj <vb...@cloudera.com>.

> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 66
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line66>
> >
> >     Instead of exception, we could encourage
> >     
> >     We should use the NullConfiguration class in this case, saying this connector does not support FROM
> >     
> >     just a thought.
> 
> Qian Xu wrote:
>     Good thought. From the name, I guess NullConfiguration means nothing to configure. Actually FROM has something to configure. For short-term wait, throw an exception is IMHO acceptable.

Example of using EmptyConfiguration https://reviews.apache.org/r/27004/, jarcec committed it as well


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 71
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line71>
> >
> >     does even default make sense here? We have a enum for direction and it is already typed, so why will there ever be the the default case?
> >     
> >     public enum Direction {
> >       FROM,
> >       TO
> >     }
> 
> Qian Xu wrote:
>     `default` here is used to suppress a compiler error, otherwise compiler will complain about missing a `return` statement.

you could retur null as well, this code path should happen in reality. I would not want to throw exceptions when it is not a use case for it


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java, line 83
> > <https://reviews.apache.org/r/26963/diff/1/?file=726936#file726936line83>
> >
> >     why cant be we return null ?
> 
> Qian Xu wrote:
>     Good point. I checked the code, somewhere else (e.g. Matcher) will check schema with `schema.isEmpty()`. If it returns `null` here, any check schema methods should be changed to `schema == null || schema.isEmpty()`.
> 
> Veena Basavaraj wrote:
>     we should guard the matched code, it needs to be defensive, how can we expect that everyine will return a new Schema, my first gut was to return null.
>     
>     can you file a issue for that?

please file a issue for this and assignt o me


> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java, line 24
> > <https://reviews.apache.org/r/26963/diff/1/?file=726939#file726939line24>
> >
> >     can we please move thi class to a sdk? and resue it in all connectors?
> >     
> >     also nitpick please rename link to linkconfig.
> >     
> >     link has a very different meaning in sqoop
> 
> Veena Basavaraj wrote:
>     please file a ticket for me, I can move it
> 
> Qian Xu wrote:
>     LinkConfiguration is per connector different. Are you sure you can reuse them?

if the class is exact same field names linkConfig, I am pretty sure we can re use it. if they have different field names, it wont be possible you are right.

But in your case it is just one field and you can return the common class in your connector. It should work


- Veena


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review57536
-----------------------------------------------------------


On Oct. 23, 2014, 1:43 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 23, 2014, 1:43 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Veena Basavaraj <vb...@cloudera.com>.

> On Oct. 20, 2014, 9:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java, line 21
> > <https://reviews.apache.org/r/26963/diff/1/?file=726940#file726940line21>
> >
> >     can we be more specific on supported formats for hdfs via kite sdk, since this will evolve as kote sdk evolves with newer formats.
> >     
> >     Why cant we use a enum class from Kite itself if there is one
> 
> Qian Xu wrote:
>     Unfortunately, ConfigUtils supports String Map Integer Boolean and Enum only. Kite's Format is actually a class.
> 
> Veena Basavaraj wrote:
>     I am not sure I understand, I mean why not use the Kite sdk format class for avro, csv, parquet... if those are standard formats for the kite
>     
>     why doe we need another class here.

I saw this in hdfs connector.

can we have a common enum in the conenctor sdk, why do we need duplication in every connector?


/**
 * Various supported formats on disk
 */
public enum ToFormat {
  /**
   * Comma separated text file
   */
  TEXT_FILE,

  /**
   * Sequence file
   */
  SEQUENCE_FILE,
}


- Veena


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review57536
-----------------------------------------------------------


On Oct. 21, 2014, 12:07 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 12:07 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.

> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 40
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line40>
> >
> >     will the initializer/destroyer be different for FROM? 
> >     
> >     What is the plan for FROM? If I understand correctly this will not have a From is it?
> >     
> >     If that is the case, then why not call this KiteToConnector. Also please add javadoc and add notes that is here, would be useful to read what this connector does and does not support.
> >     
> >     
> >     Destination: HDFS
> >     File Format: Avro Parquet and CSV.
> >     Compression Codec: Use default
> >     Partitioner Strategy: Not supported
> >     Column Mapping: Not supported

KiteConnector will have both FROM and TO. FROM will be added in another JIRA. In this patch, any calls of FROM will raise exception with text "Not implemented".


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 37
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line37>
> >
> >     can we not call it KiteToConnector.
> >     
> >     I saw this convention in Hbase and Hdfs.

KiteConnector is correct. A connector will have FROM and TO. A FROM has FromInitializer Partitioner Extractor and a FromDestroyer. A TO has ToInitializer Loader and ToDestroyer.


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 66
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line66>
> >
> >     Instead of exception, we could encourage
> >     
> >     We should use the NullConfiguration class in this case, saying this connector does not support FROM
> >     
> >     just a thought.

Good thought. From the name, I guess NullConfiguration means nothing to configure. Actually FROM has something to configure. For short-term wait, throw an exception is IMHO acceptable.


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java, line 31
> > <https://reviews.apache.org/r/26963/diff/1/?file=726934#file726934line31>
> >
> >     Q: is not the target here assumed to be always hdfs? can this be used to write to anything else?

Not limited to HDFS. Here is an useful link http://kitesdk.org/docs/current/guide/URIs/


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java, line 83
> > <https://reviews.apache.org/r/26963/diff/1/?file=726936#file726936line83>
> >
> >     why cant be we return null ?

Good point. I checked the code, somewhere else (e.g. Matcher) will check schema with `schema.isEmpty()`. If it returns `null` here, any check schema methods should be changed to `schema == null || schema.isEmpty()`.


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java, line 35
> > <https://reviews.apache.org/r/26963/diff/1/?file=726929#file726929line35>
> >
> >     headsup: this will need to change once 1551 is committed. Jarcec is still reviewing it.

Added a TODO marker for that


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java, line 71
> > <https://reviews.apache.org/r/26963/diff/1/?file=726930#file726930line71>
> >
> >     does even default make sense here? We have a enum for direction and it is already typed, so why will there ever be the the default case?
> >     
> >     public enum Direction {
> >       FROM,
> >       TO
> >     }

`default` here is used to suppress a compiler error, otherwise compiler will complain about missing a `return` statement.


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java, line 31
> > <https://reviews.apache.org/r/26963/diff/1/?file=726931#file726931line31>
> >
> >     what does destination mean, can we just say TO ?

Text changed to "Dataset exists already"


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java, line 28
> > <https://reviews.apache.org/r/26963/diff/1/?file=726932#file726932line28>
> >
> >     can we just make the constants a interface? It is pretty much a standard pattern to use if we need to extend some common constants in a sub entitiy.
> >     
> >     
> >     why do we need constructor for a class that holds static final strings?
> >     
> >     Might be another ticket, but it is one line change and 2 lines to delete.
> >     
> >     public interface Constants {
> >     
> >       /**
> >        * All job related configuration is prefixed with this:
> >        * <tt>org.apache.sqoop.job.</tt>
> >        */
> >       public static final String PREFIX_CONFIG = "org.apache.sqoop.job.";
> >     
> >       public static final String JOB_ETL_NUMBER_PARTITIONS = PREFIX_CONFIG
> >           + "etl.number.partitions";
> >     
> >       public static final String JOB_ETL_FIELD_NAMES = PREFIX_CONFIG
> >           + "etl.field.names";
> >     
> >       public static final String JOB_ETL_OUTPUT_DIRECTORY = PREFIX_CONFIG
> >           + "etl.output.directory";
> >     
> >       public static final String JOB_ETL_INPUT_DIRECTORY = PREFIX_CONFIG
> >           + "etl.input.directory";
> >     
> >     
> >     }

Googled a bit, it is considered as bad practice so far. I'm neutral for that, because it really simplifies code. Better open a new clean-up jira.


> On Oct. 21, 2014, 12:14 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java, line 21
> > <https://reviews.apache.org/r/26963/diff/1/?file=726940#file726940line21>
> >
> >     can we be more specific on supported formats for hdfs via kite sdk, since this will evolve as kote sdk evolves with newer formats.
> >     
> >     Why cant we use a enum class from Kite itself if there is one

Unfortunately, ConfigUtils supports String Map Integer Boolean and Enum only. Kite's Format is actually a class.


- Qian


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review57536
-----------------------------------------------------------


On Oct. 21, 2014, 3:07 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 3:07 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Veena Basavaraj <vb...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review57536
-----------------------------------------------------------


love it. overall a good start. Please see if we can add tests to the initializer/ destroyer code as well. They is logic in them as the loader/ executor too.!


connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java
<https://reviews.apache.org/r/26963/#comment98267>

    headsup: this will need to change once 1551 is committed. Jarcec is still reviewing it.



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java
<https://reviews.apache.org/r/26963/#comment98268>

    please use the code in util for this. Are we sure we dont need any tests for this?
    
    I created one ticket since I dont see tests for any of the existing connectors upgrade logic
    
    If you are up for it,
    
    https://issues.apache.org/jira/browse/SQOOP-1595



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment98269>

    can we not call it KiteToConnector.
    
    I saw this convention in Hbase and Hdfs.



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment98270>

    will the initializer/destroyer be different for FROM? 
    
    What is the plan for FROM? If I understand correctly this will not have a From is it?
    
    If that is the case, then why not call this KiteToConnector. Also please add javadoc and add notes that is here, would be useful to read what this connector does and does not support.
    
    Destination: HDFS
    File Format: Avro Parquet and CSV.
    Compression Codec: Use default
    Partitioner Strategy: Not supported
    Column Mapping: Not supported



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment98272>

    Instead of exception, we could encourage
    
    We should use the NullConfiguration class in this case, saying this connector does not support FROM
    
    just a thought.



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment98271>

    does even default make sense here? We have a enum for direction and it is already typed, so why will there ever be the the default case?
    
    public enum Direction {
      FROM,
      TO
    }



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment98274>

    heads up this will change



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java
<https://reviews.apache.org/r/26963/#comment98276>

    what does destination mean, can we just say TO ?



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java
<https://reviews.apache.org/r/26963/#comment98282>

    can we just make the constants a interface? It is pretty much a standard pattern to use if we need to extend some common constants in a sub entitiy.
    
    why do we need constructor for a class that holds static final strings?
    
    Might be another ticket, but it is one line change and 2 lines to delete.
    
    public interface Constants {
    
      /**
       * All job related configuration is prefixed with this:
       * <tt>org.apache.sqoop.job.</tt>
       */
      public static final String PREFIX_CONFIG = "org.apache.sqoop.job.";
    
      public static final String JOB_ETL_NUMBER_PARTITIONS = PREFIX_CONFIG
          + "etl.number.partitions";
    
      public static final String JOB_ETL_FIELD_NAMES = PREFIX_CONFIG
          + "etl.field.names";
    
      public static final String JOB_ETL_OUTPUT_DIRECTORY = PREFIX_CONFIG
          + "etl.output.directory";
    
      public static final String JOB_ETL_INPUT_DIRECTORY = PREFIX_CONFIG
          + "etl.input.directory";
    
    }



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java
<https://reviews.apache.org/r/26963/#comment98296>

    Q: is not the target here assumed to be always hdfs? can this be used to write to anything else?



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java
<https://reviews.apache.org/r/26963/#comment98297>

    can we add a few comments on why this is done in this step? since it unusal to be doing this kind of logic in destroyer



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java
<https://reviews.apache.org/r/26963/#comment98298>

    why cant be we return null ?



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java
<https://reviews.apache.org/r/26963/#comment98299>

    can we please move thi class to a sdk? and resue it in all connectors?
    
    also nitpick please rename link to linkconfig.
    
    link has a very different meaning in sqoop



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java
<https://reviews.apache.org/r/26963/#comment98300>

    can we be more specific on supported formats for hdfs via kite sdk, since this will evolve as kote sdk evolves with newer formats.
    
    Why cant we use a enum class from Kite itself if there is one



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java
<https://reviews.apache.org/r/26963/#comment98301>

    please add a ticker for feature todos and add the number here if you intend to address this alter.



connector/connector-kite/src/main/resources/kite-connector-config.properties
<https://reviews.apache.org/r/26963/#comment98302>

    nitpick rename to connector configs



connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestExecutor.java
<https://reviews.apache.org/r/26963/#comment98303>

    testKiteExecutor, please rename the test class to the class it is testing for consistency



connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java
<https://reviews.apache.org/r/26963/#comment98304>

    ditto as above



connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java
<https://reviews.apache.org/r/26963/#comment98305>

    nit pick, please do not use job and link in cases it really means configs. they are different.


- Veena Basavaraj


On Oct. 20, 2014, 6:58 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 20, 2014, 6:58 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Abraham Elmahrek <ab...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review58150
-----------------------------------------------------------



pom.xml
<https://reviews.apache.org/r/26963/#comment99069>

    I think the kite depedencies need to be added to the dependencyManagement section.


- Abraham Elmahrek


On Oct. 23, 2014, 8:43 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 23, 2014, 8:43 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.

> On Nov. 1, 2014, 10:44 a.m., Gwen Shapira wrote:
> > Do we want to add Hive and HBase as possible destinations as well? I understood that Kite supports it relatively easily.

Sorry for ignored the comment. HBase will be handled and tested in another JIRA SQOOP-1744.


- Qian


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review59454
-----------------------------------------------------------


On Nov. 17, 2014, 3:31 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Nov. 17, 2014, 3:31 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/FileFormat.java PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/JarUtil.java PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml 233d3ce 
>   server/pom.xml 4a5eb5e 
>   test/pom.xml 2dbb8c5 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Gwen Shapira <gs...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review59454
-----------------------------------------------------------


Do we want to add Hive and HBase as possible destinations as well? I understood that Kite supports it relatively easily.

- Gwen Shapira


On Oct. 31, 2014, 4:48 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 31, 2014, 4:48 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Abraham Elmahrek <ab...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review61881
-----------------------------------------------------------

Ship it!


I'm +1. Let's address the below comment in a separate Jira as well.


connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java
<https://reviews.apache.org/r/26963/#comment103806>

    Should include opencsv, parquet, and avro jars as well.


- Abraham Elmahrek


On Nov. 17, 2014, 7:31 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Nov. 17, 2014, 7:31 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/FileFormat.java PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/JarUtil.java PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml 233d3ce 
>   server/pom.xml 4a5eb5e 
>   test/pom.xml 2dbb8c5 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.

> On Nov. 18, 2014, 9:47 a.m., Abraham Elmahrek wrote:
> > I have a couple of high level comments and an exception.
> > 
> > It seems like the LinkConfiguration and the ToJobConfiguration are holding the wrong pieces? For example, the dataset URI should be part of the LinkConfiguration and the data format (Avro, CSV, etc.) should be part of the job? Or perhaps they should all be part of the link configuration?
> > 
> > Also, I think there should be a validation step for the dataset for HDFS. We should check for "hdfs" type and require a host. It's fine for now for this to be a runtime exception, but let's log a separate Jira?
> > 
> >     Exception has occurred during processing command 
> > Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception
> > Stack trace: 
> >          at  org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:115) 
> >          at  org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:148) 
> >          at  org.apache.sqoop.client.request.JobResourceRequest (JobResourceRequest.java:63) 
> >          at  org.apache.sqoop.client.request.SqoopResourceRequests (SqoopResourceRequests.java:116) 
> >          at  org.apache.sqoop.client.SqoopClient (SqoopClient.java:406) 
> >          at  org.apache.sqoop.shell.CreateJobFunction (CreateJobFunction.java:99) 
> >          at  org.apache.sqoop.shell.CreateJobFunction (CreateJobFunction.java:64) 
> >          at  org.apache.sqoop.shell.SqoopFunction (SqoopFunction.java:51) 
> >          at  org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:127) 
> >          at  org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:103) 
> >          at  org.codehaus.groovy.tools.shell.Command$execute (null:-1) 
> >          at  org.codehaus.groovy.tools.shell.Shell (Shell.groovy:101) 
> >          at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:-1) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
> >          at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
> >          at  java.lang.reflect.Method (Method.java:606) 
> >          at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) 
> >          at  groovy.lang.MetaMethod (MetaMethod.java:233) 
> >          at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) 
> >          at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) 
> >          at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:173) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
> >          at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
> >          at  java.lang.reflect.Method (Method.java:606) 
> >          at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267) 
> >          at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52) 
> >          at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:141) 
> >          at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:121) 
> >          at  org.codehaus.groovy.tools.shell.Shell (Shell.groovy:114) 
> >          at  org.codehaus.groovy.tools.shell.Shell$leftShift$0 (null:-1) 
> >          at  org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:88) 
> >          at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
> >          at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
> >          at  java.lang.reflect.Method (Method.java:606) 
> >          at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) 
> >          at  groovy.lang.MetaMethod (MetaMethod.java:233) 
> >          at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) 
> >          at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) 
> >          at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148) 
> >          at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:100) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
> >          at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
> >          at  java.lang.reflect.Method (Method.java:606) 
> >          at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267) 
> >          at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52) 
> >          at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:137) 
> >          at  org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:57) 
> >          at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
> >          at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
> >          at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
> >          at  java.lang.reflect.Method (Method.java:606) 
> >          at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) 
> >          at  groovy.lang.MetaMethod (MetaMethod.java:233) 
> >          at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) 
> >          at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) 
> >          at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148) 
> >          at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:66) 
> >          at  java_lang_Runnable$run (null:-1) 
> >          at  org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42) 
> >          at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:108) 
> >          at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:112) 
> >          at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:463) 
> >          at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:402) 
> >          at  org.apache.sqoop.shell.SqoopShell (SqoopShell.java:128) 
> > Caused by: Exception: java.lang.NullPointerException Message: 
> > Stack trace:
> >          at  java.util.regex.Matcher (Matcher.java:1234) 
> >          at  java.util.regex.Matcher (Matcher.java:308) 
> >          at  java.util.regex.Matcher (Matcher.java:228) 
> >          at  java.util.regex.Pattern (Pattern.java:1088) 
> >          at  org.apache.sqoop.connector.kite.util.InputValidation (InputValidation.java:35) 
> >          at  org.apache.sqoop.connector.kite.configuration.ToJobConfig$ConfigValidator (ToJobConfig.java:38) 
> >          at  org.apache.sqoop.connector.kite.configuration.ToJobConfig$ConfigValidator (ToJobConfig.java:33) 
> >          at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:172) 
> >          at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:140) 
> >          at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:121) 
> >          at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:82) 
> >          at  org.apache.sqoop.model.ConfigUtils (ConfigUtils.java:220) 
> >          at  org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:220) 
> >          at  org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:102) 
> >          at  org.apache.sqoop.server.v1.JobServlet (JobServlet.java:91) 
> >          at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63) 
> >          at  javax.servlet.http.HttpServlet (HttpServlet.java:643) 
> >          at  javax.servlet.http.HttpServlet (HttpServlet.java:723) 
> >          at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290) 
> >          at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) 
> >          at  org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:392) 
> >          at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:235) 
> >          at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) 
> >          at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233) 
> >          at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191) 
> >          at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127) 
> >          at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103) 
> >          at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109) 
> >          at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293) 
> >          at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:861) 
> >          at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606) 
> >          at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489) 
> >          at  java.lang.Thread (Thread.java:744)

Added https://issues.apache.org/jira/browse/SQOOP-1751


- Qian


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review61857
-----------------------------------------------------------


On Nov. 17, 2014, 3:31 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Nov. 17, 2014, 3:31 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/FileFormat.java PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/JarUtil.java PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml 233d3ce 
>   server/pom.xml 4a5eb5e 
>   test/pom.xml 2dbb8c5 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Abraham Elmahrek <ab...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review61857
-----------------------------------------------------------


I have a couple of high level comments and an exception.

It seems like the LinkConfiguration and the ToJobConfiguration are holding the wrong pieces? For example, the dataset URI should be part of the LinkConfiguration and the data format (Avro, CSV, etc.) should be part of the job? Or perhaps they should all be part of the link configuration?

Also, I think there should be a validation step for the dataset for HDFS. We should check for "hdfs" type and require a host. It's fine for now for this to be a runtime exception, but let's log a separate Jira?

    Exception has occurred during processing command 
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0001:Server has returned exception
Stack trace: 
         at  org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:115) 
         at  org.apache.sqoop.client.request.ResourceRequest (ResourceRequest.java:148) 
         at  org.apache.sqoop.client.request.JobResourceRequest (JobResourceRequest.java:63) 
         at  org.apache.sqoop.client.request.SqoopResourceRequests (SqoopResourceRequests.java:116) 
         at  org.apache.sqoop.client.SqoopClient (SqoopClient.java:406) 
         at  org.apache.sqoop.shell.CreateJobFunction (CreateJobFunction.java:99) 
         at  org.apache.sqoop.shell.CreateJobFunction (CreateJobFunction.java:64) 
         at  org.apache.sqoop.shell.SqoopFunction (SqoopFunction.java:51) 
         at  org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:127) 
         at  org.apache.sqoop.shell.SqoopCommand (SqoopCommand.java:103) 
         at  org.codehaus.groovy.tools.shell.Command$execute (null:-1) 
         at  org.codehaus.groovy.tools.shell.Shell (Shell.groovy:101) 
         at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:-1) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
         at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
         at  java.lang.reflect.Method (Method.java:606) 
         at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) 
         at  groovy.lang.MetaMethod (MetaMethod.java:233) 
         at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) 
         at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) 
         at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:173) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
         at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
         at  java.lang.reflect.Method (Method.java:606) 
         at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267) 
         at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52) 
         at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:141) 
         at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:121) 
         at  org.codehaus.groovy.tools.shell.Shell (Shell.groovy:114) 
         at  org.codehaus.groovy.tools.shell.Shell$leftShift$0 (null:-1) 
         at  org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:88) 
         at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
         at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
         at  java.lang.reflect.Method (Method.java:606) 
         at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) 
         at  groovy.lang.MetaMethod (MetaMethod.java:233) 
         at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) 
         at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) 
         at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148) 
         at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:100) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
         at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
         at  java.lang.reflect.Method (Method.java:606) 
         at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce (PogoMetaMethodSite.java:267) 
         at  org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite (PogoMetaMethodSite.java:52) 
         at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:137) 
         at  org.codehaus.groovy.tools.shell.ShellRunner (ShellRunner.groovy:57) 
         at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:-1) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:-2) 
         at  sun.reflect.NativeMethodAccessorImpl (NativeMethodAccessorImpl.java:57) 
         at  sun.reflect.DelegatingMethodAccessorImpl (DelegatingMethodAccessorImpl.java:43) 
         at  java.lang.reflect.Method (Method.java:606) 
         at  org.codehaus.groovy.reflection.CachedMethod (CachedMethod.java:90) 
         at  groovy.lang.MetaMethod (MetaMethod.java:233) 
         at  groovy.lang.MetaClassImpl (MetaClassImpl.java:1054) 
         at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:128) 
         at  org.codehaus.groovy.runtime.ScriptBytecodeAdapter (ScriptBytecodeAdapter.java:148) 
         at  org.codehaus.groovy.tools.shell.InteractiveShellRunner (InteractiveShellRunner.groovy:66) 
         at  java_lang_Runnable$run (null:-1) 
         at  org.codehaus.groovy.runtime.callsite.CallSiteArray (CallSiteArray.java:42) 
         at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:108) 
         at  org.codehaus.groovy.runtime.callsite.AbstractCallSite (AbstractCallSite.java:112) 
         at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:463) 
         at  org.codehaus.groovy.tools.shell.Groovysh (Groovysh.groovy:402) 
         at  org.apache.sqoop.shell.SqoopShell (SqoopShell.java:128) 
Caused by: Exception: java.lang.NullPointerException Message: 
Stack trace:
         at  java.util.regex.Matcher (Matcher.java:1234) 
         at  java.util.regex.Matcher (Matcher.java:308) 
         at  java.util.regex.Matcher (Matcher.java:228) 
         at  java.util.regex.Pattern (Pattern.java:1088) 
         at  org.apache.sqoop.connector.kite.util.InputValidation (InputValidation.java:35) 
         at  org.apache.sqoop.connector.kite.configuration.ToJobConfig$ConfigValidator (ToJobConfig.java:38) 
         at  org.apache.sqoop.connector.kite.configuration.ToJobConfig$ConfigValidator (ToJobConfig.java:33) 
         at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:172) 
         at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:140) 
         at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:121) 
         at  org.apache.sqoop.validation.ConfigValidationRunner (ConfigValidationRunner.java:82) 
         at  org.apache.sqoop.model.ConfigUtils (ConfigUtils.java:220) 
         at  org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:220) 
         at  org.apache.sqoop.handler.JobRequestHandler (JobRequestHandler.java:102) 
         at  org.apache.sqoop.server.v1.JobServlet (JobServlet.java:91) 
         at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63) 
         at  javax.servlet.http.HttpServlet (HttpServlet.java:643) 
         at  javax.servlet.http.HttpServlet (HttpServlet.java:723) 
         at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290) 
         at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) 
         at  org.apache.hadoop.security.authentication.server.AuthenticationFilter (AuthenticationFilter.java:392) 
         at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:235) 
         at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206) 
         at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233) 
         at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191) 
         at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127) 
         at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:103) 
         at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109) 
         at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293) 
         at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:861) 
         at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:606) 
         at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489) 
         at  java.lang.Thread (Thread.java:744)

- Abraham Elmahrek


On Nov. 17, 2014, 7:31 a.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Nov. 17, 2014, 7:31 a.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/FileFormat.java PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/JarUtil.java PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml 233d3ce 
>   server/pom.xml 4a5eb5e 
>   test/pom.xml 2dbb8c5 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/
-----------------------------------------------------------

(Updated Nov. 17, 2014, 3:31 p.m.)


Review request for Sqoop.


Changes
-------

(1) updated code regarding review comments (thanks Veena) (2) refactored code regarding recent spi changes


Bugs: SQOOP-1588
    https://issues.apache.org/jira/browse/SQOOP-1588


Repository: sqoop-sqoop2


Description
-------

Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.

The scope is defined as follows:
* Destination: HDFS
* File Format: Avro Parquet and CSV.
* Compression Codec: Use default
* Partitioner Strategy: Not supported
* Column Mapping: Not supported


Diffs (updated)
-----

  connector/connector-kite/pom.xml PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
  connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
  connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
  connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
  connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/FileFormat.java PRE-CREATION 
  connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/JarUtil.java PRE-CREATION 
  connector/pom.xml e98a0fc 
  pom.xml 233d3ce 
  server/pom.xml 4a5eb5e 
  test/pom.xml 2dbb8c5 

Diff: https://reviews.apache.org/r/26963/diff/


Testing
-------

New unittests included. All passed.


Thanks,

Qian Xu


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Veena Basavaraj <vb...@cloudera.com>.

> On Oct. 31, 2014, 3:41 p.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java, line 49
> > <https://reviews.apache.org/r/26963/diff/4/?file=745369#file745369line49>
> >
> >     please add a comment to this and link to the tciker open for not doing merge in destroy. I would even want us to have 2 methods on destroy. I am going to ping Jarcec on this ticket. Here is the ticket for your review.
> >     https://issues.apache.org/jira/browse/SQOOP-1602
> >     
> >     This will be miss the cracks if we dont have a todo to revisit. I will update it whent he ticket is resolved.

it is https://issues.apache.org/jira/browse/SQOOP-1603

BTW, did you fix https://issues.apache.org/jira/browse/SQOOP-1602? 

what happend to the bug you raised about not splitting the load across 2 loaders in the TO phase?


- Veena


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review59430
-----------------------------------------------------------


On Oct. 30, 2014, 9:48 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 30, 2014, 9:48 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.

> On Nov. 1, 2014, 6:41 a.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java, line 137
> > <https://reviews.apache.org/r/26963/diff/4/?file=745367#file745367line137>
> >
> >     is there a ticket for this? or are you waiting for some ont to confirm. create a ticket and add the details there .

I checked Hive documentation. There is no limitation of 32 chars.


> On Nov. 1, 2014, 6:41 a.m., Veena Basavaraj wrote:
> > connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java, line 49
> > <https://reviews.apache.org/r/26963/diff/4/?file=745369#file745369line49>
> >
> >     please add a comment to this and link to the tciker open for not doing merge in destroy. I would even want us to have 2 methods on destroy. I am going to ping Jarcec on this ticket. Here is the ticket for your review.
> >     https://issues.apache.org/jira/browse/SQOOP-1602
> >     
> >     This will be miss the cracks if we dont have a todo to revisit. I will update it whent he ticket is resolved.
> 
> Veena Basavaraj wrote:
>     it is https://issues.apache.org/jira/browse/SQOOP-1603
>     
>     BTW, did you fix https://issues.apache.org/jira/browse/SQOOP-1602? 
>     
>     what happend to the bug you raised about not splitting the load across 2 loaders in the TO phase?

There is no effect to result. I'll keep in mind of SQOOP-1603. BTW: SQOOP-1602 is an improvement thought, not a issue. No need to fix before this get committed.


- Qian


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review59430
-----------------------------------------------------------


On Nov. 17, 2014, 3:31 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Nov. 17, 2014, 3:31 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/FileFormat.java PRE-CREATION 
>   connector/connector-sdk/src/main/java/org/apache/sqoop/connector/common/JarUtil.java PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml 233d3ce 
>   server/pom.xml 4a5eb5e 
>   test/pom.xml 2dbb8c5 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Veena Basavaraj <vb...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/#review59430
-----------------------------------------------------------

Ship it!


Please fix the remaining ones. Some might be considered nitpicks, but we have to get there to this kind of standards so we dont have someone else joining the team wondering what everything means and what it does.

If you disagree, please shoot me an email / ping me we shall work something out together:)


connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment100702>

    can you add some java docs on this? or is it available somewhere else. I think we need to get more cautious about docs and tests with everyone.
    
    I dont mind even linking something off from here
    
    if you have suggestions let me know.



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java
<https://reviews.apache.org/r/26963/#comment100703>

    thanks for following this guideline of linking a ticket number.



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java
<https://reviews.apache.org/r/26963/#comment100704>

    is there a ticket for this? or are you waiting for some ont to confirm. create a ticket and add the details there .



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java
<https://reviews.apache.org/r/26963/#comment100705>

    please add a comment to this and link to the tciker open for not doing merge in destroy. I would even want us to have 2 methods on destroy. I am going to ping Jarcec on this ticket. Here is the ticket for your review.
    https://issues.apache.org/jira/browse/SQOOP-1602
    
    This will be miss the cracks if we dont have a todo to revisit. I will update it whent he ticket is resolved.



connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java
<https://reviews.apache.org/r/26963/#comment100706>

    are you thinking of moving this to SDK so we can share it with other conenctors. DRY!



connector/connector-kite/src/main/resources/connector-configs.properties
<https://reviews.apache.org/r/26963/#comment100707>

    its called link ( I fixed it in generid jdbc thi week)



connector/connector-kite/src/main/resources/connector-configs.properties
<https://reviews.apache.org/r/26963/#comment100708>

    nice examples man! I love that you give the attention to details.



connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java
<https://reviews.apache.org/r/26963/#comment100709>

    cool tests.



pom.xml
<https://reviews.apache.org/r/26963/#comment100701>

    nice! I love this library for unit tests


I have

- Veena Basavaraj


On Oct. 30, 2014, 9:48 p.m., Qian Xu wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26963/
> -----------------------------------------------------------
> 
> (Updated Oct. 30, 2014, 9:48 p.m.)
> 
> 
> Review request for Sqoop.
> 
> 
> Bugs: SQOOP-1588
>     https://issues.apache.org/jira/browse/SQOOP-1588
> 
> 
> Repository: sqoop-sqoop2
> 
> 
> Description
> -------
> 
> Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.
> 
> The scope is defined as follows:
> * Destination: HDFS
> * File Format: Avro Parquet and CSV.
> * Compression Codec: Use default
> * Partitioner Strategy: Not supported
> * Column Mapping: Not supported
> 
> 
> Diffs
> -----
> 
>   connector/connector-kite/pom.xml PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
>   connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
>   connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
>   connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
>   connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
>   connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
>   connector/pom.xml e98a0fc 
>   pom.xml f25a29f 
>   server/pom.xml 67baaa5 
>   test/pom.xml 7a80710 
> 
> Diff: https://reviews.apache.org/r/26963/diff/
> 
> 
> Testing
> -------
> 
> New unittests included. All passed.
> 
> 
> Thanks,
> 
> Qian Xu
> 
>


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/
-----------------------------------------------------------

(Updated Oct. 31, 2014, 12:48 p.m.)


Review request for Sqoop.


Bugs: SQOOP-1588
    https://issues.apache.org/jira/browse/SQOOP-1588


Repository: sqoop-sqoop2


Description
-------

Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.

The scope is defined as follows:
* Destination: HDFS
* File Format: Avro Parquet and CSV.
* Compression Codec: Use default
* Partitioner Strategy: Not supported
* Column Mapping: Not supported


Diffs (updated)
-----

  connector/connector-kite/pom.xml PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorConstants.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/InputValidation.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
  connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
  connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
  connector/pom.xml e98a0fc 
  pom.xml f25a29f 
  server/pom.xml 67baaa5 
  test/pom.xml 7a80710 

Diff: https://reviews.apache.org/r/26963/diff/


Testing
-------

New unittests included. All passed.


Thanks,

Qian Xu


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/
-----------------------------------------------------------

(Updated Oct. 23, 2014, 4:43 p.m.)


Review request for Sqoop.


Bugs: SQOOP-1588
    https://issues.apache.org/jira/browse/SQOOP-1588


Repository: sqoop-sqoop2


Description
-------

Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.

The scope is defined as follows:
* Destination: HDFS
* File Format: Avro Parquet and CSV.
* Compression Codec: Use default
* Partitioner Strategy: Not supported
* Column Mapping: Not supported


Diffs (updated)
-----

  connector/connector-kite/pom.xml PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
  connector/connector-kite/src/main/resources/connector-configs.properties PRE-CREATION 
  connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
  connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
  connector/pom.xml e98a0fc 
  execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
  pom.xml f25a29f 
  server/pom.xml 67baaa5 
  test/pom.xml 7a80710 

Diff: https://reviews.apache.org/r/26963/diff/


Testing
-------

New unittests included. All passed.


Thanks,

Qian Xu


Re: Review Request 26963: SQOOP-1588: TO-side of Kite Connector - Write data to HDFS

Posted by Qian Xu <sx...@googlemail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26963/
-----------------------------------------------------------

(Updated Oct. 21, 2014, 3:07 p.m.)


Review request for Sqoop.


Changes
-------

Leave 4 comments open and responsed the rest 11 items.


Bugs: SQOOP-1588
    https://issues.apache.org/jira/browse/SQOOP-1588


Repository: sqoop-sqoop2


Description
-------

Create a basic Kite connector that can write data (i.e. from a jdbc connection) to HDFS.

The scope is defined as follows:
* Destination: HDFS
* File Format: Avro Parquet and CSV.
* Compression Codec: Use default
* Partitioner Strategy: Not supported
* Column Mapping: Not supported


Diffs (updated)
-----

  connector/connector-kite/pom.xml PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConfigUpgrader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnector.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConnectorError.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteConstants.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteDatasetExecutor.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteLoader.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToDestroyer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteToInitializer.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/KiteValidator.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/LinkConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToFormat.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfig.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/configuration/ToJobConfiguration.java PRE-CREATION 
  connector/connector-kite/src/main/java/org/apache/sqoop/connector/kite/util/KiteDataTypeUtil.java PRE-CREATION 
  connector/connector-kite/src/main/resources/kite-connector-config.properties PRE-CREATION 
  connector/connector-kite/src/main/resources/sqoopconnector.properties PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteExecutor.java PRE-CREATION 
  connector/connector-kite/src/test/java/org/apache/sqoop/connector/kite/TestKiteLoader.java PRE-CREATION 
  connector/connector-kite/src/test/resources/log4j.properties PRE-CREATION 
  connector/pom.xml e98a0fc 
  execution/mapreduce/src/main/java/org/apache/sqoop/job/mr/SqoopDestroyerExecutor.java b385926 
  pom.xml f25a29f 
  server/pom.xml 67baaa5 
  test/pom.xml 7a80710 

Diff: https://reviews.apache.org/r/26963/diff/


Testing
-------

New unittests included. All passed.


Thanks,

Qian Xu