You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Alberto Ramón <a....@gmail.com> on 2016/10/07 19:38:34 UTC

jdbc.JDBCInputFormat

I want use CreateInput + buildJDBCInputFormat to acces to database on SCALA

PB1:

import org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
Error:(25, 37) object jdbc is not a member of package
org.apache.flink.api.java.io
import org.apache.flink.api.java.io.jdbc.JDBCInputFormat

Then, I can't use:
[image: Imágenes integradas 1]

I tried to download code from git and recompile, also

Re: jdbc.JDBCInputFormat

Posted by Timo Walther <tw...@apache.org>.
Hi Alberto,

you need to checkout the branch, run `mvn clean install` to put this 
version in your Maven repo and the `pom.xml` of your project should 
point to Flink version `1.2-SNAPSHOT`.

Timo


Am 11/10/16 um 19:47 schrieb Alberto Ramn:
> Hello
>
> I downloaded and compile your branch:
> Imgenes integradas 3
>
> And the error is the same:
> Imgenes integradas 2
>
> (I tested with SQuirreL and works OK)
>
>
> *If you want any log / test , feel free to contact with me  *
>
>
>
> /Additional INFO:/
>   Scala:  String = version 2.11.8
>
> I created the template project with:
> |mvn archetype:generate \|
> |-DarchetypeGroupId=org.apache.flink \|
> |-DarchetypeArtifactId=flink-quickstart-scala \|
> |-DarchetypeVersion=1.1.2 \|
> |-DgroupId=org.apache.flink.quickstart \|
> |-DartifactId=flink-scala-project \|
> |-Dversion=0.1 \|
> |-Dpackage=org.apache.flink.quickstart \|
> |-DinteractiveMode=false|
> And the coded is attached:
>
> 2016-10-11 12:01 GMT+02:00 Alberto Ramn <a.ramonportoles@gmail.com 
> <ma...@gmail.com>>:
>
>     I will check it this nigth
>
>     Thanks
>
>     2016-10-11 11:24 GMT+02:00 Timo Walther <twalthr@apache.org
>     <ma...@apache.org>>:
>
>         I have opened a PR (https://github.com/apache/flink/pull/2619
>         <https://github.com/apache/flink/pull/2619>). Would be great
>         if you could try it and comment if it solves you problem.
>
>         Timo
>
>         Am 10/10/16 um 17:48 schrieb Timo Walther:
>>         I could reproduce the error locally. I will prepare a fix for it.
>>
>>         Timo
>>
>>         Am 10/10/16 um 11:54 schrieb Alberto Ramn:
>>>         It's from Jun and Unassigned   :(
>>>         Is There a Workarround?
>>>
>>>         I'm will try to contact with the reporter , Martin Scholl )
>>>
>>>         2016-10-10 11:04 GMT+02:00 Timo Walther <twalthr@apache.org
>>>         <ma...@apache.org>>:
>>>
>>>             I think you already found the correct issue describing
>>>             your problem ( FLINK-4108). This should get higher priority.
>>>
>>>             Timo
>>>
>>>             Am 09/10/16 um 13:27 schrieb Alberto Ramn:
>>>>
>>>>             After solved some issues, I connected with Kylin, but I
>>>>             can't read data
>>>>
>>>>             import org.apache.flink.api.scala._
>>>>             import org.apache.flink.api.java.io
>>>>             <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>>>             import org.apache.flink.api.table.Row
>>>>             import org.apache.flink.api.table.typeutils.RowTypeInfo
>>>>             import org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>>>>             var stringColum: TypeInformation[Int] =createTypeInformation[Int]
>>>>             val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>>>>
>>>>             val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>>>                .setDrivername("org.apache.kylin.jdbc.Driver")
>>>>                .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin
>>>>             <http://172.17.0.2:7070/learn_kylin>")
>>>>                .setUsername("ADMIN")
>>>>                .setPassword("KYLIN")
>>>>                .setQuery("select count(distinct seller_id) as sellers from
>>>>             kylin_sales group by part_dt order by part_dt")
>>>>                .setRowTypeInfo(DB_ROWTYPE)
>>>>                .finish()
>>>>
>>>>                val dataset =env.createInput(inputFormat)
>>>>             dataset.print()
>>>>
>>>>             The error is:
>>>>             Imgenes integradas 1
>>>>
>>>>
>>>>             (I checked that queries and  config are correct with SQuirriel)
>>>>             (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>>>>
>>>>
>>>>             Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>>>>
>>>>             BR, Alberto
>>>>             2016-10-07 21:46 GMT+02:00 Fabian Hueske
>>>>             <fhueske@gmail.com <ma...@gmail.com>>:
>>>>
>>>>                 As the exception says the class
>>>>                 org.apache.flink.api.scala.io
>>>>                 <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>>>>                 does not exist. You have to do: import
>>>>                 org.apache.flink.api.java.io
>>>>                 <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>>>
>>>>                 There is no Scala implementation of this class but
>>>>                 you can also use Java classes in Scala.
>>>>                 2016-10-07 21:38 GMT+02:00 Alberto Ramn
>>>>                 <a.ramonportoles@gmail.com
>>>>                 <ma...@gmail.com>>:
>>>>
>>>>                     I want use CreateInput + buildJDBCInputFormat
>>>>                     to acces to database on SCALA
>>>>                     PB1:
>>>>
>>>>                     import org.apache.flink.api.scala.io
>>>>                     <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>>>>                     Error:(25, 37) object jdbc is not a member of
>>>>                     package org.apache.flink.api.java.io
>>>>                     <http://org.apache.flink.api.java.io> import
>>>>                     org.apache.flink.api.java.io
>>>>                     <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>>>
>>>>                     Then, I can't use:
>>>>                     Imgenes integradas 1
>>>>
>>>>                     I tried to download code from git and recompile, also
>>>>
>>>             -- 
>>>             Freundliche Gre / Kind Regards
>>>
>>>             Timo Walther
>>>
>>>             Follow me: @twalthr
>>>             https://www.linkedin.com/in/twalthr
>>>             <https://www.linkedin.com/in/twalthr>
>>>
>>         -- 
>>         Freundliche Gre / Kind Regards
>>
>>         Timo Walther
>>
>>         Follow me: @twalthr
>>         https://www.linkedin.com/in/twalthr
>>         <https://www.linkedin.com/in/twalthr>
>
>         -- 
>         Freundliche Gre / Kind Regards
>
>         Timo Walther
>
>         Follow me: @twalthr
>         https://www.linkedin.com/in/twalthr
>         <https://www.linkedin.com/in/twalthr>
>
-- 
Freundliche Gre / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr

Re: jdbc.JDBCInputFormat

Posted by Chesnay Schepler <ch...@apache.org>.
Doesn't this just mean that there is a scala version mismatch? i.e flink 
was compiled with 2.10 but you run it with 2.11?

On 12.10.2016 12:39, sunny patel wrote:
> Thanks, Timo,
>
> I have updated `flink-parent' and Flink version to 1.2-SNAPSHOT`
> but still, I am facing the version errors.
>
> could you please advise me on this?
>
> Information:12/10/2016, 11:34 - Compilation completed with 2 errors 
> and 0 warnings in 7s 284ms
> /Users/janaidu/faid/src/main/scala/fgid/JDBC.scala
> Error:(17, 67) can't expand macros compiled by previous versions of Scala
>     val stringColum: TypeInformation[Int] = createTypeInformation[Int]
> Error:(29, 33) can't expand macros compiled by previous versions of Scala
>     val dataset =env.createInput(inputFormat)
>
>
>
>
> ========= POM.XML FILE
>
>
> <?xml version="1.0" encoding="UTF-8"?> <project 
> xmlns="http://maven.apache.org/POM/4.0.0" 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>     <modelVersion>4.0.0</modelVersion>
>     <parent>
>        <artifactId>flink-parent</artifactId>
>        <groupId>org.apache.flink</groupId>
>        <version>1.2-SNAPSHOT</version>
>     </parent>
>
>     <groupId>org.apache.flink.quickstart</groupId>
>     <artifactId>flink-scala-project</artifactId>
>     <version>0.1</version>
>     <packaging>jar</packaging>
>
>     <name>Flink Quickstart Job</name>
>     <url>http://www.myorganization.org</url>
>
>     <repositories>
>        <repository>
>           <id>apache.snapshots</id>
>           <name>Apache Development Snapshot Repository</name>
>           <url>https://repository.apache.org/content/repositories/snapshots/</url>
>           <releases>
>              <enabled>false</enabled>
>           </releases>
>           <snapshots>
>           </snapshots>
>        </repository>
>     </repositories>
>
>     <properties>
>        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>        <flink.version>1.2-SNAPSHOT</flink.version>
>     </properties>
>
>     <!-- Execute "mvn clean package -Pbuild-jar" to build a jar file out 
> of this project! How to use the Flink Quickstart pom: a) Adding new 
> dependencies: You can add dependencies to the list below. Please check 
> if the maven-shade-plugin below is filtering out your dependency and 
> remove the exclude from there. b) Build a jar for running on the 
> cluster: There are two options for creating a jar from this project 
> b.1) "mvn clean package" -> this will create a fat jar which contains 
> all dependencies necessary for running the jar created by this pom in 
> a cluster. The "maven-shade-plugin" excludes everything that is 
> provided on a running Flink cluster. b.2) "mvn clean package 
> -Pbuild-jar" -> This will also create a fat-jar, but with much nicer 
> dependency exclusion handling. This approach is preferred and leads to 
> much cleaner jar files. --> <dependencies>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-jdbc</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-table_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-streaming-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-clients_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>     </dependencies>
>
>     <profiles>
>        <profile>
>           <!-- Profile for packaging correct JAR files --> <id>build-jar</id>
>           <activation>
>           </activation>
>           <dependencies>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-streaming-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-clients_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>           </dependencies>
>
>           <build>
>              <plugins>
>                 <!-- disable the exclusion rules --> <plugin>
>                    <groupId>org.apache.maven.plugins</groupId>
>                    <artifactId>maven-shade-plugin</artifactId>
>                    <version>2.4.1</version>
>                    <executions>
>                       <execution>
>                          <phase>package</phase>
>                          <goals>
>                             <goal>shade</goal>
>                          </goals>
>                          <configuration>
>                             <artifactSet>
>                                <excludes combine.self="override"></excludes>
>                             </artifactSet>
>                          </configuration>
>                       </execution>
>                    </executions>
>                 </plugin>
>              </plugins>
>           </build>
>        </profile>
>     </profiles>
>
>     <!-- We use the maven-assembly plugin to create a fat jar that 
> contains all dependencies except flink and its transitive 
> dependencies. The resulting fat-jar can be executed on a cluster. 
> Change the value of Program-Class if your program entry point changes. 
> --> <build>
>        <plugins>
>           <!-- We use the maven-shade plugin to create a fat jar that contains 
> all dependencies except flink and it's transitive dependencies. The 
> resulting fat-jar can be executed on a cluster. Change the value of 
> Program-Class if your program entry point changes. --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-shade-plugin</artifactId>
>              <version>2.4.1</version>
>              <executions>
>                 <!-- Run shade goal on package phase --> <execution>
>                    <phase>package</phase>
>                    <goals>
>                       <goal>shade</goal>
>                    </goals>
>                    <configuration>
>                       <artifactSet>
>                          <excludes>
>                             <!-- This list contains all dependencies of flink-dist Everything else 
> will be packaged into the fat-jar --> <exclude>org.apache.flink:flink-annotations</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                             <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                             <exclude>org.apache.flink:flink-core</exclude>
>                             <exclude>org.apache.flink:flink-java</exclude>
>                             <exclude>org.apache.flink:flink-scala_2.10</exclude>
>                             <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>                             <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>                             <exclude>org.apache.flink:flink-clients_2.10</exclude>
>                             <exclude>org.apache.flink:flink-avro_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>                             <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>
>                             <!-- Also exclude very big transitive dependencies of Flink WARNING: 
> You have to remove these excludes if your code relies on other 
> versions of these dependencies. --> <exclude>org.scala-lang:scala-library</exclude>
>                             <exclude>org.scala-lang:scala-compiler</exclude>
>                             <exclude>org.scala-lang:scala-reflect</exclude>
>                             <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                             <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                             <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                             <exclude>io.netty:netty-all</exclude>
>                             <exclude>io.netty:netty</exclude>
>                             <exclude>commons-fileupload:commons-fileupload</exclude>
>                             <exclude>org.apache.avro:avro</exclude>
>                             <exclude>commons-collections:commons-collections</exclude>
>                             <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                             <exclude>org.xerial.snappy:snappy-java</exclude>
>                             <exclude>org.apache.commons:commons-compress</exclude>
>                             <exclude>org.tukaani:xz</exclude>
>                             <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                             <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                             <exclude>org.objenesis:objenesis</exclude>
>                             <exclude>com.twitter:chill_*</exclude>
>                             <exclude>com.twitter:chill-java</exclude>
>                             <exclude>commons-lang:commons-lang</exclude>
>                             <exclude>junit:junit</exclude>
>                             <exclude>org.apache.commons:commons-lang3</exclude>
>                             <exclude>org.slf4j:slf4j-api</exclude>
>                             <exclude>org.slf4j:slf4j-log4j12</exclude>
>                             <exclude>log4j:log4j</exclude>
>                             <exclude>org.apache.commons:commons-math</exclude>
>                             <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                             <exclude>commons-logging:commons-logging</exclude>
>                             <exclude>commons-codec:commons-codec</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                             <exclude>stax:stax-api</exclude>
>                             <exclude>com.typesafe:config</exclude>
>                             <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                             <exclude>com.github.scopt:scopt_*</exclude>
>                             <exclude>commons-io:commons-io</exclude>
>                             <exclude>commons-cli:commons-cli</exclude>
>                          </excludes>
>                       </artifactSet>
>                       <filters>
>                          <filter>
>                             <artifact>org.apache.flink:*</artifact>
>                             <excludes>
>                                <!-- exclude shaded google but include shaded curator --> <exclude>org/apache/flink/shaded/com/**</exclude>
>                                <exclude>web-docs/**</exclude>
>                             </excludes>
>                          </filter>
>                          <filter>
>                             <!-- Do not copy the signatures in the META-INF folder. Otherwise, 
> this might cause SecurityExceptions when using the JAR. --> <artifact>*:*</artifact>
>                             <excludes>
>                                <exclude>META-INF/*.SF</exclude>
>                                <exclude>META-INF/*.DSA</exclude>
>                                <exclude>META-INF/*.RSA</exclude>
>                             </excludes>
>                          </filter>
>                       </filters>
>                       <!-- If you want to use ./bin/flink run <quickstart jar> uncomment the 
> following lines. This will add a Main-Class entry to the manifest file 
> --> <!-- <transformers> <transformer 
> implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> 
> <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass> 
> </transformer> </transformers> --> <createDependencyReducedPom>false</createDependencyReducedPom>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-compiler-plugin</artifactId>
>              <version>3.1</version>
>              <configuration>
>                 <source>1.7</source>
>                 <target>1.7</target>
>              </configuration>
>           </plugin>
>           <plugin>
>              <groupId>net.alchim31.maven</groupId>
>              <artifactId>scala-maven-plugin</artifactId>
>              <version>3.1.4</version>
>              <executions>
>                 <execution>
>                    <goals>
>                       <goal>compile</goal>
>                       <goal>testCompile</goal>
>                    </goals>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <!-- Eclipse Integration --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-eclipse-plugin</artifactId>
>              <version>2.8</version>
>              <configuration>
>                 <downloadSources>true</downloadSources>
>                 <projectnatures>
>                    <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                    <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                 </projectnatures>
>                 <buildcommands>
>                    <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                 </buildcommands>
>                 <classpathContainers>
>                    <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                    <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                 </classpathContainers>
>                 <excludes>
>                    <exclude>org.scala-lang:scala-library</exclude>
>                    <exclude>org.scala-lang:scala-compiler</exclude>
>                 </excludes>
>                 <sourceIncludes>
>                    <sourceInclude>**/*.scala</sourceInclude>
>                    <sourceInclude>**/*.java</sourceInclude>
>                 </sourceIncludes>
>              </configuration>
>           </plugin>
>
>           <!-- Adding scala source directories to build path --> <plugin>
>              <groupId>org.codehaus.mojo</groupId>
>              <artifactId>build-helper-maven-plugin</artifactId>
>              <version>1.7</version>
>              <executions>
>                 <!-- Add src/main/scala to eclipse build path --> <execution>
>                    <id>add-source</id>
>                    <phase>generate-sources</phase>
>                    <goals>
>                       <goal>add-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/main/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>                 <!-- Add src/test/scala to eclipse build path --> <execution>
>                    <id>add-test-source</id>
>                    <phase>generate-test-sources</phase>
>                    <goals>
>                       <goal>add-test-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/test/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>        </plugins>
>     </build>
> </project>
> ==========
>
>
> Thanks
> S
>
> On Wed, Oct 12, 2016 at 11:19 AM, Timo Walther <twalthr@apache.org 
> <ma...@apache.org>> wrote:
>
>     Hi Sunny,
>
>     you are using different versions of Flink. `flink-parent` is set
>     to `1.2-SNAPSHOT` but the property `flink.version` is still `1.1.2`.
>
>     Hope that helps.
>
>     Timo
>
>
>
>     Am 12/10/16 um 11:49 schrieb sunny patel:
>>     Hi guys,
>>
>>     I am facing following error message in flink scala JDBC wordcount.
>>     could you please advise me on this?
>>
>>     *Information:12/10/2016, 10:43 - Compilation completed with 2
>>     errors and 0 warnings in 1s 903ms*
>>     */Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*
>>     *
>>     *
>>     *Error:(17, 67) can't expand macros compiled by previous versions
>>     of Scala*
>>     *    val stringColum: TypeInformation[Int] =
>>     createTypeInformation[Int]*
>>     *
>>     *
>>     *Error:(29, 33) can't expand macros compiled by previous versions
>>     of Scala*
>>     *    val dataset =env.createInput(inputFormat)*
>>     *
>>     *
>>
>>     ------------ code
>>
>>
>>     package DataSources
>>
>>     import org.apache.flink.api.common.typeinfo.TypeInformation
>>     import org.apache.flink.api.java.io
>>     <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>     import org.apache.flink.api.scala._
>>     import org.apache.flink.api.table.typeutils.RowTypeInfo
>>
>>     object WordCount {
>>        def main(args: Array[String]) {
>>
>>          val PATH = getClass.getResource("").getPath
>>
>>          // set up the execution environment val env = ExecutionEnvironment.getExecutionEnvironment // Read data from JDBC (Kylin in our
>>     case) val stringColum: TypeInformation[Int] =createTypeInformation[Int]
>>          val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>>
>>          val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>            .setDrivername("org.postgresql.jdbc.Driver")
>>            .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>>            .setUsername("MI")
>>            .setPassword("MI")
>>            .setQuery("select * FROM identity")
>>            .setRowTypeInfo(DB_ROWTYPE)
>>            .finish()
>>
>>          val dataset =env.createInput(inputFormat)
>>          dataset.print()
>>
>>          println(PATH)
>>        }
>>     }
>>     ---------pom.xml
>>     <?xml version="1.0" encoding="UTF-8"?> <project
>>     xmlns="http://maven.apache.org/POM/4.0.0
>>     <http://maven.apache.org/POM/4.0.0>"
>>     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance
>>     <http://www.w3.org/2001/XMLSchema-instance>"
>>     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
>>     <http://maven.apache.org/POM/4.0.0>
>>     http://maven.apache.org/xsd/maven-4.0.0.xsd
>>     <http://maven.apache.org/xsd/maven-4.0.0.xsd>">
>>         <modelVersion>4.0.0</modelVersion>
>>         <parent>
>>            <artifactId>flink-parent</artifactId>
>>            <groupId>org.apache.flink</groupId>
>>            <version>1.2-SNAPSHOT</version>
>>         </parent>
>>
>>         <groupId>org.apache.flink.quickstart</groupId>
>>         <artifactId>flink-scala-project</artifactId>
>>         <version>0.1</version>
>>         <packaging>jar</packaging>
>>
>>         <name>Flink Quickstart Job</name>
>>         <url>http://www.myorganization.org
>>     <http://www.myorganization.org></url>
>>
>>         <repositories>
>>            <repository>
>>               <id>apache.snapshots</id>
>>               <name>Apache Development Snapshot Repository</name>
>>               <url>https://repository.apache.org/content/repositories/snapshots/ <https://repository.apache.org/content/repositories/snapshots/></url>
>>               <releases>
>>                  <enabled>false</enabled>
>>               </releases>
>>               <snapshots>
>>               </snapshots>
>>            </repository>
>>         </repositories>
>>
>>         <properties>
>>            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>            <flink.version>1.1.2</flink.version>
>>         </properties>
>>
>>         <!-- Execute "mvn clean package -Pbuild-jar" to build a jar file
>>     out of this project! How to use the Flink Quickstart pom: a)
>>     Adding new dependencies: You can add dependencies to the list
>>     below. Please check if the maven-shade-plugin below is filtering
>>     out your dependency and remove the exclude from there. b) Build a
>>     jar for running on the cluster: There are two options for
>>     creating a jar from this project b.1) "mvn clean package" -> this
>>     will create a fat jar which contains all dependencies necessary
>>     for running the jar created by this pom in a cluster. The
>>     "maven-shade-plugin" excludes everything that is provided on a
>>     running Flink cluster. b.2) "mvn clean package -Pbuild-jar" ->
>>     This will also create a fat-jar, but with much nicer dependency
>>     exclusion handling. This approach is preferred and leads to much
>>     cleaner jar files. --> <dependencies>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-jdbc</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-table_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-scala_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-streaming-scala_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-clients_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>         </dependencies>
>>
>>         <profiles>
>>            <profile>
>>               <!-- Profile for packaging correct JAR files --> <id>build-jar</id>
>>               <activation>
>>               </activation>
>>               <dependencies>
>>                  <dependency>
>>                     <groupId>org.apache.flink</groupId>
>>                     <artifactId>flink-scala_2.10</artifactId>
>>                     <version>${flink.version}</version>
>>                     <scope>provided</scope>
>>                  </dependency>
>>                  <dependency>
>>                     <groupId>org.apache.flink</groupId>
>>                     <artifactId>flink-streaming-scala_2.10</artifactId>
>>                     <version>${flink.version}</version>
>>                     <scope>provided</scope>
>>                  </dependency>
>>                  <dependency>
>>                     <groupId>org.apache.flink</groupId>
>>                     <artifactId>flink-clients_2.10</artifactId>
>>                     <version>${flink.version}</version>
>>                     <scope>provided</scope>
>>                  </dependency>
>>               </dependencies>
>>
>>               <build>
>>                  <plugins>
>>                     <!-- disable the exclusion rules --> <plugin>
>>                        <groupId>org.apache.maven.plugins</groupId>
>>                        <artifactId>maven-shade-plugin</artifactId>
>>                        <version>2.4.1</version>
>>                        <executions>
>>                           <execution>
>>                              <phase>package</phase>
>>                              <goals>
>>                                 <goal>shade</goal>
>>                              </goals>
>>                              <configuration>
>>                                 <artifactSet>
>>                                    <excludes combine.self="override"></excludes>
>>                                 </artifactSet>
>>                              </configuration>
>>                           </execution>
>>                        </executions>
>>                     </plugin>
>>                  </plugins>
>>               </build>
>>            </profile>
>>         </profiles>
>>
>>         <!-- We use the maven-assembly plugin to create a fat jar that
>>     contains all dependencies except flink and its transitive
>>     dependencies. The resulting fat-jar can be executed on a cluster.
>>     Change the value of Program-Class if your program entry point
>>     changes. --> <build>
>>            <plugins>
>>               <!-- We use the maven-shade plugin to create a fat jar that
>>     contains all dependencies except flink and it's transitive
>>     dependencies. The resulting fat-jar can be executed on a cluster.
>>     Change the value of Program-Class if your program entry point
>>     changes. --> <plugin>
>>                  <groupId>org.apache.maven.plugins</groupId>
>>                  <artifactId>maven-shade-plugin</artifactId>
>>                  <version>2.4.1</version>
>>                  <executions>
>>                     <!-- Run shade goal on package phase --> <execution>
>>                        <phase>package</phase>
>>                        <goals>
>>                           <goal>shade</goal>
>>                        </goals>
>>                        <configuration>
>>                           <artifactSet>
>>                              <excludes>
>>                                 <!-- This list contains all dependencies of flink-dist Everything
>>     else will be packaged into the fat-jar --> <exclude>org.apache.flink:flink-annotations</exclude>
>>                                 <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>>                                 <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>>                                 <exclude>org.apache.flink:flink-core</exclude>
>>                                 <exclude>org.apache.flink:flink-java</exclude>
>>                                 <exclude>org.apache.flink:flink-scala_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-clients_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-avro_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>>
>>                                 <!-- Also exclude very big transitive dependencies of Flink
>>     WARNING: You have to remove these excludes if your code relies on
>>     other versions of these dependencies. --> <exclude>org.scala-lang:scala-library</exclude>
>>                                 <exclude>org.scala-lang:scala-compiler</exclude>
>>                                 <exclude>org.scala-lang:scala-reflect</exclude>
>>                                 <exclude>com.typesafe.akka:akka-actor_*</exclude>
>>                                 <exclude>com.typesafe.akka:akka-remote_*</exclude>
>>                                 <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>>                                 <exclude>io.netty:netty-all</exclude>
>>                                 <exclude>io.netty:netty</exclude>
>>                                 <exclude>commons-fileupload:commons-fileupload</exclude>
>>                                 <exclude>org.apache.avro:avro</exclude>
>>                                 <exclude>commons-collections:commons-collections</exclude>
>>                                 <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>>                                 <exclude>org.xerial.snappy:snappy-java</exclude>
>>                                 <exclude>org.apache.commons:commons-compress</exclude>
>>                                 <exclude>org.tukaani:xz</exclude>
>>                                 <exclude>com.esotericsoftware.kryo:kryo</exclude>
>>                                 <exclude>com.esotericsoftware.minlog:minlog</exclude>
>>                                 <exclude>org.objenesis:objenesis</exclude>
>>                                 <exclude>com.twitter:chill_*</exclude>
>>                                 <exclude>com.twitter:chill-java</exclude>
>>                                 <exclude>commons-lang:commons-lang</exclude>
>>                                 <exclude>junit:junit</exclude>
>>                                 <exclude>org.apache.commons:commons-lang3</exclude>
>>                                 <exclude>org.slf4j:slf4j-api</exclude>
>>                                 <exclude>org.slf4j:slf4j-log4j12</exclude>
>>                                 <exclude>log4j:log4j</exclude>
>>                                 <exclude>org.apache.commons:commons-math</exclude>
>>                                 <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>>                                 <exclude>commons-logging:commons-logging</exclude>
>>                                 <exclude>commons-codec:commons-codec</exclude>
>>                                 <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>>                                 <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>>                                 <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>>                                 <exclude>stax:stax-api</exclude>
>>                                 <exclude>com.typesafe:config</exclude>
>>                                 <exclude>org.uncommons.maths:uncommons-maths</exclude>
>>                                 <exclude>com.github.scopt:scopt_*</exclude>
>>                                 <exclude>commons-io:commons-io</exclude>
>>                                 <exclude>commons-cli:commons-cli</exclude>
>>                              </excludes>
>>                           </artifactSet>
>>                           <filters>
>>                              <filter>
>>                                 <artifact>org.apache.flink:*</artifact>
>>                                 <excludes>
>>                                    <!-- exclude shaded google but include shaded curator --> <exclude>org/apache/flink/shaded/com/**</exclude>
>>                                    <exclude>web-docs/**</exclude>
>>                                 </excludes>
>>                              </filter>
>>                              <filter>
>>                                 <!-- Do not copy the signatures in the META-INF folder.
>>     Otherwise, this might cause SecurityExceptions when using the
>>     JAR. --> <artifact>*:*</artifact>
>>                                 <excludes>
>>                                    <exclude>META-INF/*.SF</exclude>
>>                                    <exclude>META-INF/*.DSA</exclude>
>>                                    <exclude>META-INF/*.RSA</exclude>
>>                                 </excludes>
>>                              </filter>
>>                           </filters>
>>                           <!-- If you want to use ./bin/flink run <quickstart jar>
>>     uncomment the following lines. This will add a Main-Class entry
>>     to the manifest file --> <!-- <transformers> <transformer
>>     implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
>>     <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>
>>     </transformer> </transformers> --> <createDependencyReducedPom>false</createDependencyReducedPom>
>>                        </configuration>
>>                     </execution>
>>                  </executions>
>>               </plugin>
>>
>>               <plugin>
>>                  <groupId>org.apache.maven.plugins</groupId>
>>                  <artifactId>maven-compiler-plugin</artifactId>
>>                  <version>3.1</version>
>>                  <configuration>
>>                     <source>1.7</source>
>>                     <target>1.7</target>
>>                  </configuration>
>>               </plugin>
>>               <plugin>
>>                  <groupId>net.alchim31.maven</groupId>
>>                  <artifactId>scala-maven-plugin</artifactId>
>>                  <version>3.1.4</version>
>>                  <executions>
>>                     <execution>
>>                        <goals>
>>                           <goal>compile</goal>
>>                           <goal>testCompile</goal>
>>                        </goals>
>>                     </execution>
>>                  </executions>
>>               </plugin>
>>
>>               <!-- Eclipse Integration --> <plugin>
>>                  <groupId>org.apache.maven.plugins</groupId>
>>                  <artifactId>maven-eclipse-plugin</artifactId>
>>                  <version>2.8</version>
>>                  <configuration>
>>                     <downloadSources>true</downloadSources>
>>                     <projectnatures>
>>                        <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>>                        <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>>                     </projectnatures>
>>                     <buildcommands>
>>                        <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>>                     </buildcommands>
>>                     <classpathContainers>
>>                        <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>>                        <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>>                     </classpathContainers>
>>                     <excludes>
>>                        <exclude>org.scala-lang:scala-library</exclude>
>>                        <exclude>org.scala-lang:scala-compiler</exclude>
>>                     </excludes>
>>                     <sourceIncludes>
>>                        <sourceInclude>**/*.scala</sourceInclude>
>>                        <sourceInclude>**/*.java</sourceInclude>
>>                     </sourceIncludes>
>>                  </configuration>
>>               </plugin>
>>
>>               <!-- Adding scala source directories to build path --> <plugin>
>>                  <groupId>org.codehaus.mojo</groupId>
>>                  <artifactId>build-helper-maven-plugin</artifactId>
>>                  <version>1.7</version>
>>                  <executions>
>>                     <!-- Add src/main/scala to eclipse build path --> <execution>
>>                        <id>add-source</id>
>>                        <phase>generate-sources</phase>
>>                        <goals>
>>                           <goal>add-source</goal>
>>                        </goals>
>>                        <configuration>
>>                           <sources>
>>                              <source>src/main/scala</source>
>>                           </sources>
>>                        </configuration>
>>                     </execution>
>>                     <!-- Add src/test/scala to eclipse build path --> <execution>
>>                        <id>add-test-source</id>
>>                        <phase>generate-test-sources</phase>
>>                        <goals>
>>                           <goal>add-test-source</goal>
>>                        </goals>
>>                        <configuration>
>>                           <sources>
>>                              <source>src/test/scala</source>
>>                           </sources>
>>                        </configuration>
>>                     </execution>
>>                  </executions>
>>               </plugin>
>>            </plugins>
>>         </build>
>>     </project>
>>     Cheers
>>     S
>
>     -- 
>     Freundliche Gr��e / Kind Regards
>
>     Timo Walther
>
>     Follow me: @twalthr
>     https://www.linkedin.com/in/twalthr
>     <https://www.linkedin.com/in/twalthr>
>

Re: jdbc.JDBCInputFormat

Posted by Alberto Ramón <a....@gmail.com>.
"Dear Timo "  .... Thanks ¡¡¡    :)

Work fine:  Your pull <https://github.com/apache/flink/pull/2619> and Flink
4108 <https://issues.apache.org/jira/browse/FLINK-4108>
I tested With, Long, BigDecimal andDate types:

[image: Imágenes integradas 1]

Have a nice day, Alb

2016-10-12 13:37 GMT+02:00 Timo Walther <tw...@apache.org>:

> Are you sure that the same Scala version is used everywhere? Maybe it
> helps to clean your local Maven repo and build the version again.
>
>
> Am 12/10/16 um 12:39 schrieb sunny patel:
>
> Thanks, Timo,
>
> I have updated `flink-parent' and Flink version to 1.2-SNAPSHOT`
> but still, I am facing the version errors.
>
> could you please advise me on this?
>
> Information:12/10/2016, 11:34 - Compilation completed with 2 errors and 0
> warnings in 7s 284ms
> /Users/janaidu/faid/src/main/scala/fgid/JDBC.scala
> Error:(17, 67) can't expand macros compiled by previous versions of Scala
>     val stringColum: TypeInformation[Int] = createTypeInformation[Int]
> Error:(29, 33) can't expand macros compiled by previous versions of Scala
>     val dataset =env.createInput(inputFormat)
>
>
>
>
> ========= POM.XML FILE
>
>
> <?xml version="1.0" encoding="UTF-8"?><project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
>    <modelVersion>4.0.0</modelVersion>
>    <parent>
>       <artifactId>flink-parent</artifactId>
>       <groupId>org.apache.flink</groupId>
>       <version>1.2-SNAPSHOT</version>
>    </parent>
>
>    <groupId>org.apache.flink.quickstart</groupId>
>    <artifactId>flink-scala-project</artifactId>
>    <version>0.1</version>
>    <packaging>jar</packaging>
>
>    <name>Flink Quickstart Job</name>
>    <url>http://www.myorganization.org</url>
>
>    <repositories>
>       <repository>
>          <id>apache.snapshots</id>
>          <name>Apache Development Snapshot Repository</name>
>          <url>https://repository.apache.org/content/repositories/snapshots/</url>
>          <releases>
>             <enabled>false</enabled>
>          </releases>
>          <snapshots>
>          </snapshots>
>       </repository>
>    </repositories>
>
>    <properties>
>       <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>       <flink.version>1.2-SNAPSHOT</flink.version>
>    </properties>
>
>    <!--      Execute "mvn clean package -Pbuild-jar"      to build a jar file out of this project!      How to use the Flink Quickstart pom:      a) Adding new dependencies:         You can add dependencies to the list below.         Please check if the maven-shade-plugin below is filtering out your dependency         and remove the exclude from there.      b) Build a jar for running on the cluster:         There are two options for creating a jar from this project         b.1) "mvn clean package" -> this will create a fat jar which contains all               dependencies necessary for running the jar created by this pom in a cluster.               The "maven-shade-plugin" excludes everything that is provided on a running Flink cluster.         b.2) "mvn clean package -Pbuild-jar" -> This will also create a fat-jar, but with much               nicer dependency exclusion handling. This approach is preferred and leads to               much cleaner jar files.   -->   <dependencies>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-jdbc</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-table_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-scala_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-streaming-scala_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-clients_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>    </dependencies>
>
>    <profiles>
>       <profile>
>          <!-- Profile for packaging correct JAR files -->         <id>build-jar</id>
>          <activation>
>          </activation>
>          <dependencies>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-scala_2.10</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-streaming-scala_2.10</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-clients_2.10</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>          </dependencies>
>
>          <build>
>             <plugins>
>                <!-- disable the exclusion rules -->               <plugin>
>                   <groupId>org.apache.maven.plugins</groupId>
>                   <artifactId>maven-shade-plugin</artifactId>
>                   <version>2.4.1</version>
>                   <executions>
>                      <execution>
>                         <phase>package</phase>
>                         <goals>
>                            <goal>shade</goal>
>                         </goals>
>                         <configuration>
>                            <artifactSet>
>                               <excludes combine.self="override"></excludes>
>                            </artifactSet>
>                         </configuration>
>                      </execution>
>                   </executions>
>                </plugin>
>             </plugins>
>          </build>
>       </profile>
>    </profiles>
>
>    <!-- We use the maven-assembly plugin to create a fat jar that contains all dependencies      except flink and its transitive dependencies. The resulting fat-jar can be executed      on a cluster. Change the value of Program-Class if your program entry point changes. -->   <build>
>       <plugins>
>          <!-- We use the maven-shade plugin to create a fat jar that contains all dependencies         except flink and it's transitive dependencies. The resulting fat-jar can be executed         on a cluster. Change the value of Program-Class if your program entry point changes. -->         <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-shade-plugin</artifactId>
>             <version>2.4.1</version>
>             <executions>
>                <!-- Run shade goal on package phase -->               <execution>
>                   <phase>package</phase>
>                   <goals>
>                      <goal>shade</goal>
>                   </goals>
>                   <configuration>
>                      <artifactSet>
>                         <excludes>
>                            <!-- This list contains all dependencies of flink-dist                           Everything else will be packaged into the fat-jar                           -->                           <exclude>org.apache.flink:flink-annotations</exclude>
>                            <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>                            <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                            <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                            <exclude>org.apache.flink:flink-core</exclude>
>                            <exclude>org.apache.flink:flink-java</exclude>
>                            <exclude>org.apache.flink:flink-scala_2.10</exclude>
>                            <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>                            <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>                            <exclude>org.apache.flink:flink-clients_2.10</exclude>
>                            <exclude>org.apache.flink:flink-avro_2.10</exclude>
>                            <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>                            <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>                            <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>
>                            <!-- Also exclude very big transitive dependencies of Flink                           WARNING: You have to remove these excludes if your code relies on other                           versions of these dependencies.                           -->                           <exclude>org.scala-lang:scala-library</exclude>
>                            <exclude>org.scala-lang:scala-compiler</exclude>
>                            <exclude>org.scala-lang:scala-reflect</exclude>
>                            <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                            <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                            <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                            <exclude>io.netty:netty-all</exclude>
>                            <exclude>io.netty:netty</exclude>
>                            <exclude>commons-fileupload:commons-fileupload</exclude>
>                            <exclude>org.apache.avro:avro</exclude>
>                            <exclude>commons-collections:commons-collections</exclude>
>                            <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                            <exclude>org.xerial.snappy:snappy-java</exclude>
>                            <exclude>org.apache.commons:commons-compress</exclude>
>                            <exclude>org.tukaani:xz</exclude>
>                            <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                            <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                            <exclude>org.objenesis:objenesis</exclude>
>                            <exclude>com.twitter:chill_*</exclude>
>                            <exclude>com.twitter:chill-java</exclude>
>                            <exclude>commons-lang:commons-lang</exclude>
>                            <exclude>junit:junit</exclude>
>                            <exclude>org.apache.commons:commons-lang3</exclude>
>                            <exclude>org.slf4j:slf4j-api</exclude>
>                            <exclude>org.slf4j:slf4j-log4j12</exclude>
>                            <exclude>log4j:log4j</exclude>
>                            <exclude>org.apache.commons:commons-math</exclude>
>                            <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                            <exclude>commons-logging:commons-logging</exclude>
>                            <exclude>commons-codec:commons-codec</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                            <exclude>stax:stax-api</exclude>
>                            <exclude>com.typesafe:config</exclude>
>                            <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                            <exclude>com.github.scopt:scopt_*</exclude>
>                            <exclude>commons-io:commons-io</exclude>
>                            <exclude>commons-cli:commons-cli</exclude>
>                         </excludes>
>                      </artifactSet>
>                      <filters>
>                         <filter>
>                            <artifact>org.apache.flink:*</artifact>
>                            <excludes>
>                               <!-- exclude shaded google but include shaded curator -->                              <exclude>org/apache/flink/shaded/com/**</exclude>
>                               <exclude>web-docs/**</exclude>
>                            </excludes>
>                         </filter>
>                         <filter>
>                            <!-- Do not copy the signatures in the META-INF folder.                           Otherwise, this might cause SecurityExceptions when using the JAR. -->                           <artifact>*:*</artifact>
>                            <excludes>
>                               <exclude>META-INF/*.SF</exclude>
>                               <exclude>META-INF/*.DSA</exclude>
>                               <exclude>META-INF/*.RSA</exclude>
>                            </excludes>
>                         </filter>
>                      </filters>
>                      <!-- If you want to use ./bin/flink run <quickstart jar> uncomment the following lines.                     This will add a Main-Class entry to the manifest file -->                     <!--                     <transformers>                        <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">                           <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>                        </transformer>                     </transformers>                      -->                     <createDependencyReducedPom>false</createDependencyReducedPom>
>                   </configuration>
>                </execution>
>             </executions>
>          </plugin>
>
>          <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-compiler-plugin</artifactId>
>             <version>3.1</version>
>             <configuration>
>                <source>1.7</source>
>                <target>1.7</target>
>             </configuration>
>          </plugin>
>          <plugin>
>             <groupId>net.alchim31.maven</groupId>
>             <artifactId>scala-maven-plugin</artifactId>
>             <version>3.1.4</version>
>             <executions>
>                <execution>
>                   <goals>
>                      <goal>compile</goal>
>                      <goal>testCompile</goal>
>                   </goals>
>                </execution>
>             </executions>
>          </plugin>
>
>          <!-- Eclipse Integration -->         <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-eclipse-plugin</artifactId>
>             <version>2.8</version>
>             <configuration>
>                <downloadSources>true</downloadSources>
>                <projectnatures>
>                   <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                   <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                </projectnatures>
>                <buildcommands>
>                   <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                </buildcommands>
>                <classpathContainers>
>                   <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                   <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                </classpathContainers>
>                <excludes>
>                   <exclude>org.scala-lang:scala-library</exclude>
>                   <exclude>org.scala-lang:scala-compiler</exclude>
>                </excludes>
>                <sourceIncludes>
>                   <sourceInclude>**/*.scala</sourceInclude>
>                   <sourceInclude>**/*.java</sourceInclude>
>                </sourceIncludes>
>             </configuration>
>          </plugin>
>
>          <!-- Adding scala source directories to build path -->         <plugin>
>             <groupId>org.codehaus.mojo</groupId>
>             <artifactId>build-helper-maven-plugin</artifactId>
>             <version>1.7</version>
>             <executions>
>                <!-- Add src/main/scala to eclipse build path -->               <execution>
>                   <id>add-source</id>
>                   <phase>generate-sources</phase>
>                   <goals>
>                      <goal>add-source</goal>
>                   </goals>
>                   <configuration>
>                      <sources>
>                         <source>src/main/scala</source>
>                      </sources>
>                   </configuration>
>                </execution>
>                <!-- Add src/test/scala to eclipse build path -->               <execution>
>                   <id>add-test-source</id>
>                   <phase>generate-test-sources</phase>
>                   <goals>
>                      <goal>add-test-source</goal>
>                   </goals>
>                   <configuration>
>                      <sources>
>                         <source>src/test/scala</source>
>                      </sources>
>                   </configuration>
>                </execution>
>             </executions>
>          </plugin>
>       </plugins>
>    </build></project>
>
> ==========
>
>
> Thanks
> S
>
> On Wed, Oct 12, 2016 at 11:19 AM, Timo Walther <tw...@apache.org> wrote:
>
>> Hi Sunny,
>>
>> you are using different versions of Flink. `flink-parent` is set to
>> `1.2-SNAPSHOT` but the property `flink.version` is still `1.1.2`.
>>
>> Hope that helps.
>>
>> Timo
>>
>>
>>
>> Am 12/10/16 um 11:49 schrieb sunny patel:
>>
>> Hi guys,
>>
>> I am facing following error message in flink scala JDBC wordcount.
>> could you please advise me on this?
>>
>> *Information:12/10/2016, 10:43 - Compilation completed with 2 errors and
>> 0 warnings in 1s 903ms*
>> */Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*
>>
>> *Error:(17, 67) can't expand macros compiled by previous versions of
>> Scala*
>> *    val stringColum: TypeInformation[Int] = createTypeInformation[Int]*
>>
>> *Error:(29, 33) can't expand macros compiled by previous versions of
>> Scala*
>> *    val dataset =env.createInput(inputFormat)*
>>
>>
>> ------------ code
>>
>>
>> package DataSources
>> import org.apache.flink.api.common.typeinfo.TypeInformationimport org.apache.flink.api.java.io.jdbc.JDBCInputFormatimport org.apache.flink.api.scala._import org.apache.flink.api.table.typeutils.RowTypeInfo
>> object WordCount {
>>   def main(args: Array[String]) {
>>
>>     val PATH = getClass.getResource("").getPath
>>
>>     // set up the execution environment    val env = ExecutionEnvironment.getExecutionEnvironment    // Read data from JDBC (Kylin in our case)    val stringColum: TypeInformation[Int] = createTypeInformation[Int]
>>     val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
>>
>>     val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>       .setDrivername("org.postgresql.jdbc.Driver")
>>       .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>>       .setUsername("MI")
>>       .setPassword("MI")
>>       .setQuery("select * FROM identity")
>>       .setRowTypeInfo(DB_ROWTYPE)
>>       .finish()
>>
>>     val dataset =env.createInput(inputFormat)
>>     dataset.print()
>>
>>     println(PATH)
>>   }
>> }
>>
>> ---------pom.xml
>>
>> <?xml version="1.0" encoding="UTF-8"?><project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
>>    <modelVersion>4.0.0</modelVersion>
>>    <parent>
>>       <artifactId>flink-parent</artifactId>
>>       <groupId>org.apache.flink</groupId>
>>       <version>1.2-SNAPSHOT</version>
>>    </parent>
>>
>>    <groupId>org.apache.flink.quickstart</groupId>
>>    <artifactId>flink-scala-project</artifactId>
>>    <version>0.1</version>
>>    <packaging>jar</packaging>
>>
>>    <name>Flink Quickstart Job</name>
>>    <url>http://www.myorganization.org</url>
>>
>>    <repositories>
>>       <repository>
>>          <id>apache.snapshots</id>
>>          <name>Apache Development Snapshot Repository</name>
>>          <url>https://repository.apache.org/content/repositories/snapshots/</url>
>>          <releases>
>>             <enabled>false</enabled>
>>          </releases>
>>          <snapshots>
>>          </snapshots>
>>       </repository>
>>    </repositories>
>>
>>    <properties>
>>       <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>       <flink.version>1.1.2</flink.version>
>>    </properties>
>>
>>    <!--      Execute "mvn clean package -Pbuild-jar"      to build a jar file out of this project!      How to use the Flink Quickstart pom:      a) Adding new dependencies:         You can add dependencies to the list below.         Please check if the maven-shade-plugin below is filtering out your dependency         and remove the exclude from there.      b) Build a jar for running on the cluster:         There are two options for creating a jar from this project         b.1) "mvn clean package" -> this will create a fat jar which contains all               dependencies necessary for running the jar created by this pom in a cluster.               The "maven-shade-plugin" excludes everything that is provided on a running Flink cluster.         b.2) "mvn clean package -Pbuild-jar" -> This will also create a fat-jar, but with much               nicer dependency exclusion handling. This approach is preferred and leads to               much cleaner jar files.   -->   <dependencies>
>>       <dependency>
>>          <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-jdbc</artifactId>
>>          <version>${flink.version}</version>
>>       </dependency>
>>       <dependency>
>>          <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-table_2.10</artifactId>
>>          <version>${flink.version}</version>
>>       </dependency>
>>       <dependency>
>>          <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-scala_2.10</artifactId>
>>          <version>${flink.version}</version>
>>       </dependency>
>>       <dependency>
>>          <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-streaming-scala_2.10</artifactId>
>>          <version>${flink.version}</version>
>>       </dependency>
>>       <dependency>
>>          <groupId>org.apache.flink</groupId>
>>          <artifactId>flink-clients_2.10</artifactId>
>>          <version>${flink.version}</version>
>>       </dependency>
>>    </dependencies>
>>
>>    <profiles>
>>       <profile>
>>          <!-- Profile for packaging correct JAR files -->         <id>build-jar</id>
>>          <activation>
>>          </activation>
>>          <dependencies>
>>             <dependency>
>>                <groupId>org.apache.flink</groupId>
>>                <artifactId>flink-scala_2.10</artifactId>
>>                <version>${flink.version}</version>
>>                <scope>provided</scope>
>>             </dependency>
>>             <dependency>
>>                <groupId>org.apache.flink</groupId>
>>                <artifactId>flink-streaming-scala_2.10</artifactId>
>>                <version>${flink.version}</version>
>>                <scope>provided</scope>
>>             </dependency>
>>             <dependency>
>>                <groupId>org.apache.flink</groupId>
>>                <artifactId>flink-clients_2.10</artifactId>
>>                <version>${flink.version}</version>
>>                <scope>provided</scope>
>>             </dependency>
>>          </dependencies>
>>
>>          <build>
>>             <plugins>
>>                <!-- disable the exclusion rules -->               <plugin>
>>                   <groupId>org.apache.maven.plugins</groupId>
>>                   <artifactId>maven-shade-plugin</artifactId>
>>                   <version>2.4.1</version>
>>                   <executions>
>>                      <execution>
>>                         <phase>package</phase>
>>                         <goals>
>>                            <goal>shade</goal>
>>                         </goals>
>>                         <configuration>
>>                            <artifactSet>
>>                               <excludes combine.self="override"></excludes>
>>                            </artifactSet>
>>                         </configuration>
>>                      </execution>
>>                   </executions>
>>                </plugin>
>>             </plugins>
>>          </build>
>>       </profile>
>>    </profiles>
>>
>>    <!-- We use the maven-assembly plugin to create a fat jar that contains all dependencies      except flink and its transitive dependencies. The resulting fat-jar can be executed      on a cluster. Change the value of Program-Class if your program entry point changes. -->   <build>
>>       <plugins>
>>          <!-- We use the maven-shade plugin to create a fat jar that contains all dependencies         except flink and it's transitive dependencies. The resulting fat-jar can be executed         on a cluster. Change the value of Program-Class if your program entry point changes. -->         <plugin>
>>             <groupId>org.apache.maven.plugins</groupId>
>>             <artifactId>maven-shade-plugin</artifactId>
>>             <version>2.4.1</version>
>>             <executions>
>>                <!-- Run shade goal on package phase -->               <execution>
>>                   <phase>package</phase>
>>                   <goals>
>>                      <goal>shade</goal>
>>                   </goals>
>>                   <configuration>
>>                      <artifactSet>
>>                         <excludes>
>>                            <!-- This list contains all dependencies of flink-dist                           Everything else will be packaged into the fat-jar                           -->                           <exclude>org.apache.flink:flink-annotations</exclude>
>>                            <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>>                            <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>>                            <exclude>org.apache.flink:flink-core</exclude>
>>                            <exclude>org.apache.flink:flink-java</exclude>
>>                            <exclude>org.apache.flink:flink-scala_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-clients_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-avro_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>>                            <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>>
>>                            <!-- Also exclude very big transitive dependencies of Flink                           WARNING: You have to remove these excludes if your code relies on other                           versions of these dependencies.                           -->                           <exclude>org.scala-lang:scala-library</exclude>
>>                            <exclude>org.scala-lang:scala-compiler</exclude>
>>                            <exclude>org.scala-lang:scala-reflect</exclude>
>>                            <exclude>com.typesafe.akka:akka-actor_*</exclude>
>>                            <exclude>com.typesafe.akka:akka-remote_*</exclude>
>>                            <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>>                            <exclude>io.netty:netty-all</exclude>
>>                            <exclude>io.netty:netty</exclude>
>>                            <exclude>commons-fileupload:commons-fileupload</exclude>
>>                            <exclude>org.apache.avro:avro</exclude>
>>                            <exclude>commons-collections:commons-collections</exclude>
>>                            <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>>                            <exclude>org.xerial.snappy:snappy-java</exclude>
>>                            <exclude>org.apache.commons:commons-compress</exclude>
>>                            <exclude>org.tukaani:xz</exclude>
>>                            <exclude>com.esotericsoftware.kryo:kryo</exclude>
>>                            <exclude>com.esotericsoftware.minlog:minlog</exclude>
>>                            <exclude>org.objenesis:objenesis</exclude>
>>                            <exclude>com.twitter:chill_*</exclude>
>>                            <exclude>com.twitter:chill-java</exclude>
>>                            <exclude>commons-lang:commons-lang</exclude>
>>                            <exclude>junit:junit</exclude>
>>                            <exclude>org.apache.commons:commons-lang3</exclude>
>>                            <exclude>org.slf4j:slf4j-api</exclude>
>>                            <exclude>org.slf4j:slf4j-log4j12</exclude>
>>                            <exclude>log4j:log4j</exclude>
>>                            <exclude>org.apache.commons:commons-math</exclude>
>>                            <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>>                            <exclude>commons-logging:commons-logging</exclude>
>>                            <exclude>commons-codec:commons-codec</exclude>
>>                            <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>>                            <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>>                            <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>>                            <exclude>stax:stax-api</exclude>
>>                            <exclude>com.typesafe:config</exclude>
>>                            <exclude>org.uncommons.maths:uncommons-maths</exclude>
>>                            <exclude>com.github.scopt:scopt_*</exclude>
>>                            <exclude>commons-io:commons-io</exclude>
>>                            <exclude>commons-cli:commons-cli</exclude>
>>                         </excludes>
>>                      </artifactSet>
>>                      <filters>
>>                         <filter>
>>                            <artifact>org.apache.flink:*</artifact>
>>                            <excludes>
>>                               <!-- exclude shaded google but include shaded curator -->                              <exclude>org/apache/flink/shaded/com/**</exclude>
>>                               <exclude>web-docs/**</exclude>
>>                            </excludes>
>>                         </filter>
>>                         <filter>
>>                            <!-- Do not copy the signatures in the META-INF folder.                           Otherwise, this might cause SecurityExceptions when using the JAR. -->                           <artifact>*:*</artifact>
>>                            <excludes>
>>                               <exclude>META-INF/*.SF</exclude>
>>                               <exclude>META-INF/*.DSA</exclude>
>>                               <exclude>META-INF/*.RSA</exclude>
>>                            </excludes>
>>                         </filter>
>>                      </filters>
>>                      <!-- If you want to use ./bin/flink run <quickstart jar> uncomment the following lines.                     This will add a Main-Class entry to the manifest file -->                     <!--                     <transformers>                        <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">                           <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>                        </transformer>                     </transformers>                      -->                     <createDependencyReducedPom>false</createDependencyReducedPom>
>>                   </configuration>
>>                </execution>
>>             </executions>
>>          </plugin>
>>
>>          <plugin>
>>             <groupId>org.apache.maven.plugins</groupId>
>>             <artifactId>maven-compiler-plugin</artifactId>
>>             <version>3.1</version>
>>             <configuration>
>>                <source>1.7</source>
>>                <target>1.7</target>
>>             </configuration>
>>          </plugin>
>>          <plugin>
>>             <groupId>net.alchim31.maven</groupId>
>>             <artifactId>scala-maven-plugin</artifactId>
>>             <version>3.1.4</version>
>>             <executions>
>>                <execution>
>>                   <goals>
>>                      <goal>compile</goal>
>>                      <goal>testCompile</goal>
>>                   </goals>
>>                </execution>
>>             </executions>
>>          </plugin>
>>
>>          <!-- Eclipse Integration -->         <plugin>
>>             <groupId>org.apache.maven.plugins</groupId>
>>             <artifactId>maven-eclipse-plugin</artifactId>
>>             <version>2.8</version>
>>             <configuration>
>>                <downloadSources>true</downloadSources>
>>                <projectnatures>
>>                   <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>>                   <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>>                </projectnatures>
>>                <buildcommands>
>>                   <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>>                </buildcommands>
>>                <classpathContainers>
>>                   <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>>                   <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>>                </classpathContainers>
>>                <excludes>
>>                   <exclude>org.scala-lang:scala-library</exclude>
>>                   <exclude>org.scala-lang:scala-compiler</exclude>
>>                </excludes>
>>                <sourceIncludes>
>>                   <sourceInclude>**/*.scala</sourceInclude>
>>                   <sourceInclude>**/*.java</sourceInclude>
>>                </sourceIncludes>
>>             </configuration>
>>          </plugin>
>>
>>          <!-- Adding scala source directories to build path -->         <plugin>
>>             <groupId>org.codehaus.mojo</groupId>
>>             <artifactId>build-helper-maven-plugin</artifactId>
>>             <version>1.7</version>
>>             <executions>
>>                <!-- Add src/main/scala to eclipse build path -->               <execution>
>>                   <id>add-source</id>
>>                   <phase>generate-sources</phase>
>>                   <goals>
>>                      <goal>add-source</goal>
>>                   </goals>
>>                   <configuration>
>>                      <sources>
>>                         <source>src/main/scala</source>
>>                      </sources>
>>                   </configuration>
>>                </execution>
>>                <!-- Add src/test/scala to eclipse build path -->               <execution>
>>                   <id>add-test-source</id>
>>                   <phase>generate-test-sources</phase>
>>                   <goals>
>>                      <goal>add-test-source</goal>
>>                   </goals>
>>                   <configuration>
>>                      <sources>
>>                         <source>src/test/scala</source>
>>                      </sources>
>>                   </configuration>
>>                </execution>
>>             </executions>
>>          </plugin>
>>       </plugins>
>>    </build></project>
>>
>> Cheers
>>
>> S
>>
>> --
>> Freundliche Grüße / Kind Regards
>>
>> Timo Walther
>>
>> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>>
>> --
> Freundliche Grüße / Kind Regards
>
> Timo Walther
>
> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>
>

Re: jdbc.JDBCInputFormat

Posted by Timo Walther <tw...@apache.org>.
Are you sure that the same Scala version is used everywhere? Maybe it 
helps to clean your local Maven repo and build the version again.


Am 12/10/16 um 12:39 schrieb sunny patel:
> Thanks, Timo,
>
> I have updated `flink-parent' and Flink version to 1.2-SNAPSHOT`
> but still, I am facing the version errors.
>
> could you please advise me on this?
>
> Information:12/10/2016, 11:34 - Compilation completed with 2 errors 
> and 0 warnings in 7s 284ms
> /Users/janaidu/faid/src/main/scala/fgid/JDBC.scala
> Error:(17, 67) can't expand macros compiled by previous versions of Scala
>     val stringColum: TypeInformation[Int] = createTypeInformation[Int]
> Error:(29, 33) can't expand macros compiled by previous versions of Scala
>     val dataset =env.createInput(inputFormat)
>
>
>
>
> ========= POM.XML FILE
>
>
> <?xml version="1.0" encoding="UTF-8"?> <project 
> xmlns="http://maven.apache.org/POM/4.0.0" 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>     <modelVersion>4.0.0</modelVersion>
>     <parent>
>        <artifactId>flink-parent</artifactId>
>        <groupId>org.apache.flink</groupId>
>        <version>1.2-SNAPSHOT</version>
>     </parent>
>
>     <groupId>org.apache.flink.quickstart</groupId>
>     <artifactId>flink-scala-project</artifactId>
>     <version>0.1</version>
>     <packaging>jar</packaging>
>
>     <name>Flink Quickstart Job</name>
>     <url>http://www.myorganization.org</url>
>
>     <repositories>
>        <repository>
>           <id>apache.snapshots</id>
>           <name>Apache Development Snapshot Repository</name>
>           <url>https://repository.apache.org/content/repositories/snapshots/</url>
>           <releases>
>              <enabled>false</enabled>
>           </releases>
>           <snapshots>
>           </snapshots>
>        </repository>
>     </repositories>
>
>     <properties>
>        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>        <flink.version>1.2-SNAPSHOT</flink.version>
>     </properties>
>
>     <!-- Execute "mvn clean package -Pbuild-jar" to build a jar file out 
> of this project! How to use the Flink Quickstart pom: a) Adding new 
> dependencies: You can add dependencies to the list below. Please check 
> if the maven-shade-plugin below is filtering out your dependency and 
> remove the exclude from there. b) Build a jar for running on the 
> cluster: There are two options for creating a jar from this project 
> b.1) "mvn clean package" -> this will create a fat jar which contains 
> all dependencies necessary for running the jar created by this pom in 
> a cluster. The "maven-shade-plugin" excludes everything that is 
> provided on a running Flink cluster. b.2) "mvn clean package 
> -Pbuild-jar" -> This will also create a fat-jar, but with much nicer 
> dependency exclusion handling. This approach is preferred and leads to 
> much cleaner jar files. --> <dependencies>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-jdbc</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-table_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-streaming-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-clients_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>     </dependencies>
>
>     <profiles>
>        <profile>
>           <!-- Profile for packaging correct JAR files --> <id>build-jar</id>
>           <activation>
>           </activation>
>           <dependencies>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-streaming-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-clients_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>           </dependencies>
>
>           <build>
>              <plugins>
>                 <!-- disable the exclusion rules --> <plugin>
>                    <groupId>org.apache.maven.plugins</groupId>
>                    <artifactId>maven-shade-plugin</artifactId>
>                    <version>2.4.1</version>
>                    <executions>
>                       <execution>
>                          <phase>package</phase>
>                          <goals>
>                             <goal>shade</goal>
>                          </goals>
>                          <configuration>
>                             <artifactSet>
>                                <excludes combine.self="override"></excludes>
>                             </artifactSet>
>                          </configuration>
>                       </execution>
>                    </executions>
>                 </plugin>
>              </plugins>
>           </build>
>        </profile>
>     </profiles>
>
>     <!-- We use the maven-assembly plugin to create a fat jar that 
> contains all dependencies except flink and its transitive 
> dependencies. The resulting fat-jar can be executed on a cluster. 
> Change the value of Program-Class if your program entry point changes. 
> --> <build>
>        <plugins>
>           <!-- We use the maven-shade plugin to create a fat jar that contains 
> all dependencies except flink and it's transitive dependencies. The 
> resulting fat-jar can be executed on a cluster. Change the value of 
> Program-Class if your program entry point changes. --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-shade-plugin</artifactId>
>              <version>2.4.1</version>
>              <executions>
>                 <!-- Run shade goal on package phase --> <execution>
>                    <phase>package</phase>
>                    <goals>
>                       <goal>shade</goal>
>                    </goals>
>                    <configuration>
>                       <artifactSet>
>                          <excludes>
>                             <!-- This list contains all dependencies of flink-dist Everything else 
> will be packaged into the fat-jar --> <exclude>org.apache.flink:flink-annotations</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                             <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                             <exclude>org.apache.flink:flink-core</exclude>
>                             <exclude>org.apache.flink:flink-java</exclude>
>                             <exclude>org.apache.flink:flink-scala_2.10</exclude>
>                             <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>                             <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>                             <exclude>org.apache.flink:flink-clients_2.10</exclude>
>                             <exclude>org.apache.flink:flink-avro_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>                             <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>
>                             <!-- Also exclude very big transitive dependencies of Flink WARNING: 
> You have to remove these excludes if your code relies on other 
> versions of these dependencies. --> <exclude>org.scala-lang:scala-library</exclude>
>                             <exclude>org.scala-lang:scala-compiler</exclude>
>                             <exclude>org.scala-lang:scala-reflect</exclude>
>                             <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                             <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                             <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                             <exclude>io.netty:netty-all</exclude>
>                             <exclude>io.netty:netty</exclude>
>                             <exclude>commons-fileupload:commons-fileupload</exclude>
>                             <exclude>org.apache.avro:avro</exclude>
>                             <exclude>commons-collections:commons-collections</exclude>
>                             <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                             <exclude>org.xerial.snappy:snappy-java</exclude>
>                             <exclude>org.apache.commons:commons-compress</exclude>
>                             <exclude>org.tukaani:xz</exclude>
>                             <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                             <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                             <exclude>org.objenesis:objenesis</exclude>
>                             <exclude>com.twitter:chill_*</exclude>
>                             <exclude>com.twitter:chill-java</exclude>
>                             <exclude>commons-lang:commons-lang</exclude>
>                             <exclude>junit:junit</exclude>
>                             <exclude>org.apache.commons:commons-lang3</exclude>
>                             <exclude>org.slf4j:slf4j-api</exclude>
>                             <exclude>org.slf4j:slf4j-log4j12</exclude>
>                             <exclude>log4j:log4j</exclude>
>                             <exclude>org.apache.commons:commons-math</exclude>
>                             <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                             <exclude>commons-logging:commons-logging</exclude>
>                             <exclude>commons-codec:commons-codec</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                             <exclude>stax:stax-api</exclude>
>                             <exclude>com.typesafe:config</exclude>
>                             <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                             <exclude>com.github.scopt:scopt_*</exclude>
>                             <exclude>commons-io:commons-io</exclude>
>                             <exclude>commons-cli:commons-cli</exclude>
>                          </excludes>
>                       </artifactSet>
>                       <filters>
>                          <filter>
>                             <artifact>org.apache.flink:*</artifact>
>                             <excludes>
>                                <!-- exclude shaded google but include shaded curator --> <exclude>org/apache/flink/shaded/com/**</exclude>
>                                <exclude>web-docs/**</exclude>
>                             </excludes>
>                          </filter>
>                          <filter>
>                             <!-- Do not copy the signatures in the META-INF folder. Otherwise, 
> this might cause SecurityExceptions when using the JAR. --> <artifact>*:*</artifact>
>                             <excludes>
>                                <exclude>META-INF/*.SF</exclude>
>                                <exclude>META-INF/*.DSA</exclude>
>                                <exclude>META-INF/*.RSA</exclude>
>                             </excludes>
>                          </filter>
>                       </filters>
>                       <!-- If you want to use ./bin/flink run <quickstart jar> uncomment the 
> following lines. This will add a Main-Class entry to the manifest file 
> --> <!-- <transformers> <transformer 
> implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> 
> <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass> 
> </transformer> </transformers> --> <createDependencyReducedPom>false</createDependencyReducedPom>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-compiler-plugin</artifactId>
>              <version>3.1</version>
>              <configuration>
>                 <source>1.7</source>
>                 <target>1.7</target>
>              </configuration>
>           </plugin>
>           <plugin>
>              <groupId>net.alchim31.maven</groupId>
>              <artifactId>scala-maven-plugin</artifactId>
>              <version>3.1.4</version>
>              <executions>
>                 <execution>
>                    <goals>
>                       <goal>compile</goal>
>                       <goal>testCompile</goal>
>                    </goals>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <!-- Eclipse Integration --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-eclipse-plugin</artifactId>
>              <version>2.8</version>
>              <configuration>
>                 <downloadSources>true</downloadSources>
>                 <projectnatures>
>                    <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                    <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                 </projectnatures>
>                 <buildcommands>
>                    <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                 </buildcommands>
>                 <classpathContainers>
>                    <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                    <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                 </classpathContainers>
>                 <excludes>
>                    <exclude>org.scala-lang:scala-library</exclude>
>                    <exclude>org.scala-lang:scala-compiler</exclude>
>                 </excludes>
>                 <sourceIncludes>
>                    <sourceInclude>**/*.scala</sourceInclude>
>                    <sourceInclude>**/*.java</sourceInclude>
>                 </sourceIncludes>
>              </configuration>
>           </plugin>
>
>           <!-- Adding scala source directories to build path --> <plugin>
>              <groupId>org.codehaus.mojo</groupId>
>              <artifactId>build-helper-maven-plugin</artifactId>
>              <version>1.7</version>
>              <executions>
>                 <!-- Add src/main/scala to eclipse build path --> <execution>
>                    <id>add-source</id>
>                    <phase>generate-sources</phase>
>                    <goals>
>                       <goal>add-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/main/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>                 <!-- Add src/test/scala to eclipse build path --> <execution>
>                    <id>add-test-source</id>
>                    <phase>generate-test-sources</phase>
>                    <goals>
>                       <goal>add-test-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/test/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>        </plugins>
>     </build>
> </project>
> ==========
>
>
> Thanks
> S
>
> On Wed, Oct 12, 2016 at 11:19 AM, Timo Walther <twalthr@apache.org 
> <ma...@apache.org>> wrote:
>
>     Hi Sunny,
>
>     you are using different versions of Flink. `flink-parent` is set
>     to `1.2-SNAPSHOT` but the property `flink.version` is still `1.1.2`.
>
>     Hope that helps.
>
>     Timo
>
>
>
>     Am 12/10/16 um 11:49 schrieb sunny patel:
>>     Hi guys,
>>
>>     I am facing following error message in flink scala JDBC wordcount.
>>     could you please advise me on this?
>>
>>     *Information:12/10/2016, 10:43 - Compilation completed with 2
>>     errors and 0 warnings in 1s 903ms*
>>     */Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*
>>     *
>>     *
>>     *Error:(17, 67) can't expand macros compiled by previous versions
>>     of Scala*
>>     *    val stringColum: TypeInformation[Int] =
>>     createTypeInformation[Int]*
>>     *
>>     *
>>     *Error:(29, 33) can't expand macros compiled by previous versions
>>     of Scala*
>>     *    val dataset =env.createInput(inputFormat)*
>>     *
>>     *
>>
>>     ------------ code
>>
>>
>>     package DataSources
>>
>>     import org.apache.flink.api.common.typeinfo.TypeInformation
>>     import org.apache.flink.api.java.io
>>     <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>     import org.apache.flink.api.scala._
>>     import org.apache.flink.api.table.typeutils.RowTypeInfo
>>
>>     object WordCount {
>>        def main(args: Array[String]) {
>>
>>          val PATH = getClass.getResource("").getPath
>>
>>          // set up the execution environment val env = ExecutionEnvironment.getExecutionEnvironment // Read data from JDBC (Kylin in our
>>     case) val stringColum: TypeInformation[Int] =createTypeInformation[Int]
>>          val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>>
>>          val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>            .setDrivername("org.postgresql.jdbc.Driver")
>>            .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>>            .setUsername("MI")
>>            .setPassword("MI")
>>            .setQuery("select * FROM identity")
>>            .setRowTypeInfo(DB_ROWTYPE)
>>            .finish()
>>
>>          val dataset =env.createInput(inputFormat)
>>          dataset.print()
>>
>>          println(PATH)
>>        }
>>     }
>>     ---------pom.xml
>>     <?xml version="1.0" encoding="UTF-8"?> <project
>>     xmlns="http://maven.apache.org/POM/4.0.0
>>     <http://maven.apache.org/POM/4.0.0>"
>>     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance
>>     <http://www.w3.org/2001/XMLSchema-instance>"
>>     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
>>     <http://maven.apache.org/POM/4.0.0>
>>     http://maven.apache.org/xsd/maven-4.0.0.xsd
>>     <http://maven.apache.org/xsd/maven-4.0.0.xsd>">
>>         <modelVersion>4.0.0</modelVersion>
>>         <parent>
>>            <artifactId>flink-parent</artifactId>
>>            <groupId>org.apache.flink</groupId>
>>            <version>1.2-SNAPSHOT</version>
>>         </parent>
>>
>>         <groupId>org.apache.flink.quickstart</groupId>
>>         <artifactId>flink-scala-project</artifactId>
>>         <version>0.1</version>
>>         <packaging>jar</packaging>
>>
>>         <name>Flink Quickstart Job</name>
>>         <url>http://www.myorganization.org
>>     <http://www.myorganization.org></url>
>>
>>         <repositories>
>>            <repository>
>>               <id>apache.snapshots</id>
>>               <name>Apache Development Snapshot Repository</name>
>>               <url>https://repository.apache.org/content/repositories/snapshots/
>>     <https://repository.apache.org/content/repositories/snapshots/></url>
>>               <releases>
>>                  <enabled>false</enabled>
>>               </releases>
>>               <snapshots>
>>               </snapshots>
>>            </repository>
>>         </repositories>
>>
>>         <properties>
>>            <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>            <flink.version>1.1.2</flink.version>
>>         </properties>
>>
>>         <!-- Execute "mvn clean package -Pbuild-jar" to build a jar file
>>     out of this project! How to use the Flink Quickstart pom: a)
>>     Adding new dependencies: You can add dependencies to the list
>>     below. Please check if the maven-shade-plugin below is filtering
>>     out your dependency and remove the exclude from there. b) Build a
>>     jar for running on the cluster: There are two options for
>>     creating a jar from this project b.1) "mvn clean package" -> this
>>     will create a fat jar which contains all dependencies necessary
>>     for running the jar created by this pom in a cluster. The
>>     "maven-shade-plugin" excludes everything that is provided on a
>>     running Flink cluster. b.2) "mvn clean package -Pbuild-jar" ->
>>     This will also create a fat-jar, but with much nicer dependency
>>     exclusion handling. This approach is preferred and leads to much
>>     cleaner jar files. --> <dependencies>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-jdbc</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-table_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-scala_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-streaming-scala_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>            <dependency>
>>               <groupId>org.apache.flink</groupId>
>>               <artifactId>flink-clients_2.10</artifactId>
>>               <version>${flink.version}</version>
>>            </dependency>
>>         </dependencies>
>>
>>         <profiles>
>>            <profile>
>>               <!-- Profile for packaging correct JAR files --> <id>build-jar</id>
>>               <activation>
>>               </activation>
>>               <dependencies>
>>                  <dependency>
>>                     <groupId>org.apache.flink</groupId>
>>                     <artifactId>flink-scala_2.10</artifactId>
>>                     <version>${flink.version}</version>
>>                     <scope>provided</scope>
>>                  </dependency>
>>                  <dependency>
>>                     <groupId>org.apache.flink</groupId>
>>                     <artifactId>flink-streaming-scala_2.10</artifactId>
>>                     <version>${flink.version}</version>
>>                     <scope>provided</scope>
>>                  </dependency>
>>                  <dependency>
>>                     <groupId>org.apache.flink</groupId>
>>                     <artifactId>flink-clients_2.10</artifactId>
>>                     <version>${flink.version}</version>
>>                     <scope>provided</scope>
>>                  </dependency>
>>               </dependencies>
>>
>>               <build>
>>                  <plugins>
>>                     <!-- disable the exclusion rules --> <plugin>
>>                        <groupId>org.apache.maven.plugins</groupId>
>>                        <artifactId>maven-shade-plugin</artifactId>
>>                        <version>2.4.1</version>
>>                        <executions>
>>                           <execution>
>>                              <phase>package</phase>
>>                              <goals>
>>                                 <goal>shade</goal>
>>                              </goals>
>>                              <configuration>
>>                                 <artifactSet>
>>                                    <excludes combine.self="override"></excludes>
>>                                 </artifactSet>
>>                              </configuration>
>>                           </execution>
>>                        </executions>
>>                     </plugin>
>>                  </plugins>
>>               </build>
>>            </profile>
>>         </profiles>
>>
>>         <!-- We use the maven-assembly plugin to create a fat jar that
>>     contains all dependencies except flink and its transitive
>>     dependencies. The resulting fat-jar can be executed on a cluster.
>>     Change the value of Program-Class if your program entry point
>>     changes. --> <build>
>>            <plugins>
>>               <!-- We use the maven-shade plugin to create a fat jar that
>>     contains all dependencies except flink and it's transitive
>>     dependencies. The resulting fat-jar can be executed on a cluster.
>>     Change the value of Program-Class if your program entry point
>>     changes. --> <plugin>
>>                  <groupId>org.apache.maven.plugins</groupId>
>>                  <artifactId>maven-shade-plugin</artifactId>
>>                  <version>2.4.1</version>
>>                  <executions>
>>                     <!-- Run shade goal on package phase --> <execution>
>>                        <phase>package</phase>
>>                        <goals>
>>                           <goal>shade</goal>
>>                        </goals>
>>                        <configuration>
>>                           <artifactSet>
>>                              <excludes>
>>                                 <!-- This list contains all dependencies of flink-dist Everything
>>     else will be packaged into the fat-jar --> <exclude>org.apache.flink:flink-annotations</exclude>
>>                                 <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>>                                 <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>>                                 <exclude>org.apache.flink:flink-core</exclude>
>>                                 <exclude>org.apache.flink:flink-java</exclude>
>>                                 <exclude>org.apache.flink:flink-scala_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-clients_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-avro_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>>                                 <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>>
>>                                 <!-- Also exclude very big transitive dependencies of Flink
>>     WARNING: You have to remove these excludes if your code relies on
>>     other versions of these dependencies. --> <exclude>org.scala-lang:scala-library</exclude>
>>                                 <exclude>org.scala-lang:scala-compiler</exclude>
>>                                 <exclude>org.scala-lang:scala-reflect</exclude>
>>                                 <exclude>com.typesafe.akka:akka-actor_*</exclude>
>>                                 <exclude>com.typesafe.akka:akka-remote_*</exclude>
>>                                 <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>>                                 <exclude>io.netty:netty-all</exclude>
>>                                 <exclude>io.netty:netty</exclude>
>>                                 <exclude>commons-fileupload:commons-fileupload</exclude>
>>                                 <exclude>org.apache.avro:avro</exclude>
>>                                 <exclude>commons-collections:commons-collections</exclude>
>>                                 <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>>                                 <exclude>org.xerial.snappy:snappy-java</exclude>
>>                                 <exclude>org.apache.commons:commons-compress</exclude>
>>                                 <exclude>org.tukaani:xz</exclude>
>>                                 <exclude>com.esotericsoftware.kryo:kryo</exclude>
>>                                 <exclude>com.esotericsoftware.minlog:minlog</exclude>
>>                                 <exclude>org.objenesis:objenesis</exclude>
>>                                 <exclude>com.twitter:chill_*</exclude>
>>                                 <exclude>com.twitter:chill-java</exclude>
>>                                 <exclude>commons-lang:commons-lang</exclude>
>>                                 <exclude>junit:junit</exclude>
>>                                 <exclude>org.apache.commons:commons-lang3</exclude>
>>                                 <exclude>org.slf4j:slf4j-api</exclude>
>>                                 <exclude>org.slf4j:slf4j-log4j12</exclude>
>>                                 <exclude>log4j:log4j</exclude>
>>                                 <exclude>org.apache.commons:commons-math</exclude>
>>                                 <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>>                                 <exclude>commons-logging:commons-logging</exclude>
>>                                 <exclude>commons-codec:commons-codec</exclude>
>>                                 <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>>                                 <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>>                                 <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>>                                 <exclude>stax:stax-api</exclude>
>>                                 <exclude>com.typesafe:config</exclude>
>>                                 <exclude>org.uncommons.maths:uncommons-maths</exclude>
>>                                 <exclude>com.github.scopt:scopt_*</exclude>
>>                                 <exclude>commons-io:commons-io</exclude>
>>                                 <exclude>commons-cli:commons-cli</exclude>
>>                              </excludes>
>>                           </artifactSet>
>>                           <filters>
>>                              <filter>
>>                                 <artifact>org.apache.flink:*</artifact>
>>                                 <excludes>
>>                                    <!-- exclude shaded google but include shaded curator --> <exclude>org/apache/flink/shaded/com/**</exclude>
>>                                    <exclude>web-docs/**</exclude>
>>                                 </excludes>
>>                              </filter>
>>                              <filter>
>>                                 <!-- Do not copy the signatures in the META-INF folder.
>>     Otherwise, this might cause SecurityExceptions when using the
>>     JAR. --> <artifact>*:*</artifact>
>>                                 <excludes>
>>                                    <exclude>META-INF/*.SF</exclude>
>>                                    <exclude>META-INF/*.DSA</exclude>
>>                                    <exclude>META-INF/*.RSA</exclude>
>>                                 </excludes>
>>                              </filter>
>>                           </filters>
>>                           <!-- If you want to use ./bin/flink run <quickstart jar>
>>     uncomment the following lines. This will add a Main-Class entry
>>     to the manifest file --> <!-- <transformers> <transformer
>>     implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
>>     <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>
>>     </transformer> </transformers> --> <createDependencyReducedPom>false</createDependencyReducedPom>
>>                        </configuration>
>>                     </execution>
>>                  </executions>
>>               </plugin>
>>
>>               <plugin>
>>                  <groupId>org.apache.maven.plugins</groupId>
>>                  <artifactId>maven-compiler-plugin</artifactId>
>>                  <version>3.1</version>
>>                  <configuration>
>>                     <source>1.7</source>
>>                     <target>1.7</target>
>>                  </configuration>
>>               </plugin>
>>               <plugin>
>>                  <groupId>net.alchim31.maven</groupId>
>>                  <artifactId>scala-maven-plugin</artifactId>
>>                  <version>3.1.4</version>
>>                  <executions>
>>                     <execution>
>>                        <goals>
>>                           <goal>compile</goal>
>>                           <goal>testCompile</goal>
>>                        </goals>
>>                     </execution>
>>                  </executions>
>>               </plugin>
>>
>>               <!-- Eclipse Integration --> <plugin>
>>                  <groupId>org.apache.maven.plugins</groupId>
>>                  <artifactId>maven-eclipse-plugin</artifactId>
>>                  <version>2.8</version>
>>                  <configuration>
>>                     <downloadSources>true</downloadSources>
>>                     <projectnatures>
>>                        <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>>                        <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>>                     </projectnatures>
>>                     <buildcommands>
>>                        <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>>                     </buildcommands>
>>                     <classpathContainers>
>>                        <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>>                        <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>>                     </classpathContainers>
>>                     <excludes>
>>                        <exclude>org.scala-lang:scala-library</exclude>
>>                        <exclude>org.scala-lang:scala-compiler</exclude>
>>                     </excludes>
>>                     <sourceIncludes>
>>                        <sourceInclude>**/*.scala</sourceInclude>
>>                        <sourceInclude>**/*.java</sourceInclude>
>>                     </sourceIncludes>
>>                  </configuration>
>>               </plugin>
>>
>>               <!-- Adding scala source directories to build path --> <plugin>
>>                  <groupId>org.codehaus.mojo</groupId>
>>                  <artifactId>build-helper-maven-plugin</artifactId>
>>                  <version>1.7</version>
>>                  <executions>
>>                     <!-- Add src/main/scala to eclipse build path --> <execution>
>>                        <id>add-source</id>
>>                        <phase>generate-sources</phase>
>>                        <goals>
>>                           <goal>add-source</goal>
>>                        </goals>
>>                        <configuration>
>>                           <sources>
>>                              <source>src/main/scala</source>
>>                           </sources>
>>                        </configuration>
>>                     </execution>
>>                     <!-- Add src/test/scala to eclipse build path --> <execution>
>>                        <id>add-test-source</id>
>>                        <phase>generate-test-sources</phase>
>>                        <goals>
>>                           <goal>add-test-source</goal>
>>                        </goals>
>>                        <configuration>
>>                           <sources>
>>                              <source>src/test/scala</source>
>>                           </sources>
>>                        </configuration>
>>                     </execution>
>>                  </executions>
>>               </plugin>
>>            </plugins>
>>         </build>
>>     </project>
>>     Cheers
>>     S
>
>     -- 
>     Freundliche Gr��e / Kind Regards
>
>     Timo Walther
>
>     Follow me: @twalthr
>     https://www.linkedin.com/in/twalthr
>     <https://www.linkedin.com/in/twalthr>
>
-- 
Freundliche Gr��e / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr

Re: jdbc.JDBCInputFormat

Posted by sunny patel <su...@gmail.com>.
Thanks, Timo,

I have updated `flink-parent' and Flink version to 1.2-SNAPSHOT`
but still, I am facing the version errors.

could you please advise me on this?

Information:12/10/2016, 11:34 - Compilation completed with 2 errors and 0
warnings in 7s 284ms
/Users/janaidu/faid/src/main/scala/fgid/JDBC.scala
Error:(17, 67) can't expand macros compiled by previous versions of Scala
    val stringColum: TypeInformation[Int] = createTypeInformation[Int]
Error:(29, 33) can't expand macros compiled by previous versions of Scala
    val dataset =env.createInput(inputFormat)




========= POM.XML FILE


<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
   <modelVersion>4.0.0</modelVersion>
   <parent>
      <artifactId>flink-parent</artifactId>
      <groupId>org.apache.flink</groupId>
      <version>1.2-SNAPSHOT</version>
   </parent>

   <groupId>org.apache.flink.quickstart</groupId>
   <artifactId>flink-scala-project</artifactId>
   <version>0.1</version>
   <packaging>jar</packaging>

   <name>Flink Quickstart Job</name>
   <url>http://www.myorganization.org</url>

   <repositories>
      <repository>
         <id>apache.snapshots</id>
         <name>Apache Development Snapshot Repository</name>
         <url>https://repository.apache.org/content/repositories/snapshots/</url>
         <releases>
            <enabled>false</enabled>
         </releases>
         <snapshots>
         </snapshots>
      </repository>
   </repositories>

   <properties>
      <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
      <flink.version>1.2-SNAPSHOT</flink.version>
   </properties>

   <!--

      Execute "mvn clean package -Pbuild-jar"
      to build a jar file out of this project!

      How to use the Flink Quickstart pom:

      a) Adding new dependencies:
         You can add dependencies to the list below.
         Please check if the maven-shade-plugin below is filtering out
your dependency
         and remove the exclude from there.

      b) Build a jar for running on the cluster:
         There are two options for creating a jar from this project

         b.1) "mvn clean package" -> this will create a fat jar which
contains all
               dependencies necessary for running the jar created by
this pom in a cluster.
               The "maven-shade-plugin" excludes everything that is
provided on a running Flink cluster.

         b.2) "mvn clean package -Pbuild-jar" -> This will also create
a fat-jar, but with much
               nicer dependency exclusion handling. This approach is
preferred and leads to
               much cleaner jar files.
   -->

   <dependencies>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-jdbc</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-table_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-scala_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-streaming-scala_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-clients_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
   </dependencies>

   <profiles>
      <profile>
         <!-- Profile for packaging correct JAR files -->
         <id>build-jar</id>
         <activation>
         </activation>
         <dependencies>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-scala_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-streaming-scala_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-clients_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
         </dependencies>

         <build>
            <plugins>
               <!-- disable the exclusion rules -->
               <plugin>
                  <groupId>org.apache.maven.plugins</groupId>
                  <artifactId>maven-shade-plugin</artifactId>
                  <version>2.4.1</version>
                  <executions>
                     <execution>
                        <phase>package</phase>
                        <goals>
                           <goal>shade</goal>
                        </goals>
                        <configuration>
                           <artifactSet>
                              <excludes combine.self="override"></excludes>
                           </artifactSet>
                        </configuration>
                     </execution>
                  </executions>
               </plugin>
            </plugins>
         </build>
      </profile>
   </profiles>

   <!-- We use the maven-assembly plugin to create a fat jar that
contains all dependencies
      except flink and its transitive dependencies. The resulting
fat-jar can be executed
      on a cluster. Change the value of Program-Class if your program
entry point changes. -->
   <build>
      <plugins>
         <!-- We use the maven-shade plugin to create a fat jar that
contains all dependencies
         except flink and it's transitive dependencies. The resulting
fat-jar can be executed
         on a cluster. Change the value of Program-Class if your
program entry point changes. -->
         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.1</version>
            <executions>
               <!-- Run shade goal on package phase -->
               <execution>
                  <phase>package</phase>
                  <goals>
                     <goal>shade</goal>
                  </goals>
                  <configuration>
                     <artifactSet>
                        <excludes>
                           <!-- This list contains all dependencies of
flink-dist
                           Everything else will be packaged into the fat-jar
                           -->
                           <exclude>org.apache.flink:flink-annotations</exclude>

<exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>

<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>

<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
                           <exclude>org.apache.flink:flink-core</exclude>
                           <exclude>org.apache.flink:flink-java</exclude>
                           <exclude>org.apache.flink:flink-scala_2.10</exclude>

<exclude>org.apache.flink:flink-runtime_2.10</exclude>

<exclude>org.apache.flink:flink-optimizer_2.10</exclude>

<exclude>org.apache.flink:flink-clients_2.10</exclude>
                           <exclude>org.apache.flink:flink-avro_2.10</exclude>

<exclude>org.apache.flink:flink-examples-batch_2.10</exclude>

<exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>

<exclude>org.apache.flink:flink-streaming-java_2.10</exclude>

                           <!-- Also exclude very big transitive
dependencies of Flink

                           WARNING: You have to remove these excludes
if your code relies on other
                           versions of these dependencies.

                           -->

                           <exclude>org.scala-lang:scala-library</exclude>
                           <exclude>org.scala-lang:scala-compiler</exclude>
                           <exclude>org.scala-lang:scala-reflect</exclude>
                           <exclude>com.typesafe.akka:akka-actor_*</exclude>
                           <exclude>com.typesafe.akka:akka-remote_*</exclude>
                           <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
                           <exclude>io.netty:netty-all</exclude>
                           <exclude>io.netty:netty</exclude>

<exclude>commons-fileupload:commons-fileupload</exclude>
                           <exclude>org.apache.avro:avro</exclude>

<exclude>commons-collections:commons-collections</exclude>

<exclude>com.thoughtworks.paranamer:paranamer</exclude>
                           <exclude>org.xerial.snappy:snappy-java</exclude>

<exclude>org.apache.commons:commons-compress</exclude>
                           <exclude>org.tukaani:xz</exclude>
                           <exclude>com.esotericsoftware.kryo:kryo</exclude>
                           <exclude>com.esotericsoftware.minlog:minlog</exclude>
                           <exclude>org.objenesis:objenesis</exclude>
                           <exclude>com.twitter:chill_*</exclude>
                           <exclude>com.twitter:chill-java</exclude>
                           <exclude>commons-lang:commons-lang</exclude>
                           <exclude>junit:junit</exclude>
                           <exclude>org.apache.commons:commons-lang3</exclude>
                           <exclude>org.slf4j:slf4j-api</exclude>
                           <exclude>org.slf4j:slf4j-log4j12</exclude>
                           <exclude>log4j:log4j</exclude>
                           <exclude>org.apache.commons:commons-math</exclude>

<exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
                           <exclude>commons-logging:commons-logging</exclude>
                           <exclude>commons-codec:commons-codec</exclude>

<exclude>com.fasterxml.jackson.core:jackson-core</exclude>

<exclude>com.fasterxml.jackson.core:jackson-databind</exclude>

<exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
                           <exclude>stax:stax-api</exclude>
                           <exclude>com.typesafe:config</exclude>

<exclude>org.uncommons.maths:uncommons-maths</exclude>
                           <exclude>com.github.scopt:scopt_*</exclude>
                           <exclude>commons-io:commons-io</exclude>
                           <exclude>commons-cli:commons-cli</exclude>
                        </excludes>
                     </artifactSet>
                     <filters>
                        <filter>
                           <artifact>org.apache.flink:*</artifact>
                           <excludes>
                              <!-- exclude shaded google but include
shaded curator -->
                              <exclude>org/apache/flink/shaded/com/**</exclude>
                              <exclude>web-docs/**</exclude>
                           </excludes>
                        </filter>
                        <filter>
                           <!-- Do not copy the signatures in the
META-INF folder.
                           Otherwise, this might cause
SecurityExceptions when using the JAR. -->
                           <artifact>*:*</artifact>
                           <excludes>
                              <exclude>META-INF/*.SF</exclude>
                              <exclude>META-INF/*.DSA</exclude>
                              <exclude>META-INF/*.RSA</exclude>
                           </excludes>
                        </filter>
                     </filters>
                     <!-- If you want to use ./bin/flink run
<quickstart jar> uncomment the following lines.
                     This will add a Main-Class entry to the manifest file -->
                     <!--
                     <transformers>
                        <transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

<mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>
                        </transformer>
                     </transformers>
                      -->

<createDependencyReducedPom>false</createDependencyReducedPom>
                  </configuration>
               </execution>
            </executions>
         </plugin>

         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
               <source>1.7</source>
               <target>1.7</target>
            </configuration>
         </plugin>
         <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.1.4</version>
            <executions>
               <execution>
                  <goals>
                     <goal>compile</goal>
                     <goal>testCompile</goal>
                  </goals>
               </execution>
            </executions>
         </plugin>

         <!-- Eclipse Integration -->
         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-eclipse-plugin</artifactId>
            <version>2.8</version>
            <configuration>
               <downloadSources>true</downloadSources>
               <projectnatures>

<projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
                  <projectnature>org.eclipse.jdt.core.javanature</projectnature>
               </projectnatures>
               <buildcommands>

<buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
               </buildcommands>
               <classpathContainers>

<classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>

<classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
               </classpathContainers>
               <excludes>
                  <exclude>org.scala-lang:scala-library</exclude>
                  <exclude>org.scala-lang:scala-compiler</exclude>
               </excludes>
               <sourceIncludes>
                  <sourceInclude>**/*.scala</sourceInclude>
                  <sourceInclude>**/*.java</sourceInclude>
               </sourceIncludes>
            </configuration>
         </plugin>

         <!-- Adding scala source directories to build path -->
         <plugin>
            <groupId>org.codehaus.mojo</groupId>
            <artifactId>build-helper-maven-plugin</artifactId>
            <version>1.7</version>
            <executions>
               <!-- Add src/main/scala to eclipse build path -->
               <execution>
                  <id>add-source</id>
                  <phase>generate-sources</phase>
                  <goals>
                     <goal>add-source</goal>
                  </goals>
                  <configuration>
                     <sources>
                        <source>src/main/scala</source>
                     </sources>
                  </configuration>
               </execution>
               <!-- Add src/test/scala to eclipse build path -->
               <execution>
                  <id>add-test-source</id>
                  <phase>generate-test-sources</phase>
                  <goals>
                     <goal>add-test-source</goal>
                  </goals>
                  <configuration>
                     <sources>
                        <source>src/test/scala</source>
                     </sources>
                  </configuration>
               </execution>
            </executions>
         </plugin>
      </plugins>
   </build>
</project>

==========


Thanks
S

On Wed, Oct 12, 2016 at 11:19 AM, Timo Walther <tw...@apache.org> wrote:

> Hi Sunny,
>
> you are using different versions of Flink. `flink-parent` is set to
> `1.2-SNAPSHOT` but the property `flink.version` is still `1.1.2`.
>
> Hope that helps.
>
> Timo
>
>
>
> Am 12/10/16 um 11:49 schrieb sunny patel:
>
> Hi guys,
>
> I am facing following error message in flink scala JDBC wordcount.
> could you please advise me on this?
>
> *Information:12/10/2016, 10:43 - Compilation completed with 2 errors and 0
> warnings in 1s 903ms*
> */Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*
>
> *Error:(17, 67) can't expand macros compiled by previous versions of Scala*
> *    val stringColum: TypeInformation[Int] = createTypeInformation[Int]*
>
> *Error:(29, 33) can't expand macros compiled by previous versions of Scala*
> *    val dataset =env.createInput(inputFormat)*
>
>
> ------------ code
>
>
> package DataSources
> import org.apache.flink.api.common.typeinfo.TypeInformationimport org.apache.flink.api.java.io.jdbc.JDBCInputFormatimport org.apache.flink.api.scala._import org.apache.flink.api.table.typeutils.RowTypeInfo
> object WordCount {
>   def main(args: Array[String]) {
>
>     val PATH = getClass.getResource("").getPath
>
>     // set up the execution environment    val env = ExecutionEnvironment.getExecutionEnvironment    // Read data from JDBC (Kylin in our case)    val stringColum: TypeInformation[Int] = createTypeInformation[Int]
>     val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
>
>     val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>       .setDrivername("org.postgresql.jdbc.Driver")
>       .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>       .setUsername("MI")
>       .setPassword("MI")
>       .setQuery("select * FROM identity")
>       .setRowTypeInfo(DB_ROWTYPE)
>       .finish()
>
>     val dataset =env.createInput(inputFormat)
>     dataset.print()
>
>     println(PATH)
>   }
> }
>
> ---------pom.xml
>
> <?xml version="1.0" encoding="UTF-8"?><project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
>    <modelVersion>4.0.0</modelVersion>
>    <parent>
>       <artifactId>flink-parent</artifactId>
>       <groupId>org.apache.flink</groupId>
>       <version>1.2-SNAPSHOT</version>
>    </parent>
>
>    <groupId>org.apache.flink.quickstart</groupId>
>    <artifactId>flink-scala-project</artifactId>
>    <version>0.1</version>
>    <packaging>jar</packaging>
>
>    <name>Flink Quickstart Job</name>
>    <url>http://www.myorganization.org</url>
>
>    <repositories>
>       <repository>
>          <id>apache.snapshots</id>
>          <name>Apache Development Snapshot Repository</name>
>          <url>https://repository.apache.org/content/repositories/snapshots/</url>
>          <releases>
>             <enabled>false</enabled>
>          </releases>
>          <snapshots>
>          </snapshots>
>       </repository>
>    </repositories>
>
>    <properties>
>       <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>       <flink.version>1.1.2</flink.version>
>    </properties>
>
>    <!--      Execute "mvn clean package -Pbuild-jar"      to build a jar file out of this project!      How to use the Flink Quickstart pom:      a) Adding new dependencies:         You can add dependencies to the list below.         Please check if the maven-shade-plugin below is filtering out your dependency         and remove the exclude from there.      b) Build a jar for running on the cluster:         There are two options for creating a jar from this project         b.1) "mvn clean package" -> this will create a fat jar which contains all               dependencies necessary for running the jar created by this pom in a cluster.               The "maven-shade-plugin" excludes everything that is provided on a running Flink cluster.         b.2) "mvn clean package -Pbuild-jar" -> This will also create a fat-jar, but with much               nicer dependency exclusion handling. This approach is preferred and leads to               much cleaner jar files.   -->   <dependencies>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-jdbc</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-table_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-scala_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-streaming-scala_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>       <dependency>
>          <groupId>org.apache.flink</groupId>
>          <artifactId>flink-clients_2.10</artifactId>
>          <version>${flink.version}</version>
>       </dependency>
>    </dependencies>
>
>    <profiles>
>       <profile>
>          <!-- Profile for packaging correct JAR files -->         <id>build-jar</id>
>          <activation>
>          </activation>
>          <dependencies>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-scala_2.10</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-streaming-scala_2.10</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>             <dependency>
>                <groupId>org.apache.flink</groupId>
>                <artifactId>flink-clients_2.10</artifactId>
>                <version>${flink.version}</version>
>                <scope>provided</scope>
>             </dependency>
>          </dependencies>
>
>          <build>
>             <plugins>
>                <!-- disable the exclusion rules -->               <plugin>
>                   <groupId>org.apache.maven.plugins</groupId>
>                   <artifactId>maven-shade-plugin</artifactId>
>                   <version>2.4.1</version>
>                   <executions>
>                      <execution>
>                         <phase>package</phase>
>                         <goals>
>                            <goal>shade</goal>
>                         </goals>
>                         <configuration>
>                            <artifactSet>
>                               <excludes combine.self="override"></excludes>
>                            </artifactSet>
>                         </configuration>
>                      </execution>
>                   </executions>
>                </plugin>
>             </plugins>
>          </build>
>       </profile>
>    </profiles>
>
>    <!-- We use the maven-assembly plugin to create a fat jar that contains all dependencies      except flink and its transitive dependencies. The resulting fat-jar can be executed      on a cluster. Change the value of Program-Class if your program entry point changes. -->   <build>
>       <plugins>
>          <!-- We use the maven-shade plugin to create a fat jar that contains all dependencies         except flink and it's transitive dependencies. The resulting fat-jar can be executed         on a cluster. Change the value of Program-Class if your program entry point changes. -->         <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-shade-plugin</artifactId>
>             <version>2.4.1</version>
>             <executions>
>                <!-- Run shade goal on package phase -->               <execution>
>                   <phase>package</phase>
>                   <goals>
>                      <goal>shade</goal>
>                   </goals>
>                   <configuration>
>                      <artifactSet>
>                         <excludes>
>                            <!-- This list contains all dependencies of flink-dist                           Everything else will be packaged into the fat-jar                           -->                           <exclude>org.apache.flink:flink-annotations</exclude>
>                            <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>                            <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                            <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                            <exclude>org.apache.flink:flink-core</exclude>
>                            <exclude>org.apache.flink:flink-java</exclude>
>                            <exclude>org.apache.flink:flink-scala_2.10</exclude>
>                            <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>                            <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>                            <exclude>org.apache.flink:flink-clients_2.10</exclude>
>                            <exclude>org.apache.flink:flink-avro_2.10</exclude>
>                            <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>                            <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>                            <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>
>                            <!-- Also exclude very big transitive dependencies of Flink                           WARNING: You have to remove these excludes if your code relies on other                           versions of these dependencies.                           -->                           <exclude>org.scala-lang:scala-library</exclude>
>                            <exclude>org.scala-lang:scala-compiler</exclude>
>                            <exclude>org.scala-lang:scala-reflect</exclude>
>                            <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                            <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                            <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                            <exclude>io.netty:netty-all</exclude>
>                            <exclude>io.netty:netty</exclude>
>                            <exclude>commons-fileupload:commons-fileupload</exclude>
>                            <exclude>org.apache.avro:avro</exclude>
>                            <exclude>commons-collections:commons-collections</exclude>
>                            <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                            <exclude>org.xerial.snappy:snappy-java</exclude>
>                            <exclude>org.apache.commons:commons-compress</exclude>
>                            <exclude>org.tukaani:xz</exclude>
>                            <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                            <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                            <exclude>org.objenesis:objenesis</exclude>
>                            <exclude>com.twitter:chill_*</exclude>
>                            <exclude>com.twitter:chill-java</exclude>
>                            <exclude>commons-lang:commons-lang</exclude>
>                            <exclude>junit:junit</exclude>
>                            <exclude>org.apache.commons:commons-lang3</exclude>
>                            <exclude>org.slf4j:slf4j-api</exclude>
>                            <exclude>org.slf4j:slf4j-log4j12</exclude>
>                            <exclude>log4j:log4j</exclude>
>                            <exclude>org.apache.commons:commons-math</exclude>
>                            <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                            <exclude>commons-logging:commons-logging</exclude>
>                            <exclude>commons-codec:commons-codec</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                            <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                            <exclude>stax:stax-api</exclude>
>                            <exclude>com.typesafe:config</exclude>
>                            <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                            <exclude>com.github.scopt:scopt_*</exclude>
>                            <exclude>commons-io:commons-io</exclude>
>                            <exclude>commons-cli:commons-cli</exclude>
>                         </excludes>
>                      </artifactSet>
>                      <filters>
>                         <filter>
>                            <artifact>org.apache.flink:*</artifact>
>                            <excludes>
>                               <!-- exclude shaded google but include shaded curator -->                              <exclude>org/apache/flink/shaded/com/**</exclude>
>                               <exclude>web-docs/**</exclude>
>                            </excludes>
>                         </filter>
>                         <filter>
>                            <!-- Do not copy the signatures in the META-INF folder.                           Otherwise, this might cause SecurityExceptions when using the JAR. -->                           <artifact>*:*</artifact>
>                            <excludes>
>                               <exclude>META-INF/*.SF</exclude>
>                               <exclude>META-INF/*.DSA</exclude>
>                               <exclude>META-INF/*.RSA</exclude>
>                            </excludes>
>                         </filter>
>                      </filters>
>                      <!-- If you want to use ./bin/flink run <quickstart jar> uncomment the following lines.                     This will add a Main-Class entry to the manifest file -->                     <!--                     <transformers>                        <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">                           <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>                        </transformer>                     </transformers>                      -->                     <createDependencyReducedPom>false</createDependencyReducedPom>
>                   </configuration>
>                </execution>
>             </executions>
>          </plugin>
>
>          <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-compiler-plugin</artifactId>
>             <version>3.1</version>
>             <configuration>
>                <source>1.7</source>
>                <target>1.7</target>
>             </configuration>
>          </plugin>
>          <plugin>
>             <groupId>net.alchim31.maven</groupId>
>             <artifactId>scala-maven-plugin</artifactId>
>             <version>3.1.4</version>
>             <executions>
>                <execution>
>                   <goals>
>                      <goal>compile</goal>
>                      <goal>testCompile</goal>
>                   </goals>
>                </execution>
>             </executions>
>          </plugin>
>
>          <!-- Eclipse Integration -->         <plugin>
>             <groupId>org.apache.maven.plugins</groupId>
>             <artifactId>maven-eclipse-plugin</artifactId>
>             <version>2.8</version>
>             <configuration>
>                <downloadSources>true</downloadSources>
>                <projectnatures>
>                   <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                   <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                </projectnatures>
>                <buildcommands>
>                   <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                </buildcommands>
>                <classpathContainers>
>                   <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                   <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                </classpathContainers>
>                <excludes>
>                   <exclude>org.scala-lang:scala-library</exclude>
>                   <exclude>org.scala-lang:scala-compiler</exclude>
>                </excludes>
>                <sourceIncludes>
>                   <sourceInclude>**/*.scala</sourceInclude>
>                   <sourceInclude>**/*.java</sourceInclude>
>                </sourceIncludes>
>             </configuration>
>          </plugin>
>
>          <!-- Adding scala source directories to build path -->         <plugin>
>             <groupId>org.codehaus.mojo</groupId>
>             <artifactId>build-helper-maven-plugin</artifactId>
>             <version>1.7</version>
>             <executions>
>                <!-- Add src/main/scala to eclipse build path -->               <execution>
>                   <id>add-source</id>
>                   <phase>generate-sources</phase>
>                   <goals>
>                      <goal>add-source</goal>
>                   </goals>
>                   <configuration>
>                      <sources>
>                         <source>src/main/scala</source>
>                      </sources>
>                   </configuration>
>                </execution>
>                <!-- Add src/test/scala to eclipse build path -->               <execution>
>                   <id>add-test-source</id>
>                   <phase>generate-test-sources</phase>
>                   <goals>
>                      <goal>add-test-source</goal>
>                   </goals>
>                   <configuration>
>                      <sources>
>                         <source>src/test/scala</source>
>                      </sources>
>                   </configuration>
>                </execution>
>             </executions>
>          </plugin>
>       </plugins>
>    </build></project>
>
> Cheers
>
> S
>
>
>
> --
> Freundliche Grüße / Kind Regards
>
> Timo Walther
>
> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>
>

Re: jdbc.JDBCInputFormat

Posted by Timo Walther <tw...@apache.org>.
Hi Sunny,

you are using different versions of Flink. `flink-parent` is set to 
`1.2-SNAPSHOT` but the property `flink.version` is still `1.1.2`.

Hope that helps.

Timo



Am 12/10/16 um 11:49 schrieb sunny patel:
> Hi guys,
>
> I am facing following error message in flink scala JDBC wordcount.
> could you please advise me on this?
>
> *Information:12/10/2016, 10:43 - Compilation completed with 2 errors 
> and 0 warnings in 1s 903ms*
> */Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*
> *
> *
> *Error:(17, 67) can't expand macros compiled by previous versions of 
> Scala*
> *    val stringColum: TypeInformation[Int] = createTypeInformation[Int]*
> *
> *
> *Error:(29, 33) can't expand macros compiled by previous versions of 
> Scala*
> *    val dataset =env.createInput(inputFormat)*
> *
> *
>
> ------------ code
>
>
> package DataSources
>
> import org.apache.flink.api.common.typeinfo.TypeInformation
> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
> import org.apache.flink.api.scala._
> import org.apache.flink.api.table.typeutils.RowTypeInfo
>
> object WordCount {
>    def main(args: Array[String]) {
>
>      val PATH = getClass.getResource("").getPath
>
>      // set up the execution environment val env = ExecutionEnvironment.getExecutionEnvironment // Read data from JDBC (Kylin in our case) val stringColum: TypeInformation[Int] =createTypeInformation[Int]
>      val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>
>      val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>        .setDrivername("org.postgresql.jdbc.Driver")
>        .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
>        .setUsername("MI")
>        .setPassword("MI")
>        .setQuery("select * FROM identity")
>        .setRowTypeInfo(DB_ROWTYPE)
>        .finish()
>
>      val dataset =env.createInput(inputFormat)
>      dataset.print()
>
>      println(PATH)
>    }
> }
> ---------pom.xml
> <?xml version="1.0" encoding="UTF-8"?> <project 
> xmlns="http://maven.apache.org/POM/4.0.0" 
> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
> xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>     <modelVersion>4.0.0</modelVersion>
>     <parent>
>        <artifactId>flink-parent</artifactId>
>        <groupId>org.apache.flink</groupId>
>        <version>1.2-SNAPSHOT</version>
>     </parent>
>
>     <groupId>org.apache.flink.quickstart</groupId>
>     <artifactId>flink-scala-project</artifactId>
>     <version>0.1</version>
>     <packaging>jar</packaging>
>
>     <name>Flink Quickstart Job</name>
>     <url>http://www.myorganization.org</url>
>
>     <repositories>
>        <repository>
>           <id>apache.snapshots</id>
>           <name>Apache Development Snapshot Repository</name>
>           <url>https://repository.apache.org/content/repositories/snapshots/</url>
>           <releases>
>              <enabled>false</enabled>
>           </releases>
>           <snapshots>
>           </snapshots>
>        </repository>
>     </repositories>
>
>     <properties>
>        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>        <flink.version>1.1.2</flink.version>
>     </properties>
>
>     <!-- Execute "mvn clean package -Pbuild-jar" to build a jar file out 
> of this project! How to use the Flink Quickstart pom: a) Adding new 
> dependencies: You can add dependencies to the list below. Please check 
> if the maven-shade-plugin below is filtering out your dependency and 
> remove the exclude from there. b) Build a jar for running on the 
> cluster: There are two options for creating a jar from this project 
> b.1) "mvn clean package" -> this will create a fat jar which contains 
> all dependencies necessary for running the jar created by this pom in 
> a cluster. The "maven-shade-plugin" excludes everything that is 
> provided on a running Flink cluster. b.2) "mvn clean package 
> -Pbuild-jar" -> This will also create a fat-jar, but with much nicer 
> dependency exclusion handling. This approach is preferred and leads to 
> much cleaner jar files. --> <dependencies>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-jdbc</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-table_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-streaming-scala_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>        <dependency>
>           <groupId>org.apache.flink</groupId>
>           <artifactId>flink-clients_2.10</artifactId>
>           <version>${flink.version}</version>
>        </dependency>
>     </dependencies>
>
>     <profiles>
>        <profile>
>           <!-- Profile for packaging correct JAR files --> <id>build-jar</id>
>           <activation>
>           </activation>
>           <dependencies>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-streaming-scala_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>              <dependency>
>                 <groupId>org.apache.flink</groupId>
>                 <artifactId>flink-clients_2.10</artifactId>
>                 <version>${flink.version}</version>
>                 <scope>provided</scope>
>              </dependency>
>           </dependencies>
>
>           <build>
>              <plugins>
>                 <!-- disable the exclusion rules --> <plugin>
>                    <groupId>org.apache.maven.plugins</groupId>
>                    <artifactId>maven-shade-plugin</artifactId>
>                    <version>2.4.1</version>
>                    <executions>
>                       <execution>
>                          <phase>package</phase>
>                          <goals>
>                             <goal>shade</goal>
>                          </goals>
>                          <configuration>
>                             <artifactSet>
>                                <excludes combine.self="override"></excludes>
>                             </artifactSet>
>                          </configuration>
>                       </execution>
>                    </executions>
>                 </plugin>
>              </plugins>
>           </build>
>        </profile>
>     </profiles>
>
>     <!-- We use the maven-assembly plugin to create a fat jar that 
> contains all dependencies except flink and its transitive 
> dependencies. The resulting fat-jar can be executed on a cluster. 
> Change the value of Program-Class if your program entry point changes. 
> --> <build>
>        <plugins>
>           <!-- We use the maven-shade plugin to create a fat jar that contains 
> all dependencies except flink and it's transitive dependencies. The 
> resulting fat-jar can be executed on a cluster. Change the value of 
> Program-Class if your program entry point changes. --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-shade-plugin</artifactId>
>              <version>2.4.1</version>
>              <executions>
>                 <!-- Run shade goal on package phase --> <execution>
>                    <phase>package</phase>
>                    <goals>
>                       <goal>shade</goal>
>                    </goals>
>                    <configuration>
>                       <artifactSet>
>                          <excludes>
>                             <!-- This list contains all dependencies of flink-dist Everything else 
> will be packaged into the fat-jar --> <exclude>org.apache.flink:flink-annotations</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>
>                             <exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
>                             <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
>                             <exclude>org.apache.flink:flink-core</exclude>
>                             <exclude>org.apache.flink:flink-java</exclude>
>                             <exclude>org.apache.flink:flink-scala_2.10</exclude>
>                             <exclude>org.apache.flink:flink-runtime_2.10</exclude>
>                             <exclude>org.apache.flink:flink-optimizer_2.10</exclude>
>                             <exclude>org.apache.flink:flink-clients_2.10</exclude>
>                             <exclude>org.apache.flink:flink-avro_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-batch_2.10</exclude>
>                             <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>
>                             <exclude>org.apache.flink:flink-streaming-java_2.10</exclude>
>
>                             <!-- Also exclude very big transitive dependencies of Flink WARNING: 
> You have to remove these excludes if your code relies on other 
> versions of these dependencies. --> <exclude>org.scala-lang:scala-library</exclude>
>                             <exclude>org.scala-lang:scala-compiler</exclude>
>                             <exclude>org.scala-lang:scala-reflect</exclude>
>                             <exclude>com.typesafe.akka:akka-actor_*</exclude>
>                             <exclude>com.typesafe.akka:akka-remote_*</exclude>
>                             <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
>                             <exclude>io.netty:netty-all</exclude>
>                             <exclude>io.netty:netty</exclude>
>                             <exclude>commons-fileupload:commons-fileupload</exclude>
>                             <exclude>org.apache.avro:avro</exclude>
>                             <exclude>commons-collections:commons-collections</exclude>
>                             <exclude>com.thoughtworks.paranamer:paranamer</exclude>
>                             <exclude>org.xerial.snappy:snappy-java</exclude>
>                             <exclude>org.apache.commons:commons-compress</exclude>
>                             <exclude>org.tukaani:xz</exclude>
>                             <exclude>com.esotericsoftware.kryo:kryo</exclude>
>                             <exclude>com.esotericsoftware.minlog:minlog</exclude>
>                             <exclude>org.objenesis:objenesis</exclude>
>                             <exclude>com.twitter:chill_*</exclude>
>                             <exclude>com.twitter:chill-java</exclude>
>                             <exclude>commons-lang:commons-lang</exclude>
>                             <exclude>junit:junit</exclude>
>                             <exclude>org.apache.commons:commons-lang3</exclude>
>                             <exclude>org.slf4j:slf4j-api</exclude>
>                             <exclude>org.slf4j:slf4j-log4j12</exclude>
>                             <exclude>log4j:log4j</exclude>
>                             <exclude>org.apache.commons:commons-math</exclude>
>                             <exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
>                             <exclude>commons-logging:commons-logging</exclude>
>                             <exclude>commons-codec:commons-codec</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-core</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
>                             <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
>                             <exclude>stax:stax-api</exclude>
>                             <exclude>com.typesafe:config</exclude>
>                             <exclude>org.uncommons.maths:uncommons-maths</exclude>
>                             <exclude>com.github.scopt:scopt_*</exclude>
>                             <exclude>commons-io:commons-io</exclude>
>                             <exclude>commons-cli:commons-cli</exclude>
>                          </excludes>
>                       </artifactSet>
>                       <filters>
>                          <filter>
>                             <artifact>org.apache.flink:*</artifact>
>                             <excludes>
>                                <!-- exclude shaded google but include shaded curator --> <exclude>org/apache/flink/shaded/com/**</exclude>
>                                <exclude>web-docs/**</exclude>
>                             </excludes>
>                          </filter>
>                          <filter>
>                             <!-- Do not copy the signatures in the META-INF folder. Otherwise, 
> this might cause SecurityExceptions when using the JAR. --> <artifact>*:*</artifact>
>                             <excludes>
>                                <exclude>META-INF/*.SF</exclude>
>                                <exclude>META-INF/*.DSA</exclude>
>                                <exclude>META-INF/*.RSA</exclude>
>                             </excludes>
>                          </filter>
>                       </filters>
>                       <!-- If you want to use ./bin/flink run <quickstart jar> uncomment the 
> following lines. This will add a Main-Class entry to the manifest file 
> --> <!-- <transformers> <transformer 
> implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> 
> <mainClass>org.apache.flink.quickstart.StreamingJob</mainClass> 
> </transformer> </transformers> --> <createDependencyReducedPom>false</createDependencyReducedPom>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-compiler-plugin</artifactId>
>              <version>3.1</version>
>              <configuration>
>                 <source>1.7</source>
>                 <target>1.7</target>
>              </configuration>
>           </plugin>
>           <plugin>
>              <groupId>net.alchim31.maven</groupId>
>              <artifactId>scala-maven-plugin</artifactId>
>              <version>3.1.4</version>
>              <executions>
>                 <execution>
>                    <goals>
>                       <goal>compile</goal>
>                       <goal>testCompile</goal>
>                    </goals>
>                 </execution>
>              </executions>
>           </plugin>
>
>           <!-- Eclipse Integration --> <plugin>
>              <groupId>org.apache.maven.plugins</groupId>
>              <artifactId>maven-eclipse-plugin</artifactId>
>              <version>2.8</version>
>              <configuration>
>                 <downloadSources>true</downloadSources>
>                 <projectnatures>
>                    <projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
>                    <projectnature>org.eclipse.jdt.core.javanature</projectnature>
>                 </projectnatures>
>                 <buildcommands>
>                    <buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
>                 </buildcommands>
>                 <classpathContainers>
>                    <classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>
>                    <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
>                 </classpathContainers>
>                 <excludes>
>                    <exclude>org.scala-lang:scala-library</exclude>
>                    <exclude>org.scala-lang:scala-compiler</exclude>
>                 </excludes>
>                 <sourceIncludes>
>                    <sourceInclude>**/*.scala</sourceInclude>
>                    <sourceInclude>**/*.java</sourceInclude>
>                 </sourceIncludes>
>              </configuration>
>           </plugin>
>
>           <!-- Adding scala source directories to build path --> <plugin>
>              <groupId>org.codehaus.mojo</groupId>
>              <artifactId>build-helper-maven-plugin</artifactId>
>              <version>1.7</version>
>              <executions>
>                 <!-- Add src/main/scala to eclipse build path --> <execution>
>                    <id>add-source</id>
>                    <phase>generate-sources</phase>
>                    <goals>
>                       <goal>add-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/main/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>                 <!-- Add src/test/scala to eclipse build path --> <execution>
>                    <id>add-test-source</id>
>                    <phase>generate-test-sources</phase>
>                    <goals>
>                       <goal>add-test-source</goal>
>                    </goals>
>                    <configuration>
>                       <sources>
>                          <source>src/test/scala</source>
>                       </sources>
>                    </configuration>
>                 </execution>
>              </executions>
>           </plugin>
>        </plugins>
>     </build>
> </project>
> Cheers
> S


-- 
Freundliche Gr��e / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr


Re: jdbc.JDBCInputFormat

Posted by sunny patel <su...@gmail.com>.
Hi guys,

I am facing following error message in flink scala JDBC wordcount.
could you please advise me on this?

*Information:12/10/2016, 10:43 - Compilation completed with 2 errors and 0
warnings in 1s 903ms*
*/Users/janaidu/faid/src/main/scala/fgid/JDBC.scala*

*Error:(17, 67) can't expand macros compiled by previous versions of Scala*
*    val stringColum: TypeInformation[Int] = createTypeInformation[Int]*

*Error:(29, 33) can't expand macros compiled by previous versions of Scala*
*    val dataset =env.createInput(inputFormat)*


------------ code


package DataSources

import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
import org.apache.flink.api.scala._
import org.apache.flink.api.table.typeutils.RowTypeInfo

object WordCount {
  def main(args: Array[String]) {

    val PATH = getClass.getResource("").getPath

    // set up the execution environment
    val env = ExecutionEnvironment.getExecutionEnvironment

    // Read data from JDBC (Kylin in our case)
    val stringColum: TypeInformation[Int] = createTypeInformation[Int]
    val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))

    val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
      .setDrivername("org.postgresql.jdbc.Driver")
      .setDBUrl("jdbc:postgresql://localhost:5432/mydb")
      .setUsername("MI")
      .setPassword("MI")
      .setQuery("select * FROM identity")
      .setRowTypeInfo(DB_ROWTYPE)
      .finish()

    val dataset =env.createInput(inputFormat)
    dataset.print()

    println(PATH)
  }
}


---------pom.xml

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
   <modelVersion>4.0.0</modelVersion>
   <parent>
      <artifactId>flink-parent</artifactId>
      <groupId>org.apache.flink</groupId>
      <version>1.2-SNAPSHOT</version>
   </parent>

   <groupId>org.apache.flink.quickstart</groupId>
   <artifactId>flink-scala-project</artifactId>
   <version>0.1</version>
   <packaging>jar</packaging>

   <name>Flink Quickstart Job</name>
   <url>http://www.myorganization.org</url>

   <repositories>
      <repository>
         <id>apache.snapshots</id>
         <name>Apache Development Snapshot Repository</name>
         <url>https://repository.apache.org/content/repositories/snapshots/</url>
         <releases>
            <enabled>false</enabled>
         </releases>
         <snapshots>
         </snapshots>
      </repository>
   </repositories>

   <properties>
      <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
      <flink.version>1.1.2</flink.version>
   </properties>

   <!--

      Execute "mvn clean package -Pbuild-jar"
      to build a jar file out of this project!

      How to use the Flink Quickstart pom:

      a) Adding new dependencies:
         You can add dependencies to the list below.
         Please check if the maven-shade-plugin below is filtering out
your dependency
         and remove the exclude from there.

      b) Build a jar for running on the cluster:
         There are two options for creating a jar from this project

         b.1) "mvn clean package" -> this will create a fat jar which
contains all
               dependencies necessary for running the jar created by
this pom in a cluster.
               The "maven-shade-plugin" excludes everything that is
provided on a running Flink cluster.

         b.2) "mvn clean package -Pbuild-jar" -> This will also create
a fat-jar, but with much
               nicer dependency exclusion handling. This approach is
preferred and leads to
               much cleaner jar files.
   -->

   <dependencies>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-jdbc</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-table_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-scala_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-streaming-scala_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
      <dependency>
         <groupId>org.apache.flink</groupId>
         <artifactId>flink-clients_2.10</artifactId>
         <version>${flink.version}</version>
      </dependency>
   </dependencies>

   <profiles>
      <profile>
         <!-- Profile for packaging correct JAR files -->
         <id>build-jar</id>
         <activation>
         </activation>
         <dependencies>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-scala_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-streaming-scala_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
            <dependency>
               <groupId>org.apache.flink</groupId>
               <artifactId>flink-clients_2.10</artifactId>
               <version>${flink.version}</version>
               <scope>provided</scope>
            </dependency>
         </dependencies>

         <build>
            <plugins>
               <!-- disable the exclusion rules -->
               <plugin>
                  <groupId>org.apache.maven.plugins</groupId>
                  <artifactId>maven-shade-plugin</artifactId>
                  <version>2.4.1</version>
                  <executions>
                     <execution>
                        <phase>package</phase>
                        <goals>
                           <goal>shade</goal>
                        </goals>
                        <configuration>
                           <artifactSet>
                              <excludes combine.self="override"></excludes>
                           </artifactSet>
                        </configuration>
                     </execution>
                  </executions>
               </plugin>
            </plugins>
         </build>
      </profile>
   </profiles>

   <!-- We use the maven-assembly plugin to create a fat jar that
contains all dependencies
      except flink and its transitive dependencies. The resulting
fat-jar can be executed
      on a cluster. Change the value of Program-Class if your program
entry point changes. -->
   <build>
      <plugins>
         <!-- We use the maven-shade plugin to create a fat jar that
contains all dependencies
         except flink and it's transitive dependencies. The resulting
fat-jar can be executed
         on a cluster. Change the value of Program-Class if your
program entry point changes. -->
         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.4.1</version>
            <executions>
               <!-- Run shade goal on package phase -->
               <execution>
                  <phase>package</phase>
                  <goals>
                     <goal>shade</goal>
                  </goals>
                  <configuration>
                     <artifactSet>
                        <excludes>
                           <!-- This list contains all dependencies of
flink-dist
                           Everything else will be packaged into the fat-jar
                           -->
                           <exclude>org.apache.flink:flink-annotations</exclude>

<exclude>org.apache.flink:flink-shaded-hadoop1_2.10</exclude>

<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>

<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
                           <exclude>org.apache.flink:flink-core</exclude>
                           <exclude>org.apache.flink:flink-java</exclude>
                           <exclude>org.apache.flink:flink-scala_2.10</exclude>

<exclude>org.apache.flink:flink-runtime_2.10</exclude>

<exclude>org.apache.flink:flink-optimizer_2.10</exclude>

<exclude>org.apache.flink:flink-clients_2.10</exclude>
                           <exclude>org.apache.flink:flink-avro_2.10</exclude>

<exclude>org.apache.flink:flink-examples-batch_2.10</exclude>

<exclude>org.apache.flink:flink-examples-streaming_2.10</exclude>

<exclude>org.apache.flink:flink-streaming-java_2.10</exclude>

                           <!-- Also exclude very big transitive
dependencies of Flink

                           WARNING: You have to remove these excludes
if your code relies on other
                           versions of these dependencies.

                           -->

                           <exclude>org.scala-lang:scala-library</exclude>
                           <exclude>org.scala-lang:scala-compiler</exclude>
                           <exclude>org.scala-lang:scala-reflect</exclude>
                           <exclude>com.typesafe.akka:akka-actor_*</exclude>
                           <exclude>com.typesafe.akka:akka-remote_*</exclude>
                           <exclude>com.typesafe.akka:akka-slf4j_*</exclude>
                           <exclude>io.netty:netty-all</exclude>
                           <exclude>io.netty:netty</exclude>

<exclude>commons-fileupload:commons-fileupload</exclude>
                           <exclude>org.apache.avro:avro</exclude>

<exclude>commons-collections:commons-collections</exclude>

<exclude>com.thoughtworks.paranamer:paranamer</exclude>
                           <exclude>org.xerial.snappy:snappy-java</exclude>

<exclude>org.apache.commons:commons-compress</exclude>
                           <exclude>org.tukaani:xz</exclude>
                           <exclude>com.esotericsoftware.kryo:kryo</exclude>
                           <exclude>com.esotericsoftware.minlog:minlog</exclude>
                           <exclude>org.objenesis:objenesis</exclude>
                           <exclude>com.twitter:chill_*</exclude>
                           <exclude>com.twitter:chill-java</exclude>
                           <exclude>commons-lang:commons-lang</exclude>
                           <exclude>junit:junit</exclude>
                           <exclude>org.apache.commons:commons-lang3</exclude>
                           <exclude>org.slf4j:slf4j-api</exclude>
                           <exclude>org.slf4j:slf4j-log4j12</exclude>
                           <exclude>log4j:log4j</exclude>
                           <exclude>org.apache.commons:commons-math</exclude>

<exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
                           <exclude>commons-logging:commons-logging</exclude>
                           <exclude>commons-codec:commons-codec</exclude>

<exclude>com.fasterxml.jackson.core:jackson-core</exclude>

<exclude>com.fasterxml.jackson.core:jackson-databind</exclude>

<exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
                           <exclude>stax:stax-api</exclude>
                           <exclude>com.typesafe:config</exclude>

<exclude>org.uncommons.maths:uncommons-maths</exclude>
                           <exclude>com.github.scopt:scopt_*</exclude>
                           <exclude>commons-io:commons-io</exclude>
                           <exclude>commons-cli:commons-cli</exclude>
                        </excludes>
                     </artifactSet>
                     <filters>
                        <filter>
                           <artifact>org.apache.flink:*</artifact>
                           <excludes>
                              <!-- exclude shaded google but include
shaded curator -->
                              <exclude>org/apache/flink/shaded/com/**</exclude>
                              <exclude>web-docs/**</exclude>
                           </excludes>
                        </filter>
                        <filter>
                           <!-- Do not copy the signatures in the
META-INF folder.
                           Otherwise, this might cause
SecurityExceptions when using the JAR. -->
                           <artifact>*:*</artifact>
                           <excludes>
                              <exclude>META-INF/*.SF</exclude>
                              <exclude>META-INF/*.DSA</exclude>
                              <exclude>META-INF/*.RSA</exclude>
                           </excludes>
                        </filter>
                     </filters>
                     <!-- If you want to use ./bin/flink run
<quickstart jar> uncomment the following lines.
                     This will add a Main-Class entry to the manifest file -->
                     <!--
                     <transformers>
                        <transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

<mainClass>org.apache.flink.quickstart.StreamingJob</mainClass>
                        </transformer>
                     </transformers>
                      -->

<createDependencyReducedPom>false</createDependencyReducedPom>
                  </configuration>
               </execution>
            </executions>
         </plugin>

         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
               <source>1.7</source>
               <target>1.7</target>
            </configuration>
         </plugin>
         <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.1.4</version>
            <executions>
               <execution>
                  <goals>
                     <goal>compile</goal>
                     <goal>testCompile</goal>
                  </goals>
               </execution>
            </executions>
         </plugin>

         <!-- Eclipse Integration -->
         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-eclipse-plugin</artifactId>
            <version>2.8</version>
            <configuration>
               <downloadSources>true</downloadSources>
               <projectnatures>

<projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
                  <projectnature>org.eclipse.jdt.core.javanature</projectnature>
               </projectnatures>
               <buildcommands>

<buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
               </buildcommands>
               <classpathContainers>

<classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER</classpathContainer>

<classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
               </classpathContainers>
               <excludes>
                  <exclude>org.scala-lang:scala-library</exclude>
                  <exclude>org.scala-lang:scala-compiler</exclude>
               </excludes>
               <sourceIncludes>
                  <sourceInclude>**/*.scala</sourceInclude>
                  <sourceInclude>**/*.java</sourceInclude>
               </sourceIncludes>
            </configuration>
         </plugin>

         <!-- Adding scala source directories to build path -->
         <plugin>
            <groupId>org.codehaus.mojo</groupId>
            <artifactId>build-helper-maven-plugin</artifactId>
            <version>1.7</version>
            <executions>
               <!-- Add src/main/scala to eclipse build path -->
               <execution>
                  <id>add-source</id>
                  <phase>generate-sources</phase>
                  <goals>
                     <goal>add-source</goal>
                  </goals>
                  <configuration>
                     <sources>
                        <source>src/main/scala</source>
                     </sources>
                  </configuration>
               </execution>
               <!-- Add src/test/scala to eclipse build path -->
               <execution>
                  <id>add-test-source</id>
                  <phase>generate-test-sources</phase>
                  <goals>
                     <goal>add-test-source</goal>
                  </goals>
                  <configuration>
                     <sources>
                        <source>src/test/scala</source>
                     </sources>
                  </configuration>
               </execution>
            </executions>
         </plugin>
      </plugins>
   </build>
</project>



Cheers

S

Re: jdbc.JDBCInputFormat

Posted by Alberto Ramón <a....@gmail.com>.
Hello

I downloaded and compile your branch:
[image: Imágenes integradas 3]

And the error is the same:
[image: Imágenes integradas 2]

(I tested with SQuirreL and works OK)


*If you want any log / test , feel free to contact with me  ¡¡*¡



*Additional INFO:*
  Scala:  String = version 2.11.8

I created the template project with:

mvn archetype:generate                             \
-DarchetypeGroupId=org.apache.flink            \
-DarchetypeArtifactId=flink-quickstart-scala   \
-DarchetypeVersion=1.1.2                       \
-DgroupId=org.apache.flink.quickstart          \
-DartifactId=flink-scala-project               \    -Dversion=0.1
                            \    -Dpackage=org.apache.flink.quickstart
         \    -DinteractiveMode=false

And the coded is attached:

2016-10-11 12:01 GMT+02:00 Alberto Ramón <a....@gmail.com>:

> I will check it this nigth
>
> Thanks
>
> 2016-10-11 11:24 GMT+02:00 Timo Walther <tw...@apache.org>:
>
>> I have opened a PR (https://github.com/apache/flink/pull/2619). Would be
>> great if you could try it and comment if it solves you problem.
>>
>> Timo
>>
>> Am 10/10/16 um 17:48 schrieb Timo Walther:
>>
>> I could reproduce the error locally. I will prepare a fix for it.
>>
>> Timo
>>
>> Am 10/10/16 um 11:54 schrieb Alberto Ramón:
>>
>> It's from Jun and Unassigned   :(
>> Is There a Workarround?
>>
>> I'm will try to contact with the reporter , Martin Scholl )
>>
>> 2016-10-10 11:04 GMT+02:00 Timo Walther <tw...@apache.org>:
>>
>>> I think you already found the correct issue describing your problem (
>>> FLINK-4108). This should get higher priority.
>>>
>>> Timo
>>>
>>> Am 09/10/16 um 13:27 schrieb Alberto Ramón:
>>>
>>>
>>> After solved some issues, I connected with Kylin, but I can't read data
>>>
>>> import org.apache.flink.api.scala._import org.apache.flink.api.java.io.jdbc.JDBCInputFormatimport org.apache.flink.api.table.Rowimport org.apache.flink.api.table.typeutils.RowTypeInfoimport org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>>>
>>> var stringColum: TypeInformation[Int] = createTypeInformation[Int]val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
>>> val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>>   .setDrivername("org.apache.kylin.jdbc.Driver")
>>>   .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin")
>>>   .setUsername("ADMIN")
>>>   .setPassword("KYLIN")
>>>   .setQuery("select count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt")
>>>   .setRowTypeInfo(DB_ROWTYPE)
>>>   .finish()
>>>
>>>   val dataset =env.createInput(inputFormat)
>>> dataset.print()
>>>
>>>
>>> The error is:[image: Imágenes integradas 1]
>>>
>>>
>>> (I checked that queries and  config are correct with SQuirriel)
>>>
>>> (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>>>
>>>
>>>
>>> Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>>>
>>>
>>> BR, Alberto
>>>
>>> 2016-10-07 21:46 GMT+02:00 Fabian Hueske <fh...@gmail.com>:
>>>>
>>>> As the exception says the class org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
>>>> does not exist. You have to do: import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>>>
>>>> There is no Scala implementation of this class but you can also use
>>>> Java classes in Scala.
>>>> 2016-10-07 21:38 GMT+02:00 Alberto Ramón <a....@gmail.com>:
>>>>>
>>>>> I want use CreateInput + buildJDBCInputFormat to acces to database on
>>>>> SCALA
>>>>> PB1:
>>>>>
>>>>> import org.apache.flink.api.scala.io.jdbc.JDBCInputFormatError:(25, 37) object jdbc is not a member of package org.apache.flink.api.java.io
>>>>> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>>>>
>>>>> Then, I can't use:[image: Imágenes integradas 1]
>>>>>
>>>>> I tried to download code from git and recompile, also
>>>>>
>>>>> --
>>> Freundliche Grüße / Kind Regards
>>>
>>> Timo Walther
>>>
>>> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>>>
>>> --
>> Freundliche Grüße / Kind Regards
>>
>> Timo Walther
>>
>> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>>
>> --
>> Freundliche Grüße / Kind Regards
>>
>> Timo Walther
>>
>> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>>
>>
>

Re: jdbc.JDBCInputFormat

Posted by Alberto Ramón <a....@gmail.com>.
I will check it this nigth

Thanks

2016-10-11 11:24 GMT+02:00 Timo Walther <tw...@apache.org>:

> I have opened a PR (https://github.com/apache/flink/pull/2619). Would be
> great if you could try it and comment if it solves you problem.
>
> Timo
>
> Am 10/10/16 um 17:48 schrieb Timo Walther:
>
> I could reproduce the error locally. I will prepare a fix for it.
>
> Timo
>
> Am 10/10/16 um 11:54 schrieb Alberto Ramón:
>
> It's from Jun and Unassigned   :(
> Is There a Workarround?
>
> I'm will try to contact with the reporter , Martin Scholl )
>
> 2016-10-10 11:04 GMT+02:00 Timo Walther <tw...@apache.org>:
>
>> I think you already found the correct issue describing your problem (
>> FLINK-4108). This should get higher priority.
>>
>> Timo
>>
>> Am 09/10/16 um 13:27 schrieb Alberto Ramón:
>>
>>
>> After solved some issues, I connected with Kylin, but I can't read data
>>
>> import org.apache.flink.api.scala._import org.apache.flink.api.java.io.jdbc.JDBCInputFormatimport org.apache.flink.api.table.Rowimport org.apache.flink.api.table.typeutils.RowTypeInfoimport org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>>
>> var stringColum: TypeInformation[Int] = createTypeInformation[Int]val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
>> val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>   .setDrivername("org.apache.kylin.jdbc.Driver")
>>   .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin")
>>   .setUsername("ADMIN")
>>   .setPassword("KYLIN")
>>   .setQuery("select count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt")
>>   .setRowTypeInfo(DB_ROWTYPE)
>>   .finish()
>>
>>   val dataset =env.createInput(inputFormat)
>> dataset.print()
>>
>>
>> The error is:[image: Imágenes integradas 1]
>>
>>
>> (I checked that queries and  config are correct with SQuirriel)
>>
>> (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>>
>>
>>
>> Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>>
>>
>> BR, Alberto
>>
>> 2016-10-07 21:46 GMT+02:00 Fabian Hueske <fh...@gmail.com>:
>>>
>>> As the exception says the class org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
>>> does not exist. You have to do: import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>>
>>> There is no Scala implementation of this class but you can also use Java
>>> classes in Scala.
>>> 2016-10-07 21:38 GMT+02:00 Alberto Ramón <a....@gmail.com>:
>>>>
>>>> I want use CreateInput + buildJDBCInputFormat to acces to database on
>>>> SCALA
>>>> PB1:
>>>>
>>>> import org.apache.flink.api.scala.io.jdbc.JDBCInputFormatError:(25, 37) object jdbc is not a member of package org.apache.flink.api.java.io
>>>> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>>>
>>>> Then, I can't use:[image: Imágenes integradas 1]
>>>>
>>>> I tried to download code from git and recompile, also
>>>>
>>>> --
>> Freundliche Grüße / Kind Regards
>>
>> Timo Walther
>>
>> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>>
>> --
> Freundliche Grüße / Kind Regards
>
> Timo Walther
>
> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>
> --
> Freundliche Grüße / Kind Regards
>
> Timo Walther
>
> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>
>

Re: jdbc.JDBCInputFormat

Posted by Timo Walther <tw...@apache.org>.
I have opened a PR (https://github.com/apache/flink/pull/2619). Would be 
great if you could try it and comment if it solves you problem.

Timo

Am 10/10/16 um 17:48 schrieb Timo Walther:
> I could reproduce the error locally. I will prepare a fix for it.
>
> Timo
>
> Am 10/10/16 um 11:54 schrieb Alberto Ram�n:
>> It's from Jun and Unassigned   :(
>> Is There a Workarround?
>>
>> I'm will try to contact with the reporter , Martin Scholl )
>>
>> 2016-10-10 11:04 GMT+02:00 Timo Walther <twalthr@apache.org 
>> <ma...@apache.org>>:
>>
>>     I think you already found the correct issue describing your
>>     problem ( FLINK-4108). This should get higher priority.
>>
>>     Timo
>>
>>     Am 09/10/16 um 13:27 schrieb Alberto Ram�n:
>>>
>>>     After solved some issues, I connected with Kylin, but I can't
>>>     read data
>>>
>>>     import org.apache.flink.api.scala._
>>>     import org.apache.flink.api.java.io
>>>     <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>>     import org.apache.flink.api.table.Row
>>>     import org.apache.flink.api.table.typeutils.RowTypeInfo
>>>     import org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>>>     var stringColum: TypeInformation[Int] =createTypeInformation[Int]
>>>     val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>>>
>>>     val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>>        .setDrivername("org.apache.kylin.jdbc.Driver")
>>>        .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin
>>>     <http://172.17.0.2:7070/learn_kylin>")
>>>        .setUsername("ADMIN")
>>>        .setPassword("KYLIN")
>>>        .setQuery("select count(distinct seller_id) as sellers from kylin_sales
>>>     group by part_dt order by part_dt")
>>>        .setRowTypeInfo(DB_ROWTYPE)
>>>        .finish()
>>>
>>>        val dataset =env.createInput(inputFormat)
>>>     dataset.print()
>>>
>>>     The error is:
>>>     Im�genes integradas 1
>>>
>>>
>>>     (I checked that queries and  config are correct with SQuirriel)
>>>     (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>>>
>>>
>>>     Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>>>
>>>     BR, Alberto
>>>     2016-10-07 21:46 GMT+02:00 Fabian Hueske <fhueske@gmail.com
>>>     <ma...@gmail.com>>:
>>>
>>>         As the exception says the class
>>>         org.apache.flink.api.scala.io
>>>         <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>>>         does not exist. You have to do: import
>>>         org.apache.flink.api.java.io
>>>         <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>>         There is no Scala implementation of this class but you can
>>>         also use Java classes in Scala.
>>>         2016-10-07 21:38 GMT+02:00 Alberto Ram�n
>>>         <a.ramonportoles@gmail.com <ma...@gmail.com>>:
>>>
>>>             I want use CreateInput + buildJDBCInputFormat to acces
>>>             to database on SCALA
>>>             PB1:
>>>
>>>             import org.apache.flink.api.scala.io
>>>             <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>>>             Error:(25, 37) object jdbc is not a member of package
>>>             org.apache.flink.api.java.io
>>>             <http://org.apache.flink.api.java.io> import
>>>             org.apache.flink.api.java.io
>>>             <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>>
>>>             Then, I can't use:
>>>             Im�genes integradas 1
>>>
>>>             I tried to download code from git and recompile, also
>>>
>>     -- 
>>     Freundliche Gr��e / Kind Regards
>>
>>     Timo Walther
>>
>>     Follow me: @twalthr
>>     https://www.linkedin.com/in/twalthr
>>     <https://www.linkedin.com/in/twalthr>
>>
> -- 
> Freundliche Gr��e / Kind Regards
>
> Timo Walther
>
> Follow me: @twalthr
> https://www.linkedin.com/in/twalthr

-- 
Freundliche Gr��e / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr

Re: jdbc.JDBCInputFormat

Posted by Timo Walther <tw...@apache.org>.
I could reproduce the error locally. I will prepare a fix for it.

Timo

Am 10/10/16 um 11:54 schrieb Alberto Ram�n:
> It's from Jun and Unassigned   :(
> Is There a Workarround?
>
> I'm will try to contact with the reporter , Martin Scholl )
>
> 2016-10-10 11:04 GMT+02:00 Timo Walther <twalthr@apache.org 
> <ma...@apache.org>>:
>
>     I think you already found the correct issue describing your
>     problem ( FLINK-4108). This should get higher priority.
>
>     Timo
>
>     Am 09/10/16 um 13:27 schrieb Alberto Ram�n:
>>
>>     After solved some issues, I connected with Kylin, but I can't
>>     read data
>>
>>     import org.apache.flink.api.scala._
>>     import org.apache.flink.api.java.io
>>     <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>     import org.apache.flink.api.table.Row
>>     import org.apache.flink.api.table.typeutils.RowTypeInfo
>>     import org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>>     var stringColum: TypeInformation[Int] =createTypeInformation[Int]
>>     val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>>
>>     val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>>        .setDrivername("org.apache.kylin.jdbc.Driver")
>>        .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin
>>     <http://172.17.0.2:7070/learn_kylin>")
>>        .setUsername("ADMIN")
>>        .setPassword("KYLIN")
>>        .setQuery("select count(distinct seller_id) as sellers from kylin_sales
>>     group by part_dt order by part_dt")
>>        .setRowTypeInfo(DB_ROWTYPE)
>>        .finish()
>>
>>        val dataset =env.createInput(inputFormat)
>>     dataset.print()
>>
>>     The error is:
>>     Im�genes integradas 1
>>
>>
>>     (I checked that queries and  config are correct with SQuirriel)
>>     (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>>
>>
>>     Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>>
>>     BR, Alberto
>>     2016-10-07 21:46 GMT+02:00 Fabian Hueske <fhueske@gmail.com
>>     <ma...@gmail.com>>:
>>
>>         As the exception says the class org.apache.flink.api.scala.io
>>         <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>>         does not exist. You have to do: import
>>         org.apache.flink.api.java.io
>>         <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>         There is no Scala implementation of this class but you can
>>         also use Java classes in Scala.
>>         2016-10-07 21:38 GMT+02:00 Alberto Ram�n
>>         <a.ramonportoles@gmail.com <ma...@gmail.com>>:
>>
>>             I want use CreateInput + buildJDBCInputFormat to acces to
>>             database on SCALA
>>             PB1:
>>
>>             import org.apache.flink.api.scala.io
>>             <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>>             Error:(25, 37) object jdbc is not a member of package
>>             org.apache.flink.api.java.io
>>             <http://org.apache.flink.api.java.io> import
>>             org.apache.flink.api.java.io
>>             <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>>
>>             Then, I can't use:
>>             Im�genes integradas 1
>>
>>             I tried to download code from git and recompile, also
>>
>     -- 
>     Freundliche Gr��e / Kind Regards
>
>     Timo Walther
>
>     Follow me: @twalthr
>     https://www.linkedin.com/in/twalthr
>     <https://www.linkedin.com/in/twalthr>
>
-- 
Freundliche Gr��e / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr

Re: jdbc.JDBCInputFormat

Posted by Alberto Ramón <a....@gmail.com>.
It's from Jun and Unassigned   :(
Is There a Workarround?

I'm will try to contact with the reporter , Martin Scholl )

2016-10-10 11:04 GMT+02:00 Timo Walther <tw...@apache.org>:

> I think you already found the correct issue describing your problem (
> FLINK-4108). This should get higher priority.
>
> Timo
>
> Am 09/10/16 um 13:27 schrieb Alberto Ramón:
>
>
> After solved some issues, I connected with Kylin, but I can't read data
>
> import org.apache.flink.api.scala._import org.apache.flink.api.java.io.jdbc.JDBCInputFormatimport org.apache.flink.api.table.Rowimport org.apache.flink.api.table.typeutils.RowTypeInfoimport org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>
>
>
> var stringColum: TypeInformation[Int] = createTypeInformation[Int]val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))
> val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>   .setDrivername("org.apache.kylin.jdbc.Driver")
>   .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin")
>   .setUsername("ADMIN")
>   .setPassword("KYLIN")
>   .setQuery("select count(distinct seller_id) as sellers from kylin_sales group by part_dt order by part_dt")
>   .setRowTypeInfo(DB_ROWTYPE)
>   .finish()
>
>   val dataset =env.createInput(inputFormat)
> dataset.print()
>
>
> The error is:[image: Imágenes integradas 1]
>
>
> (I checked that queries and  config are correct with SQuirriel)
>
> (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>
>
>
> Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>
>
> BR, Alberto
>
> 2016-10-07 21:46 GMT+02:00 Fabian Hueske <fh...@gmail.com>:
>>
>> As the exception says the class org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
>> does not exist. You have to do: import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>
>> There is no Scala implementation of this class but you can also use Java
>> classes in Scala.
>> 2016-10-07 21:38 GMT+02:00 Alberto Ramón <a....@gmail.com>:
>>>
>>> I want use CreateInput + buildJDBCInputFormat to acces to database on
>>> SCALA
>>> PB1:
>>>
>>> import org.apache.flink.api.scala.io.jdbc.JDBCInputFormatError:(25, 37) object jdbc is not a member of package org.apache.flink.api.java.io
>>> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>>
>>> Then, I can't use:[image: Imágenes integradas 1]
>>>
>>> I tried to download code from git and recompile, also
>>>
>>> --
> Freundliche Grüße / Kind Regards
>
> Timo Walther
>
> Follow me: @twalthrhttps://www.linkedin.com/in/twalthr
>
>

Re: jdbc.JDBCInputFormat

Posted by Timo Walther <tw...@apache.org>.
I think you already found the correct issue describing your problem ( 
FLINK-4108). This should get higher priority.

Timo

Am 09/10/16 um 13:27 schrieb Alberto Ram�n:
>
> After solved some issues, I connected with Kylin, but I can't read data
>
> import org.apache.flink.api.scala._
> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
> import org.apache.flink.api.table.Row
> import org.apache.flink.api.table.typeutils.RowTypeInfo
> import org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}
>
>
> var stringColum: TypeInformation[Int] =createTypeInformation[Int]
> val DB_ROWTYPE =new RowTypeInfo(Seq(stringColum))
>
> val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
>    .setDrivername("org.apache.kylin.jdbc.Driver")
>    .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin 
> <http://172.17.0.2:7070/learn_kylin>")
>    .setUsername("ADMIN")
>    .setPassword("KYLIN")
>    .setQuery("select count(distinct seller_id) as sellers from kylin_sales group by 
> part_dt order by part_dt")
>    .setRowTypeInfo(DB_ROWTYPE)
>    .finish()
>
>    val dataset =env.createInput(inputFormat)
> dataset.print()
>
> The error is:
> Im�genes integradas 1
>
>
> (I checked that queries and  config are correct with SQuirriel)
> (Isn't a connection problem, Because if I turn off database the error is different "Reused Connection")
>
>
> Can you see a problem in my code? (I found  Flink 4108 unsolved issue,I don't know if is related)
>
> BR, Alberto
> 2016-10-07 21:46 GMT+02:00 Fabian Hueske <fhueske@gmail.com 
> <ma...@gmail.com>>:
>
>     As the exception says the class org.apache.flink.api.scala.io
>     <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat does
>     not exist. You have to do: import org.apache.flink.api.java.io
>     <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>     There is no Scala implementation of this class but you can also
>     use Java classes in Scala.
>     2016-10-07 21:38 GMT+02:00 Alberto Ram�n
>     <a.ramonportoles@gmail.com <ma...@gmail.com>>:
>
>         I want use CreateInput + buildJDBCInputFormat to acces to
>         database on SCALA
>         PB1:
>
>         import org.apache.flink.api.scala.io
>         <http://org.apache.flink.api.scala.io>.jdbc.JDBCInputFormat
>         Error:(25, 37) object jdbc is not a member of package
>         org.apache.flink.api.java.io
>         <http://org.apache.flink.api.java.io> import
>         org.apache.flink.api.java.io
>         <http://org.apache.flink.api.java.io>.jdbc.JDBCInputFormat
>
>         Then, I can't use:
>         Im�genes integradas 1
>
>         I tried to download code from git and recompile, also
>
-- 
Freundliche Gr��e / Kind Regards

Timo Walther

Follow me: @twalthr
https://www.linkedin.com/in/twalthr

Re: jdbc.JDBCInputFormat

Posted by Alberto Ramón <a....@gmail.com>.
After solved some issues, I connected with Kylin, but I can't read data

import org.apache.flink.api.scala._
import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
import org.apache.flink.api.table.Row
import org.apache.flink.api.table.typeutils.RowTypeInfo
import org.apache.flink.api.common.typeinfo.{BasicTypeInfo, TypeInformation}



var stringColum: TypeInformation[Int] = createTypeInformation[Int]
val DB_ROWTYPE = new RowTypeInfo(Seq(stringColum))

val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
  .setDrivername("org.apache.kylin.jdbc.Driver")
  .setDBUrl("jdbc:kylin://172.17.0.2:7070/learn_kylin")
  .setUsername("ADMIN")
  .setPassword("KYLIN")
  .setQuery("select count(distinct seller_id) as sellers from
kylin_sales group by part_dt order by part_dt")
  .setRowTypeInfo(DB_ROWTYPE)
  .finish()

  val dataset =env.createInput(inputFormat)
dataset.print()

The error is:
[image: Imágenes integradas 1]


(I checked that queries and  config are correct with SQuirriel)

(Isn't a connection problem, Because if I turn off database the error
is different "Reused Connection")


Can you see a problem in my code? (I found  Flink 4108 unsolved
issue,I don't know if is related)

BR, Alberto




2016-10-07 21:46 GMT+02:00 Fabian Hueske <fh...@gmail.com>:

> As the exception says the class org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
> does not exist.
>
> You have to do:
>
> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>
> There is no Scala implementation of this class but you can also use Java
> classes in Scala.
>
> 2016-10-07 21:38 GMT+02:00 Alberto Ramón <a....@gmail.com>:
>
>>
>> I want use CreateInput + buildJDBCInputFormat to acces to database on
>> SCALA
>>
>> PB1:
>>
>> import org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
>> Error:(25, 37) object jdbc is not a member of package org.apache.flink.api.java.io
>> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>>
>> Then, I can't use:
>> [image: Imágenes integradas 1]
>>
>> I tried to download code from git and recompile, also
>>
>>
>

Re: jdbc.JDBCInputFormat

Posted by Fabian Hueske <fh...@gmail.com>.
As the exception says the class
org.apache.flink.api.scala.io.jdbc.JDBCInputFormat does not exist.

You have to do:

import org.apache.flink.api.java.io.jdbc.JDBCInputFormat

There is no Scala implementation of this class but you can also use Java
classes in Scala.

2016-10-07 21:38 GMT+02:00 Alberto Ramón <a....@gmail.com>:

>
> I want use CreateInput + buildJDBCInputFormat to acces to database on SCALA
>
> PB1:
>
> import org.apache.flink.api.scala.io.jdbc.JDBCInputFormat
> Error:(25, 37) object jdbc is not a member of package org.apache.flink.api.java.io
> import org.apache.flink.api.java.io.jdbc.JDBCInputFormat
>
> Then, I can't use:
> [image: Imágenes integradas 1]
>
> I tried to download code from git and recompile, also
>
>