You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2019/09/17 07:39:43 UTC

[GitHub] [incubator-hudi] HariprasadAllaka1612 edited a comment on issue #888: Exception in thread "main" com.uber.hoodie.exception.DatasetNotFoundException: Hoodie dataset not found in path

HariprasadAllaka1612 edited a comment on issue #888: Exception in thread "main" com.uber.hoodie.exception.DatasetNotFoundException: Hoodie dataset not found in path
URL: https://github.com/apache/incubator-hudi/issues/888#issuecomment-532071607
 
 
   Hi Vinoth,
   
   Thanks for the reply.
   
   a. This is the first time i am writing the data to that path
   b. Tried with Overwrite but i have the same exception. 
   
   Below is the exception with Overwrite savemode for your reference,
   Exception in thread "main" com.uber.hoodie.exception.DatasetNotFoundException: Hoodie dataset not found in path s3a://gat-datalake-raw-dev/Games3\.hoodie
   	at com.uber.hoodie.exception.DatasetNotFoundException.checkValidDataset(DatasetNotFoundException.java:45)
   	at com.uber.hoodie.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:91)
   	at com.uber.hoodie.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:78)
   	at com.uber.hoodie.common.table.HoodieTableMetaClient.initializePathAsHoodieDataset(HoodieTableMetaClient.java:310)
   	at com.uber.hoodie.common.table.HoodieTableMetaClient.initTableType(HoodieTableMetaClient.java:248)
   	at com.uber.hoodie.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:136)
   	at com.uber.hoodie.DefaultSource.createRelation(DefaultSource.scala:91)
   	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
   	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
   	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
   	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
   	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
   	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
   	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
   	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
   	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
   	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
   	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
   	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
   	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
   	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
   	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
   	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
   	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228)
   	at com.playngodataengg.scala.dao.DataAccessS3.writeDataToRefinedS3(DataAccessS3.scala:28)
   	at com.playngodataengg.scala.controller.GameAndProviderDataTransform.processData(GameAndProviderDataTransform.scala:29)
   	at com.playngodataengg.scala.action.GameAndProviderData$.main(GameAndProviderData.scala:10)
   	at com.playngodataengg.scala.action.GameAndProviderData.main(GameAndProviderData.scala)
   
   And i wanted to provide you the dependencies i have in my maven project as well.
   
   <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
     <modelVersion>4.0.0</modelVersion>
     <groupId>com.playngodataengg.scala</groupId>
     <artifactId>playngodataengg</artifactId>
     <version>1.0-SNAPSHOT</version>
     <inceptionYear>2008</inceptionYear>
     <properties>
       <scala.version>2.11.12</scala.version>
       <spark.version>2.4.0</spark.version>
       <spark.scala.lib.version>2.11</spark.scala.lib.version>
     </properties>
   
     <repositories>
       <repository>
         <id>scala-tools.org</id>
         <name>Scala-Tools Maven2 Repository</name>
         <url>http://scala-tools.org/repo-releases</url>
       </repository>
       <repository>
         <id>redshift</id>
         <url>http://redshift-maven-repository.s3-website-us-east-1.amazonaws.com/release</url>
       </repository>
     </repositories>
   
     <pluginRepositories>
       <pluginRepository>
         <id>scala-tools.org</id>
         <name>Scala-Tools Maven2 Repository</name>
         <url>http://scala-tools.org/repo-releases</url>
       </pluginRepository>
     </pluginRepositories>
   
     <dependencies>
       <dependency>
         <groupId>org.apache.httpcomponents</groupId>
         <artifactId>httpasyncclient</artifactId>
         <version>4.0.2</version>
         <exclusions>
           <exclusion>
             <groupId>org.apache.httpcomponents</groupId>
             <artifactId>httpcore</artifactId>
           </exclusion>
         </exclusions>
       </dependency>
       <dependency>
         <groupId>org.apache.httpcomponents</groupId>
         <artifactId>httpclient</artifactId>
         <version>4.5.2</version>
       </dependency>
       <dependency>
         <groupId>org.apache.httpcomponents</groupId>
         <artifactId>httpcore</artifactId>
         <version>4.4.5</version>
       </dependency>
       <dependency>
         <groupId>org.scala-tools</groupId>
         <artifactId>maven-scala-plugin</artifactId>
         <version>${spark.scala.lib.version}</version>
       </dependency>
       <dependency>
         <groupId>org.scala-lang</groupId>
         <artifactId>scala-library</artifactId>
         <version>${scala.version}</version>
       </dependency>
       <dependency>
         <groupId>junit</groupId>
         <artifactId>junit</artifactId>
         <version>4.4</version>
         <scope>test</scope>
       </dependency>
       <dependency>
         <groupId>org.specs</groupId>
         <artifactId>specs</artifactId>
         <version>1.2.5</version>
         <scope>test</scope>
       </dependency>
       <dependency>
         <groupId>com.typesafe</groupId>
         <artifactId>config</artifactId>
         <version>1.3.4</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging -->
       <dependency>
         <groupId>com.typesafe.scala-logging</groupId>
         <artifactId>scala-logging_${spark.scala.lib.version}</artifactId>
         <version>3.9.2</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-scala -->
       <dependency>
         <groupId>org.apache.logging.log4j</groupId>
         <artifactId>log4j-scala</artifactId>
         <version>11.0</version>
         <type>pom</type>
       </dependency>
       <dependency>
         <groupId>commons-codec</groupId>
         <artifactId>commons-codec</artifactId>
         <version>1.13</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
       <dependency>
         <groupId>org.apache.spark</groupId>
         <artifactId>spark-core_${spark.scala.lib.version}</artifactId>
         <version>${spark.version}</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
       <dependency>
         <groupId>org.apache.spark</groupId>
         <artifactId>spark-sql_${spark.scala.lib.version}</artifactId>
         <version>${spark.version}</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql-kafka-0-10 -->
       <dependency>
         <groupId>org.apache.spark</groupId>
         <artifactId>spark-sql-kafka-0-10_${spark.scala.lib.version}</artifactId>
         <version>${spark.version}</version>
         <scope>provided</scope>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
       <dependency>
         <groupId>com.amazonaws</groupId>
         <artifactId>aws-java-sdk</artifactId>
         <version>1.11.623</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws -->
       <dependency>
         <groupId>org.apache.hadoop</groupId>
         <artifactId>hadoop-aws</artifactId>
         <version>3.2.0</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
       <dependency>
         <groupId>org.apache.hadoop</groupId>
         <artifactId>hadoop-common</artifactId>
         <version>3.2.0</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.databricks/spark-avro -->
       <dependency>
         <groupId>com.databricks</groupId>
         <artifactId>spark-avro_${spark.scala.lib.version}</artifactId>
         <version>4.0.0</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.databricks/spark-redshift -->
       <dependency>
         <groupId>com.databricks</groupId>
         <artifactId>spark-redshift_${spark.scala.lib.version}</artifactId>
         <version>2.0.1</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc41 -->
       <dependency>
         <groupId>com.amazon.redshift</groupId>
         <artifactId>redshift-jdbc41</artifactId>
         <version>1.2.27.1051</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-common -->
       <dependency>
         <groupId>com.uber.hoodie</groupId>
         <artifactId>hoodie-common</artifactId>
         <version>0.4.7</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-hive -->
       <dependency>
         <groupId>com.uber.hoodie</groupId>
         <artifactId>hoodie-hive</artifactId>
         <version>0.4.7</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-hadoop-mr -->
       <dependency>
         <groupId>com.uber.hoodie</groupId>
         <artifactId>hoodie-hadoop-mr</artifactId>
         <version>0.4.7</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-spark -->
       <dependency>
         <groupId>com.uber.hoodie</groupId>
         <artifactId>hoodie-spark</artifactId>
         <version>0.4.7</version>
       </dependency>
       <!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-client -->
       <dependency>
         <groupId>com.uber.hoodie</groupId>
         <artifactId>hoodie-client</artifactId>
         <version>0.4.7</version>
       </dependency>
     </dependencies>
   
     <build>
       <sourceDirectory>src/main/scala</sourceDirectory>
       <testSourceDirectory>src/test/scala</testSourceDirectory>
       <plugins>
         <plugin>
           <groupId>org.scala-tools</groupId>
           <artifactId>maven-scala-plugin</artifactId>
           <executions>
             <execution>
               <goals>
                 <goal>compile</goal>
                 <goal>testCompile</goal>
               </goals>
             </execution>
           </executions>
           <configuration>
             <scalaVersion>${scala.version}</scalaVersion>
             <args>
               <arg>-target:jvm-1.5</arg>
             </args>
           </configuration>
         </plugin>
         <plugin>
           <groupId>org.apache.maven.plugins</groupId>
           <artifactId>maven-eclipse-plugin</artifactId>
           <configuration>
             <downloadSources>true</downloadSources>
             <buildcommands>
               <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>
             </buildcommands>
             <additionalProjectnatures>
               <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>
             </additionalProjectnatures>
             <classpathContainers>
               <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
               <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>
             </classpathContainers>
           </configuration>
         </plugin>
       </plugins>
     </build>
     <reporting>
       <plugins>
         <plugin>
           <groupId>org.scala-tools</groupId>
           <artifactId>maven-scala-plugin</artifactId>
           <configuration>
             <scalaVersion>${scala.version}</scalaVersion>
           </configuration>
         </plugin>
       </plugins>
     </reporting>
   </project>
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services