You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by GitBox <gi...@apache.org> on 2021/02/09 15:12:13 UTC

[GitHub] [carbondata] Indhumathi27 commented on a change in pull request #4090: [CARBONDATA-4122] Support HDFS Carbon writer for Flink Carbon Streaming

Indhumathi27 commented on a change in pull request #4090:
URL: https://github.com/apache/carbondata/pull/4090#discussion_r572968625



##########
File path: docs/flink-integration-guide.md
##########
@@ -78,7 +78,7 @@ limitations under the License.
     val carbonProperties = new Properties
     // Set the carbon properties here, such as date format, store location, etc.
      
-    // Create carbon bulk writer factory. Two writer types are supported: 'Local' and 'S3'.
+    // Create carbon bulk writer factory. Three writer types are supported: 'Local', Hdfs' and 'S3'.

Review comment:
       yes. i also thought about the same. But since, already they have implemented writers for LOCAL and S3 type, i have implemented for HDFS. But i can see, there are some differences only for S3 writer, some extra configurations are needed and they are not creating directory while writing stage directories in S3. you can check  CarbonS3Writer.commit. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org