You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by "Jiqiu (JIRA)" <ji...@apache.org> on 2013/04/09 10:28:15 UTC
[jira] [Created] (HDFS-4678) libhdfs casts Japanese character
incorrectly to Java API
Jiqiu created HDFS-4678:
---------------------------
Summary: libhdfs casts Japanese character incorrectly to Java API
Key: HDFS-4678
URL: https://issues.apache.org/jira/browse/HDFS-4678
Project: Hadoop HDFS
Issue Type: Bug
Components: libhdfs
Affects Versions: 1.1.2
Environment: Platform: Linux64
Locale: Japanese (ja_JP.UTF-8)
Reporter: Jiqiu
Priority: Critical
Fix For: 1.2.0
put a local file with Japanese characters to hdfs,
while browsing it in hdfs, it cannot be recognized.
here is the test.c
#include "hdfs.h"
#include <stdio.h>
#include <locale.h>
int main(int argc, char **argv) {
if(!setlocale(LC_CTYPE, "ja_JP")) {
printf("Can not set locale type\n");
}
printf("0\n");
hdfsFS fs = hdfsConnect("localhost", 9000);
printf("1\n");
const char* writePath = "/tmp/\xF0\xA0\x80\x8B.txt";
printf("2\n");
hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
if(!writeFile) {
fprintf(stderr, "Failed to open %s for writing!\n", writePath);
exit(-1);
}
char* buffer = "Hello, World! \xF0\xA0\x80\x8B";
tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
if (hdfsFlush(fs, writeFile)) {
fprintf(stderr, "Failed to 'flush' %s\n", writePath);
exit(-1);
}
printf("3\n");
hdfsCloseFile(fs, writeFile);
}
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira