You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by "yinghua_zh@163.com" <yi...@163.com> on 2021/01/14 10:22:51 UTC
Re: Re:flink-sql字段类型问题
回复错了,抱歉!
yinghua_zh@163.com
发件人: yinghua_zh@163.com
发送时间: 2021-01-14 18:16
收件人: user-zh
主题: Re: 转发:flink-sql字段类型问题
[root@sdp-10-88-100-147 flink-1.11.3]# hdfs dfs -ls hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
log4j:WARN No such property [datePattern] in org.apache.log4j.RollingFileAppender.
21/01/14 17:05:50 INFO util.NativeCodeLoader: Loaded the native-hadoop library
Found 1 items
-rw-rw-r-- 3 yarn hdfs 5388 2021-01-14 17:03 hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6/_metadata // 这个是通过JobManger看到已经checkpoing完成后去查询出来的记录,的确是生成了,里面已经包含了_metadata文件
[root@sdp-10-88-100-147 flink-1.11.3]# hdfs dfs -ls hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6 //我停止任务后再去查询时,这个目录已经删除了,出错如下
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
log4j:WARN No such property [datePattern] in org.apache.log4j.RollingFileAppender.
21/01/14 17:06:17 INFO util.NativeCodeLoader: Loaded the native-hadoop library
ls: `hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6': No such file or directory //出错信息
yinghua_zh@163.com
发件人: 郝文强
发送时间: 2021-01-14 17:24
收件人: user-zh
主题: 转发:flink-sql字段类型问题
| |
郝文强
|
|
18846086541@163.com
|
签名由网易邮箱大师定制
--------- 转发邮件信息 ---------
发件人: 郝文强 <18...@163.com>
发送日期: 2021年01月14日 17:23
发送至: dev@flink.apache.org <de...@flink.apache.org>
主题: 转发:flink-sql字段类型问题
| |
郝文强
|
|
18846086541@163.com
|
签名由网易邮箱大师定制
--------- 转发邮件信息 ---------
发件人: 郝文强 <18...@163.com>
发送日期: 2021年01月14日 17:22
发送至: dev-help@flink.apache.org <de...@flink.apache.org>
主题: flink-sql字段类型问题
sql-client 创建表 报错java.math.BigInteger cannot be cast to java.lang.Long
麻烦各位帮看一下
源数据表是 mysql的information_schema.tables 表
表结构如下:
table_catalog varchar(64)
table_schema varchar(64)
table_name varchar(64)
table_type enum('base table','view','system view')
engine varchar(64)
version int
row_format enum('fixed','dynamic','compressed','redundant','compact','paged')
table_rows bigint unsigned
avg_row_length bigint unsigned
data_length bigint unsigned
max_data_length bigint unsigned
index_length bigint unsigned
data_free bigint unsigned
auto_increment bigint unsigned
create_time timestamp
update_time datetime
check_time datetime
table_collation varchar(64)
checksum bigint
create_options varchar(256)
table_comment text
我的flink sql 建表语句:
CREATE TABLE info_table (
TABLE_CATALOG STRING,
TABLE_SCHEMA STRING,
TABLE_NAME STRING,
TABLE_TYPE STRING,
ENGINE STRING,
VERSION INT,
ROW_FORMAT STRING,
TABLE_ROWS BIGINT,
AVG_ROW_LENGTH BIGINT,
DATA_LENGTH BIGINT,
MAX_DATA_LENGTH BIGINT,
INDEX_LENGTH BIGINT,
DATA_FREE BIGINT,
AUTO_INCREMENT BIGINT,
CREATE_TIME TIMESTAMP,
UPDATE_TIME TIMESTAMP,
CHECK_TIME TIMESTAMP,
TABLE_COLLATION STRING,
CHECKSUM INTEGER,
CREATE_OPTIONS STRING,
TABLE_COMMENT STRING,
PRIMARY KEY (`TABLE_NAME`) NOT ENFORCED
) WITH (
'connector' = 'jdbc',
'url' = 'jdbc:mysql://localhost:3306/information_schema',
'username' = 'root',
'password' = 'root',
'table-name' = 'TABLES'
);
反复改了几次类型都报错:
java.math.BigInteger cannot be cast to java.lang.Integer
java.lang.Long cannot be cast to java.math.BigDecimal
java.lang.Long cannot be cast to java.lang.Integer
| |
郝文强
|
|
18846086541@163.com
|
签名由网易邮箱大师定制
Re: Re:flink-sql字段类型问题
Posted by zhang hao <zh...@gmail.com>.
看了下源码BigInteger 转都会有问题,没有匹配的这种类型:
public boolean isNullAt(int pos) {
return this.fields[pos] == null;
}
@Override
public boolean getBoolean(int pos) {
return (boolean) this.fields[pos];
}
@Override
public byte getByte(int pos) {
return (byte) this.fields[pos];
}
@Override
public short getShort(int pos) {
return (short) this.fields[pos];
}
@Override
public int getInt(int pos) {
return (int) this.fields[pos];
}
@Override
public long getLong(int pos) {
return (long) this.fields[pos];
}
@Override
public float getFloat(int pos) {
return (float) this.fields[pos];
}
@Override
public double getDouble(int pos) {
return (double) this.fields[pos];
}
On Thu, Jan 14, 2021 at 6:24 PM yinghua_zh@163.com <yi...@163.com>
wrote:
>
> 回复错了,抱歉!
>
>
> yinghua_zh@163.com
>
> 发件人: yinghua_zh@163.com
> 发送时间: 2021-01-14 18:16
> 收件人: user-zh
> 主题: Re: 转发:flink-sql字段类型问题
>
> [root@sdp-10-88-100-147 flink-1.11.3]# hdfs dfs -ls
> hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
> MaxPermSize=512m; support was removed in 8.0
> log4j:WARN No such property [datePattern] in
> org.apache.log4j.RollingFileAppender.
> 21/01/14 17:05:50 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> Found 1 items
> -rw-rw-r-- 3 yarn hdfs 5388 2021-01-14 17:03
> hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6/_metadata
> // 这个是通过JobManger看到已经checkpoing完成后去查询出来的记录,的确是生成了,里面已经包含了_metadata文件
> [root@sdp-10-88-100-147 flink-1.11.3]# hdfs dfs -ls
> hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6
> //我停止任务后再去查询时,这个目录已经删除了,出错如下
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
> MaxPermSize=512m; support was removed in 8.0
> log4j:WARN No such property [datePattern] in
> org.apache.log4j.RollingFileAppender.
> 21/01/14 17:06:17 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> ls:
> `hdfs://hdfsCluster/apps/ccp/flink/checkpoints/10001/39ed8aee0a2c4497be9a9d826355f595/chk-6':
> No such file or directory //出错信息
>
>
>
> yinghua_zh@163.com
> 发件人: 郝文强
> 发送时间: 2021-01-14 17:24
> 收件人: user-zh
> 主题: 转发:flink-sql字段类型问题
> | |
> 郝文强
> |
> |
> 18846086541@163.com
> |
> 签名由网易邮箱大师定制
> --------- 转发邮件信息 ---------
> 发件人: 郝文强 <18...@163.com>
> 发送日期: 2021年01月14日 17:23
> 发送至: dev@flink.apache.org <de...@flink.apache.org>
> 主题: 转发:flink-sql字段类型问题
> | |
> 郝文强
> |
> |
> 18846086541@163.com
> |
> 签名由网易邮箱大师定制
> --------- 转发邮件信息 ---------
> 发件人: 郝文强 <18...@163.com>
> 发送日期: 2021年01月14日 17:22
> 发送至: dev-help@flink.apache.org <de...@flink.apache.org>
> 主题: flink-sql字段类型问题
> sql-client 创建表 报错java.math.BigInteger cannot be cast to java.lang.Long
> 麻烦各位帮看一下
> 源数据表是 mysql的information_schema.tables 表
> 表结构如下:
> table_catalog varchar(64)
> table_schema varchar(64)
> table_name varchar(64)
> table_type enum('base table','view','system view')
> engine varchar(64)
> version int
> row_format
> enum('fixed','dynamic','compressed','redundant','compact','paged')
> table_rows bigint unsigned
> avg_row_length bigint unsigned
> data_length bigint unsigned
> max_data_length bigint unsigned
> index_length bigint unsigned
> data_free bigint unsigned
> auto_increment bigint unsigned
> create_time timestamp
> update_time datetime
> check_time datetime
> table_collation varchar(64)
> checksum bigint
> create_options varchar(256)
> table_comment text
> 我的flink sql 建表语句:
> CREATE TABLE info_table (
> TABLE_CATALOG STRING,
> TABLE_SCHEMA STRING,
> TABLE_NAME STRING,
> TABLE_TYPE STRING,
> ENGINE STRING,
> VERSION INT,
> ROW_FORMAT STRING,
> TABLE_ROWS BIGINT,
> AVG_ROW_LENGTH BIGINT,
> DATA_LENGTH BIGINT,
> MAX_DATA_LENGTH BIGINT,
> INDEX_LENGTH BIGINT,
> DATA_FREE BIGINT,
> AUTO_INCREMENT BIGINT,
> CREATE_TIME TIMESTAMP,
> UPDATE_TIME TIMESTAMP,
> CHECK_TIME TIMESTAMP,
> TABLE_COLLATION STRING,
> CHECKSUM INTEGER,
> CREATE_OPTIONS STRING,
> TABLE_COMMENT STRING,
> PRIMARY KEY (`TABLE_NAME`) NOT ENFORCED
> ) WITH (
> 'connector' = 'jdbc',
> 'url' = 'jdbc:mysql://localhost:3306/information_schema',
> 'username' = 'root',
> 'password' = 'root',
> 'table-name' = 'TABLES'
> );
> 反复改了几次类型都报错:
> java.math.BigInteger cannot be cast to java.lang.Integer
> java.lang.Long cannot be cast to java.math.BigDecimal
> java.lang.Long cannot be cast to java.lang.Integer
> | |
> 郝文强
> |
> |
> 18846086541@163.com
> |
> 签名由网易邮箱大师定制
>