You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Aleksandr Filichkin (JIRA)" <ji...@apache.org> on 2018/07/19 08:02:00 UTC
[jira] [Created] (FLINK-9893) Cannot run Flink job in IDE when we
have more than 1 taskslot
Aleksandr Filichkin created FLINK-9893:
------------------------------------------
Summary: Cannot run Flink job in IDE when we have more than 1 taskslot
Key: FLINK-9893
URL: https://issues.apache.org/jira/browse/FLINK-9893
Project: Flink
Issue Type: Wish
Components: Streaming
Affects Versions: 1.5.1
Reporter: Aleksandr Filichkin
The problem is I cannot run it in IDE when I have more than 1 taskslot in my job.
public class StreamingJob {
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties kafkaProperties = new Properties();
kafkaProperties.setProperty("bootstrap.servers", "localhost:9092");
kafkaProperties.setProperty("group.id", "test");
env.setParallelism(1);
DataStream<String> kafkaSource = env.addSource(new FlinkKafkaConsumer010<>("flink-source", new SimpleStringSchema(), kafkaProperties)).name("Kafka-Source").slotSharingGroup("Kafka-Source");
kafkaSource.print().slotSharingGroup("Print");
env.execute("Flink Streaming Java API Skeleton");
}
}
I know that job need 2 slot for this job and I can have two taskmanagers in Flink cluster, but how can I run it locally in IDE.
Currently I have to specify the same slotSharingGroup name for all operator locally to have one slot. But it's not flexible.
How do you handle it?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)