You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by GitBox <gi...@apache.org> on 2021/03/30 02:53:34 UTC

[GitHub] [kafka] kebab-mai-haddi opened a new pull request #10432: KAFKA-12506

kebab-mai-haddi opened a new pull request #10432:
URL: https://github.com/apache/kafka/pull/10432


   writes data to the Kafka topic as part of the test setup
   
   Change of behavior: set up Kafka Producer and publish data to a topic as part of the AdjustStreamThreadCountTest integration test.
   
   ### Committer Checklist (excluded from commit message)
   - [ ] Verify design and implementation 
   - [ ] Verify test coverage and CI build status
   - [ ] Verify documentation (including upgrade notes)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kafka] kebab-mai-haddi closed pull request #10432: KAFKA-12506

Posted by GitBox <gi...@apache.org>.
kebab-mai-haddi closed pull request #10432:
URL: https://github.com/apache/kafka/pull/10432


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kafka] kebab-mai-haddi edited a comment on pull request #10432: KAFKA-12506

Posted by GitBox <gi...@apache.org>.
kebab-mai-haddi edited a comment on pull request #10432:
URL: https://github.com/apache/kafka/pull/10432#issuecomment-813124316


   @ableegoldman , I need your help here.
   
   # My design:
   In order to publish data, I am thinking of doing so in the `setup()` function of the test.
   I see that multiple tests use the same `inputTopic` and so, I think of publishing a message on this topic.
   
   # Implementation:
   
   I wrote a new private function that creates a producer and sends a message to the aforementioned topic. 
   
   A problem that I am facing is, my tests are passing but few other tests fail undeterministically. For eg, testAddPartitionDuringDeleteTopic() and AclAuthorizationWithZkSaslTest (its function testAclUpdateWithAcuthFailure()) failed. Rerunning them, they both passed. I also ran all the tests (`./gradlew test`) and that was successful as well.
   
   When I remove my code, all tests path smoothly.
   ## Code:
   ```
   private void publishDummyDataToTopic(final String inputTopic, final EmbeddedKafkaCluster cluster) {
           final Properties props = new Properties();
           props.put("acks", "all");
           props.put("retries", 1);
           props.put("transactional.id", "my-transactional-id");
           props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9000");
           props.put(ProducerConfig.CLIENT_ID_CONFIG, "test-client");
           props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
           props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
           final KafkaProducer<String, String> dummyProducer = new KafkaProducer<>(props);
           dummyProducer.send(new ProducerRecord<String, String>(inputTopic, Integer.toString(4), Integer.toString(4)));
           dummyProducer.close();
           return;
       }
   ```
   # My concerns[Help needed]
   ## Should I delete the data once all the tests of this package are over?
   ## Is the bootstrap server’s address correct? 
   - I considered other tests and they use the same address (localhost:9000).
   - I am talking about tests such as KafkaProducerTest->testFlushCompleteSendOfInflightBatches() that use localhost:9000 as the address.
   ## The way I create properties above, is it standardized and good practice as far as Kafka tests are considered?
   ## The way I send the message (dummyProducer.send()), is it standardized and good practice?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kafka] kebab-mai-haddi edited a comment on pull request #10432: KAFKA-12506

Posted by GitBox <gi...@apache.org>.
kebab-mai-haddi edited a comment on pull request #10432:
URL: https://github.com/apache/kafka/pull/10432#issuecomment-813124316


   @ableegoldman , I need your help here.
   
   # My design:
   In order to publish data, I am thinking of doing so in the `setup()` function of the test.
   I see that multiple tests use the same `inputTopic` and so, I think of publishing a message on this topic.
   
   # Implementation:
   
   I wrote a new private function that creates a producer and sends a message to the aforementioned topic. 
   
   A problem that I am facing is, my tests are passing but few other tests fail undeterministically. For eg, testAddPartitionDuringDeleteTopic() and AclAuthorizationWithZkSaslTest (its function testAclUpdateWithAcuthFailure()) failed. Rerunning them, they both passed.
   
   When I remove my code, all tests path smoothly.
   
   A few inhibitions around my code:
   
   ## Code:
   ```
   private void publishDummyDataToTopic(final String inputTopic, final EmbeddedKafkaCluster cluster) {
          final Properties props = new Properties();
          props.put("acks", "all");
          props.put("retries", 1);
          props.put("transactional.id", "my-transactional-id");
          props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9000");
          props.put(ProducerConfig.CLIENT_ID_CONFIG, "test-client");
          props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
          props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
          final KafkaProducer<String, String> dummyProducer = new KafkaProducer<>(props);
          System.out.println("Avi created a producer");
          dummyProducer.send(new ProducerRecord<String, String>(inputTopic, Integer.toString(4), Integer.toString(4)));
          dummyProducer.close();
          return;
      }
   ```
   # My concerns
   ## Should I delete the data once all the tests of this package are over?
   ## Is the bootstrap server’s address correct? 
   - I considered other tests and they use the same address (localhost:9000).
   - I am talking about tests such as KafkaProducerTest->testFlushCompleteSendOfInflightBatches() that use localhost:9000 as the address.
   ## The way I create properties above, is it standardized and good practice as far as Kafka tests are considered?
   ## The way I send the message (dummyProducer.send()), is it standardized and good practice?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kafka] kebab-mai-haddi commented on pull request #10432: KAFKA-12506

Posted by GitBox <gi...@apache.org>.
kebab-mai-haddi commented on pull request #10432:
URL: https://github.com/apache/kafka/pull/10432#issuecomment-813124316


   @ableegoldman , I need your help in here.
   
   - My design:
   In order to publish data, I am thinking of doing so in the setup() function of the test.
   I see that multiple tests use the same inputTopic and so, I think of publishing a message in this topic.
   
   - Implementation:
   
   I wrote a new private function that creates a producer and sends a message to the aforementioned topic. 
   
   A problem that I am facing is, my tests are passing but few other tests fail undeterministically. For eg, testAddPartitionDuringDeleteTopic() and AclAuthorizationWithZkSaslTest (its function testAclUpdateWithAcuthFailure()) failed. Rerunning them, they both passed.
   
   When I remove my code, all tests path smoothly.
   
   A few inhibitions around my code:
   
   Code:
   private void publishDummyDataToTopic(final String inputTopic, final EmbeddedKafkaCluster cluster) {
          System.out.println("The input topic is: " + inputTopic);
          System.out.println(cluster.bootstrapServers());
   
          final Properties props = new Properties();
          props.put("acks", "all");
          props.put("retries", 1);
          props.put("transactional.id", "my-transactional-id");
          props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9000");
          props.put(ProducerConfig.CLIENT_ID_CONFIG, "test-client");
          props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
          props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
          final KafkaProducer<String, String> dummyProducer = new KafkaProducer<>(props);
          System.out.println("Avi created a producer");
   //        dummyProducer.initTransactions();
          dummyProducer.send(new ProducerRecord<String, String>(inputTopic, Integer.toString(4), Integer.toString(4)));
   //        try {
   //            dummyProducer.beginTransaction();
   //            for (int i = 0; i < 2; i++) {
   //                dummyProducer.send(new ProducerRecord<String, String>(inputTopic, Integer.toString(i), Integer.toString(i)));
   //            }
   //            dummyProducer.commitTransaction();
   //        } catch (final KafkaException e) {
   //            // For all other exceptions, just abort the transaction and try again.
   //            dummyProducer.abortTransaction();
   //        }
          dummyProducer.close();
          System.out.println("Avi sent a message!");
          return;
      }
   
   Should I delete the data once all the tests of this package are over?
   Is the bootstrap server’s address correct? I considered other tests and they use the same address (localhost:9000). I am talking about tests such as KafkaProducerTest->testFlushCompleteSendOfInflightBatches() that use localhost:9000 as the address.
   The way I create properties above, is it standardized and a good practice as far as Kafka tests are considered?
   The way I send the message (dummyProducer.send()), is it standardized and a good practice.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kafka] kebab-mai-haddi edited a comment on pull request #10432: KAFKA-12506

Posted by GitBox <gi...@apache.org>.
kebab-mai-haddi edited a comment on pull request #10432:
URL: https://github.com/apache/kafka/pull/10432#issuecomment-813124316


   @ableegoldman , I need your help here.
   
   # My design:
   In order to publish data, I am thinking of doing so in the setup() function of the test.
   I see that multiple tests use the same `inputTopic` and so, I think of publishing a message on this topic.
   
   # Implementation:
   
   I wrote a new private function that creates a producer and sends a message to the aforementioned topic. 
   
   A problem that I am facing is, my tests are passing but few other tests fail undeterministically. For eg, testAddPartitionDuringDeleteTopic() and AclAuthorizationWithZkSaslTest (its function testAclUpdateWithAcuthFailure()) failed. Rerunning them, they both passed.
   
   When I remove my code, all tests path smoothly.
   
   A few inhibitions around my code:
   
   ## Code:
   ```
   private void publishDummyDataToTopic(final String inputTopic, final EmbeddedKafkaCluster cluster) {
          final Properties props = new Properties();
          props.put("acks", "all");
          props.put("retries", 1);
          props.put("transactional.id", "my-transactional-id");
          props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9000");
          props.put(ProducerConfig.CLIENT_ID_CONFIG, "test-client");
          props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
          props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
          final KafkaProducer<String, String> dummyProducer = new KafkaProducer<>(props);
          System.out.println("Avi created a producer");
          dummyProducer.send(new ProducerRecord<String, String>(inputTopic, Integer.toString(4), Integer.toString(4)));
          dummyProducer.close();
          return;
      }
   ```
   # My concerns
   ## Should I delete the data once all the tests of this package are over?
   ## Is the bootstrap server’s address correct? 
   - I considered other tests and they use the same address (localhost:9000).
   - I am talking about tests such as KafkaProducerTest->testFlushCompleteSendOfInflightBatches() that use localhost:9000 as the address.
   ## The way I create properties above, is it standardized and good practice as far as Kafka tests are considered?
   ## The way I send the message (dummyProducer.send()), is it standardized and good practice?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kafka] kebab-mai-haddi edited a comment on pull request #10432: KAFKA-12506

Posted by GitBox <gi...@apache.org>.
kebab-mai-haddi edited a comment on pull request #10432:
URL: https://github.com/apache/kafka/pull/10432#issuecomment-813124316


   @ableegoldman , I need your help here.
   
   # My design:
   In order to publish data, I am thinking of doing so in the `setup()` function of the test.
   I see that multiple tests use the same `inputTopic` and so, I think of publishing a message on this topic.
   
   # Implementation:
   
   I wrote a new private function that creates a producer and sends a message to the aforementioned topic. 
   
   A problem that I am facing is, my tests are passing but few other tests fail undeterministically. For eg, testAddPartitionDuringDeleteTopic() and AclAuthorizationWithZkSaslTest (its function testAclUpdateWithAcuthFailure()) failed. Rerunning them, they both passed. I also ran all the tests (`./gradlew test`) and that was successful as well.
   
   When I remove my code, all tests path smoothly.
   
   A few inhibitions around my code:
   
   ## Code:
   ```
   private void publishDummyDataToTopic(final String inputTopic, final EmbeddedKafkaCluster cluster) {
           final Properties props = new Properties();
           props.put("acks", "all");
           props.put("retries", 1);
           props.put("transactional.id", "my-transactional-id");
           props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9000");
           props.put(ProducerConfig.CLIENT_ID_CONFIG, "test-client");
           props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
           props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
           final KafkaProducer<String, String> dummyProducer = new KafkaProducer<>(props);
           dummyProducer.send(new ProducerRecord<String, String>(inputTopic, Integer.toString(4), Integer.toString(4)));
           dummyProducer.close();
           return;
       }
   ```
   # My concerns
   ## Should I delete the data once all the tests of this package are over?
   ## Is the bootstrap server’s address correct? 
   - I considered other tests and they use the same address (localhost:9000).
   - I am talking about tests such as KafkaProducerTest->testFlushCompleteSendOfInflightBatches() that use localhost:9000 as the address.
   ## The way I create properties above, is it standardized and good practice as far as Kafka tests are considered?
   ## The way I send the message (dummyProducer.send()), is it standardized and good practice?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org