You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by "Adam Bellemare (Jira)" <ji...@apache.org> on 2021/02/11 20:00:00 UTC

[jira] [Created] (KAFKA-12323) Record timestamps not populated in event

Adam Bellemare created KAFKA-12323:
--------------------------------------

             Summary: Record timestamps not populated in event
                 Key: KAFKA-12323
                 URL: https://issues.apache.org/jira/browse/KAFKA-12323
             Project: Kafka
          Issue Type: Bug
    Affects Versions: 2.7.0
            Reporter: Adam Bellemare


Upgraded a kafka streams application from 2.6.0 to 2.7.0. Noticed that the events being produced had a "CreatedAt" timestamp = 0, causing downstream failures as we depend on those timestamps. Reverting back to 2.6.0/2.6.1 fixed this issue. This was the only change to the Kafka Streams application.


Consuming the event stream produced by 2.6.0 results in events that, when consumed using the `kafka-avro-console-consumer` and `--property print.timestamp=true` result in events prepended with the event times, such as:
```
CreateTime:1613072202271 <key> <value>
CreateTime:1613072203412 <key> <value>
CreateTime:1613072205431 <key> <value>
```

etc.

However, when those events are produced by the Kafka Streams app using 2.7.0, we get:

```
CreateTime:0 <key> <value>
CreateTime:0 <key> <value>
CreateTime:0 <key> <value>
```

I don't know if these is a default value somewhere that changed, but this is actually a blocker for our use-cases as we now need to circumnavigate this limitation (or roll back to 2.6.1, though there are other issues we must deal with then). I am not sure which unit tests in the code base to look at to validate this, but I wanted to log this bug now in case someone else has already seen this or an open one exists (I didn't see one though).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)