-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spring Kafka Wait For Assignment Failed After Update To SpringBoot 3.2.0 #2978
Comments
Thanks for getting in touch. Please try downgrading your Spring Kafka dependency to If that doesn't change the problem, please provide a complete minimal sample that reproduces the problem. You can share it with us by pushing it to a separate repository on GitHub or by zipping it and attaching it to this issue. |
I have the same issue |
@scottfrederick Hi Thank you for your feedback, yes I will attach sample project for reproducing the issue |
In my integration tests I had the same issue after upgrading to 3.2, I worked around it by making sure an integration test is for a single topic and has partitions set to 1 on the embedded Kafka annotation. It seems like a weird race condition, because when I attached a debugger I could see the number of expected partitions was correct, but when just running the test it would fail with the same error as reported here. |
Hi @ThomHurks could you add example project to reproduce the error? As I tried with my project the error didn't appear |
If you would like us to look at this issue, please provide the requested information. If the information is not provided within the next 7 days this issue will be closed. |
Closing due to lack of requested feedback. If you would like us to look at this issue, please provide the requested information and we will re-open the issue. |
More information on my observation of this change in behaviour: Steps to reproduce:
|
Thanks, @rgolder1. The problem does not occur with Spring Boot 3.2.1 when Spring Kafka is downgraded to 3.0.13 by adding /cc @artembilan who maintains Spring Kafka |
Thank you for great example to play with! Although it is too complicated to digest quickly, but here is a workaround for now:
Starting with Spring for Apache Kafka Apparently it does not take into account a number of partitions set by default in the
or more fine-grained:
to make an embedded Kafka to create those topics for us upfront. As for the problem with KRaft.
So, we propagate a specific property into Kafka broker started via embedded Kafka for topics auto-creation.
So, feel free to raise a respective issue in the https://github.com/spring-projects/spring-kafka/issues. Or, @wilkinsona , we can just transfer this issue over there. |
Fixes: spring-projects#2978 If we don't create topics manually, that can be done automatically on the broker side according to its configuration. For that goal the `EmbeddedKafkaKraftBroker` is missing to populate `KafkaConfig.NumPartitionsProp(): "" + this.partitionsPerTopic` broker property from `@EmbeddedKafka` configuration * Propagate `partitionsPerTopic` option down to the embedded broker(s) in the `EmbeddedKafkaKraftBroker` * Some other simple refactoring in the `EmbeddedKafkaKraftBroker` * Verify the option propagated via new unit test in the `KafkaTestUtilsTests.topicAutomaticallyCreatedWithProperNumberOfPartitions()`
Fixes: #2978 If we don't create topics manually, that can be done automatically on the broker side according to its configuration. For that goal the `EmbeddedKafkaKraftBroker` is missing to populate `KafkaConfig.NumPartitionsProp(): "" + this.partitionsPerTopic` broker property from `@EmbeddedKafka` configuration * Propagate `partitionsPerTopic` option down to the embedded broker(s) in the `EmbeddedKafkaKraftBroker` * Some other simple refactoring in the `EmbeddedKafkaKraftBroker` * Verify the option propagated via new unit test in the `KafkaTestUtilsTests.topicAutomaticallyCreatedWithProperNumberOfPartitions()`
Problem :
Caused by: java.lang.IllegalStateException: Expected 2 but got 1 partitions
The text was updated successfully, but these errors were encountered: