Browse Source

ref: Add comment about assumption currently made by eventstream (#37367)

We are in the process of splitting the events and transactions topic.
The current implementation assumes there are either the same or on the
same cluster, and will not work if not. Add a comment to make this more
clear.
Lyn Nagara 2 years ago
parent
commit
cfd6eed6a4
1 changed files with 5 additions and 0 deletions
  1. 5 0
      src/sentry/eventstream/kafka/backend.py

+ 5 - 0
src/sentry/eventstream/kafka/backend.py

@@ -29,6 +29,11 @@ class KafkaEventStream(SnubaProtocolEventStream):
 
     @cached_property
     def producer(self):
+        # TODO: The producer is currently hardcoded to KAFKA_EVENTS. This assumes that the transactions
+        # topic is either the same (or is on the same cluster) as the events topic. Since we are in the
+        # process of splitting the topic this will no longer be true. This should be fixed and we should
+        # drop this requirement when the KafkaEventStream is refactored to be agnostic of dataset specific
+        # details and the correct topic should be passed into here instead of hardcoding events.
         return kafka.producers.get(settings.KAFKA_EVENTS)
 
     def delivery_callback(self, error, message):