HomePhabricator

Allow mysql consumer to continue in case of duplicate key error

Authored by Ottomata.

Description

Allow mysql consumer to continue in case of duplicate key error

Now that EventLogging can run on Kafka, it will try to pick up
where it left off based on consumer offsets. This might cause
events that have been consumed to be reconsumed, which could
result in mysql duplicate key errors. This change catches
those errors and continues.

Note that this might cause missed messges in the case of batch
mysql insertion. However, this is not worse than it was before,
where the consumer would just die and start back up again from the
end of the 0mq stream. As a future enhancement, we could
seqeuentially insert the batch of events in the case of a duplicate
key error, so that events that have not actually yet been inserted
have a real chance of making it into mysql.

Change-Id: I45a8eb5b5b8f0ece8133aa82e2dd3de7e74cc315

Details

Committed
OttomataSep 8 2015, 6:32 PM
Parents
rEEVL0fa41cfd1271: Setup.py only matches eventlogging tests
Branches
Unknown
Tags
Unknown
References
refs/changes/50/236850/2
ChangeId
I45a8eb5b5b8f0ece8133aa82e2dd3de7e74cc315