* Protobuf all the things
* oops
* Protobufize events to protect from malformed JSON
* format the generated files (will need to remember this for future)
* format
* clean up kafka produce serializer
* fixes
* Use ReplacingMergeTree for elements, remove element_groups and use elements_hash as a virtual "pk"
* remove unused ELEMENT_GROUP_TABLE_SQL
* merge fixes
* use redis cache to avoid writing duplicate elements to clickhouse
* move fakeredis to requirements.txt
* add team_id to cache key
* remove elements_group kafka table references
* add elements_hash to clickhouse element serializer
* fix cache key
* rename few keys
* add test runner to ease pycharm dev
* fix a some mypy errors
* remove typo
Co-authored-by: Eric <eeoneric@gmail.com>
* Publish events to Kafka for consumption
* Commit avro idl's for event schemas
* convert client to use github.com/dpkp/kafka-python
* events loaded into clickhouse from Kafka
* remove cruft
* Publish events to Kafka for consumption
* convert client to use github.com/dpkp/kafka-python
* remove cruft
* include kafka migrations
* bugfixes for migrations
* use constants for consistency
* wrap up local migrations
* small fixes
* tune ups