Currently this would throw on a fresh install with an error similar to
```
│ posthog-events Migration would get applied: 0024_materialize_window_and_session_id │
│ posthog-events Traceback (most recent call last): │
│ posthog-events File "manage.py", line 21, in <module> │
│ posthog-events main() │
│ posthog-events File "manage.py", line 17, in main │
│ posthog-events execute_from_command_line(sys.argv) │
│ posthog-events File "/usr/local/lib/python3.8/site-packages/django/core/management/__init__.py", li │
│ posthog-events utility.execute() │
│ posthog-events File "/usr/local/lib/python3.8/site-packages/django/core/management/__init__.py", li │
│ posthog-events self.fetch_command(subcommand).run_from_argv(self.argv) │
│ posthog-events File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 3 │
│ posthog-events self.execute(*args, **cmd_options) │
│ posthog-events File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 3 │
│ posthog-events output = self.handle(*args, **options) │
│ posthog-events File "/home/posthog/code/ee/management/commands/migrate_clickhouse.py", line 41, in │
│ posthog-events self.migrate(CLICKHOUSE_HTTP_URL, options) │
│ posthog-events File "/home/posthog/code/ee/management/commands/migrate_clickhouse.py", line 57, in │
│ posthog-events sql = getattr(op, "_sql") │
│ posthog-events AttributeError: 'RunPython' object has no attribute '_sql'
```
This is due to the command not working with `RunPython` commands.
* Make clickhouse server image version configurable
* Attempt to use a reusable workflow for backend tests
* Try a composite action
* Add missing shell: bash
* Temporarily remove a step
* Cache id parameter
* Include the action when needing to rerun
* Move checking migrations to separate file
This didn't work in composite actions tests and slows down every
parallelized test execution for no reason. This should have been
separate in the first place.
* Rename job
* Improve check-migrations
* Start stack in the new job
* Remove shell
* Solve flaky test
* Remove unused file
* Create a test timings job
* Use checkout@v2 - https://github.com/cds-snc/github-actions/issues/23
* Ignore coverage files
* Save backend test durations
* Install pytest-split
* Clean up ci-backend
* Move actions code around
* wording improvement
* Save backend test durations
* Add a comment
* Make it possible to run tests with a different clickhouse version
* Remove debugging code
* Fix typo
* Boolean inputs please work
Fixes overeager test running as per https://github.com/actions/runner/issues/1483
Co-authored-by: PostHog Bot <hey@posthog.com>
* option to expand taxonomic filter infinite lists
* make the button blue, update copy
* remove duplicate
* test for is_event_property filter
* fix test
* refactor fetching
* test expandable infinite list
* fix clicking on "$time" in event prop cypress
* remove debug
* fix data-attr
* clean up shared attrs
* test with flag
* use "scoped endpoint" instead of "extended"
* don't click on skeleton
* describe scoped endpoint
* Revert "describe scoped endpoint"
This reverts commit 8450b66ef5.
* describe scoped endpoint
* remove comment
* type div props
* pluralize list
* cleaner variable names
* make sure there's something there
* pluralize some more
* don't tab if we can select something in the list
* super lazy vms
* address review
* update tests
* fix typo
* test updates
* ts magic
* sort stuff out
* don't import from another test
* fix lazy tests and hopefully everything will be right in the world
* update teardown tests
* increase timeout in e2e.timeout
* sort imports
* test delay
* import delay
* fix leaky failures and 🚀
* fix optimize table timeout on 0002
* refactor
* fixes + optimize last and try-catch
* formatting
* test
* Move env to be normal not dynamic as we don't have access that early
Co-authored-by: Tiina Turban <tiina303@gmail.com>
* perf(postgresql): remove team_id index on persondistinctids table
We have an index on team_id, distinct_id which is already being used. To
avoid the overhead on pdi writes we can remove this index, thereby
reducing the IO we're experiencing on postgresql
Refers to https://github.com/PostHog/product-internal/issues/256
* format
* remove unused import