ci: Split integration tests into per-provider matrix jobs#6091
ci: Split integration tests into per-provider matrix jobs#6091franciscojavierarceo wants to merge 5 commits intomasterfrom
Conversation
Split the monolithic integration test job into 5 parallel matrix jobs (Local/File, GCP, AWS, Snowflake, Redis) so each provider shows as a separate check in PRs, improving failure visibility and parallelism. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: e2879b59f2
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| run: make test-python-integration | ||
| run: | | ||
| uv run python -m pytest --tb=short -v -n 8 --integration --color=yes --durations=10 --timeout=1200 --timeout_method=thread --dist loadgroup \ | ||
| -k '${{ matrix.pytest_k }}' \ |
There was a problem hiding this comment.
Restore coverage for provider-agnostic integration tests
This workflow now always applies -k '${{ matrix.pytest_k }}', but every matrix filter only includes provider/store keywords, so integration tests whose node IDs do not contain those terms are deselected in all shards (for example sdk/python/tests/integration/cli/test_universal_cli.py::test_universal_cli and ::test_odfv_apply). The previous command path (make test-python-integration) ran the full integration suite except a narrow Snowflake exclusion, so this change silently removes meaningful CI coverage.
Useful? React with 👍 / 👎.
| needs_snowflake: false | ||
| needs_redis: false | ||
| - name: "Snowflake" | ||
| pytest_k: "Snowflake or test_snowflake_materialization" |
There was a problem hiding this comment.
Preserve Snowflake exclusion for historical retrieval test
The Snowflake shard filter Snowflake or test_snowflake_materialization no longer carries forward the existing test-python-integration guard that excluded Snowflake variants of test_historical_features_main (-k "(not snowflake or not test_historical_features_main)" in Makefile). This re-enables the previously excluded Snowflake historical retrieval case and can reintroduce the instability that exclusion was handling.
Useful? React with 👍 / 👎.
Add PYTEST_K variable to test-python-integration Make target so the workflow can pass provider-specific filters while keeping the Makefile as the single source of truth for pytest flags. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
- Restore Redis as a services: container available to all matrix jobs, since cross-product configs (e.g. BigQuery+redis) need it running - Add test_historical_features_main exclusion to Snowflake filter to preserve the existing instability guard from the original -k filter Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
|
@franciscojavierarceo I already have something similar that's almost working at #6073 |
pretty huge PR! |
Use needs_snowflake flag to conditionally set Snowflake secrets so non-Snowflake jobs don't unnecessarily receive credentials. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>
|
@tokoko i'll close mine in favor of yours |
Split the monolithic integration test job into 5 parallel matrix jobs (Local/File, GCP, AWS, Snowflake, Redis) so each provider shows as a separate check in PRs, improving failure visibility and parallelism.
What this PR does / why we need it:
Which issue(s) this PR fixes:
Misc