Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix conflicting Oracle and SSH providers tests #28337

Merged
merged 1 commit into from
Dec 13, 2022

Conversation

Taragolis
Copy link
Contributor

Fix broken main due to side effects between testings dags

    def do_execute(self, cursor, statement, parameters, context=None):
>       cursor.execute(statement, parameters)
E       sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "dag_pkey"
E       DETAIL:  Key (dag_id)=(dag) already exists.
E       
E       [SQL: INSERT INTO dag (dag_id, root_dag_id, is_paused, is_subdag, is_active, last_parsed_time, last_pickled, last_expired, scheduler_lock, pickle_id, fileloc, processor_subdir, owners, description, default_view, schedule_interval, timetable_description, max_active_tasks, max_active_runs, has_task_concurrency_limits, has_import_errors, next_dagrun, next_dagrun_data_interval_start, next_dagrun_data_interval_end, next_dagrun_create_after) VALUES (%(dag_id)s, %(root_dag_id)s, %(is_paused)s, %(is_subdag)s, %(is_active)s, %(last_parsed_time)s, %(last_pickled)s, %(last_expired)s, %(scheduler_lock)s, %(pickle_id)s, %(fileloc)s, %(processor_subdir)s, %(owners)s, %(description)s, %(default_view)s, %(schedule_interval)s, %(timetable_description)s, %(max_active_tasks)s, %(max_active_runs)s, %(has_task_concurrency_limits)s, %(has_import_errors)s, %(next_dagrun)s, %(next_dagrun_data_interval_start)s, %(next_dagrun_data_interval_end)s, %(next_dagrun_create_after)s)]
E       [parameters: {'dag_id': 'dag', 'root_dag_id': None, 'is_paused': False, 'is_subdag': False, 'is_active': False, 'last_parsed_time': None, 'last_pickled': None, 'last_expired': None, 'scheduler_lock': None, 'pickle_id': None, 'fileloc': None, 'processor_subdir': None, 'owners': None, 'description': None, 'default_view': None, 'schedule_interval': 'null', 'timetable_description': None, 'max_active_tasks': 16, 'max_active_runs': 16, 'has_task_concurrency_limits': True, 'has_import_errors': False, 'next_dagrun': None, 'next_dagrun_data_interval_start': None, 'next_dagrun_data_interval_end': None, 'next_dagrun_create_after': None}]
E       (Background on this error at: https://sqlalche.me/e/14/gkpj)

Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another side effect.

@potiuk potiuk merged commit 4dd6b2c into apache:main Dec 13, 2022
@Taragolis Taragolis deleted the fix-main-conflicts-between-providers branch December 13, 2022 15:07
@potiuk
Copy link
Member

potiuk commented Dec 13, 2022

Interesting. It was merged because SSH was not run in the #27319 due to selective checks. But it's next to impossible to prevent those kind of side-effect from creeping in :)

I guess we can add isolation to reset the DB between each package. That could be quite possible from perforrmance point of view, and would prevent this kind of error from ever happening.

@Taragolis
Copy link
Contributor Author

I thought this is possible to achieve by autouse fixture in providers conftests.py and module/package scope level

@potiuk
Copy link
Member

potiuk commented Dec 28, 2022

I thought this is possible to achieve by autouse fixture in providers conftests.py and module/package scope level

Yes. That's exactly what I thought. Would you like to give it a shot @Taragolis ?

@Taragolis
Copy link
Contributor Author

I tried to do it locally. It required some additional logic in fixture because for some unknown reason it not working well on "package" scope.

Need to find this branch, complete, and make a PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants