Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Python version to 3.13 in Dockerfile and Pipfile #985

Merged
merged 27 commits into from
Feb 3, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
def9fb8
Update Python version to 3.13 in Dockerfile and Pipfile
dasunpubudumal Jan 23, 2025
f56a37a
Add ODBC Driver installation step for SQL Server in CI workflow
dasunpubudumal Jan 23, 2025
419c982
Remove ODBC Driver installation step for SQL Server from CI workflow
dasunpubudumal Jan 23, 2025
f02aed4
Update CI workflow to use Python 3.13 and install ODBC Driver for SQL…
dasunpubudumal Jan 24, 2025
4df0260
Update ODBC Driver installation to version 18 in CI workflow
dasunpubudumal Jan 24, 2025
d4ea3e5
Add setuptools dependency to Pipfile
dasunpubudumal Jan 24, 2025
be1b8cb
Fixing CI (try 01)
dasunpubudumal Jan 24, 2025
ccff437
Fixing CI (try 02)
dasunpubudumal Jan 24, 2025
a8b0413
Fixing CI (try 03)
dasunpubudumal Jan 24, 2025
21466a7
Remove setuptools dependency from Pipfile
dasunpubudumal Jan 24, 2025
ecce3df
Update Python version in Pipfile from 3.13 to 3.11
dasunpubudumal Jan 24, 2025
1c7c38f
Update Dockerfile and Pipfile to use Python 3.11
dasunpubudumal Jan 24, 2025
9aeec47
Update CI configuration to use Python 3.11
dasunpubudumal Jan 24, 2025
f9c520f
Reorder SQL Server database setup in CI workflow
dasunpubudumal Jan 24, 2025
de9b397
Update CI workflow to install msodbcsql17 instead of msodbcsql18
dasunpubudumal Jan 24, 2025
071623f
Update CI workflow to use Python 3.13 and install msodbcsql18
dasunpubudumal Jan 24, 2025
8dffcac
Update CI workflow and Dockerfile to use Python 3.12
dasunpubudumal Jan 24, 2025
74f98cc
Update Python version to 3.12 and change SQL Server ODBC driver to ve…
dasunpubudumal Jan 24, 2025
8e93969
Add TrustServerCertificate option to SQL Server connection string
dasunpubudumal Jan 24, 2025
ed6e49c
Add numpy dependency to Pipfile
dasunpubudumal Jan 24, 2025
72796ac
Update Python version to 3.13 and modify CI workflow for dynamic vers…
dasunpubudumal Jan 27, 2025
8e1ddca
Update SQL Server ODBC driver to version 18 in connection strings
dasunpubudumal Jan 27, 2025
1dcce33
Flake8 fixes
dasunpubudumal Jan 27, 2025
65b5e68
Flake8 fixes
dasunpubudumal Jan 27, 2025
6f66739
Adding possible test fixes
dasunpubudumal Jan 27, 2025
f3f5b0e
Upgradinbg to 3.13 (latest)
dasunpubudumal Jan 27, 2025
26335be
Upgradinbg to 3.13 (latest)
dasunpubudumal Jan 27, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 29 additions & 7 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,24 @@ on:
- master

jobs:
setup:
runs-on: ubuntu-latest
outputs:
python_version: ${{ steps.read_python_version.outputs.python_version }}
steps:
- uses: actions/checkout@v4
- name: Read Python version
id: read_python_version
run: echo "::set-output name=python_version::$(cat .python-version)"
black:
runs-on: ubuntu-latest
needs: setup
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
python-version: ${{ needs.setup.outputs.python_version }}
- uses: actions/cache@v4
with:
path: ~/.cache/pip
Expand All @@ -37,12 +47,13 @@ jobs:
python -m black --check .
flake8:
runs-on: ubuntu-latest
needs: setup
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
python-version: ${{ needs.setup.outputs.python_version }}
- uses: actions/cache@v4
with:
path: ~/.cache/pip
Expand All @@ -61,12 +72,13 @@ jobs:
flake8
mypy:
runs-on: ubuntu-latest
needs: setup
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
python-version: ${{ needs.setup.outputs.python_version }}
- uses: actions/cache@v4
with:
path: ~/.cache/pip
Expand All @@ -84,6 +96,7 @@ jobs:
python -m mypy .
test:
runs-on: ubuntu-latest
needs: setup
services:
mysql:
image: mysql:8.0
Expand All @@ -104,13 +117,17 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.8
python-version: ${{ needs.setup.outputs.python_version }}
- uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/Pipfile') }}
restore-keys: |
${{ runner.os }}-pip-
- name: Install dependencies for building Python wheels (Ubuntu)
run: |
sudo apt-get update
sudo apt-get install -y build-essential python3-dev libffi-dev libssl-dev libpq-dev unixodbc-dev
- name: Install pipenv
run: |
pip install pipenv
Expand All @@ -122,12 +139,17 @@ jobs:
with:
mongodb-version: 4.2
mongodb-replica-set: heron_rs
- name: Create SQL Server testing database
run: |
python setup_sqlserver_test_db.py
- name: Setup the test MLWH and Events databases
run: |
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
curl https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/prod.list | sudo tee /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get install -y msodbcsql18
sudo apt-get install -y unixodbc-dev
python setup_test_db.py
- name: Create SQL Server testing database
run: |
python setup_sqlserver_test_db.py
- name: Test with pytest
run: |
python -m pytest -x
Expand Down
6 changes: 3 additions & 3 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -84,9 +84,6 @@ target/
profile_default/
ipython_config.py

# pyenv
.python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
Expand Down Expand Up @@ -132,3 +129,6 @@ dmypy.json
.pyre/

tests/data/reports/*

# PyCharm indexes
.idea
1 change: 1 addition & 0 deletions .python-version
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3.13.0
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Use slim for a smaller image size and install only the required packages
FROM python:3.8-slim-buster
FROM python:3.13-slim

# Use the following on M1; for odbc connection to mssql.
# FROM --platform=linux/amd64 python:3.8-slim-buster
Expand Down
4 changes: 3 additions & 1 deletion Pipfile
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,11 @@ requests = "~=2.32"
slackclient = "~=2.9"
sqlalchemy = "~=2.0"
pymongo = "~=4.8.0"
setuptools = "*"
numpy = "*"
Comment on lines +34 to +35

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should these be loosely version locked?

Comment on lines +34 to +35

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Numpy is a big dependency to not be formally included? Is it used directly in the code?


[requires]
python_version = "3.8"
python_version = "3.13"

[pipenv]
allow_prereleases = true
Expand Down
2,682 changes: 1,351 additions & 1,331 deletions Pipfile.lock

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ def _is_valid_destination_coordinate_not_duplicated(self, wells):
coordinates = [well["destination_coordinate"] for well in wells]
duplicates = set([coor for coor in coordinates if coordinates.count(coor) > 1])
if len(duplicates) > 0:
raise RetrievalError(f"Some coordinates have clashing samples/controls: { duplicates }")
raise RetrievalError(f"Some coordinates have clashing samples/controls: {duplicates}")

@property
def errors(self) -> List[str]:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def _well_samples(self):
def _is_valid_no_duplicate_uuids(self, uuids):
duplicates = set([uuid for uuid in uuids if uuids.count(uuid) > 1])
if len(duplicates) > 0:
raise RetrievalError(f"There is duplication in the lh sample uuids provided: { list(duplicates) }")
raise RetrievalError(f"There is duplication in the lh sample uuids provided: {list(duplicates)}")

def _get_sample_with_uuid(self, samples, uuid):
for sample in samples:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -138,27 +138,27 @@ def retrieval_scope(self):
Returns:
ContextManager - A context specifically created to handle a retrieval process
"""
logger.info(f"At { self._source_code_position_for_logging() } - Start retrieval")
logger.info(f"At {self._source_code_position_for_logging()} - Start retrieval")

try:
if not self.is_valid():
raise ValidationError(f"At { self._source_code_position_for_logging() } - Exception during validation")
raise ValidationError(f"At {self._source_code_position_for_logging()} - Exception during validation")
yield
except Exception as exc:
logger.error(f"At { self._source_code_position_for_logging() } - Exception during retrieval")
logger.error(f"At {self._source_code_position_for_logging()} - Exception during retrieval")

self._is_valid = False
msg = f"Exception during retrieval: {exc}"
if msg not in self._errors:
self._errors.append(msg)
raise exc

logger.info(f"At { self._source_code_position_for_logging() } - End retrieval")
logger.info(f"At {self._source_code_position_for_logging()} - End retrieval")

def _source_code_position_for_logging(self):
klass = type(self)
frame = sys._getframe(3)
function_name = frame.f_code.co_name
line_no = frame.f_lineno

return f"{ klass.__qualname__ }::{ function_name } - line { line_no }"
return f"{klass.__qualname__}::{function_name} - line {line_no}"
8 changes: 4 additions & 4 deletions lighthouse/classes/event_properties/validations.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ def validate_param_not_missing(self, param: str) -> None:
bool - True/False indicating if this condition is met
"""
with self.validation_scope():
self.process_validation(self.get_param_value(param) is not None, f"'{ param }' is missing")
self.process_validation(self.get_param_value(param) is not None, f"'{param}' is missing")

def validate_param_not_empty(self, param: str) -> None:
"""
Expand All @@ -31,7 +31,7 @@ def validate_param_not_empty(self, param: str) -> None:
bool - True/False indicating if this condition is met
"""
with self.validation_scope():
self.process_validation(self.get_param_value(param) != "", f"'{ param }' should not be an empty string")
self.process_validation(self.get_param_value(param) != "", f"'{param}' should not be an empty string")

def validate_param_no_whitespaces(self, param: str) -> None:
"""
Expand All @@ -47,7 +47,7 @@ def validate_param_no_whitespaces(self, param: str) -> None:
with self.validation_scope():
text_to_check = self.get_param_value(param)
self.process_validation(
text_to_check is None or (" " not in text_to_check), f"'{ param }' should not contain any whitespaces"
text_to_check is None or (" " not in text_to_check), f"'{param}' should not contain any whitespaces"
)

def validate_param_is_integer(self, param: str) -> None:
Expand All @@ -65,4 +65,4 @@ def validate_param_is_integer(self, param: str) -> None:
bool - True/False indicating if this condition is met
"""
with self.validation_scope():
self.process_validation(is_integer(self.get_param_value(param)), f"'{ param }' should be an integer")
self.process_validation(is_integer(self.get_param_value(param)), f"'{param}' should be an integer")
2 changes: 1 addition & 1 deletion lighthouse/classes/events/biosero/destination_completed.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,4 +78,4 @@ def process_event(self) -> None:
response = message.send_to_ss()

if not response.ok:
raise Exception(f"There was some problem when sending message to Sequencescape: { response.text }")
raise Exception(f"There was some problem when sending message to Sequencescape: {response.text}")
Original file line number Diff line number Diff line change
Expand Up @@ -78,4 +78,4 @@ def process_event(self) -> None:
response = message.send_to_ss()

if not response.ok:
raise Exception(f"There was some problem when sending message to Sequencescape: { response.text }")
raise Exception(f"There was some problem when sending message to Sequencescape: {response.text}")
Original file line number Diff line number Diff line change
Expand Up @@ -78,4 +78,4 @@ def process_event(self) -> None:
response = message.send_to_ss()

if not response.ok:
raise Exception(f"There was some problem when sending message to Sequencescape: { response.text }")
raise Exception(f"There was some problem when sending message to Sequencescape: {response.text}")
Original file line number Diff line number Diff line change
Expand Up @@ -78,4 +78,4 @@ def process_event(self) -> None:
response = message.send_to_ss()

if not response.ok:
raise Exception(f"There was some problem when sending message to Sequencescape: { response.text }")
raise Exception(f"There was some problem when sending message to Sequencescape: {response.text}")
2 changes: 1 addition & 1 deletion lighthouse/classes/services/cherrytrack.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ class CherrytrackServiceMixin:
def raise_error_from_response(self, response):
json = response.json()
if not (isinstance(json, dict)):
raise Exception(f"Response from Cherrytrack is not a valid JSON: { json }")
raise Exception(f"Response from Cherrytrack is not a valid JSON: {json}")

if json and ("errors" in json):
raise Exception(f"Response from Cherrytrack is not OK: {','.join(json['errors'])}")
Expand Down
2 changes: 1 addition & 1 deletion lighthouse/classes/services/labwhere.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ def transfer_to_bin(self: Any) -> None:
user_barcode=robot_barcode,
)
if not response.ok:
raise Exception(f"There was a problem attempting to change a location in LabWhere: { response.text }")
raise Exception(f"There was a problem attempting to change a location in LabWhere: {response.text}")

@staticmethod
def _destroyed_barcode() -> str:
Expand Down
2 changes: 1 addition & 1 deletion lighthouse/config/defaults.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@

# NB: Remember to copy this definition to any config which redefines any of the variables that are used to create it.
DART_SQL_SERVER_CONNECTION_STRING = (
"DRIVER={ODBC Driver 17 for SQL Server};"
"DRIVER={ODBC Driver 18 for SQL Server};"
f"SERVER=tcp:{DART_SQL_SERVER_HOST};"
f"DATABASE={DART_SQL_SERVER_DATABASE};"
f"UID={DART_SQL_SERVER_USER};"
Expand Down
5 changes: 3 additions & 2 deletions lighthouse/config/test.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,11 +65,12 @@
# NB: Create the connection string here as we define the database here. Since a f-string is evaluated immediately,
# it cannot only live in defaults.py if we redefine any of the variables that are used to create it.
DART_SQL_SERVER_CONNECTION_STRING = (
"DRIVER={ODBC Driver 17 for SQL Server};"
"DRIVER={ODBC Driver 18 for SQL Server};"
f"SERVER=tcp:{DART_SQL_SERVER_HOST};"
f"DATABASE={DART_SQL_SERVER_DATABASE};"
f"UID={DART_SQL_SERVER_USER};"
f"PWD={DART_SQL_SERVER_PASSWORD}"
f"PWD={DART_SQL_SERVER_PASSWORD};"
f"TrustServerCertificate=yes;"
)

###
Expand Down
4 changes: 2 additions & 2 deletions setup_sqlserver_test_db.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@

cnxn = pyodbc.connect(
(
"DRIVER={ODBC Driver 17 for SQL Server};" # noqa: F541
"DRIVER={ODBC Driver 18 for SQL Server};" # noqa: F541
f"SERVER=tcp:{LOCALHOST};"
"DATABASE=master;UID=SA;PWD=MyS3cr3tPassw0rd"
"DATABASE=master;UID=SA;PWD=MyS3cr3tPassw0rd;TrustServerCertificate=yes"
),
autocommit=True,
)
Expand Down
2 changes: 1 addition & 1 deletion tests/endpoints/events/beckman/test_source_completed.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def test_get_event_source_completed(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Beckman.EVENT_SOURCE_COMPLETED }",
routing_key=f"test.event.{Beckman.EVENT_SOURCE_COMPLETED}",
body='{"event": {"uuid": "'
+ int_to_uuid(1)
+ (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def test_get_event_source_no_pickable_wells(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Beckman.EVENT_SOURCE_ALL_NEGATIVES }",
routing_key=f"test.event.{Beckman.EVENT_SOURCE_ALL_NEGATIVES}",
body='{"event": {"uuid": "'
+ int_to_uuid(1)
+ (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def test_event_no_plate_map_data(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Beckman.EVENT_SOURCE_NO_PLATE_MAP_DATA }",
routing_key=f"test.event.{Beckman.EVENT_SOURCE_NO_PLATE_MAP_DATA}",
body='{"event": {"uuid": "'
+ int_to_uuid(1)
+ (
Expand Down
2 changes: 1 addition & 1 deletion tests/endpoints/events/beckman/test_source_unrecognised.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ def test_get_event_source_unrecognised(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Beckman.EVENT_SOURCE_UNRECOGNISED }",
routing_key=f"test.event.{Beckman.EVENT_SOURCE_UNRECOGNISED}",
body='{"event": {"uuid": "'
+ int_to_uuid(1)
+ (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ def test_post_event_destination_completed(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Biosero.EVENT_DESTINATION_COMPLETED }",
routing_key=f"test.event.{Biosero.EVENT_DESTINATION_COMPLETED}",
body=event_message,
)

Expand Down
2 changes: 1 addition & 1 deletion tests/endpoints/events/biosero/test_destination_failed.py
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ def test_post_event_partially_completed(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Biosero.EVENT_DESTINATION_FAILED }",
routing_key=f"test.event.{Biosero.EVENT_DESTINATION_FAILED}",
body=event_message,
)

Expand Down
6 changes: 3 additions & 3 deletions tests/endpoints/events/biosero/test_destination_partial.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ def test_post_destination_completed_missing_barcode(app, client, biosero_auth_he
assert response.json == {
"_status": "ERR",
"_issues": {
"event_type": (f"'barcode' cannot be empty with the '{ Biosero.EVENT_DESTINATION_PARTIAL }' event")
"event_type": (f"'barcode' cannot be empty with the '{Biosero.EVENT_DESTINATION_PARTIAL}' event")
},
"_error": {"code": 422, "message": "Insertion failure: 1 document(s) contain(s) error(s)"},
}
Expand All @@ -51,7 +51,7 @@ def test_post_destination_partial_missing_run_id(app, client, biosero_auth_heade
assert response.json == {
"_status": "ERR",
"_issues": {
"event_type": f"'{ Biosero.EVENT_DESTINATION_PARTIAL }' requires a corresponding 'run_id' parameter"
"event_type": f"'{Biosero.EVENT_DESTINATION_PARTIAL}' requires a corresponding 'run_id' parameter"
},
"_error": {"code": 422, "message": "Insertion failure: 1 document(s) contain(s) error(s)"},
}
Expand Down Expand Up @@ -133,7 +133,7 @@ def test_post_event_partial(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Biosero.EVENT_DESTINATION_PARTIAL }",
routing_key=f"test.event.{Biosero.EVENT_DESTINATION_PARTIAL}",
body=event_message,
)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ def test_post_event_destination_partially_completed(

mocked_rabbit_channel.basic_publish.assert_called_with(
exchange="lighthouse.test.examples",
routing_key=f"test.event.{ Biosero.EVENT_DESTINATION_PARTIAL_COMPLETED }",
routing_key=f"test.event.{Biosero.EVENT_DESTINATION_PARTIAL_COMPLETED}",
body=event_message,
)

Expand Down
Loading
Loading