Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

exp/logging-messages-style #1444

Closed
wants to merge 3 commits into from

Conversation

cacrespo
Copy link

@cacrespo cacrespo commented Jun 5, 2024

Description

Hi all!

I noticed that the logging output levels are currently left-aligned with a width of 21 characters. Is there any specific reason for this?

Here's an example of the current formatting:
image

While it's not a major issue, I believe removing the white spaces could improve the visualization.

In this PR, I've adjusted the formatting accordingly and included a new test to validate the output format, specifically for error and warning messages:
image

Another option could be centering the text and reducing the total width slightly...

Perhaps in the future, we could explore using colors to differentiate between different log levels.

Related Issues

Nope

Thanks!

Copy link

netlify bot commented Jun 5, 2024

Deploy Preview for dlt-hub-docs canceled.

Name Link
🔨 Latest commit 5b29bde
🔍 Latest deploy log https://app.netlify.com/sites/dlt-hub-docs/deploys/66634379a18cfc000848068b

@rudolfix
Copy link
Collaborator

rudolfix commented Jun 5, 2024

@cacrespo heh good point :) I took this formatter from my older projects when we started writing dlt. the intention was to align all log lines where file name and line number where... but since that time we added more things and that does not make sense.

btw. log format is configurable via toml files and env variables (RUNTIME__LOG_FORMAT). so if you want colors you can add them (but no variable colors without changing code I'm afraid)

I'm ok with merging this. @burnash @sh-rp WDYT?

@cacrespo
Copy link
Author

cacrespo commented Jun 6, 2024

@rudolfix, thanks for your response.

I see that some checks have failed. Give me a few days to try to fix them and think about what more I can do. 💪

@cacrespo
Copy link
Author

cacrespo commented Jun 7, 2024

Hi there!

I've just reviewed the tests and found something odd. If you run test_logging.py before test/common/runners, everything works fine. However, if you execute the logging tests after the runners, we encounter problems (as is happening now).

The conflict particularly occurs here (if we comment on these lines, everything is OK):

@pytest.fixture(scope="module", autouse=True)
def logger_autouse() -> None:
    init_test_logging()

(...)

@pytest.mark.parametrize("method", ALL_METHODS)
def test_pool_runner_process_methods_forced(method) -> None:
    multiprocessing.set_start_method(method, force=True)
    r = _TestRunnableWorker(4)
    # make sure signals and logging is initialized
    C = resolve_configuration(RunConfiguration())
    initialize_runtime(C)

    runs_count = runner.run_pool(configure(ProcessPoolConfiguration), r)
    assert runs_count == 1
    assert [v[0] for v in r.rv] == list(range(4))


@pytest.mark.parametrize("method", ALL_METHODS)
def test_pool_runner_process_methods_configured(method) -> None:
    r = _TestRunnableWorker(4)
    # make sure signals and logging is initialized
    C = resolve_configuration(RunConfiguration())
    initialize_runtime(C)

    runs_count = runner.run_pool(ProcessPoolConfiguration(start_method=method), r)
    assert runs_count == 1
    assert [v[0] for v in r.rv] == list(range(4))

My conclusion is that we have some problem happening when the process instantiates the configuration through the different tests.

Honestly, I don't know enough about pytest, but I made some changes (see the last commit), and now it seems that the problems are resolved (at least in my local environment).

Whatever you need, I'm here to help! 😃

@rudolfix
Copy link
Collaborator

@cacrespo OK! let me check it. runner will initialize logging on their own and we do not have a proper way to "uninitialize" them. So some settings may not be applied. I'm running CI again and the see which tests are failing. thx!

@rudolfix rudolfix self-assigned this Jun 24, 2024
@sh-rp
Copy link
Collaborator

sh-rp commented Jun 24, 2024

@cacrespo You can change the logging format via your config.toml if you like by the way.

@rudolfix rudolfix assigned sh-rp and unassigned rudolfix Jun 26, 2024
@rudolfix rudolfix added the sprint Marks group of tasks with core team focus at this moment label Jun 26, 2024
@cacrespo
Copy link
Author

cacrespo commented Jun 26, 2024

@cacrespo You can change the logging format via your config.toml if you like by the way.

Yes, you're right. I think it's my best option instead of trying to change the application logger. 😜

For your information, I used pytest-randomly and I believe there are some flaky tests (apart from my problem). Here are my steps and an example:

Add the package:

poetry add pytest-randomly

Run the tests without randomness:

poetry run pytest -p no:randomly tests/common/runners tests/common/runtime

Everything goes fine.

Run the tests with randomness:

poetry run pytest --randomly-seed=1330321047 tests/common/runners tests/common/runtime

We encounter some errors.

Hope this helps improve this great tool. Thanks!

@sh-rp
Copy link
Collaborator

sh-rp commented Jun 26, 2024

@cacrespo I have created a new branch (since this one is yours) with only the change you suggested: #1517. I'm happy to attribute it to you, I don't think we need any tests for this at the moment.

@cacrespo
Copy link
Author

Closed by #1517

@cacrespo cacrespo closed this Jun 26, 2024
@rudolfix rudolfix removed the sprint Marks group of tasks with core team focus at this moment label Aug 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants