Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce loglevel for xpath page parsing errors #322

Merged
merged 2 commits into from
Jul 22, 2018
Merged

Conversation

tisba
Copy link
Collaborator

@tisba tisba commented Jul 22, 2018

Like with #94 we should reduce the loglevel for issues regarding parsing the current page as XML/HTML when applying XPath expressions for dynvars.

The current problem is, that we generate A LOT of log data when we dump the entire page. It will be printed as binary (ts_search:(3:<0.3361.0>) Page couldn't be parsed:(error:function_clause) Page:<<72,84,84,80,47,49,46,49,32,50,48,48,33,…>>).

When we run large-scale load tests, errors on the backend are to be expected. In those cases we end up with log files being multiple hundred MB in size putting quite a bit of stress on tsung nodes impacting the test as a side-effect.

Besides reducing the log level, this PR also adds a new error counter being reported to ts_mon: error_xml_unparsable. This error is comparable to error_json_unparsable. Also, there is still an error logged (on loglevel error: Couldn't execute XPath: page not parsed (varname=~p)).

@tisba tisba merged commit b2d5cf8 into develop Jul 22, 2018
@tisba tisba deleted the xpath-error-logging branch July 22, 2018 11:27
@nniclausse nniclausse added this to the 1.8.0 milestone Feb 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants