reduce loglevel for xpath page parsing errors #322
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Like with #94 we should reduce the loglevel for issues regarding parsing the current page as XML/HTML when applying XPath expressions for dynvars.
The current problem is, that we generate A LOT of log data when we dump the entire page. It will be printed as binary (
ts_search:(3:<0.3361.0>) Page couldn't be parsed:(error:function_clause) Page:<<72,84,84,80,47,49,46,49,32,50,48,48,33,…>>
).When we run large-scale load tests, errors on the backend are to be expected. In those cases we end up with log files being multiple hundred MB in size putting quite a bit of stress on tsung nodes impacting the test as a side-effect.
Besides reducing the log level, this PR also adds a new error counter being reported to
ts_mon
:error_xml_unparsable
. This error is comparable toerror_json_unparsable
. Also, there is still an error logged (on loglevel error:Couldn't execute XPath: page not parsed (varname=~p)
).