Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MINOR: [R] Avoid {glue}'s whitespace trimming #12770

Closed
wants to merge 1 commit into from

Conversation

jonkeane
Copy link
Member

@jonkeane jonkeane commented Mar 31, 2022

For a while now glue has been removing the tabs before: (this might be from tidyverse/glue@8369f9a which was to change a different thing we noticed with whitespace stripping).

Regardless, the fix is easy enough (don't include new lines around the line we are glueing)

Before the change, the format of the relevant lines of arrowExports.cpp are:

static const R_CallMethodDef CallEntries[] = {
{ "_arrow_available", (DL_FUNC)& _arrow_available, 0 },
{ "_dataset_available", (DL_FUNC)& _dataset_available, 0 },
{ "_engine_available", (DL_FUNC)& _engine_available, 0 },
{ "_parquet_available", (DL_FUNC)& _parquet_available, 0 },
{ "_s3_available", (DL_FUNC)& _s3_available, 0 },
{ "_json_available", (DL_FUNC)& _json_available, 0 },
{ "_arrow_test_SET_STRING_ELT", (DL_FUNC) &_arrow_test_SET_STRING_ELT, 1}, 

When they should be:

static const R_CallMethodDef CallEntries[] = {
	{ "_arrow_available", (DL_FUNC)& _arrow_available, 0 },
	{ "_dataset_available", (DL_FUNC)& _dataset_available, 0 },
	{ "_engine_available", (DL_FUNC)& _engine_available, 0 },
	{ "_parquet_available", (DL_FUNC)& _parquet_available, 0 },
	{ "_s3_available", (DL_FUNC)& _s3_available, 0 },
	{ "_json_available", (DL_FUNC)& _json_available, 0 },
	{ "_arrow_test_SET_STRING_ELT", (DL_FUNC) &_arrow_test_SET_STRING_ELT, 1}, 

This change restores the indentation we expect there (so the file won't change on rebuilds)

Copy link
Member

@paleolimbot paleolimbot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has been bugging me for a while too! Thanks!

@jonkeane
Copy link
Member Author

I would be curious if this looks like a new bug in glue or if it's really just alignment with the documented behavior and we were relying on the not-to-spec behavior (I'm leaning towards the second option, but would love a second opinion!)

@jonkeane jonkeane closed this in a1f32fa Mar 31, 2022
@ursabot
Copy link

ursabot commented Mar 31, 2022

Benchmark runs are scheduled for baseline = a1a255b and contender = a1f32fa. a1f32fa is a master commit associated with this PR. Results will be available as each benchmark for each run completes.
Conbench compare runs links:
[Finished ⬇️0.0% ⬆️0.0%] ec2-t3-xlarge-us-east-2
[Finished ⬇️0.17% ⬆️0.04%] test-mac-arm
[Failed ⬇️1.07% ⬆️0.0%] ursa-i9-9960x
[Finished ⬇️0.09% ⬆️0.04%] ursa-thinkcentre-m75q
Buildkite builds:
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-ec2-t3-xlarge-us-east-2/builds/427| a1f32fab ec2-t3-xlarge-us-east-2>
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-test-mac-arm/builds/412| a1f32fab test-mac-arm>
[Failed] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-ursa-i9-9960x/builds/412| a1f32fab ursa-i9-9960x>
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-ursa-thinkcentre-m75q/builds/422| a1f32fab ursa-thinkcentre-m75q>
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-ec2-t3-xlarge-us-east-2/builds/426| a1a255bf ec2-t3-xlarge-us-east-2>
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-test-mac-arm/builds/411| a1a255bf test-mac-arm>
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-ursa-i9-9960x/builds/411| a1a255bf ursa-i9-9960x>
[Finished] <https://buildkite.com/apache-arrow/arrow-bci-benchmark-on-ursa-thinkcentre-m75q/builds/421| a1a255bf ursa-thinkcentre-m75q>
Supported benchmarks:
ec2-t3-xlarge-us-east-2: Supported benchmark langs: Python, R. Runs only benchmarks with cloud = True
test-mac-arm: Supported benchmark langs: C++, Python, R
ursa-i9-9960x: Supported benchmark langs: Python, R, JavaScript
ursa-thinkcentre-m75q: Supported benchmark langs: C++, Java

jcralmeida pushed a commit to rafael-telles/arrow that referenced this pull request Apr 19, 2022
For a while now glue has been removing the tabs before: (this might be from tidyverse/glue@8369f9a which was to change [a different thing we noticed with whitespace stripping](tidyverse/glue#247)).

Regardless, the fix is easy enough (don't include new lines around the line we are glueing)

Before the change, the format of the relevant lines of arrowExports.cpp are:
```
static const R_CallMethodDef CallEntries[] = {
{ "_arrow_available", (DL_FUNC)& _arrow_available, 0 },
{ "_dataset_available", (DL_FUNC)& _dataset_available, 0 },
{ "_engine_available", (DL_FUNC)& _engine_available, 0 },
{ "_parquet_available", (DL_FUNC)& _parquet_available, 0 },
{ "_s3_available", (DL_FUNC)& _s3_available, 0 },
{ "_json_available", (DL_FUNC)& _json_available, 0 },
{ "_arrow_test_SET_STRING_ELT", (DL_FUNC) &_arrow_test_SET_STRING_ELT, 1},
```

When they should be:
```
static const R_CallMethodDef CallEntries[] = {
	{ "_arrow_available", (DL_FUNC)& _arrow_available, 0 },
	{ "_dataset_available", (DL_FUNC)& _dataset_available, 0 },
	{ "_engine_available", (DL_FUNC)& _engine_available, 0 },
	{ "_parquet_available", (DL_FUNC)& _parquet_available, 0 },
	{ "_s3_available", (DL_FUNC)& _s3_available, 0 },
	{ "_json_available", (DL_FUNC)& _json_available, 0 },
	{ "_arrow_test_SET_STRING_ELT", (DL_FUNC) &_arrow_test_SET_STRING_ELT, 1},
```

This change restores the indentation we expect there (so the file won't change on rebuilds)

Closes apache#12770 from jonkeane/whitespace

Authored-by: Jonathan Keane <jkeane@gmail.com>
Signed-off-by: Jonathan Keane <jkeane@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants