perf(metrics): Improve Performance for Inserting/Deleting Large Numbers of Lines in Buffer #18
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overview
This PR improves the performance of handling large numbers of rows in buffers by refactoring the
CsvViewMetrics:_add_row_placeholders
andCsvViewMetrics:_remove_rows
methods. The changes address inefficiencies intable.insert
andtable.remove
, which are not ideal for large datasets due to their linear complexity.Changes
Optimized
CsvViewMetrics:_add_row_placeholders
Replaced
table.insert
with a custom method that shifts elements directly. This approach significantly improves insertion performance for large buffers.Optimized
CsvViewMetrics:_remove_rows
Replaced
table.remove
with a direct element-shifting method, enhancing efficiency when removing multiple rows.Performance Improvements
In testing, the time required to insert or delete 100,000 rows decreased significantly—from approximately 5 seconds to just over 1 second. This represents a substantial reduction in latency for large-scale edits.