-
Notifications
You must be signed in to change notification settings - Fork 214
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stack errors with very large CSV files #92
Comments
Note that event if the on-data function does absolutely nothing, I'll still run into the RangeError |
What version of node are you using? |
0.10.28 |
+1, same issue here while running the following against a 6M lines csv file (1.51GB) on Node v0.10.33: var output = fs.createWriteStream(resultFile, { encoding: 'utf-8' });
var input = fs.createReadStream(file);
csv
.fromStream(input, {headers: true})
.transform(function(obj, next){
process.nextTick(function() {
if(obj.TIMESTAMP) { obj.TIMESTAMP = moment(obj.TIMESTAMP, 'YYYYMMDDHHmmss.SSSZ').toISOString(); }
return next(null, obj);
});
});
.pipe(csv.createWriteStream({headers: true}))
.pipe(output);
|
So I found the issue which had to do with Please let me know if the problem persists, but this was the only way I was able to reproduce. -Doug |
I should note you will have to upgrade to |
awesome. i've verified the fix with my example. thanks! |
Awesome...thanks for the bug report -Doug |
v0.6.0 solved the issue, thank you! |
👍 |
Solved for me too, thanks! |
I'm getting a RangeError when I read very large - 570k - CSV files.
I do not have the same problem when I write the file of this size - only when I turn around and read it back.
Here's the pattern I'm using :
The text was updated successfully, but these errors were encountered: