内存不足,试图创build大的csv文件

使用Meteor并在服务器上尝试通过在Meteor集合中循环并插入一行来生成一个大型的csv文件。 在某些时候,服务器将会出现内存不足的错误 – 我的猜测是在循环结束之前内存不足,具体取决于集合的大小。 我怎样才能解决这个问题(以某种方式清除内存)? 代码如下:

var job = Jobs.findOne(); var fs = Npm.require('fs'); var file = '/tmp/csv-' + job._id + '.csv'; var headers = ["Email", "Processed?", "Integration", "Passed?", "Reason", "Date"]; var stream = fs.createWriteStream(file); var first_line = headers.join() + '\n'; var wstream = fs.createWriteStream(file); var emails = rawEmails.find(); wstream.write(first_line); emails.forEach(function(rawemail) { var line_item = []; line_item.push(rawemail.email); if (rawemail.processed === true || rawemail.processed === false) line_item.push(rawemail.processed); if (rawemail.integration) line_item.push(rawemail.integration); if (rawemail.passed === true || rawemail.passed === false) line_item.push(rawemail.passed); if (rawemail.reason) line_item.push(rawemail.reason); if (rawemail.updated_at) line_item.push(rawemail.updated_at); var to_write = line_item.join() + '\n'; wstream.write(to_write); }); wstream.end(); 

 var emails = rawEmails.find(); 

不好。 您将需要限制,分页并将大量logging写入文件

 var skip = 0 var emails = rawEmails.find({}, {limit: 100, skip: skip}) while (emails) { // write to buffer skip = (++skip) * 100 emails = rawEmails.find({}, {limit: 100, skip: skip}) } 

请注意,如果logging数量太大,节点进程也会为writeStream消耗大量的内存,从而再次造成内存溢出。 考虑写入多个文件并将其压缩回发送给客户端(如果客户端要下载它)