47

Hey I am trying to export a csv from node.js (pulling the data from mongodb). I already have the data being pulled and separated by commas and all, but now I am trying to figure out how to actually send it... I am sticking this code in my routes file. Any advice on how to take the array of data and send it to a user straight for download on request.

here is the code: (I attempted the the bottom part of the code the second function)

exports.downloadContacts = function(req, res) {
    async.waterfall([
        function(callback) {
            var source = [];
            Friend.find({userId: req.signedCookies.userid}, function(err, friends) {
                if(err) {console.log('err with friends for download');
                } else {
                    var userMap = {};
                    var friendIds = friends.map(function (user) {
                        userMap[user.friend_id] = user;
                        return user.friend_id;
                    });
                    console.log(friends);
                    User.find({_id: {$in: friendIds}}, function(err, users) {
                        if(err) {console.log(err); 
                        } else {
                            for(var i = 0; i < users.length; i++) {
                                console.log('users')
                                //console.log(users[i]);
                                source.push(users[i].firstNameTrue, users[i].lastNameTrue, users[i].emailTrue, users[i].phone, users[i].emailList, users[i].phoneList)
                            }
                            console.log(source);
                            callback(null, source);
                        }


                    });
                }


            });

        }
    ],
    function(err, source) {
        var result = [];

        res.contentType('csv');

        csv()
        .from(source)
        .on('data', function(data){ 
            result.push(data.join());
        })
        .on('end', function(){
            res.send(result.join('\n'));
        });
    });     
};

10 Answers 10

44

Here is what I did:

  1. Use json2csv to build csv data from mongodb data

var json2csv = require('json2csv');
var fields = ['name', 'phone', 'mobile', 'email', 'address', 'notes'];
var fieldNames = ['Name', 'Phone', 'Mobile', 'Email', 'Address', 'Notes'];
var data = json2csv({ data: docs, fields: fields, fieldNames: fieldNames });
  1. Send csv data to client

res.attachment('filename.csv');
res.status(200).send(data);
Sign up to request clarification or add additional context in comments.

6 Comments

How do you consume this on the front end?
@Ohjay44 I used it for my front end (FE) website using express js framework. If you separate API and FE, I think FE and API should communicate together by json. And you will export this json data to csv on the FE. Do I understand your question correctly?
It turns out all i needed to do was trigger it to download as a file once I received the data in Javascript. All I was receiving was the file in raw format so i was confused how to start the csv file download once my call was complete.
@VietTran will it create the file on server in case it is deployed on a server? I don't want the server to be creating too many files with the above solution in case of multiple download requests.
@y_159 It's been too long since I worked on nodejs but no, the code above doesn't create any file on server. The variable data is on the memory only.
|
39

Have you tried something like this with a content type as "application/octet-stream"

res.set('Content-Type', 'application/octet-stream');
res.send(<your data>);

or simply

res.send(Buffer.from(<your data>));

Express send() docs.

6 Comments

Yeah that works, but how do I set the content type correctly... is it suppose to show up as type - file in my download folder? or is there a friendlier way to send it? This works fine and opens up with any app, but was just wondering if there was a better way to set type or how to make it look nicer, or is this the way people do it if you are trying to send a file?
What do you mean by "nicer"? maybe "text/plain"? Could you give me a little more info about what are you trying to do. Cheers.
Not sure yet what you actually need, but I this could help you expressjs.com/api.html#res.download or expressjs.com/api.html#res.sendfile Not sure if you can send the buffer instead of the file, maybe you can it directly without save it to the fs. This will prompt to the client to download the file. Hope this help.
what type of data we have to send in this case?? an array or JSON ?
new Buffer is deprecated, use Buffer.from instead
|
20

The json2csv package has been updated since the highest voted answer was written, the newer version has slightly different syntax:

        var { Parser } = require('json2csv')

        const fields = [{
            label: 'header 1',
            value: 'field1_name'
        }, {
            label: 'header 2',
            value: 'field2_name'
        }]

        const json2csv = new Parser({ fields: fields })

        try {
            const csv = json2csv.parse(data)
            res.attachment('data.csv')
            res.status(200).send(csv)
        } catch (error) {
            console.log('error:', error.message)
            res.status(500).send(error.message)
        }

res.attachment is a function and not an attribute. Needed to remove the equal sign for this to work.

2 Comments

important! since the top answer now doesn't work...
@svarrall will it create the file on server in case it is deployed on a server? I don't want the server to be creating too many files with the above solution in case of multiple download requests.
4

Based on @VierTD answer, you have to use json2csv to build csv data from mongodb data.

Update Package.json with dependecies:

  "dependencies": {
    "mongodb": "^2.2.10",
    "json2csv": "*",
    "express": "*"
  }

Create the json2CSV object, remember this is mapping your document (db table) so names on "fields" must match db table, fieldNames are up to you but are also important since these are the CSV file column names:

var json2csv = require('json2csv');
var fields = ['name', 'phone', 'mobile', 'email', 'address', 'notes'];
var fieldNames = ['Name', 'Phone', 'Mobile', 'Email', 'Address', 'Notes'];
var data = json2csv({ data: docs, fields: fields, fieldNames: fieldNames });

Send csv data to the client:

res.attachment('filename.csv');
res.status(200).send(data);

This code shows how to export csv file based on mongo db document (database table)

I have created a github repo really brief sampling the idea, also created a database with dummy data at Mongo Lab Website so this code will run inmediately in your computer.Based on @VietTD answer.

In case you are interested on testing this code just remember to change following lines.

Change db url (User your own db this works but it's only for showing purposes).

        var url = 'mongodb://admin:[email protected]:63946/misale_dev';

Change target document (or db table).

var collection = db.collection('_dummy');

Change document columns (or db table's column fields):

var fields = ['_id', 'JobID', 'LastApplied'];

Finally set you CSV title column names as well as your CSV file name:

var fieldNames = ['ID_OR_PK', 'JOB_UNIQUE_ID_TITLE', 'APPLICATION_DATE'];

Last but not least:

res.attachment('yourfilenamehere.csv');

Please feel free to improve the sample code i'll appreciate that! or just download it and take a look at the code. It already comes with a database, so it will be easy to understand and run, in such a case you dont care here is the whole code maybe you see something interesting:

//
// EXPRESS JS SERVER INITI
//
var express = require('express')
var app = express()

//
// MONGO DB INIT
//
var MongoClient = require('mongodb').MongoClient, assert = require('assert');
app.get('/', function (req, res) {
    var url = 'mongodb://admin:[email protected]:63946/misale_dev';
    //
    // This function should be used for migrating a db table to a TBD format
    //
    var migrateMongoDBTable = function(db, callback) {
        // Get the documents collection
        console.log("Reading database records");
        // Get the documents collection
        var collection = db.collection('_dummy');
        // Find some documents
        //collection.find({'a': 3}).toArray(function(err, docs) {
        collection.find({}).toArray(function(err, docs) {
          assert.equal(err, null);
          //console.log(docs);
          //console.log('docs.length ---> ', docs.length);
          console.log('Creating CSV...');
          //console.log('-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=');
          var json2csv = require('json2csv');
          var fields = ['_id', 'JobID', 'LastApplied'];
          var fieldNames = ['ID_OR_PK', 'JOB_UNIQUE_ID_TITLE', 'APPLICATION_DATE'];
          var data = json2csv({ data: docs, fields: fields, fieldNames: fieldNames });
          //console.log('-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-==-=');
          // EXPORT FILE
          res.attachment('yourfilenamehere.csv');
          res.status(200).send(data);
          callback(docs);
        });
    };

    // Use connect method to connect to the server
    MongoClient.connect(url, function(err, db) {
      assert.equal(null, err);
      console.log("Connected successfully to server");
      //
      // migrate db table to some format TBD
      //
      migrateMongoDBTable(db, function() {
        db.close();
      });
    });

})

app.listen(3000, function () {
  console.log('Example app listening on port 3000!')
})
// Connection URL
//var url = 'mongodb://localhost:27017/myproject';

1 Comment

json2csv is not a function
2

I'm not totally sure what you're question is, but I think what you're looking for can be solved by setting the Content Disposition header. Take a look at this response on another another SO question: https://stackoverflow.com/a/7288883/2320243.

1 Comment

I am trying to send a user's friend list offline, letting them download there friend list as a csv that is formed from mongodb and sent directly to the user for download without downloading it to our system... that did not help..
2

Using json2csv library you can export csv from mongodb data.

const json2csv = require('json2csv').parse;

//For unique file name
const dateTime = new Date().toISOString().slice(-24).replace(/\D/g, 
'').slice(0, 14); 

const filePath = path.join(__dirname, "../../../", "public", "exports", "csv-" 
+ dateTime + ".csv");

let csv; 

const student = await req.db.collection('Student').find({}).toArray();

// Logging student
// [{id:1,name:"John",country:"USA"},{id:1,name:"Ronny",country:"Germany"}]

const fields = ['id','name','country'];

 try {
    csv = json2csv(student, {fields});
 } catch (err) {
    return res.status(500).json({err});
 }

 fs.writeFile(filePath, csv, function (err) {
    if (err) {
        return res.json(err).status(500);
    }
    else {
        setTimeout(function () {
            fs.unlink(filePath, function (err) { // delete file after 30 sec
            if (err) {
                console.error(err);
            }
            console.log('File has been Deleted');
        });

    }, 30000);
        res.download(filePath);
    }
})

1 Comment

Heres the link for the gist for the working examples of exporting to the csv gist.github.com/aabiskar/e7f028e78e586921a0b7f3435b65b5f1
2

Real example maybe helpful

BACKEND - NodeJs

router.js

var { Parser } = require('json2csv');
const dataService = require('../service');

router.get('/exportData', cors(), async (req, res) => {
    const list = await dataService.getData(req);
    const fields = [
        { label: 'Id', value: 'Id' },
        { label: 'Name', value: 'Name' },
        { label: 'Age', value: 'Age' },
        { label: 'Date', value: 'Date' },
    ];
    const json2csv = new Parser({ fields: fields });
    const csv = json2csv.parse(list);
    res.status(200).send(Buffer.from(csv));
});

service.js

const sequelize = require('sequelize');
const moment = require('moment');
const { CallLogs } = require("../models");
const { Op } = sequelize;

exports.getData = async (req) => {
    let list = [];
    try {
        const { query: { Date_gte, Date_lt } } = req;
        const request = {
            where: {},
            order: [['Date', 'DESC']],
        };
        if (Date_gte && Date_lt) {
            request.where.Date = {
                [Op.between]: [Date_gte, Date_lt]
            }
        } else {
            if (Date_gte) {
                request.where.Date = {
                    [Op.gte]: Date_gte
                }
            }
            if (Date_lt) {
                request.where.Date = {
                    [Op.lt]: Date_lt
                }
            }
        }

        list = JSON.parse(JSON.stringify(await CallLogs.findAll(request)));

        list = list.map(c => {
            let obj = { ...c };
            obj.Date = moment(new Date(obj.Date)).format('DD.MM.YYYY');
            return obj;
        });
    } catch (error) {
        console.log(error);
    }
    return list;
}

FRONT END - REACT

app.js

import download from "downloadjs";
import Axios from "axios";

async function exportResultToExcel() {
    try {
        let filters = { ...this.state.filter };
        const res = (await Axios.get(`api/exportData`, {
            params: filters,
            responseType: "blob"
        })).data;
        if (res) {
            download(res, new Date().toLocaleDateString() + '-data.csv');
        } else {
            alert("Data not found");
        }
    } catch (error) {
        console.log(error);
    }
}

<Button onClick={() => { this.exportResultToExcel() }}>Export to CSV</Button>

Comments

1

Express-csv is a great module for writing csv contents to stream from a node.js server, which will be sent as a response to the client (and downloaded as a file). Very easy to use.

app.get('/', function(req, res) {
  res.csv([
    ["a", "b", "c"]
  , ["d", "e", "f"]
  ]);
});

The docs: https://www.npmjs.com/package/express-csv

When you pass an object, you need to prepend the headers explicitly (if you want them). Here's my my example using npm mysql

router.route('/api/report')
    .get(function(req, res) {
            query = connection.query('select * from table where table_id=1;', function(err, rows, fields) {
                if (err) {
                    res.send(err);
                }
                var headers = {};
                for (key in rows[0]) {
                    headers[key] = key;
                }
                rows.unshift(headers);
                res.csv(rows);
            });
    });

3 Comments

How does one go from the code above (attached a csv object to the response) to offering it as a file the user can download?
I see now that's all that's required to present a download. Pretty sweet.
I could not send rows with a million rows data. Is there any way to handle this
1

If you don't want to use a module, I would break it into two steps:

Step 1: Convert JSON to flat array

What's important here is that the JSON keys become the first row of the array

/**
 * Convert JSON list of objects to 2D array
 * Headers from first list item create first row of array
 * @param Array JSON list of objects
 * @return Array 2D array with first row as the object keys from the first object in list
 */
function json2array (data) {
  let headers = []
  let output = new Array(data.length + 1)
  // get object keys for first item if one exists
  if (data.length > 0) headers = Object.keys(data[0])
  output[0] = headers
  data.forEach((dataRow, row) => {
    const outputRow = new Array(headers.length)
    // populate array
    headers.forEach((header, column) => {
      outputRow[column] = dataRow[header]
    })
    output[row + 1] = outputRow
  })
  return output
}

// jsonData = [{email: "[email protected]", name: "John"}, {email: "[email protected]", name: "Jane"}] 

const arr = json2array(jsonData)

// arr = [["email", "name"],["[email protected]", "John"],["[email protected]", "Jane"]]

Step 2: Convert 2D array to CSV

Then you can save to file or output to browser, etc.

/**
 * Convert a 2D array into CSV notation
 * @param Array a 2D array with the first row as headers
 * @return String CSV-formatted text
 */
function array2csv (arr) {
  let csvOut = ''
  arr.forEach((arrRow, rowNum) => {
    let csvRow = ''
    arrRow.forEach((value, colNum) => {
      if (colNum > 0) csvRow += ',' // comma-separate columns
      const valueType = typeof(value)
      if (valueType == 'number') csvRow += value
      else if (valueType == 'boolean') csvRow += value ? 'TRUE':'FALSE'
      // quote strings and handle existing quotes
      else csvRow += '"' + value.replace(/"/g, '""') + '"'
    })
    if (rowNum > 0) csvOut += "\n" // newline-split rows
    csvOut += csvRow
  })
  return csvOut
}

const csvOut = array2csv(arr)

/*
"email","name"
"[email protected]","John"
"[email protected]","Jane"
*/

These methods are O(n) complexity.

Comments

0

I found a solution at this http://nikgrozev.com/2017/05/10/mongo-query-to-CSV-download-expressjs/

Keyword

stream the output to the HTTP response

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.