As long as the file is inside your project folder (config file, i.e.) you can load it synchronously requiring it directly in NodeJS.
var test = require('./array.json');
And then the content will be loaded into your variable in the next executed sentence.
You can try to console.log it, and it will print:
[ { name: 'c', content: '3', _prototype: 'item' },
{ name: 'd', content: '4', _prototype: 'item' } ]
Right exactly in the order that the file had.
Answer from jesusbotella on Stack OverflowAs long as the file is inside your project folder (config file, i.e.) you can load it synchronously requiring it directly in NodeJS.
var test = require('./array.json');
And then the content will be loaded into your variable in the next executed sentence.
You can try to console.log it, and it will print:
[ { name: 'c', content: '3', _prototype: 'item' },
{ name: 'd', content: '4', _prototype: 'item' } ]
Right exactly in the order that the file had.
fs.stat is async, so your function is async.
You want fs.fstatSync instead.
require() does the JSON parsing for you and returns an object.
You can use Object.values() to get an array that contains only the values of this object (and ignore the keys):
const data = require('../../data/usershops.json');
const arr = Object.values(data)
console.log(arr);
// [ { nani: 'meow' }, { nani: 'woof' } ]
But please be aware that the order of keys/values in an object is not determined and this means the order of values in the array returned by Object.values() might not always be the one you expect.
Unless you iterate over the array and use all the values in on operation that does not depend on their order, I recommend you don't use an object this way.
let data = {
"cat": {
"nani": "meow"
},
"dog": {
"nani": "woof"
}
};
let array = Object.entries(data).map(([key,value])=>value);
console.log(array[0].nani);
Just read out the movies property:
var parsedJSON = require('./filename.json');
var result = parsedJSON.movies;
You can parse the file string using JSON.parse(..) :
const fs = require('fs');
var fileString = fs.readFileSync('movies.txt').toString();
var fileObj = JSON.parse(fileString);
var moviesArr = fileObj.movies;
console.log(moviesArr);
As a variant:
'use strict';
const fs = require('fs');
const path = require('path');
const dir = './data';
function init() {
return fs.readdirSync(dir)
.filter(name => path.extname(name) === '.json')
.map(name => require(path.join(dir, name)));
}
You can use glob package.
const glob = require("glob");
const path = require("path");
glob('data/*.json', function (er, files) {
files.forEach(function(file) {
//Do your thing
})
})
-
In ES5 you can do it by
const object = require('./config.json');
The object will contain your JSON.
In ES6
import object from './config.json'
-
Using fs module synchronously
const fs = require('fs')
const jsonData = JSON.parse(fs.readFileSync('config.json', 'utf-8'))
Async example, if you're using ES6 modules:
import fs from 'fs/promises';
const {prefix, token} = JSON.parse(
await fs.readFile('./config.json', 'utf-8')
);
See also:
- https://nodejs.org/api/fs.html#promises-api
- https://nodejs.org/api/packages.html#determining-module-system
I’m working on a chat client and server to learn about API’s and now I need to write the messages submitted by users into the master chat log (a json file with a messages array inside of it). I have been searching for different approaches and they all error out somehow. I’m using node and express for routing if that helps.
tldr; how should one append an object to an array inside a json file using node.
edit: I’ve taken carcigenocate’s advice. I open the file, make the changes in memory, then write to the file and reload it to reflect changes. I am still open to improvements on this design!