If gzip doesn't compress well enough, chances are your binary format won't either, especially if you wan't to be able to decode it via javascript within a reasonable amount of time.

Remember that the unzipping when using gzip is done natively by the browser and is orders of magnitude faster than anything you can do in javascript.

If you feel that the JSON deserialization is too slow, because you are supporting older browsers like ie7 which doesn't decode JSON natively but depends on eval for the job, consider going away from JSON to a custom encoding based on string splitting, which is much much faster to deserialize.

For inspiration try to read this article:

http://code.flickr.com/blog/2009/03/18/building-fast-client-side-searches/

Answer from Martin Jespersen on Stack Overflow
🌐
GitHub
github.com › JeffreyRiggle › bison
GitHub - JeffreyRiggle/bison: JSON to binary
A JSON to binary tool. This tool takes in JSON and converts it to a binary structure to save some bytes.
Starred by 8 users
Forked by 2 users
Languages   TypeScript 93.1% | JavaScript 6.9% | TypeScript 93.1% | JavaScript 6.9%
Top answer
1 of 2
17

First of all, Browsers treat binary data differently than NodeJS. In browser, binaries can be seen as Blob or ArrayBuffer, but in NodeJS, it is seen as Buffer doesn't understand ArrayBuffer. I won't go in too deep into this, but you need to handle data differently between browser and nodeJS.

When using WebSocket at the browser side, data are transmitted as either string or binary, if binary will be used, then you have to specify BinaryType, and in this particular case, I will use ArrayBuffer.

As to string to buffer, I suggest to use the standard UTF-8 as there are 2 ways of encoding UTF-16. For example '\u0024' in UTF-16 will be stored as 00 24 in UTF-16BE, and in UTF-16LE, it is stored as 24 00. That is, if you are going to use UTF-16, then you should use TextEncoder and TextDecoder. Otherwise you can simply do this

strToAB = str =>
  new Uint8Array(str.split('')
    .map(c => c.charCodeAt(0))).buffer;

ABToStr = ab => 
  new Uint8Array(ab).reduce((p, c) =>
  p + String.fromCharCode(c), '');

console.log(ABToStr(strToAB('hello world!')));

For UTF-16, the browser code should be something like:

const ENCODING = 'utf-16le';
var ws = new WebSocket('ws://localhost');

ws.binaryType = 'arraybuffer';
ws.onmessage = event => {
  let str = new TextDecoder(ENCODING).decode(event.data),
    json = JSON.parse(str);
    console.log('received', json);
};
ws.onopen = () => {
  let json = { client: 'hi server' },
    str = JSON.stringify(json);
  console.log('sent',json);

  //JSON.toString() returns "[object Object]" which isn't what you want,
  //so ws.send(json) will send wrong data.
  ws.send(new TextEncoder(ENCODING).encode(str));
}

At the server side, data is stored as Buffer and it more or less does everything natively. You however need to specify Encoding unless it is UTF-8.

const ENCODING = 'utf-16le';
//You may use a different websocket implementation, but the core
//logic reminds as they all build on top of Buffer.
var WebSocketServer = require('websocket').server,
  http = require('http'),
  //This is only here so webSocketServer can be initialize.
  wss = new WebSocketServer({
    httpServer: http.createServer()
      .listen({ port: 80 })});

wss.on('request', request => {
  var connection = request.accept(null, request.origin);
  connection.on('message', msg => {
    if (msg.type === 'binary') {
      //In NodeJS (Buffer), you can use toString(encoding) to get
      //the string representation of the buffer.
      let str = msg.binaryData.toString(ENCODING);
      console.log(`message : ${str}`);

      //send data back to browser.
      let json = JSON.parse(str);
      json.server = 'Go away!';
      str = JSON.stringify(json);

      //In NodeJS (Buffer), you can create a new Buffer with a
      //string+encoding, and the default encoding is UTF-8.
      let buf = new Buffer(str, ENCODING);
      connection.sendBytes(buf);
    }
  });
});
2 of 2
2

Try it:

Sending data example:

var data = [{
    id: 1,
    name: "test",
    position: [1234, 850], //random position on the map
    points: 100 //example points
}];

var data2 = new Uint16Array(data);
socket.send(data2);

In your event onMessage websocket try it:

function onMessage(event) {

    if (event.data instanceof window["ArrayBuffer"]){
        var data3 = JSON.parse(String.fromCharCode.apply(null, new Uint16Array(event.data)));
    };

};
🌐
npm
npmjs.com › package › typed-binary-json
typed-binary-json - npm
December 3, 2024 - It stores known object prototypes in a JSON header, and serializes the data in a binary format following the header. TBJSON is useful for serializing known objects, classes, or types, otherwise it will offer little advantage if any in terms of size or performance over JSON. For a browser compatible version of this package, use TBJSON in the Browser. Each file starts off with .tbj to singinify that it is a Typed Binary JSON file, followed by a unit32 which is the length of the header.
      » npm install typed-binary-json
    
Published   Dec 03, 2024
Version   1.21.2
Author   Jeff Seaman
🌐
npm
npmjs.com › package › js-binary
js-binary - npm
November 27, 2017 - Encode/decode to a custom binary format, much more compact and faster than JSON/BSON. Latest version: 1.2.0, last published: 8 years ago. Start using js-binary in your project by running `npm i js-binary`. There are 5 other projects in the npm registry using js-binary.
      » npm install js-binary
    
Published   Nov 27, 2017
Version   1.2.0
Author   Sitegui
🌐
IETF
ietf.org › archive › id › draft-hallambaker-jsonbcd-24.html
Binary Encodings for JavaScript Object Notation: JSON-B, JSON-C, JSON-D
Ability to convert from JSON to binary encoding in a streaming mode (i.e. without reading the entire binary data block before beginning encoding.¶ ... The ability to support JSON tag compression and extended data types are considered desirable but not essential for typical network applications.¶ ... Encodes JSON data in binary. Only the JavaScript data model is supported (i.e.
🌐
DotNetCurry
dotnetcurry.com › csharp › 1279 › serialize-json-data-binary
Serializing JSON Data into Binary Form | DotNetCurry
The XML form is acceptable over Binary, but the Xml data needs encoding when it is passed over HTTP. You have to escape/unescape an XML file removing traces of characters (< is replaced with &lt; & is replaced with &amp; and so on ) that could be treated as markup. This makes data communication more complex because the size of the data message increases as a result of the encoding. JSON is JavaScript ...
Find elsewhere
🌐
npm
npmjs.com › search
binary json - npm search
Node Bindings for abieos: Binary <> JSON conversion using ABIs.
🌐
GitHub
github.com › mrdoob › three.js › issues › 1667
Convert JSON to Binary? · Issue #1667 · mrdoob/three.js
April 8, 2012 - Hi all, Wondering if there is a tool to convert an existing Three.js JSON file to its binary equivalent? I searched and found a few versions of this question on StackOverflow etc., with no satisfac...
🌐
Online Tools
onlinetools.com › json › convert-json-to-bson
Convert JSON to BSON – Online JSON Tools
Click the "Save as" button to download the binary file. By default, the output BSON file has the ".bson" extension but you can change it to another in the options. Other common BSON extensions are ".bjson" and ".bin". If you need to convert a machine-readable BSON file back to a human-readable JSON file, you can use our Convert BSON to JSON tool. Json-abulous! This tool converts JSON (JavaScript Object Notation) to BSON (Binary JSON).
🌐
Index.dev
index.dev › blog › json-to-buffer-typescript
Convert JSON to Buffer in TypeScript: A Step-by-Step Guide
December 24, 2024 - This guide taught us how to use TypeScript to convert JSON to Buffer, a method that's particularly helpful for file storage, network transfer, and interacting with binary-only systems. Parsing JSON, turning it into a string, and then encoding it into a buffer were among the procedures.
🌐
Medium
medium.com › @th30z › yet-another-json-binary-encoding-382d6e785273
Yet Another JSON Binary Encoding. YAJBE is a compact binary data format… | by Matteo Bertozzi | Medium
April 5, 2023 - Yet Another JSON Binary Encoding YAJBE is a compact binary data format built to be a drop-in replacement for JSON (JavaScript Object Notation). You can find the repository here …
Top answer
1 of 16
593

There are 94 Unicode characters which can be represented as one byte according to the JSON spec (if your JSON is transmitted as UTF-8). With that in mind, I think the best you can do space-wise is base85 which represents four bytes as five characters. However, this is only a 7% improvement over base64, it's more expensive to compute, and implementations are less common than for base64 so it's probably not a win.

You could also simply map every input byte to the corresponding character in U+0000-U+00FF, then do the minimum encoding required by the JSON standard to pass those characters; the advantage here is that the required decoding is nil beyond builtin functions, but the space efficiency is bad -- a 105% expansion (if all input bytes are equally likely) vs. 25% for base85 or 33% for base64.

Final verdict: base64 wins, in my opinion, on the grounds that it's common, easy, and not bad enough to warrant replacement.

See also: Base91 and Base122

2 of 16
340

I ran into the same problem, and thought I'd share a solution: multipart/form-data.

By sending a multipart form you send first as string your JSON meta-data, and then separately send as raw binary (image(s), wavs, etc) indexed by the Content-Disposition name.

Here's a nice tutorial on how to do this in obj-c, and here is a blog article that explains how to partition the string data with the form boundary, and separate it from the binary data.

The only change you really need to do is on the server side; you will have to capture your meta-data which should reference the POST'ed binary data appropriately (by using a Content-Disposition boundary).

Granted it requires additional work on the server side, but if you are sending many images or large images, this is worth it. Combine this with gzip compression if you want.

IMHO sending base64 encoded data is a hack; the RFC multipart/form-data was created for issues such as this: sending binary data in combination with text or meta-data.

🌐
GitHub
github.com › juliangruber › binary-extract
GitHub - juliangruber/binary-extract: Extract a value from a buffer of json without parsing the whole thing
With the object from bench.js, extract() is ~2-4x faster than JSON.parse(buf.toString()). It is also way more memory efficient as the blob stays out of the V8 heap. The big perf gain comes mainly from not parsing everything and not converting the buffer to a string. $ npm install binary-extract ·
Starred by 154 users
Forked by 4 users
Languages   JavaScript 98.8% | Makefile 1.2% | JavaScript 98.8% | Makefile 1.2%
🌐
DeepWiki
deepwiki.com › bellard › quickjs › 8.4-binary-json
Binary JSON | bellard/quickjs | DeepWiki
September 1, 2025 - Binary JSON is a compact binary serialization format in QuickJS for JavaScript values. It provides a way to convert JavaScript objects into a binary representation and vice versa. Binary JSON is a sub
🌐
GeeksforGeeks
geeksforgeeks.org › javascript › how-to-convert-json-to-base64-in-javascript
How to Convert JSON to base64 in JavaScript ? - GeeksforGeeks
August 5, 2025 - First, the JSON data is converted to a UTF-8 string using unescape and encodeURIComponent, and then btoa encodes this UTF-8 string to Base64. ... Example: The below example uses btoa function to convert JSON to Base64 in JavaScript...