You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
Luke Childs e384800abc Make example output more obvious 8 years ago
benchmark Woops, fix lint errors 8 years ago
examples Make example output more obvious 8 years ago
src Increase default chunk size to 250KB 8 years ago
test Use non utf8 buffer to help catch decoding errors 8 years ago
.gitignore Initial commit 8 years ago
.travis.yml Don't test on latest and 7 8 years ago
LICENSE Initial commit 8 years ago
README.md Move intro all into one line 8 years ago
package.json base64 => Base64 8 years ago

README.md

base64-async

Non-blocking chunked Base64 encoding

Build Status Coverage Status npm

Process large Base64 documents without blocking the event loop.

Configurable chunk size option to optimise for your use case.

Note:

Base64 in Node.js is already crazy fast. Breaking the work up into chunks and adding async logic adds overhead. If you aren't dealing with large files it will probably be more performant to just block the event loop for the small amount of time it takes Node.js to process Base64 synchronously.

Install

npm install --save base64-async

Usage

const b64 = require('base64-async');
const fs = require('fs');
const buffer = fs.readFileSync('somehugefile.jpg');

b64.encode(fileBuffer).then(b64String => console.log(b64String));
// aGkgbXVt...

b64.decode(b64String).then(buffer => console.log(buffer));
// <Buffer 68 69 20 6d 75 6d ... >

// or, for the cool kids
const b64String = await b64.encode(fileBuffer);
const originalFileBuffer = await b64.decode(b64String);

// which is equivalent to this
const b64String = await b64(fileBuffer);
const originalFileBuffer = await b64(b64String);
// If no method is specified, buffers are encoded, strings are decoded

License

MIT © Luke Childs