When using standard `source.pipe(destination)` the source will _not_ be destroyed if the destination emits close or error. You are also not able to provide a callback to tell when the pipe has finished.
`miss.pipe` does these two things for you, ensuring you handle stream errors 100% of the time (unhandled errors are probably the most common bug in most node streams code)
Iterate the data in `stream` one chunk at a time. Your `each` function will be called with `(data, next)` where data is a data chunk and next is a callback. Call `next` when you are ready to consume the next chunk.
Optionally you can call `next` with an error to destroy the stream. You can also pass the optional third argument, `done`, which is a function that will be called with `(err)` when the stream ends. The `err` argument will be populated with an error if the stream emitted an error.
Builds a pipeline from all the transform streams passed in as arguments by piping them together and returning a single stream object that lets you write to the first stream and read from the last stream.
If any of the streams in the pipeline emits an error or gets destroyed, or you destroy the stream it returns, all of the streams will be destroyed and cleaned up for you.
Take two separate streams, a writable and a readable, and turn them into a single [duplex (readable and writable) stream](https://nodejs.org/api/stream.html#stream_class_stream_duplex).
The returned stream will emit data from the readable. When you write to it it writes to the writable.
You can either choose to supply the writable and the readable at the time you create the stream, or you can do it later using the `.setWritable` and `.setReadable` methods and data written to the stream in the meantime will be buffered for you.
Make a custom [transform stream](https://nodejs.org/docs/latest/api/stream.html#stream_class_stream_transform).
The `options` object is passed to the internal transform stream and can be used to create an `objectMode` stream (or use the shortcut `miss.through.obj([...])`)
The `transformFunction` is called when data is available for the writable side and has the signature `(chunk, encoding, cb)`. Within the function, add data to the readable side any number of times with `this.push(data)`. Call `cb()` to indicate processing of the `chunk` is complete. Or to easily emit a single error or chunk, call `cb(err, chunk)`
The `flushFunction`, with signature `(cb)`, is called just before the stream is complete and should be used to wrap up stream processing.
miss.pipe(read, uppercaser, write, function (err) {
if (err) return console.error('Trouble uppercasing!')
console.log('Splendid uppercasing!')
})
```
### from
#####`miss.from([opts], read)`
Make a custom [readable stream](https://nodejs.org/docs/latest/api/stream.html#stream_class_stream_readable).
`opts` contains the options to pass on to the ReadableStream constructor e.g. for creating a readable object stream (or use the shortcut `miss.from.obj([...])`).
Returns a readable stream that calls `read(size, next)` when data is requested from the stream.
-`size` is the recommended amount of data (in bytes) to retrieve.
-`next(err, chunk)` should be called when you're ready to emit more data.
Make a custom [writable stream](https://nodejs.org/docs/latest/api/stream.html#stream_class_stream_writable).
`opts` contains the options to pass on to the WritableStream constructor e.g. for creating a readable object stream (or use the shortcut `miss.to.obj([...])`).
Returns a writable stream that calls `write(data, enc, cb)` when data is written to the stream.
-`data` is the received data to write the destination.
-`enc` encoding of the piece of data received.
-`next(err, chunk)` should be called when you're ready to write more data, or encountered an error.
`flush(cb)` is called before `finish` is emitted and allows for cleanup steps to occur.
Calling `miss.concat(cb)` returns a writable stream. `cb` is called when the writable stream is finished, e.g. when all data is done being written to it. `cb` is called with a single argument, `(data)`, which will contain the result of concatenating all the data written to the stream.
// imageBuffer is all of `cat.png` as a node.js Buffer
}
function handleError(err) {
// handle your error appropriately here, e.g.:
console.error(err) // print the error to STDERR
process.exit(1) // exit program with non-zero exit code
}
```
### finished
#####`miss.finished(stream, cb)`
Waits for `stream` to finish or error and then calls `cb` with `(err)`. `cb` will only be called once. `err` will be null if the stream finished without error, or else it will be populated with the error from the streams `error` event.
This function is useful for simplifying stream handling code as it lets you handle success or error conditions in a single code path. It's used internally `miss.pipe`.
This works like `through` except you can process items in parallel, while still preserving the original input order.
This is handy if you wanna take advantage of node's async I/O and process streams of items in batches. With this module you can build your very own streaming parallel job queue.
Note that `miss.parallel` preserves input ordering, if you don't need that then you can use [through2-concurrent](https://github.com/almost/through2-concurrent) instead, which is very similar to this otherwise.
#### original module
`miss.parallel` is provided by [`require('parallel-transform')`](https://npmjs.org/parallel-transform)
#### example
This example fetches the GET HTTP headers for a stream of input URLs 5 at a time in parallel.
```js
function getResponse (item, cb) {
var r = request(item.url)
r.on('error', function (err) {
cb(err)
})
r.on('response', function (re) {
cb(null, {url: item.url, date: new Date(), status: re.statusCode, headers: re.headers})
r.abort()
})
}
miss.pipe(
fs.createReadStream('./urls.txt'), // one url per line