Browse Source

doc: remove "above" and "below" references

The docs were recently refactored, and some "above" and "below"
references were no longer accurate. This commit removes many
such references, and replaces others with links.

PR-URL: https://github.com/nodejs/node/pull/4499
Reviewed-By: Colin Ihrig <cjihrig@gmail.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
process-exit-stdio-flushing
Richard Sun 9 years ago
committed by cjihrig
parent
commit
a2e77cedef
  1. 17
      doc/api/child_process.markdown
  2. 3
      doc/api/crypto.markdown
  3. 9
      doc/api/dns.markdown
  4. 3
      doc/api/errors.markdown
  5. 5
      doc/api/fs.markdown
  6. 2
      doc/api/http.markdown
  7. 6
      doc/api/modules.markdown
  8. 4
      doc/api/path.markdown
  9. 4
      doc/api/process.markdown
  10. 2
      doc/api/repl.markdown
  11. 40
      doc/api/stream.markdown
  12. 5
      doc/api/util.markdown
  13. 2
      doc/api/zlib.markdown

17
doc/api/child_process.markdown

@ -33,10 +33,9 @@ function provides equivalent functionality in a synchronous manner that blocks
the event loop until the spawned process either exits of is terminated.
For convenience, the `child_process` module provides a handful of synchronous
and asynchronous alternatives to `child_process.spawn()` and
`child_process.spawnSync()`, each of which are documented fully [below][].
*Note that each of these alternatives are implemented on top of
`child_process.spawn()` or `child_process.spawnSync()`.*
and asynchronous alternatives to [`child_process.spawn()`][] and
[`child_process.spawnSync()`][]. *Note that each of these alternatives are
implemented on top of `child_process.spawn()` or `child_process.spawnSync()`.*
* `child_process.exec()`: spawns a shell and runs a command within that shell,
passing the `stdout` and `stderr` to a callback function when complete.
@ -222,7 +221,8 @@ spawned directly as a new process making it slightly more efficient than
(Default: `process.execArgv`)
* `silent` {Boolean} If true, stdin, stdout, and stderr of the child will be
piped to the parent, otherwise they will be inherited from the parent, see
the `'pipe'` and `'inherit'` options for [`spawn()`][]'s [`stdio`][] for more details
the `'pipe'` and `'inherit'` options for [`spawn()`][]'s [`stdio`][] for
more details
(default is false)
* `uid` {Number} Sets the user identity of the process. (See setuid(2).)
* `gid` {Number} Sets the group identity of the process. (See setgid(2).)
@ -262,10 +262,10 @@ not clone the current process.*
* `cwd` {String} Current working directory of the child process
* `env` {Object} Environment key-value pairs
* `stdio` {Array|String} Child's stdio configuration. (See
[below](#child_process_options_stdio))
[`options.stdio`][])
* `detached` {Boolean} Prepare child to run independently of its parent
process. Specific behavior depends on the platform, see
[below](#child_process_options_detached))
[`options.detached`][])
* `uid` {Number} Sets the user identity of the process. (See setuid(2).)
* `gid` {Number} Sets the group identity of the process. (See setgid(2).)
* return: {ChildProcess object}
@ -933,6 +933,7 @@ to the same value.
[`EventEmitters`]: events.html#events_class_events_eventemitter
[`net.Server`]: net.html#net_class_net_server
[`net.Socket`]: net.html#net_class_net_socket
[`options.detached`]: #child_process_options_detached
[`options.stdio`]: #child_process_options_stdio
[`stdio`]: #child_process_options_stdio
[below]: #child_process_asynchronous_process_creation
[synchronous counterparts]: #child_process_synchronous_process_creation

3
doc/api/crypto.markdown

@ -972,7 +972,7 @@ supported groups are: `'modp1'`, `'modp2'`, `'modp5'` (defined in
[RFC 2412][], but see [Caveats][]) and `'modp14'`, `'modp15'`,
`'modp16'`, `'modp17'`, `'modp18'` (defined in [RFC 3526][]). The
returned object mimics the interface of objects created by
[`crypto.createDiffieHellman()`][] above, but will not allow changing
[`crypto.createDiffieHellman()`][], but will not allow changing
the keys (with [`diffieHellman.setPublicKey()`][] for example). The
advantage of using this method is that the parties do not have to
generate nor exchange a group modulus beforehand, saving both processor
@ -1249,6 +1249,7 @@ See the reference for other recommendations and details.
[OpenSSL's SPKAC implementation]: https://www.openssl.org/docs/apps/spkac.html
[`createCipher()`]: #crypto_crypto_createcipher_algorithm_password
[`createCipheriv()`]: #crypto_crypto_createcipheriv_algorithm_key_iv
[`createHash()`]: #crypto_crypto_createhash_algorithm
[`crypto.createDecipher`]: #crypto_crypto_createdecipher_algorithm_password
[`crypto.createDecipheriv`]: #crypto_crypto_createdecipheriv_algorithm_key_iv
[`crypto.createDiffieHellman()`]: #crypto_crypto_creatediffiehellman_prime_prime_encoding_generator_generator_encoding

9
doc/api/dns.markdown

@ -69,7 +69,7 @@ Alternatively, `options` can be an object containing these properties:
`getaddrinfo` flags. If `hints` is not provided, then no flags are passed to
`getaddrinfo`. Multiple flags can be passed through `hints` by logically
`OR`ing their values.
See [supported `getaddrinfo` flags][] below for more information on supported
See [supported `getaddrinfo` flags][] for more information on supported
flags.
* `all`: {Boolean} - When `true`, the callback returns all resolved addresses
in an array, otherwise returns a single address. Defaults to `false`.
@ -151,10 +151,10 @@ Valid values for `rrtype` are:
The `callback` function has arguments `(err, addresses)`. When successful,
`addresses` will be an array. The type of each item in `addresses` is
determined by the record type, and described in the documentation for the
corresponding lookup methods below.
corresponding lookup methods.
On error, `err` is an [`Error`][] object, where `err.code` is
one of the error codes listed below.
one of the error codes listed [here](#dns_error_codes).
## dns.resolve4(hostname, callback)
@ -250,7 +250,7 @@ The `callback` function has arguments `(err, hostnames)`, where `hostnames`
is an array of resolved hostnames for the given `ip`.
On error, `err` is an [`Error`][] object, where `err.code` is
one of the error codes listed below.
one of the [DNS error codes][].
## dns.setServers(servers)
@ -335,6 +335,7 @@ processing that happens on libuv's threadpool that [`dns.lookup()`][] can have.
They do not use the same set of configuration files than what [`dns.lookup()`][]
uses. For instance, _they do not use the configuration from `/etc/hosts`_.
[DNS error codes]: #dns_error_codes
[`dns.lookup()`]: #dns_dns_lookup_hostname_options_callback
[`dns.resolve()`]: #dns_dns_resolve_hostname_rrtype_callback
[`dns.resolve4()`]: #dns_dns_resolve4_hostname_callback

3
doc/api/errors.markdown

@ -317,7 +317,7 @@ The number of frames captured by the stack trace is bounded by the smaller of
loop tick.
System-level errors are generated as augmented `Error` instances, which are
detailed [below](#errors_system_errors).
detailed [here](#errors_system_errors).
## Class: RangeError
@ -503,7 +503,6 @@ found [here][online].
[`process.on('uncaughtException')`]: process.html#process_event_uncaughtexception
[`try / catch` construct]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/try...catch
[`try { } catch(err) { }`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/try...catch
[below]: #errors_error_propagation_and_interception
[domains]: domain.html
[event emitter-based]: events.html#events_class_events_eventemitter
[file descriptors]: https://en.wikipedia.org/wiki/File_descriptor

5
doc/api/fs.markdown

@ -659,7 +659,7 @@ Synchronous rmdir(2). Returns `undefined`.
## fs.stat(path, callback)
Asynchronous stat(2). The callback gets two arguments `(err, stats)` where
`stats` is a [`fs.Stats`][] object. See the [`fs.Stats`][] section below for more
`stats` is a [`fs.Stats`][] object. See the [`fs.Stats`][] section for more
information.
## fs.statSync(path)
@ -743,7 +743,7 @@ The supported boolean members are `persistent` and `recursive`. `persistent`
indicates whether the process should continue to run as long as files are being
watched. `recursive` indicates whether all subdirectories should be watched, or
only the current directory. This applies when a directory is specified, and only
on supported platforms (See Caveats below).
on supported platforms (See [Caveats][]).
The default is `{ persistent: true, recursive: false }`.
@ -929,6 +929,7 @@ Synchronous versions of [`fs.write()`][]. Returns the number of bytes written.
[`Buffer.byteLength`]: buffer.html#buffer_class_method_buffer_bytelength_string_encoding
[`Buffer`]: buffer.html#buffer_buffer
[Caveats]: #fs_caveats
[`fs.access()`]: #fs_fs_access_path_mode_callback
[`fs.accessSync()`]: #fs_fs_accesssync_path_mode
[`fs.appendFile()`]: fs.html#fs_fs_appendfile_file_data_options_callback

2
doc/api/http.markdown

@ -52,7 +52,7 @@ require developers to manually close the HTTP clients using
KeepAlive.
If you opt into using HTTP KeepAlive, you can create an Agent object
with that flag set to `true`. (See the [constructor options][] below.)
with that flag set to `true`. (See the [constructor options][].)
Then, the Agent will keep unused sockets in a pool for later use. They
will be explicitly marked so as to not keep the Node.js process running.
However, it is still a good idea to explicitly [`destroy()`][] KeepAlive

6
doc/api/modules.markdown

@ -94,9 +94,9 @@ may itself have dependencies, and in some cases, these dependencies may even
collide or form cycles.
Since Node.js looks up the `realpath` of any modules it loads (that is,
resolves symlinks), and then looks for their dependencies in the
`node_modules` folders as described above, this situation is very simple to
resolve with the following architecture:
resolves symlinks), and then looks for their dependencies in the `node_modules`
folders as described [here](#modules_loading_from_node_modules_folders), this
situation is very simple to resolve with the following architecture:
* `/usr/lib/node/foo/1.2.3/` - Contents of the `foo` package, version 1.2.3.
* `/usr/lib/node/bar/4.3.2/` - Contents of the `bar` package that `foo`

4
doc/api/path.markdown

@ -83,7 +83,7 @@ an empty string. Examples:
## path.format(pathObject)
Returns a path string from an object, the opposite of `path.parse` above.
Returns a path string from an object, the opposite of [`path.parse`][].
path.format({
root : "/",
@ -286,3 +286,5 @@ An example on Windows:
Provide access to aforementioned `path` methods but always interact in a win32
compatible way.
[`path.parse`]: #path_path_parse_pathstring

4
doc/api/process.markdown

@ -165,8 +165,7 @@ In cases like this, you may not want to track the rejection as a developer
error like you would for other `'unhandledRejection'` events. To address
this, you can either attach a dummy `.catch(function() { })` handler to
`resource.loaded`, preventing the `'unhandledRejection'` event from being
emitted, or you can use the `'rejectionHandled'` event. Below is an
explanation of how to do that.
emitted, or you can use the [`'rejectionHandled'`][] event.
## Exit Codes
@ -953,6 +952,7 @@ Will print something like:
[`net.Socket`]: net.html#net_class_net_socket
[`process.exit()`]: #process_process_exit_code
[`promise.catch(...)`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/catch
[`'rejectionHandled'`]: #process_event_rejectionhandled
[`require.main`]: modules.html#modules_accessing_the_main_module
[`setTimeout(fn, 0)`]: timers.html#timers_settimeout_callback_delay_arg
[Signal Events]: #process_signal_events

2
doc/api/repl.markdown

@ -59,7 +59,7 @@ Previously in Node.js/io.js v2.x, REPL history was controlled by using a
format. This variable has now been deprecated, and your REPL history will
automatically be converted to using plain text. The new file will be saved to
either your home directory, or a directory defined by the `NODE_REPL_HISTORY`
variable, as documented below.
variable, as documented [here](#repl_environment_variable_options).
## REPL Features

40
doc/api/stream.markdown

@ -37,14 +37,14 @@ and properties depending on whether they are Readable, Writable, or
Duplex.
If a stream is both Readable and Writable, then it implements all of
the methods and events below. So, a [Duplex][] or [Transform][] stream is
the methods and events. So, a [Duplex][] or [Transform][] stream is
fully described by this API, though their implementation may be
somewhat different.
It is not necessary to implement Stream interfaces in order to consume
streams in your programs. If you **are** implementing streaming
interfaces in your own program, please also refer to
[API for Stream Implementors][] below.
[API for Stream Implementors][].
Almost all Node.js programs, no matter how simple, use Streams in some
way. Here is an example of using Streams in an Node.js program:
@ -95,7 +95,7 @@ server.listen(1337);
### Class: stream.Duplex
Duplex streams are streams that implement both the [Readable][] and
[Writable][] interfaces. See above for usage.
[Writable][] interfaces.
Examples of Duplex streams include:
@ -464,8 +464,8 @@ Note that `stream.unshift(chunk)` cannot be called after the `'end'` event
has been triggered; a runtime error will be raised.
If you find that you must often call `stream.unshift(chunk)` in your
programs, consider implementing a [Transform][] stream instead. (See API
for Stream Implementors, below.)
programs, consider implementing a [Transform][] stream instead. (See [API
for Stream Implementors][].)
```javascript
// Pull off a header delimited by \n\n
@ -514,7 +514,7 @@ reading state appropriately, however it is best to simply avoid calling
* `stream` {Stream} An "old style" readable stream
Versions of Node.js prior to v0.10 had streams that did not implement the
entire Streams API as it is today. (See "Compatibility" below for
entire Streams API as it is today. (See [Compatibility][] for
more information.)
If you are using an older Node.js library that emits `'data'` events and
@ -542,7 +542,7 @@ myReader.on('readable', () => {
Transform streams are [Duplex][] streams where the output is in some way
computed from the input. They implement both the [Readable][] and
[Writable][] interfaces. See above for usage.
[Writable][] interfaces.
Examples of Transform streams include:
@ -789,7 +789,7 @@ of stream class you are writing:
</table>
In your implementation code, it is very important to never call the
methods described in [API for Stream Consumers][] above. Otherwise, you
methods described in [API for Stream Consumers][]. Otherwise, you
can potentially cause adverse side effects in programs that consume
your streaming interfaces.
@ -843,7 +843,7 @@ it can come in handy as a building block for novel sorts of streams.
`stream.Readable` is an abstract class designed to be extended with an
underlying implementation of the [`_read(size)`][] method.
Please see above under [API for Stream Consumers][] for how to consume
Please see [API for Stream Consumers][] for how to consume
streams in your programs. What follows is an explanation of how to
implement Readable streams in your programs.
@ -979,12 +979,13 @@ Counter.prototype._read = function() {
#### Example: SimpleProtocol v1 (Sub-optimal)
This is similar to the `parseHeader` function described above, but
implemented as a custom stream. Also, note that this implementation
does not convert the incoming data to a string.
This is similar to the `parseHeader` function described
[here](#stream_readable_unshift_chunk), but implemented as a custom stream.
Also, note that this implementation does not convert the incoming data to a
string.
However, this would be better implemented as a [Transform][] stream. See
below for a better implementation.
[SimpleProtocol v2][] for a better implementation.
```javascript
// A parser for a simple data protocol.
@ -1201,9 +1202,10 @@ your own extension classes.
#### Example: `SimpleProtocol` parser v2
The example above of a simple protocol parser can be implemented
simply by using the higher level [Transform][] stream class, similar to
the `parseHeader` and `SimpleProtocol v1` examples above.
The example [here](#stream_example_simpleprotocol_v1_sub_optimal) of a simple
protocol parser can be implemented simply by using the higher level
[Transform][] stream class, similar to the `parseHeader` and `SimpleProtocol
v1` examples.
In this example, rather than providing the input as an argument, it
would be piped into the parser, which is a more idiomatic Node.js stream
@ -1284,7 +1286,7 @@ SimpleProtocol.prototype._transform = function(chunk, encoding, done) {
`stream.Writable` is an abstract class designed to be extended with an
underlying implementation of the [`_write(chunk, encoding, callback)`][] method.
Please see above under [API for Stream Consumers][] for how to consume
Please see [API for Stream Consumers][] for how to consume
writable streams in your programs. What follows is an explanation of
how to implement Writable streams in your programs.
@ -1500,7 +1502,7 @@ simpler, but also less powerful and less useful.
meant that you still had to be prepared to receive `'data'` events
even when the stream was in a paused state.
In Node.js v0.10, the Readable class described below was added.
In Node.js v0.10, the [Readable][] class was added.
For backwards compatibility with older Node.js programs, Readable streams
switch into "flowing mode" when a `'data'` event handler is added, or
when the [`resume()`][] method is called. The effect is that, even if
@ -1710,6 +1712,7 @@ horribly wrong.
[API for Stream Implementors]: #stream_api_for_stream_implementors
[child process stdin]: child_process.html#child_process_child_stdin
[child process stdout and stderr]: child_process.html#child_process_child_stdout
[Compatibility]: #stream_compatibility_with_older_node_js_versions
[crypto streams]: crypto.html
[crypto]: crypto.html
[Duplex]: #stream_class_stream_duplex
@ -1722,6 +1725,7 @@ horribly wrong.
[Object mode]: #stream_object_mode
[Readable]: #stream_class_stream_readable
[request to an HTTP server]: http.html#http_http_incomingmessage
[SimpleProtocol v2]: #stream_example_simpleprotocol_parser_v2
[tcp sockets]: net.html#net_class_net_socket
[Transform]: #stream_class_stream_transform
[unpiped]: #stream_readable_unpipe_destination

5
doc/api/util.markdown

@ -161,7 +161,8 @@ formatted string:
`2`. To make it recurse indefinitely pass `null`.
- `colors` - if `true`, then the output will be styled with ANSI color codes.
Defaults to `false`. Colors are customizable, see below.
Defaults to `false`. Colors are customizable, see [Customizing
`util.inspect` colors][].
- `customInspect` - if `false`, then custom `inspect(depth, opts)` functions
defined on the objects being inspected won't be called. Defaults to `true`.
@ -501,4 +502,6 @@ Deprecated predecessor of `console.log`.
[`Array.isArray`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/isArray
[constructor]: https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Object/constructor
[Customizing `util.inspect` colors]: #util_customizing_util_inspect_colors
[here]: #util_customizing_util_inspect_colors
[`Error`]: errors.html#errors_class_error

2
doc/api/zlib.markdown

@ -44,7 +44,7 @@ on requests, and the [content-encoding][] header on responses.
**Note: these examples are drastically simplified to show
the basic concept.** Zlib encoding can be expensive, and the results
ought to be cached. See [Memory Usage Tuning][] below for more information
ought to be cached. See [Memory Usage Tuning][] for more information
on the speed/memory/compression tradeoffs involved in zlib usage.
// client request example

Loading…
Cancel
Save