Problem: It was not possible to detect the reason for a premature
connection termination in http requests.
This patch provides a new `err` argument to the 'close' event which
can be inspected to differentiate between a timeout and a client
actively terminating the connection.
Also contains tests for this new behavior for http and https.
The change for #954 introduced a regression that would cause
the url parser to fail on special chars found in the auth
segment. Fix that, and also don't create invalid urls when
format() is called on an object containing an auth member
containing '@' characters or delimiters.
This fixes a critical bug see in MJR's production. Very difficult to build a
test case. Sometimes HTTPS server gets sockets that are hanging in a
half-duplex state.
1. Allow single-quotes in urls, but escape them.
2. Add comments about which RFCs we're following for guidance.
3. Handle any invalid character in the hostname portion.
4. lcase protocol and hostname portions, since they are
case-insensitive.
When creating multiple .pipe()s to the same destination stream, the
first source to end would close the destination, breaking all remaining
pipes. This patch fixes the problem by keeping track of all open
pipes, so that we only call end on destinations that have no more
sources piping to them.
closes#929
Cherry-pick fail. Breaks linux. Will land again shortly.
This reverts commit e8cf98c841.
This reverts commit d953856d87.
This reverts commit 752bbd6b42.
Problem: Sometimes it is useful to read a file from a certain position
to it's end. The current implementation was already perfectly capable
of this, but decided to throw an error when the user tried to omit
the end option. The only way to do this, was to pass {end: Infinity}.
Solution: Automatically assume {end: Infinity} when omitted, and remove
the previous exception thrown. Also updated the docs.
closes#801.
However, this test is failing for some quite unrelated issue.
Getting some odd "socket hangup" crashes, and only the first request
ever makes it to the server.
Calling resume() immediately after calling pause() would trigger
a race condition where it would try to read() from a file
descriptor that was already being read from, causing an EBADF
I have seen a lot of people trying to pass objects to crypto's update
functions, assuming that it would somehow serialize the object before
hashing.
In reality, the object was converted to '[object Object]' which was
then hashed, without any error message showing.
This patch modifies the DecodeBytes function (used exclusively by
crypto at this point) to complain when receiving anything but a
string or buffer.
Overall this should be a less-suprising, more robust behavior.
This does 3 things:
1. Delimiters and "unwise" characters are never included in the
hostname or path.
2. url.format will sanitize string URLs that are passed to it.
3. The parsed url's 'href' member will be the sanitized url, which may
not match the argument to url.parse.