git ssb

2+

Dominic / pull-stream



Commit 7d0c918beaee414a6b11f020b51dfb097122ede0

move .md files into docs/

Michael Williams committed on 6/12/2016, 12:32:24 PM
Parent: 11f0d082fcb805336ce40ead36918a2dc3cf4ee7

Files changed

README.mdchanged
sinks/collect.mddeleted
sinks/concat.mddeleted
sinks/drain.mddeleted
sinks/index.mddeleted
sinks/log.mddeleted
sinks/on-end.mddeleted
sinks/reduce.mddeleted
sources/count.mddeleted
sources/empty.mddeleted
sources/error.mddeleted
sources/index.mddeleted
sources/infinite.mddeleted
sources/keys.mddeleted
sources/once.mddeleted
sources/values.mddeleted
throughs/async-map.mddeleted
throughs/filter-not.mddeleted
throughs/filter.mddeleted
throughs/flatten.mddeleted
throughs/index.mddeleted
throughs/map.mddeleted
throughs/non-unique.mddeleted
throughs/take.mddeleted
throughs/through.mddeleted
throughs/unique.mddeleted
examples.mddeleted
docs/examples.mdadded
docs/glossary.mdadded
docs/pull.mdadded
docs/sinks/collect.mdadded
docs/sinks/concat.mdadded
docs/sinks/drain.mdadded
docs/sinks/index.mdadded
docs/sinks/log.mdadded
docs/sinks/on-end.mdadded
docs/sinks/reduce.mdadded
docs/sources/count.mdadded
docs/sources/empty.mdadded
docs/sources/error.mdadded
docs/sources/index.mdadded
docs/sources/infinite.mdadded
docs/sources/keys.mdadded
docs/sources/once.mdadded
docs/sources/values.mdadded
docs/spec.mdadded
docs/throughs/async-map.mdadded
docs/throughs/filter-not.mdadded
docs/throughs/filter.mdadded
docs/throughs/flatten.mdadded
docs/throughs/index.mdadded
docs/throughs/map.mdadded
docs/throughs/non-unique.mdadded
docs/throughs/take.mdadded
docs/throughs/through.mdadded
docs/throughs/unique.mdadded
glossary.mddeleted
pull.mddeleted
spec.mddeleted
README.mdView
@@ -44,11 +44,11 @@
4444 (aka "Source") and a `reader` stream (aka "Sink"). Through streams
4545 is a Sink that returns a Source.
4646
4747 See also:
48-* [Sources](./sources/index.md)
49-* [Throughs](./throughs/index.md)
50-* [Sinks](./sinks/index.md)
48+* [Sources](./docs/sources/index.md)
49+* [Throughs](./docs/throughs/index.md)
50+* [Sinks](./docs/sinks/index.md)
5151
5252 ### Source (aka, Readable)
5353
5454 The readable stream is just a `function read(end, cb)`,
@@ -79,10 +79,10 @@
7979
8080 A sink is just a `reader` function that calls a Source (read function),
8181 until it decideds to stop, or the readable ends. `cb(err || true)`
8282
83-All [Throughs](./throughs/index.md)
84-and [Sinks](./sinks/index.md)
83+All [Throughs](./docs/throughs/index.md)
84+and [Sinks](./docs/sinks/index.md)
8585 are reader streams.
8686
8787 ```js
8888 //read source and log it.
@@ -300,16 +300,16 @@
300300
301301
302302 ## Further Examples
303303
304-- https://github.com/dominictarr/pull-stream-examples
305-- https://github.com/pull-stream/pull-stream/blob/master/examples.md
304+- [dominictarr/pull-stream-examples](https://github.com/dominictarr/pull-stream-examples)
305+- [./docs/examples](./docs/examples.md)
306306
307307 Explore this repo further for more information about
308-[sources](./sources/index.md),
309-[throughs](./throughs/index.md),
310-[sinks](./sinks/index.md), and
311-[glossary](./glossary.md).
308+[sources](./docs/sources/index.md),
309+[throughs](./docs/throughs/index.md),
310+[sinks](./docs/sinks/index.md), and
311+[glossary](./docs/glossary.md).
312312
313313
314314 ## License
315315
sinks/collect.mdView
@@ -1,10 +1,0 @@
1-# pull-stream/sinks/collect
2-
3-## usage
4-
5-### `collect = require('pull-stream/sinks/collect')`
6-
7-### `collect(cb)`
8-
9-Read the stream into an array, then callback.
10-
sinks/concat.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sinks/concat
2-
3-## usage
4-
5-### `concat = require('pull-stream/sinks/concat')`
6-
7-### `concat(cb)`
8-
9-concat stream of strings into single string, then callback.
sinks/drain.mdView
@@ -1,11 +1,0 @@
1-# pull-stream/sinks/drain
2-
3-## usage
4-
5-### `drain = require('pull-stream/sinks/drain')`
6-
7-### `drain(op?, done?)`
8-
9-Drain the stream, calling `op` on each `data`.
10-call `done` when stream is finished.
11-If op returns `===false`, abort the stream.
sinks/index.mdView
@@ -1,22 +1,0 @@
1-# Sinks
2-
3-A Sink is a stream that is not readable.
4-You *must* have a sink at the end of a pipeline
5-for data to move towards.
6-
7-You can only use _one_ sink per pipeline.
8-
9-``` js
10-pull(source, through, sink)
11-```
12-
13-See also:
14-* [Sources](../sources/index.md)
15-* [Throughs](../throughs/index.md)
16-
17-## [drain](./drain.md)
18-## [reduce](./reduce.md)
19-## [concat](./collect.md)
20-## [collect](./collect.md)
21-## [onEnd](./on-end.md)
22-## [log](./log.md)
sinks/log.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sinks/log
2-
3-## usage
4-
5-### `log = require('pull-stream/sinks/log')`
6-
7-### `log()`
8-
9-output the stream to `console.log`
sinks/on-end.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sinks/onEnd
2-
3-## usage
4-
5-### `onEnd = require('pull-stream/sinks/onEnd')`
6-
7-### `onEnd(cb)`
8-
9-Drain the stream and then callback when done.
sinks/reduce.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sinks/reduce
2-
3-## usage
4-
5-### `reduce = require('pull-stream/sinks/reduce')`
6-
7-### `reduce (reduce, initial, cb)`
8-
9-reduce stream into single value, then callback.
sources/count.mdView
@@ -1,12 +1,0 @@
1-# pull-stream/sources/count
2-
3-## usage
4-
5-### `count = require('pull-stream/sources/count')`
6-
7-### `count(max, onAbort)`
8-
9-create a stream that outputs `0 ... max`.
10-by default, `max = Infinity`, see
11-[take](../throughs/take.md)
12-
sources/empty.mdView
@@ -1,20 +1,0 @@
1-# pull-stream/sources/empty
2-
3-## usage
4-
5-### `empty = require('pull-stream/sources/empty')`
6-
7-### `empty()`
8-
9-A stream with no contents (it just ends immediately)
10-
11-``` js
12-pull(
13- pull.empty(),
14- pull.collect(function (err, ary) {
15- console.log(arg)
16- // ==> []
17- })
18-}
19-```
20-
sources/error.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sources/error
2-
3-## usage
4-
5-### `error = require('pull-stream/sources/error')`
6-
7-### `error(err)`
8-
9-a stream that errors immediately
sources/index.mdView
@@ -1,23 +1,0 @@
1-# Sources
2-
3-A source is a stream that is not writable.
4-You *must* have a source at the start of a pipeline
5-for data to move through.
6-
7-in general:
8-
9-``` js
10-pull(source, through, sink)
11-```
12-
13-See also:
14-* [Throughs](../throughs/index.md)
15-* [Sinks](../sinks/index.md)
16-
17-## [values](./values.md)
18-## [keys](./keys.md)
19-## [count](./count.md)
20-## [infinite](./infinite.md)
21-## [empty](./empty.md)
22-## [once](./once.md)
23-## [error](./error.md)
sources/infinite.mdView
@@ -1,11 +1,0 @@
1-# pull-stream/sources/infinite
2-
3-## usage
4-
5-### `infinite = require('pull-stream/sources/infinite')`
6-
7-### `infinite(generator, onAbort)`
8-
9-create an unending stream by repeatedly calling a generator
10-function (by default, `Math.random`)
11-see [take](../throughs/take.md)
sources/keys.mdView
@@ -1,10 +1,0 @@
1-# pull-stream/sources/keys
2-
3-## usage
4-
5-### `keys = require('pull-stream/sources/keys')`
6-
7-### `keys(array | object, onAbort)`
8-
9-stream the key names from an object (or array)
10-
sources/once.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sources/once
2-
3-## usage
4-
5-### `once = require('pull-stream/sources/once')`
6-
7-### `once(value, onAbort)`
8-
9-a stream with a single value
sources/values.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/sources/values
2-
3-## usage
4-
5-### `values = require('pull-stream/sources/values')`
6-
7-### `values(array | object, onAbort)`
8-
9-create a SourceStream that reads the values from an array or object and then stops.
throughs/async-map.mdView
@@ -1,10 +1,0 @@
1-# pull-stream/throughs/async-map
2-
3-## usage
4-
5-### `asyncMap = require('pull-stream/throughs/async-map')`
6-
7-### `asyncMap(fn)`
8-
9-Like [`map`](./map.md) but the signature of `fn` must be
10-`function (data, cb) { cb(null, data) }`
throughs/filter-not.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/throughs/filterNot
2-
3-## usage
4-
5-### `filterNot = require('pull-stream/throughs/filter-not')`
6-
7-### `filterNot(test)`
8-
9-Like [`filter`](./filter.md), but remove items where the filter returns true.
throughs/filter.mdView
@@ -1,11 +1,0 @@
1-# pull-stream/throughs/filter
2-
3-## usage
4-
5-### `filter = require('pull-stream/throughs/filter')`
6-
7-### `filter(test)`
8-
9-Like `[].filter(function (data) {return true || false})`
10-only `data` where `test(data) == true` are let through
11-to the next stream.
throughs/flatten.mdView
@@ -1,9 +1,0 @@
1-# pull-stream/throughs/flatten
2-
3-## usage
4-
5-### `flatten = require('pull-stream/throughs/flatten')`
6-
7-### `flatten()`
8-
9-Turn a stream of arrays into a stream of their items, (undoes group).
throughs/index.mdView
@@ -1,46 +1,0 @@
1-# Throughs
2-
3-A Through is a stream that both reads and is read by
4-another stream.
5-
6-Through streams are optional.
7-
8-Put through streams in-between [sources](../sources/index.md) and [sinks](../sinks/index.md),
9-like this:
10-
11-```js
12-pull(source, through, sink)
13-```
14-
15-Also, if you don't have the source/sink yet,
16-you can pipe multiple through streams together
17-to get one through stream!
18-
19-```js
20-var throughABC = function () {
21- return pull(
22- throughA(),
23- throughB(),
24- throughC()
25- )
26-}
27-```
28-
29-Which can then be treated like a normal through stream!
30-
31-```js
32-pull(source(), throughABC(), sink())
33-```
34-
35-See also:
36-* [Sources](../sources/index.md)
37-* [Sinks](../sinks/index.md)
38-
39-## [map](./map.md)
40-## [asyncMap](./async-map.md)
41-## [filter](./filter.md)
42-## [filterNot](./filter-not.md)
43-## [unique](./unique.md)
44-## [nonUnique](./non-unique.md)
45-## [take](./take.md)
46-## [flatten](./flatten.md)
throughs/map.mdView
@@ -1,54 +1,0 @@
1-# pull-stream/throughs/map
2-
3-> [].map for pull-streams
4-
5-## Background
6-
7-Pull-streams are arrays of data in time rather than space.
8-
9-As with a [`[].map`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/map), we may want to map a function over a stream.
10-
11-## Example
12-
13-```js
14-var map = require('pull-stream/throughs/map')
15-```
16-
17-```js
18-pull(
19- values([0, 1, 2, 3]),
20- map(function (x) {
21- return x * x
22- }),
23- log()
24-)
25-// 0
26-// 1
27-// 4
28-// 9
29-```
30-
31-## Usage
32-
33-### `map = require('pull-stream/throughs/map')`
34-
35-### `map((data) => data)`
36-
37-`map(fn)` returns a through stream that calls the given `fn` for each chunk of incoming data and outputs the return value, in the same order as before.
38-
39-## Install
40-
41-With [npm](https://npmjs.org/) installed, run
42-
43-```
44-$ npm install pull-stream
45-```
46-
47-## See Also
48-
49-- [`brycebaril/through2-map`](https://github.com/brycebaril/through2-map)
50-- [`Rx.Obsevable#map`](http://xgrommx.github.io/rx-book/content/observable/observable_instance_methods/map.html)
51-
52-## License
53-
54-[MIT](https://tldrlegal.com/license/mit-license)
throughs/non-unique.mdView
@@ -1,10 +1,0 @@
1-# pull-stream/throughs/nonUnique
2-
3-## usage
4-
5-### `nonUnique = require('pull-stream/throughs/non-unique')`
6-
7-### `nonUnique(prop)`
8-
9-Filter unique items -- get the duplicates.
10-The inverse of [`unique`](./unique.md)
throughs/take.mdView
@@ -1,13 +1,0 @@
1-# pull-stream/throughs/take
2-
3-## usage
4-
5-### `take = require('pull-stream/throughs/take')`
6-
7-### `take(test [, opts])`
8-
9-If test is a function, read data from the source stream and forward it downstream until test(data) returns false.
10-
11-If `opts.last` is set to true, the data for which the test failed will be included in what is forwarded.
12-
13-If test is an integer, take n item from the source.
throughs/through.mdView
@@ -1,5 +1,0 @@
1-# pull-stream/throughs/filter
2-
3-## usage
4-
5-### `filter = require('pull-stream/throughs/filter')`
throughs/unique.mdView
@@ -1,11 +1,0 @@
1-# pull-stream/throughs/unique
2-
3-## usage
4-
5-### `unique = require('pull-stream/throughs/unique')`
6-
7-### `unique(prop)`
8-
9-Filter items that have a repeated value for `prop()`,
10-by default, `prop = function (it) {return it }`, if prop is a string,
11-it will filter nodes which have repeated values for that property.
examples.mdView
@@ -1,91 +1,0 @@
1-
2-This document describes some examples of where various features
3-of pull streams are used in simple real-world examples.
4-
5-Much of the focus here is handling the error cases. Indeed,
6-distributed systems are _all about_ handling the error cases.
7-
8-# simple source that ends correctly. (read, end)
9-
10-A normal file (source) is read, and sent to a sink stream
11-that computes some aggregation upon that input.
12-such as the number of bytes, or number of occurances of the `\n`
13-character (i.e. the number of lines).
14-
15-The source reads a chunk of the file at each time it's called,
16-there is some optimium size depending on your operating system,
17-file system, physical hardware,
18-and how many other files are being read concurrently.
19-
20-when the sink gets a chunk, it iterates over the characters in it
21-counting the `\n` characters. when the source returns `end` to the
22-sink, the sink calls a user provided callback.
23-
24-# source that may fail. (read, err, end)
25-
26-download a file over http and write it to fail.
27-The network should always be considered to be unreliable,
28-and you must design your system to recover from failures.
29-So there for the download may fail (wifi cuts out or something)
30-
31-The read stream is just the http download, and the sink
32-writes it to a tempfile. If the source ends normally,
33-the tempfile is moved to the correct location.
34-If the source errors, the tempfile is deleted.
35-
36-(you could also write the file to the correct location,
37-and delete it if it errors, but the tempfile method has the advantage
38-that if the computer or process crashes it leaves only a tempfile
39-and not a file that appears valid. stray tempfiles can be cleaned up
40-or resumed when the process restarts)
41-
42-# sink that may fail
43-
44-If we read a file from disk, and upload it,
45-then it is the sink that may error.
46-The file system is probably faster than the upload,
47-so it will mostly be waiting for the sink to ask for more.
48-usually, the sink calls read, and the source gets more from the file
49-until the file ends. If the sink errors, it calls `read(true, cb)`
50-and the source closes the file descriptor and stops reading.
51-In this case the whole file is never loaded into memory.
52-
53-# sink that may fail out of turn.
54-
55-A http client connects to a log server and tails a log in realtime.
56-(another process writes to the log file,
57-but we don't need to think about that)
58-
59-The source is the server log stream, and the sink is the client.
60-First the source outputs the old data, this will always be a fast
61-response, because that data is already at hand. When that is all
62-written then the output rate may drop significantly because it will
63-wait for new data to be added to the file. Because of this,
64-it becomes much more likely that the sink errors (the network connection
65-drops) while the source is waiting for new data. Because of this,
66-it's necessary to be able to abort the stream reading (after you called
67-read, but before it called back). If it was not possible to abort
68-out of turn, you'd have to wait for the next read before you can abort
69-but, depending on the source of the stream, that may never come.
70-
71-# a through stream that needs to abort.
72-
73-Say we read from a file (source), JSON parse each line (through),
74-and then output to another file (sink).
75-because there is valid and invalid JSON, the parse could error,
76-if this parsing is a fatal error, then we are aborting the pipeline
77-from the middle. Here the source is normal, but then the through fails.
78-When the through finds an invalid line, it should abort the source,
79-and then callback to the sink with an error. This way,
80-by the time the sink receives the error, the entire stream has been cleaned up.
81-
82-(you could abort the source, and error back to the sink in parallel,
83-but if something happened to the source while aborting, for the user
84-to know they'd have to give another callback to the source, this would
85-get called very rarely so users would be inclined to not handle that.
86-better to have one callback at the sink.)
87-
88-In some cases you may want the stream to continue, and just ignore
89-an invalid line if it does not parse. An example where you definately
90-want to abort if it's invalid would be an encrypted stream, which
91-should be broken into chunks that are encrypted separately.
docs/examples.mdView
@@ -1,0 +1,91 @@
1+
2+This document describes some examples of where various features
3+of pull streams are used in simple real-world examples.
4+
5+Much of the focus here is handling the error cases. Indeed,
6+distributed systems are _all about_ handling the error cases.
7+
8+# simple source that ends correctly. (read, end)
9+
10+A normal file (source) is read, and sent to a sink stream
11+that computes some aggregation upon that input.
12+such as the number of bytes, or number of occurances of the `\n`
13+character (i.e. the number of lines).
14+
15+The source reads a chunk of the file at each time it's called,
16+there is some optimium size depending on your operating system,
17+file system, physical hardware,
18+and how many other files are being read concurrently.
19+
20+when the sink gets a chunk, it iterates over the characters in it
21+counting the `\n` characters. when the source returns `end` to the
22+sink, the sink calls a user provided callback.
23+
24+# source that may fail. (read, err, end)
25+
26+download a file over http and write it to fail.
27+The network should always be considered to be unreliable,
28+and you must design your system to recover from failures.
29+So there for the download may fail (wifi cuts out or something)
30+
31+The read stream is just the http download, and the sink
32+writes it to a tempfile. If the source ends normally,
33+the tempfile is moved to the correct location.
34+If the source errors, the tempfile is deleted.
35+
36+(you could also write the file to the correct location,
37+and delete it if it errors, but the tempfile method has the advantage
38+that if the computer or process crashes it leaves only a tempfile
39+and not a file that appears valid. stray tempfiles can be cleaned up
40+or resumed when the process restarts)
41+
42+# sink that may fail
43+
44+If we read a file from disk, and upload it,
45+then it is the sink that may error.
46+The file system is probably faster than the upload,
47+so it will mostly be waiting for the sink to ask for more.
48+usually, the sink calls read, and the source gets more from the file
49+until the file ends. If the sink errors, it calls `read(true, cb)`
50+and the source closes the file descriptor and stops reading.
51+In this case the whole file is never loaded into memory.
52+
53+# sink that may fail out of turn.
54+
55+A http client connects to a log server and tails a log in realtime.
56+(another process writes to the log file,
57+but we don't need to think about that)
58+
59+The source is the server log stream, and the sink is the client.
60+First the source outputs the old data, this will always be a fast
61+response, because that data is already at hand. When that is all
62+written then the output rate may drop significantly because it will
63+wait for new data to be added to the file. Because of this,
64+it becomes much more likely that the sink errors (the network connection
65+drops) while the source is waiting for new data. Because of this,
66+it's necessary to be able to abort the stream reading (after you called
67+read, but before it called back). If it was not possible to abort
68+out of turn, you'd have to wait for the next read before you can abort
69+but, depending on the source of the stream, that may never come.
70+
71+# a through stream that needs to abort.
72+
73+Say we read from a file (source), JSON parse each line (through),
74+and then output to another file (sink).
75+because there is valid and invalid JSON, the parse could error,
76+if this parsing is a fatal error, then we are aborting the pipeline
77+from the middle. Here the source is normal, but then the through fails.
78+When the through finds an invalid line, it should abort the source,
79+and then callback to the sink with an error. This way,
80+by the time the sink receives the error, the entire stream has been cleaned up.
81+
82+(you could abort the source, and error back to the sink in parallel,
83+but if something happened to the source while aborting, for the user
84+to know they'd have to give another callback to the source, this would
85+get called very rarely so users would be inclined to not handle that.
86+better to have one callback at the sink.)
87+
88+In some cases you may want the stream to continue, and just ignore
89+an invalid line if it does not parse. An example where you definately
90+want to abort if it's invalid would be an encrypted stream, which
91+should be broken into chunks that are encrypted separately.
docs/glossary.mdView
@@ -1,0 +1,47 @@
1+# Glossary
2+
3+## read (end, cb)
4+
5+A function that retrives the next chunk.
6+All readable streams (sources, and throughs)
7+must return a `read` function.
8+
9+## reader (read,...)
10+
11+A function to create a reader. It takes a `read` function
12+as the first argument, and any other options after that.
13+
14+When passed to `pipeable` or `pipeableSource`,
15+a new function is created that adds `.pipe(dest)`
16+
17+## Lazy vs Eager
18+
19+Lazy means to avoid doing something until you know you have
20+to do it.
21+
22+Eager means to do something early, so you have it ready
23+immediately when you need it.
24+
25+## Source
26+
27+The first stream in the pipeline. The Source is not writable.
28+
29+## Sink
30+
31+The last Stream in the pipeline. The Sink is not readable.
32+
33+## Push vs Pull
34+
35+A pull-stream is a stream where the movement of data
36+is initiated by the sink, and a push-stream
37+is a stream where the movement of data is initiated
38+by the source.
39+
40+## Reader vs Writable
41+
42+In push streams, destination streams (Through and Sink),
43+are _writable_. They are written to by the source streams.
44+
45+In pull streams, destination streams _read_ from the source
46+streams. They are the active participant, so they are called
47+_readers_ rather than _writables_.
docs/pull.mdView
@@ -1,0 +1,106 @@
1+# pull-stream/pull
2+
3+> pipe many pull streams into a pipeline
4+
5+## Background
6+
7+In pull-streams, you need a complete pipeline before data will flow.
8+
9+That means: a source, zero or more throughs, and a sink.
10+
11+But you can still create a _partial_ pipeline, which is a great for tiny pull-stream modules.
12+
13+## Usage
14+
15+```js
16+var pull = require('pull-stream/pull')
17+```
18+
19+Create a simple complete pipeline:
20+
21+```js
22+pull(source, sink) => undefined
23+```
24+
25+Create a source modified by a through:
26+
27+```js
28+pull(source, through) => source
29+```
30+
31+Create a sink, but modify it's input before it goes.
32+
33+```js
34+pull(through, sink) => sink
35+```
36+
37+Create a through, by chainging several throughs:
38+
39+```js
40+pull(through1, through2) => through
41+```
42+
43+These streams combine just like normal streams.
44+
45+```js
46+pull(
47+ pull(source, through),
48+ pull(through1, through2),
49+ pull(through, sink)
50+) => undefined
51+```
52+
53+The complete pipeline returns undefined, because it cannot be piped to anything else.
54+
55+Pipe duplex streams like this:
56+
57+```js
58+var a = duplex()
59+var b = duplex()
60+
61+pull(a.source, b.sink)
62+pull(b.source, a.sink)
63+
64+//which is the same as
65+
66+b.sink(a.source); a.sink(b.source)
67+
68+//but the easiest way is to allow pull to handle this
69+
70+pull(a, b, a)
71+
72+//"pull from a to b and then back to a"
73+```
74+
75+## API
76+
77+```js
78+var pull = require('pull-stream/pull')
79+```
80+
81+### `pull(...streams)`
82+
83+`pull` is a function that receives n-arity stream arguments and connects them into a pipeline.
84+
85+`pull` detects the type of stream by checking function arity, if the function takes only one argument it's either a sink or a through. Otherwise it's a source. A duplex stream is an object with the shape `{ source, sink }`.
86+
87+If the pipeline is complete (reduces into a source being passed into a sink), then `pull` returns `undefined`, as the data is flowing.
88+
89+If the pipeline is partial (reduces into either a source, a through, or a sink), then `pull` returns the partial pipeline, as it must be composed with other streams before the data will flow.
90+
91+## Install
92+
93+With [npm](https://npmjs.org/) installed, run
94+
95+```sh
96+$ npm install pull-stream
97+```
98+
99+## See Also
100+
101+- [`mafintosh/pump`](https://github.com/mafintosh/pump)
102+- [`mafintosh/pumpify`](https://github.com/mafintosh/pumpify)
103+
104+## License
105+
106+[MIT](https://tldrlegal.com/license/mit-license)
docs/sinks/collect.mdView
@@ -1,0 +1,10 @@
1+# pull-stream/sinks/collect
2+
3+## usage
4+
5+### `collect = require('pull-stream/sinks/collect')`
6+
7+### `collect(cb)`
8+
9+Read the stream into an array, then callback.
10+
docs/sinks/concat.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sinks/concat
2+
3+## usage
4+
5+### `concat = require('pull-stream/sinks/concat')`
6+
7+### `concat(cb)`
8+
9+concat stream of strings into single string, then callback.
docs/sinks/drain.mdView
@@ -1,0 +1,11 @@
1+# pull-stream/sinks/drain
2+
3+## usage
4+
5+### `drain = require('pull-stream/sinks/drain')`
6+
7+### `drain(op?, done?)`
8+
9+Drain the stream, calling `op` on each `data`.
10+call `done` when stream is finished.
11+If op returns `===false`, abort the stream.
docs/sinks/index.mdView
@@ -1,0 +1,22 @@
1+# Sinks
2+
3+A Sink is a stream that is not readable.
4+You *must* have a sink at the end of a pipeline
5+for data to move towards.
6+
7+You can only use _one_ sink per pipeline.
8+
9+``` js
10+pull(source, through, sink)
11+```
12+
13+See also:
14+* [Sources](../sources/index.md)
15+* [Throughs](../throughs/index.md)
16+
17+## [drain](./drain.md)
18+## [reduce](./reduce.md)
19+## [concat](./collect.md)
20+## [collect](./collect.md)
21+## [onEnd](./on-end.md)
22+## [log](./log.md)
docs/sinks/log.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sinks/log
2+
3+## usage
4+
5+### `log = require('pull-stream/sinks/log')`
6+
7+### `log()`
8+
9+output the stream to `console.log`
docs/sinks/on-end.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sinks/on-end
2+
3+## usage
4+
5+### `onEnd = require('pull-stream/sinks/on-end')`
6+
7+### `onEnd(cb)`
8+
9+Drain the stream and then callback when done.
docs/sinks/reduce.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sinks/reduce
2+
3+## usage
4+
5+### `reduce = require('pull-stream/sinks/reduce')`
6+
7+### `reduce (reduce, initial, cb)`
8+
9+reduce stream into single value, then callback.
docs/sources/count.mdView
@@ -1,0 +1,12 @@
1+# pull-stream/sources/count
2+
3+## usage
4+
5+### `count = require('pull-stream/sources/count')`
6+
7+### `count(max, onAbort)`
8+
9+create a stream that outputs `0 ... max`.
10+by default, `max = Infinity`, see
11+[take](../throughs/take.md)
12+
docs/sources/empty.mdView
@@ -1,0 +1,20 @@
1+# pull-stream/sources/empty
2+
3+## usage
4+
5+### `empty = require('pull-stream/sources/empty')`
6+
7+### `empty()`
8+
9+A stream with no contents (it just ends immediately)
10+
11+``` js
12+pull(
13+ pull.empty(),
14+ pull.collect(function (err, ary) {
15+ console.log(arg)
16+ // ==> []
17+ })
18+}
19+```
20+
docs/sources/error.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sources/error
2+
3+## usage
4+
5+### `error = require('pull-stream/sources/error')`
6+
7+### `error(err)`
8+
9+a stream that errors immediately
docs/sources/index.mdView
@@ -1,0 +1,23 @@
1+# Sources
2+
3+A source is a stream that is not writable.
4+You *must* have a source at the start of a pipeline
5+for data to move through.
6+
7+in general:
8+
9+``` js
10+pull(source, through, sink)
11+```
12+
13+See also:
14+* [Throughs](../throughs/index.md)
15+* [Sinks](../sinks/index.md)
16+
17+## [values](./values.md)
18+## [keys](./keys.md)
19+## [count](./count.md)
20+## [infinite](./infinite.md)
21+## [empty](./empty.md)
22+## [once](./once.md)
23+## [error](./error.md)
docs/sources/infinite.mdView
@@ -1,0 +1,11 @@
1+# pull-stream/sources/infinite
2+
3+## usage
4+
5+### `infinite = require('pull-stream/sources/infinite')`
6+
7+### `infinite(generator, onAbort)`
8+
9+create an unending stream by repeatedly calling a generator
10+function (by default, `Math.random`)
11+see [take](../throughs/take.md)
docs/sources/keys.mdView
@@ -1,0 +1,10 @@
1+# pull-stream/sources/keys
2+
3+## usage
4+
5+### `keys = require('pull-stream/sources/keys')`
6+
7+### `keys(array | object, onAbort)`
8+
9+stream the key names from an object (or array)
10+
docs/sources/once.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sources/once
2+
3+## usage
4+
5+### `once = require('pull-stream/sources/once')`
6+
7+### `once(value, onAbort)`
8+
9+a stream with a single value
docs/sources/values.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/sources/values
2+
3+## usage
4+
5+### `values = require('pull-stream/sources/values')`
6+
7+### `values(array | object, onAbort)`
8+
9+create a SourceStream that reads the values from an array or object and then stops.
docs/spec.mdView
@@ -1,0 +1,67 @@
1+# Synopsis
2+
3+In Pull-Streams, there are two fundamental types of streams `Source`s and `Sink`s. There are two composite types of streams `Through` (aka transform) and `Duplex`. A Through Stream is a sink stream that reads what goes into the Source Stream, it can also be written to. A duplex stream is a pair of streams (`{Source, Sink}`) streams.
4+
5+# Pull-Streams
6+## Source Streams
7+
8+A Source Stream (aka readable stream) is a asynchronous function that may be called repeatedly until it returns a terminal state. Pull-streams have back pressure, but it is implicit instead of sending an explicit back pressure signal. If a source
9+needs the sink to slow down, it may delay returning a read. If a sink needs the source to slow down, it just waits until it reads the source again.
10+
11+For example, the Source Stream `fn(abort, cb)` may have an internal implementation that will read data from a disk or network. If `fn` is called with the first argument (`abort`) being truthy, the callback will be passed `abort` as it's first argument. The callback has three different argument configurations...
12+
13+ 1. `cb(null, data)`, indicates there there is data.
14+ 2. `cb(true)`, indicates the stream has ended normally.
15+ 3. `cb(error)`, indicates that there was an error.
16+
17+The read method *must not* be called until the previous call has returned, except for a call to abort the stream.
18+
19+### End
20+The stream may be terminated, for example `cb(err|end)`. The read method *must not* be called after it has terminated. As a normal stream end is propagated up the pipeline, an error should be propagated also, because it also means the end of the stream. If `cb(end=true)` that is a "end" which means it's a valid termination, if `cb(err)` that is an error.
21+`error` and `end` are mostly the same. If you are buffering inputs and see an `end`, process those inputs and then the end.
22+If you are buffering inputs and get an `error`, then you _may_ throw away that buffer and return the end.
23+
24+### Abort
25+Sometimes it's the sink that errors, and if it can't read anymore then we _must_ abort the source. (example, source is a file stream from local fs, and sink is a http upload. prehaps the network drops or remote server crashes, in this case we should abort the source, so that it's resources can be released.)
26+
27+To abort the sink, call read with a truthy first argument. You may abort a source _before_ it has returned from a regular read. (if you wait for the previous read to complete, it's possible you'd get a deadlock, if you a reading a stream that takes a long time, example, `tail -f` is reading a file, but nothing has appended to that file yet).
28+
29+When a stream is aborted during a read, the callback provided to the read function *must* be called first, with an error, and then the abort callback.
30+
31+## Sink Streams
32+
33+A Sink Stream (aka writable stream) is a function that a Source Stream is passed to. The Sink Stream calls the `read` function of the Source Stream, abiding by the rules about when it may not call.
34+
35+### Abort
36+The Sink Stream may also abort the source if it can no longer read from it.
37+
38+## Through Streams
39+
40+A through stream is a sink stream that returns another source when it is passed a source.
41+A through stream may be thought of as wrapping a source.
42+
43+## Duplex Streams
44+
45+A pair of independent streams, one Source and one Sink. The purpose of a duplex stream is not transformation of the data that passes though it. It's meant for communication only.
46+
47+# Composing Streams
48+
49+Since a Sink is a function that takes a Source, a Source may be fed into a Sink by simply passing the Source to the Sink.
50+For example, `sink(source)`. Since a transform is a Sink that returns a Source, you can just add to that pattern by wrapping the source. For example, `sink(transform(source))`. This works, but it reads from right-to-left, and we are used to left-to-right.
51+
52+A method for creating a left-to-rihght reading pipeline of pull-streams. For example, a method could implement the following interface...
53+
54+```
55+pull([source] [,transform ...] [,sink ...])
56+```
57+
58+The interface could alllow for the following scenarios...
59+
60+1. Connect a complete pipeline: `pull(source, transform,* sink)` this connects a source to a sink via zero or more transforms.
61+
62+2. If a sink is not provided: `pull(source, transform+)` then pull should return the last `source`,
63+this way streams can be easily combined in a functional way.
64+
65+3. If a source is not provided: `pull(transform,* sink)` then pull should return a sink that will complete the pipeline when
66+it's passed a source. `function (source) { return pull(source, pipeline) }`
67+If neither a source or a sink are provided, this will return a source that will return another source (via 2) i.e. a through stream.
docs/throughs/async-map.mdView
@@ -1,0 +1,10 @@
1+# pull-stream/throughs/async-map
2+
3+## usage
4+
5+### `asyncMap = require('pull-stream/throughs/async-map')`
6+
7+### `asyncMap(fn)`
8+
9+Like [`map`](./map.md) but the signature of `fn` must be
10+`function (data, cb) { cb(null, data) }`
docs/throughs/filter-not.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/throughs/filter-not
2+
3+## usage
4+
5+### `filterNot = require('pull-stream/throughs/filter-not')`
6+
7+### `filterNot(test)`
8+
9+Like [`filter`](./filter.md), but remove items where the filter returns true.
docs/throughs/filter.mdView
@@ -1,0 +1,11 @@
1+# pull-stream/throughs/filter
2+
3+## usage
4+
5+### `filter = require('pull-stream/throughs/filter')`
6+
7+### `filter(test)`
8+
9+Like `[].filter(function (data) {return true || false})`
10+only `data` where `test(data) == true` are let through
11+to the next stream.
docs/throughs/flatten.mdView
@@ -1,0 +1,9 @@
1+# pull-stream/throughs/flatten
2+
3+## usage
4+
5+### `flatten = require('pull-stream/throughs/flatten')`
6+
7+### `flatten()`
8+
9+Turn a stream of arrays into a stream of their items, (undoes group).
docs/throughs/index.mdView
@@ -1,0 +1,46 @@
1+# Throughs
2+
3+A Through is a stream that both reads and is read by
4+another stream.
5+
6+Through streams are optional.
7+
8+Put through streams in-between [sources](../sources/index.md) and [sinks](../sinks/index.md),
9+like this:
10+
11+```js
12+pull(source, through, sink)
13+```
14+
15+Also, if you don't have the source/sink yet,
16+you can pipe multiple through streams together
17+to get one through stream!
18+
19+```js
20+var throughABC = function () {
21+ return pull(
22+ throughA(),
23+ throughB(),
24+ throughC()
25+ )
26+}
27+```
28+
29+Which can then be treated like a normal through stream!
30+
31+```js
32+pull(source(), throughABC(), sink())
33+```
34+
35+See also:
36+* [Sources](../sources/index.md)
37+* [Sinks](../sinks/index.md)
38+
39+## [map](./map.md)
40+## [asyncMap](./async-map.md)
41+## [filter](./filter.md)
42+## [filterNot](./filter-not.md)
43+## [unique](./unique.md)
44+## [nonUnique](./non-unique.md)
45+## [take](./take.md)
46+## [flatten](./flatten.md)
docs/throughs/map.mdView
@@ -1,0 +1,54 @@
1+# pull-stream/throughs/map
2+
3+> [].map for pull-streams
4+
5+## Background
6+
7+Pull-streams are arrays of data in time rather than space.
8+
9+As with a [`[].map`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/map), we may want to map a function over a stream.
10+
11+## Example
12+
13+```js
14+var map = require('pull-stream/throughs/map')
15+```
16+
17+```js
18+pull(
19+ values([0, 1, 2, 3]),
20+ map(function (x) {
21+ return x * x
22+ }),
23+ log()
24+)
25+// 0
26+// 1
27+// 4
28+// 9
29+```
30+
31+## Usage
32+
33+### `map = require('pull-stream/throughs/map')`
34+
35+### `map((data) => data)`
36+
37+`map(fn)` returns a through stream that calls the given `fn` for each chunk of incoming data and outputs the return value, in the same order as before.
38+
39+## Install
40+
41+With [npm](https://npmjs.org/) installed, run
42+
43+```
44+$ npm install pull-stream
45+```
46+
47+## See Also
48+
49+- [`brycebaril/through2-map`](https://github.com/brycebaril/through2-map)
50+- [`Rx.Obsevable#map`](http://xgrommx.github.io/rx-book/content/observable/observable_instance_methods/map.html)
51+
52+## License
53+
54+[MIT](https://tldrlegal.com/license/mit-license)
docs/throughs/non-unique.mdView
@@ -1,0 +1,10 @@
1+# pull-stream/throughs/non-unique
2+
3+## usage
4+
5+### `nonUnique = require('pull-stream/throughs/non-unique')`
6+
7+### `nonUnique(prop)`
8+
9+Filter unique items -- get the duplicates.
10+The inverse of [`unique`](./unique.md)
docs/throughs/take.mdView
@@ -1,0 +1,13 @@
1+# pull-stream/throughs/take
2+
3+## usage
4+
5+### `take = require('pull-stream/throughs/take')`
6+
7+### `take(test [, opts])`
8+
9+If test is a function, read data from the source stream and forward it downstream until test(data) returns false.
10+
11+If `opts.last` is set to true, the data for which the test failed will be included in what is forwarded.
12+
13+If test is an integer, take n item from the source.
docs/throughs/through.mdView
@@ -1,0 +1,5 @@
1+# pull-stream/throughs/filter
2+
3+## usage
4+
5+### `filter = require('pull-stream/throughs/filter')`
docs/throughs/unique.mdView
@@ -1,0 +1,11 @@
1+# pull-stream/throughs/unique
2+
3+## usage
4+
5+### `unique = require('pull-stream/throughs/unique')`
6+
7+### `unique(prop)`
8+
9+Filter items that have a repeated value for `prop()`,
10+by default, `prop = function (it) {return it }`, if prop is a string,
11+it will filter nodes which have repeated values for that property.
glossary.mdView
@@ -1,47 +1,0 @@
1-# Glossary
2-
3-## read (end, cb)
4-
5-A function that retrives the next chunk.
6-All readable streams (sources, and throughs)
7-must return a `read` function.
8-
9-## reader (read,...)
10-
11-A function to create a reader. It takes a `read` function
12-as the first argument, and any other options after that.
13-
14-When passed to `pipeable` or `pipeableSource`,
15-a new function is created that adds `.pipe(dest)`
16-
17-## Lazy vs Eager
18-
19-Lazy means to avoid doing something until you know you have
20-to do it.
21-
22-Eager means to do something early, so you have it ready
23-immediately when you need it.
24-
25-## Source
26-
27-The first stream in the pipeline. The Source is not writable.
28-
29-## Sink
30-
31-The last Stream in the pipeline. The Sink is not readable.
32-
33-## Push vs Pull
34-
35-A pull-stream is a stream where the movement of data
36-is initiated by the sink, and a push-stream
37-is a stream where the movement of data is initiated
38-by the source.
39-
40-## Reader vs Writable
41-
42-In push streams, destination streams (Through and Sink),
43-are _writable_. They are written to by the source streams.
44-
45-In pull streams, destination streams _read_ from the source
46-streams. They are the active participant, so they are called
47-_readers_ rather than _writables_.
pull.mdView
@@ -1,106 +1,0 @@
1-# pull-stream/pull
2-
3-> pipe many pull streams into a pipeline
4-
5-## Background
6-
7-In pull-streams, you need a complete pipeline before data will flow.
8-
9-That means: a source, zero or more throughs, and a sink.
10-
11-But you can still create a _partial_ pipeline, which is a great for tiny pull-stream modules.
12-
13-## Usage
14-
15-```js
16-var pull = require('pull-stream/pull')
17-```
18-
19-Create a simple complete pipeline:
20-
21-```js
22-pull(source, sink) => undefined
23-```
24-
25-Create a source modified by a through:
26-
27-```js
28-pull(source, through) => source
29-```
30-
31-Create a sink, but modify it's input before it goes.
32-
33-```js
34-pull(through, sink) => sink
35-```
36-
37-Create a through, by chainging several throughs:
38-
39-```js
40-pull(through1, through2) => through
41-```
42-
43-These streams combine just like normal streams.
44-
45-```js
46-pull(
47- pull(source, through),
48- pull(through1, through2),
49- pull(through, sink)
50-) => undefined
51-```
52-
53-The complete pipeline returns undefined, because it cannot be piped to anything else.
54-
55-Pipe duplex streams like this:
56-
57-```js
58-var a = duplex()
59-var b = duplex()
60-
61-pull(a.source, b.sink)
62-pull(b.source, a.sink)
63-
64-//which is the same as
65-
66-b.sink(a.source); a.sink(b.source)
67-
68-//but the easiest way is to allow pull to handle this
69-
70-pull(a, b, a)
71-
72-//"pull from a to b and then back to a"
73-```
74-
75-## API
76-
77-```js
78-var pull = require('pull-stream/pull')
79-```
80-
81-### `pull(...streams)`
82-
83-`pull` is a function that receives n-arity stream arguments and connects them into a pipeline.
84-
85-`pull` detects the type of stream by checking function arity, if the function takes only one argument it's either a sink or a through. Otherwise it's a source. A duplex stream is an object with the shape `{ source, sink }`.
86-
87-If the pipeline is complete (reduces into a source being passed into a sink), then `pull` returns `undefined`, as the data is flowing.
88-
89-If the pipeline is partial (reduces into either a source, a through, or a sink), then `pull` returns the partial pipeline, as it must be composed with other streams before the data will flow.
90-
91-## Install
92-
93-With [npm](https://npmjs.org/) installed, run
94-
95-```sh
96-$ npm install pull-stream
97-```
98-
99-## See Also
100-
101-- [`mafintosh/pump`](https://github.com/mafintosh/pump)
102-- [`mafintosh/pumpify`](https://github.com/mafintosh/pumpify)
103-
104-## License
105-
106-[MIT](https://tldrlegal.com/license/mit-license)
spec.mdView
@@ -1,67 +1,0 @@
1-# Synopsis
2-
3-In Pull-Streams, there are two fundamental types of streams `Source`s and `Sink`s. There are two composite types of streams `Through` (aka transform) and `Duplex`. A Through Stream is a sink stream that reads what goes into the Source Stream, it can also be written to. A duplex stream is a pair of streams (`{Source, Sink}`) streams.
4-
5-# Pull-Streams
6-## Source Streams
7-
8-A Source Stream (aka readable stream) is a asynchronous function that may be called repeatedly until it returns a terminal state. Pull-streams have back pressure, but it is implicit instead of sending an explicit back pressure signal. If a source
9-needs the sink to slow down, it may delay returning a read. If a sink needs the source to slow down, it just waits until it reads the source again.
10-
11-For example, the Source Stream `fn(abort, cb)` may have an internal implementation that will read data from a disk or network. If `fn` is called with the first argument (`abort`) being truthy, the callback will be passed `abort` as it's first argument. The callback has three different argument configurations...
12-
13- 1. `cb(null, data)`, indicates there there is data.
14- 2. `cb(true)`, indicates the stream has ended normally.
15- 3. `cb(error)`, indicates that there was an error.
16-
17-The read method *must not* be called until the previous call has returned, except for a call to abort the stream.
18-
19-### End
20-The stream may be terminated, for example `cb(err|end)`. The read method *must not* be called after it has terminated. As a normal stream end is propagated up the pipeline, an error should be propagated also, because it also means the end of the stream. If `cb(end=true)` that is a "end" which means it's a valid termination, if `cb(err)` that is an error.
21-`error` and `end` are mostly the same. If you are buffering inputs and see an `end`, process those inputs and then the end.
22-If you are buffering inputs and get an `error`, then you _may_ throw away that buffer and return the end.
23-
24-### Abort
25-Sometimes it's the sink that errors, and if it can't read anymore then we _must_ abort the source. (example, source is a file stream from local fs, and sink is a http upload. prehaps the network drops or remote server crashes, in this case we should abort the source, so that it's resources can be released.)
26-
27-To abort the sink, call read with a truthy first argument. You may abort a source _before_ it has returned from a regular read. (if you wait for the previous read to complete, it's possible you'd get a deadlock, if you a reading a stream that takes a long time, example, `tail -f` is reading a file, but nothing has appended to that file yet).
28-
29-When a stream is aborted during a read, the callback provided to the read function *must* be called first, with an error, and then the abort callback.
30-
31-## Sink Streams
32-
33-A Sink Stream (aka writable stream) is a function that a Source Stream is passed to. The Sink Stream calls the `read` function of the Source Stream, abiding by the rules about when it may not call.
34-
35-### Abort
36-The Sink Stream may also abort the source if it can no longer read from it.
37-
38-## Through Streams
39-
40-A through stream is a sink stream that returns another source when it is passed a source.
41-A through stream may be thought of as wrapping a source.
42-
43-## Duplex Streams
44-
45-A pair of independent streams, one Source and one Sink. The purpose of a duplex stream is not transformation of the data that passes though it. It's meant for communication only.
46-
47-# Composing Streams
48-
49-Since a Sink is a function that takes a Source, a Source may be fed into a Sink by simply passing the Source to the Sink.
50-For example, `sink(source)`. Since a transform is a Sink that returns a Source, you can just add to that pattern by wrapping the source. For example, `sink(transform(source))`. This works, but it reads from right-to-left, and we are used to left-to-right.
51-
52-A method for creating a left-to-rihght reading pipeline of pull-streams. For example, a method could implement the following interface...
53-
54-```
55-pull([source] [,transform ...] [,sink ...])
56-```
57-
58-The interface could alllow for the following scenarios...
59-
60-1. Connect a complete pipeline: `pull(source, transform,* sink)` this connects a source to a sink via zero or more transforms.
61-
62-2. If a sink is not provided: `pull(source, transform+)` then pull should return the last `source`,
63-this way streams can be easily combined in a functional way.
64-
65-3. If a source is not provided: `pull(transform,* sink)` then pull should return a sink that will complete the pipeline when
66-it's passed a source. `function (source) { return pull(source, pipeline) }`
67-If neither a source or a sink are provided, this will return a source that will return another source (via 2) i.e. a through stream.

Built with git-ssb-web