git ssb

4+

Dominic / scuttlebot



Tree: ed7e4997927bef14491b40767dbd64f58f2d246e

Files: ed7e4997927bef14491b40767dbd64f58f2d246e / api.md

16965 bytesRaw

scuttlebot

Secure-scuttlebutt API server

get: async

Get a message by its hash-id.

get {msgid}
get(msgid, cb)

createFeedStream: source

(feed) Fetch messages ordered by their claimed timestamps.

feed [--live] [--gt index] [--gte index] [--lt index] [--lte index] [--reverse]  [--keys] [--values] [--limit n]
createFeedStream({ live:, gt:, gte:, lt:, lte:, reverse:, keys:, values:, limit:, fillCache:, keyEncoding:, valueEncoding: })

Create a stream of the data in the database, ordered by the timestamp claimed by the author. NOTE - the timestamp is not verified, and may be incorrect. The range queries (gt, gte, lt, lte) filter against this claimed timestap.

createLogStream: source

(log) Fetch messages ordered by the time received.

log [--live] [--gt index] [--gte index] [--lt index] [--lte index] [--reverse]  [--keys] [--values] [--limit n]
createLogStream({ live:, gt:, gte:, lt:, lte:, reverse:, keys:, values:, limit:, fillCache:, keyEncoding:, valueEncoding: })

Creates a stream of the messages that have been written to this instance, in the order they arrived. The objects in this stream will be of the form:

{ key: Hash, value: Message, timestamp: timestamp }

timestamp is the time which the message was received. It is generated by monotonic-timestamp. The range queries (gt, gte, lt, lte) filter against this receive timestap.

messagesByType: source

(logt) Retrieve messages with a given type, ordered by receive-time.

logt --type {type} [--live] [--gt index] [--gte index] [--lt index] [--lte index] [--reverse]  [--keys] [--values] [--limit n]
messagesByType({ type:, live:, gt:, gte:, lt:, lte:, reverse:, keys:, values:, limit:, fillCache:, keyEncoding:, valueEncoding: })

All messages must have a type, so this is a good way to select messages that an application might use. Like in createLogStream, the range queries (gt, gte, lt, lte) filter against the receive timestap.

createHistoryStream: source

(hist) Fetch messages from a specific user, ordered by sequence numbers.

hist {feedid} [seq] [live]
hist --id {feedid} [--seq n] [--live] [--limit n] [--keys] [--values]
createHistoryStream(id, seq, live)
createHistoryStream({ id:, seq:, live:, limit:, keys:, values: })

createHistoryStream and createUserStream serve the same purpose. createHistoryStream exists as a separate call because it provides fewer range parameters, which makes it safer for RPC between untrusted peers.

createUserStream: source

Fetch messages from a specific user, ordered by sequence numbers.

createUserStream --id {feedid} [--live] [--gt index] [--gte index] [--lt index] [--lte index] [--reverse]  [--keys] [--values] [--limit n]
createUserStream({ id:, live:, gt:, gte:, lt:, lte:, reverse:, keys:, values:, limit:, fillCache:, keyEncoding:, valueEncoding: })

createHistoryStream and createUserStream serve the same purpose. createHistoryStream exists as a separate call because it provides fewer range parameters, which makes it safer for RPC between untrusted peers.

The range queries (gt, gte, lt, lte) filter against the sequence number.

Get a stream of messages, feeds, or blobs that are linked to/from an id.

links [--source id|filter] [--dest id|filter] [--rel value] [--keys] [--values] [--live] [--reverse]
links({ source:, dest:, rel:, keys:, values:, live:, reverse: })

The objects in this stream will be of the form:

{ source: ID, rel: String, dest: ID, key: MsgID }

relatedMessages: async

Retrieve the tree of messages related to the given id.

relatedMessages --id {msgid} [--rel value] [--count] [--parent]
relatedMessages ({ id:, rel:, count:, parent: }, cb)

This is ideal for collecting things like threaded replies. The output is a recursive structure like this:

{
  key: <id>,
  value: <msg>,
  related: [
    <recursive>,...
  ],
  //number of messages below this point. (when opts.count = true)
  count: <int>,
  //the message this message links to. this will not appear on the bottom level.
  //(when opts.parent = true)
  parent: <parent_id>
}

add: async

Add a well-formed message to the database.

cat ./message.json | add
add --author {feedid} --sequence {number} --previous {msgid} --timestamp {number} --hash sha256 --signature {sig} --content.type {type} --content.{...}
add({ author:, sequence:, previous: timestamp:, hash: 'sha256', signature:, content: { type:, ... } }, cb)

publish: async

Construct a message using sbot's current user, and add it to the DB.

cat ./message-content.json | publish
publish --type {string} [--other-attributes...]
publish({ type:, ... }, cb)

This is the recommended method for publishing new messages, as it handles the tasks of correctly setting the message's timestamp, sequence number, previous-hash, and signature.

getAddress: sync

Get the address of the server.

getAddress
getAddress(cb)

getLatest: async

Get the latest message in the database by the given feedid.

getLatest {feedid}
getLatest(id, cb)

latest: source

Get the seq numbers of the latest messages of all users in the database.

latest
latest()

latestSequence: async

Get the sequence and local timestamp of the last received message from a given feedId.

latestSequence {feedId}
latest({feedId})

whoami: sync

Get information about the current sbot user.

whoami
whoami(cb)

Outputs information in the following form:

{ id: FeedID }

progress: sync

returns an object reflecting the progress state of various plugins. the return value is a {} with subobjects showing {start,current,target}
to represent progress. Currently implemented are migration (legacy->flume) migration progress and indexes (index regeneration).

getVectorClock: async

Built with git-ssb-web