dstore-dist
Command Arguments¶
dstore-dist [-config file] [-debug] [-cpuprofile] [-memprofile]
Description¶
dstore-dist
acts as a router/distributor of the protobuf messages
that are generated by recursor
and dnsdist
. It is configured using a
YAML-based configuration file.
dstore-dist
is configured with a set of destinations, which indicate
all the possible destinations for a message. It is also configured with
a set of routes; each route can send messages to one or more
destinations, and can also be configured to perform filtering on the
messages.
Flags¶
Flag | Argument | Description |
---|---|---|
-config | file |
Load configuration from file |
-debug | Generate debug logging | |
-help | Display a helpful message and exit. | |
-cpuprofile | Write CPU profile to file | |
-memprofile | Write memory profile to file |
Files¶
/etc/pdns-dstore-dist/dstore-dist.yml
: Default location of the config file
Configuration¶
See Configuration of dstore-dist for details of the configuration file format.
Network Protocol/Encoding¶
When sending messages to destinations, the protocol/encoding used is slightly different depending on the destination.
For destinations of type pdns
, dstore-dist will send messages over a
TCP stream as serialised protobuf messages preceded by framing bytes (the type of framing is configurable).
The destination does not send any responses.
For destinations of type kafka
, the protocol used is the Kafka
protocol, and each Kafka Message is encoded as follows:
-
Key: This can be either "mm" or "sm".
- If the Key is "mm" then each Kafka Message contains potentially
multiple protobuf messages, which are encoded as repeated
Protobuf fields. If Json encoding is selected, then the multiple JSON messages are
separated by the
,
character. - If the Key is "sm" then each Kafka Message contains only one protobuf or JSON message.
- Value: The protobuf message(s), encoded as described above.
- Headers: The following Kafka headers are set:
- *msgType: DNSMessage
- instanceName: (Optional) The name of the instance if set
- If the Key is "mm" then each Kafka Message contains potentially
multiple protobuf messages, which are encoded as repeated
Protobuf fields. If Json encoding is selected, then the multiple JSON messages are
separated by the
For destinations of type storage
the encoding can be either protobuf (using the default 16-bit framing), JSON, or
Bind Query log format (see BIND 9 Administrator Reference Manual)