Expand documentation on CLI and workflows (#130)

This commit is contained in:
Justin Ibarra
2020-08-18 14:27:51 -05:00
committed by GitHub
parent 9b70383898
commit 28c869fb5f
4 changed files with 151 additions and 13 deletions
+132
View File
@@ -0,0 +1,132 @@
# Command Line Interface (CLI)
This covers more advanced CLI use cases and workflows. To [get started](README.md#getting-started) with the CLI, reference
the [README](README.md). Basic use of the CLI such as [creating a rule](CONTRIBUTING.md#creating-a-rule-with-the-cli) or
[testing](CONTRIBUTING.md#testing-a-rule-with-the-cli) are referenced in the [contribution guide](CONTRIBUTING.md).
## Using a config file or environment variables
CLI commands which are tied to Kibana and Elasticsearch are capable of parsing auth-related keyword args from a config
file or environment variables.
If a value is set in multiple places, such as config file and environment variable, the order of precedence will be as
follows:
* explicitly passed args (such as `--user joe`)
* environment variables
* config values
* prompt (this only applies to certain values)
#### Setup a config file
In the root directory of this repo, create the file `.detection-rules-cfg.json` and add relevant values
Currently supported arguments:
* elasticsearch_url
* kibana_url
* cloud_id
* username
* password
#### Using environment variables
Environment variables using the argument format: `DR_<UPPERCASED_ARG_NAME>` will be parsed in commands which expect it.
EX: `DR_USER=joe`
## Importing rules into the repo
You can import rules into the repo using the `create-rule` or `import-rules` commands. Both of these commands will
require that the rules are schema-compliant and able to pass full validation. The biggest benefit to using these
commands is that they will strip[*](#note) additional fields[**](#note-2) and prompt for missing required
fields.
Alternatively, you can manually place rule files in the directory and run tests to validate as well.
<a id="note">\* Note</a>: This is currently limited to flat fields and may not apply to nested values.<br>
<a id="note-2">\** Note</a>: Additional fields are based on the current schema at the time the command is used.
#### `create-rule`
```console
Usage: detection_rules create-rule [OPTIONS] PATH
Create a detection rule.
Options:
-c, --config FILE Rule or config file
--required-only Only prompt for required fields
-t, --rule-type [machine_learning|saved_query|query|threshold]
Type of rule to create
-h, --help Show this message and exit.
```
This command will allow you to pass a rule file using the `-c/--config` parameter. This is limited to one rule at a time
and will accept any valid rule in the following formats:
* toml
* json
* yaml (yup)
* ndjson (as long as it contains only a single rule and has the extension `.ndjson` or `.jsonl`)
#### `import-rules`
```console
Usage: detection_rules import-rules [OPTIONS] [INFILE]...
Import rules from json, toml, or Kibana exported rule file(s).
Options:
-d, --directory DIRECTORY Load files from a directory
-h, --help Show this message and exit.
```
The primary advantage of using this command is the ability to import multiple rules at once. Multiple rule paths can be
specified explicitly with unlimited arguments, recursively within a directory using `-d/--directory`[*](#note-3), or
a combination of both.
In addition to the formats mentioned using `create-rule`, this will also accept an `.ndjson`/`jsonl` file
containing multiple rules (as would be the case with a bulk export).
This will also strip additional fields and prompt for missing required fields.
<a id="note-3">\* Note</a>: This will attempt to parse ALL files recursively within a specified directory.
## Uploading rules to Kibana
Toml formatted rule files can be uploaded as custom rules using the `kibana-upload` command. To upload more than one
file, specify multiple files at a time as individual args. This command is meant to support uploading and testing of
rules and is not intended for production use in its current state.
```console
python -m detection_rules kibana-upload my-rules/example_custom_rule.toml
```
_*To load a custom rule, the proper index must be setup first. The simplest way to do this is to click
the `Load prebuilt detection rules and timeline templates` button on the `detections` page in the Kibana security app._
## Converting between JSON and TOML
[Importing rules](#importing-rules-into-the-repo) will convert from any supported format to toml. Additionally, the
command `view-rule` will also allow you to view a converted rule without importing it by specifying the `--rule-format` flag.
To view a rule in JSON format, you can also use the `view-rule` command with the `--api-format` flag, which is the default.
(See the [note](#a-note-on-version-handling) on the JSON formatted rules and versioning)
## A note on version handling
The rule toml files exist slightly different than they do in their final state as a JSON file in Kibana. The files are
white space stripped, normalized, sorted, and indented, prior to their json conversion. Everything within the `metadata`
table is also stripped out, as this is meant to be used only in the context of this repository and not in Kibana..
Additionally, the `version` of the rule is added to the file prior to exporting it. This is done to restrict version bumps
to occur intentionally right before we create a release. Versions are auto-incremented based on detected changes in
rules. This is based on the hash of the rule in the following format:
* sorted json
* serialized
* b64 encoded
* sha256 hash
As a result, all cases where rules are shown or converted to JSON are not just simple conversions from TOML.
+1
View File
@@ -75,6 +75,7 @@ Commands:
The [contribution guide](CONTRIBUTING.md) describes how to use the `create-rule` and `test` commands to create and test a new rule when contributing to Detection Rules.
For more advanced command line interface (CLI) usage, refer to the [CLI guide](CLI.md).
## How to contribute
+14 -10
View File
@@ -28,10 +28,12 @@ def es_group():
"""Helper commands for integrating with Elasticsearch."""
def get_es_client(user, password, host=None, cloud_id=None, **kwargs):
def get_es_client(user, password, elasticsearch_url=None, cloud_id=None, **kwargs):
"""Get an auth-validated elsticsearch client."""
assert host or cloud_id, 'You must specify a host or cloud-id to authenticate to elasticsearch instance'
hosts = [host] if host else host
assert elasticsearch_url or cloud_id, \
'You must specify a host or cloud_id to authenticate to an elasticsearch instance'
hosts = [elasticsearch_url] if elasticsearch_url else elasticsearch_url
client = Elasticsearch(hosts=hosts, cloud_id=cloud_id, http_auth=(user, password), **kwargs)
# force login to test auth
@@ -177,7 +179,7 @@ class CollectEvents(object):
@es_group.command('collect-events')
@click.argument('agent-hostname')
@click.option('--host', callback=set_param_values, expose_value=True)
@click.option('--elasticsearch-url', '-u', callback=set_param_values, expose_value=True)
@click.option('--cloud-id', callback=set_param_values, expose_value=True)
@click.option('--user', '-u', callback=set_param_values, expose_value=True, hide_input=False)
@click.option('--password', '-p', callback=set_param_values, expose_value=True, hide_input=True)
@@ -186,14 +188,16 @@ class CollectEvents(object):
@click.option('--rta-name', '-r', help='Name of RTA in order to save events directly to unit tests data directory')
@click.option('--rule-id', help='Updates rule mapping in rule-mapping.yml file (requires --rta-name)')
@click.option('--view-events', is_flag=True, help='Print events after saving')
def collect_events(agent_hostname, host, cloud_id, user, password, index, agent_type, rta_name, rule_id, view_events):
def collect_events(agent_hostname, elasticsearch_url, cloud_id, user, password, index, agent_type, rta_name, rule_id,
view_events):
"""Collect events from Elasticsearch."""
match = {'agent.type': agent_type} if agent_type else {}
try:
client = get_es_client(host=host, use_ssl=True, cloud_id=cloud_id, user=user, password=password)
client = get_es_client(elasticsearch_url=elasticsearch_url, use_ssl=True, cloud_id=cloud_id, user=user,
password=password)
except AuthenticationException:
click.secho('Failed authentication for {}'.format(host or cloud_id), fg='red', err=True)
click.secho('Failed authentication for {}'.format(elasticsearch_url or cloud_id), fg='red', err=True)
return ERRORS['FAILED_ES_AUTH']
try:
@@ -225,17 +229,17 @@ def normalize_file(events_file):
@root.command("kibana-upload")
@click.argument("toml-files", nargs=-1, required=True)
@click.option('--url', callback=set_param_values, expose_value=True)
@click.option('--kibana-url', '-u', callback=set_param_values, expose_value=True)
@click.option('--cloud-id', callback=set_param_values, expose_value=True)
@click.option('--user', '-u', callback=set_param_values, expose_value=True, hide_input=False)
@click.option('--password', '-p', callback=set_param_values, expose_value=True, hide_input=True)
def kibana_upload(toml_files, url, cloud_id, user, password):
def kibana_upload(toml_files, kibana_url, cloud_id, user, password):
"""Upload a list of rule .toml files to Kibana."""
from uuid import uuid4
from .packaging import manage_versions
from .schemas import downgrade
with Kibana(cloud_id=cloud_id, url=url) as kibana:
with Kibana(cloud_id=cloud_id, url=kibana_url) as kibana:
kibana.login(user, password)
file_lookup = load_rule_files(paths=toml_files)
+4 -3
View File
@@ -123,9 +123,9 @@ def mass_update(ctx, query, field):
@root.command('view-rule')
@click.argument('rule-id', required=False)
@click.option('--rule-file', '-f', type=click.Path(dir_okay=False), help='Optionally view a rule from a specified file')
@click.option('--as-api/--as-rule', default=True, help='Print the rule in final api or rule format')
@click.option('--api-format/--rule-format', default=True, help='Print the rule in final api or rule format')
@click.pass_context
def view_rule(ctx, rule_id, rule_file, as_api):
def view_rule(ctx, rule_id, rule_file, api_format):
"""View an internal rule or specified rule file."""
rule = None
@@ -147,7 +147,8 @@ def view_rule(ctx, rule_id, rule_file, as_api):
click.secho('Unknown format!', fg='red')
ctx.exit(1)
click.echo(toml_write(rule.rule_format()) if not as_api else json.dumps(rule.contents, indent=2, sort_keys=True))
click.echo(toml_write(rule.rule_format()) if not api_format else
json.dumps(rule.contents, indent=2, sort_keys=True))
return rule