Re-organize commands under more specific click groups (#356)
* Restructure commands under more specific click groups * standardize CLI error handling * add global debug options * move es and kibana clients into their click groups * move commands and groups to dedicated files * distinguish variable names for better env/config parsing
This commit is contained in:
@@ -30,11 +30,11 @@ jobs:
|
||||
|
||||
- name: Python License Check
|
||||
run: |
|
||||
python -m detection_rules license-check
|
||||
python -m detection_rules dev license-check
|
||||
|
||||
- name: Build release package
|
||||
run: |
|
||||
python -m detection_rules build-release
|
||||
python -m detection_rules dev build-release
|
||||
|
||||
- name: Archive production artifacts
|
||||
uses: actions/upload-artifact@v2
|
||||
|
||||
@@ -25,8 +25,8 @@ Currently supported arguments:
|
||||
* elasticsearch_url
|
||||
* kibana_url
|
||||
* cloud_id
|
||||
* username
|
||||
* password
|
||||
* *_username (kibana and es)
|
||||
* *_password (kibana and es)
|
||||
|
||||
#### Using environment variables
|
||||
|
||||
@@ -92,14 +92,79 @@ This will also strip additional fields and prompt for missing required fields.
|
||||
<a id="note-3">\* Note</a>: This will attempt to parse ALL files recursively within a specified directory.
|
||||
|
||||
|
||||
## Commands using Elasticsearch and Kibana clients
|
||||
|
||||
Commands which connect to Elasticsearch or Kibana are embedded under the subcommands:
|
||||
* es
|
||||
* kibana
|
||||
|
||||
These command groups will leverage their respective clients and will automatically use parsed config options if
|
||||
defined, otherwise arguments should be passed to the sub-command as:
|
||||
|
||||
`python -m detection-rules kibana -u <username> -p <password> upload-rule <...>`
|
||||
|
||||
```console
|
||||
python -m detection_rules es -h
|
||||
|
||||
Usage: detection_rules es [OPTIONS] COMMAND [ARGS]...
|
||||
|
||||
Commands for integrating with Elasticsearch.
|
||||
|
||||
Options:
|
||||
-e, --elasticsearch-url TEXT
|
||||
--cloud-id TEXT
|
||||
-u, --es-user TEXT
|
||||
-p, --es-password TEXT
|
||||
-t, --timeout INTEGER Timeout for elasticsearch client
|
||||
-h, --help Show this message and exit.
|
||||
|
||||
Commands:
|
||||
collect-events Collect events from Elasticsearch.
|
||||
```
|
||||
|
||||
```console
|
||||
python -m detection_rules kibana -h
|
||||
|
||||
Usage: detection_rules kibana [OPTIONS] COMMAND [ARGS]...
|
||||
|
||||
Commands for integrating with Kibana.
|
||||
|
||||
Options:
|
||||
-k, --kibana-url TEXT
|
||||
--cloud-id TEXT
|
||||
-u, --kibana-user TEXT
|
||||
-p, --kibana-password TEXT
|
||||
-t, --timeout INTEGER Timeout for kibana client
|
||||
-h, --help Show this message and exit.
|
||||
|
||||
Commands:
|
||||
upload-rule Upload a list of rule .toml files to Kibana.
|
||||
```
|
||||
|
||||
|
||||
## Uploading rules to Kibana
|
||||
|
||||
Toml formatted rule files can be uploaded as custom rules using the `kibana-upload` command. To upload more than one
|
||||
Toml formatted rule files can be uploaded as custom rules using the `kibana upload-rule` command. To upload more than one
|
||||
file, specify multiple files at a time as individual args. This command is meant to support uploading and testing of
|
||||
rules and is not intended for production use in its current state.
|
||||
|
||||
```console
|
||||
python -m detection_rules kibana-upload my-rules/example_custom_rule.toml
|
||||
python -m detection_rules kibana upload-rule -h
|
||||
|
||||
Kibana client:
|
||||
Options:
|
||||
-k, --kibana-url TEXT
|
||||
--cloud-id TEXT
|
||||
-u, --kibana-user TEXT
|
||||
-p, --kibana-password TEXT
|
||||
-t, --timeout INTEGER Timeout for kibana client
|
||||
|
||||
Usage: detection_rules kibana upload-rule [OPTIONS] TOML_FILES...
|
||||
|
||||
Upload a list of rule .toml files to Kibana.
|
||||
|
||||
Options:
|
||||
-h, --help Show this message and exit.
|
||||
```
|
||||
|
||||
_*To load a custom rule, the proper index must be setup first. The simplest way to do this is to click
|
||||
@@ -130,3 +195,11 @@ rules. This is based on the hash of the rule in the following format:
|
||||
* sha256 hash
|
||||
|
||||
As a result, all cases where rules are shown or converted to JSON are not just simple conversions from TOML.
|
||||
|
||||
## Debugging
|
||||
|
||||
Most of the CLI errors will print a concise, user friendly error. To enable debug mode and see full error stacktraces,
|
||||
you can define `"debug": true` in your config file, or run `python -m detection-rules -d <commands...>`.
|
||||
|
||||
Precedence goes to the flag over the config file, so if debug is enabled in your config and you run
|
||||
`python -m detection-rules --no-debug`, debugging will be disabled.
|
||||
|
||||
@@ -35,7 +35,7 @@ pytest: $(VENV) deps
|
||||
.PHONY: license-check
|
||||
license-check: $(VENV) deps
|
||||
@echo "LICENSE CHECK"
|
||||
$(PYTHON) -m detection_rules license-check
|
||||
$(PYTHON) -m detection_rules dev license-check
|
||||
|
||||
.PHONY: lint
|
||||
lint: $(VENV) deps
|
||||
@@ -48,7 +48,7 @@ test: $(VENV) lint pytest
|
||||
.PHONY: release
|
||||
release: deps
|
||||
@echo "RELEASE: $(app_name)"
|
||||
$(PYTHON) -m detection_rules build-release
|
||||
$(PYTHON) -m detection_rules dev build-release
|
||||
rm -rf dist
|
||||
mkdir dist
|
||||
cp -r releases/*/*.zip dist/
|
||||
@@ -56,4 +56,4 @@ release: deps
|
||||
.PHONY: kibana-commit
|
||||
kibana-commit: deps
|
||||
@echo "PREP KIBANA-COMMIT: $(app_name)"
|
||||
$(PYTHON) -m detection_rules kibana-commit
|
||||
$(PYTHON) -m detection_rules dev kibana-commit
|
||||
|
||||
@@ -55,22 +55,23 @@ Usage: detection_rules [OPTIONS] COMMAND [ARGS]...
|
||||
Commands for detection-rules repository.
|
||||
|
||||
Options:
|
||||
-h, --help Show this message and exit.
|
||||
-d, --debug / -n, --no-debug Print full exception stacktrace on errors
|
||||
-h, --help Show this message and exit.
|
||||
|
||||
Commands:
|
||||
build-release Assemble all the rules into Kibana-ready release files.
|
||||
create-rule Create a new rule TOML file.
|
||||
es Helper commands for integrating with Elasticsearch.
|
||||
kibana-diff Diff rules against their version represented in...
|
||||
load-from-file Load rules from file(s).
|
||||
mass-update Update multiple rules based on eql results.
|
||||
rule-search Use EQL to search the rules.
|
||||
test Run unit tests over all of the rules.
|
||||
toml-lint Cleanup files with some simple toml formatting.
|
||||
update-lock-versions Update rule hashes in version.lock.json file...
|
||||
validate-all Check if all rules validates against a schema.
|
||||
validate-rule Check if a rule staged in rules dir validates...
|
||||
view-rule View an internal rule or specified rule file.
|
||||
create-rule Create a detection rule.
|
||||
dev Commands for development and management by internal...
|
||||
es Commands for integrating with Elasticsearch.
|
||||
import-rules Import rules from json, toml, or Kibana exported rule...
|
||||
kibana Commands for integrating with Kibana.
|
||||
mass-update Update multiple rules based on eql results.
|
||||
normalize-data Normalize Elasticsearch data timestamps and sort.
|
||||
rule-search Use KQL or EQL to find matching rules.
|
||||
test Run unit tests over all of the rules.
|
||||
toml-lint Cleanup files with some simple toml formatting.
|
||||
validate-all Check if all rules validates against a schema.
|
||||
validate-rule Check if a rule staged in rules dir validates against a...
|
||||
view-rule View an internal rule or specified rule file.
|
||||
```
|
||||
|
||||
The [contribution guide](CONTRIBUTING.md) describes how to use the `create-rule` and `test` commands to create and test a new rule when contributing to Detection Rules.
|
||||
|
||||
@@ -3,8 +3,10 @@
|
||||
# you may not use this file except in compliance with the Elastic License.
|
||||
|
||||
"""Detection rules."""
|
||||
from . import devtools
|
||||
from . import docs
|
||||
from . import eswrap
|
||||
from . import kbwrap
|
||||
from . import main
|
||||
from . import mappings
|
||||
from . import misc
|
||||
@@ -14,8 +16,10 @@ from . import schemas
|
||||
from . import utils
|
||||
|
||||
__all__ = (
|
||||
'devtools',
|
||||
'docs',
|
||||
'eswrap',
|
||||
'kbwrap',
|
||||
'mappings',
|
||||
"main",
|
||||
'misc',
|
||||
|
||||
@@ -0,0 +1,203 @@
|
||||
# Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
# or more contributor license agreements. Licensed under the Elastic License;
|
||||
# you may not use this file except in compliance with the Elastic License.
|
||||
|
||||
"""CLI commands for internal detection_rules dev team."""
|
||||
import glob
|
||||
import io
|
||||
import json
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
|
||||
import click
|
||||
from eql import load_dump
|
||||
|
||||
from . import rule_loader
|
||||
from .main import root
|
||||
from .misc import PYTHON_LICENSE, client_error
|
||||
from .packaging import PACKAGE_FILE, Package, manage_versions, RELEASE_DIR
|
||||
from .rule import Rule
|
||||
from .utils import get_path
|
||||
|
||||
|
||||
RULES_DIR = get_path('rules')
|
||||
|
||||
|
||||
@root.group('dev')
|
||||
def dev_group():
|
||||
"""Commands related to the Elastic Stack rules release lifecycle."""
|
||||
|
||||
|
||||
@dev_group.command('build-release')
|
||||
@click.argument('config-file', type=click.Path(exists=True, dir_okay=False), required=False, default=PACKAGE_FILE)
|
||||
@click.option('--update-version-lock', '-u', is_flag=True,
|
||||
help='Save version.lock.json file with updated rule versions in the package')
|
||||
def build_release(config_file, update_version_lock):
|
||||
"""Assemble all the rules into Kibana-ready release files."""
|
||||
config = load_dump(config_file)['package']
|
||||
click.echo('[+] Building package {}'.format(config.get('name')))
|
||||
package = Package.from_config(config, update_version_lock=update_version_lock, verbose=True)
|
||||
package.save()
|
||||
package.get_package_hash(verbose=True)
|
||||
click.echo('- {} rules included'.format(len(package.rules)))
|
||||
|
||||
|
||||
@dev_group.command('update-lock-versions')
|
||||
@click.argument('rule-ids', nargs=-1, required=True)
|
||||
def update_lock_versions(rule_ids):
|
||||
"""Update rule hashes in version.lock.json file without bumping version."""
|
||||
from .packaging import manage_versions
|
||||
|
||||
if not click.confirm('Are you sure you want to update hashes without a version bump?'):
|
||||
return
|
||||
|
||||
rules = [r for r in rule_loader.load_rules(verbose=False).values() if r.id in rule_ids]
|
||||
changed, new = manage_versions(rules, exclude_version_update=True, add_new=False, save_changes=True)
|
||||
|
||||
if not changed:
|
||||
click.echo('No hashes updated')
|
||||
|
||||
return changed
|
||||
|
||||
|
||||
@dev_group.command('kibana-diff')
|
||||
@click.option('--rule-id', '-r', multiple=True, help='Optionally specify rule ID')
|
||||
@click.option('--branch', '-b', default='master', help='Specify the kibana branch to diff against')
|
||||
@click.option('--threads', '-t', type=click.IntRange(1), default=50, help='Number of threads to use to download rules')
|
||||
def kibana_diff(rule_id, branch, threads):
|
||||
"""Diff rules against their version represented in kibana if exists."""
|
||||
from .misc import get_kibana_rules
|
||||
|
||||
if rule_id:
|
||||
rules = {r.id: r for r in rule_loader.load_rules(verbose=False).values() if r.id in rule_id}
|
||||
else:
|
||||
rules = {r.id: r for r in rule_loader.get_production_rules()}
|
||||
|
||||
# add versions to the rules
|
||||
manage_versions(list(rules.values()), verbose=False)
|
||||
repo_hashes = {r.id: r.get_hash() for r in rules.values()}
|
||||
|
||||
kibana_rules = {r['rule_id']: r for r in get_kibana_rules(branch=branch, threads=threads).values()}
|
||||
kibana_hashes = {r['rule_id']: Rule.dict_hash(r) for r in kibana_rules.values()}
|
||||
|
||||
missing_from_repo = list(set(kibana_hashes).difference(set(repo_hashes)))
|
||||
missing_from_kibana = list(set(repo_hashes).difference(set(kibana_hashes)))
|
||||
|
||||
rule_diff = []
|
||||
for rid, rhash in repo_hashes.items():
|
||||
if rid in missing_from_kibana:
|
||||
continue
|
||||
if rhash != kibana_hashes[rid]:
|
||||
rule_diff.append(
|
||||
f'versions - repo: {rules[rid].contents["version"]}, kibana: {kibana_rules[rid]["version"]} -> '
|
||||
f'{rid} - {rules[rid].name}'
|
||||
)
|
||||
|
||||
diff = {
|
||||
'missing_from_kibana': [f'{r} - {rules[r].name}' for r in missing_from_kibana],
|
||||
'diff': rule_diff,
|
||||
'missing_from_repo': [f'{r} - {kibana_rules[r]["name"]}' for r in missing_from_repo]
|
||||
}
|
||||
|
||||
diff['stats'] = {k: len(v) for k, v in diff.items()}
|
||||
diff['stats'].update(total_repo_prod_rules=len(rules), total_gh_prod_rules=len(kibana_rules))
|
||||
|
||||
click.echo(json.dumps(diff, indent=2, sort_keys=True))
|
||||
return diff
|
||||
|
||||
|
||||
@dev_group.command("kibana-commit")
|
||||
@click.argument("local-repo", default=get_path("..", "kibana"))
|
||||
@click.option("--kibana-directory", "-d", help="Directory to overwrite in Kibana",
|
||||
default="x-pack/plugins/security_solution/server/lib/detection_engine/rules/prepackaged_rules")
|
||||
@click.option("--base-branch", "-b", help="Base branch in Kibana", default="master")
|
||||
@click.option("--ssh/--http", is_flag=True, help="Method to use for cloning")
|
||||
@click.option("--github-repo", "-r", help="Repository to use for the branch", default="elastic/kibana")
|
||||
@click.option("--message", "-m", help="Override default commit message")
|
||||
@click.pass_context
|
||||
def kibana_commit(ctx, local_repo, github_repo, ssh, kibana_directory, base_branch, message):
|
||||
"""Prep a commit and push to Kibana."""
|
||||
git_exe = shutil.which("git")
|
||||
|
||||
package_name = load_dump(PACKAGE_FILE)['package']["name"]
|
||||
release_dir = os.path.join(RELEASE_DIR, package_name)
|
||||
message = message or f"[Detection Rules] Add {package_name} rules"
|
||||
|
||||
if not os.path.exists(release_dir):
|
||||
click.secho("Release directory doesn't exist.", fg="red", err=True)
|
||||
click.echo(f"Run {click.style('python -m detection_rules build-release', bold=True)} to populate", err=True)
|
||||
ctx.exit(1)
|
||||
|
||||
if not git_exe:
|
||||
click.secho("Unable to find git", err=True, fg="red")
|
||||
ctx.exit(1)
|
||||
|
||||
try:
|
||||
if not os.path.exists(local_repo):
|
||||
if not click.confirm(f"Kibana repository doesn't exist at {local_repo}. Clone?"):
|
||||
ctx.exit(1)
|
||||
|
||||
url = f"git@github.com:{github_repo}.git" if ssh else f"https://github.com/{github_repo}.git"
|
||||
subprocess.check_call([git_exe, "clone", url, local_repo, "--depth", 1])
|
||||
|
||||
def git(*args, show_output=False):
|
||||
method = subprocess.call if show_output else subprocess.check_output
|
||||
return method([git_exe, "-C", local_repo] + list(args), encoding="utf-8")
|
||||
|
||||
git("checkout", base_branch)
|
||||
git("pull")
|
||||
git("checkout", "-b", f"rules/{package_name}", show_output=True)
|
||||
git("rm", "-r", kibana_directory)
|
||||
|
||||
source_dir = os.path.join(release_dir, "rules")
|
||||
target_dir = os.path.join(local_repo, kibana_directory)
|
||||
os.makedirs(target_dir)
|
||||
|
||||
for name in os.listdir(source_dir):
|
||||
_, ext = os.path.splitext(name)
|
||||
path = os.path.join(source_dir, name)
|
||||
|
||||
if ext in (".ts", ".json"):
|
||||
shutil.copyfile(path, os.path.join(target_dir, name))
|
||||
|
||||
git("add", kibana_directory)
|
||||
|
||||
git("commit", "-S", "-m", message)
|
||||
git("status", show_output=True)
|
||||
|
||||
click.echo(f"Kibana repository {local_repo} prepped. Push changes when ready")
|
||||
click.secho(f"cd {local_repo}", bold=True)
|
||||
|
||||
except subprocess.CalledProcessError as e:
|
||||
client_error(e.returncode, e, ctx=ctx)
|
||||
|
||||
|
||||
@dev_group.command('license-check')
|
||||
@click.pass_context
|
||||
def license_check(ctx):
|
||||
"""Check that all code files contain a valid license."""
|
||||
|
||||
failed = False
|
||||
|
||||
for path in glob.glob(get_path("**", "*.py"), recursive=True):
|
||||
if path.startswith(get_path("env", "")):
|
||||
continue
|
||||
|
||||
relative_path = os.path.relpath(path)
|
||||
|
||||
with io.open(path, "rt", encoding="utf-8") as f:
|
||||
contents = f.read()
|
||||
|
||||
# skip over shebang lines
|
||||
if contents.startswith("#!/"):
|
||||
_, _, contents = contents.partition("\n")
|
||||
|
||||
if not contents.lstrip("\r\n").startswith(PYTHON_LICENSE):
|
||||
if not failed:
|
||||
click.echo("Missing license headers for:", err=True)
|
||||
|
||||
failed = True
|
||||
click.echo(relative_path, err=True)
|
||||
|
||||
ctx.exit(int(failed))
|
||||
+57
-98
@@ -2,30 +2,20 @@
|
||||
# or more contributor license agreements. Licensed under the Elastic License;
|
||||
# you may not use this file except in compliance with the Elastic License.
|
||||
|
||||
"""Elasticsearch cli and tmp."""
|
||||
"""Elasticsearch cli commands."""
|
||||
import json
|
||||
import os
|
||||
import time
|
||||
|
||||
import click
|
||||
from elasticsearch import AuthenticationException, Elasticsearch
|
||||
from kibana import Kibana, RuleResource
|
||||
|
||||
from .main import root
|
||||
from .misc import getdefault
|
||||
from .utils import normalize_timing_and_sort, unix_time_to_formatted, get_path
|
||||
from .rule_loader import get_rule, rta_mappings, load_rule_files, load_rules
|
||||
from .misc import client_error, getdefault
|
||||
from .utils import format_command_options, normalize_timing_and_sort, unix_time_to_formatted, get_path
|
||||
from .rule_loader import get_rule, rta_mappings
|
||||
|
||||
COLLECTION_DIR = get_path('collections')
|
||||
ERRORS = {
|
||||
'NO_EVENTS': 1,
|
||||
'FAILED_ES_AUTH': 2
|
||||
}
|
||||
|
||||
|
||||
@root.group('es')
|
||||
def es_group():
|
||||
"""Helper commands for integrating with Elasticsearch."""
|
||||
|
||||
|
||||
def get_es_client(user, password, elasticsearch_url=None, cloud_id=None, **kwargs):
|
||||
@@ -177,103 +167,72 @@ class CollectEvents(object):
|
||||
return Events(agent_hostname, events)
|
||||
|
||||
|
||||
@es_group.command('collect-events')
|
||||
@click.argument('agent-hostname')
|
||||
@click.option('--elasticsearch-url', '-u', default=getdefault("elasticsearch_url"))
|
||||
@click.option('--cloud-id', default=getdefault("cloud_id"))
|
||||
@click.option('--user', '-u', default=getdefault("user"))
|
||||
@click.option('--password', '-p', default=getdefault("password"))
|
||||
@click.option('--index', '-i', multiple=True, help='Index(es) to search against (default: all indexes)')
|
||||
@click.option('--agent-type', '-a', help='Restrict results to a source type (agent.type) ex: auditbeat')
|
||||
@click.option('--rta-name', '-r', help='Name of RTA in order to save events directly to unit tests data directory')
|
||||
@click.option('--rule-id', help='Updates rule mapping in rule-mapping.yml file (requires --rta-name)')
|
||||
@click.option('--view-events', is_flag=True, help='Print events after saving')
|
||||
def collect_events(agent_hostname, elasticsearch_url, cloud_id, user, password, index, agent_type, rta_name, rule_id,
|
||||
view_events):
|
||||
"""Collect events from Elasticsearch."""
|
||||
match = {'agent.type': agent_type} if agent_type else {}
|
||||
|
||||
if not cloud_id or elasticsearch_url:
|
||||
raise click.ClickException("Missing required --cloud-id or --elasticsearch-url")
|
||||
|
||||
# don't prompt for these until there's a cloud id or elasticsearch URL
|
||||
user = user or click.prompt("user")
|
||||
password = password or click.prompt("password", hide_input=True)
|
||||
|
||||
try:
|
||||
client = get_es_client(elasticsearch_url=elasticsearch_url, use_ssl=True, cloud_id=cloud_id, user=user,
|
||||
password=password)
|
||||
except AuthenticationException:
|
||||
click.secho('Failed authentication for {}'.format(elasticsearch_url or cloud_id), fg='red', err=True)
|
||||
return ERRORS['FAILED_ES_AUTH']
|
||||
|
||||
try:
|
||||
collector = CollectEvents(client)
|
||||
events = collector.run(agent_hostname, index, **match)
|
||||
events.save(rta_name)
|
||||
except AssertionError:
|
||||
click.secho('No events collected! Verify events are streaming and that the agent-hostname is correct',
|
||||
err=True, fg='red')
|
||||
return ERRORS['NO_EVENTS']
|
||||
|
||||
if rta_name and rule_id:
|
||||
events.evaluate_against_rule_and_update_mapping(rule_id, rta_name)
|
||||
|
||||
if view_events and events.events:
|
||||
events.echo_events(pager=True)
|
||||
|
||||
return events
|
||||
|
||||
|
||||
@es_group.command('normalize-data')
|
||||
@root.command('normalize-data')
|
||||
@click.argument('events-file', type=click.File('r'))
|
||||
def normalize_file(events_file):
|
||||
def normalize_data(events_file):
|
||||
"""Normalize Elasticsearch data timestamps and sort."""
|
||||
file_name = os.path.splitext(os.path.basename(events_file.name))[0]
|
||||
events = Events('_', {file_name: [json.loads(e) for e in events_file.readlines()]})
|
||||
events.save(dump_dir=os.path.dirname(events_file.name))
|
||||
|
||||
|
||||
@root.command("kibana-upload")
|
||||
@click.argument("toml-files", nargs=-1, required=True)
|
||||
@click.option('--kibana-url', '-u', default=getdefault("kibana_url"))
|
||||
@root.group('es')
|
||||
@click.option('--elasticsearch-url', '-e', default=getdefault("elasticsearch_url"))
|
||||
@click.option('--cloud-id', default=getdefault("cloud_id"))
|
||||
@click.option('--user', '-u', default=getdefault("user"))
|
||||
@click.option('--password', '-p', default=getdefault("password"))
|
||||
def kibana_upload(toml_files, kibana_url, cloud_id, user, password):
|
||||
"""Upload a list of rule .toml files to Kibana."""
|
||||
from uuid import uuid4
|
||||
from .packaging import manage_versions
|
||||
from .schemas import downgrade
|
||||
@click.option('--es-user', '-u', default=getdefault("es_user"))
|
||||
@click.option('--es-password', '-p', default=getdefault("es_password"))
|
||||
@click.option('--timeout', '-t', default=60, help='Timeout for elasticsearch client')
|
||||
@click.pass_context
|
||||
def es_group(ctx: click.Context, **es_kwargs):
|
||||
"""Commands for integrating with Elasticsearch."""
|
||||
ctx.ensure_object(dict)
|
||||
|
||||
if not (cloud_id or kibana_url):
|
||||
raise click.ClickException("Missing required --cloud-id or --kibana-url")
|
||||
# only initialize an es client if the subcommand is invoked without help (hacky)
|
||||
if click.get_os_args()[-1] in ctx.help_option_names:
|
||||
click.echo('Elasticsearch client:')
|
||||
click.echo(format_command_options(ctx))
|
||||
|
||||
# don't prompt for these until there's a cloud id or kibana URL
|
||||
user = user or click.prompt("user")
|
||||
password = password or click.prompt("password", hide_input=True)
|
||||
else:
|
||||
if not es_kwargs['cloud_id'] or es_kwargs['elasticsearch_url']:
|
||||
client_error("Missing required --cloud-id or --elasticsearch-url")
|
||||
|
||||
with Kibana(cloud_id=cloud_id, url=kibana_url) as kibana:
|
||||
kibana.login(user, password)
|
||||
# don't prompt for these until there's a cloud id or elasticsearch URL
|
||||
es_kwargs['es_user'] = es_kwargs['es_user'] or click.prompt("es_user")
|
||||
es_kwargs['es_password'] = es_kwargs['es_password'] or click.prompt("es_password", hide_input=True)
|
||||
|
||||
file_lookup = load_rule_files(paths=toml_files)
|
||||
rules = list(load_rules(file_lookup=file_lookup).values())
|
||||
try:
|
||||
client = get_es_client(use_ssl=True, **es_kwargs)
|
||||
ctx.obj['es'] = client
|
||||
except AuthenticationException as e:
|
||||
error_msg = f'Failed authentication for {es_kwargs.get("elasticsearch_url") or es_kwargs.get("cloud_id")}'
|
||||
client_error(error_msg, e, ctx=ctx, err=True)
|
||||
|
||||
# assign the versions from etc/versions.lock.json
|
||||
# rules that have changed in hash get incremented, others stay as-is.
|
||||
# rules that aren't in the lookup default to version 1
|
||||
manage_versions(rules, verbose=False)
|
||||
|
||||
api_payloads = []
|
||||
@es_group.command('collect-events')
|
||||
@click.argument('agent-hostname')
|
||||
@click.option('--index', '-i', multiple=True, help='Index(es) to search against (default: all indexes)')
|
||||
@click.option('--agent-type', '-a', help='Restrict results to a source type (agent.type) ex: auditbeat')
|
||||
@click.option('--rta-name', '-r', help='Name of RTA in order to save events directly to unit tests data directory')
|
||||
@click.option('--rule-id', help='Updates rule mapping in rule-mapping.yml file (requires --rta-name)')
|
||||
@click.option('--view-events', is_flag=True, help='Print events after saving')
|
||||
@click.pass_context
|
||||
def collect_events(ctx, agent_hostname, index, agent_type, rta_name, rule_id, view_events):
|
||||
"""Collect events from Elasticsearch."""
|
||||
match = {'agent.type': agent_type} if agent_type else {}
|
||||
client = ctx.obj['es']
|
||||
|
||||
for rule in rules:
|
||||
payload = rule.contents.copy()
|
||||
meta = payload.setdefault("meta", {})
|
||||
meta["original"] = dict(id=rule.id, **rule.metadata)
|
||||
payload["rule_id"] = str(uuid4())
|
||||
payload = downgrade(payload, kibana.version)
|
||||
rule = RuleResource(payload)
|
||||
api_payloads.append(rule)
|
||||
try:
|
||||
collector = CollectEvents(client)
|
||||
events = collector.run(agent_hostname, index, **match)
|
||||
events.save(rta_name)
|
||||
|
||||
rules = RuleResource.bulk_create(api_payloads)
|
||||
click.echo(f"Successfully uploaded {len(rules)} rules")
|
||||
if rta_name and rule_id:
|
||||
events.evaluate_against_rule_and_update_mapping(rule_id, rta_name)
|
||||
|
||||
if view_events and events.events:
|
||||
events.echo_events(pager=True)
|
||||
|
||||
return events
|
||||
except AssertionError as e:
|
||||
error_msg = 'No events collected! Verify events are streaming and that the agent-hostname is correct'
|
||||
client_error(error_msg, e, ctx=ctx)
|
||||
|
||||
@@ -0,0 +1,73 @@
|
||||
# Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
# or more contributor license agreements. Licensed under the Elastic License;
|
||||
# you may not use this file except in compliance with the Elastic License.
|
||||
|
||||
"""Kibana cli commands."""
|
||||
import click
|
||||
from kibana import Kibana, RuleResource
|
||||
|
||||
from .main import root
|
||||
from .misc import client_error, getdefault
|
||||
from .rule_loader import load_rule_files, load_rules
|
||||
from .utils import format_command_options
|
||||
|
||||
|
||||
@root.group('kibana')
|
||||
@click.option('--kibana-url', '-k', default=getdefault('kibana_url'))
|
||||
@click.option('--cloud-id', default=getdefault('cloud_id'))
|
||||
@click.option('--kibana-user', '-u', default=getdefault('kibana_user'))
|
||||
@click.option('--kibana-password', '-p', default=getdefault("kibana_password"))
|
||||
@click.pass_context
|
||||
def kibana_group(ctx: click.Context, **kibana_kwargs):
|
||||
"""Commands for integrating with Kibana."""
|
||||
ctx.ensure_object(dict)
|
||||
|
||||
# only initialize an kibana client if the subcommand is invoked without help (hacky)
|
||||
if click.get_os_args()[-1] in ctx.help_option_names:
|
||||
click.echo('Kibana client:')
|
||||
click.echo(format_command_options(ctx))
|
||||
|
||||
else:
|
||||
if not kibana_kwargs['cloud_id'] or kibana_kwargs['kibana_url']:
|
||||
client_error("Missing required --cloud-id or --kibana-url")
|
||||
|
||||
# don't prompt for these until there's a cloud id or Kibana URL
|
||||
kibana_user = kibana_kwargs.pop('kibana_user', None) or click.prompt("kibana_user")
|
||||
kibana_password = kibana_kwargs.pop('kibana_password', None) or click.prompt("kibana_password", hide_input=True)
|
||||
|
||||
with Kibana(**kibana_kwargs) as kibana:
|
||||
kibana.login(kibana_user, kibana_password)
|
||||
ctx.obj['kibana'] = kibana
|
||||
|
||||
|
||||
@kibana_group.command("upload-rule")
|
||||
@click.argument("toml-files", nargs=-1, required=True)
|
||||
@click.pass_context
|
||||
def upload_rule(ctx, toml_files):
|
||||
"""Upload a list of rule .toml files to Kibana."""
|
||||
from uuid import uuid4
|
||||
from .packaging import manage_versions
|
||||
from .schemas import downgrade
|
||||
|
||||
kibana = ctx.obj['kibana']
|
||||
file_lookup = load_rule_files(paths=toml_files)
|
||||
rules = list(load_rules(file_lookup=file_lookup).values())
|
||||
|
||||
# assign the versions from etc/versions.lock.json
|
||||
# rules that have changed in hash get incremented, others stay as-is.
|
||||
# rules that aren't in the lookup default to version 1
|
||||
manage_versions(rules, verbose=False)
|
||||
|
||||
api_payloads = []
|
||||
|
||||
for rule in rules:
|
||||
payload = rule.contents.copy()
|
||||
meta = payload.setdefault("meta", {})
|
||||
meta["original"] = dict(id=rule.id, **rule.metadata)
|
||||
payload["rule_id"] = str(uuid4())
|
||||
payload = downgrade(payload, kibana.version)
|
||||
rule = RuleResource(payload)
|
||||
api_payloads.append(rule)
|
||||
|
||||
rules = RuleResource.bulk_create(api_payloads)
|
||||
click.echo(f"Successfully uploaded {len(rules)} rules")
|
||||
+23
-200
@@ -4,21 +4,16 @@
|
||||
|
||||
"""CLI commands for detection_rules."""
|
||||
import glob
|
||||
import io
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
|
||||
import click
|
||||
import jsonschema
|
||||
import pytoml
|
||||
from eql import load_dump
|
||||
|
||||
from .misc import PYTHON_LICENSE, nested_set
|
||||
from . import rule_loader
|
||||
from .packaging import PACKAGE_FILE, Package, manage_versions, RELEASE_DIR
|
||||
from .misc import client_error, nested_set, parse_config
|
||||
from .rule import Rule
|
||||
from .rule_formatter import toml_write
|
||||
from .schemas import CurrentSchema
|
||||
@@ -29,8 +24,15 @@ RULES_DIR = get_path('rules')
|
||||
|
||||
|
||||
@click.group('detection-rules', context_settings={'help_option_names': ['-h', '--help']})
|
||||
def root():
|
||||
@click.option('--debug/--no-debug', '-D/-N', is_flag=True, default=None,
|
||||
help='Print full exception stacktrace on errors')
|
||||
@click.pass_context
|
||||
def root(ctx, debug):
|
||||
"""Commands for detection-rules repository."""
|
||||
debug = debug if debug is not None else parse_config().get('debug')
|
||||
ctx.obj = {'debug': debug}
|
||||
if debug:
|
||||
click.secho('DEBUG MODE ENABLED', fg='yellow')
|
||||
|
||||
|
||||
@root.command('create-rule')
|
||||
@@ -140,15 +142,12 @@ def view_rule(ctx, rule_id, rule_file, api_format):
|
||||
try:
|
||||
rule = Rule(rule_file, contents)
|
||||
except jsonschema.ValidationError as e:
|
||||
click.secho(e.args[0], fg='red')
|
||||
ctx.exit(1)
|
||||
client_error(f'Rule: {rule_id or os.path.basename(rule_file)} failed validation', e, ctx=ctx)
|
||||
else:
|
||||
click.secho('Unknown rule!', fg='red')
|
||||
ctx.exit(1)
|
||||
client_error('Unknown rule!')
|
||||
|
||||
if not rule:
|
||||
click.secho('Unknown format!', fg='red')
|
||||
ctx.exit(1)
|
||||
client_error('Unknown format!')
|
||||
|
||||
click.echo(toml_write(rule.rule_format()) if not api_format else
|
||||
json.dumps(rule.contents, indent=2, sort_keys=True))
|
||||
@@ -160,51 +159,19 @@ def view_rule(ctx, rule_id, rule_file, api_format):
|
||||
@click.argument('rule-id', required=False)
|
||||
@click.option('--rule-name', '-n')
|
||||
@click.option('--path', '-p', type=click.Path(dir_okay=False))
|
||||
def validate_rule(rule_id, rule_name, path):
|
||||
"""Check if a rule staged in rules dir validates against a schema."""
|
||||
rule = rule_loader.get_rule(rule_id, rule_name, path, verbose=False)
|
||||
|
||||
if not rule:
|
||||
return click.secho('Rule not found!', fg='red')
|
||||
|
||||
try:
|
||||
rule.validate(as_rule=True)
|
||||
except jsonschema.ValidationError as e:
|
||||
click.echo(e)
|
||||
|
||||
click.echo('Rule validation successful')
|
||||
|
||||
return rule
|
||||
|
||||
|
||||
@root.command('license-check')
|
||||
@click.pass_context
|
||||
def license_check(ctx):
|
||||
"""Check that all code files contain a valid license."""
|
||||
def validate_rule(ctx, rule_id, rule_name, path):
|
||||
"""Check if a rule staged in rules dir validates against a schema."""
|
||||
try:
|
||||
rule = rule_loader.get_rule(rule_id, rule_name, path, verbose=False)
|
||||
if not rule:
|
||||
client_error('Rule not found!')
|
||||
|
||||
failed = False
|
||||
|
||||
for path in glob.glob(get_path("**", "*.py"), recursive=True):
|
||||
if path.startswith(get_path("env", "")):
|
||||
continue
|
||||
|
||||
relative_path = os.path.relpath(path)
|
||||
|
||||
with io.open(path, "rt", encoding="utf-8") as f:
|
||||
contents = f.read()
|
||||
|
||||
# skip over shebang lines
|
||||
if contents.startswith("#!/"):
|
||||
_, _, contents = contents.partition("\n")
|
||||
|
||||
if not contents.lstrip("\r\n").startswith(PYTHON_LICENSE):
|
||||
if not failed:
|
||||
click.echo("Missing license headers for:", err=True)
|
||||
|
||||
failed = True
|
||||
click.echo(relative_path, err=True)
|
||||
|
||||
ctx.exit(int(failed))
|
||||
rule.validate(as_rule=True)
|
||||
click.echo('Rule validation successful')
|
||||
return rule
|
||||
except jsonschema.ValidationError as e:
|
||||
client_error(e.args[0], e, ctx=ctx)
|
||||
|
||||
|
||||
@root.command('validate-all')
|
||||
@@ -268,84 +235,6 @@ def search_rules(query, columns, language, verbose=True):
|
||||
return filtered
|
||||
|
||||
|
||||
@root.command('build-release')
|
||||
@click.argument('config-file', type=click.Path(exists=True, dir_okay=False), required=False, default=PACKAGE_FILE)
|
||||
@click.option('--update-version-lock', '-u', is_flag=True,
|
||||
help='Save version.lock.json file with updated rule versions in the package')
|
||||
def build_release(config_file, update_version_lock):
|
||||
"""Assemble all the rules into Kibana-ready release files."""
|
||||
config = load_dump(config_file)['package']
|
||||
click.echo('[+] Building package {}'.format(config.get('name')))
|
||||
package = Package.from_config(config, update_version_lock=update_version_lock, verbose=True)
|
||||
package.save()
|
||||
package.get_package_hash(verbose=True)
|
||||
click.echo('- {} rules included'.format(len(package.rules)))
|
||||
|
||||
|
||||
@root.command('update-lock-versions')
|
||||
@click.argument('rule-ids', nargs=-1, required=True)
|
||||
def update_lock_versions(rule_ids):
|
||||
"""Update rule hashes in version.lock.json file without bumping version."""
|
||||
from .packaging import manage_versions
|
||||
|
||||
if not click.confirm('Are you sure you want to update hashes without a version bump?'):
|
||||
return
|
||||
|
||||
rules = [r for r in rule_loader.load_rules(verbose=False).values() if r.id in rule_ids]
|
||||
changed, new = manage_versions(rules, exclude_version_update=True, add_new=False, save_changes=True)
|
||||
|
||||
if not changed:
|
||||
click.echo('No hashes updated')
|
||||
|
||||
return changed
|
||||
|
||||
|
||||
@root.command('kibana-diff')
|
||||
@click.option('--rule-id', '-r', multiple=True, help='Optionally specify rule ID')
|
||||
@click.option('--branch', '-b', default='master', help='Specify the kibana branch to diff against')
|
||||
@click.option('--threads', '-t', type=click.IntRange(1), default=50, help='Number of threads to use to download rules')
|
||||
def kibana_diff(rule_id, branch, threads):
|
||||
"""Diff rules against their version represented in kibana if exists."""
|
||||
from .misc import get_kibana_rules
|
||||
|
||||
if rule_id:
|
||||
rules = {r.id: r for r in rule_loader.load_rules(verbose=False).values() if r.id in rule_id}
|
||||
else:
|
||||
rules = {r.id: r for r in rule_loader.get_production_rules()}
|
||||
|
||||
# add versions to the rules
|
||||
manage_versions(list(rules.values()), verbose=False)
|
||||
repo_hashes = {r.id: r.get_hash() for r in rules.values()}
|
||||
|
||||
kibana_rules = {r['rule_id']: r for r in get_kibana_rules(branch=branch, threads=threads).values()}
|
||||
kibana_hashes = {r['rule_id']: Rule.dict_hash(r) for r in kibana_rules.values()}
|
||||
|
||||
missing_from_repo = list(set(kibana_hashes).difference(set(repo_hashes)))
|
||||
missing_from_kibana = list(set(repo_hashes).difference(set(kibana_hashes)))
|
||||
|
||||
rule_diff = []
|
||||
for rid, rhash in repo_hashes.items():
|
||||
if rid in missing_from_kibana:
|
||||
continue
|
||||
if rhash != kibana_hashes[rid]:
|
||||
rule_diff.append(
|
||||
f'versions - repo: {rules[rid].contents["version"]}, kibana: {kibana_rules[rid]["version"]} -> '
|
||||
f'{rid} - {rules[rid].name}'
|
||||
)
|
||||
|
||||
diff = {
|
||||
'missing_from_kibana': [f'{r} - {rules[r].name}' for r in missing_from_kibana],
|
||||
'diff': rule_diff,
|
||||
'missing_from_repo': [f'{r} - {kibana_rules[r]["name"]}' for r in missing_from_repo]
|
||||
}
|
||||
|
||||
diff['stats'] = {k: len(v) for k, v in diff.items()}
|
||||
diff['stats'].update(total_repo_prod_rules=len(rules), total_gh_prod_rules=len(kibana_rules))
|
||||
|
||||
click.echo(json.dumps(diff, indent=2, sort_keys=True))
|
||||
return diff
|
||||
|
||||
|
||||
@root.command("test")
|
||||
@click.pass_context
|
||||
def test_rules(ctx):
|
||||
@@ -354,69 +243,3 @@ def test_rules(ctx):
|
||||
|
||||
clear_caches()
|
||||
ctx.exit(pytest.main(["-v"]))
|
||||
|
||||
|
||||
@root.command("kibana-commit")
|
||||
@click.argument("local-repo", default=get_path("..", "kibana"))
|
||||
@click.option("--kibana-directory", "-d", help="Directory to overwrite in Kibana",
|
||||
default="x-pack/plugins/security_solution/server/lib/detection_engine/rules/prepackaged_rules")
|
||||
@click.option("--base-branch", "-b", help="Base branch in Kibana", default="master")
|
||||
@click.option("--ssh/--http", is_flag=True, help="Method to use for cloning")
|
||||
@click.option("--github-repo", "-r", help="Repository to use for the branch", default="elastic/kibana")
|
||||
@click.option("--message", "-m", help="Override default commit message")
|
||||
@click.pass_context
|
||||
def kibana_commit(ctx, local_repo, github_repo, ssh, kibana_directory, base_branch, message):
|
||||
"""Prep a commit and push to Kibana."""
|
||||
git_exe = shutil.which("git")
|
||||
|
||||
package_name = load_dump(PACKAGE_FILE)['package']["name"]
|
||||
release_dir = os.path.join(RELEASE_DIR, package_name)
|
||||
message = message or f"[Detection Rules] Add {package_name} rules"
|
||||
|
||||
if not os.path.exists(release_dir):
|
||||
click.secho("Release directory doesn't exist.", fg="red", err=True)
|
||||
click.echo(f"Run {click.style('python -m detection_rules build-release', bold=True)} to populate", err=True)
|
||||
ctx.exit(1)
|
||||
|
||||
if not git_exe:
|
||||
click.secho("Unable to find git", err=True, fg="red")
|
||||
ctx.exit(1)
|
||||
|
||||
try:
|
||||
if not os.path.exists(local_repo):
|
||||
if not click.confirm(f"Kibana repository doesn't exist at {local_repo}. Clone?"):
|
||||
ctx.exit(1)
|
||||
|
||||
url = f"git@github.com:{github_repo}.git" if ssh else f"https://github.com/{github_repo}.git"
|
||||
subprocess.check_call([git_exe, "clone", url, local_repo, "--depth", 1])
|
||||
|
||||
def git(*args, show_output=False):
|
||||
method = subprocess.call if show_output else subprocess.check_output
|
||||
return method([git_exe, "-C", local_repo] + list(args), encoding="utf-8")
|
||||
|
||||
git("checkout", base_branch)
|
||||
git("pull")
|
||||
git("checkout", "-b", f"rules/{package_name}", show_output=True)
|
||||
git("rm", "-r", kibana_directory)
|
||||
|
||||
source_dir = os.path.join(release_dir, "rules")
|
||||
target_dir = os.path.join(local_repo, kibana_directory)
|
||||
os.makedirs(target_dir)
|
||||
|
||||
for name in os.listdir(source_dir):
|
||||
_, ext = os.path.splitext(name)
|
||||
path = os.path.join(source_dir, name)
|
||||
|
||||
if ext in (".ts", ".json"):
|
||||
shutil.copyfile(path, os.path.join(target_dir, name))
|
||||
|
||||
git("add", kibana_directory)
|
||||
|
||||
git("commit", "-S", "-m", message)
|
||||
git("status", show_output=True)
|
||||
|
||||
click.echo(f"Kibana repository {local_repo} prepped. Push changes when ready")
|
||||
click.secho(f"cd {local_repo}", bold=True)
|
||||
|
||||
except subprocess.CalledProcessError as exc:
|
||||
ctx.exit(exc.returncode)
|
||||
|
||||
@@ -31,6 +31,31 @@ JS_LICENSE = """
|
||||
""".strip().format("\n".join(' * ' + line for line in LICENSE_LINES))
|
||||
|
||||
|
||||
class ClientError(click.ClickException):
|
||||
"""Custom CLI error to format output or full debug stacktrace."""
|
||||
|
||||
def __init__(self, message, original_error=None):
|
||||
super(ClientError, self).__init__(message)
|
||||
self.original_error = original_error
|
||||
|
||||
def show(self, file=None, err=True):
|
||||
"""Print the error to the console."""
|
||||
err = f' ({self.original_error})' if self.original_error else ''
|
||||
click.echo(f'{click.style(f"CLI Error{err}", fg="red", bold=True)}: {self.format_message()}',
|
||||
err=err, file=file)
|
||||
|
||||
|
||||
def client_error(message, exc: Exception = None, debug=None, ctx: click.Context = None, file=None, err=None):
|
||||
config_debug = True if ctx and ctx.ensure_object(dict) and ctx.obj.get('debug') is True else False
|
||||
debug = debug if debug is not None else config_debug
|
||||
|
||||
if debug:
|
||||
click.echo(click.style('DEBUG: ', fg='yellow') + message, err=err, file=file)
|
||||
raise
|
||||
else:
|
||||
raise ClientError(message, original_error=type(exc).__name__)
|
||||
|
||||
|
||||
def nested_get(_dict, dot_key, default=None):
|
||||
"""Get a nested field from a nested dict with dot notation."""
|
||||
if _dict is None or dot_key is None:
|
||||
|
||||
@@ -113,7 +113,8 @@ def load_rules(file_lookup=None, verbose=True, error=True):
|
||||
err_msg = "Invalid rule file in {}\n{}".format(rule_file, click.style(e.args[0], fg='red'))
|
||||
errors.append(err_msg)
|
||||
if error:
|
||||
print(err_msg)
|
||||
if verbose:
|
||||
print(err_msg)
|
||||
raise e
|
||||
|
||||
if failed:
|
||||
@@ -184,6 +185,7 @@ rta_mappings = RtaMappings()
|
||||
|
||||
|
||||
__all__ = (
|
||||
"load_rule_files",
|
||||
"load_rules",
|
||||
"get_file_name",
|
||||
"get_production_rules",
|
||||
|
||||
@@ -232,3 +232,23 @@ def load_rule_contents(rule_file: str, single_only=False) -> list:
|
||||
return rule
|
||||
else:
|
||||
raise ValueError(f"Expected a list or dictionary in {rule_file}")
|
||||
|
||||
|
||||
def format_command_options(ctx):
|
||||
"""Echo options for a click command."""
|
||||
formatter = ctx.make_formatter()
|
||||
opts = []
|
||||
|
||||
for param in ctx.command.get_params(ctx):
|
||||
if param.name == 'help':
|
||||
continue
|
||||
|
||||
rv = param.get_help_record(ctx)
|
||||
if rv is not None:
|
||||
opts.append(rv)
|
||||
|
||||
if opts:
|
||||
with formatter.section('Options'):
|
||||
formatter.write_dl(opts)
|
||||
|
||||
return formatter.getvalue()
|
||||
|
||||
+5
-5
@@ -20,14 +20,14 @@ class Kibana(object):
|
||||
|
||||
CACHED = False
|
||||
|
||||
def __init__(self, cloud_id=None, url=None, verify=True, elasticsearch=None):
|
||||
def __init__(self, cloud_id=None, kibana_url=None, verify=True, elasticsearch=None):
|
||||
""""Open a session to the platform."""
|
||||
self.authenticated = False
|
||||
self.session = requests.Session()
|
||||
self.session.verify = verify
|
||||
|
||||
self.cloud_id = cloud_id
|
||||
self.kibana_url = url
|
||||
self.kibana_url = kibana_url
|
||||
self.elastic_url = None
|
||||
self.status = None
|
||||
|
||||
@@ -99,9 +99,9 @@ class Kibana(object):
|
||||
"""Perform an HTTP DELETE."""
|
||||
return self.request('DELETE', uri, params=params, error=error)
|
||||
|
||||
def login(self, username, password):
|
||||
def login(self, kibana_username, kibana_password):
|
||||
"""Authenticate to Kibana using the API to update our cookies."""
|
||||
payload = {'username': username, 'password': password}
|
||||
payload = {'username': kibana_username, 'password': kibana_password}
|
||||
path = '/internal/security/login'
|
||||
|
||||
self.post(path, data=payload, error=True)
|
||||
@@ -110,7 +110,7 @@ class Kibana(object):
|
||||
|
||||
# create ES and force authentication
|
||||
if self.elasticsearch is None and self.elastic_url is not None:
|
||||
self.elasticsearch = Elasticsearch(hosts=[self.elastic_url], http_auth=(username, password))
|
||||
self.elasticsearch = Elasticsearch(hosts=[self.elastic_url], http_auth=(kibana_username, kibana_password))
|
||||
self.elasticsearch.info()
|
||||
|
||||
# make chaining easier
|
||||
|
||||
Reference in New Issue
Block a user