Compare commits

...

79 Commits

Author SHA1 Message Date
Thomas Patzke 0cdfc776de Sigma tools release 0.5 2018-07-03 00:07:43 +02:00
Thomas Patzke 3e40a48ce1 Merge branch 'SaltyHash123-master' 2018-07-02 23:31:43 +02:00
Thomas Patzke 0bacba05aa Added backend 'splunkxml' to CI tests 2018-07-02 23:20:02 +02:00
Thomas Patzke 67158ba1d2 Merge branch 'master' of https://github.com/SaltyHash123/sigma into SaltyHash123-master 2018-07-02 23:14:04 +02:00
Florian Roth 48582a1c93 Bugfix in Flash Downloader Rule 2018-06-30 23:39:38 +02:00
Florian Roth 2a74a62c67 Config file for SPARK scanner 2018-06-29 16:42:16 +02:00
Florian Roth c3bf968462 High FP Rule 2018-06-29 16:01:46 +02:00
Florian Roth c26c3ee426 Trying to fix rule 2018-06-28 16:39:47 +02:00
Florian Roth fa98595ad6 Added SPARK Sigma rule scan feature to list 2018-06-28 16:28:07 +02:00
Florian Roth 9e0abc5f0b Adjusted rules to the new specs reg "not null" usage 2018-06-28 09:30:31 +02:00
Florian Roth 336f4c83e0 Merge pull request #97 from scherma/patch-1
False positive circumstance
2018-06-27 23:18:56 +02:00
scherma 19ba5df207 False positive circumstance 2018-06-27 21:14:38 +01:00
Florian Roth 86e6518764 Changed (any) statements to (not null) to comply with the newest specs 2018-06-27 20:57:58 +02:00
Florian Roth a61052fc0a Rule fixes 2018-06-27 18:47:52 +02:00
Florian Roth 9705366060 Adjusted some rules 2018-06-27 16:54:44 +02:00
Florian Roth fc72bd16af Fixed bugs 2018-06-27 09:20:41 +02:00
Thomas Patzke c3d582bc13 Cleanup 2018-06-26 23:37:21 +02:00
Florian Roth 5843fe2590 Update README.md 2018-06-25 18:59:36 +02:00
Florian Roth 467b8c80f4 Update README.md 2018-06-25 18:58:05 +02:00
Florian Roth 2ae57166ac Updated README 2018-06-25 18:29:02 +02:00
Florian Roth 3283c52c0f Added WDATP in the list of supported backends 2018-06-25 18:09:21 +02:00
Florian Roth f4b150def8 Rule: Powershell remote thread creation in Rundll32 2018-06-25 15:23:19 +02:00
Florian Roth 1a1011b0ad Merge pull request #96 from yt0ng/master
Detects the creation of a schtask via PowerSploit Default Configuration
2018-06-23 17:15:14 +02:00
yt0ng c59d0c7dca Added additional options 2018-06-23 15:54:31 +02:00
yt0ng cc3fd9f5d0 Detects the creation of a schtask via PowerSploit Default Configuration
https://github.com/0xdeadbeefJERKY/PowerSploit/blob/8690399ef70d2cad10213575ac67e8fa90ddf7c3/Persistence/Persistence.psm1
2018-06-23 15:45:58 +02:00
Roey 14464f8c79 Added support of splunk dashboards (xml) 2018-06-22 14:17:58 +02:00
Florian Roth 28a7e64212 Rule: Sysprep on AppData folder 2018-06-22 14:02:55 +02:00
Thomas Patzke 7d1b801858 Merge branch 'devel-sigmac-wdatp' 2018-06-22 00:43:23 +02:00
Thomas Patzke d8e036f737 sigmac: Parameter for ignoring "not supported" errors
Used to pass tests with complete rule set that would fail for backends
which target systems don't support required features.
2018-06-22 00:23:59 +02:00
Thomas Patzke 31727b3b25 Added Windows Defender ATP backend
Missing:
* Aggregations
2018-06-22 00:03:10 +02:00
Thomas Patzke df6ad82770 Removed redundant attribute from rule
EventID 4657 already implies the modification.
2018-06-21 23:59:55 +02:00
Thomas Patzke e72c0d5de4 SingleTextQueryBackend ignores empty components in composed queries
Example: one component of a AND-composition is ignored if invoked
generate* call returns None.
2018-06-21 23:59:41 +02:00
Thomas Patzke d8a7bcad39 Reordered rule generation
Generation of query parts before and after main query gives access to
information possibly gathered while main query generation.
2018-06-21 23:50:13 +02:00
Florian Roth b05856eae1 Rule: Update suspicious TLD downloads 2018-06-13 00:08:46 +02:00
Florian Roth 3d52030391 Changed help text for -r flag 2018-06-13 00:08:46 +02:00
Florian Roth 946c946366 Rule: NTLM logon 2018-06-13 00:08:46 +02:00
Florian Roth 7edd95744a Windows NTLM 2018-06-13 00:08:46 +02:00
Florian Roth e23cdafb85 Rule: Fixed missing description 2018-06-13 00:08:46 +02:00
Florian Roth c9658074dd Removed "not yet implemented" comment from -r flag 2018-06-13 00:08:46 +02:00
Florian Roth df2745ec6c Merge pull request #92 from yt0ng/patch-2
Update proxy_ua_apt.yml
2018-06-10 10:29:16 +02:00
Florian Roth f6f718c54f Cosmetics 2018-06-10 10:28:59 +02:00
yt0ng 3166bf5b05 Update proxy_ua_apt.yml
user Agent seen in https://www.hybrid-analysis.com/sample/a80e29c0757bee05338fd5c22a542d852ad86c477068e3eb4aacc1c3e59e2eef?environmentId=100
2018-06-10 10:17:02 +02:00
Thomas Patzke dbc25b6bfa Integrated Qualys backend to CI testing 2018-06-07 23:33:47 +02:00
Thomas Patzke f6d5e5dd99 Sigmac parameter -I now ignores all backend errors
New backends introduced further exceptions and the intention of -I is to
get a successful run.
2018-06-07 23:33:12 +02:00
Thomas Patzke 8ddb369df3 Integration of Qualys backend
* Changed description text to one-liner
* Output to intended class
* Minor code optimizations
2018-06-07 23:31:09 +02:00
Thomas Patzke ce9db548ff Integration of ArcSight backend
* Rename
* Changed description to one line to beautify output of backend list
* Small bugfix in handling of numeric values
2018-06-07 23:04:36 +02:00
Thomas Patzke 17c894005c Merge branch 'master' of https://github.com/socprime/sigma into socprime-backends 2018-06-07 22:18:51 +02:00
nikotin d13e8d7bd3 Added ArcSight & Qualys backends 2018-06-07 16:18:23 +03:00
Florian Roth bd61f223ee Sofacy Zebrocy samples 2018-06-06 23:24:18 +02:00
Florian Roth 667b3b4935 Rule: Added 2 more Sofacy User-Agents 2018-06-06 22:38:50 +02:00
Florian Roth 9640806678 Rules: Telegram Bot API access 2018-06-05 16:25:43 +02:00
Florian Roth 9c817a493b Rule: DCSync 2018-06-03 16:00:57 +02:00
Florian Roth d1d4473505 Rule: ADS with executable
https://twitter.com/0xrawsec/status/1002478725605273600
2018-06-03 02:08:57 +02:00
Florian Roth 4eabc5ea5c Sigmac Usage 2018-06-01 10:33:11 +02:00
Florian Roth 8e500d2caa Bugfix in rule 2018-05-29 14:11:12 +02:00
Florian Roth 0d97522b5a Merge pull request #88 from noraj/patch-1
enhance web server paths
2018-05-29 11:54:46 +02:00
Alexandre ZANNI 74da324d8f remove old public_html
remove old public_html
2018-05-29 11:44:38 +02:00
Alexandre ZANNI a1de770b64 enhance web server paths
- specify when it is apache only
- add Per-user path
- add archlinux paths
2018-05-29 11:41:36 +02:00
Florian Roth f9596c1ae0 MISP added 2018-05-28 09:15:48 +02:00
Florian Roth fc8a21fac5 Evt2Sigma 2018-05-28 09:13:08 +02:00
Florian Roth 51c6d0a767 Rule: Proxy User-Agent VPNFilter 2018-05-24 00:34:07 +02:00
Florian Roth 65cc78f9e8 Windows Config Update - DNS logs 2018-05-22 16:59:58 +02:00
Florian Roth 2db00b8559 Rule: whoami execution 2018-05-22 16:59:58 +02:00
Thomas Patzke bd23946f06 Merge of Graylog backend pull request 2018-05-18 15:55:02 +02:00
Thomas Patzke 21040f04cc Added CI test for Graylog backend 2018-05-18 15:53:25 +02:00
Thomas Patzke b28480495e Merge branch 'master' of https://github.com/DefenceLogic/sigma into DefenceLogic-master 2018-05-18 15:49:19 +02:00
Thomas Patzke 079c04f28d Fixed rule scope 2018-05-18 14:23:52 +02:00
Paul Dutot 715a88542d Graylog backend added 2018-05-17 15:51:25 +01:00
Paul Dutot 05e108a4d1 Merge pull request #1 from Neo23x0/master
Updating Fork
2018-05-17 10:49:54 +01:00
Florian Roth 1fd4172832 Merge pull request #84 from mgreen27/patch-1
Update_WebDAV
2018-05-17 09:40:32 +02:00
Florian Roth 57dc02aa9f Merge pull request #85 from HacknowledgeCH/es-dsl-patch
patched es-dsl
2018-05-17 09:39:55 +02:00
milkmix 37ee355a77 patched es-dsl 2018-05-17 08:44:50 +02:00
Matthew Green 16365b7793 Update_WebDAV
Made the name a bit generic as WebDAV can be used by several download cradles.
Added in HttpMethod as a select as GET requests makes for a great filter point with much less false positives.
2018-05-16 13:05:15 +10:00
Thomas Patzke 33ffd2683e Disabled failing pypy3 build 2018-05-13 22:52:25 +02:00
Thomas Patzke 738d03c751 Fixed position of line separation if rulecomment and verbose is active 2018-05-13 22:36:51 +02:00
Thomas Patzke 6a3fcdc68c Unified 0x values with other rules 2018-05-13 22:28:43 +02:00
Florian Roth 429ae0729a README Update 2018-05-12 08:33:31 +02:00
Florian Roth 1aaed07dd7 Rule: Suspicious base64 encoded part of DNS query 2018-05-10 14:08:52 +02:00
Florian Roth 62b490396d Rule: Cobalt Strike DNS Beaconing 2018-05-10 14:08:52 +02:00
43 changed files with 1211 additions and 187 deletions
-1
View File
@@ -2,7 +2,6 @@ language: python
python:
- 3.5
- 3.6
- pypy3
services:
- elasticsearch
cache: pip
+6 -1
View File
@@ -18,10 +18,15 @@ test-sigmac:
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t es-qs rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -O rulecomment -rvdI -t es-qs rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t kibana rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t graylog rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t xpack-watcher rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t splunk rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t splunkxml rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t logpoint rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t wdatp rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t es-dsl rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t arcsight -c tools/config/arcsight.yml rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t qualys -c tools/config/qualys.yml rules/ > /dev/null
coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t splunk -f 'level>=high,level<=critical,status=stable,logsource=windows' rules/ > /dev/null
! coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t splunk -f 'level>=high,level<=critical,status=xstable,logsource=windows' rules/ > /dev/null
! coverage run -a --include=$(COVSCOPE) tools/sigmac -rvdI -t splunk -f 'level>=high,level<=xcritical,status=stable,logsource=windows' rules/ > /dev/null
@@ -56,7 +61,7 @@ test-sigmac:
! coverage run -a --include=$(COVSCOPE) tools/sigmac -t es-qs -c not_existing rules/windows/sysmon/sysmon_mimikatz_detection_lsass.yml
! coverage run -a --include=$(COVSCOPE) tools/sigmac -t es-qs -c tests/invalid_yaml.yml rules/windows/sysmon/sysmon_mimikatz_detection_lsass.yml
! coverage run -a --include=$(COVSCOPE) tools/sigmac -t es-qs -c tests/invalid_config.yml rules/windows/sysmon/sysmon_mimikatz_detection_lsass.yml
! coverage run -a --include=$(COVSCOPE) tools/sigmac -rvI -c tools/config/elk-defaultindex.yml -t kibana rules/ > /dev/null
! coverage run -a --include=$(COVSCOPE) tools/sigmac -rv -c tools/config/elk-defaultindex.yml -t kibana rules/ > /dev/null
test-merge:
tests/test-merge.sh
+43 -27
View File
@@ -26,26 +26,17 @@ This repository contains:
# Use Cases
* Describe your once discovered detection method in Sigma to make it sharable
* Share the signature in the appendix of your analysis along with file hashes and C2 servers
* Describe your detection method in Sigma to make it sharable
* Write and your SIEM searches in Sigma to avoid a vendor lock-in
* Share the signature in the appendix of your analysis along with IOCs and YARA rules
* Share the signature in threat intel communities - e.g. via MISP
* Provide Sigma signatures for malicious behaviour in your own application (Error messages, access violations, manipulations)
* Integrate a new log into your SIEM and check the Sigma repository for available rules
* Write a rule converter for your custom log analysis tool and process new Sigma rules automatically
* Provide a free or commercial feed for Sigma signatures
* Provide Sigma signatures for malicious behaviour in your own application
# Why Sigma
Today, everyone collects log data for analysis. People start working on their own, processing numerous white papers, blog posts and log analysis guidelines, extracting the necessary information and build their own searches and dashboard. Some of their searches and correlations are great and very useful but they lack a standardized format in which they can share their work with others.
Others provide excellent analyses for threat groups, sharing file indicators, C2 servers and YARA rules to detect the malicious files, but describe a certain malicious service install or remote thread injection in a separate paragraph. Security analysts, who read that paragraph then extract the necessary information and create rules in their SIEM system. The detection method never finds a way into a repository that is shared, structured and archived.
The lower layers of the OSI layer are well known and described. Every SIEM vendor has rules to detect port scans, ping sweeps and threats like the ['smurf attack'](https://en.wikipedia.org/wiki/Smurf_attack). But the higher layers contain numerous applications and protocols with special characteristics that write their own custom log files. SIEM vendors consider the signatures and correlations as their intelectual property and do not tend to share details on the coverage.
Sigma is meant to be an open standard in which detection mechanisms can be defined, shared and collected in order to improve the detection capabilities on the application layers for everyone.
![sigma_why](./images/Problem_OSI_v01.png)
Others provide excellent analyses, include IOCs and YARA rules to detect the malicious files and network connections, but have no way to describe a specific or generic detection method in log events. Sigma is meant to be an open standard in which such detection mechanisms can be defined, shared and collected in order to improve the detection capabilities for everyone.
## Slides
@@ -61,8 +52,19 @@ The current specification is a proposal. Feedback is requested.
# Getting Started
## Rule Creation
Florian wrote a short [rule creation tutorial](https://www.nextron-systems.com/2018/02/10/write-sigma-rules/) that can help you getting started.
## Rule Usage
1. Download or clone the respository
2. Check the `./rules` sub directory for an overview on the rule base
3. Run `python sigmac --help` in folder `./tools` to get a help on the rule converter
4. Convert a rule of your choice with `sigmac` like `python sigmac -t splunk ../rules/windows/builtin/win_susp_process_creations.yml`
5. Convert a whole rule directory with `python sigmac -t splunk -r ../rules/proxy/`
6. Check the `./tools/config` folder and the [wiki](https://github.com/Neo23x0/sigma/wiki/Converter-Tool-Sigmac) if you need custom field or log source mappings in your environment
# Examples
Windows 'Security' Eventlog: Access to LSASS Process with Certain Access Mask / Object Type (experimental)
@@ -80,7 +82,9 @@ Sysmon: Web Shell Detection
Windows 'Security' Eventlog: Suspicious Number of Failed Logons from a Single Source Workstation
![sigma_rule example5](./images/Sigma_rule_example5.png)
## Sigma Tools
# Sigma Tools
## Sigmac
Sigmac converts sigma rules into queries or inputs of the supported targets listed below. It acts as a frontend to the
Sigma library that may be used to integrate Sigma support in other projects. Further, there's `merge_sigma.py` which
@@ -91,17 +95,22 @@ merges multiple YAML documents of a Sigma rule collection into simple Sigma rule
### Supported Targets
* [Splunk](https://www.splunk.com/)
* [Elasticsearch](https://www.elastic.co/)
* [ElasticSearch](https://www.elastic.co/)
* [ElasticSearch Query DSL](https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl.html)
* [Kibana](https://www.elastic.co/de/products/kibana)
* [Elastic X-Pack Watcher](https://www.elastic.co/guide/en/x-pack/current/xpack-alerting.html)
* [Logpoint](https://www.logpoint.com)
* [Windows Defender Advanced Threat Protection (WDATP)](https://www.microsoft.com/en-us/windowsforbusiness/windows-atp)
* Grep with Perl-compatible regular expression support
New targets are continuously developed. A current list can be obtained with `sigmac --target-list` or `sigmac -l`.
Current work-in-progress
* [Splunk Data Models](https://docs.splunk.com/Documentation/Splunk/7.1.0/Knowledge/Aboutdatamodels)
New targets are continuously developed. You can get a list of supported targets with `sigmac --target-list` or `sigmac -l`.
### Requirements
The usage of Sigmac or the underlying library requires Python >= 3.5 and PyYAML.
The usage of Sigmac (the Sigma Rule Converter) or the underlying library requires Python >= 3.5 and PyYAML.
### Installation
@@ -123,6 +132,10 @@ For development (e.g. execution of integration tests with `make` and packaging),
pip3 install -r tools/requirements-devel.txt
```
## Evt2Sigma
[Evt2Sigma](https://github.com/Neo23x0/evt2sigma) helps you with the rule creation. It generates a Sigma rule from a log entry.
## Contributed Scripts
The directory `contrib` contains scripts that were contributed by the community:
@@ -134,20 +147,17 @@ These tools are not part of the main toolchain and maintained separately by thei
# Next Steps
* Integration of feedback into the rule specifications
* Integration into Threat Intel Exchanges, e.g. [MISP](http://www.misp-project.org/)
* Integration of MITRE ATT&CK framework identifier to the rule set
* Integration into Threat Intel Exchanges
* Attempts to convince others to use the rule format in their reports, threat feeds, blog posts, threat sharing platforms
# Projects that use Sigma
* [Augmentd](https://augmentd.co/)
* [MISP](http://www.misp-project.org/2017/03/26/MISP.2.4.70.released.html) (since version 2.4.70, March 2017)
* [TA-Sigma-Searches](https://github.com/dstaulcu/TA-Sigma-Searches) (Splunk App)
# Credits
This is a private project mainly developed by Florian Roth and Thomas Patzke with feedback from many fellow analysts and friends. Rules are our own or have been drived from blog posts, tweets or other public sources that are referenced in the rules.
Copyright for Tree Image: [studiobarcelona / 123RF Stock Photo](http://www.123rf.com/profile_studiobarcelona)
* [SOC Prime - Sigma Rule Editor](https://tdm.socprime.com/sigma/)
* [ypsilon](https://github.com/P4T12ICK/ypsilon) - Automated Use Case Testing
* [SPARK](https://www.nextron-systems.com/2018/06/28/spark-applies-sigma-rules-in-eventlog-scan/) - Scan with Sigma rules on endpoints
# Licenses
@@ -156,3 +166,9 @@ The content of this repository is released under the following licenses:
* The toolchain (everything under `tools/`) is licensed under the [GNU Lesser General Public License](https://www.gnu.org/licenses/lgpl-3.0.en.html).
* The [Sigma specification](https://github.com/Neo23x0/sigma/wiki) is public domain.
* Everything else, especially the rules contained in the `rules/` directory is released under the [GNU General Public License](https://www.gnu.org/licenses/gpl-3.0.en.html).
# Credits
This is a private project mainly developed by Florian Roth and Thomas Patzke with feedback from many fellow analysts and friends. Rules are our own or have been drived from blog posts, tweets or other public sources that are referenced in the rules.
Copyright for Tree Image: [studiobarcelona / 123RF Stock Photo](http://www.123rf.com/profile_studiobarcelona)
@@ -15,12 +15,15 @@ detection:
# Temporary folder
- '/tmp/*'
# Web server
- '/var/www/*' # Standard
- '/usr/local/apache2/*' # Classical Apache
- '/usr/local/httpd/*' # Old SuSE Linux 6.*
- '/var/apache/*' # Solaris
- '/srv/www/*' # SuSE Linux 9.*
- '/home/httpd/html/*' # Redhat 6 or older
- '/var/www/*' # Standard
- '/home/*/public_html/*' # Per-user
- '/usr/local/apache2/*' # Classical Apache
- '/usr/local/httpd/*' # Old SuSE Linux 6.* Apache
- '/var/apache/*' # Solaris Apache
- '/srv/www/*' # SuSE Linux 9.*
- '/home/httpd/html/*' # Redhat 6 or older Apache
- '/srv/http/*' # ArchLinux standard
- '/usr/share/nginx/html/*' # ArchLinux nginx
# Data dirs of typically exploited services (incomplete list)
- '/var/lib/pgsql/data/*'
- '/usr/local/mysql/data/*'
@@ -28,8 +31,6 @@ detection:
- '/var/vsftpd/*'
- '/etc/bind/*'
- '/var/named/*'
# Others
- '*/public_html/*'
condition: selection
falsepositives:
- Admin activity (especially in /tmp folders)
@@ -6,8 +6,8 @@ logsource:
detection:
selection:
pam_message: "authentication failure"
pam_user: not null
pam_rhost: not null
pam_user: '*'
pam_rhost: '*'
timeframe: 24h
condition: selection | count(pam_user) by pam_rhost > 3
falsepositives:
@@ -0,0 +1,19 @@
title: Cobalt Strike DNS Beaconing
status: experimental
description: Detects suspicious DNS queries known from Cobalt Strike beacons
references:
- https://www.icebrg.io/blog/footprints-of-fin7-tracking-actor-patterns
author: Florian Roth
date: 2018/05/10
logsource:
category: dns
detection:
selection:
query:
- 'aaa.stage.*'
- 'post.1*'
condition: selection
falsepositives:
- Unknown
level: high
@@ -0,0 +1,17 @@
title: Suspicious DNS Query with B64 Encoded String
status: experimental
description: Detects suspicious DNS queries using base64 encoding
references:
- https://github.com/krmaxwell/dns-exfiltration
author: Florian Roth
date: 2018/05/10
logsource:
category: dns
detection:
selection:
query:
- '*==.*'
condition: selection
falsepositives:
- Unknown
level: medium
+20
View File
@@ -0,0 +1,20 @@
title: Telegram Bot API Request
status: experimental
description: Detects suspicious DNS queries to api.telegram.org used by Telegram Bots of any kind
references:
- https://core.telegram.org/bots/faq
- https://researchcenter.paloaltonetworks.com/2018/03/unit42-telerat-another-android-trojan-leveraging-telegrams-bot-api-to-target-iranian-users/
- https://blog.malwarebytes.com/threat-analysis/2016/11/telecrypt-the-ransomware-abusing-telegram-api-defeated/
- https://www.welivesecurity.com/2016/12/13/rise-telebots-analyzing-disruptive-killdisk-attacks/
author: Florian Roth
date: 2018/06/05
logsource:
category: dns
detection:
selection:
query:
- 'api.telegram.org' # Telegram Bot API Request https://core.telegram.org/bots/faq
condition: selection
falsepositives:
- Legitimate use of Telegram bots in the company
level: medium
@@ -5,8 +5,9 @@ references:
- https://www.symantec.com/connect/blogs/shady-tld-research-gdn-and-our-2016-wrap
- https://promos.mcafee.com/en-US/PDF/MTMW_Report.pdf
- https://www.spamhaus.org/statistics/tlds/
- https://krebsonsecurity.com/2018/06/bad-men-at-work-please-dont-click/
author: Florian Roth
date: 2017/11/07
date: 2018/06/13
logsource:
category: proxy
detection:
@@ -60,7 +61,7 @@ detection:
- '*.cricket'
- '*.space'
- '*.top'
# McAfee report
# McAfee report
- '*.info'
- '*.vn'
- '*.cm'
@@ -83,7 +84,6 @@ detection:
- '*.tt'
- '*.name'
- '*.tv'
- '*.tv'
- '*.kz'
- '*.tc'
- '*.mobi'
@@ -93,10 +93,16 @@ detection:
- '*.link'
- '*.trade'
- '*.accountant'
# Spamhaus 2018 https://krebsonsecurity.com/2018/06/bad-men-at-work-please-dont-click/
- '*.click'
- '*.cf'
- '*.gq'
- '*.ml'
- '*.ga'
condition: selection
fields:
- ClientIP
- URL
falsepositives:
- All kind of software downloads
- All kinds of software downloads
level: low
@@ -1,6 +1,6 @@
title: Windows PowerShell WebDav User Agent
title: Windows WebDAV User Agent
status: experimental
description: Detects Windows PowerShell Web Access
description: Detects WebDav DownloadCradle
references:
- https://mgreen27.github.io/posts/2018/04/02/DownloadCradle.html
author: Florian Roth
@@ -10,12 +10,15 @@ logsource:
detection:
selection:
UserAgent: 'Microsoft-WebDAV-MiniRedir/*'
HttpMethod: 'GET'
condition: selection
fields:
- ClientIP
- URL
- UserAgent
- HttpMethod
falsepositives:
- Administrative scripts that download files from the Internet
- Administrative scripts that retrieve certain website contents
- Legitimate WebDAV administration
level: high
@@ -12,7 +12,7 @@ detection:
- '*/install_flash_player.exe'
- '*/flash_install.php*'
filter:
cs-uri-query: '*.adobe.com/*'
cs-uri-stem: '*.adobe.com/*'
condition: selection and not filter
falsepositives:
- Unknown flash download locations
+29
View File
@@ -0,0 +1,29 @@
title: Telegram API Access
status: experimental
description: Detects suspicious requests to Telegram API without the usual Telegram User-Agent
references:
- https://researchcenter.paloaltonetworks.com/2018/03/unit42-telerat-another-android-trojan-leveraging-telegrams-bot-api-to-target-iranian-users/
- https://blog.malwarebytes.com/threat-analysis/2016/11/telecrypt-the-ransomware-abusing-telegram-api-defeated/
- https://www.welivesecurity.com/2016/12/13/rise-telebots-analyzing-disruptive-killdisk-attacks/
author: Florian Roth
date: 2018/06/05
logsource:
category: proxy
detection:
selection:
r-dns:
- 'api.telegram.org' # Often used by Bots
filter:
UserAgent:
# Used https://core.telegram.org/bots/samples for this list
- '*Telegram*'
- '*Bot*'
condition: selection and not filter
fields:
- ClientIP
- URL
- UserAgent
falsepositives:
- Legitimate use of Telegram bots in the company
level: medium
+5
View File
@@ -30,6 +30,11 @@ detection:
- 'Mozilla/4.0 (compatible; MSIE 11.0; Windows NT 6.1; SV1)' # Bronze Butler - Daserf
- 'Mozilla/4.0 (compatible; MSIE 8.0; Win32)' # TSCookie https://app.any.run/tasks/0996b314-5133-491b-8d23-d431ffdec597
- 'Mozilla v5.1 (Windows NT 6.1; rv:6.0.1) Gecko/20100101 Firefox/6.0.1' # Delphi downloader https://www.welivesecurity.com/2018/04/24/sednit-update-analysis-zebrocy/
- 'Mozilla/6.1 (compatible; MSIE 9.0; Windows NT 5.3; Trident/5.0)' # VPNFilter https://blog.talosintelligence.com/2018/05/VPNFilter.html
- 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648; InfoPath.1)' # Sofacy User-Agent https://researchcenter.paloaltonetworks.com/2018/06/unit42-sofacy-groups-parallel-attacks/
- 'Mozilla/5.0 (Windows NT 6.1; WOW64) WinHttp/1.6.3.8 (WinHTTP/5.1) like Gecko' # Sofacy User-Agent https://researchcenter.paloaltonetworks.com/2018/06/unit42-sofacy-groups-parallel-attacks/
- 'Mozilla v5.1 *' # Sofacy Zebrocy samples
- 'MSIE 8.0' # Sofacy Azzy Backdoor from https://www.hybrid-analysis.com/sample/a80e29c0757bee05338fd5c22a542d852ad86c477068e3eb4aacc1c3e59e2eef?environmentId=100
condition: selection
fields:
- ClientIP
@@ -1,5 +1,5 @@
title: Access to ADMIN$ Share
description:
description: Detects access to $ADMIN share
status: experimental
author: Florian Roth
logsource:
@@ -9,7 +9,7 @@ logsource:
description: 'Requirements: Audit Policy : Policy Change > Audit Authorization Policy Change, Group Policy : Computer Configuration\Windows Settings\Security Settings\Advanced Audit Policy Configuration\Audit Policies\Policy Change\Audit Authorization Policy Change'
detection:
selection:
EventID: 4707
EventID: 4704
keywords:
- 'SeEnableDelegationPrivilege'
condition: all of them
@@ -12,7 +12,8 @@ logsource:
detection:
selection1:
EventID: 4738
AllowedToDelegateTo: '*'
filter1:
AllowedToDelegateTo: null
selection2:
EventID: 5136
AttributeLDAPDisplayName: 'msDS-AllowedToDelegateTo'
@@ -20,7 +21,7 @@ detection:
EventID: 5136
ObjectClass: 'user'
AttributeLDAPDisplayName: 'servicePrincipalName'
condition: 1 of them
condition: (selection1 and not filter1) or selection2 or selection3
falsepositives:
- Unknown
level: high
@@ -3,7 +3,7 @@ description: This method detects well-known keywords, certain field combination
author: Florian Roth
logsource:
product: windows
service: system
service: security
detection:
# Ruler https://github.com/sensepost/ruler
selection1:
+22
View File
@@ -0,0 +1,22 @@
title: Mimikatz DC Sync
description: Detects Mimikatz DC sync security events
status: experimental
date: 2018/06/03
author: Benjamin Delpy, Florian Roth
references:
- https://twitter.com/gentilkiwi/status/1003236624925413376
- https://gist.github.com/gentilkiwi/dcc132457408cf11ad2061340dcb53c2
logsource:
product: windows
service: security
detection:
selection:
EventID: 4662
Properties:
- '*Replicating Directory Changes All*'
- '*1131f6ad-9c07-11d1-f79f-00c04fc2dcd2*'
condition: selection
falsepositives:
- Unkown
level: critical
@@ -32,7 +32,6 @@ logsource:
detection:
selection2:
EventID: 4657
OperationType: 'Existing registry value modified'
ObjectName: '\REGISTRY\MACHINE\SYSTEM\*ControlSet*\Control\Lsa'
ObjectValueName:
- 'LmCompatibilityLevel'
@@ -12,19 +12,19 @@ author: juju4
detection:
selection:
CommandLine:
- '^'
- '@'
#- '^'
#- '@'
# 0x002D -, 0x2013 , 0x2014 , 0x2015 ― ... FIXME! how to match hexa form?
- '-'
- '―'
- 'c:/'
# - '-'
# - '―'
#- 'c:/'
- '<TAB>'
- '^h^t^t^p'
- 'h"t"t"p'
condition: selection
falsepositives:
- False positives depend on scripts and administrative tools used in the monitored environment
level: medium
level: low
---
# Windows Audit Log
logsource:
@@ -10,11 +10,11 @@ detection:
- 4625
- 4776
Status:
- 0xC0000072
- 0xC000006F
- 0xC0000070
- 0xC0000413
- 0xC000018C
- '0xC0000072'
- '0xC000006F'
- '0xC0000070'
- '0xC0000413'
- '0xC000018C'
condition: selection
falsepositives:
- User using a disabled account
@@ -9,12 +9,12 @@ detection:
EventID:
- 529
- 4625
UserName: not null
WorkstationName: not null
UserName: '*'
WorkstationName: '*'
selection2:
EventID: 4776
UserName: not null
Workstation: not null
UserName: '*'
Workstation: '*'
timeframe: 24h
condition:
- selection1 | count(UserName) by WorkstationName > 3
@@ -21,6 +21,6 @@ detection:
- 'mpengine.dll'
condition: 1 of selection* and all of keywords
falsepositives:
- Unknown
- MsMpEng.exe can crash when C:\ is full
level: high
@@ -0,0 +1,20 @@
title: NTLM Logon
status: experimental
description: Detects logons using NTLM, which could be caused by a legacy source or attackers
references:
- https://twitter.com/JohnLaTwC/status/1004895028995477505
- https://goo.gl/PsqrhT
author: Florian Roth
date: 2018/06/08
logsource:
product: windows
service: ntlm
description: Reqiures events from Microsoft-Windows-NTLM/Operational
detection:
selection:
EventID: 8002
CallingProcessName: '*' # We use this to avoid false positives with ID 8002 on other log sources if the logsource isn't set correctly
condition: selection
falsepositives:
- Legacy hosts
level: low
@@ -1,83 +0,0 @@
action: global
title: Phantom DLLs Usage
description: Detects Phantom DLLs usage and matching executable
status: experimental
references:
- http://www.hexacorn.com/blog/2013/12/08/beyond-good-ol-run-key-part-5/
- http://www.hexacorn.com/blog/2015/02/23/beyond-good-ol-run-key-part-28/
author: juju4
detection:
selection:
CommandLine:
- '*ntbackup*'
- '*\edbbcli.dll*'
- '*\esebcli2.dll*'
# - '*mrt*'
- '*\bcrypt.dll*'
- '*sessmgr*'
- '*\SalemHook.dll*'
- '*certreq*'
- '*\msfte.dll*'
- '*\mstracer.dll*'
- '*fxscover*'
- '*\TPPrnUIENU.dll*'
- '*dxdiag*'
- '*\DXGIDebug.dll*'
- '*msinfo32*'
- '*\fveapi.dll*'
- '*narrator*'
- '*\MSTTSLocEnUS.dll*'
- '*\Wow64Log.dll*'
- '*Dism*'
- '*\Dism\wimgapi.dll*'
- '*\DismCore.dll*'
- '*FileHistory*'
- '*\Microsoft.NET\Framework\v4.0.30319\api-ms-win-core-winrt-l1-1-0.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\mscoree.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\ole32.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\urlmon.dll*'
# - '*mmc*'
- '*\Microsoft.Net\assembly\GAC_32\mscorlib\v4.0_4.0.0.0__b77a5c561934e089\oleaut32.dll*'
- '*\Microsoft.Net\assembly\GAC_32\mscorlib\v4.0_4.0.0.0__b77a5c561934e089\shell32.dll*'
- '*\Microsoft.Net\assembly\GAC_MSIL\MIGUIControls\v4.0_1.0.0.0__31bf3856ad364e35\ntdll.dll*'
- '*\Microsoft.Net\assembly\GAC_MSIL\System.Windows.Forms\v4.0_4.0.0.0__b77a5c561934e089\comctl32.dll*'
- '*\Microsoft.Net\assembly\GAC_MSIL\System.Windows.Forms\v4.0_4.0.0.0__b77a5c561934e089\uxtheme.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\api-ms-win-core-winrt-l1-1-0.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\mscoree.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\ole32.dll*'
- '*\Microsoft.NET\Framework\v4.0.30319\VERSION.dll*'
- '*Narrator*'
- '*speech\engines\tts\MSTTSLocEnUS.DLL'
- '*omadmclient*'
- '*cmnet.dll*'
- '*PresentationHost*'
- '*\Microsoft.NET\Framework\v4.0.30319\WPF\PresentationHost_v0400.dll*'
- '*provtool*'
- '*MvHelper.dll*'
- '*SearchIndexer*'
- '*msfte.dll*'
- '*msTracer.dll*'
- '*SearchProtocolHost*'
- '*msfte.dll*'
- '*msTracer.dll*'
condition: selection
falsepositives:
- False positives depend on environment
level: medium
---
# Windows Audit Log
logsource:
product: windows
service: security
description: 'Requirements: Audit Policy : Detailed Tracking > Audit Process creation, Group Policy : Administrative Templates\System\Audit Process Creation'
detection:
selection:
EventID: 4688
---
# Sysmon
logsource:
product: windows
service: sysmon
detection:
selection:
EventID: 1
@@ -6,11 +6,12 @@ logsource:
service: security
detection:
samrpipe:
EventID: 5145
RelativeTargetName: samr
EventID: 5145
RelativeTargetName: samr
passwordchanged:
EventID: 4738
PasswordLastSet: (any)
EventID: 4738
passwordchanged_filter:
PasswordLastSet: null
timeframe: 15s
condition: samrpipe | near passwordchanged
condition: ( passwordchanged and not passwordchanged_filter ) | near samrpipe
level: medium
@@ -0,0 +1,34 @@
---
action: global
title: Sysprep on AppData Folder
status: experimental
description: Detects suspicious sysprep process start with AppData folder as target (as used by Trojan Syndicasec in Thrip report by Symantec)
references:
- https://www.symantec.com/blogs/threat-intelligence/thrip-hits-satellite-telecoms-defense-targets
- https://app.any.run/tasks/61a296bb-81ad-4fee-955f-3b399f4aaf4b
author: Florian Roth
date: 2018/06/22
detection:
selection:
CommandLine:
- '*\sysprep.exe *\AppData\*'
- 'sysprep.exe *\AppData\*'
condition: selection
falsepositives:
- False positives depend on scripts and administrative tools used in the monitored environment
level: medium
---
logsource:
product: windows
service: sysmon
detection:
selection:
EventID: 1
---
logsource:
product: windows
service: security
description: 'Requirements: Audit Policy : Detailed Tracking > Audit Process creation, Group Policy : Administrative Templates\System\Audit Process Creation'
detection:
selection:
EventID: 4688
+33
View File
@@ -0,0 +1,33 @@
---
action: global
title: Whoami Execution
status: experimental
description: 'Detects the execution of whoami, which is often used by attackers after exloitation / privilege escalation but rarely used by administrators'
references:
- https://twitter.com/haroonmeer/status/939099379834658817
- https://twitter.com/c_APT_ure/status/939475433711722497
author: Florian Roth
date: 2018/05/22
detection:
condition: selection
falsepositives:
- Admin activity
- Scripts and administrative tools used in the monitored environment
level: high
---
logsource:
product: windows
service: sysmon
detection:
selection:
EventID: 1
CommandLine: 'whoami'
---
logsource:
product: windows
service: security
description: 'Requirements: Audit Policy : Detailed Tracking > Audit Process creation, Group Policy : Administrative Templates\System\Audit Process Creation'
detection:
selection:
EventID: 4688
NewProcessName: '*\whoami.exe'
@@ -1,6 +1,6 @@
title: Malicious PowerShell Commandlets
title: Malicious PowerShell Keywords
status: experimental
description: Detects Commandlet names from well-known PowerShell exploitation frameworks
description: Detects keywords from well-known PowerShell exploitation frameworks
references:
- https://adsecurity.org/?p=2921
author: Sean Metcalf (source), Florian Roth (rule)
@@ -0,0 +1,24 @@
title: Executable in ADS
status: experimental
description: Detects the creation of an ADS data stream that contains an executable (non-empty imphash)
references:
- https://twitter.com/0xrawsec/status/1002478725605273600?s=21
author: Florian Roth, @0xrawsec
date: 2018/06/03
logsource:
product: windows
service: sysmon
description: 'Requirements: Sysmon config with Imphash logging activated'
detection:
selection:
EventID: 15
filter:
Imphash: '00000000000000000000000000000000'
condition: selection and not filter
fields:
- TargetFilename
- Image
falsepositives:
- unknown
level: critical
@@ -13,5 +13,3 @@ detection:
falsepositives:
- unknown
level: high
@@ -0,0 +1,23 @@
title: Default PowerSploit Schtasks Persistence
status: experimental
description: Detects the creation of a schtask via PowerSploit Default Configuration
references:
- https://github.com/0xdeadbeefJERKY/PowerSploit/blob/8690399ef70d2cad10213575ac67e8fa90ddf7c3/Persistence/Persistence.psm1
author: Markus Neis
date: 2018/03/06
logsource:
product: windows
service: sysmon
detection:
selection:
ParentImage:
- '*\Powershell.exe'
CommandLine:
- '*\schtasks.exe*/Create*/RU*system*/SC*ONLOGON*'
- '*\schtasks.exe*/Create*/RU*system*/SC*DAILY*'
- '*\schtasks.exe*/Create*/RU*system*/SC*ONIDLE*'
- '*\schtasks.exe*/Create*/RU*system*/SC*HOURLY*'
condition: selection
falsepositives:
- False positives are possible, depends on organisation and processes
level: high
@@ -1,8 +1,7 @@
title: Suspicious Certutil Command
status: experimental
description: Detetcs a suspicious Microsoft certutil execution with sub commands like 'decode' sub command, which is sometimes used to decode malicious code with the built-in certutil utility
author:
- Florian Roth, juju4
author: Florian Roth, juju4
references:
- https://twitter.com/JohnLaTwC/status/835149808817991680
- https://twitter.com/subTee/status/888102593838362624
@@ -27,4 +27,4 @@ fields:
- ParentCommandLine
falsepositives:
- Will need to be tuned. If using Splunk, I recommend | stats count by Computer,CommandLine following the search for easy hunting by computer/CommandLine.
level: medium
level: low
@@ -0,0 +1,19 @@
title: PowerShell Rundll32 Remote Thread Creation
status: experimental
description: Detects PowerShell remote thread creation in Rundll32.exe
author: Florian Roth
references:
- https://www.fireeye.com/blog/threat-research/2018/06/bring-your-own-land-novel-red-teaming-technique.html
date: 2018/06/25
logsource:
product: windows
service: sysmon
detection:
selection:
EventID: 8
SourceImage: '*\powershell.exe'
TargetImage: '*\rundll32.exe'
condition: selection
falsepositives:
- Unkown
level: high
+69
View File
@@ -8,3 +8,72 @@ command line tools:
* Elasticsearch X-Pack Watcher
* Logpoint queries
* *merge_sigma*: Merge Sigma collections into simple Sigma rules.
## Sigmac
### Usage
usage: sigmac [-h] [--recurse] [--filter FILTER]
[--target {es-dsl,es-qs,graylog,kibana,xpack-watcher,logpoint,splunk,grep,fieldlist}]
[--target-list] [--config CONFIG] [--output OUTPUT]
[--backend-option BACKEND_OPTION] [--defer-abort]
[--ignore-not-implemented] [--verbose] [--debug]
[inputs [inputs ...]]
Convert Sigma rules into SIEM signatures.
positional arguments:
inputs Sigma input files
optional arguments:
-h, --help show this help message and exit
--recurse, -r Recurse into subdirectories (not yet implemented)
--filter FILTER, -f FILTER
Define comma-separated filters that must match (AND-
linked) to rule to be processed. Valid filters:
level<=x, level>=x, level=x, status=y, logsource=z. x
is one of: low, medium, high, critical. y is one of:
experimental, testing, stable. z is a word appearing
in an arbitrary log source attribute. Multiple log
source specifications are AND linked.
--target {es-dsl,es-qs,graylog,kibana,xpack-watcher,logpoint,splunk,grep,fieldlist}, -t {es-dsl,es-qs,graylog,kibana,xpack-watcher,logpoint,splunk,grep,fieldlist}
Output target format
--target-list, -l List available output target formats
--config CONFIG, -c CONFIG
Configuration with field name and index mapping for
target environment (not yet implemented)
--output OUTPUT, -o OUTPUT
Output file or filename prefix if multiple files are
generated (not yet implemented)
--backend-option BACKEND_OPTION, -O BACKEND_OPTION
Options and switches that are passed to the backend
--defer-abort, -d Don't abort on parse or conversion errors, proceed
with next rule. The exit code from the last error is
returned
--ignore-not-implemented, -I
Only return error codes for parse errors and ignore
errors for rules with not implemented features
--verbose, -v Be verbose
--debug, -D Debugging output
Backend options:
es-dsl
es : Host and port of Elasticsearch instance (default: http://localhost:9200)
output : Output format: import = JSON search request, curl = Shell script that do the search queries via curl (default: import)
es-qs
rulecomment: Prefix generated query with comment containing title (default: False)
graylog
rulecomment: Prefix generated query with comment containing title (default: False)
kibana
output : Output format: import = JSON file manually imported in Kibana, curl = Shell script that imports queries in Kibana via curl (jq is additionally required) (default: import)
es : Host and port of Elasticsearch instance (default: localhost:9200)
index : Kibana index (default: .kibana)
prefix : Title prefix of Sigma queries (default: Sigma: )
xpack-watcher
output : Output format: curl = Shell script that imports queries in Watcher index with curl (default: curl)
es : Host and port of Elasticsearch instance (default: localhost:9200)
mail : Mail address for Watcher notification (only logging if not set) (default: None)
logpoint
rulecomment: Prefix generated query with comment containing title (default: False)
splunk
rulecomment: Prefix generated query with comment containing title (default: False)
+102
View File
@@ -0,0 +1,102 @@
logsources:
linux:
product: linux
conditions:
deviceVendor: Unix
linux-sshd:
product: linux
service: sshd
conditions:
deviceVendor: Unix
linux-auth:
product: linux
service: auth
conditions:
deviceVendor: Unix
linux-clamav:
product: linux
service: clamav
conditions:
deviceVendor: Unix
windows-dns:
product: windows
service: dns-server
conditions:
deviceVendor: Microsoft
deviceProduct: DNS-Server
windows-pc:
product: windows
service: powershell-classic
conditions:
deviceVendor: Microsoft
windows-sys:
product: windows
service: sysmon
conditions:
deviceVendor: Microsoft
deviceProduct: Sysmon
windows-sec:
product: windows
service: security
conditions:
deviceVendor: Microsoft
deviceProduct: Microsoft Windows
windows-power:
product: windows
service: powershell
conditions:
deviceVendor: Microsoft
windows-system:
product: windows
service: system
conditions:
deviceVendor: Microsoft
windows-driver:
product: windows
service: driver-framework
conditions:
deviceVendor: Microsoft
windows-app:
product: windows
service: application
conditions:
deviceVendor: Microsoft
proxy:
category: proxy
conditions:
categoryDeviceGroup: /Proxy
python:
product: python
conditions:
deviceProduct: Python
categoryDeviceGroup: /Application
ruby_on_rails:
product: ruby_on_rails
conditions:
deviceProduct: Ruby on Rails
categoryDeviceGroup: /Application
spring:
product: spring
conditions:
deviceProduct: Spring
categoryDeviceGroup: /Application
apache:
product: apache
conditions:
deviceProduct: Apache
categoryDeviceGroup: /Application
firewall:
product: firewall
conditions:
categoryDeviceGroup: /Firewall
fieldmappings:
EventID: externalId
dst:
- destinationAddress
dst_ip:
- destinationAddress
src:
- sourceAddress
src_ip:
- sourceAddress
+17
View File
@@ -0,0 +1,17 @@
fieldmappings:
dst:
- network.remote.address.ip
dst_ip:
- network.remote.address.ip
src:
- network.local.address.ip
src_ip:
- network.local.address.ip
file_hash:
- file.hash.md5
- file.hash.sha256
NewProcessName: process.name
ServiceName: process.name
ServiceFileName: process.name
TargetObject: registry.path
+54
View File
@@ -0,0 +1,54 @@
logsources:
windows-application:
product: windows
service: application
sources:
- 'WinEventLog:Application'
windows-security:
product: windows
service: security
sources:
- 'WinEventLog:Security'
windows-security:
product: windows
service: system
sources:
- 'WinEventLog:System'
windows-sysmon:
product: windows
service: sysmon
sources:
- 'WinEventLog:Microsoft-Windows-Sysmon/Operational'
windows-powershell:
product: windows
service: powershell
sources:
- 'WinEventLog:Microsoft-Windows-PowerShell/Operational'
windows-powershell:
product: windows
service: taskscheduler
sources:
- 'WinEventLog:Microsoft-Windows-TaskScheduler/Operational'
windows-wmi:
product: windows
service: wmi
sources:
- 'WinEventLog:Microsoft-Windows-WMI-Activity/Operational'
apache:
category: webserver
sources:
- 'File:/var/log/apache/*.log'
- 'File:/var/log/apache2/*.log'
- 'File:/var/log/httpd/*.log'
linux-auth:
product: linux
service: auth
sources:
- 'File:/var/log/auth.log'
- 'File:/var/log/auth.log.?' # auth.log.1, auth.log.2, ...
linux-syslog:
product: linux
service: syslog
sources:
- 'File:/var/log/syslog'
- 'File:/var/log/syslog.?' # syslog.1, syslog.2 ...
+11
View File
@@ -42,12 +42,23 @@ logsources:
windows-dns-server:
product: windows
service: dns-server
category: dns
conditions:
source: 'DNS Server'
windows-dns-server-audit:
product: windows
service: dns-server-audit
conditions:
source: 'Microsoft-Windows-DNS-Server/Audit'
windows-driver-framework:
product: windows
service: driver-framework
conditions:
source: 'Microsoft-Windows-DriverFrameworks-UserMode/Operational'
windows-ntlm:
product: windows
service: ntlm
conditions:
source: 'Microsoft-Windows-NTLM/Operational'
fieldmappings:
EventID: EventCode
+19 -3
View File
@@ -13,7 +13,7 @@ with open(path.join(here, 'README.md'), encoding='utf-8') as f:
setup(
name='sigmatools',
version='0.4',
version='0.5',
description='Tools for the Generic Signature Format for SIEM Systems',
long_description=long_description,
url='https://github.com/Neo23x0/sigma',
@@ -39,6 +39,22 @@ setup(
extras_require={
'test': ['coverage', 'yamllint'],
},
data_files=[('etc/sigma', ['config/elk-windows.yml', 'config/elk-linux.yml', 'config/elk-defaultindex.yml', 'config/splunk-windows-all.yml', 'config/splunk-windows-all.yml', 'config/logpoint-windows-all.yml'])],
scripts=['sigmac', 'merge_sigma']
data_files=[
('etc/sigma', [
'config/arcsight.yml',
'config/elk-defaultindex-filebeat.yml',
'config/elk-defaultindex-logstash.yml',
'config/elk-defaultindex.yml',
'config/elk-linux.yml',
'config/elk-windows.yml',
'config/helk.yml',
'config/logpoint-windows-all.yml',
'config/qualys.yml',
'config/spark.yml',
'config/splunk-windows-all.yml',
])],
scripts=[
'sigmac',
'merge_sigma'
]
)
+541 -15
View File
@@ -102,13 +102,14 @@ class BaseBackend:
def generate(self, sigmaparser):
"""Method is called for each sigma rule and receives the parsed rule (SigmaParser)"""
for parsed in sigmaparser.condparsed:
query = self.generateQuery(parsed)
before = self.generateBefore(parsed)
after = self.generateAfter(parsed)
if before is not None:
self.output.print(before, end="")
query = self.generateQuery(parsed)
if query is not None:
self.output.print(query)
after = self.generateAfter(parsed)
if after is not None:
self.output.print(after, end="")
@@ -209,10 +210,14 @@ class RulenameCommentMixin:
def generateBefore(self, parsed):
if self.rulecomment:
try:
return "\n%s%s\n" % (self.prefix, parsed.sigmaParser.parsedyaml['title'])
return "%s%s\n" % (self.prefix, parsed.sigmaParser.parsedyaml['title'])
except KeyError:
return ""
def generateAfter(self, parsed):
if self.rulecomment:
return "\n"
class ElasticsearchDSLBackend(RulenameCommentMixin, BaseBackend):
"""ElasticSearch DSL backend"""
identifier = 'es-dsl'
@@ -285,16 +290,16 @@ class ElasticsearchDSLBackend(RulenameCommentMixin, BaseBackend):
if type(value) is list:
res = {'bool': {'should': []}}
for v in value:
res['bool']['should'].append({'match': {key: v}})
res['bool']['should'].append({'match_phrase': {key: v}})
return res
else:
return {'match': {key: value}}
return {'match_phrase': {key: value}}
def generateValueNode(self, node):
return {'multi_match': {'query': node, 'fields': []}}
return {'multi_match': {'query': node, 'fields': [], 'type': 'phrase'}}
def generateNULLValueNode(self, node):
return {'missing': {'field': node.item}}
return {'bool': {'must_not': {'exists': {'field': node.item}}}}
def generateNotNULLValueNode(self, node):
return {'exists': {'field': node.item}}
@@ -352,12 +357,19 @@ class ElasticsearchDSLBackend(RulenameCommentMixin, BaseBackend):
if self.indices is not None and len(self.indices) == 1:
index = '%s/'%self.indices[0]
for query in self.queries:
if self.output_type == 'curl':
if self.output_type == 'curl':
for query in self.queries:
self.output.print("\curl -XGET '%s/%s_search?pretty' -H 'Content-Type: application/json' -d'"%(self.es, index))
self.output.print(json.dumps(query, indent=2))
if self.output_type == 'curl':
self.output.print(json.dumps(query, indent=2))
self.output.print("'")
else:
if len(self.queries) == 1:
self.output.print(json.dumps(self.queries[0], indent=2))
else:
self.output.print(json.dumps(self.queries, indent=2))
class SingleTextQueryBackend(RulenameCommentMixin, BaseBackend, QuoteCharMixin):
"""Base class for backends that generate one text-based expression from a Sigma rule"""
@@ -380,16 +392,34 @@ class SingleTextQueryBackend(RulenameCommentMixin, BaseBackend, QuoteCharMixin):
mapListValueExpression = None # Syntax for field/value condititons where map value is a list
def generateANDNode(self, node):
return self.andToken.join([self.generateNode(val) for val in node])
generated = [ self.generateNode(val) for val in node ]
filtered = [ g for g in generated if g is not None ]
if filtered:
return self.andToken.join(filtered)
else:
return None
def generateORNode(self, node):
return self.orToken.join([self.generateNode(val) for val in node])
generated = [ self.generateNode(val) for val in node ]
filtered = [ g for g in generated if g is not None ]
if filtered:
return self.orToken.join(filtered)
else:
return None
def generateNOTNode(self, node):
return self.notToken + self.generateNode(node.item)
generated = self.generateNode(node.item)
if generated is not None:
return self.notToken + generated
else:
return None
def generateSubexpressionNode(self, node):
return self.subExpression % self.generateNode(node.items)
generated = self.generateNode(node.items)
if generated:
return self.subExpression % generated
else:
return None
def generateListNode(self, node):
if not set([type(value) for value in node]).issubset({str, int}):
@@ -464,6 +494,29 @@ class ElasticsearchQuerystringBackend(SingleTextQueryBackend):
mapExpression = "%s:%s"
mapListsSpecialHandling = False
# Graylog reserved characters in search && || : \ / + - ! ( ) { } [ ] ^ " ~ * ?
# Modified from Elasticsearch backend reserved character at https://www.elastic.co/guide/en/elasticsearch/reference/2.1/query-dsl-query-string-query.html#_reserved_characters
# Elasticsearch characters + - = && || > < ! ( ) { } [ ] ^ " ~ * ? : \ /
class GraylogQuerystringBackend(SingleTextQueryBackend):
"""Converts Sigma rule into Graylog query string. Only searches, no aggregations."""
identifier = "graylog"
active = True
reEscape = re.compile("([+\\-!(){}\\[\\]^\"~:/]|\\\\(?![*?])|&&|\\|\\|)")
reClear = None
andToken = " AND "
orToken = " OR "
notToken = "NOT "
subExpression = "(%s)"
listExpression = "(%s)"
listSeparator = " "
valueExpression = "\"%s\""
nullExpression = "NOT _exists_:%s"
notNullExpression = "_exists_:%s"
mapExpression = "%s:%s"
mapListsSpecialHandling = False
class KibanaBackend(ElasticsearchQuerystringBackend, MultiRuleOutputMixin):
"""Converts Sigma rule into Kibana JSON Configuration files (searches only)."""
identifier = "kibana"
@@ -832,6 +885,254 @@ class SplunkBackend(SingleTextQueryBackend):
else:
return " | stats %s(%s) as val by %s | search val %s %s" % (agg.aggfunc_notrans, agg.aggfield, agg.groupfield, agg.cond_op, agg.condition)
class WindowsDefenderATPBackend(SingleTextQueryBackend):
"""Converts Sigma rule into Windows Defender ATP Hunting Queries."""
identifier = "wdatp"
active = True
reEscape = re.compile('("|\\\\(?![*?]))')
reClear = None
andToken = " and "
orToken = " or "
notToken = "not "
subExpression = "(%s)"
listExpression = "(%s)"
listSeparator = ", "
valueExpression = "\"%s\""
nullExpression = "isnull(%s)"
notNullExpression = "isnotnull(%s)"
mapExpression = "%s == %s"
mapListsSpecialHandling = True
mapListValueExpression = "%s in %s"
def __init__(self, *args, **kwargs):
"""Initialize field mappings"""
super().__init__(*args, **kwargs)
self.fieldMappings = { # mapping between Sigma and ATP field names
# Supported values:
# (field name mapping, value mapping): distinct mappings for field name and value, may be a string (direct mapping) or function maps name/value to ATP target value
# (mapping function,): receives field name and value as parameter, return list of 2 element tuples (destination field name and value)
# (replacement, ): Replaces field occurrence with static string
"AccountName" : (self.id_mapping, self.default_value_mapping),
"CommandLine" : ("ProcessCommandLine", self.default_value_mapping),
"ComputerName" : (self.id_mapping, self.default_value_mapping),
"DestinationHostname" : ("RemoteUrl", self.default_value_mapping),
"DestinationIp" : ("RemoteIP", self.default_value_mapping),
"DestinationIsIpv6" : ("RemoteIP has \":\"", ),
"DestinationPort" : ("RemotePort", self.default_value_mapping),
"Details" : ("RegistryValueData", self.default_value_mapping),
"EventType" : ("ActionType", self.default_value_mapping),
"Image" : ("FolderPath", self.default_value_mapping),
"ImageLoaded" : ("FolderPath", self.default_value_mapping),
"LogonType" : (self.id_mapping, self.logontype_mapping),
"NewProcessName" : ("FolderPath", self.default_value_mapping),
"ObjectValueName" : ("RegistryValueName", self.default_value_mapping),
"ParentImage" : ("InitiatingProcessFolderPath", self.default_value_mapping),
"SourceImage" : ("InitiatingProcessFolderPath", self.default_value_mapping),
"TargetFilename" : ("FolderPath", self.default_value_mapping),
"TargetImage" : ("FolderPath", self.default_value_mapping),
"TargetObject" : ("RegistryKey", self.default_value_mapping),
"User" : (self.decompose_user, ),
}
def id_mapping(self, src):
"""Identity mapping, source == target field name"""
return src
def default_value_mapping(self, val):
op = "=="
if "*" in val[1:-1]: # value contains * inside string - use regex match
op = "matches regex"
val = re.sub('([".^$]|\\\\(?![*?]))', '\\\\\g<1>', val)
val = re.sub('\\*', '.*', val)
val = re.sub('\\?', '.', val)
else: # value possibly only starts and/or ends with *, use prefix/postfix match
if val.endswith("*") and val.startswith("*"):
op = "contains"
val = self.cleanValue(val[1:-1])
elif val.endswith("*"):
op = "startswith"
val = self.cleanValue(val[:-1])
elif val.startswith("*"):
op = "endswith"
val = self.cleanValue(val[1:])
return "%s \"%s\"" % (op, val)
def logontype_mapping(self, src):
"""Value mapping for logon events to reduced ATP LogonType set"""
logontype_mapping = {
2: "Interactive",
3: "Network",
4: "Batch",
5: "Service",
7: "Interactive", # unsure
8: "Network",
9: "Interactive", # unsure
10: "Remote interactive (RDP) logons", # really the value?
11: "Interactive"
}
try:
return logontype_mapping[int(src)]
except KeyError:
raise NotSupportedError("Logon type %d unknown and can't be mapped" % src)
def decompose_user(self, src_field, src_value):
"""Decompose domain\\user User field of Sysmon events into ATP InitiatingProcessAccountDomain and InititatingProcessAccountName."""
reUser = re.compile("^(.*?)\\\\(.*)$")
m = reUser.match(src_value)
if m:
domain, user = m.groups()
return (("InitiatingProcessAccountDomain", domain), ("InititatingProcessAccountName", user))
else: # assume only user name is given if backslash is missing
return (("InititatingProcessAccountName", src_value),)
def generate(self, sigmaparser):
self.table = None
try:
self.product = sigmaparser.parsedyaml['logsource']['product']
self.service = sigmaparser.parsedyaml['logsource']['service']
except KeyError:
self.product = None
self.service = None
super().generate(sigmaparser)
def generateBefore(self, parsed):
if self.table is None:
raise NotSupportedError("No WDATP table could be determined from Sigma rule")
return "%s | where " % self.table
def generateMapItemNode(self, node):
"""
ATP queries refer to event tables instead of Windows logging event identifiers. This method catches conditions that refer to this field
and creates an appropriate table reference.
"""
key, value = node
if type(value) == list: # handle map items with values list like multiple OR-chained conditions
return self.generateORNode(
[(key, v) for v in value]
)
elif key == "EventID": # EventIDs are not reflected in condition but in table selection
if self.product == "windows":
if self.service == "sysmon" and value == 1 \
or self.service == "security" and value == 4688: # Process Execution
self.table = "ProcessCreationEvents"
return None
elif self.service == "sysmon" and value == 3: # Network Connection
self.table = "NetworkCommunicationEvents"
return None
elif self.service == "sysmon" and value == 7: # Image Load
self.table = "ImageLoadEvents"
return None
elif self.service == "sysmon" and value == 8: # Create Remote Thread
self.table = "MiscEvents"
return "ActionType == \"CreateRemoteThread\""
elif self.service == "sysmon" and value == 11: # File Creation
self.table = "FileCreationEvents"
return None
elif self.service == "sysmon" and value == 13 \
or self.service == "security" and value == 4657: # Set Registry Value
self.table = "RegistryEvents"
return "ActionType == \"SetValue\""
elif self.service == "security" and value == 4624:
self.table = "LogonEvents"
return None
elif type(value) in (str, int): # default value processing
try:
mapping = self.fieldMappings[key]
except KeyError:
raise NotSupportedError("No mapping defined for field '%s'" % key)
if len(mapping) == 1:
mapping = mapping[0]
if type(mapping) == str:
return mapping
elif callable(mapping):
conds = mapping(key, value)
return self.generateSubexpressionNode(
self.generateANDNode(
[cond for cond in mapping(key, value)]
)
)
elif len(mapping) == 2:
result = list()
for mapitem, val in zip(mapping, node): # iterate mapping and mapping source value synchronously over key and value
if type(mapitem) == str:
result.append(mapitem)
elif callable(mapitem):
result.append(mapitem(val))
return "{} {}".format(*result)
else:
raise TypeError("Backend does not support map values of type " + str(type(value)))
return super().generateMapItemNode(node)
class SplunkXMLBackend(SingleTextQueryBackend, MultiRuleOutputMixin):
"""Converts Sigma rule into XML used for Splunk Dashboard Panels"""
identifier = "splunkxml"
active = True
index_field = "index"
panel_pre = "<row><panel><title>"
panel_inf = "</title><table><search><query>"
panel_suf = "</query><earliest>$field1.earliest$</earliest><latest>$field1.latest$</latest><sampleRatio>1</sampleRatio>" \
"</search><option name=\"count\">20</option><option name=\"dataOverlayMode\">none</option><option name=\"" \
"drilldown\">row</option><option name=\"percentagesRow\">false</option><option name=\"refresh.display\">" \
"progressbar</option><option name=\"rowNumbers\">false</option><option name=\"totalsRow\">false</option>" \
"<option name=\"wrap\">true</option></table></panel></row>"
dash_pre = "<form><label>MyDashboard</label><fieldset submitButton=\"false\"><input type=\"time\" token=\"field1\">" \
"<label></label><default><earliest>-24h@h</earliest><latest>now</latest></default></input></fieldset>"
dash_suf = "</form>"
queries = dash_pre
reEscape = re.compile('("|\\\\(?![*?]))')
reClear = SplunkBackend.reClear
andToken = SplunkBackend.andToken
orToken = SplunkBackend.orToken
notToken = SplunkBackend.notToken
subExpression = SplunkBackend.subExpression
listExpression = SplunkBackend.listExpression
listSeparator = SplunkBackend.listSeparator
valueExpression = SplunkBackend.valueExpression
nullExpression = SplunkBackend.nullExpression
notNullExpression = SplunkBackend.notNullExpression
mapExpression = SplunkBackend.mapExpression
mapListsSpecialHandling = SplunkBackend.mapListsSpecialHandling
mapListValueExpression = SplunkBackend.mapListValueExpression
def generateMapItemListNode(self, key, value):
return "(" + (" OR ".join(['%s=%s' % (key, self.generateValueNode(item)) for item in value])) + ")"
def generateAggregation(self, agg):
if agg == None:
return ""
if agg.aggfunc == sigma.parser.SigmaAggregationParser.AGGFUNC_NEAR:
return ""
if agg.groupfield == None:
return " | stats %s(%s) as val | search val %s %s" % (agg.aggfunc_notrans, agg.aggfield, agg.cond_op, agg.condition)
else:
return " | stats %s(%s) as val by %s | search val %s %s" % (agg.aggfunc_notrans, agg.aggfield, agg.groupfield, agg.cond_op, agg.condition)
def generate(self, sigmaparser):
"""Method is called for each sigma rule and receives the parsed rule (SigmaParser)"""
for parsed in sigmaparser.condparsed:
query = self.generateQuery(parsed)
if query is not None:
self.queries += self.panel_pre
self.queries += self.getRuleName(sigmaparser)
self.queries += self.panel_inf
query = query.replace("<", "&lt;")
query = query.replace(">", "&gt;")
self.queries += query
self.queries += self.panel_suf
def finalize(self):
self.queries += self.dash_suf
self.output.print(self.queries)
class GrepBackend(BaseBackend, QuoteCharMixin):
"""Generates Perl compatible regular expressions and puts 'grep -P' around it"""
identifier = "grep"
@@ -948,3 +1249,228 @@ class BackendError(Exception):
class NotSupportedError(BackendError):
"""Exception is raised if some output is required that is not supported by the target language."""
pass
class PartialMatchError(Exception):
pass
class FullMatchError(Exception):
pass
class ArcSightBackend(SingleTextQueryBackend):
"""Converts Sigma rule into ArcSight saved search. Contributed by SOC Prime. https://socprime.com"""
identifier = "arcsight"
active = True
andToken = " AND "
orToken = " OR "
notToken = " NOT "
subExpression = "(%s)"
listExpression = "(%s)"
listSeparator = " OR "
valueExpression = "\"%s\""
containsExpression = "%s CONTAINS %s"
nullExpression = "NOT _exists_:%s"
notNullExpression = "_exists_:%s"
mapExpression = "%s = %s"
mapListsSpecialHandling = True
mapListValueExpression = "%s = %s"
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
aFL = ["deviceVendor", "categoryDeviceGroup", "deviceProduct"]
for item in self.sigmaconfig.fieldmappings.values():
if item.target_type is list:
aFL.extend(item.target)
else:
aFL.append(item.target)
self.allowedFieldsList = list(set(aFL))
# Skip logsource value from sigma document for separate path.
def generateCleanValueNodeLogsource(self, value):
return self.valueExpression % (self.cleanValue(str(value)))
# Clearing values from special characters.
def CleanNode(self, node):
search_ptrn = re.compile(r"[\/\\@?#&_%*',\(\)\" ]")
replace_ptrn = re.compile(r"[ \/\\@?#&_%*',\(\)\" ]")
match = search_ptrn.search(str(node))
new_node = list()
if match:
replaced_str = replace_ptrn.sub('*', node)
node = [x for x in replaced_str.split('*') if x]
new_node.extend(node)
else:
new_node.append(node)
node = new_node
return node
# Clearing values from special characters.
def generateMapItemNode(self, node):
key, value = node
if key in self.allowedFieldsList:
if self.mapListsSpecialHandling == False and type(value) in (
str, int, list) or self.mapListsSpecialHandling == True and type(value) in (str, int):
return self.mapExpression % (key, self.generateCleanValueNodeLogsource(value))
elif type(value) is list:
return self.generateMapItemListNode(key, value)
else:
raise TypeError("Backend does not support map values of type " + str(type(value)))
else:
if self.mapListsSpecialHandling == False and type(value) in (
str, int, list) or self.mapListsSpecialHandling == True and type(value) in (str, int):
if type(value) is str:
new_value = list()
value = self.CleanNode(value)
if type(value) == list:
new_value.append(self.andToken.join([self.valueExpression % val for val in value]))
else:
new_value.append(value)
if len(new_value)==1:
return "(" + self.generateANDNode(new_value) + ")"
else:
return "(" + self.generateORNode(new_value) + ")"
else:
return self.generateValueNode(value)
elif type(value) is list:
new_value = list()
for item in value:
item = self.CleanNode(item)
if type(item) is list and len(item) == 1:
new_value.append(self.valueExpression % item[0])
elif type(item) is list:
new_value.append(self.andToken.join([self.valueExpression % val for val in item]))
else:
new_value.append(item)
return self.generateORNode(new_value)
else:
raise TypeError("Backend does not support map values of type " + str(type(value)))
# for keywords values with space
def generateValueNode(self, node):
if type(node) is int:
return self.cleanValue(str(node))
if 'AND' in node:
return "(" + self.cleanValue(str(node)) + ")"
else:
return self.cleanValue(str(node))
# collect elements of Arcsight search using OR
def generateMapItemListNode(self, key, value):
itemslist = list()
for item in value:
if key in self.allowedFieldsList:
itemslist.append('%s = %s' % (key, self.generateValueNode(item)))
else:
itemslist.append('%s' % (self.generateValueNode(item)))
return " OR ".join(itemslist)
# prepare of tail for every translate
def generate(self, sigmaparser):
"""Method is called for each sigma rule and receives the parsed rule (SigmaParser)"""
const_title = ' AND type != 2 | rex field = flexString1 mode=sed "s//Sigma: {}/g"'
for parsed in sigmaparser.condparsed:
self.output.print(self.generateQuery(parsed) + const_title.format(sigmaparser.parsedyaml["title"]))
# Add "( )" for values
def generateSubexpressionNode(self, node):
return self.subExpression % self.generateNode(node.items)
# generateORNode algorithm for ArcSightBackend class.
def generateORNode(self, node):
if type(node) == sigma.parser.ConditionOR and all(isinstance(item, str) for item in node):
new_value = list()
for value in node:
value = self.CleanNode(value)
if type(value) is list:
new_value.append(self.andToken.join([self.valueExpression % val for val in value]))
else:
new_value.append(value)
return "(" + self.orToken.join([self.generateNode(val) for val in new_value]) + ")"
return "(" + self.orToken.join([self.generateNode(val) for val in node]) + ")"
class QualysBackend(SingleTextQueryBackend):
"""Converts Sigma rule into Qualys saved search. Contributed by SOC Prime. https://socprime.com"""
identifier = "qualys"
active = True
andToken = " and "
orToken = " or "
notToken = "not "
subExpression = "(%s)"
listExpression = "%s"
listSeparator = " "
valueExpression = "%s"
nullExpression = "%s is null"
notNullExpression = "not (%s is null)"
mapExpression = "%s:`%s`"
mapListsSpecialHandling = True
PartialMatchFlag = False
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
fl = []
for item in self.sigmaconfig.fieldmappings.values():
if item.target_type == list:
fl.extend(item.target)
else:
fl.append(item.target)
self.allowedFieldsList = list(set(fl))
def generateORNode(self, node):
new_list = []
for val in node:
if type(val) == tuple and not(val[0] in self.allowedFieldsList):
pass
# self.PartialMatchFlag = True
else:
new_list.append(val)
generated = [self.generateNode(val) for val in new_list]
filtered = [g for g in generated if g is not None]
return self.orToken.join(filtered)
def generateANDNode(self, node):
new_list = []
for val in node:
if type(val) == tuple and not(val[0] in self.allowedFieldsList):
self.PartialMatchFlag = True
else:
new_list.append(val)
generated = [self.generateNode(val) for val in new_list]
filtered = [g for g in generated if g is not None]
return self.andToken.join(filtered)
def generateMapItemNode(self, node):
key, value = node
if self.mapListsSpecialHandling == False and type(value) in (str, int, list) or self.mapListsSpecialHandling == True and type(value) in (str, int):
if key in self.allowedFieldsList:
return self.mapExpression % (key, self.generateNode(value))
else:
return self.generateNode(value)
elif type(value) == list:
return self.generateMapItemListNode(key, value)
else:
raise TypeError("Backend does not support map values of type " + str(type(value)))
def generateMapItemListNode(self, key, value):
itemslist = []
for item in value:
if key in self.allowedFieldsList:
itemslist.append('%s:`%s`' % (key, self.generateValueNode(item)))
else:
itemslist.append('%s' % (self.generateValueNode(item)))
return "(" + (" or ".join(itemslist)) + ")"
def generate(self, sigmaparser):
"""Method is called for each sigma rule and receives the parsed rule (SigmaParser)"""
all_keys = set()
for parsed in sigmaparser.condparsed:
query = self.generateQuery(parsed)
if query == "()":
self.PartialMatchFlag = None
if self.PartialMatchFlag == True:
raise PartialMatchError(query)
elif self.PartialMatchFlag == None:
raise FullMatchError(query)
else:
self.output.print(query)
+25 -6
View File
@@ -67,7 +67,7 @@ class SigmacArgumentParser(argparse.ArgumentParser):
return helptext
argparser = SigmacArgumentParser(description="Convert Sigma rules into SIEM signatures.")
argparser.add_argument("--recurse", "-r", action="store_true", help="Recurse into subdirectories (not yet implemented)")
argparser.add_argument("--recurse", "-r", action="store_true", help="Use directory as input (recurse into subdirectories is not implemented yet)")
argparser.add_argument("--filter", "-f", help="""
Define comma-separated filters that must match (AND-linked) to rule to be processed.
Valid filters: level<=x, level>=x, level=x, status=y, logsource=z.
@@ -82,7 +82,7 @@ argparser.add_argument("--config", "-c", help="Configuration with field name and
argparser.add_argument("--output", "-o", default=None, help="Output file or filename prefix if multiple files are generated (not yet implemented)")
argparser.add_argument("--backend-option", "-O", action="append", help="Options and switches that are passed to the backend")
argparser.add_argument("--defer-abort", "-d", action="store_true", help="Don't abort on parse or conversion errors, proceed with next rule. The exit code from the last error is returned")
argparser.add_argument("--ignore-not-implemented", "-I", action="store_true", help="Only return error codes for parse errors and ignore errors for rules with not implemented features")
argparser.add_argument("--ignore-backend-errors", "-I", action="store_true", help="Only return error codes for parse errors and ignore errors for rules that cause backend errors. Useful, when you want to get as much queries as possible.")
argparser.add_argument("--verbose", "-v", action="store_true", help="Be verbose")
argparser.add_argument("--debug", "-D", action="store_true", help="Debugging output")
argparser.add_argument("inputs", nargs="*", help="Sigma input files")
@@ -153,18 +153,37 @@ for sigmafile in get_inputs(cmdargs.inputs, cmdargs.recurse):
error = 4
if not cmdargs.defer_abort:
sys.exit(error)
except backends.NotSupportedError as e:
print("The Sigma rule requires a feature that is not supported by the target system: " + str(e), file=sys.stderr)
if not cmdargs.ignore_backend_errors:
error = 9
if not cmdargs.defer_abort:
sys.exit(error)
except backends.BackendError as e:
print("Backend error in %s: %s" % (sigmafile, str(e)), file=sys.stderr)
error = 8
if not cmdargs.defer_abort:
sys.exit(error)
if not cmdargs.ignore_backend_errors:
error = 8
if not cmdargs.defer_abort:
sys.exit(error)
except NotImplementedError as e:
print("An unsupported feature is required for this Sigma rule: " + str(e), file=sys.stderr)
print("Feel free to contribute for fun and fame, this is open source :) -> https://github.com/Neo23x0/sigma", file=sys.stderr)
if not cmdargs.ignore_not_implemented:
if not cmdargs.ignore_backend_errors:
error = 42
if not cmdargs.defer_abort:
sys.exit(error)
except backends.PartialMatchError as e:
print("Partial field match error: %s" % str(e), file=sys.stderr)
if not cmdargs.ignore_backend_errors:
error = 80
if not cmdargs.defer_abort:
sys.exit(error)
except backends.FullMatchError as e:
print("Full field match error", file=sys.stderr)
if not cmdargs.ignore_backend_errors:
error = 90
if not cmdargs.defer_abort:
sys.exit(error)
finally:
try:
f.close()