Logstash Import Source
- andrew johnston (Unlicensed)
Overview
You can use the Information Security Analytics Manager to import Logstash records as Data Sources into the IKANOW Information Security Analytics platform.
Using the wizard, it is easy to specify header rows, field names, as well as quote and separator characters.
Start
To import a Logstash record and start configuring the source
- From the Treat Analytics Dashboard, click on Data Sources.
- Click on Add New Source.
- Under "What kind of source would you like to create?" specify "Logstash Import Source."
- Click on Next.
- Configure the fields as described in the table below.
Field | Description | Note |
---|---|---|
Source | File/HDFS: File path to a Hadoop Distributed File System (HDFS) location. S3: url to Amazon S3 location. eg. test.ikanow.com | |
S3 Username | Username for the Amazon s3 bucket. | Only displayed if s3 is selected as Source. |
S3 Password | Password for the Amazon s3 bucket. | Only displayed if s3 is selected as Source. |
Quote Character | Default: " | Standard characters have been selected for the quote and separators. You may chose from the list of other common options or add your own based off your data. |
Separator | Default: , | Standard characters have been selected for the quote and separators. You may chose from the list of other common options or add your own based off your data. |
Column Headers | Used to manually specify each column header for separated values files. (eg. csv) | |
Date Column | Column name that holds the Date/Time. | |
Date/Time Mask | Date/time format: MM/dd/yyyy HH:mm:ss The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event. For example, syslog events usually have timestamps like this:
You would use the date format “MMM dd HH:mm:ss” to parse this. | Select from the dropdown of commonly used settings or specify Other for a custom setting. |
Timezone | Select the appropriate timezone from the dropdown. |
Advanced Options
The following fields are also available for more advanced Logstash import settings.
Field | Description |
---|---|
Geo IP | Column name to use for IP geo tagging. |
Type | Default value: SNORT Describes the type of file for the logstash configuration. eg. "apache", "syslog" Mainly used for filter activation. Possible values: Firewall Proxy RADIUS IDS IPS HBSS Web_Server DNS SQL Router DHCP Other
|
Configure and Test
Once you have made the input settings, you will need to perform additional configuration and testing.
To test the source
- Provide a name for the source.
- Select the previously created Data Group.
- Specify Media Type.
- Indicate if the Data Origin is Internal or External. For more information, see Data Sources.
- Specify the frequency at which the source should be harvested (eg. Once per day)
- Click on Test or Save Source.
About Testing
If the source has been configured properly testing with return test results, and you will be able to move forward with Publishing the new source. Otherwise, a failure message is generated which can be used for troubleshooting. You can always Save your source and come back to fix any testing errors later.
Saving or Publishing
Saving
To save the source after testing
- Click on Save.
The source is saved and you are re-directed to the Source Manager. You can return to the source later for Testing and Publishing.
Publishing
To publish the source after testing
- Click on Publish.
The source is published and you are re-directed to the Source Manager.
In this section: