Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

Overview

You can use the Threat Analytics Manager to import Logstash records into Data Sources.

Using the wizard, it is easy to specify header rows, field names, as well as quote and separator characters.

Importing the Logstasg Record

To import the Logstash record

  1. From the Treat Analytics Dashboard, click on Data Sources (top right).
  2. Under "What kind of source would you like to create?" specify "Logstash Import Source."
  3. Click on Next.
  4. Configure the fields as described in the table below.
FieldDescriptionNote
S3 Location

URL for the Amazon s3 bucket.

eg. test.ikanow.com

 
S3 UsernameUsername for the Amazon s3 bucket. 
S3 PasswordPassword for the Amazon s3 bucket. 
Quote CharacterDefault: "

Standard characters have been selected for the quote and separators. You may chose from the list of other common options or add your own based off your data.

SeparatorDefault: ,

Standard characters have been selected for the quote and separators. You may chose from the list of other common options or add your own based off your data.

Column HeadersUsed to manually specify each column header for separated values files. (eg. csv) 

Advanced Options

The following fields are also available for more advanced Logstash import settings.

FieldDescriptionNote
Geo IPInput the "src_ip" (source IP address) that you want associated to this record. It will be used to geocode the CSV field to latitude and longitude information. 
Date ColumnColumn name that holds the Date/Time. 
Date/Time Mask

Date/time format: MM/dd/yyyy HH:mm:ss

The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event.

For example, syslog events usually have timestamps like this:

"Apr 17 09:32:01"

You would use the date format “MMM dd HH:mm:ss” to parse this.

 
Type

Default value: SNORT

Optional string that describes the type of file for the logstash configuration. eg. "apache", "syslog"

Mainly used for filter activation.

 

 

 

Configuring and Testing

Once you have made the input settings, you will need to perform additional configuration and testing.

To configure and test

  1. Provide a name for the source.
  2. Select the previously created Data Group. todo link to data group.
  3. Specify the frequency at which the source should be harvested (eg. Once per day)
  4. Click on Test.

About Testing

If the source has been configured properly testing with return test results, and you will be able to move forward with Publishing the new source.  Otherwise, a failure message is generated which can be used for troubleshooting (currently it only says FAIL).  You can always Save your source and come back to fix any testing errors later.

Saving or Publishing

Saving 

To save the source after testing

  • Click on Save.

The source is saved and you are re-directed to the Source Manager.

Publishing

To publish the source after testing

  • Click on Publish.

The source is published and you are re-directed to the list Source Manager.

 

 

In this section:

 


 

Related Documentation:

 

 

  • No labels