Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

You can use the Threat Information Security Analytics Manager to import .CSV Logstash records into as Data Sources into the IKANOW Information Security Analytics platform.

Using the wizard, it is easy to specify CSV header rows, field names, as well as quote and separator characters.

Importing the CSV File

Image Added

Start

To import the CSV recorda Logstash record and start configuring the source

  1. From the Treat Analytics Dashboard, click on Data Sources (top right).Image Added Image Added
  2. Click on Add New Source.
  3. Under "What kind of source would you like to create?" specify "CSV Record Logstash Import Source."
  4. Click on Next.
  5. Configure the fields as described in the table below.
Path 
FieldDescriptionNote
Source

File

Local file path where the CSV file can be located.urls?
DateSpecific date that you would like to associate with the uploaded CSV fileWhat is the benefit of this, especially for files where each record has different date/time?
TimeSpecific time that you would like to associate with the imported CSV file

/HDFS:

File path to a Hadoop Distributed File System (HDFS) location.

S3:

url to Amazon S3 location.

eg. test.ikanow.com

 
S3 UsernameUsername for the Amazon s3 bucket.Only displayed if s3 is selected as Source.
S3 PasswordPassword for the Amazon s3 bucket.Only displayed if s3 is selected as Source.
Quote CharacterDefault: "These should not require manual input, and should take defaults if user specifies nothing (currently throws an error). Default values should be viewable on GUI

Standard characters have been selected for the quote and separators. You may chose from the list of other common options or add your own based off your data.

SeparatorDefault: , EscapeDefault: \Shouldn't escape also be on the GUI?

Standard characters have been selected for the quote and separators. You may chose from the list of other common options or add your own based off your data.

Column HeadersManually Used to manually specify each column header Shouldn't have to do this. Eg. If each header field starts with '#' they should be able to only specify this. Or leave blank for platform to do it automatically.for separated values files. (eg. csv) 
Date ColumnColumn name that holds the Date/Time. 
Date/Time Mask

Date/time format: MM/dd/yyyy HH:mm:ss

The date filter is used for parsing dates from fields, and then using that date or timestamp as the logstash timestamp for the event.

For example, syslog events usually have timestamps like this:

"Apr 17 09:32:01"

You would use the date format “MMM dd HH:mm:ss” to parse this.

Select from the dropdown of commonly used settings or specify Other for a custom setting.
TimezoneSelect the appropriate timezone from the dropdown. 

Advanced Options

The following fields are also available for more advanced CSV Logstash import settings.

Configuring and Testing
FieldDescriptionNote
Geo IPInput the "src_ip" (source IP address) that you want associated to this record. It will be used to geocode the CSV field to latitude and longitude information. 
Type Not sure what this is referring to.
   

 

Column name to use for IP geo tagging.
Type

Default value: SNORT

Describes the type of file for the logstash configuration. eg. "apache", "syslog" Mainly used for filter activation.

Possible values:

Firewall

Proxy

RADIUS

IDS

IPS

HBSS

Web_Server

DNS

SQL

Router

DHCP

Other

 

Configure and Test

Once you have made the input settings, you will need to perform additional configuration and testing.

To configure and test the source

  1. Provide a name for the source.
  2. Select the previously created Data Group. todo link to data group
  3. Specify Media Type.
  4. Indicate if the Data Origin is Internal or External.  For more information, see Data Sources.
  5. Specify the frequency at which the source should be harvested (eg. Once per day)
  6. Click on test Test or Save Source.

About Testing

If the source has been configured properly testing with return test results, and you will be able to move forward with saving Publishing the new source.  Otherwise, a failure message is generated which can be used for troubleshooting (currently it only says FAIL).

 

 

 

 .  You can always Save your source and come back to fix any testing errors later.

Saving or Publishing

Saving 

To save the source after testing

  • Click on Save.

The source is saved and you are re-directed to the Source Manager.  You can return to the source later for Testing and Publishing.

Publishing

To publish the source after testing

  • Click on Publish.

The source is published and you are re-directed to the Source Manager.

Panel

In this section: 

Table of Contents

 

maxLevel

panelpanel

2

Related Documentation:

 

indent16px