Using logstash to import csv json files into elasticsearch

Using logstash to import csv files into elasticsearch


test.csv file contains these data's
01/01/2012 12:01:00 AM,18900 TIMES AV,SAN LORENZO,CA,94541,290,VANDALISM ($400 OR MORE),ACSO,12000048,"(37.67705445, -122.11298064)"
01/01/2012 12:01:00 AM,21000 SHERMAN DR,CASTRO VALLEY,CA,94552,250,FORGERY,ACSO,13006643,"(37.71437982, -122.02231781)"
01/01/2012 12:03:00 AM,A ST / MISSION BL,HAYWARD,CA,94541,999,REPORT NUMBER WAS CANCELLED,ACSO,11021284,"(37.66655037, -122.10000611)"
01/01/2012 12:03:00 AM,A ST / MISSION BL,HAYWARD,CA,94541,999,REPORT NUMBER WAS CANCELLED,ACSO,12000010,"(37.66655037, -122.10000611)"
01/01/2012 12:03:00 AM,A ST / MISSION BL,HAYWARD,CA,94541,90D,DUI ALCOHOL/DRUGS,ACSO,11021283,"(37.66655037, -122.10000611)"

Logstash conf file 

input{
 file {
   path => "/home/ubuntu/test.csv"
   start_position => beginning
 }
}

filter {
    csv {
    separator => ","
        columns => ['DateTime','Block','City','State','Zip','CrimeCode','CrimeDescription','AgencyId','CrimeId','Location']
    }
}

output {
 stdout {
 codec => rubydebug
 }
 elasticsearch{}
}


Separater \t,:,"",/ etc
separator => "\t"

Using logstash to import json files into elasticsearch


test.json file contains
{"Query":{"project_id":"a7565b911f324a9199a91854ea18de7e","vsp_timestamp":1392076800,"vsp_tx_id":"2e20a255448742cebdd2ccf5c207cd4e","vuforia_token":"3F23A788D06DD5FE9745D140C264C2A4D7A8C0E6acf4a4e01ba39c66c7c9cbd6a123588b22dc3a24"}}
{"Response":{"result_code":"Success","project_id":"a7565b911f324a9199a91854ea18de7e","vsp_timestamp":1392076801,"http_status_code":200,"vsp_tx_id":"2e20a255448742cebdd2ccf5c207cd4e","vuforia_token":"3F23A788D06DD5FE9745D140C264C2A4D7A8C0E6acf4a4e01ba39c66c7c9cbd6a123588b22dc3a24","targets":[]}}
{"Query":{"project_id":"a7565b911f324a9199a91854ea18de7e","vsp_timestamp":1392076801,"vsp_tx_id":"f7f68c7fb14f4959a1db1a206c88a5b7","vuforia_token":"3F23A788D06DD5FE9745D140C264C2A4D7A8C0E6acf4a4e01ba39c66c7c9cbd6a123588b22dc3a24"}}

Logstash conf file 

input {
  file {
        path => ["/home/ubuntu/test.json"]
       codec => json
       start_position => beginning
 }
}
output{
 stdout { codec => rubydebug }
 elasticsearch{}
}
Note :- Directly also can to upload a json data into elasticsearch using bulk api

Comments

Popular posts from this blog

Proxy setting in java

Kibana 4 Installation and Run as a service in ubuntu