logstash - _grokparsefailure on varnish log -


message looks like

1.2.3.4 "-" - - [19/apr/2016:11:42:18 +0200] "get http://monsite.vpù/api/opa/status http/1.1" 200 92 "-" "curl - api-player - preprod" hit opa-preprod-api - 0.000144958 

my grok pattern is

grok {       match => { "message" => "%{ip:clientip} \"%{data:x_forwarded_for}\" %{user:ident} %{user:auth} \[%{httpdate:timestamp}\] \"(?:%{word:verb} %{notspace:request}(?: http/%{number:httpversion})?|%{data:rawrequest})\" %{number:response} (?:%{number:bytes}|-) %{qs:referrer} %{qs:agent} (%{notspace:hitmiss}|-) (%{notspace:varnish_conf}|-) (%{notspace:varnish_backend}|-) %{number:time_firstbyte}"}     } 

i have grokparsefailure tag whereas fields fulfilled correctly except last one, 0 instead of 0.000144958

the full message in es is

{   "_index": "logstash-2016.04.19",   "_type": "syslog",   "_id": "avqt7wscn-2lsqj9ziiq",   "_score": null,   "_source": {     "message": "212.95.71.201 \"-\" - - [19/apr/2016:11:50:12 +0200] \"get http://monsite.com/api/opa/status http/1.1\" 200 92 \"-\" \"curl - api-player - preprod\" hit opa-preprod-api - 0.000132084",     "@version": "1",     "@timestamp": "2016-04-19t09:50:12.000z",     "type": "syslog",     "host": "212.95.70.80",     "tags": [       "_grokparsefailure"     ],     "application": "varnish-preprod",     "clientip": "1.2.3.4",     "x_forwarded_for": "-",     "ident": "-",     "auth": "-",     "timestamp": "19/apr/2016:11:50:12 +0200",     "verb": "get",     "request": "http://monsite.com/api/opa/status",     "httpversion": "1.1",     "response": "200",     "bytes": "92",     "referrer": "\"-\"",     "agent": "\"curl - api-player - preprod\"",     "hitmiss": "hit",     "varnish_conf": "opa-preprod-api",     "varnish_backend": "-",     "time_firstbyte": "0.000132084",     "geoip": {       "ip": "1.2.3.4",       "country_code2": "fr",       "country_code3": "fra",       "country_name": "france",       "continent_code": "eu",       "region_name": "c1",       "city_name": "strasbourg",       "latitude": 48.60040000000001,       "longitude": 7.787399999999991,       "timezone": "europe/paris",       "real_region_name": "alsace",       "location": [         7.787399999999991,         48.60040000000001       ]     },     "agentname": "other",     "agentos": "other",     "agentdevice": "other"   },   "fields": {     "@timestamp": [       1461059412000     ]   },   "highlight": {     "agent": [       "\"curl - api-player - @kibana-highlighted-field@preprod@/kibana-highlighted-field@\""     ],     "varnish_conf": [       "opa-@kibana-highlighted-field@preprod@/kibana-highlighted-field@-api"     ],     "application": [       "@kibana-highlighted-field@varnish@/kibana-highlighted-field@-@kibana-highlighted-field@preprod@/kibana-highlighted-field@"     ],     "message": [       "1.2.3.4 \"-\" - - [19/apr/2016:11:50:12 +0200] \"get http://monsote.com/api/opa/status http/1.1\" 200 92 \"-\" \"curl - api-player - @kibana-highlighted-field@preprod@/kibana-highlighted-field@\" hit opa-@kibana-highlighted-field@preprod@/kibana-highlighted-field@-api - 0.000132084"     ]   },   "sort": [     1461059412000   ] } 

the answer kibana not display little number

you grokparsefailure if grok, um, fails. so, it's not grok that's producing tag. use tag_on_failure parameter in groks provide unique tag each grok.

as parsing problem, i'll bet grok working fine. note elasticsearch can make fields dynamically , guess type of field based on first data seen. if first data "0", have made field integer , later entries cast type. can pull mapping see happened.

you need control mapping created. can specify field float in grok (%{number:myfield:int}) or creating own template.

also notice notspace matches "-", patterns varnish_backend, etc, not entirely correct.


Comments

Popular posts from this blog

Ansible - ERROR! the field 'hosts' is required but was not set -

customize file_field button ruby on rails -

SoapUI on windows 10 - high DPI/4K scaling issue -