[ https://jira.fiware.org/browse/HELP-16756?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Fernando Lopez reassigned HELP-16756: ------------------------------------- Assignee: Chandra Challagonda > [fiware-stackoverflow] Parse FIWARE logs using fluentd > ------------------------------------------------------ > > Key: HELP-16756 > URL: https://jira.fiware.org/browse/HELP-16756 > Project: Help-Desk > Issue Type: Monitor > Components: FIWARE-TECH-HELP > Reporter: Backlog Manager > Assignee: Chandra Challagonda > Labels: fiware, fluentd, kibana > > Created question in FIWARE Q/A platform on 27-05-2020 at 11:05 > {color: red}Please, ANSWER this question AT{color} https://stackoverflow.com/questions/62039849/parse-fiware-logs-using-fluentd > +Question:+ > Parse FIWARE logs using fluentd > +Description:+ > I can't parse FIWARE logs using a fluentd parser. I would like to know how I could do it in order to represent the contents of the log in a kibana dashboard. > An example of a log that I want parse is: > time=Wednesday 27 May 09:20:29 2020.830Z | lvl=INFO | corr=4ef1c162-9ffb-11ea-a000-02420a000008 | trans=1590570988-174-00000000007 | from=127.0.0.1 | srv=<none> | subsrv=<none> | comp=Orion | op=logMsg.h[1844]:lmTransactionStart | msg=Starting transaction from 127.0.0.1:46122/version" > Fluend config file: > <source> > @type forward > port 24224 > bind 0.0.0.0 > </source> > <filter **> > @type record_transformer > <record> > service_name ${tag_parts[1]} > stack_name ${tag_parts[0]} > hostname "#{Socket.gethostname}" > </record> > </filter> > <match *.**> > @type copy > <store> > @type elasticsearch > host elasticsearch > port 9200 > logstash_format true > logstash_prefix fluentd > logstash_dateformat %Y.%m.%d > include_tag_key true > type_name access_log > tag_key @log_name > <buffer> > flush_interval 1s > flush_thread_count 2 > </buffer> > </store> > <store> > @type stdout > </store> > </match> -- This message was sent by Atlassian JIRA (v6.4.1#64016)
You can get more information about our cookies and privacy policies clicking on the following links: Privacy policy Cookies policy