[Bro] loging to elasticsearch git clone

Mo Jia life.130815 at gmail.com
Sat May 2 07:02:25 PDT 2015

Do what you said, ElasticSearch was installed .

I think I success at first. But after I rm the dir elasticsearch-1.5.2/data/
And do it again it don't work any more .

In logs only have stderr.log and stdout.log (disable local log take effect)

ikfb at ikfb:/usr/local/bro/logs/current$ cat stderr.log
listening on eth0, capture length 8192 bytes

ikfb at ikfb:/usr/local/bro/logs/current$ cat stdout.log
max memory size         (kbytes, -m) unlimited
data seg size           (kbytes, -d) unlimited
virtual memory          (kbytes, -v) unlimited
core file size          (blocks, -c) unlimited

seem work fine.
I think after I rm  elasticsearch-1.5.2/data/ it can rebuild. I don't
change bro system. Any suggestion to debug why bro can't connect

I add print in share/bro/elasticsearch/logs-to-elasticsearch.bro

event bro_init() &priority=-5
if ( server_host == "" )

print "beofore for";

for ( stream_id in Log::active_streams )
if ( stream_id in excluded_log_ids ||
    (|send_logs| > 0 && stream_id !in send_logs) )

print "after if"

local filter: Log::Filter = [$name = "default-es",
                            $writer = Log::WRITER_ELASTICSEARCH,
                            $interv = LogElasticSearch::rotation_interval];
Log::add_filter(stream_id, filter);

It don't show msg in broctl where I start. I think it may be in
and I am wrong.

2015-05-02 6:29 GMT+08:00 Daniel Guerra <daniel.guerra69 at gmail.com>:
> Logging local and then parse (the logstash way) it is not really preferred.
> I have been playing with docker and created a docker image for bro with
> elasticsearch. This works great bro uses elasticsearch to log, only kibana
> needs a different timestamp (ts).
> To check your bro can do elasticsearch do :
> /usr/local/bro/bin/bro -N Bro::ElasticSearch
> should give
> Bro::ElasticSearch - ElasticSearch log writer (dynamic, version 1.0)
> Setup elasticsearch
> vi /usr/local/bro/share/bro/base/frameworks/logging/main.bro
> and set
> const enable_local_logging = F
> to avoid local logging
> vi /usr/local/bro/lib/bro/plugins/Bro_ElasticSearch/scripts/init.bro
> and set

by the way : can we just add these line to local.bro

@load elasticsearch/logs-to-elasticsearch
export {
 redef Log::enable_local_logging = F;
 redef LogAscii::json_timestamps = JSON::TS_ISO8601;

> ## Name of the ES cluster.
>         const cluster_name = “<clustername>" &redef;
>         ## ES server.
>         const server_host = “<yourip>" &redef;
> to get clustername and ip check with your browser
> http://<elasticip>:9200/_nodes
> mkdir /usr/local/bro/share/bro/elasticsearch and copy from the git bro
> source dir
> aux/plugins/elasticsearch/scripts/Bro/ElasticSearch/logs-to-elasticsearch.bro
> to
> /usr/local/bro/share/bro/elasticsearch
> add to /usr/local/bro/share/bro/base/init-default.bro
> @load elasticsearch/logs-to-elasticsearch
> You are now ready to log to elasticsearch
> In kibana use bro-* to get your indices or check
> http://<elasticip>:9200/_cat/indices?v
> Hopefully bro can log a YYYY:mm:dd HH:MM:ss format for ts, work in progress
> …….
> Regards,
> Daniel

More information about the Bro mailing list