In the last writeup we were able to generate logs based on file change. But that does not help us much because we again have to monitor every single machine in the environment manually which is impossible. It is certainly better if we can centralize all the logs into a single dashboard. As ELK one of the common tool to search and visualize data, we can transfer logs using Logstash to Elastic and eventually view them in Kibana dashboard. In this writeup we will be pushing the logs to our Elastic Logstash Kibana (ELK) system.

Now lets bring up Elastic:

user1@osquery1:~$ sudo apt install
[sudo] password for user1: 
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following additional packages will be installed:
  bridge-utils containerd git git-man liberror-perl pigz runc ubuntu-fan
Suggested packages:
  ifupdown aufs-tools btrfs-progs cgroupfs-mount | cgroup-lite debootstrap docker-doc
  rinse zfs-fuse | zfsutils git-daemon-run | git-daemon-sysvinit git-doc git-email
  git-gui gitk gitweb git-cvs git-mediawiki git-svn
The following NEW packages will be installed:
  bridge-utils containerd git git-man liberror-perl pigz runc ubuntu-fan
0 upgraded, 9 newly installed, 0 to remove and 158 not upgraded.
Need to get 78.4 MB of archives.
After this operation, 316 MB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:1 jammy/universe amd64 pigz amd64 2.6-1 [63.6 kB]
... [SNIP] ...
Setting up git (1:2.34.1-1ubuntu1.10) ...
Processing triggers for man-db (2.10.2-1) ...
$ sudo docker network create elastic
$ sudo docker pull
8.6.2: Pulling from elasticsearch/elasticsearch
963f82b80814: Pull complete 
77da6088d9c4: Pull complete 
53c1cc4b2357: Pull complete 
89732bc75041: Pull complete 
a9889a85f79d: Pull complete 
15c2ef330c04: Pull complete 
dc39009e3b6b: Pull complete 
18c8712e2883: Pull complete 
1910ac4f96c8: Pull complete 
fa50b81a1497: Pull complete 
Digest: sha256:1c53c89d04f207beb99d56cc4a1cc23516bd9c386858843d5082a98257c04d1c
Status: Downloaded newer image for
$ sudo sysctl -w vm.max_map_count=262144
vm.max_map_count = 262144
$ sudo docker run --name es01 --net elastic -p 9200:9200 -p 9300:9300 -t
... [SNIP] ...

If everything is successful you should see something like this:

✅ Elasticsearch security features have been automatically configured!
✅ Authentication is enabled and cluster connections are encrypted.

ℹ️  Password for the elastic user (reset with `bin/elasticsearch-reset-password -u elastic`):

ℹ️  HTTP CA certificate SHA-256 fingerprint:

ℹ️  Configure Kibana to use this cluster:
• Run Kibana and click the configuration link in the terminal when Kibana starts.
• Copy the following enrollment token and paste it into Kibana in your browser (valid for the next 30 minutes):

ℹ️ Configure other nodes to join this cluster:
• Copy the following enrollment token and start new Elasticsearch nodes with `bin/elasticsearch --enrollment-token <token>` (valid for the next 30 minutes):

  If you're running in Docker, copy the enrollment token and run:
  `docker run -e "ENROLLMENT_TOKEN=<token>"`

This part contains the credentials needed to communicate with Elastic system.

Now lets bring up Kibana

$ sudo docker pull
[sudo] password for user1: 
8.6.2: Pulling from kibana/kibana
963f82b80814: Already exists 
... [SNIP] ...
$ sudo docker run --name kib01 --net elastic -p 5601:5601
[2023-08-31T02:53:29.911+00:00][INFO ][node] Kibana process configured with roles: [background_tasks, ui]
[2023-08-31T02:53:57.594+00:00][INFO ][plugins-service] Plugin "cloudChat" is disabled.
[2023-08-31T02:53:57.599+00:00][INFO ][plugins-service] Plugin "cloudExperiments" is disabled.
[2023-08-31T02:53:57.600+00:00][INFO ][plugins-service] Plugin "cloudFullStory" is disabled.
[2023-08-31T02:53:57.601+00:00][INFO ][plugins-service] Plugin "cloudGainsight" is disabled.
[2023-08-31T02:53:57.611+00:00][INFO ][plugins-service] Plugin "profiling" is disabled.
[2023-08-31T02:53:57.765+00:00][INFO ][http.server.Preboot] http server running at
[2023-08-31T02:53:57.848+00:00][INFO ][plugins-system.preboot] Setting up [1] plugins: [interactiveSetup]
[2023-08-31T02:53:57.852+00:00][INFO ][preboot] "interactiveSetup" plugin is holding setup: Validating Elasticsearch connection configuration…
... [SNIP] ...
[2023-08-31T02:53:57.909+00:00][INFO ][root] Holding setup until preboot stage is completed.

i Kibana has not been configured.

Go to to get started.

As we can see that Kibana is not configured so we go to the link provided above and enter the key to configure it 042cd327b4120314ecca1e1bc47e1842.png

After successful configuration, lets login using the username and password provided in your console output. 1f0b2742f6b731eb517b3e3feaec5e25.png

Now lets install logstash. You will have to install this on the system where you are expecting to monitor the osquery generated log file. To install follow the command given below:

$ wget -qO - | sudo gpg --dearmor -o /usr/share/keyrings/elastic-keyring.gpg
$ sudo apt-get install apt-transport-https
$ echo "deb [signed-by=/usr/share/keyrings/elastic-keyring.gpg] stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-8.x.list
$ sudo apt-get update && sudo apt-get install logstash

Now copy the following configuration to /etc/logstash/conf.d/os-query-export.conf

input {
  file {
    path => "/var/log/osquery/osqueryd.results.log"
    type => "osquery_json"
    codec => "json"

filter {
   if [type] == "osquery_json" {
      date {
        match => [ "unixTime", "UNIX" ]

output {
  stdout {}
  elasticsearch {
    hosts => [""]
    user => "elastic"
    password => "<ENTER PASSWORD HERE>"
    ssl_certificate_verification => false
    data_stream => true

Lets run the logstash in another terminal using the command provided below:

$ sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/os-query-export.conf --log.level trace

If everything is goes alright in Kibana quite soon we will see a data stream and lets create a data view for it. a543236a5e7d85d27920e180ceb0acf5.png

After creating the data view we should see logs are coming in. 97f3bea0e0e8245385f3879e26581fa6.png

Now lets create a file in the .ssh folder of the user1’s home directory which should trigger the osquery to log that information to the file and eventually carry that information to Logstash and then to Elastic which we can view with Kibana.

$ touch  ~/.ssh/this_is_my_test.txt


As we can see that the created file has been logged into the Elastics via osquery