osquery and FIM using ELK
Table of Contents
In the last writeup we were able to generate logs based on file change. But that does not help us much because we again have to monitor every single machine in the environment manually which is impossible. It is certainly better if we can centralize all the logs into a single dashboard. As ELK one of the common tool to search and visualize data, we can transfer logs using Logstash to Elastic and eventually view them in Kibana dashboard. In this writeup we will be pushing the logs to our Elastic Logstash Kibana (ELK) system.
Now lets bring up Elastic:
user1@osquery1:~$ sudo apt install docker.io
[sudo] password for user1:
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following additional packages will be installed:
bridge-utils containerd git git-man liberror-perl pigz runc ubuntu-fan
Suggested packages:
ifupdown aufs-tools btrfs-progs cgroupfs-mount | cgroup-lite debootstrap docker-doc
rinse zfs-fuse | zfsutils git-daemon-run | git-daemon-sysvinit git-doc git-email
git-gui gitk gitweb git-cvs git-mediawiki git-svn
The following NEW packages will be installed:
bridge-utils containerd docker.io git git-man liberror-perl pigz runc ubuntu-fan
0 upgraded, 9 newly installed, 0 to remove and 158 not upgraded.
Need to get 78.4 MB of archives.
After this operation, 316 MB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:1 http://gb.archive.ubuntu.com/ubuntu jammy/universe amd64 pigz amd64 2.6-1 [63.6 kB]
... [SNIP] ...
Setting up git (1:2.34.1-1ubuntu1.10) ...
Processing triggers for man-db (2.10.2-1) ...
$ sudo docker network create elastic
9fe0694c89c302da1d2e75c30dbb27122be123e6e9c2d49696778d07aa611171
$ sudo docker pull docker.elastic.co/elasticsearch/elasticsearch:8.6.2
8.6.2: Pulling from elasticsearch/elasticsearch
963f82b80814: Pull complete
77da6088d9c4: Pull complete
53c1cc4b2357: Pull complete
89732bc75041: Pull complete
a9889a85f79d: Pull complete
15c2ef330c04: Pull complete
dc39009e3b6b: Pull complete
18c8712e2883: Pull complete
1910ac4f96c8: Pull complete
fa50b81a1497: Pull complete
Digest: sha256:1c53c89d04f207beb99d56cc4a1cc23516bd9c386858843d5082a98257c04d1c
Status: Downloaded newer image for docker.elastic.co/elasticsearch/elasticsearch:8.6.2
docker.elastic.co/elasticsearch/elasticsearch:8.6.2
$ sudo sysctl -w vm.max_map_count=262144
vm.max_map_count = 262144
$ sudo docker run --name es01 --net elastic -p 9200:9200 -p 9300:9300 -t docker.elastic.co/elasticsearch/elasticsearch:8.6.2
... [SNIP] ...
If everything is successful you should see something like this:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
✅ Elasticsearch security features have been automatically configured!
✅ Authentication is enabled and cluster connections are encrypted.
ℹ️ Password for the elastic user (reset with `bin/elasticsearch-reset-password -u elastic`):
UzyMRtMFsg_nqV9PGyaU
ℹ️ HTTP CA certificate SHA-256 fingerprint:
4064bf0d320d017733d7aa03fc3a9dc660bfef5d6f72f2abd5ab109f9c382d51
ℹ️ Configure Kibana to use this cluster:
• Run Kibana and click the configuration link in the terminal when Kibana starts.
• Copy the following enrollment token and paste it into Kibana in your browser (valid for the next 30 minutes):
eyJ2ZXIiOiI4LjYuMiIsImFkciI6WyIxNzIuMTguMC4yOjkyMDAiXSwiZmdyIjoiNDA2NGJmMGQzMjBkMDE3NzMzZDdhYTAzZmMzYTlkYzY2MGJmZWY1ZDZmNzJmMmFiZDVhYjEwOWY5YzM4MmQ1MSIsImtleSI6IjdHTjdTWW9CcFJ3VXN6eDlna3RGOjBqd3FFWVl1UWUtZzVSSXBxT1VaZVEifQ==
ℹ️ Configure other nodes to join this cluster:
• Copy the following enrollment token and start new Elasticsearch nodes with `bin/elasticsearch --enrollment-token <token>` (valid for the next 30 minutes):
eyJ2ZXIiOiI4LjYuMiIsImFkciI6WyIxNzIuMTguMC4yOjkyMDAiXSwiZmdyIjoiNDA2NGJmMGQzMjBkMDE3NzMzZDdhYTAzZmMzYTlkYzY2MGJmZWY1ZDZmNzJmMmFiZDVhYjEwOWY5YzM4MmQ1MSIsImtleSI6IjdtTjdTWW9CcFJ3VXN6eDlna3QtOlR5OUVWcnV4UkZDelBTZEJuRUowYUEifQ==
If you're running in Docker, copy the enrollment token and run:
`docker run -e "ENROLLMENT_TOKEN=<token>" docker.elastic.co/elasticsearch/elasticsearch:8.6.2`
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
This part contains the credentials needed to communicate with Elastic system.
Now lets bring up Kibana
$ sudo docker pull docker.elastic.co/kibana/kibana:8.6.2
[sudo] password for user1:
8.6.2: Pulling from kibana/kibana
963f82b80814: Already exists
... [SNIP] ...
docker.elastic.co/kibana/kibana:8.6.2
$ sudo docker run --name kib01 --net elastic -p 5601:5601 docker.elastic.co/kibana/kibana:8.6.2
[2023-08-31T02:53:29.911+00:00][INFO ][node] Kibana process configured with roles: [background_tasks, ui]
[2023-08-31T02:53:57.594+00:00][INFO ][plugins-service] Plugin "cloudChat" is disabled.
[2023-08-31T02:53:57.599+00:00][INFO ][plugins-service] Plugin "cloudExperiments" is disabled.
[2023-08-31T02:53:57.600+00:00][INFO ][plugins-service] Plugin "cloudFullStory" is disabled.
[2023-08-31T02:53:57.601+00:00][INFO ][plugins-service] Plugin "cloudGainsight" is disabled.
[2023-08-31T02:53:57.611+00:00][INFO ][plugins-service] Plugin "profiling" is disabled.
[2023-08-31T02:53:57.765+00:00][INFO ][http.server.Preboot] http server running at http://0.0.0.0:5601
[2023-08-31T02:53:57.848+00:00][INFO ][plugins-system.preboot] Setting up [1] plugins: [interactiveSetup]
[2023-08-31T02:53:57.852+00:00][INFO ][preboot] "interactiveSetup" plugin is holding setup: Validating Elasticsearch connection configuration…
... [SNIP] ...
[2023-08-31T02:53:57.909+00:00][INFO ][root] Holding setup until preboot stage is completed.
i Kibana has not been configured.
Go to http://0.0.0.0:5601/?code=213588 to get started.
As we can see that Kibana is not configured so we go to the link provided above and enter the key to configure it
After successful configuration, lets login using the username and password provided in your console output.
Now lets install logstash. You will have to install this on the system where you are expecting to monitor the osquery generated log file. To install follow the command given below:
$ wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elastic-keyring.gpg
$ sudo apt-get install apt-transport-https
$ echo "deb [signed-by=/usr/share/keyrings/elastic-keyring.gpg] https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-8.x.list
$ sudo apt-get update && sudo apt-get install logstash
Now copy the following configuration to /etc/logstash/conf.d/os-query-export.conf
input {
file {
path => "/var/log/osquery/osqueryd.results.log"
type => "osquery_json"
codec => "json"
}
}
filter {
if [type] == "osquery_json" {
date {
match => [ "unixTime", "UNIX" ]
}
}
}
output {
stdout {}
elasticsearch {
hosts => ["https://127.0.0.1:9200"]
user => "elastic"
password => "<ENTER PASSWORD HERE>"
ssl_certificate_verification => false
data_stream => true
}
}
Lets run the logstash in another terminal using the command provided below:
$ sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/os-query-export.conf --log.level trace
If everything is goes alright in Kibana quite soon we will see a data stream and lets create a data view for it.
After creating the data view we should see logs are coming in.
Now lets create a file in the .ssh
folder of the user1
’s home directory which should trigger the osquery to log that information to the file and eventually carry that information to Logstash and then to Elastic which we can view with Kibana.
$ touch ~/.ssh/this_is_my_test.txt
As we can see that the created file has been logged into the Elastics via osquery