Creating Splunk Alerts (aka 'Saved Searches') from the command line
Splunk Alerts (also called saved searches) are a great way to have Splunk send you data on a scheduled basis, or when certain conditions are met (e.g. a metric crosses a threshold). While these alerts can easily be created in the web UI (by clicking “Save As/Alert” in a search) in many cases would be nice to do it programmatically. This makes it easy to set up alerts for many individual searches, while keeping everything under source control.
Specifying the search
First of all, let’s specify our search. We will put each search in its
own text file. As an example, here is a really simple search that
counts the number of 404 errors by sourcetype
:
index=* | stats count(eval(status="404")) AS count_status BY sourcetype
This is the same search definition as you would enter in the Splunk
search app UI. It’s a good idea to test and refine the search
interactively until you’re happy with it, then save it in a file
called my_search.txt
.
The Splunk API
Next we’re going to write some Python to drive the Splunk API. The
following code assumes you have a valid auth token in the
environment variable SPLUNK_AUTH_TOKEN
. Here is a little helper
script you can use to set this variable (assuming you have xmllint
installed):
#!/bin/sh
#
# login to splunk and set SPLUNK_AUTH_TOKEN
#
# Usage: eval $( ./splunk-login.sh )
#
SPLUNK_HOST="https://splunk.int.corp:8089"
read -p "Username: " USERNAME
read -s -p "Password: " PASSWORD
echo >&2
response=$( curl -s -d "username=${USERNAME}&password=${PASSWORD}" -k ${SPLUNK_HOST}/services/auth/login )
SPLUNK_AUTH_TOKEN=$( echo $response | xmllint --nowarning --xpath '//response/sessionKey/text()' - 2>/dev/null )
if [[ $? -eq 0 ]] ; then
echo "export SPLUNK_AUTH_TOKEN=${SPLUNK_AUTH_TOKEN}"
else
echo $response | xmllint --xpath '//response/messages/msg/text()' - >&2
echo >&2
fi
You’ll also need to install the Splunk SDK for Python. This should
be as simple typing pip install splunk-sdk
as depending on how your
environment is configured.
Ok, on to the code!
|
|
Great, we now have a reference to the Splunk API service. So how do we use the SDK to create a saved search? The Splunk API documentation is slightly terrifying. Luckily, we don’t need to worry about the vast majority of the available parameters. Let’s create an alert that runs on a schedule and sends its results to a webhook:
|
|
And that’s it! Easy.
There’s a few things in that dictionary of parameters to the API call that are worth calling attention to.
The first two lines (#17-18) specify the action to trigger for our alert: in this example, a webhook and the URL that Splunk will POST results to. You could also specify
'actions': 'email'
and'action.email.to'
to send the alert as an email instead__*__.Line #24 specifies when to run the saved search, as a crontab entry. This example is “daily at 1:00 am” (in the Splunk server’s time zone).
Lines #25-26 define the time range over which to run the search (in this case “last 30 days”).
The digest mode and alert threshold settings (lines #20 and #22 ) cause this alert to send results every time it is invoked, rather than depending on a condition being met. Creating a conditional alert is left as an exercise for the reader…
(*) Note that I have not yet tested this, so I’m not sure which other parameters are required.
Fully worked example
The above code is all you actually need, but here’s a slightly
expanded example that accepts command line arguments and multiple
search files. It also deletes any existing search of the same name
(i.e. your new search replaces the old one), and randomises the
crontab
spec slightly to spread out load on the Splunk server.
|
|
Conclusion
So despite the somewhat lacking official documentation, creating Splunk saved searches is actually pretty straightforward. Thanks to Alexander Leonov for this post that got me headed in the right direction: https://avleonov.com/2019/01/17/creating-splunk-alerts-using-api/