Regularly Export SSH Replays

Scenario: You want to export ssh replay captures from your organization on a regular basis. This document explains how to do this using the sdm audit ssh CLI command. Instructions are included for local export and for exporting to either AWS S3 cloud storage or Google Cloud Platform (GCP) cloud storage. For more command information, see the sdm audit ssh.

Initial Setup

Create a new Linux system user with restricted permissions to run the audit. In this example, we use sdm. Download and install the Linux Installation Guide.

You do not need to log into the SDM client. The admin token serves as authentication.

Create an Admin Token

To create an admin token, sign into the StrongDM Admin UI and go to Audit > API & Admin Tokens. From there you can create an admin token with the specific rights you require, which in this case is the Audit > SSH Captures permission only.

After you click Create, a dialog pops with the admin token. Copy the token, and save it for later use in /etc/sdm-admin.token in the format SDM_ADMIN_TOKEN=<YOUR_TOKEN>.

This file must be owned by your user.

chown sdm:sdm /etc/sdm-admin.token

For more details on creating admin tokens, see Create Admin Tokens.

Export to a JSON File

Set up a script to run a periodic SSH export. In the following example SSH export script, captured SSH sessions write to a JSON document every five minutes.

#!/bin/bash
export SDM_ADMIN_TOKEN=<insert admin token here>
START=$(date -d "5 minutes ago" '+%Y-%m-%dT%H:%M:00') # start of audit slice, defaulting to 5 minutes ago
FN=$(date -d "yesterday 00:00" '+%Y%m%d%H%M') # timestamp string to append to output filename
END=$(date '+%Y-%m-%d%TH:%M:00') # end of audit slice, defaulting to now, at the top of the minute
TARGET=/var/log/sdm # location where JSON files will be written

/opt/strongdm/bin/sdm audit ssh --from "$START" --to "$END" -j > "$TARGET/ssh.$FN.json"

Add a crontab entry

Although most Linux systems have locations to place scripts that run daily, weekly, or so on, the script is configured by default to run every five minutes. As such, our best bet is to place it directly into the crontab file for a user or for the system.

Add this line to the crontab of your choice, modifying the interval to match what you set in the script:

Export to Cloud Storage

If you configured logging to a cloud environment, use the following methods to extract SSH captures before or after log export.

SSH session extraction prior to export

Set up and run a periodic export in order to extract SSH sessions prior to shipping the logs to your cloud storage. The SSH captures are compressed and exported every hour.

Configure this script to run every hour in cron.

If you have chosen to save your StrongDM gateway or relay logs in JSON, you need to add the -j option to perform this operation correctly (for example, sdm ssh split -j $TEMPDIR/sdmaudit.log).

SSH session extraction after export

  1. To extract SSH sessions from exported logs, first determine the ID of the session you want to view. Do this by running sdm audit ssh with the relevant --from and --to flags, as in the following example.

  2. Next, copy the logs from the relevant timeframe back down from your cloud storage. Please note that an SSH session may span several logs, so pay attention to the duration of the session as revealed in step 1.

  3. Unzip the logs and compile them into a single file.

  4. Run sdm ssh split <logfile> to extract all SSH sessions from this log. They are named after the session ID. At this point, you can view the relevant session file (in JSON format).

If you have chosen to save your StrongDM gateway or relay logs in JSON, you need to add the -j option to perform this operation correctly (for example, sdm ssh split -j combined-logs).

Last updated

Was this helpful?