[Ref: OpenBSD 4.8 amd64, 4.9 amd64 & i386]
Table of Contents
Now we have some analysis tools going, we have a basic workflow for pro-active investigation of what’s happening on our network:-
Enter, FlowViewer (for those who don’t want to investigate at the command-line, flow-tools) and related tools FlowTracker, FlowGrapher.
FlowViewer gives you a GUI front-end for interrogate the database for quick analysis of the data. For our above scenario, you can quick drill through to see which hosts were generating the highest traffic at that time and to/from whom.
After the previous netflow analysis tools (flow-tools, flowscan, CUFlow) installing FlowViewer is rather simplified.
Refer: Michael W. Lucas’ book “Network Flow Analysis.” which already identifies minimal prerequisites to get our FlowViewer up and running.
Binary tools I install from OpenBSD Packages, but perl modules may require installing from CPAN. PAN.
If you got here, without installing and testing the FlowScan/CUFlow, things should work but interpretation of data will really need more support from the above book.
The Package Binaries
The binaries should be installable from the Ports/Packages. Check the website for what services require the binaries, whether you need them, but they are relatively easy to install so while we’re learning, just install everything.
In summary, we need the gd related tools for graphing (e.g. FlowGrapher) and the RRD tool for Tracking, charting with tracking data.
Ref: FlowViewer
As per the installation instructions (‘cause I know you were reading it.)
Installation Tips
Get the current release from the FlowViewer homesite (http://ensight.eos.nasa.gov/FlowViewer/) and follow the basic install instructions that comes with the software.
curl -O http://ensight.eos.nasa.gov/FlowViewer/FlowViewer_3.4.tar
Normally, /var/www/cgi-bin/FlowViewer, for our example below I’m using and base installation directory of:
Path | Description |
---|---|
/var/www/flowviewer | Base Directory |
/var/www/flowviewer/cgi | CGI Scripts |
/var/www/flowviewer/docs | Documents |
/var/www/flowviewer/saves | SAVE Directory |
/var/www/var/flowviewer | Scratch/Work Directory |
Using the above directory path construction, let’s make all the appropriate directories:
basedir=/var/www/flowviewer
cgidir=$basedir/cgi
docs=$basedir/docs
workdir=/var/www/var/flowviewer
for d in saves docs cgi; \
do mkdir -p $basedir/$d/; chmod -R a=rwx $basedir/$d/; done
for d in FlowGrapher FlowTracker FlowWorking; \
do mkdir -p $docs/$d/; chmod -R a=rwx $docs/$d/; done
for d in names filters rrdtool exporter log; \
do mkdir -p $workdir/$d/; chmod -R a=rwx $workdir/$d/; done
Untar the files and copy it into the CGI Scripts directory:
Note: Using a ‘personalised’ path structure is ’nice’ but it also means the real documentation doesn’t always align well with this documentation. That’s good, don’t use this documentation as a cut-and-paste instruction set.
Make the changes appropriate for getting your site up and running, with your data etc.
One option of modifying the standard configuration, is to add your changes at the bottom of the configuration file, differentiating your changes.
File: FlowViewer_Configuration.pm
$FlowViewer_server = "10.0.0.2"; # (IP address or hostname)
$trackings_title = "My Company Title";
$user_logo = "My.Company.Logo.jpg"; # For a nice look make your logo 86 pixels high
$user_hyperlink = "http://www.example.com/";
@devices = ("sensorXY");
#@exporters = ("sensorXY_ipaddress:sensorXY Title");
Set the Basic configuration of Service Title and what sensors are being reviewed.
For my configuration file, I’ve created 3 x new variables:
These are obviously relevant to this example, and simplify modifications.
File: FlowViewer_Configuration.pm
$basedir = "/var/www/flowviewer";
$baseurl = "/FlowViewer";
$workdir = "/var/www/var/flowviewer";
$viewdocs = "$basedir/docs";
$viewdocs_url = "$baseurl/docs/";
$cgi_bin_directory = "$basedir/cgi";
$cgi_bin_short = "$baseurl/cgi";
$reports_directory = "$viewdocs";
$reports_short = "$viewdocs_url";
$graphs_directory = "$viewdocs/FlowGrapher";
$graphs_short = "$viewdocs_url/FlowGrapher";
$tracker_directory = "$viewdocs/FlowTracker";
$tracker_short = "$viewdocs_url/FlowTracker";
$work_directory = "$viewdocs/FlowWorking";
$work_short = "$viewdocs_url/FlowWorking";
$save_directory = "$basedir/saves";
$save_short = "$baseurl/saves";
$names_directory = "$workdir/names";
$filter_directory = "$workdir/filters";
$rrdtool_directory = "$workdir/rrdtool";
The above configuration specifies the local path where files are to be stored (X_directory), and the Web URL (X_short) matched to that directory.
The second configuration update is to set the location for the data.
File: FlowViewer_Configuration.pm
$flow_data_directory = "/var/netflow";
$exporter_directory = "/var/www/var/flowviewer/exporter";
$flow_bin_directory = "/usr/local/bin";
$rrdtool_bin_directory = "/usr/local/bin";
$log_directory = "/var/www/var/flowviewer/log";
Make sure that the data and binary folders are correct for your configuration. The binaries on a regular install (using ports/packages) for OpenBSD should be as in the above.
During the initial startup of the CGI scripts, they will attempt to create the relevant reporting paths.
Copy your images to the new “working-directories”
cp $basedir/cgi/FlowViewer/*.png $docs
cp My.Company.Logo.jpg $docs
To use FlowTracker and FlowGrapher, we start the ‘perl’ daemons to collate the data.
SCRIPTDIR=/var/www/flowviewer/cgi
LOGDIR=/var/www/var/flowviewer/log
PERL5LIB=${SCRIPTDIR} ${SCRIPTDIR}/FlowTracker_Collector \
>> ${LOGDIR}/FlowTracker_Collector.log 2>&1 &
PERL5LIB=${SCRIPTDIR} ${SCRIPTDIR}/FlowTracker_Grapher \
>> ${LOGDIR}/FlowTracker_Grapher.log 2>&1 &
Validate the command-lines work (common problems I’ve encountered are not having the right paths)
Your log file should have something like the below:
At 07/29/2015 09:05:52 started next collection. Period: July 29, 2015 8:30:00 to July 29, 2015 8:35:00 0 trackings had a zero value. 2 trackings had a positive value. At 07/29/2015 09:05:57 finished this loop. Update period: 1438133700 2 Trackings. Loop took: 5 seconds
With the above logs verified, you can add the command-lines to your /etc/rc.local to ensure it is started with each machine restart.
Customising your configuration, is basically choosing what paths you want to show on your browser. The classic example is a straight out alias using the Capitalisation of the application name FlowViewer.
File extract: /var/www/conf/httpd.conf
Alias /FlowViewer/cgi/ /var/www/flowviewer/cgi/
Alias /FlowViewer/docs/ /var/www/flowviewer/docs/
Alias /FlowViewer/saves/ /var/www/flowviewer/saves/
<Directory /var/www/flowviewer/cgi/>
Options ExecCGI
AllowOverride None
Order allow,deny
Allow from [list.of.ips.you.trust]
</Directory>
<Directory /var/www/flowviewer/docs/>
Options Index
AllowOverride None
Order allow,deny
Allow from [list.of.ips.you.trust]
</Directory>
<Directory /var/www/flowviewer/saves/>
Options Index
AllowOverride None
Order allow,deny
Allow from [list.of.ips.you.trust]
</Directory>
As noted in the above, working files are in the path /var/www/flowviewer/
And now, we can point our browser to the above site to start looking at reports:
http://collector-ip-address/FlowViewer/cgi/FlowViewer.cgi
Fortunately, that single online URL can connect you to other parts of the toolkit.
Remember that OpenBSD’s default Apache instance runs in the chroot(2) environment. To use FlowViewer, disable this environment using the “-u” option.
-u By default httpd will chroot(2) to the ``ServerRoot'' path. The -u option disables this behaviour, and returns httpd to the expanded "unsecure" behaviour.
If you’re really paranoid, you could configure all the relevant tools to work within a chroot’d environment.
Some directories may not have been, or permissions not useable.
Review the /var/www/log/error_log file
A simple, active monitoring day’s activities could revolve such as the scenario below:
FlowViewer is good for getting snapshot views of traffic behaviour, whether the snapshot is 5 minutes or 10 days. It is a good way of isolating the hosts, patterns of heavy use such that we can isolate further points of investigation.
To view historical data on the identified traffic pattern, we can use FlowGrapher to generate a chart to give us a better view of previous behaviour.
Love using this feature to justify what modifications may or may not be required on the infrastructure.
Other links that may be relevant, or give better clues.