“Linux is Not Susceptible to Malware” and Other Fairytales

There is a huge misconception that Linux systems are immune to malware, which has zero supporting evidence because it’s quite simply not true. On poorly managed systems a lack of regular security updates, misconfigured services, or inadvertent allowance of unapproved software being downloaded and executed can create the entry-point for malware on any system. ANY system.

Here are just a few examples of trojans, viruses, and worms that historically impacted Linux:

Class Name
Trojan Kaiten
Trojan Rexob
Trojan Xorddos
Trojan Cdorked
Trojan Chapro
Trojan Ddssh
Trojan Mumblehard
Trojan Fokirtor
Virus MetaPHOR
Virus ZipWorm
Virus Alaeda
Virus Winux (Lindose)
Virus Herm1t
Virus Vit
Virus OSF.8759
Worm Slapper
Worm Adore
Worm Ramen
Worm Cheese
Worm Linux.Darlloz
Worm DevNull
Worm Adm

You get the idea…

Antivirus software only possesses the capability of neutralizing the low-hanging fruit of the malware kingdom, but that doesn’t exclude it from the Defense in Depth strategy when you can easily integrate antivirus protective efforts into your information security policy. How? Through the miracle of automation. The more work that can be accomplished with the least amount of effort or manual human intervention, the better. Nice and efficient.

ClamAV Prep Work

You will need to install ClamAV on your system, which can be done using your distribution’s package manager, although the version that resides on the repository may be outdated. In which case, keeping up with the latest version of ClamAV is always a good idea, and also supports a security-driven mindset by maintaining current software releases on your system. You can find the latest version on ClamAV’s website. Once installed, you will need to ensure that the virus signature database is the most recent, because antivirus software is only effective if it knows what to look for regarding emerging threats. Running the ‘freshclam’ command is one way of updating virus definition databases, although your /etc/clamav/freshclam.conf file can be configured to run checks at predetermined intervals from specific ClamAV databases automatically.

We are going to drop the following command into a shell script to automate your daily scans, but let’s isolate the command used for the scan itself to better understand what we’re trying to accomplish:

clamscan --quiet -ri --copy=/tmp/infected_files "$S" –exclude-dir=/sys –exclude-dir=/dev –exclude-dir=/media –exclude-dir=/proc -l "$LOGFILE"

— quiet This flag keeps the command output suppressed, which isn’t necessary since we’ll be using it as a cron job, but in the event the command needs to be plucked out of the script and ran manually, I like keeping it in so I don’t forget it.

r The single ‘r’ flag lets ClamAV know to scan directories in a recursive manner, meaning subdirectories will be included. Since we’re scanning the root directory, this will cover our entire filesystem structure that is a child of the root directory, minus what we choose to exclude.

i The single ‘i’ flag asks ClamAV to output infected files in its scan details.

— copy Rather than using the “remove” or “move” flag, we’ll set the copy flag to send a copy of any potentially infected files to a temporary directory. This way any false positives do not disrupt crucial system files and cause problems by displacing them from the anticipated directory. Should any potentially infected files be copied to this directory, you can investigate them based on binary analysis and what exactly triggered the alerting definition to both eliminate malware and fine-tune any custom definitions. You’ll need to use the command mkdir /tmp/infected_files so ClamAV has a destination to copy these files to. HOMEWORK: Get crafty with some shell scripting and create an automated means of being alerted that files have been copied to this directory to trigger your own Incident Response plan.

$S The ‘S’ variable being called here instructs the command to include the parameters defined in the bash script for the directories to be scanned.

–exclude-dir The multiple exclude entries are pretty obvious, and help prevent larger, virtual filesystem directories with dynamic data from being included in the scan.

– l This flag instructs the scan to produce a log file, which can either be given a literal file path, or you can use a variable in the script for the sake of brevity, which we like. In the example, we’ve assigned this to the ‘$LOGFILE’ variable, and when tied together this is what we will be dealing with as far as a simple script:

#!/bin/bash

LOGFILE="/var/log/clamav/clamav-$(date +'%Y-%m-%d').log";
DIRTOSCAN="/";

for S in ${DIRTOSCAN}; do

	echo -n "Starting daily ClamAV scan of the "$S" directory at " $(date)".";

	clamscan --quiet -v -ri --copy=/tmp/infected_files "$S" --exclude=/sys --exclude=/dev --exclude=/media --exclude=/proc -l "$LOGFILE"

done

exit 0

Save your bash script as “clamav_daily_scan.sh”, and make it executable by entering the command:

chmod u+x clamav_daily_scan.sh

Test out your script from the command line by running:

./clamav_daily_scan.sh

The echo string we included will inform you the scan has started, and you can use the ‘top’ command to monitor resource consumption as it runs. Antivirus scans consume a lot of system resources while performing scans, so don’t panic when you see your CPU usage maxing out while they run.

Setting Up Your Scheduled Scan

Using cron jobs for scheduling removes the burden of having to remember to login and run jobs, and is really simple to use. Lucky us.

Change the file access permissions using:

chmod 0755 /root/clamav_daily_scan.sh

Then include a link to it in the configuration file for cron jobs, which is typically located in the /etc directory:

ln /root/clamav_daily_scan.sh /etc/cron.daily

Ok, you should be all set for automated virus scans of your system. Check your destination directory for log file output after the scheduled run time. Or, if a few days go by, take a peek in the directory, and if you have several dated log files full of scan data, you got it. You should have already tested your script, so if something goes awry, it’s obvious that your cron scheduler may need poking at. You have the internet, so I’m sure you’ll figure it out.

Considerations for Logging Output

Off-disk Storage Implementation – Future-proofing your Local Area Network (LAN) ecosystem against an insufficient amount of storage for log files, packet captures, backups, and anything else you may start archiving moving forward is a critical concern to address. The integration of Network Attached Storage (NAS) with a few terabytes of space will work for a Small Office/Home Office setup, but you’ll be looking at doubling that number when utilizing RAID for redundancy in the event of disk failure. This could be a fairly uncomfortable expense depending on how serious you are taking system administration and information security policy within your home LAN, but if you are enhancing your infrastructure it’s certainly worth the cost. Cloud storage solutions do however alleviate the burden of infrastructure management for data storage, and has become more affordable over the last couple of years, so this is a fast-adopting computing strategy you should research before shelling out the cash for a mini server room and making your kids sleep on the roof.

Timing is Everything – Process scheduling represents an essential procedure in modern system administration, so taking the time to write shell scripts or small Python programs that can be included in your system’s cron job schedule is a necessary chore. You don’t want to glob everything into some generic “at the stroke of midnight” window, because this can dramatically impact system and network performance, so spreading processes out throughout the day and applying logic to your schedule is key.

Automating Dismal Errands – The ability to automate common or repetitive tasks has been in the holster of systems administrators since the beginning of the modern computing age, and for good reason. Manually engaging in hundreds of tasks throughout the day isn’t just annoying, it’s unrealistic. This is a segway into the concept of log file archiving and rotation, which can be accomplished through the compression of files and warehousing them on a centralized storage system. This not only keeps things organized and makes backups more manageable, but helps prevent intruders from altering files to cover their tracks.

Antivirus is Dead

Already? Well, there are schools of thought that recognize the fact that antivirus software is more of a feel-good security implementation, and does not do its job very well given the sophistication of threat actors these days. Looking at Information Systems Security from this point of view, there is one important word to learn: whitelisting. Before you write-off whitelisting based on rumors you may have heard, go read this article on InfoSec Institute’s website to dispel any ridiculous myths. With antivirus being a form of blacklisting, the complementary benefits provided by application whitelisting further harden your environments against code execution as a threat vector.

The implementation process, level of requisite maintenance and support, understandable lack of expertise, and inability to mesh a home-grown whitelisting solution with a mature environment may leave you in an even worse situation than not using it at all due to the placebo effect. That’s why using a pre-rolled solution can help kick-start a whitelisting campaign with much less hassle. Barring any commercial expenses for home use, there are free-of-charge Open Source solutions available, one of which is fapolicyd. It’s worth checking out if you are interested in a simple application whitelisting approach, and as always, if there’s something you don’t like or need to change, that’s the beauty of Open Source software.

Now that you have a brand new antivirus solution for your Linux systems you can sleep a little better tonight.

~ Dan