Why use FIM in the first place?
For most people, the answer is “because my auditor / bank / security consultant said we had to!” Security standards like PCI DSS require a requirement for periodic file integrity checks, including backups / log files, and this is the starting factor for most organizations to implement FIM.
Unlike antivirus and firewall technology, FIM is not yet considered a pervasive security requirement. In some respects, FIM is similar to data encryption, in that both are undeniably valuable security safeguards to implement, but both are used sparingly, reserved for niche or specialized security requirements.
How does FIM help with data security?
At a basic level, File Integrity Monitoring will verify that important system files and configuration files have not changed, in other words, the integrity of the files has been maintained.
Why is this important? For system files (programs, applications, or operating system files), these should only change when an update, patch, or upgrade is implemented. At other times, the files should never change.
Most security breaches involving the theft of data from a system will use a keylogger to capture the data that is entered into a PC (the theft is then perpetrated via spoofed access later), or some kind of data transfer conduit program, used to divert information from serving. In all cases, there has to be some type of malware implanted in the system, usually operating as a Trojan, that is, the malware impersonates a legitimate system file so that it can be executed and granted access privileges to the files. system data.
In these cases, a file integrity check will detect the existence of Trojans, and since targeted zero-day threats or APT (Advanced Persistent Threat) attacks will evade antivirus measures, FIM stands out as a security defense measure. indispensable. To have the necessary peace of mind that a file has not changed, the attributes of the file that govern security and permissions, as well as the length of the file and the cryptographic hash value, should be tracked.
Similarly, for configuration files, computer configuration parameters that restrict access to the host or restrict privileges for users of the host must also be maintained. For example, a new user account provisioned for the host and with root or administrator privileges is an obvious potential vector for data theft: the account can be used to access host data directly or to install malware that will provide access to confidential data.
File integrity monitoring and configuration hardening
Which brings us to the topic of setting hardening. Hardening a configuration is intended to counteract the wide range of potential threats to a host, and best practice guides are available for all versions of Solaris, Ubuntu, RedHat, Windows, and most network devices. Known security vulnerabilities are mitigated by using a fundamentally secure configuration for the host.
For example, a basic key to protecting a host is through a strong password policy. For Solaris, Ubuntu, or other Linux hosts, this is implemented by editing the /etc/login.defs file or similar, whereas a Windows host will require the necessary settings to be defined within the Local or Group Security Policy. In any case, the configuration options exist as a file that can be parsed and checked for integrity for consistency (even if, in the case of Windows, this file can be a registry value or the output of a line program command).
Thus, file integrity monitoring ensures that a server or network device remains secure in two key dimensions: protected from Trojans or other changes to system files, and kept in a protected or hardened state of safe way.
The integrity of the file is assured, but is it the correct file to start with?
But is using FIM enough to ensure that the configuration and system files remain unchanged? In doing so, there is a guarantee that the system being monitored remains in its original state, but there is a risk of perpetuating a misconfiguration, a classic case of ‘garbage in and out’ computing. In other words, if the system was built using an impure source, the recent Citadel keylogger scam is estimated to have generated more than $ 500 million in stolen funds from bank accounts where PCs were set up using pirated Windows operating system DVDs, each with keylogger. malware included for free.
In the business world, operating system images, patches, and updates are typically downloaded directly from the manufacturer’s website, providing an original and reliable source. However, it will always be necessary to apply the necessary configuration settings to fully harden the host, and in this case, file integrity monitoring technology can provide an additional and invaluable function.
The best Enterprise FIM solutions can not only detect changes to configuration files / settings, but also analyze settings to ensure security configuration best practices have been applied.
In this way, it can be ensured that all hosts are secure and configured in accordance with not only industry best practice recommendations for safe operation, but also any individual corporate hardened building standards.
A hardened construction standard is a prerequisite for secure operations and is required by all formal security standards such as PCI DSS, SOX, HIPAA, and ISO27K.
Even if FIM is being adopted simply to meet the requirements of a compliance audit, there are a wide range of benefits to be gained in addition to simply passing the audit.
Protecting host systems against Trojan or malware infections cannot be left to antivirus technology alone. AV’s blind spot for zero-day threats and APT-like attacks leaves too much doubt about the integrity of the system not to use FIM for additional defense.
But preventing security breaches is the first step to take, and hardening a server, PC, or network device will prevent all non-internal infiltrations. Using an FIM system with auditing capabilities for secure configuration checklist best practices makes expert-level hardening easy.
Don’t just monitor the integrity of your files, check them first!