The combination of worsening and highly publicized data breaches, and stricter regulatory compliance requirements are increasing the focus on database security.
Databases are subject to some unique types of threats which cannot be handled by firewalls, intrusion prevention systems and other perimeter defenses. The threat landscape is constantly evolving and becoming more sophisticated and specialized (e.g. attacking through memory backdoors inside databases).
From a risk management perspective, there is no difference between an external user and an insider threat. In other words, when planning for database security, assume the threat is an insider. If the user is external, his/her attack plan will be to capture the credentials of an insider and then attack the database.
Database Activity Monitoring
Database Activity Monitoring and Protection is a key component in Cymbel’s next-generation defense-in-depth architecture focused on applications, users, and data.
As usual, there are network-based and host-based approaches to database security. And generally, logging is considered a third approach. Each has pros and cons:
Logging requires turning on the database product’s native logging capability. The main advantage of this approach is that it is a standard feature included with every database. Also some database vendors like Oracle have a complete, but separately priced Database Activity Monitoring solution, which they claim will support other databases. Here are the issues with logging:
- Logging cannot block actions that violate policies. Logging is after the fact and cannot block malicious activity. If you are logging to a SIEM which has the ability to take actions, by the time the events are processed by the SIEM, seconds or minutes have passed which means the exploit will at least be partially completed.
- You need a log management or Security Information and Event Management (SIEM) system to normalize each vendor’s log format into a standard format so you can correlate events across different databases and store the large volume of events that are generated. If you already committed to a SIEM product this might not be an issue assuming the SIEM vendor does a good job with database logs.
- There can be significant performance overhead on the database associated with logging, possibly as high as 50%.
- Database administrators can tamper with the logs. Also if an external hacker gains control of the database server, he/she is likely to turn logging off or delete the logs.
An appliance is connected to a tap or a span port on the switch that sits in front of the database servers. Traffic to and, in most cases, from the databases is captured and analyzed. Clearly this puts no performance burden on the database servers at all. It also provides a degree of isolation from the database administrators.Here are the issues:
- Local database calls and stored procedures are not seen. Therefore you have an incomplete picture of database activity.
- Your must have the network infrastructure to support these appliances.
- It can get expensive depending on how many databases you have and how geographically dispersed they are.
An agent is installed directly on each database server.The overhead is much lower than with native database logging, as low as 1% to 5%, although you should test this for yourself. Also, the agent sees everything including stored procedures. Database administrators will have a hard time interfering with the process without being noticed. Deployment is simple, i.e. neither the networking group nor the datacenter team need be involved. Finally, the installation process should not require a database restart. Here are the issues:
- Building and maintaining the agent software is difficult and more time consuming for the vendor than the network approach. However, this is the vendor’s issue not the user’s.
- The analysis is performed by the agent right on the database. This could mean additional overhead, but has the advantage of being able to block a query that is not “in policy.”
- Under heavy load, transactions could be missed. But even if this is true, it’s still better than the network based approach which surely misses local actions and stored procedures.
- IT administrators could use the agent to snoop on database transactions to which they would not normally have access.
The ideal solution is a combination of network-based protection plus agents where needed.
Organizations are increasingly turning to database encryption as a layer of a defense-in-depth architecture to protect sensitive information because (1) there have been highly publicized data breaches which would have been prevented by database encryption, and (2) regulatory compliance regimes like PCI DSS require it. In Massachusetts, at present, MA 201 CMR 17 only requires encryption for laptops and ” …all transmitted records and files containing personal information that will travel across public networks, and encryption of all data containing personal information to be transmitted wirelessly.”
There are many functions within the organization that require database data beyond the production systems. High on the list is the development team’s need for test data. Unfortunately the easiest way to give it to them is simply copy data from the production databases, which creates risks of data breaches. Data Masking is an algorithmic method of converting live data into test data that (1) is still useful for developers and (2) does not expose confidential or compliance-protected information.
If you have a question or a comment, or would like more information, please let us know by completing the Contact Us box on the upper right side of this page.