Tuesday, April 09, 2013

Oracle database security blueprint against network attacks

Databases play a vital role in many current enterprise systems. They are commonly used to store vital, critical and confidential data about customers, finance, logistics and other operations within the company. Due to the central role a lot of database play in an overall architectural landscape of a company it can be expected that companies do take all measures to ensure the security of a database. Security can be seen from many different angles. For example availability is a security point which is often not considered to be part of security. When people talk about security in general they think about how to protect unwanted and unauthorised people from accessing a system or data.

When thinking about how to protect a system, a database in this case from being accessed by people who are not intended to a lot of people do think in the following order about security. User accounts, networking, applications, operating systems and then the rest. All are evenly important however thinking about security is something that needs to be taken very carefully. For example the network security is not simply placing a firewall between the database and the application server or directly to the rest of the world.

Below is a start of a blueprint which might help you to start your own database security blueprint. In this case we have taken a situation in which the database is used in combination with an application server which is connected to the public internet. A couple of things to remind, we only take network security as a topic in this blogpost and we do only think about security in a way to prevent users from attacking the database via the network to gain access. Meaning this rules out DDOS kind of attacks and this rules out any attacks on the application server (directly).

In the image below you can see the implementation of the blueprint for a database in a more then average secured landscape. This however is not yet considered a full secure architecture however is providing you security against a large number of the general attacks towards the database that might be undertaken on a web-facing application.

It is common practice for most companies to place web-facing application servers in a DMZ for security reasons. What is not common practice is that both firewalls should be of a different make and model. Reason for this is that if an attacker would be able to compromise the first firewall it would be very simple to hack the second DMZ-inside firewall when this was of the same make and model.

Next to this you can notice that the application server is attached to two different (V)lan's. Reason for this is that on the user (V)LAN you most likely only want to have one singel port open which is exactly the same port you will allow to be accessed from the outside world. Due to this setup it is important that you have at least two different NIC's. One NIC attached to the User VLAN and one attached to the application VLAN. On the application VLAN you can have more ports open then you will have on the  user VLAN. As you can see all servers in the above shown blueprint design are hardend by themselves by making also use from a local firewall. This means that even though you have firewalls available on network level you also have on every server a local firewall as an extra layer of security. On Linux servers you would use ipTables for this.

In your application design and your database schema and user design you have to already have to made sure most common security features are available. For example by making use of deep application boundary validation in your code which I already discussed in a previous blogpost.

On a more database security topic, in the blueprint design shown above you can see that the application server is not connected to the database server directly. Instead it is connected to the a database firewall server which sits in its own Oracle database firewall DMZ. The reason the Oracle database firewall is placed here is that all the other firewalls, also the Linux internal firewalls, are only there to protect against unauthorised network routing. Those firewalls simply state if a connection between 2 systems on a specified port is allowed. The Oracle database firewall is adding to this that it is checking the actual SQL statements that are executed. The Oracle database firewall is protection you against a potential attack via, for example, SQL injection. I have been discussing the Oracle database firewall in more detail in a previous blogpost.

The Oracle database firewall will be the point to which your application server will connect to like it is a normal database. The Oracle database firewall will check the statements it receive for a specific database against a whitelist of statements and if approved act as a "proxy" towards the database. This is shown in the below image from Oracle.

Now we have in place firewalls on the hosts protecting the hosts with IPtables. we also have firewalls in between the network segments and have separated the different network segments. We have also deployed a Oracle database firewall to ensure that not only the network traffic is controlled we have also ensured that the statements that are send to the database are valid and do not contain any statements that could be used to exploit the database.

As a last line of defence we use a technique that is less known even by most Oracle DBA's. We limit the hosts that can connect to the database on the database itself. In case someone is able to circumvent all firewalls and bypass the Oracle database firewall we have an option to state in the database instance itself which hosts can connect. A sort of whitelist of hosts, a sidenode to this is that you will have to add for example your application servers on this list and the Oracle database firewall. If someone gains access to a shell on those servers and starts a SQL session from this server it is considered valid. However it will hold back all the SQL sessions from IP's that are not in the whitelist.

To enable this valid node checking function you have to add some information to your $TNS_ADMIN/sqlnet.ora configuration. You have to change (add) the following to the file:

tcp.validnode_checking = YES
tcp.invited_nodes = ( X.X.X.X, hostname, ... )

Do note that if you do not add the IP's or hostnames of the machines your DBA is using they will also be unable to connect to the system. Adding the tcp.validnode_checking option to YES is in the security best practices of Oracle and should (in my opinion) be done always unless you have a very valid point to not do this.

A good thing to note is that if you use tcp.validnode_checking in an Oracle eBS setup this is supported by the autoconfig functionality. AutoConfig supports automated configuration of this setting. If the profile option “SQLNet Access” (FND_SQLNET_ACCESS) is set to “ALLOW_RESTRICTED” at the Site level when AutoConfig is run on the database server, AutoConfig will add IP restrictions to sqlnet.ora. The list of host will be all those from the FND_NODES table that are registered as an EBS node.

For more information, refer to MOS Note 387859.1: Using AutoConfig to Manage System Configurations with Oracle Applications Release 12 - or the Oracle Applications Concepts manual.

Friday, April 05, 2013

Cloud computing risk of data silo

Companies are more and more moving to the cloud. Cloud applications in the form of SaaS are getting more common in enterprises, hosting in the cloud in the form of DBaaS, PaaS or IaaS are getting more common by the day. I the"early" days of cloud computing companies who adopted where more small and medium businesses and startup companies. Now we see a move from enterprises towards the cloud. Large vendors are jumping on the cloud solutions and almost every day new cloud services and providers are seeing the light of day. The velocity of new companies getting created takes memories back to the days before the Internet bubble, maybe we are creating a cloud bubble at the moment.

However, we have seen that from a collapsing bubble also good things can come. When the Internet bubble collapsed the good where separated from the bad, this is something we might also see from a potential collapse of a potential cloud bubble.

Even though the cloud brings a lot of good there are some things to consider when moving to the cloud. I already discussed the legal implications of moving to the cloud in another blogpost. There are however more things to consider. One of the is related to a potential collapse of a cloud bubble. 

One of the things to consider next to the legal impact of cloud computing and the fact that your data might be hosted on a country that has a different legal system then the country where your company is located is the fact of data-silos. The risk of creating a data silo is on the agenda of a lot of the CIO's currently and it is indeed a potential risk of cloud computing.

When we talk about a data silo we talk about an cloud based application where we put in a lot of information however lacks the ability to extract this information. A large number of cloud based companies, and especially SaaS like companies do offer a great way of having your application in the cloud and provide you with options to enter information into this system. However the raw stored data is in most cases not accessible or in a very limited way. This makes that it is almost impossible to leave the cloud vendor and take your data with you to a new vendor. 

In cases where this is possible, based upon a contractual obligation or via build in functionality it often is a hard task to get the data in such a format that it is usable to be migrated to a new system. In cases where you have time to plan for a migration this is often costly however not impossible. In cases where this has to be done suddenly, for example due to a vendor that is bankrupt, you will not have the time to start a project for this. In those cases it might happen that you are unable to extract the data or extract it in a form that it is usable. 

It is good practice to check with your cloud vendor who is offering you a SaaS solution, or even a PaaS or IaaS solution, what your option are to access your data and what services are available to extract this. Next to this it is good practice to check with your vendor on what the rundown policy and exit clauses are in the contract in case the contract is ended and in case of a sudden bankruptcy. 

In an ideal situation your vendor has a escrow like mechanism in place for your data to ensure that your data is still available in case of a sudden bankruptcy. This will prevent you from data loss in such a situation, it will however not prevent you from service loss.

In case you want to extract your data during the contract period or at the end of a contract period it is ideal to have a mechanism in place for data extraction. Such a mechanism can be simple dump of all information in a pre-defined format, a direct coupling to the database holding your data or access to your data via an API.