Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Perimeter

5/24/2011
11:41 AM
Adrian Lane
Adrian Lane
Commentary
50%
50%

Oracle 11G Available On AWS

When testing Oracle on Amazon AWS, consider how you will secure your data

Amazon announced availability of Oracle 11G machine images on Amazon AWS. It's fairly apparent that what's being provided is not really intended for production deployments at this time. Still, it's a neat way for customers test out the elasticity of Platform as a Service and sample Oracle 11G capabilities without having to invest in database licenses and new hardware.

But Amazon environments are very different than managing traditional IT. Sure, you can spin up a database really fast, but to do anything with it, you'll need to configure just about every aspect of your environment. And it's harder to inspect and monitor servers in the cloud. That applies to security as well as general administration.

In Amazon's announcement, security information is limited to automatic patching of the image files, and the following: "You can also control access and security for your instance(s) and manage your database backups and snapshots." Automated patching is a great advantage.

Unfortunately, there is a lot more to Oracle security than patching: you'll have to figure out access control and archive encryption, as the Amazon guidance leaves big gaps between the theory and the execution for access controls and archive security. From a security perspective, there are a lot of steps you need to perform in order to secure the database so that you'll be able to reliably store any data, much less sensitive information.

It's pretty clear that this service -- at least for the time being -- is not for production usage. Even if you are not planning on building a production database, it's beneficial to consider the security implications while you run your tests. Here is a quick rundown of what you need to consider for access controls and archive security:

Encryption: You are going to need to encrypt your archives and snapshots; Amazon S3 is simple but not necessarily secure. I'll jump right to the point and say you'll want transparent database encryption so that any archive or snapshots are automatically encrypted. You'll need to acquire the add-on Oracle package or install a OS layer encryption product like Vormetric. In both cases, since Amazon is patching the image files, you'll need to understand how this affects additional encryption features; most likely you will re-apply setup scripts or re-install products to the virtual image.

Authentication:You need to determine how you are going to authenticate users, internally or externally. I recommend external, but that still leaves a couple options as to how you do this. You can create and deploy an LDAP service in the cloud, or leverage Amazon services for credentials, or you can link back to your existing IT services. Any way you go, you are responsible for user setup and validating security on what you deploy.

If you plan on doing anything more than basic testing and proof of concepts -- and then totally dismantling the database afterwards -- here are several other things you should consider:

Certificate Management: There is a lot of management to the machine images, disk images, virtual networking, and data management; you will be running scripts to auto-configure images at startup and linking all of the resources together. You'll have certificates issued to validate connections and admin capabilities, so you need to capture these certificates in a secure location and distribute to select few. I don't recommend keeping public and private keys in the same directory, and I definitely don't recommend installing certificates on images were they can compromised.

Network Access: The database should only be indirectly accessible through your applications, or through some secured connection from your existing IT environment. You have the option of creating a virtual network with Amazon's Elastic Beanstalk and controlling how database connections can be created. You'll want to set up a VPN tunnel for management connections, and you'll want to set up the database with a private IP address so that it cannot be publicly addressed.

Assessment: Assessment of the database configuration to ensure you turn off unwanted services and reset default passwords. If you are quickly spinning up and shutting down databases, it's easy to miss configuration details, so get an assessment tool to validate security settings. Consider creating a script file that runs prior to launching the images so your systems have a secure baseline configuration.

Key Management: Consider how you want to manage encryption keys for the database. Sure, you can install the keys on the disk image, but that's not very secure as they can be read by attackers. You'll likely need a key server in the cloud, or once again, supporting the cloud from your existing IT environment.

Masking: If you're testing a new application, or the viability of an existing applications in the cloud, you'll need test data. Any data that you put in the cloud should not be production data until you have audited database security. Do yourself a favor and get a masking tool that will auto-generate data for you, or obfuscate existing data, before moving into the cloud database. Adrian Lane is an analyst/CTO with Securosis LLC, an independent security consulting practice. Special to Dark Reading. Adrian Lane is a Security Strategist and brings over 25 years of industry experience to the Securosis team, much of it at the executive level. Adrian specializes in database security, data security, and secure software development. With experience at Ingres, Oracle, and ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
DevSecOps: The Answer to the Cloud Security Skills Gap
Lamont Orange, Chief Information Security Officer at Netskope,  11/15/2019
Attackers' Costs Increasing as Businesses Focus on Security
Robert Lemos, Contributing Writer,  11/15/2019
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Current Issue
Navigating the Deluge of Security Data
In this Tech Digest, Dark Reading shares the experiences of some top security practitioners as they navigate volumes of security data. We examine some examples of how enterprises can cull this data to find the clues they need.
Flash Poll
Rethinking Enterprise Data Defense
Rethinking Enterprise Data Defense
Frustrated with recurring intrusions and breaches, cybersecurity professionals are questioning some of the industrys conventional wisdom. Heres a look at what theyre thinking about.
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2019-19037
PUBLISHED: 2019-11-21
ext4_empty_dir in fs/ext4/namei.c in the Linux kernel through 5.3.12 allows a NULL pointer dereference because ext4_read_dirblock(inode,0,DIRENT_HTREE) can be zero.
CVE-2019-19036
PUBLISHED: 2019-11-21
btrfs_root_node in fs/btrfs/ctree.c in the Linux kernel through 5.3.12 allows a NULL pointer dereference because rcu_dereference(root->node) can be zero.
CVE-2019-19039
PUBLISHED: 2019-11-21
__btrfs_free_extent in fs/btrfs/extent-tree.c in the Linux kernel through 5.3.12 calls btrfs_print_leaf in a certain ENOENT case, which allows local users to obtain potentially sensitive information about register values via the dmesg program.
CVE-2019-6852
PUBLISHED: 2019-11-20
A CWE-200: Information Exposure vulnerability exists in Modicon Controllers (M340 CPUs, M340 communication modules, Premium CPUs, Premium communication modules, Quantum CPUs, Quantum communication modules - see security notification for specific versions), which could cause the disclosure of FTP har...
CVE-2019-6853
PUBLISHED: 2019-11-20
A CWE-79: Failure to Preserve Web Page Structure vulnerability exists in Andover Continuum (models 9680, 5740 and 5720, bCX4040, bCX9640, 9900, 9940, 9924 and 9702) , which could enable a successful Cross-site Scripting (XSS attack) when using the products web server.