Showing posts with label cloud. Show all posts
Showing posts with label cloud. Show all posts

Monday, June 12, 2017

Oracle Data Visualization Cloud Service - upload files

Within the Oracle Cloud portfolio Oracle has positioned the Oracle Data Visualization Cloud Service as the tool to explore your data, visualize it and share your information with other people within the enterprise. The Oracle Data Visualization Cloud Service can be used as a part of a data democratization strategy within an enterprise to provide all users access to data where and whenever they need it. The concept of data democratization is discussed in another blogpost on this blog.

Currently the Oracle Data Visualization Cloud Service provides two main ways of getting data in the Oracle Data Visualization Cloud Service. One is by connecting it to a oracle database source, for example located in the Oracle database cloud service, another is by uploading a file. Primarily CSV and XLSX files are supported for uploading data.

In most cases it is not a best practice to upload files as the data is relative static and is not connected to a live datasource as you would have with a database connection. However, in some cases it can be a good way to get data in. Examples are; users who add their own content and do not have the means to connect an Oracle database or relative static data.

Example data from DUO
In the example below we add a relative static piece of data to the Oracle Data Visualization Cloud Service. This is a year over year report of people dropping out of schools in the Netherlands. The data is per year, per location and per education type and is freely available as open-data from the DUO website. You can locate the file for reference here.

Loading a data file
When a user wants to load data into the Oracle Data Visualization Cloud Service the most easy way to do so from and end-user perspective is to use the GUI. Loading data includes a limited number of steps.

1) within the Data Sources section navigate to "Create" - " Data Source" Here you can select the type file by default. When selected you are presented with the option to select a file on your local file system.


2) the next step, after the file is uploaded, is to verify and if needed modify the definition of the uploaded data.


3) after this step is completed you will find the file ready for you use in your data sources as shown below.

In effect, those are the only actions needed by a user to add data to the Oracle Data Visualization Cloud Service.

Wednesday, December 28, 2016

Oracle Cloud - Governance Risk and Compliance Framework based deployments

Regulations, compliance rules, guidelines and security directives should all be part of the overall governance risk and compliance framework implemented within your company. Governance risk and compliance frameworks are developed and implemented to bring structure in how companies are organized to protect against risks and how to react in case of an incident.

when developing a governance, risk and compliance framework to be implemented within a large organisation companies often look at market best practice standards, regulatory requirements and internal standards. Within a wider model it will be required for organisation to look strategic risks, operational risks and tactical risks to ensure the entire organisation is covered under the Governance risk and compliance framework.


Information systems and data - tier 3
As can be seen in the above image, the high level view showing a government, a popular risk and compliance framework outline consists out of 3 tiers. Only tier 3 includes information systems and data as a primary focus. A framework and its implementation is only complete and only makes sense if you ensure you will focus on all 3 tiers and not only on one or two of them.

Even thought this post will focus on tier 3, information systems and data, in an overall governance , risk and compliance framework all the layers should be covered. Having stated that, all the 3 tiers will influence each other and the end-to-end model cannot be created without having cross functions and overlap.

Implementation by standardization and Enterprise Architecture
When looking at tier 3 specific one of the things that will become obvious rather quickly is that creating and implementing a governance risk and compliance framework will require standardization.

To be able to build a good and workable governance, risk and compliance framework you will have to standardize and limit the number of technologies as much as possible. Building standardized building blocks and adopting them in your enterprise architecture repository for your organisation and ensuring they are complimented with standard implementation and security rules is a good rule by default. Ensuring the standardized solution building blocks and  the governance risk and compliance framework are complementing to each other is a vital corner stone for being successful in tier 3 implementations.

The below diagram shows a part of TOGAF relevant to building a governance, risk and compliance framework enabled Enterprise Architecture.


When developing a governance, risk and compliance framework you will have to ensure that this framework will be adopted in the Enterprise Architecture framework as a standard. This means that the framework will have to be implemented in the standards information base. By doing so it will ensure that it is included in the architecture landscape and as a result will end up in the solution building blocks.

The "Standards Information Base" will hold standards on architecture, standards on configuration as well as coding and development standards. This will also hold all standards coming from the governance, risk and compliance framework to ensure that this is embedded in the foundation of the architecture.

It is of vital importance to ensure that not only the standards coming from the governance, risk and compliance framework are included in the "Standards Information Base". It is equally important to ensure that they are applied and used and that an architecture compliance review process is in place to ensure this.

Having the standards derived from the governance, risk and compliance framework embedded in the enterprise architecture and ensuring that it is applied on an architecture level and used in the right manner will help enforcing the implementation in tier 3.

Compliance assessment
Having a governance, risk and compliance framework in place, embedding it in the "Standards Information Base" of your Enterprise Architecture Repository and ensuring with a architecture compliance review process that the standards are included in the resulting architectures and solution building blocks in only a part of the end-to-end solution.

Ensuring that your architecture is in line with the requirements stated in the governance, risk and compliance framework is not a guarantee that it is implemented in this manner. And when it is implemented in compliance with the standards it is not a guarantee that it will remain that way during operations.

What is required to ensure a correct level of compliance with he standards is a constant monitoring of the current deployments and to what level they are compliant. Solutions like Puppet can be used up to a certain level to complete this task and report the level of deviation from the standard requirements however solutions like Puppet (and others) are not designed for this specific purpose and are only able to do this task up to a certain level.

Oracle provides a fully build for purpose solution as part of the Oracle Management Cloud Service. The Oracle Compliance Cloud Service is a software-as-a service solution that enables the IT and Business Compliance function to assess and score industry standard benchmarks, REST-based cloud resources and your own custom rules. With the Oracle compliance Cloud Service you can score, assign and remediate compliance violations both on premise and in the cloud.



The Oracle Compliance Cloud Service will allow you to monitor systems and applications deployed in the Oracle cloud in other public clouds, in your local datacenter and in your private cloud. Providing a constant monitoring and reporting to enable you to have a realtime insight into the level of compliance. This can be against the standards defined by your own organisation or against industry and regulatory standards.

Having the ability to constant have a realtime insight and define automatic actions in case a check fails ensures that you gain more control over the actual implementation of the governance, risk and compliance framework in tier 3. having the option to do realtime and constant assessments will uncover situations that might lead to possible issues directly and empowers IT to ensure security, reliability and compliance at all times. 

Monday, December 19, 2016

Oracle Cloud – Changing the tool chain for DevOps

Organisations used to make use of a waterfall based strategy with a clear split between development and operations. A model in which developers where tasked to develop new functionality and improve functionality based upon change requests in a waterfall based strategy. Operations departments where tasked with running the production systems with the code developed by the development teams without a clear feedback channel to the developers. With adopting new ways of working and with the rise of DevOps a change is happening in companies of all sizes.

Development and operations departments are merged together and form DevOps teams more focused around a set of products from both a development as well as a operational run point of view opposed to being focused on development only or operational support only.

The general view and the general outcome of this model is that by merging operational and development teams into one team responsible for the entire lifecycle of a product the overall technical quality of the solution as well as the quality of the operational use improves.

Transition challenges
Traditional organized teams making the transition from a split operational and development model to a DevOps model will face a number of challenges. One of the most challenging parts of this transition will be the culture shift and the change of responsibilities and expectations for each individual in the team. People tasked with operational support tasks will now be required to also work on improving the solution and extending functionality. People who are used to only work on developing new code are now also expected to support the solution in a run phase. 

Another challenging part of the transition will be the adoption of a totally new way of working and a new toolset. When working with the Oracle Cloud in a DevOps way you will have the option to use all the tools out of the standard DevOps tool chain. 

Changing the tool chain
When moving from more traditional way op IT operations to a DevOps operational model you will see that the tool chain is changing. The interesting part of the DevOps chain is that most of the common tools in the DevOps tool chain are open and most of them are open source.  Secondly, there's no single, one-size-fits-all DevOps tool. Rather, the most effective results come from standardizing on a tool chain that maps directly to best practices such as version control, peer review and continuous delivery all built on a foundation of managing infrastructure as code and continues delivery and deployment.

There is a large set of tools which are commonly used in different setups of DevOps both in situations of conventional IT, private cloud, public cloud and hybrid cloud. The combination of tools and how they are used in a specific DevOps footprint is primarily driven by the type of applications that are maintained, the level of expertise of the DevOps team and the level of integration with already present tools in the tool chain.

Oracle Cloud DevOps ToolChain

DevOps tool chain in the Oracle Cloud
A growing part of the DevOps tool chain is by default available in the Oracle Public Cloud, enabling your DevOps teams to start using it directly from moment one. When using the Oracle Public Cloud and consuming for example PaaS and IaaS services and building a DevOps Tool Chain around the Oracle Public Cloud and other public and private clouds it is wise to look into the options provided by Oracle. 

Oracle Cloud DevOps ToolChain

As can be seen in the above image the Oracle Developer Cloud Service already provides a large set of DevOps tools out of the box. In addition to this the Oracle Computer Cloud Service will provide you IaaS service and provides you the ability to run Oracle Linux instances which you can use to run whatever DevOps tool you need and integrate that with the tools that are already available in the Oracle Developer Cloud Service.

Monitoring and Orchestration 
As a large part of DevOps revolves not only around developing code and deploying code however also includes constant monitoring of your environment and the orchestration of the end-to-end flow a large part of the DevOps team time is spend on this. Within the open and opensource tool chain a lot of different tools can be found who are all capable of doing so. 

DevOps teams can decide to make use of those tools as well as that the can make use of services provided from within the Oracle Cloud. Tools provided by the Oracle cloud for this can be used for DevOps taks in the Oracle cloud as well as in other clouds or on-premise deployed solutions. 

As an example, Oracle Orchestration Cloud Service can take a large part of the tasks for the end-to-end orchestration within the DevOps model. Next to this, within the Management portfolio of the Oracle Public Cloud portfolio tools for application monitoring, infrastructure monitoring and log analytics can be found. Tasks also commonly being given to solutions like the Elastic Search stack or to Splunk

The Oracle Public Cloud picture
If we combine the above in on picture we will see that we can run the entire DevOps tool chain in the Oracle cloud. This includes the needed feedback from continuous monitoring by leveraging the products from the Oracle Cloud monitoring portfolio. 


As you can see, Oracle Cloud is not the only target when running the DevOps tool chain in the Oracle Cloud, you can use this as a central tool location to develop and operate solutions deployed at any location. This can be the Oracle Public Cloud, a private cloud or another non-Oracle Public Cloud. 

Thursday, December 01, 2016

Oracle - Profitability and Cost Management Cloud Service

Often it is hard to find the true costs of a product and ensure you make a true calculation for the profitability of a product. As an example, a retail organization might figure the sales price minus the purchase price is the profitability of a single product. The hidden overall costs for overhead, transportation, IT services and others are often deducted from the overall company revenue. Even though this will ensure you have the correct overall company revenue it is not giving you the correct profitability figures per product of services.

Not being able to see on a product or service level what the exact profitability is might result in having a sub-optimal set of products. Being able to see on a product or service level which products and services are profitable and which are not can help companies to create a clean portfolio and become more profitable overall.


With the launch of the profitability and cost management cloud service Oracle tries to provide a solution for this.

It takes information streams like production cost figures, facility costs figures and human resource cost figures and combines those with the information from your core general ledger. The combined set of figures is loaded into the performance ledger to enable the profitability and cost management cloud to analyze and calculate the true costs of a product.

The core of the product is a web-based interface which allows analysts to combine, include and exclude sets of data to create a calculation model which will enable them to see the true costs which included the hidden costs.

As profitability and cost management cloud makes use of Hyperion in the background you are also able to use the smart view for office options and include the data and the data model results you crate in profitability and cost management cloud in your local excel. As a lot of people still tend to like Microsoft excel and do some additional analysis in Excel the inclusion of the smart view for office options makes a lot of sense to business users.

In the above video you can see an introduction to the new cloud service from Oracle. 

Monday, November 21, 2016

Using Oracle cloud to integrate Salesforce and Amazon hosted SAP

Oracle Integration Cloud Service (ICS) delivers “Hybrid” Integration. Oracle Integration Cloud Service is a simple and powerful integration platform in the cloud to maximize the value of your investments in SaaS and on-premises applications. It includes an intuitive web based integration designer for point and click integration between applications and a rich monitoring dashboard that provides real-time insight into the transactions, all running on Oracle Public Cloud. Oracle Integration Cloud Service will help accelerate integration projects and significantly shorten the time-to-market through it's intuitive and simplified designer, an intelligent data mapper, and a library of adapters to connect to various applications.

Oracle Integration Cloud Service can also be leveraged during a transition from on premise to cloud or by building a multi-cloud strategy. As an example, Oracle provides a standardized connection between Salesforce and SAP as shown above.


An outline of how to achieve this integration is shown in the below video which outlines the options and easy of developing an integration between Salesforce and SAP and ensure the two solution work as an integrated and hybrid solution.



As enterprises start to move more and more to a full cloud strategy ensuring you have a central ingratiation point already in the cloud positioned is ideal when moving to a more cloud based strategy. As an example, SAP can run on Amazon. During a test and migration path to the cloud you most likely do want to ensure you can test your integration between salesforce and SAP without the need of re-coding and re-developing integration.



By ensuring you use Oracle Integration Cloud Service as your central integration solution the move to a cloud strategy for your non-cloud native applications becomes much more easy. You can add a second integration during your test and migration phase and when your migration to, for example, Amazon has been completed you can discontinue your integration to your old on premise SAP instances.


This will finally result in an all-cloud deployment where you have certain business functions running in Saleforce, your SAP systems running in Amazon while you leverage Oracle Integration Cloud Service to bind all systems together and make it a true hybrid multi-cloud solution.

Friday, October 14, 2016

Obtaining OPCinit for Oracle Linux

When deploying an Oracle Linux instance on the Oracle Public Cloud you will most likely use the Oracle Linux default templates. That is, up until the moment the moment that you need more than what is provided by the template.

It might very well be that at one point in time you feel that scripting additional configuration to be used after deployment is no longer satisfactions and for some reason you would like to have your own private template. Oracle provide some good documentation on how to do this. You can read some of this at the "Using Oracle Compute Cloud Service" documentation under the "Building Your Own Machine Images" section.

The documentation however lacks one very important point, you can find references about using OPCinit when creating your template. Up until recent the entire OPCinit was missing online and you would not be able to download it. You could reverse engineer OPCinit from an existing template and use it however the vanilla download was not available and it was not available on the Oracle Linux YUM repository.

Now Oracle has solved this by providing a download link to a zip file containing two RPM's you can use to install in your template that will ensure it will make use of OPCinit.

You can download OPCinit from the Oracle website on this location. Unfortunate it is not available on the public Oracle Linux YUM repository so you have to download it manually.

Sunday, September 18, 2016

Oracle Cloud master orchestrations

When deploying a new instance in the Oracle Compute Cloud Service you will notice this is driven by an orchestration. If you look at the orchestration tab you will notice it is not a single orchestration, it will contain out of 3 orchestrations. One to bundle them and actually two orchestrations that will create a tangible object. In our case the tangible object will be an instance and a storage object.

  • Master orchestration which binds the two others together.
  • Instance orchestration which will create the actual compute instance.
  • Storage orchestration which will create the required storage volume. 
When working with orchestrations you have to remember that it is only the instruction set on how to create the actual end result. This means that, when you stop an orchestration the end result is not only stopped, it is also removed. When you start it again, the end result will be created again (from scratch).

In the below screenshot you can see 3 orchestrations which were used to create a compute instance named TESTBOX08 and the associated storage which was needed.


If we open the details of master orchestration we can see that this is actually a JSON file containing instructions on what to create.  In essence the master is used to bundle both the instance orchestration and the storage orchestration together and make it a single set of instructions.


As you can see in the above JSON used in the TESTBOX08_master orchestration there is a relationship defined in the master between TESTBOX08_storage and TESTBOX08_instance. The relationship is that you have a oplan named TESTBOX08_instance. This means that TESTBOX08_instance is the actual oplan, An object plan, or oplan, is the primary building block of an orchestration.

As you can see above and in the below example  this is how a  relationship within a oplan is defined.

{
 "oplan": ,
 "to_oplan": ,
 "type": "depends",
}

  • oplan : the name of oplan1 
  • to_oplan : Label of an oplan on which oplan1 depends
  • type: Type of the relationship. It must be depends

for this plan that means that the instance depends on the storage, that also means that the storage will be created first and after that the instance as soon as you execute the master orchestration.

Be careful when stopping the master orchestration
When selecting the “stop” command on the TESTBOX08_master orchestration you will get a warning which looks like this:

"Orchestration "TESTBOX08_master" will be stopped. Stopping an orchestration will destroy all objects that were created using the orchestration. If you created instances using this orchestration, those instances will be deleted. If you provisioned storage volumes using this orchestration, those storage volumes will be deleted and all data stored on them will be lost. However, objects created outside this orchestration and merely referenced in this orchestration won't be deleted. At any time, you can re-create objects defined in this orchestration by starting it. Are you sure you want to stop this orchestration?"


As you can see in the above screenshot when you stop the master orchestration it will take some time before the associated orchestrations are stopped. In the screenshot below you can see how all 3 orchestrations are topped.


As you can see in the above screenshot, also the storage has stopped and remembering the warning this would mean that also the attached storage is stopped, which means it is removed. If you check the storage tab you will see that indeed the storage volume is no longer available.

Now, if I select “start” again on the master orchestration it will start executing the storage and the instance orchestration again. It will first start the storage as this is a pre-requisition for the instance. The issue with this way of working, and the risk is, that you have to be aware that your storage is really been removed and is created again from scratch.

Meaning, you will have a fresh environment again, everything you have ever done to the system is lost. Which might be very well something you like to do… however, if you goal was to stop the instance for some period of time and start it again at a later moment and continue working on it again this is not the right direction. In that case, the case you would like to “pause” your instance you have to stop the instance orchestration only, which is described in more detail in this blogpost

Tuesday, April 26, 2016

Oracle Hybrid Cloud

Recently I presented together with Marcel Giacomini from Oracle on Oracle public, private and hybrid cloud. The hybrid cloud is a direction I personally feel the market will move towards very quickly. Even though cloud companies would like to see enterprises adopting a full cloud model I think a majority of the large enterprises and companies will take the route of hybrid cloud first.

To see more on the capabilities around hybrid cloud from Oracle have a look at the deck we presented during Advantage You.


Thursday, May 28, 2015

Oracle building blocks for future enterprise services

As we observe the direction enterprises are heading into with regards of their IT footprint we can observe a number of interesting trends. None of them are new, however, we see them picking up more and more momentum and becoming the new standard within enterprise IT. If we take a look at some of the directions enterprises are moving into and what the demands are from the internal users in the form of business departments we see the challenges faced.

The questions asked by the business are in some cases against the traditional way of working and doing things. To be able to implement them and satisfy the business some radical change is needed in some cases. Not only in the way IT departments work, also in the way the entire IT landscape is architected and how the entire IT landscape traditionally is build.

To be able to move from a traditional way of working, in most cases, a combination of application as well as infrastructure modernization and rationalization is needed.

To read the full blogpost please visit the Capgemini.com Oracle blog.

Sunday, September 28, 2014

The future of the small cloud

When talking about cloud, immediately the thoughts of Amazon, Azure and Oracle Cloud comes to mind by a lot of people. When talking about private cloud the general idea comes to mind that this is a model which is valuable for large customers running hundreds or thousands of environments and which will require a large investment in hardware, software, networking and human resources to deploy a private cloud solution.

Even though the public cloud provides a lot of benefits and relieves companies from CAPEX costs in some cases it is beneficial to create a private cloud. This is not only the case for large enterprises running thousands of services, this is also the case for small companies. Some of the reasons that a private cloud is more applicable then using a public cloud can be for example:

Legal requirements
Compliancy rules and regulations
Confidentiality of data and/or source code
Specific needs around control beyond the possibilities of public cloud
Specific needs around performance beyond the possibilities of public cloud
Specific architectural and technical requirements beyond the possibilities of public cloud

There are more specific reasons that a private cloud, or hybrid cloud, for small companies can be more beneficial than a public cloud and can be determined on a case by case base. Capgemini provides roadmap architecture services to support customers in determining the best solution for a specific case which can be public cloud, private cloud or a mix of both in the form of a hybrid cloud. This is next to more traditional solutions that are still very valid in many cases for customers.

One of the main misconceptions around private cloud is that it is considered to be only valid for large deployments and large enterprises. The general opinion is that there is the need for a high initial investment in hardware, software and knowledge. As stated this is a misconception. By using both Oracle hardware and software there is an option to build a relative low cost private cloud which can be managed for a large part from a central graphical user interface in the form of Oracle Enterprise Manager.

A private cloud can be started by a simple deployment of two or more Sun X4-* servers and using Oracle VM as a hypervisor for the virtualization. This can be the starting point for a simple self-service enabled private cloud where departments and developers can provision systems in an infrastructure as a service manner or provision databases and middleware in the same fashion.


By using the above setup in combination with Oracle Enterprise Manager you can have a simple private cloud up and running in a matter of days. This will enable your business or your local development teams to make use of a private in-house cloud where they can make use of self service portals to deploy new virtual machines or databases / applications in a matter of minutes based upon templates provided by Oracle.

Sunday, July 13, 2014

Oracle will take three years to become a cloud company

Traditional software vendors who have been relying on a steady income of license revenue are forced to ether change the standing business model radically or been overrun by new and upcoming companies. The change that cloud computing is bringing is by some industry analysts compared to the introduction of the Internet. The introduction and the rapid growth of the internet did start a complete new sub-industry in the IT industry and has created the IT-bubble which made numerous companies bankrupt when it did burst.

As the current standing companies see the thread and possibilities of cloud computing rising they are trying to change direction to ensure survival. Oracle, being one of the biggest enterprise oriented software vendors at this moment is currently changing direction and stepping into cloud computing full swing. This by extending on the more traditional way of doing business by providing tools to create private cloud solutions for customer and also by becoming the new cloud vendor in the form of IaaS, SaaS, DBaaS and some other forms of cloud computing.

According to a recent article from Investor Business Daily the transition for Oracle will take around three years to complete. Based upon Susan Anthony, an analyst for Mirabaud Securities, it will take around five years until cloud based solutions will contribute significantly more then the current license sales model;

"As the shift takes place, the software vendors' new license revenues will ... be replaced to some extent by the cloud-subscription model, which within three years will match the revenues that would have been generated by the equivalent perpetual license and, over five years, contribute significantly more"

The key to success for Oracle and for other companies will be to attract different minded people then they are currently have. The traditional way of thinking is so deeply embedded in the companies that a more cloud minded generation will be needed to help turn the cloud transformation for traditional companies into a success. Michael Turits, an analyst for Raymond James & Associates states the following on this critical success factor:

"It takes a lot to turn the battleship and transition a legacy (software) company into a cloud company, We believe they are hiring people to focus on cloud sales and that the incentive structure is being altered to speed the transition."

Analysts are united in the believe that this is a needed transition for Oracle to survive however that it will, on the short term will hurt the revenue stream of the company and by doing so it will negatively influence the stock price for the upcoming years. Rick Sherlund, a Nomura Securities analyst, wrote in a June 25 research note:

"Oracle, like other traditional, on-premise software vendors, will be financially disadvantaged over the short term as its upfront on-premise license revenues are cannibalized by the recurring cloud-based revenues, therefore, we model expected license revenues to be flat to down for the next two years (during) the transition."

Currently we can see the transition taking place, June 25, 2014, Mark Hurd presented the Oracle Cloud Strategy for the upcoming years. Not only the expansion in global datacenters for hosting the new business model however also the growth predictions for the upcoming years. As we look at the growth in datacenters you will be able to see that Oracle is serious about the cloud strategy and transformation.


The full presentation deck can be found embedded below:


Monday, June 30, 2014

Puppet and Oracle Enterprise Manager

Enterprises are using virtualization already for years as part of their datacenter strategy. Since recent you see that virtualization solutions are turning into private cloud solutions which enables business users even more to quickly request new systems and make full use of the benefits of cloud within the confinement of their own datacenter. A number of vendors provide both the software and the hardware to kickstart the deployment of a private cloud.

Oracle provides an engineered system in the form of the Oracle Virtual Compute Appliance which is a combination of pre-installed hardware which enables customers to get up and running in days instead of months. However, a similar solution can also be created “manually”. All software components are available separately from the OVCA. Central within the private cloud strategy from Oracle is Oracle Enterprise Manager 12C in combination with Oracle VM and Oracle Linux.

In the below diagram you can see a typical deployment of a private cloud solution based upon Oracle software.


As you can see in the above diagram Oracle Enterprise Manager plays a vital role in the Oracle private cloud architecture. Oracle positions the Oracle Enterprise Manager solution as the central monitoring and provisioning tooling for both the infrastructure components as well as application database components. Next to this Oracle Enterprise Manager is used for patching both operating system components as well as application and database components. In general Oracle positions the Oracle Enterprise Manager the central solution for your entire private cloud solution. Oracle Enterprise Manager ties in with Oracle VM Manager and enables customers to request / provision new virtual servers from an administrator role however also by using the cloud self service portals where users can create (and destroy) their own virtual servers. Before you can do so you however have to ensure that your Oracle VM Manager is connected to Oracle Enterprise Manager and that Oracle VM itself is configured. 

The initial steps to configure Oracle VM to be able to run virtual machines are outlined below and are commonly only needed once.


As you can observer quite a number of steps are needed before you will be able to create your first virtual machine. Not included in this diagram are the efforts needed to setup Oracle Enterprise Manager and combine it with Oracle VM Manager and on how to activate the self service portals that can be used by users to create a virtual machine without the need for an administrator. 

In general, when you create / provision a new virtual machine via Oracle tooling you will be making use of a template. A pre-defined template which can contain one or more virtual machines and which can potentially contain a full application or database. For this you can make use of the Oracle VM template builder or you can download pre-defined templates. The templates stored in your Oracle VM template repository can be used to provision a new virtual machine. Commonly used strategy is to use the right mix of things you put into your template and things you do configure and install using a first boot script which will be started the first time when a new virtual machine starts. even though you can do a lot in the first boot script this still will require you to potentially create and maintain a large set of templates which might differ substantially per application you would like to install or per environment it will be used in. 

In a more ideal scenario you will not be developing a large set of templates, in an ideal scenario you will only maintain one (or a very limited set of) templates and use other external tooling to ensure the correct actions are taken. Since recent Oracle has changed some of the policies and the Oracle Linux template for Oracle VM which you can download from the Oracle website is nothing more then a bare minimum installation where almost every package you might take for granted it missing. This means that there will not be a overhead of packages and services started that you do not want or need. the system will be fully your to configure. This configuration can be done by using first boot scripting which you would need to build and customize for a large part yourself or you can use external tooling for this. 

A good solution for this is to make use of puppet. This would mean that the first boot script only need to be able to install the puppet agent on the newly created virtual machine. By making use of node classification the puppet master will be able to see what the intended use of this new machine is and install packages and configure the machine accordingly.


Even though this is not a part of the Oracle best practices it is a solution for companies who do have a large set of different type of virtual machines they need to be able to provision automatically or semi automatically. By implementing puppet you will be able to keep the number of Oracle VM templates to a minimum and keep the maintenance on the template extremely limited. All changes to a certain type of virtual machine provisioning can be done by changing puppet scripts. An additional benefit is that this is seen as a non-intrusive customization to the Oracle VM way of working. This way you can stay true to the Oracle best practices and add the puppet best practices. 

As a warning, on the puppet website a number of scripts for Oracle databases and other Oracle software are available. Even though they do tend to work it is advised to be extremely contentious about using them and you should be aware that this might be harming your application and database software installation. it will be good to look at the inner workings of them before applying them in your production cloud. However, when tested and approved to be working for your needs they might be helping you to speed up deployments. 

Saturday, June 07, 2014

Oracle Cloud Periodic Table

Cloud computing and cloud in general is a well discussed topic which defines a new era of computing and how we think about computing and how this can be done in the new era. Defining the cloud is a hard thing and very much depend on your point of view. As many vendors have tried to describe what cloud computing is you might find that they all have a different explanation due to their point of view. This makes creating a single description of cloud computing hard. When you are looking for the pure definition of cloud computing the best source to turn to is NIST (National Institute of Standard and Technology) who have been giving a definition of cloud computing which might be one of the best ways of stating it.

The NIST definition lists five essential characteristics of cloud computing: on-demand self-service, broad network access, resource pooling, rapid elasticity or expansion, and measured service. It also lists three "service models" (software, platform and infrastructure), and four "deployment models" (private, community, public and hybrid) that together categorize ways to deliver cloud services. The definition is intended to serve as a means for broad comparisons of cloud services and deployment strategies, and to provide a baseline for discussion from what is cloud computing to how to best use cloud computing.

To help customers understand cloud and cloud computing better and to show that cloud computing is not a single solution however constist out of many solutions which can be combined to form other solutions Oracle has released a short video to create a mindset which uses the analogy with the Periodic Table of Elements which is called the Oracle Cloud Periodic Table.


This video shows the vision of Oracle on cloud computing, or at least a part of the vision and creates a mindset to understand that your specific cloud solution will most likely be the combination of a number of modules which are offered from within a cloud platform. Not only Oracle is making use of this model, it is a growing trend in hybrid clouds and is largely based upon open standards and the as-a-Service way of thinking. 


Saturday, April 19, 2014

Oracle Database Backup Service explained

Oracle databases are commonly used for mission critical systems, in many cases databases are configured in a high availability setup spanning two or more datacenters. Even though a dual or triple datacenter is protecting you against a number of risks, like for example a fire in one of the datacenters it is not excusing you from implementing a proper backup and recovery strategy. In cases where your data is corrupted or for any other reason you need to consult a backup you will most likely rely on Oracle RMAN. RMAN used the default way for backup and recovery and ships with the Oracle database.

The below diagram shows a proper way of conducting backups. In this case all the data in database A and B is written to the tape library in another datacenter. Databases C and D write the data to the other datacenter. This ensures that your data is always at two locations. If for some reason datacenter-1 should be considered a total loss you can still recover your data from the other datacenter. For mission critical systems you most likely also will have a standby database in the other datacenter however this is not included in this diagram.

Even though this is considered a best practice it is for some companies a costly implementation. Specially smaller companies do not want to invest in a dual, or even triple, datacenter architecture. For this reason you commonly see that the data is written to tape in the same datacenter as the database is hosted and that a person is collecting the tapes on a daily basis. Or, in some worst case scenarios the tapes just reside in the same datacenter. This holds that in case of a fire the entire data collection of a company can be considered lost.

Oracle has recently introduced a solution for this issue by adding a cloud backup service to its cloud services portfolio. The Oracle database backup cloud provides an option to keep using your standard RMAN tooling, however instead of talking to a local tape library, or one in another datacenter, you will be writing your backup to the Oracle cloud. This cloud service, named Oracle Database Backup Service requires you to install the Oracle Database Cloud Backup Module on your database server. You can use the installed module as a RMAN channel to do your backup. By using encryption and compression you can ensure that your backup is send quickly and secure to the Oracle backup Service.


The above diagram shows the flow used in case you backup to the Oracle database backup service. This model is working when you have, for example, only a single datacenter. However it can also work as a strategic model when you have multiple datacenters and even if you have mixed this with cloud based hosting.

The above diagram shows how you can use the Oracle database backup service to do a cloud to cloud backup. If you, for example, host your database at Azure or Amazon and you would like to backup your data at the same backup service providers as all your other datacenters are using. Or you want to have it at Oracle to ensure your data is not with one single company, you can use the same mechanism to perform the backup to the Oracle Database Backup Service.

Creating an account at Oracle and ordering backup space is easy and can be done completely online. As you can see from the screenshot below you can order per terabyte of backup space.


One thing you have to keep in mind, as with all cloud based solutions. There are some legal considerations you need to review. When using the Oracle Database Backup Service you are moving your data away from your company and into the trust of another company. Oracle has provided numerous security options to ensure your data is safe, however, from a legal point of view you have to be sure you are allowed to move the data into the trust of Oracle. For most US based companies this will not be an issue, for US based government agencies and non-US companies it is something you might want to check with your legal department, just to be sure.

Friday, April 18, 2014

Enterprise cloud spending $235B

Companies are moving to the cloud. The trend is more and more to move business functions to cloud based solutions. A couple of years ago companies where not including cloud in the main consideration when thinking about new or improved IT solutions. Currently on almost every shortlist we do see cloud based solutions as a viable option. This is showing in the forecasts and the history of spendings on cloud technology and cloud based architectures where companies are deploying  enterprise functionality on.

Ryan Huang reports on the ZDnet page the growth of cloud based spending and the forecast voor 2017. Below you can see the graph showing the rise of cloud spending in the upcoming years as predicted.


This prediction shows that all companies who are currently investing in building cloud based platforms are making a solid investment as the trend is that the cloud based solutions, and associated customer investment, will continue to grow, for all good reasons. 

Saturday, December 28, 2013

Oracle Infrastructure as a service announcement

Oracle is already securing its place in the cloud era for some time by providing a number of cloud based solutions as well as software and hardware which enables companies to build private (or hybrid) clouds within their own datacenters. One area however which has not been touched that much by Oracle is the Infrastructure as a Service area.

If you take a close look at the Oracle cloud strategy which has been forming in past couple of years and if you have been looking closely at the products in both hardware and software this should not come as a surprise to many. What you could seen is that Oracle has been working on a lot of solutions that are all part of a cloud / IaaS foundation. In recent years it has become clear that Oracle is using the public market as some sort of testing ground and have the product they put out in the market, end up up in the Oracle cloud solutions. Another giveaway could have been the fact that Oracle has started to embrace OpenStack which was a clear giveaway that IaaS was the way to go in the minds of the Oracle executives.

In the recent second quarter financials call from Oracle given by Larry Ellison the statement made was: "his company intends to be price competitive with established cloud infrastructure-as-a-service players like Amazon, Microsoft and Rackspace". Also stated by Larry was: "We intend to compete aggressively in the commodity infrastructure as a service marketplace". According to Oracle this should start unfolding in the first half year of 2014

This news, even though one could have expected it, will ensure that other IaaS competitors will have to start watching and battling Oracle in this field in which previously Oracle was not a player.

Monday, November 25, 2013

Deploying Ceph and OpenStack with Juju

A lot of people working on building private or public clouds will be familiar with OpenStack amd will use it or at least have adopted some of the technological thinking behind OpenStack. Less know might be juju and CEPH.  Akash Chandrashekar is working as a solution engineer at Canonical which is the organization behind Ubuntu Linux and he gives a very clear explanation how your can build clouds by using OpenStack, juju and CEPH.

To give some highlevel background before watching the video on the 3 components; OpenStack, juju and CEPH.

OpenStack: OpenStack, a cloud-computing project, aims to provide an infrastructure as a service (IaaS). It is free and open-source software released under the terms of the Apache License. The project is managed by the OpenStack Foundation, a non-profit corporate entity established in September 2012 to promote OpenStack software and its community.

juju: Juju (formerly Ensemble) is a service orchestration management tool developed by Canonical Ltd.. It is an open-source project hosted on Launchpad released under the Affero General Public License (AGPL). Juju concentrates on the notion of service, abstracting the notion of machine or server, and defines relations between those services that are automatically updated when two linked services observe a notable modification. This allows for services to very easily be up and down scaled through the call of a single command. For example, a web service described as a Juju charm that has an established relation with a load balancer can be scaled horizontally with a single juju "add-unit" call, without having to worry about re-configuring the load-balancer to declare the new instances: the charm's event based relations will take care of that.

CEPH: Ceph is a free software storage platform designed to present object, block, and file storage from a single distributed computer cluster. Ceph's main goals are to be completely distributed without a single point of failure, scalable to the exabyte level, and freely-available. The data is replicated, making it fault tolerant.



In case you want to view the slides about the presentation deploying openstack and CEP with Juju at your own speed and comfort please find the slides below as they are shared on slideshare.




Thursday, October 03, 2013

Oracle Software Defined Datacenter enabling strategy

When using a cloud service less and less people are thinking about how things work "under the cloud". The cloud is taken as a given fact without thinking about how a cloud vendor is ensuring everything is working and is capable of providing the scalability and flexibility that comes with a true cloud solution. There is also no need to think about this in many cases, unless you are the one who is building the cloud solution and/or responsible for maintaining the solution. 

As already stated by Pat Gelsinger, the VMWare CEO we are entering the third wave of IT which is the Mobile-cloud wave. This third wave is making life much more simpler for a number of people, when you need an environment you can simply request one by your infrastructure-as-a-service provide and most hints will be arranged. When you for example request a new instance at Amazon web services you can simply click your network components together and magically everything is working. 

The more complicated factor that is coming in to play which was not (that much) the case in the client-server era is that more and more components need to be virtualised and should be able to be controlled from a central software based portal. This is when SDDC is coming into play, SDDC stands for Software-Defined Data Center and is an architecture approach in which the entire IT infrastructure extending on the virtualisation concept. Within this concept all infrastructure components are delivered as it where software components. In general the main 3 components of a SDDC architecture are:

Compute virtualisation, which is a software implementation of a computer.

Network and security virtualization. Network virtualization, sometimes referred to as software-defined networking, is the process of merging hardware and software resources and networking functionality into a software-based virtual network.The network and security virtualization layer untethers the software-defined data center from the underlying physical network and firewall architecture

Software-defined storage, or storage virtualization, enables data center administrators to manage multiple storage types and brands from a single software interface. High availability, which is unbundled from the actual storage hardware, allows for the addition of any storage arrays as needed.

When we take a look at the Oracle portfolio we do see a tendency towards software-defined-datacenter solutions. As Oracle is adopting the cloud thinking and is not only providing a cloud platform but also is providing the building blocks for customers to build there own (internal) clouds it is not more then logical that we find SDDC supporting solutions. 

Oracle Compute virtualisation;
it is without any doubt that Oracle is working on a number of virtualisation technologies where Oracle VM is the most noteworthy and used. Next to this Oracle is working on a Solaris containers approach however for the x86 platforms the common standard is becoming Oracle VM which is based on the XEN hypervisor

Software defined networking;
In this field Oracle is taking some great steps. Oracle SDN (software defined Network) has been launched some time ago. Oracle SDN boosts application performance and management flexibility by dynamically connecting virtual machines (VMs) and servers to any resource in your data center fabric. Oracle SDN redefines server connectivity by employing the basic concepts of virtualisation. Unlike legacy port- and switch-based networking, which defines connectivity via complex LAN configurations, Oracle SDN defines connectivity entirely in software, using a supremely  elegant resource: the private virtual interconnect. A private virtual interconnect is a software defined link between two resources. It enables you to connect any virtual machine or server to any other resource including virtual machines, virtual appliances, bare metal servers, networks, and storage devices anywhere in the data center. 

The SDN solution from oracle provides a great set of management and monitoring tools which enables administrators and architects to manage the virtual network in a more efficient way and also tie this into a flexible cloud solution which is architected front he ground up and is fully automated. 

Software defined storage;
within the field of software defined storage Oracle is, at this moment, not providing a clear path to the future. However when searching the Oracle website you can find some reports that are hinting or talking about the subject. 

There is an IDC report on the oracle website where IDC is stating the following question without answering it directly; "Will Oracle leverage ZFS or ZFS/OpenStack for a software-only, software-defined storage solution for hyperscale cloud builders? Given that Oracle does not have a material storage hardware business to protect and it has an excellent software stack with ZFS (and more enhancements coming), Oracle could really become a strategic supplier to next-generation cloud builders.

I my opinion this is a bit off the mark as Oracle has a storage hardware department where they do build and sell storage appliances however it is true that this is not the main focus of the company however can become a more and more valuable part of the company in the upcoming time. 

Next to this there is a report from Dragon Slayer Consulting which can be found on the Oracle website which is also talking for a bit about software defined storage and is also stating some hints on how ZFS appliances can be used in combination with Oracle Enterprise Manager to be used in a software defined storage solution. 

Even though there are a lot of options to "trick" components to act like software defined storage solutions and a lot can be done by using Oracle Enterprise Manager there is not a real good definition and a clear path coming from Oracle on what role they will play with regards to Software Defined Storage in the future. 

Oracle Enterprise Manager;

We do see a trend that Oracle is integrating the monitoring and management options into Oracle Enterprise Manager and making this the central location for all management tasks. Also Oracle announced that it will be integrating with openstack and will provide OpenStack Swift API's. Having the Oracle Enterprise Manager capabilities extended with OpenStack API's and making more and more components "software defined" Oracle is building a portfolio that is able to form the basis for a full Oracle Red Stack private cloud solution, not only for "small" enterprises but also for large cloud vendors who are willing to provide large scale cloud solutions to large number of (internal or external) customers. 

Saturday, September 21, 2013

2013 state of cloud computing adoption

It is beyond any doubt that the cloud way of things is picking up, more and more companies to tend to look at cloud solutions in one way or another. Some might even use cloud based solutions without even knowing it and in some cases enterprises are not aware that departments are using cloud based solutions on their own account. This last part is, or at least should, be a concern for the security department. You can better channel the use of cloud solutions then staying unaware of it or block it and find out that users find other ways of still doing it however in slightly different way. 

What is interesting is to see how cloud is used and by what kind of companies. How they adopt the cloud way of thinking. Cloud is a given and it is not going away anytime soon. Now we can see how this is unraveling and is adopted in the daily way of doing things within companies. 

A good source for such information is the survey from RightScale; " RightScale surveyed technical professionals across a broad cross-section of organisations about their adoption of cloud computing." one things however we have to keep in mind when reading those figures is that the list of people who have been send this survey are (A) technically knowledgeable and (B) most likely in some form a connection with RightScale and might be more cloud orientated then the average business user. Keeping this in mind the report is still providing a good point of view on the current adoption level. 

One of the interesting things is that they show the adoption in a four level way between enterprises and small and medium businesses. For this they use the levels "POC/Experiment", "First project", "several apps" and Heavy use. This is used to show the cloud adoption within enterprises and SMB companies. What you can see from this is that the SMB market is already making much more use of cloud then the enterprises, 41% SMB usages of cloud against 17% in the enterprise section of the market. 



The levels used for the adoption are somewhat comparable with the cloud adoption model as shown below in a pyramid way of representing this. Int his model we use 5 levels instead of 4. Rightscale is not taking into account the virtualization layer which is represented as level 1 in the below pyramid. For the rest you could see "POC/Experiment" as level 2 Cloud Experimentation, "First Project" as level 3 Cloud Foundations, "Several Apps" as level 4 Cloud Exploitation and "Heavy Use" as level 5 Hyper cloud. 

As we have seen form the above information the SMB market is more willing to adopt cloud computing in a "heavy" was and the majority of the SMB companies who use cloud computing do this in a level 5 Hyper cloud way. What is interesting however is that, based upon the RightScale 2013 report, there is not much relative difference between the number of enterprises and SMB companies who use cloud. 75% of all companies do make use of cloud computing, this breaks down in 77% of the enterprises and 73% of the small and medium businesses. 

This would hold that the percent of enterprises using the cloud is higher however that they less advanced in the level of adoption of the cloud computing platform. Small and medium business are lacking behind a little however when they start adopting the cloud they tend to go in much more aggressive and move very quickly into a higher rate of adoption. 

One of the reasons for this could potentially be that cloud computing is picking up at the moment and needs to find its place. We are still in the "client server" area of computing and where enterprises tend to move slower and have in general more and complex systems then small and medium businesses it takes longer to move to the "cloud and mobile" area. Next to this small and medium businesses tend to have a less complex chain of command and decisions can be made more quicker and without the sometimes complex bureaucracy of large enterprises.  

This means, small and medium businesses are more agile and can move quicker into new technologies then enterprises. An additional complexity for adoption is that the amount of money involved for an enterprise is in general much higher then that for a SMB company due to the scale and complexity of the applications used. 

What we however see is that large enterprises are very much willing to adopt cloud computing however it takes longer to implement this solution. However, the market for enterprise cloud computing is opening and should be a focus point for all companies who do sell cloud solutions in one form or another.