Enterprises are starting to adopt the hybrid cloud model, this means that in some cases applications that are normally hosted on-premise are moved into the cloud in full. This also means that in some cases parts of an application are moved to the cloud and that some systems on which an application depends will stay on premise and will not move to the cloud.
A common question asked when discussing moving parts of an IT estate to the cloud is how to bridge the gap between the systems in the cloud and the systems that remain on premise. The below diagram shows a common deployment in enterprises where one application is depending on the database of another application.
A common question asked when discussing moving parts of an IT estate to the cloud is how to bridge the gap between the systems in the cloud and the systems that remain on premise. The below diagram shows a common deployment in enterprises where one application is depending on the database of another application.
Figure 1 - shared database deployment
In this deployment shown in figure 1 the following applies:
- Application 0 makes use of database instance A
- Application 1 makes use of database instance B and makes use of database instance A
When deployed in a single datacenter the connection between the applications and the databases will all be equal (in general). No significant delay or latency is to be expected and the user experience is equal for both the users of application 1 and application 2
Moving to the cloud
In case a requirement is stated that application 2 is moved to the cloud and application one (for a specific reason) will stay on premise, including the directly associated database, the deployment model will start to look as shown in figure 2.
Figure 2 - Crossing the line
In this case the worry of many companies is with connection A shown in the above figure worry about potential latency over connection A, they worry what might happen to application availability of application 1 in case the connection becomes unavailable for a moment.
A possible solution is making use of a caching mechanism, a solution often used to alleviate workloads from a database server and speed up application performance. A caching solution like this can also be used to resolve the issue of bridging the gap between cloud based applications and on premise data stores. Do note, data stores, as this can also be something else than the Oracle database used in this example.
Using cache as a bridge
A good open source solution for this is memcached. A large number of enterprises use memcached. It is good to realize what memcached is, memcached is an in memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls or page rendering. If your application is able to function based upon this principle memcached is a very good solution to implement and mitigate the risk of a breaking or limited connection between application-1 and database A. This would result in a deployment as shown in figure 3.
figure 3 - using memcached
Understanding memcached
To fully grasp the possibilities it is important to take a closer look at memcached and how it can be used. As stated, memcached is an open source in memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls(, API calls or page rendering). You will be able to run memcached on an Oracle Linux instance, this can be an instance in the Oracle public cloud as part of the compute service (as shown in this example) or it can be an Oracle Linux instance deployed in a private cloud / traditional bare-metal deployment.
No comments:
Post a Comment