Thursday, March 29, 2007

Dell and Linux

Dell could very well be shipping Linux dekstop computers. On the Dell company blog the most popular idea at this moment is shipping PC’s with Linux.

Already for some time it was possible to order a Linux sever from Dell but now it looks like Dell is also betting on Linux in the consumer market. Some news agencies are already stating that Dell already made a decision to do so however at this moment the reality is that it is still under investigation.

However it looks like they will be making a decision in favor of Linux.

Consistent Greenhouse Gas Effect

New calculations show that sensitivity of Earth's climate to changes in the greenhouse gas carbon dioxide (CO2) has been consistent for the last 420 million years, according to an article in Nature by geologists at Yale and Wesleyan Universities.

A popular predictor of future climate sensitivity is the change in global temperature produced by each doubling of CO2 in the atmosphere. This study confirms that in the Earth's past 420 million years, each doubling of atmospheric CO2 translates to an average global temperature increase of about 3° Celsius, or 5° Fahrenheit.

According to the authors, since there has continuously been life on the planet over this time span, there must be an ongoing balance between CO2 entering and leaving the atmosphere from the rocks and waters at Earth's surface. Their simulations examined a wide span of possible relationships between atmospheric CO2 and temperature and the likelihood they could have occurred based on proxy data from geological samples.

Most estimates of climate sensitivity have been based on computer simulations of climate or records of climate change over the past few decades to thousands of years, when carbon dioxide concentrations and global temperatures were similar to or lower than today. Such estimates could underestimate the magnitude of large climate-change events.

To keep Earth's carbon cycle in balance, atmospheric CO2 has varied over geologic time. Carbon-cycle models balance chemical reactions that involve carbon, such as photosynthesis and the formation of limestone, on a global scale. To better predict future trends in global warming, these researchers compared estimates from long-term modeling of Earth's carbon cycle with the recent proxy measurements of CO2.

This study used 500 data points in the geological records as "proxy data" and evaluated them in the context of the CO2 cycling models of co-author Robert Berner, professor emeritus of geology and geophysics at Yale who pioneered models of the balance of CO2 in the Earth and Earth's atmosphere.

"Proxy data are indirect measurements of CO2 -- they are a measure of the effects of CO2," explained co-author Jeffrey Park, professor of geology and geophysics at Yale who created the computer simulations for the project. "While we cannot actually measure the CO2 that was in the atmosphere millions of years ago, we can measure the geologic record of its presence. For example, measurement of carbon isotopes in ancient ocean-plankton material reflects atmospheric CO2 concentrations."

Led by Dana L. Royer, assistant professor of Earth and Environmental Sciences at Wesleyan University, who did his graduate work in geology at Yale, the collaboration simulated 10,000 variations in the carbon-cycle processes such as the sensitivity of plant growth to extra CO2 in the atmosphere. They evaluated these variations for a range of atmospheric warming conditions, using the agreement with the geologic data to determine the most likely warming scenarios. The model-estimated atmospheric CO2 variations were tested against data from ancient rocks.

Other proxy measurements of soil, rock and fossils provided estimates of CO2 over the past 420 million years. Calculation of the climate sensitivity in this way did not require independent estimates of temperature. It incorporated information from times when the Earth was substantially warmer and colder than today, and reflects the sensitivity of the carbon-cycle balance over millions of years.

"Our results are consistent with estimates from shorter-term records, and indicate that climate sensitivity was almost certainly greater than 1.5, but less than 5.5 degrees Celsius over this period," said Park. "At those extremes of CO2 sensitivity, [1.5°C or 5.5°C] the carbon-cycle would have been in a 'perfect storm' condition."

Citation: Nature (March 29, 2007)

Wednesday, March 28, 2007

Rank Malicious Web Sites

Have you ever wondered how fraudulent or malicious websites can rank highly on search engines like Google or Yahoo?

Queensland University of Technology IT researcher Professor Audun Josang said a website's ranking was determined by the number of people who visited the site - the more hits the higher the ranking.

But this system is fraught with danger and can be easily manipulated directing people to unreliable, low quality and fraudulent sites, according to Professor Josang.

"Just because a website ranks highly on a search engine doesn't mean it's a good website, in fact highly ranked websites can be malicious websites," he said.

To safeguard against this type of threat, Professor Josang believes the answer is to develop a new type of internet security system based on "reputation" where a community of users can rank the quality of a website.

He said this could then be used to warn others from visiting that site.

"For example most people are able to recognise a website that tries to trick them into giving confidential information (a phishing attack) when they see it," he said.

"With this system, aware users can rate such websites as malicious and as a result a phishing site will be quickly and universally recognised as dangerous, warning unsuspecting users against visiting that site."

Professor Josang said using this "social control" approach could provide protection against this type of online threat, by preventing attacks before they occurred.

"Social control methods, also known as soft security, adhere to common ethical norms by parties in a community.

"They make it possible to identify and sanction those participants who breach the norms and to recognise and reward members who adhere to them."

Professor Josang said in today's technologically advanced world of business, high ranking of a company's web page was a crucial factor for its success.

"This is why the control of search engines is so important and why it can be financially worthwhile for businesses to manipulate the system."

The central idea of Professor Josang's research is to take search engines one step further and by using them to make the internet a safe place to interact and transact.

"This project is about a new type of internet security that can be supported by search engines. There is a deception waiting for you around every corner on the internet and the technology we develop will protect people from that.

"I think in the future reputation systems, integrated into search engines, can be used to weed out such websites by giving them a low ranking and thereby making them invisible to unsuspecting users."

Tuesday, March 27, 2007

Playstation 3 Computing Cluster

The Sony Playstation 3, Xbox and Nintendo Wii have captivated a generation of computer gamers with bold graphics and rapid-fire animation. But these high-tech toys can do a lot more than just play games. At North Carolina State University, Dr. Frank Mueller imagined using the power of the new PS3 to create a high-powered computing environment for a fraction of the cost of the supercomputers on the market.

Mueller, an associate professor of computer science, has built a supercomputing cluster capable of both high-performance computing and running the latest in computer gaming. His cluster of eight PS3 machines – the first such academic cluster in the world – packs the power of a small supercomputer, but at a total cost of about $5,000, it costs less than some desktop computers that have only a fraction of the computing power.

“Clusters are not new to the computing world,” Mueller says. “Google, the stock market, automotive design companies and scientists use clusters, but this is the first academic computing cluster built from Playstation 3s.

“Scientific computing is just number crunching, which the PS3s are very good at given the Cell processor and deploying them in a cluster,” Mueller says. “Right now one limitation is the 256 megabyte RAM memory constraint, but it might be possible to retrofit more RAM. We just haven’t cracked the case and explored that option yet.” Another problem lies in limited speed for double-precision calculations required by scientific applications, but announcements for the next-generation Cell processor address this issue.

“In the computing world there is a list of the top 500 fastest computers,” Mueller says. Currently the fastest is BlueGene/L, a supercomputer with more than 130,000 processors at Lawrence Livermore National Laboratory. The PS3 cluster at NC State does not break into the top 500, but Mueller estimates that with approximately 10,000 PS3 machines anyone could create the fastest computer in the world – albeit with limited single-precision capabilities and networking constraints.

The PS3 allows the Linux operating system to be installed, and IBM designed the programming environment for programming the Cell processor (including eight vectorization units), which combined tremendous computing power within a single PS3. According to Mueller, each PS3 unit contains six operational special-purpose cores for number crunching and one general-purpose core that is two-way multithreaded in his configuration, so the eight machines clustered have 64 logical processors, providing plenty of number-crunching ability in addition to running the latest games.

“Jan. 3 is the ‘birthdate’ of this cluster,” Mueller says. “Of course, here at NC State we will use it for educational purposes and for research. We are working with scientists to determine the needs and how our cluster can be used to their benefit, and our computer science faculty is already using the cluster to teach classes in operating systems, with parallel systems, compilers and gaming likely to follow.”

Note: This story is based on a news release issued by North Carolina State University.

Monday, March 19, 2007

su : could not open session

[root@termtest etc]# su nagios
could not open session
[root@termtest etc]#

Problem, unable to su to a different account. When you login as root and try to access a different account by issuing the su command you get the error “could not open session”.

Today this problem occurred on one of the redhat Linux servers after restoring the /etc/passwd and the /etc/shadow file from backup. The problem is that the permissions on the /etc/passwd file are not set correctly.

If you experience this problem and you do a check on the file you will most likely see something similar as in this example:

[root@termtest etc]# ls -rtl
-rw------- 1 root root 1871 Mar 6 16:35 passwd
[root@termtest etc]#

The correct permission on the file should be:
-rw-r--r-- 1 root root 1871 Mar 6 16:35 passwd

To correct this you should invoke the chmod command to change the permissions on the file.

[root@ termtest etc]# chmod 644 passwd

This will solve the problem and will enable you to su to an other user.

Sunday, March 11, 2007

MySQL naming convention

Currently I am working on a private project with some other people. We are trying to develop a new web-based system which will be using a MySQL database and a PHP interface. Because there are a couple of people working on the project the person responsible for creating the database model used Microsoft Access to create the basic database model. I installed a windows development server running MySQL in combination with PHP and a Apache webserver, I intended to run it on a Debian server but found out that the best support for the migration of Access to MySQL was on the Windows platform by using the MySQL migration toolkit.

After migrating from Access to MySQL I have been developing a lot of queries and functions, now I migrated from MySQL on windows to MySQL on a Linux server at the datacenter and suddenly found almost every query failing. The reason behind this is that because of the way MySQL is developed the object names on a windows platform are by default NOT case secetive and on a Linux/UNIX platform they are case sensitive.

In MySQL, databases and tables correspond to directories and files within those directories. Consequently, the case-sensitivity of the underlying operating system determines the case-sensitivity of database and table names. This means database and table names are case-insensitive in Windows, and case-sensitive in most varieties of Unix. One prominent exception here is Mac OS X, when the default HFS+ file system is being used.

Meaning, if you are working on a MySQL project remember that you will be needing a strong naming convention for all the database objects you will be creating during the project. Maybe you are convinced that it will always run on a windows server but it might change during the course of the project and you will have a hell of a time renaming all the objects in the database and in all the functions, queries and scripts.

In basic you can create your own naming convention for your database objects, use lowercase, use uppercase use them in combination as long as you know what th convention is and all the members of the team are aware of it and keep to the naming convention. My personal flavor is to use only uppercase in the naming of database objects. Use lowercase for the column names.

Some other rules that can make your like a lot easier is to start always with a prefix.
  • TBL_{name of the table}
  • V_{name of the view}
  • SEQ_{name of the sequence}
  • etc etc etc

Saturday, March 10, 2007

Nagios plugins for Oracle.

Nagios is a host and service monitor designed to inform you of network problems before your clients, end-users or managers do. It has been designed to run under the Linux operating system, but works fine under most *NIX variants as well. The monitoring daemon runs intermittent checks on hosts and services you specify using external "plugins" which return status information to Nagios. When problems are encountered, the daemon can send notifications out to administrative contacts in a variety of different ways (email, instant message, SMS, etc.). Current status information, historical logs, and reports can all be accessed via a web browser.

It is the plugin option that makes Nagios so powerful, not everyone is willing of capable of writing a Nagios clone for them selfs but most of the Linux/UNIX administrators and developers will be able to write a plugin. An other power of Nagios is that you can write a plugin in every language you like as long as it is capable of providing a stdout. This means that the person writing the plugin can do this using C/C++, Java, Bash, Perl or any language of this choice.

This is one of the main drivers of the success of Nagios and the adaption of the system in a wide range of companies. Almost every company serious about running Linux / UNIX servers have a Nagios server running or should think about this. As my new job requires me to monitor a lot of Linux servers besides working on Oracle projects I will have to learn all the ins and outs of the Nagios system. This will save me a lot of time and frustration and in cooperation with ILO it will save me a lot of drives to the datacenter.

As I always will be a Oracle person I have been searching on Google for Nagios plugins which can be used to monitor Oracle and found several. A good website I can advice you to take a look is NagiosExchange which offers a large number of plugins which also contains some Nagios Oracle plugins. Plugins to check if the database is in archive mode, buffer cache checking, tablespace usage, executing your own PL/SQL checking scripts, check if you are able to write to a database instance and a lot more plugin functions can be found here.

If we take a look at the other type of plugins you will find a plugin for almost every system and network equipment that is out there and if not you will find out that it is not that hard to write your own custom plugin. As an example you can take a look at the plugins which can be found in the nagios-plugins-1.4.tar.gz file which you can download from
If you take a look at the "contrib" directory you will be able to view a lot of files you can use as an example or use out-off-the-box. You can also use google codesearch to take a look without downloading the file. You can use this link for viewing the file like that. As an example you can see here the code used to check if a process is running on a server:


# Program: Process running check plugin for Nagios
# License : GPL
# Copyright (c) 2002 Jerome Tytgat (
#,v 1.0 2002/09/18 15:28
# Description :
# This plugin check if at least one process is running
# Usage :
# -p process_name
# Example :
# To know if snort is running
# -p snort
# > OK - total snort running : PID=23441
# Linux Redhat 7.3

help_usage() {
echo "Usage:"
echo " $0 -p "
echo " $0 (-v | --version)"
echo " $0 (-h | --help)"

help_version() {
echo " (nagios-plugins) 1.0"
echo "The nagios plugins come with ABSOLUTELY NO WARRANTY. You may redistribute"
echo "copies of the plugins under the terms of the GNU General Public License."
echo "For more information about these matters, see the file named COPYING."
echo "Copyright (c) 2002 Jerome Tytgat -"
echo "Greetings goes to Websurg which kindly let me took time to develop this"
echo " Manu Feig and Jacques Kern who were my beta testers, thanks to them !"

verify_dep() {
needed="bash cut egrep expr grep let ps sed sort tail test tr wc"
for i in `echo $needed`
type $i > /dev/null 2>&1 /dev/null
if [ $? -eq 1 ]
echo "I am missing an important component : $i"
echo "Cannot continue, sorry, try to find the missing one..."
exit 3



if [ "$1" = "-h" -o "$1" = "--help" ]
echo ""
echo "This plugin will check if a process is running."
echo ""
echo ""
echo "Required Arguments:"
echo " -p, --process STRING"
echo " process name we want to verify"
echo ""
exit 3

if [ "$1" = "-v" -o "$1" = "--version" ]
exit 3

if [ `echo $@|tr "=" " "|wc -w` -lt 2 ]
echo "Bad arguments number (need two)!"
exit 3


# Test of the command lines arguments
while test $# -gt 0

case "$1" in
if [ -n "$process_name" ]
echo "Only one --process argument is useful..."
exit 3
process_name="`echo $1|tr \",\" \"|\"`"
echo "Unknown argument $1"
exit 3

# ps line construction set...
for i in `ps ho pid -C $process_name`
pid_list="$pid_list $i"

if [ -z "$pid_list" ]

# Finally Inform Nagios of what we found...
if [ $crit -eq 1 ]
echo "CRITICAL - process $process_name is not running !"
exit 2
echo "OK - process $process_name is running : PID=$pid_list "
exit 0

# Hey what are we doing here ???
exit 3

If you want some more basic insight into how Nagios is working there is a very
nice introduction guide written by Mark Duling which you can find at

Friday, March 09, 2007

Mercury Messenger V1.8 MSN client

I have tried, not to use MSN for a long time, however now I have failed. At my new job most of the communication between people working at an other location (think a customer location) is done by MSN. At my previous job we used a IBM corporate tool but due to the fact that my new company is not big enough to run there own chat server only for those cases that someone is at a remote location we use MSN. Meaning, I have to use MSN. My big problem was that it is a Microsoft service and I did not like to use it. Why not use IRC relay chat instead of something like MSN? Now more and more people are using it and because of this I have started to use it.

Because I wanted to run it on my Mac I needed a Mac client for MSN. I have tried several MSN clients for the Mac and finally found the Mercury Messenger V1.8 the best suitable for my needs. Mercury Messenger was formally known as dMSN. Mercury is a java based MSN client and it works like a charm, in my opinion. I just installed aMSN on my girlfriends Linux laptop so, including the others I tested and discarded, I have tested several non Microsoft MSN clients and I have to say they all work better than the client presented by Microsoft.

So if you need to use MSN, download a non windows client and start having fun.

Monday, March 05, 2007

Make a screenshot using a Mac.

Just recently, last week, I purchased a new laptop. I was in need of a smaller and lighter laptop and the side effect was that i could give my old laptop running Ubuntu Linux to my girlfriend to use during here new study. After some searching I found myself i favor of a MacBook, a very nice designed laptop, a great looking operating system and a Linux/UNIX like operating system under the nice graphics layer.

Even do I already have a MacMini for some time I still do not know in detail how the Mac operating system is working. I used my MacMini mostly for surfing the web and do some graphics design however for the rest is mostly been a toy to play with and it never became a serious computer. However, now I have a MacBook I will have to find out all the secrets of mac because this computer will be my traveling mate for the upcoming time and most of the stuff I will be working on will be done from this laptop. I will have my work laptop but this will be running Windows so for all the good stuff I will need to know how to operate a Mac.

Looking under the hood there is running Darwin, an open-source UNIX like operating system which I can manage and understand. The problem is more in the graphical mode. All the new buttons and toys to play with... and it all looks real great. Now I have to start mastering it.

One of the questions that came to mind was how do I make a simple screenshot. Looking at the keyboard I could not find a "printscreen" button and looking in the menu and the application folder I was unable to locate a screenshot application. However David Battino from O'Reilly came to the rescue.

  • Command+Shift+3 : Capture entire screen and save as a file
  • Command+Control+Shift+3 : Capture entire screen and copy to the clipboard
  • Command+Shift+4 : Capture dragged area and save as a file
  • Command+Control+Shift+4 : Capture dragged area and copy to the clipboard
  • Command+Shift+4 then Space bar : Capture a window, menu, desktop icon, or the menu bar and save as a file
  • Command+Control+Shift+4 then Space bar : Capture a window, menu, desktop icon, or the menu bar and copy to the clipboard
For his complete guide for making a good screenshot using a mac you can look at this article named Mac OS-X Screenshot Secrets.