James Dixon’s Blog

James Dixon’s thoughts on commercial open source and open source business intelligence

Archive for November 2010

150,000 installations year-to-date for Pentaho

with 8 comments

Our most recent figures show 156,000 copies of Pentaho software were installed so far this year. These numbers are not download numbers, but installed software that has been used. This includes Pentaho servers and some Pentaho client tools. These numbers do not represent only long-term installations, but also do not represent all Pentaho’s software distributions or installations. Since these numbers are not absolutel

An analysis by country of these numbers shows interesting results.

The Long Tail

This chart shows the number of new installations year-to-date for each country. Our data shows new Pentaho installations in 176 countries so far this year. That’s out of a total of 229 countries.

This is clearly a classic long tail. In fact after the first 20 or 30 countries it is difficult to read values from the chart. This second chart uses a log scale. The line on this chart is almost perfectly linear, showing that the distribution by country is pretty much logarithmic.

Over the same time period Pentaho has customers in 46 countries. This is a larger geographic spread than most of the proprietary BI companies.

Since we are dealing with country-based data, here is the analysis I did using Google Geo Map, Pentaho Data Integration, and Pentaho BI Server.

New Pentaho Installs Jan-Oct 2010

This shows the geographic spread of the installations.

It is fairly obvious from the map above that the highest number of installations were in the USA, China, and Brazil, followed by India and parts of Europe. But this simplistic graphic does take into account the economics or demographics of the countries. How does the number of installations relate to the size or economic power of each country?

New Installations Per $Billion GDP

If we look at the number of new installations of Pentaho software per billion dollars of GDP we see a different picture. The GDP data is from the CIA World Factbook

I capped ‘Installs Per Bn GDP’ at 10 to prevent outliers from skewing the color gradient.

Compared with the first map the prominence of the USA and Chine is reduced, and the areas of high activity are shown to be South America followed by Europe and parts of Asia. But analysis using GDP alone does not take into account things like exchange rates and the cost-of-living within a country – as a result there is probably a bias towards countries like the South American ones. So I went to find metrics that should remove bias of economic factors.

New Installations Per 100k Labor Force

If we look at the number of new Pentaho installs compared with the labor force of each country we  get a slightly different picture. The labor force data is from the CIA World Factbook

I capped ‘Installs Per 100k Labor Force’ at 50 to prevent outliers from skewing the color gradient.

Compared with the first two maps, this one shows the South America, Europe, and North America countries roughly equal to each other. Australia and New Zealand are also comparable. Asia, Africa, and the Middle East are shown to be generally behind. What is odd about this graphic is that countries like India, generally considered to be significant open source consumers, are not shown to be within the leading countries. This is because, I’m assuming, that a large percentage of the labor force is agricultural, and as such, less likely to be doing much BI.

New Installations Per 100k Internet Users

So instead of labor force, let’s look at new installations of Pentaho for every 100k internet users within a country.

I capped ‘Installs Per 100k Internet Users’ at 50 to prevent outliers from skewing the color gradient.

Here we see that South America is still prominent, along with southern Europe. The rest of Europe and North America come second along with India, other parts of Asia and Australia. South Africa also makes a showing for the first time. China however does not show strongly.

This metric – Installations per 100k Internet Users – seems like a reasonable way to compare the adoption of software between countries. ‘Internet Users’, by definition, have access to a computer (needed to run FOSS) and to the internet (needed to get FOSS). This metric is not skewed by the percentage of the population that are not internet users, and is not skewed by cost-of-living or exchange rates.

Here are the top 40 countries for new Pentaho installations per 100k internet users ( for countries with over 1 million internet users).

There is a bias still. Countries with a lower percentage of internet users in the total population will be rated higher than those with very high percentage. This is because in the first case, the individuals with internet access will tend to be those in business, i.e. those with a higher than normal need for BI tools. Whereas in the second case the internet users include relatively more families and individuals – those with a lower need for BI tools. This bias would not affect the installation figures of software such as Firefox, but would affect the ratings in Pentaho’s case.

If we group the countries into regions we see some other interesting things. This scatter chart shows number internet users on the X axis and new installations of Pentaho software on the Y axis.

Interestingly the USA, South America, and Asia come out with around the same total number of installations (approx 30,000), but the chart shows a large difference (100m up to 550m) in the number of internet users within those regions. Europe, as a region, has the highest number of new installations, with a 50% margin over the second place region.

So which metric do you think is most valuable? And for what purpose?

Also interesting to note is that the 2010 installations numbers represent, for each country, 40-50% of the all-time (2007-2010) installation figures. This means that the number of new installations so far in 2010 is about the same as the number of installations in the previous 3 years combined.

About This Analysis

And yes, I used Pentaho software to do this analysis – I used Pentaho’s Agile BI process.

  • Iteration 1: I first loaded the ‘new installations’ data into a table to do the histograms and the first map. After seeing the map, it occurred to me that just looking at the installation figures was not very interesting, and that comparing installations to GDP might be better.
  • Iteration 2: So I went to find GDP data and added it to the table using Pentaho Data Integration. After seeing the ‘Pentaho Installs Per $dn GDP’  map it occurred to me that other metrics might show different, and better, results – so I went to find other data sets, not knowing what I might find.
  • Iteration 3: At the CIA World Factbook I found Labor Force and Internet Users. I added these to the table and looked at the maps. At this point I decided that comparing installation counts with the total number of internet users in a country was a good metric.

It took three iterations of finding data, merge/calculate/load, and visualization before I settled on an analysis I thought was optimal. The important point here is that until I saw the data visualized, the next question did not occur to me, so a one-pass ‘requirements’ -> ‘design’ -> ‘implement’ -> ‘visualize’ process would not have worked.

Advertisement

Written by James

November 15, 2010 at 3:27 pm

Data Quality Issue Sparks Invasion of Costa Rica

leave a comment »

When it comes to data, quality is king. Validating that the data is right is an important part of a BI project. Get the data wrong, and bad things can happen.

For example: the State Department sends Google bad border data – and Costa Rica gets invaded.

http://searchengineland.com/nicaragua-raids-costa-rica-blames-google-maps-54885

http://edition.cnn.com/2010/TECH/web/11/05/nicaragua.raid.google.maps/

That’s my new favorite data quality anecdote

Written by James

November 7, 2010 at 3:37 am

SalesForce needs a good CRM system

with 3 comments

Today I received some sales spam from SalesForce.com. Nothing unusual about that except that we are a customer of SalesForce.com.

It’s not a good demonstration of their product capabilities if the king of SaaS Customer Relationship Management can’t internally identify who their customer are. #Fail

SalesForce needs a better CRM system. Any suggestions?

Written by James

November 5, 2010 at 11:30 pm

Posted in Uncategorized

Comparing open source and proprietary software markets

with 22 comments

An excellent post on decisionstats.com in response to Jason Stamper’s CBR interview with SAS CEO Jim Goodnight highlights some issues that arise when you try to compare open source, commercial open source, and proprietary software in the marketplace.

The issues arise because the economics of the  offerings are so different that traditional measures don’t work as well as they used to. As pointed out on decisionstats the commercial open source offerings (as yet) don’t amount to much when you compare revenue $ on paper.

There are several problems when attempting an analysis like this.

1. Stale Metrics

A few years ago the analysts of the operating systems market were confused. Everywhere they went they saw Linux machines in the server room. But every time they surveyed the market their results showed that Linux had a 2% market share. Their statistical data did not match the real world they witnessed. Why? They were asking the wrong question.

I’ll illustrate this problem using another domain. Let’s say we compare the actual usage of automobiles (vs unit sales) by looking at the total amount of gasoline bought by their owners. Up until a while ago the differences in mileage consumption between different vehicles was small enough to be safely ignored. But if you use that metric today you’ll come to the conclusion that the hybrid cars are not driven very far each year, and the electric cars are not driven at all. Clearly ‘gasoline spend’ is the wrong metric today, but would have been ok in the past.

Let’s switch back to software. After the Red Hat purchase of JBoss I received a survey from a well known analyst company. They were trying to gauge market reaction to the purchase. Here are two sample questions:

  • How much did you spend on JBoss licenses last year?
  • How much do you expect to spend on JBoss licenses next year?

My truthful answer to these questions would have to be $0. JBoss does not have a license fee. It is purchased on a subscription basis. There were other questions about spending on JBoss vs WebSphere and WebLogic. Even if I answer based on subscription dollars $1 of JBoss buys me much more than $1 of WebSphere, so dollar for dollar comparisons aren’t valid.

For most commercial open source companies the subscription is around the same as the annual maintenance for the equivalent proprietary software – with no up front license fee. Let’s say the proprietary software is $100k with 20% maintenance. So the first year revenue is $12ok, and $20k per year after that. The (approximately) equivalent commercial open source software will have $0 license fee and an annual subscription of $20k or less per year. Over three years the proprietary software would cost $160k and the commercial open source software would be $60k – roughly 3 times cheaper. Independent assessments show the savings to be greater than that but I’ll stick to 3x for now.

Any company using pure open source or a community edition will report even lower figures when it comes to software spend.

The analysts in the operating systems market now ask many more questions about how many CPUs are deployed, instead of just questions about money. Some even ask about Linux variants like Fedora. They ask these questions because they have learned that they don’t have a clear picture of the market without them.

Each analyst decides what questions they want to ask. In the BI space, to date, most of the questions are about spending, not usage.

Hidden Utilization

Just after Sun acquired MySQL Jonathan Schwartz (Sun CEO) took Marten Mikos (MySQL CEO) with him on a visit to the CIO of a large financial services company. The CIO was pleased to meet Mickos but didn’t know why he was there – ‘we don’t use MySQL’ he said. Mickos informed the CIO that there had been over 1000 downloads of MySQL by employees in his company. At this point the IT director informed his CIO that they used MySQL all over the place. Ask a CIO how much open source software is used in his company and you will often get an underestimate – sometimes wrong by many orders of magnitude.

Factoring in Community Edition

There is another problem with questions asked about software expenditure – it assumes that the open source software has no value outside of a subscription.

Many commentators on software markets won’t include data about companies who use software but do not have a support agreement. I assume the logic is that if you don’t have a support agreement, you’re not really serious about it. Maybe its just because it’s a lot easier to analyze this way. But at some point it becomes like analyzing the travel market and saying that if you don’t use a travel agent you’re not really serious about traveling, and therefore don’t count. At some point the analysis no longer reflects reality.

The most popular open source software such as Firefox, OpenOffice, MySQL, Linux  etc are used by thousands or millions of people. Look at the effect of the Apache HTTP server on that market. How can you assume that this usage does not affect the market share across that market segment?

So usage of open source BI is hard to ignore, but also hard to quantify. Maybe we factor it in under a new metric.

Intangible Market

There is a portion of any given software market that is intangible because open source software is being used. This intangible market is not monetized by the commercial open source company – it is not revenue or bookings. It is value (or savings if you prefer) that the market derives from the open source software.

Here is one way to estimate the size of the ‘Intangible Market’ of a commercial open source company that makes $10m in revenue:

  • These companies deliver functionality about 3-5 times cheaper than the proprietary software vendors – let’s use 3
  • These companies get about 70% of their revenue from subscriptions – a factor of 0.7
  • The ratio of community installations to customer installations is between 1:5000 and 1:100 – let’s use 1:100
  • The same number of installations, under a proprietary model, would generate $10m x 3 x 0.7 x 100 = $2.1 bn

So is the ‘intangible market’ of a $10m commercial software company really over $2bn? This seems a little extreme, but then so does assuming that usage of community edition software has zero impact. The $2.1bn is clearly strongly affected by assuming that every installation of the community edition has the same economic weight as a paying customer. So let’s mitigate. Many of these installations are:

  • At deployments where there is no software budget for BI tools.
  • In geographies where none of the proprietary vendors offer sales or support.
  • In economies where exchange rates make the proprietary offerings non-viable

These installations represent a new market that has not been included before. But assuming a dollar for dollar exchange with the ‘old’ market is not reasonable. On the other hand many of these installations are:

  • Are at deployments of SMB and enterprise companies with BI budgets.
  • In North America, Europe and other regions where proprietary vendors offer sales or support.
  • In economies where the proprietary offerings are frequently purchased.

So how do we balance these? I’m going to go with a factor of 0.1. Which says an installation of open source BI software (with no support contract) is effectively worth 10c where a supported commercial open source package or proprietary package would be worth $1.

So a commercial open source company with $10m in annual revenue has an intangible market of $10m x 3 x 0.7 x 100 / 10 = $210m. Overall these factors comes out to 21 x annual revenue.

So there you have it. My initial estimate of a generic FOSS Intangible Market Factor (FIMF) is 21. To be accurate you’d need to estimate the FIMF for each open source offering within a given market.

Then again, I’m a code jockey, not an analyst. Maybe they can estimate each of these factors more accurately.

Written by James

November 2, 2010 at 4:15 am

SAS under pressure from Pentaho

with 22 comments

Computer Business Review recently reported an interview with SAS CEO Jim Goodnight in an article titled SAS CEO says CEP, open source and cloud have “limited” appeal

Firstly, welcome to the party, Jim.

This article is good news for Pentaho. Here’s why. For proprietary software companies the first rule of marketing against open source competitors is ‘don’t mention them unless you absolutely have to’. Microsoft discovered this over 12 years ago, and since then have not managed to create an effective anti-open source marketing campaign (read about the Halloween Documents here and here for more info). That’s with 10 years to think about it and over $10 billion to spend. So SAS must be under some pressure otherwise Jim Goodnight (the CEO no less) would be keeping his mouth shut. Hence the title of this post.

What this means is that SAS has moved from the Igorance phase to the Ridicule phase of battling open source, they only have Fighting and Losing to go. See Red Hat’s Truth Happens video for an entertaining description of the phases.

There are a couple of other interesting points in the article.

Jim Goodnight says of open source BI:

We haven’t noticed that (open source BI) a lot. Most of our companies need industrial strength software that has been tested, put through every possible scenario or failure to make sure everything works correctly.

He states that SAS does not come across companies like Pentaho in their part of the BI market. Interestingly he picks on quality and the extend of the testing environment, which is actually a strength of open source development over proprietary development. He implies that his customers have additional requirements that other BI consumers might not have. He seems to be saying that SAS has enterprise products and enterprise customers, leading us to conclude that open source BI is only a competitive threat in the small and medium sized business (SMB) market. There are a couple of problems with this for SAS. Firstly they lists their customers on their web site, and its very easy to cross-reference that list against the companies we know have downloaded and installed Pentaho’s software – a quick comparison shows the overlap is higher than 30%. Ouch. So much for SAS’s customers not having an interest in open source BI. You can check the Pentaho forums and see the activity of SAS customers integrating with Pentaho, and (oh no!) migrating from SAS to Pentaho. Google trends shows us how interested people are in Jim Goodnight’s company vs ours.

The reason SAS has not noticed open source BI is that those projects have gone off the radar and their sales reps don’t even hear about them. They don’t see because they’re not looking. Meanwhile, the SAS sales reps almost certainly have a deck of slides to position SAS against open source alternatives. If they are anything like the slide decks of the other proprietary BI vendors they are amusingly inaccurate FUD. Send me a copy if you have one, I’d enjoy them.

Also Jim Goodnight’s viewpoint of open source BI is interesting because it shows where the ‘Open Source Tide’ is. Here is the rising of the tide:

  • Operating systems vendors (Microsoft, 1990’s) say open source is for hobbyists.
  • A few years later database vendors say open source is fine for your OS but not for your databases
  • A few years after that middleware vendors say open source is fine for OS and databases, but not for middleware
  • A few years after that application vendors say open source is fine for the rest of the stack, but not for applications

At this point in the tide we are into sub-markets. Jim Goodnight is implying that open source is suitable for SMB BI but not SAS’s kind of BI. My favorite Open Source Tide article is a Forbes one titled A Fatal Flaw for Open Source. In this article it is essentially stated that open source is ok for everything except multi-tennent hosted applications. Wow. The tide is getting pretty high. Not much ground left to stand on. Of course the Forbes interview is with the CEO of a company that provides hosting services. The common theme here is that these CEO’s are telling us ‘open source is fine for everything except what we sell’. Predictable really. But they can never really tell you why – how can SAS tell us open source BI is not enterprise ready when they support Linux, MySQL, PostgreSQL etc?

Here is another thought. A few years ago, weren’t the established BI vendors saying their avenues for growth were going to be the SMB market and emerging markets? Now the CEO of SAS is saying he’s comfy staying in the enterprise zone. Hmm, so emerging markets is the fallback? SAS has customers in 45 countries. Pentaho has installations in over 180 countries. Good luck entering those geographies with expensive proprietary software, when open source BI is already the incumbent.

With open source presenting challenges in both SMB and emerging markets you might expect the growth of BI for the old guard companies to be rather flat – oh look

  • Business Objects (SAP): -0.2% market share growth last year
  • Hyperion (Oracle): 2.3% market share growth last year
  • SAS: 2.7% market share growth last year
  • MicroStrategy: -6.4% product license revenue growth Q3 2010 from Q3 2009
  • Pentaho is on target for 150% growth this year

One final comment: I’ve worked for proprietary BI companies including Hyperion. If you’re going to pick on open source, don’t pick on quality.

Written by James

November 1, 2010 at 6:53 pm