Entries Tagged 'Licensing' ↓

On the rise and fall of the GNU GPL

Back in 2011 we caused something of a stir, to say the least, when we covered the trend towards permissive licensing at the expense of reciprocal copyleft licenses.

Since some people were dubious of Black Duck’s statistics, to put it mildly, we also validated our initial findings, at Bradley M Kuhn’s suggestion, using a selection of data from FLOSSmole, which confirmed the rate of decline in the proportion of projects using the GPL family of licenses between October 2008 and May 2011.

Returning to Black Duck’s figures, we later projected that if the rate of decline continued the GPL family of licenses (including the LGPL and AGPL) would account for only 50% of all open source software by September 2012.

As 2012 draws to a close it seems like a good time to revisit that projection and check the latest statistics.

I will preface this with an admission that yes, we know these figures only provide a very limited perspective on the open source projects in question. A more rounded study would look at other aspects such as how many lines of code a project has, how often it is downloaded, its popularity in terms of number of users or developers, how often the project is being updated, how many of the developers are employed by a single vendor, and what proportion of the codebase is contributed by developers other than the core committers. Since that would involve checking all these for more than 300,000 projects I’m going to pass on that.

Additionally, while all that is true, it does not mean that there is no value in examining the proportion of projects using a certain license. I am more interested in what the data does tell us, than what it doesn’t.

Data sources:
We analysed two distinct data sources for our previous analysis: Black Duck’s license data and a selection of data collected by FLOSSmole. Specifically we chose data from Rubyforge, Freecode (fka Freshmeat), ObjectWeb and the Free Software Foundation because those were the only sets for which historical (October 2008) data was available in mid 2011. For this update we have to use FLOSSmole’s data from September 2012 since the November 2012 dataset for the Free Software Foundation is incomplete. It is not possible to get a picture of GPLv2 traction using this FLOSSmole data since the majority of projects on Freecode are labelled “GPL” with no version number. In addition, for this update we have also looked at FLOSSmole data from Google Code, comparing datasets for November 2011 and November 2012. to get a sense of the trends on a newer project hosting site.

Black Duck’s data
According to Black Duck’s data the proportion of projects using the GNU GPL family of licenses declined from 70% in June 2008 to 53.24% today. The first thing to note therefore is that the rate of decline seen a year ago did not continue, and that the GNU GPL family of licenses continues to account for more than 50% of all open source software. The rate of the decline of the GNU GPLv2 has actually accelerated over the past year, however, and its usage is now almost the same as the combination of permissive licenses (I went with MIT/Apache/BSD/Ms-PL, you can argue about that last one if you like, but I’ve got to stick with it for consistency) at around 32%.

FLOSSmole’s data
Also in the interests of consistency I should clarify that we made a slight error in our previous calculations relating to the data from FLOSSmole. When we looked at the FLOSSmole data in June 2011 we reported a decline from 70.77% in October 2008 to 59.31% in May 2011. In calculating the data for this update I identified an error and that the figure should have been 62.8% in 2011. So less of a decline, but a decline nonetheless. The figures show that despite the total number of projects increasing from 54,000 in 2011 to 57,069 in September 2012, the proportion of projects using the GNU GPL family of licenses has remained steady at 62.8%. However, the proportion of projects using permissive licenses has grown, from 10.9% in 2008 to 13.4% in 2011 and 13.7% in September 2012.

Google Code data
The data from Google Code involves a much larger data set: 237,810 projects in 2011 and 300,465 in 2012. It also presents something problem since one of the choices on Google Code is dual-licensing using the Artistic License/GPL. Including these projects in the GNU GPL family count we see that the proportion of projects hosted on Google Code using the GNU GPL family of licenses declines from 54.7% in November 2011 to 52.7% in November 2011. Interestingly though the proportion of projects using permissive licenses also fell, from 38% in 2011 to 37.1% today. As a side note, the use of “other open source licenses” grew from 2.0% in 2011 to 4.3% in 2012.

What does it all mean? You can read as much or as little into the statistics as you wish. Since I am fed up with being accused of being a shill for providing analysis of the numbers I won’t bother to do so on this occasion – you are perfectly free to figure it out for yourselves.

Here’s everything in a single chart:

Back to the future of commercial open source

It’s been tempting to write a post about open source licensing trends and how they relate to commercial business strategies, given ongoing interest in our previous posts about the relative decline of the GPL.

Every time I start to write a post though I realise that I’d just be repeating myself, most notably The future of commercial open source business strategies from December 2011, but also Control and Community – and the future of commercial open source strategies from late 2010.

You can trace the origins of the theories and research in those posts back to The golden age of open source? in August 2010, and even further to Commercial open source business strategies in 2009 and beyond from early 2009.

That post in particular contains the core elements about why we believed we were at a tipping point with regards to commercial open source strategies, prompting the shift from vendor-led strategies that emphasised control via copyleft licenses, to community-led strategies that emphasised collaboration via permissive licenses.

The one aspect that those posts didn’t cover is what happens after this shift. That is a question that has recently been addressed by Simon Phipps, who predicts that the pendulum will swing to the centre and weak-copyleft licenses and specifically the recently released MPLv2.

While I don’t dispute the logic of that prediction, I can see nothing in the data that we have previously collected and analysed that indicates a shift to weak-copyleft. As you can see, while there was a strong shift from vendors towards non-copyleft licenses from 2007 onwards, we have seen no such shift with regards to weak-copyleft.

Which is not to say that it won’t happen – just that we see no evidence of it right now, and that we would have to see an enormous swing towards weak-copyleft licenses in the next couple of years. It will be interesting to see whether the release of MPLv2 will be the event that triggers that swing.

That’s not science: the FSF’s analysis of GPL usage

The Free Software Foundation has responded to our analysis of figures that indicate that the proportion of open source projects using the GPL is in decline.

Specifically, FSF executive director John Sullivan gave a presentation at FOSDEM which asked “Is copyleft being framed”. You can find his slides here, a write-up about the presentation here, and Slashdot discussion here.

Most of the opposition to the earlier posts on this subject addressed perceived problems with the underlying data, specifically that it comes from Black Duck, which does not publish details of its methodology. John’s response is no exception. “That’s not science,” he asserts, with regards to the lack of clarity.

This is a valid criticism, which is why – prompted by Bradley M Kuhn – I previously went to a lot of effort to analyze data from Rubyforge, Freshmeat, ObjectWeb and the Free Software Foundation collected and published by FLOSSmole, only to find that it confirmed the trend suggested by Black Duck’s figures. I was personally therefore happy to use Black Duck’s figures for our update.

John Sullivan is not overly impressed with the FLOSSmole numbers either, noting that while they are verifiable, they do leave a number of questions related to the breadth and depth of the sample, the relative activity of the projects, whether all lines of code and applications should be treated equally, and how packages with multiple licenses are treated.

These are all also valid questions. As we previously noted, a study that *might* satisfy all questions related to license usage would have to take into account how many lines of code a project has; how often it is downloaded; its popularity in terms of number of users or developers; how often the project is being updated; how many of the developers are employed by a single vendor; and what proportion of the codebase is contributed by developers other than the core committers.

John offers some evidence of his own that suggests that the use of the GPL is in fact growing. Anyone hoping for the all-encompassing study mentioned above is in for some disappointment, however. It is based on a script-based analysis of the Debian GNU’Linux distribution codebase.

Nothing wrong with the script-based analysis – but a single GNU/Linux distribution considered to be a representative sample of all free and open source software?

That’s not science.

The future of commercial open source business strategies

The reason we are confident that the comparative decline in the use of the GNU GPL family of licenses and the increasing significance of complementary vendors in relation to funding for open source software-related vendors will continue is due to the analysis of our database of more than 400 open source software-related vendors, past and present.

We previously used the database to analyze the engagement of vendors with open source projects for our Control and Community report, plotting the strategies used by the vendors against the year in which they first began to engage with open source projects to get an approximate view of open source-related strategy changes over time.

For example, we found that the engagement of vendors with projects that used strong copyleft licenses peaked in 2006, while the engagement of vendors with projects using non-copyleft licenses had been rising steadily since 2002.

Analysis of our updated database shows that the the number of new vendors engaging with open source projects in each year has risen steadily in recent years, from 26 in 2008 to 44 in 2011. However, as noted last week, we have also seen a shift towards ‘complementary vendors’ – those that are dependent on open source software to build their products and services, even though those products and services may not themselves be open source.

2010 was the first year in which we saw more complementary vendors engage with open source projects than open source specialist, and that trend accelerated in 2011.

As previously explained, complementary vendors were responsible for over 30% of open source software-related funding raised in 2011, and we should expect that proportion to remain high given that over 57% of the vendors engaging with open source in 2011 were complementary vendors.

We have also seen that complementary vendors are more likely to engage with projects with non-copyleft licenses (38% of complementary vendors have engaged with projects with non-copyleft licenses, compared to 24% that have engaged with projects with strong copyleft licenses).

If we look at all 400+ vendors in our database in terms of open source software license preference, the trend towards new vendors engaging with non-copyleft licenses is clear.

There has been a strong shift from vendors towards non-copyleft licenses in recent years, accelerated in 2011 by the likes of Apache Hadoop and OpenStack in particular. This does not mean that the number of projects using strong copyleft licensing has decreased (although as we previously saw the proportion of projects using the GPL family of licenses has declined).

It is indicative, we believe, of the shift away from specialist open source vendors using vendor-led projects and strong copyleft licenses towards multi-vendor collaborative projects and proprietary implementations of open source code, however.

This trend should not really surprise anyone. For some time we have seen open source becoming part of the fabric of modern software development and licensing strategies, rather than a competitive differentiator. Back in 2009 we predicted the increased importance of business strategies that relied on vendor-led development communities, rather than projects dominated by a single vendor.

We called this “open source 4.0” and later suggested that it might be considered the golden age of open source, based on our belief that vendors had learned that they stand to gain more from collaborating on open source projects and differentiating at another stage in the software stack than they do from attempting to control open source projects.

Updating the results of our analysis to the end of 2011 and 400+ vendors indicates that, from the perspective of the commercial adoption of open source business strategies at least, we were not far off.

Some might not consider the proliferation of multi-vendor open source communities and proprietary distributions of open source software as the peak of achievement for open source. Each is of course entitled to come to their own conclusions about the implications.

Our perspective, as always, is that open source methodologies present a potentially disruptive, and also valuable, asset that complements the way both vendors and enterprise IT organizations conduct their businesses.

Our analysis indicates, however, that open source methodologies are increasingly being employed by ‘complementary vendors’ with a leaning towards more permissive licensing.

VC funding for OSS hits new high. Or does it?

One of the favourite blog topics on CAOS Theory blog over the years has been our quarterly and annual updates on venture capital funding for open source-related businesses, based on our database of over 600 funding deals since January 1997 involving nearly 250 companies, and over $4.8bn.

There are still a few days left for funding deals to be announced in 2011 but it is already clear that 2011 will be a record year. $672.8m has been invested in open source-related vendors in 2011, according to our preliminary figures, an increase of over 48% on 2010, and the highest total amount invested in any year, beating the previous best of $623.6m, raised in 2006.

Following the largest single quarter for funding for open source-related vendors ever in Q3, Q4 was the second largest single quarter for funding for open source-related vendors ever, as $230.4m was invested in companies including Cloudera, Hortonworks, and Rapid7.

As with Q3, however, the list of vendors presents us with something of an existential dilemma, as we see an increasing amount of activity by what we have referred to as ‘complementary vendors’ – those that are dependent on open source software to build their products and services, even though those products and services may not themselves be open source – as opposed to open source specialists.

The list of complementary vendors has grown rapidly in 2011, particularly around projects such as OpenStack and Apache Hadoop. If we examine the figures in more detail we find that over 30% of the funding raised in 2011 was raised by complementary vendors, compared to just 4% in 2006.

In fact, as the chart below indicates, VC funding for specialist open source vendors in 2011 was actually less than that in 2006 and 2008, and only marginally up on 2010, when again just 4% of funding went to complementary vendors.

The low amount of funding for complementary vendors in 2010 shows that the significance of complementary vendors is not growing at a constant rate, although for reasons that will become clear when we publish a follow-up post on the latest trends regarding the engagement of vendors with open source projects, we do expect that the proportion of funding related to complementary vendors is more likely to increase in the future, rather than decline.

This has implications for the ongoing trends related to open source software licensing, as covered yesterday. Examining our database of over 400 open source-related vendors – funded and unfunded, complementary and specialist – indicates that specialist vendors are much more likely to engage with projects using strong copyleft licenses than complementary vendors.

Specifically, our data indicates that 55% of open source specialists have engaged with projects that use strong copyleft licenses, while just 20% have engaged with projects with non-copyleft licenses. In comparison, 38% of complementary vendors have engaged with projects with non-copyleft licenses, compared to 24% that have engaged with projects with strong copyleft licenses.

Will will take a more detailed look at the trends related to the engagement of vendors with open source projects in the concluding part of this series of posts.

On the continuing decline of the GPL

Our most popular CAOS blog post of the year, by some margin, was this one, from early June, looking at the trend towards persmissive licensing, and the decline in the usage of the GNU GPL family of licenses.

Prompted by this post by Bruce Byfield, I thought it might be interesting to bring that post up to date with a look at the latest figures.

NB: I am relying on the current set of figures published by Black Duck Software for this post, combined with our previous posts on the topic. I am aware that some people are distrustful of Black Duck’s figures given the lack of transparency on the methodology for collecting them. Since I previously went to a lot of effort to analyze data collected and published by FLOSSmole to find that it confirmed the trend suggested by Black Duck’s figures, I am confident that the trends are an accurate reflection of the situation.

The figures indicate that not only has the usage of the GNU GPL family of licenses (GPL2+3, LGPL2+3, AGPL) continued to decline since June, but that the decline has accelerated. The GPL family now accounts for about 57% of all open source software, compared to 61% in June.

As you can see from the chart below, if the current rate of decline continues, we project that the GPL family of licenses will account for only 50% of all open source software by September 2012.

That is still a significant proportion of course, but would be down from 70% in June 2008. Our projection also suggests that permissive licenses (specifically in this case, MIT/Apache/BSD/Ms-PL) will account for close to 30% of all open source software by September 2012, up from 15% in June 2009 (we don’t have a figure for June 2008 unfortunately).

Of course, there is no guarantee that the current rate of decline will continue – as the chart indicates the rate of decline slowed between June 2009 and June 2011, and it may well do so again. Or it could accelerate further.

Interestingly, however, while the more rapid rate of decline prior to June 2009 was clearly driven by the declining use of the GPLv2 in particular, Black Duck’s data suggests that the usage of the GPL family declined at a faster rate between June 2011 and December 2011 (6.7%) than the usage of the GPLv2 specifically (6.2%).

UPDATE – It is has been rightfully noted that this decline relates to the proportion of all open source software, while the number of projects using the GPL family has increased in real terms. Using Black Duck’s figures we can calculate that in fact the number of projects using the GPL family of licenses grew 15% between June 2009 and December 2011, from 105,822 to 121,928. However, in the same time period the total number of open source projects grew 31% in real terms, while the number of projects using permissive licenses grew 117%. – UPDATE

As indicated in June, we believe there are some wider trends that need to be discussed in relation to license usage, particularly with regards to vendor engagement with open source projects and a decline in the number of vendors engaging with strong copyleft licensed software.

The analysis indicated that the previous dominance of strong copyleft licenses was achieved and maintained to a significant degree due to vendor-led open source projects, and that the ongoing shift away from projects controlled by a single vendor toward community projects was in part driving a shift towards more permissive non-copyleft licenses.

We will update this analysis over the next few days with a look at the latest trends regarding the engagement of vendors with open source projects, and venture funding for open source-related vendors, providing some additional context for the trends related to licensing.

MySQL at the core of commercial open source

Oracle last week quietely announced the addition of new extended capabilities in MySQL Enterprise Edition, confirming the adoption of the open core licensing strategy, as we reported last November.

The news was both welcomed and derided. Rather than re-hashing previous arguments about open core licensing, what interests me more about the move is how it illustrates the different strategies adopted by Sun and Oracle for driving revenue from MySQL, and how a single project can be used to describe most of the major strategies from generating revenue from open source software.

Like most open source-related software vendors, MySQL started out life offering support, training and consulting around the open source database. The company also saw success in offering a closed source variant of the database for embedding in closed source systems, and it was this dual licensing strategy that drove much of the company’s early revenue. That began to change with the arrival of MySQL Enterprise (initially ‘MySQL Network’) – a subscription offering that delivered monitoring and (later) backup capabilities to paying customers only. While some people see this as an example of the open core licensing strategy, as we have previously explained, it is not. While open core is an extension of the dual licensing strategy with additional extensions, MySQL AB’s MySQL Enterprise, as the graphic above illustrates, actually paired the extensions with the open source MySQL Community – a subtle difference from the MySQL Enterprise licensing strategy adopted by Oracle (more of which later).

MySQL flirted with the open core licensing model in early 2008 with plans to introduce new features into Enterprise Edition that would not be available under an open source license. Those plans were ultimately reversed at the behest of new owner Sun Microsystems. To understand why Sun did this one must consider the company’s wider strategy for open source at the time. While a software freedom philosophy played a part, Jonathan Schwartz’s map of open source downloads, each representing ‘a potential customer that cost Sun nothing to acquire’, explains how Sun was less interested in driving direct revenue from MySQL (and other open source software) as it was in helping open source users to become customers for Sun’s commodity hardware and other products and services. (Although as Henrik notes in the comments, Sun did also increase MySQL direct revenue as well).

Sun never got the chance to prove whether this model would have worked (I’m being polite), but in any case contrast Sun’s approach with Oracle’s strategy for open source. While the majority of Oracle’s revenue clearly comes from other products, it is not looking to drive revenue for those products via open source downloads. Witness Larry Ellison’s recent proclamation that he doesn’t care if Oracle x86 server business (typically used to run MySQL) goes to zero. Instead (for better or worse) the company is focused on driving revenue directly from each individual product, whether that is a high margin server, or closed or open source software. That has resulted in an increased investment in embedded opportunities for MySQL, as well as traditional software license agreements. While customers might choose to use MySQL Community and purchase additional support subscriptions, as of November 2010 Oracle prefers that Standard Edition and Enterprise Edition customers enter into a commercial license agreement with the company. That was a strategy that was in place in advance of last week’s addition of high availability, scalability and security features, but one that clearly looks set to continue.

Whether this is a good or a bad thing depends on your perspective. Monty Widenius does a good job of outlining the down sides to an open core licensing strategy, while Giuseppe Maxia focuses on the positives. Certainly Oracle will have to be mindful to balance the control and community aspects, but as we have previously covered (451 Group clients) there are a number of new capabilities in development for the core MySQL database itself. It is also worth noting, incidentally, that MySQL Enterprise Edition remains priced at $5,000 per server per year.

FLOSSmole data confirms declining GPL usage

Last week we published a post looking at some statistics suggesting a decline in the usage of the GNU GPL.

The post sparked some interesting debate, not least about the validity of Black Duck Software’s numbers, which we had used to compare usage of the various FLOSS licenses over recent years.

While we have no specific reason to doubt Black Duck’s figures, Bradley M Kuhn, in particular, suggested that Black Duck’s data should be “ignored by serious researchers” since the company doesn’t disclose enough detail about its data collection methods.

He added that “AFAICT, FLOSSmole is the only project attempting to generate this kind of data and analysis thereof in a scientifically verifiable way”.

You can probably guess where this is going…

Started in 2004, FLOSSmole* collects data on open source software projects. FLOSSmole’s data is freely available via Google Code.

In order to test Black Duck’s data we downloaded FLOSSmole data from four sources for which both current (May 2011) and historical (October 2008) data was available: Rubyforge, Freshmeat, ObjectWeb and the Free Software Foundation.

We then sorted each data set and generated subtotals for each license type, checking the data manually to make sure we had combined all the relevant data (data tagged GPL2, GPLv2 and GNU GPLv2 for example).

Given the wide variety of ways in which the various GNU Public Licenses have been tagged across the four data sources (a huge number of Freshmeat projects are tagged simply “General Public License” with no version number) it also made sense to group the licenses together into the GPL family (including LGPL and AGPL).

The results show that the GPL family of licenses accounted for 70.77% of all 53,914 projects in the sample in October 2008. In May 2011 that figure had declined to 59.31% of 54,800.

As a reminder, the figures from Black Duck showed the proportion of projects using the GPL family of licenses had declined from 70% in June 2008 to 61% today. So the FLOSSmole figures actually show a more rapid decline in GPL usage than Black Duck’s.

One important point to note is that a significant number of projects (5,775) in the 2011 Freshmeat data do not have license details. Removing these projects from the sample would result in the GPL family of licenses representing 66.3% of 49,025 projects in 2011.

Either way, the FLOSSmole results confirm a decline in GPL usage.

UPDATE: Just to be clear, the figures for ‘GPL family’ above include both LGPL and AGPL as well. FLOSSmole’s figures show both increased from 2008-2011, from 6.22% to 7.21% and 0.11% to 0.36% respectively.

2ND UPDATE: Of course, the % of total projects is only one way to measure adoption, and some people will argue it’s not a particularly good one. Certainly we’re not going to get carried away with the fact that the % of projects hosted by the Free Software Foundation using the GPL family has declined from 81.2% to 76.7%. Although it is kind of interesting.

*Howison, J., Conklin, M., & Crowston, K. (2006). FLOSSmole: A collaborative repository for FLOSS research data and analyses. International Journal of Information Technology and Web Engineering, 1(3), 17–26. (more)

The trend towards permissive licensing

Ian Skerrett last week suggested that there is a growing trend in favour of permissive non-copyleft licenses at the expense of reciprocal copyleft licenses. Ian asked “name one popular community open source project created in the last 5 years that uses the AGPL or GPL?”

The responses didn’t exactly come thick and fast. I certainly couldn’t think of one. But the question did prompt me to look for some evidence for the trend away from copyleft licenses.

License usage
The first port of call for evidence of trends related to open source license use is Black Duck’s Open Source Resource Center. The lastest figures show that GPLv2 is used for 45.33% of projects in Black Duck’s KnowledgeBase, while the GPL family accounts for roughly 61% of all projects.

While the GPL family is dominant, comparing the latest figures with those provided in June 2008, June 2009, and some previous CAOS research from March 2010 indicates a steady decline in the use of the GPL family and the GPLv2 in particular.

According to Black Duck’s figures the proportion of open source projects using the GPL family of licenses has fallen to 61% today from 70% in June 2008, while the GPLv2 has fallen to 45% from 58% three years ago.

It is worth noting that the number of projects using the GPL licenses has increased in real terms over the past few years. According to our calculations based on Black Duck’s figures, the number of GPLv2 projects rose 5.5% between June 2009 and June 2011, while the total number of open source projects grew over 16%.

We should expect to see slower growth for the GPLv2 given it has been superseded but even though the number of AGPLv3 and GPLv3 projects grew 90% and 85% respectively over the past two years, that only resulted in 29% growth for the GPL family overall (while A/L/GPLv3 adoption appears to be slowing).

In comparison the number of Apache licensed projects grew 46% over the past two years, while the number of MIT licensed projects grew 152%. Indeed Black Duck’s figures indicate that the MIT License has been the biggest gainer in the last two years, jumping from 3.8% of all projects in June 2009 to 8.23% today, leapfrogging Apache, BSD, GPLv3 and LGPLv2.1 in the process.

While the level of adoption of copyleft licenses remains dominant, and continues to rise in terms of the number of projects, there is no escaping the continuing overall decline in terms of ‘license share’.

UPDATE – Since some people dod not trust Black Duck’s data I also took a look at data collected by FLOSSmole. The results are remarkably similar. – UPDATE

Vendor formation
Black Duck’s data is not the only indication that the importance of copyleft licenses has decreased in recent years. The research we conducted as part of of our Control and Community report also indicated a decline in the number of vendors engaging with strong copyleft licensed software.

Specifically, we evaluated the open source-related strategies of 300 software vendors and subsidiaries, including the license choice, development model, copyright strategy and revenue generator.

By plotting the results of this analysis against the year in which the companies were founded (for open source specialists) or began to engage with open source (for complementary vendors) we are able to gain a perspective on the changing popularity of the individual strategies*.

Having updated the results to the end of 2010, our analysis now covers 321 vendors and shows that 2010 was the first year in which there were more companies formed around projects with non-copyleft licences than with strong copyleft licences.

The formation of vendors around open source software with strong copyleft licenses peaked in 2006, having risen steadily between 1997 and 2006 – although there have been gains since 2007. By comparison, the formation of vendors around open source software with non-copyleft licences has been steadily increasing since 2002.

The results get even more interesting in terms of Ian’s question if we filter them by development model. Looking at community-led development projects, we see that there have been significantly more companies formed around community-led projects with non-copyleft licenses than with strong copyleft licenses since 2007.

In fact, strong copyleft licenses have been much more popular for vendor-led development projects, but even here there was an increase in the use of non-copyleft licenses in 2010.

This last chart illustrates something significant about the previous dominance of strong copyleft licenses: that it was achieved and maintained to a significant degree due to the vendor-led open source projects, rather than community-led projects.

One of the main findings of our Control and Community report was the ongoing shift away from projects controlled by a single vendor and back toward community and collaboration. While some might expect that to mean increased adoption of strong copyleft licenses – given that they are associated with collaborative development projects such as GNU and the Linux kernel – the charts above indicate a shift towards non copyleft.

As previously noted, while free software projects utilize strong copyleft to ensure that the software in question remains open (or as Bradley M Kuhn recently put it, to keep developers “honest”), vendors using the open core licensing strategy use strong copyleft licenses, along with copyright ownership, to ensure that only they have the opportunity to take it closed.

Either way, strong copyleft is used as a means of control on the code and the project, and our analysis backs up Ian’s contention that there is a trend away from control and towards more permissive non-copyleft licenses.

This is part of what we called the fourth stage of commercial open source business strategies and is being driven by the increased engagement of previously closed-source vendors with open source projects.

The fourth stage is about balancing the ability to create closed source derivatives with collaborative development through multi-vendor open source projects and permissive licensing, and as such it not only avoids the need to control a project through licensing, it actively discourages control through licensing.

That is why, in my opinion, the decline of the copyleft licenses has only just begun.

*The method is not perfect, since it plots the license being used today against the year of formation, and as such does not reflect licensing changes in the interim. It does provide us with an overview of general historical trends, however.

Opening up the Open Source Initiative

One of the ironies of open source over the years has been that the organisation formed to “educate about and advocate for the benefits of open source”, the Open Source Initiative, was itself *perceived to be* something of a closed shop [see the comments for clarification on this point].

That is set to change as the OSI has publicly launched its plan to encourage greater participation by shifting to a membership model and elected board members. The plan was announced during a session at the Open Source Business Conference (slides) and is part of an effort to focus on the second half of the organisation’s mission statement: “to build bridges among different constituencies in the open source community”.

As OSI director Simon Phipps explains in an interview with The H, the plan is to begin welcoming existing open source-related organisations into an affiliate programme, staring with non-profit open source foundations (hopefully by OSCON), followed by for-profit organisations (but not corporations) and government bodies. There will also be a separate Corporate Advisory Board for corporations, and a Personal Affiliate Scheme for individuals.

The exact plan for voting rights and the board election process is still to be decided but a complex arrangement involving an electoral college formed of delegates from groups of affiliates and the OSI’s working groups has been proposed.

The complexity seems to have been born out of two requirements: a desire to ensure that working group participants are not out-numbered by affiliates, and to ensure that the OSI cannot be subverted by any single affiliate (or more specifically corporation) gaining too much power.

Protecting the OSI from subversion is clearly an important goal, although having read the proposal and discussion it does seem to me that this is receiving more attention than is perhaps strictly necessary (as an aside Henrik Ingo has proposed supermajority voting requirements to shield the Open Source Definition from being unnecessarily tinkered with which to my mind provides the security required without associated complexity).

Arguably, a fate equal to the subversion of the OSI would be irrelevance. Rather than assuming that organisations will seek to over-run the OSI, I believe more attention should be being placed on ensuring that organisations will seek to join. The OSI remains well-respected, but I believe that for many of the different constituencies in the open source community it is not entirely clear what it is that the OSI contributes beyond its traditional role of protecting the Open Source Definition and approving associated licenses.

During the launch presentation Simon Phipps made a good case for the increasing relevance of the OSI (such as its involvement in the investigation of the sale of Novell’s patents to CPTN) but the launch was very sparsely attended, even taking into account the business-focused audience and the fact that it was a late addition to schedule, and media attention for the membership plan has been thin on the ground.

For those already involved in the OSI its importance is self-evident. It is those outside that need to be brought in, however, and I know there are some that will need persuading that the OSI remains relevant if they are to give their time – and also their money – to join.

The expanded membership scheme is likely to require full-time staff that will need to be paid for, and it seems likely that the OSI will need to raise funds either through the affiliate schemes or corporate advisory board.

Deciding the nature and involvement of that Corporate Advisory Board is something that hasn’t been addressed as yet and could pose a significant challenge as different people will have different perspectives about what business entities and strategies are considered acceptable in relation to open source.

For example, during the opening remarks at the OSBC presentation Simon Phipps made reference to companies with “dubious” business strategies attempting to “game” open source.

While I am sure Simon was expressing his own views, rather than those of the OSI, he is clearly not the only person involved in this process with that perspective (hence the concern about subversion).

To be clear, there was no direct suggestion that such companies would not be welcome, but if the new OSI is to “build bridges among different constituencies in the open source community” it will be important to avoid any comments that could be seen to be discriminatory.

Open source means different things to different organisations, such that opening up the membership of the OSI was always going to be a complicated process. The plans for the affiliate programme are well thought out (if arguably overly complex) and it is understandable that corporate involvement has been set aside until the end of the year.

Previous attempts to create a membership/affiliate scheme have floundered, so the OSI board is to be congratulated in its progress so far. Anyone interested in joining the process should start with the slides which include the relevant links and contact details.

The 451 Take on the Future of Open Source

As previously mentioned, The 451 Group was very pleased to be able to participate in this year’s Future of Open Source Survey. We believe that the results provide critical insight into the wants and needs of end users that will help shape the evolution of vendor business strategies designed to meet the long-term needs of the industry.

The 451 Group’s research has previously shown that the benefits of open source software are many and varied and The Future of Open Source Survey highlights the fact that multiple factors are driving the increased adoption of open source software, including freedom from vendor lock-in, greater flexibility and lower cost.

As part of our involvement in the survey we have produced a report providing our perspective on the results and what they mean for the industry. The report is available to 451 Group clients here, and is also available to non-clients here. The report is free, although you will need to complete a short form to receive the report, which will also give you the opportunity to trial additional 451 Group research.

Time for a new open source definition?

Andrew C Oliver recently wrote “I think most know by now that a license is insufficient to make something actually open source.”

What makes this fascinating is that it involves a director of the Open Source Initiative – the stewards of the Open Source Definition – stating that the Open Source Definition is not enough to define software as open source.

There is nothing surprising in this statement for anyone who has been following open source for some time, however. Over recent years we have observed a growing tendency among some open source advocates to define open source beyond the software license.

Another recent example comes from Greylock partner and former Mozilla CEO John Lilly: “The open source world should not let Android redefine it to mean ‘publishes the source code.’ That’s a different thing,” he stated with reference to Andy Rubin’s attempt to explain Android’s openness.

But who is doing the redefining here?

Nothing is black and white when it comes to open source except source code availability and the license: either the source code is available or it isn’t (which means that Honeycomb is not open source), and either the license meets the Open Source Definition, and is approved by the OSI, or it doesn’t.

Everything else – such as the development methodology, the release cycle, copyright ownership, or the associated product licensing and revenue strategy – can be placed somewhere on a spectrum made up of various shades of grey.

It is true to say that the vendors and users adopting software from one end of that spectrum enjoy more of the benefits associated with open source but that doesn’t mean that the software at the other end of the spectrum isn’t open source.

When a person or company ‘publishes the source code’ (using an appropriate license) it *is* open source, and always has been. If that is considered insufficient to make something open source then perhaps the time has for a new open source definition.

UPDATE – Just to be absolutely clear, I am not suggesting there is anything wrong with the Open Source Definition. What I am suggesting is that if you are trying to define open source using something other than the Open Source Definition, then you need another definition of open source – UPDATE

Necessity is the mother of NoSQL

As we noted last week, necessity is one of the six key factors that are driving the adoption of alternative data management technologies identified in our latest long format report, NoSQL, NewSQL and Beyond.

Necessity is particularly relevant when looking at the history of the NoSQL databases. While it is easy for the incumbent database vendor to dismiss the various NoSQL projects as development playthings, it is clear that the vast majority of NoSQL projects were developed by companies and individuals in response to the fact that the existing database products and vendors were not suitable to meet their requirements with regards to the other five factors: scalability, performance, relaxed consistency, agility and intricacy.

The genesis of much – although by no means all – of the momentum behind the NoSQL database movement can be attributed to two research papers: Google’s BigTable: A Distributed Storage System for Structured Data, presented at the Seventh Symposium on Operating System Design and Implementation, in November 2006, and Amazon’s Dynamo: Amazon’s Highly Available Key-Value Store, presented at the 21st ACM Symposium on Operating Systems Principles, in October 2007.

The importance of these two projects is highlighted by The NoSQL Family Tree, a graphic representation of the relationships between (most of) the various major NoSQL projects:

Not only were the existing database products and vendors were not suitable to meet their requirements, but Google and Amazon, as well as the likes of Facebook, LinkedIn, PowerSet and Zvents, could not rely on the incumbent vendors to develop anything suitable, given the vendors’ desire to protect their existing technologies and installed bases.

Werner Vogels, Amazon’s CTO, has explained that as far as Amazon was concerned, the database layer required to support the company’s various Web services was too critical to be trusted to anyone else – Amazon had to develop Dynamo itself.

Vogels also pointed out, however, that this situation is suboptimal. The fact that Facebook, LinkedIn, Google and Amazon have had to develop and support their own database infrastructure is not a healthy sign. In a perfect world, they would all have better things to do than focus on developing and managing database platforms.

That explains why the companies have also all chosen to share their projects. Google and Amazon did so through the publication of research papers, which enabled the likes of Powerset, Facebook, Zvents and Linkedin to create their own implementations.

These implementations were then shared through the publication of source code, which has enabled the likes of Yahoo, Digg and Twitter to collaborate with each other and additional companies on their ongoing development.

Additionally, the NoSQL movement also boasts a significant number of developer-led projects initiated by individuals – in the tradition of open source – to scratch their own technology itches.

Examples include Apache CouchDB, originally created by the now-CTO of Couchbase, Damien Katz, to be an unstructured object store to support an RSS feed aggregator; and Redis, which was created by Salvatore Sanfilippo to support his real-time website analytics service.

We would also note that even some of the major vendor-led projects, such as Couchbase and 10gen, have been heavily influenced by non-vendor experience. 10gen was founded by former Doubleclick executives to create the software they felt was needed at the digital advertising firm, while online gaming firm Zynga was heavily involved in the development of the original Membase Server memcached-based key-value store (now Elastic Couchbase).

In this context it is interesting to note, therefore, that while the majority of NoSQL databases are open source, the NewSQL providers have largely chosen to avoid open source licensing, with VoltDB being the notable exception.

These NewSQL technologies are no less a child of necessity than NoSQL, although it is a vendor’s necessity to fill a gap in the market, rather than a user’s necessity to fill a gap in its own infrastructure. It will be intriguing to see whether the various other NewSQL vendors will turn to open source licensing in order to grow adoption and benefit from collaborative development.

NoSQL, NewSQL and Beyond is available now from both the Information Management and Open Source practices (non-clients can apply for trial access). I will also be presenting the findings at the forthcoming Open Source Business Conference.

OpenStack: balancing control and community

“the trends appear to be moving away from control and back toward community and collaborative development, which is why The 451 Group has advised that established vendors that rely on controlling open source development projects need to evaluate how they might be able to transition to more collaborative development practices and permissive licensing”
The 451 Group: Control and Community, November 2010

Shifting from control to community is not easy. Recent weeks have provided a number of examples of how the demand for collaborative development from the community can outpace corporate strategy.

A prime example would be the reaction to Google’s decision to withhold the code for Honeycomb until it deems it to be ready for wider distribution.

While Eric Raymond had cautioned against over-reacting to this news, Stephen Walli meanwhile also provided a timely reminder of retaining too much control can be damaging, specifically how it played a part in the decline of the Symbian Foundation.

If that wasn’t enough, Rick Clark also published his concerns about the OpenStack project and whether Rackspace has overstepped the mark in trying to control the project rather than influence it.

OpenStack is a benchmark in the shift away from control seen in the past few years, since the project was itself born out of a desire to shift the balance towards community-led development.

As we wrote in Control and Community:

“NASA… was formerly using Eucalyptus’ open source cloud platform, but in July created the alternative OpenStack project with Rackspace following a disagreement with Eucalyptus. The exact nature of that disagreement is itself a matter of dispute, but it is clear that it was related to the copyright attribution agreement used by Eucalyptus for external contributions to ensure that it was in a position to continue its open core licensing approach. It is no coincidence that the OpenStack project involves distributed copyright ownership and a non-copyleft license, designed to ensure that the core software will remain open source while providing all participants with equal opportunities to create closed source derivatives and complementary products and services.”

Quite how equal the opportunities for participants are was drawn into question by Rackspace’s recent acquisition of Anso Labs and the launch of its Cloud Builders business providing support and services for OpenStack deployments.

While this was unsettling for some it should not have been a surprise as Rackspace had made no secret of its desire to explore commercial support and services revenue opportunities, although it had stated that it has no desire to sell closed source variants.

As we wrote in July 2010:

“The vendor says it does not plan any commercial licensing or products from OpenStack. Its main focus is making cloud computing easier to consume and repeat, although it is anticipating that users will deploy the software on-premises to create their own on-ramp to the Rackspace cloud and is considering providing commercial support for on-premises implementations.”

Even so, the acquisition of Anso gave cause for concern. As Glyn Moody noted:

“Anso Labs held one of the four seats on the OpenStack governance board and three of the nine seats on the project oversight committee. The purchase of Anso by Rackspace means that Rackspace now dominate OpenStack’s governance, three to one, and project oversight, eight to one; the “one” in both cases being Citrix.”

And it is the governance of the project, rather than product or licensing plans, that have raised concern. Specifically, as Rick Clark explains:

“Basically, Rackspace made governance changes without talking to the development community or the sitting governance board. This is extremely problematic for the health of the project… The sad thing here, is that the governing body would have probably approved it with only minor changes. The changes are for the most part good, but the process shows a serious flaw in Rackspace’s thinking.”

As noted above, OpenStack was specifically designed to be more open than the alternative and the organisations behind it specifically chose distributed copyright ownership and a non-copyleft license, as well as a promise of openness in order to shift the balance away from vendor control towards community.

In the context of the wider shift away from control it is interesting to note that these steps are not considered enough and that the call has gone out for even less vendor influence and the creation of a non-profit foundation.

While it is tempting to suggest that Rackspace was not open enough in creating the OpenStack project, another viewpoint might suggest that the result of any level of openness is the demand from the community for more.

It is too easy to assume that in the balancing act between control and community the demand for control is exclusive to the vendor. The vendor is in a privileged position, and must recognise this and act accordingly.

However, we must also recognise that the community is often also seeking control, albeit with the aim of sharing that control between collaborating participants.

The balance between control and community is not simply a matter of balancing between the vendor and developers/users, but between all participants in any collaborative initiative.

When commercial open source goes bad

One of the primary proof-points of the success of open source has been its adoption by previously proprietary software vendors.

In February 2007 The 451 Group’s CAOS practice released its third report, Going Open, which examined the increasing adoption of open source licensing by traditionally-licensed software companies and captured the industry best-practices to ensure a successful transition from closed to open.

Four years is a long time in the software industry and in a few months we will publish a follow-up to Going Open, updating our analysis of the trends and best-practices and revisiting the vendors profiled in the report to see how they have fared following the transition to open source licensing.

As well as examining open source successes it is also important to consider those examples where open source licensing has not delivered the expected results, and the new report will also examine vendors that have “Gone Closed” and abandoned open source licensing efforts.

There are a few examples we are already aware of through our ongoing research. One analytic database vendor recently changed the license of its Community Edition project, abandoning the GNU GPL in favour of a license that would not meet the Open Source Definition (I won’t name them now since I haven’t given them the opportunity to explain themselves).

Another example involves a company set up by some prominent former employees of one of the big names in open source software. The first version was released using an open source license but was never updated, as the company focused all its attention on the closed source version instead.

Meanwhile one of the prominent “open source” systems management vendors appears to have removed all mention of its Community Edition software from its website, while the Community Edition itself has not been updated for 15 months. While the project is not officially “dead” it is, to say the least, “pining for the fjords” and the company in question could be said to be open source in name only.

We are sure there are plenty of other examples of companies that have launched an open source project or “Community Edition” only to later decide that maintaining the project was not in its best commercial interests. The question is, why did these companies fail to benefit from open source licensing, and what can commercial open source companies lean from their experiences.

If you have any examples of dead or “resting” open source projects, please let us know and we will investigate.

Please note, however, that in this instance we are not interested in companies that have simply gone “open core” or adopted copyright contribution agreements. As fascinating as those subjects are (and may well be contributing factors in the demise of a project) we are interested for this report primarily in the discontinued use of an open source license.

451 CAOS Links 2011.02.25

UK Govt goes big on open source. DotNetNuke acquires Active Modules. And more.

# This week the UK Government confirmed that it really is serious about open source software adoption. Mark Taylor reported from the UK Cabinet Office’s Open Source System Integrator Forum, while ComputerWeekly rounded up the latest changes to the UK government ‘s strategy for open source.

# DotNetNuke acquired social collaboration solutions provider Active Modules.

# Acunu raised $3.6m in series A funding for its Apache Cassandra-based data storage software.

# Canonical and Banshee agreed to disagree on music store revenue.

# SAP’s HANA appliance runs SUSE Linus Enterprise Server.

# Vaadin released a Vaadin Pro subscription with set of commercial tools, components and support.

# Oracle Technology Network published an article on using Berkeley DB as a NoSQL data store.

# Quest Software is sponsoring the Sudo project.

# LINBIT’s DRBD replication software is now available for Red Hat Enterprise Linux.

# Marten Mickos discussed how open source impacts company culture.

# Openbravo introduced “Agile ERP” with Openbravo 3.

# The Chemistry open source implementation of CMIS is now a top level Apache project.

# The first release of Spring Gemfire integration is now generally available.

# Sencha introduced PhiloGL, an open source WebGL framework.

# A project has been started to create a Qt implementation for Android.

# Karmasphere updated its Studio and Analyst development and analytics products for Hadoop.

# Andy Updegrove discussed best practices in open source foundation governance.

# arstechnica explained why Microsoft was right to ban GPL apps from its app store, and why Apple should do the same.

# Yahoo plans to release its internal cloud computing engine using as open source license.

A graphic example of Microsoft’s relationship with OSS licenses

At first glance there doesn’t seem to be a way to search Microsoft’s Windows Marketplace app store by license, but out of interest, here’s a chart showing the applications on Microsoft’s CodePlex code hosting site, by license.

Make of it what you will.

Updated open source business strategy framework

We have had a couple of queries this week regarding the open source business strategy framework we have used for the last two years or so in our analysis of open source-related business strategies.

The framework has evolved over time based on changing strategies, our research, and feedback from clients (and non-clients), but the last publicly-available version of the framework (to which the query related) was only a work in progress.

Since there is interest in making wider use of this framework – we are pleased to have seen several OSS-related vendors using it to explain their strategy – the easiest thing to do is to publish it here.

To understand how we made use of the framework to analyse the open source-related strategies of 300 software vendors and subsidiaries, see our recently published Control and Community report.

The framework, and a brief explanation of terms, is below. Any comments, suggestions, gratefully received.

Software license
• Strong copyleft
Reciprocal licenses that ensure redistributed modifications and derived works based on or including the code must be made available under the same license. For example the GNU GPL and the Affero GPL.

• Weak-copyleft
Reciprocal licenses that enable integration with closed source software without the entire derived work having to be made available under the same license. For example the GNU Lesser GPL, the Eclipse Public License, the Mozilla Public License, the Common Development and Distribution License (CDDL).

• Non-copyleft
Permissive licenses that do not place restrictions on code usage, enabling it to be integrated with closed source software and the combined code to be distributed under a closed source license. For example BSD licenses, the X11/MIT license, the Apache License.

• No preference
The vendor commercializes software that combines or utilizes multiple open source software licenses and has no discernible preference.

Development model
• The Cathedral
The source code is available with each software release, but is developed privately by an exclusive group of developers.

• The Bazaar
The code is developed in public, with builds and updates constantly made available on a public forge available to anyone.

• Aggregate
The vendor commercializes software that utilizes a combination of publicly and privately developed software and has no discernible preference.

And separately:
• Community
The software is predominantly developed by a community

• Vendor
The software is predominantly developed by a vendor.

• Mixed
The vendor commercializes software that utilizes a combination of community- and vendor-developed software and has no discernible preference.

Copyright ownership
• Vendor
The copyright is owned by a single vendor.

• Foundation
The copyright is owned by a foundation.

• Distributed
Copyright ownership is distributed across the individual developers

• Withheld
The copyright is owned by another vendor.

Product licensing
• Single open source
The software and all associated features are available under a single open source license.

• Multiple open source
The software and all associated features are available using a combination of open source licenses.

• Dual licensing
The software is available using an open source license, or a closed source license.

• Open core
The core project is open source, but a version with additional functionality is available using a closed source license.

• Open complement
Complementary products and services are available using a closed source license.

• Open edge
The core product is closed source, but extensions and complementary features are open source.

• Open foundation
The core product is closed source, but is built on open source software.

• Open platform
Open source software has been used to create a platform for the delivery of software services and Web applications.

Revenue generator
• Closed source license
Either for a version of the full project, or a larger software package or hardware appliance based on the project, or for extensions to the open source core.

• Support subscription
An annual, repeatable support and service agreement.

• Value-add subscription
An annual, repeatable support and service agreement with additional features/functionality delivered as a service.

• Service/support
Ad hoc support calls, service, training and consulting contracts.

• Software services
Users pay to access and use the software via hosted or cloud services.

• Advertising
The software is free to use and is funded by associated advertising.

• Custom Development
Customers pay for the software to be customized to meet their specific requirements.

• Other Products and Services
The open source software is not used to directly generate revenue. Complementary products provide the revenue.

Copyright assignment – a little commercial perspective

Gather the pitchforks and light the torches. Hordes of marketing men are gathering, intent on invading the free and open source software village armed with copyright assignment policies and turning everyone into mindless corporate contributors. As Michael Meeks (via LWN.net) has warned there is “‘a sustained marketing drive coming’ to push the copyright-assignment agenda” As you read this very post, faceless marketing drones are calling your bosses, spreading pernicious lies about the necessity of copyright assignment policies.

Meanwhile, back in commercial reality, copyright assignment agreements have been a valid, albeit controversial, element of the open source software development for decades. They are used by vendors to protect their rights, some would argue unnecessarily, but are neither new nor growing in usage.

If anything, in fact, there is a growing realisation that the copyright assignment policies have a negative effect on community development and contributions, and that participant agreements and permissive licensing, which ensure more equal distribution of rights, are more successful in protecting the rights of both vendors and potential contributors.

We noted over a year ago that copyright control was increasingly being recognised as a core element in open source-related business strategies and added it to the list of categories against which we assessed 300 vendors for our recent Control and Community report.

The results of that research provide some interesting context for the ongoing debate about copyright assignment.

We asked 286 open source software users to express their preference for various copyright ownership options. We were not at all surprised to find that 47% expressed a preference for copyright owned by a foundation, compared to just 6% for copyright owned by a single vendor.

What did surprise us was that just 8% expressed a preference for copyright ownership distributed across the various contributors, since that is the model used for many of the most succesful open source projects, including the Linux kernel and the various Apache projects.

We must be careful not to read too much into this, however. Roberto Galoppini is not correct when he states that “respondents don’t like copyright ownership distributed across the various contributors”, but he is correct when he clarifies that “at least they like it no more than copyright owned by a single vendor”.

Apart from anything else, we must take into account the fact that 38% of respondents expressed no preference either way. This is a significant proportion of open source users who don’t care who owns the copyright to the software they are using. Perhaps this perspective may come back to haunt them, but we should not assume that this lack of preference is a vote against vendor-, distributed-, or foundational copyright ownership.

It is also worth noting that copyright ownership was given an average importance rating of 3.1 out of 5, making it the least important of our five factors influencing open source business strategies, according to open source users. Again, this perspective might be short-sighted, but it cannot simply be dismissed.

Another thing that cannot be ignored is the fact that vendor-owned copyright has been the dominant choice for open source-related vendors for the last ten years.

Our research showed that 50% of the 300 vendors assessed own the copyright to the related open source software project, compared with 28% that are involved with projects with distributed copyright ownership, and just 3% with foundational copyright ownership. The remaining 19% are involved with project for which copyright is owned by another vendor or organisation.

While there are valid issues to be raised about copyright assignment policies, it is a bit late to be raising the alarm.

In fact, our research indicates that the formation of vendors around projects for which the copyright is owned by that vendor has been in decline since 2005 (albeit with a slight increase in 2010). By comparison, the formation of vendors around projects with distributed copyright ownership has risen (albeit with a slight dip in 2006) since 2002.

There is no doubt that copyright assignment has its problems in restricting community contributions. Our research indicates that where a vendor owns the copyright for a project, only 29% use bazaar development and just 13% use community development.

That is, in part, why we believe we are seeing vendors re-assess their use of copyright assignment policies and why we have highlighted, as Simon Phipps has, the difference between copyright assignment and participant agreements.

Should developers and vendors alike be wary of copyright assignment agreements? Of course they should. Is there a calculated marketing campaign designed to convince the world of the necessity of copyright assignment? Of course there isn’t.

Open core licensing is free software’s evil twin

Or, why free software advocates love to hate open core

I’ve been trying to figure out why it is that free software advocates are so fixated on the open core licensing strategy and recently came to the conclusion that there is only one explanation: open core is free software’s evil twin.

To clarify I do not believe that open core is evil, but that the relationship between free software and open core is the equivalent of the literary device where two protagonists share certain characteristics (such as general appearance) but have inverted moralities and visual differentiators (usually a goatee beard).

Spock and Evil Spock, image courtesy of Dave Friedel. See also Evil David Hasselhoff, aka Garthe Knight.

If we look at the relationship between open core licensing and free software we also see common characteristics along with diverging moralities.

With regards to the common characteristics witness the fact that the two strategies are united by a dependence on strong copyleft licensing.

According to our recent research on open-source-related business strategies 67% of vendors utilizing the open core licensing strategy are associated with a project that uses a strong copyleft license. We have also found that 52% of vendors taking a single open source licensing approach to open source use a strong copyleft license.

Looking at it another way we see that 30% of vendors associated with strong copyleft licenses are using open core licensing, while a very similar number – 29% – of vendors associated with strong copyleft licenses are using single open source licensing.

That is where the similar character traits end and the differences begin.

While free software projects utilize strong copyleft to ensure that the software in question remains open, vendors using the open core licensing strategy use strong copyleft licenses, along with copyright ownership, to ensure that only they have the opportunity to take it closed.

While 83% of vendors utilizing the open core licensing strategy are associated with a project for which the vendor owns the copyright, 88% of vendors associated with foundation-owned copyright were using open source licensing.

Meanwhile 56% of vendors taking a single open source licensing approach were using the bazaar development model, compared to 61% of those taking an open core approach using the cathedral development model.

Similarly 43% of vendors taking a single open source licensing approach were using the community-led development model, compared to 80% of those taking an open core approach using the vendor-led development model.

Finally, while 96% of vendors utilizing the open core licensing strategy generate the largest proportion of their revenue from closed source software, 32% of those associated with single open source licensing generate revenue from support subscriptions, and the same proportion from ad hoc support/services.

The evil twin theory doesn’t explain why the debate is so enduring however, or why free software advocates seem to be so fixated on open core. That is unless you add in the theory that the twins are also symbiotically dependent on each other.

You don’t have to look hard for evidence that open core is dependent on free software – the statistics above demonstrate how open core related to the strong copyleft licensing strategy – but what of free software’s dependence on open core?

It seems to me that in a world where the line between proprietary and free and open source is increasingly blurred advocates of free software are becoming increasingly dependent on open core as the bogeyman to define the line and differentiate the free software approach. Where once ‘proprietary’ was considered the opposite of free, now it is open core that is considered the opposite of open source.

The dependence has gone so far, in fact, that we have seen examples of free software advocates labeling projects and vendors as open core, even when they are not, in order to highlight the benefits of a pure open source approach. Witness Bradley M Kuhn and Alexandre Oliva attempting to pin open core’s goatee beard on Canonical and the Linux kernel respectively.

If we look back at the creation of the term ‘open core’ it was coined in order to provide an alternative to terms such as ‘bait and switch’. In hindsight it was inevitable that the negative connotations would simply be applied to the new terminology.

What wasn’t obvious was how important open core would become to the software freedom movement in articulating the benefits of software freedom. That is why free software advocates love to hate open core, and that is why the open core debate will endure.