August 9th, 2013 — Linux, Software
There’s been an interesting debate on the OpenStack cloud computing project and its API compatibility with Amazon. The discussion and debate over the open source cloud software’s compatibility with cloud leader Amazon’s proprietary APIs was just beginning when the 451 Group released The OpenStack Tipping Point in April. With the advancement of the OpenStack software and community — along with lingering questions about the desired level of compatibility with Amazon’s cloud — the matter is heating up. However, the issue of Amazon cloud compatibility is largely a non-issue.
Enterprise customers are focused on solving their computing and business challenges. They typically center on promptly providing their customers and internal users and divisions with adequate resources and infrastructure; speeding application development and deployment; and avoiding so-called “Shadow IT,” which normally involves use of Amazon’s cloud. Read the full article at LinuxInsider.
I’m not the only one with an opinion around here. My 451 Research colleagues have also weighed in on the matter and 451 Research subscribers can view their argument that Amazon API compatibility may be a fool’s errand.
October 5th, 2011 — Software
Red Hat’s $136m acquisition of open source storage vendor Gluster marks Red Hat’s biggest buy since JBoss and starts the fourth quarter with a very intersting deal. The acquisition is definitely good for Red Hat since it bolsters its Cloud Forms IaaS and OpenShift PaaS technology and strategy with storage, which is often the starting point for enterprise and service provider cloud computing deployments. The acquisition also gives Red Hat another weapon in its fight against VMware, Microsoft and others, including OpenStack, of which Gluster is a member (more on that further down). The deal is also good for Gluster given the sizeable price Red Hat is paying for the provider of open source, software-based, scale-out storage for unstructured data and also as validation of both open source and software in today’s IT and cloud computing storage.
This is exactly the kind of disruption we’ve been seeing and expecting as Linux vendors compete with new rivals in virtualization, cloud computing and different layers of the stack, including storage (VMware, Microsoft, OpenStack, Oracle, Amazon and others), as covered in our recent special report, The Changing Linux Landscape.
While the deal makes perfect sense for both Red Hat and for Gluster, it also has implications for the white hot open source cloud computing project OpenStack. There was no mention of OpenStack in Red Hat’s FAQ on the deal, but there was a reference to ongoing support for Gluster partners, of which there are many fellow OpenStack members. OpenStack was also highlighted among Gluster’s key open standards participation along with the Linux Foundation and Red Hat-led Open Virtualization Alliance oriented around KVM. Sources at both Gluster and Red Hat, which point to OpenStack support being bundled into Red Hat’s coming Fedora 16, also reiterated to me Red Hat is indeed planning to continue involvement with OpenStack around the Gluster technologies. I suspect Red Hat is looking to leverage Gluster more for its own purposes than for OpenStack’s, but I must also acknowledge Red Hat’s understanding of the value of openness, community and compatibility. Taking that idea a step further, Gluster may represent a way that Red Hat can integrate with and tap into the OpenStack community by blending it with its own community around Fedora, RHEL, JBoss, RHEV and Cloud Forms and OpenShift.
The deal also leads many to wonder whether or what may be next for Red Hat in terms of acquisition. We’ve long thought database and data management technologies were areas where we might see Red Hat building out. This was also the subject of renewed rumors recently, and we believe it might still be an attractive piece for Red Hat given the open source opportunities and targets around NoSQL technologies such as Apache Hadoop distributed data management framework and Cassandra distributed database management software. We’ve also believed systems management to be a potential place for Red Hat to further expand. Given its need to largely stay within open source, we would expect targets in this area to include GroundWork Open Source, which joins Linux and Windows systmes in its monitorig and management, and Zenoss, which works with Cisco and Red Hat rival VMware in monitoring and managing systems with its open source software. Another potential target that would increase Red Hat’s depth in open source virtualization and cloud computing is Convirture, which might also be an avenue for Red Hat to reach out to midmarket and SMB customers and channel players. Red Hat was among the non-OpenStack members we listed as potential acquirers when considering the M&A possibilities (451 subscribers) out of OpenStack.
Given its recent quarterly earnings report and topping the $1 billion annual revenue mark, Red Hat seems again to be bucking the bad economy. We’ve written before in 2008 and more recently how bad economic conditions can be good for open source software. Red Hat is atop the list of open source vendors that suffer as traditional, enterprise IT customers such as banks freeze spending or worse, fail. However, the company’s deal for Gluster is yet another sign it is thriving and expanding despite economic difficulty and uncertainty.
You don’t have to just look at Red Hat’s earnings or take our word for it. On Jim Cramer’s ‘Mad Money’ this week, we heard Red Hat CEO Jim Whitehurst praised for Red Hat performance and traction where most companies and many economists are throwing the blame: financial services, government and Europe. Cramer credited Red Hat for a ‘spectacular quarter’ and allowed Whitehurst to tout the benefits of the Gluster technology and acquisition, particularly Gluster’s software-based storage technology that matches cloud computing. It was quite a contrast to the news out of Oracle Open World, where hardware was a focal point.
September 28th, 2011 — Software
It’s been some time now that we’ve been talking about devops, the pushing together of application development and application deployment via IT operations, in the enterprise. To keep up to speed on the trend, 451 CAOS attended PuppetConf, a conference for the Puppet Labs community of IT administrators, developers and industry leaders around the open source Puppet server configuration and automation software. One thing that seems clear, given the talk about agile development and operations, cloud computing, business and culture, our definition of devops continues to be accurate.
Another consistent part of devops that also emerged at PuppetConf last week was the way it tends to introduce additional stakeholders beyond software developers and IT administrators. This might be the web or mobile folks, sales and CRM people, security professionals or others, but it is typically about applying business operations methodology to applications and IT, thus bringing in more of the business minds as well. The introduction of additional stakeholders was also a theme we heard from Puppet Labs CEO Luke Kanies in his keynote address. Kanies then discussed how the community was working to make Puppet the ‘language of operations,’ which it basically is along with competitors Chef from Opscode and CFEngine when it comes to devops implementations.
There was another interesting point on the PuppetConf stage from DTO Solutions co-founder and President Damon Edwards, who said devops should not be sold as a way to achieve cost savings, but rather as something that will bring return on investment (ROI). This is similar to the shift of open source software drivers we’ve seen in the enterprise, which are sometimes changing from cost savings and time to factors of performance, reliability and innovation.
Later in the conference during his keynote, Eucalyptus Systems CEO Marten Mickos also had some interesting observations concerning devops, which he described as managing the cloud from both sides. One of his points was that developers have the most to learn about operations. While I would agree to some extent, this statement is interesting when considered alongside my contention that most of the change in devops is happening on the IT administrator and operations side. Later in an interview, Mickos elaborated on his devops thinking, indicating the experts who orchestrate applications in cloud computing — both developers and admins — must understand the entire lifecycle and environment. Continuing our comparison of devops to open source, Mickos indicated the open source MySQL database that he helped usher into the enterprise was disrupting old technology, while devops is innovating new technology.
While it remains early days for devops in the case of many enterprise organizations, we continue to see and hear signs that devops practices, technologies, ideas and culture are making their way into more and more mainstream enterprise IT shops. While we expect devops practices to be implemented by many enterprises based on utility and need to leverage cloud computing, we see a higher level of awareness and engagement from leadership and executives than we did with open source software. This means we expect uptake of devops to happen more quickly and to generate more revenue and opportunity.
May 26th, 2011 — Software
It’s coming up on a couple of years since I wrote about the reasonable approach toward open source software adoption put forth by the U.S. Department of Defense, which was ready and willing to use open source, but was not requiring a less-realistic all-open source or only-open source approach.
Today, we see that measured consideration of open source and its adoption has served the DoD well, given it just published a guide (PDF) regarding its experience with policy and adoption of open source software. This provides a valuable lesson to enterprise organizations considering use of, participation, increased adoption of open source software. Based on our findings that more than 60% of open source users and customers have no policies or guidelines for contributing to open source software (November 2009 survey of 1,711 open source users and customers), it is also needed.
Some highlights from the guide, titled ‘Open Technology Deployment – Lessons Learned and Best Practices for Military Software,’ which was nicely released under the open source Creative Commons Attribution ShareAlike 3.0 License, include:
*The guide begins with a nice explanation of ‘off-the-shelf’ software, a common phrase for commercial software purchased/procured by governments, as well as explaining open source software.
*It also walks through some of the fundamentals of open source that are often overlooked or lowered in priority in favor of cost, flexibility or other advantages of open source. This includes intellectual property rights, reuse, governance, forking and licensing.
*In addition to some more technical, government-related infrastructure needs and demands, the guide does a wonderful job covering some of the more aesthetic components of open source software development and community management, including the need to be inclusive, avoiding private conversations, practice of conspicuous code review, awareness and communication of roles and dealing with rude or poisonous members of communities. One key phrase in the report that sums up the wisdom here: ‘Community first, technology second.’
*Interestingly, the guide touches on practices associated with ‘devops,’ – the confluence of application development and application deployment via IT operations. In particular, it focuses on continuous development and delivery, more rapid development and release cycles, testing, transitioning to operations and maintenance. This is another indicator of how significant open source software can be to devops, and also of how pervasive the trend is becoming.
*Finally, the guide cuts through some FUD that may persist in some circles and verticals, including the public sector, regarding open source software, indicating nearly all open source software is backed commercially and available as Commercial-off the shelf (COTS), an important classification for government adoption. The guide also differentiates open source from freeware and shareware, which are often limited both perceptually and legally in government use.
The DoD guide — which similar to its memo on open source a year and a half ago represents a pragmatic, realistic approach to adopting and using open source software — is also another indicator of the drivers, advantages and challenges of open source software, which have typically been about cost, flexibility and avoiding vendor lock-in. We are tracking changes in those drivers, advantages and challenges as well with our take on the recent Future of Open Source Survey.
It’s encouraging to see this happening with the DoD and government, which has long had procurement, procedure and policy that was typically mismatched for open source software. The situation has now changed with vendors providing more support, certification, listings and adjustment to government adoption and use. Governments, led by organizations such as the DoD, are also adapting their way of doing things so that open source, cost savings, collaboration, avoiding vendor lock-in and all of the other benefits of open source are things they too can leverage.
My coverage of the first DoD memo on open source software in October 2009 also included the idea that this policy was taking shape amid more official, above-board adoption of open source software by both governments and enterprises. This means that rather than sneaking into organizations through developers, administrators, teams and divisions — largely under the door and through the cracks — open source software is now being adopted as part of official procurement and use. This trend, which we see continuing, also means a larger opportunity for paid support, services, components and other products from vendors focused on open source software.
December 21st, 2010 — Software
There was a recent skirmish about open source software in the enterprise regarding a contention that open source is not really used by big business, which was refuted, naturally, by open source vendors. Nevertheless, my experience among not only vendors, but also investors and particularly large enterprise end users, is that open source is typically atop the list of priorities, strategies and options. Granted, I’m an analyst working primarily covering open source software in the enterprise, but I have many conversations with non-open source companies, and the end users with which I speak are focused on open source among many other things.
I wrote about big companies using open source last year, and today I find that most companies, whatever vertical or industry, are leveraging open source software in one way or another, whether infrastructure software and operating systems such as Linux, middleware where we see Apache Tomcat and Red Hat JBoss going strong or applications, where every category has open source options, and most categories have paid, commercial open source options.
I am currently repeating a theme that I came up with when economic conditions were growing the use of open source, including paid use, mission-critical use, production use and, yes, big business use. The theme is this: a few years ago, enterprise organizations might say they were not using open source or did not want to use open source for this reason or that reason, and it was probably accepted as somewhat reasonable. Flash forward to today, and the commercial support and credibility of open source have evolved amid a drive toward open source alternatives from economic conditions. Thus, to say that an organization avoids or bans open source software today is tantamount to saying that organization does not save money, does not do things efficiently and is not progressive. There may be those who continue to believe that the use of open source is still relegated to geeky development or IT operations teams, or that it is limited to test and dev projects, but it has already made inroads into production. Whether the leadership of big business knows it or not may be another matter.
June 9th, 2010 — Software
The 451 Group has published another open source strategy Spotlight report, this time turning our attention to longtime Linux server vendor Hewlett-Packard, which continues to dedicate resources to Linux and other open source software communities, but which also has a lower profile than others known for their open source contributions.
HP has long been a big supporter of Linux and other open source software, particularly through its testing, certification and support of Linux on its ProLiant x86 and now Integrity IA-64-based servers. But despite its top market position, the company has also historically been overshadowed by others similarly supporting Linux and open source.
HP’s work with Linux centers on enabling, qualifying and supporting Linux on hardware and with its management software, and this may help explain why its open source contribution is sometimes viewed as more self-serving than for open source community. Still, with its contributions to the Linux kernel in architecture, virtualization, security, filesystems and hardware device drivers, the company’s support and contribution have a significant impact.
In addition to numerous printer drivers contributed and embedding open source in more than 200 of its own products, HP is responsible for initiating more than 3,000 open source projects, providing more than 200 open source tools, utilities and libraries, paying 300 Linux developers and supporting some 2,500 Linux and other open source developers. The company also works with Intel and Yahoo on the Open Cirrus cloud testbed and partners with other open source players, including Cloudera, which bases business on the Hadoop data management framework and Canonical, distributor of Ubuntu Linux. Although HP cannot support every flavor of Linux (it estimates there are more than 700 of them) the company offers arguably the largest range of enterprise Linux support, spanning from unpaid community Linux such as CentOS to enterprise subscription versions such as Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise Server (SLES).
HP recognizes that users and customers – in financial services, insurance, telecommunications, healthcare, and among other early adopters – no longer need to be convinced on Linux. What they need now is guidance on adapting their strategy and effectively incorporating Linux and other open source software.
More is available in the HP Spotlight report, which is available to existing 451 Group clients. Non-clients, as always, may apply for trial access via the same link.
May 26th, 2010 — Software
Just when you thought open source and its licensing were getting a bit dull (okay, that will probably never happen) … Sure, the GPL is giving up some of its dominance. OEM, embedded, mobile and other expansion areas for open source are keeping open source licenses relevant, as are virtualization and cloud computing, and these are all areas where open source licenses such as the AGPLv3 hold both promise and burden, depending on who you ask. It’s clear open source licensing is heating up again as a topic and as we assess what is really open and what is really not.
Matt recently asked about Google’s recently announced WebM, whether it is open source and what this tells us about the open source license definition and approval process. WebM, a Web video format that is available for free, is intended as open and even open source, but it is not actually licensed under an OSI-approved open source license, thus making it fall short of the definition of open source.
We may see Google get that OSI approval. It’s certainly not out of the ordinary, and even Microsoft has successfully lobbied and certified some of its own licenses as open source. However, for the time being, WebM falls under the category of ‘not open source,’ and I believe reflects Google’s challenge of getting open enough. On the other hand, Google’s Android OS, which is also backed by a broad consortium of other software, hardware, wireless carrier and other players, is sometimes criticized or questioned on its openness, particularly amid its recent progress. The fact of the matter is the kernel and core of the OS is based on Linux and the OS itself is licensed under the Apache 2.0, one of the top open source licenses we discuss in our report, The Myth of Open Source License Proliferation and one we see gaining use and prominence.
‘Open enough’ is another topic we’ve discussed on the CAOS Theory blog before, but I believe we are seeing cases of non open source software, such as Amazon’s APIs for EC2 and its cloud computing services, being open and available enough in many regards. Yet the fact these are not open standards and not open source brings persisting concerns about what the future might hold. This also highlights how lock-in, which we saw fade to some extent as a factor driving open source, is becoming more significant again. Although there has been an evolving acceptance of some lock-in, particularly as the debate has moved to open data, many early and established cloud computing users are worried if they have a single source for their infrastructure and services (vendor and product shutdowns, consolidation and rigid roadmaps are among the legitimate customer fears). In response, many are looking to ‘alternative’ software pieces and stacks for their private and hybrid cloud computing endeavors, and this is frequently, if not mostly open source.
Back to the licensing matter, we’re also seeing some friction on software licensing from virtualization and cloud computing, where the wants and needs of suppliers and consumers do not necessarily align. In terms of open source, this dilemma shows how flexibility and leverage — either with the vendor or with the software itself given the ability to access source code and build on it or influence its development — can help set open source apart as users contemplate their licensing and deployment strategy. Still, there are also challenges that come with open source software licensing, such as requiring the sharing of code and modifications and limited use of the open source code in combination with other software and in other products.
All of this highlights the ongoing need and importance of the OSI and broader industry definition of open source and its licenses, particularly as open source continues to blur and blend with non-open source in mobile and other electronic devices, virtualization, cloud computing and elsewhere.
May 19th, 2010 — Software
LinuxCare, which recently relaunched a new cloud computing-based Linux services business, had represented frankly a lot of the Linux support business, promise and opportunity that never quite lived up to the hype and expectations. LinuxCare, which suffered from lack of leadership and execution, later became Levanta, and we eventually questioned its Linux-only approach in an enterprise IT world increasingly made up of mixed-OS deployments. Levanta shut down, along with some other missed systems management efforts, in 2008.
The lack of novelty and uniqueness about Linux continued, and as we saw with Linux World 2008, Linux had become so well ingrained in enterprise IT that it truly seemed nobody cared. Like Levanta, LinuxWorld is now gone.
So why would now be the right time for another go at Linux support business? I believe the answer lies in the same response I’ve been offering a number of users, vendors and clients: cloud computing. We began watching more closely the use of Linux, including unpaid community Linux, in cloud computing a couple of years ago with our report, The Rise of Community Linux. Last year, we continued to track the use of Linux, and again community Linux, in cloud computing as we were still hearing about the use of both paid versions such as Red Hat Enterprise Linux and Suse Linux Enterprise Server, but also community versions such as CentOS, Debian, and Ubuntu, which is growing both its paid and unpaid use in cloud computing environments.
We’ve also seen cloud-specific versions and vendors, such as CloudLinux, which typically aim their cloud-tuned Linux software and support at specific verticals and industries. For CloudLinux, as well as for the major Linux vendors and others, that specific industry is hosting and other service providers moving to the clouds.
So a LinuxCare relaunch with with focus on supporting Linux for cloud computing infrastructure, applications and services makes some sense, and also highlights the continued business, benefits and opportunities of Linux as opposed to competing operating systems.
November 30th, 2009 — Software
Every once in awhile, we are reminded of years past and some of the old attitudes that used to be popular, but have lost credibility over time. The latest comes via a European CIO for GE, who reportedly describes open source as largely relegated to ‘playground’ development and a ‘huge risk’ in mission critical applications and uses.
First off, I would question how aware this CIO is of his company’s use of open source software, including but not limited to Linux, in production, mission-critical environments and applications. In fact, I have heard GE named by at least two open source software vendors I’ve talked with, and the deployments were most certainly not limited to any internal development or ‘playground’ setting. I also wrote about GE as one of the big brands that is using open source.
Second, we have seen a dramatic shift in the risk/benefit outlook on open source software in the enterprise. Driven largely by difficult economic conditions, tightened budgets and time and business pressures, customers who were previously unsure about open source software are now willing to live with that uncertainty and risk to give open source a chance. Our recent survey of open source software users and customers reinforces the idea that open source is now more positively associated with cost savings than it is negatively associated with risk. In response, open source seems to be largely passing the test, meeting or exceeding cost-savings expectations for nearly 90% of our more than 1,700 survey respondents.
Further evidence of open source acceptance and use can be found in a recent survey from Black Duck Software, which found that 22% of code in each of more than 170 sampled projects and applications is open source. What projects and applications to be specific and were they mission-critical? The list speaks for itself: voice applications, video applications, financial software, IT infrastructure, Web sites, customer relations applications, embedded solutions, desktop applications, businss process management, mobile infrastructure and handsets, e-commerce markets, defense electronics. Most if not all are arguably mission critical and most developers and administrators familiar with today’s markets and datacenters will tell you there is plenty of open source software throughout.
Still, there is room for nostalgia and remembering the days when open source software really was viewed as a ‘risk’ for production and mission-critical use. Let’s not forget, though, that those days are long gone.
September 7th, 2009 — M&A, Software
Matt Asay has written an interesting post speculating that Oracle might use the delay caused by the European Commission investigation into its acquisition of Sun to drive the price down. Sounds reasonable enough to me.
In it, Matt makes a couple of statements, one I agree with: “Oracle… likely will prove to be a better manager of this asset than Sun was”; and one that I have real doubts about: “MySQL’s… doing just fine, thank you”.
MySQL might well be doing fine. Unfortunately Sun’s financial results don’t actually provide any evidence either way.
Billings for the MySQL/Infrastructure were up 51% to $313m in FY09, according to information presented with Sun’s financial results, with revenue hitting $100m (up 10%) in Sun’s fourth quarter.
That sounds pretty good, but it is unclear how much of that was attributable to MySQL, and how much to “Infrastructure”. What we do know, based on Sun’s prior financial information, is that whatever Infrastructure is, it delivered revenue of $198m in FY07, the last full year without the contribution of MySQL.
(It was unclear to me at first whether revenue from MySQL had been back-dated into these figures, so I checked with Sun earlier this year – it hadn’t).
Earlier this year Roy Hann speculated that the net contribution to billings by MySQL was roughly $63M during its first four quarters under Sun, “assuming the billings for ‘infrastructure’ didn’t take a massive tumble”.
It is possible that they did, of course. The first quarter (3Q08) that MySQL was added to Sun saw MySQL/Infrastructure revenue plunge 20% YoY to $40m, which would imply that Infrastructure took a nosedive, at least in that quarter. My point is, we just don’t know.
Zack Urlocker recently claimed that MySQL sales inquiries were growing nicely (although I can’t find the original Tweet) but those outside Sun don’t know what from and what to. Mind you, those outside MySQL/Sun never knew what MySQL’s revenue was in the first place.
And we haven’t even mentioned the state of health of he MySQL’s development/sales teams.
August 24th, 2009 — Software
I recently attended Open Source World, concurrent with Cloud World in San Francisco and naturally, a good place to converse on and consider the intersection of open source and cloud computing and what it means for vendors, customers and the software. I found a general consistency among views on this, but did hear some surprising input as well. Below are some points and perspectives I heard while attending the conference, with some of our thoughts on those views included.
*While IT staff may be reluctant to let go of control (and embrace cloud computing), the fact of the matter is they can’t keep up with the business side of enterprise organizations and strategies. This continues to the case for many organizations that turn to both open source software and cloud computing as tools to allow them to keep pace. Interestingly, it seems open source software is more of an established pillar, with some trepidation around cloud computing, but familiarity and generally fondness for open source software. Open source does represent cost savings and flexibility, and when it intersects with cloud computing, this includes avoiding vendor lock-in.
*Since we’re always asking about how vendors and customers typically roll out cloud computing initiatives and infrastructures, I heard a lot of talk about the cloud computing customer process. Many vendors reinforced the idea that open source has paved the way to the cloud, a topic coverd here on the CAOS Theory blog by Matt. Consider Amazon’s use of the Xen open source hypervisor at the core of its EC2 cloud offerings as a prime example. We’re also hearing more and more about internal cloud use, public cloud use and hybrid scenarios where organizations look to their own, private resources first, then use Amazon, Google or other cloud options for higher scale and heavy lifting. Many organizations don’t want to compete with the initial, large cloud players as much as they want to emulate and leverage them. A lot of folks also referred to cloud computing and its potential to take virtualization beyond the hypervisor and beyond the server so that applications, databases and data are all abstract.
*I came to the conference thinking we’re not yet seeing much in the way of actual deployment of internal cloud infrastructures, based on previous vendor and customer conversations, but heard indications it may be more of a case of early adopters not wanting to disclose their cloud plans and architectures. This is a familiar theme in open source software, where early adopters and major users can often be shy about their use of open source software for competitive or other reasons. One vendor reported that 9 out of 10 customers are going ahead with some sort of internal cloud plans. We also heard about new cloud computing providers who are picking their customers, rather than pitching to whole swaths of customers.
*Cloud computing, similar to Linux and other open source software, is also clearly emerging as a major opportunity for hosters and service providers, as well as vendors that cater to them.
*We also heard that some large IT users, such as those in academia, are accustomed to using large cluster and high-performance IT infrastructures, so they are likely to be among the early customers and users. also experience in doing this among academia and newer, emerging companies (Canonical)
*It was not all cloud computing cheerleading at the conference, as one meeting revealed some concerns that cloud computing will not sufficiently meet the needs of high-performance computing (HPC) needs. However, there was quickly a response that HPC can fit with cloud computing as it does with grid computing, where a grid infrastructure can still use reinforcement from the cloud during spikes, for example.
*There was also an ongoing theme of mixed environments, and it seems the trend will continue in the clouds, where we expect to see proprietary and open source pieces used together frequently. Open source and proprietary software used to often be a choice of either/or, but with both increasingly deployed and supported together, the choices grow and the customer gains more control. We see advantages, such as licensing costs and flexibility, in open source software and truly open standards, but we also realize there are preferences and legacy ties for proprietary software as well.
July 1st, 2009 — Software
There has been no shortage of lively discussion on open source software licenses with recent shifts in the top licenses, perspectives on the licenses or lack of them for networked, SaaS and cloud-based software, increased prominence of a Microsoft open source license and concern over the openness (or closedness, depending on your perspedtive) of the latest devices. Amid all of it, we’re pleased to present our latest long-form report, CAOS 12 – The Myth of Open Source License Proliferation.
In the report, we cover how the spread and structure of open source software licenses has indeed led to some proliferation, but rather than a bad thing for the enterprise, we believe the variety and abundance of open source licenses has enabled broader enterprise use of open source. Furthermore, there has been an evolutionary natural selection of the most popular open source licenses, with the GNU GPL family, BSD family, Artistic, Apache and MIT licenses dominating both open source software hosted on repository and open source software in use, according to vulnerability reporting and analysis from Airius Internet Solutions. Another key finding in CAOS 12: vendors such as Sun Microsystems and IBM are contributing to license consolidation, retiring open source licenses in Sun’s case and for IBM, superseding the Common Public License with the Eclipse Public License, which similar to the Mozilla Public License is growing in types of software and popularity, particularly given mixed licensing within open source.
The report also carries on the themes of increased open core models, whereby open source software and licensing is combined with commercial licensing, that we covered in CAOS Nine – Open Source is Not a Business Model, as we consider how the need to generate revenue and reward investors can impact decisions on open source licenses. The report also identifies where different open source software licenses are most prominent, both in terms of the layer of the enterprise software stack and types of environments, from mobile and embedded software to SaaS environments to cloud computing.
Despite some recent doubts about it, we see GPLv2 still widely popular beyond its most prominent projects Linux and MySQL, which nonetheless help bolster its significance. Still, it is a once favorite license that may be fading as it is being used less in new projects, which are opting instead for more modern terms and coverage from GPLv3, AGPLv3, CPAL or other open source licenses. There is no question that GPLv3, by contrast, is on the rise and despite its lack of addressing what is commonly known as the ASP or network or SaaS loophole in GPLv2, is generally viewed as more modern. However, there is still strong resistance to GPLv3, particularly outside of the U.S., where we see the European Union turning to its own EUPL for more appropriate language and license coverage. This puts EUPL on our CAOS 12 list of licenses to watch, and another interesting license that joins it there is AGPLv3, which we’ve covered on the CAOS Theory blog before. As covered in the report, while AGPLv3 has failed to gain the same level of support and traction as its cousin GPLv3, it is the open source license of choice among some interesting new cloud plays, such as 10gen and Enomaly, which we’ve also covered here. If a project or vendor can demonstrate some development, distribution or collaboration advantages from AGPLv3, we believe it could lead to a broad embrace of the license in the enterprise. We should point out, however, this has yet to occur and at present, AGPLv3 is often viewed as onerous, to the extent that Google does not support the license in its Project Hosting.
With implications for vendors, both open source and proprietary competitors, for investors and for end users and customers of enterprise open source software, CAOS 12 is also intended as a guide to which open source licenses are most popular and appropriate, and why, for the many enterprise uses of open source software, whether in development, infrastructure, middleware or applications. Looking ahead, we don’t see the most popular open source license list changing much, as vendors tend to stick with the one or two licenses that suit them and rarely change. However, there will be some interesting jockeying among those top dozen licenses. The emergent models of virtual appliances, SaaS, virtualized and cloud environments will certainly impact license decisions and direction, but things will most likely follow the evolutionary path that open source licenses have traveled thus far.
May 1st, 2009 — Podcast
Topics for this podcast:
*Oracle-Sun roundup on open source
*Canonical’s Ubuntu 9.04 hits servers, PCs, netbooks
*New cloud plays look to open source
*Is SMB and midmarket opportunity growing?
iTunes or direct download (26:59, 6.2 MB)
April 23rd, 2009 — Software
One thing that seems clear in cloud computing right now — the combination of operating system, hypervisor, clustering, applications and other cloud infrastructure components in the mix is creating some interesting competition. We’ve written before about the fight over the OS role and its relevancy as hypervisor vendors race to cover the OS parts, OS vendors race to cover the hypervisor parts and so on. We’re now seeing a similar battle in cloud computing, where open source looms large, as other vendors step up to the opportunities.
VMware’s announcement of vSphere, billed as ‘the industry’s first cloud operating system,’ is a good example of how this fight continues. More specifically, VMware says vSphere 4 is ‘the first operating system for building the internal cloud, enabling the delivery of efficient, flexible and reliable IT as a service.’ In her report on vSphere, our own 451 Group Research Director Rachel Chalmers highlights the automation and self-service tools of vSphere, but also points out much of this doesn’t really arrive until VMware’s vCenter Suite later this year. In addition, while vSphere offers a good virtualized abstraction layer and tools to build internal clouds on commodity hardware, Xen is free and open source and is broadly used for cloud infrastructure.
In terms of the ‘first cloud OS,’ I think Ubuntu Linux vendor Canonical may beg to differ. The company’s Ubuntu Linux is already a popular choice for cloud deployment thanks to its free availability and lack of licensing royalties that can quickly cancel out cloud cost advantages. With its latest release this week, Ubuntu 9.04, Canonical is doing more to back this cloud deployment of Ubuntu. Its new server version is tuned for the cloud, primarily thanks to incorporation of the Eucalyptus open source clustering and cloud infrastructure software. Similar to VMware, which is offering bits and pieces of full functionality that is yet to come, Canonical offers a ‘preview’ of Ubuntu Eneterprise Cloud in 9.04. For its part, Canonical bills UEC and Ubuntu as ‘the first commercially-supported distribution to enable businesses to build cloud environments inside their firewalls.’ It is also worth noting Ubuntu Server Edition 9.04 will itself be available on Amazon’s Elastic Compute Cloud (EC2).
These are certainly a couple of interesting ‘firsts.’ It is also perhaps the first time these two vendors are competing so directly. We should probably expect to see more of this as a variety of vendors large and small from different software areas push into the clouds.
March 18th, 2009 — Software
UPDATED – I had to update this post after a conversation with RightScale founder and CTO Thorsten von Eicken and for Sun’s Open Cloud announcement, which are both now included below.
There has been some substantial technology and news regarding open source software in cloud computing lately. More proof that open source is reaching into nearly all aspects of enterprise and broader IT, and also reinforcement of the idea that open source software will continue to have a pervasive and disruptive impact on the way organizations of all shapes and sizes do their computing and deal with their data.
First up is RightScale, which as detailed by 451 colleague and Principal Analyst William Fellows, is up and running across the pond on Amazon’s EU EC2. As WiF reports, RightScale started with Red Hat Linux clone CentOS, but is seeing demand and traction among its users with Canonical’s Ubuntu Linux, which it recently began supporting in full. Our report also highlights Ubuntu packaging and integrated AWS-compatible Eucalyptus APIs. For its part, RightScale says its cloud infrastructure now includes cloud-ready ServerTemplates for Ubuntu — pre-built templates for common cloud configurations.
In my recent conversation with RightScale founder and CTO Thorsten von Eicken, he indicated as ISVs and others contemplate how to publish, sell, support and monetize applications in the cloud, they can benefit from the lessons and advantages of open source software. von Eicken and I agree that open source represents a different usage and payment model that is more conducive to cloud computing than traditional software licensing and payment models.
Next up is my own coverage of Cittio and its initiation of Project Zeppelin to create a standard, open agent and open source instrumentation for cloud monitoring. One of the most interesting aspects of Zeppelin is its intent to provide a standard way to compare clouds — both public ones from Amazon and others and internal deployments — and match applications to infrastructure by looking at discovery, monitoring, evaluation and auditing data. Monitoring of the clouds is also a place we see Hyperic, the most cloud-centric of the systems management and monitoring vendors centered on open source.
We’re also hearing a lot about the Apache Hadoop Project, most notably the new commercial play around it – Cloudera (covered recently in Matt’s latest CAOS Links and late last year in a blog). With Hadoop in use at places such as Facebook, Google and Yahoo! and recent $5m in funding from Accel Partners and others, the company certainly has some opportunity that is not pie in the sky. Indeed, Hadoop, which is also a focus for Cittio, and Cloudera are all further evidence of how real open source software is for cloud computing.
Although it may be getting lost in the noise around the potential IBM-Sun acquisition rumors, Sun Microsystems made a significant cloud announcement involving open source, as well. With its release of the Sun Cloud aimed at ‘developers, students and startups,’ Sun is relying on several open source components such as Java, MySQL, OpenSolaris and Open Storage.
So while many Linux and open source fans and followers have, unfortunately, grown used to hearing about Linux in this or open source in that when it turns out to be just for the buzz and attention created by those key words, Linux and open source in the clouds is more than mere mist.
October 19th, 2008 — Software
Microsoft continued its moves to make its Windows OS and other software more supportive and integrated with open source last week, releasing Web Application Installer software to facilitate development and use of popular Web applications, including open source software such as DotNetNuke web application framework, Drupal content management software, osCommerce e-commerce software and WordPress blogging software.
The release and Microsoft’s statements and stance are being viewed as both supportive and detrimental to open source. While I would agree developments such as these continue to blur the line between what is, or is not, an open source vendor, I do not agree with Microsoft’s contention that all software players are becoming ‘mixed-source’ companies. Sure, vendors and users seem to care less about whether the software they use, support, sell and pay for is open source or not, but those using open source to make products move faster and cost less, such as Red Hat, continue to differentiate themselves based primarily on open source.
I believe that Microsoft’s earnest intent is to make open source on Windows, ASP.Net and Silverlight as simple and supported as open source on Linux and Apache infrastructure, following on its previous movement toward open source. Would Microsoft benefit from making these newly-supported, open source pieces and products less efficient or integrated? Would it benefit from seeing them sway toward proprietary licensing and development? I don’t think so, since the company could already create that without open source. No, I belive Microsoft is genuinely looking to provide as good if not better support for open source software as anyone esle. Consider Drupal creator Dries Buyteart (also Drupal-based startup Acquia CEO) and his excitement at the prospect (even though he couldn’t test it since he didn’t have a Windows computer). Sure, Microsoft may be less inclined to offer its support in more competitive areas such as the OS with Linux or office software against OpenOffice.org. However, even in those cases we see recognition of reality and thus, collaboration, integration and support for open source from Microsoft.
For those who fear Microsoft’s talk of mixed-source means the company is looking to muddy the waters, a couple of things. First, there are a whole range of trends, issues and places — virtualization, SaaS, cloud computing, systems management, SOA, Web services, etc. — where the mixed-source mantra is getting pushed along quite well regardless of Redmond. Second, I have argued before Microsoft’s involvement in open source will rightfully draw the full and complete scrutiny of open source supporters, thus providing some good vetting to involvement by Microsoft and other proprietary players. In the end (and after several more confrontational approaches have failed), I think Microsoft may end up providing a significant paradigm for how proprietary software companies can successfully confront and coexist with open source.