Tips on Scaling Open Source in the Cloud

241

This article was sponsored by Alibaba and written by Linux.com.

After much anticipation, LinuxCon, ContainerCon and Cloud Open China will soon be officially underway. Some of the world’s top technologists and open source leaders are gathering at the China National Convention Center in Beijing. The excitement is building around the discoveries and discussions on Linux, containers, cloud technologies, networking, microservices, and more. Attendees will also exchange insights and tips on how to navigate and lead in the open source community, and what better way than to network in person at LinuxCon China?

To preview how some leading companies are using open source and participating in the open source community, Linux.com interviewed several companies attending the conference. In this segment, Alibaba discusses how to successfully manage scaling open source in the cloud

Hong Tang, chief architect of Alibaba Cloud.
We spoke with Hong Tang, chief architect of Alibaba Cloud.  Here are the interesting insights he had to share.

Linux.com: What are some of the advantages of using open source in the cloud?

Hong: I can summarize that in three points for application developers: a shorter learning curve, better security with less hassle, and more resources with increased agility.

First is the shortened learning curve. Developers just want to develop applications when they use open source. They want to focus on their particular application logic and they want to decide what features to develop. They do not want to spend time and effort on managing the physical infrastructure, an aggravation cloud computing eliminates.

Further, developers are aware that many of the open source products are not easy to setup and configure properly — particularly those running on a distributed set of machines, which means it is much more than a single library you can just link to your application. Managing open source on the cloud lowers the learning curve on those issues for developers.

Also, given there are so many choices, with different kinds of open sources on the cloud, means developers can try several choices and quickly figure out which will work for them. And they don’t waste time learning how to set up, configure and use it, only to discover that software doesn’t deliver what they need. So that’s the first big advantage of using open source in the cloud.

The second thing I think is very important is the security. Given the nature of the openness of the open source software, everyone can see the source code, so it’s much easier to figure out the security vulnerabilities of the software. But not all developers are highly focused on security so sometimes they may fall behind in things like applying patches or upgrading to the latest version of the software. Particularly when the newer version might not be compatible, an upgrade possibly means they have to reconfigure everything. The cloud is very helpful with that since patches and upgrades are automatic.

Also, we have dedicated teams watching all those vulnerabilities of all those open source options, and commercial software as well. We can manage them and protect them from the peripherals because things can be done outside their virtual machines, or their cloud instances.

Third, running open source on the cloud combines the advantages of both open source and the cloud. Not everything the developer seeks may be available in open source, or maybe best of breed is offered in something that is not open sourced. By using both cloud and open source, developers don’t have to restrict themselves to what is within the open source software. They can leverage the best of open source with some cloud services that open source does not provide yet. We have plenty of those, by the way.

These are three things that I can see as why running open source on the cloud matters.

Linux.com: What are some of the problems you see in scaling open source on the cloud?

Hong: It’s not that there is a direct problem with scaling the adoption of open source on the cloud. We see people using open source and creating applications comfortably on the cloud. We see pretty good growth of open source options on the cloud. But certainly, I think there are a lot of things we can do to help developers to better leverage open source on the cloud. So, I wouldn’t call it a problem but I would say there are things that we can do to unlock the advantages of open source on the cloud.

The first thing is to make open source more manageable. A lot of the things we talked about previously require integrations between open source and the cloud to deliver that increased manageability. Essentially, we want developers to use open source as managed services on the cloud.

Why is that? Well, if they just repeat what they are already doing and simply put their software, including the open source parts, on the cloud, they’ll probably discover there’s not much difference in running their applications in an on-premises environment or on the cloud. A lot of people doing this kind of application migration essentially mirror the on-premises environment in a cloud environment, but that basically means they didn’t really leverage the advantages of the cloud.

We want to educate developers on how to properly architecture applications on the cloud so that they can capture all the benefits.

Linux.com: How does embracing DevOps make a positive difference in scaling properly?

Hong:  The key difference between on-premises and cloud environments is that in an on-premises environment, the developer has a fixed set of iron boxes and services and they want to put those application pieces into those boxes. Of course, private cloud solutions like VMWare or Docker make things a little bit easier, but still they have a fixed physical infrastructure. Basically what the developers do is following a fixed deployment.

Developers have to think, ok this application requires, let’s see, how many QPS?  I need to provision with how many servers? Further, they think deployment through and decide the type of servers they want to run this application on, with customizations for memory sizes, or faster disks, or faster CPUs. That’s the way they do it and they buy a set of boxes for an application and another set of boxes for other applications, and so on.

On the cloud, it’s different because there are “unlimited resources” underneath it which means you can get any combination of server specs. If you want high performance, high memory, or high performing disks, you can get that. And you get that with only the things you want with an API call so there’s no depreciation between the physical infrastructure provisioned and running things on top of that. And we provide the pieces to do this. For example, there’s a thing called elastic scaler that can monitor the load on the backend and decide when you need to acquire another server instance for the application and put load balancer in front to hide those little details.  

We have now what’s called serverless computing in the industry. With that, you don’t have to put this process in that box, you don’t have to care where all those processing and storage happen. That’s why they’re called serverless. Open source also provides some of those like HBase, Cassandra, etc so you don’t really know, you don’t really care where the piece of data is stored, or where the application’s processing is happening. So you can see that by leveraging both open source and cloud services, a developer’s work becomes much easier and faster with these multitude of options.

Also on the cloud we have resource orchestration. You can choose resources, label them, and with that spin up a testing version of services directly. This is also sometimes called agility. So you can test more easily in full scale and not in a mocking way.

All of these capabilities and options bring forth a different mentality when you write applications targeted for the cloud vs when you write applications for the on-premises environment. If you take advantage of those, developers can save a lot of hassle in reasoning the scalability of their components or deciding how much resources they need, as you don’t have to worry about it.

The application can simply scale along with the workload.

Linux.com: Any final thoughts?

Hong: I hope to see many of the people reading this at LinuxCon China. We are working hard every day to engage developers, provide them with new tools, and build services they tell us they want and features that we discover by listening to attendees at conferences like this one. See you there!

The article is sponsored by Alibaba CloudAlibaba Group’s cloud computing arm, develops highly scalable platforms for cloud computing and data management. It provides a comprehensive suite of cloud computing services to support participants of Alibaba Group’s online and mobile commerce ecosystem, including sellers, and other third-party customers and businesses.