Paulus Schoutsen created Home Assistant in 2013 “as a simple script to turn on the lights when the sun was setting,” as he told attendees of his recent Embedded Linux Conference and OpenIoT Summit presentation, “Automating your Home with Home Assistant: Python’s Answer to the Internet of Things.”
Ensuring U.S. government agencies have a compliant cloud-based infrastructure is the task of the General Services Administration’s 18F digital services, which created cloud.gov, a Cloud Foundry-based hosted cloud service specifically for federal agencies.
In this episode of The New Stack Makers embedded below, The New Stack founderAlex Williams and co-host Lee Calcote sat down with Diego Lapiduz, 18F software engineer and cloud.gov director, Bret Mogilefsky, innovation specialist and product lead at 18F, as well as with Barton George, senior principal engineer in Dell’s Office of the chief technology officer. The interview took place during the Cloud Foundry Summit 2016. Follow along to learn more about how both 18F and Dell have utilized open source and Cloud Foundry to develop not only secure technology for today’s federal agencies but flexible infrastructures across the cloud.
Good code is cheap; it’s operational knowledge that’s holding back big data from solving the great problems of our time.
Solving those operational difficulties with a modular, easy-to-use system was the solution Mark Shuttleworth laid out in his keynote entitled “More Fun, Less Friction” at Apache Big Data in Vancouver in May.
“If we take the friction out, we can unleash all sorts of creativity,” Shuttleworth said.
Shuttleworth is the founder of Canonical and is known for his work on the Linux distribution Ubuntu, which was created for the purpose of making the open operating system as easy as possible to use.
Now Canonical is working on a system called Juju, which models applications and services together to be deployed on any cloud database system. Those models are called charms, and companies from IBM to Cloudera to Couchbase to any number of Apache projects are creating them to run on Canonical’s system.
During his keynote, Shuttleworth spent the first half of his talk — about 8 minutes — spinning up a containerized Linux server on a laptop, an Amazon Web Services server, and a physical server rack using Juju, demonstrating how each environment handled the application cluster with the exact same commands.
“We’re using a common modeling language to describe these different applications, and then essentially binding all of those to the underlying resources needed in each of those environments,” Shuttleworth said. “Why is this interesting? It’s interesting because free software is becoming expensive… Scarcity, what’s rare, is not the code anymore. What’s scarce is the knowledge on how to operate that code.”
By reducing the complexity of spinning up servers to run dozens of combinations of different types of big data technology, and then getting those systems to play nice with each other, Shuttleworth and Canonical are trying to enable people working on big data to focus effort on making money, doing groundbreaking research into things like AI, or just making cool stuff.
“There are 33 big data projects in Apache. How many people know how to operate all of those?” Shuttleworth said. “And the complexity of those architectures is growing. It’s not two apps across two machines, it’s many apps across many machines. And architecture today is really a discussion about how we map those things together.
“We call this a phase change … because all of these pieces of software … they all have that same line crossing — the software is becoming free, but the operation is becoming expensive.”
Reducing complexity means reducing expense, and that means increased adoption and — hopefully — increased results.
So, you’ve just written some kick-ass code that you want to give out to the community as an open source project. You get a GitHub repo and push your code up for everyone to consume, your website and blog are looking slick, and you have managed to obtain the appropriate Twitter handle, so the last thing to do is choose an open source license. Which, as anyone who has looked into the huge range of open source licenses can attest to, is easier said than done.
For those that are overwhelmed, there are a number of sites that attempt to guide you through the process of selecting the correct license for your project.
Weaveworks announced the public beta of its Weave Cloud hosted cloud product. It combines versions of Weaveworks’ container networking and management software.
Of particular interest: Weave Cloud offers native Docker container integration with Amazon Web Services (AWS) Virtual Private Cloud (VPC), so that Docker containers can run directly on AWS VPC.
Like many trends in software there’s no one clear view of what ‘Serverless’ is, and that isn’t helped by it really coming to mean two different but overlapping areas:
Serverless was first used to describe applications that significantly or fully depend on 3rd party applications / services (‘in the cloud’) to manage server-side logic and state. These are typically ‘rich client’ applications (think single page web apps, or mobile apps) that use the vast ecosystem of cloud accessible databases (like Parse, Firebase), authentication services (Auth0, AWS Cognito), etc. These types of services have been previously described as ‘(Mobile) Backend as a Service’, and I’ll be using‘BaaS’ as a shorthand in the rest of this article.
Serverless can also mean applications where some amount of server-side logic is still written by the application developer but unlike traditional architectures is run in stateless compute containers that are event-triggered, ephemeral (may only last for one invocation), and fully managed by a 3rd party.
At Travix we run about half of our 100 in-house developed applications in Google Container Engine, Google’s hosted version of the Kubernetes container management cluster. We started using it in May 2015 when Kubernetes was still in Alpha, and since then embraced it and use it as our default hosting platform for any new application.
In this article I’ll describe how to deploy your application to Kubernetes and expose it as a public service.
There are many ways to work with Git. The workflows vary depending on the size of the team, organization, and on the way of working — is it distributed, is it sprint-based, is it a company, or an open-source project, where a maintainer approves pull requests?
You can use vanilla-Git, you can use GitHub, BitBucket, GitLab, or Stash. And then on the client side you can use the command line, IDE integration, or stand-alone clients like SourceTree. The workflows differ mostly in the way you organize you branches and the way you merge them. Do you branch off branches?
As an applications developer, you are a problem solver. You design, implement and support next-generation applications that are utilized to meet company needs. You offer solutions to drive overall business performance and success. It should come as no surprise then that you are a hiring priority, as companies are in the market for professionals who can help them solve real-world problems.
Open source developers are in high demand.
Developers are the most sought after open source professionals, with 74 percent of hiring managers seeking developers, according to the most recent Open Source Jobs Report from Dice and The Linux Foundation.
While in popular demand, it doesn’t mean you can’t ignore what’s trending in tech. As companies look to be more “open,” they need applications developers who are not only familiar with emerging open source technologies, but who are experts in writing code and finding valuable insights through data mining.
With the bar set high in terms of what employers expect, it is important to educate yourself on what specific skillsets companies are looking for from an applications developer. This not only will allow you to grow and move up the ranks within your current company, but it will make you more marketable when looking to change employers.
Below are just a few of the most sought after skills employers are requesting on Dice when in the market for an open source applications developer.*
Big Data: Developing and running applications requires using large amounts of data. For this reason, companies need applications developers who have experience working with Big Data technologies, like Hadoop or Apache Spark, which work to help collect, process and analyze huge data sets. For those less acquainted with Big Data, it may be wise to get yourself up to speed on this skillset. Professionals with Big Data expertise earn $121,328 on average in the United States and are amongst the highest paid professionals, according to Dice’s latest annual salary survey.
Cloud:With the rise of cloud computing, you are seeing more and more companies looking for open source professionals with cloud expertise. In the 2016 Open Source Jobs Report, 51 percent of hiring managers surveyed said knowledge of cloud technologies has the biggest impact on their open source hiring decisions. OpenStack, in particular, is one cloud-related skill that is gaining momentum, with e-commerce and security companies alike looking for open source professionals with familiarity with this cloud-based operating system.
Mobile:For most companies today, understanding mobile isn’t so much an option as it is a requirement. Employers on Dice are looking for applications developers with a strong working knowledge of mobile coupled with UI/UX experience, solid programming background (i.e. Java and C/C++) and project management skills.
JavaScript:JavaScript is a core programming language that employers want in an applications developer. On any given day, there are over 13,000 JavaScript job postings on Dice, which represents approximately 15 percent of Dice’s overall job count. Expertise in JavaScript is also needed when working with open-source frameworks like AngularJS.
While these skillsets are a good jumping off point if you are looking to further develop yourself as an open source applications developer, there is always more you can learn. According to the 2016 Open Source Jobs Report, free online tutorials are the most common method used by open source professionals to keep their skills up-to-date, but there is also other formal training you can rely upon to gain additional skills expertise, such as meet-ups or instructor-led courses.
In the end, whichever method(s) you choose to brush up on or expand your skills base doesn’t really matter. What matters is that you are continuously learning and keeping current on what’s trending in tech. As a problem solver, you need to have a keen familiarity with the latest platforms and skills in order to offer up the best solutions.
* Applications Development is a broad job category that encompasses a wide range of job titles including but not limited to “applications developer,” “software development engineer,” “software developer,” “mobile developer” and “big data engineer.”
Ultimately, I realized I wanted to develop software, and really wanted to develop in a native language where I had access to the hardware. At the time I used Linux as my main operating system, but also worked with people using Windows and Mac OS X on a regular basis. I wanted them to be able to use the software I developed, and didn’t want to write it three times. This was when I started aligning on a software stack, largely influenced by KDE and the opportunity I had to do a Google Summer of Code project with them. It is now about nine years since I did that Google Summer of Code project, and I am largely using the same stack with some additions/updates to facilitate collaborative, open source, cross-platform development.
C++ is a standardized language with a number of powerful open source compilers and many proprietary compilers that span everything from embedded systems to the biggest supercomputers on the planet. In my work, we regularly target both extremes of the spectrum as well as a lot of work in the desktop/laptop space.