Home Blog Page 786

Hyperledger Works on Its Open-Source Footing

Taking a bootstrapped initiative to a healthy open-source project is difficult. But when there’s only approximately 100 developers in the world that have a deep understanding of the technology, such as blockchain, the difficulty increases dramatically.

Open-source veteran Brian Behlendorf was aware of the challenges when the Linux Foundation tapped him to lead the Hyperledger Project as its executive director in May.

“The job really is to be an independent voice for the project that is not affiliated with one company or another,” he told Markets Media. “It’s also to bring to the party everything that the Linux Foundation knows about running open-source projects. My job is to corral all that towards the purpose of building a great community and a great collection of code.”

Read more at Markets Media

Steady User Growth Characterizes Cloud Foundry Ecosystem

The CF community now includes 173 user groups with 33,400-plus individual members across 105 cities in 48 countries, CEO Sam Ramji said.

Cloud Foundry might be the only PaaS to have its own user conference—a three-day one, at that.

Cloud Foundry is an open source cloud platform as a service originally developed by VMware and now run by Pivotal Software, which is a joint venture owned by EMC, VMware and General Electric. Cloud Foundry was designed and developed by a small team from Google led by Derek Collison and originally was called Project B29.

Read more at eWeek

The Rise of Deep Learning in the Tech Industry

Tech analysts love trending topics. In fact, that’s their job: forecast and analyze trends. Some years ago we had “Big Data”, more recently “Machine Learning”, and now it s the time of “Deep Learning”. So let’s dive in and try to understand what‘s behind it and what impact it can have on our society.

What’s new?

Neural Network algorithms are the main science behind Deep Learning. They are not new but became more popular in the mid-2000s after Geoffrey Hinton and Ruslan Salakhutdinov published a paper explaining how we could train a many-layered feedforward neural network one layer at a time. The large-scale impact of Deep Learning in Big Tech Companies began around 2010 with speech recognition.

It took around 30 years to become mainstream. Computers were not powerful enough and companies didn’t have such a large amount of data. When the researcher Yann LeCun played with his first algorithms in the 80’s it took him 3 days to run it! As you can see on the previous diagram, it’s only been 3 years since Deep Learning became more mainstream. Indeed in 2012 ImageNet, a popular challenge for scientists in the field of image recognition, was first won by Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton thanks to Deep Learning. This result put lot of attention on this field in the tech sector.

The technology behind Deep Learning is Neural Networks stacked together into multiple layers. One of the challenges for the humans who implement them is to understand the exact information extracted by each layer. Each stack of neurons extracts higher level information so that at the end they can recognize very complex patterns. Humans are sometimes skeptical of this model because, even though it’s based on well-known mathematical equations, we know little about why some models works.

This is only the beginning. There are many challenges to tackle on topics like “NLP” (Natural Language Processing) or understanding spoken language. One key for this is the context. When speech is limited to a small scope (e.g., in a legal document or a food recipe) machines can interpret the meaning. For now, much of the nuance and complexity of human language is difficult for machines (for instance it’s very hard for a machine to understand a joke).

This is a big turn in history. Before Neural Networks, humans thought they were the best at designing code. Now they need to accept that the machine can beat them even in writing an algorithm. Machines programmed to recognize patterns with Deep Learning beat the old “Rule-Based” algorithms.

This video of a simple machine trained with the DeepMind algorithm is a very good illustration of the superior “intelligence” of the machine. The computer learns to win the game and at the end discovers tricks that nobody found before. It’s no longer about Brute-Force algorithms, but about the replication of complex human behavior. For instance, the same DeepMind team (recently bought by Google) won the game of Go against the best European player, something that no computer could do before Deep Learning.

Applications

A well-known application of Deep Learning is face recognition. Google Photos, for instance, is a very good example of this technology. It can even recognize your face from 20 years ago! To simplify, we could say that the first layer of neurons can recognize a circle, the second an iris, and the third an eye. If the computer has been trained well enough, it can recognize abstract entities like a face with a good probability.

After videos, speech, and translation, Google now uses Deep Learning for search, its core business. The ranking doesn’t rely anymore only on human-designed algorithms (like the well known PageRank) but thanks to RankBrain, a Deep Learning algorithm, Google now has more accuracy and precision.

Of course, one of the trending topics in Deep Learning is the autonomous car. The National Highway Traffic Safety Administration said the Artificial Intelligence system piloting a self-driving car could be considered as the driver under federal law. This is a major step toward ultimately winning approval for autonomous vehicles on the roads.

Many tech companies have recently understood the benefits that new A.I. techniques can bring. Facebook, Google, Apple, Microsoft, IBM and many others are building Deep Learning teams to tackle these challenges.

Facebook hired Yann LeCun to head its new A.I. lab and one year later hired Vladimir Vapnik, a main developer of the Vapnik–Chervonenkis theory of statistical learning. Apple recently bought three startups in Deep Learning as well. Google, as we underlined before, hired an amazing crew including Geoffrey Hinton. Finally, Baidu hired Andrew Ng, one of the most famous teachers and scientists in Machine Learning, to head its new research lab.

The battle is starting and we don’t know yet who will win this deep learning fight. The main question is about how it will impact our daily life. Will we become as powerful as James Bond, a personal version of Moneypenny in our pocket (the Facebook virtual assistant “M” )? Will we all lose our jobs and be replaced by machines? Maybe both?

What kind of future will appear?

The super-intelligence of connected machines, which humans may not be able to fully understand, could become a potential threat tomorrow. Stephen Hawking, Bill Gates, and Elon Musk have warned us about it.

“I am in the camp that is concerned about super intelligence. First, the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that though the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.” said Bill Gates

Indeed, it’s certain that we will save many lives and reduce boring and automated tasks with Deep Learning, but it could also have a huge negative impact on our society. As we all may have seen in more recent sci-fi movies, Artificial (super-)Intelligence could be used destroy things or manipulate humans. One way to limit this potential threat is to open source code so that the whole community can be aware of the algorithm and know the state of the art. TensorFlow or OpenAI are good examples of this idea.

“Because of AI’s surprising history, it’s hard to predict when human-level AI might come within reach. When it does, it’ll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest.” in the manifest of OpenAI.

One of the other consequences we fear most is the end of many jobs. Because each major technological innovation spreads across the whole economy, it’s certain that many sectors will be impacted by the exponential growth of such technologies. As economist Joseph Schumpeter taught us, it will also probably create many jobs in other sectors (mainly services and on-demand jobs). Maybe “this time will be different” and we will need new social institutions to take care of this. New economic ideas like Basic Income could be an interesting way to decrease the shock caused by the invasion of Deep Learning everywhere. Some institutions are already prepared to experiment with it.

Today, each AI is built with data from Internet sources like Google searches or Facebook feeds. But in the near future, each AI could be built with data from our personal devices. We don’t know already which applications will emerge. We can be sure, as Andy Rubin the co-founder of Android stated, that Deep Learning will become easier and cheaper to implement so that every piece of software or hardware will be able to run its own intelligent algorithms.

Deep Learning is on its way to becoming a commodity….

This article was contributed by a student at Holberton School and should be used for educational purposes only.

Rancher & Vapor IO Perform New Tricks With Apache Mesos

Rancher Labs and Vapor IO are announcing moves related to Apache Mesos, the open source container orchestration platform. Rancher is adding Mesos support to its container management environment (also named Rancher), while Vapor IO is bringing its data center management software into Mesosphere‘s DC/OS. 

The announcements are coming out Wednesday morning at MesosCon North America, being held in Denver. Rancher already supports three types of container scheduling: KubernetesDocker Swarm from Docker Inc., and the startup’s own Cattle.

Read more at SDx Central

Linux Foundation Backs HPE’s Open Source Switch OS

OpenSwitch, the operating system for data center network switches Hewlett-Packard Enterprise launched last year as an open source project together with a number of other networking heavyweights, has become an official Linux Foundation project, the foundation announced today.

The foundation provides infrastructure and management resources for open source projects it accepts, as well as the exposure to open source developers that may be more inclined to contribute because of the organization’s pedigree. It hosts some of the most influential open source infrastructure projects, such as Cloud Foundry, OpenDaylight, and Zen Project.

Read more at Data Center Knowledge

Bring Networking Projects Under A Common Umbrella, Urges Cisco’s Dave Ward

As a “networking guy,” Cisco CTO of Engineering and Chief Architect Dave Ward finds it frustrating that today, although somebody can fire up an application and ask for CPU, RAM and storage, they can’t even ask for bandwidth. They have very simple networking primitives all the way up to the PaaS (Platform as a Service) layer.

Developers shouldn’t have to “keep the whole stack in their head,” said Ward, in his Collaboration Summit keynote. “From that developer’s point of view, I want to be able to fire up my workload, and I just want it to work.”

In his wide-ranging talk, titled “Umbrellas Aren’t Just for When It’s Raining,” Ward offered his thoughts on points including “building network projects in the stack so developers don’t have to know or care what’s going on.” A “no-stack developer” wants all of the controllers, analytics, orchestration, and service chaining just to work.

Ward’s goal is for the infrastructure to just do what a developer needs to have happen… thereby “creating a no-stack developer environment in which intent can be driven directly into the network.”

Ward discussed various open source projects that have sprung up in the past two years, and he said, “The Linux Foundation has done an outstandingly good job of pulling together communities that fill certain niches and certain functionality inside this stack.”

“The Linux Foundation has proven itself to be a perfect place for us to collaborate,” said Ward, with more than a dozen network projects, millions of lines of code under management, and many corporate sponsors and developers working together on multiple projects.

“I’m trying to catalyze, through this talk, a conversation about how to take all the projects we have and pull them together under an umbrella,” said Ward.

Toward that end, he says, “We could continue with ‘Let a thousand flowers bloom, and let a thousand communities rise’ and continue the way we are currently operating now.” But, suggests Ward, it would be good to have some planning around how to allocate resources: what’s the key focus, what needs to be built inside that architecture, and then align the cost.

Ward says, “It’s time to talk about creating a networking umbrella over all these foundations and projects.” Ward clarifies that he is talking about “The actual mechanism by how we can do this with an understanding of the governance structures, not the technical structures.” This could get the industry to point where they could fill in and continue to complete all the pieces that are necessary for orchestration, config, provisioning, and resources.

At minimum, urged Ward, “If we can’t get an umbrella architecture, we know a lot of the places that we need to fill in and have to work as an industry to create communities around those projects to get the job done.”

Watch Dave Ward’s full keynote, below:

https://www.youtube.com/watch?v=eEckX2hn4y4

linux-com_ctas_may2016_v2_collab.png?itok=Mj86VQnX

Cumulus Linux 3.0 NOS Now in the Wild

Cumulus Linux is touting a bunch of heavyweights as supporting the latest iteration of its white-box Linux.

On board for the launch of the Cumulus Linux 3.0 network operating system are Dell, EdgeCore Networks, Mellanox, Penguin Computing, and Supermicro.

For Cumulus, one of the biggest aspects of the launch is that version 3.0 is 100 Gbps Ethernet-capable, something it reckons will be important for the data centre market.

Four of the products already certified in its hardware compatibility list target that space: Dell’s Z9100, Penguin’s 3200CP and Supermicro’s SSE-C3632 (all using Broadcom Tomahawk silicon), and Mellanox’s own-silicon SN2700.

Read more at The Register

HPE Targets DevOps and Agile with New Application Lifecycle Management Software

On Wednesday, Hewlett Packard Enterprise (HPE) announced the general availability ALM Octane, its cloud-based application lifecycle management offering that is geared towards making customers’ DevOps processes more efficient.

The platform makes use of common toolsets and frameworks, such as Jenkins, GIT, and Gherkin, while also providing insights to developers and application testers. This could potentially help enterprises deliver those applications more quickly, without having to cut corners in the vetting process.

“HPE ALM Octane is specifically designed for Agile and DevOps-ready teams, bringing a cloud-first approach that’s accessible anytime and anywhere, bolstered by big data-style analytics to help deliver speed, quality, and scale across all modes of IT,” said Raffi Margaliot, senior vice president and general manager of application delivery management for HPE.

Read more at TechRepublic

Samba Server installation on Ubuntu 16.04

This guide explains the installation and configuration of a Samba server on Ubuntu 16.04 with anonymous and secured Samba shares. Samba is an Open Source/Free Software suite that provides seamless file and print services to SMB/CIFS clients. Samba is freely available, unlike other SMB/CIFS implementations, and allows for interoperability between Linux/Unix servers and Windows-based clients.

 

4 Steps To Secure Serverless Applications

Serverless applications remove a lot of the operational burdens from your team. No more managing operating systems or running low level infrastructure.

This lets you and your team focus on building…and that’s a wonderful thing.

But don’t let the lack of day-to-day operational tasks fool you into thinking that there’s nothing to do but write code. With a serverless design, you still have operational tasks. These tasks are different and tend to be more directly tied to delivering business value but they still exist.

Read more at Marknca Blog