Home Blog Page 527

Linux Chgrp Command for Beginners (5 Examples)

Here at HowtoForge, we recently discussed the chown command which lets users change the owner as well as group of file (or a directory) in Linux. But did you know there exists a dedicated command line utility that you can use when it comes to changing group-related information? The tool in question is chgrp, and in this tutorial, we will be discussing this tool using easy to understand examples.

But before we do that, it’s worth mentioning that all examples and instructions mentioned in this tutorial have been tested on Ubuntu 16.04LTS.

Linux chgrp command

As you’d have already understood by now, if the requirement is to only change the group of a file or directory, then you can use chgrp instead of chown. The tool provides several command line options that you can use in different situations. Here’s the generic syntax of chgrp:

Read more at HowToForge

Commerce Seeks Input On Fighting Botnets

The Commerce Department is asking for public input on what the government should do to combat cyberattacks launched by armies of infected computers.

The request follows a May executive order that directed the Commerce and Homeland Security departments to lead “an open and transparent process” to organize tech companies and other stakeholders to help secure the internet against the automated and distributed attack groups known as botnets.

The botnet section was among the most tendentious portions of the otherwise noncontroversial executive order when it was in draft form. Industry was concerned the government might place additional mandates on businesses.

Read more at NextGov

ODPi Launches Apache Bigtop Grant Fund Program

ODPi, a nonprofit organization accelerating the open ecosystem of big data solutions, today announced Apache™ Bigtop “Test Drive” Grant Program, a new grant funding program designed to increase developer involvement in the Apache Software Foundation (ASF) project. Through the program, ODPi is investing $50,000 to fund developer work with the world’s top Apache and big data developers and architects to expand Bigtop’s functionality and usability.

To apply to participate in the Bigtop “Test Drive” Grant Program, submit proposals here by Friday, July 14 at 11:59pm PST.

Read more at ODPi

To Cloud Native & Beyond: The Journey to Build the Cloud Foundry Certified Developer Exam

By Amar Rao

Creating an exam for complicated technologies is tricky business. Doing it right takes time. It took 12 months of work to create and launch the Cloud Foundry Certified Developer (CFCD) exam. This blog is about that journey.

The exam is meant for developers to demonstrate proficiency in developing cloud-native applications on the Cloud Foundry platform and can be taken anyplace, any time. It is deployed on The Linux Foundation cloud-based exam infrastructure, leveraging the latest tools available. The actual environment through which candidates will access the Cloud Foundry instance is hosted on Swisscom. This project was developed as a collaborative team effort that included Cloud Foundry community subject matter experts (SME), the Linux Foundation and Biarca.

This certification exam is designed as a performance-based test in which the candidate’s proficiency is assessed not by testing knowledge of the subject, but by solving real world tasks encountered on the job. Problems are presented to the candidate that require the candidate to work in a “real” CF environment and deploy and manipulate application code.

Development began in June 2016, following a series of workshops facilitated by Wallace Judd, a psychometrician from Authentic Testing in which Cloud Foundry SME from leading global organizations were assembled to formulate the exam questions with Steve Greenberg of Resilient Scale as a technical advisor and SME. Given that all participants live across the country and are involved in other full-time work, it was a challenge to coordinate this activity — but we worked together to make it happen.

Biarca created the automation for provisioning each question’s setup and configuration to provide the environment in which candidates answer the questions. In addition, an automated grading system had to be developed through which the candidates were assessed and graded. In parallel, Linux Foundation was responsible for doing the end-to-end integration of all the various systems to deploy the exams on their cloud-based exam platform.

The Cloud Foundry examination process includes exam provisioning, proctoring, grading and result announcement. The exam architecture is shown in the image below:

Fig1: Cloud Foundry Exam Infrastructure

The decision was made to utilize Python in developing automation scripts for question provisioning, configuration and grading to ensure compatibility with the Linux Foundation exam platform. The Biarca team created a Python client library to address some of the unique requirements of administering the CF exam on the LF platform. One of the issues the team had to overcome included setting up Eclipse Che, a browser-based IDE interface, to make it more intuitive and closer to real-life environment for candidates. It was also a requirement to provide multi-language support so candidates can choose their language — be it Java, Node or Ruby. This had never been done before in a performance-based test, but the team rose to the challenge.

Fig2: Eclipse Che project running in the browser

The development team also concentrated on providing an integrated experience for the candidate, designed to let the candidate focus on the question and to deploy the best answer possible without dealing with extraneous issues.

Another important part of the work was to provide a reference answer orchestration, which is required to verify accuracy of the exam setup provisioning and the grading process as a pre-check. Answer orchestration takes place before providing the exam setup to the candidate. The answer scripts are run to answer the individual exam items, and to ensure that the exam setup and grading is working as expected. Developers had to address challenges that included route collisions among the multiple applications being deployed by candidates as part of answering the exam.

The grading software is designed to assess not just the answer, but also the thought process leading up to it. The Cloud Foundry exam includes continuous interaction with a public cloud, and the software design had to consider network failures, non-availability of the cloud platform and more.

The grading scripts had to capture all the raw scores for each item answered by the candidate. These raw scores must then be adjusted to account for different weighting for different items based on complexity, etc. The culmination of this is the final grade, and the publishing of a Pass/Fail.

The development team wrapped up its work by December 2016. The months of January and February 2017 were focused on alpha testing the exam, followed by beta testing in March and April.

The exam is being formally launched at Cloud Foundry Summit Silicon Valley in Santa Clara today June 13. 

This article originally appeared at Cloud Foundry.

It’s the Ecosystem, Stupid

I’ve been writing for a while on topics related to product and supply chain management in the context of open source communities, and I’ve noticed a few consistent themes in my articles and blog posts. Most notable is the call for companies to move from the “not invented here” syndrome to a more externally focused view. After all, if so much innovation is taking place in open source projects, why not take advantage of it to the fullest extent possible? You can see this theme manifested in the following ways:

  • Open source program office and management: This is all about providing structure for creating and implementing open source strategy, including programs and systems to help engineering teams meet legal requirements for participating in external open source communities. This is only valuable if the company places strategic importance on influencing external communities.

Read more at OpenSource.com

Artificial Intelligence: Open Source and Standards Bodies Drive Opportunities

Artificial intelligence (AI) and machine learning (ML) skillsets are now becoming a crucial way for technology-focused workers to differentiate themselves from the pack. Moreover, from Elon Musk’s OpenAI organization to Google’s sweeping new open AI initiatives announced at the recent Google I/O conference, investment in AI is driving many new opportunities. For technologists who straddle the arenas of open source and AI, opportunities are looking particularly promising.

At Google’s recent developer conference, the company introduced a project called AutoML from its Google Brain artificial intelligence research group.  It is designed to help automate many of the toughest aspects of designing machine learning and AI tools. Google is looking to grow the number of developers able to leverage machine learning by reducing the expertise required and is aiming to drum up community involvement.

As The Verge recently noted, the company’s AI initiatives “attract talent to Google and help make the company’s in-house software the standard for machine learning.” The bottom line is that AI and machine learning talent is very in-demand talent.

Organized Responses to the Promise of AI

Powerful consortiums are taking shape to help drive the future of open artificial intelligence. Partnership on AI is one of the most notable.  According to its founders: “We are at an inflection point in the development and application of AI technologies. The upswing in AI competencies, fueled by data, computation, and advances in algorithms for machine learning, perception, planning, and natural language, promise great value to people and society… We are excited about the prospect of coming together to collaborate on addressing concerns, rough edges, and rising challenges around AI, as well as to work together to pursue the grand opportunities and possibilities of the long-term dream of mastering the computational science of intelligence. It is our intention that the Partnership on AI will be collaborative, constructive, and work openly with all.”

More than 20 companies have joined Partnership on AI. The organizations range from Facebook to Intel to Salesforce and SAP. Many of these companies are actively contributing open source AI and machine learning projects to the community.

Meanwhile, Elon Musk’s OpenAI is creating new types of opportunities, including the release of open source tools. “We seek to broadcast our work to the world as papers, blog posts, software, talks, and tutorials,” the organization reports, and OpenAI is also hiring. 

Most recently, OpenAI has delivered an open toolkit for training robots via virtual reality. It has also open sourced a toolkit called Universe, which is middleware that can help AI agents solve arbitrary tasks and learn as they solve problems.

Building Out Your Skillset

So how can you gain skills that can become valuable as AI and machine learning advance? Coursera offers a popular class focused on machine learning, taught by a Stanford University expert. Udacity also offers free courses on AI, and has a notable course on deep learning developed with one of the principal scientists at Google. The course shows you how to train and optimize basic neural networks, convolutional neural networks, and long short term memory networks. It also introduces Google’s open source AI tools.

One of the more popular online courses on AI is found on the edX platform. The course is offered in conjunction with Columbia University and taught by a Columbia professor. The course covers building intelligent agents, open source AI tools, machine learning and more. Check out more free courses in this area, rounded up by the Hackearth blog.

There are also many good online tutorials focused on AI and machine learning. Here, you can find many of them for TensorFlow, Google’s flexible and popular open source framework that can be applied to image recognition tasks, neural networking, and more. You can also find many tutorials for H2O.ai’s popular AI and machine learning tools here.

To learn more about the promise of machine learning and artificial intelligence, watch a video featuring David Meyer, Chairman of the Board at OpenDaylight, a Collaborative Project at The Linux Foundation.

Are you interested in how organizations are bootstrapping their own open source programs internally? You can learn more in the Fundamentals of Professional Open Source Management training course from The Linux Foundation. Download a sample chapter now!

 

Test-Driven Security With Chef InSpec

Test-driven security is the implementation of tests into the development process, and Chef InSpec is one tool that will help you get started with this process. These security tests are intended to define the security features required for a system to be production ready.

In this post, we will walk through the process of using test-driven security, with proscriptive security tests, using Chef InSpec.

Regression Testing Security

Regression testing is the testing of software to ensure that changes do not break existing behavior. As new features are added or even as bugs are fixed, we want to test that previously existing behavior does not become broken and result in new bugs. To continue ensuring quality as software is developed, regression tests are added. The benefit of these tests lies in the fact that they help to prevent duplicate work and also help to ensure a better user experience.

Read more at ThreatStack

Calçado’s Microservices Prerequisites

When you decide to adopt microservices, you are explicitly moving away from having just one or a few moving pieces to a more complex system. In this new world, the many moving parts act in unpredictable ways as teams and services are created, changed, and destroyed continuously. This system’s ability to change and adapt quickly can provide significant benefits for your organisation, but you need to make sure that some guard rails are in place or your delivery can come to a standstill amidst the neverending change.

These guardrails are the prerequisites we discuss here. It is possible to successfully adopt a new technology without some or all of these in place, but their presence is expected to increase the probability of success and reduce the noise and confusion during the migration process.

Admittedly, the list of prerequisites presented here is long and, depending on your organisation’s culture and infrastructure, might require a massive investment. This upfront cost should be expected, though. A microservices architecture isn’t supposed to be any easier than other styles, and you need to make sure that you assess the Return on Investment before making a decision.

Read more at Phil Calçado

Deploying Minio Cloud Storage to DC/OS

Container orchestration is gaining traction as the default way to deploy applications. Developers are architecting their modern applications from the ground-up to run in containers, which enables faster deployment and more resilience. Even legacy applications are adopting containers in every way they can to access these advantages.

Of the many characteristics that make an application container ready, the way it handles unstructured data is one of the most important. Back in the day, the default way to handle unstructured data was to dump all of it onto the server’s file system, but using the host filesystem doesn’t make any sense for containerized apps. This is because, in an orchestrated environment, a container can be scheduled — or rescheduled — on any of the hosts in a cluster, but data written to a previous host can not be rescheduled with that container.

Read more at DZone

New Software Needed to Support a New Kind of Processor

Analysis of big data that can reveal early signs of an Ebola outbreak or the first traces of a cyberattack require a different kind of processor than has been developed for large-scale scientific studies. Since the data might come from disparate sources — say, medical records and GPS locations in the case of Ebola — they are organized in such a way that conventional computer processors handle them inefficiently.

Now, the U.S. military research organization DARPA has announced a new effort to build a processor for this kind of data — and the software to run on it. A group of computer scientists at the U.S. Department of Energy’s Pacific Northwest National Laboratory will receive $7 million over five years to create a software development kit for big data analysis.

Read more at ACM