Home Blog Page 436

How to Squeeze the Most out of Linux File Compression

If you have any doubt about the many commands and options available on Linux systems for file compression, you might want to take a look at the output of the apropos compress command. Chances are you’ll be surprised by the many commands that you can use for compressing and decompressing files, as well as for comparing compressed files, examining and searching through the content of compressed files, and even changing a compressed file from one format to another (i.e., .z format to .gz format).

You’re likely to see all of these entries just for the suite of bzip2 compression commands. Add in zip, gzip, and xz, and you’ve got a lot of interesting options.

Read more at NetworkWorld

CoreOS’s Open Cloud Services Could Bring Cloud Portability to Container-Native Apps

With the release of Tectonic 1.8, CoreOS provides a way to easily deploy container-native applications as services, even across multiple service providers and in-house resources.

“We take open source APIs, make them super easy to consume, and create a catalog of these things to run on top of Kubernetes so they are portable no matter where you go,” said Brandon Philips, CoreOS chief technology officer.

The company launched this latest iteration of Tectonic, its commercial distribution of the Kubernetes open source container orchestration engine, at the Cloud Native Computing Foundation‘s Kubecon 2017 event, held last week in Austin.

Read more at The New Stack

HPC Storage Grows Cloudier, Flashier

Organizations running high performance computing (HPC) workloads are increasingly seeking out cloud-based storage solutions and speedy flash-enabled systems to help them cope with growing complexity and the sheer amounts of data they are managing nowadays, according to new research from DataDirect Networks (DDN).

For starters, organizations are making use of more data, the company found in its survey of over 100 HPC professionals. Eighty-five percent of reported that they are using or managing more than one petabyte (PB) of storage, a 12-percent increase compared to last year’s results. Nearly 30 percent said they in charge of over 10PB of storage.

Nearly half (48 percent) of all respondents said they planned to stash at least some of their data on a public or private cloud, an 11 percent jump compared to 2016. Yet, only five percent of those polled said they expect to place more than 30 percent of their data in the cloud.

Read more at Datamation

3 Essential Questions to Ask at Your Next Tech Interview

The annual Open Source Jobs Report from Dice and The Linux Foundation reveals a lot about prospects for open source professionals and hiring activity in the year ahead. In this year’s report, 86 percent of tech professionals said that knowing open source has advanced their careers. Yet what happens with all that experience when it comes time for advancing within their own organization or applying for a new roles elsewhere?

Interviewing for a new job is never easy. Aside from the complexities of juggling your current work while preparing for a new role, there’s the added pressure of coming up with the necessary response when the interviewer asks “Do you have any questions for me?”

At Dice, we’re in the business of careers, advice, and connecting tech professionals with employers. But we also hire tech talent at our organization to work on open source projects. In fact, the Dice platform is based on a number of Linux distributions and we leverage open source databases as the basis for our search functionality. In short, we couldn’t run Dice without open source software, therefore it’s vital that we hire professionals who understand, and love, open source.

Over the years, I’ve learned the importance of asking good questions during an interview. It’s an opportunity to learn about your potential new employer, as well as better understand if they are a good match for your skills.

Here are three essential questions to ask and the reason they’re important:

1. What is the company’s position on employees contributing to open source projects or writing code in their spare time?

The answer to this question will tell you a lot about the company you’re interviewing with. In general, companies will want tech pros who contribute to websites or projects as long as they don’t conflict with the work you’re doing at that firm. Allowing this outside the company also fosters an entrepreneurial spirt among the tech organization, and teaches tech skills that you may not otherwise get in the normal course of your day.

2. How are projects prioritized here?

As all companies have become tech companies, there is often a division between innovative customer facing tech projects versus those that improve the platform itself. Will you be working on keeping the existing platform up to date? Or working on new products for the public? Depending on where your interests lie, the answer could determine if the company is a right fit for you.

3. Who primarily makes decisions on new products and how much input do developers have in the decision-making process?

This question is one part understanding who is responsible for innovation at the company (and how close you’ll be working with him/her) and one part discovering your career path at the firm. A good company will talk to its developers and open source talent ahead of developing new products. It seems like a no brainer, but it’s a step that’s sometimes missed and will mean the difference between a collaborative environment or chaotic process ahead of new product releases.

Interviewing can be stressful, however as 58 percent of companies tell Dice and The Linux Foundation that they need to hire open source talent in the months ahead, it’s important to remember the heightened demand puts professionals like you in the driver’s seat. Steer your career in the direction you desire.

Download the full 2017 Open Source Jobs Report now.

What Open Means to OpenStack

In his keynote at OpenStack Summit in Australia, Jonathan Bryce (Executive Director of the OpenStack Foundation) stressed on the meaning of both “Open” and “Stack” in the name of the project and focused on the importance of collaboration within the OpenStackecosystem.

OpenStack has enjoyed unprecedented success since its early days. It has excited the IT industry about applications at scale and created new ways to consume cloud. The adoption rate of OpenStack and the growth of its community exceeded even the biggest open source project on the planet, Linux. In its short life of 6 years, OpenStack has achieved more than Linux did in a similar time span.

So, why does OpenStack need to redefine the meaning of the project and stress collaboration? Why now?

“We have reached a point where the technology has proven itself,” said Mark Collier, the CTO of the OpenStack Foundation. “You have seen all the massive use case of OpenStack all around the globe.”

Collier said that the OpenStack community is all about solving problems. Although they continue to refine compute, storage, and networking, they also look beyond that.

Read more at The Linux Foundation

Many Cloud-Native Hands Try to Make Light Work of Kubernetes

The Cloud Native Computing Foundation, home of the Kubernetes open-source community, grew wildly this year. It welcomed membership from industry giants like Amazon Web Services Inc. and broke attendance records at last week’s KubeCon + CloudNativeCon conference in Austin, Texas. This is all happy news for Kubernetes — the favored platform for orchestrating containers (a virtualized method for running distributed applications). The technology needs all the untangling, simplifying fingers it can get.

This is also why most in the community are happy to tamp down their competitive instincts to chip away at common difficulties. “You kind of have to,” said Michelle Noorali (pictured), senior software engineer at Microsoft and co-chair of KubeCon + CloudNativeCon North America & Europe 2017. “These problems are really hard.”

Read more at SiliconAngle

Asynchronous Decision-Making: Helping Remote Teams Succeed

Asynchronous decision-making is a strategy that enables geographically and culturally distributed software teams to make decisions more efficiently. In this article, I’ll discuss some of the principles and tools that make this approach possible.

Synchronous decision-making, in which participants interact with each other in real time, can be expensive for people who work on a Maker’s Schedule, and they are often impractical for remote teams. We’ve all seen how such meetings can devolve into inefficient time wasters that we all dread and avoid.

In contrast, asynchronous decision-making, which is often used in large open source projects—for example, at the Apache Software Foundation (ASF), where I’m most active—provides an efficient way for teams to move forward with minimal meetings. Many open source projects involve only a few meetings each year (and some none at all), yet development teams consistently produce high-quality software.

How does asynchronous decision-making work?

Read more at OpenSource.com

Leveraging NFV and SDN for Network Slicing

Network slicing is poised to play a pivotal role in the enablement of 5G. The technology allows operators to run multiple virtual networks on top of a single, physical infrastructure. With 5G commercialization set for 2020, many are wondering to what extend network functions virtualization (NFV) and software-defined networking (SDN) can help move network slicing forward.

Virtualized infrastructure

NFV and SDN are two similar but distinct technologies that are spearheading the digital transformation of network infrastructure in the telecom industry. NFV is an initiative to provide network services that conventionally ran on proprietary hardware with virtual machines, where a virtual machine is understood as an operating system that imitates dedicated hardware. With NFV, network functions such as routing, load balancing and firewalls are delivered by virtual machines. Using NFV, resources are no longer bound to data centers, but pervade the network to accelerate the productivity of internal operations.

Read more at RCR Wireless News

Juniper Moves OpenContrail to the Linux Foundation

Juniper Networks is moving the codebase for its OpenContrail network virtualization platform to the Linux Foundation.

Juniper first released its Contrail products as open source in 2013 and built a community around the project. However, many stakeholders complained that Juniper didn’t work very hard to build the community, and some called it “faux-pen source.”

In today’s announcement, Juniper said adding OpenContrail’s codebase to the Linux Foundation will further its objective to grow the use of open source platforms in cloud ecosystems.

Read more at SDxCentral

Language Bugs Infest Downstream Software, Fuzzer Finds

Developers working in secure development guidelines can still be bitten by upstream bugs in the languages they use. That’s the conclusion of research presented last week at Black Hat Europe by IOActive’s Fernando Arnaboldi.

As Arnaboldi wrote in his Black Hat Europe paper [PDF]: “software developers may unknowingly include code in an application that can be used in a way that the designer did not foresee. Some of these behaviors pose a security risk to applications that were securely developed according to guidelines.”

Arnaboldi found bugs in the major programming languages JavaScript, Perl, PHP, Python and Ruby, and in all cases, he said the vulnerabilities could expose software written using those languages.

Read more at The Register