Home Blog Page 316

Getting Started with the Unbreakable Enterprise Kernel Release 5 for Oracle Linux on Oracle Cloud Infrastructure

Oracle Linux images available on Oracle Cloud Infrastructure are frequently updated to help ensure access to the latest software. The latest Oracle Linux images provided in Oracle Cloud Infrastructure now include Oracle Linux 7 Update 5 with the Unbreakable Enterprise Kernel Release 5 (UEK R5). UEK R5 is an extensively tested and optimized Linux kernel designed for 64-bit (Intel x86_64) and ARM (aarch64)…
Click to Read More at Oracle Linux Kernel Development

Getting Started with the Unbreakable Enterprise Kernel Release 5 for Oracle Linux on Oracle Cloud Infrastructure

Oracle Linux images available on Oracle Cloud Infrastructure are frequently updated to help ensure access to the latest software. The latest Oracle Linux images provided in Oracle Cloud Infrastructure now include Oracle Linux 7 Update 5 with the Unbreakable Enterprise Kernel Release 5 (UEK R5). UEK R5 is an extensively tested and optimized Linux kernel designed for 64-bit (Intel x86_64) and ARM (aarch64)…

Click to Read More at Oracle Linux Kernel Development

Getting Started with the Unbreakable Enterprise Kernel Release 5 for Oracle Linux on Oracle Cloud Infrastructure

Oracle Linux images available on Oracle Cloud Infrastructure are frequently updated to help ensure access to the latest software. The latest Oracle Linux images provided in Oracle Cloud Infrastructure now include Oracle Linux 7 Update 5 with the Unbreakable Enterprise Kernel Release 5 (UEK R5). UEK R5 is an extensively tested and optimized Linux kernel designed for 64-bit (Intel x86_64) and ARM (aarch64)…

Click to Read More at Oracle Linux Kernel Development

Kid’s Day at Open Source Summit

The Linux Foundation strives to make Open Source Summit one of the most inclusive tech events in a variety of ways, offering activities such as the “Women in Open Source” lunch, a diversity social, a first-time attendees get-together, and more. The have activities focused on children, too. Not only does Open Source Summit offer free on-site childcare for attendees’ children, they also sponsor a Kid’s Day.

At this year’s Kid’s Day in Vancouver, the primary goal was to introduce the kids to coding via HTML, and very little computer knowledge or experience was required to participate. “The basics, typing, browsing the Internet and minor computer operation, are all your child needs to participate,” according to the website.

For this event, The Linux Foundation collaborated with Banks Family Tech, who organized the 4-hour long workshop. This workshop was geared toward children ages 9–18 and was open to children from the community as well as those of event attendees. The kids that participated actually ranged in age from 5-13 years of age, and, many already had some coding experience. Some had tried Scratch, and others had written scripts for games.

“We are going to teach how to go from nothing and become coders,” said Phillip Banks, founder of Banks Family Tech.

HTML workshop

The workshop focused squarely on HTML, one of the easiest computing languages. “It’s close to English and it’s not hard text and syntax to learn. It allows us to squeeze a lot of things into a day and get them excited so that they can go home and learn more,” said Banks. “After that, maybe, you can go to Python but HTML is so easy as they get a quick return by manipulating objects, text color and other things on a web-page immediately.”

This Kid’s Day event had a great mix of participants. While some of the kids accompanied their parents who were attending the conference, the majority were from the local community, whose parents learned about the workshop from social networks like Facebook. Khristine Carino, Director for Communications of SCWIST (Society for Canadian Women In Science and Technology), not only brought her own kids but also invited families from underrepresented minorities in Vancouver.

In the workshop, the children learned HTML basics like font tags, how to use fonts and colors, how to add images and videos, and how choose a background for their website. They also had the opportunity to share what they created with the whole group and learn from each other.

“It’s not so much about learning to code, just to be a coder; it’s learning to understand how things work,” said Banks. You can hear more in the video below.

Check out the full list of activities coming up at Open Source Summit in Europe and sign up to receive updates:

This article originally appeared at The Linux Foundation

Linux strace Command Tutorial for Beginners (8 Examples)

The Linux command line offers many tools that are helpful for software developers. One among them is strace, basics of which we’ll be discussing in this tutorial using some easy to understand examples.

But before we do that, it’s worth mentioning that all examples in this article have been tested on an Ubuntu 18.04 LTS machine.

Linux strace command

The strace command in Linux lets you trace system calls and signals. Following is its syntax:

strace [OPTIONS] command

Read more at HowToForge

Using Text Mining and Machine Learning to Enhance the Credit Risk Assessment Process

Advances in technology have instigated a substantial shift in consumer expectations. Today’s financial services customers demand access to a range of services, real-time updates and a seamless customer experience. At Open FinTech Forum, I will provide some insight into Spotcap’s approach to credit risk assessment using text mining and machine learning.

Bruce Brenkus, Chief Risk Officer, Spotcap

A recent survey by Oracle found, that although customers are generally satisfied with basic banking services, their satisfaction drops when attempting more complex transactions such as securing a loan. We have observed the same sentiment across the business community. This is why, at Spotcap, we’ve turned tradition on its head and created a more efficient take on business loans.

We undertake cash flow based, rather than credit-score based underwriting, and use technology to speed up the process. Combining tried and tested credit assessment principles with innovative technology such as our automated data scraping services, machine learning credit models, and skilled human analysts enables us to offer a more efficient take on business loans.

Machine learning credit algorithms

Our risk assessment utilizes numerous sources but relies heavily on three main sources – borrower profile, bank account, and business profile – and is supported by a set of machine learning credit algorithms. This approach allows us to accurately and fairly assess how a business is performing today, and make a prediction about its future performance.  

Whilst we feed our models with hundreds of data points sourced from credit bureaus, tax agencies, business records and the applicants themselves, it is bank account transactional data that often paints the most accurate picture.

Spotcap’s Bank Account Model incorporates more than 200 numerical variables. Business bank account data, when structured correctly, is one of the strongest sources of predictive information for short-term lending and risk mitigation. We construct the raw data found in a bank account into a form of variables enabling us to derive meaningful insights.  

We have also developed bank account text mining tools to identify key negative factors such as payment reversals, late fees and collections transactions.However, this requires a supervised approach to minimize the risk of false positives.

The more data you feed into your machine learning models, the more accurate will be your results. But it’s not only about quantity, it’s primarily the quality of data that matters. Well specified machine learning models can help lenders make faster and more informed decisions. However, even the most powerful machine learning algorithm will fail if applied to data with measurement error. The better your understanding of your data, the more accurate and insightful your results. Our underwriters and data scientists continuously add new knowledge and risk drivers to our models to get even more precise outcomes.

It’s all about automating the right parts of your analysis and remembering that human interaction is important at every stage of the model life cycle because we’re dealing with real people and real businesses, which are by nature complex. Human expertise combined with advanced technology enables us to make accurate, yet flexible credit decisions within one day.

Sign up to receive updates on Open FinTech Forum:

What Is Deep Learning AI? A Simple Guide With 8 Practical Examples

The amount of data we generate every day is staggering—currently estimated at 2.6 quintillion bytes—and it’s the resource that makes deep learning possible. Since deep-learning algorithms require a ton of data to learn from, this increase in data creation is one reason that deep learning capabilities have grown in recent years. In addition to more data creation, deep learning algorithms benefit from the stronger computing power that’s available today as well as the proliferation of Artificial Intelligence (AI) as a Service. AI as a Service has given smaller organizations access to artificial intelligence technology and specifically the AI algorithms required for deep learning without a large initial investment.

Deep learning allows machines to solve complex problems even when using a data set that is very diverse, unstructured and inter-connected. The more deep learning algorithms learn, the better they perform.

8 practical examples of deep learning

Now that we’re in a time when machines can learn to solve complex problems without human intervention, what exactly are the problems they are tackling? Here are just a few of the tasks that deep learning supports today and the list will just continue to grow as the algorithms continue to learn via the infusion of data.

Read more at Forbes

How Open Source Projects Are Pushing the Shift to Edge Computing

Gnanavelkandan Kathirvel of AT&T is sure of one thing: it will take a large group of open-source projects working together to push computing closer to the edge.

He’s behind the telecom’s efforts at Akraino Edge Stack, a Linux Foundation project that aims to create an open-source software for edge.  The AT&T contribution is designed for carrier-scale edge computing applications running in virtual machines and containers to support reliability and performance requirements.

To accomplish this, Arkraino will count on collaboration from other open source projects including ONAP, OpenStack, Airship, Kubernetes, Docker, Ceph, ONF, EdgeXFoundry and more. To ensure that there are no holes in the functionalities will require strong collaboration between Arkraino and upstream open-source communities.

Read more at SuperUser

AT&T Details Open White Box Specs for Linux-Based 5G Routers

This week AT&T will release detailed specs to the Open Compute Project for building white box cell site gateway routers for 5G. Over the next few years, more than 60,000 white box routers built by a variety of manufacturers will be deployed as 5G routers in AT&T’s network.

In its Oct. 1 announcement, AT&T said it will load the routers with its Debian Linux based Vyatta Network Operating System (NOS) stack. Vyatta NOS forms the basis for AT&T’s open source dNOS platform, which in turn is the basis for a new Linux Foundation open source NOS project called DANOS, which similarly stands for Disaggregated Network Operating System (see below).

AT&T’s white box blueprint “decouples hardware from software” so any organization can build its own compliant systems running other software. This will provide the cellular gateway industry with flexibility as well as the security of building on an interoperable, long-lifecycle platform. The white box spec appears to OS agnostic. However, routers typically run Linux-based NOS stacks, and that does not appear to be changing with 5G.

The release of specs to the Open Compute Project — an organization that helps standardize open white box designs — departs from the traditional practice of contracting a few vendors to build proprietary solutions for cellular routers. AT&T’s next-gen router blueprint will enable any hardware manufacturer willing to build to spec to compete for the orders. By attracting more manufacturers, AT&T aims to reduce costs, spur innovation, and more quickly meet the “surging data demands” for 5G.

“We now carry more than 222 petabytes of data on an average business day,” stated Chris Rice, SVP, Network Cloud and Infrastructure at AT&T. “The old hardware model simply can’t keep up, and we need to get faster and more efficient.”

The reference design blueprint is said to be flexible enough to enable manufacturers to offer custom platforms for different use cases. In addition to offering faster mobile services, AT&T’s 5G services will enable new applications in “autonomous cars, drones, augmented reality and virtual reality systems, smart factories, and more,” says AT&T.

5G technology will not only provide a major boost in bandwidth for mobile customers, it should also enable wireless services to better compete with the cable providers’ wired broadband Internet services for the home. This week, AT&T rival Verizon opened pre-orders for consumer customers to sign up for 5G home internet service targeted for a launch in 2019.

At publication time, neither AT&T or the Open Compute Project had not yet published the white box specs, but AT&T offered a few details:

  • Supports a wide range of client-side speeds including “100M/1G needed for legacy Baseband Unit systems and next generation 5G Baseband Unit systems operating at 10G/25G and backhaul speeds up to 100G”

  • Supports industrial temperature ranges (-40 to 65°C)

  • Integrates the Broadcom Qumran-AX switching chip with deep buffers to support advanced features and QOS

  • Integrates a baseboard management controller (BMC) for platform health status monitoring and recovery

  • Include a “powerful CPU for network operating software”

  • Provides timing circuitry that supports a variety of I/O</ul>

Vyatta NOS to dNOS to DANOS

Vyatta launched the Debian based, OpenVPN compliant Vyatta Community Edition over a decade ago. The distribution, which later added features like Quagga support and a standardized management console, was available in both subscription-based and open source Vyatta Core versions.

When Brocade acquired Vyatta in 2012, it discontinued the open source version. However, independent developers forked Vyatta Core to create an open source VyOS platform. Last year, Brocade sold its proprietary Vyatta assets to AT&T, which developed it as Vyatta NOS.

AT&T will initially load the proprietary, “production-hardened” Vyatta NOS on the white box routers it purchases. However, the goal appears to be to eventually replace this with AT&T’s dNOS stack under the emerging DANOS framework.

Robert Bays, assistant VP of Vyatta Development at AT&T Labs, stated: “Consistent with our previous announcements to create the DANOS open source project, hosted by the Linux Foundation, we are now sorting out which components of the open cell site gateway router NOS we will be contributing to open source.”

dNOS/DANOS aims to be the world’s first open source, carrier-grade operating system for wide area networks. The software is designed to interoperate with the widely endorsed ONAP (Open Network Automation Platform), a Linux Foundation project for standardizing open source cloud networking software. In AT&T’s dNOS announcement in January, which preceded the DANOS project launch in March, the company stated: “Just as the ONAP platform has become the open network operating system for the network cloud, the dNOS project aims to be the open operating system for white box.”

The DANOS project is also aligned with Linux Foundation projects like FRRouting, OpenSwitch, and the AT&T-derived Akraino Edge Stack. The Akraino project aims to standardize open source edge computing software for basestations and will also support telecom, enterprise networking, and IoT edge platforms.

Different Akraino blueprints will target technologies and standards such as DANOS, Ceph, Kata Containers, Kubernetes, StarlingX, OpenStack, Acumos AI, and EdgeX Foundry. In a few years, we will likely see DANOS-based white box gateway routers running Akraino software to enable 5G applications ranging from autonomous car communications to augmented reality.

Join us at Open Source Summit + Embedded Linux Conference Europe in Edinburgh, UK on October 22-24, 2018, for 100+ sessions on Linux, Cloud, Containers, AI, Community, and more.

Open Source Communities Unite Around Cloud-Native Network Functions

Cloud Native Computing Foundation (CNCF), chiefly responsible for Kubernetes, and the recently established Linux Foundation Networking (LF Networking) group are collaborating on a new class of software tools called Cloud-native Network Functions (CNFs).

CNFs are the next generation Virtual Network Functions (VNFs) designed specifically for private, public and hybrid cloud environments, packaged inside application containers based on Kubernetes.

VNFs are primarily used by telecommunications providers; CNFs are aimed at telecommunications providers that have shifted to cloud architectures, and will be especially useful in the deployment of 5G networks.

Read more at Datacenter Dynamics