Nearline is framed as a polar opposite alternative to tiered storage being that Google’s new storage option promises virtually immediate access.
IBM Platform Computing – Ready to Run Clusters in the Cloud
Demands by users that are running applications in the scientific, technical, financial or research areas can easily outstrip the capabilities of in-house clusters of servers. IT departments have to anticipate compute and storage needs for their most demanding users, which can lead to extra spending on both CAPEX and OPEX once the workload changes.
How to Test Your Internet Speed Bidirectionally from Command Line Using ‘Speedtest-CLI’ Tool
We always need to check the speed of the Internet connection at home and office. What we do for this? Go to websites like Speedtest.net and begin test. It loads JavaScript in the web browser and then select best server based upon ping and output the result. It also uses a Flash player to produce graphical results.
What about headless server, where isn’t any web based browser and the main point is, most of the servers are headless. The another bottleneck of such web browser based speed testing is that, you can’t schedule the speed testing at regular interval. Here comes an application “Speedtest-cli” that removes such bottlenecks and let you test the speed of Internet connection from command line.
http://www.tecmint.com/check-internet-speed-from-command-line-in-linux/
Virginia Tech’s Linux Laptop Orchestra Puts a New Twist on Orchestral Music
The Linux Laptop Orchestra doesn’t need instruments to make music.
“It’s basically my crazy idea,” said Ivica Ico Bukvic, a professor at Virginia Tech and the founder of Linux Laptop Orchestra, or L²Ork, a group that makes music with computers and motion-sensing remotes. Hosted by the Design | Cultures + Creativity and Honors Humanities living-learning programs and the architecture school, L²Ork will bring its digital art to this university through a performance, lecture and design workshop.
Bukvic and L²Ork have fostered musical progress since 2009. Using relatively cheap but innovative tools, the group built the world’s first Linux-based laptop orchestra.
Read more at the Diamondback.
Linux Adopts Conflict Resolution Code
“If you can’t take the heat, get out of the kitchen” could be the unofficial motto of the Linux kernel community. Over the years, there has been one conflict after another in the heart of the the Linux development community, the Linux Kernel Mailing List (LKML). Now, in order to make the LKML more peaceful, the group has adopted a Code of Conduct.
This title isn’t quite accurate; it’s not a code of conduct. Rather, it describes a method to resolve conflicts.
Read more at ZDNet.
Tor Browser: An Ultimate Web Browser for Anonymous Web Browsing in Linux
Most of us give a considerable time of ours to Internet. The primary Application we require to perform our internet activity is a browser, a web browser to be more perfect. Over Internet most of our’s activity is logged toServer/Client machine which includes IP address, Geographical Location, search/activity trends and a whole lots of Information which can potentially be very harmful, if used intentionally the other way…
http://www.tecmint.com/tor-browser-for-anonymous-web-browsing/#
Heeding Jim Zemlin’s Call to Help Secure Internet Infrastructure Projects
Stephen Walli works on the Helion converged cloud team at Hewlett-Packard where he does business strategy and evangelism.
Jim Zemlin has always been quotable. His keynote at this year’s Linux Foundation Collaboration Summit provided a great summary (as always) of the growth of the Linux ecosystem, but also focused on the serious problems in the security of the Internet in an era when key breaches have their own branding and logos. [Think Heartbleed and Shellshock.] He ran through some scary facts:
- OpenSSL secures most of the Internet and the small team at the OpenSSL Foundation (including the “two Steves”, Steve Henson and Steve Marquess) were only pulling in about US $2,000 in donations per year.
- Theo de Raadt supports OpenSSH part-time.
- Harlan Stenn “runs the clocks on the Internet”, i.e. he’s responsible for ntpd, and until recently was earning about US $25K/year.
- Werner Koch maintains GnuPG, which secures a lot of email and provides the confirmation that a software package is what it says it is. According to a recent interview, he was going broke.
Before readers unversed in open source software get concerned with the security of open source software, let us remember this is a software problem, not an open source problem. Closed proprietary products have their share of exploits, etc. But with open source licensed software, the broad community can do things.
Review creates secure code
It is perplexing that if Linus’s Law is true, “Given enough eyeballs, all bugs are shallow,” then such security problems persist. Jim suggested as he gave the above examples that “there just aren’t enough eyes.” I’d offer a corollary. I think vibrant projects live a culture of review before code gets committed. I think this is because the developers have perspective and context that can never be built into a static analysis tool. Tools can find obvious portability breakage, or some security related issues (e.g. buffer overflow problems), so issues that are likely syntactically based, but a human can understand the semantic content of the code in front of them.
There’s even research to back this up:
- “Code review is more effective than test because in review the faults in the code are found directly, while testing uncovers only the symptoms of problems, requiring debugging to find the direct cause. The seriousness of the wrong behavior by the system does not have a relation to the type of mistake, since even simple mistakes can cause complex behaviors.”
– Walcélio Melo, Forrest Shull, Guilherme Horta Travassos, “Software Review Guidelines“, Technical Report ES 556/01, COPPE/UFRJ, August 2001. - “Software inspections are indeed cost-effective: They find about 70% of the recorded defects, take 6% to 9% of the development effort, and yield an estimated saving of 21% to 34%. i.e., finding and correcting defects before testing pays off, so indeed “quality is free”.” and: “Individual inspection reading and individual Code Reviews are the most cost-effective techniques to detect defects, while System Tests are the least cost-effective.”
– Reidar Conradi, Amarjit Singh Marjara, Øivind Hantho, Torbjørn Frotveit, and Børge Skåtevik, “A Study of Inspections and Testing at Ericsson, NO“, a rewritten edition of the paper presented at PROFES’99 (Oulu, Finland, June 1999). From the Conclusion:
In the open source community bugs are found quickly, but this happens after the fact. Vibrant projects live a culture of review before code gets committed so it’s much more like they find the bugs before they happen. Many of the key projects that have had breaches are taken for granted and don’t necessarily have the vibrancy of a Linux, or an OpenStack. These projects have simply become part of the fabric of the Internet.
Invest in Core Infrastructure
The Linux Foundation is stepping up to tackle these problems with the Core Infrastructure Initiative (CII). The Foundation is in an excellent position at this point in time to be the centre of gravity for such industry action. Jim talked about the Initiative in his keynote. A broad collection of players have banded together to provide a three-pronged approach to the problem of securing the open source software that secures the Internet.
1. Invest in the people that are the most knowledgeable about key projects.
2. Provide deep audits for the key projects to work to prevent the next security breach.
3. Develop and disseminate best practices and guidelines for developing and deploying secure software.
Jim’s big concerns are how not to perturb the market economics that drive open source software ecosystems and how to avoid creating an open source welfare state. He rightly used the example of the I-35W bridge collapse as an example of failing infrastructure that should have been fixed before a key transcontinental artery collapsed. I think that’s the right idea economically.
Governments invest in infrastructure to enable economic growth. Support and investment for rights of way for railroads, deep port infrastructure, or interstate highway systems creates the transportation infrastructure that enables economic growth and free markets for all. All the projects Jim discussed are fundamental Internet infrastructure. If a project under consideration implements or secures an underlying universal communications protocol or cryptographic algorithm then it is probably a good candidate for CII investigation.
Likewise, software projects that are not owned by corporations seem to be a necessary attribute. A database, even one as broadly deployed as MySQL, shouldn’t be a candidate. A fabulous engine for application deployment (node.js) is owned by a company. I’m pretty sure the investors would love the vendor community to invest in securing node.js “because it’s so hugely important going forward at enabling blah blah marketing blah.” Sorry — if a company controls the copyright, then you’re off the list. Private roads didn’t get government-funded bridges.
The Linux Foundation is obviously not a government, but it is a well-funded, well-organized industry non-profit. As such it provides an excellent place for the vendors that best benefit from the Internet infrastructure to collectively support the infrastructure on which they all depend.
The Core Infrastructure Initiative efforts are fundamentally important. A complete list of participants to date exists on the Linux Foundation site. Jim’s excellent keynote is up on YouTube (below), the slides will hopefully be up soon, and Jim’s blog post introducing CII is published on the Linux Foundation blogs. If your company isn’t supporting the initiative, it is well worth exploring how best to participate.
This article is republished with permission from Stephen Walli’s blog.
Red Hat Enterprise Linux (RHEL) 7.1 Released – A Quick Review and Installation Instructions
Red Hat Enterprise Linux commonly but not officially abbreviated as RHEL is a Linux distribution developed from commercial point of view. Red Hat Enterprise Linux provides their source code for free but keep check on free re-distribution of their officially supported version of Red Hat Enterprise Linux. All the third party derivatives and community supported distributions be it CentOS, Oracle Linux and Scientific Linux build and redistribute their distribution by removing non-free components and Trade mark of Red Hat.
http://www.tecmint.com/redhat-enterprise-linux-7-1-installation/
Facebook Develops ‘Yosemite’ Open Server Chassis with Intel
The social networking company is contributing Yosemite and open networking gear to the Open Compute Project Foundation.
The Latest Round Of GNOME’s Outreach Program For Women Wraps Up
The ninth and latest round of GNOME’s Outreach Program for Women (OPW) came to an end yesterday…