
Ancient Greece had its Great Explainers, one of whom was Plato. The open source community has its Great Explainers, one of whom is Michael Tiemann.

Ancient Greece had its Great Explainers, one of whom was Plato. The open source community has its Great Explainers, one of whom is Michael Tiemann.
The market is dealing with product delays and more energy-efficient x86 chips from Intel and AMD, Dell server General Manager Forrest Norrod says.
The global Smart Cities Market is expected to reach a value of USD 1,265.85 billion by 2019, growing at a CAGR of 14.0% from 2013 to 2019. Increase in migration from rural to urban areas is the major factor responsible for the growth of smart cities market, globally. North America was the largest contributor to the smart cities market and accounted for a share of 34.5% in 2012. This is mainly attributed to the increasing smart grid investments, upgradation in the water infrastructure and transportation sector. The manufacturers in this region are investing more in smart meters and smart grids to provide an excellent foundation for smart city programs.
Browse the full Smart Cities Market Report at http://www.transparencymarketresearch.com/smart-cities-market.html
Among the different application categories, smart transportation segment held the largest share of around 16% in 2012. This was due to the growing demand for advanced traffic management, building a superior environment and reducing the volume of delivery vehicles. At the same time, smart transportation links the modes of transport to improve the traffic flow in both urban and inter-urban networks. Smart transportation system helps in minimizing the economic burden of government by reducing traveling delays and fuel consumption rate. Smart security is the fastest growing segment and is expected to grow at a CAGR of 15.0% during the forecast period from 2013 to 2019. One of the reasons for the growing popularity of smart security is that it avoids third party misuse by imposing high security requirements onto the used technology.
In terms of geography, North America represents largest market for smart cities and is expected to reach a market size of USD 392.41 billion by 2019. The regional governments are taking steps towards reducing the carbon footprint by increasing the use of renewable energy resources. Governments in North America are currently working on an objective to accomplish the target of zero wastage of energy by the year 2020.
Get report sample PDF copy from here: http://www.transparencymarketresearch.com/sample/sample.php?flag=B&rep_id=357
Some of the major players in smart cities market include Siemens AG, ABB Ltd., IBM Corporation, Hitachi Ltd., Alcatel-Lucent S.A., Honeywell international Inc., Alstom S.A., General Electric Company, Telefonaktiebolaget L. M. Ericsson, Cisco Systems Inc., Oracle Corporation and others.
The global smart cities market is segmented as below:
Smart Cities Market, By Application
Smart homes
Smart buildings
Smart energy management
Smart industrial automation
Smart healthcare
Smart transportation
Smart security
Others (smart water management, smart education, so on)
Browse the full Smart Cities Market Report Press Release : http://www.transparencymarketresearch.com/pressrelease/smart-cities-market.htm
Smart Cities Market, By Geography
North America
Europe
Asia Pacific
Rest of the World (RoW)
Browse Technology and Media Market Research Reports @ http://www.transparencymarketresearch.com/technology-market-reports-8.html
Browsers have been a hot topic lately here in the Linux blogosphere, not just because of all the woes plaguing Tor in recent weeks, but also because of the increasingly worried mumbling about Firefox’s future. It’s been difficult to discern where a FOSS fan should turn for his or her Internet browsing needs, so it was with great relief that Linux Girl recently came upon an article dedicated to that very topic. Web browsers for the Linux desktop have evolved over the years, said Datamation’s Matt Hartley.
The Samsung supported Flash-Friendly File-System (F2FS) will sport some new functionality with the Linux 3.17 kernel release…
Companies increasingly understand that the key to developing innovative software faster and better than the competition is through the use of open source software (OSS). It’s nearly impossible to use only commercially sourced code and get your software to market with the speed and cost constraints required by today’s product life cycles. Without the ability to choose and integrate best-of-breed OSS, some of the greatest product ideas might never see the light of day.
With the use of open source, however, comes a different set of challenges. While your teams can gain speed and agility, it’s often more difficult to ascertain the code’s true origin and assure that it is secure.
As the OpenSSL Heartbleed vulnerability proved, not knowing what code is in your application or finished product can potentially create critical security threats that require time-consuming remediation efforts. Conversely, having an accurate inventory of what OSS components and versions are used and where can prove invaluable for quickly responding to and remediating vulnerabilities.
The Heartbleed bug reminded developers and companies just how important security is. While there has been widespread debate over whether proprietary or open source software is more secure, the issue is largely moot. The reality is that code defects exist in most pieces of software, regardless of origin, and some affect security.
Security challenges can become even more complex when open source is integrated with internal, proprietary code. In addition to the obvious risk of not properly managing license compliance, tracking code origins and use throughout an organization can become very difficult, very quickly.
To have a truly accurate understanding of your potential vulnerabilities, you need to understand three things:
What code is in your current products and applications?
What code is being used in the front end of the development process and where are developers acquiring these components?
What components are being used at the back end of the process and where does code need to be validated before it is deployed?
All companies should check their code against common vulnerability databases, such as the United States National Institute of Standards and Technology’s National Vulnerability Database (NVD). Resources like the NVD track security vulnerabilities and provide severity rankings to help companies keep their code secure and up to date.
If you’ve never reviewed your code against a vulnerability database, it may seem like a daunting task. Fortunately, there are tools that leverage these databases to regularly and automatically identify all open source security vulnerabilities, alerting and tracking where affected components are in use and in need of remediation.
Continuously monitoring your codebase helps guarantee that unknown code is identified, code origin is understood, license information is up to date and future security vulnerabilities are quickly flagged for resolution. If your company has an accurate code inventory in place, you can easily find vulnerable code and remediate it to ensure your business – and your customers – remain secure.
Most developers are attracted to OSS because it’s easy to access and free to acquire, usually allowing them to forgo a formal procurement process. Yet, while many development organizations have policies or guidelines for open source use, they are not always enforced and often not properly tracked. It’s important to track what code is coming into your organization, whether it’s been approved for use and where it’s used throughout your organization.
Once you know what you have, you need to establish governance. By implementing a management framework throughout the development process, you can ensure accurate descriptions of the code are captured and eliminate questions as to what code is where and whether it’s up to date. Manually managing this process is nearly impossible, which is why best-in-class companies actively manage their use of open source through automated code management and audit solutions.
Although every company and development team is different, the following processes have been proven to help organizations of all sizes manage and secure their use of OSS:
Automate Approvals and Cataloging – Capture and track all relevant attributes of OSS components, assess license compliance and review possible security vulnerabilities through automated scanning, approval and inventory processes.
Maintain Updated Code Versions – Assess code quality and make sure your product is built using the most updated versions of the code.
Verify Code – Evaluate all OSS in use; audit code for security, license, or export risk and remediate any issues.
Ensure Compliance – Create and implement an open source policy, establish an automated compliance process to ensure open source policies, regulations, legal obligations, etc., are followed across the organization.
As the use of software across industries proliferates, open source will continue to play a crucial role in developing the newest innovations. To prevent security vulnerabilities in this increasingly complex environment, companies must actively manage the flow of open source throughout their organization and establish processes to regularly check their code against vulnerability databases for fast and easy remediation.
Bill Ledingham is the Chief Technology Officer (CTO) and Executive Vice President of Engineering at Black Duck Software. Previously, Bill was CTO of Verdasys, a leader in information and cyber security, where he worked closely with leading Global 2000 companies and government organizations to safeguard their most sensitive information. Bill has been on the founding team of four companies, is active in the Boston start-up community, and has been a partner/investor with CommonAngels for the past 6 years.
Ugoos is prepping an Android 4.4 “S85″ media player dongle with a quad-core Amlogic S805 Cortex-A5 SoC clocked to 1.5GHz, and a quad-core Mali-450 GPU. Ugoos has spun a variety of Android media player boxes and dongles over the last few years, including a UT3 box, featuring Rockchip’s quad-core, Cortex-A17 RK3288 system-on-chip with a 16-core […]
Congratulations to Zach Villers, the second winner of our Linux poetry writing contest. Zach has won a free pass to LinuxCon and CloudOpen North America, Aug. 20-22 in Chicago, for his Systemd haiku.
“I wrote the Haiku after a particularly frustrating go-round with Arch Linux,” he writes. “Being fairly new to Arch, I somehow ended up with three or four separate daemons trying to manage network connectivity. I had to read through some documentation to figure out all of the commands to list (list-units), start, and stop the extra services I had conflicting with NetworkManager.
“I had just learned init and this was a little foreign to me, but sometimes breaking something is the best way to learn about it. Fedora, RHEL, and CentOS all now use systemd, Debian and Ubuntu are heading there as well, so we might as well start saying goodbye to init,” he said.
The contest ended Aug. 1 and was inspired by software developer Morgan Phillips, who is teaching herself about Linux by writing poetry.
Systemd Haiku
List-units, start, stop
enable and disable
goodbye to init
Alleging that the company is being stiffed by Samsung, Microsoft turns to the courts.