Home News Enterprise Computing Cloud Computing

Google Drive will Provide Storage Rescue for Chrome OS

Following years of rumors saying that Google would launch a cloud-based storage service to compete with players such as Dropbox, Google did indeed introduce Google Drive in late April. You can sign up for 5GB of free cloud storage from Google, and use it efficiently with your Android device, but we made the point last month that a big part of Google Drive's purpose is to fill a gap in google's Chrome OS. As we've reported before, with Chrome OS, Google bet heavily on the idea that consumers and business users would have no problem storing data and using applications in the cloud, without working on the locally stored data/applications model that most people are used to. Now, there are clear signs that Google Drive is going to be the stopgap that solves this Chrome OS problem. In a recent post, I wrote: "Google could create useful synergies between a new cloud-based storage service and Chrome OS, and there might even be room to give people storage incentives in the cloud if they choose Chrome OS. That kind of incentive might entice some businesses to adopt Chromebooks and Google's operating system.The price-per-gigabyte of storage has been dropping for many...

Read more... Comment (0)

The Web vs. the Cloud: Which Metaphor Survives?

A recent list of what are frequently called the "most valuable Web startups" is topped by the following six entries: Facebook, Zynga, Groupon,, Twitter and Dropbox. The common factor among them, in the context of everyday conversation, is that they're considered Web businesses. The keyword here is "considered." If you ask CEO Marc Benioff, Facebook already ate the Web and is licking up the remains. Perhaps more than any other company, Salesforce represents the perspective of "the cloud;" and from this point of view, the cloud is the technology that is driving the Internet today and not the Web. ...

Read more... Comment (0)

Google's New BigQuery Commoditizes Big Data Analytics

Google is moving the goalpost significantly in the market for big data tools, at least for organizations that can work with its canned tools and are willing to trust the search giant with their data. After some time in a limited preview, Google has unveiled Google BigQuery for public consumption. Google is giving developers the ability to query up to 100GB of data per month for free, or up to 2TB of data stored without having to contact sales at all, which provides a very low bar for working with big data. Google's BigQuery is a Platform-as-a-Service (PaaS) for working with "massive datasets" that can be in the billions of rows. It has a SQL-like query language, and promises to analyze large data sets "in seconds." Note that organizations that want a Google-hosted SQL database can tap the Cloud SQL offering. What's most interesting about BigQuery is the fact that it provides big-data analytics in a completely hosted offering. Organization's don't have...

Read more... Comment (0)

Consumerization vs. Vendor Lock-In

The term, "vendor lock-in" strikes terror throughout the IT community. And yet in reality, many companies are pursuing strategies destined to increase their dependence on a limited number of vendors mostly driven by the ineffectiveness of IT to provide simple connectivity capabilities between various corporate applications. By shrinking the number of vendors, IT is actually creating vendor lock-in. Instead, IT should be aligned with its business users as they seek to increase information access, both internally and externally, by promoting increased vendor participation - more expansion, inclusion and diversity of information sources. Restricting information flow is a flawed control tactic doomed to fail. Guest author John Yapaola is CEO of Kapow Software. He has a successful track record of managing and growing high-tech startups. With Kapow Software, he has created the industry's leading provider of cloud, mobile, social and Big Data application integration solutions that drive enterprise innovation and transformation for companies like Audi, NetApp, Intel and Commerzbank, and dozens of federal agencies. At a recent IT forum, attending CIOs were asked, "How many cloud offerings do you currently have in your organization?" The responses varied, but many had more than a dozen presently and growing. One of the CIO panelists noted, "We are now beginning to restrict the expansion of cloud offerings in the company. We slap their hands if they add any more." That was a disheartening response. Policing their customers (the business users) is not a viable solution. How about being less CIOfficer and more CIOptimizer? The Crowd-Sourced Browser Standard and controls are two words not often...

Read more... Comment (0)

Cassandra 1.1 Brings Cache Tuning, Mixed Storage Support

Apache has dished out another serving of Cassandra, the open source NoSQL database popular for handling big data. The improvements speak to a maturing NoSQL database that's well-suited for big data deployments. This time around, Cassandra has improvements to its query language, and tuning improvements that will help companies trying to boost performance with a mixture of magnetic media and solid state drives (SSD). Its continued development helps maintain open-source dominance in the big data/NoSQL market. Cassandra 1.1 hits just a bit more than six months after Apache released Cassandra 1.0, in October 2011. The major features in 1.1 point to Cassandra's focus on very large data sets. Notable Features Jonathan Ellis, vice president of the project and CTO of DataStax, pointed to several features that make 1.1 more than just a minor update. One of the most interesting is Cassandra's support for intelligently mixing magnetic and SSD media. Ellis says that a Cassandra deployment may have some tables that are updated more frequently than others, so it makes sense to put some tables on magnetic media (which is much slower) and other tables on SSD. Prior to the 1.1 release, Cassandra had no way of distinguishing between the two. This meant that if you mixed media, you could have very uneven results. The alternative, going all SSD or all spinning disks, was either very expensive (SSD) or much slower (magnetic media). Cassandra deployments can hit hundreds of terabytes of data. The largest (known) production cluster, according to Apache, exceeds 300TB of data spread...

Read more... Comment (0)
Page 162 of 222

Upcoming Linux Foundation Courses

  1. LFD312 Developing Applications For Linux
    16 Feb » 20 Feb - Atlanta - GA
  2. LFD331 Developing Linux Device Drivers
    16 Feb » 20 Feb - San Jose - CA
  3. LFS220 Linux System Administration
    16 Feb » 19 Feb - Virtual

View All Upcoming Courses

Become an Individual Member
Check out the Friday Funnies

Sign Up For the Newsletter

Who we are ?

The Linux Foundation is a non-profit consortium dedicated to the growth of Linux.

More About the foundation...

Frequent Questions

Join / Linux Training / Board