How To Learn From Your Data

Data Center
Data Center

Five years ago, the founder of Microsoft Live Labs, Gary Flake, asked how you can take a big collection of things and make sense of it. The question led to the development of Pivot, released in February 2010.

The idea was to transform data into a collection of images and textual data. By condensing this data into thumbnails, a user could glance over a collage and zoom in on the information he or she was seeking.

Data presented in this way was easier to organize into different configurations. The graphical interface was simple and intuitive. It was a remarkable way to get insight on data collections by having it out pop out visually at you.

Unfortunately, Microsoft Live Labs closed down and this ambitious project using visualization technology to dynamically sort and organize data never quite caught on. Today, Microsoft Pivot has become PivotViewer, which is under the aegis of the Silverlight Developer Center and it’s now distributed as part of Silverlight products.

Meanwhile, a company called IQMS is working on ERP MRP systems. Let’s break down the geek speak here. ERP stands for Enterprise Resource Planning and MRP stands for Material Requirement Planning. Essentially what IQMS does is to crunch down the large amounts of data faced by manufacturers into organized information. Basically, they figure out how to plan production, schedule, and control inventory involved in manufacturing.

What both these companies are trying to do is figure out what to do about large amounts of data. While Microsoft Live Labs was not able to scale up its idea, IQMS has found a workable way of organizing the large influx of manufacturing data.

How Big Is Big Data?

Although these are bold, heroic, even ingenious answers, and they do work for selective data sets, no comprehensive solution has emerged on what to do with the continuous influx of data, which will be so humongous that it will be measured in zettabytes.

One problem is that much of the data is locked behind corporate firewalls, which means that independent innovators can’t access it. This means that those trying to jump in to work on the answer only have an abstract idea of what big data is like.

The Big Deal About Big Data

Although the buzzword is new, the idea is not. Increasing amounts of data have been accumulating for the past few centuries, but what is different is the rate at which data is collecting and the promise of analytics to take a look at it.

Never before was there an opportunity to get excited about the possibility of transforming retrospective marketing into predictive marketing. In theory, if big data can be analyzed, it will be possible to spot trends, make correlations, and run more efficient and profitable businesses. That is the Holy Grail which makes Big Data a big deal.

The Holy Grail Of Marketing

What if we now have enough technology to make sense of mountains of information that was previously stored away because it was so overwhelming?

What if we can now sort out, prioritize, and analyze structured and unstructured data?

Currently, it’s estimated that 80% of data is unstructured while 20% of it is structured. Chief information officers around the world are beginning to wonder about the possibility of contextualizing structured data and mapping out unstructured data into a meaningful pattern.

Accelerated Data Creation

What every company wants to know is what customer’s think about it.

This information actually exists in many conversations conducted on social media, video conference meetings, PDF documents, emails, fax messages, and so on. Much of this information is recorded on computers, smartphones, and tablets.

Society, as a whole, has become very good at creating vast amounts of content that would help an organization estimate its future as well as anticipate consumer trends. Creating data has become easy; figuring out how to use it has become more difficult.

Putting Big Data Into Perspective

The idea of unlocking big data is a little like the Faustian dream of knowing everything. Information is only going to continue to proliferate and even if we can get machines to do our thinking for us and make a dent into this sea of knowledge, will we have enough patience to become wise enough to use the data well?

Something that has worked in a humble way to make sense of large volumes of data is enterprise resource management software.

While ERP is not a solution to Big Data, it is an elegant way of handling the large amount of structured data that a company has to deal with as it tries to work out its destiny.

Here is an excellent definition from Webopedia:

“Enterprise Resource Planning (ERP) is business process management software that allows an organization to use a system of integrated applications to manage the business and automate many back office functions related to technology, services and human resources. ERP software integrates all facets of an operation, including product planning, development, manufacturing, sales, and marketing.”

In putting the quixotical quest for Big Data into perspective, perhaps we are ignoring many of the things that are already working well for us and that we can continue to augment to become even more useful. Investing in ERP research to improve it even more, appears to be a more realistic approach rather than lusting after the unrealistic total assimilation of Big Data.

This article was written by BusinessVibes from Business2Community and was legally licensed through the NewsCred publisher network.


Leave a Reply

Your email address will not be published. Required fields are marked *