By Sumit Gupta, Vice President, High-Performance Computing and OpenPOWER, IBM
December 8, 2015 | Provided by IBM
You’re constantly leaving an electronic trail, and you’re not alone.
We’re all constantly sharing billions of pieces of data about our shopping habits, where we like to eat, and places we visit often, as well as the messages, pictures, and video we post on social media. Companies want to use this data to provide us better and new services—and of course, sell us more stuff.
Creating the computer systems and software needed to sift through billions of pieces of data is among today’s biggest challenges—and opportunities—for the high-technology industry.
From a company’s point of view, the most critical aspect of this process is time to insight—that is, how long it takes to obtain actionable insights from the data they’ve captured and analyzed. For example, imagine that you’re on a flight that’s running late. During the delay, the flight attendant offers you a complimentary drink that just happens to be your favorite cocktail. That kind of service requires real-time analysis of data to quickly obtain insights about your preferences and habits.
Creating the computer systems and software needed to sift through all that data and provide faster time to insight is among today’s biggest challenges—and opportunities—for the high-technology industry. Corporate customers in this space are turning to tech companies for help in building bigger computer systems and sophisticated software to enable fast, effective big-data analytics.
Today, big-data computer systems and software are increasingly being built on an innovative, collaborative open-source foundation. In fact, 170 technology companies, many of which compete with each other, have formed a consortium to provide some of the technologies needed to fuel big-data analytics. The new organization, the OpenPOWER Foundation, allows these companies to collaborate in an open ecosystem.