How to avoid big data project failures: Your 5-step guide
The big data project failure rate is 60%, according to 2015 research from Gartner. A major contributor to this failure rate is political or people-oriented, rather than the technology. What can big data project managers do to avoid these project failures?
"To succeed, you must develop a viable strategy to deliver business value from a big data initiative," according to Gartner research director Svetlana Sicular, as stated in a Gartner blog post. "Then map out and acquire or develop the missing and specialized skills that are needed. Once strategy and skill priorities are addressed, then you can move on to big data analytics."
Artificial data reduces privacy concerns and helps with big data analysis
Much has been said about how big data will help solve many of the world's thorniest problems, including pandemics, hunger, cancer treatments, and conservation. However, because of the seriousness of the problems, and complexity of big data and its analysis, a great deal of testing is required before any results can be considered trustworthy. Unfortunately, most businesses and organizations do not have the in-house capability to achieve any semblance of trust. Thus, the normal procedure has been to outsource the work to third-party vendors.
How Colleges Use Big Data to Target the Students They Want
A decade ago, Saint Louis University found itself in a precarious situation. About half of the university’s 8,600 undergraduates were from Missouri and Illinois, and the demographic forecast for the Midwest looked bleak: the number of high-school graduates from the region was projected to drop by nearly a third by 2028.
So the university started to dig deeper for prospects in its backyard, purchasing more names of prospective high-school students from the College Board and ACT and targeting those teenagers with marketing materials. At one point, admissions officials at Saint Louis University were buying upwards of 250,000 names annually.