To the cloud, big data sisters and brothers, to the cloud
While reports of big data’s death have been greatly exaggerated, the skepticism is not unwarranted. The cloud may have some of the answers, but it won’t solve all of big data’s problems.
Gartner says the big data party is over, and Tony Baer wonders whether we can even make data science (and engineering) work. And results of a survey recently published by Dimensional Research (DR) and sponsored by Snowflake are clear: although 100% of participants acknowledge that data initiatives are important, the vast majority (88%) have had “failed” projects.
Respondents have reported some things that would help them get more out of their current data environments, including the ability to implement and deploy faster, reduce time to make data available, simplify tool sets, or reduce the overhead of managing infrastructure. DR states that cloud analytics has the potential to deliver these benefits.
But expecting cloud infrastructure alone to deliver big data initiatives from their maladies verges on the metaphysical, much in the way Chekhov’s heroines expected fleeing to Moscow to deliver them. There’s more to getting big data right, and we’ve had Jon Bock, Snowflake’s VP of products, weigh in on the subject.
People cite data infrastructure “inflexibility” as a major cause of issues, and according to Snowflake that often boils down to delays caused by complexity and resource constraints. It takes time and a huge amount of work to get the right capacity deployed and to start using it, and each new project can put new strains on the existing infrastructure.