Welcome!

Slogan-Free Since 2004

Roger Strukhoff

Subscribe to Roger Strukhoff: eMailAlertsEmail Alerts
Get Roger Strukhoff via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Cloud Computing

Article

Big Data Swamping My Timeline

Are We Devaluating Our Ability for Abstraction?

Tibco Software is holding its annual user conference this week in Las Vegas, and Big Data is among the big topics of discussion. The company claims it was doing big data before there was Big Data, which reminds me of all the companies who've said they've been doing cloud computing for, you know, ever. (Disclosure: I worked at Tibco from 2006-09, a time during which it acquired Spotfire, which did bring a layer of analytics and big data-ish capability to the company.)

In any case, Big Data has been dominating my twitter timelines the last few weeks, with a sharp spike the likes of which I've not seen before.

The Big Data landscape looks like utter chaos to me right now. I don't know how to define Big Data, and not even NIST has defined it, the way it has helpfully if loosely defined cloud computing.

NIST did hold a workshop on the topic back in June. The first thing to jump out at me from that event was the old-line Big Data people - aka scientists - gamely outlining the classic Big Data applications - nuclear-event simulation, epidemiology, metereology, and other massively computational uses, while noting that NIST had its first seminar on this topic in 1997.

But today, Big Data has been hijacked by real-time data-collection activities, and all the nefarious uses to which it can be put by marketers. There's still big science involved in developing Big Data frameworks and apps, but the application focus is moving from controlling measles and predicting hurricanes to capturing eyeballs and selling coupons.

In addressing issues such as sensor density, sampling rates, and real-time graphic simulation, I'm reminded of the late Argentinian author Jorge Borge's perfect map of the world - one which would be the same size as the world and feature every detail precisely.

Old Man Say I Dunno
In other words, by creating perfect pictures of the world (or perfect pictures of a customer's buying activity), are we missing something essential in the human mind's ability for abstraction? Would Big Data helped design the original Ford Mustang and identify its buyers? Did it drive the vision behind the iPod, iPhone, and iPad?

So yes, I'm uncomfortable with the all-out onslaught on the term Big Data. I fear its best uses may be obscured and perhaps lost within a miasma of marketing-driven bloviation. The same thing almost happened to cloud computing, and with today's conflation of Big Data and cloud computing, it could still happen.

I'll be at Cloud Expo/Big Data Expo in November, wearing my hairshirt and admonishing anyone who's abusing the terms. I may not have such a problem there, though, as I expect discussions to be highly technical and useful. Maybe I just need to stop taking my twitter timeline so seriously.

Follow me on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff is Executive Director of the Tau Institute for Global IoT Research, (@IoT2040), with offices in Illinois and Manila. He is also a writer & editor for SYS-CON Media. He writes for IoT Journal & Computerworld Philippines. He is Conference Chair of @ThingsExpo. He has a BA from Knox College, Certificate in Tech Writing from UC-Berkeley, and MBA studies at CSU-East Bay. He serves on the board of the Campbell Center for Historic Preservation Studies, and has served as Director, U.S. Coast Guard Aux Int'l Affairs.