You know, if I'm not careful, I'm going to rapidly turn into The Blogger Who Only Talks About. Of course, if he'd stop writing so many insightful articles related to the online video space, then maybe I'd be forced to go out and find something else to talk about. Until then… you should definitely throw Nalts' RSS feed in your preferred reader.
Nalts then goes on to talk about what advantages online video has going for it that social media does not. For instance, there's no login required to watch an online video, whereas most social media requires you to sign up and/or login. Online video is also very low maintenance, he says. I watch my video and then I'm gone.
Good points. He then goes on to suggest that social media will lose its charm in 2010, and that online video will always trump social media.
And now for the fun part (for me, at least): I'm not sure I could disagree more. Yes, online video is huge, and will only get bigger. But I'm not sure this study actually gives us much valuable information with which to draw conclusions.
So I followed the link in his post to Liz Gannes write-up of the Pew study. She linked to MediaPost, saying they were the first to write up the study. But MediaPost doesn't cite any sources or link to any sources. (Seriously, how can people write up something like this without citing any source?).
So I went to Google to try and find the original Pew release. And guess what… I found it here. (See… it's not that hard to link to the original study).
I have a few problems with this study:
1) Survey Subjects
Pew appears to have surveyed 2,253 adults (18 and over) to create this report. Really? Adults? We're going to draw conclusions about online video (like YouTube and Hulu) versus social media (like MySpace and Facebook and Twitter) but we're not going to talk to any teenagers?
Yes… Pew's write-up mentions "young adults" several times, but they've clearly labeled people aged 18-29 as "young adults”. There's no mention whatsoever of people under 18.
That's just flawed judgment. Now, for all I know, Pew has to jump through a lot of legal hoops to ask teenagers questions. But I see studies all the time about teenagers, so you can't tell me it's impossible to interview teens for a survey.
And I'm not suggesting that adding teens to the mix would have tipped the scale for social media over video either. I'm simply suggesting that we can't know the real answer as long as we're leaving out one of the largest demographics for both online video and social media. Hey Pew… next time why don't you take a survey on Wii usage versus the Xbox 360, and poll 1,300 people in nursing homes to gather your data.
Yes, I know that the median age for Facebook is rising—ditto for most sites—as the older generations hop on the bus. But any conclusion about online video versus social media that doesn't factor in the under-18 crowd is just worthless information.
2) No Universal Definition of Social Media
Even Nalts admits early in his article that he's not sure all social media users even know that they're using social media. Let's use my mother as an example (Mom, I promise not to use you as my example in every post I write… I swear). She's on Facebook. But I bet you $100 that if you asked her if she was using social media, she'd either say "no" or "what's social media?” I guess I just think this is a much bigger red flag than Nalts seems to think it is.
But how do we even define social media? Is it just Twitter and Facebook? How about Digg? Are blogs social media? I could make a pretty fine argument that YouTube itself is social media, which would really blow up the survey's conclusions.
I can't find anything on Pew's write-up that mentions a clearly-stated definition of social media which they used for this survey. Which means that respondents were left to judge for themselves what "social media" even means. And it's just too new and too nebulous a term for there to be any useful, universal definition.
3) It Doesn't Compute (to me)
I know I open myself up to ridicule here, but I have a hard time with surveys in general. It doesn't make sense to me that we talk to 1,500 people and then extrapolate what 300 Million think from that small sample.
Yes, I know there's some solid math behind some of this. Yes, I know it's commonly accepted that this method works. But I've never understood how we can know for sure that it works unless we take one issue and survey all 300 Million Americans to see what they think.
Stay with me here… Let's say we want to know who watched the finale of So You Think You Can Dance (not me!). So we survey 2,500 people and ask them. We choose a statistically valid cross-section of individuals from different races and regions and religions and creeds. Then we publish our findings with a margin of error of 3.5%.
How can we know for a fact that this math works unless we have a baseline survey of all 300 Million Americans to compare it against? At least one time we have to interview everyone to know if sample-surveying works, right? Maybe we did back in the day and I just never heard about it.
But even Pew's website says this:
"…one can say with 95% confidence that the error attributable to sampling and other random effects is plus or minus 2.4 percentage points.”
Let me translate that for you: There's a 95% chance they're either correct in their numbers, or at least no more than 2.4% off. Which means there's a 5% chance they could be off the mark by a mile and a half.
It's like Wikipedia. Is Wikipedia generally accurate? Yes. Is it consistently accurate? Maybe. Is it always accurate? Not at all. Since Wikipedia pages can be edited by anyone, then at any given moment a page may be inaccurate (see people like Jim Rome and Stephen Colbert encouraging viewers and listeners to alter Wikipedia pages for a lark). And if Wikipedia can be non-factual at any given moment, then logically we can't ever trust it implicitly. That's why most college professors won't accept Wikipedia as a legitimate source. That's how I feel about Pew's "there's a 95% chance we're right or close to right" qualifier.
Now… go ahead and call me a conspiracy theorist. But I'm not saying Pew is wrong. They're probably right. But there's no baseline to prove it—which is exactly why they say there's a 95% chance of certainty.
So there you go. Call me a skeptic. Hand me my tin-foil hat. I'm not saying the study's conclusions are wrong (I think they're probably accurate), I'm saying we can't draw these conclusions from this study. The study seems flawed to me. Frankly, I'm not sure why we're in such a hurry to pick a winner between video and social media anyway. There's such a thin line separating the two that I almost don't know why we need to separate them. How is sharing video online with others not social media, anyway?