Video Vs Social Media, OR Why I Don’t Trust The Pew Study

You know, if I'm not careful, I'm going to rapidly turn into The Blogger Who Only Talks About Nalts.  Of course, if he'd stop writing so many insightful articles related to the online video space, then maybe I'd be forced to go out and find something else to talk about.  Until then… you should definitely throw Nalts' RSS feed in your preferred reader.

A recent article, "Why Online Video Will ALWAYS Trump Social Media”, discusses a new study from Pew that says more people watch online video than use social media (62% compared to 46%).

Nalts then goes on to talk about what advantages online video has going for it that social media does not.  For instance, there's no login required to watch an online video, whereas most social media requires you to sign up and/or login.  Online video is also very low maintenance, he says.  I watch my video and then I'm gone.

Good points.  He then goes on to suggest that social media will lose its charm in 2010, and that online video will always trump social media.

And now for the fun part (for me, at least):  I'm not sure I could disagree more.  Yes, online video is huge, and will only get bigger.  But I'm not sure this study actually gives us much valuable information with which to draw conclusions.

So I followed the link in his post to Liz Gannes write-up of the Pew study.  She linked to MediaPost, saying they were the first to write up the study.  But MediaPost doesn't cite any sources or link to any sources.  (Seriously, how can people write up something like this without citing any source?).

So I went to Google to try and find the original Pew release.   And guess what… I found it here.  (See… it's not that hard to link to the original study).

I have a few problems with this study:

1) Survey Subjects

Video Vs Social Media, OR Why I Don’t Trust The Pew Study Pew appears to have surveyed 2,253 adults (18 and over) to create this report.  Really?  Adults?  We're going to draw conclusions about online video (like YouTube and Hulu) versus social media (like MySpace and Facebook and Twitter) but we're not going to talk to any teenagers?

Yes… Pew's write-up mentions "young adults" several times, but they've clearly labeled people aged 18-29 as "young adults”.  There's no mention whatsoever of people under 18.

That's just flawed judgment.  Now, for all I know, Pew has to jump through a lot of legal hoops to ask teenagers questions.  But I see studies all the time about teenagers, so you can't tell me it's impossible to interview teens for a survey.

And I'm not suggesting that adding teens to the mix would have tipped the scale for social media over video either.  I'm simply suggesting that we can't know the real answer as long as we're leaving out one of the largest demographics for both online video and social media.  Hey Pew… next time why don't you take a survey on Wii usage versus the Xbox 360, and poll 1,300 people in nursing homes to gather your data.

Yes, I know that the median age for Facebook is rising—ditto for most sites—as the older generations hop on the bus.  But any conclusion about online video versus social media that doesn't factor in the under-18 crowd is just worthless information.

2) No Universal Definition of Social Media

Even Nalts admits early in his article that he's not sure all social media users even know that they're using social media.  Let's use my mother as an example (Mom, I promise not to use you as my example in every post I write… I swear).  She's on Facebook.  But I bet  you $100 that if you asked her if she was using social media, she'd either say "no" or "what's social media?”  I guess I just think this is a much bigger red flag than Nalts seems to think it is. 

But how do we even define social media?  Is it just Twitter and Facebook?  How about Digg?  Are blogs social media?  I could make a pretty fine argument that YouTube itself is social media, which would really blow up the survey's conclusions.

I can't find anything on Pew's write-up that mentions a clearly-stated definition of social media which they used for this survey.  Which means that respondents were left to judge for themselves what "social media" even means.  And it's just too new and too nebulous a term for there to be any useful, universal definition.

3) It Doesn't Compute (to me)

I know I open myself up to ridicule here, but I have a hard time with surveys in general.  It doesn't make sense to me that we talk to 1,500 people and then extrapolate what 300 Million think from that small sample.

Yes, I know there's some solid math behind some of this.  Yes, I know it's commonly accepted that this method works.  But I've never understood how we can know for sure that it works unless we take one issue and survey all 300 Million Americans to see what they think.

Stay with me here… Let's say we want to know who watched the finale of So You Think You Can Dance (not me!).  So we survey 2,500 people and ask them.  We choose a statistically valid cross-section of individuals from different races and regions and religions and creeds.  Then we publish our findings with a margin of error of 3.5%.

How can we know for a fact that this math works unless we have a baseline survey of all 300 Million Americans to compare it against?  At least one time we have to interview everyone to know if sample-surveying works, right?  Maybe we did back in the day and I just never heard about it.

But even Pew's website says this:

"…one can say with 95% confidence that the error attributable to sampling and other random effects is plus or minus 2.4 percentage points.”

Let me translate that for you:  There's a 95% chance they're either correct in their numbers, or at least no more than 2.4% off.  Which means there's a 5% chance they could be off the mark by a mile and a half.

It's like Wikipedia.  Is Wikipedia generally accurate?  Yes.  Is it consistently accurate?  Maybe.  Is it always accurate?  Not at all.  Since Wikipedia pages can be edited by anyone, then at any given moment a page may be inaccurate (see people like Jim Rome and Stephen Colbert encouraging viewers and listeners to alter Wikipedia pages for a lark).  And if Wikipedia can be non-factual at any given moment, then logically we can't ever trust it implicitly.  That's why most college professors won't accept Wikipedia as a legitimate source.  That's how I feel about Pew's "there's a 95% chance we're right or close to right" qualifier.

Now… go ahead and call me a conspiracy theorist.  But I'm not saying Pew is wrong.  They're probably right.  But there's no baseline to prove it—which is exactly why they say there's a 95% chance of certainty.

So there you go.  Call me a skeptic.  Hand me my tin-foil hat.  I'm not saying the study's conclusions are wrong (I think they're probably accurate), I'm saying we can't draw these conclusions from this study.   The study seems flawed to me.  Frankly, I'm not sure why we're in such a hurry to pick a winner between video and social media anyway.  There's such a thin line separating the two that I almost don't know why we need to separate them.  How is sharing video online with others not social media, anyway?

Don't Miss Out - Join Our VIP Video Marketing Community!
Get daily online video tips and trends via email!
About the Author -
Jeremy Scott is the founder of The Viral Orchard, an Internet marketing firm offering content writing and development services, viral marketing consulting, and SEO services. Jeremy writes constantly, loves online video, and enjoys helping small businesses succeed in any way he can. View All Posts By -

What do you think? ▼
  • http://viralorchard.com Jeremy Scott

    Reid,
    Thanks for the link.
    Fair enough on the survey being just about adults. But that just seems like a silly survey to do in the first place. Why not do a survey of people over 80 and see if they like YouTube more than Twitter?

    And yeah... I know that this kind of sampling is part of statistical analysis. I'm merely saying that there's no way to prove its accuracy. After all, there are a host of examples of Presidential election predictions being off. Are they usually close? Yes. But don't tell Dewey that. :)

    My beef is not that Pew conducts sampling surveys, or even with Pew's results... but rather with how we in the media take a survey like that and print it as solid-gold gospel truth. If statistical analysis is about inference, then the reporting of the findings should be as well. Instead, the reporting becomes "People Watch Videos More Than They Use Social Media."

    The margin of error is just one of many reasons I don't trust the survey's results. They only surveyed adults, eschewing a huge section of the user-base, and they didn't clearly define what "social media" means. Makes the whole thing a bit of a muddled mess for me.

    Appreciate the comment!

  • Reid Williams

    You definitely ask some fair questions, Jeremy. You're right about them not surveying the demographic that watches the most video and spends the most time with social media; it's possible, though, that the data from that group could skew conclusions about the adults (unless you're looking at those specific demographics' data). The survey is, after all, about adults. They do explain in some detail their survey methodology at http://www.pewinternet.org/Static-Pages/About-Us/Our-Research/IRB-Information.aspx

    To address your final point on the size of the survey pool: Sampling is part and parcel to statistical analysis; by its nature, statistics is about inference. The relative sizes of the sample and the population the sample is used to generalize determine the confidence interval of the conclusions; the larger the sample, the more confidence we have (there are other factors, as well). I recommend checking out http://www.pewinternet.org/Static-Pages/About-Us/Our-Research/IRB-Information.aspx
    This method, by the way, is used in tons of statistical applications, from drug testing to projecting the winners of our elections before the polls close.

  • http://www.facebook.com/louderback Jim Louderback

    Love everything you say, and it's all right on -- until you got to (3).

    Then you started sounding like "math is hard" Barbie.

    There are solid, solid theories behind statistical significance, and the Pew guys are among the most trusted out there - they have no agenda, they are the academic stats and polling version of rocket scientists.

    Asking about the confidence interval is important. The 95% of 2.4% really means that any result that is less than 3% from what it is being compared to can be logically assumed to be the same (ie, it had better be bigger or smaller by more than 3%). Confidence Intervals are an essential element of understanding today's world - and should probably be taught in a required college course to everyone.

    But to tar the whole thing because you don't have a solid stats background... c'mon. That's just the caveman throwing stones at the lunar eclipse.

    jim louderback
    ceo revision3

  • http://viralorchard.com Jeremy Scott

    I don't know, Jim. I feel like I said most of what you did... that there's solid math behind it... that Pew's findings are probably right... and that I don't "get" statistics. I'm not tarring the whole thing based on that last point. I'm tarring the whole thing for the first two points... the third point is just a personal one that puts things over the top for me. Guess I didn't strike the right tone.

    But "math is hard Barbie" made me laugh out loud.

    • http://www.reelseo.com/about/mark/ Mark Robertson

      "math is hard" Barbie cracked me up as well....

  • DahliaK

    Hey, why don't you comment on Naltses blog. You afraid of being lumped together with the crazies?

    Aside from erroneous assumptions in sampling, the big issue I have is how complex statistical studies are routinely boiled down to an overly simplistic headline or "takehome lesson" or "management summary." (For management Barbie) This treatment is rampant in journalism, including online journalism. Don't know any way to get around it, other than to make journalists (and bloggers) take several years of advanced study in research methodology. That's legal in a democracy, isn't it?

  • http://willvideoforfood.com/ nalts

    Yey- I love blog debates. When I'm sober I'm going to leave an intelligent response.

  • http://viralorchard.com Jeremy Scott

    Wait... should I have waited for sobriety before posting this? Nobody told me!