Part of my survey data collecting strategy involved targeting users of careers services via Skills Development Scotland‘s (SDS) various social media accounts. Rather naively, I thought that this method would be a cheap, quick, and easy way to gather potentially hundreds of responses from young jobseekers.
There was some logic to my misplaced assumption. On Facebook alone, for example, there are at least twenty one local SDS pages. Some of these have hundreds of likes, and some have thousands. Additonally, SDS have a centralised Twitter page with over 16,000 followers, and My World of Work Twitter and Facebook accounts with 6,000+ followers and 13,500+ likes, respectively.
Now, there is likely to be some crossover as far as the followers and “likers” of these pages are concerned (i.e. some of them will have liked more than one of the pages). But even still, I thought that my survey had the potential to reach at least 20,000 jobseekers. Based on that, even a 2% response rate would have given me about 400 responses – which I thought was a cautious estimate.
The reality was very different than the one I’d anticipated, and most of the challenges stemmed from the fact that I was not personally administering the survey. Scattergun dissemination was an issue, for example. The survey was not posted on all of the accounts, or the link was retweeted/shared instead of actually posted. This was understandable given the number (possibly dozens) of very busy people who would have been involved in operating these social media pages.
In the cases where the survey was posted to a social media account, it was often buried under other user-content. That is to say, because the survey was not pinned on any of the pages, it soon disappeared down Facebook Walls/Twitter Feeds and into obscurity. If the social media user wasn’t online within an hour or so of it being posted, it is unlikely they actually saw it. The window of opportunity would have been smaller if the social media user had a busy individual newsfeed.
Not all of those who saw the survey link would have been active jobseekers. This is a pertinent point. Even if, in theory, my survey reached thousands of users who had “liked” or followed the pages where is was posted, it would be almost impossible to determine how many of those users would have been actively seeking work at that time. The period of job search can vary wildly between individuals, before the person either a) finds employment, b) gives up searching, c) takes a break from searching, or d) postpones searching due to other pursuits (e.g. full-time education).
In summary, before I even get to general survey concerns such as survey response fatigue (i.e. people who are eligible to participate, but for whatever reason can’t be bothered), potentially thousands of participants would not have been reached in the first place.