By Amanda McDonald – Chairman for the East Kilbride Pirates
Our survey said…
The last week of the regular season has been and gone, and for many teams that means starting to think about preparing for the 2015 season.
Hopefully for many that process automatically involves reviewing your off-field performance as well as your on-field one. And one of the easiest ways to do that is to hold an end-of-season survey asking your players, coaches and volunteers to share their views on what went well and help you identify areas for improvement.
Why a survey?
Personally I love surveys. They’re a really simple way of capturing views in one place, most of the survey tools on the market will do all the hard work of analysing the responses for you, and most importantly they empower participants to say what they want because they’re anonymous. This really frees people up to say what’s on their mind, especially once you have a couple of surveys under your belt and people realise you can’t identify them and that there’s no repercussions for being honest.
The other good thing about surveys is if you do them for several seasons, you can start to make comparisons. If you’ve identified problem areas to target – perhaps changing a training venue or a need for more focus on film – you can see the following year if your work paid off. They also give you great evidence to back up statements around things like offering value for money and having high quality facilities. You might think that everyone in your team gets great value for money and loves that Astroturf pitch you’ve booked, but do your players and coaches?
It’s not just about the questions
Outside of football, in my previous day job, I was closely involved in the planning, communication and delivery of five annual global opinion surveys targeted at around 140,000 employees for a large financial services organisation. It seemed like a sensible move to apply some of the things I learned there to how we gather opinion at the Pirates. Obviously without the budget and the expensive consultancy on hand to analyse the results! But the basic principles were the same, and there were lots of easy to do things that mapped over really well. I’ll caveat this by saying I’m not a survey expert or a statistician, just someone who’s interested in how they work and who sees a lot of value in the information you can get out of them.
I’ve now organised three end of season surveys, plus run a number of ad hoc surveys for player camps and other situations, as well as helped manage and compile survey results for the Edinburgh Napier Knights. I inherited the Pirates survey from OC Andrew Mackintosh (*waves*), when it was much more coaching focused and broadened it out to capture more information, cover additional topics and communicate the results more widely.
I know some of you will already do a survey, but hopefully the next few pages will give you some additional food for thought or maybe a tip or two on how to share your results and action plan.
It’s also worth adding up front that the most important thing about doing a survey isn’t actually the survey itself. It’s what you do with the results, and how you communicate them.
To pay or not to pay….
There are lots of different survey tools on the market and there may already be one you like. My personal preference is SurveyMonkey – it’s free package is basic but serviceable depending on what you want to do, and it’s select entry level plan is powerful and pretty cheap and can be cancelled after a month meaning you can use all the added functionality for only £24.
The reason I would recommend paying for the select plan and not the free plan is around functionality and user experience. With the select package you can ask more than 10 questions, you get question logic so your users can skip to different sections depending on their responses, you can ask matrix questions (ones with multiple scored attributes), you can add in your team branding, set up a thank you message, and more importantly create a short and memorable URL for your survey. Plus, you get a better level of analysis.
What do you want to know?
Once you’ve set up an account with your preferred survey tool, the next step is to plan the content of your survey.
Have a think about the sort of information you’d like to capture. Are there any things you’d like team views on – like training times or frequency? Do you want to change how you travel to away games, set up a training camp, or are you considering a subs increase or a rules change? Or maybe you want to capture ideas for your next club night out or fundraiser.
Topics we’ve covered at the Pirates for the last few years include ratings for our coaches, gym behaviour, training venues, game time, value for money, gameday experience, being treated fairly and bullying, the performance of our committee and general suggestions and comments. This means we can look back and see if particular scores have improved or if there are any areas of concern or themes coming through.
In each survey there have also been some extra questions on things relevant at the time – like changing training times, adding extra meals on at camp, ideas for merchandise, or adding in friendly fixtures.
Qual vs quant
Once you’ve determined the topics of the questions to ask, develop your questions.
There are different types of questions to ask. Ones where you ask respondents to rate on a defined scale or answer yes or no, giving you quantifiable results. For example, ‘Do you feel you were treated fairly this year – yes or no’ or ‘How would you rate our Thursday night training venue’ (1 = poor, 2 = average, 3 = good, 4 = excellent).
And the other type is qualitative, where you ask people to freely share their views and feelings on a particular topic through comment or essay boxes. For example, ‘what else could coaches do to benefit you as a player?.’
A good survey will have a decent mix of both, probably mostly quantitative but broken up with plenty of opportunities to share views via free text boxes. You may also want to add comment boxes to rated questions too, so people can say why they replied yes or no, or rated someone or something 1 out of 10.
Also avoid too many questions. Keep it short and to the point. Think about how you feel filling in a survey yourself. You want it to be quick and the questions clear. You probably want to be able to click through most things, make a few comments have a good rant about one or two things if needed.
Try and make sure your questions aren’t leading, vague, or open to interpretation. Be really clear. By leading, I mean prodding someone towards the answer you’re looking for and making a judgement or implying stupidity if you disagree. For example ‘The committee think the best way we could improve the quality of our film is by buying a sidewinder. Do you agree – yes or no’ is a leading question because it suggests you’re dissenting from the club committee if you disagree, and that a sidewinder is the best or only solution. A better question might be ‘How do you think we could improve the quality of our film’ and ask for comments.
It’s a really good idea to get someone impartial to review your survey and your questions before you make it live.
When setting up a rated question, there’s a lot of debate about the scale to use. Personally I like an even four point scale – something like ‘poor, fair, good, excellent’. There’s a belief that if it’s a five or seven point scale, for example, ‘very poor, poor, fair, good, excellent’ people will just opt for the middle ground, preferring to be neutral or undecided.
A four point scale forces people to form an opinion and either agree or disagree in some way by selecting an answer that is either in the positive or negative end of the scale. There are pros and cons to both. Experts prefer a five point scale and depending on the question, you might actually want a neutral or ‘middle ground’ answer. For example if you’re asking people about the length of training, game time or the cost of subs.
The important thing is to try and be consistent and try and keep the rating scale and format used pretty consistent throughout. Try not to switch from a five-point scale to a four-point scale and then up to seven. Also, keep the positions of the value axes the same – if you start out with “least/worst/disagree/negative” type values on the left of the scale and “most/best/agree/positive” on the right, stick with that. Keep your rating scales consistent too. You don’t need to rack your brain making them up, other people have already done the hard work of creating common rating scales for you!
A word of caution. Excessive use of rating scales is one of the major causes of people getting fed up with your survey and giving up half way through because it’s dull and repetitive and they’ve lost the will to live! The survey will seem much more manageable to the respondent if you break up the rating items into small groups of perhaps three to six items at a time. And in between break them up with other types of questions, such as simple yes/no, multiple choice, or free text boxes.
Don’t ask people to rate more than one thing at a time – for example ‘the committee are good at communication and making decisions’. These kinds of double-barrelled questions are a rookie mistake. Each item in a rating scale should only have one concept or attribute.
It’s also worth selecting the option to require an answer for each question. This reduces the likelihood of people just skipping everything or picking and choosing the questions they like.
Who’s answering your survey and skip logic
Decide at the beginning too who your survey is targeted at – is it players, or is it coaches or staff. Or maybe it’s everyone. For last year’s survey I played about with question logic in SurveyMonkey (also know as skip logic). This allows you to ask questions at the beginning to determine who’s completing your survey, and send them to different parts accordingly. For example, you might not want to ask your coaches about gametime or gym behaviour, so you can send them to a different part of the survey to answer questions about how the committee perform or what they think of training facilities and training times. You probably also don’t want your offensive players rating defensive coaches. Or you might want to ask rookies different questions to veterans. So think about putting in a few jumping off points in the survey at different points, and at the beginning ask a few questions about your respondents.
We typically ask how many seasons people have been involved in the team (first season, two-four seasons, four or more seasons), and their role (player, coach, other volunteer). Using question logic, we will send players, coaches and volunteers down different survey paths. And we’ll also split players off with additional question logic about whether they’re offensive or defensive players.
It’s probably worth sketching out the journey your respondent will take and testing it a few times afterwards to make sure you’ve got it right.
And even if you don’t use skip logic, it’s still worth asking how long people have been involved at the beginning of the survey though, as you might find veteran players think differently to newbies! And when you look at your results you can compare their views which can help you spot trends.