Pre-survey/existing issue
|
Issue
|
Problems it causes
|
Mitigating actions
|
Launching the employee survey (or previous surveys) at the wrong time
Example: Client X had launched an employee survey when unaddressed rumours existed among staff about an imminent corporate restructure |
Bias response/data of limited validity |
Don’t plan your staff survey as a stand-alone HR initiative but make it an organisation-wide project
Address employees’ concerns about the restructure and survey a month or two later |
Lack of action from previous staff surveys
Example: Some new clients anecdotally report that nothing or very little was done with the results of their last staff survey |
Staff scepticism that nothing will be done with staff survey results may mean low response rates and meaningless data
Other managers across the organisation may not support the employee survey project and future improvement ideas |
Acknowledge lack of previous inactivity and give reasons
Publically state intentions for the findings for this staff survey with timescales for implementations
Provide regular communication around milestones made for new staff survey initiatives |
Respondent fatigue caused by excessive staff survey projects
Example: A new client recently asked if we could boost participation rates to their employee survey; staff had been asked to complete five surveys in the last 12 months |
Decreased response rate impacting robustness of staff survey data |
Only use one all-staff survey at a time
Timely gaps with space for insights to be translated to actions prior to follow-on staff survey projects
Incentivise staff survey participation e.g. with charity donation or prize draw |
Poor pre-survey communications
Example: Emailing survey reminders to staff with no computer access, communicating the survey in languages staff struggle with, or a staff survey that’s appropriate for the senior leadership team but not operational staff |
Lack of employee awareness about the staff survey |
Communicate about the employee survey via multiple media channels, such as email, notice boards, letters, launch events, meetings, etc.
Appoint dedicated staff survey communications champions
A planned staff survey communications strategy linked to wider HR initiatives |
Staff concerns regarding consequences of participating in the employee survey
Example: Following a previous internal staff survey which was supposedly confidential, some employees had been identified and targeted by line managers regarding their responses |
Low employee survey response rates
False data as staff will say what they think they should say rather than what they believe |
Assurances of confidentiality by using an independent third party with a formal policy for the employee surveyPre-set minimum reporting sizes
Amalgamated demographics for small departments
Confidential staff survey data collection methods such as unmarked, reply-paid sealed envelopes with secured and well publicised dropped boxes
Secure, external hosting of electronic staff survey data |
Survey design
|
Issue
|
Problems it causes
|
Mitigating actions
|
Lack of clarity on what you want the staff survey to achieve
Example: Government funded departments and franchised employers have come to Insync citing that they must conduct a staff survey as it’s part of their contractual obligation or because “they must been seen to do so” |
Employee survey findings will fail to provide clear insights leading to actionable outcomes |
Be clear on what type of staff survey is required and why e.g. safety culture survey, employee engagement survey, alignment and engagement survey, entry survey and exit interview |
Asking the wrong types of staff survey questions
Example: Client Y came to Insync to design a new staff survey after formerly asking questions which related to KPIs of managerial staff. A data bias occurred in staff responses meaning that all the staff survey data had to be interpreted/used with extreme caution due to the questionable validity of the data |
Failure to provide useful data
Alienation of staff as respondents |
Hold pre-survey focus groups to establish pressing issues requiring investigation in the staff survey
Consider historical data for trend comparisons |
The employee survey doesn’t align with the culture of your organisation or the culture your organisation aspires to
Example: a new HR director implemented a staff survey which used acronyms not yet widely known throughout the organisation. The result was that staff skipped staff survey items rendering data of questionable validity |
Staff will not know how to answer questions and may not complete the staff survey or skip items |
Tailor the staff survey items into familiar language
Write some staff survey items which are unique to your organisational context
Use open-ended staff survey questions to invite employees’ opinions; as well as providing insights. This may increase staff engagement by helping them feel that they have been listened to |
Using an incorrect staff survey scale or method of data collection
Example: When moving from one staff survey provider to another the response scale may change; mathematical formulae can sometimes be applied to address this; please contact us to speak to our registered psychologists and research experts |
New data statistically incomparable with previous employee survey data |
Retain same types of quantitative scale for areas where direct comparisons are required with previous employee survey responses e.g. when comparing rankings of corporate values |
Cross cultural issues caused by insensitive translation of staff survey content
Example: An FMCG client conducted a global staff survey without the necessary content quality controls; some of the staff survey questions/items simply failed to translate meaning, staff didn’t understand the question so skipped it |
Alienation of some employee survey respondents |
Following translation of the staff survey content, conduct a pilot survey or sense check at the local level to ensure translations are correct |
Survey in field
|
Issue
|
Problems it causes
|
Mitigating actions
|
Respondents participating for the wrong reasons
Example: A newsagency tried to boost staff survey participation rates by offering respondents a voucher for a nearby cafe; staff clicked through the long, laborious survey quickly to gain the voucher without giving due consideration to their responses |
Staff survey data bias |
Avoid linking KPIs or other financial incentives to the staff survey, a charity donation can work well instead |
Staff survey can’t be completed by parts of the workforce
Examples: Electronic surveys that have been distributed during periods of system interruption due to IT systems upgrades. Surveys that have been mailed to staff at home with no spare copies available when staff are at work near the designated drop-boxes |
Low staff survey response rates
Missing useful/vital data from some demographic groups
Alienating groups of employees |
Distribute your staff survey via multiple communications channels
Ensure that paper-based staff surveys with a robust collection method are available even in organisational areas in which staff normally work via a computer |
Analysis of survey data
|
Issue
|
Problems it causes
|
Mitigating actions
|
Trend analysis: comparing current staff survey data to previous data collected differently
Example: An education sector client tried to make direct comparisons between qualitative data findings and those collected from a quantitative staff survey |
Invalidity of staff survey data |
Employee survey interpretation must consider:
– Previous response rates
– Previous organisational context
– Previous external context
– Like-for-like scales
– Like-for-like methods of analysis
– Like-for-like demographics and corporate structuring |
Failure to address cultural response bias in the staff survey
Example: Global travel client who was surprised at the high level of staff turnover in one country despite the fact that this country had recorded positive sentiment in a recent staff survey |
Misinterpretation of seemingly similar staff survey data |
Apply knowledge of cultural understanding to interpretation of staff survey data before action planning |
Interpretation of staff survey data
|
Issue
|
Problems it causes
|
Mitigating actions
|
Launching the staff survey survey “too soon after the dust has settled” from corporate changes
Example: A finance sector client who received staff feedback seeking answers to information which was currently filtering through their internal communications cascade |
Bias response/data of limited validity |
Following a corporate process restructuring, and before running the staff survey, enable new functional teams to embed and go through the group dynamics of storming, norming and performing |
Failure to look externally at competitors’ staff survey results
Example: A marketing team who were shocked and surprised to learn that despite a year-on-year improvement in % favourable scores in their own staff survey; when benchmarked against similar organisations their results were in the bottom decile of a standard distribution comparison |
Missing the big picture |
Benchmark employee survey data against a dynamic and relevant database in addition to scrutinising your own results |
Post survey
|
Issue
|
Problems it causes
|
Mitigating actions
|
Reporting employee survey findings to the wrong staff members
Example: An organisation who fed internal staff survey results directly back to line managers rather than filtering through their HR department; within an hour, selected parts of the results had been published by staff on the intranet; taken out of context and in isolation, some of these results made alarming headlines |
Misinterpretation of employee survey data
Misuse of staff survey data e.g. managers seeking out and berating staff for responses they think they have given |
Prevention is better than cure:Swift but vital sense-check of employee survey data by HR manager
Sensitive information cascade
De-identification of all qualitative responses
Listen to guidance from an independent third party survey provider, who wouldn’t approve of this to happen |
Lack of post-survey communications
Example: Too many action items. A lot of organisations that approach Insync for the first time cite that little to nothing was done with the results of the last staff survey. These same organisations often state that a key outcome of success would be to boost or at least retain participation levels |
Increased staff scepticism re: the employee survey motivations
Decreased employee engagement
Cycle of lowered response rates for the staff survey begins again |
*See Insync’s article on employee survey communication |
Mistakes in employee survey data/data interpretation
Example: A client made a mistake in the internal data analysis for their previous staff surveys and feared that a new survey would lack credibility even though they now had a professional provider |
Staff mistrust in the staff survey
Future withdrawal from employee research |
With careful management this issue was used to facilitate focus groups to involve managers in communicating intention and results of the next staff survey
Swift action by HR team following the staff survey
Don’t try and cover anything up with your employee survey results
Apologise, be honest and explain how mistakes could have occurred
Publish correct information in a timely manner |
Hiding bad news from the employee survey
Example: When unfavourable staff survey results for a new executive were revealed the HR department decided not to feed this back. |
Promotes future mistrust of the organisation
Doesn’t help the new manager on a path to improvement and lets down employees who took the time to provide constructive feedback via the staff survey |
Use the staff survey as a platform for explaining how things will be changing
Offer this executive a 360 feedback survey to help them grow as a leader |