1. Introduction
The Food Hygiene Rating Scheme (FHRS) was formally launched in November 2010. The scheme is designed to help consumers make more considered choices about where they eat out or shop for food by providing clear information about the hygiene standards of food businesses at their last inspection by a food safety officer.
Under the scheme, places where food is supplied, sold or consumed are given a rating ranging from 0 to 5, with 5 standing for ‘very good’ food hygiene and 0 ‘urgent improvement necessary’. The ratings are determined by three elements: hygienic food handling; physical condition of the premises and facilities; and food safety management.
The FSA has conducted research into the Display of Food Hygiene Ratings in England, Northern Ireland and Wales since 2011. The objectives of this wave were to:
-
Provide a representative estimate of the display of food hygiene ratings stickers at physical premises
-
Provide a representative estimate of the display of food hygiene ratings online (a new objective for this wave)
-
Explore business awareness and attitudes towards the scheme
-
Explore the reasons and drivers for display and non-display in England
To meet the research objectives a two-pronged approach was adopted, consisting of:
-
A covert audit of 1,479 food businesses in England, Northern Ireland and Wales (485, 497 and 497 respectively), conducted by Mystery Shoppers.[1]
-
A telephone survey of 1,500 food businesses in England, Northern Ireland and Wales, conducted by IFF Research (500 per nation).
This paper outlines the methodological approach taken for both strands of the research.
2. Sampling
This section covers the process of acquiring and preparing the necessary contact details for potential participants of the study. The aim of this process was to ensure the sample is representative of the overall food businesses profile, within country.
For both the audit and telephone survey of food businesses, the sample was obtained from the FSA’s FHRS database. The initial dataset (received from the FSA on 25 August 2023) contained 468,819 food businesses. This sample was processed to exclude food businesses that were ineligible for the research. Specifically, food businesses were excluded if they had not yet been inspected and issued with an FHRS rating, if they operated in a premise that was not publicly accessible (e.g. manufacturers and wholesalers) or if they were a mobile food business.[2]
From this sample frame we drew 28,540 food businesses, stratified by country, outlet type and FHRS rating to broadly reflect the underlying population of food businesses. Food businesses in England were stratified by region also. Sampled businesses in England broadly fell in line with the underlying local authority/district council populations although this was not incorporated into the sample stratification.[3] Businesses in Northern Ireland and Wales, takeaways and sandwich shops and those with a food hygiene rating of three or less were oversampled to ensure that robust results could be produced for each sub-group.
There are no telephone numbers available on the FHRS database. IFF Research therefore undertook a two-stage process of appending telephone numbers so that businesses included in the sample frame could be contacted for survey fieldwork. The first stage was automated telephone matching; business details (trading name and postal address) were uploaded to the secure website of an external data supplier (Sagacity) and telephone numbers were automatically appended. The second stage was internal desk research, whereby telephone numbers were identified and appended to the database using online search engines and directories.
The two-stage process of appending telephone numbers to the sample frame yielded contact details for 10,906 food businesses. This equates to telephone match rate of 38%. An additional 41 food business records where telephone numbers could not be identified were included in the sample for pilot audit fieldwork. This means that the overall starting sample for the 2023 wave of the FHRS Audit and Survey was 10,947.
From the starting sample of 10,947, a smaller number of 2,775 business sites were drawn for audit fieldwork (selected at random and stratified by country, outlet type and FHRS rating to broadly reflect the underlying population). Less sample was required for the audit fieldwork as the ability to conduct covert audits was not reliant on the consent of the business.
The sample frame for the 2023 wave of the FHRS Audit and Survey was in line with the sample frame of the previous wave. Bed & Breakfasts, home caterers, kitchens without physical premises (i.e. ‘dark kitchens’) and food businesses considered inaccessible to the public (e.g. staff canteens) were not included in the starting sample for the audit.
Tables 2.1 to 2.3 present the profile of the starting sample (10,947) of eligible and usable food businesses for audit and telephone fieldwork in terms of country, outlet type and FHRS rating.
3. Audit fieldwork
This is the third wave of the FHRS display audit research IFF Research have conducted (although annual research into the FHRS has occurred since 2011). Effort was made to ensure as much consistency as possible with previous waves of the research to allow for tracking of results over time. However, some changes were made in the 2023 wave, and as such a pilot fieldwork phase was conducted before mainstage fieldwork to evaluate the impact of these changes on results.
The primary change made to the audit approach was that auditors conducted a more thorough online audit than in previous years. Auditors checked the online presence of food business operators and the rate of display of FHRS ratings across three online platforms: Facebook business pages, Instagram profiles, and business websites with online ordering capabilities. Additionally, checks were conducted on the presence of businesses on four major online food delivery aggregators: Just Eat, Deliveroo, Uber Eats, and Foodhub.
Other minor refinements were made to the physical audit element of the audit questionnaire ahead of pilot fieldwork. This included reinstating a question for outlets in Wales to capture whether a dragon logo was visible on the sticker and adding an instruction to auditors to inform them that it is not necessary to photograph the back of stickers.
Pilot fieldwork
Pilot audit fieldwork was conducted between 23 October and 26 October 2023 to ensure the questionnaire designed for the audit was appropriate ahead of mainstage fieldwork. A small proportion of the starting sample was randomly selected for pilot audit fieldwork (117 businesses), from which 45 audits were conducted. This number of audits allowed for the updated audit approach to be tested with food business operators in different sectors and with different FHRS ratings across all three countries. All 45 audits completed during the pilot involved auditors conducting a physical and online audit of display. Tables 3.1 to 3.3 present the number of interviews completed by country, sector, and FHRS rating.
All auditors received a verbal briefing and were issued with a written briefing pack before the start of pilot fieldwork. The content of the briefing was broadly in line with the previous wave, providing auditors with an understanding of the background of the research, the questionnaire design, the screening criteria, and the sample design. The briefing also included instructions on how to approach the online audit.
The questionnaire performed well during pilot fieldwork, with auditors experiencing no issues with any of the questions in the survey nor any issues with being able to conduct audits covertly. However, one small refinement was made to the online element of the audit questionnaire ahead of mainstage fieldwork to allow data to be recorded individually for each of the four aggregator platforms rather than one of any of the platforms. The final version of the questionnaire used in mainstage fieldwork can be found in Appendix B of this report.
Mainstage fieldwork
Mainstage audit fieldwork took place between 6 November and 18 December 2023. In total, 1,479 audits were completed. All audits completed during the mainstage involved auditors conducting a physical and online audit of display. The final profile of the audits achieved by country, sector and FHRS rating is detailed in Tables 3.4 to 3.6.
As with the pilot, before the start of mainstage fieldwork all auditors received a briefing on the survey and were issued with a written briefing pack (see Appendix C).
The total number of audits achieved was slightly lower than the target of 1,500. This occurred due to auditors encountering a relatively high number of un-auditable establishments. The main reasons for this included establishments being closed within their advertised opening hours, establishments no longer being in business and establishments not being publicly accessible.
Over the course of fieldwork, auditors visited a total of 1,559 establishments. Of these, 76 were found to be un-auditable at the point of visit. As agreed with the FSA, 50 of these establishments were replaced with new establishments which met the same criteria in terms of region, outlet type and FHRS rating, and the remainder (26) were not replaced. Upon the completion of fieldwork, a further four records were excluded from the data due to data quality issues.
Quality assurance
Mystery Shoppers are a Company Partner of the Market Research Society, and the Mystery Shopping Professionals Association (MSPA). As a member of MSPA, Mystery Shoppers adhere to the MSPA’s Code of Professional Ethics and Code of Professional Standards and apply quality checking procedures at every stage of research. This involved:
-
Auditors being hand-picked based on their location and experience of both working on projects among food businesses, and of conducting audits in this fashion.
-
All auditors receiving a briefing led by the project account team and supported by IFF Research.
-
All briefed auditors receiving an online test checking that they understood the requirements of the brief. Only if they passed this were they incorporated into the panel that worked on this project.
-
Upon the completion of each audit, the geographical location of the site was uploaded to a form which ensures the auditor is at the correct establishment and unable to ‘game’ the system by submitting an audit form without having attended the site. They also took photos of the outside the premises (in a covert fashion).
-
Auditors had direct lines of communication to the project account team MSL, in case they had queries regarding the audit.
-
A designated quality control manager worked on the project. After each audit was submitted by auditors, it was held for 48 hours before being ‘published’, as it undergoes a series of quality checks:
-
Location is correct (verification against photos and geolocation)
-
Date and Time is correct and matches the brief and the date and time the photo was taken.
-
All questions have been answered fully by the auditor and provide enough text in the comments if applicable.
-
Question answers and comments are consistent e.g. if marked no, comment fully explains why not.
-
Answers are not ambiguous.
-
Comments are checked for obvious spelling and grammar errors.
-
If anomalies are identified, the team will speak with the auditor directly and obtain clarification.
-
10% of assignments are then checked for a second time by another team member.
4. Telephone fieldwork
As with the audit, effort has been made to ensure there was as much consistency as possible across waves of the telephone survey in order to make time series comparisons. However, some changes were made at this wave and as such a short pilot fieldwork phase was conducted before mainstage fieldwork to evaluate the impact of these changes on results.
Ahead of pilot telephone fieldwork, a series of refinements were made to the survey used in the previous wave. This included:
-
A new code was introduced to A4 to ensure the question captured all the ways businesses allow customers to order food online from them.
-
A new question (A7) was introduced to capture whether businesses that do not have a facility that allows customers to order food online intend to introduce one in the future.
-
A minor wording update made to the example list presented at question B18a to include ‘third party websites’.
-
A new question (B18d) was introduced to capture which social media platforms FHRS ratings were displayed on.
-
A new question (B20a) was introduced to capture how businesses would feel about third-party ordering services displaying FHRS ratings in a ‘readily seen location rather than behind a click-through’.
-
A minor wording update made to the example text at question C12C. The distinction was made that this would mean displaying the rating in a readily seen location where customers can see the rating without having to actively seek it out.
-
A new set of questions (C14, C15 and C16) were introduced to collect information about the number of businesses appealing their most recent FHRS rating, the outcome of the appeal for those that applied and the reason for not applying for an appeal for those dissatisfied with their rating.
Pilot fieldwork
Between 9 October and 12 October 2023, IFF Research piloted the survey with 50 businesses. This number of surveys allowed for the updated survey to be tested with food business operators in different sectors and with different FHRS ratings across all three countries Tables 4.1 to 4.3 present the number of interviews completed by country, sector and FHRS rating.
Before the start of pilot fieldwork all interviewers received a briefing on the survey and were issued with a written briefing pack, providing them with an understanding of the background to the research, the questionnaire design, the screening criteria and the sample design.
The pilot survey involved administering the survey exactly as it would be during mainstage fieldwork. As well as allowing for further checks on comprehension of questions and survey flow, the pilot provided an opportunity to monitor response patterns and the overall interview length.
The results of the pilot were positive. On average, the survey took just over 14 minutes to complete during the pilot (14 minutes and 9 seconds), which was slightly above the target of 12 minutes but broadly in line with the average survey duration in previous waves. Furthermore, there were no issues with the screening process, there was limited feedback from interviewers regarding issues with participant comprehension and businesses were generally willing to participate.
Following the completion of pilot fieldwork, a couple of minor refinements were made to improve respondent comprehension of questions B20a and C12c. The final version of the questionnaire used in mainstage fieldwork can be found in Appendix D of this report.
Mainstage fieldwork
Mainstage fieldwork was carried out between 24 October and 15 December 2023. A total of 1,500 food businesses were interviewed. Tables 4.4 to 4.6 present the number of interviews completed by country, sector and FHRS rating.
As with the pilot, before the start of mainstage fieldwork all interviewers received a briefing on the survey and were issued with written instructions.
As per the agreed target, the overall number of telephone surveys achieved in 2023 was 1,500 (with 500 in each country). The profile of the achieved sample in each country was broadly in line with underlying population in terms of sector and FHRS rating. However, as in previous years, challenges were faced with completing surveys with businesses in the takeaway and sandwich shop sector. This was primarily caused by businesses in operating in this sector not picking up the phone.
With the knowledge that difficulties had been faced with this audience in previous waves, during the design phase we purposefully oversampled takeaways and sandwich shops. Furthermore, during the course of fieldwork we employed a variety of techniques to improve the connection rate, including sourcing alternative contact details, calling at different times of the day and calling during the weekend. However, despite a concerted effort we were unable to achieve the original target of 230 surveys with this audience (80 in England, 81 in Northern Ireland and 70 in Wales). We finished fieldwork with 155 surveys in total (53 in England, 52 in Northern Ireland and 50 in Wales).
Quality assurance
All interviewers that worked on the survey were given full Interviewer Quality Control Scheme (IQCS) training, regardless of previous experience, followed by three days on-the-job training. Only at the end of this period, and having reached our minimum standards, are interviewers added to our panel. Interviewers continue to receive enhanced monitoring and mentoring until it is judged that they may be put into the ongoing monitoring pool along with our established interviewers. Ongoing monitoring is conducted on all interviewers throughout their time with the company. Other practices to ensure quality that relate specifically to the quantitative interviewing phase of this study include:
-
The Computer Assisted Telephone Interviewing (CATI) questionnaire derived from the survey questionnaire agreed by the FSA underwent thorough checks before “going live” for interviewing. At least three members of the project team tested the survey, making sure that all routing instructions were followed as prescribed, that the question text and interviewer instructions read as intended and that answering protocols were respected (e.g. where a question is intended to get a single response, it is not possible to record more than one response). Dummy data was also used as a further check on routing.
-
Following a briefing from the Research Manager, interviewers were encouraged to spend up to an hour running through the test version of the CATI questionnaire to familiarise themselves with survey.
-
Interviewer standards were monitored continuously throughout fieldwork. All interviewers on the project had at least 5% of interviews listened-in to and monitored “live” by project supervisors. Where best practice is not being followed, remedial action is taken as appropriate (e.g. interviewers will repeat their training, or be re-briefed, etc.). In any cases where malpractice is identified (a rare occurrence), all of the interviewers’ interviews on the project will be reviewed – with respondents re-contacted to check their responses.
-
Auto-dialler software was used which means interviewers are randomly and automatedly assigned numbers to call. This means they cannot employ sample selection bias by choosing which calls to make.
-
All telephone interviews are automatically recorded by the CATI system and these recordings are used as part of the quality control monitoring process.
-
The project team periodically reviewed answer patterns to all survey questions. In particular, of non-response (don’t knows and refusals) on both an aggregate basis and for individual interviewers were looked at.
-
In addition to live monitoring and post-hoc data checks, a selection of at least 2.5% of respondents, chosen at random, were re-called by IFF’s Quality Control Team. The team re-ask a small number of questions to check that responses have been accurately recorded. Where discrepancies are identified, details are passed to the Research Team, and remedial action taken as appropriate (at the extreme, this will mean reviewing all interviews conducted by the interviewer in question).
Response rate
A total of 10,373 records acquired from the FSA’s FHRS database were used over the course of the telephone survey with food businesses (533 were not called as quota targets had been achieved). Of the records used during fieldwork, 94 records were ineligible, as the business reported that they did not sell, serve or prepare food for the public (58) or because the business was closed (36).
Of the remaining 10,279 businesses, 8,394 records were eligible to take part in the study but did not do so. The most common reason, as shown in Table 4.3, was that we could not secure consent for an interview (6,870) followed by the business consenting to take part but not committing to an interview time in the fieldwork period (773), and not being able to identify an individual within the business to take part in the research in the fieldwork period (398).
Response rate calculations do not include records that were outside of the scope of the fieldwork, given that no firm contact was made with these food businesses. This means that 1,885 records were in scope of fieldwork. Of these, 1,500 completed an interview. This equates to a response rate of 80% (see Table 4.8)
Overlap between the audit and telephone survey of food businesses
Of the 1,479 food businesses audited, 443 also participated in the telephone survey. Tables 4.9 to 4.11 present the profile of these food businesses in terms of country, sector and FHRS rating.
5. Ethics
IFF uphold a steadfast commitment to ethnical practices in the design and conduct of all research. The 2023 wave of the FHRS survey and audit was conducted in strict accordance with the Market Research Society Code of Conduct and applied the principles of the Government Social Research Code on research ethics. With regards to the survey of food businesses, IFF:
-
Ensured informed consent was provided by all participants. As with previous waves of the survey, recruitment screening questions explained the aims of the research, that participation is voluntary, what it will entail for the respondent (especially interview duration), explained and assured anonymity/confidentiality, and provided information about their rights under GDPR.
-
Designed a reassurance email in collaboration with the FSA that could be sent to confirming the details of the research and to provide details of how to opt-out of the research.
-
Offered respondents telephone numbers and email addresses that they could contact for reassurance or further information. These included the IFF project manager, the Market Research Society freephone number and contact details for the FSA.
-
Worked within the requirements of data protection legislation, including the Telephone Preference Service, Mail Preference Service and Corporate Telephone Preference Service.
-
Took all reasonable steps to minimise the burden on participants. This was principally achieved through using a well-designed questionnaire, with questions worded to make them as easy as possible to answer and routing to ensure questions only asked of those to whom they apply. Interviews were conducted at a time and in a manner to best suit the respondent. It was also made clear during interviews that if respondents are unable or do not wish to answer a particular question then they do not need to.
-
Protected the confidentiality and anonymity of respondents. IFF’s compliance with ISO27001 ensures that IFF safeguard participants’ personal data and all outputs are carefully checked by two researchers to ensure confidentiality has been maintained.
-
Checked and recorded any specific requirements the respondents had to enable them to take part. Careful attention was paid to how the IFF interviewer communicates with the respondent (including the need for interpreters, having other individuals present for respondents to feel at ease etc.)
The audit strand of this research carried some specific ethical considerations given that audited businesses cannot, by the very nature of mystery shopping as a research technique, give informed consent for participation. However, Mystery Shoppers observe strict ethical standards: they are a Company Partner of the Market Research Society and a member of the Mystery Shopping Professionals Association, who have a code of ethical standards. All audits were conducted in such a way that no major disruption to the normal operation of the food businesses being observed was caused and that the mystery shopping results were reported accurately and honestly. Furthermore, special care was taken to ensure the safety of auditors at all times.
6. Weighting and analysis
Weighting
In line with standard market research practice, the data collected from the audit and telephone survey of food businesses was weighted to make it representative of the underlying population. Weighting the data was necessary because of the deliberate decision to stratify interviews to ensure sufficient base sizes were achieved by country and FHRS rating.
Weights were applied to the data to make it representative of the target population within each country in terms of sector and FHRS rating. To achieve this the FSA supplied an updated FHRS database export in December 2023. The profile of the underlying population in each country used for weighting is presented in Tables 6.1 and 6.2
Analysis
When comparing results across years or between sub-groups it is essential to establish whether these differences are significant or not, that is, whether we can be certain that a change in a particular percentage from one year to the next is sufficiently large to be considered a genuine movement and not due to chance. In order to do this, significance testing was carried out on survey and audit findings using t testing and z testing.
Two types of testing were used: between each set of cross break headings (for example comparing between different sector categories within the sector cross break header) and comparing data within each subgroup break to the total minus the data in the individual column. To facilitate the types of analysis necessary, data tables were produced which employed significance tests (that is testing the results for a given subgroup against the results in each of the other subgroups within a given analysis ‘break’).
All differences noted in the main report are statistically significant to a 95% confidence level or above. By convention, this is the statistical ‘cut off point’ used to mean a difference is large enough to be treated as genuine. This means there is at least 95% certainty that the difference in distribution is not due to chance but indicates a genuine change. In some parts of the report, differences which are not statistically significant using this test (but add to the overall ‘story’) have been included. In these instances it has been made clear that the difference was not statistically significant.
In some instances, the base sizes of certain questions or subgroups are small. In these instances findings should be treated with caution. Instances of low base sizes are highlighted in the report.
Where possible, comparisons between survey and the audit findings have been made in the main report. However, a degree of caution is needed when looking at these comparisons owing to survey findings relying on self-reported information provided by respondents and audit findings being derived from direct observations. Both data sources, but particularly survey data, may be subject to biases or inaccuracies. While comparisons provide valuable insights, they should be made with careful consideration of the strengths and limitations associated with each data collection method.
Quality assurance
All coding, data processing and analysis was conducted by IFF’s experienced Data Services team and took place in-house. This allowed for careful collaboration to ensure high quality outputs.
The research team created dataset and tables specifications, and these were signed-off by the Research Director. Analysis cross-breaks for subgroup analysis and derived variables were developed by the project team and agreed with FSA.
Tables and data files were thoroughly checked by three team members (using the specification documents). This was repeated until all discrepancies were caught and amended before the outputs were signed-off by the project manager.
IFF’s in-house coding team processed verbatim responses collected in the survey and produced a framework (known as a code frame) to categorise them into thematic codes. At the beginning of fieldwork the coding team were briefed on the project and issued with instructions detailing the types of responses to be included in pre-existing codes and potential themes to be mindful of. The research team reviewed the code frame periodically during fieldwork and again at the end of fieldwork (checking at least 10% of all coded verbatim). Once the code frame was complete and signed off, the thematic codes were incorporated into the data tables and data files to allow for quantitative analysis.
This encompassed both an audit of the businesses’ physical premises and their website, Instagram or Facebook Business Page, where they had one.
The FHRS applies to food manufacturers and wholesalers in Wales, but not in England and Northern Ireland. These businesses were excluded from the sample because their premises are typically not publicly accessible and so it would not be possible to covertly audit them.
Tables presenting the profile of the FBO population, the drawn sample and the achieved sample by region and Local Authority are presented in Appendix A.