1. Introduction

The Food Hygiene Rating Scheme (FHRS) was formally launched in November 2010. The scheme is designed to help consumers make more considered choices about where they eat out or shop for food by providing clear information about the hygiene standards of food businesses at their last inspection by a food safety officer. The FHRS is operated in partnership with local authorities and the Welsh Government in Wales.

Establishments where food is supplied, sold or consumed are given a rating ranging from 0 to 5, with 5 indicating ‘very good’ food hygiene and 0 indicating ‘urgent improvement necessary’. The ratings are determined by three elements: hygienic food handling; physical condition of the premises and facilities; and food safety management. In Wales and Northern Ireland, businesses are legally required to display their ratings in a prominent place at their premises. In England the scheme is voluntary, and businesses are encouraged to display their ratings. In all three countries, food hygiene ratings can be searched for on the FHRS website.

The Food Standards Agency (FSA) has conducted annual research into the Display of Food Hygiene Ratings in England, Northern Ireland and Wales since 2011. The research objectives in 2024 were to:

  • Provide a representative estimate of the display of food hygiene rating stickers at physical premises

  • Provide a representative estimate of the display of food hygiene ratings online

  • Explore business awareness and attitudes towards the Scheme

  • Explore the reasons and drivers for display and non-display both at the premises and online.

To meet the research objectives, a two-pronged approach was adopted, consisting of:

  1. A covert audit of 1,348 food businesses in England, Northern Ireland and Wales, conducted by Mystery Shoppers. This included both an audit of the businesses’ physical premises and, where applicable, their websites and social media accounts.

  2. A telephone survey of 1,349 food businesses in England, Northern Ireland and Wales, conducted by IFF Research.

This paper outlines the methodological approach taken for both strands of the research, including sampling; pilot and mainstage fieldwork; response rates; ethics, weighting and analysis.

2. Sampling

This section covers the process of acquiring and preparing the necessary contact details for potential participants of the study. The aim of this process was to ensure the sample was representative of the food businesses population, within each country.

2.1. Initial sample

For both the audit and telephone survey of food businesses, the sample was obtained from the FSA’s FHRS database. The initial dataset (received from the FSA on 30 August 2024) contained 469,602 food businesses.

The sample was processed to exclude food businesses that were ineligible for the research; these were defined as those who had not yet been inspected and issued with an FHRS rating, those who operated in a premise that was not publicly accessible (e.g. manufacturers and wholesalers), mobile food businesses or food businesses operating within a residential property.[1]

The initial sample for both the audit and the telephone survey of food businesses included the following outlet types:

  • Accommodation (e.g., hotels) and pubs, bars, and nightclubs

  • Restaurants, cafes, and other catering businesses (e.g., event caterers and home caterers)

  • Retail (e.g., supermarkets, butchers, and bakeries)

  • Takeaways and sandwich shops

To ensure representativeness of the underlying population of food businesses, records were drawn from the sample frame using a stratified random sampling approach. Specifically, the sample was stratified by setting quotas for country, region within country, outlet type and FHRS rating. Additionally, businesses in Northern Ireland and Wales, takeaways and sandwich shops and those with a food hygiene rating of three or less were oversampled to ensure that robust results could be produced for each sub-group, due to historically lower than average response rates and – in some cases – a relatively low population. In total, 28,266 food businesses were drawn from the sample frame.

There are no telephone numbers available on the FHRS database. IFF Research therefore undertook a two-stage process of appending telephone numbers so that businesses included in the sample frame could be contacted for survey fieldwork.

  1. Automated telephone matching: business details (trading name and postal address) were uploaded to the secure websites of two external data suppliers (Sagacity and Market Location) and telephone numbers were automatically appended.

  2. Internal desk research: telephone numbers were identified and appended to the database using online search engines and directories.

The two-stage process of appending telephone numbers to the sample frame yielded contact details for 15,158 food businesses. This equates to a telephone match rate of 54%. The telephone match rate was notably higher for accommodation and pubs, bars, and nightclubs (65%) and takeaways and sandwich shops (57%) compared to 2023. The match rate for restaurants, cafes, canteens & catering and retail businesses was slightly below average, at 49% and 50% respectively. Across FHRS ratings, the telephone match rates were notably higher than average for businesses with ratings of two (66%) and three (61%). The match rate for other rating values were in line with the average.

From the starting sample of 15,158, a smaller number of 3,699 business sites were drawn for audit fieldwork (selected at random and stratified by country, outlet type and FHRS rating to broadly reflect the underlying population). Less sample was required for the audit fieldwork as the ability to conduct covert audits was not reliant on the consent of the business.

The sample frame for the 2024 wave of the FHRS Audit and Survey was in line with the sample frame of the previous wave. Bed & Breakfasts, home caterers, kitchens without physical premises (i.e. ‘dark kitchens’) and food businesses considered inaccessible to the public (e.g. staff canteens) were not included in the starting sample for the audit.

Tables 1 to 3 present the profile of the starting sample (15,158) of eligible and usable food businesses for audit and telephone fieldwork in terms of country, outlet type and FHRS rating.

Table 1.Starting sample for audit and telephone fieldwork by country
Country Telephone survey Audit
England 4,567 1,255
Northern Ireland 5,254 1,213
Wales 5,337 1,231
Total 15,158 3,699
Table 2.Starting sample for audit and telephone fieldwork by sector
Sector Telephone survey Audit
Accommodation & pub/bar/nightclub 3,154 704
Restaurants, cafes, canteens, catering 5,364 1,231
Retail 3,646 960
Takeaways and sandwich shop 2,994 804
Total 15,158 3,699
Table 3.Starting sample for audit and telephone fieldwork by FHRS rating
FHRS Rating Telephone survey Audit
0 93 16
1 344 68
2 432 67
3 1,309 279
4 2,710 638
5 10,270 2,631
Total 15,158 3,699

2.2. Changes to sample size

This wave, the decision was made to reduce the total number of surveys and audits from 1,500 to 1,350[2], in order to deliver the project within the budget available. This reduction in volume had a negligible impact on confidence intervals by country; a base of 500 carries a maximum margin of error of ±4.38% while a base of 450 carries a maximum margin of error of ±4.62% (a difference of ±0.24%). Despite the reduction in sample size, data collection techniques remained consistent, ensuring comparability over time.

3. Audit fieldwork

3.1. Overview

This is the fourth wave of the FHRS display audit research IFF Research have conducted in partnership with Mystery Shoppers Ltd (MSL) (although annual research into the FHRS has occurred since 2011). An audit of 1,348 food businesses in England, Northern Ireland and Wales was conducted between October and December 2024. The audits were conducted by MSL and involved a physical audit of premises and their online presence (including websites, Facebook Business Pages or Instagram profiles). The audit followed a pre-defined questionnaire, with photos and online screenshots collected as supporting evidence.

3.2. Changes to methodology

Effort was made to ensure as much consistency as possible with previous waves of the research to allow for tracking of results over time. However, some changes were put in place for the 2024 wave, and as such, a pilot fieldwork phase was conducted before mainstage fieldwork to evaluate the impact of these changes on results.

3.2.1. Online audit

In this wave, auditors conducted a more thorough online audit than in previous years, with the aim of better understanding the presence of food business operators, the facilitation of online orders, and the display of FHRS ratings on three prominent online platforms: businesses’ own websites, Facebook business pages and Instagram profiles. The audit also involved an assessment of the use of online food delivery aggregators (JustEat, Deliveroo and UberEats).

In line with last wave, auditors carried out a comprehensive inspection of businesses’ websites, navigating through pages to check for a rating and evidence of online ordering functionality. This wave, a new open-ended question was introduced, allowing auditors to provide detailed descriptions of how FHRS ratings were presented on websites. Additionally, auditors needed to verify that online ordering functionality was embedded directly within the businesses’ own websites, ensuring customers could place orders without being redirected to third-party platforms.

For social media platforms, auditors focused on the immediate landing pages. This approach differs slightly with the methodology used in 2023, which involved a more extensive search across social media profiles. By auditing only the landing page, the findings more accurately reflect the consumer experience when visiting online platforms. Auditors recorded how ratings were displayed across the platforms and also whether the outlet’s social media pages facilitated online orders (this was defined as having dedicated food order buttons, prominent links to food delivery aggregators, and written references to ordering via the platform’s messaging service). The approach used for the online audit is broadly in line with the FSA guidance on displaying food hygiene ratings on business websites and social media profiles.

In addition, in the online audit section of the questionnaire, the existing question ‘Does the outlet have its own website with an online ordering function?’ was split out to capture whether an outlet had 1) its own website and 2) an online ordering function. This ensured that ratings displayed on the websites of those businesses who do not have their own online ordering capabilities were still captured.

FoodHub was also removed from the online food delivery aggregator usage question, due to low incidence in 2023.

3.2.2. Physical audit

The only change made to the physical audit section of the audit questionnaire affected businesses in Wales. Here, an additional step was introduced to physical audits which required auditors to take a close-up photo of the FHRS sticker if it was found to not have the Welsh Government Dragon logo (i.e. a non-compliant sticker). Stickers without the Welsh Government Dragon logo went out of circulation in 2013 when display was made mandatory in Wales and despite this, in recent years, auditors have reported observing non-compliant stickers.

3.3. Pilot fieldwork

Pilot audit fieldwork was conducted between 7 and 12 October 2024 to ensure the updated questionnaire was appropriate ahead of mainstage fieldwork. A small proportion of the starting sample (135 businesses) was randomly selected for pilot fieldwork, from which 44 audits were conducted. This number of audits allowed for the updated audit approach to be tested with food business operators in different sectors and with different FHRS ratings across all three countries. All 44 audits completed during the pilot involved auditors conducting a physical and online audit of display. Tables 4 to 6 present the number of pilot audits completed by country, sector, and FHRS rating.

Table 4.Profile of pilot audits of food businesses by country
Country Completed pilot audits
England 15
Northern Ireland 15
Wales 14
Total 44
Table 5.Profile of pilot audits of food businesses by sector
Sector Completed pilot audits
Accommodation & pub/bar/nightclub 2
Restaurants, cafes, canteens, catering 22
Retail 11
Takeaways and sandwich shop 9
Total 44
Table 6.Profile of pilot audits of food businesses by FHRS rating
FHRS Rating Completed pilot audits
0-2 7
3 12
4 10
5 15
Total 44

All auditors received a verbal briefing and were issued with a written briefing pack before the start of pilot fieldwork. The content of the briefing was broadly in line with the previous wave, providing auditors with an understanding of the background of the research, the questionnaire design, the screening criteria, and the sample design. The briefing also included detailed instructions on how to approach the online audit this wave, including step-by-step instructions, examples of accepted rating display across websites and social media platforms and examples of both built in and linked online order functionality.

The questionnaire performed well during pilot fieldwork. Auditors experienced no issues with any aspect of the updated audit materials and so no changes were necessary ahead of mainstage fieldwork. The final version of the audit questionnaire used in mainstage fieldwork can be found in Appendix A of this report.

3.4. Mainstage fieldwork

Mainstage audit fieldwork took place between 28 October and 12 December 2024. In total, 1,350 audits were completed. All audits completed during the mainstage involved auditors conducting a physical and online audit of display. Upon completion of the audit fieldwork period, data quality checks meant that two records had to be removed from the dataset. The final profile of the audits achieved by country, sector and FHRS rating is detailed in Tables 7 to 9.

Table 7.Profile of mainstage audits of food businesses by country
Country Completed mainstage audits
England 450
Northern Ireland 450
Wales 448
Total 1,348
Table 8.Profile of mainstage audits of food businesses by sector
Sector Completed mainstage audits
Accommodation & pub/bar/nightclub 215
Restaurants, cafes, canteens, catering 531
Retail 399
Takeaways and sandwich shop 203
Total 1,348
Table 9.Profile of mainstage audits of food businesses by FHRS rating
FHRS Rating Completed mainstage audits
0-2 43
3 100
4 241
5 964
Total 1,348

As with the pilot, before the start of mainstage fieldwork all auditors received a briefing on the survey and were issued with a written briefing pack (see Appendix B). Over the course of fieldwork, auditors visited a total of 1,386 establishments. Of these, 36 were found to be un-auditable at the point of visit. Common reasons for this were establishments having shut down despite appearing open online, incorrect opening hours displayed online and temporary closures (e.g. refurbishment, staff illness, seasonal hours).

3.5. Quality assurance

Mystery Shoppers are a Company Partner of the Market Research Society, and the Mystery Shopping Professionals Association (MSPA). As a member of MSPA, Mystery Shoppers adhere to the MSPA’s Code of Professional Ethics and Code of Professional Standards and apply quality checking procedures at every stage of research. This involved:

  • Auditors being hand-picked based on their location and experience of both working on projects among food businesses, and of conducting audits in this fashion.

  • All auditors receiving a briefing led by the project account team and supported by IFF Research.

  • All briefed auditors receiving an online test checking that they understood the requirements of the brief. Only if they passed this were they incorporated into the panel that worked on this project.

  • Upon the completion of each physical audit, the geographical location of the site was uploaded to a form which ensures the auditor is at the correct establishment and unable to ‘game’ the system by submitting an audit form without having attended the site. They also took photos of the outside of the premises (in a covert fashion).

  • Auditors had direct lines of communication to the project account team at MSL, in case they had queries regarding the audit.

  • A designated quality control manager worked on the project. After each audit was submitted by auditors, it was held for 48 hours before being ‘published’, as it undergoes a series of quality checks:

    • Location is correct (verification against photos and geolocation).

    • Date and time is correct and matches the brief and the date and time the photo was taken for the physical audit.

    • All questions have been answered fully by the auditor in both the physical and online audit sections of the questionnaire and provide enough text in the comments if applicable.

    • Question answers and comments are consistent e.g. if marked no, comment fully explains why not.

    • Answers are not ambiguous.

    • Comments are checked for obvious spelling and grammar errors.

    • If anomalies are identified, the team will speak with the auditor directly and obtain clarification.

    • 10% of assignments are then checked for a second time by another team member.

    • For the online audit, auditors uploaded relevant screenshots clearly showing any ratings displayed, how outlets facilitate online ordering (i.e. an online order function on their own website or showing a clear link to a third party online ordering function), as well as providing evidence of their presence on the specified online platforms (Facebook, Instagram and food delivery aggregators).

4. Telephone survey

4.1. Overview

IFF Research conducted Computer Assisted Telephone Interviews (CATI) with 1,349 food businesses across England, Northern Ireland and Wales between October and November 2024. The survey covered awareness of and attitudes towards the FHRS, reasons and drivers for rating display (and non-display in England), attitudes towards mandatory physical and online display and experiences with their local authority food hygiene department.

4.2. Changes to methodology

As with the audit, effort has been made to ensure there was as much consistency as possible across waves of the telephone survey in order to make time series comparisons. However, some changes were made at this wave and as such a short pilot fieldwork phase was conducted before mainstage fieldwork to evaluate the impact of these changes on results.

The survey fieldwork period was brought forward this wave. Mainstage fieldwork took place between 14 October and 29 November 2024. In the previous wave, mainstage fieldwork took place between 24 October and 15 December 2023. This adjustment was made to mitigate the decline in engagement in the telephone survey typically observed in the lead-up to Christmas.

The changes made to the survey questionnaire are listed below:

  • At question B2, a new code was introduced to cover those businesses who had not been given a Food Hygiene Rating yet because they were awaiting inspection at the time of the survey, giving completeness to the question.

  • Throughout the survey, all references to ‘a delivery service website/app’ were updated to ‘a third-party website/app’, to better capture the food delivery aggregator services.

  • The questions which included codes ‘On a third-party website/app’ and ‘On the online ordering function on social media’ were updated to included examples of applicable platforms (‘e.g. such as JustEat, UberEATS or Deliveroo)’ for the former and ‘(e.g. such as Instagram, X (formerly known as Twitter), or Facebook)’ for the latter.

  • At questions B21 and B22, which explore reasons for non-display on businesses’ websites and on their social media platforms, new codes were added to capture common responses not already included in the answer options. Several codes were also removed from these questions which were chosen by very few respondents in 2023.

  • A minor wording adjustment was made to question C1 to make it more explicit how food businesses could have been contacted by their local authority food hygiene department (inspection letter or report.)

  • Answer options at questions C5, C8 and C15 (questions which explore reasons for non-usage of safeguards) were reviewed to remove uncommon answer codes and include new ones (‘we went with a different approach’, ‘there is no point until we’ve made relevant changes’, ‘it does not matter enough’, ‘we will wait until our next inspection’).

  • At question C9, there was a slight change to the question wording to clarify that the question was asking about attitudes towards mandatory physical display in England.

  • C12anew had a minor wording change to allow interviewers to probe why businesses in Wales and Northern Ireland thought mandatory physical display is a good thing.

  • A brand-new question (C17) was added to assess awareness of the FSA’s new campaign encouraging businesses to display their ratings online.[3]

  • Throughout the survey, all references to the social media platform ‘Twitter’ were updated to ‘X (formerly known as Twitter)’ to reflect the platform’s name change.

4.3. Pilot fieldwork

Between 30 September and 3 October 2024, IFF Research piloted the updated survey with 50 businesses. This number of interviews allowed for the materials to be tested with food business operators in different sectors and with different FHRS ratings across all three countries. Tables 10 to 12 present the number of interviews completed by country, sector and FHRS rating in the pilot.

Table 10.Profile of pilot interviews with food businesses by country
Country Completed pilot interviews
England 16
Northern Ireland 17
Wales 17
Total 50
Table 11.Profile of pilot interviews with food businesses by sector
Sector Completed pilot interviews
Accommodation & pub/bar/nightclub 11
Restaurants, cafes, canteens, catering 25
Retail 12
Takeaways and sandwich shop 2
Total 50
Table 12.Profile of pilot interviews with food businesses by FHRS rating
FHRS Rating Completed pilot interviews
0-2 2
3 2
4 12
5 34
Total 50

Before the start of pilot fieldwork, all interviewers received a briefing on the survey and were issued with a written briefing pack, providing them with an understanding of the background to the research, the questionnaire design, the screening criteria and the sample design.

The pilot survey involved administering the survey exactly as it would be during mainstage fieldwork. As well as allowing for further checks on comprehension of questions and survey flow, the pilot provided an opportunity to monitor response patterns and the overall interview length.

The results of the pilot were positive. There were no issues with the screening process, businesses were generally willing to participate and there was limited feedback from interviewers regarding issues with participant comprehension. However, a couple of minor refinements were made to improve respondent comprehension of question D1a. The final version of the questionnaire used in mainstage fieldwork can be found in Appendix C of this report. On average, the survey took 18 minutes and 27 seconds to administer during the pilot.

4.4. Mainstage fieldwork

Mainstage fieldwork was conducted between 14 October and 29 November 2024. A total of 1,349 food businesses were surveyed. Tables 13 to 15 present the number of interviews completed by country, sector and FHRS rating. As with the pilot, before the start of mainstage fieldwork all interviewers received a briefing on the survey and were issued with written instructions. The average survey duration fell during mainstage fieldwork to 17 minutes and 9 seconds.

Table 13.Profile of mainstage interviews with food businesses by country
Country Completed mainstage interviews
England 450
Northern Ireland 450
Wales 449
Total 1,349
Table 14.Profile of mainstage interviews with food businesses by sector
Outlet type Completed mainstage interviews
Accommodation and pubs, bars and nightclubs 244
Restaurants, cafes and catering 557
Retail 366
Takeaways and sandwich shops 182
Total 1,349
Table 15.Profile of mainstage interviews with food businesses by FHRS rating
Food hygiene rating Completed mainstage interviews
0-2 44
3 92
4 217
5 996
Total 1,349

4.5. Quality assurance

All interviewers that worked on the survey were given full Interviewer Quality Control Scheme (IQCS) training, regardless of previous experience, followed by three days on-the-job training. Only at the end of this period and having reached minimum standards are interviewers added to the IFF panel. Interviewers continue to receive enhanced monitoring and mentoring until it is judged that they may be put into the ongoing monitoring pool along with established IFF interviewers. Ongoing monitoring is conducted on all interviewers throughout their time with IFF Research. Other practices to ensure quality that relate specifically to the quantitative interviewing phase of this study include:

  • The Computer Assisted Telephone Interviewing (CATI) questionnaire derived from the survey questionnaire agreed by the FSA underwent thorough checks before “going live” for interviewing. Three members of the project team tested the survey, making sure that all routing instructions were followed as prescribed, that the question text and interviewer instructions read as intended and that answering protocols were respected (e.g. where a question is intended to get a single response, it is not possible to record more than one response). Dummy data was also used as a further check on routing.

  • Following a briefing from the Research Manager, interviewers were encouraged to spend up to an hour running through the test version of the CATI questionnaire to familiarise themselves with survey.

  • Interviewer standards were monitored continuously throughout fieldwork. All interviewers on the project had at least 5% of interviews listened in to and monitored “live” by project supervisors. Where best practice is not being followed, remedial action is taken as appropriate (e.g. interviewers will repeat their training, or be re-briefed, etc.). In any cases where malpractice is identified (a rare occurrence), all of the interviewers’ calls on the project will be reviewed – with respondents re-contacted to check their responses.

  • Auto-dialler software was used which means interviewers are randomly and automatedly assigned numbers to call. This means they cannot employ sample selection bias by choosing which calls to make.

  • All telephone interviews are automatically recorded by the CATI system and these recordings are used as part of the quality control monitoring process.

  • The project team periodically reviewed answer patterns to all survey questions. In particular, of non-response (don’t knows and refusals) on both an aggregate basis and for individual interviewers were looked at.

  • In addition to live monitoring and post-hoc data checks, a selection of at least 2.5% of respondents, chosen at random, were re-called by IFF’s Quality Control Team. The team re-ask a small number of questions to check that responses have been accurately recorded. Where discrepancies are identified, details are passed to the Research Team, and remedial action taken as appropriate (at the extreme, this will mean reviewing all interviews conducted by the interviewer in question).

4.6. Response rate

A total of 15,159 records acquired from the FSA’s FHRS database were used over the course of the telephone survey with food businesses (3,734 were not called as quota targets had been achieved or because a sufficient number of surveys had been achieved with a chain of food businesses). Of the records used during fieldwork, 136 records were found to be ineligible for the research, as the business reported that they did not sell, serve or prepare food for the public or because the business was closed.

Of the remaining 11,289 businesses, 9,447 records were eligible to take part in the study but did not. As shown in Table 16 below, the most common reason was that we could not secure consent for an interview (8,208), followed by the business consenting to take part but not committing to an interview time during the fieldwork period (492), and not being able to identify an appropriate individual within the business to take part in the fieldwork period (392).

Table 16.Survey outcome for sample in scope of the study
Survey outcome Total Population in scope of study
Total in scope of study 11,289 100%
Business called 1 to 10 times but unable to reach target respondent 8,208 73%
Appointment made but not achieved during fieldwork period 492 4%
Out of quota – sector / size / country 185 2%
Not available in fieldwork period / nobody at site available 392 3%
Unobtainable number 170 2%
In scope of study but did not participate in the research 9,447 84%
In scope of study and fieldwork 1,842 16%

Response rate calculations do not include records that were outside of the scope of the fieldwork, given that no firm contact was made with these food businesses. This means that 1,842 records were in scope of fieldwork. Of these, 1,349 completed a survey. This equates to a response rate of 73% (see Table 17).

Table 17.Survey outcome for the sample in scope of fieldwork
Survey outcome Total Population in scope of study (11,289) Population in scope of fieldwork
Total in scope of fieldwork 1,842 16% 100%
Achieved interviews 1,349 12% 73%
Refusals 324 3% 18%
Drop out during the interview 168 1% 9%

4.7. Overlap between the audit and telephone survey

This year, a crossover target between the telephone survey and audit of 500 was set. This dual engagement ensures more comprehensive and insightful data; for example, collecting both observed and self-reported data gives a more holistic understanding of rating display.

Of the 1,348 food businesses audited, 574 also participated in the telephone survey, up from the 443 achieved in 2023. To achieve this, survey fieldwork was scheduled slightly before audit fieldwork, allowing the auditors to make concerted efforts to target businesses who had already completed the telephone survey. Tables 18 to 20 present the profile of the food businesses that were audited and participated in the survey in terms of country, sector and FHRS rating.

Table 18.Profile of food businesses both audited and surveyed by country
Country Completed both survey and audit
England 189
Northern Ireland 202
Wales 183
Total 574
Table 19.Profile of food businesses both audited and surveyed by sector
Outlet type Completed both survey and audit
Accommodation and pubs, bars and nightclubs 96
Restaurants, cafes and catering 219
Retail 200
Takeaways and sandwich shops 59
Total 574
Table 20.Profile of food businesses both audited and surveyed by FHRS rating
Food hygiene rating Completed both survey and audit
0-2 15
3 33
4 83
5 443
Total 574

5. Ethics

5.1. Telephone survey

IFF Research upholds a steadfast commitment to ethical practices in the design and conduct of all research. The 2024 wave of the FHRS survey and audit was conducted in strict accordance with the Market Research Society Code of Conduct and applied the principles of the Government Social Research Code on research ethics. With regards to the survey of food businesses, IFF:

  • Ensured informed consent was provided by all participants. As with previous waves of the survey, recruitment screening questions explained the aims of the research, that participation is voluntary, what it will entail for the participant (especially interview duration), explained and assured anonymity/confidentiality, and provided information about their rights under GDPR.

  • Designed a reassurance email in collaboration with the FSA that could be sent to participants, confirming the details of the research and providing details of how to opt-out of the research.

  • Offered participants telephone numbers and email addresses that they could contact for reassurance or further information. These included the IFF project manager, the Market Research Society freephone number and contact details for the FSA.

  • Worked within the requirements of data protection legislation, including the Telephone Preference Service, Mail Preference Service and Corporate Telephone Preference Service.

  • Took all reasonable steps to minimise the burden on participants. This was principally achieved through using a well-designed questionnaire, with questions worded to make them as easy as possible to answer and routing to ensure questions only asked of those to whom they apply. Interviews were conducted at a time and in a manner to best suit the respondent. It was also made clear during interviews that if participants are unable or do not wish to answer a particular question then they do not need to.

  • Protected the confidentiality and anonymity of participants. IFF’s compliance with ISO27001 ensures that IFF safeguard participants’ personal data and all outputs are carefully checked by two researchers to ensure confidentiality has been maintained.

  • Checked and recorded any specific requirements the participants had to enable them to take part. Careful attention was paid to how the IFF interviewer communicates with the participant (including the need for interpreters, having other individuals present for participants to feel at ease etc.)

5.2. Audit

The audit strand of this research carried some specific ethical considerations given that audited businesses cannot, by the very nature of mystery shopping as a research technique, give informed consent for participation. However, Mystery Shoppers observe strict ethical standards: they are a Company Partner of the Market Research Society and a member of the Mystery Shopping Professionals Association, who have a code of ethical standards. All audits were conducted in such a way that no major disruption to the normal operation of the food businesses being observed was caused and that the mystery shopping results were reported accurately and honestly.

Special care was taken to ensure the safety of auditors at all times. Mystery Shoppers conducted thorough risk assessments, ensuring auditors were protected from adverse implications and auditors were also encouraged to avoid scenarios that could potentially threaten their safety or well-being. If auditors were unable to take a photo covertly, they were advised not to take one and make a record of this.

6. Weighting and analysis

6.1. Weighting

The data collected from the audit and telephone survey of food businesses was weighted to make it representative of the underlying population. Weighting the data was necessary because of the deliberate decision to stratify interviews to ensure sufficient base sizes were achieved by country and FHRS rating.

Weights were applied to the data to make it representative of the target population within each country in terms of sector and FHRS rating. To achieve this, the FSA supplied an updated FHRS database export in December 2024. Across the surveys and audits conducted, the weighting efficiency ranged from 97% to 99% (see Appendix D). This indicates minimal variance inflation due to weighting adjustments, ensuring robust and reliable insights. The profile of the underlying population in each country used for weighting is presented in Table 21 and Table 22.

Table 21.Profile of the underlying business population by sector and FHRS rating within country by sector
Sector England Northern Ireland Wales
Accommodation and pubs, bars and nightclubs 17.0% 14.5% 21.8%
Restaurants, cafes and catering 40.1% 41.0% 37.5%
Retail 27.3% 28.0% 26.6%
Takeaways and sandwich shops 15.7% 16.5% 14.1%
Table 22.Profile of the underlying business population by sector and FHRS rating within country by FHRS rating
Food hygiene rating England Northern Ireland Wales
0-1 2.1% 0.5% 2.1%
2 1.9% 1.2% 1.7%
3 7.8% 5.2% 8.3%
4 16.7% 15.8% 20.0%
5 71.5% 77.2% 67.8%

6.2. Analysis

When comparing results across years or between sub-groups it is essential to establish whether these differences are statistically significant or not, that is, whether we can be certain that a change in a particular percentage from one year to the next is sufficiently large to be considered a genuine movement and not due to chance. In order to do this, significance testing was carried out on survey and audit findings using t testing and z testing.

Two types of testing were used: between each set of cross break headings (for example comparing between different sector categories within the sector cross break header) and comparing data within each subgroup break to the total minus the data in the individual column. To facilitate the types of analysis necessary, data tables were produced which employed significance tests (that is testing the results for a given subgroup against the results in each of the other subgroups within a given analysis ‘break’). For differences between different waves of the research, significance testing was conducted by running z tests between the respective sets of data tables.

All differences noted in the main report are statistically significant to a 95% confidence level or above unless otherwise stated. By convention, this is the statistical ‘cut off point’ used to mean a difference is large enough to be treated as genuine. This means there is at least 95% certainty that the difference in distribution is not due to chance but indicates a genuine change. Where differences that are not statistically significant have been reported in the main report, it has been made clear that they are not statistically significant.

In some instances, base sizes in the survey and audit data are too small to conduct statistical significance testing between subgroups or waves. Where base sizes are under 50, statistical significance testing has not been conducted and findings should be treated with caution. Instances of low base sizes are clearly highlighted in the report.

Where possible, comparisons between the telephone survey and the audit findings have been made in the main report. However, a degree of caution is needed when looking at these comparisons owing to survey findings relying on self-reported information provided by respondents and audit findings being derived from direct observations. Both data sources, but particularly survey data, may be subject to biases or inaccuracies. While comparisons provide valuable insights, they should be made with careful consideration of the strengths and limitations associated with each data collection method.

In the main report, both observed and self-reported online display of food hygiene ratings is covered. As stated above, a degree of caution is necessary when comparing these respective findings due to the scope of online platforms across the telephone survey and the online audit. In the telephone survey, food businesses were asked "Do you display your Food Hygiene Rating online? For example, on your website, on social media accounts or a third-party website.". If they reported displaying their rating online, a follow-up question asked about a range of specific online platforms if applicable (their own website; the online ordering function on their own website; on third-party websites/apps; online ordering functions on social media or other online platforms). In contrast, the online audit focused solely on businesses own websites, Facebook business pages and Instagram profiles.

IFF’s in-house coding team processed verbatim responses collected in the survey and produced a framework (known as a code frame) to categorise them into thematic codes. At the beginning of fieldwork, the coding team were briefed on the project and issued with instructions detailing the types of responses to be included in pre-existing codes and potential themes to be mindful of. The research team reviewed the code frame periodically during fieldwork and again at the end of fieldwork (checking at least 10% of all coded verbatim). Once the code frame was complete and signed off, the thematic codes were incorporated into the data tables and data files to allow for quantitative analysis.

6.3. Quality assurance

All coding, data processing and analysis was conducted by IFF’s experienced Data Services team and took place in-house. This allowed for careful collaboration to ensure high quality outputs.

The research team created dataset and table specifications, and these were signed off by the Research Director. Analysis cross-breaks for subgroup analysis and derived variables were developed by the project team and agreed with the FSA.

Tables and data files were thoroughly checked by three team members (using the specification documents). This was repeated until all discrepancies were caught and amended before the outputs were signed off by the project manager.

This wave, in order to improve the accuracy of the audit findings, ratings observed in the physical audit and from the sample data shared by the FSA in August 2024 were cross-referenced to confirm whether they matched. Where ratings observed at the time of audit were found not to match the published data, the FSA conducted an internal review to verify the valid rating at the time of the audit.


  1. The FHRS applies to food manufacturers and wholesalers in Wales, but not in England and Northern Ireland. These businesses were excluded from the sample because their premises are typically not publicly accessible and so it would not be possible to covertly audit them.

  2. Data quality checks conducted once the audit fieldwork was complete resulted in two audit records and one survey record being removed from the data, resulting in a final total base size of 1,348 (see section 3.4) and 1,349 (see section 4.4) respectively.

  3. Findings for this question have not been covered in the main report but data is available in the accompanying data tables.