1. Introduction

The Food Standards Agency (FSA) has been tracking small and micro Food Business Operator (FBO) attitudes towards food-related topics, and trust in the FSA and food system, since 2018. This has helped inform engagement and intervention activity targeted at businesses with fewer than 50 staff.

The survey was initially developed to assess the perceived impact of changes as a result of the UK’s exit from the European Union (EU), and the Achieving Business Compliance (ABC) programme, which aims to modernise the regulation of food businesses in England, Wales and Northern Ireland. Since then, it has evolved to regularly track small and micro FBO views on a range of subjects.

In 2023, the fourth wave of the tracking survey was carried out[1], with the following aims:

  • To ‘unpack’ attitudes towards regulation and deepen insights and knowledge of small and micro enterprises,

  • To measure trust in the FSA and extent to which FSA is considered a modern, accountable regulator.

All fieldwork for wave 4 was carried out by IFF Research, an independent market research company, commission by FSA.

This paper outlines the methodological approach taken for wave 4 of the research, including sampling; feasibility testing; pilot and mainstage fieldwork; response rates; and weighting.

2. Sampling

Businesses were eligible for the research if they were small and micro FBOs (i.e., less than 50 employees) operating in England, Northern Ireland, or Wales and were in sectors in-scope for the research (using the UK Standard Industrial Classification (SIC)).[2] IFF Research obtained information on the underlying population of businesses eligible for the Tracker in terms of sector and size from the Office of National Statistics Interdepartmental Business Register (IDBR) sourced from NOMIS (UK Business Counts) on 26 July 2023.

Once eligible business population counts had been obtained, the Food Hygiene Rating Scheme (FHRS) ratings in England, Northern Ireland and Wales were applied to derive the number of eligible business in rating groupings. This information was obtained from the weighting profile applied in wave 3 of the Tracker.[3] In total, there were 251,260 businesses deemed eligible for the research. 230,090 were in England, 7,405 were in Northern Ireland and 13,765 were in Wales.

The profile of the underlying eligible population was applied to the overall initial target of 700 interviews (500 in England and 100 in both Northern Ireland and Wales) to produce proportional country-level targets by sector grouping, FHRS rating, and number of employees. These proportional targets were then modified to ensure sufficient base sizes for robust analysis.

Table 2.1.Wave 4 target by sector grouping, FHRS rating and size by country.
Sector grouping England Northern Ireland Wales Total
Manufacturing 30 10 10 50
Wholesale 40 10 10 60
Retail 130 20 20 170
Accommodation 30 10 10 50
Food and beverage service activities 270 50 50 370
FHRS rating England Northern Ireland Wales Total
0 to 2 30 10 10 50
3 40 10 10 60
4 80 20 20 120
5 270 50 50 370
Awaiting inspection / Exempt 80 10 10 100
Number of employees England Northern Ireland Wales Total
Fewer than 10 410 70 70 550
Between 10 and 19 60 20 20 100
Between 20 and 49 30 10 10 50
Total 500 100 100 700

In the last wave of the Tracker, conducted in 2021, all sample was obtained from the FSA’s FHRS database. Using this database as the sample source presented a number of limitations and challenges including: excluding ineligible FBOs, stratifying the starting sample by size, the significant time and resource needed to identify and append telephone contact details, and limited information on sector.

To address these issues, for this wave, contact details were sourced from a commercial sample provider (specifically Market Location).[4] This approach was taken as it was more efficient and offered the potential of more in-depth analysis with the achieved survey sample. Specifically, it meant that telephone contact details and firmographic data (e.g., employee numbers and sector) was available in the starting sample, and it allowed for the exclusion of ineligible businesses before contact was made. FHRS ratings were matched by address and company name. Market Location achieved an FHRS rating match rate of 18%.

3. Telephone fieldwork

Pilot fieldwork

Between 26 September and 2 October 2023, pilot interviews were conducted with 18 food businesses completing the full questionnaire. Appendix B outlines the number of interviews completed by nation, sector and number of employees.

Prior to the commencement of pilot fieldwork, all interviewers received a briefing on the survey and were issued with written instructions. This provided them with an understanding of the background to the research, the questionnaire design, the screening criteria and the sample design.

Each pilot interview involved running through the questionnaire from start to finish as it would be during mainstage fieldwork. Throughout, interviewers took detailed notes of how businesses responded to questions, points of confusion and any additional feedback they provided. Respondents were also able to share feedback on specific questions selected to test and develop further at the end of the survey.

Overall, telephone interviewers felt there should be a reduction in the amount of text to be read out, both in terms of questions and answer options. Interviewers also noted some of the respondents were serving customers during the interview which caused delays. Interviewers did pick up on possible language barriers with respondents but noted these did not seem to cause an issue with survey responses.

The findings from the cognitive section of the questionnaire found respondents were able to easily comprehend the questions asked and felt they understood the meaning behind them, reflected by low rates of ‘Don’t Know’ answers. There were however high drop-out rates noted with 1230 respondents dropping out before completing the pilot as a result of survey length.

In the pilot the survey took 38 minutes however after accounting for cognitive testing questions which would later be removed, the survey was found to take 32 minutes in total. It is worth noting that during mainstage fieldwork, as interviewers become more familiar with the survey questions, the survey length was anticipated to shorten.

The core recommendation to come out of pilot reporting was to reduce high rates of dropouts by cutting survey length. Some minor recommendations were also made to reduce wordiness or complexity at some individual questions. These included:

  • Four questions (S3a, C12, C14 and E3) were altered so that responses were not read out

  • Three questions were removed, covering the legal status of businesses, Natasha’s law and confidence in the FSA

  • In four cases (B1, B3, B3a and C1) question wording was changed to reduce the length of time interviewers spent reading aloud or offering further clarification to businesses

  • In two cases (B1 and B3a) answer options were removed or combined to reduce survey time

  • Routing was changed at C6 so only those who had answered ‘yes’ to providing information on allergens in general at C5 were routed to this question

  • In two cases (D1 and E2) ‘add if necessary’ interviewer instructions were included for certain answer options to cut time interviewers were using for elaboration

Mainstage fieldwork

Mainstage fieldwork was carried out between 16 October and 24 November 2023.

There was an expectation that the average survey length would reduce during mainstage fieldwork as a consequence of the post-pilot amends and as interviewers became more familiar with the survey. However, after the first week of mainstage fieldwork, the survey was still taking considerably longer to complete (30.5 minutes) than the target duration. Further cuts to the questionnaire and changes to the routing were made to bring the survey length down. These included:

  • Scale questions were revised to not include an outline of the scale in the question text (as had been suggested post-pilot) due to interviewer feedback which suggested reading out scales in question texts was too lengthy

  • The word ‘fairly’ was replaced with ‘quite’ throughout based on feedback from interviewers

  • Removing an answer option at C3 given there were already three statements on food label information

  • Routing at C14 was changed so it was only asked to people who had heard of the FSA at C13

  • Including question C17c to filter out businesses that had not done anything to help with healthier choices from having to answer C18

The final version of the questionnaire used in mainstage fieldwork can be found in Appendix C.

Despite these additional changes, the survey length remained over the 23-minute target length at 29 minutes overall. The length of the survey impacted response rates which meant that reduced targets were agreed. These reduced targets preserved the minimums for Northern Ireland and Wales so robust country analysis could still take place. Including pilot respondents, a total of 556 food businesses were interviewed. Tables 3.1, 3.2, 3.3 and 3.4 presents the number of interviews completed by country, sector, size and FHRS rating.

Table 3.1.Profile of mainstage interviews with food businesses by country
Country
England 354
Northern Ireland 103
Wales 99
Table 3.2.Profile of mainstage interviews with food businesses by sector
Sector
Manufacturing 51
Wholesale 42
Retail 198
Accommodation 59
Food and beverage service activities 206
Table 3.3.Profile of mainstage interviews with food businesses by size
Size (Number of employees)
1-9 354
10-24 144
25-49 58
Table 3.4.Profile of mainstage interviews with food businesses by FHRS rating
FHRS rating
0-2 31
3 32
4 86
5 317
No rating / Awaiting inspection 90

As with the pilot, prior to commencement of mainstage fieldwork all interviewers received a briefing on the survey and were issued with written instructions. This ensured that interviewers understood the background to the research, the questionnaire design, the screening criteria, and the sample design.

Checks were conducted on the final 556 interviews to ensure the data was robust before the beginning of analysis. This involved conducting data validation checks and identifying outlier responses.

Response rate

A total of 7,774 records, acquired from Market Location, were called over the course of fieldwork. 4,883 did not answer the phone, appointments were made with 598 outside the fieldwork window, 817 were unusable contacts (See table 3.5).

Table 3.5.Survey outcome for sample in scope of the study
Potential Respondents Total Percent
Total called 7,774 100%
Completed survey 556 7%
No answer / voicemail 4,883 63%
Unusable (e.g. number unobtainable, wrong number, company closed) 817 11%
Appointment made but not achieved during fieldwork period 598 8%
Refused 435 6%
Screened out 302 4%
Busy 128 2%
Nobody at site available 55 1%

The final achieved number of 556 was 144 under the initial 700 target. The conditions under which the 2021 and 2023 were conducted were broadly consistent, with the sample targets, time of year, and length of time in field all consistent, with a key exception being survey length.

The primary factor for the difficulties compared to the previous wave was the length of the survey being longer in 2023 than 2021. The average length of the survey in 2021 was 25 minutes; this compared with 29 minutes in 2023. The main reason why this causes lower completes is interviewers spend more time on their shifts in interviews rather than trying new numbers which lowers the amount of call attempts overall. IFF spent 1,089 hours on the telephone interviewing in 2023, an increase of 77 hours from time spent in 2021 (1,012).

As can be seen in Table 3.6, despite the increase in time spent in field in 2023, the number of ‘call attempts’ in 2023 is much lower. In 2021, 28,832 call attempts were made compared to just 21,724 in 2023, a difference of 7,108 (and 25% higher).

Table 3.6.Key fieldwork differences between 2021 and 2023
Key fieldwork differences 2021 2023
Completes 700 556
Interviewer shift hours 1,012 1,089
Total call attempts 28,832 21,724
Survey length 25 minutes 29 minutes

4. Weighting

In line with standard social research practice, the data collected from the survey was weighted to make it representative of the underlying population. Weighting the data was necessary because of the decision to stratify interviews to ensure sufficient base sizes were achieved by country.

Weights were applied to the data to make it representative of the target population within each country. The weights were informed by the profile of the underlying population of businesses in terms of sector and FHRS rating. The weights also incorporated business size (number of employees). The profile of the underlying population is presented in Table 4.1 and was sourced from FSA’s FHRS database and the Inter-Departmental Business Register (IDBR).[5] [6]

Table 4.1.Profile of the underlaying business population by sector, FHRS rating within country and size
Sector England Northern Ireland Wales
Manufacturing 4% 0.17% 0.19%
Wholesale 0.7% 0.28% 0.26%
Retail 24% 0.76% 1%
Accommodation 6% 0.13% 0.68%
Food and beverage service activities 51% 2% 3%
FHRS rating England Northern Ireland Wales
0-2 4% 0.03% 0.16%
3 7% 0.12% 0.44%
4 15% 0.44% 1%
5 52% 2% 3%
Awaiting inspection / No rating 14% 0.35% 0.82%
Number of employees England Northern Ireland Wales
1-9 76% 2% 5%
10-24 10% 0.40% 0.63%
25-49 5% 0.24% 0.29%

5. Analysis

When comparing results across years or between sub-groups it is essential to establish whether these differences are significant or not, that is, whether we can be certain that a change in a particular percentage from one year to the next is a genuine difference and not due to chance. In order to do this, significance testing was carried out on findings using t testing. This investigated whether distributions of categorical variables genuinely differ from one another, by comparing the frequencies of categorical responses between two (or more) independent groups.

All differences noted in the main report are statistically significant to a 95% confidence level or above. By convention, this is the statistical ‘cut off point’ used to mean a difference is large enough to be treated as genuine. This means there is at least 95% certainty that the difference in distribution is not due to chance but indicates a genuine change. In some parts of the report, differences which are not statistically significant using this test (but add to the overall ‘story’) have been included. In these instances it has been made clear that the difference was not statistically significant.

In some instances, the base sizes of certain questions or subgroups are small. In these instances findings should be treated as indicative only. Instances of low base sizes are highlighted in the report.

All coding, data processing, and analysis took place in-house, which allowed for careful collaboration to ensure high-quality outputs. The research team created dataset and table specifications, and these were signed off by the Research Director. Analysis cross-breaks for subgroup analysis and derived variables were developed by the project team and agreed with FSA.

Tables and SPSS files were thoroughly checked by two team members (using the specification documents) in eight rounds of revisions. This was repeated until all discrepancies were caught and amended before the outputs were signed off by the project manager. Our in-house coding team processed any verbatim responses and produced a codeframe for the research team to review and sign off. At least 10% of each new codeframe was checked by research mid-way through fieldwork when we had a substantial number of completes, and again at the end of fieldwork.

6. Ethics

In the design and conduct of all work, IFF take due consideration of the nature and sensitivities of participant audiences and the safety of both our staff and participants. This project was carried out in strict accordance with the Market Research Society Code of Conduct and applied the principles of the Government Social Research Code on research ethics throughout.

The IFF ethics approach is summarised below:

  • We place the highest value on achieving informed consent for this and every study. As with previous waves of the tracker, respondents were made aware of the following before we obtained consent:

    • the aims of the research,

    • that participation is voluntary,

    • what it will entail for the respondent (especially interview duration),

    • explained and assured anonymity/confidentiality, and

    • provided information about their rights under GDPR.

    • We also used a reassurance email (the wording of which was agreed with FSA) that we could send them confirming these details (Appendix D).

    • If they did not wish to participate in the research they could opt out via multiple routes, including email, telephone and postal methods.

    • Achieved consent can be audited through our comprehensive call records.

  • The research took all reasonable steps to minimise the burden on participants. This is principally through using a well-designed questionnaire, ensuring questions are as easy as possible to answer, and that questions are only asked of those to whom they apply. We also make it clear if respondents are unable or do not wish to answer a particular question then they don’t need to.

  • The requirements of data protection legislation, including the Telephone Preference Service, Mail Preference Service and Corporate Telephone Preference Service were adhered to at all times. We conduct opt out telephone or email exercises as necessary.

  • Respondents were offered telephone numbers that they can call for reassurance or further information. These included the IFF project manager or director, the Market Research Society freephone number, and often the telephone number of our client.

  • The confidentiality and anonymity of respondents was protected throughout the research. IFF’s compliance with ISO27001 ensures that we safeguard participants’ personal data and all outputs would be carefully checked by two researchers to ensure confidentiality has been maintained.

We strive for inclusive participation in all the research we conduct. At recruitment, the recruiter carefully checked and recorded any specific requirements the respondent has to enable them to take part. This might include:

  • How the IFF interviewer communicates with the respondent (including the need for interpreters etc).

  • How ideas are presented to the respondent.

  • Other individuals (e.g. friends, family members, carers) that the respondent wishes to have present in order for them to feel at ease when taking part.

  • We can also offer interviewing in other languages to ensure all can take part.

  • Our internal governance processes include the submission of a summary of research purpose/method statement to the IFF ethics advisor (Jan Shury, Managing Director). Should any concerns be highlighted, the study team would have modified the method and/or developed mitigation steps to address these concerns.


  1. Due to the COVID-19 pandemic, the survey was not carried out in 2020. Since then, the FBO tracker has become biennial.

  2. The sectors chosen as in-scope is consistent with those sampled in waves 1 and 2, with the exception of Primary Food Producers (who were excluded in wave 3) and sectors that specialised in the production of animal feed. Appendix A presents the SIC codes of in scope sectors and how they were grouped during fieldwork and analysis.

  3. The profile of FHRS ratings in England, Northern Ireland and Wales has remained broadly consistent since 2021.

  4. The source of Market Location’s database includes Companies House feeds, new business telephone numbers registered with the BT directories, as well as call outcomes from Market Location’s call centre (which makes Market Location a ‘data owner’). Every record in Market Location’s database is called at least once every 12 months to check that the contact details and firmographic information held are up-to-date. Market Location can be described as a ‘data originator’ in that they own the data and supply this to other resellers, including Experian, Ofcom, Google etc. We use Market Location sample for the vast majority of our public sector work.

  5. The IDBR is a comprehensive list of UK businesses, which was used to weight the survey population to the underlying population in terms of business size.

  6. The total number of businesses in the underlying population was 331,954.