Introduction

The Consumer Insights Tracker is the Food Standards Agency’s (FSA) tracking survey that monitors changes in consumers’ behaviour and attitudes in relation to food. Fieldwork is conducted on a monthly basis.

This technical report presents methodological details on the Consumer Insights Tracker; including how data is collected and analysed. It also provides information on the strengths and limitations of the methodology, to support interpretation of the data.

Overview of methodology

The Consumer Insights Tracker is conducted using a bespoke online survey approach, administered to members of the YouGov Plc UK panel. Fieldwork takes place monthly, typically during the first week of the month and lasting around 5 days in total. Approximately 2,000 adults (aged 16 years and older) across England, Wales and Northern Ireland participate in the survey each month. This includes a pilot of around 100 responses for each wave. Respondents are excluded from taking the survey again for 6 months. The questionnaire completion time is approximately 10 minutes. The questions remain broadly consistent for each wave, although questions can be added/ removed as needed. Respondents are provided an incentive, in the form of points credited to their YouGov account, upon completion of the survey. Findings are summarised by YouGov into a PowerPoint report and shared with the FSA. The FSA publishes findings on the Consumer Insights Tracker webpage and makes data tables available on the FSA’s data catalogue.

History of the Consumer Insights Tracker

In April 2020 the FSA established a new monthly survey (the COVID-19 Tracker) to monitor consumers’ attitudes and behaviour during the COVID-19 pandemic. This survey was designed to provide more regular and timely insights than the FSA’s biannual official statistic survey, Food and You 2. In April 2022 the survey was renamed the Consumer Insights Tracker. This version of the Consumer Insights Tracker ran from April 2022 to June 2023. Further details are available via a separate webpage.

YouGov were appointed as the supplier of the Consumer Insights Tracker from July 2023. At this time, the FSA also took the opportunity to review the survey design and methodology for the Consumer Insights Tracker and the survey was relaunched in August 2023, following an independent review of the survey’s methods and content.

Due to a change in supplier and changes made to the survey design by the FSA, data collected from the two suppliers pre-and-post July 2023 should not be directly compared. A summary of the key changes are provided in Table 1.

Table 1.Changes to survey methodology
April 2020 – June 2023 July 2023 onwards
Utilised an omnibus panel approach Utilises a bespoke survey approach, using an online panel
Sample of those aged 16-75 in England, Wales and Northern Ireland Sample of those aged 16+ in England, Wales and Northern Ireland
Sample representative by age, gender, region, social grade and working status Sample representative by age, gender, region, social grade and education level
Sample of 60 included from Northern Ireland each month Northern Ireland sample boosted to 100 participants each month
2-month exclusion criteria 6-month exclusion criteria

Questionnaire design

Each wave of the survey is designed through collaboration between the FSA project team and YouGov. The survey asks respondents about their attitudes and behaviours in relation to a range of food-related topics that are of strategic interest to the FSA. This includes consumers’ concerns about food-related issues, perceptions of the food supply chain, and attitudes and behaviours relating to food affordability and availability. The survey is designed to be flexible so it can respond to topical or emerging issues, so although a core set of questions are asked each month, new questions can be added as new evidence needs arise. The survey is cross-sectional and offers a snapshot of time, with the findings for each wave reflecting the context at the time. Thus, it is important to reflect on the wider context when interpreting the findings for each wave.

Demographic information is already held by YouGov, so does not need to be part of the questionnaire. An example of the survey questionnaire is available in Annex 1, and additional survey scripts are available on request from the FSA Social Science team.

Survey questions are written using plain English and neutral phrasing to reduce bias and encourage participation. The survey is structured in a way that flows logically, with similar topics grouped together, to reduce respondent fatigue. Questions that are sensitive in nature, which may incite an order effect[1], or which relate to respondent demographics, are placed at the end of the survey where possible. Questions with multiple response options have their response options randomised so that the order these are presented does not affect how respondents answer. The same is done with the randomisation of questions that form question grids[2]. Additional questions are also included in the survey to check for ‘straight lining’[3], as well as attention check questions to ensure that respondents are truly engaged in the survey rather than providing random responses.

Each month, the questionnaire is reviewed by the FSA project team. Any new questions or amendments to existing questions are developed by the FSA project team in collaboration with stakeholders at the FSA. Where possible, existing and cognitively tested questions (for example, from Food and You 2) are used to help design new survey questions. Once agreed they are sent to YouGov where they are reviewed by at least two experienced researchers, including at least one senior member responsible for sign-off. See the ‘Quality Assurance’ section for further information about quality assurance in place for the survey design.

Pilot study

The online survey is piloted with approximately 100 respondents each month, before main fieldwork commences. This is to assess the respondent’s understanding of the questions, the survey logic, programming and the overall length of the survey. Once the pilot is completed any amends to the survey are made before main stage fieldwork.

Sampling

Sampling approach

All respondents who take part in the survey are drawn from the YouGov panel of over 400,000 active panel members who live in the UK. YouGov maintains an engaged panel of respondents who have specifically opted-in to participate in online research activities. This includes actively recruiting hard-to-reach respondents, such as younger people and those from ethnic minorities, via a network of partners with access to a wide range of online sources that cater to these groups. These partners have specific experience in recruiting these audiences for online activities. The sources include search engine optimization (SEO), affiliate networks, niche websites, and growth hacking techniques such as panellist refer-a-friend campaigns and social networks. All recruitment sources, for hard-to-reach audiences or anyone else, are monitored to ensure that respondents are always profiled, responsive and engaged in the survey experience.

The YouGov panel is large enough to select nationally representative samples that reflect the actual breakdown of the population on the key demographics of age, gender, region, social grade and education level.

At the start of the project a nationally representative sample of people in England, Wales and Northern Ireland was constructed. To qualify for the survey respondents needed to meet two criteria:

  • Be aged 16+ years of age

  • Live in England, Wales or Northern Ireland[4]

Each month, eligible panellists (see further details below on sampling) are sent an email inviting them to take part. Only panellists who are invited can participate in the survey, and individuals can only respond once each time they are invited. Participants are given a brief introduction to the survey before responding. They are not made aware that the survey is being conducted on behalf of the FSA.

The sampling approach used for the Consumer Insights Tracker is quota sampling, with quotas taken from the 2021 Census. Quota sampling is a non-probability sampling method that involves dividing the population into sub-groups. The relative proportion of each sub-group as a percentage of the total sample is based on known characteristics of the overall population. Responses are collected until the ‘quota’ for each sub-group is reached. This is a standard sampling approach used in online panel surveys to capture a representative population. The sample for the Consumer Insights Tracker is structured to be representative of the England, Wales and Northern Ireland population by the following variables:

  • Age

  • Gender

  • Social grade

  • Region

  • Education level

A sample boost is also applied in Northern Ireland (to achieve 100 respondents), to improve the representation of this group and to enable rudimentary demographic comparisons between countries. An exclusion criterion is also applied to the sample, which prevents any respondents from participating in the survey if they have done so in any of the six previous waves.

Sample size and weighting

The overall sample size for each wave is approximately 2,000. This sample size is sufficient to ensure that robust analysis can be conducted across various demographic sub-groups. It also ensures that margins of error within survey results are minimised, enabling statistically significant variations to be identified across and within waves.

Weights are applied to the data to make it more representative of the population of England, Wales and Northern Ireland. Weighting involves adjusting the contribution of individual respondents to ensure it is given the correct relative influence based on the demographics of the target population. Data from the 2021 UK Census from the Office for National Statistics is used to do this, with weighting applied according to gender, age, education level, region, and social grade. Weighting on these variables is common practice across similar surveys of this nature, as it ensures that the data is representative of the target population, while also maximising weighting efficiency.

Random Iterative Method (RIM) weighting is used. RIM is used when there are a number of different standard weights that all need to be applied together. This weighting method calculates weights for each individual respondent from the targets and achieved sample sizes for all the quota variables. RIM weighting is an iterative process, whereby it recalculates the weights several times until the required degree of accuracy is reached. All weights are capped at six, and a weighting report is produced for each wave. The advantage of using a RIM weighting approach is that the weighting can include a greater number of variables, and it is not necessary to have targets for all the interlaced cells.

Weighting is applied at the end of the data processing phase on cleaned data (see the Quality Assurance section for further details on data cleaning).

Table 2 provides an example of how data is weighted from the March 2024 wave of the survey. The unweighted base shows the number of completed surveys and the weighted base shows the adjustments that have been made to correct for any sample bias.

Table 2.Demographic variables and their weightings
Variable Unweighted n Weighted n Weighted %
Age
16 to 24 220 260 13%
25 to 34 312 326 16%
35 to 44 329 344 17%
45 to 54 307 313 16%
55 to 74 684 636 32%
75+ 163 136 7%
Gender
Male 993 987 49%
Female 1022 1028 51%
Region
North East 86 93 5%
North West 232 241 12%
Yorkshire and the Humber 177 185 9%
East Midlands 168 167 8%
West Midlands 183 185 9%
East of England 205 204 10%
London 266 278 14%
South East 306 315 16%
South West 188 185 9%
Wales 100 101 5%
Northern Ireland 104 60 3%
Social grade
AB 564 559 28%
C1 588 594 29%
C2 403 419 21%
DE 460 443 22%
Ethnicity
White background 1795 1781 88%
Ethnic minority background 185 197 10%

Data analysis

The data analysis process used focuses on univariate and bivariate quantitative analysis. Analysis for each question is conducted at the overall sample level, as well as across key demographic groups, using the weighted data. Analysis of changes across waves, as well as differences between demographic groups, are only reported when these differences are statistically significant (discussed further below).

Index of Multiple Deprivation (IMD) is one of the demographic factors calculated and reported for England, Wales and Northern Ireland. IMD is calculated using country-specific metrics, and then combined to give an overall measure of relative deprivation within a respondent’s specific region.

  • England: The IMD ranks each English LSOA from 1 (most deprived) to 32,844 (least deprived).

  • Wales: The Welsh equivalent (WIMD) ranks each Welsh LSOA from 1 (most deprived) to 1,909 (least deprived).

  • Northern Ireland: The NI equivalent ranks each OA from 1 (most deprived) to 5,022 (least deprived).

Statistical reliability and confidence intervals

As only a sample of the population, rather than the entire population, is surveyed each month, the results are subject to sampling tolerances, which vary with the size of the sample and the percentage figure concerned. For example, for a question where 50% of participants in a (weighted) sample of 2,000 respond with a particular answer, the chances are 95 in 100 that this result would not vary more than 2.2 percentage points, plus or minus, from the result that would have been obtained from a census of the entire population (using the same procedures).

Table 3 provides a useful rule of thumb when judging the statistical significance of the figures contained in the final dataset.

Table 3.Confidence intervals for a sample of n=2,000 for various results
Weighted base Confidence intervals at 95% level
10% or 90% 30% or 70% 50%
+/ - +/ - +/ -
Sample 2,000 1.3 2.0 2.2

Reporting

The FSA project team work closely with YouGov each wave to identify results of strategic importance for the FSA. Toplines typically focus on new/emerging issues, significant changes in findings over time, an overview of each topic from the Consumer Insights Tracker (e.g. food availability, food insecurity, consumer concerns) and key demographic differences of interest. Findings also consider key demographic groups of interest to the FSA, particularly when they indicate a significant difference to the total survey population. Changes over time, or between demographic groups, are only reported when they are statistically significant using t-tests[5], at p<0.05. This helps to ensure that only ‘true’ shifts in behaviour and attitudes are focused on in the report (i.e. they are unlikely, 5% or less, to be a result of chance).

The data for this project and other research that the FSA undertakes can be found on the FSA’s website, and in particular via the FSA data catalogue.

Quality assurance

Participant verification and engagement

Several checks are applied to ensure the quality of survey data. Checks for participant identification and authentication are conducted during panel registration, and through an initial welcome survey. Some of these checks include:

  • IP checks, including IP blacklists – checks on respondent’s IP addresses to ensure they have not been blacklisted from completing surveys.

  • Digital fingerprinting – information collected about the software and hardware of a device for the purpose of identification.

  • Double keying on panel registration and login – used to check for respondent errors in data entry.

  • Email address verification, and testing for uniqueness – prevents respondents from signing up to the YouGov panel using multiple accounts.

  • Contact detail de-duplication – prevents respondents from signing up to the panel using the same contact details multiple times.

  • Country validation – checks to ensure that respondents on the UK panel live in the UK.

  • Location verification (e.g. post code) – crosschecks respondents against other location metrics to ensure that respondents are not signing up to the panel using erroneous postcodes.

  • Machine Learning based cookie checks, and checks against Cookie blacklists – uses browser-based cookies (e.g. to prevent respondents from completing the same survey multiple times).

  • Email confirmation and activation code – checks to ensure that respondents are using a valid email address.

Data cleaning and quality checks

Once the survey has closed, respondents that ‘straight line’, fail the attention check questions or have answered under a certain time period[6] are removed from the data. Additionally, inconsistent, incoherent or incomplete responses are removed from the dataset in a process of data cleaning.

The cleaned data is used to produce data tables. The YouGov team carry out a thorough review of the data tables to check for accuracy. This includes:

  • Checking that all questions from the survey have been included.

  • Checking that the total sample size matches the target sample.

  • Checking that base sizes and text for each question are correct.

  • Checking that percentages for each question sum to 100% (if single code), and that ‘net’ figures have been calculated correctly.

  • Checking that crossbreaks have been calculated correctly.

  • Checking that the data has been weighted correctly.

  • Comparing data with previous waves and investigating the reasons behind any large shifts in results to ensure they are not the result of data errors.

When outputs are provided to the FSA, the project team also conduct quality assurance checks before data is published; including spot checks against multiple data sources (for example, previous publications and data tables) and significance testing to ensure the same results can be found using alternative methods. Any discrepancies are highlighted and discussed with YouGov. Prior to publication, reports are reviewed and signed-off by at least two researchers at the FSA.

Ethics

YouGov take a number of steps to ensure the highest ethical standards are upheld as part of this survey (these are in line with the GSR ethical assurances for social research). This covers four main items:

  1. Informed consent and right to withdraw: Before beginning the survey, respondents are presented with an introduction page explaining the survey topic. This ensures that all participants understand the nature of the survey at the outset and enables anyone who does not wish to continue to voluntarily exit at this point. All panellists have double opted into responding to online surveys. This means that they have agreed to take surveys with YouGov and have then agreed to participate in the Consumer Insights Tracker, when invited. Respondents are able to withdraw from the panel generally at any time and also have the option withdraw from the survey whilst it is in progress.

  2. Accessibility: YouGov takes steps to aid participation through its systems and capabilities by servicing all devices and channels in an accessible way. This includes optimising surveys to be conducted on a range of devices, configuring the way text and image-based questions are displayed to respondents to avoid any accessibility-related issues (e.g. through the use of screen reader support) and using plain, simple language.

  3. Avoiding personal harm: YouGov has a duty of care to panellists and steps are taken to avoid causing personal harm. Examples are signposting sensitive topics at the beginning of the survey (if relevant, depending on topics), providing links to trusted support organisations respondents can contact if required, and providing the option to skip or ‘prefer not to say’ to questions relating to sensitive information and/or special category data under the GDPR.

  4. Safeguarding personal data: All personal data collected on respondents is anonymised. The data YouGov shares with the FSA is completely anonymous, non-identifiable and only reported at an aggregate level. GDPR principles and other applicable local privacy and security obligations are adhered to by both YouGov and the FSA when conducting this research.

Strengths and limitations

Strengths of approach

The FSA’s project team carefully weigh up the strengths and limitations of all methodologies before deciding the most appropriate methodology for an individual piece of research.

The approach for the Consumer Insights Tracker offers several strengths including:

  • Reliable, representative tracking: A large, nationwide online panel, enables data to be gathered that is representative of the population aged 16 years and over in England, Wales and Northern Ireland, to track changes over time and to collect a wide range of demographic information. A boosted sample of 100 is collected in Northern Ireland which allows for the comparison of each country across England, Wales and Northern Ireland.

  • Speed of delivery: An online panel approach means that the survey can be administered to participants quickly, particularly in comparison to other modes such as telephone or face-to-face methods. As participants are pre-screened for eligibility for the panel, this means that surveys can be administered quicker than methods that require recruitment and screening processes. In addition, online surveys can be completed at any time of day (according to respondent preference).

  • Cost effectiveness: The combination of speedier delivery and a less resource-intensive approach by eliminating the need for human interviewers contribute to online surveys being significantly more cost effective than other survey methodologies. In addition, because YouGov already holds many of the key demographic indicators on panellists, this saves questionnaire space (and therefore survey cost) by not needing to ask these questions.

  • Flexibility in survey design: Online surveys provide greater flexibility in terms of the types of questions which can be included. Questions involving images, videos or audio clips can be included, which is not always possible when using a telephone methodology (for example).

  • Reduction in certain forms of bias: As online surveys do not require an interviewer to be present, they can help to reduce the effects of social desirability bias among respondents. Sampling bias can also be reduced online, compared to face-to-face approaches, where the interviewer may be more restricted by location when conducting the survey, although other forms of sampling bias may be present in online surveys (see limitations below). Additionally, using a bespoke survey approach means respondents are only asked the questions in the Consumer Insights Tracker, reducing certain types of bias associated with approaches where respondents are asked to complete multiple surveys.

  • Exclusion period: Participants are excluded from taking the survey again for 6 months. Having a long exclusion period means they are less likely to become familiar with survey questions, or to experience survey fatigue. It also allows up to 6 months of data to be combined, to provide a large base size for further analysis opportunities without the risk of duplicated participants.

Limitations of approach

Some of the limitations of the approach include:

  • Opt-in sampling bias: Panellists on opt-in panels can be more engaged in current issues, which means they are more motivated to opt-into panel surveys. This means precise measures for some attitudes and behaviours reflecting the total population can be difficult to obtain. It is therefore beneficial to regularly validate and compare the results of online opt-in surveys using more robust methodologies such as random probability surveys. The FSA often use multiple methodologies to understand the complexities of a single topic.

  • Representation: Some groups are underrepresented in online opt-in panels. This is particularly true of those who do not have internet access (who are excluded) or those who are not comfortable going online, who also tend to be older and in lower socio-economic groups. Additionally, young working-class men are difficult to reach across many methodologies, including opt-in online panels. These issues can be remedied in part through quotas and weighting, if the groups being underrepresented are identified. Other methodologies, such as postal or face-to-face surveys, can be more suitable when capturing the views of those who are digitally excluded, but these methodologies typically take longer to deliver and are more expensive.

  • Topics and complexity: Online surveys, and in general surveys that are self-administered, are useful for an overall read on issues or for exploring topics that are well understood by respondents. However, it becomes more difficult to explore in depth, complex issues where respondents do not have a good understanding of the topic. Such topics can benefit from an interviewer who can provide explanations or are better explored through qualitative research.

  • Detailed demographic analysis: Due to sample size, it is not always possible to review detailed demographic cross-breaks, such as region or granular details of income with the Consumer Insights Tracker which has a sample of approximately 2,000 per month. This only allows for analysis by broad demographic characteristics such as gender, age group, income group, or the presence of children in the household. However, by combining six months of data, a sample size of 12,000 is created. This greater sample size increases the number of individuals of each demographic cross-break, thus allowing the comparison of those that do not meet the required number in the monthly figures.


  1. An order effect occurs when a question earlier in a survey influences respondents’ answers to later questions in the survey.

  2. Question grids are tabular questions that asks respondents to evaluate one or more row items using the same set of column choices.

  3. Straight lining is when respondents answer a survey in a ‘straight line’, for example selecting the first option for each question.

  4. The sample does not include respondents from Scotland. The FSA is the independent government department working to protect public health and consumers’ wider interests in relation to food in England, Wales and Northern Ireland. Food Standards Scotland is the independent public body with responsibility for food policy and implementation in Scotland. Food Standards Scotland conduct their own consumer monitoring.

  5. More information on t-tests is available here: https://www.britannica.com/science/Students-t-test

  6. Those with a completion time of under a third under the median are removed from the final survey sample.