top of page

ESOMAR - 37 Q&A

To help buyers of online sample

esomar37

Section 1 - Company profile

Q1. What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

Main Frame was founded in 2022 and from the start is involved in building and recruiting online panels in Europe. The first recruited panels were in Poland, Hungary, Greece, Turkey and Bulgaria.

 

We've quickly become one of the top data collection suppliers in the region. This can be attributed to our amazing team and also - to our in-house developed panel management system, project management system, dynamic respondents profiling, and multiple other services and tools that help our clients.

Respondents from the panel are not engaged in direct marketing.

Q2. Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge in this area? What sort of training in sampling techniques do you provide for your frontline staff?

Our technical colleagues are continuously working to improve our sampling system. There is training involved on how to use all functions in order to deliver the best experience for respondents and clients.

 

At Main Frame, we take great pride in our ability to set, test and launch projects in less than a few minutes.

Q3. What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?

In addition to the sample-only service – we also provide an end to end service to clients, such as: coding, translations, programming of the surveys, hosting of the surveys, data analysis.

Section 2 - Sample sources and Recruitment

Q4. From what sources of online sample do you derive participants?

Main Frame relies primarily on our proprietary panels as the primary source for conducting the online surveys. These panels are the foundation of our data collection capabilities. They enable us to deliver the high-quality results quickly.


There are instances where our own panels would not be sufficient to meet a specific requirement, for example - if the project is running in a country that we do not cover with our audience or if the demographic & spec group is difficult to achieve with our resource. In such cases - we notify clients that we need an external source.

Q5. Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer?

Main Frame's main source is the actively managed panels. That's our focus and that's what we prioritize in surveys. More than 90% of the sample that we offer is derived from our exclusive panels.

Q6. What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?

We recruit panel members through a network of affiliates and ads in social networks.
The panels are open to all, but there is a double opt-in email verification included in all. 

Q7. What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are?

When new members join in - the system checks if the email of the participant is already registered. You won't be able to register more than once with the same email. There is a double opt-in process on our side - respondents would need to verify over the email their registration.
There is also a digital fingerprint solution, which can identify low quality and fraudulent data.

Q8. What brand (domain) and/or app are you using with proprietary sources?

Gavenze.com, our online panel, operates through two channels: respondents accessing the surveys through an invitation over the email and respondents accessible the surveys when they are logged in the website. More than 80% of the participants join the surveys through the email invitations.

Q9. Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?

Projects are currently being delivered via managed service (through our highly experiences operations team) or through an API integration.

Q10. If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?

At Main Frame - the data quality is the core of what we do. We have a limited list with partners - only trusted, tested vendors in place are assumed.
As it is common in market research - sometimes (due to project specs) we might need to engage with additional sample sources. We share full transparency to our clients - which vendor has been approached, what volume of completes has been delivered, etc.
We have the option to partner with suppliers through an API as well.

Q11. Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop-only questionnaires? Is it suitable to recruit for communities? For online focus groups?

Our respondents are made aware of the length of the survey, the devices on which they can take the survey and the incentive they will receive for a full completion. We can recontact/recall the respondents. We can also recruit for communities and qual studies as well.

Section 3 - Sampling and Project Management

Q12. Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?

After we receive a commissioning from a client - we set-up the specs and demographics within the project on our side. We carefully select the proper demographic quota balance that is being requested. The quotas help keep a balanced representation. We can set-up quotas on invitation level and also - follow quotas based on completion level. We would suggest to maintain quota controls on the final completion level.

Q13. What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?

As a minimum - respondents would need to provide their names, country, gender, birth date and region. So this is available for 100% of the respondents. The same is applicable for the partner supplier sources that we trust when we need to engage with another sample source.
We remind our respondents once each 6 months to have a look at their profile data and update where/if needed.
We can append these data points (the gender, age and region) additionally when required by our clients.

Q14. What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?

As a minimum we would need to know the following: country, demographic requirements (with quotas if possible), the expected incidence/penetration, the expected survey length and which devices (pc/desktop, laptop, tablets, mobiles or others) are allowed for the survey.
The feasibilities we provide usually consider about a week in field, so if the timings are longer - we can collect more than estimated.
If the incidence rates on quote/RFQ level are unknown - we can set-up an internal incidence rate check for free and test how our sample would perform on the needed audience.
If the survey length is also unknown - we can launch the project on a short LOI estimation and then re-adjust after the soft launch if needed.

Q15. What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?

If the project is impossible to complete in field due to lower response rates on our side (vs what we estimated) - we would cover the additional expenses for adding a third party sample source.
If the project is impossible to complete in field due to worse than estimated client metrics (lower than estimated IR, higher than expected incompletion, etc.) or due to a change of specs which were not communicated on bid level - then we would look for multiple options on how the project can be done through our panel partner network. The partners that we choose to work with are going through a detailed quality assessment on our side.

Q16. Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study?

We do not employ a survey router

Q17. Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?

Not applicable, as we do not employ a survey router

Q18. What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?

The only information respondents see before they enter the survey is: the ID of the survey (unique survey ID for internal panel purpose), the length in minutes of that survey, the allowed devices and the incentive they would receive for completion

Q19. Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?

Respondents are able to see a list of available surveys on the panel portal. The information they see about the survey is the ID of those particular surveys, the length and the incentive they would get for completion.

Q20. What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?

We have the ability to change incentives in the surveys. Once a project is final - we can append the incentive information in the dataset if required.

Q21. Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?

On every single project we ask respondents to provide their feedback at the end by rating the survey (1 to 5 stars) and to provide feedback about it if they wish. On overall level - we can review satisfaction based on length, project type, clients.

Q22. Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?

When required - we provide clients with data about number of invited respondents, the response rate, the average interview length, the incidence/penetration rate and other project related metrics.

Section 4 - Data quality and Validation

Q23. How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?

Participation in surveys is strictly limited to one occurrence per individual. While it is possible for the same panelist to be part of two different panels utilized in the same project, our proprietary software would prevent the duplicate participation.
We have internal invitation rule aiming to maintain a reasonable time gap between different survey engagements, aiming to minimize any potential survey fatigue and maintaining the quality and reliability of their responses.

Q24. What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?

We do record and store all individual participant data. All the activity of the respondents in surveys is captured and stored in our system. These insights allow us to better understand the behavior of our respondents, to improve the services that we provide. We make sure no PII is included/disclosed in any data files shared with clients.

Q25. Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.

We employ controls to ensure the authenticity and identity verification of panelists when accessing surveys: duplicate controls, reCAPTCHA controls, demographic controls.

Q26. How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

All suppliers used by Main Frame have to meet certain quality standards. When used - the respondents from those suppliers go through the same quality check controls that we apply to our own participants. If a certain supplier balance needs to be used for a set of surveys - this can be managed on our side with quota controls. If required by our clients - this information can also be shared.

Q27. Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?

If respondents have not participated in any surveys within a 6-month period - their panel registration automatically gets deleted. They will no longer be able to log in with their account and will no longer be able to receive invites for future surveys.

At the beginning of every survey - the participants are required to answer quality check questions. In random set of surveys - we also ask respondents about their birthdate. If their birthdate is different from what they have selected in their profile - their participation gets flagged as potentially fraudulent.

Q28. For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviors, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item non-response (e.g., “Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?

In order to maintain the high-quality data on our side - we have implemented multiple data quality checks in our survey templates. These checks help us ensure that respondents would provide accurate responses, they pay attention to the survey questions, they are engaged. The quality checks include:

Over-reporting questions - through a set of questions (respondents would only see 1 from that set, randomly) - we determine if respondents are trying to over-qualify for the surveys

Under-reporting questions - through a set of questions (respondents would only see 1 from that set, randomly) - we determine if participants are trying to under-report in surveys

Profile validation - birthdate check - if what they have selected in their profile doesn't match what they have in the survey answer - this gets flagged

Knowledge check - we ask respondents a question about basic knowledge, a math question, or to select the color of a known object that we share - if they fail - they will disqualify and will not enter the survey

Attention in surveys - respondents are being asked to select a specific answer from a list of answers

Conflicting answers check - we ask respondents twice the same question (for example - what is the birthdate of their youngest child, if they have one); if both times the answer is different - this gets flagged as a potential fraudulent respondent

Section 5 - Policies and Compliance

Q29. Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. (Note: If your company uses different privacy notices for different products or services, please provide an example relevant to the products or services covered in your response to this question).

Here is a link to the privacy policy: https://gavenze.com/privacy-policy

It can be accessed by non-panelists too. It provides essential details in compliance with the laws and regulations. The policy covers key concepts such as the identity and contact details, the personal data, the rights of the respondents, the cookie policy and others.

Q30. How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?

To ensure compliance, we have implemented a range of technical and organizational measures covering Encryption or pseudonymization of personal data, Safeguarding the confidentiality.

In any given survey, a respondent’s data is collected anonymously with data only tagged to a random identification number.

We have put in place appropriate security measures to prevent personal data from being accidentally lost, used or accessed in an unauthorized way.

We have also put in place procedures to deal with any suspected personal data breach and will notify our members and any applicable regulator of a breach where we are legally required to do so. We have appointed a data protection officer (DPO) who is responsible for overseeing questions in relation to our privacy policy.

Q31. How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants?

Consent by respondents is obtained when individuals join the panel. As outlined in our privacy page, participants have the right to withdraw their consent for future processing by simply contacting us via email and making the request.

Q32. How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

Our DPO colleague specialize in data protection and privacy regulations and closely monitor any changes or developments in the applicable laws.

Q33. What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?

In accordance with the guidelines - our panel is open to all over the age of 16. We can specifically target the parents of kids to conduct research through a parental permission question.

Q34. Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how.

We prioritize privacy by design in all aspects on our side. We integrate privacy considerations from the start, ensuring that appropriate technical and organizational measures are implemented to protect personal data.

Q35. What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?

We actively participate in ongoing monitoring and compliance activities to ensure that we continue to meet the stringent requirements set. We regularly review and update our policies and procedures to adapt to changing regulatory landscapes and emerging privacy challenges. We provide assurance to our clients and participants that their personal data is handled with the utmost care, in full compliance with legal and ethical standards.

Q36. Do you certify to or comply with a quality framework such as ISO 20252?

We have a commitment to maintaining the highest quality standards in the industry and providing reliable and accurate data to our clients. We are currently undergoing certification in ISO 20252. The confidence placed in our online panel by our clients, including large businesses, academics and competitors who continue to engage our sample is also a testament to the integrity of our products and services. We treat all personal and sensitive information confidentially

Section 6 - Metrics

Q37. Which of the following are you able to provide to buyers, in aggregate and by country and source? Please include a link or attach a file of a sample report for each of the metrics you use.

We can provide these metrics when required by clients.
- Average qualifying or completion rate, trended by month
- Percent of paid completes rejected per month/project, trended by month
- Percent of members/accounts removed/quarantined, trended by month
- Percent of paid completes from 0-3 months tenure, trended by month
- Percent of paid completes from smartphones, trended by month
- Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
- Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
- Average number of paid completes per member, trended by month (potentially by cohort)
- Active unique participants in the last 30 days
- Active unique 18-24 male participants in the last 30 days
- Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview
- Percent of quotas that reached full quota at time of delivery, trended by month

S1
S2
S3
S4
S5
S6
bottom of page