Measuring Adolescent Participation: Results from a Qualitative Pretest in Côte d’Ivoire and Indonesia

Adolescence, or the period of transition between childhood and adulthood, is a significant stage in human development. Not only is it a time of physical growth and neurological development, but it is also a time when many young people begin to expand their networks beyond their family, form close ties with their peers, develop and express their views, and have a greater desire to be involved in decisions that affect them. This act of forming and expressing their views and influencing matters that concern them is sometimes referred to as adolescent participation. When viewed through an international human rights lens, adolescent participation is considered a fundamental right, and consequently, there is a need for reliable and valid measures that accurately assess adolescent participation and allow for internationally comparable data. The purpose of this study was to qualitatively test a draft questionnaire designed to measure adolescent participation. The goal was to gain insight into both how the items would perform cross-culturally and how adolescents’ comprehension and interpretation of different constructs related to adolescent participation. Cognitive interviews were conducted with 123 adolescents aged 10 to 19 in Côte d’Ivoire and Indonesia. This paper describes the development of the draft questionnaire, presents the methods used to pretest the questionnaire, and summarizes the findings. This study reveals important considerations for researchers and practitioners interested in qualitatively pretesting questionnaires with adolescents and in measuring and understanding adolescent participation across a range of international contexts.


Introduction
Adolescence, or the period of transition between childhood and adulthood, is increasingly recognized as a significant period in human development (United Nations Children's Fund, 2018). Although adolescence is characterized by physical growth, neurological development, and the onset of puberty, it is also a time when many young people begin to expand their network beyond their immediate family, form close ties with their peers, develop and express their views, and have a greater desire to be involved in decisions that affect them. This act of adolescents' forming and expressing their views and influencing matters that concern them is sometimes referred to as adolescent participation (United Nations Children's Fund, 2018).
When viewed through an international human rights lens, adolescent participation is considered a fundamental right. According to the United Nations Convention on the Rights of the Child, an international agreement on childhood, adolescent participation includes the right for adolescents to express views on all matters affecting them. This is embodied in the Convention on the Rights of the Child in Article 12 and applies to all children capable of forming a view (United Nations Committee on the Rights of the Child, 2009). Although Article 12 makes it clear that children and young people should be able to express their views in matters that affect them, the extent to which children or adolescents around the world are, in fact, expressing their views in matters that affect them and are actively participating in society is largely unknown.
To fill this gap, we designed a questionnaire that builds on UNICEF's Conceptual Framework for Measuring Outcomes of Adolescent Participation (United Nations Children's Fund, 2018). Unlike other frameworks which tend to focus on the determinants and experience of adolescent participation, the UNICEF framework focuses on the outcomes of adolescent participation (Hart, 1992;Shier, 2001;Treseder, 1997;USAID, 2014;Wong et al., 2010). Hart (1992), for instance, describes different levels of participation, ranging from manipulated or tokenized adolescents to young people initiating and leading action and decision making. Similarly, Shier (2001) writes of the openings, opportunities, and obligations involved in adolescent participation, while Wong and colleagues (2010) incorporate intergenerational linkages in their typology. Although useful, these frameworks are nonetheless incomplete. As Checkoway and Gutierrez (2006) note, "just because a number of young people attend a number of meetings and speak a number of times" does not necessarily mean that there has been any meaningful participation (p. 2).
The questionnaire we designed, the Adolescent Participation Questionnaire, measures seven constructs: (a) self-esteem, (b) self-efficacy, (c) social connectedness, (d) mattering, (e) decision making, (f) civic attitudes, and (g) civic engagement. The questionnaire is comprised of 93 items and was designed to be suitable for use in school-based or household surveys, administered to adolescents ages 10 to 19 across different cultural contexts.

Items Selection
In designing the questionnaire, we reviewed many widely used questionnaires and scales that are intended to measure one or more of the constructs mentioned previously. Most of these questionnaires and scales measure only one construct. For instance, the Global Self-Worth Scale focuses on adolescents' self-esteem (Dubois et al., 1996), the Societal Mattering Scale assesses adolescents' sense of mattering (Schmidt, 2018), and the Chinese Positive Youth Development Scale focuses on adolescents' civic dispositions, like attitudes towards rules or volunteering (Shek et al., 2007). Other scales offer items that address several components at once. For instance, the Five C's of Positive Youth Development examines an adolescent's competence, confidence, connection, character, and caring/compassion (Lerner et al., 2005). Likewise, the Sociopolitical Control Scale for Youth measures both an adolescent's perception of their leadership skills and of their ability to influence policy in an organization or community (Peterson et al., 2011). In addition to drawing on many questionnaires and scales, we conducted a review of resources identified and provided by experts in youth participation, empowerment, and development.
Through this process, we built a question bank of over 450 items. The final selection of items was determined by several criteria. Items were prioritized if they had been used in questionnaires or scales that had been designed for and tested among the adolescent age group, had been applied across cultures, and had undergone a validation exercise. In addition to this compilation from existing instruments, new questions were developed on two constructs for which no existing tools were identified: the presence of a trusted adult and participation in decision making. The design of these questions drew from the literature and existing instruments on similar topics; for example, questions on decision making were modeled after well-established tools within the women's empowerment literature on women's power to make decisions in their households.

Approach to Pretesting
After we had developed an initial draft of the questionnaire, we conducted a qualitative pretest, using cognitive interviewing. Cognitive interviewing is a qualitative method designed to investigate whether a survey question fulfills its intended purpose, thereby combatting any potential for measurement error. This method involves recruiting individuals, or participants, who share characteristics of the survey respondent population, presenting them with the survey questions, and asking them probes, or follow-up questions, to identify any potential problems or issues with the survey questions. Findings from cognitive interviews can then be used to guide revisions and improvements to the survey questions (Beatty & Willis, 2007). When a survey such as the questionnaire is developed and validated in a western context and is intended to be administered in multiple countries across the globe, cognitive interviewing can be especially valuable for assessing its cross-cultural performance. Through cognitive interviews of a crosscultural survey, researchers can assess if the questions make sense to participants with different backgrounds and experiences, determine if participants have any trouble or need help understanding the questions, key terms, and concepts; whether the questions are relevant to participants; how comfortable participants are in answering the questions; whether any of the questions are too sensitive; and whether participants feel burdened by answering the questions.
To test the questionnaire and the extent to which the items make sense and are relevant in different cultural contexts, cognitive interviews were conducted in Abidjan, Côte d'Ivoire and Jakarta, Indonesia.

Preparing for Data Collection
To prepare for cognitive testing, we translated the questionnaire into French and Bahasa Indonesia with the support of in-country UNICEF staff, designed culturally appropriate testing protocols, and developed a semi-structured interview guide, which was also translated into French and Bahasa Indonesia. Quality translation was critical, as results from the interviews would not have been valid if the translations of the questions, response options, or probes suffered from translation errors. The interview guide was designed to uniformly facilitate the cognitive interviews, independent of interviewer, location, or participant's language. It contained the administration details, consent forms, survey questions to be tested, response options, and skip logic, as well as scripted probes for each question to be tested. The scripted probes were used to explore participants' comprehension of the questions and their reporting processes. Consistent with best practices for cognitive testing with adolescents, we used concurrent probing of specific questions. Examples of probes used are shown in Table 1.

Probes Example
General probe How did you come up with your answer?

Specific probe
What do you think it means to be "valued" by people in your community?

Recall probe
What are some examples of volunteer work that you did?
Probe on response options What do you think about these responses?
Probe on comprehension/ interpretation Can you provide an example of ". . . to solve a problem in the community"?
In addition to asking scripted probes, interviewers could ask ad hoc or spontaneous probes if additional areas of discussion were necessary for detecting problems or issues with the questionnaire.
Six interviewers were recruited and hired in Abidjan and 12 interviewers were recruited and hired in Jakarta by in-country UNICEF staff. After interviewers were hired, two members of our team traveled to Abidjan and one traveled to Jakarta to lead a 2-day interviewer training and to oversee data collection. The team worked closely with the in-country partners to plan the training and data collection. The trainings were conducted in English with simultaneous translation in the local language and training materials including the training agenda, presentation slides, and interview guide, were all translated to the local language. During training, the trainer provided an overview of the purpose of the cognitive interviews, presented the fundamentals of cognitive interviewing in a lecture format, and reviewed the procedures for recording the interviews, taking notes, and entering notes into an electronic template.
Additionally, interviewers conducted mock exercises using the interview guide. During training in both locations, minor revisions were made to the translation of the interview guide based on feedback from the interviewers while maintaining measurement equivalence.
Using a recruitment script which explained the purpose of the interviews and an eligibility screener, in-country UNICEF staff worked with local agencies to recruit at least 60 participants in both Abidjan and Jakarta. The eligibility criteria were that participants had to be between 10 and 19 years of age, fluent in the local language, and available during the data collection period. A significant effort was made to ensure diversity of participants by age, gender, and level of education in both locations.

Data Collection
Prior to data collection, the in-country UNICEF staff obtained permission from a parent of each participant under the age of 18. At the start of each interview, the interviewer obtained informed consent for participants who were 18 or 19 years of age and informed minor assent for participants under the age of 18.
Over a 2-day period, a total of 62 interviews were conducted in Abidjan and 61 interviews were conducted in Jakarta with adolescents with varying levels of education. Table 2 shows the breakdown of participants by characteristic. During each interview, the interviewer read each question aloud to the participant and the participant provided their response. Then, the interviewer asked probing questions to determine if the participant understood the question or construct as intended. For example, after a participant shared their response to the following scripted probe, "How many times have you worked as a volunteer during the past 12 months?", one interviewer asked the participant to share some examples of the type of volunteer work that they did to understand how the participant was interpreting the word "volunteer." Interviews lasted approximately 75 minutes and were conducted in the local language. With participants' permission, interviews were audio recorded. Each interview had a notetaker who was also a trained interviewer. In addition to capturing participants' responses to the survey questions and their answers to probes, for each survey question, the notetaker recorded whether the participant was able to answer the question and whether the participant had any trouble understanding the question or needed help. At the conclusion of the interviews, participants received a small token of appreciation.
After data collection, the interviewer and notetaker for each interview collaborated in entering their notes from the interview into an electronic notes template. During this process, they consulted the notes that each took during the interview, and the audio recording. The completed electronic notes template for both Abidjan and Jakarta was then transcribed to English for analysis.

Analysis
Analysts on our team used a content analysis approach to interpret and analyze the findings from the 123 interviews. The analysts conducted an item-by-item review of the data to identify cross-cutting themes and understand how the items performed. Particular attention was given to items in which the interviewer had recorded that the adolescent was not able to answer the question or had trouble understanding the question or needed help; items with vague, unclear, or unfamiliar terms; items with problematic translations; and items on sensitive topics. The analysts also conducted an in-depth review of scenarios in which the participants' responses to survey questions did not align with their qualitative responses to the interviewer's probes because such scenarios can be indicative of issues related to recall or response retrieval and may result in measurement error. Because we anticipated that participants' age may affect their ability to understand and respond to the survey questions, the analysts also reviewed the age and educational attainments of participants and noted where findings differed for younger compared to older participants.
The analysts then developed a report for each country that included global findings, item-byitem results, and recommendations for suggested changes to questionnaire items to reduce measurement error and allow for internationally comparable data.

Findings
Pretesting the questionnaire in Côte d'Ivoire and Indonesia provided valuable insights into the cross-cultural performance of the survey items.

Difficulty With Comprehension of Key Terms and Concepts
Some of the key terms and concepts, including "community," "volunteer," "solutions," "committee," and "public meeting" were unclear or unfamiliar to participants. As anticipated, in both Côte d'Ivoire and Indonesia, participants with low education had greater difficulty with comprehension compared to those with higher education and younger participants had more difficulty with comprehension compared to older participants. One 16-year-old shared, "I was never involved in any community, so I don't know" and another expressed, "I've never been involved in a community." Because the concept of "community" was unclear or unfamiliar to many participants, other community-related concepts, including "community problem," "community leader," and "community issue" were difficult for these same participants. Those who were able to share an example of a problem in their community gave a wide range of responses, suggesting that not all participants interpret "community problem" in a similar way.
Answers included "talking things out amicably," "stopping a brawl," "giving advice to friends," "solving environmental issues," and "deliberating." The concept of "community leader" was also interpreted very differently among participants. One 17-year-old asked, "It's like the neighborhood leader, for example?" Other participants associated the concept of "community leader" with politics, political structures, and political figures.
Another unclear, confusing, or unfamiliar word, especially to younger participants, was "volunteer," in the question, "In the past 12 months, have you worked as a volunteer?" Some participants had not heard of the word "volunteer" and others were confused by the phrase "worked as a volunteer." When asked to share an example of volunteering, participants shared a range of responses, similar to when they were asked to share examples of a community problem. Examples included "helping someone cross the road," "helping my mother at home," "helping someone during the flood," "donating blood," and "cleaning up the neighborhood." Other examples mentioned house chores or homework.

Low Relevance of Some Survey Questions
Some of the survey questions, particularly those related to civic engagement, did not appear to be relevant to participants, regardless of their age. For example, the majority of participants answered "No" to the following questions: "In the past 12 months, have you worked as a volunteer?", "In the past 12 months, has a community leader asked your opinion about a community issue?", and "In the past 12 months, have you served on a committee that was addressing a community issue?" Additionally, when participants were asked what they thought of the examples used in the questions, some participants noted that they could not imagine themselves in some of the examples or scenarios that were presented.

Insufficient Response Options
On several questions, participants gave an answer that was not one of the available response options. For example, when asked how often they have used their phone or the internet to gather information about a social issue or community problem, some participants answered, "Every day," even though the available options were "Once, A few times, Monthly, Weekly." On some questions with available response options of "Yes" and "No," there were participants who answered, "Don't know" or "Not sure" and a few participants who answered "Sometimes" even though these answers were not included in the response options.

Inaccurate Reference Period
Many of the questions in the section about civic engagement ask about a specific reference period, for example, "In the past 12 months, have you worked together with someone or some group to solve a problem in the community where you live?" When probing on their response, it became evident to interviewers that some participants answered "Yes," even though the activity did not occur during the specified reference period, referred to as telescoping. For example, one participant explained that "one time, I helped solve a problem in a community" and continued on to say that this had happened several years ago, despite answering "Yes" to the survey question. On the questions that included a reference period, compared to older participants, younger participants tended to think about whether they had ever participated in the activity rather than whether they had participated in the activity in the past 12 months, as specified in the question.

Sensitivity of Questionnaire Items
There were items in the questionnaire, particularly in the section related to decision making, that participants in both Côte d'Ivoire and Indonesia noted were sensitive or that they appeared to be uncomfortable with answering. The most sensitive question seemed to be, "Are you married or currently dating someone?" This question was especially sensitive in Indonesia given that during cognitive testing, there was a nationwide, non-governmental campaign to prevent adolescents from dating and shaming those who did. Other sensitive questions included, "Who will decide in the future whom you will marry or date?", "Who will decide in the future at what age you will get married"?, and "Are you allowed to visit a doctor/nurse/community health worker for family planning (condoms, pills, IUD, etc.)?" In Indonesia, because of the cultural sensitivity of this last question, a law that prohibits unmarried adolescents from purchasing contraceptives, and a planned amendment to this law to criminalize reproductive health services for unmarried adolescents, an update was made to the interview guide prior to the start of data collection that removed the examples of "condoms, pills, or an IUD." Other questions that participants found to be sensitive or uncomfortable were related to selfesteem and self-efficacy. Examples include "When I face life difficulties, I feel helpless," "I sometimes think I am a failure (a "loser")," and "I wish I had more to be proud of." After being asked the set of self-esteem and self-efficacy questions, participants were asked how the questions made them feel. Many participants of all ages expressed that they felt nervous. One 19-year-old participant shared, "I felt a little nervous, a little awkward . . . it was overwhelming . . . a lot of the questions were tough to answer." A couple of participants mentioned that they had felt "triggered" by one or more of the questions. One 16-year-old participant explained, "I felt tense because it was my first time. It's just that they're a little personal and I'm a shy person so I felt pressured by the questions."

Cognitive Burden
Because a cognitive interview requires a participant to both answer survey questions and explain how they came up with their answer to the survey questions, participating can be a burdensome exercise, especially for adolescents (Omrani et al., 2019). Although some participants did not appear to experience any burden, the majority of participants in both Côte d'Ivoire and Indonesia experienced some level of cognitive burden. The most common example of cognitive burden during these interviews is participants who expressed that they were confused by or unfamiliar with a particular term or concept, then became frustrated by the number of follow-up questions that included the same term or concept that was unfamiliar to them. Some of these participants showed frustration and others, especially younger participants, shared that the questions were difficult. Another example of burden is participants who expressed that they felt they were being asked the same question multiple times. As mentioned, many survey items were borrowed from existing scales and instruments that are composed of multiple questions aimed at assessing the same construct to establish reliability.
When presented with similar items, some participants became frustrated, not being able to detect differences between the questions being asked. The translation of some terms and concepts also contributed to cognitive burden for some participants because the translation was more complex than the term or concept in English.

Quality of Translation
As mentioned, there were several issues with the translation of the interview guide from English to French and to Bahasa Indonesia. For example, the initial translations contained words that had a different meaning than the English translation or words that are not commonly used by adolescents in the respective country. Although most of these issues were resolved during the interviewer training by making minor updates to the interview guide, three issues with the translation from English to Bahasa Indonesia were discovered during analysis. First, the translation of "community affairs" used in the questionnaire translates to "problems in society." Consequently, when participants were presented with the Indonesian translation, they tended to share examples of societal problems, such as "a dispute or disagreement between neighbors," "littering," "flooding," and "quarrels or disputes." Second, the translation of "ashamed" used in the questionnaire is a word with multiple meanings, including "ashamed," "embarrassed," and "shy." According to the interviewers, based on how the question appears in the questionnaire, "I often feel ashamed of myself," it was unclear how participants should interpret the translation of "ashamed." As a result, participants interpreted this question very differently. When asked what "ashamed" means in Indonesian, participants referenced not feeling confident or lacking confidence, not wanting to do things, doing something wrong and not apologizing, and not feeling secure. Third, the translation of "as a person" used in the questionnaire was unfamiliar or confusing to participants as this is not a common phrase used in the local language. Consequently, participants had trouble understanding the translated statements of "I am happy with myself as a person" and "The people in my community value me as a person."

Discussion
Our qualitative pretest provided valuable insight into how the items on the questionnaire would perform cross-culturally and adolescents' comprehension and interpretation of different constructs. The interviews revealed a number of challenges that respondents would be likely to face if administered the current draft of the questionnaire, including challenges related to comprehension of key terms and concepts, the relevance of specific survey questions, the sensitivity of some items, and the quality of translation. Through these interviews, we learned that some of the language used in the questionnaire represents abstract concepts that may be confusing, unclear, or unfamiliar to respondents; not all of the survey questions and examples used in the survey questions will be relevant to all participants; some questions may warrant additional response options; the reference period may need to be emphasized for accurate data collection; some questions may be too sensitive to include; and lastly, the translation likely contributed to some of the comprehension issues.
In revising the current draft of the questionnaire, special consideration will be given to how each of the issues identified during this pretest are addressed. Because of the number of comprehension issues, there is a significant need to revisit some of the concepts and terms. For example, regarding the comprehension issue with the word "community," it is likely that adolescents do, in fact, interact with various proximate and virtual communities, but they may not refer to them as such. Of critical importance during the revision process will be clarifying what the concept of "community" means in the literature related to adolescent participation, then addressing the comprehension issue of "community" by replacing the word "community" with a short description, providing a definition of "community" at the beginning of the questionnaire, or including a short vignette to convey the concept. A closer examination will also be given to other words that were unfamiliar, confusing, or unclear to younger participants, such as "volunteer," and consideration will be given to whether these words need to be replaced with ones that are more developmentally appropriate or whether a definition of the word needs to be included at the beginning of the corresponding section. Further, some of the questions will be replaced with language that is more informal, conversational, and developmentally appropriate. For example, "Have you attended any public meetings in which there was a discussion of community affairs?" may be replaced with "Have you been to any meetings where people talked about community issues?" Regarding the issue of relevance, we understand that the subject matter covered in the questionnaire includes some activities and behaviors that we would only expect a minority of adolescents to engage in. To address the issue of participants telescoping and inaccurately reporting events as occurring within the reference period, either the reference period of "in the past 12 months" will be removed or it will be underlined for emphasis, depending on how important it is to determine if the adolescent has participated in the activity during the last year.
Based on the findings of the interviews in Côte d'Ivoire and Indonesia, we plan to revise the questionnaire, then conduct at least one round of additional testing prior to finalizing the questionnaire. During the additional round(s) of testing, special attention will be given to whether new probes should be added to the interview protocol to address issues of comprehension or relevance and to determine what is developmentally appropriate for younger adolescents, for example, "How would you ask these questions of other adolescents your age?" We will also explore whether alternative approaches to measuring adolescent participation will be needed for younger adolescents, such as administering a subset of measures to adolescents under the age of 15.
In addition to providing insights into the necessary changes to the questionnaire, the study reveals important considerations for researchers and practitioners interested in measuring adolescent participation across a range of international contexts. Population-level data collection efforts that are based on representative samples and are implemented across countries require the use of a questionnaire that can produce valid and reliable data. To produce such data, the study findings confirm the importance of cognitive testing as a crucial step in the design of culturally relevant and sensitive questions and a means to reduce response bias in cross-cultural contexts.
Once finalized, the questionnaire could be included in school-based and household surveys to generate population-level estimates. When used in surveys that are adequately designed and implemented, it will allow for the generation of data that are comparable across countries.
Findings from this work underscore the importance of cognitive testing and illustrated the scope of findings that cognitive pretesting can unearth. Pretesting revealed that concepts assumed to be simple were problematic for adolescents, and that issues with comprehension and the response process differed across cultural contexts. Practitioners seeking to administer surveys with adolescents, especially across geographies or cultures, should employ cognitive interview pretesting methods to test all questions.

Limitations
There were several limitations to this study, including the use of a convenience sample and time and resource constraints. Participants were recruited using convenience methods and therefore do not represent the larger population of adolescents in Indonesia or Côte d'Ivoire. Although participants provided robust feedback on the questionnaire, it is possible that respondents may experience issues in responding to survey questions that were not detected during the cognitive interviews. Adolescents with certain demographic profiles, such as those who live in rural areas or with education levels not commensurate with their age may experience systematic difficulties in response, resulting in measurement bias. Cognitive interviewing with convenience samples is not well suited to detecting potential for this bias. Instead, pilot and field tests containing scientific sampling and robust sample sizes are best suited to detecting such effects.
did not allow for back-translation of the interview guide from the local language to English.
Back translation is a best practice for confirming fidelity of translations to source materials.
Time and resource constraints also influenced analysis methods. Analysis for items were conducted by two analysts, and inter-coder reliability was not assessed.

Future Research
Future research is needed to examine the impact of the cognitive interview environment on assessing the performance of a questionnaire, including the location of the interviews; the presence of an interviewer, observer, or translator; the process of an interviewer asking probes or follow-up questions; and whether the interview is audio recorded. These factors likely play a role in participants' responses, their level of disclosure, and their comfort level in answering questions. For example, the presence of an interviewer and an observer can result in acquiescence bias, which is particularly common among adolescents (Soto et al., 2008). Such bias may manifest in false reports of understanding the questions or increased endorsement of behaviors.
Once the questionnaire has been fielded in countries across the globe, future research will be needed to determine how well the instrument performs cross-culturally and the extent to which it allows for the collection of internationally comparable data about adolescent participation.