« Engaging with Researchers: 3 Thoughts for Educational Leaders | Main | Survey Administration in a Real-World Context »

Partnering on Survey Development: Benefits, Challenges, and Balancing the Trade-Offs

| No comments

This week we are hearing from the Los Angeles Education Research Institute (LAERI, @LAEdResearch). This post is by Carrie Miller (@Carrie_E_Miller), doctoral candidate at UCLA (@UCLA) and LAERI collaborator, Meredith Phillips, Associate Professor of Public Policy and Sociology at UCLA and co-founder of LAERI, and Kyo Yamashiro, Associate Professor of Education and director of the Urban Leadership Ph.D. program at Claremont Graduate University (@CGUnews), and co-founder and Executive Director of LAERI.

LAERI previously blogged about college access and success, college readiness, and college counseling in Los Angeles.

Today's post is written from the researcher perspective. Stay tuned: Thursday we will share the practitioner's perspective.

 

partnering-on-survey-development.pngThe Los Angeles Education Research Institute (LAERI) has collaborated with the Los Angeles Unified School District (L.A. Unified) in a research-practice partnership that has focused primarily, over the last several years, on college readiness and access. Our research team has worked closely with the district's Division of Instruction and Office of Data and Accountability (ODA) to develop and revise college access related survey modules and questions. Working on the surveys with the district has many benefits but also involves some trade-offs. Here, we share some of our experiences and takeaways.

Collecting Data: A Collaborative Effort

L.A. Unified has invited students, staff, and parents to complete an annual survey since the 2008-09 academic year. These surveys, called the School Experience Surveys (SES), typically include items related to school climate and practices, students' social emotional skills and academic behaviors, staff training, and central office services. The district has refined the content and administration of these surveys over time, to accommodate a multitude of shifting internal and external stakeholders' data needs and priorities, including a desire for relatively short surveys.

Since the 2015-16 academic year, LAERI has collaborated with the district to integrate relevant survey modules and items into the SES that capture information about topics that emerge in our research-practice partnership. LAERI first developed a high school counselor module for the staff SES and has since worked with the district to revise and expand the module to include items for elementary and middle school counselors. In addition, we have jointly developed items for principals, teachers, and secondary students. These data have played a central role in generating new knowledge about college readiness and college going in L.A. Unified, including:

  • the types of college readiness supports available to students (see the related report);

  • challenges school counselors face in providing college-counseling services;

  • the types and percentages of students who receive different types of individualized college-counseling supports; and

  • whether and where students applied to college.

These survey data, along with National Student Clearinghouse and district administrative data, have informed the district's approach to college readiness (see this earlier blog post) and the strategies incorporated into their California College Readiness Block Grant plan. In our current work, we combine SES and longitudinal administrative data to explore the extent to which differences among schools, after adjusting for students' background characteristics and prior academic preparation, seem to contribute to differences in students' college-related outcomes.

Partnering on Survey Design and Development: Striking a Balance

From the researcher perspective, we see several advantages to partnering on survey development and design. For example, the SES provides a much larger set of respondents than we would be able to recruit ourselves, which makes it possible to describe patterns among students and schools much more thoroughly than is typically possible with data collected for research purposes. In addition, developing survey items with the district allows us to improve data quality in each subsequent survey administration; we identify items that are not yielding useful data, need revisions to increase clarity, or might be more accurately answered by a different respondent (e.g., principals versus counselors) and revise the surveys accordingly. Most importantly, partnering on the SES means that we are focused from the beginning on what might be useful to inform conversations about improving practice.

Some features of the survey have both advantages and drawbacks. For example, while SES response rates are high for the student and staff surveys, some schools have relatively low response rates, and parent response rates are especially variable. Similarly, the staff survey is anonymous, which may yield higher quality data but means that we are not able to link specific teachers' practices or attitudes from that anonymous survey to other sources of relevant data (e.g., staff demographic or student data). In addition, because the SES content is designed to be responsive to district needs and interests, survey questions and response sets change between administrations. While this flexibility of the survey is an advantage for ensuring that the data collection reflects district needs, survey revisions can make it difficult to measure change or improvement consistently over time. Moreover, because the district has limited resources, few opportunities to assess the measurement properties of the survey are built into the survey development process.

While there are many advantages to partnering to develop the surveys, the survey content and administration differ from what district staff or our research team might design in isolation. For example, last school year, we were able to gather data on seniors' college application patterns because the survey administration period occurred after most college application deadlines had passed. In the most recent survey administration, however, the district moved the timing earlier in the fall at the request of some principals who wanted the survey to correspond with the timing of parent-teacher conferences, and thus the questions we had included last year about college application patterns had to be eliminated because the new survey window preceded most application deadlines.

The district's willingness to collaborate on the SES, even though working together can make the survey development process more time-consuming for their staff, has been invaluable for our research-practice partnership and college readiness research. We are grateful for their patience as we propose and revise items and clean and analyze the data. With each survey administration, we gain a better understanding of district stakeholders' data needs and priorities, the best ways to gather key college readiness and access information, as well as how to improve our data cleaning processes to get data in the hands of practitioners more quickly.

Collaborating with district staff on the SES surveys has prompted important conversations about district data needs and data collection practices, fostered closer relationships between district staff and LAERI researchers, and yielded important new knowledge. Most importantly, our collaboration has built momentum for understanding more about college access supports in the district and how to improve them. 

Notice: We recently upgraded our comments. (Learn more here.) If you are logged in as a subscriber or registered user and already have a Display Name on edweek.org, you can post comments. If you do not already have a Display Name, please create one here.
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

The opinions expressed in Urban Education Reform: Bridging Research and Practice are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Follow This Blog

Advertisement

Most Viewed on Education Week

Categories

Archives

Recent Comments