Research Brief Summary

Student Attitudes and Beliefs

It is our hypothesis that by engaging in the Participatory Sensing units described in this proposal, students will develop beliefs about themselves as STEM doers and will develop attitudes towards STEM disciplines that contribute to their learning, achievement and further engagement with these disciplines. Current lines of research in mathematics education, such as Boaler & Greeno (2000), Nasir (2002), and Cobb et al. (2009), argue for the examination of students’ identities as doers of mathematics (or science or computer science) as these identities develop in relationship to students’ subject-matter understandings. Students’ disciplinary identities are shaped by factors such as their performance on standardized measures, their participation in class, their sense of ability, and how they think about knowledge and authority in the classroom. Students’ constructions of identities as doers of a discipline and their choices about whether they will continue to study the subject and pursue related careers are directly related to their persistence and motivation when studying the subject.

Mobilize will build on the experiences of CENS faculty and staff who have participated in the center’s high school programs. Summer@CENS, for example, has completed its fourth year and demonstrates the effectiveness of the learning approach proposed in Mobilize. While this summer program attracts students who are already interested in math and science, creating the opportunity to be immersed in a university atmosphere, the follow-up evaluation (2009 Summer@ CENS evaluation report) points to the impact of the Participatory Sensing context on learning.  Mobilize will build on this model, adapting and expanding its lessons to LAUSD students.

The evaluation of both the 2009 and 2010 Summer@CENS High School programs attests to the potential of Participatory Sensing contexts to influence student beliefs and attitudes regarding STEM disciplines.  The evaluation found that:

  • The hands-on research is one of the most satisfying aspect of the program for students;
  • Belief in the relevance of engineering to society was enhanced by participating in these projects;
  • By the end of the program, most of the participants plan to pursue grad school in science or engineering;
  • All of the students agreed with the statement ‘I feel more strongly now that my career lies in science and engineering fields;’ and
  • By the end of the program, 6 of 7 female students indicated they would likely pursue a PhD in one of the core STEM fields included in this proposal (computer science, statistics, mathematics, or one of the targeted domain sciences).

The Participatory Sensing units described in our proposal incorporate a variety of skills and knowledge that characterizes a “doer” of math and science.  These included: problem formulation, computational strategies and algorithmic design, computer implementation, software sharing, teamwork, and basic research, analytical and technical skills. We believe this portrayal of a scientist and mathematician as someone who is engaged in multiple aspects of doing math and science is important to attract and support diverse students.  This is reflected in a female student’s description of how participation in the Participatory Sensing project impacted her sense of being a doer of science:

You’re involved in every part of the scientific process, it’s not like you’re just doing data collection or you’re just doing analysis or like we got to go through every step from like planning and making a proposal to writing procedures and stuff and then going out …it really makes you feel like you’re an actual scientist an actual researcher.

The evaluation team also found that  “For the women, the enjoyment of working with computers increased greatly by the end of the program.  With only half of the women reporting that they “Agreed Somewhat” or “Strongly Agreed” with the statement “I enjoy using computers”, at the beginning of the program, by the end of the program this number was 100%.   Although the men rated themselves very high at 78% for this same question at the beginning of the program, they too increased their enjoyment of using computers to 100% by the end of the program.”

Just as much of the prior work at CENS with high school students has focused on addressing disparities in gender as evidenced by the evaluations above, the Education partners (Center X and LAUSD) bring a long history of research, practice and focus on addressing disparities in access to learning based on race and socioeconomics.  Mobilize is committed to implementing participatory sensing learning in Los Angeles schools with high numbers of African American and Latino/a students.

Measurement of Outcomes

Evaluation Plan

UCLA’s Center for Research on Evaluation, Standards and Student Testing (CRESST) will serve as objective evaluator for MOBILIZE. With over 40 years’ experience, CSE/CRESST has long contributed to the development of scientifically based evaluation and testing methodology, having led numerous state, district, and program evaluation studies, applying rigorous quasi-experimental and experimental methodology and analysis techniques. The majority of these projects incorporated both standardized measures of academic achievement and other measures of instruction or opportunity to learn (e.g.,  surveys, observations, scoring rubrics). Dr. Noelle Griffin, Assistant Director/Research and Evaluation, with over 10 years’ experience in educational evaluation, will provide day-to-day project management and oversight.

Evaluation will focus primarily on quantitative indicators of program impact, including outcomes at the pre-service teacher, in-service teacher, high school student, and larger institutional levels. It will also include qualitative measures to assess the impact of the project on all partners and stakeholders. Evaluation will serve both summative and formative purposes, providing results pertinent to overall program effectiveness as well as information that the partnership can use for ongoing program improvement and refinement. As appropriate, data will be disaggregated by race, ethnicity, socio-economic status, gender, and/or disability. To provide evidence tied to outcomes and benchmarks, CRESST will investigate the following Evaluation Questions:

Evaluation Questions

  1. What is the impact of Mobilize curriculum participation on student content area learning (i.e., math, science)?  Does impact vary by key student background characteristics? (Outcome 1a)
  2. What is the impact of Mobilize participation on students’ attitudes/beliefs, including students’ identities as future STEM professionals and future math/science course taking?  (Outcome 1b)
  3. What is the impact of Mobilize curriculum participation on student CS knowledge/computational thinking? (Outcome 1c)
  4. To what extent will the pre-service learning experiences and first year classroom practices of MOBILIZE teacher candidates differ from comparison candidates who did not participate? (Outcome 2a)
  5. To what extent do teachers’ CS-based knowledge and instructional practices change after participation in MOBILIZE in-service professional development? (Outcomes 2b, 2c)
  6. What impact did the program have on institutional change and larger systemic educational policy at the local, state, national and higher education levels?  (Outcomes 3a-f)

Evaluation Design

Given the nature of the project, a true experimental design is not practical. Thus, the evaluation will employ a quasi-experimental design, selecting a control group for participating pre-service teachers, in-service teachers, and classrooms matched on a number of core characteristics. The analyses employed will further control for teacher/student characteristics that could influence student outcomes.