Project to Assess Climate in Engineering

PACE Method

The PACE research team is committed to a set of scientific standards whereby the data are valid, reliable and ethical reflections of students’ experiences. In order to meet these methodological goals, the team implemented a mixed-mode research protocol in which online surveys are complemented with in-depth, one-on-one semi-structured interviews and focus groups.

Sampling Strategy

Institutions: To reduce variation by site, the PACE research team restricted the PACE project to those undergraduate engineering programs defined as one-tiered. That is, each of the programs either enrolled its students directly from high school into the College of Engineering and/or provided an engineering advisor during the first year to students who indicated an interest in engineering on their college application form.

Individuals: Undergraduate engineering students were invited to participate in PACE based on a stratified random sample, thereby increasing generalizability to those students at similar institutions not included in PACE. The research team intentionally oversampled women and under-represented minority students as defined by the National Science Foundation (i.e. African Americans, Hispanics, Native Americans and Native Hawaiian/Pacific Islanders) to ensure that these groups would be sufficiently represented among those who completed the survey.

Back to Top

Online Climate Survey

The PACE survey, which was designed and pre-tested to accurately measure undergraduate engineering climate (culture), was administered online in 2008 to students at all 22 PACE institutions and again in 2012 at 16 of the PACE institutions. The 2012 survey results enabled assessment of the change in climate at the PACE schools.  Students from all years of matriculation were eligible to participate and the 2008 and 2012 surveys each had over 10,000 student respondents. The surveys covered the following areas:

  • Interaction with professors.
  • Interaction with Teaching Assistants (TAs, GTAs, GSIs).
  • Lab experiences (if applicable).
  • Resources offered by the College of Engineering.
  • Interaction with other students.
  • Involvement with extracurricular activities, organizations and programs.
  • Perceptions of engineering career.
  • Perceptions of engineering major.
  • Personal experiences.
  • Transfer student experiences.

Back to Top

Reliability

Both administrations of the PACE survey had reliability testing conducted on the data. The results from the 2012 administration reliability tests are below.

Internal consistency coefficients were computed for all subscales containing five or more items.  Negatively worded items were reverse-scored before analysis (but the reverse was true for the first transfer student subscale).  Overall, responses to each of the seven subscales showed adequate to excellent internal consistency (Table 1).  The average α was .80.  Responses to the Community College items showed the least internal consistency (α  = .63).  Inspection of the simple correlation coefficients revealed that those items formed two distinct subsets: Responses to the first two items were highly related (r = .80), and responses to the last three items were moderately related ( = .56), but responses across the two subsets were not at all related ( = .04).  Similarly, reliability of the Student Interaction subscale was diminished by two items (“Do students compete with one another?” and “Compared to other students … my academic abilities are…”) which were not correlated with the other six ( = -.01).

Table 1.  Reliability statistics for each of seven subscales.

Subscale

N items

Mean (SD)

Cronbach’s α

Average item-total r

Professors

21

3.6 (.47)

.87

.46

Teaching Assistants

9

3.5 (.41)

.84

.57

Student Interaction

8

3.4 (.40)

.67

.38

Perceptions of Engineering

12

3.9 (.58)

.76

.41

Engineering Major

8

4.2 (.45)

.82

.59

Confidence

6

4.3 (.08)

.86

.66

Transfer Students I

18

2.5 (.46)

.86

.46

Transfer Students II

15

3.0 (.56)

.63

.39

Back to Top

Validity

In order to examine the construct validity of the questionnaire, an exploratory factor analysis was performed on 71 Likert-type items taken from the Introduction, Professors, Teaching Assistants, Student Interaction, Perceptions of Engineering, Engineering Major, and Confidence subscales.  These items constituted the core measure of student opinion of the engineering experience.  The two items relating to instructors’ accents were excluded because of high rates of missing data.  The analysis included only those respondents (n=4,298) who had valid data for all 71 items.

Bartlett’s test of sphericity (p < .0001) and the Kaiser-Meyer-Olkin measure of sampling adequacy (KMO = .94) indicated that the data were adequate for factor analysis.  The factor solution was obtained using principal axis factoring and varimax rotation.  This produced a solution of 14 factors (with eigenvalues greater than 1.0) which cumulatively accounted for 45.3% of the variance.  Of the 71 items, 57 (82%) had a minimum loading of .32 on at least one factor, but nine items loaded on more than one factor.

Nine of the factors were comprised of at least three items with factor loadings greater than .32.  Those nine factors accounted for 39.1% of the variance in the data.

In general, items that were intended to tap into a common idea (e.g., evaluation of professors) did load together onto a common factor.  The Perception of Engineering items were the main exception: They resolved into four different factors.

In conclusion, the factor analysis results indicate a generally well-constructed set of items for assessing the undergraduate engineering experience, but some items could be omitted without consequence.

Back to Top

Climate and Leaver Interviews

We completed 179 interviews with students on-site at 16 PACE institutions in 2008 and 2009. Of those, 124 (116 audio-recorded) were with current engineering students and 55 (50 audio-recorded) were with students who had either left engineering for another major or were in the process of leaving. All students were eligible for the interviews and we recruited a diverse sample.

The interviews with current students, (“climate interviews”) addressed issues included in the Online Climate Survey and provided students the opportunity to describe, in detail, the environment for engineering undergraduates at their institution. In addition, students were asked to describe the areas in which their department excelled and ways in which it could improve. Finally, students were asked specific questions about their perceptions of female and under-represented minority students and faculty in their program.

Former engineering students, “leavers,” were asked similar questions but also asked to describe how they came to the decision to switch majors, how this change affected their career plans, peer and family relationships, and college experiences more broadly.

Back to Top

Focus Groups (2012 – 2013)

Focus groups were conducted with undergraduate students at participating schools.

Focus groups explored student perspectives on overall climate, expected retention and career expectations, unanswered questions from interview transcript analysis, and intervention impacts, providing key understanding of the “why and how” of each.

Focus groups with engineering undergraduates were conducted at twelve PACE schools in Spring 2013. Each school chose an intervention of particular interest, and focus groups were divided between intervention students and non-intervention students. In total, 307 undergraduates participated in the focus groups.

Back to Top