ESS 521

Evaluation of Physical Activity Programs for Underserved Youth

Fall 2001

PROFESSOR:

Tom Martinek
Department of Exercise and Sport Science
264 Health and Human Performance
Phone: 334-3034
email: martinek@uncg.edu

GOALS OF THE COURSE:

Each student will be expected to:

a. become aware of various models for evaluating youth development programs,
b. be familiar with the various rationale and assumptions for doing evaluation of youth development programs,
c. understand the various data sources for doing evaluation of youth development program, and
d. be knowledgeable of the various ways to analyze, interpret, and disseminate data collected in the evaluation process.

CLASS READINGS:

1. Martinek, T (2000) Program evaluation. In D. Hellison & N. Cutforth, et al. Youth development . Champaign, IL: Human Kinetics. (Required)

2. Additional articles will be provided.

ASSIGNMENTS:

1. Daily reading assignments.

2. Course project (paper and presentation) (see Final Project Outline).

3. Group reports on programs for underserved youth.


COURSE OUTLINE:

Tuesday, August 21-Review of course content and projects

Tuesday, August 28--Introduction

A. What we know about programs for underserved youth.

B. Why, when, what, and/or who do we evaluate?

C. Assumptions underlying evaluation.

D. Models for evaluating programs for underserved
youth.

Reading(s):

Martinek, T.(2000). Program evaluation.

Hellison, D. & Cutforth, N. (1997). Extended day programs for urban children and youth: From theory to practice. In H. Walberg, O. Reyes, & R. P. Weissberg (eds.). Children and Youth: Interdisciplinary Perspectives (pp. 223-249). Thousand Oaks, CA: Sage.

Tuesday, September 4

A. Jim Stiehl and Don Morris-Evaluating Extended Day Programs

B. Using and evaluating the Responsibility Model

C. Group work

Reading(s):

Tuesday, September 11

A. Types of Traditional Models
· Program profiling (McLaughlin & Heath)
· Two group designs (Cummings and Wright)

B. Issues in Traditional Evaluation Models
· Validity--internal and external
· Instrumentation (validity & reliability)
· Accessing archival data
· Statistical treatment
· Politics of evaluation

Tuesday, September 18

A. Getting Started
· Developing program crosswalk
· Selecting instrumentation
· Gaining access
· Scheduling data collection

B. Group Work

Reading(s):

Cummings, T. (1998). Testing the effectiveness of Hellison's Personal and Social Responsibility Model: A Drop-out, repeated grade, and absentee comparison. Unpublished thesis, California State University, Chico.

Wright, P. (1998). The impact of a responsibility based martial arts program on violence prevention. Unpublished Master's thesis, University of Illinois at Chicago.

Tuesday, September 25

A. Less Traditional forms of evaluation
· Single group design
· Case studies
· Journey analysis/story telling

B. Organizing and analyzing "soft" data

C. Data Sources
· Surveys/questionnaires
· Interviewing (one vs one, focus group,
informal vs formal)
· Program artifacts

D. Presentation of Program Evaluation--Group One
(Cummings' & Wright's studies)

Readings:

Locke, L. (1989). Qualitative research as a form of scientific inquiry in sport and physical education. Research Quarterly for Exercise and Sport.

Martinek, T. (2000) Program evaluation.

Tuesday, October 2-NO CLASS

Tuesday, October 9

A. Tammy Schilling-Case study evaluation

B. Creative forms of evaluation
· Using program artifacts (journals, reflection responses, informal recall, critical incidences)
· Informal recall
· Developing portfolios
· Action research/interactive research
· Service-bonded inquiry Interviewing kids/parents

C. Presentation of program evaluation-Report two (Cutforth & Pucket)

D. Group Work and Preliminary Reports

Reading(s):

Schilling, T. (In press). Research Quarterly for Exercise and Sport.

Martinek, T., & Hellison, D. (1997). Service-bonded inquiry: The road less traveled. Journal of Teaching in Physical Education.

Cutforth, N. & Pucket, K. (1999). An ivestigation into the organization, challenges, and impact of an urban apprentice teacher program. Urban Review, 31(2), 153-172.

Tuesday, October 16

A. Interpreting data from creative evaluation
· Matrix analysis
· Eyeballing

B. Program evaluation of Community recreation programs-
John Saunders(Director of Greensboro Parks and Recreation Program)

C. Program evaluation report-Report 3
(Martinek, & Schilling)


Reading(s):

Cutforth, N., & Puckett, K. (In press). An investigation into the organization, challenges, and impact of an urban apprentice teacher program. Urban Review.

Tuesday, October 23

A. Interviewing kids at Boys and Girls Club

B. Program evaluation of Boys and Girls Club-
Terry Stevenson (Executive Director of Boys & Girls Clubs)

C. Group Work

Tuesday, October 30

A. Presenting your findings (bar graphs, charts, and other bells & whistles)

B. Interpreting and applying results

· Journal outlets
· Workshops
· Working with professionals

C. Presentation of program evaluation--Group four (Kahne)

D. Group work

Reading(s):

Tuesday, November 6

A. What funding agencies look for in evaluation of programs

B. Principles Learned from the Evaluation Process

C. Principles Applied to the Evaluation Process

D. Principles Applied to Organization and Analysis of Data

E. Group work


Reading(s):

Martinek, T. & Schilling, T. (1999). Program evaluation.

Tuesday, December 4

A. Final Project Presentations

B. Summary and Wrap-up

C. Course Evaluation

FINAL GRADE CRITERIA:

Interview Summary (25%)

Program Critique (25%)

Final Project (50%)


FINAL PAPER DUE DATE : July 5