Learning Outcomes in Campus Recreation: Part II
May 12, 2011
From Accountability to Enhancement
Wallace Eddy, Ph.D.
Associate Director
Campus Recreation Services
University of Maryland (College Park)
The first article on learning outcomes dealt with the “what” of learning outcomes; this second part deals with the “how.” Is there anything more daunting to creativity like a blank page, canvas, or slate? Although you may be just getting started in the process of creating learning outcomes documentation, the learning is already occurring. So you really aren’t facing a blank page. How do you identify the learning that is taking place in your department or organization? To illustrate the process, I offer a case study of sorts, using the Challenge Course Supervisor position at our university.
Identifying learning
To identify the learning that occurs for Challenge Course Supervisors, the professional staff member responsible for that group of student employees held a brainstorming session during a staff meeting. Rather than using any specific learning outcomes language, the staff member asked: “So, what do you learn by being a challenge course supervisor?” Some prompting and probing was necessary to get the students to acknowledge that they were learning valuable knowledge and skills; the student supervisors were committed to the program mainly due to their interest in facilitating the experiential learning of others; their own learning was seen as somewhat incidental. As the conversation progressed, it became clear that a good deal of learning was happening for those who served as Challenge Course Supervisors.
Once the list of learning was brainstormed, the professional staff member asked: “Where did you learn those skills?” A combination of “through training” and “on the job” was the most common responses. This relatively brief session during a staff meeting resulted in a veritable gold mine of learning ideas, full of nuggets of knowledge and skill acquisition. So, how do we go about polishing those nuggets, shaping them into gems of student learning?
Once the list of learning ideas was generated, we then examined the list to see if any natural connections among the learning ideas were emerging. As we began to note the connections, a relatively simple overall pattern began to take shape in binary form: (1) technical skills, and (2) interpersonal/social skills. The list of learning ideas was then resorted by the two categories. By seeing the overall list organized into the two categories we could begin to see more precisely how the elements of the list fit together. For example, under the technical category, five outcome domains emerged: (1) physical safety skills, (2) theories underpinning challenge course environments, (3) facilitation content knowledge, (4) communication and group dynamics skills, and (5) risk management knowledge and skills. [What we have since learned is that we should really have combined the physical safety skills under the more general domain of risk management knowledge and skills — I point this out to demonstrate the iterative nature of the process — the product is rarely “right” on the first try.]
Now that we had the outcome domains defined, we were able to go back to the original list of learning opportunities and place them into the domain that best categorized that learning opportunity. We were now in a place to be able to write the actual learning outcomes and describe the assessment measures as well as the criteria for accomplishment of the specific learning outcome. Before moving on to structuring learning outcomes by creating an assessment plan, I also want to note that rather than identifying what students are already learning from your programs, you may also want to ask: “What should they be learning?” Although this is a more top-down approach, you may need to participate in this exercise if your students are having difficulty articulating what they are learning, or a grassroots approach is not practical in your department or organization.
Creating an assessment plan
The basic elements of an assessment plan include:
- Specific objectives to be met
- Some notation linking the outcome to a specific theoretical outcome domain [for example, the outcomes categories listed in Learning Reconsidered 2 (Keeling, 2006)]
- Measure — how objectives will be assessed
- Criteria — level of achievement required to consider objective met
- Schedule — when will outcomes be assessed, and by whom?
In addition to the basic elements, it may be useful to articulate how the stated outcomes relate to and promote progress toward meeting the mission of your department. In terms of institution-wide reporting, noting how your outcomes contribute to the mission of the institution helps you demonstrate the importance of your department as an integral part of the overall institution.
Let’s return to the example of Challenge Course Supervisors to see how the construction of an assessment plan comes about. We developed three objectives for the Challenge Course Supervisors:
(1) Students will demonstrate knowledge of the physical safety skills associated with a challenge course program;
(2) students will demonstrate knowledge of the theories underpinning challenge course program facilitation; and
(3) students will demonstrate appropriate sequencing of activities to achieve intended program goals of challenge course user groups.
To follow the example through the assessment process, let’s focus on objective 1 from the above list. [Note: this objective also directly relates to risk management.] The measure for objective one is, “direct observation by CRS professional staff using skill proficiency rubric. In order to understand the criteria for objective 1, a description of the skills rubric will be helpful. The rubric covers eleven skills including: the various necessary knots for high element work, belay and group belay procedures, safety awareness regarding both participants and facilities/environment, and proper use of and care for climbing equipment. For each skill, there are three levels of achievement: “developing,” “proficient,” and “accomplished,” listed here in order from least to highest level of achievement. For each level, a description of what a staff member would observe is provided. The skills on the rubric are grouped in a specific order to allow the achievement criteria to be more easily organized.
The criteria for success on objective 1 is, “skills as measured by challenge course supervisor objective 1 rubric must be met at the following levels:
Skills 1-6 = “accomplished”
Skills 7-9 = “proficient” or above
Skills 10, 11 = “developing” or above.
Skills 1-6 are required to be at the accomplished (highest) level because they are critical to participant safety. Skills 7-9 are important, but attaining a proficient rating will still be within acceptable risk management practices. Skills 10 and 11, although important for overall challenge course supervision skills, are not deemed critical to be achieved above the developing level in order for the learning outcome objective to be met.
To provide a structure for the assessment plans, the Division of Student Affairs Learning Outcomes Group (SALOG) has adapted the template used by the academic units at the University for their Learning Outcomes Assessment Plans. All of the information listed above to be included in the plan is located in the template. This common format allows for easier review across departments and adds consistency to the process for the Division as a whole.
Using assessment results
One common complaint among those with whom I’ve worked on assessment and planning processes is that the resulting documents become placeholders on a bookshelf. When working with learning outcomes in your department or organization, keep in mind how the results will be used. By using this end-game thinking, you may also be able to focus on the outcomes that will be most useful and meaningful to you. Given that this publication is devoted to risk management, and risk management is critical to our operation, here are some examples of learning outcomes that we have created that are related to risk management:
- At the conclusion of our General Employee Training (in which all student employees must participate) we assess their knowledge of Bloodborne Pathogens through an online exam that has been created by our Department of Environmental Safety.
- Adventure Trip Leaders in the Outdoor Recreation Center are assessed for their abilities to create a risk management plan prior to leading a trip.
- Intramural Supervisors are assessed on their ability to write objective incident reports and their ability to identify and manage unsporting behavior.
To report the results of our learning outcomes assessment in the Division of Student Affairs, we have used a template that is nearly identical to the one used for structuring the assessment plans. The two modifications are to add “results” to the section on “measure and criteria,” and to change the section that explained the assessment schedule (timeframe) to “Impact of Results.” The results of the assessment are reported in the newly named “measure, criteria, & results” section. Placing the results directly following a restatement of the measure and criteria makes the outcome’s level of achievement represented graphically. The ‘Impact of Results’ section is used to note what changes will be made to the program in which that specific learning outcome objective was measured to increase achievement success. Of course, it is possible that you will be satisfied with the results and not make any changes; this too should be noted to demonstrate consideration of assessment results.
Conclusion and a caveat
After this discussion of the identification, articulation, assessment, and application of learning outcomes, I feel that I must leave you with a caveat: the above process is a somewhat incomplete model. The degree to which you must develop outcomes that meet “research standard” will determine the degree of completeness in the above model. To be able to state that one’s learning opportunity actually resulted in learning, the prior knowledge and experience must be measured before engagement with the opportunity. Astin’s (1970) I-E-O model on college impact is commonly cited as a conceptual model for measuring the impact of the college experience. The “I” stands for “inputs,” the “E” stands for “environment,” and the “O” stands for “outputs.” We may refer to the outputs as the outcomes that we want students to be able to demonstrate as a result of our learning opportunities (environment). When we assess outcomes, even with direct measures, if we do not account for the inputs (what students brought with them in terms of prior learning or experience), we cannot take complete “credit” for the various outputs we may measure.
Measuring inputs is difficult and time-consuming. There is an argument to be made that one may spend all of one’s time assessing and not enough time programming and being innovative in our programming! So, what do we do? Does this mean we should give up on the articulation and assessment of learning outcomes? No, but we must be careful in the language we use when making claims about the degree to which students are learning particular knowledge, skills, or attitudes through participation in our programs.
The purpose of this two-part series of articles on learning outcomes was to provide an introduction as well as an example of how learning outcomes fit into a campus recreation program. Hopefully, the information in these two articles will stimulate your own thinking about how you will be able to incorporate learning outcomes into your work. Campus recreation professionals are also educators; we simply need to find ways to articulate the learning that occurs in our departments and organizations.
References
Astin, A. W. (1970). The methodology of research on college impact, part one. Sociology of Education, 43, 223-254.
Keeling, R. (Ed.). (2006). Learning reconsidered 2: Implementing a campus-wide focus on the student experience. ACPA, ACUHO-I, ACU-I, NAAA, NACA, NASPA, NIRSA.