Penn College’s academic deans have identified faculty who have crafted successful student learning outcomes activities that warrant replication. The “best practices” detailed below are presented as abstracts intended to whet interest. Faculty are encouraged to consult with the identified practitioners for more information.

In addition, beginning in 2013 with the first President's Assessment Award recipient, awardees post their entry onto this site.

Applying Theory and Demonstrating Reflective Learning in a Nursing Laboratory

Kathleen Hyatt and Margaret Faust, Assistant Professors of Nursing
Recipient of 2014 President's Assessment Award


This assessment capitalizes on use of a new laboratory serving the Nursing majors, and -in this instance- focuses on the course, NUR 214 Adult Medical/Surgical Nursing II; the laboratory mimics a hospital setting with SimMan 3G, a high fidelity manikin as the patient. A standard simulated patient scenario is presented in five phases. During each phase, half of the group interacts with the manikin in carrying out the nursing process while the other half observes and prepares critical feedback.

The laboratory experience provides students a “safe yet challenging place” in which to learn. For patient safety reasons, in the actual clinical setting, students are extremely limited in their ability to make “real” clinical decisions based on their independent observations and critical thinking skills or perform clinical skills beyond basic level care. In fact most all of their complex nursing skills and decision making is done either under the direct guidance of the clinical instructor, the nursing staff or “on paper." In the simulation lab, the students are given information and then are allowed to gather further data and make their own clinical decisions (good or bad) related to the determined patient needs. This setting allows the students to independently and critically think through the scenario and to apply their knowledge without the continuous direction from the nursing faculty. During the scenario the students are allowed to reflect on their mistakes and identify measures to correct errors in thinking and nursing judgment prior to working with patients. Moreover, observing and critiquing other caregivers and also evaluating their own performance, encourages each student to become more observant as well as self-reflective and much more conscious of the level of nursing knowledge that is necessary to provide appropriate patient care.

The nursing professors act as facilitators during debriefing session following each phase of the scenario. The debriefing allows the student group to provide peer feedback, freely express concerns, ask clarifying questions, and examine in more depth the positives and negatives of the learning experience.  This is integral in helping the students leave the simulation lab on a positive note while allowing the facilitators to discern students who may be feeling overly discouraged or stressed, so that early faculty interventions can be initiated as necessary.

At the conclusion of the activity/the five phrases, each student writes a personal reflective evaluation reviewing the simulation experience in relation to the NUR 214 clinical competencies. Students identify the process and procedures they followed in achieving each learning competency.  Unmet competencies are also identified so they can develop a plan to further work on those competencies. This allows the student to develop professional accountability for their actions and the learning process.

Finally, students evaluate the laboratory experience itself.

Course Outcomes

Simulation Competencies supporting required outcomes for NUR 214:

  • Assess individual needs of the adult consumer
  • Establish priorities for care, considering consumer’s diagnosis and needs
  • Formulate and employ appropriate nursing interventions
  • Consult with client, family, members of health care team to assist client
  • Evaluate consumer’s progress toward meeting goals
  • Correlate medication knowledge with client diagnosis and med/surg history
  • Demonstrate proper administration of medications
  • Evaluate effectiveness of nursing interventions
  • Demonstrate legal and ethical behaviors
  • Be accountable for the clinical performance via weekly journaling
  • Act as self-motivated, self-directed, responsible professional

Closing the Loop

When unmet competencies are identified, an appointment with the laboratory coordinator is set so as to determine the most appropriate follow-up activities: e.g., additional laboratory time, practice with different simulations, clarification of terminology and/or review of text materials. The student is then observed in a follow-up activity to evaluate the success of the remediation. The student’s personal journal is also reviewed by the nursing professors so that further personalized feedback can be provided.

Assessment Through Two Modalities

Dr. Craig A. Miller, Assistant Professor, History/Political Science
Recipient of 2013 President's Assessment Award

This assessment was completed for four sections of World Civilizations I, HIS 115 which fills a CUL requirement for many majors across campus, but it is an elective course which is taken by both under and upper class students. The course covers World History from the first human societies to roughly 1500 CE. The assessment had two modalities, both administered as pre and post tests to measure improvement over the course of the semester. The modalities were a multiple choice exam, and a written assessment based on an analysis of a primary source document.

Assessment through Multiple Choice exams

As part of the midterm and final exam, students had to answer ten multiple choice questions, based on content from class lectures, discussions and readings. The aim was to measure how students' comprehension of the course content changed over the course of the semester. After the midterm, I devoted an entire class to going over the multiple choice questions and answering questions about best practices for studying (how to organize class notes, how to approach reading the text, and strategies for retention of material). The aim of these tests is not memorization, however. I never ask students questions on names and dates, but rather on processes of historical change. Many of the questions ask students to identify the correct reason for a particular change in a particular society. For example:

Which of the following was NOT responsible for the emergence of Mesopotamian Civilization?

  • Location near large flowing rivers
  • The formation of city-states
  • Trade
  • The elimination of social classes

These questions were designed to evaluate not only content knowledge, but critical thinking skills as well. The aim was to assess the following outcomes:

  • Understand processes of change in historical context
  • Evaluate the influences of social institutions on civilization and history
  • Identify key periods in history and explain their impact on civilizations and cultures.

The data suggest that student progress was achieved. One semester of data has been collected thus far, with another round in the Fall 2013 semester.

Assessment of Primary Source Documents

The second modality was a pre and post testing of document-based questions: questions about primary source documents. Students were given the pre-test the first week of class. I gave the students a selection from Confucius’s Analects, knowing that many students had probably never heard of Confucius, nor been exposed to this kind of source before. This was intentional as I wanted to establish a baseline from which to measure students’ progress. They were to read the document, and complete two tasks. First, they were to make a list of three questions that would need to be asked to fully understand the people who wrote it, the context in which the document was written, and the ideas expressed in the document. Second, they had to explain what reciprocity was, as described in the document, and explain how it differs from treating everyone equally.

To help students develop these skills, they were exposed to other primary source documents with questions provided for each throughout the course. Students worked on these document-based questions in small groups. In these groups, students had to read the documents, and each group had to answer the provided questions in writing. Each group shared its answers with the rest of the class, and had to also evaluate the questions they answered. Students had to explain why these questions were important: what kinds of information were the questions intended to discover? Why is this information important when trying to understand historical documents? Each of these assignments was turned in and graded with feedback from the instructor. To encourage participation, the points from these in-class assignments were used as extra credit points toward midterm and final grades.

As the semester progressed, students began working on documents with no questions attached, and were asked to read documents, in groups, and design and answer their own questions, and explain why they asked the questions they did. The groups shared their responses with the class, and each written assignment was graded with feedback from the instructor, and again the points were to be used as extra credit.

For the post-test, students were given the same document they had read for the pre-test, so that a comparative analysis of their progress could be measured.

Course Outcomes

Upon successful completion of this course the student will be able to:

  • Analyze written and visual information and apply the knowledge thereby acquired
  • Apply the learning modes of analysis, synthesis, application, and evaluation
  • Develop an understanding of major world cultures and civilizations, and a deeper appreciation for the  diversity of the human condition as exhibited across space and time

Outcomes of the Assessments

Upon successful completion of this course the student will be able to:

  • The pre and post testing of students is designed to measure both student progress and the effectiveness of in-class assignments designed by the instructor.
  • Student progress was achieved, based on the criteria
  • Consistent use of in-class assignments facilitated that progress
  • Student-developed questions demonstrated improvements in critical thinking

Suggestions for adaptations to other courses

My aim in constructing these assessment instruments was to find ways to quantitatively measure strategies I was already using in the course, rather than trying to reinvent the wheel. I looked at some of the in-class activities I normally used, and designed assessment instruments around those activities, rather than vice versa. The multiple choice pre and post test was a relatively straightforward assessment, but did not capture the bulk of what I did in class. The document-based questions, which I use to both expose students more directly to the past, and to increase reading comprehension and critical analysis lent themselves well to a pre and post-test format. Instructors in other courses could try this with essay assignments, research papers, class presentations, or any graded activity that is used more than once in the course. the trick is to develop a rubric that measures progress quantitatively as well as qualitatively and to make sure to provide students direct feedback on the activities. I will be teaching another PD course on developing rubrics in between fall and spring semesters, and at the end of the spring 2014.

Using Progressive Analysis in a Writing Intensive Course

Rhonda Davis, Esq.

This course is the advanced level of Legal Research and Writing and is Writing Enriched. Accordingly, the students are required to prepare 4 major writing assignments (each approximately 10 typed pages) to successfully complete the course. Each assignment builds upon the other.  Students are provided a fact pattern in which a civil complaint is eventually filed and office and court documents are submitted as if the student were a paralegal; the instructor serves as the supervising attorney.

  • The first assignment requires research regarding the civil action (irrespective of which party the firm represents) with a critical analysis of the primary authority governing the matter. 
  • The second assignment requires an objective analysis of the advantages both parties possess based upon the law.
  • The third assignment requires preparing a subjective argument to the court, urging either granting/denying a trial court motion.
  • The final assignment requires submission of an additional subjective argument about why an appeal from the trial court should be granted or denied.

This progressive analysis of advanced legal research and writing connects to the Penn College definition of assessment in that it is “systematic, iterative, collaborative, documented, and adaptable. It applies multiple measures, both qualitative and quantitative. It identifies strengths and areas that warrant improvement.”

Rubrics for Dual Use

Rubrics are provided to the student prior to each assignment so s/he has an understanding of both quantitative and qualitative analyses of her/his work. It provides the framework for each writing.  Thus, the student can organize the writing, using the rubric(s) first as a guide, then as an evaluation of individual assessment.  The third and fourth writing assignments comport with the Pennsylvania Rules of Civil and Appellate Procedure OR the federal rules of such (depending on the fact pattern).

A student may use his/her written products as evidence of the drafting of legal documents to portray to prospective employers the student’s worth as an exemplary paralegal.


Upon successful completion of this course the student will be able to:

  • Understand primary sources of law and authority in legal analysis
  • Use proper citation
  • Apply and analyze case and statutory law
  • Validate research
  • Write case briefs and complex legal memoranda proficiently
  • Analyze data to identify complex questions in legal research and writing
  • Apply existing principles to the solution of such complex questions
  • Make intelligent decisions when there are no pre-existing principles to govern the exact questions that are involved
  • Analyze data to prepare an appellate brief.

Using a Rubric to Guide Capstone Presentation

Mary Jo Saxe

Assessment of the oral presentations in the Dental Hygiene Capstone course (DEN 495) is more objective and consistent with use of a rubric. The rubric provides straightforward criteria with performance expectations and grading standards which are provided in advance of the presentation.  Students have access to the grading criteria (rubric) at the start of the semester, which allows them to assess the quality of their work prior to the presentation.

The evaluation utilizes a five (5) point Likert scale that attaches a weight of 1 or 2 to each element. Key component areas such as planning, implementation and assessment have a higher weight due to the significance in the capstone project. Other areas essential for a strong quality presentation are included as elements in the rubric but have a weight of one in the calculation. Points earned are divided by the possible points of 75 to calculate a final percentage grade.

Initially the evaluation form was used by all members of the audience. This changed as more students opted to do their presentations via DVD submission. Currently just the class instructor uses the rubric which provides a single grade for the presentation.


The rubric directs both the content and the delivery of the capstone report. Knowing the expectations at the start of the semester facilitates students' topic choice as well as the organization and presentation.

DEN 495 Capstone Oral Presentations

Student Name: _________________________________________

Date: __________________

Above Average
Very Good
Below Average
Needs Improvement
Did not attempt or complete

1. The student clearly explains the rationale for their project.

5 4 3 2 1 0 _____ × 1 = _____

2. The student discusses the assessment phase of their project.

5 4 3 2 1 0 _____ × 1 = _____

3. The student discusses the planning and design phase of their project.

5 4 3 2 1 0 _____ × 2 = _____

4. The student describes the implementation stage of their project.

5 4 3 2 1 0 _____ × 2 = _____

5. The student identifies the assessment tool utilized for outcomes assessment and describes the outcome(s) of the project.

5 4 3 2 1 0 _____ × 2 = _____

6. Discussion of project; student discusses factors that influenced their project. What were the hurdles, challenges, rewards?

5 4 3 2 1 0 _____ × 2 = _____

7. The student uses professional terminology/correct spelling.

5 4 3 2 1 0 _____ × 1 = _____

8. The presentation is well organized and well rehearsed.

5 4 3 2 1 0 _____ × 1 = _____

9. The student speaks clearly/ audibly, with a good rate of speech.

5 4 3 2 1 0 _____ × 1 = _____

10. Quality visual aids are utilized, (slides, PowerPoint, DVD).

5 4 3 2 1 0 _____ × 1 = _____

11. The student utilizes the 20 minute time frame.

5 4 3 2 1 0 _____ × 1 = _____

Evaluated by: _________________________________________

Final Score: __________________

Multi-level Critique of Landscape Design

Carl Bower

This assessment activity works toward making the “subjective” more concrete while providing students a three-tiered response to their projects. The project requires students to create a landscape design that satisfies the client’s expectation while also satisfying the principles related to a thorough and successful proposal.

The instructor creates a rubric that encompasses both needs. The rubric is applied, first, by the student-designer as a self-assessment, attaching his/her grade; it is then applied by a group of three or four peers who collectively study the proposal, arriving at a group grade. Finally, it comes to the professor, who determines the “official” grade, which most likely mirrors the first two grades.

This approach develops the students’ ability to appraise their own work critically and to honor client preferences/expectations (rather than their own), introducing the students to a level of professionalism necessary in their field.


Identified the need to provide additional instruction regarding design elements and overall functionality to move students beyond superficial evaluations. Also added a lesson to assist students in articulating their concepts.

Teamwork Survey for Industrial Project Management (Plastics)

John Bartolomucci

The faculty and the advisory committee recognized a weakness in the major, specifically, students’ performance in a team/work group environment. To address this weakness, the faculty undertook a research project that, first, required students to respond to a survey instrument following their completion of the first team project. The survey results revealed the key roadblocks to effective teamwork. In response, the plastics faculty recommended that instructional strategies be embedded in the course, which required an adjustment to the lab-lecture hours. When the revised course is taught, the teamwork survey will function as a pre/post assessment.


This project yielded a formal report that connects the study to the College goals, to the academic school’s mission and goals, and to the program and course goals. The survey and resulting action demonstrate the inverted pyramid approach to assessment and stand as an example of “closing the loop.” The continued use of the survey instrument will ensure ongoing attention to the development of teamwork skills.

Business Management Capstone Assessment

Gerald (Chip) Baumgardner

A four-pronged assessment is built into the capstone and combines a commercially-prepared test with other measures:

  • Application of the Major Field Test in Business (ETS instrument) offers norming as well as individual student performance information. The data enables the department to identify the strengths/weakness of its curricular approach and provides students with information about their cumulative mastery of the material.
  • A rubric, mirroring program objectives, allows for individual assessment of each capstone presentation.
  • Student Satisfaction Survey (initiated in 2002, thus providing longitudinal data) serves the faculty’s assessment review as well as the two agencies accrediting the major.
  • The Business Strategy Game, an online simulation with 3,000 teams participating, provides Penn College students with another norming opportunity; moreover, the simulation promotes further development of technology and communication skills.


An analysis of student performance identified a need for a curriculum revision, adding a course in quantitative methods. In addition, faculty placed a stronger emphasis on oral communication skills within courses and adjusted the rubric for MGT 497, Business Policy and Strategy.

Four-Semester Research Project Culminating in Automotive Management Capstone

Ron Garner

Students are introduced to the methodology required to complete a major research project, one that spans four semesters of work and that integrates the program goals with the goals of the baccalaureate core curriculum. From research proposal, précis, data collection, to the completed report that culminates in recommendations, the students’ theses incorporate statistics, research and writing skills, and technical/management skills related to the automotive industry. Students maintain a notebook of all related correspondence, drafts, and reference materials, which hones their organizational skills.


Additional content on research/credibility of sources in the field has been integrated within the courses; the program also underwent a minor curriculum change to alter the curriculum sequence.

Pre- and Post-testing in Plastics Course

Kirk Cantor

Pre- and post testing of students allow faculty to measure student progress as well as the effectiveness of teaching methodology. The data indicates student achievement from the start through the completion of the course, while also identifying topics/concepts that require additional focus. Four years of data has been collected and analyzed via this pre/post-test approach in an extrusion course; the data revealed a lower-than-anticipated improvement in one topic area.


In response to the data indicators, the course was modified on two levels: specific objectives are identified for each class meeting to help students focus on the major points and the course content was increased to respond to the topic area of concern. The pre/post assessment tool will be used in subsequent years to determine the success of these modifications.