Spacer

Previous Standard | Next Standard

3.3.1 Institutional Effectiveness


The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas (Institutional Effectiveness):

3.3.1.1 educational programs, to include student learning outcomes


 
Judgment of Compliance
Compliant
Narrative

Educational programs at Sam Houston State University (SHSU) identify expected outcomes, assess the extent to which they achieve those outcomes, and provide evidence of improvement based on the analysis of their assessment results.  This narrative features specific examples of the outcomes assessment process utilized by educational units, including online and hybrid units, at SHSU.  These initial examples are expanded upon in more extensive, college-specific documents highlighting completed unit-level assessment plans from the past three assessment cycles (i.e., 2011-2012, 2012-2013, 2013-2014).  In addition, reviewers may access all archived academic unit assessment plans for these cycles through a password-protected link within SHSU’s reaffirmation website.  Instructions for accessing this repository are available on the Instruction Sheet.  Furthermore, this narrative will highlight additional steps taken by SHSU to ensure compliance with the guidelines, recommendations, and requirements of the Southern Association of Colleges and Schools Commission on Colleges regarding Comprehensive Standard 3.3.1.1. 

Institutional assessment at SHSU is overseen by the Office of Academic Planning and Assessment (OAPA) [1].  Within OAPA, the Director of Assessment oversees unit entries into the campus’s online assessment management system, the Online Assessment Tracking Database (OATDB) [2].  Furthermore, OAPA staff members provide training, resources, and support to all units across campus conducting annual assessment.  Examples of training presentations [3] and assessment resources [4], [5], [6] are provided as part of this narrative. 

SHSU utilizes a 13-month annual assessment cycle for all units, which runs from September to October.  This cycle roughly aligns with the university’s academic calendar and gives units flexibility to collect data from all academic semesters (i.e., fall, spring, and summer), while still giving units relying on end-of-fiscal-year data time to analyze their results and develop actions for improvement.  OAPA staff members monitor unit entries throughout the year to ensure participation in the ongoing assessment cycle. 

Annual Assessment Plan Elements

Units at SHSU utilize the OATDB to document their ongoing assessment plans and reports, and are asked to provide the following elements, defined below, within the OATDB:

Goals 

Goals are broad statements of mission or purpose that serve as guiding principles for a unit.  By their nature, goals are not necessarily measurable. 

Objectives

Objectives are specific statements of intent or purpose that a unit expects to achieve. Objectives are measurable and aligned with a unit’s goals.  Both learning and performance objectives may be used by a unit, as appropriate.

Learning Objective

Learning objectives are the expected knowledge or skills someone should gain as a result of receiving instruction or training.

Performance Objective 

Performance objectives are the expected attainment of non-learning tasks (e.g., satisfaction with service, attendance/participation levels, student recruitment and enrollment, general administrative functions).

Indicators (For Learning Objectives) 

Indicators are the instruments, processes, or evidence, both direct and indirect, used by a unit to assess a learning objective.

Criterion (For Learning Objectives) 

Criterion are utilized with indicators to assess learning objectives. Criterion are the level of expected attainment or performance for an objective.

Key Performance Indicators (KPIs; For Performance Objectives) 

KPIs are the instruments, processes, or evidence, both direct and indirect, used by a unit to assess a performance objective.

Findings/Results 

Findings or Results are the data gathered from the unit’s assessment measures.

Actions 

Actions are the next steps to be taken by a unit in response to specific assessment Findings or Results.

Plan for Continuous Improvement Elements (Two Parts) 

The Plan for Continuous Improvement Elements were introduced during the 2012-2013 assessment cycle and replaced an older assessment element (i.e., Closing the Loops Summary).

Part One: Update to previous cycle’s Plan for Continuous Improvement.  An update of progress made towards completing the action items identified within a unit’s previous cycle’s plan for continuous improvement. 

Part Two: New Plan for Continuous Improvement.  Units are asked to provide a narrative describing all action items they will implement based on their data, resources needed, personnel involved, etc.  Units are given a chance, in the next assessment cycle, to reflect upon what elements of their plan were completed and if they were successful.

Examples of Annual Outcomes Assessment

Specific examples from each academic college for the 2013-2014 assessment cycle are highlighted here, in detail, to demonstrate how educational units at SHSU define outcomes, measure outcomes, collect data regarding outcome attainment, and utilize data for continuous improvement. 

College of Business Administration - Marketing BBA (2013-2014 cycle) [7]

The Marketing BBA program has five defined student learning objectives that are assessed on a rotating basis.  During the 2013-2014 assessment cycle, the Marketing BBA focused their assessment efforts on their students’ abilities to “describe the core concepts and principles of Marketing, including the elements of marketing mix.”  These core concepts and principles were further defined by Marketing BBA in a document entitled “15 Key Marketing Concepts” [8].

To assess this learning objective, the Marketing BBA program utilized embedded exam questions across four separate exams.  A total of 499 students who enrolled in nine different sections of Marketing 3310-Principles of Marketing, were evaluated.  It was expected that students would score 75% or better on the embedded exam questions.  Marketing BBA compiled their assessment results into a detailed report [9], which they provided as part of their assessment report.  In examining their assessment results, Marketing BBA faculty identified two areas of weakness for marketing students: (a) understanding the steps in the segmentation, targeting, and position process, and (b) understanding basic distribution strategies - intensive, selective, and exclusive. 

In response to these assessment results, Marketing BBA faculty met and devised strategies for improvement.  First, in order to address identified student weaknesses with understanding the steps in the segmentation, targeting, and positioning process, the faculty developed a case study to supplement lectures and existing assignments.  The Marketing BBA faculty noted that, although in-class examples were provided, they hoped that the out-of-class case study exercise would help reinforce what the students gained through lecture.  Second, in order to address identified student weaknesses with understanding basic distribution strategies, the faculty developed a handout to supplement lectures that described the relationship between distribution strategies and consumer goods classification schemes.

College of Criminal Justice - Criminal Justice PhD (2013-2014 cycle) [10]

For the 2013-2014 assessment cycle, the Criminal Justice PhD program defined two objectives.  First, that students completing the PhD program would be able to demonstrate the necessary tools and knowledge to produce an empirically-based research manuscript ready for submission to a peer-reviewed academic journal.  Second, that advanced doctoral students would develop and demonstrate their ability to serve as high-quality classroom instructors.

To assess these objectives, Criminal Justice PhD faculty utilized two assessment measures.  First, to evaluate the quality of student writing, students were required to submit and defend portfolios consisting of self-selected writing artifacts.  These portfolios would then be evaluated by a faculty committee using a locally developed rubric [11], with the expectation that students would receive scores of “Pass” or “High Pass.”  To evaluate the students’ abilities to serve as classroom instructors, Criminal Justice PhD utilized a combination of scores from the IDEA Faculty Evaluation System [12] and a locally developed Doctoral Teaching Rubric [13].  Criminal Justice PhD faculty expected that all doctoral teaching fellows would score 4 or higher on the IDEA summary evaluations, and that they would average 80% or higher using the locally developed Doctoral Teaching Rubric.

Data collected for Criminal Justice PhD students indicated that 100% of students scored either a “Pass” or “High Pass,” prompting the faculty to examine ways to modify their portfolio grading rubric in order to better articulate and assess the desired outcomes.  When the faculty looked at the doctoral fellows’ teaching scores, they noted that, although 100% of the students received scores of 80% or higher using the locally developed Doctoral Teaching Rubric, only 66% and 69% of students received IDEA scores of 4 or higher for the fall and spring semesters, respectively.  Based on these results, the faculty have implemented curriculum revisions designed to provide students with additional training and practice with teaching pedagogy for both traditional classroom and online environments.  Additionally, the faculty designed a new rubric and feedback form that will be used by a newly developed Graduate Student Development and Assessment Committee to evaluate student teaching and provide students with detailed feedback regarding ways they can improve their teaching effectiveness.

College of Education - Counseling MED (School Counselor) (2013-2014 cycle) [14]

For the 2013-2014 assessment cycle, the Counseling MED (School Counselor) program defined three objectives: (a) program graduates should be able to demonstrate an understanding of ways to evaluate, create, and maintain a positive school environment in which diversity is acknowledged and respected; (b) program graduates should be able to develop and utilize needs assessments for school children and develop proactive and preventative Closing the Gap plans based on their results; and (c) program graduates should demonstrate competencies as a professional researcher.

To assess these objectives, Counseling MED (School Counselor) utilized three assessment measures.  First, to assess whether students had developed knowledge and skills necessary to evaluate diversity within the school environment, Counseling MED utilized a School-wide Cultural Competency Observation Checklist (SCCOC) [15], with the expectation that students would be able to respond to and interpret all checklist items, report on the results, and make recommendations based on the gathered data by the midpoint of the semester.  To measure student ability to complete needs assessments within the schools, Counseling MED (School Counselor) students were expected to develop school-wide needs assessments, with the expectation that the assessments and their associated activities would be successfully completed by 90% of the schools’ teachers.  Finally, to analyze Counseling MED (School Counselor) students’ research abilities, faculty utilized a locally developed research project grading rubric [16]

Data collected from Counseling MED (School Counselor) students indicated that the SCCOC timeline did not allow students enough time to complete and implement their recommendations.  When examining the results from their students’ development and implementation of Closing the Gaps plans, they also determined that the completion rate was only roughly 80%.  Finally, although students did complete and present their research projects, they did not share their papers with school personnel and therefore did not fully complete the objective.

The Counseling MED (School Counselor) faculty implemented several changes based upon these assessment results.   First, in response to students not having sufficient time during the semester to complete and implement the SCCOC checklists, faculty modified the assignment to give students the entire semester to complete the SCCOC checklist and develop their recommendations.  Future potential changes currently under review include conducting the assignment over two semesters or modifying it to omit a second administration of the SCCOC.  In response to the Closing the Gaps completion rates, program faculty determined that their criterion of 90% was too high; however, at the same time the program faculty recognized a need to continue to improve the rate of the successful completion of the Closing the Gaps plans.  Therefore, the faculty determined it would be more appropriate to lower the criterion to 85% for future semesters, thereby still identifying a needed increase of 5%.  Finally, faculty determined that there was a need to reinforce all required elements for the student research assignments to ensure that students were presenting their research papers to school personnel. 

College of Fine Arts and Mass Communication - Art BA (Photography) (2013-2014 cycle) [17]

For the 2013-2014 assessment cycle, the Art BA (Photography) program defined two learning objectives.  First, that students would demonstrate proficiency in the creation, manipulation, and printing of digital images.  Second, that students would have an understanding of the history of photography; would be familiar with contemporary theories, trends, and practices in photography; would be able to contextualize ideas in the continuum of this history; and would understand the relation of their own work to photography that had preceded it.

To assess these objectives, Art BA (Photography) faculty utilized two assessment measures.  First, to assess student proficiency in the creation, manipulation, and printing of digital images, faculty utilized student portfolios and evaluated them using a rubric [18], with the expectation that 70% of student portfolios would score 80% or higher on each rubric domain.  Second, to assess student understanding of the history of photography, faculty used student essays scored with a rubric [19], with the expectation that 75% of students would score 85% or higher in each domain when scored by two faculty members. 

Using these assessment measures, faculty determined that students were struggling with several learning objective aspects.  First, with regards to students’ abilities to demonstrate digital imaging skills, the faculty agreed that digital manipulation and concept development were both particular weaknesses.  Second, with regards to the students’ ability to articulate knowledge of the history of photography, faculty determined that students were in need of improvement in their use of professional vocabulary and in their ability to formulate their conclusions, as well as generally weak in their grasp of scholarly context and the history of “art” photography. 

In response to these assessment findings, Art BA (Photography) faculty took several specific actions for improvement.  First, in response to weaknesses seen with students’ abilities to demonstrate digital imaging skills, the faculty determined that there was a need to correct pedagogical inconsistences within sections of ARTS 2370 and ARTS 3370 to ensure that all learning objectives were being implemented by all professors teaching those courses.  To accomplish this, the program coordinator would meet with the faculty to reemphasize required learning objectives and to help the faculty develop effective pedagogic strategies for emphasizing these objectives in the classroom.  In response to the weaknesses identified with regards to students’ knowledge of the history of photography, the department decided to change the faculty member teaching ARTS 3381 (The History of Photography) to a faculty member with a specific background in art history.

College of Health Sciences - Nursing BSN (2013-2014 cycle) [20]

For the 2013-2014 assessment cycle, the Nursing BSN program defined three learning objectives.  First, that nursing students would achieve mastery of a specialty content area (e.g., medical-surgery, obstetrics, psych-mental health, community, fundamentals, health assessment).  Second, that nursing students would achieve mastery of all nursing content prior to graduation.  Third, that graduates from SHSU would pass the National Council Licensure Examination (NCLEX).

To assess these objectives, Nursing BSN faculty used three assessment measures.  To examine student mastery of specialty content areas, the faculty utilized student scores on the Assessment Technologies Institute (ATI) Specialty Examinations, with the expectation that 60% of the student cohorts taking specialty examinations would average a score 2 (proficient) or higher as a cohort.  To assess student mastery of all required nursing content by graduation, Nursing BSN faculty relied upon the ATI Exit Examination, with the expectation that 60% of students would score higher than the national baseline mean.  Finally, to determine student passage of the NCLEX, the faculty used first-attempt pass rates for the NCLEX, with the expectation that 80% of students would pass on their first attempt.

Using these assessment measures, Nursing BSN faculty found several areas for improvement amongst their students.  First, with regards to the objective of student mastery of specialty content areas, only three cohorts (18.75%) successfully achieved the criterion of 60% of students within the cohort scoring 2 or higher on their exams.  Similar weaknesses were observed with regards to student mastery of all nursing content, with only 20% of students scoring above the national mean on the ATI Exit Examination.  Finally, the first-time pass rate for students on the NCLEX was 78.26%. 

Given these assessment findings, Nursing BSN faculty developed several actions for improvement.  First, the faculty will work to incorporate ATI Specialty Exams into nearly every nursing course in order to provide their students with extra practice.  Furthermore, faculty will develop policies and procedures that will ensure that any student scoring less than a 2 on any exam will be given remediation hours before they take a re-test over the same material.  To ensure that students take the specialty exams seriously, the exams will also be counted as 5% of a course’s grade.  Furthermore, each student will also be assigned a faculty mentor as they enter the program who will follow up with the student’s progress throughout the program and provide assistance with study and testing skills as necessary.  Nursing BSN faculty determined that similar steps were necessary with students taking the ATI Exit Examination.  As with those students who scored unsatisfactorily on the ATI Specialty Exams, students who struggled with the Exit Examination would be required to take mandatory remediation with either their instructor or nursing resource coach prior to re-testing.  Finally, Nursing BSN faculty devised several actions to address the weaknesses with the NCLEX pass rate.  First, they would review the ATI Exit Examination results for students to identify areas of weakness that should be addressed through curricular changes.  Second, they increased the number of proctored ATI Exit Examinations from two to three, with students scoring below the mean receiving mandatory remediation before re-testing.  Third, they have incorporated a review day towards the end of each semester in which faculty from each specialty area visit classes and review important content from their specializations. Fourth, each course will add more NCLEX-style questions to their quizzes and exams.  And finally, Nursing BSN will continue a three-day ATI review. 

College of Humanities and Social Sciences - Philosophy BA (2013-2014 cycle) [21]

For the 2013-2014 assessment cycle, the Philosophy BA program defined three learning objectives and one performance objective.  The learning objectives were as follows: (a) students would demonstrate an ability to think critically, including the ability to analyze arguments and draw conclusions from available information; (b) students would demonstrate a basic understating of metaphysics, epistemology, and moral theory; and (c) students would demonstrate an understanding of the history of philosophy.  The performance objective identified by Philosophy BA centered on plans to implement revised versions of Philosophy 3362 for the 2013-2014 cycle and to develop new assessment measures to be integrated into the course. 

To assess these objectives, Philosophy BA faculty utilized multiple indicators and KPIs.  To measure students’ critical thinking ability, Philosophy BA faculty used the Texas Assessment of Critical Thinking Skills (TACTS) instrument [22], with the expectation that students would demonstrate statistically significant improvement from pre-test to post-test.  Furthermore, Philosophy BA faculty also expected students to show at least a 75% improvement on questions regarding decision-making when an outcome depends upon the conjunction of two probabilistic events. 

To assess student understanding of basic philosophical concepts, Philosophy BA faculty utilized three separate indicators: (a) locally standardized pre- and post-tests of student knowledge relating to metaphysics, epistemology, and moral theory [23], [24]; (b) specific embedded questions from the pre- to post-tests selected by the faculty to demonstrate student knowledge of the philosophical theories of Immanuel Kant; and (c) specific embedded questions from the pre- to post-test selected by the faculty to demonstrate student understanding of arguments related to the death penalty.  The expectations were that students would demonstrate statistically significant improvement from the pre-test to the post-test and specifically that students would show a 75% improvement in performance on the embedded questions relating to Immanuel Kant and the death penalty.

To analyze students’ understanding of the history of philosophy, Philosophy BA faculty used locally selected pre- to post-tests administered within PHIL 3364 (Ancient and Medieval Philosophy) and PHIL 3365 (Modern Philosophy).  The expectations were that students in each course would demonstrate statistically significant improvement in their scores from the pre- to post-tests. 

Finally, to measure students’ attainment of the performance objective, the Philosophy BA program utilized two KPIs.  The first was the successful implementation and delivery of the revised curriculum for PHIL 3362 during the fall 2013 and spring 2014 semesters.  The second was the successful development and finalization of an assessment protocol for the newly revised curriculum.

Using these assessment measures, Philosophy BA faculty were able to collect a wide range of assessment data relating to their objectives.  When the faculty examined student results on the pre- to post-tests, they determined that students displayed statistically significant gains on all exams [25], [26], [27], [28], [29].  Furthermore, they determined that students also demonstrated gains greater than 75% for all embedded questions.  In addition, the new curriculum for PHIL 3362 was fully implemented, although personnel changes postponed the finalization of new assessment measures for the course until the 2014-2015 assessment cycle.

Despite these successes, Philosophy BA faculty did identify areas for improvement.  Although students demonstrated significant gains from pre-to-post on all desired learning objectives, the post-test means were often lower than they desired or expected, ranging from 42.5% to 61%.  Given these results, Philosophy BA faculty have begun exploring ways in which they can improve content and pedagogy.  Furthermore, the Philosophy BA faculty also worked to revise existing assessment measures, such as the pre-to-post test for PHIL 2306, to address faculty and student concerns regarding the formulation of certain questions.  Finally, the faculty also decided to begin investigating establishing additional outcomes and measures that expand beyond statistically significant improvement on pre-to-post tests.

College of Sciences - Agriculture BS (Animal Science) (2013-2014 cycle) [30]

For the 2013-2014 assessment cycle, the Agriculture BS (Animal Science) program defined two learning objectives.  First, that students would be able to demonstrate competency in the key areas of knowledge of animal science.  Second, that students would also demonstrate the skills necessary to compete within the professional marketplace. 

To assess attainment of these objectives, Agriculture BS (Animal Science) faculty utilized two assessment measures.  First, to assess student knowledge of key areas of animal science, they utilized a locally developed rubric [31] to score 15 randomly selected student assignments from AGRI 3373.  AGRI 3373 represented an upper-level course in which each of these key elements were addressed.  The expectation was that at least 70% of students would score 3 or higher on the 5-point rubric.  The second assessment measure was also a locally developed rubric [32], utilized to evaluate student professional portfolios.  These portfolios, which included a professional cover letter, a résumé, a reference page, and a mock job application, were developed by the students as part of AGRI 4120, a senior-level capstone course.  Because of past student success with this objective, the Agriculture BS (Animal Science) faculty expected that at least 70% of students would score a 3.5 or higher.

Using these assessment measures, Agriculture BS (Animal Science) faculty were able to collect assessment results for use in improvement of student learning.  First, when the faculty examined student artifacts for demonstration of knowledge of key areas of animal science, they determined that only 66% of students achieved a score of 3 or higher. In particular, they noted that weaker documents either lacked evidence of scientific knowledge or did not demonstrate application of knowledge, with those areas scoring a 2.9 and 2.6, respectively.  When the Agriculture BS (Animal Science) faculty examined student professional skills, they determined that the average score was a 3.2 (below the desired average of 3.5), and issues with grammar, reference pages, and letters of reference needed attention. 

Using these assessment results the Agriculture BS (Animal Science), faculty were able to devise several actions for improvement.  In response, faculty identified and planned pedagogical changes within AGRI 3373 to provide alternative methods of delivering scientific information to students (e.g., lecture outlines, course packets), with the expectation that students will be able to assimilate important scientific information more successfully.  Furthermore, in response to the observed weaknesses in the students’ professional portfolios, the faculty determined it was necessary to further emphasize professional writing skills.

Example Assessment Plans and Reports

As further demonstration of ongoing quality assessment practices at SHSU, this narrative includes documents containing example assessment plans for the last three complete assessment cycles:

Table 1. Example Assessment Plans

College of Business Administration

2011-2012 [33]

2012-2013 [34]

2013-2014 [35]

College of Criminal Justice

2011-2012 [36]

2012-2013 [37]

2013-2014 [38]

College of Education

2011-2012 [39]

2012-2013 [40]

2013-2014 [41]

College of Fine Arts & Mass Communication

2011-2012 [42]

2012-2013 [43]

2013-2014 [44]

College of Health Sciences

NA*

NA*

2013-2014 [45]

College of Humanities and Social Sciences

2011-2012 [46]

2012-2013 [47]

2013-2014 [48]

College of Sciences

2011-2012 [49]

2012-2013 [50]

2013-2014 [51]

*The College of Health Sciences was created at the beginning of the 2013-2014 academic year.

The following scheme was used to select the highlighted units: For departments containing two or fewer units, one was selected for inclusion; for departments of three to four units, two were selected; for departments of five to six units, three were selected; and for departments of seven or more units, four were selected.  This selection scheme provided roughly 50% of academic assessment plans from each given cycle.  A complete list of all academic unit assessment plans for these cycles is included as part of the 5th Year Report Website.  Instructions for accessing this repository are available on the Instruction Sheet. 

Distance Education

For the purposes of programmatic assessment, distance education programs (i.e., programs in which students can earn 50% or more of their credit online or through distance learning modalities) at SHSU are classified as one of two types: (a) fully online programs, in which students can earn a degree only through online or distance education formats; and (b) hybrid programs, in which students can earn 50% or more of a degree through online or distance education formats but some (or all) of the degree may also be offered through traditional face-to-face modalities.  The SHSU Online website [52] maintains a listing of these programs, as well as general program information. 

Distance education programs conduct and document their annual assessment efforts in the same manner as their traditional, face-to-face counterparts.  As theory and practice regarding distance education assessment has evolved, OAPA has endeavored to create guidelines for these programs that align with generally recognized best practices.  These guidelines are summarized within the “Best Practices for Documenting Assessment of Online and Distance Education Programs” document [53].  This document, developed during the spring 2014 semester, provides a summary of the recommendations and guidelines outlined within several SACSCOC documents: (a) Best Practices for Electronically Offered Degree and Certificate Programs [54], (b) Distance and Correspondence Education: Policy Statement [55], and (c) Guidelines for Addressing Distance and Correspondence Education: A Guide for Evaluators Charged with Reviewing Distance and Correspondence Education [56]

Beginning with the 2013-2014 assessment cycle, SHSU refocused its approach for documenting the assessment of online programs, especially for those programs that employ a hybrid model, in which students can complete a degree through either distance education or face-to-face modalities.  As is highlighted within the “Best Practices for Documenting Assessment of Online and Distance Education Programs” document, these hybrid programs are prompted to disaggregate assessment results for online and face-to-face students and to use the results from both groups in the formulation of their actions for improvement.  Specifically, this document states that:

When demonstrating the effectiveness of hybrid programs, it is considered best practice to compare the results from face-to-face and distance education students for all shared learning objectives.  The purpose is to ensure that both groups are demonstrating similar levels of attainment of the desired learning objectives.  There is not an expectation that both groups’ results be identical; however, there is an expectation they be similar.  If significant variations between the two groups are identified, then reasons for these variations must be documented and plans of action provided for addressing the gaps [57].

Furthermore, hybrid programs are asked to:

…clearly demonstrate how Findings are used to improve upon the educational experience for all students within the program, regardless of delivery method, and not just one group to the exclusion of the other.  These Actions should be driven by the Findings for both distance education and face-to-face students.  Areas for improvement found within one or both groups should be addressed within both the program’s Actions and Plan for Continuous Improvement.  As with the other elements, it is important that the connection between Findings, Actions, and Plan for Continuous Improvement be clear to an outside observer [58].

As fully online programs lack any face-to-face components, these programs report their assessment results and actions, with no need for disaggregation.  Examples of how hybrid programs have utilized disaggregated assessment results to drive improvement are provided below:

History MA (2013-2014) [59]

For the 2013-2014 assessment cycle, the History MA program examined two learning objectives and one performance objective.  First, that students would be able to demonstrate competence in applying research methodologies through conducting qualitative and quantitative analysis, conducting literature reviews, and using traditional digital resources, assessed through student written and oral comprehensive exams [60], [61].  Second, that students with interest in teaching would demonstrate skills in producing digital content, assessed through successful creation of digital content.  Third, that students would demonstrate satisfaction with the usefulness of their degrees, assessed through an end-of-program exit interview [62]

History MA faculty provided disaggregated assessment data for two of their three objectives (i.e., student knowledge and student satisfaction).  Assessment data for the third objective were gained from a new project digital media project.  Although both online and face-to-face courses were involved with this project, the newness of the project prevented data from being collected in such a way that would allow for disaggregation for the 2013-2014 cycle.  For Program faculty have already taken steps to ensure that data will be collected for both online and face-to-face students in future cycles for this objective.

For the remaining two objectives, the disaggregation of assessment results between online and face-to-face students provide valuable insights to the History MA faculty.  Results from written and oral comprehensive exams indicated that students generally performed well, regardless of whether they were enrolled in face-to-face or online courses.  Of the 19 students that completed written and oral comprehensive exams, both written and oral, 18 successfully passed.  Of these 19 students, seven were predominantly face-to-face, while 12 were predominantly online.  All seven of the face-to-face students passed all sections, while 11 of the 12 online students passed.  When faculty examined why the one online student did not pass, they determined that the failure was justified because of the students’ performance within their coursework, and that the failure of the student was justified.  This data served to confirm previous findings regarding the quality of their graduate students. 

The exit interview, in particular, yielded valuable data regarding the perceptions of the online students.  An examination of the interview results revealed to program faculty that online students desired a greater experience of community than did face-to-face students within the graduate program.  In response to these sentiments, the program has designed programs to foster these community connections, including monthly virtual historical and documentary film nights, and discussion forums.

General Business Administration, BBA (2013-2014) [63]

For the 2013-2014 assessment cycle, the General Business Administration, BBA program assessed three objectives.  First, that students would be able to compose effective business messages.  Second, that students would demonstrate knowledge of the core concepts and principles of business law and of legal environments.  Third, student satisfaction with the faculty instruction.  For the 2013-2014 assessment cycle, General Business Administration, BBA faculty provided disaggregated assessment data for the objective relating to student writing competency. 

Assessment data for the objective of student writing competency were gained through the use of a faculty developed writing rubric [64], with the expectation that 80% of sampled student papers would meet or exceed expectations for each rubric domain (i.e., format, content, grammar/mechanics).  Using this rubric, General Business Administration BBA faculty examined writing samples from 11 face-to-face students and from 46 online students [65].  In reviewing their assessment data, the faculty determined that 100% of face-to face students met or exceeded general expectations on the writing assignment, with 54% of these students performing above expectations on the writing assessment.  Although these results were very good, the faculty felt that the results were higher than normal, speculating that their sample might have been too small and thus might not have been representative.  The faculty determined that results for online students were more in line with historical norms, with roughly 90% of students meeting or exceeding expectations (i.e., 61% meeting expectations, 28% exceeding expectations), overall. 

When the faculty examined the individual learning domains for both groups, similar positive results were seen for both groups in the areas of format and content.  However, this examination revealed that both online and face-to-face students were struggling with grammar and mechanics.  A review of the data indicated that 54% of face-to-face and 50% of online students had scored below expectations for this domain. 

In response to these results, program faculty are exploring new pedagogy for grammar and mechanics that can incorporated into their courses.  Furthermore, faculty also determined a need for requiring students to utilize the university writing center for most writing assignments in future semesters.  Finally, the faculty are developing online tutorials designed to address identified problem areas in student writing that will be accessible to their students [66].

Graduate Program Review

SHSU is committed to placing the responsibility of appropriate curriculum and academic excellence on its faculty.  One component of a commitment to excellence is the willingness to be open to critical review, both from internal and external sources.  Thus, all graduate programs at SHSU engage in an external review process.  This graduate review process is governed by the Texas Higher Education Coordinating Board (THECB), in accordance with Texas Administrative Code, Rule 5.52, Review of Existing Degree Programs [67], and is overseen at SHSU by the Dean of Graduate Studies [68].

On a rotating, 7-year cycle [69], each graduate program conducts a self-study [70] that addresses the aspects that are common to all graduate programs as well as aspects that are unique attributes of each program.  A self-study is but one tool to guide programs in their continuous improvement efforts in meeting the challenge of serving the needs of students, the university, and external stakeholders.  The graduate program self-studies provide an overview of the programs as well as a detailed study of the curricula, graduate faculty, program resources, assessment, student success, recruitment and marketing.

The Self-Study Process

The self-study process incorporates three-stages: (a) the creation of the self-study [71], [72], (b) an external review [73], [74], and (c) the development of an action plan for improvement [75], [76].  The program faculty and the support staff conduct a thorough program review and produce a report with appropriate support documentation.  A team of external reviewers review the report, visit the campus and consult with program personnel and university administrators, and subsequently provide an evaluation of the program to include program strengths and recommendations for improvement.  University leaders, in coordination with faculty, develop an action plan in response to the results of the self-study and external review.  The process is as transparent and inclusive as possible.  The self-study, the external reviewers’ report, and the response are all submitted to the THECB.

Follow-Up Process

One year following the program review, the program director, chair, academic dean, graduate dean, and Provost meet to discuss the progress made on the action plan which addresses the recommendations for improvement.  Any outstanding issues or barriers to improvement are discussed and addressed.

Meta-Assessment

As part of university-wide efforts to promote quality assessment practices, OAPA also facilitates an annual meta-assessment process.  First piloted in the fall 2013 semester, the meta-assessment process utilizes a locally developed rubric [77] to evaluate the quality of programmatic assessment plans.  The focus of the meta-assessment review is not on what is being assessed by each unit, but rather the quality of how the assessment is conducted, with emphasis on the assessment practices and processes themselves.

Feedback from the annual meta-assessment reviews is used in two ways.  First, completed meta-assessment rubrics are returned to the individual units for review and use in the continuous improvement of their assessment practices.  Second, data from these reviews are used by university administrators to identify areas where training and resources are needed to improve programmatic assessment efforts.  Examples of the most recently completed meta-assessment rubrics [78], [79], [80], [81], [82], [83] are provided to highlight this process.  In addition, some examples of college-level summary reports are provided here to highlight how meta-assessment is being used at the college and institutional levels [84], [85], [86], [87], [88], [89].  It should be noted that a meta-assessment review of 2013-2014 assessment plans is currently ongoing.  Examples of completed rubrics [90] and reports [91], [92], [93] available at the time of writing are provided.  Within these reports, college leaders are asked to reflect upon the strengths and weaknesses they had observed in reviewing the completed meta-assessment rubrics for their units, as well as offer strategies for addressing them.  These reports, along with the complete rubrics, are then used by OAPA to enhance assessment-related training and resources.  Following the success of the initial meta-assessment pilot review of academic unit assessment plans from the 2012-2013 assessment cycle, OAPA is currently working to institute formal meta-assessment processes beyond the Division of Academic Affairs, and into the other divisions within the institution.


Supporting Documentation

Documentation Reference
Document Title
[1] Office of Academic Planning and Assessment Website
[2] Online Assessment Tracking Database (OATDB) User Guide
[3] Plan for Continuous Improvement Workshop
[4] Accessing the Online Assessment Tracking Database Guide
[5] Migrating Your Assessment Plan Guide
[6] Assessment FAQs
[7] Marketing BBA OATDB Report for 2013-2014
[8] Marketing BBA – 15 Key Marketing Concepts
[9] Marketing BBA – Assessment and Closing the Loop Overview Interim Report December 2013
[10] Criminal Justice PhD OATDB Report for 2013-2014
[11] Criminal Justice PhD – Portfolio Defense Form
[12] The IDEA Center Homepage
[13] Criminal Justice PhD – Doctoral Teaching Rubric
[14] Counseling MED (School Counselor) OATDB Assessment Report for 2013-2014
[15] Counseling MED (School Counselor) – School-wide Cultural Competency Observation Checklist
[16] Counseling MED (School Counselor – Research Rubric
[17] Art BA (Photography) OATDB Assessment Report for 2013-2014
[18] Art BA (Photography) – Digital Portfolio Rubric
[19] Art BA (Photography) – Photo History Rubric
[20] Nursing BSN OATDB Assessment Report for 2013-2014
[21] Philosophy BA OATDB Assessment Report for 2013-2014
[22] Philosophy BA – Texas Assessment of Critical Thinking Skills
[23] Philosophy BA – PHIL 2306 Pre- to Post-test
[24] Philosophy BA – PHIL 2361 Pre- to Post-test
[25] Philosophy BA – PHIL 2303 TACTS Data for 2013-2014
[26] Philosophy BA – PHIL 2306 Pre-to-Post Data for 2013-2014
[27] Philosophy BA – PHIL 2361 Pre-to-Post Data for 2013-2014
[28] Philosophy BA – PHIL 3364 Pre-to-Post Data for 2013-2014
[29] Philosophy BA – PHIL 3365 Pre-to-Post Data for 2013-2014
[30] Agriculture BS (Animal Science) – OATDB Assessment Plan for 2013-2014
[31] Agriculture BS (Animal Science) – Animal Science Rubric
[32] Agriculture BS (Animal Science) – Professional Portfolio Rubric
[33] Example Assessment Plans for the College of Business Administration – 2011-2012
[34] Example Assessment Plans for the College of Business Administration – 2012-2013
[35] Example Assessment Plans for the College of Business Administration – 2013-2014
[36] Example Assessment Plans for the College of Criminal Justice – 2011-2012
[37] Example Assessment Plans for the College of Criminal Justice – 2012-2013
[38] Example Assessment Plans for the College of Criminal Justice – 2013-2014
[39] Example Assessment Plans for the College of Education – 2011-2012
[40] Example Assessment Plans for the College of Education – 2012-2013
[41] Example Assessment Plans for the College of Education – 2013-2014
[42] Example Assessment Plans for the College of Fine Arts and Mass Communication – 2011-2012
[43] Example Assessment Plans for the College of Fine Arts and Mass Communication – 2012-2013
[44] Example Assessment Plans for the College of Fine Arts and Mass Communication – 2013-2014
[45] Example Assessment Plans for the College of Health Sciences – 2013-2014
[46] Example Assessment Plans for the College of Humanities and Social Sciences – 2011-2012
[47] Example Assessment Plans for the College of Humanities and Social Sciences – 2012-2013
[48] Example Assessment Plans for the College of Humanities and Social Sciences – 2013-2014
[49] Example Assessment Plans for the College of Sciences – 2011-2012
[50] Example Assessment Plans for the College of Sciences – 2012-2013
[51] Example Assessment Plans for the College of Sciences – 2013-2014
[52] Online Programs and Degrees – SHSU Online Website
[53] Best Practices for Documenting Assessment of Online and Distance Education Program
[54] Best Practices for Electronically Offered Degree and Certificate Programs
[55] Distance and Correspondence Education Policy Statement
[56] Guidelines for Addressing Distance and Correspondence Education
[57] Best Practices for Documenting Assessment of Online and Distance Education Program – Data Disaggregation
[58] Best Practices for Documenting Assessment of Online and Distance Education Program – Use of Data
[59] History MA OATDB Assessment Plan for 2013-2014
[60] History MA Written Comprehensive Exam Rubric
[61] History MA Oral Comprehensive Exam Rubric
[62] History MA Exit Interview
[63] General Business Administration BBA OATDB Assessment Plan for 2013-2014
[64] General Business Administration BBA – Persuasive Writing Rubric
[65] General Business Administration BBA – Writing Assessment Findings
[66] General Business Administration BBA – Writing Assessment Summary
[67] Texas Administrative Code, Chapter 5, Subchapter C, Rule 5.52
[68] Office of Graduate Studies
[69] Graduate Program Review Schedule
[70] Graduate Program Review Self-study Manual
[71] Computing and Information Science Graduate Program Self-study for 2013-2013
[72] Master of Library Science Self-study for 2014
[73] Computing and Information Science Reviewer Report
[74] Master of Library Science Reviewer Report
[75] Computing and Information Science Action Plan
[76] Master of Library Science Program Action Plan
[77] SHSU Meta-assessment Rubric
[78] Finance BBA – 2012-2013 Meta-assessment Review
[79] Criminal Justice PhD – 2012-2013 Meta-assessment Review
[80] Interdisciplinary Studies BA, BS (Elementary EC-6) -2012-2013 Meta-assessment Review
[81] Mass Communication BA – 2012-2013 Meta-assessment Review
[82] Political Science MA – 2012-2013 Meta-assessment Review
[83] Geography BS – 2012-2013 Meta-assessment Review
[84] Meta-assessment Report – College of Business Administration, 2012-2013
[85] Meta-assessment Report – College of Criminal Justice, 2012-2013
[86] Meta-assessment Report – College of Education, 2012-2013
[87] Meta-assessment Report – College of Fine Arts and Mass Communication, 2012-2013
[88] Meta-assessment Report – College of Humanities and Social Sciences, 2012-2013
[89] Meta-assessment Report – College of Sciences, 2012-2013
[90] Physics BS – 2013-2014 Meta-assessment Review
[91] Meta-assessment Report – College of Business Administration, 2013-2014
[92] Meta-assessment Report – College of Sciences, 2013-2014
[93] Meta-assessment Report - College of Humanities and Social Sciences, 2013-2014

 

Previous Standard | Next Standard

Sub Content Box

Sam Houston State University
Huntsville, TX 77341
(936) 294-1111
1-866-BEARKAT