Spacer
Assessment : 2012 - 2013 : Educational Programs :
English MA

1 Goal    3 Objectives    3 Indicators    3 Criteria    3 Findings    3 Actions


GOAL: Knowledge And Skills

Objective  
Demonstrating Critical Thinking, Researching, And Writing Skills: Class Writing
English graduate students will demonstrate their abilities as independent critical thinkers, researchers, and writers capable of employing sophisticated skills in written analysis, synthesis, and evaluation of knowledge and of using a professional idiom in making written arguments. The program's success in achieving this objective will be measured by a holistic assessment of graduate class writing.

Indicator  
Holistic Assessment Of Graduate Writing  
The ability of students to write according to accepted professional standards is a direct indicator of the English MA and MFA programs' success in producing graduates who have acquired appropriate critical thinking, researching, and writing skills and are prepared for future professional endeavors. To that end, a significant amount of student writing is required in English graduate coursework.

To assess the effectiveness of class writing assignments in developing students' ability to make sophisticated arguments about literature, language, and writing disciplines in a critical idiom appropriate to professional standards, the faculty will undertake an annual holistic review of representative graduate student writing produced during the reporting period.
Criterion  
Standards For English Graduate Student Writing  
At least 92% of representative graduate essays evaluated during the holistic assessment will be scored as acceptable or excellent (a combined score of 5 or higher on the scale described below).

A rubric for evaluating graduate student writing is attached.


Assessment Process:

1. To assure that the assessment reviews a representative sampling of writing, graduate professors in both long terms are asked to submit term papers or other significant writing from every third student listed on their class rosters.

2. Two primary readers from among the graduate English faculty independently read and score each essay under review; in the case of an unreliable result, the essay is referred to a secondary reader, who reads the essay independently, without any knowledge of the previous results (see number 5, below)

3. Each primary reader scores each essay on a 4-point scale, with a score of 4 the highest possible. The two primary scores are added to yield a total, with the final scores ranging from 8 (highest possible) to 2 (lowest possible). A combined score of 5 or higher is passing. A score of 7 or 8  indicates an excellent essay; a score of 5 or 6 indicates an acceptable essay; a score of 4 or less indicates an unacceptable essay.

4. Reliability of the two scores is assumed when both scores from the primary readers are congruent, that is, when they are within 1 point of each other. For example, a score of 6 that would be seen as reliable would mean that both readers marked the essay as a 3. A reliable score of 5 would mean that one reader assessed the essay as a 3 while the other reader assessed it as a 2.

5. Should the primary scores for an essay not be reliable—for example, a 4 and a 1, a 3 and a 1, a 4 and a 2—the essay is referred to a secondary reader. If that reader agrees with the higher score, the essay is certified as acceptable or excellent; if the secondary reader agrees with the lower score, the essay is certified as unacceptable.

Finding  
Findings Of Holistic Writing Assessment  
For the reporting periods from 2009-2012, an average 95% of essays read for the holistic assessment earned the exemplary-acceptable rating of 3-4. Because of a scheduling error, however, the department did not undertake the holistic review for the 2012-2013 reporting period. It will resume the assessment process for the 2013-2014 academic year.
Actions for Objective:

Action  
Developing Students' Writing Abilities  
An average 95% of representative English graduate student writing reviewed over the previous three assessment periods met or exceeded the acceptable rating. Developing students' abilities to analyze, synthesize, and evaluate knowledge in writing, however, continues to be a primary program objective.

The first and most obvious action for measuring the program's success in accomplishing this objective is to resume the holistic review of graduate writing for the 2013-2014 assessment cycle. To that end, the graduate director has already gathered representative writing from Fall 2013 graduate classes.

The burden of introducing students to professional research methods and establishing standards for critical and expository writing rests largely with individual classroom instructors, who provide formal guidance, models, and assessment. Because of the variety of classroom writing assignments and the variety of ways in which professors approach writing, it is difficult to impose uniform standards upon writing from coursework. However, after consulting with the Director of Writing in the Disciplines, the graduate director will make any necessary revisions to the assessment rubric and will then supply graduate professors with the rubric, to make sure that all agree with the standards by which the program measures its success in achieving the objective.

There are other ways in which graduate faculty can also guide students in their progress toward independent critical thinking, researching, and writing. One suggestion is that each graduate faculty member serve as a mentor to a certain number of students, assigned at the beginning of a long term. While the graduate director would still be responsible for general advising and new student orientation, the mentors would be available for discussing class research and writing projects with their advisees.


GOAL: Knowledge And Skills

Objective  
Demonstrating Critical Thinking And Writing Skills And Breadth Of Knowledge: The Written Comprehensive Examination
English students will demonstrate that they have a graduate-level breadth of knowledge in literature, language, and writing disciplines and that they can express that knowledge in writing. The program's success in achieving this objective can be measured by the pass rate for the written comprehensive examination required of all students who take a graduate English degree at Sam Houston State University.

Indicator  
The Written Comprehensive Examination  
A passing score on the written comprehensive examination is a direct indicator that a student in English has acquired a breadth of knowledge in the subject, has developed critical reading and writing skills appropriate to a graduate-level education in English, and is well-prepared for future professional endeavors. For the examination, students choose three comprehensive areas from among thirteen broad topics in literature, language, and writing disciplines. To demonstrate their mastery of a broad range of materials, they are required to choose at least one British literature area and one American literature area and at least one early (pre-1800) British or American literary area and one later (post-1800) British or American literary area. For each area, students are given a reading list of works selected by faculty area experts.

During the exam itself, the student chooses one of three questions for each area and has two hours to respond to that question. A double-blind grading system is used to evaluate the candidates' proficiency. Three graduate faculty members read and evaluate each essay.
Criterion  
Written Comprehensive Examination Pass Rate  
At least 90% of examination essays will pass (with a grade of pass or high pass). The method of measuring the success in achieving the objective has changed since the previous assessment, when we counted the number of students who passed. Because most students who fail area exams pass them on a second take, the pass rate for essays themselves seems to be a more specific measurement of how well the exam assesses the success of the program in achieving the objective.

If we apply the new method of measuring this success to the exam results for Academic Year 2011-2012, 69% of essays passed.

An examination grading rubric and sample pass, fail, and high pass essays are attached.
Finding  
Written Comprehensive Examination Results: 2012-2013  
In Academic Year 2012-2013, thirteen students sat for comprehensive exams during three sessions (Fall, Spring, and Summer). A handful of these were students retaking area exams after having failed in the previous academic year.

Students wrote a total of 33 essays.

The results follow:

Pass: 23 essays: 70%
Fail: 6 essays: 18%
High Pass: 4 essays: 12%

Total Pass: 27 essays: 82%

(Two of the failing essays were the result of the students’ not having responded to questions. While we might adjust the pass rate percentage to account for this variable, presumably the two students failed to address the questions because they did not have the breadth of knowledge or critical thinking and writing skills that the exam measures; we include these essays among the failures.) 

Observation about findings: The graduate faculty have expressed concern in the past about students who cluster their exam areas to ease the burden of preparation. Rather than spreading their areas over a broad range of literature, some, for example, will choose American liteature before 1800, 19th-century British literature, and 19th-century American literature; in so doing, they narrow the range to a mere 250 years or so. One significant finding for this assessment cycle, however, is that four of the six failing essays came from students who clustered their areas. The suggestion is that this strategy may be an indicator of general academic weakness or lack of confidence.

Conclusions about findings: The pass rate of 82% is an improvement over the 69% for Academic Year 2011-2012, but it still falls short of the projected 90%. (For a comparison of pass rates by exam area for the last four assessment cycles, see the attachment, "English Graduate Comprehensive Examination Pass Rate: Percentage/Number of Essays, 2009-2013.")

We have considered possible reasons for the failure to meet the projected pass rate:

(1) Students did not prepare well enough: They may have failed to give themselves enough time to read and synthesize all of the works and critical issues in a chosen area. They may have gambled by not reading all of the works on prescribed reading lists. Or their preparation may have been misdirected.

(2) Students did not receive adequate guidance in their preparation. The graduate director offers biannual exam prep sessions, and students are urged to consult faculty area experts in preparing for the exam. Not all students attend the prep sessions or seek out advice from area experts, however. And it is possible that the prep sessions do not adequately prepare the students.

Although students who fail essays sometimes complain that their graduate classes did not prepare them for the exams, it is difficult to establish a significant statistical correlation between coursework and the examination pass rate. For one thing, students sometimes take exam areas for which they have had no graduate coursework. For another, while the student's classes will suggest approaches for reading and analyzing literature and for synthesizing bodies of information, the exams are not tied specifically to courses. A professor may teach an MA-level survey of literature and English language that covers many works on an area reading list, but there is no contractual obligation that she or he do so.

Students are advised that the responsibility for reading all of the works on the area reading lists and for making comprehensive sense of them rests, finally, with them.

(3) Faculty expectations for the exam are too rigorous. While faculty do have high expectations for students’ performance on the exam, both the reading lists and the exam questions have been carefully suited to MA-level students in the discipline. We have also found that, since instituting the current exam system ten years ago, students find themselves much better prepared for PhD work and college teaching in the field.

(4) Testing circumstances affect the students’ performance. The graduate faculty readers are aware of the highly artificial—and too-often intimidating—circumstances under which students take the exam, and they make allowances for testing anxieties. However, the faculty also believe that a student who has prepared adequately will be able to perform well enough under these circumstances. 

(5) The projected pass rate has been set too high. Perhaps the expectation that nine of ten essays pass is unrealistic.
Actions for Objective:

Action  
Preparing Students For The Written Comprehensive Examination  
The ability to make an effective argument about any subject requires, first, a thorough knowledge of the subject. Students must understand that the burden of acquiring this knowledge through their independent reading and their classwork rests, finally, upon them.

Nonetheless, there are also processes by which the graduate faculty can help in preparing the students:

The graduate director continues to publish an exam prep booklet and to conduct biannual comprehensive examination prep sessions, during which he discusses the exam process, suggests strategies for preparing and for addressing exam questions, and presents exemplary questions and responses.

It is difficult to measure objectively how effective the exam prep sessions have been. Students are not required to attend, and the graduate director has not kept records to see if there is any correspondence between attendance and the pass rate. One suggestion for improving the pass rate, however, is that attendance at at least one such session be required. Beginning with the 2013-2014 reporting period, the graduate director will also keep records to see if attendance corresponds with success.

Other graduate faculty have been involved in the preparation process in two ways: Although the examination is expressly kept separate from classwork, some instructors use typical exam questions in their courses for midterm and/or final examinations, as a way of acclimating students to the comprehensive exam expectations and circumstances. Others give advice informally to students who approach them. Faculty members are not involved uniformly in this preparation, nor do we believe that they should have to be.

One suggestion, however, is that each graduate faculty member serve as a mentor to a certain number of students, assigned at the beginning of a long term. While the graduate director would still be responsible for general advising and new student orientation, the mentors would be available to provide specific advice and encouragements and to check on their students' progress in the program during the term. These faculty members could also give advice as the students prepare for the written examination.

As with the comprehensive examination prep sessions, measuring the effectiveness of a faculty mentor system objectively is difficult. Perhaps at least the program could require that students preparing for the examination meet at least once with their mentors for that purpose.

In response to the persistent failure to meet the projected 90% target pass rate, we also need to consider whether expecting nine of ten essays to pass is unrealistic. Perhaps the best way to do so is to gather information about comprehensive examination pass rates at peer institutions that have similar exams.


GOAL: Knowledge And Skills

Objective  
Demonstrating Critical Thinking Skills And Breadth Of Knowledge: Oral Argumentation
English graduate students will demonstrate their knowledge and critical thinking skills through oral arguments. We believe that the ability to make such arguments is necessary for future professional pursuits like teaching and further graduate education. The program's success in achieving this objective can be measured by the pass rate for the oral defense required of all thesis students and the oral comprehensive examination required of all non-thesis students.

Indicator  
The Oral Examination  
A passing grade on the oral examination required of all students who take the English MA or MFA degree at Sam Houston State University is a direct indicator that graduates are able to demonstrate their critical thinking skills and breadth of knowledge in the field. Thesis students sit for a one-hour oral defense of the thesis; having passed the written comprehensive examination, non-thesis students sit for a one-hour oral comprehensive examination covering the same three areas as those on the written exam. A committee of three graduate faculty members examines each student, awarding the candidate a pass, high pass, or fail, according to her or his ability to respond to specific questions. The committee for the oral defense of thesis comprises the members of the student’s reading committee; the oral comprehensive examination committee comprises area experts appointed by the graduate director.
Criterion  
Oral Examination Pass Rate  
At least 92% of degree candidates will pass the oral defense of thesis or oral comprehensive exam at the first sitting or upon retaking it.

Thesis defense and oral comprehensive exam grading rubrics are attached.
Finding  
Oral Examination Results: 2012-2013  
In Academic Year 2012-2013, four students sat for the oral defense of thesis and five sat for the oral comprehensive examination; all nine students passed at the first sitting.

Observation about findings: Despite the pass rate, faculty who sit on oral comprehensive exam committees have still expressed disappointment with the quality of the responses from some students, who, they feel, demonstrate weak arguments and marginal knowledge-base.

Conclusions about findings: While all students who sat for oral examinations during the assessment period passed, the distinction between those who earned a high pass and those who earned a pass is one measure or quality. Two of the four students who sat for the oral defense of thesis were awarded high passes; none of the five students who sat for the oral comprehensive examination were awarded high passes. There is also the less easily measurable anecdotal evidence of faculty who express disappointment with the general quality of students' arguments during the oral comprehensive exams.

One obvious reason for the discrepancy is that the expectations for knowledge in the two types of oral examinations are unequal: While thesis students know the subjects of their theses as well as, sometimes even better than the examining faculty and have a much narrower subject, oral comprehensive exam students are expected to demonstrate the same breadth of knowledge as that required for the written comprehensive examination. And while the atmosphere of the oral exam is presumably less formal, with faculty examiners sometimes offering hints or suggesting ways that students can approach responses, many of our students find the exam terrifying.
Actions for Objective:

Action  
Preparing Students To Make Oral Arguments  
During the last four assessment cycles, all twenty students who have sat for an oral defense of thesis and all twenty-six students who have sat for an oral comprehensive examination have passed. The 100% pass rate does not suggest, however, that the program should relax its efforts to prepare its students for making oral arguments. Nor should it suggest that oral examinations are the only ways to measure the program's success in preparing students for making oral arguments.

While all students passed, the discrepancy in the quality of their oral arguments suggests that the department should discuss both the nature of the exams and the expectations: What purposes does the oral comprehensive examination serve? What should the examiners' expectations be? What does the department need to do improve the quality of students' responses during these exams?

Preparing the students for making oral arguments overlaps significantly both with preparing them for the written comprehensive examinations and with assessing their graduate-level critical and expository writing: All such endeavors require a thorough knowledge of the subject under discussion. Students are advised, first, to know well the subjects about which they are speaking. 

To prepare the students specifically for making oral arguments, however, the department has considered requiring an oral component in one or more of their graduate classes, perhaps one that duplicates the circumstances of the thesis defense or oral comprehensive exam. One logical suggestion is that such a component be part of the research and methods class (ENGL 5330) required of English graduate students before they declare their degree plans.

As with efforts to prepare students for the written comprehensive examination, it is difficult to measure objectively the effectiveness of such a requirement, especially because 100% of students during the reporting period have passed the oral exam.

Formally instituting a faculty mentor system could also help if, for example, the mentors had their advisees sit for mock oral exams.

Faculty will also continue to urge students attend academic conferences, at which they must not only present their arguments about literature and language orally but also respond to any questions or challenges from the professional audience.




Previous Cycle's "Plan for Continuous Improvement"

During the past year, the professional success of our English graduate students was shown by the fact that graduates continue to be accepted to respectable PhD and MFA programs; during the reporting period, graduates were accepted to terminal programs at Oklahoma State University and the University of Memphis. Our students have also entered professions including teaching, editing, business, law, and professional communications. Along with such successes, twenty graduate assistantships that we are able to offer for qualified graduate students make the program attractive; however, an increased graduate stipend would significantly raise the quality of the already relatively qualified graduate students. Comprehensive examinations that are administered three times a year ensure that the knowledge base of our graduating students is broad, and our graduate faculty are actively encouraged to stay up-to-date in their fields of expertise and to be productive scholars. Recently, a handful of graduate students have been paired with faculty members as research assistants; these collaborations have led to publications and conference presentations.

In addressing the weaknesses identified in the findings above, we need to consider the following:

First, we should re-kindle our departmental discussion of how the comprehensive examination does, in fact, test our MA candidates' mastery of a broad range of materials. While most students follow the spirit of the exam in choosing areas across a sufficient range, some do cluster their areas, thereby easing the burden both of critical understanding and of specific preparation. We need to consider how to close any loopholes so as to assure that the comprehensive examination does, in fact, adequately measure the students' mastery of the discipline.

Second, while the holistic assessment of essays suggests that our students write at or above an agreed-upon standard for English graduate-level critical thinking and writing skills, we are aware that some students do not, in fact, have adequate knowledge or the ability to synthesize arguments well. We acknowledge that writing is but one measure of such skills. We need to revisit this issue, first, by agreeing as a graduate faculty upon standards that will reflect the program goals; second, by agreeing how we can make sure that none of the students who take an MA in English at SHSU falls short of these standards.

Finally, to address the weaknesses in the oral performance of the students, we need to discuss the expectations that we have for students in oral exams and the best means for measuring these expectations. During new student orientation, the Graduate Director and/or Chair will also explicitly address the oral examination performance expectations.

Update on Previous Cycle's "Plan for Continuous Improvement"

The plan for continuous improvement from the 2011-2012 assessment cycle identified several issues that needed attention: (1) Do the written comprehensive exams adequately measure our students' critical thinking and writing skills and breadth of knowledge? (2) Does class writing adequately measure students' critical thinking, researching, and writing skills? (3) Do graduate faculty all understand and agree upon uniform standards for graduate-level writing? (4) What expectations do the graduate faculty have for our students' ability to demonstrate critical thinking skills and breadth of knowledge in oral examinations? These issues are all tied directly to the three program objectives. 

The actions taken in response to each of these issues are listed below by number:

1. The plan for continuous improvement from the previous assessment cycle called for continued discussion of the effectiveness of the comprehensive examination in measuring our graduates' critical thinking and writing abilities and breadth of knowledge in the field. In several departmental discussions about the graduate program during the 2012-2013 assessment period, faculty discussed both the expectations for the examination and the methods of administering it.

In response to the concern that some students preparing for the comprehensive examination cluster their areas to reduce the breadth of knowledge and ease the burden of preparation, the graduate faculty decided that because such students do satisfy the exam requirements and because these students do not enjoy a higher success rate than those who spread their areas more broadly, we would not insist upon the broader range.

As a result of our departmental discussions about the exam, the faculty also refined the administration process, to make it both more efficient and more objective than it already was. Previously, two primary readers had read each exam essay, with a third settling any disputes between the two; now, however, three readers evaluate every essay from the outset. This new process obviates the need for a tie-breaker and also reduces the amount of time for reporting results. To accommodate the greater reading volume, we now require also that every graduate faculty member read essays for every examination. Although historically there has not been too much variance in the way that faculty evaluate the essays according to the rubric, the wider distribution of essays ensures, first, that readers will have fewer exams to evaluate and, second, that the results will represent a broader cross-section of faculty expectations for the students' breadth of knowledge and critical thinking and writing skills.

Although we stress that graduate classwork does not prepare students specifically for the exam, some faculty have also begun using the types of questions that students might encounter on the comprehensive examination in their class midterm and final exams. It is difficult to measure objectively how effective this method is in preparing students for the exam, especially because we will not require that graduate instructors use it. But anecdotal evidence suggests that the students have found it helpful, if for no other reason than that they gain some experience in working under the same circumstances as those of the comprehensive exam.

The graduate director has continued to conduct biannual new student orientation sessions and biannual comprehensive examination preparation sessions, both of which are designed to help students understand the standards of and expectations for graduate-level and professional work in our discipline. He has also published extensive information about the exam in print and online literature.

2. Because of a scheduling error, a holistic review of graduate class writing was not undertaken for the 2012-2013 assessment cycle. The program will resume this assessment of writing for the 2013-2014 reporting period.

3. In order to reach some consensus about the standards for graduate-level writing that adequately measures our students' critical thinking, researching, and writing abilities, the rubric used for evaluating class writing has been made available to all graduate faculty; the department has not yet undertaken a specific discussion about these standards, however.

4. In an attempt to strengthen students' ability to make convincing oral arguments, the graduate director has begun to include a discussion of the oral exam during comprehensive exam prep sessions and has also written into the graduate student handbook and online literature a section on preparing and sitting for the oral exam.

Plan for Continuous Improvement

One measure of our continuing success in producing graduates who have demonstrable critical skills and breadth of knowledge in the field is that our MA students continue to be accepted into respectable PhD and MFA programs. During the 2012-2013 assessment period, graduates were accepted to PhD programs at Arizona State University, the University of Arizona, Carnegie-Mellon University, Drexel University, and the University of North Texas; most of them were awarded full funding. Several other graduates were given teaching positions in two-year colleges. Although these indicators of success cannot be considered measurable because not all of our graduates aspire to such endeavors, as part of our plan for continuous improvement, the faculty will continue to encourage worthy MA graduates to apply for PhD work and teaching positions.

Another measure of the program's success in producing graduates with demonstrable critical researching and writing skills and breadth of knowledge is student participation in professional conferences. Not all students participate in such activities, however, so although faculty will continue to encourage them to present their scholarly and creative work at conferences, participation cannot be considered a measurable indicator of the program's success in achieving its objectives. One suggestion for the future, however, is that participation in at least one scholarly or creative conference or colloquium be a requirement for graduation. In such an event, conference participation could be included a measurable indicator.

Dr. Helena Halmari, English Department Chair, will also continue to pair qualified students with faculty members as research assistants. The requirement for this research assistantship is that the collaboration between student and faculty member lead to a publication and/or conference presentation. Again, because not all students qualify for such assistantships or seek them out, the work undertaken as a research assistant cannot be considered a measurable indicator of the program's success in producing graduates with critical thinking, researching, and writing skills.

In responding specifically to the findings for the three objectives above, we propose the following plan for continuous improvement in the 2013-2014 assessment period:

1. The graduate faculty will undertake a thorough review of comprehensive examination reading lists, to assure that the lists represent both the expectations for breadth of knowledge and current developments in the field.

2. The graduate faculty will undertake a review of comprehensive examination questions, to assure that they are both fair and representative, that they adequately test a student's critical thinking and writing skills and breadth of knowledge, and that they represent current developments in the field.

3. Because faculty who sit on oral comprehensive examination committees still find weaknesses in some students' ability to make critical arguments and demonstrate their breadth of knowledge orally, the department will undertake a pointed discussion about both the nature of and the expectations for this oral exam.

4. To improve its progress toward achieving the objective, the graduate faculty will also consider requiring an oral component in some types of courses or other means by which the program can develop the students' ability to make oral arguments.

5. The graduate faculty will resume the holistic assessment of graduate student writing. To that end, the graduate director has already collected representative writing from all graduate courses taught in Fall 2013.

6. To assure that the rubric for the holistic assessment of writing fairly measures our students' critical thinking, researching, and writing abilities, the graduate director will consult with the University's Director of Writing in the Disciplines, who is a member of the English Department.

7. After any necessary revisions to the rubric have been made, the graduate faculty will discuss the standards for classroom writing and how well that writing develops and/or measures our students' critical abilities. The aim of this discussion will be to reach some sort of departmental consensus on standards for writing in the graduate classroom.

8. To encourage greater faculty mentoring of MA students, the graduate director will propose that each graduate faculty member be assigned four or five students as advisees. While the graduate director will continue with general advising of students, the faculty mentors would meet with their advisees as needed to discuss class researching and writing assignments and to help them prepare for written and oral examinations. Although this advising would be informal, we may require that students meet at least once with their mentors in each long term.


Sub Content Box

Sam Houston State University
Huntsville, TX 77341
(936) 294-1111
1-866-BEARKAT