Spacer
Assessment : 2011 - 2012 : Educational Programs :
Psychology MA (School Psychology)

2 Goals    2 Objectives    2 Indicators    2 Criteria    2 Findings    2 Actions


GOAL: Foundational Competence In School Psychology

Objective  
Foundational Competency In School Psychology
Students demonstrate competency in the scientific, methodological and theoretical foundations of professional school psychology.

Indicator  
National School Psychology Exam (PRAXIS II)  
The PRAXIS II School Psychology Exam is a nationally administered examination used to determine an individual’s qualification for licensure to practice within the field. Candidate competency is evaluated with respect to the following test subcategories: 1. Data Based Decision Making (35%), 2. Research-based Academic Practices (12%), 3. Research-based Behavioral and Mental Health Practices (16%), 4.   Consultation and Collaboration (12%), 5. Applied Psychological Foundations (13%), and 6. Ethical, Legal, and Professional Foundations (12%).

Criterion  
Minimum Score  
A minimum score of 165 is required to obtain the credential of Nationally Certified School Psychologist (NCSP), and thus a score of 165 or better has been established by the SSP Program as the criterion for this objective. In addition, candidates are expected to perform at or above the average range provided by the test developers for each of the six subcategories.

Finding  
PRAXIS II Scores  
Eight SSP students took the PRAXIS II exam during the past academic   year. Total scores ranged from 170 to 188, with an average score of 178. Six of the eight students had scores directly reported to our Program, which enables an analysis of subcategory performance. For these six candidates, all six (100%) performed at or above the average range for subcategories 1, 3, 4, 5, and 6. Four of six candidates (66.7%) performed at or above the average range for subcategory 2 (Research-based Academic Practices).

Thus, our program is slightly weaker in the area of research-based academic practices.

Actions for Objective:

Action  
PRAXIS II Scores  
Although most students are above average in the area of research-based academic practices, we had two students who scored lower.  We have struggled with research-based academic practices in previous years and had made some progress in this area last year; however, this year we dipped a bit. We will revisit this area and, again, emphasize the practices more zealously. 



GOAL: Skill Application

Objective  
Skill Application
Candidates in the school psychology program demonstrate knowledge and improving skill application commensurate with their level of training. Specifically, candidates in their final practicum placement and on internship, both held within the public school setting, will demonstrate appropriate application of professional school psychology skills in the areas of assessment, behavioral consultation, academic intervention and counseling.


Indicator  
Rating Forms And Positive Impact Data  
(1) Satisfactory ratings from Field Supervisors
          1(A) Ratings for Practicum II candidates (Year 2 of 3)
          1(B) Ratings for candidates on Internship (Year 3 of 3)
On-site, or field, supervisors are asked to evaluate each candidate’s
performance in order to gauge their professional performance according to the 11 NASP Domains of Competence. These include: 1) Data-Based Decision-Making and Accountability, 2) Consultation and Collaboration, 3) Effective Instruction and Development of Cognitive/Academic Skills, 4) Socialization and Development of Life Skills, 5) Student Diversity in Development and Learning, 6) School and Systems Organization, Policy Development, and Climate, 7) Prevention, Crisis Intervention, and Mental Health, 8) Home/School Community Collaboration, 9) Research and Program Evaluation, 10) School Psychology Practice and Development, and 11) Information Technology.

(2) Satisfactory ratings from Program Faculty                     
             2(A) Faculty Rating Forms (FRF) for each of four Portfolio
             cases submitted
             2(B) Procedural Integrity Rubrics (PIR) for
             each of four Portfolio cases submitted
Candidates completing the Internship Portfolio assessment will obtain satisfactory ratings from the Program Faculty on each of four cases submitted. These cases include: 1) an Assessment case, 2) a Behavioral Consultation case, 3) an Academic Intervention case, and 4) a counseling case. Two faculty members will evaluate each case, and the average of these two ratings on both the FRF and the PIR will be reported.

Positive Impact Data 
(3) Quantitative data gathered as part of the case intervention
           3(A) Effect Size
           3(B) Percent of Non-Overlapping Data Points
           (PND) 
                            
Candidates completing the Internship Portfolio assessment will submit  quantitative data gathered as part of the case intervention monitoring for three of four cases submitted. These cases include: 1) the Behavioral Consultation case, 2) the Academic Intervention Case, and 3) the Counseling case. Effect size, percent of non-overlapping data points (PND), or other means of quantitatively evaluating candidates positive impact on the student(s) will be calculated.

Criterion  
Skill Application  
1A: Candidates are rated by field supervisors according to a three point scale including the following competency rating categories: Improvement Needed (1), Competent (Supervision Needed; 2), and Professionally Competent (No Supervision Needed; 3). Because candidates in their final practicum will be under supervision for two more years, they are expected to maintain an average rating of “2.0” for each of the 11 NASP Domains evaluated.

1B:  Candidates are rated by field supervisors according to a three point scale including the following competency rating categories: Improvement Needed (1), Competent (Supervision Needed; 2), and Professionally Competent (No Supervision Needed; 3). Because candidates completing their internship year will continue to be under supervision for one more year, they are expected to maintain an average rating of “2.0” for each of the 11 NASP Domains evaluated.

2A: Candidates completing their internship experience are required to    submit four distinct Portfolio cases. Each case will be reviewed by two faculty members and assigned ratings on the Faculty Rating Form (FRF). These ratings will then be averaged across the two faculty raters. The FRF addresses all domains of practice related to the type of case being  reviewed. Each item on the FRF includes the following competency rating categories: Pass (score 1), No Pass (score 0), Not Included (score  0), and Not Applicable (removed from the scoring calculation).  Candidates are expected to achieve a minimum domain competency average of 85%.

In addition, candidates are given a single faculty rating for the overall case completion. This rating ranges from 1 (Very Poor) to 5 (Very Good). Candidates are expected to achieve a minimum average overall rating of 3 across the two faculty raters, which is equivalent to “average” work completed in the field.

2B: Internship portfolio case submissions are also scored by faculty using a Procedural Integrity Rubric, or PIR. Each case PIR includes critical procedures that must be performed as part of completing the case in order for the intern to be judged as following best practices within the field. Each item on the PIR can be scored as follows: 0 = Incomplete, 1 = Needs Improvement (task is completed, with some concerns), 2 = Completed Satisfactorily (Competency Met), and 3 = Exemplary Performance (task is completed at a level above expectations. Each PIR for the four cases submitted has an established cut score equivalent to achievement of at least 85%. Additionally, candidates are expected to obtain no ratings of “0” on any PIR.

3A: Based on the quantitative data included as part of the Behavioral Consultation, Counseling, and Academic Intervention Portfolio case submissions, the candidate’s impact on student behavior and/or learning can be calculated in a variety of ways. Effect size allows for the   comparison of the standard mean difference in student performance during baseline and treatment phases of intervention. An effect size of.8 is considered to be of moderate impact. Candidates are expected to demonstrate moderate impact through either effect size or PND calculation for two of the three quantitative cases submitted.

3B: Based on the quantitative data included as part of the Behavioral Consultation, Counseling, and Academic Intervention Portfolio case, the candidate’s impact on student behavior and/or learning can be calculated in a variety of ways. Percent of Non-overlapping Data points, or PND, provides a comparison of the percentage of data points during the treatment phase that do not overlap with the most extreme   baseline phase point. A PND calculation of 60% is considered to be of moderate impact. Candidates are expected to demonstrate moderate impact through either effect size or PND calculation for two of the three quantitative cases submitted.

Finding  
Skill Application  
1A: Table 1A: Practicum II Field Supervisor Ratings
There were 11 candidates who participated in the final Practicum experience during the Spring 2012 semester. Field supervisors rated our candidates, as a whole, very well and solidly within the “Competent” range. Ten of the eleven candidates (91%) achieved an average supervisor rating equal to or above the target score of 2.0. One candidate achieved an average supervisor rating of 1.99. The cohort average rating within each of the 11 NASP Domains met the criterion score of 2.0. Site supervisors rated our weakest domain in the area of home/school community collaboration

1B: Table 1B: Internship Field Supervisor Ratings 
There were eight candidates who participated in the Internship experience during the 2011-2012 academic year. Field supervisors rated our candidates, as a whole, very well and solidly within the “Competent” range. Seven of the eight candidates (88%) achieved an average supervisor rating equal to or above the target score of 2.0. One candidate achieved an average supervisor rating of 1.98. The cohort average rating within each of the 11 NASP Domains met the criterion score of 2.0. Faculty rated our students as weaker in the domains of school and systems organization, policy development, and climate as well as information technology. 

2A: Data Tables for FRF Portfolio Reviews 
Eight candidates completed their Internship Portfolios this academic year. Each of four Portfolio cases submitted were rated by two faculty members to obtain an average Faculty Rating Form (FRF) rating and an average Overall case rating. For the Academic Intervention case, all eight candidates (100%) achieved the criterion of 85% or higher on the average FRF rating and an Overall rating of ‘3’ or higher for the case. For the Assessment case, all eight candidates (100%) achieved the criterion of 85% or higher on the average FRF rating and an Overall rating of ‘3’ or higher for the case. For the Behavioral Consultation case, all eight candidates (100%) achieved the criterion of 85% or higher on the average FRF rating and an Overall rating of ‘3’ or higher for the case. Finally, for the Counseling case, all eight candidates (100%) achieved the criterion of 85% or higher on the average FRF rating and an Overall rating of ‘3’ or higher for the case. Our strongest areas are in assessment and academic intervention, although these areas are all in the criterion level.  Our weakest area was in the behavioral consultation area.

2B: Data Tables for PIR Portfolio Reviews 
Each Portfolio case completed was evaluated by two faculty raters using the Procedural Integrity Rubric (PIR) in order to obtain an average PIR score. Additionally, candidates were expected to obtain no ratings of ‘0’ on each of the PIR documents. For the Academic Intervention case, all eight candidates (100%) achieved an average PIR score at or above the cut score of 24, with no candidates (100%) receiving a score of ‘0’ on these case ratings. For the Assessment case, all eight candidates (100%) achieved an average PIR score at or above the cut score of 39, with one Behavioral Consultation case, all eight candidates (100%) achieved an average PIR score at or above the cut score of 21, with no candidates (100%) receiving a score of ‘0’ on these case ratings. Finally, for the Counseling case, all eight candidates (100%) achieved an average PIR score at or above the cut score of 21, with no candidates (100%) receiving a score of ‘0’ on these case ratings. Our strongest area was in academic intervention; our weakest was behavioral consultation.  There was only one student who did not meet all competencies.  That person had a weakness in the assessment area.

3A-B: Positive Impact Data for Quantitative Intervention Cases 
Candidates’ impact on student learning during the Internship experience is evaluated quantitatively through intervention cases submitted as part of the Portfolio assessment. Three cases (i.e., Academic Intervention, Behavioral Consultation, and Counseling) involve intervention with students and include progress monitoring data. A candidate’s positive impact on student functioning is evaluated by calculating either an effect size or percentage of non-overlapping data points. All eight internship candidates (100%) achieved at least a moderate impact on student learning for all three cases submitted. This met and exceeded the expectation of a moderate impact for two of the three cases submitted. Although all of our effect sizes were greater than .8, our weakest effect size was in the counseling case. There were no students on the academic intervention case who did not achieve at least a .8 score. There was one student on the counseling case whose effect size was less than .8. There were two students whose effect sizes were less than .8 on the behavioral consultation case.

Actions for Objective:

Action  
Skill Application  
Clearly, there was a discrepancy between on-site supervisors and faculty regarding stronger and weaker domains of our students.  Supervisors saw home/school community collaboration as weaker; faculty saw weaker areas to be information technology and school and systems organization, policy development, and climate. These are all areas we need to emphasize in the program so that our students will be skilled enough for an entry level job. On the portfolios, the weakest area appears to be the behavioral consultation case. We need to evaluate our curriculum, faculty expectations for students, how they are presenting their information, and the mechanism to show effectiveness of the intervention.



Closing the Loop

We are so pleased with our students' competency achievement.  Still there are areas to improve.  The domain of research-based academic interventions was a weaker area on the national certification exam. We need to evaluate our curriculum to determine where we might include more emphasis in this area. In the skill applications, we need to address the discrepancy between on-site supervisors who view home/school community collaboration as a weakness and the program faculty who view weaknesses as apparent in information technology, and school and systems organization, policy development, and climate.  Perhaps we can explore community engagement activities through the schools to provide instruction and experience in this area. The coursework regarding school and systems organization, policy development, and climate is taught in an interdisciplinary fashion through the Department of Educational Leadership and Counseling.  We will need to consult with the faculty teaching this particular area to share our concerns. In the skill application area, our weakest area is in the behavioral consultation case.  The data indicate fairly consistently that this is a weaker area although data is within the average to above average range. To strengthen this area, we need to closely examine our expectations and students' understanding of those expectations to clarify any confusion.


Sub Content Box

Sam Houston State University
Huntsville, TX 77341
(936) 294-1111
1-866-BEARKAT