VISTAS articles are made available for historical reference only and are presented "as is." ACA does not guarantee or represent that the information is current, accurate or indicative of the original or intended quality. These materials are not maintained or updated and may contain outdated or incomplete information. Readers should exercise discretion and verify information independently before relying on it. We assume no responsibility for the use or interpretation of this content.
Article 94
A T.E.A.M. Approach for Evaluating School Counselors
Abstract: Educational reform has turned its focus on professional educator evaluation systems that require multiple data sources. In 2012, the Tennessee Educator Acceleration Model (TEAM) School Services Personnel Rubric was developed to evaluate school counselors. In response, a study was conducted during the first year of TEAM use to investigate school counselors’ experiences using this assessment and to explore their perceptions of its alignment with their role and functions. Results indicated that TEAM rubric domains and indicators measure much of what school counselors do, but more revisions are needed. Also, we found that more evaluator training is needed to ensure effective use of the TEAM rubric and counselor confidence in the evaluation system. Implications for counselors and training programs in other states are discussed.
Keywords: evaluation, reform, evidence-based
As school reform efforts continue to develop and evolve, they are accompanied with greater expectations. In past decades school personnel were acknowledged for any student achievement or school improvement, but today educators now look to improve outcomes for all students and all groups of students. No educational program is exempt from scrutiny. Although many educators voice that the passage of No Child Left Behind (NCLB) is responsible for evidence-based results, the fact is that the need for school counselor accountability practices has been communicated for decades (Aubrey, 1985; Baker, 2012). Historically, school counselors relied on “warm fuzzy” feelings expressed by stakeholders, or process data that indicated where time was spent. Yet these evaluations did not reveal school counselor effectiveness or quality of tasks. Unfortunately, school counselors have been remiss in revealing how their efforts contribute to academic achievement.
Leaders in the school counseling profession (Gysbers & Henderson, 2006, as cited in Sink, 2009) acknowledge that school counselors have an obligation to utilize objective data to reveal outcomes, assess program personnel, and use needs assessments to determine stakeholder perspectives. The American School Counselor Association (ASCA) recognizes the importance of outcome-based strategies and developed the ASCA National Model® in 2003 (ASCA, 2012) as a template for school counselors to use in developing a comprehensive, developmental school counseling (CDSC) program, and was revised in 2012 (ASCA, 2012). A few years later, the School Counselor Competencies were developed as a checklist to demonstrate the mastery of identified knowledge, attitudes, and skills for evaluative purposes. As stated in the ASCA position statement, “the annual performance evaluation of the professional school counselor should use criteria reflecting the current standards and competencies of the school counseling profession” (ASCA, 1999, para.4).
A New Evaluative Approach
In July 2009, a competitive, federal initiative was created that provided over $4.35 billion for which states could apply for use in educational reform. This initiative, known as The Race to the Top, is based on proposed strategies for designing rigorous standards and assessments for student success, creating data systems to improve instruction, revealing data-driven results, and recruiting and retaining effective educators through the use of incentives (White House, 2009).
The “Race to the Top” initiative presented a unique opportunity to encourage educational reform. Tennessee and Delaware were the first two states to be awarded over $501 million with the express purpose of formulating strategies to work toward these goals (U.S. Department of Education, 2010). The state of Tennessee, in its quest to meet these aims, renamed this enterprise “First to the Top,” and one of the areas targeted for improvement resulted in the creation of several evaluation systems. School districts had the option to select an evaluation rubric that best reflect the needs of the school community from those that were created. The Tennessee Educator Acceleration Model (TEAM) School Services Personnel Rubric was one evaluation rubric designed specifically for school counselors.
The Tennessee Educator Acceleration Model (TEAM) School Services Personnel Rubric
School counselors are evaluated annually on several TEAM rubric criteria that include: growth measures through gains in student achievement test results, other objective measures in student performance, and qualitative measures. The total evaluation system for school counselors is comprised of student growth measures that account for 35% of the annual evaluation, other student performance measures that comprise 15% of the assessment, and the TEAM model that represents 50% of the evaluation system. The growth measure, based on three-year average gains in student achievement, includes items such as the over-all school-wide composite growth for all subjects, school-wide literacy or math value-added results. For school counselors who work in more than one school the weighted average of the schools is included in this evaluation. Other Student Achievement Measures are based on a menu of options from which the school counselor may choose, such as ACT scores, graduation rates, or ninth grade success.
TEAM serves as an evaluation instrument for school audiologists, school counselors, social workers, school psychologists and speech/language therapists. The rubric contains the domains of Delivery of Service, Planning of Services, and the Environment, and includes identified, defined indicators that correspond to each of these domains. The indicators under each domain are below.
Delivery of Service
Standards and Objectives
Motivating Students
Delivery of Professional Services
Service Structure and Pacing
Activities and Materials
Communication
Consultation
Developing Educational Plans for Students
Professional Content Knowledge
Knowledge of Students
Organization of Services
Problem Solving
Planning of Services
Scope of Work
Analysis of Work Products
Evaluation of Services or Programs
Environment
Expectations
Managing Student Behavior
Environment
Respectful Culture
The predominance of the indicators within each domain determines the school counselor’s effectiveness based on a Likert scale that ranges from “Significantly Above Expectations” (score of 5), “At Expectations” (score of 3), and “Significantly Below Expectations” (score of 1). School personnel receive multiple evaluations through announced and unannounced observations or conversations, and school counselors are able to present artifacts or data-based evidence during these evaluations. The number of observations a counselor may experience is associated with the license held by the school counselor.
During the first year of implementation school counselors with professional licenses were to receive four observations and those with apprentice licenses (newly inducted into the profession) were to be evaluated six times throughout the year. However, after receiving feedback from participants, evaluators were allowed to combine observations to reduce the cumbersome number of evaluations: e.g., three observations for counselors with professional licensure and five for those with apprentice licensure. In the second year of implementation, the number of required observations was based on the score received the previous year. For instance, if a school counselor with a professional or apprentice license received a score of 5, then only one observation on the entire rubric is required, with two informal evaluations in which paper work and scores were eliminated.
With a newly developed evaluation initiative that serves as the basis for merit pay and job performance, we initiated a study to have a better understanding of how Tennessee professional school counselors are using the TEAM Model Rubric (i.e., Special Personnel Services Rubric or Teacher Rubric) and to understand the degree to which this assessment supports a CDSC program as supported by the American School Counselor Association and the Tennessee Model for Comprehensive School Counseling. The following research questions served as the basis for this study:
What domains and indicators are identified to evaluate the school counselor in the
Delivery of Services role?
What domains and indicators are identified to evaluate the school counselor in the
Planning of Services role?
What domains and indicators are identified to evaluate the school counselor in the
Environment role?
To what degree do evaluation rubrics align to the role and responsibilities of the school counselor?
Methodology
In the main, we surveyed members of the Tennessee Department of Education (TNDOE) School Counselor listserv to understand their experiences with the evaluation system. TNDOE staff maintains an electronic mailing list (listserv) exclusively for disseminating timely information relevant only to school counselors; however, not all school counselors in Tennessee are members of the TNDOE listserv--counselors must request to join the School Counselor Listserv by contacting the TNDOE. Communication on this listserv is strictly one-way—only authorized emails are sent out from TNDOE. Counselors cannot send messages on the listserv nor can they “Reply to All” as one might do on the listserv of other professional groups. At the time of this research, it was estimated by TNDOE staff that the listserv had about 1,300 members; of this number, approximately 1,100 are school counselors (L. Cohen, personal communication, January 23, 2012). The survey was distributed in May 2012, near the end of the first year of TEAM implementation, which provided almost a full year of experience under the new evaluation system.
Participants
To explore the research questions, the survey was sent to school counseling practitioners who were members of the Tennessee Department of Education (TNDOE) listserv designed for school counselors and their supervisors. Out of the approximately 1,100 school counselors on the TDOE Listserv, 234 returned usable forms—a return rate of 21.3%.
Of the 234 survey respondents, 212 (92%) were female counselors and 22 (9%) were male. Respondents consisted of 202 (86%) Caucasian, 16 (7%) African American, and sixteen (7%) did not indicate a race. The experience of respondents as professional school counselors ranged from 1 year to more than 20 years with a mean of 11.5 years. Eighty-three percent (195) of respondents held a Tennessee license as a professional school counselor, while 39 (17%) possessed an Apprentice license.
Regarding the school setting, 75 respondents (32%) practiced within an elementary (K-5) configuration, 68 (29%) in middle (6-8), and 75 (32%) in secondary (9- 12) school. Sixteen respondents (6.8%) indicated other settings not listed. The counselor-student ratio shows 17 (7%) respondents with a ratio less than 1:250; 128 (55%) between 1:251 to 1:500; 68 (29%) between 1:501 and 1:750; 18 (8%) between 1:751 and 1:1000; and, 3 (1%) with a case load of greater than 1,000 students.
Instrument
The instrument in this study was a 38-item questionnaire created by the authors in collaboration with school counseling practitioners and TNDOE research staff. It was piloted with practicing school counselors at all levels and revisions were made based on their feedback. Although the instrument was not analyzed for validity or reliability, it contained both face and content validity. The instrument included multiple types of items that range from checklists, multiple-choice items, to open-ended questions. Participants were asked to: 1) rate the alignment of their counseling programs with the Tennessee Model for School Counseling; 2) describe their evaluation experience under the TEAM system; 3) assess how the TEAM rubric aligns with their school counseling duties; and, 4) offer suggestions for TEAM evaluation improvements.
Data Analysis
We analyzed survey responses using frequency and mean analysis. Further analysis of data by percent of frequency enabled us to examine the proportion and weight of responses compared to the overall group responses for items. Qualitative responses were collected through open-ended questions to expand data collection and enabled participants to add information from their experiences. We used inductive analysis to allow themes and patterns in qualitative responses to emerge (Johnson & Christensen, 2010). Representative samplings of those themes are provided in the analysis of results below.
Results
The TEAM evaluation system requires evaluators to determine what the prevailing evidence reveals about the counselor’s performance. Evaluators must use their judgment to rate what they observe counselors doing or what artifacts counselors provide to demonstrate proficiency of TEAM indicators. Because of the high-stakes accountability for counselors, evaluators must be rigorously trained in the use of the complex elements of TEAM to ensure the reliability and validity of the evaluation system. This training was of critical importance because 12% of the school counselor respondents were evaluated on the TEAM Educator Rubric that was designed for teacher expectations rather than school counselor responsibilities. Consistency in implementing the School Services Personnel Rubric required counselor evaluators to be adequately prepared. Counselors reported that approximately one in five evaluators (20%) were not trained in use of the School Services Personnel Rubric, i.e., 22% for Delivery of Service, 21 % for Planning of Services, and 21 % for Environment Domain. However, 153 (65%) survey respondents indicated that they collaborated with their evaluators on the use of the TEAM Rubrics to some extent or more.
Research Question 1:
What domains and indicators are identified to evaluate the school counselor in the Delivery of Services role?
In Delivery of Services, a majority of survey respondents reported that eleven of the twelve indicators in this domain of the TEAM evaluation system were consistent with their role. Participants indicated that Knowledge of Students (80.3%), Communication (75.2%), Motivating Students (73.9%), Consultation (72.2%), Delivery of Professional Services (71.4%), Professional Content Knowledge (63.2%), Standards and Objectives (60.3%), Problem Solving (59.8%), Activities and Materials (55.1%), Organization of Services (54.3%), and Developing Educational Plans for Students (52.1%) aligned most with their duties as a school counselor. Only Service Structure and Pacing (55.6%) received a majority of participants to indicate that it aligned least with their role as counselors.
Furthermore, 174 (63%) of respondents indicated they were directly observed, while 87 (37%) had a conversation with their evaluators regarding their accomplishments intrinsic to this domain. Approximately, 185 (79%) of respondents indicated they showed artifacts to their evaluators, while 38 (16%) did not reveal work documents, and eleven (5%) planned to do so at their next evaluation. In response to the types of artifacts respondents used to demonstrate their accomplishments in this domain, the primary themes of classroom guidance, counselor logs, calendar of services, communications such as newsletters, and conferences emerged. In order to get a clearer picture of the context in which evaluations for this domain occurred, classroom guidance, meetings, conferences, and group counseling emerged as the major themes. Participants’ comments that provided a rich context for understanding opinions of this process included:
I work at a high school and a K-8 school. My evaluations were done very differently at each school. Classroom guidance . . . and conversations with evaluator.
If I wasn’t able to demonstrate an area in an observation I was penalized and received a lower score, even if it could not be applied to what I was doing.
I was evaluated through my total involvement in one of my schools (I have two schools). I was able to describe everything I do to my evaluator and show evidence of what I do.
Research Question 2.
What domains and indicators are identified to evaluate the school counselor in the Planning of Services role?
In the Planning of Services domain, a majority of survey respondents identified Scope of Work (74.8%) and Evaluation of Services and/or Program (54.3%) as indicators that aligned most with their responsibilities as a professional school counselor. Sixty-nine participants (29.5%) were directly observed by their evaluator and 165 (70.5%) held a conversation with their evaluator. Approximately 170 (72.6%) counselors produced artifacts to support their effective planning, while 50 (21%) did not have an opportunity to reveal documentation, and another 14 (6%) expected to show evidence later. Themes of calendars, reports, classroom standards, needs assessments, correspondence, and program plans emerged as the most frequently used artifacts to demonstrate accomplishments in this domain.
In order to gain an understanding of the context in which the evaluation occurred, the most frequently reported themes included conversations, large-group observation, meetings including parent conferences, and evidence in the form of a notebook or portfolio. We were able to get a clearer focus of participants’ perspectives of this domain through several comments.
Each of these seems to align with the work I do as a counselor in that each is a way for me to inform my administrators what I do and how I plan to improve my counseling programming throughout the year.
I think counseling should be one classroom activity that doesn’t require testing. It was a stretch for me to come up with evaluation examples.
Research Question 3.
What domains and indicators are identified to evaluate the school counselor in the Environment role?
All four indictors in the Environment domain of the TEAM system appear to evaluate aspects of school counselor duties. Survey respondents reported that Respectful Culture (83.8%), Environment (66.7%) Expectations (66.7%), and Managing Student Behavior (56.8%) indicators align most with their role in Environment. One hundred twenty-five participants (54%) were observed, and 109 (46%) held conversations with their evaluator. Approximately 105 (45 %) respondents were able to show artifacts, while 20 (9%) were unable to share documentation, and 109 (46%) expected to show evidence later. The artifacts that emerged as the most prevalent themes included: classroom and lessons, letters, use of the Tennessee Model for Comprehensive School Counseling, classroom rules, the 504 plan, certificates, and group lessons. The context for the environmental evaluation included areas such as classroom guidance, meetings, parent conferences, and small- and large-group counseling. Comments from participants that provided a richer understanding of this domain include:
I travel from room to room and school to school, which makes it difficult to score me on my physical environment when I am in all these places each day.
[In my] conversations with my evaluator … she also used her knowledge of my relationships with students when considering this domain.
Research Question 4.
To what degree do evaluation rubrics align to the role and responsibilities of the school counselor?
When asked to rate how effective TEAM evaluation rubrics measured school counselor duties and responsibilities, 142 (61%) of the respondents indicated that the evaluation system was “slightly effective” to “very effective.” Regarding their confidence in the Tennessee Evaluation System to measure their effectiveness as professional school counselors, 128 (55%) of respondents indicated that the system was “slightly effective” to “very effective.” Only one rubric indicator out of nineteen (Services structure and pacing) was reported as least effective in measuring aspects of the school counselor role. The perception among survey participants indicates that the TEAM evaluation rubrics align to most duties of Tennessee school counselors.
Discussion and Limitations
School counselor evaluation is a critical component to ensure professional competence. As stated in the ASCA position statement on evaluation,
The key purpose of the professional school counselor performance evaluation is to enhance the positive affect that the school counselor and the school counseling program have on students and school stakeholders. The professional school counselor . . . initiates the annual development of a management agreement with administrators . . . collaborates with administrators to develop appropriate tools to use in the evaluation of the school counselor and the school counseling program. (ASCA, 2009, para. 3)
A comprehensive, inclusive evaluation takes into account the numerous school counselor tasks based on a job description that is a result of mutually agreed-upon tasks and goals. Without this statement of responsibilities, stakeholders will be disappointed when expected tasks are not accomplished, and professional evaluations will be a product of stakeholder assumptions of what the school counselor should be doing. Unfortunately, many school counselors are not able to perform the tasks for which they have been trained, and many are evaluated on a teacher’s evaluation rubric that has little relationship to the school counselor’s role.
The TEAM pupil personnel rubric is a positive step in aligning the school counselor’s annual evaluation with training in a CDSC program. When used as intended, it creates an opportunity for the school counselor and evaluator to dialogue and exchange information regarding appropriate tasks. Approximately 65% of the respondents indicated that they collaborated with the evaluator “some extent” to a “great extent” during the evaluative process.
Unfortunately, 43% of the respondents reported that their evaluator had not received training in all rubric domains, and approximately 12% were evaluated on the TEAM Educator Rubric designed for teachers due to the evaluator either: 1) not having knowledge of the rubric constructed for school counselors; 2) choosing to not use this rubric; or 3) not receiving training on the rubric.
In this study, participants revealed that the domains and indicators needed to be better defined, particularly since many of the categories seemed to overlap. This was particularly evident as many of the same artifacts that were used to demonstrate effectiveness emerged as themes within each of the domains; for example, calendars, communication, plans, and conferences were mentioned in all domains. Despite the concerns surrounding the TEAM evaluation, approximately 61% perceived that the TEAM rubric measured their duties and responsibilities as a professional school counselor from “slightly effectively” to “very effectively.”
The TEAM rubric represents a preliminary effort to enhance assessment strategies to more effectively demonstrate contributions to academic achievement. Although this rubric had been in existence for only one year at the time of this study, the preliminary results are positive. As with any new enterprise, revisions are made when feedback is provided, analyzed, and revised; for instance, the numbers of evaluations were reduced based on opinions that were given.
The limitations of this study include the numbers of counselors who responded with completed surveys. As indicated previously, only those who chose to be a member of the listserv received the survey, and therefore the number of school counselors who were evaluated on the TEAM evaluation model that were not members of the listserv is unknown. Furthermore, counselors who were evaluated on a different rubric did not respond to this study.
This study has implications for personnel in other states who intend to revise their state evaluation system for school counselors. Although evaluations such as the School Counselor Performance Standards from the ASCA National Model® (2012) have been developed as a school counselor assessment tool, when a statewide effort is initiated that prescribes an evaluation based on conversations and documentation, there is a greater potential for administrators to better understand the school counselor’s role. Furthermore, state mandated assessments provide more credibility for administrators to make an effort to learn about the required assessment procedures. When school counselors and administrators have an opportunity to dialogue, share perspectives, and discuss issues, school counselors are in a better position to reveal how they are integral to the educational mission. With a majority of the respondents who indicated that they were able to collaborate with their evaluator, this is a positive step in communicating the school counselor’s role. Additional recommendations for advocacy include:
providing the administrator with a copy of the ASCA Ethical Standards
conducting regularly scheduled meetings with administrators to discuss issues that are negatively impacting students or the school culture, while being mindful of confidentiality
documenting and revealing process, perception, and outcome data to provide a broader picture of contributions to educational success
informing the administrator of personal and professional goals at the beginning of the year and communicating goal attainment at the end of the academic year.
School administration training programs are able to use this information to better prepare pre-service administrators for their evaluator role and to promote the role of the school counselor as a vital educational professional to assist in improved school performance. Furthermore, counselor education programs are able to teach students about the context for evaluations, areas in which their performance will be assessed, and how to be proactive in collecting artifacts. Finally, this evaluative process serves as a catalyst for practitioners to be more actively involved in further data collection to demonstrate how students have grown academically, vocationally, and personally/socially.
Summary
Educational reform is a major federal and state initiative, and the Race to the Top competition has provided a new opportunity for states to rethink methods for educational transformation. The state of Tennessee was one of the first states to receive federal monies through this initiative, largely based upon its proposal to design a rigorous educator evaluation system based on student performance. One of the reform efforts was the development of the TEAM Evaluation system for school services personnel, which includes improved assessments for school counselors. This rubric evaluates school counselors within the domains of Delivery of Services, Planning of Services, and Environment. Indicators that more specifically identify these areas are found within each of these domains. Performance ratings of Tennessee school counselors are determined through direct observations, conversations between evaluators and counselors, and the examination of artifacts that show what counselors do within each domain and indicator.
We conducted this study to determine the domains and indicators most commonly used for evaluation purposes, and to determine the extent to which the rubric aligned to the role of the school counselor. Results indicated that many of the indicators aligned with the school counselor duties and that the majority of the school counselors believed that the rubric was effective in measuring their duties and responsibilities. As other states get on board with educational reform efforts, this rubric will serve as a template and as a major tool in stakeholder understanding of the school counselor’s role.
References
American School Counselor Association. (1999). The professional school counselor and annual performance evaluation. Alexandria, VA.
American School Counselor Association. (2009). The professional school counselor and annual performance evaluation. Alexandria, VA: Author.
American School Counselor Association. (2012). The ASCA National Model: A framework for school counseling programs (3rd ed.). Alexandria, VA: Author.
Aubrey, R. F. (1985). A counseling perspective on the recent educational reform reports.
The School Counselor, 33, 91-99.
Baker, S. B. (2012, December). A new view of evidence-based practice. Counseling Today. Alexandria, VA: American Counselor Association.
Johnson, R. B., & Christensen, L. B. (2010). Educational research: Quantitative, qualitative, and mixed approaches (4th ed.). Boston, MA: Pearson Education, Inc.
Sink, C. A. (2009). School counselors as accountability leaders: Another call for action.
Professional School Counseling, 13, 68-74.
The White House, Office of the Press Secretary. (2009). Promoting innovation, reform, and excellence in America’s public schools. Retrieved from http://www.whitehouse.gov/the-press-office/fact-sheet-race-top
U.S. Department of Education, ED.Gov. (2010). Delaware and Tennessee win race to the top grants: Archived information. Retrieved from http://www2.ed.gov/news/pressreleases/2010/03/03292010.html