IE Handbook - Texas State Technical College

Transcription

IE Handbook - Texas State Technical College
Institutional Effectiveness
Handbook
2015 Edition
Table of Contents
I.
Introduction
3
II.
TSTC Purpose: The Foundation for Institutional Effectiveness
4
III.
Vision and Values
8
IV.
What is Institutional Effectiveness?
9
V.
Support of the Process
10
VI.
Measuring Institutional Effectiveness
16
Key Performance Indicators
16
TSTC OPM
19
Role of IR in Measuring IE
19
VII. Strategic Planning and Assessment
22
Strategic Planning
22
Assessment
24
Related SACSCOC Standards
27
Integrated Budgeting
28
VIII. Functional Operational Plans and Unit Action Plans
30
Functional Operational Plans
30
Unit Action Plans
31
Student Learning Outcomes
32
Administrative Outcomes
33
Program Review
34
IX.
Assessment Cycle and Closing the Loop
35
X.
Appendices
38
2
I.
INTRODUCTION
Texas State Technical College (TSTC) has assembled the college’s methodologies, practices,
philosophies and intent to ensure institutional effectiveness and continuous quality improvement
in this, the Institutional Effectiveness Handbook. This handbook is the compendium of the
college’s approach to:







Strategic Planning
Implementation
Assessment
Policy Guidance
Research
Evaluation
Regulatory Reports
In producing the Institutional Effectiveness Handbook, TSTC intends to inform and guide college
and community partners on our procedures for strategic planning, program operation and
objectives, evaluation and budgetary development in a data driven atmosphere.
Continuous Quality Improvement is the maxim that guides the TSTC assessment processes,
evaluating all instructional departments and college service units.
The comprehensive nature of the procedures detailed in this Institutional
Effectiveness Handbook is an indication of TSTC’s commitment to excellence and
effectiveness in executing its mission.
TSTC believes in the value of collaboration and teamwork in our collective pursuit of college
excellence, quality programs and service enhancement to facilitate an atmosphere of
engagement for our students. Feedback and suggestions regarding this handbook and the
institutional effectiveness processes are always welcomed.
3
II.
TSTC PURPOSE: THE FOUNDATION OF INSTITUTIONAL EFFECTIVENESS
Texas State Technical College implements activities and initiatives to improve educational
programs and services that are responsive to students’ needs for educational quality and for the
advancement of the overall organizational effectiveness and capabilities of the college.
Texas State Technical College is accredited by the Southern Association of Colleges and Schools
Commission on Colleges to award Associate degrees and Certificates of Completion.
As it relates to the standards of accreditation, effective and accountable institutions develop an
appropriate mission statement and demonstrate accomplishment of the ideals stated within that
mission. To this effect, the College’s Mission and Expanded Statement of Purpose provide
guidance for administrative decisions regarding the overall direction of the institution; provide
direction for planning, operating and evaluation each of the college’s departments and programs;
and establishes general guidelines for the process of assessing and improving institutional
effectiveness.
TSTC is guided by a legislative mission statement defined by the State Legislature for the
Texas State Technical College System in Senate Bill 1222 of the 72nd session. Later, after a
comprehensive state study of significant issues and roles relating to higher education
throughout the state’s higher education arena, the legislature redefined the mission of the TSTC
System as set forth in Section 135.01 of the Texas Education Code:
Texas State Technical College System is a coeducational two-year institution
of higher education offering courses of study in technical-vocational
education for which there is a demand within the State of Texas.
Texas State Technical College System shall contribute to the educational and
economic development of the State of Texas by offering occupationally
oriented programs with supporting academic course work, emphasizing
highly specialized advanced and emerging technical and vocational areas for
certificates or associate degrees. The Texas State Technical College System is
authorized to serve the State of Texas through excellence in instruction,
public service, faculty and manpower research, and economic development.
The system’s economic development efforts to improve the competitiveness
of Texas Business and industry include exemplary centers of excellence in
technical program clusters on the system’s campuses and support of
education research commercialization initiatives.
Through close
collaboration with business, industry, governmental agencies and
communities, including public and private secondary and postsecondary
educational institutions, the system shall facilitate and deliver an articulated
and responsive technical education system.
4
In developing and offering highly specialized technical programs with related
supportive course work, primary consideration shall be placed on industrial
and technological manpower needs of the state. The emphasis of each Texas
State Technical College System campus shall be on advanced or emerging
programs not commonly offered by public junior colleges. (TEC 135.01)
The TSTC Board of Regents periodically reviews the mission statement as defined by the
legislature and adopts an Expanded Statement of Purpose. The most recent version of the
statement was approved on February 15, 2015, and is worded as follows:
Texas State Technical College (TSTC) is a coeducational two-year, multi-campus
institution of higher education providing innovative and responsive programs and
courses of study in technical education for which there is demand in the State of
Texas, with emphasis on advanced and emerging technologies. TSTC is a leader in
building the economic vibrancy of Texas by providing excellence in learning
experiences, on location and at a distance, and through diverse technical programs
and rigorous curricula offerings. TSTC facilitates the transfer of technical expertise
through the placement of former students, who have obtained hands-on learning
experience, in jobs with Texas business and industry. TSTC works collaboratively
both internally and with other organizations to increase the availability of relevant
technical education in Texas and to be accountable to its various constituencies.
Integrity in all of its dealings provides the foundation of TSTC’s mission.
TSTC awards include Associate of Applied Science degrees, Certificates of
Completion, badges (skill-set institutional awards) and workforce certificates. TSTC
also provides opportunities for the seamless transfer of credits to other colleges
and universities, including awards at its Harlingen campus for Associate of Science
degrees and institutional recognitions for completion of the General Education
Core curricula.
TSTC makes higher education affordable, readily accessible and personal through
multiple instructional delivery systems, counseling and guidance services, student
activities and the opportunity to learn in a residential setting at several of its
campuses. By offering TSTC programs and services in flexible times and places,
TSTC students are able to achieve their educational and career goals at a pace that
meets their needs while minimizing the elapsed time needed to reach those goals.
To achieve time and place flexibility, TSTC offers traditional higher education credit
programs taught on a semester basis, dual credit programs that lead to marketable
skills achievement or further education (in partnerships with Independent School
Districts), competency-based education and training delivery, online instruction,
project-based learning activities, continuing education, and specialized training for
5
business and industry. TSTC operates its programs and services in accordance with
the public trust for which it is responsible.
Diversity in the student body and in faculty and staff is a value that TSTC strives to
achieve. It is TSTC’s goal for the ethnicity of these groups to mirror statewide and
local demographics. Likewise, serving non-traditional and special population
groups has always been a TSTC keynote, with specialized services provided to
assist where and when needed.
TSTC’s Statewide Operating Standard (SOS) for Institutional Effectiveness and Institutional
Research (IE/IR) provides a process for review of the Expanded Statement of Purpose and any
necessary revisions recommended through the review process (see Appendix A).
Appropriateness of Expanded Statement of Purpose
SACSCOC requires institutions to justify the appropriateness of their mission to higher
education. For TSTC, the legislative mission is appropriate to higher education, as it specifically
states that the TSTC “is a co-educational two-year institution of higher education.” The Texas
Legislature, who established the Texas Higher Education Coordinating Board (THECB) as the
custodians and regulators of all public higher education in the state, expects TSTC to comply
with all applicable state requirements, expected outcomes and benchmarks, and maintain the
integrity of its legislated role.
The achievement of Closing the Gaps standards established by the THECB presumes that TSTC
operates in accordance with all state standards for higher education. The THECB reviews the
college’s compliance with all Closing the Gaps standards formally and reports results to the
legislature both directly and through the Legislative Budget Board (LBB).
Teaching and Learning and the TSTC Purpose
Teaching and learning are at the core of both TSTC’ s legislative mission and its Expanded
Statement of Purpose. In fact, the legislated mission specifically defines the realm of the
institution’s focus by emphasizing “occupationally oriented programs with supporting academic
coursework, emphasizing highly specialized advanced and emerging technical and vocational
areas for certificates or associate’s degrees…through excellence in instruction.” Furthermore,
collaboration with other entities provides the institution the opportunity to “facilitate and
deliver an articulated and responsive technical education system.” The Expanded Statement of
Purpose for TSTC brings even more into focus the college’s primary mission of teaching and
learning, and is reviewed a minimum of once every five years but may be updated more often as
conditions change.
6
Public Service and the TSTC Purpose
TSTC has been assigned a very specific purpose in higher education for the state of Texas. As the
only state-supported technical education system in Texas, the role and purpose of TSTC is very
clear to all who work to carry out this responsibility. Therefore, the college combines all
assessment and evaluation information to determine the most appropriate priorities and
activities for the college to pursue, thus promoting institutional effectiveness as part of routine
operations.
Likewise, public service as defined by TSTC is the combined efforts of the college to
encourage economic development and provide access to education and training needed to
enable students to reach their potential. The legislated mission notes that the TSTC System
“shall contribute to the educational and economic development of the state of Texas…(and)
through close collaboration with business, industry, governmental agencies and communities,
including public and private secondary and postsecondary educational institutions…(to) facilitate
and deliver an articulated and responsive technical education system.” As noted in the
second paragraph of the Expanded Statement of Purpose, TSTC promotes economic
development by making higher education affordable, readily accessible, and personal
through diverse technical programs and rigorous curriculum offerings.
Procedure for Review of Expanded Statement of Purpose
TSTC reviews the college-specific Expanded Statement of Purpose and provides the structure for
its periodic study and revision. The legislative mission was developed to guide the TSTC System
in its entirety and specify the distinct state-wide role of the System to promote technical
education and meet the workforce needs of Texas. The legislated purpose statement cannot be
amended without the consent of the state legislature.
Revision of the Expanded Statement of Institutional Purpose is initiated by the Strategic
Planning Committee and conducted at least once every five years or more frequently when
required by changing conditions. The Strategic Planning Committee, in coordination with the
Vice Chancellor for Business Intelligence, ensures that sufficient time has been allotted to the
review of the Expanded Statement of Purpose by all constituents. When a revised Expanded
Statement of Purpose is finally approved by the TSTC Board of Regents, the approved
statement is published and included in the College Catalog, Student Handbook, and college
website.
7
III.
VISION AND VALUES
The primary vision of the Texas State Technical College is to be a leader in strengthening the
competitiveness of Texas business and industry by building the state’s capacity to develop the
highest quality workforce. Our vision is put into practice through the integration of the following
core values that drive all aspects of our work:
 Integrity: Dealing honestly and openly with all of our constituencies and with one another
 Excellence: Achieving the highest quality in all we do
 Leadership: Developing visions and strategies for a desired future, and aligning and energizing
people to achieve those visions.
 Innovation: Creating and implementing new ideas and methods
 Collaboration: Working cooperatively with other organizations and within our own system
 Responsiveness: Providing appropriate programs and services in a proactive, flexible, and timely
manner
 Accountability: Measuring our performance and using the results for improvement
 Stewardship: Ensuring our programs and services add value to our students and communities
throughout the state, and operate in accordance with the public trust for which we are
responsible
 Diversity: Striving for inclusivity in our faculty, staff and students as reflected in state
demographics; treating others fairly and equitably as we would all like to be treated
Figure 1: TSTC Values
8
IV. WHAT IS INSTITUTIONAL EFFECTIVENESS?
As described in the Southern Association of Colleges and Schools Commission on Colleges
"Manual for the Principles of Accreditation: Foundations for Quality Enhancement" (2012
Edition), institutional effectiveness is the "ongoing, integrated, and institution-wide researchbased planning and evaluation process that (1) incorporates a systemic review of institutional
mission, goals, and outcomes; (2) results in continuing improvement in institutional quality;
and (3) demonstrates the institution is effectively accomplishing its mission" (Core
Requirement, 2.5, p. 16).
Institutional Effectiveness (IE) is the term used to describe the process of continuous quality
improvement. By definition, IE is the ability of an institution to match its performance to its
established purposes as stated in its mission (source). In practice, it is a tangible measure of
the institution’s performance and impact on the community which it serves. The measures
used to determine effectiveness are varied based upon the number of individual activities
and initiatives implemented throughout the College’s instructional, administrative, and
instructional support units. By setting high expectations and standards for institutional
performance, the college continually measures its performance against not only its own
expectations, but the extent to which the college’s mission is being fulfilled.
The Institutional Effectiveness & Research Offices (IE&R) facilitates the functions of
institutional effectiveness for the college. IE&R staff supports college administrators,
divisions, and programs in the continual planning that is driven by data and outcomes
collected from various evaluations and assessments conducted across the college and among
several and various groups of students or other college constituents. This data is used to
make informed planning decisions. The IE framework established is integrated and
comprehensive, affecting all levels of the institution. It is not only the quest for quality that
is a part of the institution’s culture, but the need to be accountable to the constituents of the
State of Texas, the students who attend the college, the federal government whose
educational funds enable students to attend TSTC, and the employers for whom the college
prepares students. Federal, state, and local agencies hold TSTC to performance standards
expected by their respective agencies or organizations. TSTC has integrated performance
indicators with internally developed standards of excellence to measure
effectiveness.
9
V.
SUPPORT OF THE PROCESS
The Institutional Effectiveness (IE) Committee
The Institutional Effectiveness Committee is the primary advisory body to the Policy
Development and Execution process of the chief administrative body – the Executive
Management Council (EMC) at Texas State Technical College. The composition of the Executive
Management Council is included in Appendix B.
The mission of the Institutional Effectiveness Committee at Texas State Technical College is to
facilitate and monitor planning and evaluation processes that support the mission, the vision, and
the values of the College.
Core activities and/or responsibilities of the Committee include:
1) Monitoring planning and evaluation processes for:
a. academic and technical programs and educational support units;
B. the College Strategic Plan, Functional Operating Plans and Unit Action Plans; and,
C. institutional data collection and reporting.
2) Training I.E. managers and I.E. contact persons in developing and documenting I.E.
plans and assessment processes.
3) Consulting with faculty, staff and leadership regarding institutional effectiveness
matters.
4) Facilitating communications of progress toward institutional goals and objectives as
they relate to institutional effectiveness; and,
5) Overseeing activities related to program-specific and regional institutional
accreditation.
Standing Committees
Role of the Standing Committees in the Recommendation Process
The majority of initiatives for institutional improvement originate from recommendations of
various college standing committees. The work of the TSTC standing committees is essential to
the organized structure for institutional effectiveness. Committee recommendations are used to
inform strategic planning and operational planning, where the majority of improvement activities
are documented and integrated into either institution-wide function, or unit-specific planning.
10
Committees hear and consider various recommendations resulting from their work and findings.
The committee votes on these recommendations and formulates reports and/or proposals for
submission to the appropriate administrator. Administrators then bring these recommendations
to Executive Management Council for consideration. The Executive Management Council (EMC)
reviews the proposals of the standing committees for consideration and action. EMC may elect to
take immediate action on the recommendation/proposal or may determine it appropriate to
consider the recommendation/ proposal for inclusion in the subsequent cycle of strategic and
operational planning.
The Vice Chancellor for Business Intelligences notifies the IE committee (administrators notify
their standing committee) of the action taken by EMC and appropriate follow-up occurs by the
IE&R Office. An Ad hoc committee may be established by the EMC or other administrator to study
and make recommendations to the EMC on issues of relevance to the planning, assessment
evaluation, budgeting, and improvement efforts of the college. The IE&R Offices at each campus
supports the inclusion of recommendations into the appropriate plans so that improvement plans
can be developed and resources secured.
Institutional Effectiveness Standing Committee
The Institutional Effectiveness Standing Committee supports the activities associated with
institutional effectiveness.
This committee is comprised of a minimum of eight members representing all divisions of the
college. Members of IE standing committee, as well as other college standing committees,
serves a three-year term, with one-third of the membership rotating off the committee
each year. A senior staff member of the IE&R Office is a non-voting, ex-officio member of the IE
standing committee. The IE&R representative presents recommendations from the IE standing
committee to EMC for determination of how to address all submitted recommendations and
concerns.
11
Institutional Effectiveness Cycle and Components
TSTC has developed an Institutional Effectiveness Cycle based on the Plan-Do-Study-Act
(PDSA) model of quality management pioneer Dr. W. Edwards Deming (1993). This model
presents a continuous cycle of planning, implementation, evaluation, and responsive actions
to improve processes. Planning is designed to coincide with the budgeting process, while
evaluation of performance outcomes is designed to coincide with year-end reports available
from institutional research and periodic benchmarking assessments.
Budgeting
Plan
Respond
TSTC
Institutional
Effectiveness
Cycle
Evaluate
Implement
Institutional
Research
Figure 2: TSTC Institutional Effectiveness Cycle
In addition to the Institutional Effectiveness Cycle, which illustrates the overall continuous
improvement process, an Institutional Effectiveness Model shown on Figure 2, depicts the
various levels of planning and assessment through the college’s divisions and departments. The
foundation of the pyramid is the mission, vision and values that guide all planning processes.
Planning and evaluation processes are comprised of the Strategic Plan, Functional Operating
Plans and Unit Action Plans. Organizational assessments and surveys support the three levels of
planning and evaluation with data for decision-making:
• Accountability Report of Texas Higher Education Coordinating Board (ART)
12
•
•
•
•
•
Alumni Employment Survey (AES)
Community College Survey of Student Engagement (CCSSE)
Employee Assessment of College Environment Survey (ACE)
Student Environment Survey (SES)
Internal Operational Reports
Figure 3: TSTC’s Institutional Effectiveness Model
IE is a "top-down" and a bottom-up" process
The Institutional Effectiveness process is both “bottom-up” and “top-down” and is guided by
our mission, vision, and values. Individual programs develop unit action plans and functional
areas develop functional operating plans to annually assess progress toward goals and for
continuous quality improvement. Plans, performance and adherence to the mission, vision
and values are reviewed by Vice Chancellors, Associate Vice Chancellors, Division Directors, and
by Functional Area Leads, all with an eye toward continuous quality improvement (CQI).
Thus, all segments of the college are involved in the planning process and responsible for
continuous quality improvement.
TSTC embraces and promotes advancement and growth by applying a comprehensive strategic
planning approach that provides a framework for determining the college’s direction, priorities,
and methods for adapting to change without sacrificing the quality of programs, services, or
activities. TSTC supports a culture among its students and employees that excellence is expected,
but that excellence can only succeed when there is continuous self-analysis and directed
improvement of identified weaknesses.
13
The following highlights the components of the TSTC Institutional Effectiveness Model:
1. Strategic Planning Committee is charged with monitoring and evaluating the adherence and
effectiveness of internal mechanisms established for the attainment of institutional
effectiveness through planning. The committee:





monitors the established planning cycles and recommends items to be included in
subsequent Strategic Plans;
reviews the Strategic Plan and other plans as needed;
conducts periodic reviews of the TSTC Expanded Statement of Purpose;
communicates its findings with the Institutional Research and Measures and
Standards Committees for their consideration and inclusion into subsequent
improvement plans or long-term monitoring of areas of concern; and
submits recommendations to the Executive Management Council.
2. Measures and Standards Committee is charged with reviewing outcomes from all Key
Performance Indicators (KPIs), federal and state mandates, and other performance measures
to determine the extent to which the college is fulfilling its stated mission/purpose and
producing successful student outcomes. The committee:





monitors that the College is being measured against valid and reasonable indicators
and the extent to which the college is meeting its overall performance goals;
reviews and revises KPIs for relevance and assists TSTC in being responsive to internal
and external changes;
alerts the Strategic Planning Committee regarding unmet performance measures and
standards to be addressed in the subsequent cycle of strategic planning and
administrative outcomes;
communicates its findings to the Institutional Research and Strategic Planning
Committee for their consideration and inclusion into subsequent improvement plans
or long-term monitoring of areas of concern; and
submits recommendations to the Executive Management Council.
3. Institutional Research Committee is charged with oversight of institutional research
activities. The committee:




reviews selected reports of results from data, surveys, and studies;
identifies opportunities for improvement in operations, service, programs, or
assessment outcomes based on the results of the data, surveys, and studies;
communicates its findings with the Strategic Planning and Measures and Standards
Committees for their consideration and inclusion into subsequent improvement plans
or long-term monitoring of areas of concern; and
submits recommendations to the Executive Management Council.
4. Assessment Standing Committees
14

Program Student Learning Outcomes Assessment Committee is charged with
reviewing all program student learning outcome annual assessment plans and reports.
The committee ensures that all technical and academic programs are in compliance
with the assessment schedule and that the assessment results are being utilized to
make improvements to program curriculum and instruction. The committee consists
of faculty members. The committee work in teams of two to evaluate assessment
plans and reports and ensure the results are entered into the college designated
tracking system. The teams work with department chairs to resolve minor issues as
needed.

General Education Assessment Committee is charged with developing, managing,
and reviewing the achievement of the core competencies identified for the general
education academic core. The Committee consists of faculty and staff from the
General Education division.
5. Program Assessment and Improvement Review (PAIR) Teams oversees the implementation
of the program review five-year cycle and forms teams within the membership to conduct
program reviews of instructional programs. The committee ensures the follow-up of all
program review deficiencies identified in program reviews.
15
VI.
MEASURING INSTITUTIONAL EFFECTIVENESS
TSTC Key Performance Indicators
The Measures and Standards Committee is charged with the oversight and development of the
college’s fundamental standards of performance that demonstrate institutional quality. This
committee conducts a biennial review of the twelve Key Performance Indicators (KPIs), which
represent the key metrics by which TSTC measures its progress and fulfillment of its institutional
goals and objectives. The KPIs serve as the fundamental performance indicators for the college.
The twelve KPIs are listed below with their corresponding definition are the factors that TSTC
determined to be the best demonstration of excellence and effectiveness in accomplishing its
mission. Appendix C includes an expanded description of these indicators as well as historical
performance outcomes and projected target rates for each of these measures.

KPI #1: Annual Unduplicated Count of Semester Credit Hour Students: This is a
measure of the annual headcount of students enrolled in semester credit hour classes,
counting a student one time, regardless of the number of classes or terms in which a
student is enrolled during the academic year.

KPI #2: Annual Unduplicated Count of Former Students Found Working or
Transferring to another public college or university in Texas (Placement): Placement
results are available through Texas Higher Education Coordinating Board (THECB)
Automated Student and Adult Learner Follow-Up System (ASALFS) Exit Cohort
Reports, which track all students who were at the college the previous fiscal year and
who were not present the following fiscal year whether due to graduation, transfer, or
other reasons. Included in the placement count are students who were found working
in the Texas Workforce Commission (TWC) Unemployment Insurance (UI) wage
records and in national databases including the Office of Personnel Management, US
Post Office, and military records from the Department of Defense, in October,
November, and/or December of the fiscal year following their exit from TSTC.
Placement counts include all former TSTC students (except continuing ed) from the
prior academic year who are “found working” or who have been found “enrolled” at
another public institution of higher education in Texas.

KPI # 3: Average First Year Salary of Former Students Found Working (including
Leavers): Salary results are available through Texas Higher Education Coordinating
Board (THECB) Automated Student and Adult Learner Follow-Up System (ASALFS) Exit
Cohort Reports, which track all students who were at the college the previous fiscal
year and who were not present the following fiscal year whether due to graduation,
16
transfer, or other reasons. Included in the average salary calculation are students who
were found working in the Texas Workforce Commission (TWC) UI wage records and
in national databases including the Office of Personnel Management, US Post Office,
and military records from the Department of Defense, in October, November, and/or
December of the fiscal year following their exit from TSTC. Includes former students
found to be “working only” (not enrolled in higher education in Texas).

KPI # 4: Number of Annual Graduation Awards: Graduation data is reported to
THECB on the CB-M009 report and this measure reflects the total number of degrees
and certificates awarded during the fiscal year. This is a count of awards rather than
graduates. A graduate may earn more than one award in an academic year, and this
measure counts each award earned during that year.

KPI # 5: Number of Students Graduated Annually: From the CB-M009 report, this
measure counts the number of students who graduated in a fiscal year so that a
graduate is counted only one time for the year, regardless of the number of awards
received.

KPI # 6: Percent of first-time, Full-time Degree or Certificate-seeking Students
Graduated within 3 years: First-time, Full-time Degree or Certificate-seeking Students
who Graduated Within Three Years with Either an AAS Degree or Certificate - Students
who (1) enrolled TSTC for the first time, (2) were enrolled for 12 or more semester
credit hours as of the official census date in their first semester of enrollment, (3)
declared they were seeking either an Associate of Applied Science (AAS) degree or a
vocational/technical certificate in their first semester of enrollment, and (4) graduated
within three years as a percentage of all such students in initial enrollment cohorts.

KPI # 7: Average Student Satisfaction Index: Students are surveyed to determine
their satisfaction with all aspects of TSTC operations, including facilities, student
learning processes, student life, safety, and other aspects of student life.

KPI # 8: Student Ethnicity Compared to Statewide Population Ethnicity: This
measures the difference in percentage points of the TSTC student population against
statewide averages (all ages included) of ethnicity.

KPI # 9: Funds Raised Annually: Fund raising measures the total amount of
donations, both monetary and non-monetary, at the campuses and administration,
including amounts donated to TSTC through The TSTC Foundation. All amounts by
category are summarized together. Targets are set for monetary donations.
17

KPI # 10: Annual Number of Donors: This measure provides the annual unduplicated
count of donors of both monetary and non-monetary contributions for the fiscal year.

KPI # 11: Employee Satisfaction Index: Employees are surveyed to determine their
satisfaction with all aspects of their TSTC employment. These surveys are scored on a
scale of 1 to 5, with 5 representing total satisfaction.

KPI # 12: Employee Ethnicity Compared to Statewide Population Ethnicity: This
measures the number of percentage points TSTC employee ethnicity percentages are
away from statewide ethnicity percentages. These results are based on the ethnicity
of all employees college-wide. All other categories are combined and measured
against Anglo populations.
By focusing the planning and evaluation activities of the college around these important factors, a
straightforward holistic picture of how the college is performing results. The KPIs serve as a
baseline against which to measure the performance of the college.
Each KPI is supported by performance standards from the state or other external controls, as well
as internal standards developed to determine internal goals and expectations the college sets.
Performance standards may change as regulations and mandates change, and internal
benchmarks may also be adjusted to continue to raise the bar on expected outcomes and
performance.
The twelve indicators are reviewed at least every five years, typically in conjunction with the
review of the Expanded Statement of Purpose for changes that reflect revisions made to the
Statement. This process also ensures that all measures and benchmarks used to gauge college
performance are current.
Additionally, department and college planning is tied to the achievement of the performance
standards through this process, thus providing direction in TSTC 's attempt to “close the loop” on
identified college weaknesses. All measures used to support the accomplishment of each KPI
target are expected to be met or exceeded. When one of these measures is not met at the
institutional or department level, the improvement plan developed to comply with established
measures and outcomes is integrated into the established budgeting, assessment, and planning
cycles. The areas identified as not meeting established measures and standards from all KPIs then
become part of a unit’s or division’s goals in the functional or divisional plans annual assessment
plans, and continue to be part of the plan until established measures are finally met or exceeded,
therefore ensuring achievement of the TSTC Expanded Statement of Purpose
18
TSTC Oregon Performance Measures (OPM)/Institutional Scorecard)
The TSTC Oregon Performance Measure Report/Institutional Scorecard serves to focus the various
planning and evaluation activities of the college. It provides a means to achieve a relatively
straightforward picture of the college’s overall performance and the extent to which TSTC is
achieving its stated goals. The OPM provides information on a number of factors essential to the
success of the college. The report reflects data for the previous biennium and profile institutional
achievement of the KPI targets in context of the goals and objectives outlined in the strategic
plan. This report is presented to the Executive Management Council and the TSTC Board of
Regents.
Role of Institutional Research in Measuring Institutional Effectiveness
The Institutional Effectiveness and Research functional leads are the institution’s facilitators of
assessment activities and processes, including developing and/or issuing surveys,
collecting and analyzing institutional data for reporting, disseminating information to the
college’s IE committee, standing committees, and the general public in a “user friendly” format,
and to incorporate the findings into overall improvement efforts of the college. The IE&R offices
report findings and recommendations to the IE committee and the Vice Chancellor for Business
Intelligence. The work of the department includes, but is not limited to:
• Assisting in the formulation of surveys and other instruments developed;
• Reviewing and making recommendations concerning the proposed use of any
commercially developed surveys, tests, and/or other instruments;
• Assisting in the preparation of reports and other documents containing
recommendations for the Vice Chancellor of Business Intelligence to forward to
Executive Management Council;
• Assisting the Vice Chancellor of Business Intelligence disseminate the results of
institutional studies and assessment initiatives to members of the college community;
• Reviewing surveys and other instruments prepared by other entities within the
organization for approval and/or input before administration of these surveys;
• Evaluating the effectiveness of the institutional research process in utilizing the
findings for improvement of institutional research; and,
• Reviewing established procedures and practices for incorporating the results of various
measures into the improvement efforts of the college.
The findings of studies, surveys, and/or tests a r e d i s a g g r e g a t e d w h e n p o s s i b l e a n d
reported to the appropriate Vice Chancellors, Associate Vice Chancellors, Divisional Vice
Presidents, Division Directors, Directors and/or Department Chairs. Methods of distribution and
communication of results are done in both written and electronic formats. The
department/program supervisors evaluate all relevant data and information pertinent to their
own specific operations, and prepare improvement plans. The Vice Chancellors, Associate
19
Vice Chancellors and/or other EMC member determine whether the information requires
immediate action, development of administrative assessment plans, and/or intervention by the
TSTC Administrative authorities. The responsible administrator considers the submitted
recommendation and takes appropriate action. The response from the unit supervisor and/or
administrator is presented for discussion at the subsequent IE committee meeting, and a record
of the proceedings will be incorporated into the minutes of the meeting. The IE&R Office
then follows up on the recommendations for improvement by tracking action taken by the
administration and/or the department or program.
TSTC also relies on trend analysis and the data and information that the institution reports to
national, state, regional, System, and local entities. TSTC is required to submit a variety of
mandated reports to several external oversight agencies. Additionally, through trend analysis
reported from TSTC data depositories, all constituents are able to track and evaluate the impact
on student enrollment, graduation, persistence, and other key trends, and respond accordingly
in the planning processes.
Texas State Technical College provides requested data, submits regulatory and accreditation
reports and other information on an ongoing basis depending upon the requirements of each
agency or organization. The disseminated information is collected according to guidelines which
accompany each request. Reports and surveys can be identified as compliance reports and
surveys, or as courtesy reports and surveys. Compliance reports are those required by state
and federal or other regulatory agencies. Courtesy reports and surveys are those requested by
non-governmental organizations. The following reports and surveys are routinely generated by
TSTC in compliance with requests from state and federal agencies according to the following
tables:
Table 1: External and Internal Surveys
Type
Student Satisfaction
Student Engagement
Employee Satisfaction and
Engagement
Graduate Exit/Employment
Survey
Student Evaluation of Courses
Entering Student Survey
Tool
Noel-Levitz Student Satisfaction
Inventory (SSI)
Community College Survey of Student
Engagement (CCSSE)
Survey of Employee Engagement (SEE)
CSO Outcomes Survey
End of Course Surveys
Status: TBD
Timeframe
Odd fiscal years: spring
collection, summer results
Even fiscal years: spring
collection, summer results
Annual: fall collection,
spring results
Annual: fall collection,
spring results
Every term
Every fall and spring
semester
Table 2: Institutional Off-Campus Surveys (Requests for Information
Survey Name
ACT Institutional Data Questionnaire
Comprehensive Annual Financial Report
The College Board Survey
Dates Submitted
March, annually
November, annually
October-December, annually
20
Peterson’s Annual Survey of Undergraduate Institutions
Peterson’s Interim Expenses Update
February, annually
July, annually
Table 3: Planning Report
Report Type
TSTC OPM
Dates Submitted
Annually
Table 4: State Reports
Report Type
IPEDS Institutional Characteristics Completions 12-month
enrollment
THECB Accountability System
Closing the Gaps Report
SACS Fall Annual Profile
SACS Financial Profile
IPEDS Human Resources; Financial Aid
Dates Submitted
October, Annually
December, Annually
December, Annually
November/January,
June/July, Annually
February, Annually
IPEDS Fall Enrollment; Graduation Rates; 200% Graduation
Rates; Financials
April, Annually
Performance Measures Report
Assist Resource Development with Perkins Data
April & November, Annually
June/July, Annually
Courtesy Reports and Surveys
When the information is to be provided as a courtesy rather than for compliance, it is
necessary to determine whether the information should be released. This is generally done
based on the liability of releasing the information such as statute violations. Other factors are
the ease of completion and the time frame involved.
21
VII. STRATEGIC PLANNING AND ASSESSMENT
Strategic Planning
Strategic plans guide and articulate the visions and expectations regarding the expected role of
TSTC through its Expanded Statement of Purpose and the college’s vision of how it should
develop, while maintaining its effort toward excellence and effectiveness. The TSTC Strategic
Plan (Appendix D) provides the general direction the institution will follow in the immediate and
mid-term future.
The Strategic Planning Process Model below illustrates the different components of college-wide
planning and evaluation. Strategic level planning includes administration, faculty, staff and
student representation through the Executive Management Council and the Institutional
Effectiveness Monitoring Committee. The Executive Management Council provides leadership in
developing the strategic goals and objectives.
Figure 4: TSTC’s Strategic Planning Processes
The TSTC Strategic Plan outcomes are evaluated annually, particularly in even numbered
years prior to a legislative session, and updated as needed and thoroughly reviewed and replanning every four (4) years. Strategic planning integrates institution-wide, research-based
planning with a variety of other institutional priorities to include:

External mandates (including THECB requirements, sponsored program and
funded grant goals and objectives, SACSCOC Principles, etc.);
22




The internal vision of leaders and constituents about the destination they hope to
arrive by the end of the timeframe covered in the Strategic Plan;
Environmental changes and technical advances that impact college operations;
Weaknesses identified as “unmet outcomes” in the TSTC OPM Report or other
program, services or activities assessment; and
Goals and objectives of plans that support and directly feed into the TSTC Strategic
Plan.
Taken together, the issues listed above form set the tone and direction of the TSTC Strategic
Plan. Items included in the Strategic Plan are viewed as priority items and issues that the college
will address during the timeframes specified.
College administrators are responsible for implementation of the Strategic Plan. Assignment of
Strategic Plan objectives occurs at the onset of the plan and is essential to carry out the priorities
described in the Strategic Plan to the level of implementation, identification of resources, and
evaluation of the activity. Another term to define the implementation of the strategic plan is
operational planning. Operational planning is where budgeting begins in the institutional
effectiveness process. Essentially, operational planning moves the college directly to the
destination it hopes to arrive at the end of the Strategic Plan cycle.




Responsibility for each Plan objective is assigned to an EMC member or other key
administrator;
Each member is responsible for developing and reporting a specific plan of action for
each objective within the Plan that assigns personnel and resources, sets measurable
outcomes, and determines the units needed to contribute toward the
accomplishment of specific strategic plan intended outcomes;
Outcomes are reported and tracked in the college designated tracking system and,
Executive Management Council forecasts institutional funding from the Texas
legislature for the upcoming biennium and subsequent biennia and sets timelines for
the accomplishment of objectives based on anticipated funding to finance the
priorities set forth in the Plan within the four-year timeline.
The primary processes of IE pertaining to strategic and operational planning are coordinated at
the EMC level, with administrators shaping the direction and priorities of the institution. The Vice
Chancellors, along with other administrators and key personnel, develop these plans which work
toward achievement of institutional goals, objectives, and visions in the strategic plan.
Administrators are responsible for documenting and tracking their assigned objectives and
developing strategies and activities to ensure accomplishment of each objective within the
prescribed timelines. The Vice Chancellors may also incorporate various divisional initiatives into
23
their Plans to create a sense of purpose for each division within the overall plans and to establish
an appropriate framework for the incorporation of legal requirements and budgetary matters
into the institutional priorities identified in the Plan. This provides a formal mechanism for
linking statutory requirements and regulations, legislative appropriations requests, and annual
budgets to each divisional or institutional priority for the subsequent biennium.
The Vice Chancellors and Associate Vice Chancellors and other assigned administrators have the
responsibility of ensuring that data for evaluation and assessment of their assigned
objectives/outcomes are adequately maintained and tracked. They coordinate assessment and
evaluation procedures with the IE&R Office to ensure that data that will support achievement of
each measurable objective and expected outcome is produced and collected. The administrator
responsible for each assigned objective is responsible for preparation of an annual report
regarding the progress and/or achievement of each assigned objective. The report serves to
determine the progress of the attainment of each Plan objective, allow for inclusion of new
objectives into the plan, and address barriers that prevent achievement of each objective. At the
conclusion of each review, the EMC determines possible revisions to the official Plan for the
coming year.
When the Plan is reviewed, the Chancellor, Vice Chancellors and the EMC may impose one of the
following courses of action for each of the Plan objectives under review:




Continue Status Quo of the Objective: The objective continues in its original version
and continues through the subsequent year(generally for objectives that require
longer than a year to complete or which are not earmarked for funding until the latter
part of the plan);
Revision of Objectives/Outcomes: This revision may reflect the satisfactory
achievement of particular objectives and a shift in focus for the coming year;
Revision of Standards Targeted for Achievement: Standards set for particular
objectives or outcomes that are deemed too conservative or too ambitious may be
adjusted; and
Revision of Operational and Functional Procedures: A revision of operational
practices and/or managerial expectations may be implemented to ensure that
barriers to achievement of priority objectives are eliminated.
Assessment
Assessment is a systematic and ongoing process of gathering and interpreting information to
discover if programs/services are meeting intended outcomes and then using the information to
enhance/improve the programs/services (adapted from Virginia Common Wealth, 2002).
24
Assessment serves institutional effectiveness through its four main purposes:
1. Improves programs/services by providing evidence that informs change.
2. Informs students, faculty, staff and other stakeholders of the state of an institution, its
programs/services and its impact.
3. Validates that an institution and its programs/service is accomplishing what it says it is
accomplishing through a demonstration of assessment results.
4. Supports campus-decision making processes, strategic planning, program review and
additional accountability activities such as SACSCOC reaffirmation and re-accreditation of
educational programs by professional accrediting bodies.
The assessment process focuses on the outcomes, rather than the outputs of work processes,
thus the process is intended to measure outcomes over outputs. Outcome data is generally
superior in providing information that can be used to improve programs and services. For
instance, simply knowing how many students are processed through Financial Aid says little
about Financial Aid processes.
Assessment is an important part of strategic planning at the college, functional area and
program/unit level. Strategic planning ensures that resources, decision-making, and activities are
aligned with common goals of the institution. A department or unit/program continuously
evaluates whether or not it is having the desired impact through outcomes assessment. When
actual outcomes are measured, the results analyzed, and actions are taken to improve
performance, the department or unit/program engages in meaningful and evidence-based
strategic planning.
The process through which all college planning and assessment occurs is a traditional, “closedloop” system. It is a cyclical approach that involves constituent groups, incorporates needs
assessment, establishes operational and/or program level student learning outcomes, assesses
results, recommends strategies for improvement, and tracks follow-up activities. Results from
the various assessment efforts are used to guide future decisions, including allocation of
resources and continuation of strategies/initiatives. The process is illustrated in Figure 5.
25
Figure 5: TSTC Planning and Assessment Cycle
Practical implementation of the TSTC assessment process occurs at two levels.
A. Institutional Level
•
•
•
•
Planning, review, assessment, and improvement activities address effectiveness at the
macro level using college-wide aggregated data.
Recommendations reflect a college-wide continuous improvement perspective.
The Mission, Expanded Statement of Purpose, Strategic Plan, OPM Report, and institution
wide survey results such as the CCSSE, Noel Levitz, or SEE inform institutional
effectiveness
Assessment is conducted annually or according to scheduled timelines (i.e., surveys) and
reported college-wide.
Functional Operational and Unit Action Plans
B.
•
•
•
Planning, review, assessment, and improvement activities address the smaller, focused
organizational units within the college (e.g. functional areas/departments, educational
programs, and student support services) using a combination of aggregated and more
specific data.
Recommendations reflect efforts to enhance program/department effectiveness and/or
efficiency.
Functional Operational Plan assessments are conducted annually and reported collegewide to various institutional stakeholders. These are operational statements derived from
a functional areas core functions that describe the desired quality of key services within
26
•
•
•
an administrative functional area and define exactly what the services should promote
(modified from Selim et al., 2005, p. 19).
Unit Action Plans are assessed annually and improvements (use of results) reported
biennially. Respective directors, program chairs and other departmental leaders are
expected to report findings to their respective administrative functional leads as well as
college-wide for sharing of best practices. Unit action plans are expected of both noninstructional units and instructional programs. In addition to completing UAPs,
educational programs assess and report program-level student learning outcomes.
Assessment plans, results and improvements are reported with the college designated
tracking system.
Comprehensive program reviews for educational programs are conducted on a five year
cycle.
Refer to Appendix E for the biennial assessment timeline of activities.
Related SACSCOC Standards
The Southern Association of Colleges and Schools Commission on College (SACSCOC) is the
regional body for the accreditation of degree-granting higher education institutions in the
Southern states. It serves as the common denominator of shared values and practices among the
diverse institutions in its region that award associate, baccalaureate, master’s, or doctoral
degrees. To gain or maintain accreditation with the Commission on Colleges, an institution of
higher education must comply with standards contained in the Principles of Accreditation:
Foundations for Quality Enhancement and with the policies and procedures of the Commission
on Colleges. As of 2012, SACSCOC’s most recent edition of the Principles of Accreditation
continues the emphasis on continuous improvement and institutional effectiveness. The
following are excerpts from this edition that illustrate the emphasis SACSCOC places on
assessment:
Core Requirement 2.5, Institutional Effectiveness
The institution engages in ongoing, integrated, and institution-wide research-based
planning and evaluation processes that (1) incorporate a systematic review of institutional
mission, goals, and outcomes; (2) result in continuing improvement in institutional quality;
and (3) demonstrate the institution is effectively accomplishing its mission (p. 16).
Comprehensive Standard 3.3 Institutional Effectiveness
3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these
outcomes, and provides evidence of improvement based on analysis of the results in each of the
following areas:
27
3.3.1.1 educational programs, to include student learning outcomes
3.3.1.2 administrative support services
3.3.1.3 academic and student support services
3.3.1.4 research within its educational mission, if appropriate
3.3.1.5 community/public service within its educational mission, if appropriate (p. 48)
Comprehensive Standard 3.4.4, Acceptance of Academic Credit
The institution has a defined and published policies that include criteria for evaluating,
awarding, and accepting credit for transfer, experiential learning, credit by examination,
advanced placement, and professional certificates that is consistent with its mission and
ensures that course work and learning outcomes are at the collegiate level and
comparable to the institution’s own degree programs. The institution assumes
responsibility for the academic quality of any coursework or credit recorded on the
institution’s transcript (p. 57).
Comprehensive Standard 3.5.1, College-level Competencies
The institution identifies college-level general education competencies and the extent to
which graduates have attained them (p. 27).
Integrated Planning and Budgeting
Texas State Technical College has adopted an integrated planning and budgeting process.
Effective budget planning is dependent upon thoughtful and thorough program planning. In
essence, the strategic plan, with its institutional goals and objectives, drives the planning
conducted by the programs. In turn, the unit action plans provide the framework for the
distribution of available resources during the budget planning process.
Budgeting is an essential element of both the Plan and the annual assessment plans. Annual
budget forecasts for each unit ensure that administration is aware of funding needed at each
level of the institution to determine identified priorities from plans, assessment and evaluation
results, and new mandates. By utilizing this process, administrators ensure that all strategic
plan objectives and unit priorities have funds earmarked for immediate issues and activities,
thus providing the foundation for attainment of all plans.
28
Figure 6: TSTC Budget Process
Legislative Appropriations
Requests
Student Placement
Funding for TSTC
Strategic Plan,
Functional Op Plan
& UAPs
TSTC VC
Functional/Divisional
Budget Requests/Needs
CFO/Budget Office
TSTC Chancellor/CEO
Executive Management
Council
TSTC Location Funding
TSTC Board of Regents
29
VII. FUNCTIONAL OPERATING PLANS AND UNIT ACTION PLANS
TSTC identifies expected outcomes for educational programs and administrative and educational
support services and assesses the extent to which it achieves these outcomes through Functional
Operational Plans and Unit Action Plans. In addition to these plans, the use of internal reviews,
such as comprehensive self-studies and program reviews, standing committee recommendations,
survey results, audits, and other mechanisms to identify areas for improvement, TSTC is wellestablished in assessment of the outcomes of programs, services, and activities.
Components of functional operating plan and unit action plan assessment reports include the
following elements:






Department/Units intended outcomes (administrative and student learning when
applicable)
Strategies to achieve intended outcomes
Methodology for evaluation (including measures and targets)
Budget requests (if applicable)
Year 1 results and planned improvements (improvement actions)
Year 2 follow-up (results of implemented improvements).
Functional Operational Plans
Functional Operational Plan assessments are conducted and reported annually and reported
college-wide to various institutional stakeholders. These are operational assessments derived
from a functional area’s core function(s) that describe the desired quality of key services within an
administrative division or functional area and define exactly what the services should promote.
Functional leads must designate the corresponding Key Performance Indicators and Strategic
Initiatives from the strategic plans in order to maintain alignment between the functional
operating plan and the institution’s strategic plan.
Functional Operational Plans adhere to the same process required of the outcomes assessment
process. The first step in developing intended outcome statements is to determine what your
functional area needs to know about its programs and services. When writing outcomes, keep in
mind that they should be detailed, specific, measurable or identifiable, and personally
meaningful statements that are derived from the goals and articulate what the end result
should look like. Guiding questions for the operational plan outcome statements include:

What types of things is your functional area striving for?

What direction do you want your functional area to move?
30

What would you like to accomplish during the upcoming academic year and why? In terms
of intended outcomes, what would the “perfect” functional area look like?

Will data be obtained that can be used to improve programs and services?
Appendix F contains information on developing assessment plans for administrative
outcomes – which are both operational in their design and assessment methodology.
Unit Action Plans
Unit Action Plans (UAP) facilitates planning at the program/unit level. Unit Action Plans include
program/unit level assessment for educational programs, educational support units, and
administrative support units. At TSTC, UAPs include reporting for both student learning
outcomes (required of educational programs only) and administrative outcomes (required of
educational programs and non-educational units).
Outcome assessments are a reflection of unit priorities and provide focus for upcoming
assessment activities. Student learning outcomes and administrative outcomes are distinguished
through definition:
Student Learning Outcomes (SLOs) are operational statements of demonstrable
knowledge or skill that students will possess upon completion of a program/course.
Outcomes statements are specific and derived from program-level learning competencies.
Administrative Outcomes are operational and specific statements derived from a unit’s
core functions that describe the desired quality of key services within an administrative
unit and define exactly what the services should promote (modified from Selim et al.,
2005b, p. 19).
The UAP cycle occurs over a two-year period; however, all programs/units are expected to
report progress toward their outcome objectives annually.
Similar to the functional operational plans, department chairs and directors must also designate
the corresponding Key Performance Indicators and Strategic Initiatives from the strategic plans in
order to maintain alignment between the UAP and the institution’s strategic plan.
Assessment plans for all educational programs and administrative and educational support
services are reported as Unit Action Plans and generally fit into one of the following five
categories:
1. Technical programs/skills outcomes assessment (student learning) for all AS, AAS and
Certificate of Completion programs of study, Continuing Education, and Contract
31
Training programs and courses address the specific skills and outcomes expected from
graduates of each program or completers of each non-credit course. These skill and
knowledge outcomes must be demonstrated by students.
2. General Education outcomes assessment is slightly more centralized in that all general
education faculty assess student achievement of overall general education competencies
in their courses while the final outcome, technology application, is assessed in the
student success course that technical program students graduating from TSTC are
required to take. Definitions of each general education competency as well as the
standards and the rubric used to assess each intended student learning outcome in
general education are profiled in the TSTC General Education Assessment Plan.
3. Student Development Assessment Plans are designed to complement other methods
of assessing and evaluating the effectiveness of the college’s student development
activities outside of the classroom.
4. Instructional Support works together to enhance the direct relationships between
faculty and students. Instructional support units generally rely on feedback from
constituents regarding their performance in supporting the learning environment at TSTC.
5. Administrative Assessment Plans base their evaluation on results of feedback from
those constituents they support including students, employees, vendors, employers, etc.
Constituents are generally given the opportunity to evaluate each operational or
administrative area at least once every four years. All data received from these
evaluations is analyzed and other relevant data and information is reviewed, with
results used to develop improvement plans and activities to “close the loop” for
continuous quality improvement. These improvement plans are integrated into
subsequent functional operational and unit action plans for these departments.
UAPs for Student Learning Outcomes
Assessment of student learning outcomes (SLOs) is a learner-centered process ensuring that
students are learning what was intend for them to learn. It does not evaluate individual student
performance at the course-level, nor does it focus on individual faculty / staff performance. SLO
assessment at TSTC is conducted and reported at the program level – meaning plans must be
designed to measure what students are learning as a result of study in the program, such as the
following:




Students will be able to execute a qualitative research study.
Students will demonstrate proficiency in analyzing statistical control data.
Students will be able to diagnose system failures.
Students will be able to design solutions based on customer requirements.
Student learning assessment helps us answer the following key questions about student learning:
32




What should students (i.e., graduates) know, be able to do, and value?
Have the graduates of our institution acquired this learning?
What are the contributions the programs to student growth?
How can student learning be improved?
Appendix F contains information on developing assessment plans for Student Learning
Outcome(s), including guidance for developing outcome statements, curriculum mapping,
methodologies and measures, and implementation.
UAP’s for Administrative Outcomes
Administrative plans are designed to do two things: 1) to support the institution, its
mission and vision, and/or 2) to promote continuous improvement within each TSTC
department/unit based on identified weaknesses or improvement areas. Supervisors are asked
by administrators to include unit outcomes that demonstrate and document the
department’s role in the fulfillment of a larger institutional or college-wide goal from the
Strategic Plan. All departments, instructional and non-instructional are required to complete
administrative assessment plans.
1. Using Data and Assessment to Select Issues for Improvement: Several sources of data and
information are available that demonstrate various measures of performance by the
college, provide student feedback and progress, assess student learning and development,
measure institutional standards, and measure internal efficiency.
Unit Supervisors:
• Review and analyze sources of information that have been collected in the past year
and identify areas that provide information specific to each department;
• Extract from all information sources the top few issues that seem to emerge and where
unit or departmental efforts can have a real impact;
• Prioritize all issues identified, eliminate all issues that are not specific enough to the
department or that cannot be affected through any action or initiative undertaken on the
institution’s part. For all issues remaining, prioritization occurs based on whether or
not there are currently enough resources to address the issues and the degree to which
the issue impacts student success.
2. Compliance Requirements: The Texas Higher Education Coordinating Board (THECB), SACSCOC,
TSTC System and TSTC policies and procedures, and federal, regional, or other oversight
agencies, often require several things from a public institution. If a unit does not fulfill a
certain requirement that an external agency or organization expects, this requirement is
written into an annual assessment goal. This may include areas of non-compliance for
standards contained in internal department reviews as well.
Other Considerations for Administrative Outcome Development: If no information is received
from administrative leadership that a unit must develop an outcome to support a Strategic Plan
objective, unit supervisors may consider other issues on which to base administrative outcomes.
33
Included in this category are committee recommendations and professional growth of
departmental staff. If a program advisory committee, a TSTC standing committee, or other
recognized group submits a recommendation or suggestion for departmental improvement and
the responsible Associate Vice Chancellor or Functional Division Lead is in agreement with the
committee’s finding, an administrative outcome is created that responds to the issue raised by
the committee.
Additionally, if departmental activities are critical to improving performance or efficiency, a
supervisor may use this as basis for an administrative outcome. Programs may plan new
offerings or exit points within their curriculum/degree plans and request the purchase of items
that advisory committees have deemed essential to the whole education of the student in his or
her technology. Before an administrative outcome is approved, it must demonstrate that it (1)
supports one or more of the KPIs; and, (2) identifies the source of data or information used
to “close the loop” on performance.
Appendix G contains information on developing Administrative Outcome(s) assessment plans.
Program Reviews
Instructional programs are reviewed at least once every five years through the Program
Assessment and Improvement Review (PAIR) peer review process and are documented and
reviewed as a part of the annual assessment plans for those departments that were reviewed in
the reporting year. Measures that are not met are developed into improvement plans that are
included into the following cycle of program assessment plans.
The program review process ensures educational programs comply with all external and
internal mandates and performance standards for their operations. These reviews identify
program strengths and weaknesses, as well as ensure that the program is supporting the
strategic and operational vision of the college and achieving its unit purpose statement.
34
 ASSESSMENT CYCLE AND CLOSING THE LOOP
Assessment at Texas State Technical College is an on-going process of continuous evaluation and
improvement. A full-cycle of assessment is conducted biennially through a multi-phase, multielement process; however, programs/units are expected to report progress toward stated
outcomes annually. Figure 8 illustrates a full assessment cycle for measuring student learning
over three primarily phases: Planning; Implementation, Analysis & Reporting; and
Improvement. The following outlines the general components of this process for guiding
development and implementation:
Figure 8: Assessment Cycle
PHASE I: PLANNING
PHASE III: IMPROVEMENTS
Continuous
Improvement through
Outcomes Assessment
PHASE II: IMPLEMENTATION, ANALYSIS & REPORTING
35
Phase I: Planning: Planning is the first phase of assessment and its importance is not to be
underestimated as it directs a unit’s assessment activities for the following academic year. It is
advisable to organize and prepare your planning tools prior to devising your respective unit’s
assessment plan. The following are recommended steps in preparing for assessment:
Program/unit assessment plans are devised and include the following elements:

Statement of mission or purpose

Intended outcomes (Operational, SLOs and Admin Outcomes) and related
achievement targets

Curriculum map: includes alignment of objectives (SLOs) to related program level
student learning outcomes. The resulting grid allows faculty to indicate which
courses in the curriculum support the achievement of specific outcomes as well as
where to collect data.

Methodology (action steps) to achieve each intended outcome(s) and target,
including timeline and responsible individuals.
Phase 2: Implementation, Analysis & Reporting: During this phase, planning should already be
complete and the actions steps designed to achieve the intended outcomes are implemented.
Assessment data is collected, analyzed, reviewed and reported. Proposals for improvement are
developed and reported.
Phase 3: Improvement: Proposed changes are implemented and reported. This part of the
process completes the planning and evaluation cycle by looking back over the change proposals
of the previous cycle and listing improvements that were actually implemented. The report
generated for this phase of the process serves as documentation that the College is “closing the
loop” by using assessment results for improvement.
Reporting and Assessment Review
Assessment activities are documented with the college designated tracking system on a
scheduled basis. Standardized report templates are embedded in the system to document
assessment information.
Submission deadlines are documented on an assessment calendar (See Appendix C), however,
timeframes for submittal of the reports may vary from cycle to cycle. Program Chairs, Functional
Leads, and Associate Vice Chancellors will receive copies of the assessment plans and results
36
reports for each unit under their supervision. Together with assessment standing committees,
they will be charged with conducting a formal assessment review to evaluate the overall quality
of assessment plans and reports. A rubric has been devised to measure the quality of the
assessment planning; results and improvement documentation (see Appendix D). Units are
encouraged to use this as a guide when preparing assessment plans and results reports.
37
APPENDICES TO THE INSTITUTIONAL EFFECTIVENESS HANDBOOK
Page
Appendix A: Statewide Operating Standard for Institutional Effectiveness and
Institutional Research (IE/IR) ..................................................................................... 39
Attachment “A”: Mission Statement for the Institutional Effectiveness Committee44
Attachment “B”: Preliminary List of Institutional Effectiveness Units ................... 45
Attachment “C”: Institutional Effectiveness:
Institutional Effectiveness Cycle ................................................................... 47
Institutional Effectiveness Strategic Planning Model ..................................... 48
Institutional Effectiveness Model ................................................................. 49
Attachment “D”: TSTC Strategic Goals for 2016-2020 ........................................... 50
Attachment “E”: Sample TSTC Success Factors and Performance Indicators ......... 54
Appendix B: TSTC Executive Management Council .................................................... 55
Appendix C: TSTC Key Performance Indicators .......................................................... 56
Appendix D: TSTC Strategic Plan: 2016-2020 ............................................................. 62
Appendix E: Assessment Timeline of Activities ......................................................... 92
Appendix F: Developing Student Learning Outcome(s) Assessment Plans ................. 94
Appendix G: UAPs for Administrative Outcome(s) Assessment Plans .......................105
Appendix H: Assessment Rubric Sample .................................................................. 108
38
APPENDIX A
TEXAS STATE TECHNICAL COLLEGE
STATEWIDE OPERATING STANDARD
No. GA.1.22
Page 1 of 16
DIVISION:
General Administration
SUBJECT:
Institutional Effectiveness
AUTHORITY:
Executive Action 04-15
PROPOSED BY:
Original Signed by J. Gary Hendricks
TITLE:
Vice Chancellor and Chief Business
Intelligence Officer
RECOMMENDED BY:
Original Signed by J. Gary Hendricks
TITLE:
Vice Chancellor and Chief Business
Intelligence Officer
APPROVED BY:
Original Signed by Mike Reeser
TITLE:
Chancellor
STATUS:
Effective Date: 03/25/15
Date: 03/25/15
Date: 03/25/15
Date: 03/25/15
Approved by BOR Executive Committee 03/25/15
HISTORICAL STATUS:
Approved by Executive Management Council 03/19/15
Proposed 03/16/15
POLICY:
It is the policy of Texas State Technical College to maintain and conduct a broad-based,
comprehensive system of educational and operational planning, research, assessment, review and
revision designed to continuously improve the quality of the instruction and services the college
provides in accordance with its established mission.
PERTINENT INFORMATION:
TSTC is committed to continuous quality improvement. Such improvement is made possible
through thorough and systematic research, planning, and analysis of activities and the impact
those activities have on desired institutional outcomes. It is incumbent upon TSTC, as an
institution of higher education and as a state agency, to be responsible and accountable for its
actions. A comprehensive institutional effectiveness system is the vehicle for helping ensure that
39
TSTC is accountable. In addition, the Southern Association of Colleges and Schools
Commission on Colleges (SACSCOC) requires that accredited institutions have such a system in
place. Furthermore, state and federal agencies, which fund, monitor, and guide many of the
activities of TSTC, require a system that will demonstrate that the institution is effective and
efficient in carrying out its mission.
A. Definitions:
1. Mission Statement for the Institutional Effectiveness Committee (see Attachment "A”).
2. Institutional Effectiveness Units (IE units) - The functional entities of the college for
which goals/intended outcomes are planned and assessed. (see Attachment "B”)
3. Institutional Effectiveness Cycle (IE Cycle) -The process flow and time-frame
established for conducting all IE activities. (see Attachment "C”)
4. Institutional Research (IR) - The analysis of various data gathered for use in planning and
analysis of the effectiveness of college activities.
5. Unit Action Plans (UAPs) - The plans developed by the IE units to guide their efforts for
the IE cycle.
6. IE Unit Reviews -The evaluation of demographic, budget, student, and facilities data
completed by Instructional and Administrative IE units. This review process includes
staff and faculty input to provide feedback for determining goals/intended outcomes set
in the Unit Action Plans. In turn this aligns as justifications for budget request
submissions.
Additionally, the instructional program reviews include Student Learning Outcome
(SLO) data. This data provides the program faculty the opportunity to evaluate
information needed to improve Student Learning.
7. Goals/Intended Outcomes- The specific purposes and outcome targets set by each IE unit
for the coming fiscal year. Wherever possible, these should include student learning
outcomes.
8. Assessment/Evaluation Criteria and Procedures- The methods and instruments by which
the levels of achievement of goals/intended outcomes are tracked and measured.
9. IE Report-A document produced by the Office of Institutional Effectiveness (IE)
containing a comprehensive reporting and analysis of IE achievements, including
narrative assessments provided by each IE Unit's contact person.
10. IE Unit Manager- The designated individual who has overall responsibility for the
performance and results of the unit or department.
11. IE Unit Contact Person - The person designated by the members of an IE unit to enter the
unit's Unit Action Plan (UAP) in the IE database and serve as liaison between the IE unit
and the IE committee.
40
OPERATING REQUIREMENTS:
A. Institutional Effectiveness (IE) Committee
1. The Institutional Effectiveness Committee will be a standing committee of at least nine
members comprising a representative cross section of TSTC including broad geographic
representation within TSTC.
a. Each duly appointed member of the Committee may designate a substitute from their
department/campus to temporarily serve in their stead if professional or personal
circumstances necessitate such a proxy.
b. Any duly appointed member that is absent from two or more Committee meetings
shall have their appointment to the committee declared vacant.
2. A representative from the Office of Institutional Effectiveness & Institutional Research
(IE/IR) will be a permanent member of the committee.
3. Regular and exceptional vacancies on the IE Committee will be filled by Executive
Management Council appointment as appropriate.
4. Each new member of the IE Committee will receive orientation into the IE process and
appropriate training from the Office of Institutional Effectiveness (IE) prior to the first
Committee meeting after appointment. The format for the orientation will be a
combination of face-to-face and online.
5. The IE Committee will meet at least once each semester.
B. Organization
1. Programs and departments within the TSTC organizational structure with similar/like
functions will be formed into functional units for IE planning and assessment purposes.
(see Attachment "B")
2. The IE Committee, with input as needed from Vice Chancellors, managers, programs and
departments of the college, and with final review by the TSTC Executive Management
Council, will be responsible for determining the number and configuration of the IE
units.
3. Each IE unit will designate an IE unit contact person who will be the primary steward for,
and have primary responsibility for documenting and communicating that unit's Unit
Action Plan (UAP) and data. Contact information for this person will be communicated
to the Chair of the IE Committee as soon as this person has been designated. IE activities
will be coordinated through IE Directors at the field level.
41
a. New IE unit contact persons will receive orientation into the IE process and
appropriate training from the Office of Institutional Effectiveness (IE) as soon after
their designation as is practical and/or necessary. The format for the orientation
will be a combination of face-to-face and online.
b. The IE unit contact person for each IE unit will be responsible for entering the
unit's UAP into the web-based system designed to track goals, strategies, outcomes,
evaluation, and "closing the loop" activities. The IE unit contact person will be
responsible for ensuring that the actions and data are being adequately tracked. In
addition, the IE unit contact person will be responsible for working with their
department personnel and the Office of Institutional Effectiveness (IE) to develop
new assessment procedures and/or instruments as needed or appropriate.
c. "Closing the Loop": The IE unit contact person or designee for each IE unit will
be responsible for the preparation of a brief analysis of the unit's performance
related to its IE plan during the preceding IE cycle to be included in the IE Report.
C. Institutional Effectiveness Cycle (see Attachment "C'')
1. On receipt of the Expanded Statement of Institutional Purpose and Strategic Goals
(see Attachment "D'') from the Strategic Planning Review Committee, or annually at
a minimum, members of each IE unit will meet to establish an action plan for the
coming cycle.
2. Each IE Unit UAP should consist of a manageable number of goals/intended
outcomes that will focus on the unit's operations and on specific relevant factors
deemed critical to the successful performance of that unit's function for the cycle.
(see Attachment "E'')
3. The IE cycle will be established so that it provides complete, relevant and timely
information for the budget planning process.
4. Data for analysis to determine the level of IE goal/intended outcome achievement
will be collected by the Office of Institutional Research (IR) at the end of the time
period designated for measuring the result of the UAPs.
5. The Office of institutional Effectiveness (IE) will collect from each IE unit contact
person a brief analysis of that unit's outcomes and achievements during the preceding
IE cycle.
6. The TSTC Executive Management Council will periodically review and update
the Expanded Statement of Institutional Purpose and will submit any revisions to the
TSTC Board of Regents for approval.
7. Upon approval of the revised Expanded Statement of Institutional Purpose by the
TSTC Board of R egents, members of the Executive Management Council will join
other appointed members of the TSTC community to develop and/or revise the
college's Strategic Goals where appropriate.
42
8. The Vice Chancellor and Chief Business Intelligence Officer of TSTC
and the designated IE unit contact person(s), or their designee, will present any
revisions to the Expanded Statement of Institutional Purpose and Strategic Goals
to the TSTC community at-large as soon as is possible after the review/revision.
Each IE unit Manager, or designee, will review with their IE Unit the unit's results, its
goals/intended outcomes from the previous IE cycle, and assist in making revisions,
and/or developing new goals/intended outcomes for the upcoming cycle, taking into
account any changes in the Expanded Statement of Institutional Purpose and Strategic
Goals. The unit's new and/or revised UAP will be approved by the appropriate IE
Unit Manager and then entered into the IE database on the TSTC website by the IE
unit contact person.
9. The Office of Institutional Research (IR) will compile a report of the UAPs from the
web database into a comprehensive IE plan and subsequent outcomes report for the
college for the year.
PERFORMANCE STANDARDS:
1. All employees participate in the Institutional Effectiveness process.
2. The Institutional Effectiveness Committee provides appropriate oversight of the I.E.
process.
3. Field offices of Institutional Effectiveness provide support and training services for the
I.E. process across the campuses.
43
Attachment "A”
Mission Statement for the Institutional Effectiveness Committee
The mission of the Institutional Effectiveness Committee at Texas State Technical
College is to facilitate and monitor planning and evaluation processes that support the mission,
the vision, and the values of the College.
Core activities and/or responsibilities of the Committee include:
1) Monitoring planning and evaluation processes for:
a. academic and technical programs and educational support units;
B. the College Strategic Plan, Functional Operating Plans and Unit Action Plans;
and,
C. institutional data collection and reporting.
2) Consulting with faculty, staff and leadership regarding institutional effectiveness
matters.
3) Facilitating communications of progress toward institutional goals and objectives as
they relate to institutional effectiveness; and,
4) Overseeing activities related to program-specific and regional institutional
accreditation.
44
Attachment "B"
Preliminary List of Institutional Effectiveness Units
Air Traffic Control
Aircraft Dispatch
Aircraft Pilot Training
Aviation Admin
Aviation Maintenance
Avionics
Allied Health
Academic Science
ADN
Chemical Dependency Counseling
Culinary Arts
Dental Assistant
Dental Hygiene
Dental Lab Tech
Division
EMT
Food Service
Health Info Tech
Health Information Technology
Licensed Drug Counselor
Assistant Nursing
Medical Assistant
Medical Info Specialist
Registered Nursing
Surgical Technology
Vocational Nursing
Computer/ Graphics Computer
Fundamentals Computer
Maintenance Computer
Networking /Security
Computer Networking & Systems Admin
Computer Science
Digital Media
High Performance Computing
Information & Communication Tech Core
Network Security
Networking
Visual Communication & Design
Web Design & Development Technology
Computer/ Info Systems
Business Management
Business Office Computer
Maintenance
Computer Networking & Security
Computer Science
Database & Web Programming
Digital Arts
Student Learning Cont.
Digital Media Design
Business Intelligence
Finance
Admin (Finance)
Auxiliary Services
Financial Accounting
Financial Analysis
Financial Services
Governance Risk & Compliance
Logistics
Office of Facilities & Planning
Forecasting
Office of Facilities & Planning
Human and Organizational Development
Information Technology
Admin
Admin (IT)
App Development
Income Producing
Infrastructure
Maintenance
Risk Management
Shared
Solutions Mgmt.
Support Ops
Marketing
Advancement Ops
Career Services
Communications
Field Development
Marketing
Recruitment
Office of the CEO
External Relations
General Counsel
Internal Audit
Operations
Admin (Operations)
College Readiness
Innovation
Office of Sponsored Programs
Student Development
Admin
Enrollment Management
Student Life
Student Learning
Academics Admin
Admin
Aerospace
Student Learning Cont.
45
Student Learning Cont.
Industrial/ Manufacturing
Machining Manufacturing Eng.
Comp Sys Network
Computer Int. Mfg. Computer
Maintenance Computer
Science E-‐Commerce
Information Tech Software
Engineering
Library
Manufacturing & Trans
Diesel Equipment Technology
Drafting & Design
Welding Technology
Red Oak
Services
Culinary Arts
Digital Media Design
Drafting & Design
Environmental Health
Professional Cooking
Welding
Skills USA
Transportation & Services
Auto Collision
Autobody
Automotive Technology
Aviation Maintenance
Culinary Arts
Dental Assistant
Developmental Ed
Diesel Equipment Technology
Environmental Health
Environmental Health & Safety
Golf Course & Landscape Management
Land Surveying
Pilot Training
Turfgrass & Landscape Management
Welding
Welding Technology
Workforce Development
Admin CC
Alvin CDL
CE Contract
EMT
Fire Fighters
ICAR
Indirect
Industrial Training
L3
Robotics
SDF
Truck Driving
Wind Energy
Graphics, Gaming Simulation Programming
Software & Business Mgmt. Accounting
Education & Humanities
Business
Communication & Humanities
Communications Core
Dual Enrollment
Education & Training
Engineering
English
Foreign Language
Freshman Orientation Math
Physics
Social Science
Electrical Op
Biomedical Equipment Technology
Electronics
Electrical Line worker Lineman
Engineering
Biomedical Equipment Technology
Building Construction
Chemical Technology
Civil Engineering
Computer Aided Drafting
Const. Electrical
Developmental Ed
Down Hole Tool
Drafting & Design
Electrical Lineworker
Electrical Power
Electrical Power Line/ Lineworker Program
Electromechanical Technology
Electronics
Electronics Core
Energy Mgmt.
Environmental Technology
High Voltage Electrical Industrial
Maintenance Industrial Process
Operations Industrial Systems
Industrial Training
Instrumentation IPOE
Laser Electro-‐Optics
Logistics Manufacturing Eng.
Mechanical Engineering
Pipe Fitting
Pipefitters Solar Energy
Telecommunications
Wind Energy
Wind Turbine
EWCHEC (Hutto)
Faculty Activities
Ft. Bend Technical Center
46
Attachment "C"
Institutional Effectiveness Cycle
47
Attachment "C" Cont.
Strategic Planning Model
Mission
Review
Strategic Goals
and Objec7ves
Environmental
Scan
Performance
Evalua7on &
Adjustment
Performance
Assessment
Strategic level planning includes administration, faculty, staff and student representation through
the Executive Management Council and the Institutional Effectiveness Committee. The Executive
Management Council provides leadership in developing the strategic goals and objectives.
48
Attachment "C" Cont.
Institutional Effectiveness Model
Assessment
Unit Action Plans
Functional Operating Plans
TSTC's Strategic Plan
Mission, Vision, and Values
In addition to the Strategic Planning Process Model, which illustrates the different components
of college-wide planning and evaluation, there is also an Institutional Effectiveness Model,
which depicts the various levels of planning and assessment through the college’s divisions and
departments. At the bottom of the pyramid are the mission, vision and values (foundation) that
guide all planning processes above. Planning and evaluation processes are comprised of the
Strategic Plan, Functional Operating Plans and Unit Action Plans. Other organizational
assessments and surveys support the three levels of planning and evaluation with data for
decision-making:
•
•
•
•
•
•
Accountability Report of Texas Higher Education Coordinating Board (ART)
Alumni Employment Survey (AES)
Community College Survey of Student Engagement (CCSSE)
Employee Assessment of College Environment Survey (ACE)
Student Environment Survey (SES)
Internal Operational Reports
49
Attachment "D"
TSTC Strategic Goals
TSTC’s RALLYING CRY: “PLACE MORE TEXANS”
Goal 1: Make TSTC the Perpetually-Relevant, Innovative, Go-To, State-Wide Technical
Education Source for Texas
Objective 1.1: Enhance the delivery of student related products and services
Strategies:
1.1.1 Expansion. Stand up and expand start-up operations at TSTC’s newest
locations.
Performance Indicator(s): (form & content to be developed)
1.1.1.1 – Hutto location annual assessment report
1.1.1.2 – Red Oak location annual assessment report
1.1.1.3 – Ft. Bend county location annual assessment report
1.1.2 Skill Mastery. Provide students with learning experiences required to
achieve relevant levels of technical skill mastery leading to successful
employment or advancement.
Performance Indicator(s): (form & content to be developed)
1.1.2.1 – Competency-based instruction annual assessment report
1.1.2.2 – Continuous improvement of learning assessment report
1.1.2.3 -- Curricula update annual assessment report
1.1.2.4 – Annual program facilities assessment report
1.1.2.5 – Annual percent of students achieving course-level mastery (to
be defined and methodology of assessment developed)
1.1.2.6 – Annual number of former TSTC students found working in
the Texas economy
1.1.2.7 – Average annual wages of former TSTC students found
working
1.1.3 Innovation. Accelerate innovation in delivery of technical training,
including competency-based learning, and badges to increase speed of
workforce supply to Texas’ industry.
Performance Indicators: (definitions & methodologies TBD)
1.1.3.1 – Percent of programs offering competency-based instruction
options
1.1.3.2 – Percent of programs offering badges
1.1.3.3 – Number of students earning badges annually
1.1.3.4 – AAS Graduate average time to complete
1.1.3.5 – Certificate Graduates average time to complete
1.1.4 Veteran Services. Enhance and improve student support services for
veterans.
50
Performance Indicators:
1.1.4.1 – Number of staff and annual budget devoted to Veterans
Services
1.1.4.2 – Annual unduplicated number of veterans served
1.1.4.3 – Placement rate for Veterans exiting TSTC
1.1.4.4 – Average annual salaries for placed veterans
1.1.5 The Student Experience. Enrich the student experience through
integrated, accessible and responsive services that enhance learning for
each TSTC student.
Performance Indicators:
1.1.5.1 – Selected items on the Annual Survey of Student Satisfaction
(to be selected)
Objective 1.2:
Broaden the market appeal of TSTC’s Products and Services
Strategies:
1.2.1 Branding. Inspire and activate loyalty to TSTC by its current and
prospective customers.
Performance Measures:
1.2.1.1 – Marketing Engagement Report
1.2.2 Industry Relations. Deepen our relationships with Texas employers to
enrich their supply of job-ready technicians and ensure their workforce
maintains a technical edge.
Performance Measures:
1.2.2.1 – Number of Texas employers who hire former TSTC students
annually
1.2.2.2 – Industrial Career Day report
1.2.2.3 – Job Star Data report
1.2.2.4 – Equipment Donation report
1.2.3 C4EO. Expand the Common Skills Language project for aligning skillbased language to validate learning outcomes for relevancy and
curriculum alignment
Performance Measures:
1.2.3.1 – Percent of TSTC programs for which curricula is validated or
updated using the CSL project process annually
1.2.4 Partnerships. Continue working cooperatively with Texas community
colleges and other partners to address Texas industry’s training needs
regardless of their location in the state.
Performance Measures:
1.2.4.1 – Number and list of active partnerships
1.2.4.2 – Number of students served in active partnerships
51
Objective 1.3 Provide essential services and leadership in support of operations
Strategies:
1.3.1 Single Accreditation. Merge the four TSTC college accreditations into
a single, statewide accreditation to provide an efficient structure for
increased instructional capacity, innovation, sustainability, quality,
consistency, and flexibility to respond to the growth opportunities that
lie ahead.
Performance Measures:
1.3.1.1 – SACSCOC votes to approve TSTC’s merger, June 2015
1.3.1.1 – TSTC Compliance report submitted Nov. 2015
1.3.1.2 – SACSCOC Site Visit, end of Jan. 2016
1.3.1.3 – TSTC addresses any outstanding recommendations, Apr. 2016
1.3.1.4 – SACSCOC accepts TSTC’s actions on any recommendations,
June 2016
1.3.2 Optimization. Employ reasonable means to leverage and extend the
entire system’s resources for greater operating efficiency, including
statewide integration of administrative and operational functions.
Performance Measures: (Content and format TBD)
1.3.2.1 – Student Learning annual status report
1.3.2.2 – Student Development annual status report
1.3.2.3 – Finance Division annual status report
1.3.2.4 – Business Intelligence annual status report
1.3.2.4 – Marketing/Development annual status report
1.3.2.5 – Facilities/Auxiliary Services annual status report
1.3.2.6 – IT Services annual status report
1.3.2.7 – Organizational Development annual status report
1.3.2.8 – Technology Enhancement annual status report
1.3.2.9 – OPM annual report
1.3.3 Program Vitality. Further develop the economic model, tools, and
strategies for evaluating programs within the framework of TSTC’s
new funding formula as well as other factors, including current and
future market demand and program cost
Performance Measures: (Content & format TBD)
1.3.3.1 Annual Status of Educational programs report
1.3.4 Service. Exceed expectations of customers with a sense of warmth,
friendliness, and individual pride.
Performance Measures:
1.3.4.1 – Selected responses on annual student satisfaction survey
(TBD)
1.3.4.2 – Student focus group outcomes (TBD)
52
Goal 2: Build a Top-Shelf Fund-Raising Capacity That Provides Consistent & Meaningful
Financial Resources
Objective 2.1 Advancement: Build financial resources for the College
Strategies:
2.1.1 Culture of Philanthropy. Build the unified case for support,
impassioned volunteer leadership, and equipped sales force for longterm fundraising success.
Performance Measures:
2.1.1.1 – Annual Development report
2.1.1.2 – Number of annual donors to TSTC and The TSTC Foundation
2.1.1.3 – Alumni Report
2.1.1.4 – Number of Individuals and Organizations Engaged Annually
Goal 3: Make TSTC a Great Place to Work
Objective 3.1 Organization Development: Shape the culture of the college to maximize
success for all
Strategies:
1. 1 . 1 Employee Engagement. Provide the tools, resources, and support
necessary to ensure success of all employees
Performance Measures:
3.1.1.1 – Report of Employee Satisfaction results
3.1.1.2 – Compensation Progress report
53
Attachment
"E"
Sample TSTC Success Factors and Performance
Indicators
1. Retention: A process designed to achieve and maintain retention rates as defined by
the institution. This includes completion of a postsecondary credential or certificate.
2. Enrollment Development: Emphasizes efforts to increase program
enrollments especially in under enrolled programs.
3. Student Satisfaction: Students will be satisfied with campus life and student services
at the college.
4. Employer Satisfaction: Employers will be satisfied with the job skills and
performance of TSTC graduates.
5. Employee Development: A process by which faculty/staff are provided opportunity
for professional growth and development.
6. Placement of Graduates: The process of moving students from the college
into employment.
7. Planning & Evaluation: The institution engages in planning and evaluation processes
that (1) incorporate a systematic review of institutional mission, goals, and outcomes;
(2) result in continuing improvement in institutional quality; and (3) demonstrate the
institution is effectively accomplishing its mission. (an excerpt from SACSCOC
Principles of Accreditation - CR 2.5) Each department identifies expected outcomes,
assesses the extent to which it achieves these outcomes, and provides evidence of
improvement based on analysis of the results. (an excerpt from SACSCOC Principles of
Accreditation- CS 3.3.1)
8. Technical Skills Standards: Technical Education students should master knowledge
and skills that meet state-defined and industry-validated career and technical skill
standards.
9. Effective Partnerships: These partnerships are intended to forge positive relationships
with other higher education institutions, public schools, and private sector industries in
order to increase training and ed
54
APPENDIX B: Executive Management Council
TSTC Executive Management Council
Name
Michael L. Reeser
Jonathan Hoekstra
Title
Chancellor, Chief Executive Officer
Vice Chancellor, Chief Financial Officer
J. Gary Hendricks
Vice Chancellor, Chief Business Intelligence
Officer/SACSCOC Accreditation Liaison
Jeff L. Kilgore
Vice Chancellor, Chief Marketing Officer
Gail Lawrence
Vice Chancellor, Chief Culture Officer
Roger P. Miller
Rick Herrera
Vice Chancellor, Chief Government Affairs
Officer
Vice Chancellor, Chief Technology Officer
Elton Stuckly
Vice Chancellor, Chief Operations Officer
Randall Wooten
Michael L.
Bettersworth
Vice Chancellor, Chief Execution Officer
Vice Chancellor, Chief Policy Officer
Barton Day
Stella Garcia
Rob Wolaver
Interim President, Marshall
Interim President, Harlingen
Interim President, Waco
Kyle Smith
Interim President, West Texas
James Roland
Ray Rushing
Provost, Red Oak
General Council
55
Charge
Strategic Leadership
Finance, Facilities, Auxiliaries,
Administrative Services
Business Intelligence, IE/IR,
Educational Services, ERP
Management
Marketing, Advancement,
Communications,
Recruitment, Placement
Human Resources, Culture
and Organization
Development
Legislative Relations
Information Technology
Services
Student Learning, Student
Development, Innovation,
College Readiness,
Strategic Initiatives
Policy, Technology Trends,
Center for Employability
Outcomes
Marshall, Red Oak
Harlingen, Ingleside
Waco, Williamson County,
Fort Bend County
Sweetwater, Abilene,
Brownwood, Breckenridge
Red Oak
Legal
APPENDIX C: TSTC Key Performance Indicators
KPI #1: Annual Unduplicated Count of Semester Credit Hour Students
Data Definition: This is a measure of the annual headcount of students enrolled in
semester credit hour classes, counting a student one time, regardless of the
number of classes or terms in which a student is enrolled during the academic
year.
Data Sources: Certified CB-M001 reports to Texas Higher Education Coordinating
Board (THECB)
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation: A student is counted one time for the academic year
reported. Count is made for unique student IDs.
FY 2020 Performance Target: 20,004
KPI #2: Annual Unduplicated Count of Former Students Found Working or Transferring to
another public college or university in Texas (Placement)
Data Definition: Placement results are available through Texas Higher Education
Coordinating Board (THECB) Automated Student and Adult Learner Follow-Up
System (ASALFS) Exit Cohort Reports, which track all students who were at the
college the previous fiscal year and who were not present the following fiscal year
whether due to graduation, transfer, or other reasons. Included in the placement
count are students who were found working in the Texas Workforce Commission
(TWC) Unemployment Insurance (UI) wage records and in national databases
including the Office of Personnel Management, US Post Office, and military records
from the Department of Defense, in October, November, and/or December of the
fiscal year following their exit from TSTC.
Placement counts include all former TSTC students (except continuing ed) from the
prior academic year who are “found working” or who have been found “enrolled”
at another public institution of higher education in Texas.
Data Sources: Texas Higher Education Coordinating Board (THECB) Automated
Student and Adult Learner Follow-Up System (ASALFS) Exit Cohort Reports –
Summary by Cohort Type, Level of Award (Page 3 or 4, varies by report year)
Targeted Annual Growth Rate 2016 to 2020: 4.82%
56
Method of Calculation: Count of former TSTC students “found working” or
enrolled in a public institution of higher education in Texas.
FY 2020 Performance Target: 8,412
KPI # 3: Average First Year Salary of Former Students Found Working (including
Leavers)
Data Definition: Salary results are available through Texas Higher Education
Coordinating Board (THECB) Automated Student and Adult Learner Follow-Up
System (ASALFS) Exit Cohort Reports, which track all students who were at the
college the previous fiscal year and who were not present the following fiscal year
whether due to graduation, transfer, or other reasons. Included in the average
salary calculation are students who were found working in the Texas Workforce
Commission (TWC) UI wage records and in national databases including the Office
of Personnel Management, US Post Office, and military records from the
Department of Defense, in October, November, and/or December of the fiscal year
following their exit from TSTC. Includes former students found to be “working
only” (not enrolled in higher education in Texas).
Data Sources: THECB Automated Student and Learner Follow-Up System (ASALFS) Reports
– Summary of Cohort Type, Level of Award (Page 3 or 4, varies by report year).
Targeted Annual Growth Rate 2016 to 2020: 3.71%
Method of Calculation: Mean earnings based on 4th quarter TWC wage data following year
of graduation or exit from TSTC, extrapolated to a full year.
FY 2020 Performance Target: $30,000
Data Limitations: This KPI measures former students’ salaries within one month to
thirteen months following their departure from TSTC and includes “leavers” as
well as graduates, and is based on only one quarter of UI wage and additional
database data. Salaries are thus lower than what may be discovered through a
process which measures salaries for a longer time after student departure. A
student leaving TSTC in August will be included for what they earned in the fourth
calendar quarter of that year, regardless of whether their employment was for
part-time or full-time work, and whether the former student was employed for
one month of the fourth quarter or all three months of the fourth quarter.
“Leaver” salaries also further dilute the average salaries, resulting in only
moderate average earnings per “former student.”
57
KPI # 4: Number of Annual Graduation Awards
Data Definition: Graduation data is reported to THECB on the CB-M009 report and
this measure reflects the total number of degrees and certificates awarded during
the fiscal year. This is a count of awards rather than graduates. A graduate may
earn more than one award in an academic year, and this measure counts each
award earned during that year.
Data Sources: Certified CB-M009 reports
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation: The total number of graduation awards reported on the
CB-M009 report is summed.
FY 2020 Performance Target: 3,474
KPI # 5: Number of Students Graduated Annually
Data Definition: From the CB-M009 report, this measure counts the number of
students who graduated in a fiscal year so that a graduate is counted only one time
for the year, regardless of the number of awards received.
Data Sources: Certified CB-M009 report
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation: The number of unduplicated graduates reported on the CBM009 is summed. A graduate is counted one time, regardless of the number of
awards received.
FY 2020 Performance Target: 2,863
KPI # 6: Percent of first-time, Full-time Degree or Certificate-seeking Students Graduated
within 3 years
Data Definition: First-time, Full-time Degree or Certificate-seeking Students who Graduated
Within Three Years with Either an AAS Degree or Certificate - Students who (1) enrolled at the
specific TSTC college for the first time, (2) were enrolled for 12 or more semester credit hours
as of the official census date in their first semester of enrollment, (3) declared they were
58
seeking either an Associate of Applied Science (AAS) degree or a vocational/technical certificate
in their first semester of enrollment, and (4) graduated within three years.
Data Sources: Colleague Database. This is also a Legislative Budget Board (LBB)
measure.
Targeted Annual Growth Rate 2016 to 2020: from 30.0% to 32.5%
Method of Calculation: This measure is calculated by assigning students who meet
the criteria as "first-time, full-time degree or certificate-seeking students" to a
term- and location-specific cohort. Each term's cohort is tracked for three years
and the status of each student is determined as of the end of the three-year
period--either certified as having graduated or not certified as having graduated.
The number of students certified as having graduated and the total number of
students in the cohort are summed for each of the three cohorts whose third year
ends in the fiscal year being reported. Then, the total annual number of students
certified as having graduated is expressed as a percentage of the total number of
students in the annual cohort.
FY 2020 Performance Target: 32.5%
KPI # 7: Average Student Satisfaction Index
Data Definition: Students are surveyed to determine their satisfaction with all
aspects of TSTC operations, including facilities, student learning processes, student
life, safety, and other aspects of student life.
Data Sources: Student Satisfaction Survey. Completed in odd-numbered fiscal
years
Targeted Change, 2016 to 2020: from 3.77 in FY 13 to 3.85 in FY 19
Method of Calculation: These surveys are scored on a 1 to 7 Likert scale, with a 7
representing total satisfaction. Expected performance, based on past results, was
set at an overall average score of 4.0, for Point Scaling value of 5. Each campus’
average score is weighted by the number of students at that campus. College-wide
results are weighted by the number of respondents at each location.
FY 2020 Performance Target: 3.85 (in FY 2019)
KPI # 8: Student Ethnicity Compared to Statewide Population Ethnicity
59
Data Definition: This measures the difference in percentage points of the TSTC
student population against statewide averages (all ages included).
Data Sources: Certified CB-M001 reports to the Texas Higher Education
Coordinating Board (THECB)
Expected Change 2016 to 2020: An increase from 11.5% to 12.5%
Method of Calculation: Percentages by ethnicity are calculated for TSTC students
and compared to statewide ethnicity percentages. The difference is used to
measure the differential.
FY 2020 Performance Target: 12.5%
KPI # 9: Funds Raised Annually
Data Definition: Fund raising measures the monetary donations at the campuses
and administration, including amounts donated to TSTC through The TSTC
Foundation.
Data Sources: Internal records
Targeted Annual Growth Rate 2016 to 2020: 20.0% for monetary donations
Method of Calculation: Sum of monetary donations for a fiscal year.
FY 2020 Performance Target: $3.3 million in monetary donations for FY 2020
Data Limitations: Non-monetary donations for each fiscal year will be reported, but
these donations do not have an annual “goal” due to the unpredictable variability
in their size and frequency.
KPI # 10: Annual Number of Donors
Data Definition: This measure provides the annual unduplicated count of donors
of both monetary and non-monetary contributions for the fiscal year.
Data Sources: Internal records of the TSTC Foundation.
Targeted Annual Growth Rate 2016 to 2020: 20%
Method of Calculation: Total number of unique donors for a fiscal year are
summed.
60
FY 2020 Performance Target: 498
KPI # 11: Employee Satisfaction Index
Data Definition: Employees are surveyed to determine their satisfaction with all
aspects of their TSTC employment. These surveys are scored on a scale of 1 to 5,
with 5 representing total satisfaction.
Data Sources: Employee Satisfaction Survey
Targeted Annual Growth Rate 2016 to 2020: From 3.70 to 3.90
Method of Calculation: Each location’s average score is weighted by the number of
employees at each location and then averaged.
FY 2020 Performance Target: 3.9 on a 1 to 5 scale
KPI # 12: Employee Ethnicity Compared to Statewide Population Ethnicity
Data Definition: This measures the number of percentage points TSTC employee
ethnicity percentages are away from statewide ethnicity percentages. These
results are based on the ethnicity of all employees college-wide. All other
categories are combined and measured against Anglo populations.
Data Sources: HOD records of employee ethnicity as measured in July of each
year.
Targeted Change, 2016 to 2020: from -9.8% to -9.0%
Method of Calculation: Percentages by ethnicity are calculated for TSTC
employees and compared to statewide ethnicity percentages. The difference in
these percentages is the differential.
FY 2020 Performance Target: -9.0%
61
APPENDIX D: TSTC Strategic Plan
Texas State Technical College
Strategic Plan: 2016-2020
February 10, 2015
62
CONTENTS
Strategic Plan Overview _____________________________________________________ 64- 2
Mission, Vision, and Values ____________________________________________________ 66
Legislative Mission Statement _______________________________________________________ 66
Regents Expanded Mission Statement ______________________________________________ 66- 4
Vision & Values _________________________________________________________________ 67 - 5
Strategic Plan Development for Fiscal Years 2016 to 2020____________________________6 -8
TSTC Expanded Strategic Plan Including Strategies and Performance Indicators________8 - 10
Appendix A: Key Performance Indicators Definitions_____________________________11-25
Appendix B: Key Performance Indicators - Targets for FY 2020________________________26
63
STRATEGIC PLAN OVERVIEW
Strategic level planning includes administration, faculty, staff and student representation through the
Executive Management Council and the Institutional Effectiveness Monitoring Committee. The Executive
Management Council provides leadership in developing the strategic goals and objectives.
STRATEGIC PLANNING
Figure 1: TSTC’s Strategic Planning Processes
64
In addition to the Strategic Planning Process Model, which illustrates the different components of college-wide
planning and evaluation, there is also an Institutional Effectiveness Model, which depicts the various levels of
planning and assessment through the college’s divisions and departments. At the bottom of the pyramid are the
mission, vision and values (foundation) that guide all planning processes above. Planning and evaluation processes
are comprised of the Strategic Plan, Functional Operating Plans and Unit Action Plans. Other organizational
assessments and surveys support the three levels of planning and evaluation with data for decision-making:
•
•
•
•
•
•
Accountability Report of Texas Higher Education Coordinating Board (ART)
Alumni Employment Survey (AES)
Community College Survey of Student Engagement (CCSSE)
Employee Assessment of College Environment Survey (ACE)
Student Environment Survey (SES)
Internal Operational Reports
Figure 2: TSTC’s Institutional Effectiveness Model
65
MISSION, VISION, AND VALUES
The guiding mission of TSTC is defined by the legislative framework creating the institution and the TSTC
Board of Regents. These form the foundation of all strategic and operational planning and should be consulted
whenever new initiatives are being considered.
LEGISLATIVE MISSION STATEMENT
The Texas State Technical College System mission is defined by the Texas State Legislature and published
in Vernon’s Texas Education Code Section 135.01:
Texas State Technical College System is a coeducational two-year institution of higher
education offering courses of study in technical-vocational education for which there is
a demand within the State of Texas.
Texas State Technical College System shall contribute to the educational and economic
development of the State of Texas by offering occupationally oriented programs with
supporting academic course work, emphasizing highly specialized advanced and
emerging technical and vocational areas for certificates or associate degrees. The
Texas State Technical College System is authorized to serve the State of Texas through
excellence in instruction, public service, faculty and manpower research, and economic
development.
The system’s economic development efforts to improve the
competitiveness of Texas Business and industry include exemplary centers of
excellence in technical program clusters on the system’s campuses and support of
education research commercialization initiatives. Through close collaboration with
business, industry, governmental agencies and communities, including public and
private secondary and postsecondary educational institutions, the system shall
facilitate and deliver an articulated and responsive technical education system.
In developing and offering highly specialized technical programs with related
supportive course work, primary consideration shall be placed on industrial and
technological manpower needs of the state. The emphasis of each Texas State
Technical College System campus shall be on advanced or emerging programs not
commonly offered by public junior colleges. (TEC 135.01)
REGENTS EXPANDED STATEMENT OF PURPOSE
The TSTC Board of Regents periodically reviews the mission statement as defined by the legislature and
authors an Expanded Statement of Purpose. The most recent version of the statement was approved on February
12, 2015, and is worded as follows:
Texas State Technical College (TSTC) is a coeducational two-year, multi-campus institution of
higher education providing innovative and responsive programs and courses of study in
technical education for which there is demand in the State of Texas, with emphasis on advanced
and emerging technologies. TSTC is a leader in building the economic vibrancy of Texas by
providing excellence in learning experiences, on location and at a distance, and through diverse
technical programs and rigorous curricula offerings. TSTC facilitates the transfer of technical
66
expertise through the placement of former students, who have obtained hands-on learning
experience, in jobs with Texas business and industry. TSTC works collaboratively both internally
and with other organizations to increase the availability of relevant technical education in Texas
and to be accountable to its various constituencies. Integrity in all of its dealings provides the
foundation of TSTC’s mission.
TSTC awards include Associate of Applied Science degrees, Certificates of Completion, badges
(skill-set institutional awards) and workforce certificates. TSTC also provides opportunities for
the seamless transfer of credits to other colleges and universities, including awards at its
Harlingen campus for Associate of Science degrees and institutional recognitions for completion
of the General Education Core curricula.
TSTC makes higher education affordable, readily accessible and personal through multiple
instructional delivery systems, counseling and guidance services, student activities and the
opportunity to learn in a residential setting at several of its campuses. By offering TSTC
programs and services in flexible times and places, TSTC students are able to achieve their
educational and career goals at a pace that meets their needs while minimizing the elapsed time
needed to reach those goals. To achieve time and place flexibility, TSTC offers traditional higher
education credit programs taught on a semester basis, dual credit programs that lead to
marketable skills achievement or further education (in partnerships with Independent School
Districts), competency-based education and training delivery, online instruction, project-based
learning activities, continuing education, and specialized training for business and industry.
TSTC operates its programs and services in accordance with the public trust for which it is
responsible.
Diversity in the student body and in faculty and staff is a value that TSTC strives to achieve. It is
TSTC’s goal for the ethnicity of these groups to mirror statewide and local demographics.
Likewise, serving non-traditional and special population groups has always been a TSTC keynote,
with specialized services provided to assist where and when needed.
VISION & VALUES
VISION: Texas State Technical College will be a leader in strengthening the competitiveness of Texas business and
industry by building the state’s capacity to develop the highest quality workforce.
VALUES:
 Integrity: Dealing honestly and openly with all of our constituencies and with one another
 Excellence: Achieving the highest quality in all we do
 Leadership: Developing visions and strategies for a desired future, and aligning and energizing
people to achieve those visions.
 Innovation: Creating and implementing new ideas and methods
 Collaboration: Working cooperatively with other organizations and within our own system
 Responsiveness: Providing appropriate programs and services in a proactive, flexible, and timely
manner
 Accountability: Measuring our performance and using the results for improvement
 Stewardship: Ensuring our programs and services add value to our students and communities
throughout the state, and operate in accordance with the public trust for which we are responsible
67
 Diversity: Striving for inclusivity in our faculty, staff and students as reflected in state demographics;
treating others fairly and equitably as we would all like to be treated
Figure 3: TSTC Root Values Illustration
68
STRATEGIC PLAN DEVELOPMENT FOR FISCAL YEARS 2016 TO 2020
The TSTC Strategic Plan for Fiscal Years 2016 to 2020 was developed in the Fall of 2014 and completed in the
Spring of 2015. The plan is based on a history of strategic initiatives that have preceded the Single Accreditation
initiative and date back to 2007, when TSTC leaders responded to a legislative request about the desirability and
feasibility of basing legislative appropriations on outcomes rather than being reimbursed for the cost of
educational activities.
TSTC’s response to the state legislative inquiry was positive and TSTC agreed to explore the possible options.
Based on data and analyses that dated back to the early 1990’s, the college administration decided to propose a
state funding formula for Administration and Instruction (A&I) based on the amount of state tax revenues that can
be attributed to former TSTC students as they obtain good jobs in the Texas economy. To this end, a study was
contracted with noted economist, Dr. Ray Perryman. This study was received in 2008 and concluded that such a
funding methodology would be feasible. In 2009, the State Legislature requested that the Texas Higher Education
Coordinating Board (THECB) also study the feasibility of this type of formula. The proposal was studied by THECB
staff and an independent consultant. It was also reviewed by the State Comptroller’s Office and the Texas
Workforce Commission. All agreed the proposal was feasible. In 2011, the State Legislature instructed the THECB
to work with TSTC and the Texas Workforce Commission to develop the proposed formula and utilize it for the
2013 legislative session. Work on the details of the formula continued in 2014 and involved staff from the THECB,
the Texas Workforce Commission and TSTC. Legislative Budget Board staff attended all meetings and made no
objections to the process or product. The second application of the new formula for TSTC’s A&I funding will occur
in the 2015 legislative session. This formula is particularly suited to TSTC since its mission is essentially to educate
and train students for the technical jobs that exist in the Texas economy.
The adoption of the new funding formula is an important factor in TSTC’s strategic planning. TSTC leadership
noted that “adoption of the new funding formula changes everything.” After noting that “you get what you pay
for,” college leadership recognized that the new funding formula inherently aligns the goals of all parties involved
in the educational process: students, the college, business and industry who hire graduates, and the State that
benefits from increased economic activity. TSTC’s inherent motive is to serve all students to the best of its ability
so that they may obtain good jobs in the Texas economy. While this has always been TSTC’s primary goal, the
prior concern over the number of contact hours taught no longer distracts TSTC from achieving its mission and
opens up the college to innovations that eliminate unnecessary activities and allow students to achieve their
educational goals in their own time frame, rather than one that paces learning by the clock. Further, the new
formula more appropriately incentivizes TSTC to enhance services to students at the beginning of their
educational journey to maximize their educational success, and at the end, when they are looking to join the
economy as technicians valued by Texas business and industry. As a result of the change in inherent motivation,
TSTC continues to innovate the educational process by right-sizing educational programs and providing new
options for learning, including competency-based learning, project-based learning, and innovative methods of
aligning and maintaining instructional content to the needs of business and industry.
In order to further focus TSTC’s efforts to these ends, TSTC has adopted a rallying cry (or overall Goal statement)
of “PLACE MORE TEXANS.” This overall goal then is reflected in the three overarching (thematic) goals TSTC has
adopted:
I.
Make TSTC the perpetually-relevant, innovative, go-to, statewide technical education source for
Texas;
II.
Build a top-shelf fund-raising capacity that provides consistent and meaningful financial
resources; and
III.
Make TSTC a great place to work (and learn).
69
These goals will be assessed based upon the following Key Performance Indicators (KPI’s):
I.
Make TSTC the perpetually-relevant, innovative, go-to, statewide technical education source
for Texas.
Key Performance Indicators:
A. Annual Unduplicated Count of Semester Credit Students
B.
Annual Unduplicated Count of Former Students Found Working or Transferring to another
public college or university in Texas, id est., Placement.
C.
Average First-year salary of Former Students Found Working (including Leavers)
D. Number of Annual Graduation Awards
E.
Number of Students Graduated Annually
F.
Percent of First-time, Full-time Degree or Certificate-seeking Students Graduated within 3
years
G. Average Student Satisfaction Index
H. Student Ethnicity compared to Statewide Population Ethnicity
II.
Build a top-shelf fund-raising capacity that provides consistent and meaningful financial
resources.
Key Performance Indicators:
A. Funds Raised Annually
B.
III.
Annual Number of Donors
Make TSTC a Great Place to Work:
Key Performance Indicators:
A. Employee Satisfaction Index
B.
Employee Ethnicity compared to Statewide Population Ethnicity
Following the initial development of the plan and approval of the Executive Management Council, the Institutional
Effectiveness Committee reviewed the plan and provided faculty and staff input. Additional changes were
subsequently reviewed and approved by the Executive Management Council.
In addition to the Strategic Plan, each of the various functions and divisions under Single Accreditation will
produce a Biennial Operating Plan which will contain direct references to the Strategic Plan, with performance
70
targets identified and set for the two-year life of the Biennial Operating Plans. Also, departments will develop
Departmental Biennial Operating Plans which will contain direct references to functional Biennial Operating Plans.
In this manner, all planning and evaluation processes will be linked together. The Institutional Effectiveness
Committee will monitor the three levels of planning and assessment and will make recommendations for
improvement to the content and processes as appropriate.
For each of the three overall (thematic) goals, Objectives and Strategies have been developed to “flesh-out” the
overall plan. These Goals, Objectives and Strategies are listed in the next section and constitute the TSTC Strategic
Plan.
TSTC EXPANDED STRATEGIC PLAN INCLUDING STRATEGIES AND PERFORMANCE
INDICATORS
TSTC’s RALLYING CRY: “PLACE MORE TEXANS”
Goal 1:
Make TSTC the Perpetually-Relevant, Innovative, Go-To, State-Wide Technical
Education Source for Texas
Objective 1.1: Enhance the delivery of student related products and services
Strategies:
1.1.1
Expansion. Stand up and expand start-up operations at TSTC’s newest locations.
Performance Indicator(s): (form & content to be developed)
1.1.1.1 – Hutto location annual assessment report
1.1.1.2 – Red Oak location annual assessment report
1.1.1.3 – Ft. Bend county location annual assessment report
1.1.2 Skill Mastery. Provide students with learning experiences required to achieve relevant
levels of technical skill mastery leading to successful employment or advancement.
Performance Indicator(s): (form & content to be developed)
1.1.2.1 – Competency-based instruction annual assessment report
1.1.2.2 – Continuous improvement of learning assessment report
1.1.2.3 -- Curricula update annual assessment report
1.1.2.4 – Annual program facilities assessment report
1.1.2.5 – Annual percent of students achieving course-level mastery (to be defined
and methodology of assessment developed)
1.1.2.6 – Annual number of former TSTC students found working in the Texas
economy
1.1.2.7 – Average annual wages of former TSTC students found working
1.1.3 Innovation. Accelerate innovation in delivery of technical training, including
competency-based learning, and badges to increase speed of workforce supply to
Texas’ industry.
Performance Indicators: (definitions & methodologies TBD)
1.1.3.1 – Percent of programs offering competency-based instruction options
1.1.3.2 – Percent of programs offering badges
1.1.3.3 – Number of students earning badges annually
1.1.3.4 – AAS Graduate average time to complete
1.1.3.5 – Certificate Graduates average time to complete
71
1.1.4
Veteran Services. Enhance and improve student support services for veterans.
Performance Indicators:
1.1.4.1 – Number of staff and annual budget devoted to Veterans Services
1.1.4.2 – Annual unduplicated number of veterans served
1.1.4.3 – Placement rate for Veterans exiting TSTC
1.1.4.4 – Average annual salaries for placed veterans
1.1.5 The Student Experience. Enrich the student experience through integrated, accessible
and responsive services that enhance learning for each TSTC student.
Performance Indicators:
1.1.5.1 – Selected items on the Annual Survey of Student Satisfaction (to be
selected)
Objective 1.2:
Broaden the market appeal of TSTC’s Products and Services
Strategies:
1.2.1
Branding. Inspire and activate loyalty to TSTC by its current and prospective
customers.
Performance Measures:
1.2.1.1 – Marketing Engagement Report
1.2.2 Industry Relations. Deepen our relationships with Texas employers to enrich their
supply of job-ready technicians and ensure their workforce maintains a technical
edge.
Performance Measures:
1.2.2.1 – Number of Texas employers who hire former TSTC students annually
1.2.2.2 – Industrial Career Day report
1.2.2.3 – Job Star Data report
1.2.2.4 – Equipment Donation report
1.2.3
C4EO. Expand the Common Skills Language project for aligning skill-based language
to validate learning outcomes for relevancy and curriculum alignment
Performance Measures:
1.2.3.1 – Percent of TSTC programs for which curricula is validated or updated using
the CSL project process annually
1.2.4
Partnerships. Continue working cooperatively with Texas community colleges and
other partners to address Texas industry’s training needs regardless of their location
in the state.
Performance Measures:
1.2.4.1 – Number and list of active partnerships
1.2.4.2 – Number of students served in active partnerships
Objective 1.3 Provide essential services and leadership in support of operations
Strategies:
1.3.1
Single Accreditation. Merge the four TSTC college accreditations into a single,
statewide accreditation to provide an efficient structure for increased instructional
capacity, innovation, sustainability, quality, consistency, and flexibility to respond to
the growth opportunities that lie ahead.
Performance Measures:
1.3.1.1 – SACSCOC votes to approve TSTC’s merger, June 2015
1.3.1.1 – TSTC Compliance report submitted Nov. 2015
72
1.3.1.2 – SACSCOC Site Visit, end of Jan. 2016
1.3.1.3 – TSTC addresses any outstanding recommendations, Apr. 2016
1.3.1.4 – SACSCOC accepts TSTC’s actions on any recommendations, June 2016
1.3.2
Optimization. Employ reasonable means to leverage and extend the entire system’s
resources for greater operating efficiency, including statewide integration of
administrative and operational functions.
Performance Measures: (Content and format TBD)
1.3.2.1 – Student Learning annual status report
1.3.2.2 – Student Development annual status report
1.3.2.3 – Finance Division annual status report
1.3.2.4 – Business Intelligence annual status report
1.3.2.4 – Marketing/Development annual status report
1.3.2.5 – Facilities/Auxiliary Services annual status report
1.3.2.6 – IT Services annual status report
1.3.2.7 – Organizational Development annual status report
1.3.2.8 – Technology Enhancement annual status report
1.3.2.9 – OPM annual report
1.3.3 Program Vitality. Further develop the economic model, tools, and strategies for
evaluating programs within the framework of TSTC’s new funding formula as well as
other factors, including current and future market demand and program cost
Performance Measures: (Content & format TBD)
1.3.3.1 Annual Status of Educational programs report
1.3.4 Service. Exceed expectations of customers with a sense of warmth, friendliness, and
individual pride.
Performance Measures:
1.3.4.1 – Selected responses on annual student satisfaction survey (TBD)
1.3.4.2 – Student focus group outcomes (TBD)
Goal 2: Build a Top-Shelf Fund-Raising Capacity That Provides Consistent & Meaningful
Financial Resources
Objective 2.1 Advancement: Build financial resources for the College
Strategies:
2.1.1
Culture of Philanthropy. Build the unified case for support, impassioned volunteer
leadership, and equipped sales force for long-term fundraising success.
Performance Measures:
2.1.1.1 – Annual Development report
2.1.1.2 – Number of annual donors to TSTC and The TSTC Foundation
2.1.1.3 – Alumni Report
2.1.1.4 – Number of Individuals and Organizations Engaged Annually
Goal 3: Make TSTC a Great Place to Work
Objective 3.1 Organization Development: Shape the culture of the college to maximize success for all
Strategies:
3.1.1 Employee Engagement. Provide the tools, resources, and support necessary to
ensure success of all employees
Performance Measures:
73
3.1.1.1 – Report of Employee Satisfaction results
3.1.1.2 – Compensation Progress report
74
APPENDIX A: KEY PERFORMANCE INDICATORS DEFINITIONS
KPI #1: Annual Unduplicated Count of Semester Credit Hour Students
Data Definition:
This is a measure of the annual headcount of students enrolled in semester credit hour classes,
counting a student one time, regardless of the number of classes or terms in which a student is
enrolled during the academic year.
Data Sources:
Certified CB-M001 reports to Texas Higher Education Coordinating Board (THECB)
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation:
A student is counted one time for the academic year reported. Count is made for unique
student IDs.
FY 2020 Performance Target: 20,004
75
KPI #2: Annual Unduplicated Count of Former Students Found Working or Transferring to another public
college or university in Texas (Placement)
Data Definition:
Placement results are available through Texas Higher Education Coordinating Board (THECB)
Automated Student and Adult Learner Follow-Up System (ASALFS) Exit Cohort Reports, which
link all students who were at the college the previous fiscal year and who were not present the
following fiscal year whether due to graduation, transfer, or other reasons. Included in the
placement count are students who were found working in the Texas Workforce Commission
(TWC) Unemployment Insurance (UI) wage records and in national databases including the Office
of Personnel Management, US Post Office, and military records from the Department of Defense,
in October, November, and/or December of the fiscal year following their exit from TSTC.
Placement counts include all former TSTC students (except continuing ed) from the prior
academic year who are “found working” or who have been found “enrolled” at another public
institution of higher education in Texas.
Data Sources:
Texas Higher Education Coordinating Board (THECB) Automated Student and Adult Learner
Follow-Up System (ASALFS) Exit Cohort Reports – Summary by Cohort Type, Level of Award (Page
3 or 4, varies by report year)
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation:
Count of former TSTC students “found working” or enrolled in a public institution of higher
education in Texas.
FY 2020 Performance Target: 8,412
76
Texas Higher Education Coordinating Board
ASALFS RESULTS DEFINITIONS (for 2005-2006 and later)
Automated Student and Adult Learner Follow-up System (ASALFS) - Process of tracking students who
attend Texas public community and technical colleges after they leave the colleges. Process involves
electronic matching of social security numbers of students with Texas Workforce Commission (TWC)
Unemployment Insurance (UI) wage records, DOD records, federal databases of civil employees, and the
public higher education enrollment database maintained by the Texas Higher Education Coordinating Board
(THECB). Incarcerated students enrolled in academic programs as well as students with invalid SSNs are
excluded from the cohort.
GRADUATE - Students who received an award during a given academic year (Fall, Spring, Summer I, and
Summer II).
COMPLETER - Students who were reported as satisfactorily completing a core curriculum or field of study
curriculum during a given academic year.
NON-RETURNING -Students who were enrolled in an institution during a given academic year and did not
graduate or return to that institution in the following Fall.
UNDECLARED - Students who were enrolled in a general academic or technical curricula but had not
declared a major at an institution during a given academic year and did not attend that institution in the
following Fall.
ADDITIONAL HIGHER EDUCATION AND NOT EMPLOYED- Students found to be enrolled in a Texas
public higher education institution in the Fall following a given academic year, and not found in employment
records.
EMPLOYED AND NO ADDITIONAL HIGHER EDUCATION- Students found in the employment records as
employed during the 2nd quarter following the program year in which they left postsecondary education and
not enrolled in a Texas public higher education institution in that same Fall.
ADDITIONAL HIGHER EDUCATION AND EMPLOYED- Students found to be both enrolled in a Texas
public higher education institution in the Fall following a given academic year and employed via employment
records.
EMPLOYED AND/OR ADDITIONAL HIGHER EDUCATION - Students found to be employed via the
employment records and/or enrolled in a Texas public higher education institution in the Fall following a given
academic year. (NOTE: This is an unduplicated aggregate of the first three columns. We refer to the percent
of graduates in this column as an Institution's Success Rate.)
STUDENTS NOT FOUND - Students who were reported to the THECB as enrolled during a given academic
year, but were not found by the ASALFS process.
Types of Students not Found:
•
•
•
•
•
•
Students who
Students who
regulations
Students who
Students who
Students who
Students who
transferred to colleges outside of Texas
were working for companies in Texas not covered by Unemployment Insurance
were
were
were
were
self employed
incarcerated after exiting the program
employed outside of Texas
truly unemployed and not pursuing higher education
Note: Texas community and technical colleges located near the border may have high percentages
of their former students in the Students Not Found category.
77
TOTAL FORMER STUDENTS- Unduplicated number of students reported to the THECB as either enrolled
during a given academic year but did not return to the same institution in the following Fall, or graduated
during that same academic year.
78
Definitions for the Exit Cohort Reports
All Working:
All participants that are found working in the TWC UI wage records, military, USPS (US Postal Service) and OPM
(Office of Personnel Management) databases and participants who are working at the same time going to
school.
All Enrolled:
All participants that are found enrolled in a higher education institution and participants who are working at the
same time going to school.
Enrolled Only:
Participants found enrolled in a higher education institution but not working.
Graduates:
Graduates during 20XX-20YY academic year.
Graduate Completers:
Students graduating a graduate program during 20XX-20YY academic year (for universities and HRIs only)
Leavers:
Non-returning students enrolled during Fall 20XX, Spring and Summer (I or II) 20YY and did not enroll in Fall 20YY
at the same institution.
Not Located:
Participants not found in the employment records or in THECB database.
Undergraduate Completers:
Students graduating in an undergraduate program during 20XX-20YY academic year.
Working Only:
All participants that are found working in the TWC UI wage records, military, USPS (US Postal Service) and OPM
(Office of Personnel Management) databases but not enrolled in higher education.
*:
Data suppressed due to low cell value
79
Methodology for the Exit Cohort Reports
Cohort:
All individuals enrolled in Texas public institutions during FY 20YY
Employment Records:
The records were matched with TWC UI wage records, and national databases including Office of Personnel
Management, US Postal Office and military records from Department of Defense. Employment was determined
th
if the former student was found working in the 4 quarter 20YY according to the records mentioned.
Enrollment Records:
The records were matched with public and private postsecondary enrollment records for predetermined periods
of analysis. Pursuit of additional higher education was documented if the former student was found enrolled in
the fall semester 20YY.
Industries for Employment:
It shows the industries in which the individual was found working. No information about the type of work the
individual does is available.
Higher Education Institutions for Enrollment:
Institution where students enrolled in the following fall (Fall 20YY) after they left the original institution.
CIP Areas for Enrollment:
Majors declared by students when they enroll in the following fall (Fall 20YY)
80
KPI # 3: Average First Year Salary of Former Students Found Working (including Leavers)
Data Definition:
Salary results are available through Texas Higher Education Coordinating Board (THECB)
Automated Student and Adult Learner Follow-Up System (ASALFS) Exit Cohort Reports, which link
all students who were at the college the previous fiscal year and who were not present the
following fiscal year whether due to graduation, transfer, or other reasons. Included in the
average salary calculation are students who were found working in the Texas Workforce
Commission (TWC) UI wage records and in national databases including the Office of Personnel
Management, US Post Office, and military records from the Department of Defense, in October,
November, and/or December of the fiscal year following their exit from TSTC. Includes former
students found to be “working only” (not enrolled in higher education in Texas).
Data Sources:
THECB Automated Student and Learner Follow-Up System (ASALFS) Reports – Summary of Cohort Type,
Level of Award (Page 3 or 4, varies by report year).
Targeted Annual Growth Rate 2016 to 2020: 3.71%
Method of Calculation:
Mean earnings based on 4th quarter TWC wage data following year of graduation or exit from TSTC, extrapolated
to a full year.
FY 2020 Performance Target: $30,000
Data Limitations:
This KPI measures former students’ salaries within one month to thirteen months following their
departure from TSTC and includes “leavers” as well as graduates, and is based on only one
quarter of UI wage and additional database data. Salaries are thus lower than what may be
discovered through a process which measures salaries for a longer time after student departure.
A student leaving TSTC in August will be included for what they earned in the fourth calendar
quarter of that year, regardless of whether their employment was for part-time or full-time
work, and whether the former student was employed for one month of the fourth quarter or all
three months of the fourth quarter. “Leaver” salaries also further dilute the average salaries,
resulting in only moderate average earnings per “former student.”
81
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 4: Number of Annual Graduation Awards
Data Definition:
Graduation data is reported to THECB on the CB-M009 report and this measure reflects the total
number of degrees and certificates awarded during the fiscal year. This is a count of awards
rather than graduates. A graduate may earn more than one award in an academic year, and this
measure counts each award earned during that year.
Data Sources:
Certified CB-M009 reports
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation:
The total number of graduation awards reported on the CB-M009 report is summed.
FY 2020 Performance Target: 3,474
82
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 5: Number of Students Graduated Annually
Data Definition:
From the CB-M009 report, this measure counts the number of students who graduated in a fiscal
year so that a graduate is counted only one time for the year, regardless of the number of
awards received.
Data Sources:
Certified CB-M009 report
Targeted Annual Growth Rate 2016 to 2020: 4.82%
Method of Calculation:
The number of unduplicated graduates reported on the CB-M009 is summed. A graduate is
counted one time, regardless of the number of awards received.
FY 2020 Performance Target: 2,863
83
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 6: Percent of first-time, Full-time Degree or Certificate-seeking Students Graduated within 3 years
Data Definition:
First-time, Full-time Degree or Certificate-seeking Students who Graduated Within Three Years with Either an AAS
Degree or Certificate - Students who (1) enrolled at the specific TSTC college for the first time, (2) were enrolled for
12 or more semester credit hours as of the official census date in their first semester of enrollment, (3) declared
they were seeking either an Associate of Applied Science (AAS) degree or a vocational/technical certificate in their
first semester of enrollment, and (4) graduated within three years.
Data Sources:
Colleague Database. This is also a Legislative Budget Board (LBB) measure.
Targeted Annual Growth Rate 2016 to 2020: from 30.0% to 32.5%
Method of Calculation:
This measure is calculated by assigning students who meet the criteria as "first-time, full-time degree or certificateseeking students" to a term- and location-specific cohort. Each term's cohort is tracked for three years and the
status of each student is determined as of the end of the three-year period--either certified as having graduated or
not certified as having graduated. The number of students certified as having graduated and the total number of
students in the cohort are summed for each of the three cohorts whose third year ends in the fiscal year being
reported. Then, the total annual number of students certified as having graduated is expressed as a percentage of
the total number of students in the annual cohort.
FY 2020 Performance Target: 32.5%
84
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 7: Average Student Satisfaction Index
Data Definition:
Students are surveyed to determine their satisfaction with all aspects of TSTC operations,
including facilities, student learning processes, student life, safety, and other aspects of student
life.
Data Sources:
Student Satisfaction Survey. Completed in odd-numbered fiscal years
Targeted Change, 2016 to 2020: from 3.77 in FY 13 to 3.85 in FY 19
Method of Calculation:
These surveys are scored on a 1 to 7 Likert scale, with a 7 representing total satisfaction.
Expected performance, based on past results, was set at an overall average score of 4.0, for
Point Scaling value of 5. Each campus’ average score is weighted by the number of students at
that campus. College-wide results are weighted by the number of respondents at each location.
FY 2020 Performance Target: 3.85 (in FY 2019)
85
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 8: Student Ethnicity Compared to Statewide Population Ethnicity
Data Definition:
This measures the difference in percentage points of the TSTC student population against
statewide averages (all ages included).
Data Sources:
Certified CB-M001 reports to the Texas Higher Education Coordinating Board (THECB)
Expected Change 2016 to 2020: An increase from 11.5% to 12.5%
Method of Calculation:
Percentages by ethnicity are calculated for TSTC students and compared to statewide ethnicity
percentages. The difference is used to measure the differential.
FY 2020 Performance Target: 12.5%
86
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 9: Funds Raised Annually
Data Definition:
Fund raising measures the total amount of donations, both monetary and non-monetary, at the
campuses and administration, including amounts donated to TSTC through The TSTC Foundation.
All amounts by category are summarized together.
Data Sources:
Internal records
Targeted Annual Growth Rate 2016 to 2020: 20.0% for monetary donations
Method of Calculation:
Sum of monetary donations for a fiscal year.
FY 2020 Performance Target: $3.3 million in monetary donations for FY 2020
Data Limitations:
Non-monetary donations for each fiscal year will be reported, but these donations do not have
an annual “goal” due to the unpredictable variability in their size and frequency.
87
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 10: Annual Number of Donors
Data Definition:
This measure provides the annual unduplicated count of donors of both monetary and nonmonetary contributions for the fiscal year.
Data Sources:
Internal records of the TSTC Foundation.
Targeted Annual Growth Rate 2016 to 2020: 20%
Method of Calculation:
Total number of unique donors for a fiscal year are summed.
FY 2020 Performance Target: 498
88
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 11: Employee Satisfaction Index
Data Definition:
Employees are surveyed to determine their satisfaction with all aspects of their TSTC
employment. These surveys are scored on a scale of 1 to 5, with 5 representing total
satisfaction.
Data Sources:
Employee Satisfaction Survey
Targeted Annual Growth Rate 2016 to 2020: From 3.70 to 3.90
Method of Calculation:
Each location’s average score is weighted by the number of employees at each location and then
averaged.
FY 2020 Performance Target: 3.9 on a 1 to 5 scale
89
Texas State Technical College
Institutional Effectiveness Handbook
KPI # 12: Employee Ethnicity Compared to Statewide Population Ethnicity
Data Definition:
This measures the number of percentage points TSTC employee ethnicity percentages are away
from statewide ethnicity percentages. These results are based on the ethnicity of all employees
college-wide. All other categories are combined and measured against Anglo populations.
Data Sources:
HOD records of employee ethnicity as measured in July of each year.
Targeted Change, 2016 to 2020: from -9.8% to -9.0%
Method of Calculation:
Percentages by ethnicity are calculated for TSTC employees and compared to statewide ethnicity
percentages. The difference in these percentages is the differential.
FY 2020 Performance Target: -9.0%
90
Texas State Technical College
Institutional Effectiveness Handbook
APPENDIX B: KEY PERFORMANCE INDICATORS – TARGETS FOR FY 2020
KPI
#
FY 17
Est
FY 18
Est
FY 19
Est
FY 20
Target
16,571
17,370
18,207
19,085
20,004
6,968
7,304
7,656
8,025
8,412
$26,000 $27,000 $28,000 $29,000
$30,000
KPI Title
1
Annual Unduplicated Count of Semester
Credit Hour Students
2
Annual Unduplicated Count of Former
Students Found Working or Transferring to
Another Public College or University in Texas
3
FY 16
Est
Average First Year Salary of Former Students
Found Working (including Leavers)
4
Number of Annual Graduation Awards
2,878
3,016
3,162
3,314
3,474
5
Number of Students Graduated Annually
2,371
2,486
2,605
2,731
2,863
6
Percent of first-time, Full-time Degree or
Certificate-seeking
Students Graduated within 3 years
30.00%
30.00%
31.00%
32.00%
32.50%
7
Average Student Satisfaction Index
-
8
Student Ethnicity Compared to Statewide
Population Ethnicity
11.75%
12.00%
12.25%
12.50%
12.50%
9
Funds Raised Annually:
$1,594,
800
Varies
$1,913,
760
Varies
$2,296,
512
Varies
$2,755,
814
Varies
$3,306,9
77
Varies
Cash Raised during Fiscal Year
In-Kind Contributions during Fiscal Year
3.80 -
3.85 -
10
Annual Number of Donors
240
288
346
415
498
11
Employee Satisfaction Index
3.70
3.75
3.80
3.85
3.90
12
Employee Ethnicity Compared to Statewide
Population
-9.50%
-9.25%
-9.00%
-9.00%
-9.00%
91
Texas State Technical College
Institutional Effectiveness Handbook
APPENDIX E: ASSESSMENT TIMELINE OF ACTIVITIES
Biennial Assessment Cycle for Unit Action Plans
Timeline of Assessment Activities: 2015-17 Cycle
July/August (Summer 2015)
Close Prior Year Cycle
 Units gather with staff/faculty to close or
update 2014-15 cycle and plan for
upcoming assessment cycle:
- Review results of 2014-15 improvements
(“Use of Results”) to determine impact
on outcome target.
- Document results/impact of
improvement(s) implemented in 2014-15
“Follow-Up” tab on TracDat.
- Close assessment cycle (Close the Loop)
Open 2015-17 Cycle
 Meet with program/unit faculty and staff
colleagues to identify and document
program outcomes for the new
assessment cycle.
 Develop curriculum map to align
instruction with desired program
outcomes (for instructional programs
only)
 Set Achievement targets or Criterion for
Success.
 Indicate Planned Year as 2015-17
 Indicate Start and End Date.
 Indicate Outcome Status as Active.
 Develop Assessment Methodology
- Identify who will be responsible for
Sept - December (Fall
2015)



Jan-May (Spring 2016)
Instruction and
implementation occurs
to facilitate desired
learning and/or
operational goal.

Collect artifacts (i.e.,
student projects,
exams, skills
demonstration,
retention and
placement data, etc.)
for production of data.
Create spreadsheet of
artifact data to
aggregate interim
results (i.e., average
scores of student
sample, total
enrollment into
program for fall 2015
term, % of users
indicating satisfaction
with services, etc.).
92
Instruction and
implementation occurs
to facilitate desired
learning and/or
operational goal
continues.

Collect artifacts (i.e.,
student projects,
exams, skills
demonstration,
retention and
placement data, etc.)
for production of data.

Create spreadsheet of
artifact data to
aggregate interim
results (i.e., average
scores of student
sample, total
enrollment into
program for spring 2016
term, % of users
indicating satisfaction
with services, etc.).
June-Aug (Summer 2016)
-
Review/analyze fall 2015 and/or spring
2016 results.
-
Faculty/staff gather to determine and
document planned improvements or
Use of Results (in future tense) based
on information obtained from analysis
of data, including:
- Description of improvement(s)
to be implemented;
- Courses/programming to be
impacted by the proposed
improvements;
- Artifact/data piece to be used
for measuring impact of
proposed improvements;
- Timeline and persons
responsible for implementing
proposed improvements.
Texas State Technical College
Institutional Effectiveness Handbook
conducting the assessment; what type of
data will be collected; and when the data
will be collected (Means of Assessment).
- Identify the Data Source for how the
learning or practice will be assessed tools, instruments, and/or sources for
producing data (i.e., rubric, IE&R reports,
THEC placement data, etc.).
- Identify strategies to be implemented to
achieve the outcome target.

Review/analyze results
to determine progress
toward desired target.

Review/analyze results
to determine progress
toward desired target.
Assessment Cycle
Sept - December (Fall 2016)
Jan-May (Spring 2017)
Implementation of proposed improvements as
documented on Use of Results


Collect artifact(s) for systematic measurement of
implemented improvements.


Create spreadsheet of artifact data to aggregate
summative results.

Review/analyze results to determine whether
desired target was met or not met.
Create spreadsheet of artifact data to
aggregate summative results.

Review/analyze results to determine
whether desired target was met or not met.


June-Aug (Summer 2017)
Implementation of proposed improvements
as documented on Use of Results

Collect artifact(s) for systematic
measurement of implemented
improvements.
Gather faculty to review findings on
impact of the improvement and plan
for new assessment cycle.

Document results/impact of
improvement(s) implemented in the
“Follow-Up” tab (Close the Loop) on
TracDat.

Begin planning for 2017-19 assessment
cycle based on data/results from
recently completed assessment.

Open 2017-19 assessment cycle.
93
Texas State Technical College
Institutional Effectiveness Handbook
APPENDIX F:
DEVELOPING STUDENT LEARNING OUTCOME(S) ASSESSMENT
PLANS
Stating Program (Student Learning) Outcomes
The first step in developing intended outcome statements is to determine what your program/unit
wants to know about student learning or your unit’s programs and services. When writing outcomes,
keep in mind that they should be detailed, specific, measurable or identifiable, and personally
meaningful statements that are derived from the goals and articulate what the end result of a unit,
program, course, activity, or process is.
Specific things to consider when devising outcome(s) statements (first for student learning outcomes
and then for administrative outcomes) follow below:
Student Learning Outcomes Guiding Questions:

What skills/abilities/values do you think students should have as they approach
graduation from their program of study? What can students be able to do when they
“know”?

What quality of work will students produce when they “know”?

How will students behave when they “appreciate” or “value” something?

What evidence can be used to measure that students are doing what we expect them to
be able to do? How can the quality of student work or behavior be confirmed?

Will data be obtained that can be used to improve programs and services?
A good approach is to focus on the end result of your teaching. How will you know that the students
have learned what you want from them? What does it look like? How will you identify it? Use simple,
specific action verbs to describe what students are expected to demonstrate upon completion of a
program. Examples include (adapted from Bresciani, et al., 2002):

Students will be able to {action verbs to describe knowledge, skills, or attitude}

Action verbs - Concrete verbs such as "define," "apply," or "analyze" are more helpful for
assessment than verbs such as "be exposed to," "understand," "know," or "be familiar with."
Below are examples of action verbs that can be used to measure learning competencies:
94
Texas State Technical College
Institutional Effectiveness Handbook
Common Action Verbs for Writing Outcomes
Cognitive Learning
Knowledge - to recall or remember facts
without necessarily understanding them
articulate, define, indicate, name, order, recognize, relate, recall,
reproduce, list, tell, describe, identify, show, label, tabulate, quote
Comprehensive - to understand and
interpret learned information
classify, describe, discuss, explain, express, interpret, contrast,
associate, differentiate, extend, translate, review, suggest, restate
Application - to put ideas and concepts to
work in solving problems
apply, compute, give examples, investigate, experiment, solve,
choose, predict, translate, employ, operate, practice, schedule
Analysis - to break information into its
components to see interrelationships
analyze, appraise, calculate, categorize, compare, contrast,
criticize, differentiate, distinguish, examine, investigate, interpret
arrange, assemble, collect, compose, construct, create, design,
Synthesis - to use creativity to compose and
formulate, manage, organize, plan, prepare, propose, set up
design something original
Evaluation - to judge the value of
information based on established criteria
appraise, assess, defend, judge, predict, rate, support, evaluate,
recommend, convince, conclude, compare, summarize
Affective Learning
appreciate, accept, attempt, challenge, defend, dispute, join, judge, praise, question, share, support
Examples of a Student Learning Outcome:

Graduates will be able to collect and organize appropriate clinical data, apply principles of
evidence-based medicine to determine clinical diagnoses, and formulate and implement
acceptable treatment modalities.

Graduates will be able to critically analyze and evaluate current research.

Students will be able to participate effectively as a member of a task oriented team.

Students will be able to examine rhetorical phenomena from multiple theoretical perspectives.
Once you have identified the intended outcomes, write a formal learning outcome statement. The
key is to make sure the statement is S.M.A.R.T.
95
Texas State Technical College
Institutional Effectiveness Handbook
SMART Outcome Objectives
Outcome is focused on a specific category of student learning. If it is
S
Specific
too broad it will be difficult to measure.
M Measureable
Data can be collected to measure student learning.
A Attainable
The outcome is attainable given the educational experience.
ResultsThe program outcome is a ligned with Divisional Student Learning
R
Focused
Outcomes.
Outcome is specifically tailored to the program. Use the following
T
Tailored
statement to get started:
As a result of participating in (program or experience), student should be able to (action verb)
and defined by explicit and observable terms).
It is advisable to limit student learning outcomes to three to six outcomes. Focus on the most
important goals of the program. A helpful and frequently used resource when writing student learning
outcomes is Bloom's Taxonomy of Cognitive Skills.
Curriculum Mapping
Curriculum mapping is a highly valuable process in assessment of student learning. In many cases, it
may be the first time a curriculum is systematically examined to see how the individual courses function
in the curriculum. In a best practice environment, program curricula are developed after the
identification of the program-level student learning outcomes, and courses would be designed to foster
the achievement of those outcomes
Curriculum mapping is a method to align instruction with desired goals and program outcomes. It can
also be used to explore what is taught and how. Mapping is designed to document what courses are
taught and when, reveal gaps in the curriculum, and help design an assessment plan. It improves
communication among faculty about curriculum, promotes program coherence, increases the
likelihood that students achieve program-level outcomes and encourages reflective practice.
Curriculum maps can be created in TracDat for each student learning outcome.
The
curriculum map is created by setting up a table with one row for each program learning outcome and
one column for each course in the program that is being assessed. Once the chart is established,
department personnel enter an indicator of level for each learning outcomes and course/experience.
 "I" indicates students are introduced to the outcome.
 "R" indicates the outcome is reinforced and students afforded opportunities to practice
 A" indicates where evidence might be collected and evaluated for program-level
assessment.
96
Texas State Technical College
Institutional Effectiveness Handbook
Collection might occur at the beginning and end of the program if comparisons across years are
desired. It is important that all program-level outcomes have at least one “A” as each needs to be
assessed. Not every outcome is assessed every semester, the timeline for collection is indicated on the
assessment plan. Program faculty then analyze, discuss, and revise the curriculum map, as needed.
Measuring Program Outcomes
Measurable outcomes have a numerical, quantitative rating whenever possible. If the student learning
outcome statement “increases or “decreases” something, measurable outcomes define how much
change or improvement is expected. It is also important to identify where each instructor/program
chair will extract that number, piece of data or statistics to demonstrate the achievement of
the desired outcome. The assessment methodology and criterion identifies the data that will be used
to demonstrate that something occurred as a result of implementation of a student learning
outcome. When data sources are kept continuous, they provide direct evidence of “closing the loop”
when improvement is demonstrated.
Setting Achievement Targets
The criterion for success is the benchmark or target that serves as an indicator for accomplishment.
This serves as a “point of reference” by which programs/units determine whether learning and services,
for instance, occurred at the desired proficiency rate or had the desired impact. Criteria for success
should be concrete levels of achievement based on the measures employed through the assessment
tool. These targets are established during the planning process of Phase I (i.e. prior to implementation
of the action steps).
Units should consider the use of primary and secondary criteria for success (Nichols & Nichols, 2000).
The primary criteria for success are the overall levels of success and the secondary criteria are the
detailed levels of success that contribute to the overall level of success. For instance, an administrative
unit may establish a criterion of success for client satisfaction and then establish additional criteria to
address each element contributing to overall success, such as customer service, quality of
work/product, timeliness of service, etc. An academic unit may also use primary and secondary criteria
for success.
As an example, consider the assessment of a written composition. The primary criterion may be that
90% of students are able to compose a ‘Good’ to ‘Excellent’ essay. The secondary criteria may be that
85% students are to perform at a level of ‘Good’ to ‘Excellent’ in each area of a composition including
composition, grammar, use of references, etc.
97
Texas State Technical College
Institutional Effectiveness Handbook
Select Appropriate Assessment Methods
Assessment methods outline what data will be collected, from what sources, using what methods, by
whom, and in what approximate timeframe. The following is a list of guiding questions to use when
determining the assessment methodology (Hoy, 2006):

What type of data will provide the needed information?

What type of research design and sample will best provide this information? What data displays,
comparisons or statistics will be appropriate?

How often will this information be collected and who will collect it?
An important point to consider when devising the methodology for assessment is that the link between
the assessment method and the learning or administrative outcomes must be logical. For instance,
assessing public speaking ability through the use of a multiple-choice test would be problematic and
essentially misaligned.
Too often, an assessment method is devised without much consideration as to whether or not the
method is appropriate. For instance, a nationally normed standardized test might be a relatively easy
way to obtain data, but if the test doesn’t assess the outcomes of the program, it is not going to offer
much insight into whether or not specific program outcomes have been achieved. However, line-item as
opposed to just aggregate student performance data may be available from testing companies (usually
for an additional fee), allowing for additional analyses focused on the outcomes specific to the program.
It is important to keep in mind that assessment of program-level student learning outcomes should be
based upon a direct examination of student work. Overall, the selected methodology has to meet each
of these essential points (as applicable):
1. Assessment methodology must evaluate the extent to which the Intended Outcome is achieved,
not whether the Action Steps were or were not completed.
2. Assessment methodology intended to measure student learning must examine a student work
product (student performance in response to a specific project, assignment, knowledge test,
etc.). This is referred to as using a direct measure of student learning. Indirect measures can be
used to support the data derived from the direct measure. Examples of direct and indirect types
of assessment methods are listed below.
98
Texas State Technical College
Institutional Effectiveness Handbook
3. Assessment methodology for learning outcomes must utilize objective information (based on
direct measures) and not rely solely on self-reports or other subjective information, which are
referred to as indirect measures. If indirect measures are used to measure student learning, then
they must be accompanied by a direct measure. Indirect measures can be used as the sole
measure for administrative outcomes.
Examples of Direct Assessment
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Project-embedded assessment
Observations of behavior
External evaluations
Document analysis (e.g. meeting notes, policies, handbooks, etc.)
Performance on a case study/problem
Performance on problem and analysis
Exams (national; pre-test/post-test, licensure, etc.)
Portfolios
Grading Rubrics
Research Papers, thesis and dissertations
Oral examinations or presentations
Examples of Indirect Assessment









Alumni, Employer, Student Surveys
Focus groups (depending on the interview protocol, this could be used as direct evidence)
Exit interviews
Job placement statistics
Enrollment trends
Survey of Attitudes and Behaviors
Graduation and retention rates
Graduate follow-up studies
Curriculum and syllabus analysis
Because measurement in education is not an exact science, it is a good idea to identify more than one
method of assessment for each intended outcome. Finding that two assessment methods produce
similarly positive (or negative) results lends some degree of validity for the results and conclusions
drawn from those results. This is especially true when using indirect methods. Using multiple measures
to increase the degree of validity is referred to as triangulation of data.
Assessment Methodology Example 1:
99
Texas State Technical College
Institutional Effectiveness Handbook
SLO:
Upon graduation, English majors will demonstrate the ability to analyze a text critically.
Direct Assessment Method: Senior students taking ENGL 43XX will write a paper critically
analyzing a selected text. An external expert (or experts) will review a sample of student
work by utilizing a departmentally developed rubric, which identifies students’ areas of
strengths and weaknesses. Department faculty will review the resulting reports to
determine how many students met the criteria of success. For the outcome to be
considered achieved, 90% of papers assessed must score “Proficient” or “Distinguished”
in at least three of the four elements of critical analysis.
Assessment Methodology Example 2:
SLO:
Graduating seniors in the Department of Modern Languages will demonstrate proficiency
in a foreign language and cultural awareness associated with that language.
Direct Assessment Method:
100% of graduating seniors in Modern Languages will earn a passing grade on a
departmentally developed exit exam covering written, oral, and cultural awareness
competencies. Departmental personnel will administer the exam during the students’
final semester. Departmental faculty will review test results to determine any needed
curricular changes.
Indirect Assessment Method:
All students enrolled in the senior level capstone course will maintain weekly journals. Journal
entries will consist of student reactions to course content and student perceptions of progress
toward course objectives. Instructors will examine the student journals for evidence of language
proficiency and cultural awareness. Departmental faculty will review the instructors’ findings to
determine curricular strengths and weaknesses.
Relationship between Assessment and Grading
Assessment is not a substitute for grading and grading is not a substitute for assessment. While the
processes each focus on the evaluation of student work products with the goal of improved student
learning, each process is a means to a different end. Grading is aimed at evaluating individual student
performance while assessment is aimed at improving the overall learning process for students. Further,
100
Texas State Technical College
Institutional Effectiveness Handbook
a course grade is an overall evaluation of a student’s performance in a course that is comprised of
multiple learning outcomes. Outcome assessment is focused on evaluating a single outcome at a time.
Therefore, course grades should not be used as the criteria for success in outcomes assessment.
Course grades do not provide the level of detail necessary to discern patterns of strengths and
weaknesses in student performance. Course assignments, also referred to as embedded course work
(homework assignments, tests, quizzes, in-call exercises, etc.), can be used for assessment. However,
consideration should be given to the grading method when the student work product will also be used
for assessment. It may be necessary to modify the grading method to make embedded course work
useful for assessment purposes.
Using Assessment Rubrics
Educational assessment rubrics are a common tool for directly measuring student learning and
accountability. Rubrics are scoring tools that lists the criteria for a piece of work…; it also articulates
degrees of quality for each criterion, from excellent to poor” or some such gradation (Goodrich, 1997:
14).
As a type of criteria-based assessment and/or grading, rubrics are a valuable educational tool that can
provide benefits to both faculty and students. When using a rubric for program assessment purposes,
faculty members apply the rubric to pieces of student work (e.g., reports, oral presentations, design
projects). To produce dependable scores, each faculty member needs to interpret the rubric in the same
way. The process of training faculty members to apply the rubric is called "norming." It's a way to
calibrate the faculty members so that scores are accurate. The OIRPE can provide support and assistance
in carrying out the norming process through calibration workshops and training. Appendix F of this
handbook includes sample rubrics for referencing.
Assessment Results and Analysis
Using the Results – Action Plan for Improvement
The primary benefit of engaging in assessment is the result of focused discussion about student
achievement of the program’s learning/administrative outcomes. It is not uncommon, however, for
data to be collected only to be forgotten thereafter. It is not until the data has been analyzed,
101
Texas State Technical College
Institutional Effectiveness Handbook
discussed, and used for a basis for further program/unit improvement that true assessment has taken
place.
Improvement is at the heart of assessment. The need for improvement is revealed generally by the
assessment results, but may be discovered during the assessment process itself. Hence, the use of
assessment results is paramount in the assessment process. TSTC expects that results of assessment
and subsequent interpretation will be disseminated to members of the unit. Further, it is hoped that the
unit will illicit feedback from the members of the unit, especially in cases where improvement is
warranted.
Action Plan for Improvement
Action plans are actions to be taken to improve student learning or enhance the program based on the
analysis of results. Action plans are developed by reflecting on the question: “What can we do to help
students master the learning outcome more proficiently?” or “What can we do to enhance the
program?”
The actions should be specific and directly related to an outcome or several outcomes. For
instance, if a program was unsatisfied with students’ ability to apply mathematical concepts to a
discipline-specific problem, a course assignment might be modified or added to require more
mathematical skill. Action plans might address the assessment process as well. For instance, questions
on an exit survey may be revised in order to obtain more useful information.
The number of action plans a program creates should be realistic and manageable, even if this
means that not all of the outcomes are addressed every year. Each action plan should include
a target completion date and the group or person responsible for implementation.
Examples of improvement actions:

Revise the curriculum for the capstone course

Change course requirements for the major

Expand mentoring program
Closing the Loop
One of the greatest challenges of most programs/units in the assessment process is “closing the loop”.
While many are successful in collecting sizeable amounts of data, many struggle to document with
evidence that their assessment data has been used as a catalyst for change.
102
Texas State Technical College
Institutional Effectiveness Handbook
Part of the reason for this is because in many cases, the program already takes into account student
performance and adjusts the curriculum, teaching methods, services, etc., accordingly. The problem is
that it just is not documented. Closing the loop in assessment requires that any programmatic, policy,
and/or curricular changes be directly linked to systematic assessment of student learning and
administrative outcomes. It is also important for programs/units to provide evidence of re-assessment,
specifically that improvements were impactful at the level expected.
Closing the Loop can take many forms, as described below (adapted from Hatfield, 2009, IDEA Paper No.
45):
Policies, Practices and Procedures
Faculty may choose to close the loop by revisiting program policies, practices, or procedures. For
example, you may consider revising the criteria for admission to the program (for example, GPA, or
prerequisite courses or experience). You might also develop learning supports for students who are
recognized to be struggling in the program. Implementing annual sophomore, junior, and senior reviews
is another step some programs have taken to formalize formative assessment in their programs. Other
options include reviewing the role of student advising, and creating a system for tracking student
progress.
Curricular Reform
Revising the curriculum might be necessary if gaps are found between desired and actual student
performance. Additional coursework in a specific area might be required to remedy consistent
deficiencies in student performance. New technologies, theories, or techniques in the field might also
require changing the course structure of the program, as well as changing the program mission,
emphasis, or outcomes. A specific course (e.g., ethics, diversity, critical thinking) might be replaced with
an across-the-program initiative, or vice versa.
Learning Opportunities
If assessment results indicate that students are not demonstrating learning at a desired level, it might
be worth rethinking strategies — both inside and outside the classroom — to facilitate student learning.
Gamson and Chickering’s (1987) Seven Principles for Good Practice in Undergraduate Education
provides a practical and portable list, detailed below, of opportunities to facilitate learning.
103
Texas State Technical College
Institutional Effectiveness Handbook
1.
Student-Faculty Contact: Contact with faculty serves to engage students in the learning process.
Maximizing opportunities for this contact — both inside and outside of the classroom, in face-toface settings or electronically — can facilitate student learning.
2.
Active Learning: For students to truly learn, they must process information and concepts, and
integrate it into their own experience. This seldom happens by just taking notes in a lecture hall.
Activities in class, as well as direction for how to investigate topics outside of class, can help
students understand and integrate new information with their existing frames of reference.
3.
Cooperative Learning: Professors aren’t the only people in the classroom who can teach.
Students can learn from each other in both structured and unstructured settings. But instructors
need to help students learn how to work this way. You should be especially careful when asking
students to critique each other’s work. Students need to develop the requisite knowledge base
themselves before they can be responsible for critiquing the work of others.
4.
Prompt Feedback: Feedback is an important part of the learning process: guiding, directing, and
suggesting — in addition to evaluating. Feedback aimed at helping students improve their
learning is more useful to the student than feedback that merely justifies the assignment of the
grade. While providing feedback promptly should be a goal, it needs to be balanced with the
desire to offer support, direction, and suggestion.
5.
Time on Task: Many students come to college lacking the time-management skills necessary to
succeed in college. Faculty who can guide students in using their time outside of class effectively
can positively impact student learning. This is especially true in large assignments, where
students may lack the knowledge of how to break down the task into smaller activities to achieve
the goal.
6.
High Expectations: Often students do not achieve learning goals because they are unsure about
what those goals are. Being clear and explicit about expectations — and setting challenging but
realistic expectations — can motivate students to succeed. Even an act as simple as providing the
evaluation rubric along with the assignment can help students focus their effort and energies,
often producing higher quality learning.
7.
Respect for Diverse Talents and Ways of Learning: Students learn — and demonstrate their
learning — in different ways. Try structuring curricula, assignments, and learning opportunities
so that students can learn, and demonstrate that learning, in ways that are most natural for
them. This will engage students and foster learning.
104
Texas State Technical College
Institutional Effectiveness Handbook
APPENDIX G:
DEVELOPING ADMINISTRATIVE OUTCOME(S) ASSESSMENT PLANS
Stating Administrative Outcomes
Administrative Outcomes are operational and specific statements derived from a unit’s core functions
that describe the desired quality of key services within an administrative unit and define exactly what
the services should promote (modified from Selim et al., 2005b, p. 19).
Stating administrative outcomes correctly is essential. Each administrative outcome requires that many
details be planned prior to creating outcomes:
•
•
•
•
Administrative Outcome Statement
Expected Outcomes/Results
Expected Measurement/Means of Assessment
Funding (if applicable)
Administrative Outcomes should be stated in very general terms as what will happen by the end of
the assessment cycle. When an administrative outcome is stated the college tends to encourage
supervisors to “increase,” “decrease,” “enhance,” or “strengthen” something. Doing this makes it
much easier to state and measure each administrative outcome.
Administrative Outcome Guiding Questions:

What types of things is your unit striving for?

What direction do you want your unit to move?

What would you like to accomplish during the upcoming academic year and why? In terms of
intended outcomes, what would the “perfect” unit look like?

Will data be obtained that can be used to improve programs and services?
Oftentimes, administrative outcomes are written as completion of a task or activity. While certainly
valuable, this is not meaningful assessment and does not provide information for improvement. To
accomplish the latter, administrators should try to assess the effectiveness of what you want your
department/unit to accomplish. Administrative outcomes, just like learning outcomes should be
measureable, manageable, and meaningful.
Examples of Administrative Outcomes:
The department/unit will be able to {action verb to describe what it will do, achieve or accomplish}
105
Texas State Technical College
Institutional Effectiveness Handbook

The Department of English will increase the diversity of their application pool by 50%.

The Department of Residential Life will respond to maintenance requests within 24 hours of
receipt.

The Student Health Service will schedule students for check-ups within 24 hours of request.

The Financial Aid Office will have full award letters out to the on-time complete applicants by
April 15.
Measuring Administrative Outcomes
Measurable outcomes have a numerical, quantitative rating whenever possible. If the administrative
outcome statement “increases” or “decreases” something, measurable outcomes define how much
change or improvement is expected. It is also important to identify where each supervisor will extract
that number, piece of data or statistics to demonstrate the achievement of the desired outcome. The
assessment methodology and criterion identifies the data that will be used to demonstrate that
something occurred as a result of implementation of an administrative outcome. When data sources
are kept continuous, they provide direct evidence of “closing the loop” when improvement is
demonstrated.
Funding for Administrative Outcomes
Many of the administrative outcomes created in the Plan or annual assessment plans require no
funding in order to achieve them. If an outcome requires only routine expenditures, such as the
printing of a publication or mailing of something new, no special funding will be requested by the
supervisor to ensure fulfillment of each administrative outcome. If outcomes require significant
expenditures from either the department’s operating budget, or will require money above and beyond
what is normally allocated to the unit, this consideration is made during the creation of the Annual
Operating Budget and should be noted in the development of the Annual Assessment Plan.
Developing Strategies
Once an entire administrative outcome is planned, the strategies for that outcome must be specified.
Strategy development is very basic: supervisors list in descending chronological order all of the steps to
be taken in order to ensure that the overall administrative outcome is achieved. Strategies are specific
activities.
Outcome Implementation
The basis of administrative assessment plans is continuous improvement and evaluation of new and
existing efforts. New program and department assessment data, cohort tracking data, and constituent
responses on numerous evaluations allow for a more effective method of documenting improvement
106
Texas State Technical College
Institutional Effectiveness Handbook
plans based on data and informed decision-making. This allows units and the institution overall to
“close the loop” by integrating planning, budgeting, evaluation, and assessment thus, promoting true
institutional effectiveness and more focused quality enhancement.
107
Texas State Technical College
Institutional Effectiveness Handbook
APPENDIX H: ASSESSMENT RUBRIC SAMPLE
TracDat
Program Outcomes Review (SLO)
Year:
Reviewers & Date:
Purpose
Statement
OUTCOME NUMBER:
Program:
Meets Standard:
Y/N
1a. Global: Statement of overall unit purpose within the context
of the institution.
1b. Specific: Description of major services delivered or work
processes owned by the unit.
2a. Outcome addresses student learning at program level
Outcomes
2b. Outcome identifies planned assessment years (if ongoing)
2c. Outcome has correct start date (beginning of cohort being
assessed, i.e. August 29, 2013)
2d. Outcome statement articulates the knowledge, skills, and
abilities students should have through engagement and/or
completion of the academic program.
2e. Outcome statement is observable and measurable and
includes the appropriate use of action verbs.
2f. Outcome status is identified (Archived/Active)
108
Chair:
Suggested Improvements (Must be completed if
standard not met).
Means of Assessment
Texas State Technical College
Institutional Effectiveness Handbook
3a. Assessment method(s) is/are appropriately explained and fits
the nature of the outcome.
3b. Assessment method category is appropriately selected
according to outcome domain (for each method if more than one)
3c. Criterion: Realistic & appropriate criteria/achievement
targets are used for judging success based on assessment
measures.
3d. Assessment instrument(s) are relevant and appropriate to
the outcome and includes at least one direct measure (i.e.,
exam, student project, field test, portfolio rubric, etc.).
3e. Assessment tools are uploaded or linked (i.e., rubric).
3f. Outcome, Method(s) of Assessment, and Criterion are aligned
Results & Analysis
3g. Courses have been related and curriculum mapping
completed (i.e., introduced, reinforced, assessed) and uploaded or
linked.
4a. Documentation of results is provided/uploaded (i.e., data
results spreadsheet with student identification omitted).
4b. Results directly relate to outcomes and assessments used and
align with the language of the corresponding
criterion/achievement target.
4c. Provides solid evidence that the target(s) were met, partially
met or not.
4d. Analysis of results is provided, including a discussion of
competency areas that where both highest and lowest
performing.
4e. ‘Results Type’ is appropriate to documentation of results.
4f. Reporting period is correctly selected.
109
Use of Results
Texas State Technical College
Institutional Effectiveness Handbook
5a. Use of Results (improvements) states the planned
changes/improvements expected to result in improved student
learning.
5b. Use of Results were informed from assessment results and
directly state which finding(s) were used to develop the
improvement plan.
5c. Changes proposed are realistic in scope and detailed enough
to identify areas that need to be monitored, remediated, or
enhanced and defines logical “next steps”.
5d. Use of Results contains expected completion dates and
identifies a person/group responsible for implementation.
Follow-Up
6a. Follow-Up of Use of Results is documented accordingly.
6b. Follow-Up specifically states whether improvements were
implemented and if not, program provides sufficient justification.
6c. Follow-Up identifies the extent to which the implemented
improvement(s) had an impact on the learning outcome – was
there measurable change as evidenced from the data?
110