Driving Towards Analytics

Transcription

Driving Towards Analytics
Driving Towards
Analytics
Library resources
Admin Info Systems
Library resources
On-Campus teaching
Admin Info Systems
Online/distance ed
On-Campus teaching
Data Analysis and
managerial analytics
Online/distance ed
Student recruitment
EXECUTIVE PERSPECTIVES ON THE
Academic support services
IMPLEMENTATION AND ADOPTION
Student resources and services
OF ANALYTICS
ON CAMPUS
Development efforts
Research and Scholarship
Data Analysis and managerial analytics
Student recruitment
Academic support services
Student resources and services
Development efforts
Research and Scholarship
Alumni Activities
Alumni Activities
Managing financial resources
Managing financial resources
Providing quality undergrad education
Providing quality undergrad education
Developing strong town-grown relationships
Analytics – defined by EDUCAUSE as “the use of data, statistical analysis,
and
explanatory
and predictive models to gain
Preparing
students
for future employment
Developing strong town-grown relationships
insights and act on complex issues” (Stiles 2012) – is one of the latest buzzwords
in education.
it is not surprising why.
Recruiting/retraining
talentAnd
faculty
Preparing students for future employment
Offering support services for undergrads
Confronted
with numerous challenges – budget shortfalls, rising tuition, changes in state support, increased competiRecruiting/retraining talent faculty
Building/maintaining political support
tion – higher education institutions have been focusing efforts on analytics to improve results and their ability to meet
Offering support services for undergrads
Using data to aid and inform campus decision
institutional goals. But getting a campus to adopt and embrace theseEnsuring
initiatives
as a partdevelopment
of their daily routine is not as
the professional
Building/maintaining political support
easy as it seems. In “Presidential Perspectives,” Inside Higher Ed’s Survey
of College
and
University
Securing financial
support
from
corporationsPresidents, presidents
Using data to aid and inform campus decision
ranked “data analysis and managerial analytics” comparatively high atSecuring
fifth out
of 11support
categories
on their list of the most
financial
from alumni
Ensuring the professional development
effective campus IT investments. But they rank their ability to “use data to aid and inform campus decisions” relatively
Securing financial support from corporations
low — at eighth out of 11.
Securing financial support from alumni
INSIDE HIGHER ED — 2011 SURVEY OF COLLEGE AND UNIVERSITY PRESIDENTS
Managing financial resources
Providing quality undergrad education
Developing strong town-grown relationships
Preparing students for future employment
Library resources
Admin Info Systems
On-Campus teaching
Online/distance ed
Offering support services for undergrads
Data Analysis and
managerial analytics
Student recruitment
Building/maintaining political support
Academic support services
Using data to aid and inform campus decision
Student resources and services
Ensuring the professional development
Development efforts
Securing financial support from corporations
Research and Scholarship
Securing financial support from alumni
Alumni Activities
Recruiting/retraining talent faculty
Inside Higher Ed — 2011 Survey of College and University Presidents (Green 2011, P.16, P.18)
blackboardanalytics.com
The goal of this report is to explore this disconnect and
share the successful approaches as well as the lessons
learned from higher education leaders with experience
1
DEFINING THE ROLE OF
EXECUTIVE LEADERSHIP
implementing and adopting analytics solutions. The infor-
It’s no surprise that good leadership is one of the major
mation presented is based on interviews with 29 higher
success factors of an analytics project — or any project for
education leaders, representing 17 institutions in varying
that matter. More specifically though, those interviewed
phases of deployment of Blackboard Analytics TM. The
indicated that a leader must be prepared to do the follow-
group interviewed is representative of community colleges,
ing: build executive consensus, communicate an analytics
and public and private universities. A full list of participants
vision that translates into daily work process, create an
can be found at the end of this document.
institutional focus on analytics and remove any roadblocks.
The best practices and lessons learned that were identified
during the interviews fall into three main categories. These
Build executive consensus
categories provide the foundation for how the findings are
14 of the 17 institutions stated that there must be a
presented going forward:
consensus among the institution’s executive leadership
1 DEFINING THE ROLE OF EXECUTIVE LEADERSHIP
2 MANAGING THE CULTURAL SHIFT TO ANALYTICS
3 IMPLEMENTING AND ADOPTING A SOLUTION
This report identifies and consolidates key points for review
and consideration during implementation and adoption of
an analytics initiative. To help keep these front and center, a
one-page summary checklist of the best practices accompanies this report.
SPOTLIGHT: University of Maryland,
that analytics is the direction forward.
An analytics solution touches so many areas of the institution that the executive administration must want to work
collaboratively in support of an analytics solution in order to
drive it forward. “You have to create synergy at the executive level, including the deans, to get things moving. This
is a key success criterion,” shared Ahmed El-Haggan, vice
president for information technology and chief information
officer, Coppin State University.
The CFO, vice president for enrollment management and
Baltimore County
academic deans, in conjunction with the CIO, were the ones
“There was a period in time when I was glued to the hip
to drive analytics at Roosevelt University. “I first found the
with my IT analyst because I feared getting a data-related
question from executive management that I couldn’t
answer. Blackboard Analytics allows me to be more
independent and more responsive with campus leadership, internal workgroups and committees. I no longer
analytics solution and showed it to our technology steering committee comprising the CFO and academic deans,
and they were the ones to really start pushing the idea on
campus. Collaboratively we were able to get the others
onboard as well. It was a real team effort,” explains Neeraj
dread the phone calls/emails that come in on a daily basis.
Kumar, chief information officer, Roosevelt University.
Before Blackboard Analytics, a significant amount of time
At Illinois Central College, the office of planning and effective-
had to be spent identifying what we believed to be the
ness and information technology drove the process. “I knew
issues, making a list of the data needs and then reconven-
that my Institutional Research team of three couldn’t keep up
ing when someone got the answers — this doesn’t have to
with the increasing requests of our 1,100 college employees. I
be done any longer,” shares Yvette Mozie-Ross, associate
brought the idea of analytics to my manager, the vice presi-
provost for enrollment management, University of Mary-
dent of planning and effectiveness as well as to IT. Together we
land, Baltimore County.
presented and sold the idea to our president,” shared Aimee
Cook, director, institutional research, Illinois Central College.
2
Communicate a vision that translates into
daily work process
6 of the 17 institutions stated that the executive leader has to clearly and
specifically call out what an analytics solution will do for the institution
and make it a part of the daily work process — even that of the president.
This vision has to be more specific than just “to help improve data-driven
decision making on campus.” “The most successful institutions will be
the ones with not only strong executive buy-in, but also strong executive
involvement,” stated Chris Gill, chief information officer, Gonzaga University. Based on our interview responses, the two most common ways this
was achieved was by the president 1) incorporating metrics into the strategic plan which then trickles down to department plans and so on, and 2)
insisting metrics be at the forefront when making decisions — especially at
a time of restricted budgets.
Incorporating metrics into the strategic plan
I knew that my
Institutional Research
team of three couldn’t
keep up with the
increasing requests
of our 1,100 college
employees
Aimee Cook
Director, Institutional Research
Illinois Central College
Eight institutions interviewed listed making their strategic plan measurable the initial vision for analytics. As Karl Burgher, chief strategy officer,
Indiana State University shares, “We do not make decisions without data.
Our strategic plan is measurable; all proposals require supporting data.
We do this to improve the quality of decisions made on campus and to
provide the data transparency that the taxpayers of the state deserve
given we are a public institution.”
This sentiment is echoed by Celeste Schwartz, vice president for information technology and college services, Montgomery County Community
College, who states, “Our focus is on the students with the goal of making
the best learning environment possible, and we use data to ensure we
are meeting this goal. Our strategic plan outlines metrics; each department has its own KPIs. This understanding has transcended through the
organization — IT, academic affairs, finance, marketing, etc., have all been
honing in on better understanding their data for the past ten years. Traction today on analytics is unbelievable.”
Ensuring metrics are front and center when making decisions
Seven institutions interviewed listed managing restricted budgets as the
driver for needing analytics. “Our provost, CFO, and enrollment services
VP are faced with managing demand and decreasing budgets. We need
to have the most accurate and timely information available to help us
manage our enrollment targets and adjust our budget and scheduling
assumptions accordingly. We need to be able to work smarter and faster,
even with fewer resources,” expressed Carl Whitman, associate vice president for information technology and chief information officer, California
State University, Stanislaus.
blackboardanalytics.com
Create an institutional focus on analytics
Given that analytics is relatively new to higher education, nine of the institutions interviewed have found that creating either a committee or a new
role focused on analytics has helped jump start them into this arena.
Analytics steering committee
Six institutions interviewed formed a committee to focus on institutional
analytics. “The president formed the strategic planning committee,
Our mission was to
ensure that every goal
has a measurable
metric and to bring the
right people together
from across the campus
to make this happen
Curt Sherman
Director of Strategic Enrollment
Initiatives and Research
Concordia University
Nebraska
focused on devising a three-year metric-based strategic plan from which
each department will base their individual plans. We could always see how
we were doing in any one year, but the focus was switched to success
comparatively against many years. We had to demonstrate long-term
success,” explains David Kim, chief information officer, Central Piedmont
Community College.
New role created with a focus on analytics
Three institutions created a new role with the primary responsibility of
driving the use of metrics within the institution. For example, the president
of Indiana State University created the new role of chief strategy officer.
“It is my job to work in concert with the president, CIO and entire campus
to turn the strategic plan’s implementation into a data-driven exercise. We
have some 50 academic degree programs and departments on campus.
Each fall, they are now required to review and account for their department’s metrics. All metrics starting in the fall of 2012 will be accessed
primarily through dashboards driven by our analytics solution,” states Karl
Burgher, chief strategy officer, Indiana State University.
The president of Concordia University, Nebraska, created the new role
of director of strategic enrollment initiatives and research, and formed
the strategic information team. “Our mission was to ensure that every
goal has a measurable metric and to bring the right people together
from across the campus to make this happen,” explained Curt Sherman,
director of strategic enrollment initiatives and research, Concordia
University, Nebraska.
Remove roadblocks
Changes in process, approach and culture can lead to strong differences
of opinions. When the implementation team hits an impasse, it is critical that someone with authority and credibility steps in to help navigate
past this impasse to a final decision thereby enabling the project to move
forward. The interviewees identified that either a senior executive or the
steering committee filled this role.
4
Executive leader helps remove roadblocks
analytics solution is one unified solution across the institu-
Several institutions directed the implementation team to
tion. It requires building consensus across different groups
escalate any decisions unable to be resolved to the execu-
who potentially have sub-optimal solutions that they are
tive sponsor during the initial implementation phase. Karl
more comfortable with, but which don’t help the university
Burgher, chief strategy officer, filled this roll at Indiana State
across the board. It took a lot to educate people as to why
as did Neeraj Kumar, chief information officer, Roosevelt
a data warehouse is so much better for the institution than
University, and Ted Simpson, director of enterprise systems,
writing specific, independent reports. We had to start from
Maryland Institute College of Art.
the top and come up from the bottom and meet in the
Steering committee helps remove roadblocks
Other institutions gave power to the analytics steering committee to facilitate any impasse, as they felt it important that
a group representing a cross section of departments assist
in making final decisions. This also helped mitigate risk since
the decision was made by a group rather than an individual.
middle. We couldn’t be doing what we are doing without
the support from all levels of the institution, especially the
president. It takes time and hard work but is well worth the
effort,” explained Jack Suess, vice president of information technology and chief information officer, University of
Maryland, Baltimore County.
Specific approaches to this outreach suggested by the
2
MANAGING THE CULTURAL
SHIFT TO ANALYTICS
As John Fritz, assistant vice president for instructional
technology and new media, University of Maryland, Balti-
interviewees included:
• Form a steering committee — involving a cross section
of people in the decision-making process mitigates
the sense of any one group making the decision for
the rest of the institution. Additionally, the committee
more County, articulates, “Analytics is a way of life, a way of
members will act as beacons of the solution as they go
thinking, a way of management.” This statement reflects the
back to their departments and share success stories
sentiments made by many of those interviewed as to the
with their colleagues.
significant cultural change analytics represents. Chris Gill,
• Conduct a mix of broad and focused information
chief information officer, Gonzaga University, supported
sessions — broad sessions succeed when sharing one
this when he stated, “An analytics solution represents an
message with many in a short timeframe. Focused
extraordinary cultural change. No one understood the sig-
sessions reach fewer people at once but complement
nificance of this initially — it fundamentally changes how
the campus interacts with data.”
Continuous, deliberate outreach
to all levels
the broad sessions by creating an opportunity to speak
to the needs and concerns of specific departments or
institutional roles.
• Meet individually with key stakeholders — often
there are one or more individuals critical to the
success of a project. Their support and involvement
Those interviewed all agreed that one key to a success-
are so important that individual meetings are a wise
ful implementation is continuous, deliberate outreach
investment of time.
to all levels of the institution with 4 of the 17 institutions
• Identify and implement the appropriate mix of
specifically calling out the need for a top down, bottom
communication and educational tools — such as
up approach. This outreach is focused on getting solution
websites, email updates and the creation of an
buy-in while mitigating any concerns and objections. UMBC
analytics community to build awareness.
is one of the institutions that does this especially well. “Our
blackboardanalytics.com
SPOTLIGHT: Central Piedmont
Community College
SPOTLIGHT: University of Michigan
“We brought together a group of researchers and staff
Central Piedmont used many of the outlined approaches
from across schools and colleges. Highly engaged and
to help encourage a less enthusiastic department to
enthusiastic, these participants were instrumental in
become more vested in analytics. “We had one depart-
helping to identify analytics needs, determine scope and
ment that was slow to embrace analytics, but they were
set direction. They became the first group to use the busi-
also one of the groups making the most report requests.
ness intelligence (BI) solution to manage their research
We went to them at a time we knew was the slowest for
budgets. This initial success led to additional BI solutions
them and started talking to their supervisor about assist-
and the formation of a BI Community of Experts (BICE),
ing them in developing reports for their department. The
which still periodically meets formally to share experi-
approach was one of, ‘Let’s just try this and see if it meets
ences and successes. To support BICE and offer BI infor-
your needs.’ We didn’t want to force them into it, as no
mation to the rest of campus, we created a website and
one likes being told what to do, but at the same time the
an online newsletter to build awareness and engagement,
campus was well aware that the VPs are in full support of
and promote cross-campus dialogue and collaboration,”
analytics,” explained David Kim, chief information officer,
shared Holly Nielsen, interim executive director, applica-
Central Piedmont Community College.
tion and information services, University of Michigan.
SPOTLIGHT: Coppin State University
To manage cultural change, Coppin State had a clearly
AND
3IMPLEMENTING
ADOPTING A SOLUTION
articulated analytics vision that was shared by campus
Those interviewed noted the effort it took to prepare their
governance. “Our goal was to ensure there was information at all levels of the institution; from the highest to the
lowest levels; so all decisions made are informed ones.
The executive team was onboard and we formed a steering committee to help ensure this vision became reality,”
stated Ahmed El-Haggan, vice president for information
technology and chief information officer. Prasad Doddanna, director of information systems, added, “to help
achieve this vision, we met with many areas of the campus
to understand their individual data needs. The provost
was interested in accreditation reporting and monitoring
programs such as National Council for Accreditation of
Teachers Education (NCATE), Commission on Accreditation for Health Informatics and Information Management
Education (CAHIM), Public School Administration and later
expanded to include first year experience and the summer
success academy targeted at preparing students for a
college career. Enrollment management was interested in
managing the student life cycle; deans were interested in
better managing their programs. In each of these cases,
we explained how Analytics will help solve their data issues
– and then we customized the reports and dashboards as
needed to meet their specific needs.”
institution for the analytics implementation and roll-out:
defining the goals of the implementation, managing data
issues, selecting the right tools and gaining adoption.
While the actual technical implementation of Blackboard
Analytics — getting the software installed and running with
institutional data — only took a few weeks in most cases,
there were many other steps required to gain adoption.
The graph below charts the time from solution acquisition
to initial adoption. Though this timeline — compared to the
alternatives — is still quite short, many of the institutions
interviewed felt it could be shortened even further had they
known from the beginning what they learned throughout
the process. They identified several best practices that fall
into the following categories:
A
B Defining the need for analytics
C Validating the data
D Selecting the reporting tool(s)
E Determining the roll-out strategy
F Identifying ongoing support
Forming the project team
1.
TIME ACQUISITION
TO ADOPTION
3-5
mo
N/A
Forming the project team
3
(18%)
2. Assessing the need for analytics
3. Validating the data
4. Selecting the reporting tool(s)
5. Determining the roll-out strategy
6. Identifying ongoing support
1
(6%)
6-8 mo
2
(12%)
>24 mo
2
(35%)
9-11 mo
12-24 mo
3
(17%)
1.
6
(35%)
Forming the project team
2. Assessing the need for analytics
3. Validating the data
4. Selecting the reporting tool(s)
5. Determining the roll-out strategy
6
6. Identifying ongoing support
A
SPOTLIGHT: Maryland Institute
Forming the project team
College of Art
Interviewees identified the importance of 1) having a functional leader involved, 2) allocating a dedicated project
manager and 3) clearly defining individual team members’
roles, responsibilities and timelines.
“We initially tied analytics to a larger IT transformation
project. We got the president and his cabinet excited
about analytics and then weren’t able to deliver anything
for some time since all of our resources were tied up in
Ensure a functional department takes
a leadership role
13 out of the 17 institutions interviewed had a functional
person take the full leadership role or co-lead the team along
with IT. Only four of the institutions interviewed had a representative from IT lead the team. In two of those four, though IT was
the lead, the executive sponsor was from a functional group.
The majority shared in David Kim’s (chief information officer,
our SIS upgrade as well as other projects. It wasn’t until we
made it a separate initiative that we finally started getting
some real traction,” explained Ted Simpson, director of
enterprise systems, Maryland Institute College of Art.
Define individual team members’ roles,
responsibilities and timelines
8 of the 17 interviewed stated the importance of defining
team roles, responsibilities and timelines upfront. Specifi-
Central Piedmont Community College) sentiment, “This cannot
cally, the team must:
come from IT; we are just a provider of services.”
• Be cross functional and willing to work collaboratively
to implement, test, validate and deploy the solution
• Understand the institution’s functional needs and be
IMPLEMENTATION TEAM LEADER
able to translate them into technical requirements
8
2 Co-Leads: IT and Functional
5
Functional Lead
IT Lead
• Understand the institution’s data
• Commit to their individual role and responsibility as a
member of the team
• Commit to the project timeline and have managers
4
who support this timeline by reprioritizing the team
members’ other work when necessary
• Define the next step if a deadline is not met
Allocate a dedicated project manager
8 of the 17 institutions had a dedicated project manager.
And four institutions specifically stated that the lack of a
dedicated project manager caused significant delays in
their deployment as called out by both Illinois Central College
and Maryland Institute College of Art. “When I had the time
to focus full time on analytics, the project moved forward;
when I got sidetracked by other priorities, it stopped. There
needed to be a dedicated project manager driving this initiative,” stated Aimee Cook, director, institutional research,
This cannot come from IT; we
are just a provider of services.
David Kim
Chief Information Officer, Central
Piedmont Community College
Illinois Central College.
blackboardanalytics.com
SPOTLIGHT: University of North Texas
“We created a steering committee which included the provost’s office, deans
of the larger colleges, AVP of finance, AVP of academic affairs as well as
others at a high level. This committee created a project charter including
project objectives and timelines, and outlined all those going to be involved.
All committee members formally signed the charter. I took this charter to the
resources required and showed them that their manager committed them to
work on this project for the defined duration of time,” stated Will Senn, direc-
You have to
recognize that
institutional analytics
is an evolution, not a
discrete, finite project.
Aimee Cook
Director
Institutional Research
Illinois Central College
tor of decision support, division of finance, University of North Texas.
BDefining the need for analytics
9 of the 17 institutions interviewed expressed the importance of investing time
up front, understanding the specific initial institutional needs that phase one of
an analytics solution is to solve.
As Chris Gill, chief information officer, Gonzaga University, states, “If an
organization has a vision of analytics early on in the project and can build a
sense as to what they want to accomplish before starting to implement, then
that is time well spent.” Those interviewed who took this approach answered
questions like:
• Who is going to use this solution?
• What do these users want to measure?
• What information is needed to measure this?
• Why do they want to measure this?
• How does the information need to be presented?
• How will the varying needs from different campus
departments be prioritized?
Gill went on to share how hard defining this vision can be — a sentiment shared
among many of the interviewees, as an analytics solution has the potential to
span the entire institution and impact so many areas. Aimee Cook expanded
on this by stating, “You have to recognize that institutional analytics is an evolution, not a discrete, finite project.” In turn, those interviewed provided three
suggestions on approaching a needs analysis:
Don’t attempt to solve all data issues at once. Rather, find a starting point; a win
“The challenge is not to solve all of the data issues at once. People are trying
to do too much rather than focusing on what most people need most of the
time,” explained Mary Byrkit applications manager, University of Michigan.
Examples of this approach include the following:
8
• California State University, Stanislaus, initially started
using Blackboard Analytics to replace their existing
daily operational reporting tool.
• University of Michigan first worked specifically
with researchers to help them better manage their
research project budgets.
• Coppin State University, Gonzaga University and
Roosevelt University are using the out-of-the-box
SPOTLIGHT: Maryland Institute
College of Art
“The academic deans and information technology have
been meeting for the past few months to document
clearly articulated analytics goals. We are working to
identify what academic leadership needs to measure and
why. Once complete, IT will work to ensure our analytics
reports delivered in Blackboard Analytics as a
solution meets these requirements. We [in IT] previously
starting point.
attempted to roll out Analytics to the MICA community
Seek out the group(s) most open to analytics
Find the department, committee or other entity that has
information needs not currently being met or that understands and values the benefits of analytics for their group.
This entity has the potential of becoming the champion of
the solution. By demonstrating the value of the approach, it
will entice others to adopt it.
SPOTLIGHT: Central Piedmont
Community College
without doing this needs assessment, and it was not successful. People thought the system would magically tell
them what to measure and therefore weren’t sure how
to apply the solution to their everyday jobs,” explains
Ted Simpson, director of enterprise systems, Maryland
Institute College of Art.
CValidating the data
Data validation is arguably the most important part of any
analytics implementation. As Mary Byrkit from University of
“Our campus had many needs that Blackboard Analytics
Michigan states, “You have to have good data. The project
was intended to solve,” shares David Kim, chief informa-
will succeed only if there is trust in the data.” Clients expect
tion officer, Central Piedmont Community College “We
that an analytics project will require the process of verifying
needed to prioritize these needs as well as find an initial
that the numbers being generated from the analytics solu-
use that would provide a positive proof point for the solu-
tion based on their predefined business rules are correct.
tion. The student intake committee became that proof
This can be a very time-consuming and detailed process;
point. This committee was formed to rapidly improve
but it is a relatively straightforward one. The more challeng-
our service to new students entering into the college.
ing and less expected data validation issues identified by
We were focusing on the areas of admissions, enrollment
the interviewees included the following:
and financial aid. All of these departments had to review
• Inaccuracies and errors in the underlying data
all processes for barriers of entry. We used Blackboard
Analytics to help them assess their departments for areas
of strengths and weaknesses. The committee chair fell in
love with Blackboard Analytics and became the champion
of the solution. He is a big part of the reason Blackboard
Analytics is being used as much as it is today,” states Kim.
maintained in current, existing systems
• Varied data definitions used by different departments
The sheer number of people and politics involved in resolving these data decisions can be extremely challenging. As
Karl Burgher, chief strategy officer, Indiana State University,
articulates, “Data validation is not an IT issue; it’s a man-
Identify and document key questions
agement issue.” The suggestions for mitigating these data
Once either a starting point and/or an interested group is
validation issues include the following:
identified, spend time determining what questions need
answering and their relative priority. Determine what
information is used currently, what specific information is
desired and the gap between the two.
Anticipate and communicate that underlying data
issues will be found
“Analytics enabled us to find the data errors in our systems.
Students were listed in 11 colleges instead of the 6 we
blackboardanalytics.com
really had; there were many incorrect spellings of ‘Chicago,’ and the data
presented to the execs to date had all been massaged. We saw this as an
opportunity to correct all of these data errors and to ensure that the executive team was presented with accurate information,” shared Neeraj Kumar,
chief information officer, Roosevelt University.
Devise a process for standardizing data definitions
Josh Piddington, chief information officer, Gloucester County College
added, “We knew from the beginning this had to be a whole-college
approach. When we started going through the data validation reports, we
were surprised by the data inaccuracies we found. For example, parts of
our enterprise resource planning (ERP) system were still defining academic
probation according to a 1990 definition while other parts were defining
against a newer definition. This prompted us to get our academic and
student services team involved to identify the correct definition which was
then applied against all ERP data. This data validation process has been a
great ancillary benefit.”
Define a long-term, sustainable way for correcting errors
“We knew that having standard data definitions are critical to our institution
as well as to the success of an analytics implementation, so we worked on
this in advance. We created a committee headed by institutional research,
who led conversations about data definitions. Doing this ahead of time made
our analytics implementation much easier,” shared Ahmed El-Haggan, vice
president for the information technology division, Coppin State University.
DSelecting the reporting tool(s)
There are many reporting tools that can be used with the Blackboard Ana-
There is no single
reporting solution that
is the magic bullet.
lytics solution. A reporting tool is used to present the information derived
Celeste Schwartz
Vice President for
Information Technology
and College Services
Montgomery County
Community College
how these tools were selected.
by the analytics solution; it is the front end or the user interface to the
analytics solution. This said, the questions of most interest were whether
institutions have standardized on one tool or are using a mix of tools and
14 institutions interviewed stated that they use one tool to access
Blackboard Analytics, and 6 institutions use a variety of tools.
Two critical pieces should be considered with this statistic. First, several
of the institutions that responded that they are using one tool are not
purposely standardizing; rather they indicated interest in eventually adding
more tools. Secondly, more research is needed to confirm whether the
interviewees were grouping the Microsoft® tools — ProClarity, Reporting
Services and PerformancePoint — into one. Since Blackboard Analytics’
out-of-the-box reports and dashboards are built to be accessed with these
tools, it is possible that this is the case.
10
Regardless of the above caveats, institutions varied in
report once a month; making them, in our opinion, a better
their opinion of how many tools should be used to access
candidate for Microsoft Reporting Services. You have to
Blackboard Analytics information. Ted Simpson, director
tailor the experience to the different user groups,” explains
of enterprise systems, Maryland Institute College of Art,
Yvette Mozie-Ross, associate provost for enrollment man-
shared that they have, for now, standardized on Reporting
agement, University of Maryland, Baltimore County.
Services. “We selected the reporting tool that the institution
was technically most comfortable with and that we found
easiest for our non-technical users.” On the other hand,
Celeste Schwartz, vice president for information technology
and college services at Montgomery County Community
College has a different opinion. Schwartz states, “There is
no single reporting solution that is the magic bullet. You
need to put a reporting toolkit together, each having its
own place.”
Time must be spent planning the best approach for
rolling out an analytics solution. The two areas specifically
addressed by those interviewed are 1) whether to roll out
the first phase of the solution in pieces or one more complete solution and 2) the training plan.
Create a balance between fast value
and completeness
NUMBER OF TOOLS USED TO ACCESS
BLACKBOARD ANALYTICS INFORMATION
9 of the 17 institutions interviewed thought that the right
11
One Tool
the
EDetermining
roll-out strategy
approach is to roll out phase one of the solution in pieces
rather than waiting to deliver one larger solution. And two
institutions shared that they rolled out a more complete
Mix of Tools
6
first phase, yet if they were to do it again they would choose
to roll out smaller iterations faster. The common theme was
that rolling the solution out requires a balance between
Tool Selection Guidance:
delivering the solution quickly versus delivering everything
needed. If you don’t deliver enough, people will walk away
Those interviewed suggested others assess tool selection
thinking this isn’t a solution for them; if you wait and deliver
based upon their institution’s
more, you risk losing people’s interest while waiting. Either
• User criteria (revisit the: “Defining the need for
way, expectations must be set appropriately.
analytics” section)
• IT’s comfort level with supporting the tool
• Cost of the tool
• Level of training and user support for the tool
SPOTLIGHT: University of Maryland,
Baltimore County
“We started out thinking that everyone would want to interact with data the same way — accessing reports on a frequent
basis and wanting to drill down into the data — which is why
we began using ProClarity to access Blackboard Analytics.
We came to learn, though, that different users have different data needs which impacts tool usage. Whereas I was
accessing reports on a daily basis, many people need one
The reasons supporting a phased roll out include the following:
1. Helps maintain the excitement and enthusiasm
around the project
2. Demonstrates the value of the analytics
solution quickly
3. Helps vet the solution at a time when the team can
more easily react and improve
Proactively train users using a training
mix that meets the user’s needs
Getting people to change how they work and behave
requires proactive efforts. As Karl Burgher, chief strategy
officer, Indiana State University, states, “We always go to
blackboardanalytics.com
the people — don’t sit and have them come to us.” In turn,
quality, thorough training was one of the major themes
discussed by the interviewees regardless of institution
type or an institution’s cultural acceptance of analytics. As
SPOTLIGHT: Boise State University
“We did training sessions by college. We first invited the
department chairs and administrative components to talk
about the data issues they have, and we showed them how
Jack Suess, vice president of information technology and
they could use Analytics to get the answers. We had six
chief information officer, University of Maryland, Baltimore
major training sessions — one for each college — about
County, states, “We at UMBC have the culture that values
90 people. Then we opened up to others on campus. The
assessment, but as IT support we have to take it one step
lessons we learned are: 1) administrative assistants do more
further. We proactively visit with our deans, offer IT-led and
analytical analysis for the chairs than we realized thereby
peer-to-peer training sessions, have a bi-weekly user group
significantly growing the number of people needing train-
meeting — all to help support the president’s mission of
ing and 2) training has to keep up with the higher turnover
being an assessment-driven institution.”
rate found with administrative roles,” states Steve Schmidt,
The key components to the training identified include
Boise State University.
the following:
• Identify the best person/people to do the training. All
institutions interviewed had someone from within their
organization train their users. Often the trainers were
someone from the implementation team. Though less
frequent, other institutions cited success with peer-topeer training and bringing in presenters from outside
the institution.
• Deliver a message that speaks to how this will impact
the trainees’ job, easier or better (as defined in the
“Defining the need for analytics” section).
director of institutional analysis, assessment and reporting,
SPOTLIGHT: Montgomery County
Community College
“We rolled out analytics very systematically. We had a core
team of our people go through a thorough Blackboard
product training, and then they developed a training program
for the leadership academy (train-the-trainer model). We
trained the entire cabinet and all executive administration. We specifically did not train the admin assistants, as
we wanted the solution in the hands of those accountable
• Define the training requirements. While some
institutions interviewed had absolutely no training
requirements, leaving attendance up to the individual,
others had strict attendance requirements even
including homework and class presentations.
for the data. The training course took eight weeks with a
• Segment training according to role, function.
Though many institutions interviewed offered broad
training sessions spanning many levels and roles,
all 17 institutions spoke of the value of also offering
segmented training sessions so that they have the
opportunity to speak to the needs of individual groups
within their campus.
vice president for information technology and college ser-
• Outline a process for capturing feedback, inclusive of
determining the importance of this feedback and the
means for making appropriate changes.
• Build an on-going analytics community to foster
the adoption of analytics. Two of the institutions
interviewed created an analytics community to bring
people together to discuss using analytics on campus
more effectively and to share ideas and practices
within the institution.
several-hour block each week. We assigned homework.
At the end of the training, all had to present a project. If
someone didn’t follow these terms, we had the authority to
take their system access away,” describes Celeste Schwartz,
vices, Montgomery County Community College.
FIdentifying on-going support
As stated earlier in the paper, campus-wide analytics is a
culture; it is not a finite project. In turn, there is the need for
ongoing development and support. In ideal situations, the
institution will hire one or more people to fill this role. In most
cases though, those interviewed have had to reprioritize the
work efforts of existing team members to focus on analytics.
Ted Simpson explains how hard reprioritization can be: “It
becomes a choice or tradeoff of taking resources allocated
to other projects to work on analytics. It is hard to say ‘no’
to other projects, but we understand that analytics is the
better investment for our institution as a whole.”
12
SPOTLIGHT: California State
University, Stanislaus
“Resource constraints limit our ability to take on more
It is hard to say ‘no’ to other
projects, but we understand that
analytics is the better investment
for our institution as a whole.
staff; developing analytics has to be done with our current
resources. We have three analysts working about a third of
their time enhancing the delivered reports and developing
new ones. We also have an OIT trainer conducting classes on
a number of topics, one of which is ‘The Introduction to Data
Ted Simpson
Director of Enterprise Systems
Maryland Institute
College of Art
Warehouse/Business Intelligence.’ And of course Carl [associate vice president for information technology and CIO] and
I evangelize business intelligence wherever and whenever
possible,” shared Charles Holmberg, director of information
services, California State University, Stanislaus.
SPOTLIGHT: Maryland Institute
College of Art
“We have approximately two FTE working on analytics
which we had to manage with our existing head count. The
one full-time resource was taken away from our PeopleSoft
functional support. Fifty percent of another resource’s time
was reallocated, as she had spent this time writing manual
reports which are no longer needed since they are now being
Conclusion
To ensure achievement of goals and increased competitive
advantage, institutional leaders and student consumers are
demanding analytics: the in-depth understanding of data for
the purpose of making better-informed decisions. Identifying that institutional analytics is necessary is the easy part.
done automatically. We have a third person who spends a
Understanding who is going to use an analytical solution, for
quarter of his time on data visualization, and I spend about
what purpose and then getting the institution to embrace
a quarter of my time on analytics as well,” explains Ted
this change is much more difficult.
Simpson, director of enterprise systems, Maryland Institute
In this report, 29 experts and leaders from 17 institutions
College of Art.
SPOTLIGHT: University of Maryland,
Baltimore County
“The maintenance and support of the data warehouse is
jointly managed by the division of information technology
(DoIT) and the office of institutional research (OIR). Three
people from DoIT and two from OIR handle most of the
maintenance, support, training and report development.
highlight successful strategies and lessons learned from
their experiences implementing and adopting an analytics solution. Though all 29 currently license the Blackboard
Analytics solution, the strategies outlined may be applied to
many other analytics solutions. They focus on three critical
implementation and adoption areas: the role of the executive
leader, managing cultural change, and solution implementation and adoption.
Maintaining the warehouse is my primary responsibility.
By implementing these strategies and learning from the
The other two people from DoIT are part-time employees
experiences of others, higher education leaders may be
— 30 hours and 20 hours. The two people from OIR work
better positioned to have their institution embrace the
on the warehouse about 70 and 50 percent respectively,”
stated Kevin Joseph, assistant director of development
and integration, University of Maryland, Baltimore County.
culture of analytics for the purpose of making betterinformed decisions thereby helping them to manage the
multitude of challenges they face.
Please also see the accompanying analytics checklist, which is
a quick reference to the best practices for the implementation
and adoption of an analytics solution identified in this study.
blackboardanalytics.com
Participants*
Indiana State University
Mike Snyder, assistant director,
enterprise services
Maryland Institute College of Art
Ted Simpson, director of enterprise systems
California State University, Stanislaus
Carl Whitman, associate vice president
for information technology and chief
information officer
Montgomery County Community College
Alana Mauger, director of communications
Montgomery County Community College
Celeste Schwartz, vice president for
information technology and college services
Central Piedmont Community College
David Kim, chief information officer
Roosevelt University
Neeraj Kumar, chief information officer
Concordia University, Nebraska
Curt Sherman, director of strategic enrollment
initiatives and research
University of Maryland, Baltimore County
John Fritz, assistant vice president for
instructional technology and new media
Coppin State University
Prasad Doddanna, director of
information systems
University of Maryland, Baltimore County
Kevin Joseph, assistant director of
development and integration
Coppin State University
Ahmed El-Haggan, vice president for
information technology and chief
information officer
University of Maryland, Baltimore County
Yvette Mozie-Ross, associate provost for
enrollment management
Gloucester County College
David Comfort, administrator of web
and portal systems
University of Maryland, Baltimore County
Jack Suess, vice president of information
technology and chief information officer
Gloucester County College
Josh R. Piddington, chief information officer
University of Michigan
Mary Byrkit, applications manager
Gonzaga University
Chris Gill, chief information officer
University of Michigan
Frances Mueller, assistant vice provost for
academic and budgetary affairs
Illinois Central College
Aimee Cook, director,
institutional research
University of Michigan
Holly Nielsen, interim executive director,
application and information services
Indiana State University
Karl Burgher, chief strategy officer
University of North Texas
Will Senn, director of decision support,
division of finance
Boise State University
Steve Schmidt, director of institutional analysis,
assessment and reporting
California State University, Stanislaus
Charles Holmberg, director of
information services
References
Green, K., Jaschik, S., & Lederman, D. “Presidential Perspectives: The 2011 Inside Higher Ed Survey of College and
University Presidents.” Inside Higher Ed. (2011): Table 13, pg. 16, table 15, pg. 18.
Stiles, R. (June 2012) “Understanding and Managing the Risks of Analytics in Higher Education: A Guide.” EDUCAUSE.
*Two institutions constituting 4 interviewees chose to remain anonymous.
14
Driving Towards Analytics: A BEST PRACTICES CHECKLIST
ROLE OF EXECUTIVE
LEADERSHIP
Build executive consensus
Communicate a vision that translates into daily work process
Create a role or committee focused on advancing analytics
Identify a facilitator and process for removing roadblocks
MANAGING THE
CULTURAL SHIFT
Form a Steering Committee
Conduct a mix of broad and focused information sessions
Meet individually with key stakeholders
Identify and implement a mix of communication tools
SOLUTION IMPLEMENTATION AND ADOPTION:
The Project Team
Ensure a functional department lead or co-lead
Allocate a dedicated Project Manager
Define individual roles, responsibilities and timelines
Needs Analysis
Find the starting point, a win
Seek out the group most open to analytics
Identify and document key questions
Data Validation
Anticipate, communicate and plan for underlying data issues
Devise a process for standardizing data definitions
Define a sustainable process for correcting data errors
Reporting Tool Selection
Determine ’best fit’ tools for user criteria
Match tools against IT’s skills and ability to support
Assess tool cost
Assess level of user training needed for each tool
Roll-out Strategy
Craft rollout schedule in stages for both fast value and completeness
Identify appropriate trainers
Communicate impacts, if any, on trainee’s job
Define training requirements
Segment training (e.g. by role, function)
Outline a process for capturing feedback
Build an on-going analytics community
On-Going Support
Identify new or existing staff to support ongoing analytics development
and training
Support and foster the analytics community
blackboardanalytics.com
blackboardanalytics.com • 650 Massachusetts Avenue, NW 6th Floor Washington, DC 20001 • 1.800.424.9299, ext. 4
Copyright © 2012. Blackboard Inc. All rights reserved. Blackboard, the Blackboard logo, Blackboard Analytics, and Behind the Blackboard are trademarks or registered
trademarks of Blackboard Inc. or its subsidiaries in the United States and/or other countries.