Why Doesn’t School Library Impact Research Have More Influence on School Leaders?

Transcription

Why Doesn’t School Library Impact Research Have More Influence on School Leaders?
Why Doesn’t School Library
Impact Research Have More
Influence on School Leaders?
Implications for Advocacy
An Opinion Piece By Gary Hartzell
MLA Reference:
Hartzell, Gary. Why Doesn’t School Library Impact Research Have More Influence on School Leaders? Implications for Advocacy: An
Opinion Piece. Library Media Connection. Oct. 2012. Web. Date of access (day, month, and year).
Editor’s Note: Library Media Connection is pleased to share this important work on advocacy.
A half century’s research results show that school libraries can have a positive impact on student achievement, but libraries and librarians
remain frighteningly vulnerable to budget reductions, staff cuts, even elimination.1 Why hasn’t this research evidence had more influence
with superintendents and principals? As a former school administrator and educational administration professor, it seems to me that there
are several reasons, each with implications for school library advocates.
Why Doesn’t School Library Impact Research Have More Influence on School Leaders?
1. Impact Research Isn’t Published in Journals Administrators Read.
First, impact research doesn’t influence administrators because many administrators are unaware of it. Then, if a librarian does bring it to
their attention—either verbally or by giving them one or more librarian publications—administrators may wonder why they haven’t heard
about it before. They might be bothered by the fact that this material isn’t published in the journals they read. If, indeed, there is compelling
evidence that libraries can make a significant difference, they might wonder why this information hasn’t been made widely available to
administrators and others in education, instead of being found almost exclusively in librarian research journals, in library practitioner
publications, or posted on library internet sites. And, of course, at the same time—if they are naturally suspicious or cynical—they might find
it difficult not to perceive the sharing of such information as self-serving on the librarian’s part.
It may be, of course, that administrative journal2 editors simply are not interested in libraries and librarians as subjects, but that seems
unlikely since they publish articles dealing with all kinds of education facilities, policies, curricula, practices, and programs that affect school
operation, instructional quality, and student achievement. Perhaps non-librarian journal editors and reviewers aren’t impressed with what
they see. Then, again, perhaps they don’t ever get to see the studies’ results. Perhaps impact researchers have consciously chosen not to
submit their reports to these journals and their corresponding websites. Entertaining that notion does nothing to stimulate administrator
interest or confidence.
What does this imply? Some may argue that researchers almost always publish in the journals that define their own fields. And this is
true for most fields of study. Educational administration, however, is not a narrowly defined discipline. Rather, it deals with all aspects of
schooling. This implies, I think, that while it makes perfect sense for impact researchers to publish where they do, they need to additionally
publish in the research and popular press read by administration professionals. Until they do, it’s difficult for me to see how library impact
studies can have any real effect on how administrators think and behave.
Two distinct but connected groups comprise the professional administration community: educational administration (EdAd) professors
and school administrators in the field. EdAd professors need to see this research because they shape the way aspiring and beginning school
leaders think about libraries and librarians. Superintendents and principals need to see it because they decide the fates of libraries and
librarians in the real world of K-12 schooling. Neither group’s members read librarian publications. 3
Educational administration professors. Why is it important for library researchers to publish where administration professors will read
their work? In a nutshell, unless EdAd professors accept library impact research, it will have no influence with the people who prepare and
certify school leaders. Therefore, the research will have no influence with school leaders entering the field.
Most full time EdAd professors are former practitioners.4 In essence, each generation of school administrators is trained and prepared by
the previous one. Principals and superintendents who later become professors weren’t taught the value of libraries in their own university
training, and they didn’t learn it on the job.5 As a result, they don’t integrate any sense of library value into the courses they teach to aspiring
principals and superintendents intent on earning their degrees and credentials. The result is that generation after generation of library-blind
administrators graduate into our schools.6
Library science and school library media faculty particularly need to publish in the refereed and popular print and online journals that
EdAd professors read. Administration professors, especially those associated with post-master’s degree programs, deal in research. Research
and publishing are required if they are to keep their tenure-track jobs and earn promotion, and they must guide their students in thesis
and dissertation research projects. This is an important group for school library advocates to connect with because a doctorate is rapidly
becoming a requisite for a superintendency, and increasing numbers of principals are pursuing educational specialist or doctoral degrees.7
There is virtually no hope that any significant number of them will ever become aware of the growing mass of library impact research unless
it appears in journals they read and trust.
“Trust” is a particularly important factor here. Professors don’t trust just any administration journal. Studies show that a relatively small
number have and hold their attention.8 Library impact research needs a presence in Educational Administration Quarterly, the American
2l
LIBRARY MEDIA CONNECTION
Educational Research Journal, Educational Evaluation and Policy Analysis, Journal of School Leadership, Educational Researcher, Teachers
College Record, Educational Leadership, and Phi Delta Kappan.9
Principals and superintendents. Principals and superintendents are the people who set school and district policies, make funding and
personnel decisions, evaluate outcomes, and decide what should be cut back or eliminated when money is short. If there is evidence that school
libraries significantly contribute to student achievement, they need to see it in journals that speak to their own particular experience.
School administrators tend to read periodicals that summarize, synthesize, and apply research results rather than the research journals
themselves. Seeking material that is contextually relevant to their situation, they most often read articles that mix refereed research and
application ideas. They look for evidence-supported models and practices that connect to the challenges they face in their own schools and
districts.
School administrators consistently turn to professional association conferences and publications when they are searching for evidence
to support a decision.10 Perceived as unbiased sources that can help them find and prioritize information, their associations serve as
intermediaries between raw research results and meaningful application. Principals and superintendents trust professional associations to
point them toward credible material that will help them make sense of specific local issues.11 That is entirely appropriate. If, as Tip O’Neil
used to argue, all politics are local, then so, too, is all administration. If I’m a principal, I need to deal with the situation in this school. If I’m a
superintendent, I need to deal with the situation in this district. So I look for information that will help me do that.
Administrators, particularly central office administrators, seek evidence for their decisions and practice that will fit into the framework of
what they already know and expect to find.12 Library research and practice articles need a continuing presence in the NASSP Bulletin, the
Middle School Journal, Principal, School Administrator, American School Board Journal, Educational Leadership, Phi Delta Kappan, and statelevel publications of a similar nature. Publishing in those journals bestows a research and argument credibility that cannot be had in librarian
journals. The research is vetted by the associations, editors, and editorial boards that administrators trust. An article touting libraries in a
library journal is perceived as inherently self-serving. That’s why there isn’t too much to be gained by giving a school administrator—or an
educational administration professor—a copy of a librarian publication.13 Publication in a broader professional journal removes that “taint.”
As imperative as it is to capture administrators’ attention, and that of the professors who teach them, getting library research and argument
published in influential administrative journals is a formidable task. It will take a concentrated effort, and probably a lot of co-authoring, to
break into those journals, at least at the outset. And, unfortunately, the problem is made larger because, with only a few notable exceptions,
education professors and field administrators don’t read the same journals.14
2. Administrators Don’t Perceive Impact Research as Overwhelming.
Assuming that library impact research eventually is published in the non-librarian press, administrators are still likely to be cautious about
accepting advocates’ arguments that large investments in library programs are justified. Studies that show libraries and librarians playing
important roles in promoting student achievement reach back to the 1950s.15 This is impressive, but school administrators tend to be skeptical.16
Just how important a role have they played? Touting the available research results as “proof” rather than “evidence” is risky in dealing with
administrators. There have been too many “sure solutions” in their experience. Think of the number just in reading, mathematics, language, and
foreign language teaching. Then there were managerial panaceas: site-based management, zero-based budgeting, the effective schools research
movement, MBO, TQM, and a raft of others.
More than that, administrators are particularly wary of anything that has a big-ticket price tag—and libraries do. They should be extremely
careful; education budgets are always strapped, especially in today’s economy. 17 From that perspective, without overwhelming evidence of
library value put right in front of their faces, commanding administrators’ attention is a near impossible task for librarians.
Whether or not the impact research studies do overwhelmingly demonstrate library value is neither the question nor the point here. Think
of the old story of the baseball umpire who hesitates in calling a pitch. The catcher calls it a strike; the batter calls it a ball. The umpire growls
at them both, “It’s not anything until I say it is!” Perception is everything. The research will not be overwhelming and undeniable until
administrators say it is.18
From an administrative perspective, the impact research results have been encouraging, to be sure, and certainly thought provoking, but not
overwhelming. There probably are several reasons why administrators don’t yet perceive this body of work as overwhelming. Among them are
methodologies, missing elements, and a lack of disaggregation.
3l
LIBRARY MEDIA CONNECTION
Methodologies. To begin with, the bulk of the library impact research has been correlation studies. Typical principals and district
administrators may not remember a great deal of technical information from the research methods and statistics class(es) they took to earn
their degrees and credentials,19 but they are likely to remember the basic concept that correlation is not causation. Correlation statistics can
demonstrate associations and the strength of the relationship one variable has to another, but cannot definitively demonstrate that one
caused the other. Still, advocates can argue a case for at least some cause-and-effect thinking because school library impact research results are
continual (a half century’s worth of studies), consistent (the same kind of results keep showing up), and comparable internationally throughout
the English-speaking world (that is, similar results show up in different environments—all fifty states, Canada, Britain, South Africa, Australia,
and New Zealand).20
From an administrator’s point of view, though, continual and consistent may not add up to compelling. The question administrators really want
the answer to is whether the correlations generally are high enough to justify making the kind of financial commitment that quality library
programs require; most especially if they are high enough in schools particularly like theirs.
Administrators know that interpretation of correlations is complicated and can be controversial, so they are cautious. Caution often leads to
hesitation, even avoidance, especially if there are competing options, each with its own promises of improving student achievement.
Then there are qualitative methodologies. As Judson Lanier insightfully points out, data always under-represent reality.21 Qualitative research
can breathe some life into educational studies, and there are qualitative studies that support library presence and practice.22 The problem is to
know how administrators interpret them and whether they are willing to trust them, especially if they don’t see them published under journal,
magazine, and website banners they trust. Qualitative research has gained more respect over the last thirty years, but it’s still not the gold
standard and it certainly hasn’t received a boost from Washington and the state governments adopting data-driven decision making models.23
Qualitative research remains suspect in many administrators’ eyes, particularly when they themselves are required to submit hard number
figures to the agencies and authorities that oversee them.24
What does this imply? First, it implies that it may be time for researchers to employ additional methodologies. That will be discussed more
below. Second, this situation implies that any impact research article submitted to a journal that administrators read—including, perhaps
especially, the non-refereed applied popular practitioners’ press—should carry sections specifically designed to counter a field administrator’s
ignorance of or reservations about the research methods employed. Definitive measures of library impact remain unavailable at this time, but
more widely-based and sophisticated research techniques are on the horizon. Many of the recent projects have been crafted using techniques
more sophisticated than simple correlation. Look, for example, at the use of regression analysis.25 But unless this is explained to them, advances
in methodology are lost on administrators who are years past the original research projects they undertook in their credential and degree
studies. I don’t mean that advocates should author complicated technical explanations that administrators won’t expend the time or effort to
absorb. Give them short, palatable introductions or refreshers.
Missing elements in the research. The narrowness of impact research approaches complicates, perhaps confounds, current attempts to
build advocacy on its results. The key to persuading administrators may lie as much in what is missing from the research record as what is
in it. Impact studies have been widely designed to identify and associate library elements with student success. Good administrative practice
requires multiple perspective assessment of situations, choices, proposed changes, and costs, so administrators value comparative research. Is
there evidence that library reduction or elimination is, indeed, a crippling blow? Is there a body of evidence that student achievement drops in
the years following significant library reduction or elimination?
As an aside, there is some encouraging movement in this direction. Keith Lance, a preeminent library impact researcher, is launching a study of
student achievement performance records in states that have suffered library reductions, cuts, and closures.26 These strike me as potentially
powerful projects. Showing school principals, superintendents, or school board members what they stand to lose if they cut the library could be
more persuasive than promising them non-guaranteeable future gains if they support it.27
There is another problematic characteristic of impact research: The absence of comparative studies. It would surely help the librarians’ case
if there were evidence that dollars, people, space, and time put into the library do more to enhance student achievement than comparable
resources put into more reading teachers, curriculum changes, pull-out enrichment activities, tutoring programs, or other strategies. Without
comparative evidence that library investment can deliver more than some alternate investment, superintendents and principals can’t escape
the uncertainty surrounding where to best allocate thin resources in a zero-sum environment.
What does this imply? First, social psychology research tells us that people tend to fall back on past practice and are more likely to think in
stereotypical terms when they are faced with significant uncertainty in a pressured environment—which schools certainly are.28 Librarian
stereotypes and historically underfunded collections hold no promise for a brighter library future.
4l
LIBRARY MEDIA CONNECTION
This all suggests that researchers (who are not supposed to be advocates, incidentally, but rather truth-seekers) could benefit from opening
some new research lines.29 There is, of course, risk inherent in undertaking studies of schools that have cut their library media programs.
They might produce results that show little or no loss in student achievement after library cuts. What then would be the implications? What
would be the implications if student achievement levels actually rose after library resources were shifted to other programs and people?
Lack of disaggregation. Principals think locally, and there are tremendous and profound differences between elementary, middle, and high
schools.30 The information literacy content and processes that six-year-olds need and can master differ from those of twelve-year-olds and
again from those of eighteen year-olds. Library impact research results need to be published in level-appropriate administrators’ journals in
ways that define, measure, and describe library impact at each of these specific levels.
This is important. Convincing evidence comes from research anchored in the specific context of my situation: I want to know what libraries
contribute in schools like mine, not in schools generally. If I am a busy school principal, I want to see research reports addressing the kinds of
challenges and choices that must be dealt with in my school. If I’m a middle school administrator, I don’t need “school library impact research
results;” I need “middle school library impact research results.” Because I am so busy, I don’t want to have to—and won’t—go hunting for them
in any mass of text and tables that integrate findings across K-12 schooling. General reports with outcomes integrated, mingled, or averaged
across the whole K-12 spectrum are of no interest or use to me. This suggests that it might be a good time for researchers to undertake more
targeted studies of library impact in a level-limited environment with level-limited samples.
Additionally, there is always the chance that disaggregating and publishing results by specific level may lead to a surprising capture of at
least partial library support—not across the board but at a particular level. It may be that libraries and librarians demonstrably do make a
greater difference in one setting than in another. It seems a mistake to think of or portray libraries as monolithic entities. Just as elementary,
middle, and high school students, staff members, curricula, goals, and organizational structure and operation all differ from each other, so
do the roles, functions, structures, and importance of libraries and librarians at each. Advocates may be able to make a compelling case to
support libraries at one level, even if they can’t make the case for all three. Developing a compelling case for library support at any level
would be an advance over the current situation.
One more thought. A “small wins” and supportive role approach: library value within but also beyond itself.
Another possibly profitable line of research with advocacy implications might be to investigate the extent to which quality libraries and
librarians make other school improvement efforts and programs more likely to succeed. The library is not a panacea; there are no panaceas in
education. Perhaps evidence can be found to convince administrators that not only do libraries and librarians contribute to improved student
achievement in their own right, but just as importantly—and perhaps more importantly—they can be critical elements in helping other efforts
succeed.
Studies of improving schools and of schools that perform above expectations for their socio-economic level have produced evidence that a
collection of modest, even small, improvement projects—Karl Weick’s notion of “small wins” 31—can collectively and interactively work over time
to raise overall student achievement levels.32 Obviously, information literacy skills along with access to and guidance in using a wide variety of
up to date resources are important elements in virtually every academic improvement effort. To the extent that research might demonstrate
that the library has value beyond as well as within itself, advocates could encourage school leaders to think of libraries and librarians as
important stones in a mosaic of approaches.
The Bottom Line: Broaden the Scope and the Communication Lines
None of these research results—the current body of studies available or future comparative studies, studies following up on library cuts,
specific level impact studies, or “small win” contribution studies—will ever capture continuing administrative support if administrators don’t
see them. The bottom line is this: So long as school library researchers and advocates publish almost exclusively in librarian journals, the full
potential of school library impact research cannot be realized. School library advocates must broaden both their research scope and their
communication channels.
About the Author
Gary Hartzell is emeritus professor of educational administration at the University of Nebraska at Omaha, where he taught in the master’s
and doctoral degree programs, preparing students for careers as building- and district-level administrators. Before that, he was a high school
teacher, assistant principal, and principal over a twenty-three year period in Southern California.
Completing his doctorate at UCLA in 1990, he joined the educational administration faculty at the University of Nebraska at Omaha. His
research interests center on schools as work places for adults, with particular attention given to workplace relationships. He has focused
most of his attention on the assistant principal and school librarian positions. His interest in these two areas grew out of his own experience
and out of materials surfacing in a seminar he taught on power and influence in the workplace.
5l
LIBRARY MEDIA CONNECTION
Gary is the author of Building Influence for the School Librarian (Linworth, 2004), the lead author of New Voices in the Field: The Work Lives
of First-Year Assistant Principals (Corwin Press, 1995), and many articles on school administration practices, school librarians, and workplace
relationships.
Gary is a sought-after speaker, having made presentations not only across the United States, but in Europe, Australia, and New Zealand.
He was invited to speak at the 2002 White House Conference on School Libraries and remains a member of the Laura Bush Foundation for
America’s Libraries Advisory Board. Gary wrote for a while as a monthly columnist for School Library Journal and now serves on the editorial
board for the International Association of School Librarians’ journal School Libraries Worldwide and on the advisory board for Linworth’s
Library Media Connection.
The American Association of School Librarians listed Gary on its honor roll as one of the country’s most influential figures in library media,
and honored him with its Crystal Apple Award for significant contribution to the advancement of library media. These are rare honors
because he has never been a librarian.
Gary Hartzell now lives at the beach in Southern California with his wife, Cheryl.
References and Notes
(Endnotes)
1 For an interesting visual overview of current threats and cuts see Shonda Brisco’s “A Nation without School Librarians” online at http://maps.google.com/
maps/ms?ie=UTF&msa=0&msid=117551670433142326244.000482bb91ce51be5802b. Accessed August 15, 2012.
Also see Ewbank, A. D. Factors Leading to Retention of School Librarian Positions: A School District Case Study. Presentation at the American Educational Research
Association Annual Meeting, Denver, CO, May 1, 2010. Available online at www.eric.ed.gov/PDFS/ED509456.pdf. Accessed January 29, 2011.
To sense the depth of the danger, see what is currently happening in the Los Angeles Unified School District: www.latimes.com/news/local/la-me-0513-tobar20110513,0,4862226,full.column. Accessed August 15, 2012.
2 For simplicity of language, whenever the term “administrative journals” or “administration journal” is used in this paper, readers should take it to mean “the
journals that administrators read.” Unless the context spells out that the topic is specifically works prepared by and for administration professors and school
administrators, the term should be read to include journals and magazines published by researchers and practitioners in curriculum, reading, law, economics,
school reform, special education, evaluation, teaching, and a host of other areas of education. When the term “professional association” is used, it should be taken
to mean not just national associations, but also regional, state, and local. This applies to both professional association publications and professional association
conferences.
3 Mayo, C. R., P. A. Zirkel, and B. A. Finger. “Which Journals Are Educational Leadership Professors Choosing?” Educational Administration Quarterly 42.5
(December 2006): pp. 806-811.
Zirkel, P. A. “The Professoriate, the Practitioners, and ‘Their’ Periodicals,” Phi Delta Kappan 88.8 (April 2007): pp. 586-589.
Nelson, S. R., J. C. Leffler, and B. A. Hansen. Toward a Research Agenda for Understanding and Improving the Use of Research Evidence. Portland, OR: Northwest
Regional Educational Laboratory, 2009. Available at http://educationnorthwest.org/resource/694.
4 It is difficult to assess the percentage of adjuncts teaching educational administration or educational leadership in comparison to full-time tenure track
professors. The American Association of University Professors reports that sixty-eight percent of current university teachers are in non-tenure track positions and
fifty percent of university teachers are in part-time positions. The numbers of “contingent faculty” are probably smaller in graduate professional programs, but still
appear substantial. Available at www.aaup.org/AAUP/issues/contingent/contingentfacts.htm. Accessed August 15, 2012.
5 In truth, building principals and assistant principals have little opportunity to learn about statistics, libraries, or any number of other things once they take office.
There is no end of research reports documenting the pace, pressure, fragmentation, and demands of building administration. For an introduction, see such works
as:
Cuban, L. The Managerial Imperative and the Practice of Leadership in Schools. Albany, NY: State University of New York Press, 1988.
DiPaola, M., and M. Tschannen-Moran, “The Principalship at a Crossroads: A Study of the Conditions and Concerns of Principals.” NASSP Bulletin 87.634 (March
2003): pp. 43-65.
Goldring, E., J. Huff, H. May, and E. Camburn. “School Context and Individual Characteristics: What Influences What Principals Really Do?” Journal of Educational
Administration 46.3 (2008): pp. 332-352.
Hartzell, G., R. Williams, and K. Nelson. New Voices in the Field: The Work Lives of First-Year Assistant Principals. Thousand Oaks, CA: Corwin Press, 1995.
Lortie, D. School Principal: Managing in Public. Chicago: University of Chicago Press, 2009.
Peterson, K. “The Principal’s Tasks.” Administrator’s Notebook 26.8 (1977-1978): pp. 1-4.
6 This is why school library advocates need to broaden and shift their efforts. Administrators are trained by EdAd professors and, once in the field, they are greatly
influenced by their professional associations. Library advocacy aimed at educating in-place administrators is insufficient. For a summary of arguments along this
line, please see G. Hartzell, “The Need to Shift and Widen School Library Advocacy Efforts: An Opinion Piece,” Library Media Connection, vol. 30, no. 6 (May/June,
2012), pp. 12-13. The full paper with additional discussion and references can be found online at
http://www.abc-clio.com/uploadedFiles/Content/promo/Linworth_and_LMC_Files/LMC_Web/Hartzell.pdf. Accessed August 15, 2012.
7 National Center for Education Statistics. “Characteristics of Public, Private, and Bureau of Indian Education Elementary and Secondary School Principals
in the United States: Results from the 2007-08 Schools and Staffing Survey, Table 4.” Available at http://nces.ed.gov/pubs2009/2009323/tables/
sass0708_2009323_p12n_04.asp. Accessed August 15, 2012.
8 Mayo, C. R., P. A. Zirkel, and B. A. Finger, “Which Journals Are Educational Leadership Professors Choosing? Educational Administration Quarterly 42.5
(December 2006): pp. 806-811. None of them have anything to do with school libraries, nor are likely to carry articles on school libraries.
9 Mayo, C. R., P. A. Zirkel, and B. A. Finger, “Which Journals Are Educational Leadership Professors Choosing? Educational Administration Quarterly 42.5
(December 2006). pp. 806-811.
Zirkel, P. A. “The Professoriate, the Practitioners, and ‘Their’ Periodicals.” Phi Delta Kappan 88.8 (April 2007): pp. 586-589.
Nelson, S. R., J. C. Leffler, and B. A. Hansen. Toward a Research Agenda for Understanding and Improving the Use of Research Evidence. Portland, OR: Northwest
Regional Educational Laboratory, 2009. Available at http://educationnorthwest.org/resource/694.
10 Honig, M. I., and C. Coburn. “Evidence-based Decisionmaking in School District Central Offices: Toward a Policy and Research Agenda.” Educational Policy 22.4
(July 2008): pp. 578-608.
Also see Nelson, S. R., J. C. Leffler, and B. A. Hansen, Toward a Research Agenda for Understanding and Improving the Use of Research Evidence. Portland, OR:
Northwest Regional Educational Laboratory, 2009. Available at http://educationnorthwest.org/resource/694.
6l
LIBRARY MEDIA CONNECTION
11 Nelson, S. R., J. C. Leffler, and B. A. Hansen, Toward a Research Agenda for Understanding and Improving the Use of Research Evidence. Portland, OR:
Northwest Regional Educational Laboratory, 2009. Available at http://educationnorthwest.org/resource/694.
12 Birkeland, S., E. Murphy-Graham, and C. H. Weiss. “Good Reasons for Ignoring Good Evaluation: The Case of the Drug Abuse Resistance Education (D.A.R.E.)
Program.” Evaluation and Program Planning 28.3 (2005): pp. 247-256.
Hannaway, J. Managers Managing: The Workings of an Administrative System. New York: Oxford University Press, 1989.
13 Since most educational administration professors are former practitioners, and many are recent university hires—and, given the number of adjunct professors,
there are in most educational administration departments, many still are practicing in the field—many of their habits and prejudices are still in place. Practicing
administrators tend to judge the credibility of the source as a part of judging the trustworthiness of the results reported. See Honig, M. I., and C. Coburn. “EvidenceBased Decision Making in School District Central Offices: Toward a Policy and Research Agenda.” Educational Policy 22.4 (July 2008): pp. 578-608. Available at
http://epx.sagepub.com/content/22/4/578. Accessed August 15, 2012.
They also tend to be wary of researchers who lack credibility among educational policymakers and practitioners. See Nelson, S. R., J. C. Leffler, and B. A. Hansen.
Toward a Research Agenda for Understanding and Improving the Use of Research Evidence. Portland, OR: Northwest Regional Educational Laboratory, 2009.
Available online at
http://educationnorthwest.org/resource/694.
And they are likely to be suspicious of works they see as advocacy, political arguments, or marketing ploys. See Fusarelli, L. D. “Flying (Partially) Blind: School
Leaders’ Use of Research in Decisionmaking.” In F. Hess (ed.), When Research Matters: How Scholarship Influences Education Policy (pp. 177–196). Cambridge, MA:
Harvard Education Press.
14 Zirkel, P. A. “The Professoriate, the Practitioners, and ‘Their’ Periodicals.” Phi Delta Kappan 88.8 (April 2007): pp. 586-589. Both professors and school
administrators read Educational Leadership and the Phi Delta Kappan. Beyond those two, they tend to read different journals and magazines.
15 See individual studies in the literature. Summaries are easily available online. For example, see Mansfield University’s “School Library Impact Studies Project” at
http://library.mansfield.edu/impact.asp. Accessed August 15, 2012.
A wonderful overview and links to particular studies are available at www.lrs.org/impact.php, the Library Research Service site. Accessed August 15, 2012.
16 There is some interesting research dealing with the role and value of skepticism in management behavior. Skepticism should not be confused with cynicism.
Skepticism questions the likelihood of success and effectiveness of a proposal, not the motives of the people who advance it. See works like
Kantor, Donald, and Philip Mirvis. The Cynical Americans: Living and Working in an Age of Discontent and Disillusion. San Francisco: Jossey-Bass, 1989.
Reichers, Arnon, John Wanous, and James Austin. “Understanding and Managing Cynicism about Organizational Change.” Academy of Management Executive 11.1
(February 1997): pp. 48-59.
Smith, James E., and Robert L. Winker. “The Optimizer’s Curse: Skepticism and Postdecision Surprise in Decision Analysis.” Management Science 52.3 (March
2006): pp. 311-322.
17 For a discussion of the impact of our economic problems on principals and superintendents see “Leading through a Fiscal Nightmare: The Impact on Principals
and Superintendents” by Rick Ginsberg and Karen Multon in the Phi Delta Kappan 93.8 (May 2011): pp. 42-47.
18 An instructive example of this is the way administrators treat the research on designing teacher merit pay programs. Merit pay programs seldom last more
than a few years, and they are not associated with improved student performance because, the research argues, administrators and boards keep adopting flawed
and failed models rather than paying appropriate attention to what research tells us about design and implementation features and issues. For useful overview
information see works like the following:
Clees, W. J. “Teacher Incentive Programs—Do They Make Better Teachers?” Education 113.1 (Fall/Winter 1992): pp. 145-148.
Murnane, R. J., and D. K. Cohen. “Merit Pay and the Evaluation Problem: Why Most Merit Pay Plans Fail and a Few Survive.” Harvard Educational Review 56.1
(February 1986): pp. 1-17.
--“Washington View: The Perils of Merit Pay.” Phi Delta Kappan 91.2 (October 2009): pp. 99-100.
Podgursky, M. J., and M. G. Springer. “Teacher Performance Pay: A Review.” Journal of Policy Analysis and Management 26.4 (Autumn 2007): pp. 909-950.
Sawchuk, S. “Study Casts Cold Water on Bonus Pay.” Education Week 30.5 (September 29, 2010).
19 There are at least two things to consider in assessing how statistically literate school administrators are. The first is how long ago they had their training. The
second is how deep the training was in the first place. Once on the job, administrators are consumed by the immediacy of running a school. This leads them away
from close scrutiny of research and the practice of statistics, unless the particular position they hold requires some statistical application. See the sources listed in
endnote no. 5 above.
The amount of statistical education school administrators receive in their training is subject to a number of variables. The first is the degree pursued. There is more
statistical work involved in a doctorate than in a specialist’s degree, and more in the specialist’s than in a master’s degree. Master’s degrees are generally required
for administrative credentials, but not all institutions or states require a master’s degree in administration or some other education-based subject. Consequently,
some students earn administrative certification only and receive less instruction in education research methods than others do. Some master’s degrees can be
earned by taking an examination rather than producing a thesis. People who follow this line have had less experience with statistics and other research tools than
those who take a master’s degree in administration. A second distinguishing variation is the college or university at which the degree or the certification is earned.
Any perusal of program choices will show you the requirement range.
20 See, for example: Lonsdale, M. Impact of School Libraries on Student Achievement: A Review of the Research. Report for the Australian School Library
Association, Australian Council for Educational Research, Melbourne, March, 2003.
Ontario Library Association, School Libraries and Student Achievement in Ontario, April, 2006, available at www.accessola.com/data/6/rec_docs/137_eqao_
pfe_study_2006.pdf. Accessed August 15, 2012.
21 Lanier, J. You Are Not a Gadget: A Manifesto. New York: Knopf Publishers, 2010. How accurate a picture of life in schools is provided by standardized test
results? How much of the dropout problem is explained by the number of students who drop out?
22 A place to start is K. Wirkus, “School Library Media Research: An Analysis of Articles Found in School Library Media Research and the ERIC Database,” 2005.
Available at www.ala.org/ala/mgrps/divs/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume9/eric.cfm. Accessed August 15, 2012. Other introductory
examples can be seen on the Wisconsin Department of Public Instruction site at www.dpi.wi.gov/imt/lmsstudy.html and on the site maintained by the Center for
International Scholarship in School Libraries at Rutgers University, http://cissl.rutgers.edu. Accessed August 15, 2012.
23 Cathy Roller and Richard Long use the term “background noise” in describing policymakers’ perceptions of qualitative research and offer some suggestions
on how to improve the situation: “Sounding Like More Than Background Noise to Policy Makers: Qualitative Researchers in the Policy Arena.” Journal of Literacy
Research 33.4 (December 2001): pp. 707-725.
24 This is another reason that it is important for school library advocates to capture educational administration professors’ interest and support. EdAd professors
imbue aspiring administrators with their own research and research method values as they lead their students through master’s, specialist, and doctoral degree
programs.
25 There are several works documenting and explaining the most recent and emerging techniques on the Library Research Service website at www.lrs.org/
impact.php. Accessed August 15, 2012.
26 Personal communication, August 29, 2011.
7l
LIBRARY MEDIA CONNECTION
27 The bad economy and the range of school cuts today may provide an excellent opportunity to look at schools that have had quality library media programs and
a certain level of student achievement that have made cuts in their libraries and librarian programs, and to see what has happened.
28 Bodenhausen, G. V., and M. Lichtenstein. “Social Stereotypes and Information-Processing Strategies: The Impact of Task Complexity.” Journal of Personality and
Social Psychology 52.5 (May 1987): pp. 871-880.
Freund, T., A.W. Kurglanski, and A. Shpitzajzen. “The Freezing and Unfreezing of Impression Primacy: Effects of Need for Structure and the Fear of Invalidity.”
Personality and Social Psychology Bulletin 11.4 (1985): pp. 479-487.
Sabini, J. Social Psychology. New York: Norton & Co., 1992.
As Chris Argyris of Harvard has suggested, most leaders are unaware of the assumptions underlying their perceptions. Faulty assumptions lead to errors in
decision making, and—dangerously—the resulting discrepancy goes unrecognized. Long-held stereotypical views may never allow your principal or superintendent
to see that you ought to be given opportunities to do more and different from you are now doing. Argyris, C. “Theories of Action That Inhibit Individual Learning.”
American Psychologist 31.9 (September 1976): pp. 638-645.
29 There isn’t space here to explore this fully, but an insightful and thought-provoking place to begin thinking about alternative research approaches can be found
in Daniel Callison’s interview of Keith Lance, available at the School Library Media Research online journal site in an interview article by Daniel Callison and Keith
Lance: “Enough Already? Blazing New Trails for School Library Research; An Interview with Keith Curry Lance, Director, Library Research Service, Colorado State
Library & University of Denver.” http://aasl.org/ala/mgrps/divs/aasl/aaslpubsandjournals/slmrb/editorschoiceb/lance/interviewlance.cfm. Accessed
August 15, 2012. It is also available as a PDF on the Library Research Service site at www.lrs.org/impact.php.
30 Anyone interested in exploring the uniqueness of each level can begin with works like these:
Bacharach, S. B., S. C. Bauer, and S. Conley. “Organizational Analysis of Stress: The Case of Elementary and Secondary Schools.” Work and Occupations 13.1
(February 1986): pp. 7-32.
Bruckner, M. “Neglect on the Homefront: A Study of School Administrators’ Family Relationships.” NASSP High School Magazine 4 (1996): pp. 32-35.
Bryk, A. L., and J. Smith. “High School Organization and Its Effects on Teachers and Students: An Interpretive Summary of the Research.” In W. Clune and J.
Witte (eds.), Choice and Control in Schools 2. Philadelphia: Falmer Press. pp. 135-226.
Firestone, W. A., and R. E. Herriott. “Prescriptions for Effective Elementary Schools Don’t Fit Secondary Schools.” Educational Leadership 40.3 (December 1982):
pp. 51-54.
Goodlad, J. A Place Called School. New York: McGraw-Hill,1984.
Hoffman, L. M. “Why High Schools Don’t Change: What Students and Their Yearbooks Tell Us.” High School Journal 86.2 (December 2002/January 2003).
Little, J. W., and M. W. McLaughlin (eds.). Teachers’ Work: Individuals, Colleagues, and Contexts: New York: Teachers College. Columbia University, 1993.
McLaughlin, M. W., J. E. Talbert, and N. Bascia (eds.). The Contexts of Teaching in Secondary Schools: Teachers’ Realities. New York: Teachers College Press, 1990.
Sarason, S. B. The Predictable Failure of Educational Reform. San Francisco: Jossey-Bass, 1992.
Siskin, L. S. “Departments as Different Worlds: Subject Subcultures in Secondary Schools.” Educational Administration Quarterly 27.2 (May 1991): pp. 134-160.
31 Weick, K. E. “Small Wins: Redefining the Scale of Social Problems.” American Psychologist 39.1 (January 1984): pp. 40-49. And K. E. Weick, Making Sense of
the Organization, Volume One: Key Works in Cultural Studies, London: Wiley-Blackwell, 2001.
See also: Schmoker, M. R. Results Now: How We Can Achieve Unprecedented Improvements in Teaching and Learning. Alexandria, VA: Association for
Supervision and Curriculum Development, 2005.
Also see: Kotter, J. P. “Leading Change: Why Transformation Efforts Fail.” Harvard Business Review 73.2 (March/April 1995): pp. 59-67.
Rhatigan, J. J., and J. H. Schuh. “Small Wins.” About Campus 8.1 (March/April 2003): pp. 17-22.
A short but interesting essay on the value of small wins by Deborah Myerson can be found at http://hbswk.hbs.edu/archive/2538.html. Accessed August 15,
2012.
I think this notion deserves more thought than it has received in schooling. As Harvard’s Rosabeth Kanter argues, “Not every innovation idea has to be a
blockbuster. Sufficient numbers of small or incremental innovations can lead to big profits.” Kanter, R. M. “Innovation: The Classic Traps.” Harvard Business Review
84.11 (November 2006): pp. 72-83.
“Legitimizing a New Role: Small Wins and Microprocesses of Change” by T. Reay, K. Golden-Biddle, and K. Germann in the Academy of Management Journal
49.5 (October 2006): pp. 977-998, could be useful in thinking about how to craft new roles for library media specialists.
32 See, for example: Cawelti, G. Portraits of Six Benchmark Schools: Diverse Approaches to Improving Student Achievement. Arlington, VA: Educational Research
Service, 1999. Cawelti, G. “Improving Achievement.” American School Board Journal 186.7 (July 1999): pp. 34-37. Cawelti, G. “A Model for High School Restructuring.”
Educational Forum 60 (Spring 1996): pp. 244-248. Cawelti, G. “Portrait of a Benchmark School.” Educational Leadership 57.5 (February 2000): pp. 42-44.
Also see: Fleischman, S., and J. Heppen. “Improving Low-Performing High Schools: Searching for Evidence of Promise.” The Future of Children 19.1 (Spring
2009): pp. 105-133.
Larson, R. Changing Schools from the Inside Out: Small Wins in Hard Times, Third Edition. Lanham, MD: Rowman and Littlefield Education, 2011.
Louis, K. S. “Beyond ‘Managed Change’: Rethinking How Schools Improve.” School Effectiveness and School Improvement 5.1 (March 1994): pp. 2-24.
There is interesting recent federal evidence that a willingness to try simultaneous multiple improvement strategies is more likely to be found in schools with
higher proportions of low-income and minority students. See “Schools Use Multiple Strategies to Help Students Meet Academic Standards, Especially Schools with
Higher Proportions of Low-Income and Minority Students,” GAO Highlights: Highlights of GAO-10-18, a Report to Congressional Committees, November 16, 2009.
8l
LIBRARY MEDIA CONNECTION