IES Grant Writing Workshop

Transcription

IES Grant Writing Workshop
Connec-ng Research, Policy and Prac-ce IES Grant Writing Workshop June 4, 2013
Elizabeth Albro, Ph.D. Associate Commissioner, Na-onal Center for Educa-on Research Joan McLaughlin, Ph.D. Deputy Commissioner, Na-onal Center for Special Educa-on Research ies.ed.gov Purpose of the Workshop This workshop will provide instruc-on and advice on wri-ng a successful applica-on to the Ins-tute of Educa-on Sciences’research grant programs, specifically to the: • Educa-on Research Grants Program (84.305A) • Special Educa-on Research Grants Program (84.324A) ies.ed.gov 2 Agenda •  Introduc;on to IES •  Grant Research Topics •  Grant Research Goals •  Four Sec-ons of the Project Narra-ve –  Significance –  Research Plan –  Personnel –  Resources ies.ed.gov 3 What is IES? •  Research arm of the U.S. Department of Educa-on, non-­‐par-san by law. •  Charged with providing rigorous and relevant evidence on which to ground educa-on prac-ce and policy and share this informa-on broadly. •  By iden-fying what works, what doesn't, and why, we aim to improve educa-onal outcomes for all students, par-cularly those at risk of failure. ies.ed.gov 4 IES Organiza-onal Structure Office of the
Director
Standards &
Review Office
National
Center for
Education
Statistics
ies.ed.gov National Board
for Education
Sciences
National
Center for
Education
Research
National
Center for
Education
Evaluation
5 National
Center for
Special
Education
Research
Missions of the Research Centers •  NCER –  Supports rigorous research that addresses the na-on’s most pressing educa-on needs, from early childhood to adult educa-on. •  NCSER –  Sponsors a rigorous and comprehensive program of special educa-on research designed to expand the knowledge and understanding of infants, toddlers, and students with or at risk for disabili-es from birth through high school. ies.ed.gov 6 Overall Research Objec-ves •  Develop or iden-fy educa-on interven-ons (i.e., prac-ces, programs, policies, and approaches) that enhance academic achievement and that can be widely deployed •  Iden-fy what does not work and thereby encourage innova-on and further research •  Understand the processes that underlie the effec-veness of educa-on interven-ons and the varia-on in their effec-veness ies.ed.gov 7 Primary Research Grant Programs •  Educa-on Research Grants (84.305A) •  Special Educa-on Research Grants (84.324A)* These grant programs are organized by research topic and research goal. *84.324A is not being competed for FY 2014 ies.ed.gov 8 Special Educa-on Research •  NCSER will not hold research or research training compe--ons for FY 2014 •  If funds for research are available in FY 2014, NCSER will use these funds to make addi-onal awards from the FY 2013 grant slates •  NCSER an-cipates being able to hold a grant compe--on for FY 2015 ies.ed.gov 9 Opportuni-es for the Study of Individuals with Disabili-es in NCER •  Partnerships & Collabora-ons Focused on Problems of Prac-ce or Policy grants program (84.305H) •  Postsecondary & Adult Educa-on topic of the Educa-on Research Grants compe--on (84.305A) ies.ed.gov 10 NCER Ul-mate Outcomes of Interest: Student Outcomes Grade Prekindergarten Outcome School readiness (e.g., pre-­‐reading, language, vocabulary, early math and science knowledge, social and behavioral competencies) Kindergarten – Grade 12 Learning, achievement, and higher-­‐order thinking in reading, wri-ng, mathema-cs, and science; progress through the educa-on system (e.g., course and grade comple-on or reten-on, high school gradua-on, and dropout); social skills, agtudes, and behaviors that support learning in school ies.ed.gov 11 NCER Ul-mate Outcomes of Interest: Student Outcomes Grade Postsecondary (grades 13 – 16) Outcome Access to, persistence in, progress through, and comple-on of postsecondary educa-on; for students in developmental programs, addi-onal outcomes include achievement in reading, wri-ng, English language proficiency, and mathema-cs Adult Educa;on Student achievement in reading, wri-ng, and (Adult Basic Educa-on, mathema-cs; access to, persistence in, progress Adult Secondary through, and comple-on of adult educa-on Educa-on, Adult ESL, programs and GED prepara-on) ies.ed.gov 12 Agenda •  Introduc-on to IES •  Grant Research Topics •  Grant Research Goals •  Four Sec-ons of the Project Narra-ve –  Significance –  Research Plan –  Personnel –  Resources ies.ed.gov 13 Grant Topics •  All applica-ons to the primary research grant programs must be directed to a specific topic –  Note on SF 424 Form, Item 4b (Agency Iden-fier Number) –  Note at top of Abstract and Project Narra-ve ies.ed.gov 14 Educa-on Research Topics (84.305A) • 
• 
• 
• 
• 
• 
• 
• 
• 
• 
Cogni-on & Student Learning Early Learning Programs & Policies Educa-on Technology Effec-ve Teachers & Effec-ve Teaching English Learners Improving Educa-on Systems: Policies, Organiza-on, Management, & Leadership Mathema-cs & Science Educa-on Postsecondary & Adult Educa-on Reading & Wri-ng Social & Behavioral Context for Academic Learning ies.ed.gov 15 Issues about Topics •  All require student outcomes •  Grade range may vary by topic •  Topics can overlap ies.ed.gov 16 Choosing among Overlapping Topics •  What literature are you ci-ng? •  To which topic is your area of exper-se best aligned? •  If your focus is on a specific popula-on of students/
teachers, go to that program/topic: –  Is your focus on a specific type of student/
teacher (e.g., English Learners), or are you studying them as a subgroup of your sample? ies.ed.gov 17 Issues about Topics •  Pre-­‐service programs –  Only exploratory research can be done on teacher pre-­‐service programs – no development of pre-­‐service programs, evalua-on of them, or measures-­‐development for them –  Can develop or evaluate pre-­‐service components with in-­‐service teachers –  Support for leadership pre-­‐service programs if the programs last 24 months or less ies.ed.gov 18 Grants Primarily Focused on Professional Development for K -­‐12 Teachers •  Many topics that study K-­‐12 educa-on now require PD grants to be submiled under the Effec-ve Teachers & Effec-ve Teaching topic –  Early Learning Programs & Policies –  Effec;ve Teachers & Effec;ve Teaching Cogni-on & Student Learning Educa-on Technology English Learners Improving Educa-on Systems: Policies, Organiza-on, Management, and Leadership •  Mathema-cs & Science Educa-on •  Reading & Wri-ng • 
• 
• 
• 
–  Postsecondary & Adult Educa-on –  Social & Behavioral Context for Academic Learning ies.ed.gov 19 Agenda •  Introduc-on to IES •  Grant Research Topics •  Grant Research Goals •  Four Sec-ons of the Project Narra-ve –  Significance –  Research Plan –  Personnel –  Resources ies.ed.gov 20 Grant Research Goals •  All applica-ons to 84.305A must be directed to a specific goal –  Note on SF 424 Form, Item 4b –  Note at top of Abstract and Research Narra-ve •  The goal describes the type of research to be done •  Every applica-on is directed to a specific topic/goal combina-on ies.ed.gov 21 What Topic X Goal Fits Your Project? ies.ed.gov 22 FY 2014 Research Goals • 
• 
• 
• 
• 
Explora-on Development & Innova-on Efficacy & Replica-on Effec-veness Measurement ies.ed.gov 23 Purpose of Explora-on Projects •  To iden-fy malleable factors associated with student outcomes AND/OR •  To iden-fy factors and condi-ons that may mediate or moderate rela-ons between malleable factors and student outcomes ies.ed.gov 24 Malleable Factors •  Malleable factors must be under the control of the educa-on system –  Something that can be changed by the system •  Examples –  Student characteris;cs: behavior, skills –  Teacher characteris;cs: prac-ces, creden-als –  School characteris;cs: size, climate, organiza-on –  Educa;on interven;ons: prac-ces, curricula, instruc-onal approaches, programs, and policies ies.ed.gov 25 Possible Methodological Approaches for Explora-on • Analyze secondary data • Collect primary data • Complete a meta-­‐analysis ies.ed.gov 26 Awards for Explora-on •  Secondary data analysis or meta-­‐analysis: –  Maximum of $700,000 total cost (direct + indirect) –  Maximum of 2 years •  Primary data collec-on and analysis (with or without secondary analysis): –  Maximum of $1,600,000 total cost (direct + indirect) –  Maximum of 4 years •  Applica;ons proposing more than a maximum will not be accepted for review ies.ed.gov 27 Purpose of Development & Innova-on Projects •  Develop an innova-ve interven-on (e.g., curriculum, instruc-onal approach, program, or policy) •  OR improve exis-ng educa-on interven-ons Development process must be itera;ve! •  AND collect data on feasibility and usability in actual educa-on segngs •  AND collect pilot data on student outcomes. ies.ed.gov 28 Range of Op-ons for Pilot Study •  Efficacy study (randomized controlled trial) •  Underpowered efficacy study (randomized controlled trial with small sample size that provides unbiased effect size es-mates) •  Single-­‐case study that meets the design standards of WWC •  Quasi-­‐experimental study based on the use of comparison groups with adjustments to address poten-al differences between groups (i.e., use of pretests, control variables, matching procedures) ies.ed.gov 29 Awards for Development •  Maximum of $1,500,000 total cost (direct + indirect) •  Maximum of 4 years •  In budget narra-ve, note budgeted cost of pilot study to ensure it does not exceed 35% of total funds •  Applica;ons proposing more than a maximum will not be accepted for review ies.ed.gov 30 Efficacy & Replica-on •  3 types of Efficacy and Replica-on projects •  With or without reserva-ons •  Randomized controlled trial (RCT) favored •  Strong quasi-­‐experiment ies.ed.gov 31 Design must meet What Works Clearinghouse evidence standards! Purpose #1 •  Evaluate whether or not a fully developed interven-on is efficacious under limited or ideal condi-ons –  Widely-­‐used interven-on –  Interven-on not widely used –  Possible to do so through a retrospec-ve analysis of secondary data collected in the past OR ies.ed.gov 32 Purpose #2 •  Replicate an efficacious interven-on varying the original condi-ons –  Different popula-ons of students (e.g., English language learners) –  Educa-on personnel (e.g., reading specialists versus classroom teachers) –  Segng (e.g., urban versus rural) OR ies.ed.gov 33 Purpose #3 •  Gather follow-­‐up data examining the longer term effects of an interven-on with demonstrated efficacy –  Students –  Educa-on personnel carrying out interven-on ies.ed.gov 34 Key Features of Efficacy & Replica-on Goal •  Ask what might be needed to implement interven-on under rou-ne prac-ce •  Consider role of developer to avoid conflict of interest for developer-­‐evaluators •  Do not require confirmatory mediator analyses but recommend exploratory ones ies.ed.gov 35 Awards for Efficacy & Replica-on Efficacy • Maximum of $3,500,000 total cost (direct + indirect) • Maximum of 4 years Efficacy Follow-­‐Up • Maximum of $1,200,000 total cost (direct + indirect) • Maximum of 3 years • Applica;ons proposing more than a maximum will not be accepted for review ies.ed.gov 36 Purpose of Effec-veness Projects •  Evaluate whether a fully developed interven-on that has evidence of efficacy is effec-ve when implemented under rou-ne condi-ons through an independent evalua-on OR •  Gather follow-­‐up data examining the longer term impacts of an interven-on implemented under rou-ne condi-ons on students ies.ed.gov 37 Effec-veness Goal •  IES expects researchers to –  Implement interven-on under rou-ne prac-ce –  Include evaluators independent of development/distribu-on –  Describe strong efficacy evidence for interven-on (from at least 2 previous studies) •  Does not expect wide generalizability from a single study –  Expects mul-ple Effec-veness projects to this end –  Sample size is not a key dis-nc-on from Efficacy •  Does not require confirmatory mediator analyses but encourages exploratory ones •  Cost of implementa-on is limited to 25% of budget ies.ed.gov 38 Awards for Effec-veness Effec:veness • Maximum of $5,000,000 total cost (direct + indirect) • Maximum of 5 years Effec:veness Follow-­‐Up • Maximum of $1,500,000 total cost (direct + indirect) • Maximum of 3 years • Applica;ons proposing more than a maximum will not be accepted for review 39 ies.ed.gov Purpose of Measurement Grants •  Develop new assessments •  Refine exis-ng assessments (or their delivery) –  To increase efficiency –  To improve measurement –  To improve accessibility –  To provide accommoda-ons •  Validate exis-ng assessments –  For specific purposes, contexts, and popula-ons ies.ed.gov 40 Focus of Measurement Grants •  Assessments may also be developed in other goals, but not as the primary focus •  Primary product of measurement grant is the design, refinement, and/or valida-on of an assessment ies.ed.gov 41 Awards •  Maximum of $1,600,000 total cost (direct + indirect) •  Maximum of 4 years •  Applica;ons proposing more than a maximum will not be accepted for review ies.ed.gov 42 Alend to Changes from Previous 84.305A •  See Page 11 for highlights of changes in the FY 2014 RFA. •  Carefully read the full RFA. •  Applicants to all goals must describe plans for dissemina;on as appropriate to the proposed work. ies.ed.gov 43 Expected Products •  Expected Products for each goal can help you iden-fy the right goal for your project •  At the end of a funded project, IES expects you to provide… ies.ed.gov 44 Expected Products for Explora-on •  Clear descrip-on of malleable factors (and mediators/moderators) and empirical evidence of link between malleable factors (and mediators/moderators) and student outcomes •  Clear conceptual framework -­‐ theory •  Determina-on about next steps (do results suggest future Goal 2, Goal 3, or Goal 5 project?) ies.ed.gov 45 Expected Products for Development & Innova-on •  Fully developed version of the interven-on –  including suppor-ng materials –  a theory of change –  evidence that intended end users understand and can use the interven-on •  Data that demonstrate end users can feasibly implement the interven-on •  Pilot data regarding promise for genera-ng the intended beneficial student outcomes –  including fidelity measures –  evidence of implementa-on fidelity ies.ed.gov 46 Expected Products for Efficacy & Replica-on •  Evidence of interven-on impact on relevant student outcomes rela-ve to a comparison condi-on using a research design that meets (with or without reserva-on) WWC standards •  Conclusions on and revisions to relevant conceptual framework (i.e., the theory of change) •  If beneficial impact -­‐ iden-fica-on of organiza-onal supports, tools, and procedures needed for sufficient implementa-on in future Replica-on or Effec-veness study •  If no beneficial impact -­‐ determina-on of whether a future Goal 2 is needed to revise interven-on/implementa-on ies.ed.gov 47 Expected Products for Effec-veness •  Evidence of interven-on impact under rou-ne implementa-on condi-ons on relevant student outcomes rela-ve to a comparison condi-on using a research design that meets (with or without reserva-on) WWC standards •  Conclusions on and revisions to relevant conceptual framework (i.e., the theory of change) •  If beneficial impact -­‐ iden-fica-on of organiza-onal supports, tools, and procedures needed for sufficient implementa-on under rou-ne condi-ons •  If no beneficial impact – examina-on of why findings differed from previous efficacy studies and a determina-on of whether a future Goal 2 is needed to revise interven-on/
implementa-on ies.ed.gov 48 Expected Products for Measurement •  Projects to develop/refine and validate an assessment: –  Descrip-on of the assessment and its intended use –  Descrip-on of the itera-ve development processes used to develop/refine the assessment, including field tes-ng procedures and processes for item revision –  Conceptual framework for the assessment and its valida-on ac-vi-es –  Descrip-on of the valida-on ac-vi-es –  Reliability and validity evidence for specific purpose(s), popula-ons, and contexts •  Projects to validate an exis-ng assessment: –  Conceptual framework for the assessment and its valida-on ac-vi-es –  Descrip-on of the valida-on ac-vi-es –  Reliability and validity evidence for specific purpose(s), popula-ons, and contexts ies.ed.gov 49 Maximum Award Amounts (84.305A) Goal Max. Dura;on & Award (direct + indirect) Explora;on With secondary data With primary data 2 years, $700,000 4 years, $1,600,000 Development & Innova;on 4 years, $1,500,000 Efficacy & Replica;on Follow-­‐up study 4 years, $3,500,000 3 years, $1,200,000 Effec;veness Follow-­‐up study 5 years, $5,000,000 3 years, $1,500,000 Measurement ies.ed.gov 50 4 years, $1,600,000 NCER Grants by Goal (2004-­‐2012) ies.ed.gov 51 Agenda •  Introduc-on to IES •  Grant Research Topics •  Grant Research Goals •  Four Sec;ons of the Project Narra;ve –  Significance –  Research Plan –  Personnel –  Resources ies.ed.gov 52 The Applica-on’s Project Narra-ve •  Key part of your applica-on •  4 Sec-ons –  Significance –  Research Plan –  Personnel –  Resources •  Each sec-on scored and an overall score given •  Requirements vary by program & goal •  25 pages, single spaced ies.ed.gov 53 Significance •  Describes the overall project –  Your research ques-on to be answered; interven-on to be developed or evaluated, or measure to be developed and/or validated •  Provides a compelling ra-onale for the project –  Theore-cal jus-fica-on •  Logic Models, Change Models –  Empirical jus-fica-on –  Prac-cal jus-fica-on ies.ed.gov 54 Significance •  Do not assume reviewers know significance of your work •  Do not quote back RFA on general importance of a topic, –  e.g., RFA paragraph on lack of reading proficiency of 8th and 12th graders based on NAEP data •  Do quote back RFA if a specific topic is highlighted and your work will address that topic –  E.g., dispropor-onality in discipline (Social/Behavioral); need for developmentally appropriate measures of kindergarten readiness (Early Learning) ies.ed.gov 55 Significance: Explora-on •  Describe the malleable factors, moderators, and mediators to be examined & how you will measure them •  Jus-fy their importance –  Theore-cal ra-onale –  Empirical ra-onale –  Prac-cal importance •  How work will lead to useful next step –  Development or modifica-on of interven-ons to address the iden-fied malleable factors or underlying process to improve student outcomes –  Iden-fica-on of interven-ons for more rigorous evalua-on –  Conceptual framework for developing or refining an assessment ies.ed.gov 56 Significance: Development & Innova-on •  Context for proposed interven-on –  Why needed: what problem exists –  What exists now (may be many alterna-ves already) •  Detailed descrip-on of interven-on to be developed –  Clearly iden-fy components already developed, par-ally developed, and to be developed (no jargon) –  Don’t overextend (# grades, full vs. part year) •  Theory of change (theore-cal support) •  Empirical support ies.ed.gov 57 Significance: Development & Innova-on •  Prac-cal importance: –  Meaningful impact, feasibility, affordability •  Answer the ques-on: Why will this interven-on produce beler student outcomes than current prac-ce? ies.ed.gov 58 Significance: Efficacy & Replica-on •  Detailed descrip-on of interven-on –  Show fully developed, implementa-on process, and ready to be evaluated •  Jus-fica-on for evalua-ng the interven-on –  Importance of prac-cal problem it is to address –  If in wide use, show it has not been rigorously evaluated –  If not in wide use, show evidence of feasibility and promise to address the prac-cal problem •  Theory of change –  Theore-cally and empirical ra-onale –  Direct impact on student outcomes or through mediators •  Jus-fy that it could lead to beler outcomes than current prac-ce ies.ed.gov 59 Significance: Effec-veness •  Detailed descrip-on of interven-on •  Jus-fica-on for evalua-ng the interven-on –  Evidence of meaningful impacts (from 2 Efficacy studies) •  Theory of change •  Jus-fy that it could lead to beler outcomes than current prac-ce •  Implementa-on under normal condi-ons •  Independent evalua-on •  Evidence that implementa-on can reach high enough fidelity to have meaningful impacts ies.ed.gov 60 Significance: Measurement •  Specific need and how assessment will be important to the field of educa-on research, prac-ce, and stakeholders •  Current assessments, why they are not sa-sfactory, and why the new assessment will be an improvement •  Conceptual framework and empirical evidence for the proposed assessment, including key components •  How grant ac-vi-es will provide convincing evidence of reliability and validity for intended purposes and popula-ons •  If grant is to further develop/refine an assessment from previous measurement award, describe status of previous award and jus-fy need for further development ies.ed.gov 61 Significance – 2 Key Problem Areas 1.  Descrip;on of Malleable Factor/Interven;on –  Unclear what interven-on is: confuses reviewers •  Many components and may be applied at different -mes – how fit together – Graphic may help –  Unclear how to be implemented to ensure fidelity –  Interven-on not shown to be strong enough to expect an impact •  Especially true for informa-on interven-ons – e.g., provide data on students, short teacher workshops –  Overly focused on ac-ons not content •  Ex.: 20 hours of PD held over 10 weeks but no detail on what is to be covered in the sessions ies.ed.gov 62 Significance – 2 Key Problem Areas 2.  Theory of change –  Why a malleable factor should be related to a student outcome –  Why an interven-on should improve outcomes versus current prac-ce –  Why an assessment/instrument should measure a specific construct –  A well laid out theory of change makes clear what is expected to happen and in what order –  Easy for reviewers to understand research plan – why measure certain outcomes –  Graphic can be helpful ies.ed.gov 63 Theory of Change Example • 
A pre-­‐K interven-on – 
• 
Early Learning Programs and Policies: e.g., a specific prac-ce, a curriculum, expanded access, full day pre-­‐K The Problem – 
– 
– 
– 
what and for whom (e.g., children who are not ready for kindergarten) why (e.g., weak pre-­‐reading skills, cannot focus) where does it fit in the developmental progression (e.g., prerequisites to decoding, concepts of print) ra-onale/evidence suppor-ng these specific interven-on targets at this par-cular -me ies.ed.gov 64 Theory of Change Example • 
How the interven-on addresses the need and why it should work – 
– 
– 
• 
• 
content: what the student should know or be able to do; why this meets the need pedagogy: instruc-onal techniques and methods to be used; why appropriate delivery system: how the interven-on will arrange to deliver the instruc-on Describe what aspects of the above are different from the counterfactual condi-on Describe the key factors or core ingredients most essen-al and dis-nc-ve to the interven-on ies.ed.gov 65 Logic model graphic Target
Population
Intervention
Proximal Outcomes
Distal Outcomes
Positive
attitudes to
school
4 year old
pre-K
children
Exposed to
intervention
Improved preliteracy skills
Learn
appropriate
school
behavior
ies.ed.gov 66 Increased
school
readiness
Greater
cognitive
gains in K
Mapping Sample Characteris-cs onto Theory Positive
attitudes to
school
4 year old
pre-K
children
Exposed to
intervention
Sample descriptors:
basic demographics diagnostic, need/eligibility
identification
ies.ed.gov Improved preliteracy skills
Learn
appropriate
school
behavior
67 Increased
school
readiness
Greater
cognitive
gains in K
Potential moderators:
setting, context
personal and family characteristics
prior experience
Mapping Interven-on Characteris-cs onto Theory Positive
attitudes to
school
4 year old
pre-K
children
Exposed to
intervention
Independent variable:
T vs. C experimental
condition
Generic fidelity:
T and C exposure to the
generic aspects of the
intervention (type,
amount, quality)
ies.ed.gov Improved preliteracy skills
Learn
appropriate
school
behavior
68 Increased
school
readiness
Greater
cognitive
gains in K
Specific fidelity:
T and C(?) exposure to
distinctive aspects of
the intervention (type,
amount, quality)
Potential moderators:
characteristics of personnel
intervention setting, context
e.g., class size
Mapping Outcomes onto Theory Positive
attitudes to
school
4 year old
pre-K
children
Exposed to
intervention
Focal dependent variables:
pretests (pre-intervention)
posttests (at end of intervention)
follow-ups (lagged after end of
intervention
ies.ed.gov Improved preliteracy skills
Learn
appropriate
school
behavior
69 Increased
school
readiness
Greater
cognitive
gains in K
Other dependent variables:
construct controls– related DVs
not expected to be affected
side effects– unplanned positive
or negative outcomes
mediators– DVs on causal pathways from intervention
to other DVs
Logic Model Graphics Don’t Do This! •  Overwhelm the reader •  Use color as a key (applica-ons are reviewed in black and white) ies.ed.gov 70 [Processes 3-17 repeat
twice in Year 2]
DEVELOPMENT MODEL FOR
“WL” INTERVENTION
DEVELOPED BY “ABC” WITH
THE ASSISTANCE OF “DEF”
16. DEF interviews coaches, SLTs, and PLTs about PD and WL processes 13-­‐15b. DEF observes/
consults SLTs, documents implementa-on 15a.Coaches assist SLT in implemen-ng PD with faculty PLT PLT PLT WL Debrief [4 weeks at end of prior
year or beg of current
year]
PLT Research Profile Analysis Set instruc;onal goals & WL focus [PLT appoints
SLT1 to address
PD Topic 1]
KEY
= Begin Process
WL Debrief PLT WL Debrief PLT PLT PLT PLT WL Debrief WL Debrief [Processes 4-11 WL repeat to mid-year]
Debrief = Feedback for next process
PLT
SLT
= Delegation of PD
= Primary Leadership team
= Secondary Leadership team
ies.ed.gov 12. Coaches & PLTs choose 2-­‐4 teachers (based on WLs) to become Dynamic Leadership Team 1] 11. Coaches share/ implement revisions with PLTs 5. DEF guides Coaches and PLTs in construc-ng and analyzing profiles based on data collected WL Debrief PLT WL Debrief SLT 1 6a. Coaches facilitate PLT iden-fica-on of annual instr. goal WL Debrief 13a.ABC researchers train SLT to research best-­‐prac-ces in PD area 6-­‐8b. DEF observes PLTs, documents implementa-on PLT approx. 3 wks
after school
begins
WL Debrief 8a1. Coaches assist PLTs in using all data to ID area for prof devt 7a. Coaches teach PLT to conduct Learning Walks (WL) PLT Begins weekly mee;ngs WL Debrief WL Debrief PLT Implement Prof Devt PLT WL Debrief WL Debrief 8a2. Coaches assist PLTs in weekly WLs and Debriefing through mid-­‐year PLT PLT SLT 1 14a. SLT and Coaches create PD unit 18a. ABC and DEF submit Annual Report to DOE and schools 17. ABC reviews/revises model based on DEF findings 10. ABC reviews/revises model based on DEF findings approx. 9 wks
after school
begins
9. DEF interviews coaches and PLTs about WL process 1a. PI Recruits and Trains Coaches 1b. DEF develops data collec-on tools 4. Coaches and DEF work with 3b. DEF PLT to develops collect electronic profile climate data develop-­‐
ment tool 3a. Coaches collect 3 yrs. stud. ach. & demo. data per school 2a. ABC PI matches Coaches to schools (n=5) 2b. DEF trains coaches to use all data tools Research Plan •  Describe the work you intend to do –  How you will answer your research ques-on; develop your interven-on; evaluate the interven-on, or develop and/or validate your assessment •  Make certain Research Plan is aligned to Significance sec-on –  All research ques-ons should have jus-fica-on in Significance •  Step-­‐by-­‐step process –  A -meline is strongly recommended! ies.ed.gov 72 Logic model graphic: Segng/Popula-on/
Sample Target
Population
Intervention
Proximal Outcomes
Distal Outcomes
Positive
attitudes to
school
4 year old
pre-K
children
Exposed to
intervention
Improved preliteracy skills
Learn
appropriate
school
behavior
ies.ed.gov 73 Increased
school
readiness
Greater
cognitive
gains in K
Segng, Popula-on, and Sample •  Iden-fy the places you will be doing research •  Iden-fy the popula-on you are addressing •  Iden-fy the sample –  Inclusion and exclusion criteria –  Sample size (issues of power for analysis) –  The importance of alri-on and how to address it –  External validity: can you generalize to your popula-on or only to a subset of it •  If using secondary data, discuss these for the datasets you will be using ies.ed.gov 74 Logic model graphic: Measures Target
Population
Intervention
Proximal Outcomes
Distal Outcomes
Positive
attitudes to
school
4 year old
pre-K
children
Exposed to
intervention
Improved preliteracy skills
Learn
appropriate
school
behavior
ies.ed.gov 75 Increased
school
readiness
Greater
cognitive
gains in K
Outcome Measures For both proximal and distal outcomes Sensi-ve (o{en narrow) measures Measures of broad interest to educators Measures not expected to be linked can be used as addi-onal evidence •  Describe reliability, validity, and relevance •  Do not include measures not linked to research ques-ons •  Mul-ple comparison issue gaining importance • 
• 
• 
• 
ies.ed.gov 76 Other Measures •  Measures that feed back into itera-ve development process •  Fidelity of Implementa-on –  Opera-ng as intended –  Able to address comparison groups •  Feasibility •  Measurement Projects: –  Alternate forms – horizontal equa-ng –  Ver-cal equa-ng if measure growth –  Test fairness –  Non-­‐student instruments must be validated against student outcomes 77 ies.ed.gov Measures Derived From Qualita-ve Research •  Can be derived from qualita-ve research (surveys, observa-ons, focus groups, interviews) –  Actual items to be used –  How items link to constructs – the validity of these measures –  Procedures for collec-on and coding (address inter-­‐rater reliability) –  How consent obtained for an adequate percent of sample –  How qualita-vely collected measures are used in analysis of quan-ta-ve outcomes (e.g., test scores) ies.ed.gov 78 Research Design •  Start off with your research ques-ons •  The research design should answer your ques-ons –  Do not have the design sec-on wrilen independently by a methodologist –  If sec-ons are wrilen by different people have everyone read through the whole applica-on •  Issues common to designs across goals –  Alri-on and missing data –  Obtain access and permission to collect/use data ies.ed.gov 79 Research Design (cont.) •  Experiments allowed under Explora-on and Development –  small-­‐scale experiments may provide reliable informa-on to meet the purpose/requirements of the goal –  not intended to test for efficacy ies.ed.gov 80 Research Design Varies by Goal Explora;on –  Primary data •  Sampling strategy •  Data collec-on and coding processes •  Small-­‐scale, -ghtly controlled experiments allowed –  Secondary data •  Descrip-ve analysis •  Sta-s-cal correla-onal analysis •  Analyses alemp-ng to address selec-on issues: PSM •  Media-on analyses ies.ed.gov 81 Research Design Varies by Goal Development & Innova;on –  Focus should be on itera-ve development process –  Feasibility study: use in authen-c educa-on segng –  Type of pilot study depends upon •  Complexity of interven-on, •  Level of interven-on implementa-on, •  35% limit on grants funds that can be used for pilot study •  Con-nuum of Rigor (RCT, under-­‐powered RCT, Single-­‐Case Study, QED with Comparison group) ies.ed.gov 82 Research Design Varies by Goal •  Efficacy & Replica;on – RCT favored •  Unit of randomiza-on and jus-fica-on •  Procedures for assignment – Strong quasi-­‐experiment -­‐ jus-fy RCT not possible •  How it reduces or models selec-on bias •  Discuss threats to internal validity – conclusions to be drawn ies.ed.gov 83 Research Design Varies by Goal •  Efficacy & Replica;on (cont.) –  Describe the control/comparison group –  Power analysis/MDES – show calcula-on and assump-ons –  Fidelity of implementa-on study in both T and C –  Mediator and moderator analyses –  Contamina-on issues: schools vs. classrooms ies.ed.gov 84 Research Design Varies by Goal •  Effec;veness –  Same as Efficacy & Replica-on except •  requires implementa-on under rou-ne condi-ons •  independent evaluator •  Cost-­‐feasibility analysis ies.ed.gov 85 Research Design Varies by Goal •  Measurement –  The plan to develop or refine the assessment •  Evidence of constructs •  Interpreta-on of assessment results •  Item development and selec-on •  Procedures for administering and scoring –  Reliability and validity studies ies.ed.gov 86 Analysis Depends on Design •  Describe how your analysis answers your research ques-ons •  Describe analyses of qualita-ve data ies.ed.gov 87 Analysis (cont.) •  Show your model –  Iden-fy coefficients of interest and their meaning –  Show different models for different analyses –  Include Equa-ons Yijk = µ + α i + β j + αβ ij + ε ijk
•  Address clustering •  Describe plan for missing data – check for equivalency at start and alri-on bias •  Use sensi-vity tests of assump-ons ies.ed.gov 88 Personnel Sec-on •  Describe key personnel –  Show that every aspect of project has person with exper;se to do it • 
• 
• 
• 
Appropriate methodological exper-se Substan-ve person for all issues addressed Do not propose to hire a key person with X exper-se Project management skills –  Show that every aspect of project has enough ;me from an expert •  Orient CVs so specific to project –  4 pages plus 1 page for other sources of support ies.ed.gov 89 Personnel Strategies for PI •  Senior Researcher as PI –  Show adequate -me to be PI –  Make creden-als clear (not all reviewers may know) •  Junior Researcher as PI –  Show you have adequate exper-se not only to do work but to manage project •  Con-nua-on of graduate research •  Management skills as graduate student –  Reviewers more comfortable if you have senior person(s) on project to turn to for advice •  Co-­‐PI, Co-­‐I, contractors, advisory board •  Have them on for enough -me to be taken seriously 90 ies.ed.gov Resources •  Show ins-tu-ons involved have capacity to support work –  Do not use university boilerplate •  Show that all organiza-ons involved understand and agree to their roles –  What will each ins-tu-on, including schools, contribute to the project –  Show strong commitment of schools and districts and alterna-ves in case of alri-on •  If you have received a prior grant award for similar work, describe the success of that work ies.ed.gov 91 Resources (cont.) •  Appendix C should back up the Resources sec-on •  Detailed Lelers of Agreement from research ins-tu-ons, States, districts, schools –  Do lelers show that partners understand their role in the project (e.g., random assignment to condi-on; -me commitments)? –  Do lelers show that you have access to all necessary data to do the proposed work? ies.ed.gov 92 Cri-cal Issues for Project Narra-ve •  Opening Paragraph •  Clarity of Wri-ng ies.ed.gov 93 Opening Paragraph Cri-cal •  Opening paragraph sets the scene for the reviewers –  Iden-fies the significance of the work to be done and what actually will be done –  Reviewers use it to create categoriza-on system to organize informa-on in rest of applica-ons –  You can lose your reviewers right away with an unclear opening ies.ed.gov 94 Importance of Clarity of Wri-ng •  Reviewers complain about lack of clarity –  Significance too general –  Lack of detail regarding interven-on, development cycle, or data analysis –  Use of jargon and assump-ons of knowledge –  Sec-ons don’t support one another ies.ed.gov 95 Budget and Budget Narra-ve •  Provide a clear budget and budget narra-ve for overall project and each sub-­‐award •  Provide detail on the assump-ons used in the budget (e.g., assump-ons for travel) •  IES Grants.gov Applica-on Submission Guide describes budget categories •  Check RFA for specific budget requirements for Research Goals •  Ensure alignment among Project Narra-ve, Budget, and Budget Narra-ve ies.ed.gov 96 Other IES Funding Opportuni-es Research Programs •  Sta-s-cal and Research Methodology in Educa-on (84.305D) – Sta-s-cal and Research Methodology Grants – Early Career Sta-s-cal and Research Methodology Grants •  Partnerships and Collabora-ons Focused on Problems of Prac-ce or Policy Compe--on (84.305H) –  Researcher-­‐Prac--oner Partnerships in Educa-on Research –  Con-nuous Improvement in Educa-on Research –  Evalua-on of State and Local Educa-on Programs and Policies •  Educa-on Research and Development Centers (84.305C) –  Developmental Educa-on Assessment and Instruc-on –  Knowledge U-liza-on ies.ed.gov 97 Other IES Funding Opportuni-es (cont’d) Research Training Programs •  Research Training Programs in the Educa-on Sciences (84.305B) –  Predoctoral Interdisciplinary Research Training –  Methods Training for Educa-on Researchers –  Training in Educa-on Research Use and Prac-ce ies.ed.gov 98 Important Dates and Deadlines Applica;on Deadline Leeer of Applica;on Intent Due Package Date Posted Sept 4, 2013 June 6, 2013 June 6, 2013 4:30:00 PM DC Time ies.ed.gov 99 Start Dates July 1, 2014 to Sept 1, 2014 Finding Applica-on Packages • FY 2014 Applica-on Packages will be available on www.grants.gov ies.ed.gov 100 ies.ed.gov 101 Review Applica-on Requirements q Request for Applica-ons q Currently available at hep://ies.ed.gov/funding q Grants.gov Applica-on Submission Guide q Will be available 6/6/2013 at hep://ies.ed.gov/funding q Applica-on Package q Will be available on Grants.gov on 6/6/2013 ies.ed.gov 102 Peer Review Process •  Applica-ons are reviewed for compliance and responsiveness to the RFA •  Applica-ons that are compliant and responsive are assigned to a review panel •  Two or three panel members conduct a primary review of each applica-on •  The most compe--ve applica-ons are reviewed and discussed by the full panel ies.ed.gov 103 ies.ed.gov 104 Help Us Help You • Read the Request for Applica-ons carefully • Call or e-­‐mail IES Program Officers early in the process • As -me permits, IES program staff can review dra{ proposals and provide feedback Don’t be afraid to contact us! ies.ed.gov 105 Wrap-­‐up and Final Q&A Elizabeth Albro Associate Commissioner, Teaching and Learning, NCER [email protected] (202) 219-­‐2148 Joan McLaughlin Deputy Commissioner, NCSER [email protected] (202)219-­‐1309 ies.ed.gov 106