TABE Best Practice Reports and FAQ for TABE Online
Transcription
TABE Best Practice Reports and FAQ for TABE Online
Why TABE 9 & 10 Can Be Used Interchangeably In Your Assessment Program The National Research Council considers assessments to be alternate forms if they measure the same constructs, are intended for the same purposes, and are administered using the same directions. If two forms of an assessment meet these three conditions they can be used interchangeably in an assessment program. TABE meets these three conditions. Two forms of a test do not have to be parallel forms to be considered alternate forms. Parallel forms have equal number correct means, equal number correct standard deviations, and equal correlations with other measures for any given population. This means that if someone took two parallel forms of a test, without any change in ability level, they would obtain the same number correct score. There would be no need to convert the number correct scores to scale scores or standard scores. Parallel forms are rare and generally do not exist in the educational testing market. The more common scenario is for two forms of a test to be equivalent forms. Equivalent forms do not have equal number correct means or standard deviations, but the differences in the number correct statistics are compensated for by the conversion of the number correct scores to scale scores and derived scores, such as national percentiles and normal curve equivalents. Equivalent forms also have form-specific norms tables, which provide the number correct to scale score conversions. Therefore, if a student takes two forms of an assessment such as TABE, it is very likely that he/she will obtain different number correct scores even if his/her ability level remains constant. This is due to the fact that one form of an assessment is almost always going to be slightly more difficult, or slightly easier, than another form. This is true for TABE. The table below compares summary statistics for TABE 9 & 10 Level M Complete Battery Language. Descriptive Statistic Number of items Mean number correct Mean p-value Standard Deviation Standard Error of Measurement (SEM) KR-20 (reliability) Table 1: TABE 9 & 10 Equivalence TABE 9 TABE 10 55 55 32.53 34.03 .59 .62 10.46 10.22 3.14 3.07 .91 .91 As table 1 shows, Form 9 is slightly more difficult than Form 10. The mean number correct is 1.5 points lower for Form 9 than Form 10, and the mean p-value (item difficulty value) for Form 9 is slightly lower than Form 10. However, the other data show how equivalent TABE 9 & 10 are—not parallel, but close. The number correct score standard deviations are very similar, the standard error of measurement (SEM) coefficients are very similar, and the KR-20 correlation coefficients are identical. The following example illustrates how TABE 9 & 10 provide equivalent scores. Two students, Bill and Maria, are in the same Language class and have identical grades and the teacher feels they have equal ability. To confirm this, she gave Bill TABE Form 9 and Maria TABE Form 10 of the Level M Language test. Bill answered 36 items correctly and Maria answered 38 items correctly. At first the teacher thought Maria performed better on the test than Bill, but when she used her norms book to get their grade equivalents and national percentiles, she found the results presented in Table 2. Student Bill Maria TABE Form 9 10 # Correct 36 38 Scale Score 513 514 SEM 17 18 Grade Equiv 5.2 5.3 Percentile 51 51 Despite answering two fewer items correctly, Bill has virtually identical results in terms of scale scores, grade equivalents, and percentile scores. The reason their scale scores and GE scores are slightly different is because when two different forms of a test are scaled using number correct scaling there is simply not a scale score for every possible number correct score. For example, the scale for Form 9 Level M Language goes from 260 to 807. Since there are only 55 items on the test there are many potential scale score points that will not have a number correct score associated with it. Appropriate Use of the TABE® 9&10 Locator Test Locator tests are given to help determine which level of an assessment to administer to obtain the most accurate information about a student’s academic strengths and weaknesses. Locator tests, such as those for TABE 9&10, are built to measure a wide range of ability with a limited number of items for each content area. As a result, the information from a locator test about a student should be viewed as only a very rough estimate of the student’s functional level, not as an absolute prediction. As is true for all tests (and in accordance with Standard 13.7 of the Standards for Educational and Psychological Testing: AERA, APA, and NCME, 1999), any decisions about a student should not be made on the basis of a single locator test score, but should include other relevant information about the student. That being said, it is often the case with adult students that little is known about the student’s ability level when a test such as TABE needs to be administered, so locator tests are heavily relied upon to make decisions about which level of the test to administer. Because locator tests have a limited number of items, they do not provide results that are as reliable as the main assessments, nor can the same kind of generalizations about a student’s probability of success in academic coursework be made based on the results. Therefore, locator tests should never be used in place of a main assessment such as the TABE Survey or TABE Complete Battery. Moreover, the standard error of measurement (SEM) should be taken into account when using results from a locator test. SEM is an attribute of all tests because tests sample from a content domain, just like the results from a Gallup Poll always contain sampling error. Sampling error in Gallup Poll results is directly related to the size of the sample—the larger the sample, the lower the sampling error. The same is true for a test—SEM for a test will be lower if a larger sample of items is given. If a student’s score on a locator test is right at a cutscore boundary, SEM alone could lead to a student being identified as having more, or less, ability than he/she actually has. For example, the recommended cut-scores for the Language Locator Test are • 6 items correct or below administer Level E • 7-8 items correct administer Level M • 9-10 items correct administer Level D • 11-12 items correct administer Level A Developed and published by CTB/McGraw-Hill LLC, a subsidiary of The McGraw-Hill Companies, Inc., 20 Ryan Ranch Road, Monterey, California 93940-5703. Copyright © 2007 by CTB/McGraw-Hill LLC. All rights reserved. Portions of this publication so marked may be reproduced and distributed as needed for educational purposes only. Otherwise, no part of this publication may be reproduced or distributed in any form or by any means, or stored in a database or retrieval system, without the prior written permission of the publisher. The SEM for the TABE Language Locator Test is 1.42, so a student could be identified as being ready for Level D if they got 9 items correct when their actual functional level meant they should have taken Level M (i.e., 9-1.42=7.58). The recommended cut-scores and SEM values are shown in Table 1. Table 1: Recommended TABE Locator Test Cut-Scores Reading Mathematics Language TABE level to administer 6 and below* 4-6** 6 and below E 7-8 7-8 7-8 M 9-10 9-11 9-10 D 11-12 12-16 11-12 A SEM = 1.26 SEM = 1.54 SEM = 1.42 In order to avoid administering a TABE level that is too difficult for the student a good rule of thumb would be to administer a lower level if the student scores at the lower bound of the recommended cut-scores (e.g., if a student got 9 Mathematics Locator Test items correct, administer Level M rather than Level D—if they got 10 or 11 correct, administer Level D). Because TABE is vertically scaled across its four levels it theoretically does not matter if a student takes an adjacent level—their scale score would be the same—but taking a level of TABE that is appropriate for the student’s ability level will provide more accurate diagnostic information and will be a less frustrating experience for the student. 800.538.9547 • CTB.com/tabe Copyright © 2007 by CTB/McGraw-Hill LLC. Individual Profile: Johnson, Mike Report Criteria ID: 513160 Test Name: TABE 9 Online Complete Battery District: State: Test Finish Date: 08-09-2007 Report Date: 08-24-2007 06:39:27 PM Wisconsin Rose West Test Scheduler: MATC School: Downtown Class: Monday ABE Test Results Content Area Level Number of Questions Total Correct Attempted Scale Score Grade National Equivalent Percentile Normal Curve Equivalent National Stanine % Objectives Mastery Applied Mathematics A 50 12 12 423 3.3 17 30 3 22 Language A 55 12 12 337 1.4 5 14 2 0 Language Mechanics A 20 12 12 565 9.5 75 64 6 0 Math Computation A 40 12 12 486 5.2 42 46 5 0 Reading A 50 12 12 325 1.6 5 16 2 20 Spelling A 20 12 12 531 8.6 63 57 6 33 Vocabulary A 20 12 12 497 5.2 39 44 4 67 Total Battery* 195 48 48 372 2.2 5 16 2 Total Mathematics** 90 24 24 454 4.4 26 37 4 Predictive Analysis Recommendation Content Area NRS Levels*** Predictive GED Score Recommended Activity Average 280 Instruct Content Area Language Level 1 Math 310 Instruct Reading Level 1 Reading 220 Instruct Total Mathematics Level 3 Science 230 Instruct Social Studies 230 Instruct Writing 290 Instruct Performance on Objectives NRS Levels Non-Mastery Number of Questions Objective Partial-Mastery Percent Correct Total Correct Attempted Computation in Context 4 0 0 0 Data Analysis 7 1 1 14 Estimation 4 3 3 75 Geometry and Spatial Sense 6 0 0 0 Measurement 6 1 1 17 Number and Number Operations 6 1 1 17 Patterns, Functions, Algebra 8 1 1 13 Problem Solving and Reasoning 5 4 4 80 4 1 1 Applied Mathematics Statistics and Probability Average 25 30.1 Language Capitalization 5 2 2 40 Paragraph Development 9 0 0 0 Punctuation 6 1 1 17 Developed and published by CTB/McGraw-Hill LLC, a subsidiary of the McGraw-Hill Companies, Inc., 20 Ryan Ranch Road, Monterey, California, 93940-5703. Copyright © 2007 by CTB/McGraw-Hill LLC. All rights reserved. Only authorized customers may copy, download and/or print the document, located online at ctb.com. Any other use or reproduction of this document, in whole or in part, requires written permission of the publisher. Mastery Mastery Level Group List Report Report Criteria Test Name: TABE 9 Online Complete Battery State: CTB QA TABE Scoring and Report Test Dates: From: 06-01-2007 District: AR Scoring District To: School: AR Scoring School Class: Class A Report Date: 06-01-2008 08-24-2007 06:37:44 PM Test Scheduler: Rose West There were no demographic selected, therefore this report includes all students. Edit/Change Filters Filters: Norm Referenced Scores: All Students Total number of students in this report: 100 Number of Students Mean Scale Score Mean Grade Equivalent Median National Percentile Mean Normal Curve Equivalent Applied Mathematics 100 559.0 9.0 Language 100 468.4 3.1 24.0 35.0 Language Mechanics 100 Math Computation 100 708.8 12.9 98.0 569.4 8.6 Reading 100 492.9 5.2 Spelling 100 646.3 12.9 Vocabulary 100 676.1 12.9 *Total Mathematics 100 565.1 8.9 80.0 68.0 **Total Battery 100 509.8 5.7 50.0 50.0 Content Area 43.0 Student Distribution by NRS Level*** Student Distribution by Recommended Action Instruct Review Test NRS Level Language Reading Writing 45 34 20 Level 1 28 26 0 Social Studies 37 36 26 Level 2 28 12 22 Science 41 38 20 Level 3 12 16 7 Reading 39 38 22 Level 4 14 16 18 Math 25 32 42 Level 5 6 10 8 Average 49 24 26 Level 6 12 20 44 Content Area Total Math Results per page: 15 Test Results: All Students Student SS - Scale Score Score Type GE - Grade Equivalent NP - National Percentile Applied Language Math Language Reading Mathematics Mechanics Computation NRS - Literacy Levels Spelling Vocabulary Johnson, Mike SS 423 337 565 486 325 531 497 ID: 513160 GE 3.3 1.4 9.5 5.2 1.6 8.6 5.2 Level/Form: A9 Total Mathematics Total Battery NP 17 5 75 42 5 63 39 NRS -- 1 -- -- 1 -- -- Smith, Tim SS 375 337 337 290 300 340 300 332 323 ID: 513169 GE 2.3 1.4 1.4 1.8 1.1 2.3 1.1 2.1 1.4 Test Date: 08-09-2007 Level/Form: A9 NP 8 5 8 3 4 9 4 2 2 NRS -- 1 -- -- 1 -- -- 2 -- Jones, Bob SS 375 337 337 290 300 340 300 332 323 ID: 513259 GE 2.3 1.4 1.4 1.8 1.1 2.3 1.1 2.1 1.4 Test Date: 08-09-2007 Level/Form: A9 NP 8 5 8 3 4 9 4 2 2 NRS -- 1 -- -- 1 -- -- 2 -- Lee, Rich SS 375 337 337 290 300 340 300 332 323 ID: 513170 GE 2.3 1.4 1.4 1.8 1.1 2.3 1.1 2.1 1.4 Level/Form: A9 NP 8 5 8 3 4 9 4 2 2 Test Date: 08-09-2007 Developed and published by CTB/McGraw-Hill LLC, a subsidiary of the McGraw-Hill Companies, Inc., 20 Ryan Ranch Road, Monterey, California, 93940-5703. Copyright © 2007 by CTB/McGraw-Hill LLC. All rights reserved. Only authorized customers may copy, download and/or print the document, located online at ctb.com. Any other use or reproduction of this document, in whole or in part, requires written permission of the publisher. Item Analysis Report Report Criteria Test Name: TABE 10 Online Complete Battery District: CTB Demo Test Dates: From: 10-25-2007 School: Reports Demo To: Class: All 10-24-2008 Report Date: 11-15-2007 2:08:17 PM Results per page: All Total results: Test Scheduler: Rose West Content Area: Applied Mathematics Level: D Filters: There were no demographic selected, therefore this report includes all students. 50 Applied Mathematics Items: All Students Total number of students in this report: 7 = correct answer % of Students by Selected Response Objective Number - Objective Item Number - Item % of students responding to this item A B C D E 0% 0% 0% 86 - Number and Number Operations 01 Number Line 100% 86% 14% 17 Equivalent Forms 100% 0% 86% 14% 0% 0% 19 Compare, Order 100% 0% 0% 100% 0% 0% 31 Place Value 100% 14% 71% 0% 14% 0% 37 Ratio, Proportion 100% 0% 14% 86% 0% 0% 44 Operation Properties 100% 71% 29% 0% 0% 0% 47 Factors, Multiples, Divisibility 100% 0% 0% 86% 14% 0% 49 Fractional Part 100% 14% 0% 29% 57% 0% 07 Decimals 86% 0% 57% 0% 29% 0% 10 Whole Numbers 100% 0% 14% 57% 29% 0% 24 Whole Numbers 100% 29% 0% 29% 43% 0% 36 Fractions 100% 0% 0% 100% 0% 0% 48 Fractions 100% 0% 0% 100% 0% 0% 87 - Computation in Context 88 - Estimation 05 Rounding 100% 71% 14% 0% 14% 0% 09 Estimation 100% 14% 57% 14% 14% 0% 15 Estimation 100% 0% 14% 86% 0% 0% 16 Reasonableness of Answer 100% 0% 0% 0% 100% 0% 46 Rounding 100% 0% 0% 14% 86% 0% 89 - Measurement 14 Perimeter 100% 0% 100% 0% 0% 0% 18 Time 100% 14% 0% 86% 0% 0% 23 Appropriate Unit 100% 0% 14% 0% 86% 0% 28 Area 100% 0% 0% 100% 0% 0% 32 Rate 100% 100% 0% 0% 0% 0% 34 Angle Measure 100% 57% 29% 14% 0% 0% 27 Solid Figure 100% 0% 86% 14% 0% 0% 29 Parallel, Perpendicular 100% 0% 0% 29% 71% 0% 30 Visualization, Spatial Reasoning 100% 0% 0% 71% 29% 0% 90 - Geometry and Spatial Sense Developed and published by CTB/McGraw-Hill LLC, a subsidiary of the McGraw-Hill Companies, Inc., 20 Ryan Ranch Road, Monterey, California, 93940-5703. Copyright © 2007 by CTB/McGraw-Hill LLC. All rights reserved. Only authorized customers may copy, download and/or print the document, located online at ctb.com. Any other use or reproduction of this document, in whole or in part, requires written permission of the publisher. Page 1 of 1 Individual Report for Report Student Report Identification Information Student ID: 444 Test Date: 05/17/10 Report Date: 05/17/10 Skill Area Reading Mathematics Computation Applied Mathematics Language Vocabulary Language Mechanics Spelling Total Mathematics Total Battery L/F=Test Level & Form NC=Number Correct NA=Number Attempted Objectives Reading D01 Intrp Graph D02 Wd In Contx D03 Recall Info D04 Const Mean D05 Eval/Ex Mng Subtest Average Mathematics Computation D13 Mul Whl Num D14 Div Whl Num D15 Decimals D16 Fractions D17 Integers D18 Percents Subtest Average Applied Mathematics D21 Num Operatn D22 Comp Contxt D23 Estimation D24 Measurement D25 Geometry D26 Data Analy D27 Stat/Prob D28 Pre-Alg/Alg D29 Prob Solvg Subtest Average Test Group: Test Name: Examiner: Site Name L/F D D D D D D D NC NA 12 50 9 40 15 50 14 55 5 20 4 20 5 20 24 90 50 195 SS=Scale Score GE=Grade Equivalent NP=National Percentile Score SS 343 324 442 346 332 295 320 383 357 GE 1.7 2.2 3.8 1.5 1.6 1.1 1.7 2.7 2.0 NP 7 4 22 5 6 4 6 8 4 NS=National Stanine OM=% Objectives Mastered Mastery Level Percent Correct 1/ 4 1/ 4 3/13 2/17 5/12 Non-Mastery Non-Mastery Non-Mastery Non-Mastery Non-Mastery 25 25 23 11 41 24 0/ 5 1/ 5 3/ 8 1/ 8 3/ 9 1/ 5 Non-Mastery Non-Mastery Non-Mastery Non-Mastery Non-Mastery Non-Mastery 0 20 37 12 33 20 23 1/ 8 1/ 4 3/ 5 3/ 6 1/ 6 1/ 7 1/ 4 3/ 6 1/ 4 NS 2 2 3 2 2 2 2 2 2 Non-Mastery Non-Mastery Partial Mastery Partial Mastery Non-Mastery Non-Mastery Non-Mastery Partial Mastery Non-Mastery 12 25 60 50 16 14 25 50 25 30 OM 0 0 0 0 0 0 0 Content Area Reading Mathematics Writing Science Social Studies Average Content Area Reading Language Total Mathematics Objectives Language D30 Usage D31 Sent Forma D32 Para Devel D33 Capitaliz D34 Punctuation D35 Writg Conv Subtest Average Vocabulary D40 Wd Meaning D41 Multimng Wd D42 Wd in Contx Subtest Average Language Mechanics D43 Sent Phrase D44 Writg Conv Subtest Average Spelling D45 Vowel D46 Consonant D47 Struct Unit Subtest Average Total Average New Reports TABE 9/10 Basic Ed Rita BSC Predicted GED Score 230 270 290 240 240 260 Status Instructional Instructional Instructional Instructional Instructional Instructional NRS Levels Level Description 1 Begining ABE Literacy 1 Begining ABE Literacy 2 Begining Basic Education Score Mastery Level Percent Correct 4/15 3/ 7 3/11 1/ 4 2/11 1/ 7 Non-Mastery Non-Mastery Non-Mastery Non-Mastery Non-Mastery Non-Mastery 26 42 27 25 18 14 25 4/ 8 1/ 4 0/ 8 Partial Mastery Non-Mastery Non-Mastery 50 25 0 25 1/10 3/10 Non-Mastery Non-Mastery 10 30 20 3/ 7 1/ 6 1/ 7 Non-Mastery Non-Mastery Non-Mastery 42 16 14 25 25 Copyright by CTB\McGraw-Hill, Inc. All Rights Reserved. file://C:\CTB\TABE32\Indsubrpt.tbr 5/17/2010 Pre-Post Report for Brown, James ID Number: Pre Date: Post Date: Run Date: B1B11111 05/27/08 10/28/10 10/28/10 Subtests Complete Battery TABE 9/10 Basic Ed tina ctb Pre-test Scores SS GE NP NCE Reading Math Compu Applied Math Language Vocabulary Lang Mech Spelling L/F M0 M0 M0 M0 M0 M0 M0 0.0 1.1 2.2 0.6 0.0 2.0 0.3 2 1 6 2 2 14 2 6 1 17 6 6 27 6 304 273 1.8 0.4 1 1 1 1 Total Math Total Battery 255 245 363 260 255 390 260 L/F=Test Level & Form NP=National %ile Post %age Reading M01 Intrp Graph M02 Wd In Contx M03 Recall Info M04 Const Mean M05 Eval/Ex Mng Subtest Avg 25 0 23 23 20 20 40 25 38 21 14 26 15 25 15 -2 -6 6 Math Compu M11 Add Whl Num M12 Sub Whl Num M13 Mul Whl Num M14 Div Whl Num M15 Decimals M16 Fractions Subtest Avg 20 0 14 16 28 25 18 16 16 28 16 14 25 20 -4 16 14 0 -14 0 3 Applied Math M21 Num Operatn M22 Comp Contxt M23 Estimation M24 Measurement M25 Geometry M26 Data Analy M27 Stat/Prob M28 Pre-Alg/Alg M29 Prob Solvg Subtest Avg 20 20 50 33 16 33 0 60 0 26 M9 M9 M9 M9 M9 M9 M9 40 0 20 50 0 42 50 0 25 28 Gain/ Loss 20 -20 -30 17 -16 9 50 -60 25 2 Copyright © by CTB\McGraw-Hill, Inc. All Rights Reserved. Gain/Loss SS NCE 364 258 363 373 365 324 267 1.9 1.3 2.2 1.7 1.9 1.3 0.8 8 2 6 7 8 6 3 20 6 17 18 20 17 10 109 13 0 113 110 -66 7 14 5 0 12 14 -10 4 310 349 1.9 2.0 1 3 1 10 6 76 0 9 SS=Scale Score NCE=Normal Curve Equiv Pre %age Objectives Post-test Scores SS GE NP NCE L/F GE=Grade Equivalent Pre %age Post %age Language M30 Usage M31 Sent Forma M32 Para Devel M33 Capitaliz M34 Punctuation M35 Writg Conv Subtest Avg 12 33 11 50 22 16 22 25 37 25 25 16 33 27 13 4 14 -25 -6 17 6 Vocabulary M40 Wd Meaning M41 Multimng Wd M42 Wd in Contx Subtest Avg 25 0 28 20 12 75 25 30 -13 75 -3 10 Lang Mech M43 Sent Phrase M44 Writg Conv Subtest Avg 27 33 30 30 20 25 3 -13 -5 Spelling M45 Vowel M46 Consonant M47 Struct Unit Subtest Avg 14 16 28 20 28 33 14 25 14 17 -14 5 Total Average 22 26 4 Objectives Gain/ Loss Page 1 Page 1 of 7 Prescriptive Report for Report Student Report Identification Information Student ID: 444 Test Date: 05/17/10 Report Date: 05/17/10 Page No: 1 Test Group: Test Name: Examiner: Site Name New Reports TABE 9/10 Basic Ed Rita BSC SKILL AREA: Reading Level D Form 9 Objective/Skill INTERPRET GRAPHIC INFORMATION REFERENCE SOURCES WORDS IN CONTEXT OPPOSITE MEANING Reference Source Assigment Pages Complete PreGED Pg. 189-195 Complete PreGED Pg. 212 Complete PreGED Pg. 211 Complete PreGED PreGED Language Arts, Reading PreGED Reading Skill Workbook: Informational Texts PreGED Science PreGED Science Skill Workbook: Earth and Space Science PreGED Science Skill Workbook: Physical Science PreGED Social Studies PreGED Social Studies Skill Workbook: Geography and Economics PreGED Social Studies Skill Workbook: World History Pg. 224-229, 341-345 Pg. 19-20 Pg. 10-11 Pg. 36-38 Pg. 23-24 Pg. 18-19 Pg. 22-26 Pg. 18-19 Complete PreGED PreGED Language Arts, Reading PreGED Reading Skill Workbook: Informational Texts PreGED Science PreGED Science Skill Workbook: Earth and Space Science PreGED Science Skill Workbook: Life Science PreGED Social Studies PreGED Social Studies Skill Workbook: U.S. History, Civics, and Government PreGED Social Studies Skill Workbook: World History PreGED Writing Skill Workbook: Organization Pg. 265-270 Pg. 41-42 Pg. 16-18 Pg. 53-54 Pg. 7-10 Pg. 9-10 Pg. 55-56 Pg. 3-5 Complete PreGED PreGED Language Arts, Reading PreGED Science PreGED Science Skill Workbook: Earth and Space Science PreGED Science Skill Workbook: Physical Science PreGED Social Studies Skill Workbook: U.S. History, Civics, and Government PreGED Social Studies Skill Workbook: World History PreGED Writing Skill Workbook: Organization Pg. 275-279 Pg. 43-44 Pg. 55 Pg. 21-22 Pg. 6-7, 16-17 Pg. 19-21 CHARACTER ASPECTS Complete PreGED PreGED Language Arts, Reading PreGED Reading Skill Workbook: Fiction PreGED Reading Skill Workbook: Poetry and Dreams Pg. 293-295 Pg. 143-145, 189 Pg. 9-11 Pg. 20-21 COMPARE/CONTRAST Complete PreGED PreGED Language Arts, Reading PreGED Reading Skill Workbook: Informational Texts PreGED Science Skill Workbook: Physical Science PreGED Social Studies Skill Workbook: U.S. History, Civics, and Government PreGED Social Studies Skill Workbook: World History PreGED Writing Skill Workbook: Organization Pg. 271-274 Pg. 45-46 Pg. 12-13, 21-22 Pg. 3 Pg. 14-15 PreGED Language Arts, Reading Pg. 39 SAME MEANING RECALL INFORMATION DETAILS SEQUENCE CONSTRUCT MEANING CAUSE/EFFECT CONCLUSION Copyright by CTB\McGraw-Hill, Inc. All Rights Reserved. file://C:\CTB\TABE32\Indprsrpt.tbr Pg. 16-17 Pg. 8-9 Pg. 14-18 Pg. 18-19 Pg. 19-20 Pg. 6-7 Pg. 21-22 Copyright by McGraw-Hill/Contemporary, Inc. All Rights Reserved. 5/17/2010 Item Analysis for Abbey, Tina Group Name: ID Number: Test Date: Run Date: Complete Battery A2C45678 05/27/08 10/28/10 Reading - E9 E01 Intrp Graph 0% 4 J 7 C 8 J 14 G 25 A 26 G 27 C 43 C E02 Wd In Contx 0% 5 A 11 C 24 J 38 G E03 Recall Info 33% 1 A 2 G + 3 C 9 A + 10 G 12 J 15 C 30 G 31 C 32 J + 33 A + 36 J 44 J 45 A + 46 G E04 Const Mean 12% 6 G 13 A 16 J 19 C 20 J 21 A 28 J 29 A 34 G 35 C 39 C 40 J 41 A 47 C + 48 J + 49 A Copyright © by CTB\McGraw-Hill, Inc. All Rights Reserved. TABE 9/10 Basic Ed rita bsc Page No: 1 E05 Eval/Ex Mng 28% 17 A 18 G 22 G + 23 C + 37 A 42 G 50 G Mathematics Computation - M9 M11 Add Whl Num 0% 1 A 2 G 3 C 5 E 7 B 9 D M12 Sub Whl Num 0% 4 J 11 A 14 J 17 B 25 E 27 B M13 Mul Whl Num 28% 6 F 8 H 10 K 12 G + 15 E 18 H + 26 F M14 Div Whl Num 16% 20 K 22 G 28 H 32 G + 35 E 39 D M15 Decimals 42% 13 C 16 F 21 A + 24 J + 30 K 34 J 36 F + M16 Fractions 12% Page 1 TABE 10 SURVEY LEVEL M Table 112 Mathematics Computation Reference Group ABEAll 154 NC SS SEM 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 622 575 547 529 515 502 489 477 466 455 443 432 421 409 397 384 369 353 334 310 274 245 245 245 245 245 73 36 25 23 22 22 22 22 22 22 23 24 25 26 28 30 33 38 45 58 94 123 123 123 123 123 N O R M S ABEJuvenile GE P NCE S P NCE S 9.9+ 8.9 7.8 7.1 6.3 5.7 5.4 5.1 4.8 4.6 4.2 3.8 3.5 3.3 3.1 2.9 2.7 2.6 2.3 2.0 1.5 1.1 1.1 1.1 1.1 1.1 96 85 74 65 57 50 44 37 32 27 22 18 15 12 10 8 7 6 5 4 2 1 1 1 1 1 86 72 63 58 54 50 47 43 40 37 34 31 28 26 23 21 18 16 14 12 7 1 1 1 1 1 9 7 6 6 5 5 5 4 4 4 3 3 3 3 2 2 2 2 2 1 1 1 1 1 1 1 96 85 73 64 57 50 43 37 32 27 23 20 16 14 12 10 8 7 6 4 2 1 1 1 1 1 86 71 63 58 54 50 46 43 40 37 35 32 29 27 25 23 20 18 17 14 8 1 1 1 1 1 9 7 6 6 5 5 5 4 4 4 4 3 3 3 3 2 2 2 2 1 1 1 1 1 1 1 T A B L E S TABE 10 SURVEY LEVEL M Table 113 Applied Mathematics Reference Group ABEAll NC SS SEM 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 716 658 618 593 575 560 548 536 525 515 505 494 483 471 458 444 428 410 388 353 240 240 240 240 240 240 93 53 37 30 26 24 23 22 22 22 23 24 26 28 30 34 38 44 57 92 205 205 205 205 205 205 ABEJuvenile GE P NCE S P NCE S 9.9+ 9.9+ 9.9+ 9.9+ 9.9+ 9.0 8.0 7.1 6.7 6.3 6.0 5.7 5.4 4.9 4.5 3.9 3.4 2.9 2.4 2.0 0.4 0.4 0.4 0.4 0.4 0.4 99 98 93 88 82 77 71 65 59 54 49 43 37 32 27 22 18 14 10 5 1 1 1 1 1 1 99 92 82 75 70 65 62 58 55 52 49 46 43 40 37 34 31 27 23 16 2 2 2 2 2 2 9 9 8 7 7 6 6 6 5 5 5 5 4 4 4 3 3 3 2 2 1 1 1 1 1 1 99 98 93 87 81 74 67 61 54 49 43 37 32 27 22 18 14 11 8 3 1 1 1 1 1 1 99 92 81 74 68 63 60 56 52 49 46 43 40 37 34 31 28 24 20 12 1 1 1 1 1 1 9 9 8 7 7 6 6 6 5 5 5 4 4 4 3 3 3 3 2 1 1 1 1 1 1 1 N O R M S T A B L E S 155 Test of Adult Basic Education TABE GROUP REPORT: Center Name: Report Date: TABE Reading Scale Scores and Lexile Measures Name (Last, First) Johnson, Johnson, Johnson, Johnson, Kendal Keely Cullen Julie Identification Number 409 516 316 208 Test Date 2009-10-31 2009-10-31 2009-10-31 2009-10-31 CTB REPORTING SITE TEST CENTER October 11, 2010 TABE Reading Scale Score 600 700 500 460 Lexile Measure 1270L 1500L 815L 630L What is a Lexile® measure? A Lexile measure is an estimate of reading ability, and is also used to rate the difficulty of a text (for example, a book or a newspaper or magazine article). Lexile measures are represented as numbers followed by an “L” and range from below 200L for beginning readers to above 1700L for advanced readers. Likewise, when applied to the text demand of reading materials, Lexile measures range from below 200L for beginning-reader materials to above 1700L for advanced materials. How do students use Lexile measures? Students can strengthen their reading skills by using Lexile measures to find materials that meet and challenge their reading ability. Lexile-developer MetaMetrics recommends that students target a text range spanning 100L below to 50L above their Lexile measure, which represents the reader’s Lexile range. For example, a student with a Lexile measure of 950L should choose reading materials within a Lexile range of 850L to 1000L. Students also can use Lexile measures to manage their reading comprehension: · Readers may experience frustration when text readability is more than 50L above their Lexile measure. · Readers may experience ease when text readability is more than 50L to 100L below their Lexile measure. · Readers will likely experience growth when text readability is within their Lexile range. Where can students find books within their Lexile range? The free “Find a Book” search utility at www.lexile.com/findabook enables students to explore the Lexile Book Database and build a personalized reading list based on their Lexile measure and interests. The database contains tens of thousands of fiction and nonfiction books with Lexile measures. Reading lists can be emailed, printed or saved to a computer. Students are encouraged to refer to their reading list at the library, bookstore or on the Internet when selecting materials to practice their reading. For more information on using Lexile measures in the classroom, visit www.Lexile.com. Please note: A Lexile measures text difficulty only. It does not address the subject matter or quality of the text, age-appropriateness of the content, or the reader’s interests. These factors should be considered before making a book selection. 1 MetaMetrics®, the MetaMetrics logo and tagline, Lexile®, Lexile Framework® and the Lexile logo are trademarks of MetaMetrics, Inc., and are registered in the United States and abroad. The trademarks and names of other companies and products mentioned herein are the property of their respective owners. Copyright © 2009 MetaMetrics, Inc. All rights reserved. Test of Adult Basic Education TABE INDIVIDUAL REPORT: Center Name: CTB REPORTING SITE TEST CENTER TABE Reading Scale Score and Lexile Measure Report Date: October 11, 2010 Name (Last, First): Johnson, Cullen TABE 9/10 Reading Scale Score: 500 Identification Number: 316 Lexile Measure: 815L Test Date: 2009-10-31 Lexile Range: 715L - 865L What is a Lexile® measure? A Lexile measures the reading ability of people and the text difficulty of reading materials (for example, a book or an article). A Lexile measure is a number followed by an “L” and ranges from below 200L for beginning readers to above 1700L for advanced readers. Likewise, when applied to the text demand of reading materials, Lexile measures range from below 200L for beginning-reader materials to above 1700L for advanced materials. How do I use Lexile measures? You can strengthen your reading skills by using Lexile measures to find books, articles and other materials that meet and challenge your reading ability. Research indicates that reading materials at your ability (Lexile) level and in your areas of interest will help you grow as a reader. Your Lexile range spans from 100L below your Lexile measure to 50L above your Lexile measure. Lexile measures enable you to predict how well you will understand a book or an article. You may experience frustration when reading material more than 50L above your Lexile measure. You may experience ease when reading material more than 100L below your Lexile measure. You will likely experience growth when reading material within your Lexile range. How do I find books within my Lexile range? The free “Find a Book” search utility at www.lexile.com/findabook enables you to explore the Lexile Book Database and build a personalized reading list based on your Lexile measure and interests. The database contains tens of thousands of fiction and nonfiction books with Lexile measures. Sample titles are included in the table to the right. Reading lists can be emailed, printed or saved to a computer. You are encouraged to refer to your reading list at the library, bookstore or on the Internet when selecting materials to practice your reading. *Please note: A Lexile measures text difficulty only. It does not address the subject matter or quality of the text, the reader’s interests or the age-appropriateness of the content. These factors should be considered before making a book selection. Title Author The Accident Elie Wiesel 480L And Then There Were None Agatha Christie 570L The Firm John Grisham 680L The Elephant Vanishes Haruki Murakami 740L Star of the Sea Joseph O’Connor 850L Seabiscuit Laura Hillenbrand 990L The Glass Castle Jeannette Walls 1010L The Amazing Adventures of Kavalier & Clay Michael Chabon 1170L The House of the Spirits Isabel Allende 1280L The Postman Antonio Skármeta 1310L MetaMetrics®, the MetaMetrics logo and tagline, Lexile®, Lexile Framework® and the Lexile logo are trademarks of MetaMetrics, Inc., and are registered in the United States and abroad. The trademarks and names of other companies and products mentioned herein are the property of their respective owners. Copyright © 2009 MetaMetrics, Inc. All rights reserved. Lexile* TABE 10 Test Demo Account Name: ____________________________________________ School or Organization Name: ________________________ City: _____________________________________________ Phone: ____________________________________________ Email: ____________________________________________ Select One: _____ TABE Online _____ TABE Adaptive