Frequently Asked Questions
- Q: Why is Retell a required part of DORF in DIBELS Next?
- A: An in-depth response can be found in our position paper,
Why is Retell a Required Part of DORF in DIBELS Next?
- Q: Why doesn't the password I use to download DIBELS® 6th Edition materials work to download the DIBELS Next materials?
- A: The DIBELS 6th Edition download site is hosted by the University of Oregon, for historical reasons. The DIBELS Next download site is hosted directly by DMG, the authors of DIBELS. Because we need to count DIBELS Next download users separately from 6th Edition download users, you will need to sign up and receive a new password to access the DIBELS Next materials. To sign up, visit our DIBELS Next page.
- Q: Where do I find the Benchmark Goals for DIBELS Next?
- A: The DIBELS Next Benchmark Goals are available from the DIBELS Next Benchmark Goals and Composite Score document.
- Q: Where can I find technical adequacy information for the DIBELS Next measures?
- A: A draft of the DIBELS Next Technical Manual was released in February of 2011 and is available on the DIBELS Next download page. Additional technical reports are available from the Publications and Presentations link.
- Q: Why doesn't Letter Naming Fluency have benchmark goals?
- A: LNF has always been used as an indicator of risk rather than an instructional target. This is also why progress monitoring materials are not provided for LNF. In 6th Edition, we provided target ranges for LNF to identify levels of risk, but they did not work in the same way as real benchmark goals. For DIBELS Next, we have the DIBELS Composite Score, which provides the best indicator of risk. Since the risk indicator comes from the composite score and LNF does not assess an instructional target, there is no need for a benchmark goal for LNF.
- Q: Is LNF a measure of Rapid Automated Naming (RAN)?
- A: LNF is not a direct measure of RAN, although RAN is likely a component skill that contributes to fluency in letter naming once students know the names of the letters. Tasks that are direct measures of RAN differ from LNF in the following ways:
- RAN tasks typically involve the timed naming of a small number of familiar items arranged in a grid with each of the familiar items repeated in random order. For example, a common RAN task contains five rows in which five items (letters, numbers, colors, shapes) are displayed in left-to-right serial fashion and repeated randomly in each row. In contrast, LNF uses all 26 upper case and lower case letters in a stratified-random order and assesses a student's knowledge of letter names as well as fluency in naming them.
- RAN tests use items that are familiar to the student. On LNF, students may or may not know the name of some or all of the letters.
- Q: Why doesn’t DIBELS contain a direct measure of RAN?
Although research supports a relationship between RAN and reading skill, the role of RAN and the nature of the relationship between RAN and reading is not well understood.
DIBELS measures assess recognized and empirically validated skills related to reading outcomes. These skills, known as Basic Early Literacy Skills, differentiate successful from less successful readers, are amenable to change through instruction and, most important, improve reading outcomes for students when they are learned (Kaminski, Cummings, Powell-Smith, & Good, 2008). To date, there is not evidence that training students to improve RAN speed improves their reading fluency (Norton & Wolf 2012). Rather than focusing intervention on improving RAN speed, reading skills may be improved by Improving fluency in the skills that are related to later reading, such as fluency in phonemic awareness, and oral reading.
- Q: Why are the sixth grade benchmark goals lower than the fifth grade goals?
- A: The difficulty level of the passages used for DORF and Daze changes by grade, so composite scores and benchmark goals can't be directly compared across grades. The difficulty level of the passages increases by grade in a roughly linear fashion. However, student performance increases in a curve, with the most growth occurring in the earlier grades, and slower growth in the upper grades. Between fifth and sixth grade, the difficulty level of the materials increases at a faster rate than student performance, so benchmark goals are lower in sixth grade than in fifth.
- Q: Can DIBELS be used to help identify students with dyslexia?
DIBELS Next can be used as an effective tool for identifying students who are at risk for early reading difficulties, including dyslexia. Low skills on DIBELS Next followed by persistent lack of adequate progress, in spite of instruction that has been effective with other students with a similar level of initial skills, may provide evidence that a student is at risk for dyslexia.
However, please note that no single assessment can be used to specifically identify dyslexia. Instead, that is a decision made by a team of qualified professionals who may include test data as part of their decision-making process.
More information on dyslexia and DIBELS Next can be found in our position paper,
Dyslexia Screening and the Use of DIBELS Next.
- Q: What's the difference between DIBELS 6th Edition and DIBELS Next.
- A: For more information on the difference between DIBELS 6th Edition and DIBELS Next,
read our Transitioning to DIBELS Next brochure.
- Q: How long will DIBELS 6th Edition continue to be supported?
- A: We strongly encourage schools to transition to DIBELS Next as soon as possible. DIBELS Next is a significant improvement over DIBELS 6th Edition and is a more reliable and valid assessment. For more information on transitioning to DIBELS Next, read our Transitioning to DIBELS Next brochure. At some point in the near future, DIBELS 6th Edition will no longer continue to be supported.
- Q: Where can I find the reliability and validity information for DIBELS 6th Edition?
- A: Reliability and validity information for 6th Edition can be found in the DIBELS 6th Edition Technical Adequacy Information paper.
- Q: The font used for the LNF and NWF student materials is one that many young children are unfamiliar with and it seems like it may affect their scores. What are your reasons for choosing this font?
- A: The font used for DIBELS 6th Edition student materials is Times New Roman, the most common font used for print materials. The benchmark goals were developed using the same materials that are now used for assessment, so should take into consideration any challenges that students have with the font. Note that for DIBELS Next we have chosen to used a different font, Report School, which was specifically developed for early literacy instruction.
- Q: Some of the pictures used for ISF are challenging for my kindergarten students to remember. What if they say a word that starts with the correct sound, but it is not the same word I used?
- A: You can score the item as correct. The goal of ISF is to assess student's early phonemic awareness skills. Being able to identify a word that starts with the sound you present indicate the child is phonemically aware, and should receive credit for the item.
- Q: The Administration and Scoring Guide states that students should be able to retell 25% of their ORF score. Which ORF score do we base the 25% RTF score on? If a child retells 25% of their ORF median score but on another passage, is this considered passing or do they have to meet the 25% goal on the median passage?
- A: For 6th Edition, remember that the 25% goal is an estimate, and not a real DIBELS benchmark goal. You should compare the median RTF score to the median DORF score (whether they came from the same passage or not). The problem is that there are two ways to meet that goal: Retell more, or read fewer correct words per minute. So if you want to use it as an approximation, you should only use it for students who are meeting the benchmark goal on DORF. We do have real benchmark goals for Retell in DIBELS Next, making it easier to interpret that information.
- Q: Can DORF scores be converted to Lexiles?
- A: Yes. For DIBELS 6th Edition, the developers of the Lexile Framework at MetaMetrics created a report that can link DORF scores to Lexile values. For more information, read Linking DIBELS Next® with the Lexile® Framework.
DIBELS 6th Edition
- Q: Are the IDAPEL measures available for use?
- A: The IDAPEL measures continue to be researched for their reliability and predictive utility. DMG has been working with, and continues to seek school districts interested in using the measures and willing to be research partners for the purpose of continuing research. Ongoing research with school district partners will allow us to gain more information about how well the measures work with diverse groups of students. If you are interested in becoming a research partner, we highly recommend the IDAPEL Essential Training, the first workshop from a series of cumulative training workshops. Training prospective users allows us to be more confident about the data you collect for us.
- Q: What is IDAPEL Essential Training?
- A: The IDAPEL Essential training is a 2-day French-language workshop. IDAPEL Essential Workshop is designed to provide an in-depth understanding of the conceptual and empirical foundations of IDAPEL as well as comprehensive practice in the administration and scoring of all the current IDAPEL measures for grades K – 5. This practitioner model supports individuals using IDAPEL in their school/school district. In addition to classroom teachers, attendees should include any bilingual school staff including administrative staff, district school psychologists, school aides and other resource personnel invested in student reading outcomes and who could be involved collecting school data three times per year.
- Q: What is IDAPEL Data Use and Interpretation Workshop?
- A: This two-day French language workshop is designed to provide users of IDAPEL training such as to make efficient use of IDAPEL data for improving individual student, classroom and school district reading outcomes. Workshops topics include interpreting individual student and classroom scores, planning instructional groups and identifying instructional goals, effectively using IDAPEL data to identify students in need of additional support, and setting up progress monitoring systems as a way to improve and maintain student reading success. Attendees should include individuals who have received the initial IDAPEL Essential Training, who are familiar with the administration and scoring of IDAPEL measures through classroom data collection, and who are interested in effectively using IDAPEL data for the purpose of improving individual and classroom reading outcomes.
- Q: What are IDAPEL Workshop fees?
- A: Each IDAPEL workshop costs $1,500 a day plus trainer’s travel fees. Estimated travel expenses based on round-trip airfare from Eugene Oregon to the designated city would be submitted prior to training. For on-site workshops, DMG would provide electronic files of training modules for school districts to print. The Essential training manual consists of 188 double-sided pages. The Data Interpretation manual consists of 250 double-sided pages. Each participant would require a training manual for each workshop and for use as future reference. For the workshops, a stop-watch for each participant would be most useful and which can be purchased from Dynamic Measurement Group. Due to printing and shipping costs, it has been our experience that having school districts print their own training manuals reduces their costs. The trainer uses a PowerPoint presentation for the training and would require an LCD projector, and computer speakers, or an adequate sound system for a larger audience.
- Q: How can I access the measures?
- A: After the Essential training, DMG would provide you with a username and password to download grade-level electronic versions of the IDAPEL measures. These materials would be available for your use free of charge for the duration of the study. We currently have IDAPEL grade level benchmark booklets for grades K through grade 5, with progress monitoring booklets for the different measures, except for oral reading fluency. As a research partner, your school district would agree to collect and enter raw score data into our password protected DMG database. You would also agree to provide basic demographic data, and complete a user survey once on the measures.
Any research DMG conducts is in compliance with US federal guidelines for the protection of human subjects. Prior to any research, we would initiate an Internal Review Board Application with Exempt Certification. If approved by your school district, we would then begin a longitudinal study whereby cohorts of students are followed over an extended period of time (at least three years). Your school district would benefit directly from the study by being provided with valid early reading assessments that are brief, easy to use, and designed to assess students’ French reading trajectories over time, and assist teachers in planning for French reading instruction.
- Q: Have the IDAPEL Benchmark Goals been established?
- A: IDAPEL validation studies have been undertaken with both French language first students in eastern Canada and French second-language students in both eastern and western Canada. Benchmark goals for both groups of students have been preliminarily established. Longitudinal research examining the level of early literacy skills that are predictive of later literacy outcomes for both groups of students will help confirm benchmark goals data analyses.