Wednesday, July 25, 2007

Education 101: Testing

When it comes to covering education these days, it's likely you'll come across an alphabet soup of acronyms you'll need to use in daily reporting. But Drew Gitomer, from Educational Testing Service, and Daniel J. Losen, of UCLA and Harvard Law School, told Hechinger attendees that it's more than just NCLB, IEPs and HSPA***.


Here are a few of the tips, tricks and story ideas they offered:



  1. Get a technical advisor – someone from a local university to help explain things to you


  2. Weigh significance over meaning. It can be statistically big but mean nothing in the classroom. Numbers can mean different things to different people.


  3. Total definitions are wrong: there is no silver bullet or magic wand. It’s likely a combination of things that made the scores come out as they are.


  4. Testing became the solution, when really, all testing should be is a thermometer.


  5. Seek confirmation of what the tests say. Compare the test scores to tests that cover the same topics, graduation rates, grade retention, other trends over time. Are low achievers leaving the testing pod? What are your dropout rates?


  6. Look at the raw data. Has the test changed? What is the enrollment vs. the number of students who were tested? Who was left out?


  7. Dig beyond your state information: National Assessment of Educational Progress – shows no change in reading progress since NCLB began


  8. Other factors to focus on: graduation rates, attrition (and English language) mobility, grade retention/advancement (kids being held back to pass test?), suspension/attendance/lost instructional time (suspensions new ‘class trips’ for poor performing kids on test days), teacher quality retention


  9. Look at 9th graders and see how many actually get their diplomas


  10. What are the costs of dropouts: Cost of incarceration, welfare, etc?


  11. If a child receives multiple suspensions, the likelihood that they will drop out before they reach the 10th grade triples.


  12. Do your own math


  13. Look at data over time


  14. Disaggregate data by race, gender, special needs, etc


  15. Look for data loopholes


  16. Proficiency means different things in different states. What is the proficiency in your state? How does your coverage area hold up against other students? If I’m not proficient, what’ can’t I do? If I’m proficient, what can I do? What can’t I?


  17. Contact NAEP – are the tests dumbed down? Are they increase graduation to decrease the number of dropouts? Teachers staying on the job longer? Equating study.
  18. Get off message: When tests are good, administrators chalk it up to good teaching or new programs (FYI: always wait for a few years to pass before extolling the effectiveness of a program. Anything before then is just conjecture). When results are bad, they blame the budget, special needs students or other factors beyond their control. Ask them this: What did you do differently? That gets to the heart of the results. Evaluate from there.


***NCLB = No Child Left Behind
IEP = Individualized Educational Plan
HSPA = High School Proficiency Assessment

Labels: , ,

continue...

Posted by T Dot at 8:21 PM | link

We'd Like to Know...

Our Favorites

NABJ
Poynter Institute
Journal-isms
Media News
Romenesko
Ask the Recruiter
About the Job
On The Media
Columbia Journalism Review
Howard Kurtz's Media Notes
Eric Deggans
E-Media Tidbits














































































































































































































































































































































































.