Getting Digital Academic Performance Data Right

Getting Digital Academic Performance Data Right

Blended learning promises a wealth of academic data that will empower teachers with the information they need to target their students’ learning difficulties daily and accelerate their learning.  This hope is not fantasy, but two challenges have hindered our realization of this ideal: the grain-sizing problem and false positives.

The Grain-Sizing Problem

Since State and Common Core Standards represent the most common organization of academic goals, most digital content and assessment providers build their systems around these Standards and track students’ performance on each Standard.  When reflecting on a student’s performance over the course of a unit or year, teachers and administrators find Standards performance data useful.  These State and the Common Core Standards performance data provide teachers some very large grains of data, but not the smaller grains of information they need to make instructional decisions.  Let’s take a look at this 9th – 10th grade reading standard for informational texts:

ELA-Literacy.RI.9-10.8

“Delineate and evaluate the argument and specific claims in a text, assessing whether the reasoning is valid and the evidence is relevant and sufficient; identify false statements and fallacious reasoning.”

Learning that your students are 60% proficient on this standard is the equivalent of a cardiologist learning that their patient’s heart is 60% healthy before open-heart surgery: the doctor would understand the problem at a high level, but not in enough detail to understand and fix the problem that ails her patient.

Most teachers spend the beginning of the year breaking down standards like this one into smaller learning objectives that can be taught and assessed in a single lesson.  These objectives build toward mastery of a standard over the course of a unit or even the entire school year.  Below you can see a very simple potential breakdown of this standard into 8 objectives.

Breakdown of Standard ELA-Literacy.RI.9-10.8:

  • Define common logical fallacies
  • Define four forms of evidence
  • Define evidence sufficiency in an informational text
  • Identify an author’s argument in an informational text
  • Identify specific supporting claims within an author’s argument in an informational text
  • Identify the logical sequence of an author’s argument in an informational text
  • Evaluate the relevancy and sufficiency of an author’s evidence
  • Evaluate the logical sequence of an author’s argument in an informational text

Learning that a student is 60% proficient on a specific objective gives teachers an appropriately sized grain that they can use to understand student misunderstandings and make data-driven instructional decisions daily.

Why haven’t digital content and assessment providers provided this degree of granularity?  No consensus exists on the objectives that should build towards each standard.  Some providers have created their own objectives.  However, those objectives only exist within their system and are not comparable with others, making data triangulation between different programs difficult.

False Positives

While the grain-sizing problem presents teachers with imperfect data of some value, false positives – misinformation caused by a flawed calculation – present little redeeming value.  Fundamental flaws in many content and assessment provider’s data architecture create false positives misleading even the best academic data analysts.

Let’s consider another Common Core Standard and its objectives aligned to the levels of Bloom’s Taxonomy, which is commonly used to determine the difficulty of a task:

ELA-Literacy.RI.9-10.6

“Determine an author’s point of view or purpose in a text and analyze how an author uses rhetoric to advance that point of view or purpose”

AcademicPerformanceData1

In order for a student to master this standard, he must successfully complete problems or questions at the same level of Bloom’s Taxonomy as the standard: the analyzing level.  However, content and assessment providers frequently count performance on tasks at the lower level of Bloom’s towards mastery of the whole standard.

For example:

AcademicPerformanceData2

Four of the five questions above don’t actually assess mastery of the whole standard, only a single objective.  A student could master just a narrow portion of the standard and earn 80% proficiency, creating a false positive and leading their teacher to believe they’ve achieved master of the standard.

Since many standards require performance tasks that computers cannot easily assess, many providers provide few performance tasks at the level of the standard.  The scarcity of these tasks not only complicates the calculation of standards performance, but also challenges the validity of the results.  Without enough performance tasks for each objective or standard, teachers can’t trust the statistical significance of the data.

Fixes

Some companies have already begun addressing these issues by creating their own sets of objectives, and providing data for both standards and objectives.  However, even if all content and assessment providers took this approach, teachers still couldn’t compare data between their systems.

The answer could lie in a “data quality” certification for digital content and assessment providers.  If a conglomeration of providers or an authority, such as the Department of Education or a large foundation, certified providers for meeting a “data quality standard,” we could rid content and assessment providers of these problems. This certification could range from certifying the soundness of the data architecture, to requiring that a system adopt a particular set of objectives to accompany each of the Common Core Standards.

Many providers lack the nuanced knowledge necessary to avoid these problems and a certification system could help them avoid these problems.  However, for these providers to ensure an appropriate return on investment for doing the difficult work to earn a certification, schools and districts must become more savvy consumers.

Things to try before you buy:

  • Talk to other schools and districts who have tried to accomplish the same academic goal using a digital content or assessment provider.  They will know the right questions to ask.  Most importantly, they will help you find technology solutions that fit your goals, as opposed to goals that fit technology solutions.
  • Test a demo account that will allow you to generate and view student performance data.  Use the system to generate student data and have your best academic data analyst complete a full data analysis cycle.
  • Always start small.  Pilot everything small batch with a single class for several weeks before making your final decision.

The savvier schools and districts become, the more attractive a certification will be, and the less common the grain-sizing problem and false positives will become.  With improved digital content and assessment data, we can make data-driven instructional decisions with confidence that will allow us to realize the true promise of blended learning.

Written by Will Eden

Will Eden

Entrepreneur in Residence at Alpha Public Schools.

Leave a Reply

Your email address will not be published. Required fields are marked *