Program evaluation services that increase program value for grantors, grantees, and communities. x

Archive for month: February, 2020

February 18, 2020
18 Feb 2020

Digital Technology vs. Students’ Education

Over the last two decades, American education has sought to introduce and improve student access to digital technology. Since the first introduction of personal computers in classrooms, to the more recent efflorescence of iPads and the use of on-line educational content, educators have expressed enthusiasm for digital technology. As Natalie Wexler writes in The MIT Review, December 19, 2019, “Gallup …found near-universal enthusiasm for technology on the part of educators. Among administrators and principals, 96% fully or somewhat support “the increased use of digital learning tools in their school,” with almost as much support (85%) coming from teachers.” Despite this enthusiasm, there isn’t a lot of evidence for the effectiveness of digitally based educational tools. Wexler cites a study of millions of high school students in the 36 member countries of the Organization for Economic Co-operation and Development (OECD) which found that those who used computers heavily at school “do a lot worse in most learning outcomes, even after accounting for social background and student demographics.”

Although popular, and thought by educators useful, digital tools in classrooms not only appear to make little difference in educational outcomes, but in some cases may actually negatively affect student learning. “According to other studies, college students in the US who used laptops or digital devices in their classes did worse on exams. Eighth graders who took Algebra I online did much worse than those who took the course in person. And fourth graders who used tablets in all or almost all their classes had, on average, reading scores 14 points lower than those who never used them—a differential equivalent to an entire grade level. In some states, the gap was significantly larger.”

While it has been largely believed that digital technologies can “level the playing” field for economically disadvantaged students, the OECD study found that “technology is of little help in bridging the skills divide between advantaged and disadvantaged students.”

Why do digital technologies fail students? As Wexler ably details:

  • When students read text from a screen, it’s been shown, they absorb less information than when they read it on paper
  • Digital vs. human instruction eliminates the personal, face-to-face relationships that customarily support students’ motivation to learn
  • Technology can drain a classroom of the communal aspect of learning, over individualize instruction, and thus diminish the important role of social interaction in learning
  • Technology is primarily used as a delivery system, but if the material it’s delivering is flawed or inadequate, or presented in an illogical order, it won’t provide much benefit
  • Learning, especially reading comprehension, isn’t just a matter of skill acquisition, of showing up and absorbing facts, but is largely dependent upon students’ background knowledge and familiarity with context. In his article “Technology in the Classroom in 2019: 6 Pros & Cons” Vawn Himmelsbach, makes many of the same arguments and adds a few liabilities to Wexler’s list.
  • Technology in the classroom can be a distraction
  • Technology can disconnect students from social interactions
  • Technology can foster cheating in class and on assignments
  • Students don’t have equal access to technological resources
  • The quality of research and sources they find may not be top-notch
  • Lesson planning might become more labor-intensive with technology

Access and availability to digital technology varies, of course, among schools and school districts. As the authors of Concordia University’s blog, Rm. 241 point out, “Technology spending varies greatly across the nation. Some schools have the means to address the digital divide so that all of their students have access to technology and can improve their technological skills. Meanwhile, other schools still struggle with their computer-to-student ratio and/or lack the means to provide economically disadvantaged students with loaner iPads and other devices so that they can have access to the same tools and resources that their classmates have at school and at home.”

While students certainly need technological skills to navigate the modern world and equality of access to such technology remains a challenge, digital technology alone cannot hope to solve the problems of either education or “the digital divide.” The more we rely on the use of digital tools in the classroom, the less we may be helping some students, especially disadvantaged students, to learn.


February 4, 2020
04 Feb 2020

Bias, Seeing Things as We Are, Not as They Are

“Bias” is a tendency (either known or unknown) to prefer one thing over another that prevents objectivity, that influences understanding or outcomes in some way. (See the Open Education Sociology Dictionary) Bias is an important phenomenon in social science and in our everyday lives.

In her article, “9 types of research bias and how to avoid them,” Rebecca Sarniak discusses the core kinds of bias in social research. These include both the biases of the researcher, and the biases of the research subject/respondent.

Prevalent kinds of researcher bias include:

  • confirmation bias
  • culture bias
  • question-order bias
  • leading questions/wording bias
  • the halo effect

Respondent biases include:

  • acquiescence bias
  • social desirability bias
  • habituation
  • sponsor bias

In their Scientific American article “How-to-think-about-implicit-bias,”, Keith Payne, Laura Niemi, John M. Doris assure us that bias is not merely rooted in prejudice, but in our tendency to notice patterns and make generalizations. “When is the last time a stereotype popped into your mind? If you are like most people, the authors included, it happens all the time. That doesn’t make you a racist, sexist, or whatever-ist. It just means your brain is working properly, noticing patterns, and making generalizations…. This tendency for stereotype-confirming thoughts to pass spontaneously through our minds is what psychologists call implicit bias. It sets people up to overgeneralize, sometimes leading to discrimination even when people feel they are being fair.”

Of course, bias is not just a phenomenon relevant to social science (and evaluation) research. It affects our everyday activities too. In “10 Cognitive Biases That Distort Your Thinking,” Kendra Cherry explores the following kinds of biases:

In evaluation research, especially when employing qualitative methods, such as interviews and focus groups, unconscious bias can negatively affect evaluation findings. The following types of bias are especially problematic in evaluations:

  • confirmation bias, when a researcher forms a hypothesis or belief and uses respondents’ information to confirm that belief.
  • acquiescence bias, also known as “yea-saying” or “the friendliness bias,” when a respondent demonstrates a tendency to agree with, and be positive about, whatever the interviewer presents.
  • social desirability bias, involves respondents answering questions in a way that they think will lead to being accepted and liked. Some respondents will report inaccurately on sensitive or personal topics to present themselves in the best possible light.
  • sponsor bias, when respondents know – or suspect – the interests and preferences of the sponsor or funder of the research, and modifies their (respondents) answers to questions
  • leading questions/wording bias, elaborating on a respondent’s answer puts words in their mouth because the researcher is trying to confirm a hypothesis, build rapport, or overestimate their understanding of the respondent.

It’s important to strive to eliminate bias in both our personal judgements and in social research. (For an extensive list of cognitive biases, see here.) Awareness of potential biases can alert us to when bias, rather than impartiality, influence our methods and affect our judgments.


“How-to-think-about-implicit-bias,” Keith Payne, Laura Niemi, John M. Doris, March 27, 2018, Scientific American
“Bias in Social Research.” M. Hammersley, R. Gomm
Copyright © 2020 - Brad Rose Consulting