Student Help-Seeking for (Un)Specified Behaviors

Posted on 02 October 2021.

Here’s a summary of the full arc, including later work, of the Examplar project.

Over the years we have done a lot of work on Examplar, our system for helping students understand the problem before they start implementing it. Given that students will even use it voluntarily (perhaps even too much), it would seem to be a success story.

However, a full and fair scientific account of Examplar should also examine where it fails. To that end, we conducted an extensive investigation of all the posts students made on our course help forum for a whole semester, identified which posts had to do with problem specification (and under-specification!), and categorized how helpful or unhelpful Examplar was.

The good news is we saw several cases where Examplar had been directly helpful to students. These should indeed be considered a lower-bound, because the point of Examplar is to “answer” many questions directly, so they would never even make it onto the help forum.

But there is also bad news. To wit:

  • Students sometimes simply fail to use Examplar’s feedback; is this a shortcoming of the UI, of the training, or something inherent to how students interact with such systems?

  • Students tend to overly focus on inputs, which are only a part of the suite of examples.

  • Students do not transfer lessons from earlier assignments to later ones.

  • Students have various preconceptions about problem statements, such as imagining functionality not asked for or constraints not imposed.

  • Students enlarge the specification beyond what was written.

  • Students sometimes just don’t understand Examplar.

These serve to spur future research in this field, and may also point to the limits of automated assistance.

To learn more about this work, and in particular to get the points above fleshed out, see our paper!