Effects of Automated Feedback in Scratch Programming Tutorials
In this episode I unpack Obermüller, Greifenstein, and Fraser’s (2023) publication titled “Effects of automated feedback in Scratch programming tutorials,” which investigates the impact of two different types of hint generating approaches among two different classes.
-
Quote our results show that
automatically generated Next Step hints
suggesting code changes can lead to
higher correctness faster progression
and reduce the number of help requests
for the teacher for teachers this is
ideal as it frees up their time to focus
on Learners who require the most
assistance although a possible concern
is that students may just implement the
suggested changes without actually
thinking about how this solves the
problem we observe no differences in
comprehension while these are
encouraging findings we also observe
that hints generated for more complex
code can be misleading resulting in the
opposite effects end quote that's from
page 396 of the article titled effects
of automated feedback in scratch
programming tutorials which was written
by Florian obermuller Louisa Cree
Feinstein and Gordon Frazier apologies
if I mispronounce your names here's the
abstract for the paper quote block-based
programming languages like scratch are
commonly used to introduce Young
Learners to programming while coding
Learners May encounter problems which
may require teachers to intervene
however teachers may be overwhelmed with
help requests in a classroom setting and
an independent learning scenarios
teachers may not be available at all
automated tutoring systems aim to help
by providing hints but misleading or
confusing hints can be detrimental to
better understand the effects of
automatically generated hints in this
paper we study a state-of-the-art hand
generation system that provides
suggestions when Learners fail to
complete a step in a programming
tutorial the system is evaluated using
two cohorts of students age 12 to 13.
where one cohort receives only textual
hints based on test failures while the
other additionally receives visual Next
Step support in terms of Illustrated
code changes we find that initially the
automatically generated visual Next Step
hints increase the speed at which
Learners complete the steps of the
tutorial and reduce the number of
questions posed to teachers without
affecting the learner's overall
understanding of their program
negatively however with increasing
complexity of the programs the quality
of the hints to grades thus calling for
further research on improving hint
generation systems end quote Vara
summarizes paper into a single sentence
I'd say that this paper investigates the
impact of two different types hint
generating approaches among two
different classes so the introduction of
this paper the authors talk about how
teachers can be overwhelmed especially
if they are new to Cs and the students
are also new to see us having done that
before it is sometimes difficult to get
things started but after like a couple
weeks or even a month depending on how
many times you're able to do it students
are actually able to sign into whatever
platform they're using and they're more
comfortable using it without having a
billion questions throughout each one of
the classes there are some steps that
you can take to kind of set yourself up
for Success without having a hint
generation system but having a system
like this might actually be able to help
you out to make it so that you don't
have to work with students one-on-one
but again at the end of this episode
I'll talk about some approaches that I
took without needing a hint generation
system the authors do note that hints
can take many different forms in terms
of like the kinds of approaches that you
can use and talks about that in the
review of literature which is titled
section two background so in this they
talk about linters tests and Next Step
hints as three different approaches but
I've mentioned in other episodes that
there are some other poaches that you
can use even with the types of questions
that you ask like you can have open
questions guiding questions or even
close-ended questions or even analytical
judicial creative Etc I'll include a
link in the show notes at
jarrodolary.com to some other podcast
episodes now talk about question
techniques that you can use because that
is something that I am quite frankly a
big fan of when it comes to education
rather than telling somebody something
you can guide them through some
questions to help them discover or
uncover a solution to a problem a bug or
whatever so check out some of the other
episodes that are linked in the show
notes if you're interested in learning
more about that the third section of
this paper just kind of talks about what
is the interactive tutorial system that
they designed for Scratch so if you want
to learn more about that check it out
but basically in this particular study
they're going to talk about two
different approaches one is text based
we're kind of like explain step by step
like what you could do for the next hint
and the other has like some pictures
where it shows like instead of doing
this do this and it shows like the
different blocks that you might be able
to use inside of scratch this particular
paper was Guided by four different
research questions and I'm going to kind
of combine them into one longer one so
the questions were basically how does
the hint system impact motivation
progression help request and
comprehensions those are the four four
main areas that we're going to explore
in today's unpacking scholarship episode
and in this particular study they looked
at a boat race tutorial and you can read
more about that on page 398 if you're
interested in learning about well what
was it that students were actually
creating through the particular tutorial
that they were following the authors
note they were able to compare two
different classes so the teacher had
mentioned that these students were about
at the same level and that one of the
classes had 19 students the other class
had 22 students so one of the classes
was assigned to the textual only hint
and the other is what was assigned to
the one that had the visually generated
Next Step hints so while the results
from the study are not generalizable
across like large populations of
students it is interesting to explore
the idea of automated hint systems that
you might be able to use in your classes
like Khan Academy has some built into
their JavaScript tutorials and whatnot
so this is not a brand new idea this has
been around for quite some time and the
authors do talk about this in the paper
itself but one thing I will caution is
that every single class has a different
vibe to them so like when I first
started teaching music one of the
schools that I was at they would have
have like three third grade classes in a
row and then like three fourth grade
classes in a row and then sixth grade
Etc so it was just like back to back the
same grade level and it honestly made it
easier for me to be able to teach
because I didn't have to like rearrange
the room and get out different stuff for
different um portions of the day I could
just leave things out for a couple
classes in a row switch over to the next
one Etc which was great but what was
even better is I was able to see just
how distinctly different each one of
those classes could be it may have been
the exact same lesson because I've
talked about this before they were
mandated lessons and you get in trouble
if you didn't teach the exact same
lesson in the same way on specific days
so that's a topic for a whole different
podcast but when you teach the same
thing and see how it was received
completely differently by the exact same
age group one right after the next it
really teaches a lot about how
distinctly different each one of the
classes you work with can be so I say
all this to say that there are two
distinct classes that are in here and we
cannot generalize the findings from this
outside of those two unique classes if
it had been flips that the text-based
group was now doing the visual one and
vice versa we may have gotten similar
results or we may have gotten completely
different results and we don't really
know that but it is important to explore
these kinds of questions and whatnot
because we can learn through these
Explorations which hopefully this
podcast will help with that on page 399
the authors start breaking down some of
the results so the first one is how do
next step hints influence motivation
this is a research question one and
here's a quote for page 399 that
summarizes that quote our study suggests
that the next step hint tutorial system
was well received and liked by the
students and had a positive effect on
the motivation of the students end quote
if you want to see the general breakdown
for this you can find some figures on
page 399. it was interesting to look at
it in general the students tended to
prefer the visual system more so than
the text-based one I do Wonder out loud
if that would have changed if this was
like a text-based language like if
they're doing JavaScript or something
would they prefer to be able to read
through the text rather than look at
pictures of text but I don't know that's
a whole separate study this is me
thinking out loud I imagine this was
helpful for some students who did not
want to ask for help I found that to be
fairly common especially as students got
into like the late Elementary and Middle
School ages they didn't necessarily want
to look quote dumb in front of their
peers so they would often not ask
questions that were like good questions
or share when they didn't understand
something because they didn't want to
embarrass themselves in front of their
peers but then if you go into like high
school students or even like early
Elementary they're generally speaking at
an age where they're not as concerned
with that so I'm kind of curious would
this have like made a difference for
like kindergartners versus like
undergraduate students but again that's
outside of the scope of this particular
study but that is something that is
useful for educators to consider is that
every single age group is going to have
some different tendencies in terms of
where they might lean towards being
comfortable to asking for help or not
and again going back to like the class
example there were some classes where
like the there was a norm within it that
the teacher made it very comfortable for
people to share their misunderstandings
or questions that they might have and
then there are other classes were as
much more competitive and they were not
willing to share those kinds of things
with students or their peers rather so
depending on what kind of class the the
homeroom teacher might have or the
teachers that they see throughout the
day like if they're in like a middle
school cycle where they have elective
classes Etc that might have an impact on
whether or not students are willing or
interested in asking for help in the
classes that you facilitate might have
nothing to do with you as an educator
when you first start teaching with that
group but it might take months to
actually develop the like trust that
students are going to have in coming to
you for asking for support and whatnot
so just be patient with it and
consistent with just like letting
students know that like hey I'm here to
help you out I'm not here to judge you
I'm still learning as a computer science
educator about how to do this kind of
stuff so let's learn together if that's
the general vibe that you have then it
will pay off in the long run at least in
the experiences that I've had and a lot
of the teachers that I've spoken with
now the next question is how does Next
Step hints influence progression here's
a summary on page 400 quote our study
suggests that next step hints generated
at early stages are precise and helpful
leading to better performance while hint
generation is less reliable for more
complex programs negatively affecting
performance end quote so what they
basically found is that towards the
beginning of the tutorials the hint
generation system was more useful for
them and students were using it more but
as it got into the more complex stages
of the tutorials it was less helpful
especially when it would give like a
false recommendation or whatever so it
might lead students down a path where it
was actually giving bad information that
was ultimately going to lead in students
not being able to complete the project
without a series of bugs or whatever now
one of the interesting things they did
on page 400 is they talk about some of
the less helpful hints that were
provided so one of them are the like
starting block suggestions like hey in
this next step you're going to use a
wind Sprite is clicked for the students
in this particular study that wasn't as
helpful as talking about what comes
after that which for me is interesting
like I think there should be different
layers of types of hints so when I'm
providing a hint for a student I might
talk about okay well what event do you
think you're going to use to be able to
do that thing that you want to do then
they have like a list of events and
they're gonna kind of go through it and
figure it out but also guiding them
through okay well then what is gonna
happen and then and then and then one of
the things you can do with like scratch
because it's like set up into different
types of blocks like uh there's motion
blocks or sound blocks there's the
events box control blocks etc etc if
your students are having issues with it
you could start with one what kind of
events are you going to use and then
after that what type of block are you
going to use next do you think you might
use a motion block or a looks block if
they don't know you can say okay what is
it that you want the Sprite to be able
to do do you want to be able to move
around or change the way it appears or
make a sound and if they're like I
wanted to make some kind of a sound you
go ah it sounds like you might want to
use one of the sounds blocks which one
of these sounds blocks do you think you
would use in this particular scenario
you can kind of guide them through these
different layers of hints so you don't
have to just give them what what are you
going to do next and then just leave
them but actually like help them think
through the different steps that they're
going to do in the sequence without
necessarily giving them an answer to
that now the next two main areas that
the authors mentioned that were
unhelpful hints were when like the block
was wrong so it gave you like a like a
wrong direction to go down or just give
you like some unnecessary scripts where
it's like you're going to use a go to XY
block and then a go to x-block it's like
wait that's like kind of doubling up
you're doing X the same thing twice in a
row like why would you do that so that
would cause some confusion with students
whether it be like wait I thought this
was doing one thing but now it's telling
me to do the same thing in a different
way but over time I imagine these
systems are going to get significantly
better especially when it comes to being
able to help with more complex projects
so research question threes on how do
next step hints influence help requests
There's a summary on page 401 quote our
study suggests that next step hints
increase the interactions of Learners
with the tutorial systems while lowering
the number of questions to the teacher
significantly end quote which that
totally makes sense like if students are
able to use self-guided resources and
support systems without having to ask
the teacher for help that's going to
make us so the teacher is receiving less
requests which will make teaching a lot
easier because you're able to sit down
with the students or the group of
students who might need more assistance
and the ones who might have just like a
minor question or kind of like get stuck
with it they don't have to wait for you
to answer that but again at the end of
this episode I kind of talk about some
approaches that you can use where you
don't have to use this like next step
automated system or AI or whatever to
help you out with this kind of approach
and the last research question is on how
do next step hints influence
comprehension and the authors mentioned
quote our results suggest that
comprehension is not significantly
influenced by additional Next Step hints
end quote from page 401 now what I do
wonder about this is how it'll compare
if it had a third group so a control
group where the students did not receive
Next Step hints would the comprehension
be the same would it be less would it be
more if the comprehension is the same
but it lowers the amount of work that
the teacher has to do and it makes it so
that the students are able to move at a
faster Pace because they get more or in
the moment feedback then cool this
sounds like a helpful system for the
most part as long as it doesn't give
some like a wrong paths to head down or
some redundant code Etc and then finally
the paper ends with a discussion on some
of the related work and then some
conclusions and you can check that out
on page 401 if you're interested in that
now at the end of these unpacking
scholarship episodes I'd like to share
some lingering questions and thoughts
that I was thinking of when I read
through this particular paper so the
first one is what kind of projects can
students create with such a tool so if
you compared over the course of a year a
class that used this kind of a tool
whether it's the text based or the
visual based versus a class that did not
have this kind of a tool what types of
projects might come out of that so one
of the things that has been really
interesting when listening to people at
the scratch Foundation talk about their
workshops is they measure the success of
the workshop by the diversity of the
projects that come out of the workshop
itself that's one of the many things
that they'll take a look at and that is
a fascinating way of looking at it
because going with like Mitch resnick's
discussion of projects peers play Etc
which you can learn more about in
episode 106 which is titled lifelong
kindergarten with Mitch Resnick it was a
fantastic interview but one of the goals
of exploring passion peers Play Projects
Etc is to get people to express
themselves and explore their interests
and create what they are interested in
rather than recreating the exact same
thing as all of their peers Etc so for
example one of the options that students
had was to be able to create stuff in
Khan Academy in Khan Academy if you
follow through this you're going through
the the lessons going through the
scripts by the end of each one of the
little units everybody creates basically
the exact same project the reason why is
because of the hint generation system
that they're using encourages students
to be able to create the exact same
thing because it has a very small range
of acceptable answers within it to be
able to move on to the next step so
everybody's going to create like a dog
or everybody's going to create a pong
game or whatever compared to hey use
these shapes to draw an animal or hey
use these Concepts and practices to
create some kind of a game those are two
very different approaches so I'd imagine
that two different classes that use two
different approaches one where it's more
open-ended and students can create with
you on the others where people are
following stuff they're likely going to
create this exact same thing compared to
the students who are going to create
what is interesting to them there's
going to be more diverse projects that
come out of that that is much more
culturally relevant and much more
interest driven than the other approach
where it kind of treats everybody as the
same in creating the exact same thing it
assumes a buy-in into that this approach
can work really well if people sign up
for it like if it's an elective class
where students are like hey I really
want to learn computer science show me
whatever you got I'm I'm in I bought
into this I want to learn this thing
compared to classes where the students
are forced to attend like the K8 classes
that I used to work with you might have
the best sequence of lessons for the
students who are bought in but the ones
who are like why do I need to learn this
they're going to be like this is
pointless I'm not interested in making a
game or a story or whatever but if you
gave him options of like creating a
sports project or reading a racing game
or whatever then they might be like yeah
I'm totally into that but if that's not
in your unit well they're gonna to be
bored in your class all that being said
there is a way to actually do rhizomatic
learning with these kinds of tools if
you have enough projects to select from
so for example I previously created a
bunch of curricular resources used by
students around the world those
curricular resources you could go
through sequentially start with project
one go to Project two project three Etc
or you could say here's a list of three
different options pick which one of
these three look interesting to you
because they all explore the same
concepts of practice in a different way
or you could go a little bit further
down the rhizomatic path and say hey
here's 10 projects pick one of those 10
that look interesting to you might be a
game it might be a story it might be
whatever or a completely rhizomatic you
could say hey pick any of these 40
Projects or make up your own we can
storyboard it together and kind of
figure it out that approach could use
some of these automated tools but it
will get to the point where if students
are creating their own projects they're
going to get into some questions that
any kind of like automated tool is
likely not going to be able to help
solve that which gets into my next
question how do you teach students to
provide feedback to peers and when other
forms of feedback are unavailable so
each of the different approaches that
were mentioned in this particular study
could be done by yourself or by students
with some kind of modification and
there's a ton of valuable lessons that
can be learned when students are
actually providing feedback to each
other rather than relying on automated
feedback I don't know about for you but
for myself I started teaching when I was
in high school and so I taught a middle
school class with the first drum
instructor that I ever had and it was
like a summer school class and I
realized that I needed to learn how to
say things in many different ways in
order to be able to help different
students who had different issues with
their drumming technique or whatever by
learning at something and like holding a
concept as like a gym that you could
look at from many different angles or
many different facets and kind of
explore it it allowed me to better
understand those Concepts and practices
and the same thing can be done with
computer science so yeah an automated
system might be very helpful for like
homeschool students or students who are
working on their own at home but if they
are in a space where they have peers I
would argue that students can get more
out of teaching each other not
necessarily learning from their peers
but actually teaching their peers so
even though it might be better for the
individual student to learn from an
automated feedback system because it's
much faster are they able to get right
into an answer and be able to move on
with their next step my guess is that
students will learn more in the long run
if they learn how to teach their peers
which is why like when I was in the
classroom and I was doing this
rhizomatic approach where if there was
like 30 kids in the class they were
working on 30 different projects with
several different programming languages
Etc the students had four steps that
they followed so the first step was to
check the built-in like manuals the
resources Etc so like rtfm so read the
free manual will be the family friendly
way of phrasing that the next two steps
were to ask like their peers and then
the final step was to ask myself if I
was not working with somebody if I was
working with somebody you got to repeat
the first three steps this made it so
students were helping each other out
quite a bit and if they happen to get to
step four and I knew somebody else could
answer that question I'd be like oh
Johnny you have an issue with that bug
well Susie knows how to solve that Susie
can you you come over here and help
Johnny out with this because again
Susie's going to learn a lot by
exploring that concept and reinforcing
when they are teaching Johnny the same
thing that they had an issue with
earlier if however you're working on
some kind of a curriculum where all the
students are using a system that has
like some kind of hint generation system
or whatever in their classroom cool
nothing wrong with that but you can
still add in opportunities for
creativity in that so let's say you saw
students five days a week one of the
things that you could do is days one two
three and four like Monday through
Thursday you're going to have it so that
students are following the pre-scripted
lessons that like kind of have them all
create the exact same thing and then on
Friday they're going to go into a blank
project where they are going to apply
what they learned inside of that project
in a way that's individually meaningful
you could also do this with units so
like in Khan Academy it would have like
a unit on how to draw shapes or
something cool once you finish that unit
I would have students go into a blank
project and now make something with
those shapes could be architecture it
could be a face an animal whatever and
maybe the next thing unit was okay now
we're gonna animate some of these shapes
awesome now that you've learned
animation and you've finished that
second unit now go into your project
that you worked on previously and add in
some kind of animated shapes movement
Etc that's how you can go through a
pre-scripted sequence while also
providing opportunities for culturally
relevant interest driven learning within
the same shared space but another
question that I have is as teaching
assessment feedback Etc becomes more
automated how will this impact teaching
and learning if you think about it this
is the worst that AI will ever be it's
only going to get better so what is the
role of an educator if AI can provide
faster and more accurate answers than
like a novice computer science educator
like when I first started I often would
get questions around like I don't know
the answer to that let me look it up
I'll come back to you tomorrow and try
and help you out with that if students
realize that their teacher is not as
helpful as like an AI system they're
going to rely on that AI system more so
than asking the teacher so what is the
role of the educator in that particular
scenario is it going to get to a point
where the the teacher is just there to
kind of like manage the classroom
environment and not necessarily need to
know the content area I don't know for
now being able to treat the student as a
human with unique goals will help
teachers to be able to distinguish
themselves from Ai and where it's going
to be five and ten years from now but I
would argue that if we focus on teaching
to the group rather than teaching
individuals within a group context we
will not provide an experience that is
above what AI can provide but I'm
curious what you think do you agree do
you disagree do you have other questions
about AI or some papers you'd like me to
explore you can leave a comment on the
YouTube video or on any of the other
social media platforms linked on my
website jaredolary.com just let me know
and if you enjoyed this particular
episode or any of the almost 200
episodes please consider sharing with
somebody else it just helps more people
find the free resources on my website at
jaredelary.com I've been adding a bunch
of scratch tips and whatnot over the
past month and there's plenty more
that's going to come and there's also a
bunch of content around drumming and
gaming like there's over 1600 hours of
drumming content so if you're into that
check it out odds are you're gonna find
something that will challenge you in a
fun way but anyways thank you so much
for listening stay tuned for an episode
next week which will also be on Scratch
and it's going to talk about scratch in
physical education which was an
interesting paper to read until then I
hope you're all staying safe and are
having a wonderful week
Article
Obermüller, F., Greifenstein, L., & Fraser, G. (2023). Effects of Automated Feedback in Scratch Programming Tutorials. In Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 1, 396-402.
Abstract
“Block-based programming languages like Scratch are commonly used to introduce young learners to programming. While coding, learners may encounter problems, which may require teachers to intervene. However, teachers may be overwhelmed with help requests in a classroom setting, and in independent learning scenarios, teachers may not be available at all. Automated tutoring systems aim to help by providing hints, but misleading or confusing hints can be detrimental. To better understand the effects of automatically generated hints, in this paper we study a state-of-the-art hint generation system that provides suggestions when learners fail to complete a step in a programming tutorial. The system is evaluated using two cohorts of students aged 12–13, where one cohort receives only textual hints based on test failures while the other additionally receives visual next-step support in terms of illustrated code changes. We find that initially the automatically generated visual next-step hints increase the speed at which learners complete the steps of the tutorial and reduce the number of questions posed to teachers, without affecting the learners’ overall understanding of their program negatively. However, with increasing complexity of the programs the quality of the hints degrades, thus calling for further research on improving hint generation systems.”
Author Keywords
Block-based programming, Automated feedback, Automated tests, Next-step Hints, Scratch
My One Sentence Summary
This paper investigates the impact of two different types of hint generating approaches among two different classes.
Some Of My Lingering Questions/Thoughts
What kind of projects can students create with such a tool?
How do you teach students to provide feedback to peers when other forms of feedback are unavailable?
As teaching, assessment, feedback, etc. becomes more automated, how will this impact teaching and learning?
Resources/Links Relevant to This Episode
Other podcast episodes that were mentioned or are relevant to this episode
Assessment Considerations: A Simple Heuristic
In this episode I read and unpack my (2019) publication titled “Assessment Considerations: A Simple Heuristic,” which is intended to serve as a heuristic for creating or selecting an assessment.
Fostering Intersectional Identities through Rhizomatic Learning
In this episode, Jon Stapleton and I read our (2022) publication titled “Fostering intersectional identities through rhizomatic learning,” which uses mapping as a metaphor for individualized learning.
How to Get Started with Computer Science Education
In this episode I provide a framework for how districts and educators can get started with computer science education for free.
Lifelong Kindergarten with Mitch Resnick
In this interview with Mitch Resnick, we discuss misconceptions people have around the four P’s (Projects, Passion, Peers, and Play) in Mitch’s book, encouraging depth of understanding while playing, what has surprised Mitch during his career, encouraging online communication and collaboration without creating artificial engagement, what Mitch wishes we’d see more of and discuss in CS education, our pet peeves with unplugged activities and computational thinking, accounting for survivorship bias with Scratch, expanding our focus on equity and inclusion to include both the “who” and the “how,” the importance of experimenting and learning through play, and much more.
Rhizomatic Learning with Catherine Bornhorst, Jon Stapleton, and Katie Henry
In this panel discussion with Catherine Bornhorst, Jon Stapleton, and Katie Henry, we discuss what rhizomatic learning is and looks like in formalized educational spaces, affordances and constraints of rhizomatic learning, how to support individual students within a group setting, standards and rhizomatic learning, why few people know and use rhizomatic learning approaches, how to advocate for and learn more about rhizomatic learning, and much more.
Talking About [Computer Science]: Better Questions? Better Discussions!
In this episode I unpack Allsup and Baxter’s (2004) publication titled “Talking about music: Better questions? Better discussions!” which is a short article that discusses open, guided, and closed questions, as well as a framework for encouraging critical thinking through questions. Although this article is published in a music education journal, I discuss potential implications for computer science educators.
Using Questions That Guide Mathematical Thinking to Think Computationally
In this episode I discuss some example questions we can ask to encourage kids to think deeper about computer science and computational thinking by unpacking two papers on using guiding questions in mathematics education. The first paper paper by Way (2014) is titled “Using questioning to stimulate mathematical thinking” and the second paper by Pennant (2018) is titled “Developing a classroom culture that supports a problem-solving approach to mathematics.”
Find other CS educators and resources by using the #CSK8 hashtag on Twitter