Music Making in Scratch: High Floors, Low Ceilings, and Narrow Walls?
In this episode I unpack Payne and Ruthmann’s (2019) publication titled “Music making in Scratch: High floors, low ceilings, and narrow walls,” which problematizes the limitations of making music with Scratch.
-
Welcome back to another episode of the
csk8 podcast my name is Jared O'Leary
each week of this podcast is either an
interview with a guest or multiple
guests or a solo episode where I unpack
some scholarship in relation to Computer
Science Education in this week's episode
of unpacking a paper titled music making
in scratch colon High floors low
ceilings and narrow walls question mark
this paper is written by William Payne
and Alex ruthman and is also available
for free so you can find a direct link
to it in the show notes at
geraldleary.com or by clicking the link
in the app that you're listening to this
on you'll notice there that this podcast
is powered by Buddha professional
development which is the non-profit that
I work for go to boot.pd.org to learn
more about the paid professional
development or the free computer science
education resources that I have created
for Buddha all right here's the abstract
for this paper quote music programming
is an increasingly popular activity for
Learning and creating the intersection
of computer science and music perhaps
the most widely used educational tool
that enables music programming in
scratch the construction is visual
programming environment developed by the
lifelong kindergarten group at the MIT
media lab while a plethora of work has
studied scrap match in the context of
children creating games and coding
interactive environments in general very
little has honed in on its creative
sound or music specific functionality
Music and Sound are such an important
part of children's lives yet their
ability to easily engage in creating
music and coding environments is limited
by the Deep knowledge needed in music
theory and Computing to easily realize
musical ideas in this paper we discuss
the affordances and constraints of
scratch 2.0 as a tool for making
creating and coding music through an
analysis of limitations and sound code
block of design a discussion of
bottom-up Music programming and a task
breakdown of building a simple drum Loop
we argue and illustrate that the Music
and Sound blocks as currently
implemented in scratch May limit and
frustrate meaningful music making for
children the core user base for Scratch
we briefly touch on the history of
educational music coding languages
reference existing scratch projects and
forums compare scratch with other music
programming tools and introduce new
block design ideas to promote lower
floors higher ceilings and wider walls
for Music Creation in scratch end quote
now found to summarize this paper into a
single sentence I'd say that this paper
problematizes the limitations of making
music with scratch now you might be
listening to this and go okay Jared this
is very specific I know you have a
background in music and computer science
but like why the reason why is because I
think this serves as a wonderful example
of some of the cautions around
integration or a lot of people say
they're going to integrate music in
scratch or scratch an Ela or scratch in
math science whatever and they do so in
ways that really just kind of skim the
surface of a discipline either computer
science or the other discipline in a way
that might be described as subservient
which I've talked about in other
podcasts so in this episode we're going
to dive deeper into a platform a
platform that I love like listen to the
interview with Mitch Resnick I clearly
have a passion for Scratch as does Mitch
but even though I might love a platform
we need to talk about some of the
problems now note that this is for
Scratch 2.0 this particular paper and
right now at the time of this recording
in November of 2022 scratch 3.0 is out
so there are more options for making
music but in general the main crack of
this particular paper still stands in my
opinion which is why we're reading it
well I am you're listening but I mean
you can read it it's in the show notes
you know what I mean or maybe you don't
and that's okay all right so in the
background for this paper it starts with
describing how programming music is
becoming more of a popular thing to do
however a lot of the programs that
people have explored through research
tend to focus on computer musicians so
people who are like more expert level
rather than novice programmers or
musicians so they provide some examples
of some platforms or languages that
specialize in creating music and it's
usually at the like professional level
or at least expert level however there
have been newer platforms that they
mention like Sonic Pi which actually
facilitated in the fourth through 8th
grade classes that I previously worked
with and then I even designed some
curricular content for the University
that I was contracted through it was
several years ago now it definitely can
be done in like Elementary and above
space there's also platforms like your
sketch which I have talked about
previously so that is I've typically
seen in high school and sometimes in
middle school and some other platforms
like scratch now before the authors talk
about scratch they kind of give you a
little landscape preview of some of the
other languages like the ones that I
just mentioned but also jython music
walking tone.js Etc so if you're
interested in like actually diving into
more musical examples like if you listen
to any of those and you're like wait I
don't know that one well cool take a
look at this paper again it's for free
and maybe you'll learn something maybe
you'll find that perfect music platform
that you always wanted to do to code
music but the next main subject section
that I want to focus on is on Scratch
and so scratch as I've talked about in
other episodes like in interviews and
unpacking scholarship episodes like this
like media arts platform where you can
create games stories animation music all
sorts of really cool things be very
simple projects or or absurdly complex
to the point where I'm like scratching
my head trying to figure out how it
works which is great because I learned
something when I finally figure out oh
that's how that works but I'm a nerd and
I like those problems so in this section
they talk about how they had an
undergraduate course that was titled the
Creative Learning design class that
would create some scratch projects that
would then be user tested with first and
seventh graders which is an interesting
range of students and I mean that in a
good way I've actually known one of the
authors Alex for several years now maybe
even a decade and he's wonderful alright
so the next subsection the authors talk
about okay what was the motivation
behind me so they mentioned that scratch
at the time of this publication or at
least writing it there were 33 million
registered users 35 million project
shared and that scratch is famously
known for having low floors High
ceilings and wide walls I don't remember
if Mitch mentioned that in the episode
in the interview that I did with him
which again I highly recommend listening
to but that is something that people
very frequently associate with scratch
so this paper kind of questions that
here's a quote from my PDF page three it
might be on a different page for you but
it's the first paragraph under the
motivation subsection quote however when
it comes to music projects We Believe
scratch has limitations in terms of
easily coding and playing pre-recorded
and user created audio file files one
finds low floors however for users
wishing to create music through
sequencing musical notes or with
engaging sounds and instruments they
often face High floors low ceilings and
narrow walls due to complex numeric
music mappings and mathematical
representations as well as data
structures that limit musical expression
and quote again this is scratch 2.0 but
I would argue that most of what they're
talking about is still applicable in
scratch 3.0 most current release and
again even though this is music I'd
argue this is also very applicable in
other areas so we're about to dive a
little bit deeper into what are some of
the problems with this but like think
about some of the curricula that like
say they integrate coding and art well
they might make it so you can like Draw
Something it kind of relates to Geometry
maybe depending on how you code it but
it's generally very limited in terms of
the things you can create and while a
fun or neat experience it just often
pales in comparison to actually pulling
out different tools that are designed
for creating art like pencil and paper
paintbrush and canvas clay glass
Etc so let's hone in on the music side
of things so the first subsection is on
music functionality in scratch 2.0 so
they provide some pictures of the box
and they complement Scratch by saying
you get immediate feedback on these
blocks when you click on a play note
you'll hear what that note is when you
click on a play sound you'll hear what
that sound is makes it very easy
compared to some other languages where
you might have to actually like write
things out and so in figure two they
actually like show here's what it would
look like in scratch and then the same
thing in Max which was previously called
Max MSP and then tone.js and it shows
that is significantly easier to have one
block that you can click on to play a
sound as opposed to having to like
initialize and then sequence together a
potential algorithm or whatever in order
to actually hear the sound they also say
that there are three different types of
audio playback so there's midi pitched
instruments midi drums and then audio
files as sounds or Loops what they
mention is that although there are three
different types there are inconsistent
among the three different types so for
example you can set tempo for the
pitched instruments and the drums but
you can't set your Tempo which is the
speed for your audio files which are the
sound blocks which are the play sound or
play sound until done blocks which by
the way play sound until then as the
author's note is one of the most
confusing things for some of the
students that I previously worked with
because they often don't realize that it
won't move on to the next block in your
algorithm until that sound has completed
however short or long it is and that the
play sound block will basically play
that block simultaneously with the next
block in the algorithm so it sometimes
confuses kids so if you're working on
scratch for the very first time with
music be prepared for that I'd highly
recommend thinking through some
different questions that you might be
able to ask them and I've actually got
multiple podcasts on questioning
techniques if you want to check them out
at jaredolary.com I'll link to them in
the show notes now the authors also
mentioned that there's like a bit of a
confusion around the pitched instruments
and the drums in terms of how they work
because for the play note there's a
parameter in there and you're able to
change what note it is as well as the
duration so you might pick midi note 65
but for the drums you're not picking the
midi note you're picking the instrument
rather than a pitch and then you also
have the duration as a parameter so the
authors note that it's just confusing
for kids because you're only able to
have one instrument per Sprite for the
pitched instruments going on at the same
time unless you get a little cheeky with
the timing through parallelism but for
percussion you're able to have multiple
instruments that you can play
simultaneously so that can be a little
confusing for kids but speaking of the
parameters that I just mentioned in
terms of like duration or like the
parameter for the pitch this can be a
constraint for some kids in terms of
like what timing is even possible with
the numbers that you can put in there
and then how do the numbers shape how
people use it and so in the final
paragraph on this particular subsection
they talk about how scratch doesn't
actually allow you to synthesize new
sounds like to be able to design or
craft your own new sounds I'm changing
the waveform and whatnot through adsr
which is the attack Decay sustain
release which is a very fundamental
thing to most digital audio workstations
or Daws me music programs synthesizers
Etc so by not having this functionality
students are limited to basic midi
instruments or recorded sounds so that
too can be a limiting factor when
engaging with music and scratch which by
the way it also might be a good thing
because then you're not overwhelming
students with too many things I'm going
to argue with myself here and the
authors do that as well like they're
very good at being able to say hey this
is great but also let's talk about this
some things to work on speaking of
things to work on here's the next
section this one is on learning music
coding coding music and scratch so they
talk about how this is a bottom-up
approach that is typically used from a
constructionist standpoint which some
critics have argued creates a bit of a
problem with control structures and how
some kids will often use just a ton of
small scripts that aren't necessarily
serving as functions so it kind of makes
things incoherent Etc but this approach
of having to drag in a new block for
each one of the pitches instruments
durations whatever can make it so that
you end up spending a lot of time just
dragging out blocks and engaging in
scratch's version of syntax cleanup and
whatnot rather than actually just like
making music with code now if you think
back to the episode that I did that was
talking about Sonic Pi which uses the
language Ruby which is a very light
language like it's super fast one of the
reasons why that language was used is
because it's so quick to type you're not
bogged down by the syntax you're not
having to drag and drop blocks you're
not having to type in semicolons you're
just able to quickly create music which
is important because that platform
generally speaking is intended to be
used live or creating live music so with
scratch on the other hand it just might
be a little difficult or cumbersome to
create longer pieces of music so for
example in figure three they show that
Twinkle Twinkle Little Star requires a
bunch of blocks kind of all sequenced
together and it's not very easy to
understand what's going on just by
looking at it unless you like happen to
be more fluent with music making or
experienced but one of the things that
they kind of like offhandedly mention is
that for moments like these where you
have like this song that's just like
absurdly long in terms of the algorithm
it would sometimes challenge kids or
even adults to just think of how could
you make this song with as little amount
of code as possible which I think is a
really interesting challenge like John
Stapleton has mentioned this before in
conversations you've listened to the
episodes with John I'll link them in the
podcast because John is awesome but John
Stapleton has like mentioned tiny code
which is like this really unique thing
that I've seen on like Twitter or some
other places where they will try and
like create artwork or music or whatever
or even a game with like only a few
lines of code which is just really
fascinating if you're also like an
efficiency nerd and code efficiency nerd
like I am I highly recommend checking
out and Diving deeper into that world
and I'll try and include some links in
the show notes now there's a really
interesting discussion right after
figure three this is on PDF page eight
for myself that talks about how the
typical approach that's used to create
music in platforms like scratch or
honestly even in some music classes is
it focuses on very granular music
Concepts often out of context and it
does not encourage or enable kids to
apply their prior understandings or even
experiment with new understandings in a
meaningful way and so they provide a
really interesting example of a
different type of curriculum that uses
like chunks of Melodies and allows
students to focus more on like larger
chunks rather than individual notes to
experiment with different phrases Tunes
Etc and then it's not until later that
they actually then dive into the
individual notes and pitches and being
able to modify those so a way that I can
try and relate this to people who are
not as versed in music making is think
of it like you have an algorithm where
it runs a sequence of distinct functions
like you have a function that makes a
robot do a flip and then another
function that makes a robot spin around
in circles and then another function
that makes a robot I don't know shoot a
basketball or something the approach
that is discussed here is being number
two move those functions around into a
different order and see what happens if
I remove this one or add in more of this
other one or change the sequence Etc
what could we make the robot do or in
this case what could we make for a new
melody or song and then it's not until
later that they actually like double
click on one of those functions and go
okay that function for making the robot
shoot a basketball what if you wanted to
change it so that it would shoot a long
shot or a short shot or very high or a
low shot or a bounce shot Etc so it
doesn't start there it starts first with
the abstraction shooting a ball without
being able to focus on the granularity
and then it dives into it later this was
a really interesting point which is why
I'm kind of sitting with it a little bit
more and it kind of reminds me of like
the use modify create theory that I've
talked about previously where you kind
of gradually work your way down toward
the nitty-gritty of creating first you
start off with just playing with
something testing it out figuring out
what you can do in a platform or a game
or whatever then you go and you start
modifying maybe changing some of the
functions around like I mentioned maybe
changing a parameter or two then after
some comfort and maybe you might create
your own though the same thing could be
done with like integration you could
focus on the more abstract things so
just immediately be able to create
things that are satisfying rather than
jumping into okay we're gonna learn how
to sequence together notes well I mean
maybe that's interesting maybe not and
argue with myself from earlier in Sonic
Pi you could synthesize sounds and like
adjust all sorts of stuff which was cool
but I didn't start kids there I started
with them just being able to sequence
together or play in parallel different
musical Snippets or Loops or even some
sequences of like midi notes and whatnot
kind of like with scratch but with way
more control over what you can do in
them here's a really interesting
paragraph so this is the third paragraph
right after figure three this is on PDF
page eight quote examine from the music
technology angle Freeman and mcgarco
classify educational music programming
languages by the layer of musical
interaction they afford and thus the
form of musical thinking they prescribe
sub-symbolic programming languages
prioritize synthesis and control of time
at an extremely low level symbolic
languages offer editing at the
individual note level usually via midi
hierarchical languages deal with musical
form and allow Loops or motives to be
layered and remixed the most powerful of
the bunch are clearly sub-symbolic
languages offering near Limitless
flexibility and control for expert and
professional coders yet the manipulation
of low-level processes requires patience
determination and a lot of time and
experience to build up the knowledge to
implement in a practical setting we have
observed that novice scratch users
sometimes get hung up and frustrated
after the first few short programs they
write lacking ever a moment of true
self-expression and failing to develop
the knowledge and motivation to progress
further end quote and the next paragraph
says quote hierarchical languages do not
always support lower level sound editing
meaning that advanced users will
inevitably reach a ceiling on what they
can express with the tool ear sketch for
example allows the programming of Beats
but disallows modifying single pitches
within a Melody however this constraint
is not necessarily A Bad Thing
educational programming languages should
not serve as the end all be all but
rather as a meaningful stage in a
learner's development end quote that's a
great Point like I'm recording on a
program called Ableton Live it kind of
looks like an Excel spreadsheet
depending on what mode you're in or it
looks more like a traditional Daw or doc
where you can like record multiple
tracks or midi tracks simultaneously I
love it it's something that professional
musicians producers use Etc and so when
I get on a platform like GarageBand
which is like more entry level I am just
like wait I can't do this either ah I
just like feel so held back but if I had
never used a Daw before and was brand
new to like music production and whatnot
I think I'd probably much rather prefer
GarageBand an entry level thing rather
than Ableton which is a professional
level so even though scratch might not
have like the best ways of integrating
music perhaps it's best for that
experience level of the kids that you're
working with it depends on what their
goals are and what their experiences and
backgrounds are because again everything
in education is pretty gray and things
are great for some things and then
they're terrible for other things or
different contexts and by the way the
creativity within constraints is like a
whole thing actually the people who were
in my dissertation for the discussion
forum that I looked at that had like 10
themes that was coming up was people
talking about the reason why they engage
with chiptunes which is music made from
like retro video game and like computer
hardware the reason why they liked it is
because of the constraints it was
difficult to make they were limited in
what they could do with like number of
oscillators or tracks or how fast or
slow or their pitch range Etc or even
like what kind of waveforms could we
actually use all that fun stuff some
people really like those constraints and
then other people looked at it and went
nah I like the aesthetic of chiptunes
but I want to be able to add in a bunch
of other stuff and so there's a big old
debate in the community it's really
interesting check it out in the
dissertation I'll link to it I think
it's in chapter four and six when I talk
about it but anyways the next next
section on this paper is titled
inconsistent timing in scratch music
playback so honestly I think this
section probably makes more sense if you
look at the different figures because
they basically talk about how if you
want to have like a beat that has like a
bass a hi-hat and like a snare all
playing at the same time and you want to
Loop it it's very difficult to have all
three of those line up consistently over
an extended period of time they slowly
get out of sync and so the authors talk
about like different workarounds that
people have tried like different ways
that you can do it but the main point is
that something as basic as having three
parts playing at the same time is
surprisingly difficult to pull off in
scratch 2.0 and by the way if you hear a
little pitter patter it's my dog walk
around in the background he had
contractors over all day so I'm like
recording later in the evening and she
wants to be with me so I'm letting her
so if you hear a little pitter-patter
that's Minnie say hi alright so in the
next section it talks about some
potential ideas for new ways that
scratch 3.0 might be able to fix some of
these constraints but scratch 3.0 is
already out and the authors also mention
that scratch X which is like their
experimental side of scratch kind of
shows some really cool things that
actually haven't come out yet that I'm
aware of in the main version of scratch
but I'll include a little link to
scratch X if you haven't done that
before there's some really cool stuff
you can do in that that you can't
actually do in normal scratch like one
of them was to be able to time your
blocks to music in a Spotify song which
is really cool I bet at the end of these
like unpacking scholarship episodes I
usually like to share some lingering
questions and thoughts I've already kind
of like shared some of my thoughts
throughout this but the main point of
this is I wanted to kind of share or
problematize the idea of in what ways
might CS integration constrain
engagement with one or more disciplines
so this particular paper is saying it's
constraining music making but I would
argue if you had a general music lesson
that used computational thinking that
would constrain your engagement with
computer science so it goes both ways
but obviously that depends on how you
implement it just a very general quick
example of it I can think of it far too
often when we look at like a Venn
diagram of where one discipline or
domain might be able to overlap with the
other we focus on the connection points
at the cost of focusing on everything
outside of those connection points on
the Venn diagram so yeah you can talk
about how to sequence notes together to
create a Melody if you do music and
computer science but depending on the
platform you might not be able to even
discuss what Tamer is and how to change
it or shape it if the platform doesn't
allow for that but going back to what I
mentioned earlier constraints aren't
always a bad thing I also think it's
important to consider when is that okay
and when is that problematic there's no
one size fit that's all answer to that
so it's really kind of up to you and the
kids you're working with in whatever
context you're in but I enjoyed this
paper I think it was really cool to
think through and problematize some of
the constraints or limitations of a
platform that I very much so enjoy as
you can hear in the podcast that I've
Linked In the show notes like especially
the one with Mitch Resnick talk about
scratch a good amount but I just want to
say appreciate you listening and again
if you're interested this paper is
available for free I include a link to
it in the show notes so check it out at
jaredelery.com thank you so much for
listening to this week's episode stay
tuned for another episode next week
until then I hope you're all staying
safe and are having a wonderful week
Article
Payne, W., & Ruthmann, S. A. (2019). Music making in Scratch: High floors, low ceilings, and narrow walls? The Journal of Interactive Technology & Pedagogy, 15, 1–23.
Abstract
“Music programming is an increasingly popular activity for learning and creating at the intersection of computer science and music. Perhaps the most widely used educational tool that enables music programming is Scratch, the constructionist visual programming environment developed by the Lifelong Kindergarten Group at the MIT Media Lab. While a plethora of work has studied Scratch in the context of children creating games and coding interactive environments in general, very little has honed in on its creative sound or music-specific functionality. Music and sound are such an important part of children’s lives, yet their ability to easily engage in creating music in coding environments is limited by the deep knowledge needed in music theory and computing to easily realize musical ideas. In this paper, we discuss the affordances and constraints of Scratch 2.0 as a tool for making, creating and coding music. Through an analysis of limitations in music and sound code block design, a discussion of bottom-up music programming, and a task breakdown of building a simple drum loop, we argue and illustrate that the music and sound blocks as currently implemented in Scratch may limit and frustrate meaningful music making for children, the core user base for Scratch. We briefly touch on the history of educational music coding languages, reference existing Scratch projects and forums, compare Scratch with other music programming tools, and introduce new block design ideas to promote lower floors, higher ceilings and wider walls for music creation in Scratch.”
My One Sentence Summary
This paper problematizes the limitations of making music with Scratch.
Some Of My Lingering Questions/Thoughts
In what ways might CS integration constrain engagement with one or more disciplines?
When is that ok and when is that problematic?
Resources/Links Relevant to This Episode
Other podcast episodes that were mentioned or are relevant to this episode
Computer Science in Music (CSTA Wyoming interview)
In this episode I'm a guest on CSTA Wyoming's podcast for computer science educators and I answer some questions about the intersections of music and computer science.
Computational Literacies with Michael Horn
In this interview with Michael Horn, we discuss computational literacies vs computational thinking, power in literacy, cultural imperialism, the impact of programming language on identity, the intersections of music and CS, and so much more.
How to Get Started with Computer Science Education
In this episode I provide a framework for how districts and educators can get started with computer science education for free.
Lifelong Kindergarten with Mitch Resnick
In this interview with Mitch Resnick, we discuss misconceptions people have around the four P’s (Projects, Passion, Peers, and Play) in Mitch’s book, encouraging depth of understanding while playing, what has surprised Mitch during his career, encouraging online communication and collaboration without creating artificial engagement, what Mitch wishes we’d see more of and discuss in CS education, our pet peeves with unplugged activities and computational thinking, accounting for survivorship bias with Scratch, expanding our focus on equity and inclusion to include both the “who” and the “how,” the importance of experimenting and learning through play, and much more.
Programming Music with Sonic Pi Promotes Positive Attitudes for Beginners
In this episode I unpack Petrie’s (2021) publication titled “Programming music with Sonic Pi promotes positive attitudes for beginners,” which investigates student attitudes around enjoyment, importance, and anxiety when coding music through Sonic Pi.
Reconceptualizing “Music Making:” Music Technology and Freedom in the Age of Neoliberalism
In this episode I unpack Benedict and O’Leary’s (2019) publication titled “Reconceptualizing “music making:” Music technology and freedom in the age of Neoliberalism,” which explores the use of computer science practices to counter neoliberal influence on education.
Rhizomatic Learning with Catherine Bornhorst, Jon Stapleton, and Katie Henry
In this panel discussion with Catherine Bornhorst, Jon Stapleton, and Katie Henry, we discuss what rhizomatic learning is and looks like in formalized educational spaces, affordances and constraints of rhizomatic learning, how to support individual students within a group setting, standards and rhizomatic learning, why few people know and use rhizomatic learning approaches, how to advocate for and learn more about rhizomatic learning, and much more.
Talking About [Computer Science]: Better Questions? Better Discussions!
In this episode I unpack Allsup and Baxter’s (2004) publication titled “Talking about music: Better questions? Better discussions!” which is a short article that discusses open, guided, and closed questions, as well as a framework for encouraging critical thinking through questions. Although this article is published in a music education journal, I discuss potential implications for computer science educators.
Thinking through a Lesson: Successfully Implementing High-level Tasks
In this episode I unpack Smith, Bill, and Hughes’ (2008) publication titled “Thinking through a lesson: Successfully implementing high-level tasks,” which provides a heuristic that can be used to prepare for a lesson.
In this episode I unpack Bresler’s (1995) publication titled “The subservient, co-equal, affective, and social integration styles and their implications for the arts,” which “examines the different manifestations of arts integration in the operational, day-to-day curriculum in ordinary schools, focusing on the how, the what, and the toward what” (p. 33).
Check out my dissertation, if you’re feeling extra nerdy today
Find other CS educators and resources by using the #CSK8 hashtag on Twitter