Intersections of Popular Musicianship and Computer Science Practices

In this episode I unpack my (2020) publication titled “Intersections of popular musicianship and computer science practices,” which discusses potential implications of hardware and software practices that blur the boundaries between music making and computer science.

  • Welcome back to another episode of the

    CSK8 podcast my name is jared o'leary

    each episode of this podcast is either

    an episode with a guest or multiple

    guests or a solo episode where i unpack

    some scholarship in relation to computer

    science education in this week's episode

    i'm unpacking a paper titled

    intersections of popular musicianship

    and computer science practices by jared

    o'leary that's me here's the abstract

    for this paper quote since the

    introduction of music education within

    public schools curricular offerings have

    not developed in parallel with the

    myriad of practices popular musicians

    engage with outside school context in

    other academic disciplines such as

    computer science curricula and practices

    are iterative in nature and are

    responsive to the ever-changing

    practices that people engage with

    outside educational contexts although

    popular musicians are using computer

    science practices for music related

    purposes such practices are seldom

    discussed within music education

    scholarship this article begins with an

    exploration of such intersections by

    describing hardware practices popular

    musicians use to modify design or build

    electronic devices for music related

    purposes the following section

    introduces coding practices that people

    use to create and modify music software

    as well as to make music with code the

    article concludes by unpacking potential

    implications and considerations for

    educators interested in the

    intersections of popular musicianship

    and computer science practices end quote

    if i were to summarize this article into

    a single sentence i'd say that this

    article discusses potential implications

    of hardware and software practices that

    blur the boundaries between music making

    and computer science now i have to admit

    it is a little weird for me to kind of

    unpack my own papers i've done this

    before in the affinity space episode and

    in the episode on working with trans

    students that was written by trans

    educators but because i'm kind of in a

    unique position in that my background is

    in music education and i've been working

    on computer science education for the

    last several years and people have

    expressed interest in this intersection

    i figured i should talk about this

    particular paper you can find a link to

    it in the show notes which is available

    in the app that you're listening to this

    on by clicking the link in the

    description or by simply going to

    jarrodoleary.com and going to the

    podcast tab while you're on my website

    you'll find hundreds if not thousands of

    resources relevant to computer science

    educators including a link to boot up

    pd.org which is the nonprofit that i

    create curriculum for and collaborate on

    research grants with as well as some

    other content i create related to gaming

    and drumming because i'm that kind of

    nerd all right so in the opening section

    of this paper i contrast music education

    which was first introduced in public

    schools in 1837 by lowell mason with

    stem in cs education so for music

    education honestly not much has changed

    since it was first introduced there are

    some more ensembles but those ensembles

    usually have about like a 20 or 30 year

    lag time between when it's popular to

    engage in that kind of music outside of

    school and when it's finally introduced

    in schools now compare this with

    computer science and stem education

    what's going on outside of schools in

    terms of how people engage with computer

    science and stem is often also how they

    engage with it in schools and what's

    interesting is there have been many

    computer science educators and scholars

    who have explored this intersection of

    music and cs but there haven't been many

    music educators talking about it so in

    this paper which was written for music

    educators i wanted to explain like hey

    here are some really interesting

    hardware and software practices that

    people are doing that kind of blur the

    boundaries between music making and

    computer science and then talk about

    well what are the implications for this

    both in terms of for music educators and

    for computer science educators so in the

    first main section of this paper i

    discuss hardware practices and

    specifically hardware practices that

    blur those boundaries between music

    making and music learning with computer

    science

    and electrical engineering by like

    modifying building or designing

    electronic devices for music related

    purposes now these music related

    purposes are often related to enabling

    music making or enhancing or extending

    music making in some way and one of the

    interesting findings from my

    dissertation is that although these like

    practices related to electrical

    engineering and reading circuit diagrams

    and schematics and manufacturing devices

    are typically practices that are outside

    of music education context like not

    discussed within them the musicians that

    i studied online in a discussion forum

    over 10 000 members in there often

    expressed that these practices were

    essential for the music making practices

    which is really interesting so let's

    talk about some of them so the first

    subsection is on circuit bending so

    picture a battery operated children's

    toy like a speaking spell or one of

    those toys where when you spin the wheel

    whatever animal it lands on it makes a

    sound that matches that particular

    animal so if you were to circumvent a

    device like that you would basically

    take a wire you touch one end of the

    circuit with the wire and then you touch

    somewhere else some other metal part and

    then you'd see how it changes the sound

    if it does at all so what can happen

    with this is by changing the voltage it

    can actually add all sorts of different

    effects so for example with the speaking

    spell gauzola who's like considered to

    be the person who discovered circuit

    bending found that there was six

    different types of circuit bends with

    the speaking spell now these different

    systems allowed you to loop back audio

    or change the pitch like you'd add in a

    potentiometer so when you turn the knob

    to the right you might get more signal

    when you turn to the left you get less

    signal and that'll change how fast the

    audio signal moves through it which will

    make it so it gets higher and lower as

    well as faster and slower there's bends

    called body contact vibrato where your

    body's electrical conductivity literally

    impacts the pitch of the device so

    depending on how you move or how close

    you are to the device it'll actually

    change the vibrato which kind of gives

    that like ghostly

    vibrato effect or whatever etc so if you

    want to read a little bit more about

    that check that out from page 156 but

    here's a little section on that page

    that i want to read that basically

    explains how to do circuit bending and

    this is from gaza quote a begin with a

    low voltage battery powered circuit that

    produces sound and touch on one end of

    conductive wire to a circuit and the

    other end of the conductive wire to

    another point on the circuit b if

    there's a sound or change in an existing

    sound mark the start and end locations

    and move on to another endpoint on the

    circuit

    c once all possible endpoints are tested

    with one starting location move to a new

    starting location and repeat the

    previous steps d when all possible

    combinations of start and end points

    have been tested gazale recommends

    creating permanent connections between

    marked locations by soldering switches

    buttons or potentiometers to the

    electronic device to manually break or

    adjust the voltage of a discovered bend

    end quote so then once you've done all

    that you could perform with the new

    device you could record the audio for it

    and then create sample based recordings

    like make a beat or a melody or whatever

    so there's a lot of really cool things

    that you can do with these circuit bent

    devices and this is relatively cheap and

    easy way to experiment with hardware

    practices that kind of blur those

    boundaries between music making and some

    cs practices so the next subsection in

    here is on hardware modifying so here's

    a quote from page 157 that i want to

    read quote rather than creating music

    with pre-manufactured musical

    instruments students can modify

    electrical hardware to create new

    musical instruments or interfaces for

    creating or modifying music end quote so

    for example the original game boy was

    designed to have headphones plugged into

    them not designed to record out of not

    designed to perform over loudspeakers

    etc so the original design of the device

    actually didn't have great sound quality

    when you plug in an audio jack and try

    and record it or project it so what

    people did is a process called pro sound

    mod so to do a pro sound you basically

    rewire it so that instead of coming out

    of the audio port that's built into the

    game boy you create two new ports one

    for each channel and this actually gives

    you a cleaner and louder signal so

    people would use this to perform with

    record with etc now another example of

    performing that takes it to the next

    step that i provide in this particular

    article talks about mold over and how

    they augment midi controllers by adding

    in some new buttons knobs keys pads etc

    to make it so they could actually do

    more with that controller than it was

    originally designed to do now if we dive

    even deeper into the hardware practices

    some people will actually design and

    manufacture new devices specifically for

    music related purposes and this can be

    for both electronic instruments as well

    as acoustic instruments so as an example

    for electronic instruments in my

    dissertation i found that the chip

    musicians in the discussion forum that i

    explored or investigated would often

    design interfaces that allowed them to

    connect retro consoles like video game

    consoles with modern devices like their

    pc or mac and so this made it so that

    they could actually control the sound

    chip of like a sega genesis or mega

    drive using midi information sent from

    like their laptop alright so that's a

    very quick summary of the hardware

    practices so we have a range from like

    tinkering and experimenting with battery

    operated stuff to

    adding in buttons knobs etc or modifying

    devices to be able to make music with it

    or even designing and manufacturing new

    devices that enable new ways of making

    music now these practices are pretty

    atypical forms of engagement in

    school music curricula at least in the

    k-12 context however there are many

    institutions in higher education that

    are actually exploring these contexts in

    music technology programs or computer

    science programs now on the software

    side of things or the coding side of

    things we have some other really

    interesting practices that i discuss in

    the next section of this paper but one

    of the really interesting findings from

    my dissertation is that there was a

    discussion forum topic where somebody

    asked whether or not coding practices

    were considered to be off-topic for

    music making and members responded with

    comments such as quote i think it is

    more than on topic and it is interesting

    to read even if i barely understand

    anything end quote and then i include a

    citation from a another book by huggle

    this is from page 209 of the book or

    from page 158 of the article and the

    quote is in order to escape the

    limitations that commercial software

    imposes on the musician i think it is

    important to be able to work in an

    environment where you are free to

    compose your own instruments or tools in

    quote so in other words engaging in

    coding practices to enable or expand

    music making like when i mentioned the

    game boy there wasn't software for

    composing and performing with game boys

    nintendo never released anything like

    that but because people wanted to make

    music with those devices fans of the

    game boy wrote their own software called

    trackers and allowed them to compose

    music and perform music with game boys

    so in the next subsection on creating

    and modifying music software that is an

    example of people who create software to

    be able to make music and this was

    actually born out of mod practices from

    the demo scene where people would modify

    the introduction to a game to kind of

    demonstrate their prowess in coding by

    adding in new graphics and new sound

    effects and music etc but it eventually

    spun off into its own music related

    practice known as chiptunes where people

    would just create software or just write

    music with software often in the form of

    trackers and i include a screenshot on

    page 159 bigger one which shows a

    tracker for little sound dj lsdj on the

    game boy so you can see what it looks

    like now although a select number of

    people ended up creating trackers like

    lsdj a larger number of people actually

    modified them for their own purposes and

    this was enabled because many of the

    original creators would release the

    source code and encourage modification

    they were often doing this part-time for

    fun and couldn't add in all the features

    that other people wanted so they'd share

    their source code and say hey if you got

    a feature that you'd like to add go for

    it and then those people would share

    their mods of it so how this applies to

    music education and cs education in the

    k12 context is

    quote rather than using pre-packaged

    music software with inherent limitations

    and biases for certain modes of music

    making and learning students might

    augment or alter the code of music

    software to not only tinker with

    inherent constraints but to circumvent

    such constraints end quote so as an

    example of a constraint what if you have

    music software that like makes it so you

    can only play between 100 and 200 beats

    a minute what if you wanted to go slower

    than that what if you wanted to go 80

    beats a minute or what if you wanted to

    go really fast like 250 beats a minute

    well you could simply go into the code

    and change the minimum and maximum for

    the beats per minute this allows you to

    increase your tempo range and it's just

    a very simple example of how you might

    modify software to remove constraints

    that are designed into it now some

    software actually encourage this so in

    digital audio workstations also known as

    daws daw there are programs called

    virtual studio technologies or vsts and

    so these vsts basically are like add-ons

    for your software or expansion packs so

    it allows you to open up these little

    sub programs that enhance the software

    in some way and so people will often

    code and share their own vsts with other

    musicians and programmers so that way

    they can make music in ways that were

    not originally designed into the

    software now while many music educators

    that i've spoken with would probably

    look at that and go okay well you're

    just programming software so what does

    that have to do with music making and

    what i would argue is that you need to

    know both music and computer science

    concepts in order to be able to create

    music software or add-ons like vsts and

    whatnot so here's a quote from my

    dissertation from page 215 quote

    consider a simplified example of what a

    person would need to know to enable user

    controlled volume within a tracker from

    a music education perspective a person

    would need to understand that sounds and

    music can change volume and they might

    label that concept dynamics volume or

    even amplitude from a computer science

    perspective a person would also need to

    understand that a symbolic label can

    keep track of a value that changes

    through user interaction and they might

    label that concept of variable without

    understanding that volume can change

    music education concept and that a

    variable that represents the numeric

    value of the volume can change through

    user interaction computer science

    concept a person will be unable to

    create this simple interface this

    simplified example demonstrates a person

    would apply concepts from both

    disciplines to enable user controlled

    volume within a tracker end quote so in

    other words yeah you might be spending a

    lot of time coding but you have to

    understand music and you're creating a

    program for music related purposes so i

    personally would argue that yeah this is

    very relevant to music educators and

    obviously computer science educators now

    that part of the intersection of

    programming and music it might be a

    little foreign for a lot of music

    educators and they might still be like

    yeah i don't know about that the next

    two sub-sections in here on composing

    with code and performing with code so

    for example you could use ear sketch to

    compose with code so you could write out

    a bunch of beats and whatnot create some

    melodies create entire songs remixes etc

    using code but you could also perform so

    programs like sonic pi and others allow

    you to perform live which is a process

    known as algo raving so in an algorithm

    somebody might take a laptop plug it in

    so that you can see the screen on a

    projector and then plug in the audio to

    some speakers and they will code out

    live music so i will include some links

    in the show notes so you can see what

    this might look like but basically every

    sound that you hear you write out lines

    of code for different arrays different

    control structures etc to create like an

    improvised performance through code all

    of the beats all the melodies etc are

    all performed live and because you're

    projecting the screen you can actually

    see every line of code that's being used

    to create and modify the music however

    there are some other interesting coding

    and performance related practices that

    collins mentions in one of their papers

    so here's a quote from page 161 that

    kind of mentions some of these different

    examples by collins and then salazar and

    armitage so quote a using the human

    body's ability to conduct electricity to

    make and break electrical connections by

    touching conductive objects that trigger

    music related algorithmic sequences b

    touching artwork where segments of the

    artwork use conductive materials to

    trigger music-related algorithmic

    sequences c having audience members play

    a game where the position and type of

    game pieces on a board directed

    performers actions for creating music

    example moving a pawn to a specific

    piece on a chessboard might indicate one

    action for the performer while moving a

    bishop to the same location might

    indicate another d or engaging in

    algorithmic dancing by using a

    computer's camera to analyze motion and

    trigger sequences that correspond to a

    dancer specific movements end quote so i

    want to provide an example for b which

    is on touching artwork so you can use

    conductive ink to paint a painting and

    then add in like a makey makey or micro

    bit or something to the back of it have

    those wires connect into a computer and

    then you could have it so that when you

    touch different parts of a painting it

    makes different sound or music so one of

    the music education students that i

    formally worked with painted a picture

    of peter and the wolf that had like the

    different animals from the story and

    when you would touch the different

    animals it would play the theme music

    for that particular animal so that's one

    example of being able to touch artwork

    to make music that i think is a really

    interesting intersection of not only

    music and cs but also art alright so in

    the next subsection i talk about coding

    practices in music education context so

    for most music educators that's a very

    foreign idea even though they might look

    at the composing and performing

    practices and go oh yeah okay that is

    making music however most music

    educators don't know how to program so

    fortunately there is a range between

    like text-based and block-based

    languages that allow you to kind of like

    jump in and get started with making

    music and i've got some resources on my

    website for that i'll include links to

    those in the show notes but there are

    some things that we need to consider

    whether you're engaging in the software

    coding practices or in the hardware

    practices that i mentioned earlier so

    the next section is titled unpacking

    potential implications and

    considerations and the first subsection

    is questioning the value of practices at

    the intersection of multiple academic

    disciplines so while i think the

    intersection between cs and music is

    really interesting obviously since i did

    a whole dissertation that basically

    explored it and i'm doing a podcast on

    it there are some scholars who suggest

    that quote the design and execution of

    block-based languages and platforms

    themselves may limit and frustrate

    meaningful music-making for children in

    particular payne and rothman discuss

    limitations from the types of blocks

    scratch offers for creating music and

    sounds as well as the inconsistent

    timing of music sequences running in

    parallel i.e at the same time given that

    platforms such as scratch have music

    making and learning limitations and

    platforms such as ear sketch have

    limited music production abilities

    compared to many modern daws why would a

    popular musician choose to create music

    with code instead of through other

    mediums end quote so this is a question

    that i really think that both music

    educators and cs educators need to sit

    with so yeah you could engage in

    practices that blur these boundaries

    like well instead of composing with

    pencil and paper you could compose by

    coding okay you could but does that

    enhance the music making experience in

    any way or is it just a substitution so

    if we think of the sammer framework so

    xamarin is a framework for thinking of

    like using technology or devices and so

    the s is substitution the a is

    augmentation the m is modification and

    the r is redefinition so substitution

    when it comes to sammer and music

    encoding might be well instead of

    composing in sibelius which is a music

    notation software you could compose in

    sonic pi where you write out lines of

    code that do the same composition

    augmentation might be okay well instead

    of just writing out the notation on

    there in

    sonic pi you can actually like shape the

    sound you can make it a shorter sound

    you can make it a longer sound you can

    make it so it gradually gets louder and

    softer etc so it allows you to augment

    the sound capabilities in ways that you

    might not have been able to do with the

    other music software for writing music

    composing modification might be okay not

    only can i compose music but i can

    perform live while i'm doing it by

    writing out lines of code so that allows

    you to do two things simultaneously but

    a redefinition might be okay well i

    could compose music with different

    control structures that allow me to

    create

    semi-improvised or aliatoric music based

    off of algorithmic sequences so for

    example i wrote a program for a

    randomized drum set this drum set would

    play infinitely different beats and

    fills and i have no idea what it would

    play i just give it a different seed a

    different number and then it would play

    a completely different random set of

    beats fills etc forever that is not

    something that i can create with just a

    music notation software on its own i had

    to have computer science practices and

    coding practices to be able to do that

    that is a redefinition of engaging with

    music in ways that i could not have done

    by

    simply substituting out one piece of

    technology for another so what i

    recommend is yes if you are going to

    explore the intersections of cs and

    music totally recommend it it's a lot of

    fun but do so in ways that allow you to

    make music in ways that are not possible

    without that intersection of music and

    cs here's a quote from page 164 quote

    rather than using computer science

    practices to replicate already existing

    music making and music learning

    opportunities educators interested in

    exploring the intersections of music

    technology and computer science should

    consider how the intersections of

    popular musicianship and computer

    science practices might afford new forms

    of music engagement that would otherwise

    be impossible without the incorporation

    of computer science practices end quote

    and a little bit further down on page

    practices enable a medium for making

    music that would otherwise be

    unavailable without the intersection of

    both music and computer science

    practices and are therefore essential

    practices for anyone interested in

    creating similar music in quote okay so

    i provide some more examples of some of

    the projects that i've created or seen

    other people create that kind of explore

    this intersection but the next

    subsection is titled when are music

    practices so here's a little quote from

    page 166 quote consider the code someone

    might type during an algorithm is the

    final code a music composition or

    residue of the performance itself

    although a person might be able to run

    the final iteration of code to listen to

    music i.e a composition it will likely

    lack the development and alterations

    over time that resulted in an expression

    heard by an audience i.e a performance

    in such a scenario we might consider

    code as both performance and composition

    depending on the context but what about

    the code that creates music software

    rather than the music and sound made

    with such software end quote so on the

    next page i basically describe how when

    i created one of the programs that i

    shared on there that allowed me to

    record and loop back a few different

    tracks of myself performing something

    and it was quantized to a metronome that

    i had programmed the vast majority of my

    time was spent sitting in front of a

    computer engaging in practices that

    people might not consider to be music

    related because i was just programming

    maybe like five percent of the time i'd

    actually try out the software perform

    something see how it worked tried to

    find bugs etc and then went back to

    programming however when engaging this

    process and like the infinite drum set

    that i made etc i would argue that all

    of it was for musical purposes and

    engaged in musical thinking problem

    solving etc and it just happened to be

    through the medium of programming so

    because of this i think it is more than

    relevant to music educators in

    particular who might be skeptical about

    yeah well you're not really making much

    music okay you're correct but you are

    thinking about music you're applying

    your concepts and musical understandings

    throughout the entire process so from my

    perspective

    it is very relevant to music educators

    so if you do have any friends or music

    educators who are skeptical about this

    intersection feel free to have them

    listen to this article or read the

    article or even reach out to me all

    three of my degrees are music education

    i've worked with every grade

    kindergarten through doctoral student in

    music education in ensembles general

    music music education etc context so

    hopefully that background kind of helps

    music educators go oh jared's one of us

    and they say that this is related to

    music maybe i should listen because

    quite frankly some music educators are

    pretty skeptical about this especially

    if it comes from somebody who's only

    done computer science and doesn't have a

    background in music education alright so

    the next subsection is on connecting

    practices now basically this section

    summarizes something that i found in my

    dissertation in that the people who were

    posting on chipmusic.org were engaging

    in a variety of practices and they'd

    oscillate back and forth between them so

    one day they might be composing the next

    day they might be engaging in hardware

    modification practices and the next day

    they might be programming software and

    that could even all kind of alternate in

    the course of an hour jumping back and

    forth between them so that contrasts

    pretty heavily with what typically

    happens in schools where you have a

    music class and you have a computer

    science class and those two don't really

    interact together if you to engage in

    these practices in a holistic way you'd

    be able to jump back and forth between

    them so maybe i'm performing one minute

    and then the next minute maybe i'm

    programming but the amount of time that

    you spend on one versus the other really

    kind of makes me question well where

    would this occur is should it occur in a

    music education class should it occur in

    a computer science class what about the

    intersection of the two could you form a

    new interdiscipline so discipline that

    is a merger of cs and music making so

    you're engaging in both if you were to

    do that by connecting these practices

    whose standards would you use for that

    in my dissertation i point out that well

    the music education standards don't

    really get at

    most of the hardware and software

    practices going on but the computer

    science standards don't really get at

    most of the performing and composing

    practices that are going on so do you

    combine the two or do you make something

    new and i kind of unpacked this a little

    bit more in the final section of this

    paper on further considerations for

    music educators but i guess one thing

    that i'd like to leave on that i kind of

    talk about at the very end is quote

    educators interested in music-related

    engagement that incorporates practices

    from a multitude of academic disciplines

    may need to collaborate with experts

    outside music to integrate or co-teach

    new curricular offerings that encourage

    development and application of

    understandings across multiple academic

    disciplines when creating new classes or

    collaborating with experts from multiple

    academic disciplines educators should

    take care to ensure no discipline is in

    a subservient relationship to another

    but maintains co-equal status end quote

    so a subservient relationship for like

    music educators might say okay well

    we're gonna do this intersection of cs

    and music and we're doing this to

    increase enrollment numbers in our cs

    courses because we think it's engaging

    and will interest students but the real

    purpose is all about increasing numbers

    in cs not necessarily to make music so

    it's important for us to consider what

    bressler calls that co-equal integration

    where rather than making it so like cs

    is subservient to music or music is

    subservient to cs that we have this like

    interdiscipline where both work together

    in a symbiotic relationship you can't

    have one without the other and they're

    both weighted equally if you're

    interested in learning more about that

    kind of integration maybe i'll do an

    unpacking scholarship episode on

    bressler's paper that talks about this

    and i'll relate it to cs education and

    integration i think it is an important

    thing to consider there are many flavors

    of integration that we can do especially

    on the intersections of music and cs and

    some of them are problematic while

    others are less problematic so i like to

    end these unpacking scholarship episodes

    with lingering questions or thoughts

    that i have but because i wrote this

    like i don't really have much because it

    was already in my head i guess the

    question that i might ask for the field

    is how might scholars and educators from

    both domains work together to develop a

    new interdiscipline or to engage in

    interdisciplinary multidisciplinary

    transdisciplinary practices that are

    co-equal rather than subservient and by

    the way the articles that i'm citing

    that i would describe as subservient

    relationships i don't think was like

    intentionally trying to position one

    domain in a subservient relationship

    with the other for malicious purposes

    but it certainly can come across that

    way unintentionally assumingly if the

    purpose is to serve one domain over

    another and i get it that grant funding

    lightly had a say into why this

    relationship was used but there are

    other ways of collaborating across

    domains that i think we could explore

    more and speaking of other domains the

    last question that i have is what other

    intersections work well with cs one of

    the really interesting things about like

    video game development in general is it

    involves so much other disciplines like

    art sound music animation physics

    kinesiology all sorts of really

    interesting domains that can intersect

    with cs so should we create interest

    disciplines for those as well or do we

    consider css inherently

    interdisciplinary but are those examples

    that i provided in relation to video

    game design and development is that the

    same kind of interdisciplinary

    connections that people try and use when

    they are crowbaring cs into other

    subject areas and if we are going to

    consider many other forms of engagement

    in interdisciplinary contexts how can we

    make sure that it's not putting cs in a

    subservient relationship without

    discipline so for example when we're

    integrating cs into math or ela or

    science or whatever is it just to cover

    the standards of cs but the real purpose

    is just to learn ela math etc if so

    that's problematic and going back to the

    samur framework are we just doing in

    ways that are substitutions and

    augmentations or are we modifying or

    redefining these disciplines through cs

    i don't really have answers it's very

    situationally dependent like even if you

    look at a curriculum that might be one

    way you could teach it a different way

    so i just am raising these thoughts as

    things to consider and feel free to

    disagree with them and you can even come

    on the podcast and tell me why you

    disagree with them which you can do by

    pressing the contact me button on my

    website at jaredoleary.com which again

    has all the show notes for this podcast

    as well as a ton of cs related resources

    and some more drumming and gaming

    content so check it out if you haven't

    but that concludes this week's episode

    of the cska podcast you'd be so kind

    please consider sharing a review or

    sharing with somebody else and stay

    tuned next week for another episode

    until then i hope you're all staying

    safe and are having a wonderful week

Article

O’Leary, J. (2020). Intersections of popular musicianship and computer science practices. Journal of Popular Music Education, 4(2), 153-174.


Abstract

“Since the introduction of music education within public schools, curricular offerings have not developed in parallel with the myriad of practices popular musicians engage with outside school contexts. In other academic disciplines such as computer science, curricula and practices are iterative in nature and are responsive to the ever-changing practices that people engage with outside educational contexts. Although popular musicians are using computer science practices for music-related purposes, such practices are seldom discussed within music education scholarship. This article begins with an exploration of such intersections by describing hardware practices popular musicians use to modify, design or build electronic devices for music-related purposes. The following section introduces coding practices that people use to create and modify music software, as well as to make music with code. The article concludes by unpacking potential implications and considerations for educators interested in the intersections of popular musicianship and computer science practices.”


Author Keywords

Computer science, interdisciplinary, coding, popular musicianship, computer science education, music education


My One Sentence Summary

This article discusses potential implications of hardware and software practices that blur the boundaries between music making and computer science.


Some Of My Lingering Questions/Thoughts

  • How might scholars and educators from both domains work together to develop a new interdiscipline?

  • What other intersections work well with CS?


Resources/Links Relevant to This Episode



More Content