A little preface to this post: This post may be a bit rambling, but I promise that this post includes a lesson/experience idea that can be used in your music class. If you want to get to that, just jump to the end (I won’t be hurt). However, if you want to hear about my journey in coming to this curricular idea, my general excitement about Scratch, and would like to read some basic blog-ish discussion, read on
So, I’m a generally techie type of guy. However, I’m not the build my own motherboard or program my CPU sort of person. Often, if the barriers are high enough on a program or piece of tech, I just ditch it and find something that actually does what I want it to do. Sometimes, I end up getting fairly good a piece of technology to the point that I can bend it to do exactly what I want (Garageband and Audacity are good examples). However, I’m just not patient enough to go through the rigmarole of learning how to use yet another program or device in order to do something I can do on a platform I already know how to use, and use well. So, I guess what I’m saying in a very round about way is that: I love technology but I am generally not very patient (ask me to practice for a concert and you may see this lack of patience in full swing). When I jump into something, I want to figure out the basics quickly and then be able to MAKE something with it (we can unpack the problems of this mentality later, as there are many).
With that all stated–now that you have had a little look under the hood of Jesse–I feel the need to talk to you about something that is immediately become a fairly inspirational tool for me. It has pretty low barriers and is completely centered on MAKING new things, but it can be frustrating to learn, at times. Scratch from MIT Media Lab (among many other collaborators) makes me wish I was back in my classroom right now to try it out with the students. Scratch is a web-based programming tool, a Lego-like way for kids (and, clearly, adults) to think and act as computer programmers. The true power of this cross-platform tool is not in the learning to code but is that it is centered on the idea of MAKING things. Kids create new projects (animations, games, music, etc.), collaborate with other via remixes and rebuilds of existing creations, and generally finding ways to make a computer do what you want it to do via code. Okay, it sounds a bit scary, but trusts me, it really isn’t. Here is a kindergartener to teach you some of the basics:
With that all stated–now that you have had a little look under the hood of Jesse–I feel the need to talk to you about something that is immediately become a fairly inspirational tool for me. It has pretty low barriers and is completely centered on MAKING new things, but it can be frustrating to learn, at times. Scratch from MIT Media Lab (among many other collaborators) makes me wish I was back in my classroom right now to try it out with the students. Scratch is a web-based programming tool, a Lego-like way for kids (and, clearly, adults) to think and act as computer programmers. The true power of this cross-platform tool is not in the learning to code but is that it is centered on the idea of MAKING things. Kids create new projects (animations, games, music, etc.), collaborate with other via remixes and rebuilds of existing creations, and generally finding ways to make a computer do what you want it to do via code. Okay, it sounds a bit scary, but trusts me, it really isn’t. Here is a kindergartener to teach you some of the basics:
Before you start down the path of “digital natives” (the problematic idea that kids today are born to understand new tech), realize that the kid in this video is doing the same things you do with your word processor (pointing, clicking, dragging, and typing). You can do this and, most importantly, so can the students in your classroom.
Now, you may be wondering what any of this has to do with music or music education. That’s fair and its something that I wondered for a while, myself. A doctoral friend of mine named Isaac Bickmore has become somewhat obsessed with Scratch (sorry to share that with everyone, Isaac). He has created a pretty neat video game as he has learned how to use Scratch (http://scratch.mit.edu/projects/19959856/). He was inspired to use this by his work with Julian Peterson and Ryan Bledsoe (amazing friends and brilliant educators) and an after school project where students hacked and made things. Even with seeing and playing Isaac’s game and hearing Ryan and Julian talk about Scratch, I still failed to see why I should care about it from a music education perspective. I am usually all about finding ways to bring new ideas into music education, but I guess with Scratch, I was being a bit of a stick in the mud. That was until I started to play around with it. In our doctoral seminar class, we were charged to “make something” via a broad spectrum of technologies available. With the help of Emmett O’Leary and Dr. Marg Schmidt, I started to discover the potential of Scratch as a musical tool and doorway to new explorations with sound. And, it all started with a game of “pop the balloon” (http://scratch.mit.edu/projects/10126867/).
So, what did I do? Well, take a look here and see my basic experiment here (go a head, I’ll be here waiting for you when you get back). . .
I wanted to make my camera become a sound controller. My intent was just that simple. I have played with Soundbeams, AUMI, and AirVox (which are amazing tools that I discuss the blog post “Join the Band“) and I wanted to see if I could make something like those apps. Now, it took me a while to figure out how to get my camera to trigger a sound, but I also looked for help via help sections, videos of proof of concept, and actually looking at/editing the code of other Scratch projects (called “remixing”). Once I got one “sprite” (or image) to trigger a sound (a “meow”) by using the camera, all I had to do was make more sprites and drag the code to the next sprite which, essentially, works as copy and paste for code in Scratch (though, I’m realizing now that I could have just programed one sprite and then duplicated it numerous times). I uploaded some sounds (created in Garageband) and then added a little more code blocks to change the volume of each sound. The end result is something that you can walk/move in front of the computer screen to generate an ever-changing soundscape.
Now, you may be wondering what any of this has to do with music or music education. That’s fair and its something that I wondered for a while, myself. A doctoral friend of mine named Isaac Bickmore has become somewhat obsessed with Scratch (sorry to share that with everyone, Isaac). He has created a pretty neat video game as he has learned how to use Scratch (http://scratch.mit.edu/projects/19959856/). He was inspired to use this by his work with Julian Peterson and Ryan Bledsoe (amazing friends and brilliant educators) and an after school project where students hacked and made things. Even with seeing and playing Isaac’s game and hearing Ryan and Julian talk about Scratch, I still failed to see why I should care about it from a music education perspective. I am usually all about finding ways to bring new ideas into music education, but I guess with Scratch, I was being a bit of a stick in the mud. That was until I started to play around with it. In our doctoral seminar class, we were charged to “make something” via a broad spectrum of technologies available. With the help of Emmett O’Leary and Dr. Marg Schmidt, I started to discover the potential of Scratch as a musical tool and doorway to new explorations with sound. And, it all started with a game of “pop the balloon” (http://scratch.mit.edu/projects/10126867/).
So, what did I do? Well, take a look here and see my basic experiment here (go a head, I’ll be here waiting for you when you get back). . .
I wanted to make my camera become a sound controller. My intent was just that simple. I have played with Soundbeams, AUMI, and AirVox (which are amazing tools that I discuss the blog post “Join the Band“) and I wanted to see if I could make something like those apps. Now, it took me a while to figure out how to get my camera to trigger a sound, but I also looked for help via help sections, videos of proof of concept, and actually looking at/editing the code of other Scratch projects (called “remixing”). Once I got one “sprite” (or image) to trigger a sound (a “meow”) by using the camera, all I had to do was make more sprites and drag the code to the next sprite which, essentially, works as copy and paste for code in Scratch (though, I’m realizing now that I could have just programed one sprite and then duplicated it numerous times). I uploaded some sounds (created in Garageband) and then added a little more code blocks to change the volume of each sound. The end result is something that you can walk/move in front of the computer screen to generate an ever-changing soundscape.
Lesson/Experience Idea
So, what can you do with Scratch in your classroom, other than use what other people have created? There are limitless ways to incorporate this, however, I’m going to highlight one idea I like to call: Scratch + Sound + Movement. The question at the heart of this idea is, “How can I use sound and movement to create an expressive artwork?” In the experience, students would experiment with generating sounds, creating an interactive sound controller via Scratch, and then using expressive movements to triggers their sounds.
Materials: Computer that can run flash, Wi-Fi or some internet connection, sound sources, a web camera, and space to move.
Step 1: Either as individuals or in “design teams,” students create sounds. These could be MIDI sounds, found sounds, instrument recordings, loop, and clips of other songs. It doesn’t really matter, as long as they are recorded as an MP3 or WAV file. Each individual sound needs to be exported as a file (so name them well and know where they export to). I’ve found it useful to create some short sounds, some long sounds, and some files that have a lot of space in them. For an example of the types of sounds I would suggest, take a look at In Bb and it’s instruction.
* It would be very fun to record found sounds into a sampler (like the one found in Garageband iOS) so the students could alter the waveforms and create new sounds.
Step 2: Scratch time. You’ll want to join Scratch (its free and worth it). If not, you can still make things, but saving becomes problematic. Log into, or have students log into the Scratch account on the computer they are using. From here, the students could take two routes to creating a web-camera controller with all their sound files: 1) The students can “remix” an existing project (here is a simplified version of mine to use as a template). Or, 2) you can help the students create their own.
So, what can you do with Scratch in your classroom, other than use what other people have created? There are limitless ways to incorporate this, however, I’m going to highlight one idea I like to call: Scratch + Sound + Movement. The question at the heart of this idea is, “How can I use sound and movement to create an expressive artwork?” In the experience, students would experiment with generating sounds, creating an interactive sound controller via Scratch, and then using expressive movements to triggers their sounds.
Materials: Computer that can run flash, Wi-Fi or some internet connection, sound sources, a web camera, and space to move.
Step 1: Either as individuals or in “design teams,” students create sounds. These could be MIDI sounds, found sounds, instrument recordings, loop, and clips of other songs. It doesn’t really matter, as long as they are recorded as an MP3 or WAV file. Each individual sound needs to be exported as a file (so name them well and know where they export to). I’ve found it useful to create some short sounds, some long sounds, and some files that have a lot of space in them. For an example of the types of sounds I would suggest, take a look at In Bb and it’s instruction.
* It would be very fun to record found sounds into a sampler (like the one found in Garageband iOS) so the students could alter the waveforms and create new sounds.
Step 2: Scratch time. You’ll want to join Scratch (its free and worth it). If not, you can still make things, but saving becomes problematic. Log into, or have students log into the Scratch account on the computer they are using. From here, the students could take two routes to creating a web-camera controller with all their sound files: 1) The students can “remix” an existing project (here is a simplified version of mine to use as a template). Or, 2) you can help the students create their own.
- Route 1) Open up a template. Students can edit the sprites and their placements by clicking on them. To change the sounds that are triggered, watch the tutorial video below. Once all sprites are changes, you have functional web camera controller. Rename it and save.
- Route 2) Help students build their own unique and meaningful controller. This takes some time but is very meaningful and allows students to make use of 21st century skills, like coding and systemic thinking, to create an artistic work. Watch this tutorial for the basics of how to set up a web camera triggered sprite:
Step 3: Now that the sounds are created, the controller is made, lets create some movements to trigger the sounds. Students could choreograph a dance to perform in front of their controller and see what happens. Or, they could interact with the sounds as they are triggered, changing their movements to fit the sounds but also knowing how to make a specific move to trigger a specific sound. The design team/student may realize that they need to reorganize the sprites to allow for specific sounds to be triggered by a certain move. Excellent! That’s real-world revision and editing in action.|
Step 4: Share. The students should share their works, share their controllers, and explain their design decisions. What are the next steps? How can design teams use this a pilot effort? The sharing and discussing is an integral portion. Imagine if you had an interactive sound installation night where parents, teachers, and community members came to play with the sound controllers. The students could explain their process and when help the other inter-actors make their own.
Assessments: To assess the learning in an experience like this, lets think about what the students may have learned (or what I hoped they did).
Also, in this experience you might be able to see the kernels of developing understanding in the musical dimension of timbre, texture, dynamics, articulation, rhythm (at least duration), and form. The students thought up sounds and organized those sounds via the creation of the controller and their movements. They would act as composers and improvisors… as well as performers of their own work. They might begin to understand artworks from a more systemic view and see how the final “piece” is often an intensely collaborative product (think about artists like Skrillex’s and Daft Punk’s most recent albums as examples of the collaborative process). The assessments here are both performance-based and discussion-based. They are assessments for and as learning as they help the teachers know how to proceed (for) and help the students reflect upon their experiences and generate their own path forward (as).*
* for a longer discussion of assessment of, for, and as learning, see Sheila J. Scott’s 2012 MEJ article entitled “Rethinking the Role of Assessment in Music Education”
Questions to Ponder:
Step 4: Share. The students should share their works, share their controllers, and explain their design decisions. What are the next steps? How can design teams use this a pilot effort? The sharing and discussing is an integral portion. Imagine if you had an interactive sound installation night where parents, teachers, and community members came to play with the sound controllers. The students could explain their process and when help the other inter-actors make their own.
Assessments: To assess the learning in an experience like this, lets think about what the students may have learned (or what I hoped they did).
- Create interesting sounds
- Understand and apply ideas of building sound controllers
- Developing multi-facets artworks that are inherently interactive
- Pilot an art experience and reflect/revise
Also, in this experience you might be able to see the kernels of developing understanding in the musical dimension of timbre, texture, dynamics, articulation, rhythm (at least duration), and form. The students thought up sounds and organized those sounds via the creation of the controller and their movements. They would act as composers and improvisors… as well as performers of their own work. They might begin to understand artworks from a more systemic view and see how the final “piece” is often an intensely collaborative product (think about artists like Skrillex’s and Daft Punk’s most recent albums as examples of the collaborative process). The assessments here are both performance-based and discussion-based. They are assessments for and as learning as they help the teachers know how to proceed (for) and help the students reflect upon their experiences and generate their own path forward (as).*
* for a longer discussion of assessment of, for, and as learning, see Sheila J. Scott’s 2012 MEJ article entitled “Rethinking the Role of Assessment in Music Education”
Questions to Ponder:
- What could this experience mean to the students?
- What could this experience mean to you and your classroom?
- How could designing interactive sound installations help students develop as musicians and musical beings in an ever-changing musical atmosphere?
- How can students become interactive artwork design teams in your classroom?
- What becomes your role in a “design-based” music classroom?