midi

Practical Projects for Your Elective Course

I will be presenting a session at the NJMEA Conference this week titled Practical Projects for Your Elective Course. In preparation for this presentation, I have reissued an older episode of my podcast with similar content to make it quickly accessible in the feed for attendees of the session.

This post contains the episode and resources related to the presentation.

The podcast episode opens with my first impressions of the Apple Vision Pro which I got to try out last week.

Enjoy!

Listen to the podcast episode and subscribe below!

Subscribe to the Podcast in… Apple Podcasts | Overcast | Castro | Spotify | RSS

This is the first year that I have had consistent access to a cloud-based DAW and notation editor with all of my students. The result of this experience has been dozens of new Soundtrap project ideas. I plan to do an episode and post about how I am using these tools for composition projects in the band classroom later this year.

In the episode above, and in this post, I give an overview of three of my most successful Soundtrap projects this year. These projects can be done in any digital audio workstation, including Bandlab, GarageBand, Logic, Ableton Live, etc.

Listen to the episode for a more thorough explanation of each idea.

C Melody and Loop Audition

Melody writing in Noteflight and producing in Soundtrap go hand in hand. Because of Noteflight’s many export options, you have many ways to get your melody out of Noteflight and into your Soundtrap project.

Melody writing in Noteflight and producing in Soundtrap go hand in hand. Because of Noteflight’s many export options, you have many ways to get your melody out of Noteflight and into your Soundtrap project.

After giving students some time to experience the user interface of Soundtrap, I have them write a short, 8 measure long, melody in the key of C using Noteflight.

Next, I have them export this melody as a MIDI file and then import it into a track in Soundtrap. Next, I have them add three more tracks and create an accompaniment for their melody using at least one of the three loop types.

  1. Bass

  2. Percussion

  3. Harmonic (guitar, synth, piano, etc.)

Here is a recent submission of this project one of my students made. This assignment was submitted later in the semester when we had expanded the project into composing two alternating melodies which became the basis of a song-form in Soundtrap. By alternating between both melodies (and varying the loop accompaniment) the student made a song that follows the form: Verse—Chorus—Verse—Chorus—Bridge—Chorus. This particular student chose to improvise the blues scale over the bridge.

Row Your Boat Multi-Track

This idea was taken from the amazing Middle School Music Technology class content in MusicFirst. I recommend checking out their subscription options if you like this idea. MusicFirst combines the activity with curated YouTube and Spotify playlists that give students an idea about early recording studio practices for multi-track recording before music was produced digitally.

I am giving an overview of this project with permission from MusicFirst. Full Disclosure: MusicFirst is a past sponsor of the Music Ed Tech Talk blog and podcast.

To start, the student performs the song Row Row Row Your Boat into a software instrument track in Soundtrap. After it is recorded, they quantize it to the nearest 1/8th or 1/16th note so that it is rhythmically accurate.

Next, they duplicate this track two times. Using the piano roll to edit the MIDI content of each loop, students select all notes and drag them higher to create harmony. I have them move the second track a third higher and the next track a fifth higher so that they get a nice three-part voicing.

After multi-tracking the melody of Row Row Row Your Boat, this student moved the starting point of each region to create a round.

After multi-tracking the melody of Row Row Row Your Boat, this student moved the starting point of each region to create a round.

Next, I have them move any notes that land on black keys as a result down to the nearest white key so that every note of each voice is in the key of C.

Next, I have them duplicate these three tracks and transpose them up an octave. Next, I have them take the lowest voice and transpose it down three octaves to add some bass. You can have students make the final result as dense as you like.

Optionally, students can experiment with using different instrument sounds and adding groove-based loop accompaniment.

Here are three recent examples of my student’s submissions:

All-Star Remix

In this assignment, I take a popular song that students choose and separate the voice from the instrumental accompaniment. Then I add it to an audio track in a Soundtrap template and match the tempo and key of the Soundtrap project to the vocal track.

This way, students can drag and drop loops and have them match the pitch and tempo content of the vocals.

I have a post with more details about this project here.

You can watch a video of how to do it below.

Here are two examples of my own student’s recent work:

For a Soundtrap project idea for teaching band/choir/orchestra students to compose, check out the podcast episode and blog post below:

Episode Info

See below for all notes associated with the podcast episode…

Description

Robby overviews his three most successful Soundtrap projects in 2021.

Chapters:

  • 00:00:00 - Intro

  • 00:00:14 - Sponsor: DMV Percussion Academy

  • 00:01:24 - About this Episode

  • 00:03:51 - YAY SUMMER

  • 00:05:11 - 1: Melody Composition / Loop Accompaniment

  • 00:14:35 - 1: Student Examples

  • 00:16:15 - 2: Row Your Boat Multi-Track

  • 00:22:19 - 2: Student Examples

  • 00:24:27 - 3: All-Star Remix

  • 00:34:38 - 3: Student Examples

  • 00:36:00 - Grading for Mastery Not Creativity

  • 00:37:47 - Tech Tip of the Week

  • 00:39:20 - App of the Week

  • 00:40:00 - Album of the Week

  • 00:42:44 - Conclusion

Show Notes:

App of the Week: 
Reeder 5

Album of the Week:
Alison Balsom - Paris

Thanks to this week’s sponsor, the DMV Percussion Academy. Learn more and register here.

Please don’t forget to rate the show and share it with others!

Subscribe to Music Ed Tech Talk:

Subscribe to the Blog

Subscribe to the Podcast in… Apple Podcasts | Overcast | Castro | Spotify | RSS

Sibelius for iPad, with Joe Plazak (Principal Software Engineer and Designer)

This week on Music Ed Tech Talk I am joined by Joe Plazak, Principal Software Engineer and Designer of Sibelius, to talk all about their summer iPad release.

Listen below or in the podcast app of your choice! I look forward to writing more about Sibelius for iPad down the road.

Episode Description: Joe Plazak (Principal Software Engineer and Designer) joins the show to talk about Sibelius for Mobile and their new iPad app.

This episode is sponsored by Blink Session Music: Because Virtual Lessons Are More Than a Video Chat.

Backstage Access Patreon Subscribers can listen to extended discussion including Joe Plazak's Book of the Week and some of my reflections on writing Digital Organization Tips for Music Teachers.

Subscribe to the Blog... RSS | Email Newsletter

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS

Support Music Ed Tech Talk

Become a Patron!

Buy me a coffeeBuy me a coffee

Thanks to my sponsors this month, Blink Session Music.

Show Notes:

App of the Week:
Robby - CleanShot Joe - Tips

Album of the Week:
Robby - Jack & Owane - Part One: Shredemption Joe - Pomplamoose - Impossible à prononcer

Tech Tip of the Week:
Robby - Make your own custom keyboard shortcuts Joe - Hold the spacebar on iPhone to get a cursor

Please don't forget to rate the show and share it with others!


3 Soundtrap Projects Your Students Will Love

Listen to the podcast episode and subscribe below!

Subscribe to the Podcast in… Apple Podcasts | Overcast | Castro | Spotify | RSS

This is the first year that I have had consistent access to a cloud-based DAW and notation editor with all of my students. The result of this experience has been dozens of new Soundtrap project ideas. I plan to do an episode and post about how I am using these tools for composition projects in the band classroom later this year.

In the episode above, and in this post, I give an overview of three of my most successful Soundtrap projects this year. These projects can be done in any digital audio workstation, including Bandlab, GarageBand, Logic, Ableton Live, etc.

Listen to the episode for a more thorough explanation of each idea.

C Melody and Loop Audition

Melody writing in Noteflight and producing in Soundtrap go hand in hand. Because of Noteflight’s many export options, you have many ways to get your melody out of Noteflight and into your Soundtrap project.

Melody writing in Noteflight and producing in Soundtrap go hand in hand. Because of Noteflight’s many export options, you have many ways to get your melody out of Noteflight and into your Soundtrap project.

After giving students some time to experience the user interface of Soundtrap, I have them write a short, 8 measure long, melody in the key of C using Noteflight.

Next, I have them export this melody as a MIDI file and then import it into a track in Soundtrap. Next, I have them add three more tracks and create an accompaniment for their melody using at least one of the three loop types.

  1. Bass

  2. Percussion

  3. Harmonic (guitar, synth, piano, etc.)

Here is a recent submission of this project one of my students made. This assignment was submitted later in the semester when we had expanded the project into composing two alternating melodies which became the basis of a song-form in Soundtrap. By alternating between both melodies (and varying the loop accompaniment) the student made a song that follows the form: Verse—Chorus—Verse—Chorus—Bridge—Chorus. This particular student chose to improvise the blues scale over the bridge.

Row Your Boat Multi-Track

This idea was taken from the amazing Middle School Music Technology class content in MusicFirst. I recommend checking out their subscription options if you like this idea. MusicFirst combines the activity with curated YouTube and Spotify playlists that give students an idea about early recording studio practices for multi-track recording before music was produced digitally.

I am giving an overview of this project with permission from MusicFirst. Full Disclosure: MusicFirst is a past sponsor of the Music Ed Tech Talk blog and podcast.

To start, the student performs the song Row Row Row Your Boat into a software instrument track in Soundtrap. After it is recorded, they quantize it to the nearest 1/8th or 1/16th note so that it is rhythmically accurate.

Next, they duplicate this track two times. Using the piano roll to edit the MIDI content of each loop, students select all notes and drag them higher to create harmony. I have them move the second track a third higher and the next track a fifth higher so that they get a nice three-part voicing.

After multi-tracking the melody of Row Row Row Your Boat, this student moved the starting point of each region to create a round.

After multi-tracking the melody of Row Row Row Your Boat, this student moved the starting point of each region to create a round.

Next, I have them move any notes that land on black keys as a result down to the nearest white key so that every note of each voice is in the key of C.

Next, I have them duplicate these three tracks and transpose them up an octave. Next, I have them take the lowest voice and transpose it down three octaves to add some bass. You can have students make the final result as dense as you like.

Optionally, students can experiment with using different instrument sounds and adding groove-based loop accompaniment.

Here are three recent examples of my student’s submissions:

All-Star Remix

In this assignment, I take a popular song that students choose and separate the voice from the instrumental accompaniment. Then I add it to an audio track in a Soundtrap template and match the tempo and key of the Soundtrap project to the vocal track.

This way, students can drag and drop loops and have them match the pitch and tempo content of the vocals.

I have a post with more details about this project here.

You can watch a video of how to do it below.

Here are two examples of my own student’s recent work:

For a Soundtrap project idea for teaching band/choir/orchestra students to compose, check out the podcast episode and blog post below:

Episode Info

See below for all notes associated with the podcast episode…

Description

Robby overviews his three most successful Soundtrap projects in 2021.

Chapters:

  • 00:00:00 - Intro

  • 00:00:14 - Sponsor: DMV Percussion Academy

  • 00:01:24 - About this Episode

  • 00:03:51 - YAY SUMMER

  • 00:05:11 - 1: Melody Composition / Loop Accompaniment

  • 00:14:35 - 1: Student Examples

  • 00:16:15 - 2: Row Your Boat Multi-Track

  • 00:22:19 - 2: Student Examples

  • 00:24:27 - 3: All-Star Remix

  • 00:34:38 - 3: Student Examples

  • 00:36:00 - Grading for Mastery Not Creativity

  • 00:37:47 - Tech Tip of the Week

  • 00:39:20 - App of the Week

  • 00:40:00 - Album of the Week

  • 00:42:44 - Conclusion

Show Notes:

App of the Week: 
Reeder 5

Album of the Week:
Alison Balsom - Paris

Thanks to this week’s sponsor, the DMV Percussion Academy. Learn more and register here.

Please don’t forget to rate the show and share it with others!

Subscribe to Music Ed Tech Talk:

Subscribe to the Blog

Subscribe to the Podcast in… Apple Podcasts | Overcast | Castro | Spotify | RSS

Introducing My "Scale Exercise Play-Along Tracks With Trap Beats" - Available Now on My New Online Store!

I am announcing a new section of this website. A STORE! Starting today, I will be selling digital products and services I have created for musicians and music teachers. Check it out here!

First up is a collection of Scale Exercise Play-Along Tracks with Trap Beats underneath them.

You bet there's a promotional video.

Here is the product description from the sale page:

This collection contains over 70 major scale play-along tracks for your ensemble.

Each track includes a tuning drone playing the tonic with a scale overtop in just intonation so that you can reinforce flawless intonation, tone production, and blend amongst your students. Every exercise includes a count-off and a trap beat underneath to engage your students while reinforcing slower playing and subdividing!

The audio-only version of this package includes mp3 files of the following recordings in all twelve major keys, at 70bpm.

  • Whole note scale

  • Half note scale

  • Quarter note scale

  • Eighth note scale

  • Scale Exercise in Thirds

  • Mini-Scale with Arpeggio

Also included:

  • Remington at three different speeds! Perfect for playing underneath many of the exercises that come from popular band methods.

The premium version of this product includes the audio tracks above in addition to the Logic Pro and GarageBand stems so you can edit every element of the tracks, including speed, pitch, and instrumentation.

These are perfect for running through your Zoom/Google Meet/Virtual Classroom to keep kids playing as much as possible.

I have been using tracks like these with my band students for years now and they LOVE them. The trap beat resonates with them. Its popularity in hip hop music aside, there is something compelling about them, musically. The backbeat on three, combined with the busy hi-hat activity, helps kids subdivide slower tempos and keeps them motivated to practice stuff like long tones and scales. The strong 808 baseline asserts the beat while adding fun syncopation.

It was essential to me that the drones were in just intonation because I teach my students to hear and adjust to the beats that result when unison pitches and diatonic intervals are in/out of tune. The Yamaha Harmony Director was definitely the tool for the job. Here's a really brief blog post I shared earlier this month about the process if you want to take a stab at making something like this.

You can alternatively do this process using the (excellent) Tonal Energy Tuner App, a MIDI keyboard, and GarageBand on iOS. I wrote about that here. I prefer the Tonal Energy experience, but the Yamaha's hardware keys made it easier to "perform" the drones and allowed me to create in Logic Pro, which I am more proficient in.

The original concept for this was very ambitious initially, and I simplified the vision a ton to help myself "ship it." I have seen music teachers asking for something like this on social media a lot lately, and it seemed like time to do the work. I am happy with how they turned out and I hope to create more of these down the road in varying style, tempo, and exercise patterns.

A few notes:

  1. Due to file upload limitations on Squarespace, buying the stems directs you to download a text file instead of the audio files. The text file contains a link to a third-party hosting source. A little inelegant, I know, but setting up a Squarespace store was otherwise the most comfortable choice.
  2. These are incredibly effective for engaging synchronous ensemble rehearsals. No, we still can't play at once, but running rehearsal tracks through your Google Meet or Zoom call while students are muted is a great way to keep them playing. These tracks are slow enough that I have had success having groups of 3-6 unmute while playing along, and it is not total chaos. Between these, my Solfege Bingo tracks, and The Breathing Gym DVD, we can be synchronously active for more than 80 percent of each class. I get the audio to route directly through to the call using Loopback.
  3. Many of these tracks, particularly the scale exercise in thirds, mini-scale, and Remington tracks, pair perfectly with a multitude of examples in the Foundation for Superior Performance band method books series. I did not title them as such because the book and my project are in no way connected. I bring it up here because I know those exercises are ubiquitous in band rehearsals, and it's for this reason, many directors have their students purchase those books.
  4. I made the arrangements of these tracks simple to keep the appeal as wide-reaching and flexible as possible. My hope is that people who really want to change the style, edit the beat, change the speed, or any other kind of alternation, will buy the version that comes with the GarageBand and Logic stems. Tip: If you want to use software instruments to create your own accompaniment, and want them to be justly in tune with my tracks, Logic Pro has support for tuning systems. That means that if you mute my trap beat and add your own samba tracks, you can have the instruments play in the key area you select instead of their usual equal tempered tuning.

METT Podcast #16 - Master Your Virtual Teaching Tech, with David MacDonald

Thanks to my sponsor this month, MusicFirst

David MacDonald returns to the show to talk about the hardware and software in our virtual teaching setups. Then we speculate about touchscreen Macs and consider how Apple's recent App Store policies might impact the future of creative professional software on iOS.

Topics include:

  • New Zoom features for musicians and teachers
  • David and Philip Rothman's new podcast, Scoring Notes
  • Using Open Broadcaster Software to level up your virtual teaching
  • Routing audio from your apps into Zoom and Google Meet calls
  • Teaching with Auralia
  • LMS integration with third-party music education apps
  • Using MainStage and Logic for performing instruments into virtual classrooms
  • Touchscreen Macs
  • Apple's App Store Policy

Show Notes:

Where to Find Us:
Robby - Twitter | Blog | Book
David MacDonald - Twitter | Website | Blog

Please don't forget to rate the show and share it with others!

Subscribe to Music Ed Tech Talk:

Subscribe to the Blog

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS

Today's episode is sponsored by MusicFirst:

MusicFirst offers music educators and their students easy-to-use, affordable, cloud-based software that enables music learning, creation, assessment, sharing, and exploration on any device, anywhere, at any time.

MusicFirst Classroom is the only learning management system designed specifically for K-12 music education. It combines the flexibility of an LMS with engaging content and powerful software integrations to help manage your students’ progress, make lesson plans, and create assignments.

And for younger students, MusicFirst Junior is the perfect online system for teaching elementary general music. It includes a comprehensive K-5 curriculum, hundreds of lessons & songs, and kid-friendly graphics to making learning and creating music fun!

Whether you’re teaching remotely, in-person, or in a blended learning environment, MusicFirst will work with you to find a solution that fits your program’s unique needs. Try it free for 30 days at musicfirst.com.

David’s teaching setup.

David’s teaching setup.

My teaching setup.

My teaching setup.

…From far away.

…From far away.

My Online Teaching Setup (High-Tech Edition)

My studio computer and associated hardware.

My studio computer and associated hardware.

When school let out in March, I wrote My Very Straightforward and Very Successful Setup for Teaching Virtual Private Lessons. The impetus for this post, and its snarky title, was an overwhelming number of teachers I saw on Facebook fussing about what apps and hardware they should use to teach online when all you really need is a smartphone, FaceTime, and maybe a tripod.

I stand by that post. But there are also reasons to go high-tech. I have had a lot of time this summer to reflect on the coming fall teaching semester. I have been experimenting with software and hardware solutions that are going to make my classes way more engaging.

Zoom

I have been hesitant about Zoom. I still have reservations about their software. Yet, it is hard to resist how customizable their desktop version is. I will be using Google Meet for my public school classes in September, but for my private lessons, I have been taking advantage of Zoom’s detailed features and settings.

For example, it’s easier to manage audio ins and outs. Right from the chat window, I can change if my voice input is going through my Mac's internal microphone or my studio microphone, or if video is coming from my laptop webcam or my external Logitech webcam. This will also be useful for routing audio from apps into the call (we will get to that in a moment).

Zoom allows you to choose the audio/video input from right within the call.

Zoom allows you to choose the audio/video input from right within the call.

Zoom also allows you to AirPlay the screen of an iOS device to the student as a screen sharing option. This is the main reason I have been experimenting with Zoom. Providing musical feedback is challenging over an internet-connected video call. Speaking slowly helps to convey thoughts accurately, but it helps a lot more when I say “start at measure 32” and the student sees me circle the spot I want them to start in the music, right on their phone.

You can get really detailed by zooming in and out of scores and annotating as little as a single note. If you are wondering, I am doing all of this on a 12.9 inch iPad Pro with Apple Pencil, using the forScore app. A tight feedback loop of “student performance—>teacher feedback—>student adjustment” is so important to good teaching, and a lot of it is lost during online lessons. It helps to get some of it back through the clarity and engagement of annotated sheet music.

Selecting AirPlay as a screen sharing option.

Selecting AirPlay as a screen sharing option.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

As much as I love this, I still think Zoom is pretty student hostile, particularly with the audio settings. Computers already try to normalize audio by taking extreme louds and compressing them. Given that my private lessons are on percussion instruments, this is very bad. Zoom is the worst at it of all the video apps I have used. To make it better, you have to turn on an option in the audio settings called “Use Original Audio” so that the host hears the student’s raw sound, not Zoom’s attempt to even it out. Some of my students report that they have to re-choose this option in the “Meeting Settings” of each new Zoom call.

If this experiment turns out to be worth it for the sheet music streaming, I will deal with it. But this is one of the reasons why I have been using simple apps like FaceTime up until this point.

My Zoom audio settings.

My Zoom audio settings.

My Zoom advanced audio settings.

My Zoom advanced audio settings.

Sending App Audio Directly to the Call

I have been experimenting with a few apps by Rogue Amoeba that give me more control over how audio is flowing throughout my hardware and software.

Last Spring, I would often play my public school students YouTube videos, concert band recordings from Apple Music, and warm-up play-alongs that were embedded in Keynote slides. I was achieving this by having the sound of these sources come out of my computer speakers and right back into the microphone of my laptop. It actually works. But not for everyone. And not well.

Loopback is an app by Rogue Amoeba that allows you to combine the audio input and output of your various microphones, speakers, and apps, into new single audio devices that can be recognized by the system. I wrote about it here. My current set up includes a new audio device I created with Loopback which combines my audio interface and a bunch of frequently used audio apps into one. The resulting device is called Interface+Apps. If I select it as the input in my computer’s sound settings, then my students hear those apps and any microphone plugged into my audio interface directly. The audio quality of my apps is therefore more pure and direct, and there is no risk of getting an echo or feedback effect from my microphone picking up my computer speaker’s sound.

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

I can select this compound device from my Mac’s Sound settings.

I can select this compound device from my Mac’s Sound settings.

Now I can do the following with a much higher level of quality...

  • Run a play-along band track and have a private student drum along
  • Play examples of professional bands for my band class on YouTube
  • Run Keynote slides that contain beats, tuning drones, and other play-along/reference tracks
  • and...

Logic Pro X

Logic Pro X is one of my apps routing through to the call via Loopback. I have a MIDI keyboard plugged into my audio interface and a Roland Octopad electronic drum pad that is plugged in as an audio source (though it can be used as a MIDI source too).

The sounds on the Roland Octopad are pretty authentic. I have hi-hat and bass drum foot pedal triggers so I can play it naturally. So in Logic, I start with an audio track that is monitoring the Octopad, and a software instrument track that is set to a piano (or marimba or xylophone, whatever is relevant). This way, I can model drum set or mallet parts for students quickly without leaving my desk. The audio I produce in Logic is routed through Loopback directly into the call. My students say the drum set, in particular, sounds way better in some instances than the quality of real instruments over internet-connected calls. Isn’t that something...

Multiple Camera Angles

Obviously, there is a reason I have previously recommended a set up as simple as a smartphone and a tripod stand. Smartphones are very portable and convenient. And simple smartphone apps like FaceTime and Google Duo make a lot of good default choices about how to handle audio without the fiddly settings some of the more established “voice conference” platforms are known for.

Furthermore, I can’t pick up my desk and move it to my timpani or marimba if I need to model something. So I have begun experimenting with multiple camera angles. I bought a webcam back in March (it finally just shipped). I can use this as a secondary camera to my laptop’s camera (Command+Shift+N in Zoom to change cameras).

Alternatively, I can share my iPhone screen via AirPlay and turn on the camera app. Now I can get up from my desk and go wherever I need to. The student sees me wherever I go. This option is sometimes laggy.

Alternatively, I can log in to the call separately on the iPhone and Mac. This way, there are two instances of me, and if I need to, I can mute the studio desk microphone, and use the phone microphone so that students can hear me wherever I go. I like this option the best because it has the added benefit of showing me what meeting participants see in Zoom.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

SoundSource

This process works well once it is set up. But it does take some fiddling around with audio ins and outs to get it right. SoundSource is another app by Rogue Amoeba that takes some of the fiddly-ness out of the equation. It replaces the sound options in your Mac’s menubar, offering your more control and more ease at the same time.

This app is seriously great.

This app is seriously great.

This app saved me from digging into the audio settings of my computer numerous times. In addition to putting audio device selection at a more surface level, it also lets you control the individual volume level of each app, apply audio effects to your apps, and more. One thing I do with it regularly is turn down the volume of just the Zoom app when my students play xylophone.

Rogue Amoeba's apps will cost you, but they are worth it for those who want more audio control on the Mac. Make sure you take advantage of their educator discount.

EDIT: My teaching set up now includes the use of OBS and an Elago Stream Deck. Read more here.

Conclusion

I went a little overboard here. If this is overwhelming to you, don't get the idea that you need to do it all. Anyone of these tweaks will advance your setup and teaching.

This post is not specific about the hardware I use. If you care about the brands and models of my gear, check out My Favorite Technology to read more about the specific audio equipment in my setup.

Making Just Intonation Play Along Tracks for Your Performing Ensemble (Using Tonal Energy and GarageBand)

There are a few things that would be helpful to know about my music teaching philosophy before reading this post.

1. I believe that tone production, intonation, balance and blend are central to teaching performing musicians. I prioritize them much higher than fingering technique, rhythmic precision, and even reading comprehension.

2. The way I structure my band classes starts with, is focused on, and always revisits those core ideas.

3. I have accumulated a vast variety of tools and teaching strategies to meet my goals of having superior tone quality, intonation, balance and blend. One of the most essential tools I use is the Tonal Energy Tuning app.

Tonal Energy Tuner

What is Tonal Energy? A hyper charged, power-user app for musicians that has many advanced features, including...

- Tuning drones that can be triggered polyphonically

- Feedback as to how in tune a performer is, which includes a delightful happy face to depict good or questionable intonation

- Drones and feedback can be adjusted to different temperaments

- A metronome (with more features than nearly any alternative on the App Store) that can be used separately or at the same time as the tuning drones

- Analysis tools that depict amplitude and intonation on an easy to read visual graph 

- Recording and play back practice tools for musicians to listen back to their performance

- Automated metronome pre-sets that can be sequenced 

See the video below. I will first depict the tuner playing a Bb drone, then I will show how it can model a Bb major triad all at once. Then I will turn the tuner to just intonation mode, and you will hear that the third and fifth of the chord are appropriately adjusted so that they are in tune with the Bb root. Next, the video will demonstrate how the metronome can be used in combination with these drones.

Imagine now that a student is playing a scale along with Tonal Energy. By leaving the tuner in just intonation, and centering around the key area of Bb major, every note of the scale that I touch will resonate accurately with the Bb, giving the student an accurate reference to blend into.

Developing An Inner Ear for Diatonic Intervals

Much of music is made up of scales. For a student to learn how to most accurately tune different intervals and chords, I have the drone running in the background during most of my teaching in whatever key area we are working in. I then move my finger to the correct notes of the melody to model and reinforce what good intonation would sound like. See below for an excerpt of a song my beginning students might play.

In the video below, watch as I play this song by dragging my finger along to the melody. This happens with a metronome to reinforce the beat. I like that TE has the option to speak counts out loud. In my experience, this really reinforces a concept of strong beats, weak beats, where in the measure the performer is. Other tuning apps have the counting feature as an option, but the sounds in TE sound more natural and less computerized.

Making Play Along Tracks in GarageBand

As you can imagine, I am doing a lot of dragging my finger along while students play for me. This gets tedious. I also want my students to be able to hear these pitch relationships when they practice, so I have begun recording them into play along tracks. How do I do this?

Inter-App Audio Apps and Audio Extensions in GarageBand

In the iOS GarageBand app, audio input is usually performed using either software instruments or by recording audio directly into the device with the microphone. But what you might not know is that you can also create a track that is based on the audio output of a third party audio app. If you have ever used a DAW, think of Inter-App Audio Apps and Audio Extensions like plugins. Once launched, you are kicked into a third party interface (much like using a reverb plugin from Waves or a synthesizer from Native Instruments) which then adds to or alters the sound of your overall project. In a more recent GarageBand update, Apple categorizes Inter-App Audio and Audio Extensions under the External option when you create a new track. 

Audio Extensions are effects that alter your tracks like reverbs and EQs, while Inter-App Audio captures the audio of a third party app and records it into its own track in GarageBand. You can browse the App-Store for Audio Extensions that work with GarageBand. 

public.jpeg

Recording an Inter-App Audio App Directly Into A GarageBand Project

Watch in the video below as I set up an Inter-App Audio App track with Tonal Energy. What I am going to do next is press record, and record my justly in tune play along of Lightly Row into my GarageBand project. I will do this using the euphonium sound. The euphonium drone is one of the roundest, darkest, and fullest sounds, while also containing a great range, so it is effective for most instruments to play along to while also modeling a rich, full, resonant sound.

Accurate Note Input with MIDI Controllers

In this video, you can really hear how sloppy the transition from one pitch to the next is when I drag my finger. Notice also that I did not play repeat notes. It is difficult to play the same pitch twice in a row without Tonal Energy changing itself to that key area. One way around these challenges this is to set up a portable MIDI keyboard with Tonal Energy. The one I have settled in is the CME X-Key with Bluetooth.

It has a sleek look, is very small, and has low key travel. It has buttons for pitch shifting and octave jumping. And Tonal Energy adapts to it in just intonation mode! Watch in the video below. As I change which chord I am playing, TE automatically snaps the third and fifth of each triad in tune, relative to the root. For my Lightly Row performance, I can now hold a Bb drone on in one hand, while playing a melody in the other.

Embellishing The Track with Other Instruments

The resulting play along track is alone pretty useful for students. Let’s make it more fun by adding a drum track.

We can make it even more fun by embellishing with bass and other instruments. I like to change up the style of these play alongs. Sometimes I don't even pre-record them, I just improvise along with my students to keep things fresh. Be careful though. These software instruments are NOT justly in tune, so too many of them can defeat the purpose. I try to combat this by having the drone be the loudest thing in the mix. Notice in this recording I have tried not to create any motion in the accompaniment that interferes with the consonant intervals in the melody, so that the listeners ears can remain focused on the drone for their reference.

Conclusion

Well, that's it! I can trigger these in rehearsal, sectional, and even share them with my students for home practice. Regular practice with tuning drones has really turned around my band's sound, and gives students the foundations for long term ear skills that will help them to HEAR what is in tune, not just respond to the commands “you're sharp!” and “you’re flat!”

🔗 Ethan Hein - Teaching Myself the Bach Chaconne with Ableton Live

Ethan Hein - Teaching Myself the Bach Chaconne with Ableton Live:

Gorgeous though the chaconne is, my enjoyment has been hampered by my inability to figure out the rhythm. All classical performers insist on doing extremely expressive (that is, loose) timekeeping. I don’t have the sarabande rhythm internalized well enough to be able to track it through everybody’s gooey rubato. Bach’s rhythms are complicated enough to begin with. He loves to start and end phrases in weird spots in the bar–the very first note of the piece is on beat two. So I needed some help finding the beat. A chaconne is supposed to be a dance, right? Bach wrote those note values the way he wrote them for a reason. Did he really want performers to assign any length they felt like assigning them? My gut tells me that he didn’t. I suspect that he probably played his own music in tempo, maybe with some phrasing and ornamentation but still with a clearly recognizable beat. I imagine him gritting his teeth at the rubato that modern performers use. Maybe that’s just me projecting my own preferences, but this sense comes from listening to a lot of Bach and performing some too.

So, I wanted to hear someone play the chaconne in tempo, just to hear how it works. And since no one seems to play it that way, I finally went and got the MIDI from Dave’s JS Bach MIDI page and put it into Ableton Live. I added a bunch of triple meter Afro-Cuban drum patterns to help me feel the beat, and had them enter and exit wherever I heard a natural section boundary in the music.

My personal favorite way to enjoy this piece is by performing it on vibraphone, but this is cool too. :)

🔗 Noteflight as a DAW | The Ethan Hein Blog

Noteflight as a DAW | The Ethan Hein Blog:

Notation software was not originally intended to be a composition tool. The idea was that you’d do your composing on paper, and then transcribe your handwritten scores into the computer afterwards. All of the affordances of Finale, Sibelius and the like are informed by that assumption. For example, you have to enter the notes in each measure in order from left to right. If you’re copying from an existing score, that makes sense. If you’re composing, however, it’s a serious obstacle. I can’t speak for all composers, but I’m most likely to start at the end of the bar and work backwards. If I want to put a note on the last sixteenth note of the bar in the MIDI piano roll, I just click the mouse on that beat and I’m done. Notation software requires me to first calculate the combination of rests that’s fifteen sixteenth notes long. I’m told that Dorico has finally addressed this, and lets you place your notes wherever you want. Noteflight, however, follows the model of Finale and Sibelius.

This is a super fascinating explanation of the way modern students are learning to create music on a screen. And I can vouch for Dorico that yes, it deals with note input in a non-linear way, much the same way a MIDI editor functions.

🔗 Soundtrap Enable Import Export MIDI Music Files

Soundtrap Enable Import Export MIDI Music Files:

The MIDI music technology protocol is used worldwide to enable electronic devices -- computers, cellphones, karaoke machines and more -- to generate sounds. The enhanced MIDI support by Soundtrap furthers the creative process by making the multiple tools used to make music interoperable online. This is part of an effort by Soundtrap to broaden its ecosystem of best-of-breed industry solutions so that musicians and music creators have even more flexibility in their music-making efforts. For example, Soundtrap is now interoperable with digital audio workstations (DAWs) through MIDI File Export so users can send all or part of their composition to other solutions such as GarageBand or Pro Tools.

I experimented with Soundtrap with my general music classes last Spring. I was entirely skeptical about the prospect of running a DAW in a web browser but Soundtrap impressed me. It does a great job handling audio and MIDI files in a way that doesn't feel much slower than using a native app like GarageBand. My students loved the collaborative features and we were all left wanting more.

One of my GarageBand assignments in previous years was a MIDI remix, where I put MIDI files for familiar movie and pop songs in a shared folder and encouraged students to remix them, altering the instrument voices, form, and adding loops to transform the style. The fact that I can actually now do this assignment entirely in a web based application through Soundtrap's MIDI import and export is impressive. And this is not even to mention the fact that Soundtrap can perform import and export between other web-only based applications like Noteflight. Very cool. If you have been thinking about checking out Soundtrap for yourself, or for a classroom, I highly encourage it.