facebook

Take Advantage of IFTTT Discounted Price Until October 31

In an upcoming episode of my podcast, Frank Buck and I talk a bit about automation. In that episode we reference a great service called If This Than That. I have been using it for years and though the service is free, they recently announced a Pro Version with a “choose your own price” subscription model.

You can choose a price as low as $1.99 a month until October 31. Since the podcast episode isn’t dropping until later in the week, I thought I would get this news out there now.

This is an extremely useful service that has been a part of my productivity workflow for years. IFTTT allows you to string together different apps and services to create automations, or, “Applets.” Some examples might include:

  • If I save a YouTube video to watch later, add a task to remind me to watch it

  • If I like a Tweet, save the attached article to a read it later list

  • If I am tagged in an Instagram photo, save it to my Dropbox

  • If I do an item on my to do list, log it in a row of a Google Sheet

  • If I pin something on Pinterest, share it to Facebook

The possibilities are limitless. The pro version adds a ton of features, for example:

  • Multi-step Applets

  • Queries and conditional logic

  • Multiple actions

  • Faster Applet execution

    ... and beyond.

IFTTT is a great tool and I strongly recommend you check our the new pro features before the end of this month!

Learn OmniFocus: Workflows with Robby Burns - Watch the Free Video Now

Last weekend, I had the awesome pleasure of being a workflow guest on Learn OmniFocus, a website dedicated to teaching and training on the task management app OmniFocus, complementary apps, and the productive way of life.

The video, along with resources mentioned in my appearance can be viewed here. I recommend watching it here because there are chapters you can use to skip around to the various sections of the video by topic. 

Alternatively you can watch the video on Facebook or on YouTube.

Topic include

  • the definition of multitasking

  • my love of quick entry and using a task inbox

  • how single item action lists are useful in the middle school band teaching environment

  • how to stay on top of more tasks than are actually possible to do in the day through use of tags and perspectives that filter out information only relevant to a particular context

  • Using the Drafts app for quickly capturing my thoughts, processing my tasks, and acting upon them in powerful ways

  • using project templates for larger projects like field trips and musical performances so that tasks don't slip through the cracks

  • using Siri Shortcuts to turn data into variables and make a blog post, shared document, and OmniFocus project for creating an episode of the Music Ed Tech Talk podcast

  • Using DEVONthink to connect documents to projects and tasks in OmniFocus and keep things I want to "check out later" off of my todo list

  • putting widgets with charts that show a view of my day in OmniFocus on the Home Screen of my iPhone

My thanks to Tim Stringer for his invitation and for his inspiring work with Learn OmniFocus and for inviting me to join!

Learn OmniFocus is a great website, resource, and community dedicated to empowering people to be more mindful and productive. The app OmniFocus is at the center of it but there is so much more to it than that, including productivity basics, apps, and services that compliment one another. Be sure to check it out here and become a member here. There are educator discounts!

🔗 Darcy James Argue on Spotify, Artist Compensation, and User-Centric Payment System's

I hesitate to post links to content on Facebook, but Darcy James Argue wrote an excellent post on his Facebook page regarding Spotify CEO, Daniel Ek's, recent claim that musicians may no longer be able to release music only "once every three to four years."

Excerpt below. Read the entire post here.

There’s been a lot of talk about Daniel Ek telling the artists whose creative work has made him a multi-billionaire that if we want to be paid a living wage, we just need to “work harder.” It’s infuriating, of course, but whenever this conversation comes up, people also tend to be extremely defeatist — yes, Spotify is horrible for artists, but it’s also the future, so what are you going to to do? Well, there are actually a lot of things you can do, including supporting the artists you care about directly by purchasing their music via Bandcamp and supporting the crowdfunding campaigns that allow them to actually make records. But even within the streaming world, there’s a model that is much more equitable than Spotify’s. It’s called a User-Centric Payment System, and essentially what it does is make sure the money that you pay for your monthly subscription fee actually goes to the artists you listen to.

By the way, if you are not familiar with the music of Darcy James Argue, get on it! Brooklyn Babylon is one of the most astounding records I have heard in the past 10 years.

My Online Teaching Setup (High-Tech Edition)

My studio computer and associated hardware.

My studio computer and associated hardware.

When school let out in March, I wrote My Very Straightforward and Very Successful Setup for Teaching Virtual Private Lessons. The impetus for this post, and its snarky title, was an overwhelming number of teachers I saw on Facebook fussing about what apps and hardware they should use to teach online when all you really need is a smartphone, FaceTime, and maybe a tripod.

I stand by that post. But there are also reasons to go high-tech. I have had a lot of time this summer to reflect on the coming fall teaching semester. I have been experimenting with software and hardware solutions that are going to make my classes way more engaging.

Zoom

I have been hesitant about Zoom. I still have reservations about their software. Yet, it is hard to resist how customizable their desktop version is. I will be using Google Meet for my public school classes in September, but for my private lessons, I have been taking advantage of Zoom’s detailed features and settings.

For example, it’s easier to manage audio ins and outs. Right from the chat window, I can change if my voice input is going through my Mac's internal microphone or my studio microphone, or if video is coming from my laptop webcam or my external Logitech webcam. This will also be useful for routing audio from apps into the call (we will get to that in a moment).

Zoom allows you to choose the audio/video input from right within the call.

Zoom allows you to choose the audio/video input from right within the call.

Zoom also allows you to AirPlay the screen of an iOS device to the student as a screen sharing option. This is the main reason I have been experimenting with Zoom. Providing musical feedback is challenging over an internet-connected video call. Speaking slowly helps to convey thoughts accurately, but it helps a lot more when I say “start at measure 32” and the student sees me circle the spot I want them to start in the music, right on their phone.

You can get really detailed by zooming in and out of scores and annotating as little as a single note. If you are wondering, I am doing all of this on a 12.9 inch iPad Pro with Apple Pencil, using the forScore app. A tight feedback loop of “student performance—>teacher feedback—>student adjustment” is so important to good teaching, and a lot of it is lost during online lessons. It helps to get some of it back through the clarity and engagement of annotated sheet music.

Selecting AirPlay as a screen sharing option.

Selecting AirPlay as a screen sharing option.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

AirPlaying annotated sheet music to the Zoom call using the iPad Pro and forScore app.

As much as I love this, I still think Zoom is pretty student hostile, particularly with the audio settings. Computers already try to normalize audio by taking extreme louds and compressing them. Given that my private lessons are on percussion instruments, this is very bad. Zoom is the worst at it of all the video apps I have used. To make it better, you have to turn on an option in the audio settings called “Use Original Audio” so that the host hears the student’s raw sound, not Zoom’s attempt to even it out. Some of my students report that they have to re-choose this option in the “Meeting Settings” of each new Zoom call.

If this experiment turns out to be worth it for the sheet music streaming, I will deal with it. But this is one of the reasons why I have been using simple apps like FaceTime up until this point.

My Zoom audio settings.

My Zoom audio settings.

My Zoom advanced audio settings.

My Zoom advanced audio settings.

Sending App Audio Directly to the Call

I have been experimenting with a few apps by Rogue Amoeba that give me more control over how audio is flowing throughout my hardware and software.

Last Spring, I would often play my public school students YouTube videos, concert band recordings from Apple Music, and warm-up play-alongs that were embedded in Keynote slides. I was achieving this by having the sound of these sources come out of my computer speakers and right back into the microphone of my laptop. It actually works. But not for everyone. And not well.

Loopback is an app by Rogue Amoeba that allows you to combine the audio input and output of your various microphones, speakers, and apps, into new single audio devices that can be recognized by the system. I wrote about it here. My current set up includes a new audio device I created with Loopback which combines my audio interface and a bunch of frequently used audio apps into one. The resulting device is called Interface+Apps. If I select it as the input in my computer’s sound settings, then my students hear those apps and any microphone plugged into my audio interface directly. The audio quality of my apps is therefore more pure and direct, and there is no risk of getting an echo or feedback effect from my microphone picking up my computer speaker’s sound.

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

A Loopback device I created which combines the audio output of many apps with my audio interface into a new, compound device called “Interface+Apps.”

I can select this compound device from my Mac’s Sound settings.

I can select this compound device from my Mac’s Sound settings.

Now I can do the following with a much higher level of quality...

  • Run a play-along band track and have a private student drum along
  • Play examples of professional bands for my band class on YouTube
  • Run Keynote slides that contain beats, tuning drones, and other play-along/reference tracks
  • and...

Logic Pro X

Logic Pro X is one of my apps routing through to the call via Loopback. I have a MIDI keyboard plugged into my audio interface and a Roland Octopad electronic drum pad that is plugged in as an audio source (though it can be used as a MIDI source too).

The sounds on the Roland Octopad are pretty authentic. I have hi-hat and bass drum foot pedal triggers so I can play it naturally. So in Logic, I start with an audio track that is monitoring the Octopad, and a software instrument track that is set to a piano (or marimba or xylophone, whatever is relevant). This way, I can model drum set or mallet parts for students quickly without leaving my desk. The audio I produce in Logic is routed through Loopback directly into the call. My students say the drum set, in particular, sounds way better in some instances than the quality of real instruments over internet-connected calls. Isn’t that something...

Multiple Camera Angles

Obviously, there is a reason I have previously recommended a set up as simple as a smartphone and a tripod stand. Smartphones are very portable and convenient. And simple smartphone apps like FaceTime and Google Duo make a lot of good default choices about how to handle audio without the fiddly settings some of the more established “voice conference” platforms are known for.

Furthermore, I can’t pick up my desk and move it to my timpani or marimba if I need to model something. So I have begun experimenting with multiple camera angles. I bought a webcam back in March (it finally just shipped). I can use this as a secondary camera to my laptop’s camera (Command+Shift+N in Zoom to change cameras).

Alternatively, I can share my iPhone screen via AirPlay and turn on the camera app. Now I can get up from my desk and go wherever I need to. The student sees me wherever I go. This option is sometimes laggy.

Alternatively, I can log in to the call separately on the iPhone and Mac. This way, there are two instances of me, and if I need to, I can mute the studio desk microphone, and use the phone microphone so that students can hear me wherever I go. I like this option the best because it has the added benefit of showing me what meeting participants see in Zoom.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

Logging in to the Zoom call on the Mac and iPhone gives me two different camera angles.

SoundSource

This process works well once it is set up. But it does take some fiddling around with audio ins and outs to get it right. SoundSource is another app by Rogue Amoeba that takes some of the fiddly-ness out of the equation. It replaces the sound options in your Mac’s menubar, offering your more control and more ease at the same time.

This app is seriously great.

This app is seriously great.

This app saved me from digging into the audio settings of my computer numerous times. In addition to putting audio device selection at a more surface level, it also lets you control the individual volume level of each app, apply audio effects to your apps, and more. One thing I do with it regularly is turn down the volume of just the Zoom app when my students play xylophone.

Rogue Amoeba's apps will cost you, but they are worth it for those who want more audio control on the Mac. Make sure you take advantage of their educator discount.

EDIT: My teaching set up now includes the use of OBS and an Elago Stream Deck. Read more here.

Conclusion

I went a little overboard here. If this is overwhelming to you, don't get the idea that you need to do it all. Anyone of these tweaks will advance your setup and teaching.

This post is not specific about the hardware I use. If you care about the brands and models of my gear, check out My Favorite Technology to read more about the specific audio equipment in my setup.

Panel Discussion: "Teaching Music Online During the Pandemic" this Wednesday, August 5, 2020

I am taking part in a Panel Discussion called "Teaching Music Online During the Pandemic" this Wednesday, August 5th. It is taking place on the Music Teachers Facebook Group at 8 pm through Zoom.

I will specifically be contributing ideas about practical instruction for performing ensembles. I am planning to discuss everything from large group rehearsal to break-out chamber ensembles, to music scanning apps, and software for assessing student performance.

Some great minds are involved. Here is a description of the panel from the Facebook Event:

Join us for a free online panel discussion with several of your Music Teacher Administrators and Moderators and special guests as we discuss the tools, techniques, and resources to move your music ensembles and music classrooms totally online or to a blended learning hybrid. Panelists include Jim Frankel, CEO of Music First; Katie Wardrobe, Director, Midnight Music Technology Training; Ron Kearns, retired HS band teacher; Tom West Blended Learning Instrumental Music teacher; Robby Burns, MS band teacher; Richard McCready, HS guitar teacher & music ministry; Tiffany Walker, MS band teacher; Krystal Williams, HS band teacher.

If you are interested, you can join the group here. Note: You will not be admitted into the group if you do not answer the questions.

CleanShot 2020-08-03 at 11.05.59@2x.png