Syllabus (TENTATIVE)

Introduction to Music Informatics

Informatics I548 - Music N560

 

Instructor: Don Byrd, Visiting Associate Professor of Informatics; Adjunct Associate Professor of Music
Email: donbyrd@indiana.edu
Offices: Informatics 307B; basement, Cook Music Library, Simon Center (ask for me at the Reference or Circulation Desk)
Phone: 856-0129
Office Hours: 2:30 Weds in the Music Library; other times by appointment.

Class meets Mon/Weds/Fri from 1:25 - 2:15 PM, in Simon Center M373. To get to M373, you must go into the Music Library; then go up two floors.


Course Overview and Goals

We will briefly cover several areas that are important in music informatics, hopefully enough to give students a feeling for the subject in general. The basic goals of the course are for students to:

The second goal is tricky. How successful is System A? How useful is Approach B in a given context? These questions are surprisingly hard to answer. It's easy to overgeneralize and assume that a technique that works in one situation (e.g., with one kind of music) will work in an situation that's actually very different, or, conversely, to assume that a factor that makes an approach useless in one situation kills it in another.

To achieve these goals, I believe students will need the following "core competencies":

We'll have assignments designed to cover each of them.

Class Format and Requirements

The course will be organized around readings, systems, and projects for each area, with students giving occasional presentations to the class on the assigned readings, systems, and projects. I expect a student presenting an assigned paper or system to understand the content sufficiently to present the problem(s) addressed and explain the approach taken and experimental findings or other results to the class. If possible, the student should go further to seek resources and examples that illustrate the principles and/or algorithms discussed in the paper or implemented in the system We'll also have activities of other sorts, e.g., listening individually with headphones, or small-group discussions.

There will be short assignments, including at least one that involves writing a simple program, and a large final project. For the final project, I'll provide a list with a number of possible topics, or you can propose your own. You can do your projects alone or in teams of two or three.

To keep our feet on the ground, i.e., to keep a strong connection between what we're studying and real music, (1) at the beginning of the course, we'll choose some music each of us is interested in, the idea being to create a collection to use as examples throughout the semester; (2) we'll take a look at, and when appropriate listen to, one or more major systems for each area we cover.

Preference will be given to systems we can actually try out, but most state-of-the-art systems aren't available, so we'll also discuss some interesting systems we can't touch (e.g., Cope's EMI) and some interesting systems that aren't state-of-the art. I expect that student presentations will average about 1/4 of class meeting time, not counting presentations of major projects at the end of the semester.

I'll take into account what students are interested in as much as possible in choosing both reading material and systems.

I'll try to arrange to have a music-informatics researcher as a guest speaker.

Prerequisites

Readings and Software Tools

There is no textbook as such. Music informatics is too new and too fast-moving for anything suitable to exist. Many readings will available on the Web; others will be on reserve in the Music Library, or I'll hand copies out. They will be selected mostly from recent literature -- not all of it academic -- such as Computer Music Journal, the Journal of New Music Research, and Electronic Musician, as well as proceedings of conferences like the International Conference on Music Information Retrieval, International Computer Music Conference, Computer Music Modeling and Retrieval, and the Joint Conference on Digital Libraries, and books like Pohlmann's Principles of Digital Audio.

The software tools we'll use will include R (a programming language and system), Audacity (an audio editor), Variations2 (IU's pioneering digital music library system), and most likely an experimental version of the Nightingale music-score editor with music-IR features.

 

Course Outline

The following outline of major topics is approximate and subject to change. For one thing, there's a good chance we'll skip one or two topics completely to allow more time for the others.

(2 weeks) 1. Introduction: Music Research and Music Informatics; Our Own Music; Digital Audio; Programming

We'll discuss what research is and why it matters for us; we may compose some very simple music, in order to learn more about how music is put together; and we'll each choose some music we're interested in, then listen to and discuss from a technical standpoint everyone's choices. We'll also learn a bit about programming in a simple language, namely R.


(2 weeks) 2. Music Representation and Notation; Converting between Representations and Encodings, Acquiring Music in Digital Form, etc.

The list of computer representations for music in use today is incredibly long, and problems in converting between one and another causes an incredible amount of trouble. Why can't people just choose one and be done with it? The short answer is, which one is best depends very much on what you want to do. We'll discuss music in all three basic forms -- audio, time-stamped events (MIDI), and notation -- and how you get any of them into a computer in the first place.


(2 weeks) 3. Acoustics and Psychoacoustics; Music Perception & Cognition; Expectation vs. Perception

Acoustics is the branch of physics that studies sound. When sound is considered in terms of how it's perceived, we have psychoacoustics, which is a matter of psychology and, to a lesser extent, music theory. Just as optical illusions show the complexity of visual perception, auditory illusions dramatize the complexity of our perception of sound. One aspect is the fact that we experience musical sounds partly in the time domain and partly in the frequency domain. We'll consider what this means for music informatics. We'll also see how so-called "perceptual coding" makes it possible to compress audio tremendously with very little loss of fidelity.


(2 weeks) 4. Music Retrieval Via Metadata (Digital Music Libraries) and By Content (Music IR)

We'll talk about finding music in all forms (audio, MIDI, and notation) by searching or browsing, based on its content or its style or genre (the latter as attempted by music-recommender systems like Pandora, Last.fm, MusicStrands, etc). Genre classification is obviously useful but it's very problematic, and automatic genre classification tends to increase the confusion. But the traditional way of finding music is based on bibliographic information, as in a library catalog, rather than by its content. Naturally, online catalogs like IUCAT allow many more options than do the drawers full of 3-by-5 cards libraries used to have, and computers are capable of supporting many more options than any online catalog. And once you've found the music you want, recent systems like IU's own Variations2 are starting to include powerful features for doing useful things with it. Finally, my "Music Similarity Scale" tries to make it easier to relate music-informatics problems to each other.


(1 week) 5. Music Similarity, Sampling, and Intellectual Property Rights

How does a person or a computer decide some music is similar to other music? How can a court decide if some music -- especially sampled music -- infringes the copyright of other music? Less obvious and maybe even more important, have recent changes to U.S. law and policies gone too far, interfering with the creation of new kinds of music that digital technology has just made possible?


(1 week) 6. User Interfaces and Visualization for Music Systems

Music is so complex that ease of use is an important issue for doing almost anything with it on a computer; visualization (e.g., for browsing) is a particular challenge for usability.


(1-2/3 weeks) 7. Synthesis Of Sounds and of Music: New Approaches to Performance, Composition, and Improvisation

A great deal of music we hear these days could not have been made without computers, and they're used in a wide variety of ways. Sometimes computers are used as performing media, to synthesize sounds ranging from realistic imitations of orchestral instruments or human voices to wild effects unlike anything heard before; sometimes they're used to choose notes to create music that sounds surprisingly like the works of composers from Bach to Scott Joplin and beyond; and sometimes they're used as partners in live improvisation.


(1/3 week) Reprise: Music Research and Music Informatics

(2 weeks) Presentations of Final Projects

 

Reserve Books

 

Course Requirements and Grading

Final grades will be based on the following.

In addition, I can't give specifics, but improvement over the semester may help your grade. (However, lack of improvement won't hurt it.)

 

Miscellaneous

If you have a documented disability and anticipate needing accommodations in this course, please make arrangements to meet with me soon.

If you have any comments, questions, or suggestions, please see me during office hours, make an appointment, write me a note (anonymously if you like), or send me email.

University policies on academic dishonesty will be followed. Cite your sources. Students found to be engaging in plagiarism, cheating, or other types of dishonesty will receive an F for the course. For further information, see the IU Code of Student Ethics at http://campuslife.indiana.edu/Code/index1.html .

Late work will not be accepted without a compelling reason.


Last updated: 6 Sept. 2007
Comments: donbyrd(at)indiana.edu
Copyright 2006-07, Donald Byrd