Numbed Down
I realized that when I play instruments I’m getting tactile feedback–something you don't experience when you're generating music with AI. And every instrument feels different–it’s heavier or it’s lighter, larger or smaller, and you immediately adapt and interact with it. When we're just tweaking software we're not using the same parts of our brains. With AI, we tweak it like we would a stereo receiver, or an effect in a studio, like a parametric equalizer, or other outboard effects. where we're just twisting a knob, or making some kind of a setting--which is not what playing (or writing) music is. What I thought would be an interesting interface for AI music would be an analog (not virtual) controller like a stereo receiver or an equalizer–just a series of analog knobs and sliders that we would manipulate so as to more quickly get the results that we want. Currently it takes too long to get results from an AI music interface. Instruments are also interfaces, but they immediately respond to what you do to them; you don't have to wait. Similarly, now that we can access musical scores online, it doesn't compare to having the study score that you can just pick up off the shelf and find the passage that you want. In many ways computers have made our lives more inconvenient–in addition to having to interact with the “alien” worlds of interfaces. At least when you can turn a knob or move a slider, it's like playing a musical instrument, or a synthesizer, which immediately respond to the things you're doing with it.
Comments