Semantic Sonatas
This is the first period in history that has been so "marked-up". Millennials born after 2000 have what previous generations have never had: the ability to have historical information attached to almost everything, either as actual text or as metadata. (This is what is (has always been) so exciting about the the Internet since the beginning: its semantic possibilities. In the past, we relied on the oral tradition, with stories told by the elders, some of it never written down, and if it was, it was in longhand.)
Ideally we should have new standard metadata schemas just for annotations, such that when you click on an article you get all the registered and moderated (emphasis) comments in the form of side notes. The power lies in the connections that are made, either by yourself naturally in the flow of your own mentation, but also through linked information in the form of fields/metadata/microformats. (Think simple searching [song title] [lyrics] and getting them at the top of the Result). Social networking is already doing this, but it's not bibliographic. In the future, we could assemble bits of music, simply through bibliographic linking, with established categories (A P2P archivist). See: http://phys.org/news/2016-05-scientists-categorize-music.html
But there're challenges, as with any new framework (or even one as old as this): "vastness, vagueness, uncertainty, inconsistency, and deceit.... Automated reasoning systems will have to deal with all of these issues in order to deliver on the promise of the Semantic Web."
It leads back to a simple question: Music is a rarefied domain. So why change it? Just because we're all technologists, as we use and create technology.
I'm not sold on the idea of semantically-linked sonatas, but it's an interesting thought experiment, that could have interesting implications for quasi-intelligent AIs.
***
Post-script: Just think of the incredible amount of energy to do all this, that humans already do pretty well with their brains. Could be a solution in search of a problem.
Ideally we should have new standard metadata schemas just for annotations, such that when you click on an article you get all the registered and moderated (emphasis) comments in the form of side notes. The power lies in the connections that are made, either by yourself naturally in the flow of your own mentation, but also through linked information in the form of fields/metadata/microformats. (Think simple searching [song title] [lyrics] and getting them at the top of the Result). Social networking is already doing this, but it's not bibliographic. In the future, we could assemble bits of music, simply through bibliographic linking, with established categories (A P2P archivist). See: http://phys.org/news/2016-05-scientists-categorize-music.html
But there're challenges, as with any new framework (or even one as old as this): "vastness, vagueness, uncertainty, inconsistency, and deceit.... Automated reasoning systems will have to deal with all of these issues in order to deliver on the promise of the Semantic Web."
It leads back to a simple question: Music is a rarefied domain. So why change it? Just because we're all technologists, as we use and create technology.
I'm not sold on the idea of semantically-linked sonatas, but it's an interesting thought experiment, that could have interesting implications for quasi-intelligent AIs.
***
Post-script: Just think of the incredible amount of energy to do all this, that humans already do pretty well with their brains. Could be a solution in search of a problem.