I want do this as my BE project.
I guess it has to be done using MATLAB. The software should recognize the type of music that is being played, ie, is it a sad or happy music? Is the project feasible.....? and if yes can you please throw some light on the method of doing it...???
It is not possible to directly determine sad vrs happy. Instead you need to look a tempo and other things you can measurem, then create a set of rules to decide what is sad or happy.
There is work on recognizing facial expressions (happy, sad, angry, depressed, etc.) and maybe voice. I don't have any references, but recently saw some examples. Looking at the methods used there may give you some hints.
Sounds incredibly ambitious but interesting. Simplifying as suggested would probably be a lot more feasible. BPM should give you tempo assuming you can figure out BPM from the music. e.g. how do you do this with, say, classical chamber music -- no drum? You may need to put some constraints in place to make it easier
Sounds incredibly ambitious but interesting. Simplifying as suggested would probably be a lot more feasible. BPM should give you tempo assuming you can figure out BPM from the music. e.g. how do you do this with, say, classical chamber music -- no drum? You may need to put some constraints in place to make it easier
You could just focus on Russian composers, like Shostakovich. Sad would be the default answer. Sousa would be happy; but what about American spirituals (e.g, Lena Horne) or Mozart's piano concerto #23. Can you get people to agree on whether they are happy or sad?
In reality, I think the real problem will be to define what is sad and what is happy. If humans can't agree on the mood, you will have an impossible time getting an instrument to make the decision.
Perhaps, if you can determine the key of the music with your analysis, that would be enough, but still plenty challenging.
The first step in doing this is getting the tempo of the song, which is done by detecting the beats per minute. I've found a couple of algo's for this, but those were a bit complex.Can anybody provide me with a much simpler algo......??
The first step in doing this is getting the tempo of the song, which is done by detecting the beats per minute. I've found a couple of algo's for this, but those were a bit complex.Can anybody provide me with a much simpler algo......??
Where ever i've searched, I've found the algo given in the foll. file has been used. But m finding it difficult to get hold of it. So can anybody help me understand it. The link of the file is:
I've got some idea of how to find the tempo. Here i am listing the various steps-
1. Firstly the music has to be broken down into 6 bands of frequency,
2. then envelopes of all the six signals has to be constructed in time domain,
3. then the beats are to be found out by passing it thru a differentiator
4. and then finally the signals should be convolved with comb filter of different tempo's.
The first doubt is how do i break a sound signal in different subbands in matlab?