musiccaps
2 rows where aspect_list contains "no voices" and audioset_ids contains "/m/07brj"
This data as json, CSV (advanced)
ytid ▼ | url | caption | aspect_list | audioset_names | author_id | start_s | end_s | is_balanced_subset | is_audioset_eval | audioset_ids |
---|---|---|---|---|---|---|---|---|---|---|
aq3vov8-fw8 | This orchestral song features flutes playing the main melody. This is backed by bass played on the cello. The percussion is played on the tambourine. This song is in a compound time signature. Toward the end of the clip, the tambourine sound is played loud in a specific rhythm. | ["orchestral music", "no voices", "moderate tempo", "flute sounds", "tambourine", "flute harmony", "instrumental", "string section"] | ["Music", "Tambourine"] | 0 | 30 | 40 | 0 | 1 | ["/m/04rlf", "/m/07brj"] | |
vXtk2JEP0zM | This clip features a rhythm played on a tambourine. It starts off with a galloping rhythm which is sped up. At the end, the tambourine is not struck and only the jingles are allowed to ring. There are no other instruments in this song. There are no voices in this song. This song can be played in a folk gathering. | ["tambourine rhythm", "no voices", "instrumental", "no other instruments"] | ["Music", "Tambourine", "Percussion"] | 0 | 110 | 120 | 0 | 0 | ["/m/04rlf", "/m/07brj", "/m/0l14md"] |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [musiccaps] ( [ytid] TEXT PRIMARY KEY, [url] TEXT, [caption] TEXT, [aspect_list] TEXT, [audioset_names] TEXT, [author_id] TEXT, [start_s] TEXT, [end_s] TEXT, [is_balanced_subset] INTEGER, [is_audioset_eval] INTEGER, [audioset_ids] TEXT );