musiccaps
2 rows where aspect_list contains "moderate tempo" and audioset_ids contains "/m/07brj"
This data as json, CSV (advanced)
ytid ▼ | url | caption | aspect_list | audioset_names | author_id | start_s | end_s | is_balanced_subset | is_audioset_eval | audioset_ids |
---|---|---|---|---|---|---|---|---|---|---|
aq3vov8-fw8 | This orchestral song features flutes playing the main melody. This is backed by bass played on the cello. The percussion is played on the tambourine. This song is in a compound time signature. Toward the end of the clip, the tambourine sound is played loud in a specific rhythm. | ["orchestral music", "no voices", "moderate tempo", "flute sounds", "tambourine", "flute harmony", "instrumental", "string section"] | ["Music", "Tambourine"] | 0 | 30 | 40 | 0 | 1 | ["/m/04rlf", "/m/07brj"] | |
kAE7Ceg4VgQ | This is a low audio clip. This song is a contemporary gospel song. This song features a male voice. This is accompanied by a tambourine and hand claps. The bass plays the root notes of the chords. The piano plays chords and fills. The sounds of the crowd are heard in this clip. The tempo of this song is moderate. | ["low quality audio", "percussion", "contemporary gospel song", "hand claps", "crowd noise", "tambourine", "bass guitar", "male voice", "piano", "moderate tempo"] | ["Singing", "Music", "Tambourine"] | 0 | 590 | 600 | 1 | 1 | ["/m/015lz1", "/m/04rlf", "/m/07brj"] |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [musiccaps] ( [ytid] TEXT PRIMARY KEY, [url] TEXT, [caption] TEXT, [aspect_list] TEXT, [audioset_names] TEXT, [author_id] TEXT, [start_s] TEXT, [end_s] TEXT, [is_balanced_subset] INTEGER, [is_audioset_eval] INTEGER, [audioset_ids] TEXT );