musiccaps
1 row where aspect_list contains "electric guitar", aspect_list contains "jam", aspect_list contains "keyboard" and aspect_list contains "live performance"
This data as json, CSV (advanced)
ytid ▼ | url | caption | aspect_list | audioset_names | author_id | start_s | end_s | is_balanced_subset | is_audioset_eval | audioset_ids |
---|---|---|---|---|---|---|---|---|---|---|
ALVS3Q_jNaU | This is the recording of a jazz reggae concert. There is a saxophone lead playing a solo. There is a keyboard and an electric guitar playing the main tune with the backing of a bass guitar. In the rhythmic background, there is an acoustic reggae drum beat. The atmosphere is groovy and chill. This piece could be playing in the background at a beach. It could also be included in the soundtrack of a summer/vacation/tropical themed movie. | ["jazz", "reggae", "live performance", "concert", "saxophone", "keyboard", "electric guitar", "bass guitar", "acoustic drums", "groovy", "relaxing", "calm", "chill", "tropical", "jam"] | ["Blues", "Effects unit", "Electric guitar", "Guitar", "Music", "Reggae", "Plucked string instrument", "Distortion"] | 9 | 30 | 40 | 1 | 1 | ["/m/0155w", "/m/02rr_", "/m/02sgy", "/m/0342h", "/m/04rlf", "/m/06cqb", "/m/0fx80y", "/m/0g12c5"] |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [musiccaps] ( [ytid] TEXT PRIMARY KEY, [url] TEXT, [caption] TEXT, [aspect_list] TEXT, [audioset_names] TEXT, [author_id] TEXT, [start_s] TEXT, [end_s] TEXT, [is_balanced_subset] INTEGER, [is_audioset_eval] INTEGER, [audioset_ids] TEXT );