musiccaps
1 row where aspect_list contains "instrumental" and audioset_ids = "["/m/01b9nn", "/m/01glhc", "/m/02rr_", "/m/02sgy", "/m/04rlf", "/m/0g12c5"]"
This data as json, CSV (advanced)
ytid ▼ | url | caption | aspect_list | audioset_names | author_id | start_s | end_s | is_balanced_subset | is_audioset_eval | audioset_ids |
---|---|---|---|---|---|---|---|---|---|---|
Xus0LI3QV2A | This clip is an instrumental. The tempo is slow and deliberate with an electric guitar playing an energetic riff. It is minimal instrumentation with no other instrument used. The electric guitar is loud, boomy, jarring, resonant,recurring, with an insistent riff. | ["instrumental", "electric guitar", "jarring", "reverberating", "boomy", "home video", "amateur recording", "vibrations", "sonic power", "amplified guitar", "resonant", "loud feed back", "distorted audio", "electrical distortion"] | ["Reverberation", "Tapping (guitar technique)", "Effects unit", "Electric guitar", "Music", "Distortion"] | 7 | 30 | 40 | 0 | 1 | ["/m/01b9nn", "/m/01glhc", "/m/02rr_", "/m/02sgy", "/m/04rlf", "/m/0g12c5"] |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [musiccaps] ( [ytid] TEXT PRIMARY KEY, [url] TEXT, [caption] TEXT, [aspect_list] TEXT, [audioset_names] TEXT, [author_id] TEXT, [start_s] TEXT, [end_s] TEXT, [is_balanced_subset] INTEGER, [is_audioset_eval] INTEGER, [audioset_ids] TEXT );