Datasets:
File size: 1,823 Bytes
588d098 947bc5b 588d098 947bc5b fb74847 947bc5b 8a1569c fb74847 8a1569c 947bc5b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 |
---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: speaker_id
dtype: string
- name: audio
dtype: audio
- name: mic_id
dtype: string
splits:
- name: train
num_bytes: 16540026180.2
num_examples: 88156
download_size: 17595288543
dataset_size: 16540026180.2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-to-speech
- automatic-speech-recognition
- text-to-audio
license: cc-by-4.0
language:
- en
size_categories:
- 10K<n<100K
---
# VCTK
This is a processed clone of the VCTK dataset with leading and trailing silence removed using Silero VAD. A fixed 25 ms of padding has been added to both ends of each audio clip to (hopefully) imrprove training and finetuning.
The original dataset is available at: https://datashare.ed.ac.uk/handle/10283/3443.
## Reproducing
This repository notably lacks a requirements.txt file. There's likely a missing dependency or two, but roughly:
```
pydub
tqdm
torch
torchaudio
python-dotenv
```
are the required python packages to clean the dataset.
### Steps
1. Download VCTK dataset (0.92) and extract it. This should net a `wav48_silence_trimmed` directory and a `txt` directory.
2. Run `process.py`, which will generate a `dataset` directory. This can be restarted if stopped.
### Licensing Information
Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode))
### Citation Information
```bibtex
@inproceedings{Veaux2017CSTRVC,
title = {CSTR VCTK Corpus: English Multi-speaker Corpus for CSTR Voice Cloning Toolkit},
author = {Christophe Veaux and Junichi Yamagishi and Kirsten MacDonald},
year = 2017
}
``` |