cjwbw/melotts 📝🔢❓ → 🖼️
About
High-quality multilingual text-to-speech library
Example Output
Output
Performance Metrics
2.88s
Prediction Time
2.90s
Total Time
All Input Parameters
{ "text": "The field of text-to-speech has seen rapid development recently.", "speed": 1, "speaker": "EN-BR", "language": "EN" }
Input Parameters
- text
- speed
- Speed of the output.
- speaker
- For EN, choose a speaker, for other langauges, leave it blank.
- language
Output Schema
Output
Example Execution Logs
> Text split to sentences. The field of text-to-speech has seen rapid development recently. > =========================== 0%| | 0/1 [00:00<?, ?it/s] model.safetensors: 0%| | 0.00/440M [00:00<?, ?B/s][A model.safetensors: 0%| | 47.8k/440M [00:00<19:35, 375kB/s][A model.safetensors: 2%|▏ | 10.5M/440M [00:00<00:13, 32.6MB/s][A model.safetensors: 7%|▋ | 31.5M/440M [00:00<00:05, 81.1MB/s][A model.safetensors: 19%|█▉ | 83.9M/440M [00:00<00:01, 199MB/s] [A model.safetensors: 33%|███▎ | 147M/440M [00:00<00:00, 312MB/s] [A model.safetensors: 69%|██████▉ | 304M/440M [00:00<00:00, 659MB/s][A model.safetensors: 100%|█████████▉| 440M/440M [00:00<00:00, 462MB/s] Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertForMaskedLM: ['bert.pooler.dense.bias', 'bert.pooler.dense.weight', 'cls.seq_relationship.bias', 'cls.seq_relationship.weight'] - This IS expected if you are initializing BertForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing BertForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). 100%|██████████| 1/1 [00:02<00:00, 2.49s/it] 100%|██████████| 1/1 [00:02<00:00, 2.49s/it]
Version Details
- Version ID
2e4d356f3715d98c183ef097ce2cf410def83ca9fbbdd5f8a32ba056123e6a6f
- Version Created
- March 3, 2024