snakers4 / silero-models
- воскресенье, 20 сентября 2020 г. в 00:26:33
Jupyter Notebook
Silero Models: pre-trained STT models and benchmarks made embarrassingly simple
Silero Models: pre-trained enterprise-grade STT models and benchmarks. Enterprise-grade STT made refreshingly simple (seriously, see benchmarks). We provide quality comparable to Google's STT (and sometimes even better) and we are not Google.
As a bonus:
All of the provided models are listed in the models.yml file. Any meta-data and newer versions will be added there.
Currently we provide the following checkpoints:
PyTorch | ONNX | TensorFlow | Quantization | Quality | Colab | |
---|---|---|---|---|---|---|
English (en_v1) | link | |||||
German (de_v1) | link | |||||
Spanish (es_v1) | link |
Dependencies:
Loading a model is as easy as cloning this repository and:
import torch
from omegaconf import OmegaConf
models = OmegaConf.load('models.yml')
device = torch.device('cpu') # you can use any pytorch device
model, decoder = init_jit_model(models.stt_models.en.latest.jit, device=device)
Or you can just use TorchHub:
torch.hub
clones the repo for you behind the scenes.
import torch
device = torch.device('cpu') # gpu also works, but our models are fast enough for CPU
model, decoder, utils = torch.hub.load(github='snakers4/silero-models',
model='silero_stt',
device=device,
force_reload=True,
language='de')
(read_batch,
split_into_batches,
read_audio,
prepare_model_input) = utils # see function signatures for details
We provide our models as TorchScript packages, so you can use the deployment options PyTorch itself provides (C++, Java). See details in the example notebook.
You can run our model everywhere, where you can import the ONNX model or run ONNX runtime.
Dependencies:
Just clone the repo and:
import json
import onnx
import torch
import tempfile
import onnxruntime
from omegaconf import OmegaConf
models = OmegaConf.load('models.yml')
with tempfile.NamedTemporaryFile('wb', suffix='.json') as f:
torch.hub.download_url_to_file(models.stt_models.en.latest.labels,
f.name,
progress=True)
with open(f.name) as f:
labels = json.load(f)
decoder = Decoder(labels)
with tempfile.NamedTemporaryFile('wb', suffix='.model') as f:
torch.hub.download_url_to_file(models.stt_models.en.latest.onnx,
f.name,
progress=True)
onnx_model = onnx.load(f.name)
onnx.checker.check_model(onnx_model)
ort_session = onnxruntime.InferenceSession(f.name)
See details in the example notebook.
We provide tensorflow checkpoints, but we do not provide any related utilities.
Colab notebooks and interactive demos are on the way. Please refer to this notebook in the meantime for:
Also check out our wiki.
Please refer to this wiki section.
Try our models, create an issue, join our chat, email us.
Please see our wiki and tiers for relevant information and email us.