Instructions to use wukevin/tcr-bert-mlm-only with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use wukevin/tcr-bert-mlm-only with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="wukevin/tcr-bert-mlm-only")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("wukevin/tcr-bert-mlm-only") model = AutoModelForMaskedLM.from_pretrained("wukevin/tcr-bert-mlm-only") - Notebooks
- Google Colab
- Kaggle
| { | |
| "datasets": [ | |
| "VDJdb", | |
| "PIRD" | |
| ], | |
| "bert": "bert", | |
| "config": "/home/groups/jamesz/wukevin/projects/tcr/model_configs/bert_reduced_intermediate_pe.json", | |
| "outdir": "bert_reduced_intermediate_pe_50_epochs_256_bs_5e-05_lr_0.0_warmup_VDJdb_PIRD", | |
| "epochs": 50, | |
| "bs": 256, | |
| "lr": 5e-05, | |
| "warmup": 0.0, | |
| "cpu": false, | |
| "holdout": 0.1, | |
| "noneptune": false | |
| } |