BERT as language model
Compare different tokenizers in char-level and byte-level.
Knowledge-injected Pre-trained Language Model
Generating synthetic data via self-chatting