{ "instances":[ { "text": "Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet…) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.", "questions":[ "How many pretrained models are available in Transformers?", "What does Transformers provide?", "Transformers provides interoperability between which frameworks?" ] } ] }