Instructions to use cl-nagoya/sup-simcse-ja-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use cl-nagoya/sup-simcse-ja-base with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("cl-nagoya/sup-simcse-ja-base") sentences = [ "The weather is lovely today.", "It's so sunny outside!", "He drove to the stadium." ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] - Transformers
How to use cl-nagoya/sup-simcse-ja-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="cl-nagoya/sup-simcse-ja-base")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("cl-nagoya/sup-simcse-ja-base") model = AutoModel.from_pretrained("cl-nagoya/sup-simcse-ja-base") - Notebooks
- Google Colab
- Kaggle
File size: 2,535 Bytes
4bf1550 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 | epoch,step,loss,sts-dev
0,0,inf,51.375121585435735
0,32,5.8056640625,78.35678255256113
0,64,3.11572265625,82.6244856578063
0,96,2.42578125,83.35358238047021
0,128,2.176513671875,83.61539847191834
0,160,2.0302734375,81.87700855632563
0,192,1.875732421875,79.53856626810594
0,224,1.843994140625,77.77162870199022
0,256,1.773193359375,76.12193564966829
1,288,1.690185546875,75.82749644399787
1,320,1.600341796875,77.40347138209835
1,352,1.56298828125,74.4370737294954
1,384,1.570556640625,76.08974073722088
1,416,1.534912109375,73.89938923579099
1,448,1.55078125,70.69398516957513
1,480,1.509521484375,73.33702686894722
1,512,1.545654296875,75.2034184712643
1,544,1.52001953125,73.9900785391876
2,576,1.463134765625,72.0822596167315
2,608,1.376220703125,72.30434210546885
2,640,1.373046875,75.35516784317878
2,672,1.37646484375,75.00048637711306
2,704,1.370849609375,71.54454915080103
2,736,1.35595703125,72.10342263835692
2,768,1.351318359375,73.61822488939758
2,800,1.341552734375,73.69283818050053
2,832,1.350341796875,73.35254646396575
3,864,1.292724609375,71.73891113636543
3,896,1.231201171875,72.50612877979137
3,928,1.249267578125,71.73997216828496
3,960,1.23486328125,72.24572646319987
3,992,1.2470703125,72.21874965045426
3,1024,1.260498046875,71.01279770471795
3,1056,1.262451171875,71.28239282311822
3,1088,1.24609375,72.01843871324893
3,1120,1.25048828125,71.49440820181441
4,1152,1.151611328125,69.90122275478602
4,1184,1.157958984375,71.99470697954118
4,1216,1.179931640625,69.07210383354008
4,1248,1.173828125,67.60162600897702
4,1280,1.152587890625,68.50119537305783
4,1312,1.176513671875,69.20852332481955
4,1344,1.166015625,69.92232791766861
4,1376,1.1796875,71.00892896894848
5,1408,1.16357421875,72.45555003088666
5,1440,1.0804443359375,69.45015860800906
5,1472,1.0814208984375,71.00156597627819
5,1504,1.1224365234375,70.58322578457057
5,1536,1.097412109375,69.09835049406871
5,1568,1.097412109375,70.2241604764949
5,1600,1.1026611328125,71.35726715857705
5,1632,1.1053466796875,69.87296046317874
5,1664,1.11376953125,70.50921749290218
6,1696,1.092529296875,70.10898081907449
6,1728,1.0654296875,70.09196262215698
6,1760,1.068603515625,69.50168151188232
6,1792,1.0595703125,68.94335544010495
6,1824,1.048583984375,68.59140278421327
6,1856,1.0460205078125,69.62795505629208
6,1888,1.0633544921875,69.44217502657546
6,1920,1.0531005859375,69.53616592289175
6,1952,1.0594482421875,69.40128536108074
7,1984,1.044189453125,69.35274684809701
7,2016,1.013427734375,69.4939563818511
7,2048,1.0462646484375,69.50480224133872
|