SentenceTransformer based on NeuML/pubmedbert-base-embeddings

This is a sentence-transformers model finetuned from NeuML/pubmedbert-base-embeddings. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: NeuML/pubmedbert-base-embeddings
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("praphul555/jeda-stage-1")
# Run inference
sentences = [
    "COMMAND: Have lab draw blood today per ordered tests.\nCONTEXT: get a little blood work today they're gonna get you to x-ray and lab before you leave",
    'blood draw, venipuncture (Charge)',
    'Rocephin*',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000,  0.8369, -0.0363],
#         [ 0.8369,  1.0000, -0.0708],
#         [-0.0363, -0.0708,  1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 329,355 training samples
  • Columns: text1 and text2
  • Approximate statistics based on the first 1000 samples:
    text1 text2
    type string string
    details
    • min: 8 tokens
    • mean: 37.3 tokens
    • max: 106 tokens
    • min: 3 tokens
    • mean: 12.95 tokens
    • max: 24 tokens
  • Samples:
    text1 text2
    COMMAND: Please arrange transport to radiology now and let them know we're sending him for a right foot/toe x-ray with weight-bearing views.
    CONTEXT: wheel him over to x-ray x-ray right foot complete with weight-bearing views go tell the x-ray lady
    Radiology Transfer Communication
    COMMAND: Please arrange transport to radiology now and let them know we're sending him for a right foot/toe x-ray with weight-bearing views. Radiology Transfer Communication
    CONTEXT: wheel him over to x-ray x-ray right foot complete with weight-bearing views go tell the x-ray lady
    REASONING: Doctor instructs staff to transport patient to x-ray and communicate exam details.
    Radiology Transfer Communication
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 64
  • learning_rate: 2e-05
  • num_train_epochs: 5
  • warmup_ratio: 0.1
  • seed: 13
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 13
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0097 50 2.3103
0.0194 100 1.9798
0.0291 150 1.6487
0.0389 200 1.3829
0.0486 250 1.25
0.0583 300 1.1482
0.0680 350 1.0997
0.0777 400 1.0484
0.0874 450 0.9522
0.0971 500 0.9385
0.1069 550 0.8914
0.1166 600 0.86
0.1263 650 0.8825
0.1360 700 0.8217
0.1457 750 0.8102
0.1554 800 0.7831
0.1651 850 0.796
0.1749 900 0.7542
0.1846 950 0.775
0.1943 1000 0.7437
0.2040 1050 0.7237
0.2137 1100 0.6945
0.2234 1150 0.6979
0.2331 1200 0.6834
0.2429 1250 0.7149
0.2526 1300 0.6582
0.2623 1350 0.6437
0.2720 1400 0.6213
0.2817 1450 0.6087
0.2914 1500 0.6225
0.3011 1550 0.5579
0.3109 1600 0.6206
0.3206 1650 0.5787
0.3303 1700 0.5721
0.3400 1750 0.5695
0.3497 1800 0.5395
0.3594 1850 0.5476
0.3691 1900 0.5556
0.3789 1950 0.5628
0.3886 2000 0.5241
0.3983 2050 0.5457
0.4080 2100 0.5339
0.4177 2150 0.5429
0.4274 2200 0.5421
0.4371 2250 0.5149
0.4469 2300 0.5015
0.4566 2350 0.5005
0.4663 2400 0.5149
0.4760 2450 0.5004
0.4857 2500 0.4852
0.4954 2550 0.5316
0.5051 2600 0.5227
0.5149 2650 0.5138
0.5246 2700 0.4744
0.5343 2750 0.4885
0.5440 2800 0.5036
0.5537 2850 0.5077
0.5634 2900 0.4669
0.5731 2950 0.4682
0.5829 3000 0.4588
0.5926 3050 0.4567
0.6023 3100 0.4671
0.6120 3150 0.5114
0.6217 3200 0.4715
0.6314 3250 0.4353
0.6412 3300 0.46
0.6509 3350 0.4525
0.6606 3400 0.4633
0.6703 3450 0.4344
0.6800 3500 0.4566
0.6897 3550 0.4643
0.6994 3600 0.4615
0.7092 3650 0.4387
0.7189 3700 0.4145
0.7286 3750 0.4646
0.7383 3800 0.4831
0.7480 3850 0.444
0.7577 3900 0.4412
0.7674 3950 0.4407
0.7772 4000 0.4383
0.7869 4050 0.4403
0.7966 4100 0.4674
0.8063 4150 0.4477
0.8160 4200 0.4619
0.8257 4250 0.4368
0.8354 4300 0.4531
0.8452 4350 0.4409
0.8549 4400 0.4456
0.8646 4450 0.4312
0.8743 4500 0.4233
0.8840 4550 0.4134
0.8937 4600 0.3193
0.9034 4650 0.2839
0.9132 4700 0.2286
0.9229 4750 0.2572
0.9326 4800 0.2896
0.9423 4850 0.1615
0.9520 4900 0.2984
0.9617 4950 0.1891
0.9714 5000 0.2552
0.9812 5050 0.2165
0.9909 5100 0.2774
1.0006 5150 0.2737
1.0103 5200 0.447
1.0200 5250 0.4317
1.0297 5300 0.3798
1.0394 5350 0.4063
1.0492 5400 0.4231
1.0589 5450 0.4202
1.0686 5500 0.3911
1.0783 5550 0.3807
1.0880 5600 0.3979
1.0977 5650 0.3908
1.1074 5700 0.4167
1.1172 5750 0.3885
1.1269 5800 0.3992
1.1366 5850 0.4102
1.1463 5900 0.3949
1.1560 5950 0.4066
1.1657 6000 0.3871
1.1754 6050 0.3925
1.1852 6100 0.3785
1.1949 6150 0.4529
1.2046 6200 0.4188
1.2143 6250 0.4844
1.2240 6300 0.4171
1.2337 6350 0.4001
1.2434 6400 0.3992
1.2532 6450 0.4167
1.2629 6500 0.4395
1.2726 6550 0.4
1.2823 6600 0.3905
1.2920 6650 0.3769
1.3017 6700 0.3846
1.3114 6750 0.4
1.3212 6800 0.4062
1.3309 6850 0.3972
1.3406 6900 0.3875
1.3503 6950 0.3958
1.3600 7000 0.3843
1.3697 7050 0.4004
1.3794 7100 0.4435
1.3892 7150 0.3856
1.3989 7200 0.3843
1.4086 7250 0.3777
1.4183 7300 0.4103
1.4280 7350 0.3795
1.4377 7400 0.3719
1.4474 7450 0.3938
1.4572 7500 0.4058
1.4669 7550 0.3913
1.4766 7600 0.3992
1.4863 7650 0.3743
1.4960 7700 0.4072
1.5057 7750 0.3788
1.5154 7800 0.3987
1.5252 7850 0.3774
1.5349 7900 0.3803
1.5446 7950 0.3582
1.5543 8000 0.4222
1.5640 8050 0.4001
1.5737 8100 0.3857
1.5834 8150 0.3819
1.5932 8200 0.3643
1.6029 8250 0.3884
1.6126 8300 0.3761
1.6223 8350 0.4295
1.6320 8400 0.4073
1.6417 8450 0.3963
1.6514 8500 0.389
1.6612 8550 0.3677
1.6709 8600 0.4012
1.6806 8650 0.3732
1.6903 8700 0.3793
1.7000 8750 0.3712
1.7097 8800 0.3734
1.7194 8850 0.3895
1.7292 8900 0.3667
1.7389 8950 0.3832
1.7486 9000 0.3842
1.7583 9050 0.3822
1.7680 9100 0.3706
1.7777 9150 0.3699
1.7874 9200 0.3738
1.7972 9250 0.3748
1.8069 9300 0.3911
1.8166 9350 0.366
1.8263 9400 0.3626
1.8360 9450 0.3762
1.8457 9500 0.3711
1.8554 9550 0.3568
1.8652 9600 0.3877
1.8749 9650 0.3744
1.8846 9700 0.3858
1.8943 9750 0.2191
1.9040 9800 0.1622
1.9137 9850 0.13
1.9235 9900 0.359
1.9332 9950 0.1739
1.9429 10000 0.2212
1.9526 10050 0.2445
1.9623 10100 0.2059
1.9720 10150 0.2288
1.9817 10200 0.1985
1.9915 10250 0.182
2.0012 10300 0.2609
2.0109 10350 0.3533
2.0206 10400 0.3322
2.0303 10450 0.3565
2.0400 10500 0.3454
2.0497 10550 0.3623
2.0595 10600 0.3685
2.0692 10650 0.3468
2.0789 10700 0.3448
2.0886 10750 0.3524
2.0983 10800 0.3691
2.1080 10850 0.3505
2.1177 10900 0.3253
2.1275 10950 0.3422
2.1372 11000 0.3321
2.1469 11050 0.3392
2.1566 11100 0.3292
2.1663 11150 0.3572
2.1760 11200 0.3483
2.1857 11250 0.3535
2.1955 11300 0.3559
2.2052 11350 0.3331
2.2149 11400 0.3367
2.2246 11450 0.3538
2.2343 11500 0.3458
2.2440 11550 0.3197
2.2537 11600 0.3587
2.2635 11650 0.3565
2.2732 11700 0.3533
2.2829 11750 0.3191
2.2926 11800 0.3591
2.3023 11850 0.3598
2.3120 11900 0.3495
2.3217 11950 0.353
2.3315 12000 0.3329
2.3412 12050 0.3365
2.3509 12100 0.3246
2.3606 12150 0.3377
2.3703 12200 0.3392
2.3800 12250 0.3546
2.3897 12300 0.3452
2.3995 12350 0.3403
2.4092 12400 0.3473
2.4189 12450 0.336
2.4286 12500 0.3591
2.4383 12550 0.3425
2.4480 12600 0.3293
2.4577 12650 0.3339
2.4675 12700 0.3386
2.4772 12750 0.3335
2.4869 12800 0.3249
2.4966 12850 0.3123
2.5063 12900 0.3182
2.5160 12950 0.3282
2.5257 13000 0.317
2.5355 13050 0.3177
2.5452 13100 0.3075
2.5549 13150 0.3349
2.5646 13200 0.3543
2.5743 13250 0.3228
2.5840 13300 0.3334
2.5937 13350 0.3364
2.6035 13400 0.333
2.6132 13450 0.3633
2.6229 13500 0.3547
2.6326 13550 0.3431
2.6423 13600 0.3265
2.6520 13650 0.3197
2.6617 13700 0.3233
2.6715 13750 0.3293
2.6812 13800 0.3249
2.6909 13850 0.3041
2.7006 13900 0.3612
2.7103 13950 0.3391
2.7200 14000 0.324
2.7297 14050 0.3114
2.7395 14100 0.3365
2.7492 14150 0.2987
2.7589 14200 0.3233
2.7686 14250 0.3221
2.7783 14300 0.3348
2.7880 14350 0.3231
2.7977 14400 0.3407
2.8075 14450 0.3017
2.8172 14500 0.3264
2.8269 14550 0.3349
2.8366 14600 0.3217
2.8463 14650 0.2965
2.8560 14700 0.322
2.8657 14750 0.3195
2.8755 14800 0.3021
2.8852 14850 0.299
2.8949 14900 0.1857
2.9046 14950 0.1839
2.9143 15000 0.1171
2.9240 15050 0.1275
2.9337 15100 0.1814
2.9435 15150 0.1778
2.9532 15200 0.142
2.9629 15250 0.2545
2.9726 15300 0.1202
2.9823 15350 0.132
2.9920 15400 0.154
3.0017 15450 0.2622
3.0115 15500 0.3185
3.0212 15550 0.293
3.0309 15600 0.3164
3.0406 15650 0.2934
3.0503 15700 0.3005
3.0600 15750 0.3017
3.0697 15800 0.2965
3.0795 15850 0.309
3.0892 15900 0.3056
3.0989 15950 0.3318
3.1086 16000 0.3094
3.1183 16050 0.3041
3.1280 16100 0.2981
3.1378 16150 0.316
3.1475 16200 0.3086
3.1572 16250 0.3062
3.1669 16300 0.3069
3.1766 16350 0.312
3.1863 16400 0.3161
3.1960 16450 0.3059
3.2058 16500 0.2899
3.2155 16550 0.312
3.2252 16600 0.3189
3.2349 16650 0.3152
3.2446 16700 0.2998
3.2543 16750 0.301
3.2640 16800 0.3129
3.2738 16850 0.2955
3.2835 16900 0.2923
3.2932 16950 0.3111
3.3029 17000 0.3097
3.3126 17050 0.3045
3.3223 17100 0.296
3.3320 17150 0.3086
3.3418 17200 0.2902
3.3515 17250 0.322
3.3612 17300 0.3105
3.3709 17350 0.3048
3.3806 17400 0.2853
3.3903 17450 0.2795
3.4000 17500 0.2933
3.4098 17550 0.2834
3.4195 17600 0.3
3.4292 17650 0.2998
3.4389 17700 0.2972
3.4486 17750 0.285
3.4583 17800 0.2888
3.4680 17850 0.293
3.4778 17900 0.2941
3.4875 17950 0.3
3.4972 18000 0.3022
3.5069 18050 0.3049
3.5166 18100 0.3067
3.5263 18150 0.2934
3.5360 18200 0.312
3.5458 18250 0.2823
3.5555 18300 0.2746
3.5652 18350 0.2971
3.5749 18400 0.2827
3.5846 18450 0.2718
3.5943 18500 0.2908
3.6040 18550 0.2911
3.6138 18600 0.3008
3.6235 18650 0.3058
3.6332 18700 0.304
3.6429 18750 0.284
3.6526 18800 0.3037
3.6623 18850 0.2768
3.6720 18900 0.3287
3.6818 18950 0.2768
3.6915 19000 0.316
3.7012 19050 0.2786
3.7109 19100 0.2746
3.7206 19150 0.2794
3.7303 19200 0.2869
3.7400 19250 0.2836
3.7498 19300 0.2982
3.7595 19350 0.3143
3.7692 19400 0.2942
3.7789 19450 0.2693
3.7886 19500 0.2894
3.7983 19550 0.3009
3.8080 19600 0.2893
3.8178 19650 0.2915
3.8275 19700 0.2991
3.8372 19750 0.2857
3.8469 19800 0.3028
3.8566 19850 0.3068
3.8663 19900 0.2955
3.8760 19950 0.3119
3.8858 20000 0.3364
3.8955 20050 0.0993
3.9052 20100 0.1208
3.9149 20150 0.1015
3.9246 20200 0.1422
3.9343 20250 0.1879
3.9440 20300 0.1437
3.9538 20350 0.1556
3.9635 20400 0.1279
3.9732 20450 0.1384
3.9829 20500 0.1556
3.9926 20550 0.1508
4.0023 20600 0.1812
4.0120 20650 0.2858
4.0218 20700 0.2807
4.0315 20750 0.3016
4.0412 20800 0.2611
4.0509 20850 0.3031
4.0606 20900 0.2772
4.0703 20950 0.2776
4.0800 21000 0.2556
4.0898 21050 0.2744
4.0995 21100 0.2825
4.1092 21150 0.2664
4.1189 21200 0.2772
4.1286 21250 0.2767
4.1383 21300 0.2562
4.1480 21350 0.256
4.1578 21400 0.2824
4.1675 21450 0.2762
4.1772 21500 0.2766
4.1869 21550 0.291
4.1966 21600 0.2636
4.2063 21650 0.2751
4.2160 21700 0.2739
4.2258 21750 0.2982
4.2355 21800 0.2881
4.2452 21850 0.2687
4.2549 21900 0.2644
4.2646 21950 0.2827
4.2743 22000 0.2591
4.2840 22050 0.2645
4.2938 22100 0.2786
4.3035 22150 0.2693
4.3132 22200 0.2909
4.3229 22250 0.2838
4.3326 22300 0.2901
4.3423 22350 0.2629
4.3520 22400 0.2672
4.3618 22450 0.2962
4.3715 22500 0.2742
4.3812 22550 0.2811
4.3909 22600 0.2639
4.4006 22650 0.244
4.4103 22700 0.2866
4.4201 22750 0.2968
4.4298 22800 0.2828
4.4395 22850 0.2515
4.4492 22900 0.282
4.4589 22950 0.282
4.4686 23000 0.2776
4.4783 23050 0.2795
4.4881 23100 0.2701
4.4978 23150 0.2808
4.5075 23200 0.2651
4.5172 23250 0.2631
4.5269 23300 0.2911
4.5366 23350 0.2615
4.5463 23400 0.2772
4.5561 23450 0.2826
4.5658 23500 0.2797
4.5755 23550 0.2954
4.5852 23600 0.2816
4.5949 23650 0.2889
4.6046 23700 0.2647
4.6143 23750 0.2882
4.6241 23800 0.2709
4.6338 23850 0.2794
4.6435 23900 0.2702
4.6532 23950 0.2527
4.6629 24000 0.2642
4.6726 24050 0.2808
4.6823 24100 0.2764
4.6921 24150 0.2583
4.7018 24200 0.2286
4.7115 24250 0.2707
4.7212 24300 0.2793
4.7309 24350 0.2593
4.7406 24400 0.2779
4.7503 24450 0.3168
4.7601 24500 0.2943
4.7698 24550 0.3078
4.7795 24600 0.2735
4.7892 24650 0.2846
4.7989 24700 0.2571
4.8086 24750 0.2785
4.8183 24800 0.2753
4.8281 24850 0.2943
4.8378 24900 0.264
4.8475 24950 0.2962
4.8572 25000 0.2743
4.8669 25050 0.2748
4.8766 25100 0.3039
4.8863 25150 0.2817
4.8961 25200 0.1467
4.9058 25250 0.1224
4.9155 25300 0.0547
4.9252 25350 0.1329
4.9349 25400 0.086
4.9446 25450 0.1423
4.9543 25500 0.0783
4.9641 25550 0.1377
4.9738 25600 0.0743
4.9835 25650 0.0879
4.9932 25700 0.1108

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 5.0.0
  • Transformers: 4.56.2
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.8.1
  • Datasets: 4.0.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
7
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for praphul555/jeda-stage-1

Papers for praphul555/jeda-stage-1