Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

e5base-ATM-Avg-v1

This is a sentence-transformers model finetuned from intfloat/e5-base-v2. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: intfloat/e5-base-v2
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Language: en

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("jdaviescmg/e5base-ATM-Avg-v1")
# Run inference
sentences = [
    'Hi',
    '☐ Item 1.01 Entry into a Material Definitive Agreement.\n\nOn\nAugust 21, 2024, Lexaria Bioscience Corp. (the “Company”) entered into a\nCapital on Demand™ Sales Agreement (the “Sales Agreement”) with JonesTrading\nInstitutional Services LLC (the “Agent”), pursuant to which the Company may\nissue and sell, from time to time, up to $20,000,000 in aggregate principal\namount of shares (the “Shares”) of the Company’s common stock, par value\n$0.001 per share, through or to the Agent, as the Company’s sales agent or\nprincipal.\n\nAny Shares to be offered and sold under the Sales Agreement will be\nissued and sold by methods deemed to be an “at-the-market offering” as defined\nin Rule 415(a)(4) promulgated under the Securities Act of 1933, as amended\n(the “Act”), or in negotiated transactions, if authorized by the Company.\n\nSubject to the terms of the Sales Agreement, the Agent will use reasonable\nefforts to sell the Shares from time to time, based upon the Company’s\ninstructions (including any price, time, or size limits or other customary\nparameters or conditions the Company may impose).\n\nThe Company cannot provide\nany assurances that it will issue any Shares pursuant to the Sales Agreement.The Company will pay the Agent a commission of 3.0% of the gross sales price\nof the Shares sold pursuant to the Sales Agreement, if any.\n\nThe Company has\nagreed to reimburse the Agent for certain specified expenses as provided in\nthe Sales Agreement and has also agreed to provide the Agent with customary\nindemnification and contribution rights in respect of certain liabilities,\nincluding liabilities under the Act.\n\nThe Sales Agreement also contains\ncustomary representations, warranties and covenants.The offering of the\nShares will terminate upon the earliest of (a) the issuance and sale of all of\nthe Shares by the Agent on the terms and subject to the conditions set forth\nin the Sales Agreement or (b) the termination of the Sales Agreement by either\nof the parties thereto.',
    'Note 9 – Employee Benefit Plans The Company maintains defined\ncontribution benefit plans under Section 401(k) of the Internal Revenue Code\ncovering substantially all qualified employees of the Company (the “401(k)\nPlan”).\n\nUnder the 401(k) Plan, the Company may make discretionary\ncontributions of up to 100 % of employee contributions.\n\nFor the six months\nended June 30, 2024 and 2023, the Company made contributions to the 401(k)\nPlan of $ 109,000 and $ 95,000 , respectively.Note 10 – Liquidity The Company\nfollows “ Presentation of Financial Statements—Going Concern (Subtopic\n205-40): Disclosure of Uncertainties about an Entity’s Ability to Continue as\na Going Concern ”.\n\nThe Company’s financial statements have been prepared\nassuming that it will continue as a going concern, which contemplates\ncontinuity of operations, realization of assets, and liquidation of\nliabilities in the normal course of business.\n\nAs reflected in the financial\nstatements, the Company has historically incurred a net loss and has an\naccumulated deficit of approximately $ 133,148,000 at June 30, 2024, and net\ncash used in operating activities of approximately $ 1,693,000 for the\nreporting period then ended.\n\nThe Company is implementing its business plan and\ngenerating revenue; however, the Company’s cash position and liquid crypto\nassets are sufficient to support its daily operations over the next twelve\nmonths.Our Form S-3 expired on August 14, 2024.\n\nThe Company filed a new Form\nS-3 on February 14, 2024.\n\nAs a result of SEC comments, the new Form S-3 has\nnot yet gone effective and therefore we may not sell shares under the ATM\nAgreement.Note 11 – Subsequent Events The Company evaluates events that have\noccurred after the balance sheet date but before the financial statements are\nissued.\n\nBased upon the evaluation, the Company did not identify any recognized\nor non-recognized subsequent events that would have required adjustment or\ndisclosure in the financial statements other than disclosed.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Custom Triplet

  • Dataset: dim_768
  • Evaluated with main.CustomTripletEvaluator
Metric Value
cosine_accuracy 0.715
dot_accuracy 0.285
manhattan_accuracy 0.7
euclidean_accuracy 0.715
max_accuracy 0.715

Custom Triplet

  • Dataset: dim_512
  • Evaluated with main.CustomTripletEvaluator
Metric Value
cosine_accuracy 0.715
dot_accuracy 0.285
manhattan_accuracy 0.7
euclidean_accuracy 0.715
max_accuracy 0.715

Custom Triplet

  • Dataset: dim_256
  • Evaluated with main.CustomTripletEvaluator
Metric Value
cosine_accuracy 0.715
dot_accuracy 0.285
manhattan_accuracy 0.7
euclidean_accuracy 0.715
max_accuracy 0.715

Custom Triplet

  • Dataset: dim_128
  • Evaluated with main.CustomTripletEvaluator
Metric Value
cosine_accuracy 0.715
dot_accuracy 0.285
manhattan_accuracy 0.7
euclidean_accuracy 0.715
max_accuracy 0.715

Custom Triplet

  • Dataset: dim_64
  • Evaluated with main.CustomTripletEvaluator
Metric Value
cosine_accuracy 0.715
dot_accuracy 0.285
manhattan_accuracy 0.7
euclidean_accuracy 0.715
max_accuracy 0.715

Training Details

Training Dataset

Unnamed Dataset

  • Size: 800 training samples
  • Columns: sentence1, sentence2, and label
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 label
    type string string int
    details
    • min: 3 tokens
    • mean: 3.0 tokens
    • max: 3 tokens
    • min: 35 tokens
    • mean: 371.57 tokens
    • max: 512 tokens
    • 0: ~50.00%
    • 1: ~50.00%
  • Samples:
    sentence1 sentence2 label
    Hi 8. COMMON STOCK [a] Authorized 150,000,000 authorized
    common shares, par value of $ 0.001 , and 5,000,000 preferred shares, par
    value of $ 0.001 .

    [b] Issued and outstanding shares At-the-Market Sales
    AgreementOn December 21, 2021, we entered into an At-the-Market Offering
    Sales Agreement, or ATM, with Virtu Americas, LLC, as sales agent.

    The ATM was
    terminated on February 29, 2024, and no further sales of our common stock will
    be made pursuant to the ATM.

    Since entry into the ATM, through the date of
    termination of the ATM, we offered and sold an aggregate of 200,000 shares of
    our common stock.

    These aggregate sales resulted in gross proceeds to us of
    approximately $ 1.5 million.

    During the three and six months ended June 30,
    2024, we did no t sell any shares of our common stock pursuant to the ATM.May
    2023 Registered Direct Offering In May 2023, we entered into a securities
    purchase agreement with certain purchasers, pursuant to which we sold
    3,000,000 shares of common stock at a price of $ 5.50 per share in a
    registered direct offering.

    The offering of the shares was made pursuant to
    our shelf registration statement on Form S-3 including the prospectus dated
    January 5, 2022 contained therein, and the prospectus supplement dated May 25,
    2023. We received approximately $ 15.3 million in net proceeds from the
    registered direct offering after deducting placement agent fees and offering
    expenses.February 2024 Registered Direct Offering and Concurrent Private
    PlacementIn February 2024, we entered into a securities purchase agreement
    with certain purchasers, pursuant to which we sold 13,086,151 shares of common
    stock at a price of $ 4.585 per share in a registered direct offering.

    The
    offering of the shares was made pursuant to our shelf registration statement
    on Form S-3, including the prospectus dated January 5, 2022 contained therein,
    and the prospectus supplement dated February 28, 2024.
    1
    Hi The foregoing description of the Note does not purport to be complete and is
    subject to, and is qualified in its entirety by reference to, the full text of
    the Note, which is attached as Exhibit 10.1 to this Current Report on Form
    8-K, and is incorporated herein by reference.Item 2.03.

    Creation of a Direct
    Financial Obligation or an Obligation under an Off-Balance Sheet Arrangement
    of a Registrant.

    The disclosure provided in Item 1.01 of this Current Report
    on Form 8-K is hereby incorporated by reference into this Item 2.03.Item
    8.01.Other Events.

    The Company is supplementing the Company’s risk factors in
    its Annual Report on Form 10-K filed with the SEC on March 29, 2024, and
    Quarterly Reports on Form 10-Q for the quarters ended March 31, 2024 and June
    30, 2024, filed with the SEC on May 10, 2024 and August 14, 2024,
    respectively, with the risk factor set forth below.Servicing our debt will
    require a significant amount of cash, and we may not have sufficient cash flow
    from our business to pay our debt.

    Our ability to make scheduled payments of
    the principal of, to pay interest on or to refinance our indebtedness depends
    on our future performance, which is subject to economic, financial,
    competitive and other factors beyond our control.

    We had, as of June 30, 2024,
    approximately (i) $16.1 million in working capital, (ii) $2.4 million in cash
    and cash equivalents, and (iii) $13.6 million of outstanding indebtedness, net
    of discounts.

    In addition, on August 15, 2024, we amended and restated the
    unsecured promissory note and guaranty previously issued to JXVII Trust that
    increased the principal amount from $7.6 million to $10.0 million.
    0
    Hi The Company
    incurred costs of approximately $0.9 million related to the execution of the
    Purchase Agreement.

    Of the total costs incurred, approximately $0.6 million
    was paid in Common Stock to Lincoln Park as a commitment fee and $ 0.03
    million to reimburse Lincoln Park for expenses.

    These transaction costs were
    included in other income / (expenses), net in the consolidated statement of
    operations.

    Approximately $ 0.2 million was incurred for legal fees, which
    were included in administrative and selling expenses on the consolidated
    statement of operations.During the year ended December 31, 2023, the Company
    issued and sold an aggregate of 293,509 shares pursuant to the Purchase
    Agreement and received net proceeds of $ 5.5 million.During the year ended
    December 31, 2023, the Company incurred approximately $ 0.3 million of
    expenses, related to the discount on the issuance of common stock to Lincoln
    Park, which is included in other income / (expenses), net in the consolidated
    statement of operations.

    As the Company’s common stock price is below $15.00
    per share, the Company is unable to utilize the facility.At the Market
    Offering Agreement On June 2, 2023, the Company entered into an At The Market
    Offering Agreement (the “ATM Agreement”) with H.C. Wainwright & Co., LLC, as
    sales agent (the “Agent”), to create an at-the-market equity program under
    which it may sell up to $50 million of shares of the Company’s common stock
    (the “Shares”) from time to time through the Agent (the “ATM Offering”).

    Under
    the ATM Agreement, the Agent will be entitled to a commission at a fixed rate
    of 3.0 % of the gross proceeds from each sale of Shares under the ATM
    Agreement.
    1
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "CustomContrastiveLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 4e-05
  • num_train_epochs: 10
  • warmup_ratio: 0.05
  • use_mps_device: True
  • optim: adamw_hf

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 4e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.05
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: True
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_hf
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_128_cosine_accuracy dim_256_cosine_accuracy dim_512_cosine_accuracy dim_64_cosine_accuracy dim_768_cosine_accuracy
0.64 1 - 0.695 0.695 0.695 0.695 0.695
1.92 3 - 0.715 0.715 0.715 0.715 0.715
2.56 4 - 0.715 0.715 0.715 0.715 0.715
3.84 6 - 0.71 0.71 0.71 0.71 0.71
4.48 7 - 0.725 0.725 0.725 0.725 0.725
5.76 9 - 0.72 0.72 0.72 0.72 0.72
6.4 10 0.1105 0.715 0.715 0.715 0.715 0.715

Framework Versions

  • Python: 3.12.5
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.4.1
  • Accelerate: 0.34.2
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
Downloads last month
0
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for cmgx/e5base-ATM-Avg-V1

Finetuned
(23)
this model

Evaluation results