Spaces:
Running
Running
File size: 1,364 Bytes
4fc0756 06fa6c0 4fc0756 06fa6c0 4fc0756 fffe409 4688c85 98e4a69 4688c85 98e4a69 4688c85 98e4a69 4688c85 98e4a69 4688c85 fffe409 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
---
title: Organization card
emoji: π
colorFrom: purple
colorTo: red
sdk: static
pinned: false
---
**TempestTeam**
**Mission:**
We aim to efficiently train large-scale State Space Models (SSM) while significantly reducing infrastructure usage. Our goal is to minimize economic and environmental impacts without substantially compromising linguistic performance.
**Model:**
**Tempest-LLM** β an efficient language model based on **Mamba2**, leveraging advanced compression methods to achieve an encoding efficiency of **1.58 bits per parameter**.
**Training Approach:**
Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in:
- π«π· **French**
- π¬π§ **English**
- πͺπΈ **Spanish**
This multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts.
**Impact:**
- **Economic:** Reduced computational infrastructure leads to lower operational costs.
- **Ecological:** Lower power consumption and minimal infrastructure requirements decrease environmental footprint.
- **Performance:** Maintains robust linguistic accuracy and fluency despite compression and optimization.
**Vision:**
TempestTeam is committed to showing that linguistic AI technologies can be both powerful and sustainable, contributing responsibly to AI innovation. |