File size: 3,018 Bytes
2ad73f4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 |
---
title: FischGPT API
emoji: π
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 4.12.0
app_file: app.py
pinned: false
license: mit
---
# FischGPT API Backend
This Space provides a **free API endpoint** for FischGPT-SFT, a GPT-2 style transformer built from scratch.
## π API Endpoint
**URL:** `https://kristianfischerai12345-fischgpt-api.hf.space/api/predict`
## π Usage Examples
### Python
```python
import requests
response = requests.post(
"https://kristianfischerai12345-fischgpt-api.hf.space/api/predict",
json={
"data": [
"Explain machine learning", # message
0.8, # temperature
150, # max_length
0.9 # top_p
]
}
)
result = response.json()
print(result["data"][0]["response"])
```
### JavaScript/React
```javascript
const callFischGPT = async (message) => {
const response = await fetch(
"https://kristianfischerai12345-fischgpt-api.hf.space/api/predict",
{
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
data: [message, 0.8, 150, 0.9]
})
}
);
const result = await response.json();
return result.data[0].response;
};
```
### cURL
```bash
curl -X POST "https://kristianfischerai12345-fischgpt-api.hf.space/api/predict" \
-H "Content-Type: application/json" \
-d '{"data": ["Hello, how are you?", 0.8, 150, 0.9]}'
```
## π Response Format
```json
{
"data": [{
"error": null,
"response": "Generated text response...",
"metadata": {
"input_tokens": 10,
"output_tokens": 35,
"new_tokens": 25,
"generation_time": 1.234,
"tokens_per_second": 20.3,
"model": "FischGPT-SFT",
"parameters": {
"temperature": 0.8,
"max_length": 150,
"top_p": 0.9
}
}
}]
}
```
## π― Parameters
- **user_message** (string): The input message
- **temperature** (float, 0.1-2.0): Sampling temperature (higher = more creative)
- **max_length** (int, 50-300): Maximum response length in tokens
- **top_p** (float, 0.1-1.0): Top-p sampling (higher = more diverse)
## π Model Details
- **Architecture:** GPT-2 style decoder-only transformer
- **Parameters:** ~124M (12 layers Γ 768 hidden Γ 12 heads)
- **Training:** 10B tokens pretraining + supervised fine-tuning
- **Features:** Flash attention, custom weight initialization
## π Related Links
- **Model Repository:** [kristianfischerai12345/fischgpt-sft](https://huggingface.co/kristianfischerai12345/fischgpt-sft)
- **Source Code:** [GitHub](https://github.com/yourusername/FischGPT)
*Free API hosting powered by HuggingFace Spaces π€*
|