Amini Labs
commited on
End of training
Browse files- README.md +253 -0
- config.json +50 -0
- model.safetensors +3 -0
- training_args.bin +3 -0
README.md
ADDED
@@ -0,0 +1,253 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- generated_from_trainer
|
4 |
+
model-index:
|
5 |
+
- name: SatPatchTST_large1000_V1.1.0_pretrained
|
6 |
+
results: []
|
7 |
+
---
|
8 |
+
|
9 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
10 |
+
should probably proofread and complete it, then remove this comment. -->
|
11 |
+
|
12 |
+
# SatPatchTST_large1000_V1.1.0_pretrained
|
13 |
+
|
14 |
+
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
|
15 |
+
It achieves the following results on the evaluation set:
|
16 |
+
- Loss: 0.0211
|
17 |
+
|
18 |
+
## Model description
|
19 |
+
|
20 |
+
More information needed
|
21 |
+
|
22 |
+
## Intended uses & limitations
|
23 |
+
|
24 |
+
More information needed
|
25 |
+
|
26 |
+
## Training and evaluation data
|
27 |
+
|
28 |
+
More information needed
|
29 |
+
|
30 |
+
## Training procedure
|
31 |
+
|
32 |
+
### Training hyperparameters
|
33 |
+
|
34 |
+
The following hyperparameters were used during training:
|
35 |
+
- learning_rate: 5e-05
|
36 |
+
- train_batch_size: 64
|
37 |
+
- eval_batch_size: 64
|
38 |
+
- seed: 42
|
39 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
40 |
+
- lr_scheduler_type: linear
|
41 |
+
- num_epochs: 1000
|
42 |
+
|
43 |
+
### Training results
|
44 |
+
|
45 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
46 |
+
|:-------------:|:-----:|:-------:|:---------------:|
|
47 |
+
| 0.1715 | 1.0 | 10797 | 0.0864 |
|
48 |
+
| 0.0751 | 2.0 | 21594 | 0.0652 |
|
49 |
+
| 0.0598 | 3.0 | 32391 | 0.0568 |
|
50 |
+
| 0.0534 | 4.0 | 43188 | 0.0516 |
|
51 |
+
| 0.0496 | 5.0 | 53985 | 0.0519 |
|
52 |
+
| 0.047 | 6.0 | 64782 | 0.0463 |
|
53 |
+
| 0.0449 | 7.0 | 75579 | 0.0432 |
|
54 |
+
| 0.0432 | 8.0 | 86376 | 0.0438 |
|
55 |
+
| 0.0418 | 9.0 | 97173 | 0.0409 |
|
56 |
+
| 0.0407 | 10.0 | 107970 | 0.0405 |
|
57 |
+
| 0.0397 | 11.0 | 118767 | 0.0405 |
|
58 |
+
| 0.0386 | 12.0 | 129564 | 0.0400 |
|
59 |
+
| 0.0377 | 13.0 | 140361 | 0.0375 |
|
60 |
+
| 0.0369 | 14.0 | 151158 | 0.0386 |
|
61 |
+
| 0.0364 | 15.0 | 161955 | 0.0364 |
|
62 |
+
| 0.0358 | 16.0 | 172752 | 0.0380 |
|
63 |
+
| 0.0351 | 17.0 | 183549 | 0.0369 |
|
64 |
+
| 0.0348 | 18.0 | 194346 | 0.0354 |
|
65 |
+
| 0.0344 | 19.0 | 205143 | 0.0345 |
|
66 |
+
| 0.0339 | 20.0 | 215940 | 0.0371 |
|
67 |
+
| 0.0335 | 21.0 | 226737 | 0.0359 |
|
68 |
+
| 0.0332 | 22.0 | 237534 | 0.0335 |
|
69 |
+
| 0.0328 | 23.0 | 248331 | 0.0360 |
|
70 |
+
| 0.0325 | 24.0 | 259128 | 0.0349 |
|
71 |
+
| 0.0322 | 25.0 | 269925 | 0.0333 |
|
72 |
+
| 0.0318 | 26.0 | 280722 | 0.0339 |
|
73 |
+
| 0.0315 | 27.0 | 291519 | 0.0338 |
|
74 |
+
| 0.0312 | 28.0 | 302316 | 0.0323 |
|
75 |
+
| 0.0309 | 29.0 | 313113 | 0.0316 |
|
76 |
+
| 0.0308 | 30.0 | 323910 | 0.0323 |
|
77 |
+
| 0.0305 | 31.0 | 334707 | 0.0302 |
|
78 |
+
| 0.0302 | 32.0 | 345504 | 0.0304 |
|
79 |
+
| 0.0299 | 33.0 | 356301 | 0.0310 |
|
80 |
+
| 0.0298 | 34.0 | 367098 | 0.0336 |
|
81 |
+
| 0.0295 | 35.0 | 377895 | 0.0323 |
|
82 |
+
| 0.0293 | 36.0 | 388692 | 0.0288 |
|
83 |
+
| 0.0291 | 37.0 | 399489 | 0.0296 |
|
84 |
+
| 0.0288 | 38.0 | 410286 | 0.0297 |
|
85 |
+
| 0.0287 | 39.0 | 421083 | 0.0285 |
|
86 |
+
| 0.0285 | 40.0 | 431880 | 0.0303 |
|
87 |
+
| 0.0283 | 41.0 | 442677 | 0.0293 |
|
88 |
+
| 0.0282 | 42.0 | 453474 | 0.0279 |
|
89 |
+
| 0.028 | 43.0 | 464271 | 0.0305 |
|
90 |
+
| 0.0279 | 44.0 | 475068 | 0.0292 |
|
91 |
+
| 0.0276 | 45.0 | 485865 | 0.0317 |
|
92 |
+
| 0.0275 | 46.0 | 496662 | 0.0285 |
|
93 |
+
| 0.0273 | 47.0 | 507459 | 0.0279 |
|
94 |
+
| 0.0272 | 48.0 | 518256 | 0.0313 |
|
95 |
+
| 0.0271 | 49.0 | 529053 | 0.0305 |
|
96 |
+
| 0.027 | 50.0 | 539850 | 0.0290 |
|
97 |
+
| 0.0269 | 51.0 | 550647 | 0.0291 |
|
98 |
+
| 0.0268 | 52.0 | 561444 | 0.0319 |
|
99 |
+
| 0.0266 | 53.0 | 572241 | 0.0268 |
|
100 |
+
| 0.0265 | 54.0 | 583038 | 0.0274 |
|
101 |
+
| 0.0264 | 55.0 | 593835 | 0.0275 |
|
102 |
+
| 0.0263 | 56.0 | 604632 | 0.0272 |
|
103 |
+
| 0.0261 | 57.0 | 615429 | 0.0263 |
|
104 |
+
| 0.0261 | 58.0 | 626226 | 0.0262 |
|
105 |
+
| 0.0259 | 59.0 | 637023 | 0.0305 |
|
106 |
+
| 0.0259 | 60.0 | 647820 | 0.0276 |
|
107 |
+
| 0.0258 | 61.0 | 658617 | 0.0264 |
|
108 |
+
| 0.0257 | 62.0 | 669414 | 0.0261 |
|
109 |
+
| 0.0257 | 63.0 | 680211 | 0.0261 |
|
110 |
+
| 0.0255 | 64.0 | 691008 | 0.0255 |
|
111 |
+
| 0.0254 | 65.0 | 701805 | 0.0261 |
|
112 |
+
| 0.0253 | 66.0 | 712602 | 0.0261 |
|
113 |
+
| 0.0252 | 67.0 | 723399 | 0.0288 |
|
114 |
+
| 0.0251 | 68.0 | 734196 | 0.0260 |
|
115 |
+
| 0.025 | 69.0 | 744993 | 0.0288 |
|
116 |
+
| 0.025 | 70.0 | 755790 | 0.0248 |
|
117 |
+
| 0.025 | 71.0 | 766587 | 0.0252 |
|
118 |
+
| 0.0248 | 72.0 | 777384 | 0.0370 |
|
119 |
+
| 0.0248 | 73.0 | 788181 | 0.0251 |
|
120 |
+
| 0.0247 | 74.0 | 798978 | 0.0247 |
|
121 |
+
| 0.0246 | 75.0 | 809775 | 0.0249 |
|
122 |
+
| 0.0246 | 76.0 | 820572 | 0.0278 |
|
123 |
+
| 0.0245 | 77.0 | 831369 | 0.0267 |
|
124 |
+
| 0.0244 | 78.0 | 842166 | 0.0254 |
|
125 |
+
| 0.0244 | 79.0 | 852963 | 0.0248 |
|
126 |
+
| 0.0243 | 80.0 | 863760 | 0.0275 |
|
127 |
+
| 0.0243 | 81.0 | 874557 | 0.0249 |
|
128 |
+
| 0.0242 | 82.0 | 885354 | 0.0252 |
|
129 |
+
| 0.0241 | 83.0 | 896151 | 0.0255 |
|
130 |
+
| 0.0241 | 84.0 | 906948 | 0.0249 |
|
131 |
+
| 0.024 | 85.0 | 917745 | 0.0241 |
|
132 |
+
| 0.024 | 86.0 | 928542 | 0.0253 |
|
133 |
+
| 0.0239 | 87.0 | 939339 | 0.0244 |
|
134 |
+
| 0.0238 | 88.0 | 950136 | 0.0241 |
|
135 |
+
| 0.0238 | 89.0 | 960933 | 0.0264 |
|
136 |
+
| 0.0238 | 90.0 | 971730 | 0.0245 |
|
137 |
+
| 0.0238 | 91.0 | 982527 | 0.0238 |
|
138 |
+
| 0.0236 | 92.0 | 993324 | 0.0315 |
|
139 |
+
| 0.0236 | 93.0 | 1004121 | 0.0267 |
|
140 |
+
| 0.0236 | 94.0 | 1014918 | 0.0247 |
|
141 |
+
| 0.0235 | 95.0 | 1025715 | 0.0269 |
|
142 |
+
| 0.0234 | 96.0 | 1036512 | 0.0237 |
|
143 |
+
| 0.0234 | 97.0 | 1047309 | 0.0245 |
|
144 |
+
| 0.0234 | 98.0 | 1058106 | 0.0238 |
|
145 |
+
| 0.0233 | 99.0 | 1068903 | 0.0235 |
|
146 |
+
| 0.0233 | 100.0 | 1079700 | 0.0235 |
|
147 |
+
| 0.0232 | 101.0 | 1090497 | 0.0247 |
|
148 |
+
| 0.0232 | 102.0 | 1101294 | 0.0234 |
|
149 |
+
| 0.0231 | 103.0 | 1112091 | 0.0247 |
|
150 |
+
| 0.0231 | 104.0 | 1122888 | 0.0247 |
|
151 |
+
| 0.0231 | 105.0 | 1133685 | 0.0259 |
|
152 |
+
| 0.0231 | 106.0 | 1144482 | 0.0251 |
|
153 |
+
| 0.023 | 107.0 | 1155279 | 0.0235 |
|
154 |
+
| 0.023 | 108.0 | 1166076 | 0.0238 |
|
155 |
+
| 0.0229 | 109.0 | 1176873 | 0.0239 |
|
156 |
+
| 0.0229 | 110.0 | 1187670 | 0.0227 |
|
157 |
+
| 0.0228 | 111.0 | 1198467 | 0.0232 |
|
158 |
+
| 0.0228 | 112.0 | 1209264 | 0.0232 |
|
159 |
+
| 0.0228 | 113.0 | 1220061 | 0.0308 |
|
160 |
+
| 0.0227 | 114.0 | 1230858 | 0.0239 |
|
161 |
+
| 0.0227 | 115.0 | 1241655 | 0.0260 |
|
162 |
+
| 0.0227 | 116.0 | 1252452 | 0.0230 |
|
163 |
+
| 0.0227 | 117.0 | 1263249 | 0.0234 |
|
164 |
+
| 0.0226 | 118.0 | 1274046 | 0.0228 |
|
165 |
+
| 0.0226 | 119.0 | 1284843 | 0.0228 |
|
166 |
+
| 0.0225 | 120.0 | 1295640 | 0.0229 |
|
167 |
+
| 0.0225 | 121.0 | 1306437 | 0.0231 |
|
168 |
+
| 0.0225 | 122.0 | 1317234 | 0.0225 |
|
169 |
+
| 0.0225 | 123.0 | 1328031 | 0.0234 |
|
170 |
+
| 0.0224 | 124.0 | 1338828 | 0.0254 |
|
171 |
+
| 0.0224 | 125.0 | 1349625 | 0.0228 |
|
172 |
+
| 0.0223 | 126.0 | 1360422 | 0.0225 |
|
173 |
+
| 0.0224 | 127.0 | 1371219 | 0.0231 |
|
174 |
+
| 0.0223 | 128.0 | 1382016 | 0.0234 |
|
175 |
+
| 0.0222 | 129.0 | 1392813 | 0.0237 |
|
176 |
+
| 0.0222 | 130.0 | 1403610 | 0.0225 |
|
177 |
+
| 0.0222 | 131.0 | 1414407 | 0.0227 |
|
178 |
+
| 0.0222 | 132.0 | 1425204 | 0.0227 |
|
179 |
+
| 0.0221 | 133.0 | 1436001 | 0.0255 |
|
180 |
+
| 0.0222 | 134.0 | 1446798 | 0.0220 |
|
181 |
+
| 0.0221 | 135.0 | 1457595 | 0.0227 |
|
182 |
+
| 0.0221 | 136.0 | 1468392 | 0.0222 |
|
183 |
+
| 0.022 | 137.0 | 1479189 | 0.0223 |
|
184 |
+
| 0.022 | 138.0 | 1489986 | 0.0222 |
|
185 |
+
| 0.0219 | 139.0 | 1500783 | 0.0222 |
|
186 |
+
| 0.0219 | 140.0 | 1511580 | 0.0246 |
|
187 |
+
| 0.022 | 141.0 | 1522377 | 0.0226 |
|
188 |
+
| 0.0219 | 142.0 | 1533174 | 0.0219 |
|
189 |
+
| 0.0219 | 143.0 | 1543971 | 0.0241 |
|
190 |
+
| 0.0219 | 144.0 | 1554768 | 0.0219 |
|
191 |
+
| 0.0219 | 145.0 | 1565565 | 0.0220 |
|
192 |
+
| 0.0218 | 146.0 | 1576362 | 0.0228 |
|
193 |
+
| 0.0218 | 147.0 | 1587159 | 0.0254 |
|
194 |
+
| 0.0218 | 148.0 | 1597956 | 0.0217 |
|
195 |
+
| 0.0217 | 149.0 | 1608753 | 0.0226 |
|
196 |
+
| 0.0217 | 150.0 | 1619550 | 0.0221 |
|
197 |
+
| 0.0217 | 151.0 | 1630347 | 0.0220 |
|
198 |
+
| 0.0217 | 152.0 | 1641144 | 0.0219 |
|
199 |
+
| 0.0216 | 153.0 | 1651941 | 0.0277 |
|
200 |
+
| 0.0216 | 154.0 | 1662738 | 0.0232 |
|
201 |
+
| 0.0216 | 155.0 | 1673535 | 0.0263 |
|
202 |
+
| 0.0217 | 156.0 | 1684332 | 0.0241 |
|
203 |
+
| 0.0216 | 157.0 | 1695129 | 0.0217 |
|
204 |
+
| 0.0215 | 158.0 | 1705926 | 0.0221 |
|
205 |
+
| 0.0215 | 159.0 | 1716723 | 0.0217 |
|
206 |
+
| 0.0215 | 160.0 | 1727520 | 0.0220 |
|
207 |
+
| 0.0215 | 161.0 | 1738317 | 0.0214 |
|
208 |
+
| 0.0214 | 162.0 | 1749114 | 0.0219 |
|
209 |
+
| 0.0214 | 163.0 | 1759911 | 0.0309 |
|
210 |
+
| 0.0214 | 164.0 | 1770708 | 0.0216 |
|
211 |
+
| 0.0214 | 165.0 | 1781505 | 0.0312 |
|
212 |
+
| 0.0213 | 166.0 | 1792302 | 0.0221 |
|
213 |
+
| 0.0213 | 167.0 | 1803099 | 0.0215 |
|
214 |
+
| 0.0214 | 168.0 | 1813896 | 0.0216 |
|
215 |
+
| 0.0214 | 169.0 | 1824693 | 0.0236 |
|
216 |
+
| 0.0213 | 170.0 | 1835490 | 0.0212 |
|
217 |
+
| 0.0213 | 171.0 | 1846287 | 0.0214 |
|
218 |
+
| 0.0213 | 172.0 | 1857084 | 0.0230 |
|
219 |
+
| 0.0213 | 173.0 | 1867881 | 0.0292 |
|
220 |
+
| 0.0212 | 174.0 | 1878678 | 0.0219 |
|
221 |
+
| 0.0212 | 175.0 | 1889475 | 0.0217 |
|
222 |
+
| 0.0212 | 176.0 | 1900272 | 0.0213 |
|
223 |
+
| 0.0212 | 177.0 | 1911069 | 0.0211 |
|
224 |
+
| 0.0212 | 178.0 | 1921866 | 0.0358 |
|
225 |
+
| 0.0211 | 179.0 | 1932663 | 0.0248 |
|
226 |
+
| 0.0211 | 180.0 | 1943460 | 0.0210 |
|
227 |
+
| 0.0211 | 181.0 | 1954257 | 0.0221 |
|
228 |
+
| 0.0211 | 182.0 | 1965054 | 0.0217 |
|
229 |
+
| 0.0211 | 183.0 | 1975851 | 0.0249 |
|
230 |
+
| 0.0211 | 184.0 | 1986648 | 0.0209 |
|
231 |
+
| 0.021 | 185.0 | 1997445 | 0.0212 |
|
232 |
+
| 0.021 | 186.0 | 2008242 | 0.0231 |
|
233 |
+
| 0.021 | 187.0 | 2019039 | 0.0214 |
|
234 |
+
| 0.021 | 188.0 | 2029836 | 0.0214 |
|
235 |
+
| 0.0209 | 189.0 | 2040633 | 0.0237 |
|
236 |
+
| 0.021 | 190.0 | 2051430 | 0.0212 |
|
237 |
+
| 0.0209 | 191.0 | 2062227 | 0.0209 |
|
238 |
+
| 0.0209 | 192.0 | 2073024 | 0.0210 |
|
239 |
+
| 0.0209 | 193.0 | 2083821 | 0.0216 |
|
240 |
+
| 0.0209 | 194.0 | 2094618 | 0.0212 |
|
241 |
+
| 0.0208 | 195.0 | 2105415 | 0.0268 |
|
242 |
+
| 0.0208 | 196.0 | 2116212 | 0.0209 |
|
243 |
+
| 0.0209 | 197.0 | 2127009 | 0.0246 |
|
244 |
+
| 0.0209 | 198.0 | 2137806 | 0.0209 |
|
245 |
+
| 0.0208 | 199.0 | 2148603 | 0.0210 |
|
246 |
+
|
247 |
+
|
248 |
+
### Framework versions
|
249 |
+
|
250 |
+
- Transformers 4.38.2
|
251 |
+
- Pytorch 2.2.1+cu121
|
252 |
+
- Datasets 2.18.0
|
253 |
+
- Tokenizers 0.15.2
|
config.json
ADDED
@@ -0,0 +1,50 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"activation_function": "gelu",
|
3 |
+
"architectures": [
|
4 |
+
"PatchTSTForPretraining"
|
5 |
+
],
|
6 |
+
"attention_dropout": 0.0,
|
7 |
+
"bias": true,
|
8 |
+
"channel_attention": false,
|
9 |
+
"channel_consistent_masking": false,
|
10 |
+
"context_length": 512,
|
11 |
+
"d_model": 128,
|
12 |
+
"distribution_output": "student_t",
|
13 |
+
"do_mask_input": true,
|
14 |
+
"dropout": 0.0,
|
15 |
+
"ff_dropout": 0.0,
|
16 |
+
"ffn_dim": 512,
|
17 |
+
"head_dropout": 0.0,
|
18 |
+
"init_std": 0.02,
|
19 |
+
"loss": "mse",
|
20 |
+
"mask_type": "random",
|
21 |
+
"mask_value": 0,
|
22 |
+
"model_type": "patchtst",
|
23 |
+
"norm_eps": 1e-05,
|
24 |
+
"norm_type": "batchnorm",
|
25 |
+
"num_attention_heads": 4,
|
26 |
+
"num_forecast_mask_patches": [
|
27 |
+
2
|
28 |
+
],
|
29 |
+
"num_hidden_layers": 3,
|
30 |
+
"num_input_channels": 10,
|
31 |
+
"num_parallel_samples": 100,
|
32 |
+
"num_targets": 1,
|
33 |
+
"output_range": null,
|
34 |
+
"patch_length": 12,
|
35 |
+
"patch_stride": 12,
|
36 |
+
"path_dropout": 0.0,
|
37 |
+
"pooling_type": "mean",
|
38 |
+
"positional_dropout": 0.0,
|
39 |
+
"positional_encoding_type": "sincos",
|
40 |
+
"pre_norm": true,
|
41 |
+
"prediction_length": 24,
|
42 |
+
"random_mask_ratio": 0.4,
|
43 |
+
"scaling": "std",
|
44 |
+
"share_embedding": true,
|
45 |
+
"share_projection": true,
|
46 |
+
"torch_dtype": "float32",
|
47 |
+
"transformers_version": "4.38.2",
|
48 |
+
"unmasked_channel_indices": null,
|
49 |
+
"use_cls_token": true
|
50 |
+
}
|
model.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ea27eee9eb6fa226f1633a0412b77668330990dc6d8d0232b25bda565e986c85
|
3 |
+
size 2429008
|
training_args.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eacf20385539a114220b95df5acac8ff2c393914a0c939da07ada77b1959e2b0
|
3 |
+
size 4920
|