hazemKrimi
|
059202cd1e
|
Parse the response from the model properly and stream it to the client
|
2025-01-24 00:53:49 +01:00 |
|
hazemKrimi
|
ec1e3fa810
|
Tweaks
|
2025-01-21 18:17:50 +01:00 |
|
hazemKrimi
|
b7ad987a49
|
Using channels wip
|
2025-01-21 02:00:53 +01:00 |
|
hazemKrimi
|
d564a21db4
|
Attempt at controlling formatting constraints
|
2025-01-18 18:25:45 +01:00 |
|
hazemKrimi
|
3247c0132a
|
Testing languages
|
2025-01-17 23:10:34 +01:00 |
|
hazemKrimi
|
03656acb6f
|
Clean LLM output WIP
|
2025-01-17 22:14:03 +01:00 |
|
hazemKrimi
|
5c94a354d8
|
Some improvements
|
2025-01-17 02:36:18 +01:00 |
|
hazemKrimi
|
dd3333ba9a
|
Add cursor and format code properly
|
2025-01-15 01:45:14 +01:00 |
|
hazemKrimi
|
fc07f39796
|
Consume the api locally
|
2025-01-13 21:32:20 +01:00 |
|
hazemKrimi
|
754bc1f7d8
|
Endpoint for streaming generated code
|
2025-01-13 19:59:00 +01:00 |
|
hazemKrimi
|
20f16ea4f5
|
chore: Initial commit
|
2024-12-15 16:06:21 +01:00 |
|