Zvonimir Sabljic
ffe4fbeba9
Enabled catching of max token limit errors from OpenAI's response
2023-09-18 19:18:54 -07:00
LeonOstrez
67209b5b27
merge master into debugging_ipc branch
2023-09-18 19:09:37 -07:00
Zvonimir Sabljic
0dd6b6d996
Changed development_steps, command_runs, and user_inputs models - we don't need to hash any data - we can just use it as is
2023-09-15 17:51:24 +02:00
Zvonimir Sabljic
65135344ab
Converted colored leftovers
2023-09-15 09:43:30 +02:00
Zvonimir Sabljic
9a7c15e0c5
Merge branch 'ipc' into debugging_ipc
2023-09-14 09:40:36 +02:00
Zvonimir Sabljic
0619b53d18
Necessary flag so 33c38985bf works
2023-09-12 21:35:07 +02:00
Zvonimir Sabljic
151aa051e2
Improved debugging process and enabled splitting of app development into tasks and then into steps
...
- split step implementation into different functions
- standardized the return value in the implementation process - { "success": True }
- added propagation of errors back to the recursion level 0 with TooDeepRecursionError and TokenLimitError
- created new class Debugger and moved debugging in it
2023-09-12 21:32:56 +02:00
Zvonimir Sabljic
19ac692509
Don't send max_tokens to openai api so we can use as much context as possible
2023-09-12 21:28:01 +02:00
LeonOstrez
80914f0722
Merge pull request #81 from nalbion/feature/should_execute_step
...
Refactored `execute_step()` as per #80
2023-09-12 12:57:52 +02:00
LeonOstrez
b023205a53
Merge pull request #85 from alter123/patch-1
...
Add check when the response is empty
2023-09-12 08:40:07 +02:00
LeonOstrez
9ffcdf79bb
Merge pull request #82 from nalbion/feature/test_CodeMonkey
...
Feature/test code monkey
2023-09-12 08:09:12 +02:00
Jay
74cbe33421
Add check when the response is empty
...
{
"id": "",
"object": "",
"created": 0,
"model": "",
"prompt_annotations": [
{
"prompt_index": 0,
"content_filter_results": {
"hate": { "filtered": false, "severity": "safe" },
"self_harm": { "filtered": false, "severity": "safe" },
"sexual": { "filtered": false, "severity": "safe" },
"violence": { "filtered": false, "severity": "safe" }
}
}
],
"choices": [],
"usage": null
}
In newer versions of models, at times choices array can be empty
2023-09-12 00:01:13 +05:30
Nicholas Albion
a94cbf9209
added documentation
2023-09-11 22:15:26 +10:00
Nicholas Albion
5b7b621832
Refactored execute_step() as per #80
2023-09-11 14:09:16 +10:00
Goon
3b207987aa
Merge remote-tracking branch 'upstream/main'
2023-09-11 10:26:05 +07:00
Goon
c39346868a
fix(gitignore): rm cache
2023-09-11 10:08:57 +07:00
Goon
367caa1797
fix(gitignore): rm pilot-env and cache
2023-09-11 10:08:42 +07:00
Nicholas Albion
f2187b5a04
fixed tests for CI
2023-09-09 12:02:22 +10:00
Nicholas Albion
a38c7c4f6d
linting
2023-09-09 11:54:16 +10:00
Nicholas Albion
4b64631bec
linting
2023-09-09 11:50:55 +10:00
Nicholas Albion
8cec113df9
test_username_to_uuid()
2023-09-09 10:54:33 +10:00
Nicholas Albion
831e6a4265
Merge remote-tracking branch 'origin/main' into feature/get_email-from-gitconfig
...
# Conflicts:
# pilot/utils/arguments.py
2023-09-09 10:54:02 +10:00
Zvonimir Sabljic
6a46851b20
Added catching and retrying when there is an error with Token limit
2023-09-08 18:03:54 +02:00
LeonOstrez
fd1fae8c43
Merge branch 'main' into feature/user_id-from-getpass_getuser
2023-09-08 14:46:04 +02:00
Goon
02623b6353
fix(llm connection): add openrouter api endpoint
2023-09-08 15:52:29 +07:00
Nicholas Albion
95c6e26665
removed commented-out code
2023-09-08 15:20:32 +10:00
Nicholas Albion
e33616450d
BaseModel.id is a UUIDField, create UUID from username
2023-09-08 15:19:47 +10:00
Nicholas Albion
891d153a2b
BaseModel.id is a UUIDField, create UUID from username
2023-09-08 15:14:49 +10:00
Nicholas Albion
ee77f1ffac
get_app_by_user_workspace(user_id, workspace)
2023-09-08 15:13:37 +10:00
Nicholas Albion
0ec6da74ab
workspace path can be specified in CLI args
2023-09-08 06:29:46 +10:00
Nicholas Albion
720fa26bcf
user_id defaults to OS username
2023-09-08 04:05:26 +10:00
Nicholas Albion
c4af2750ac
user_id defaults to OS username
2023-09-08 04:01:46 +10:00
Zvonimir Sabljic
d52c674cf0
Fix
2023-09-07 19:47:05 +02:00
Nicholas Albion
48edfae03c
handle rate_limit_exceeded error
2023-09-08 03:05:26 +10:00
Nicholas Albion
69eeae5606
attempt to get email from ~/.gitconfig
2023-09-08 03:03:38 +10:00
LeonOstrez
44a05b49b5
collect telemetry and ask user for feedback
2023-09-07 13:01:44 +02:00
LeonOstrez
83378033be
remove checking for tokens before openai api request and handle too many tokens in response
2023-09-06 16:46:08 +02:00
LeonOstrez
f383e6c16e
delete all development steps if project continued from step before 'coding'
2023-09-06 16:43:11 +02:00
Zvonimir Sabljic
83ebd7939d
Merge branch 'main' into sander110419-main
2023-09-05 22:52:01 +02:00
Zvonimir Sabljic
a9ead6ecbb
Fix to enable regular OpenAI access
2023-09-05 22:50:48 +02:00
Dani Acosta
af6a972cba
Add OPENAI_MODEL env var
...
Adds a env variable OPENAI_MODEL to be able to use different models to GPT-4
2023-09-05 00:15:21 +02:00
Zvonimir Sabljic
28d0143536
TEMP fix
2023-09-01 18:29:31 +02:00
Zvonimir Sabljic
85ac7e8276
Refactored all prints to be colored with fabulous and not termcolor
2023-09-01 18:28:20 +02:00
Zvonimir Sabljic
ca58c4958d
Implemented final version of IPC communication
2023-09-01 18:27:00 +02:00
Sander Hilven
a4d520763f
Added model selection to .env and update readme
2023-09-01 10:34:12 +02:00
Sander Hilven
660047a071
Hardcoded model in endpoint URL, now fixed.
2023-09-01 09:56:39 +02:00
Sander Hilven
984379fe71
Adde Azure OpenAI endpoint.
...
Tested and confirmed working.
2023-09-01 09:53:17 +02:00
Zvonimir Sabljic
bdb4d0dff8
Enabled getting user input from the external process
2023-08-31 08:38:37 +02:00
Zvonimir Sabljic
1418704186
Initial setup for IPC Client and logging
2023-08-30 23:16:17 +02:00
Zvonimir Sabljic
f4dc07407e
Made logging look nicer
2023-08-25 14:21:41 +02:00