BIN gopher_ai.py
TEXT gopher_ai.py
Help on module gopher_ai:
NAME
gopher_ai
DESCRIPTION
Queries AI model openrouter/elephant-alpha for displaying the answer in
gopher. Answer can be wrapped in cowsay by passing 'cowsay' or 'cowthink'
without quotes as the first word in the prompt.
FUNCTIONS
build_prompt_obj(prompt: str) -> dict
Build dict for the prompt for later use.
Contains the prompt and the key "answer" that will be filled later with
the LLM's answer. Also contains the boolean "cow" which indicates that
cowsay or cowthink is to be used in the answer. If the first word in
the prompt is either 'cowsay' or 'cowthink' the key "cow_method" is set
with the path to the specified cow program.
Parameters:
prompt (str): The prompt.
Returns:
dict: object with prompt, answer, cow, cow_method
cow_print(prompt_obj: dict) -> None
Wrap LLM answer in cowsay or cowthink.
Parameters:
prompt_obj (dict)
Returns:
None
g_print(mystr: str, **kwargs) -> None
Print passed string with leading 'i'.
So gopher displays string correctly.
Parameters:
mystr (str):
Returns:
None
get_parameter() -> str
Get prompt from environment.
Returns:
str: The prompt
llm_req(prompt: str) -> str
Make ai request.
Parameters:
prompt_obj (dict): The prompt.
Returns:
str: The result from the prompt.
main()
Query LLM and print answer.
Main function.
Returns:
None
print_header(prompt_obj: dict) -> None
Print a small header.
Repeats the prompt and announces the answer.
Parameters:
prompt_obj (dict): Contains the prompt.
Returns:
None
DATA
COW_SAY = {'cowsay': '/usr/pkg/bin/cowsay -f bunny', 'cowthink': '/usr...
ENV_PARAM = 'QUERY_STRING'
LLM_URL = 'https://openrouter.ai/api/v1/chat/completions'
MODEL = 'openrouter/elephant-alpha'
TOKEN = 'xxxxxxxxxxxx'