|
Iragent
This project is about using agents as simple as possible with small and large language models
|
using this class we'll be able to define an agent. More...
Public Member Functions | |
| __init__ (self, str name, str model, str base_url, str api_key, str system_prompt, float temprature=0.1, int max_token=100, str next_agent=None, List[Callable] fn=[], str provider=None, str response_format=None, memory=None) | |
| str | call_message (self, Message message, **kwargs) |
| Dict[str, Any] | function_to_schema (self, Callable fn) |
| str | python_type_to_json_type (self, Any py_type) |
| Dict[str, Any] | parse_docstring (self, Callable fn) |
Public Attributes | |
| str | provider = provider |
| The platform we use for loading the large lanuage models. | |
| base_url = base_url | |
| This will be the base url in our agent for communication with llm. | |
| api_key = api_key | |
| Your api-key will set in this variable to create a communication. | |
| model = model | |
| Choose the name of the model you want to use. | |
| temprature = temprature | |
| set tempreture for generating output from llm. | |
| max_token = max_token | |
| set max token that will be generated. | |
| system_prompt = system_prompt | |
| set system prompt that will | |
| name = name | |
| set a name for the agent. | |
| next_agent = next_agent | |
| set a agent as next agent | |
| dict | function_map = {f.__name__: f for f in fn} |
| list | fn = [self.function_to_schema(f) for f in fn] |
| list of tools that available for this agent to use. | |
| client = OpenAI(api_key=self.api_key, base_url=self.base_url) | |
| response_format = response_format | |
| memory = memory() if memory is not None else None | |
Protected Member Functions | |
| Message | _call_ollama_v2 (self, List[Dict] msgs, Message message) |
| Message | _call_openai (self, List[Dict] msgs, Message message, **kwargs) |
using this class we'll be able to define an agent.
| iragent.agent.Agent.__init__ | ( | self, | |
| str | name, | ||
| str | model, | ||
| str | base_url, | ||
| str | api_key, | ||
| str | system_prompt, | ||
| float | temprature = 0.1, | ||
| int | max_token = 100, | ||
| str | next_agent = None, | ||
| List[Callable] | fn = [], | ||
| str | provider = None, | ||
| str | response_format = None, | ||
| memory = None ) |
There is some different when you want to use ollama or openai call. this function work with "role":"tool". this function use openai library for comunicate for ollama.
|
protected |
| str iragent.agent.Agent.call_message | ( | self, | |
| Message | message, | ||
| ** | kwargs ) |
| Dict[str, Any] iragent.agent.Agent.function_to_schema | ( | self, | |
| Callable | fn ) |
| Dict[str, Any] iragent.agent.Agent.parse_docstring | ( | self, | |
| Callable | fn ) |
| str iragent.agent.Agent.python_type_to_json_type | ( | self, | |
| Any | py_type ) |
| iragent.agent.Agent.api_key = api_key |
Your api-key will set in this variable to create a communication.
| iragent.agent.Agent.base_url = base_url |
This will be the base url in our agent for communication with llm.
| list iragent.agent.Agent.fn = [self.function_to_schema(f) for f in fn] |
list of tools that available for this agent to use.
| dict iragent.agent.Agent.function_map = {f.__name__: f for f in fn} |
| iragent.agent.Agent.max_token = max_token |
set max token that will be generated.
| iragent.agent.Agent.memory = memory() if memory is not None else None |
| iragent.agent.Agent.model = model |
Choose the name of the model you want to use.
| iragent.agent.Agent.name = name |
set a name for the agent.
| iragent.agent.Agent.next_agent = next_agent |
set a agent as next agent
| str iragent.agent.Agent.provider = provider |
The platform we use for loading the large lanuage models.
you should peak ollama or openai as provider.
| iragent.agent.Agent.response_format = response_format |
| iragent.agent.Agent.system_prompt = system_prompt |
set system prompt that will
| iragent.agent.Agent.temprature = temprature |
set tempreture for generating output from llm.