Iragent
This project is about using agents as simple as possible with small and large language models
Loading...
Searching...
No Matches
iragent.agent.Agent Class Reference

using this class we'll be able to define an agent. More...

Public Member Functions

 __init__ (self, str name, str model, str base_url, str api_key, str system_prompt, float temprature=0.1, int max_token=100, str next_agent=None, List[Callable] fn=[], str provider="openai", str response_format=None, memory=None)
 
str call_message (self, Message message, **kwargs)
 
Dict[str, Any] function_to_schema (self, Callable fn)
 
str python_type_to_json_type (self, Any py_type)
 
Dict[str, Any] parse_docstring (self, Callable fn)
 

Public Attributes

str provider = provider
 The platform we use for loading the large lanuage models.
 
 base_url = base_url
 This will be the base url in our agent for communication with llm.
 
 api_key = api_key
 Your api-key will set in this variable to create a communication.
 
 model = model
 Choose the name of the model you want to use.
 
 temprature = temprature
 set tempreture for generating output from llm.
 
 max_token = max_token
 set max token that will be generated.
 
 system_prompt = system_prompt
 set system prompt that will
 
 name = name
 set a name for the agent.
 
 next_agent = next_agent
 set a agent as next agent
 
dict function_map = {f.__name__: f for f in fn}
 
list fn = [self.function_to_schema(f) for f in fn]
 list of tools that available for this agent to use.
 
 client = OpenAI(api_key=self.api_key, base_url=self.base_url)
 
 response_format = response_format
 
 memory = memory() if memory is not None else None
 

Protected Member Functions

Message _call_ollama (self, List[Dict] msgs, Message message)
 This function use http call for ollama provider.
 
Message _call_ollama_v2 (self, List[Dict] msgs, Message message)
 
Message _call_openai (self, List[Dict] msgs, Message message, **kwargs)
 

Detailed Description

using this class we'll be able to define an agent.

Constructor & Destructor Documentation

◆ __init__()

iragent.agent.Agent.__init__ ( self,
str name,
str model,
str base_url,
str api_key,
str system_prompt,
float temprature = 0.1,
int max_token = 100,
str next_agent = None,
List[Callable] fn = [],
str provider = "openai",
str response_format = None,
memory = None )

Member Function Documentation

◆ _call_ollama()

Message iragent.agent.Agent._call_ollama ( self,
List[Dict] msgs,
Message message )
protected

This function use http call for ollama provider.

Parameters
msgsthis is a list of dictionary

◆ _call_ollama_v2()

Message iragent.agent.Agent._call_ollama_v2 ( self,
List[Dict] msgs,
Message message )
protected
There is some different when you want to use ollama or openai call. this function work with "role":"tool".
this function use openai library for comunicate for ollama.

◆ _call_openai()

Message iragent.agent.Agent._call_openai ( self,
List[Dict] msgs,
Message message,
** kwargs )
protected

◆ call_message()

str iragent.agent.Agent.call_message ( self,
Message message,
** kwargs )

◆ function_to_schema()

Dict[str, Any] iragent.agent.Agent.function_to_schema ( self,
Callable fn )

◆ parse_docstring()

Dict[str, Any] iragent.agent.Agent.parse_docstring ( self,
Callable fn )

◆ python_type_to_json_type()

str iragent.agent.Agent.python_type_to_json_type ( self,
Any py_type )

Member Data Documentation

◆ api_key

iragent.agent.Agent.api_key = api_key

Your api-key will set in this variable to create a communication.

◆ base_url

iragent.agent.Agent.base_url = base_url

This will be the base url in our agent for communication with llm.

◆ client

iragent.agent.Agent.client = OpenAI(api_key=self.api_key, base_url=self.base_url)

◆ fn

list iragent.agent.Agent.fn = [self.function_to_schema(f) for f in fn]

list of tools that available for this agent to use.

◆ function_map

dict iragent.agent.Agent.function_map = {f.__name__: f for f in fn}

◆ max_token

iragent.agent.Agent.max_token = max_token

set max token that will be generated.

◆ memory

iragent.agent.Agent.memory = memory() if memory is not None else None

◆ model

iragent.agent.Agent.model = model

Choose the name of the model you want to use.

◆ name

iragent.agent.Agent.name = name

set a name for the agent.

◆ next_agent

iragent.agent.Agent.next_agent = next_agent

set a agent as next agent

◆ provider

str iragent.agent.Agent.provider = provider

The platform we use for loading the large lanuage models.

you should peak ollama or openai as provider.

◆ response_format

iragent.agent.Agent.response_format = response_format

◆ system_prompt

iragent.agent.Agent.system_prompt = system_prompt

set system prompt that will

◆ temprature

iragent.agent.Agent.temprature = temprature

set tempreture for generating output from llm.


The documentation for this class was generated from the following file: