Package com.google.genkit.plugins.ollama
Class OllamaModel
java.lang.Object
com.google.genkit.plugins.ollama.OllamaModel
- All Implemented Interfaces:
Model,Action<ModelRequest,,ModelResponse, ModelResponseChunk> Registerable
Ollama model implementation for Genkit.
Supports local Ollama models with both synchronous and streaming generation.
Ollama must be running locally (or at the configured host).
-
Constructor Summary
ConstructorsConstructorDescriptionOllamaModel(String modelName, OllamaPluginOptions options) Creates a new OllamaModel. -
Method Summary
Modifier and TypeMethodDescriptiongetInfo()Gets information about the model's capabilities.getName()Returns the name of the action.run(ActionContext context, ModelRequest request) Generates a response from the given request.run(ActionContext context, ModelRequest request, Consumer<ModelResponseChunk> streamCallback) Generates a streaming response from the given request.booleanReturns whether this model supports streaming.Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface com.google.genkit.ai.Model
getDesc, getInputSchema, getMetadata, getOutputSchema, getType, register, runJson, runJsonWithTelemetry
-
Constructor Details
-
OllamaModel
Creates a new OllamaModel.- Parameters:
modelName- the model name (e.g., "llama3.2", "mistral")options- the plugin options
-
-
Method Details
-
getName
Description copied from interface:ActionReturns the name of the action.- Specified by:
getNamein interfaceAction<ModelRequest,ModelResponse, ModelResponseChunk> - Returns:
- the action name
-
getInfo
Description copied from interface:ModelGets information about the model's capabilities. -
supportsStreaming
public boolean supportsStreaming()Description copied from interface:ModelReturns whether this model supports streaming.- Specified by:
supportsStreamingin interfaceModel- Returns:
- true if streaming is supported
-
run
Description copied from interface:ModelGenerates a response from the given request.- Specified by:
runin interfaceAction<ModelRequest,ModelResponse, ModelResponseChunk> - Specified by:
runin interfaceModel- Parameters:
context- the action contextrequest- the model request- Returns:
- the model response
-
run
public ModelResponse run(ActionContext context, ModelRequest request, Consumer<ModelResponseChunk> streamCallback) Description copied from interface:ModelGenerates a streaming response from the given request.- Specified by:
runin interfaceAction<ModelRequest,ModelResponse, ModelResponseChunk> - Specified by:
runin interfaceModel- Parameters:
context- the action contextrequest- the model requeststreamCallback- callback for streaming chunks- Returns:
- the final model response
-