Semantic Operator
Invoke LLM capabilities directly from SQL with the Semantic Operator extension, supporting OpenAI and Amazon Bedrock Claude providers.
The Semantic Operator brings Large Language Model capabilities directly into your SQL workflow. Instead of building pipelines that shuttle data between your database and external AI services, you invoke LLM functionality as a native database extension.
Enabling the Extension
create extension llm;
Configuring LLM Providers
Specify which provider and model to use before calling the LLM interface.
set llm.provider = '...';
set llm.model = '...';
OpenAI
-- Required
set llm.provider = 'openai';
set llm.model = 'gpt-4o';
set openai.api_key = 'sk-.....';
-- Optional: for OpenAI-compatible endpoints
set openai.base_url = 'https://your-endpoint.com/v1';
The base_url setting is useful for OpenAI-compatible APIs—local model servers, proxies, or alternative providers.
Amazon Bedrock Claude
set llm.provider = 'bedrock_claude';
Connects to Claude models on AWS Bedrock using your existing AWS credentials.
Use Cases
- Text Classification — Classify records directly in queries
- Content Generation — Generate summaries or derived content in data pipelines
- Semantic Extraction — Extract structured data from unstructured text
- Intelligent Defaults — Use LLM reasoning to populate fields