AI-Powered Features in Your APEX App
OCI Generative AI provides access to large language models (LLMs) like Cohere Command and Meta Llama through REST APIs. From PL/SQL, you can call these models to add intelligent features to your APEX applications: summarizing documents, generating email drafts, classifying support tickets, extracting structured data from unstructured text, and answering questions about your business data.
Calling OCI Generative AI From PL/SQL
DECLARE
l_request CLOB;
l_response CLOB;
l_summary VARCHAR2(4000);
BEGIN
-- Build the request
APEX_JSON.INITIALIZE_CLOB_OUTPUT;
APEX_JSON.OPEN_OBJECT;
APEX_JSON.WRITE('compartmentId', 'ocid1.compartment.oc1..xxxxx');
APEX_JSON.WRITE('servingMode', JSON_OBJECT(
'servingType' VALUE 'ON_DEMAND',
'modelId' VALUE 'cohere.command-r-plus'
));
APEX_JSON.OPEN_OBJECT('inferenceRequest');
APEX_JSON.WRITE('runtimeType', 'COHERE');
APEX_JSON.WRITE('prompt', 'Summarize the following support ticket in 2 sentences: ' ||
:P10_TICKET_DESCRIPTION);
APEX_JSON.WRITE('maxTokens', 200);
APEX_JSON.WRITE('temperature', 0.3);
APEX_JSON.CLOSE_OBJECT;
APEX_JSON.CLOSE_OBJECT;
l_request := APEX_JSON.GET_CLOB_OUTPUT;
APEX_JSON.FREE_OUTPUT;
-- Call the API
l_response := APEX_WEB_SERVICE.MAKE_REST_REQUEST(
p_url => 'https://inference.generativeai.us-chicago-1.oci.oraclecloud.com'
|| '/20231130/actions/generateText',
p_http_method => 'POST',
p_body => l_request,
p_credential_static_id => 'OCI_GEN_AI_CRED'
);
-- Parse the response
APEX_JSON.PARSE(l_response);
l_summary := APEX_JSON.GET_VARCHAR2(
p_path => 'inferenceResponse.generatedTexts[1].text'
);
:P10_AI_SUMMARY := l_summary;
END;
Practical APEX Use Cases
Support Ticket Classification: When a new ticket is submitted, call the LLM to classify it by category, priority, and department, then auto-route it. Document Summarization: Add a “Summarize” button to document detail pages that generates a concise summary of long reports or contracts. Email Drafting: Given a context (customer complaint, order confirmation), generate a professional email draft that the user can review and send.
Using DBMS_CLOUD_AI (Autonomous Database)
On Autonomous Database, DBMS_CLOUD_AI provides a simpler interface:
SELECT DBMS_CLOUD_AI.GENERATE(
prompt => 'Classify this support ticket into one of: Billing, Technical, Account, General. Ticket: ' || :P10_DESCRIPTION,
profile_name => 'MY_AI_PROFILE',
action => 'chat'
) AS classification FROM DUAL;
Cost and Latency Considerations
LLM API calls take 1 to 10 seconds depending on prompt and response length. Never call them synchronously during page rendering for every user. Instead, call them in background processes (APEX Automations, DBMS_SCHEDULER jobs) or on user demand (button click with a loading indicator). Cache results when the same input will produce the same output. Monitor API costs through the OCI Cost Analysis dashboard.