Showing posts with label #ai #db23ai. Show all posts
Showing posts with label #ai #db23ai. Show all posts

Wednesday, March 5, 2025

Oracle DB release 23.7 includes "Select AI" with the DBMS_CLOUD_AI package

 The latest release of Oracle DB23ai (23.7) now includes the promised packages for DBMS_CLOUD.  

I'm not talking about the ADB release, this is the general 23.7 DB release, and it even includes Select AI !!



You can find the documentation for how to install DBMS_CLOUD here.  This is updated documentation that supersedes the MOS note 2748362.1 - How To Setup And Use DBMS_CLOUD Package.


What's Included in 23.7

The Following packages are included in 23.7

DBMS_CLOUD - The SQL to install this package has been included with the DB release since 19.9.  More procedures have been added over time to provide more functionality with object storage.

DBMS_CLOUD_AI - This is the most interesting part of the release (at least to me).  This package is used as the basis for Select AI.

DBMS_CLOUD_NOTIFICATION -  This package allows you to send messages, or the output of a query to an e-mail or to Slack.

DBMS_CLOUD_PIPELINE -  This package allows you to create a data pipeline for loading and exporting data in the cloud.  This is mainly used to interact with data in object storage on a scheduled basis.

DBMS_CLOUD_REPO -  This package allows you to interact with hosted code repositories from the oracle Database. Repositories like Github are supported.


Where to start

The following are some great places to learn more about how to use the packages.

Videos:

Documentation:


Installing in your Database

I started by going through the install and prerequisites found here.
  1. Install the DBMS_CLOUD packages in a 23.7 CDB using the instructions in the 23.7 Documentation (20.2)
  2. Create the SSL wallet with certificates (20.3)
  3. Configure your environment with the new wallet (20.4).
NOTE: If you are using SEPS (ZDLRA uses SEPS), or other user authentication this is the same wallet that other authentication methods use).

    4. Configure the ACL list to allow DB calls to the LLM that you are going to be using (20.5)

    5. Verify the configuration for DBMS_CLOUD (20.6)

    6. Configure users or roles to use DBMS_CLOUD. (20.7).  In my case I granted the access to "SH".

    7. Create the credential for the LLM you are using in your PDB

    8. Create the Profile which identifies the tables that you want to use in your PDB

Example


I installed the Sample sales schema into my PDB (SH user) and followed the instructions in the documentation found here.


Below is the output of one of the queries that I ran using "Select AI" once I went through these steps to install it with the sample SH schema.

SQL> select ai tell me how many customers are in each country;

COUNTRY_NAME                             CUSTOMER_COUNT
---------------------------------------- --------------
Italy                                              7780
Singapore                                           597
Brazil                                              832
United Kingdom                                     7557
Australia                                           831
Japan                                               624
Canada                                             2010
Argentina                                           403
Poland                                              708
China                                               712
Germany                                            8173
United States of America                          18520
France                                             3833
Spain                                              2039
New Zealand                                         244
Denmark                                             383
South Africa                                         88
Saudi Arabia                                         75
Turkey                                               91


I am just getting starting determining how to best use this feature, and this should be enough to get your started.



Monday, January 20, 2025

Oracle DB 23ai in your datacenter

 Oracle DB 23ai is available for Exadata and I've been spending a lot of time working on building some demos in my lab environment. Below is the architecture.


To help you get started below are the pre-steps I did to create this demo.

  1. Download and install DB 23ai (latest version which was 23.6 when I created my demo).
  2. Install APEX within the database.  Most existing demos use APEX, and makes it easy to build a simple application.  Here is a link to a blog that I used to explain the install process, and ORDS setup for the webserver.
  3. Optional - Install the embedding model in your database to convert text it's vector representation. Here is a link to how to do this. You can also use an external model with Ollama.
  4. Optional - Install DBMS_CLOUD to access object storage.  Most demos access object storage to read in documents. Here is a link to my blog on how to install it.  I actually used ZFS for my object storage after installing DBMS_CLOUD. You can OCI, or even a PAR against any Object storage.
  5. Install ollama. Ollama is used to host the LLM, and you within Ollama you can download any open source model.. For my demo, I downloaded and installed llama3.2.
The demo I started with was the Texas Legislation demo which can be found here. This link points to a video showing the demo, and within the description is a link to the code and instruction on how to recreate the demo in your environment which are located in Github

The majority of the application is written in APEX, and can be downloaded using the instructions on github which can be found here.

The major changes I had to make to get this demo working on-premises had to do with using Ollama rather than access OCI for the LLM.

Documentation for using Ollama can be found here.

The biggest challenge was the LLM calls.  The embedding and document search was the same DBM_VECTOR calls regardless of the model.  The Demo, however uses DBMS_CLOUD.send_request which does not support OLLAMA.

I changed the functions to call DBMS_VECTOR_CHAIN.UTL_TO_GENERATE_TEXT instead, and I built a "prompt" instead of a message.  This is outlined below.

Description Demo request Ollam request
Call LLM with chat history and results/td> dbms_cloud.send_request

Message:
Question:
DBMS_VECTOR_CHAIN.UTL_TO_GENERATE_TEXT

Question:
Chat History:
Context:

SUMMARY : This RAG demo is a great place to start learning how to create a RAG architecture, and with just a few changes many of the Demo's created for Autonomous can be used on-premises also !