ZFS now supports Pre-Authenticated requests which can be useful for loading data into your data warehouse.
Version OS8.8.36 of the ZFS software was recently released, and one of the most interesting features is the support for Pre-Authenticated Requests.
Here is the note "On-premise Object Storage Best Practices and Recommended Use Cases". This Document outlines how to use the new Object Store features, some of which I will cover in future posts.
Here is my post on configuring your database to point to ZFS as an object store. It is necessary to configure ZFS as an object store if you want to do the same testing.
Here is my post on configuring your database to access ZFS as an object store. This document walks through how to configure DBMS_CLOUD in an 19c+ database.
By going through these notes you can reach the same point that I am with my sample database and ZFS. Below is the environment that I will be using for this demo.
Environment:
- Oracle Database 21c (though 19c would work just as well)
- I have updated the DATABASE_PROPERTIES to point to my wallet for SSL certificates.
- I added the unsigned SSL certificate for my ZFS simulator HTTPS server to the wallet.
- I have updated the C##CLOUD$SERVICE.dbms_cloud_store table to point to my ZFS appliance using the ORACLE_BMC authentication method.
- I installed the sales schema into my account.
- I am running the ZFS simulator
- I updated the release of the simulator to OS8.8.36 (2013.06.05.8.36) of the ZFS software.
- I created a user "salesdata" as a local user
- I created a share named "salesupload" that is owned by salesdata.
- The share "salesupload" is configured read/write using the OCI API within HTTP
- I added the user "salesdata" and it's public SSH keys for authentication with OCI Protocol within HTTP.
- I have the OCI Client tool installed
- I installed the OCI client tool
- I configured an entry for my object store in the ~/.oci/config file called salesdata
Pre-Authenticated Requests for uploading files
Step 1: Create the bucket to store the files.
[oracle@oracle-server]$ oci os bucket create --config-file ~/.oci/config --profile SALESDATA --compartment-id salesupload --namespace-name salesupload --endpoint http://10.0.0.231/oci --name salesdrop
{
"data": {
"approximate-count": null,
"approximate-size": null,
"compartment-id": "salesupload",
"created-by": "salesdata",
"defined-tags": null,
"etag": "b7ced3b97859a3cc22a23670fc59a535",
"freeform-tags": null,
"id": null,
"is-read-only": null,
"kms-key-id": null,
"metadata": null,
"name": "salesdrop",
"namespace": "salesupload",
"object-events-enabled": null,
"object-lifecycle-policy-etag": null,
"public-access-type": "NoPublicAccess",
"replication-enabled": null,
"storage-tier": "Standard",
"time-created": "2021-10-17T19:06:47+00:00",
"versioning": "Disabled"
},
"etag": "b7ced3b97859a3cc22a23670fc59a535"
}
Step 2: Create a Pre-Authenticated URL for this bucket
- config-file: Location of the configuration file
- profile: Entry to use within the configuration file (if not the default)
- namespace-name: For ZFS this is the share name
- endpoint: This is the URL for the ZFS http server + "/oci" to use the OCI API
- bucket-name: Bucket to create the Pre-Authenticated Request for.
- name: Identifying name given to this request
- access-type: What type of Pre-Authenticated request to create
- time-expires: When will this URL expire? This is mandatory.
[oracle@oracle-server ~]$ oci os preauth-request create --config-file ~/.oci/config --profile SALESDATA --namespace-name salesupload --endpoint http://10.0.0.231/oci --bucket-name salesdrop --name upload_sales_data --access-type AnyObjectWrite --time-expires="2022-11-21T23:00:00+00:00"
{
"data": {
"access-type": "AnyObjectWrite",
"access-uri": "/oci/p/CQmkSnXYrLcVgUnmRhuOmMXGTzDEJrf/n/salesupload/b/salesdrop/o/",
"id": "11c01b5c-92d8-4c2d-8cba-d9ec4e2649c5",
"name": "upload_sales_data",
"object-name": null,
"time-created": "2021-10-17T19:15:32+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
}
}
Step 3: Upload my file
d:\> curl -X PUT --data-binary '@d:\sh_sales.dat' http://10.0.0.231/oci/p/CQmkSnXYrLcVgUnmRhuOmMXGTzDEJrf/n/salesupload/b/salesdrop/o/loadfiles/sales_history_05012021.dat
d:\>
No errors. Let's check with and make sure the file got uploaded using the OCI Client tool
[oracle@oracle-server ~]$ oci os object list --endpoint http://10.0.0.231/oci --namespace-name salesupload --config-file ~/.oci/config --profile SALESDATA --bucket-name salesdrop --fields name,size,timeCreated
{
"data": [
{
"etag": null,
"md5": null,
"name": "loadfiles/sales_history_05012021.dat",
"size": 55180902,
"time-created": "2021-10-17T19:35:34+00:00",
"time-modified": null
}
],
"prefixes": []
}
I can see the file is there, and the size is 55MB.
Now where can you go with this ? Below is a diagram of how the Oracle IOT cloud can be used as a hub for datafiles from IOT. You can do the same thing by having all your IOT devices "drop" their data onto a central object store (hosted on ZFS), then filtered and loaded into a database.
Pre-Authenticated Requests for loading files
Step 1:Create the Pre-Authenticated Request to read the object.
[oracle@oracle-server]$ oci os preauth-request create --config-file ~/.oci/config --profile SALESDATA --namespace-name salesupload --endpoint http://10.0.0.231/oci --bucket-name salesdrop --name upload_sales_data --access-type ObjectRead --time-expires="2022-11-21T23:00:00+00:00" --object-name loadfiles/sales_history_05012021.dat
{
"data": {
"access-type": "ObjectRead",
"access-uri": "/oci/p/apVWoQmeWWtireCzUqEjGBTRWQwotro/n/salesupload/b/salesdrop/o/loadfiles/sales_history_05012021.dat",
"id": "547227b4-73b0-4980-bb94-ab5ee87d4c81",
"name": "upload_sales_data",
"object-name": "loadfiles/sales_history_05012021.dat",
"time-created": "2021-10-17T19:56:45+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
}
}
Step 2: Load the data
Now back to Jeff's example, I am going to log onto my database and load the data.
First I am going to count the rows in my table, then check again after.
SQL> select count(1) from sales;
COUNT(1)
----------
0
SQL> BEGIN
DBMS_CLOUD.COPY_DATA(
table_name =>'SALES',
file_uri_list =>'https://10.0.0.231/oci/p/apVWoQmeWWtireCzUqEjGBTRWQwotro/n/salesupload/b/salesdrop/o/loadfiles/sales_history_05012021.dat',
format => json_object('delimiter' VALUE '|') );
END;
PL/SQL procedure successfully completed.
SQL> select count(1) from sales;
COUNT(1)
----------
1016271
SQL>
I can see that 1 Million rows were successfully loaded into the table.
Step 3: Verify through USER_LOAD_OPERATIONS
Now, like Jeff did with his example, I am going to look at the view USER_LOAD_OPERATIONS to see the information about my load job.
col id format 999
col type format a8
col status format a10
col start_time format a15
col owner_name format a10
col table_name format a20
col file_uri_list format a70
set linesize 160
select
id,
type,
to_char(update_time,'mm/dd/yy hh24:mi:ss') update_time,
status,
owner_name,
table_name,
substr(file_uri_list,60,160) File_uri_list,
rows_loaded
from
user_load_operations
where status='COMPLETED';
SQL>
ID TYPE UPDATE_TIME STATUS OWNER_NAME TABLE_NAME FILE_URI_LIST ROWS_LOADED
---- -------- ----------------- ---------- ---------- -------------------- ---------------------------------------------------------------------- -----------
3 COPY 10/17/21 16:13:21 COMPLETED BGRENN SALES salesupload/b/salesdrop/o/loadfiles/sales_history_05012021.dat 1016271
Other Pre-Authenticated Requests
- ObjectReadWrite: This will give both read and write to a specific object.
- ObjectWrite: This give write only access to a specific object (rather than having full access to the bucket).
List Pre-Authenticated Requests
[oracle@oracle-server ~]$ oci os preauth-request list --config-file ~/.oci/config --profile SALESDATA --namespace-name salesupload --endpoint http://10.0.0.231/oci --bucket-name salesdrop
{
"data": [
{
"access-type": "AnyObjectWrite",
"id": "11c01b5c-92d8-4c2d-8cba-d9ec4e2649c5",
"name": "upload_sales_data",
"object-name": null,
"time-created": "2021-10-17T19:15:32+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
},
{
"access-type": "ObjectRead",
"id": "547227b4-73b0-4980-bb94-ab5ee87d4c81",
"name": "load_sales_data",
"object-name": "loadfiles/sales_history_05012021.dat",
"time-created": "2021-10-17T19:56:45+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
},
{
"access-type": "ObjectReadWrite",
"id": "87b4fe97-3e2e-4b22-96aa-a7e3b566dc59",
"name": "read_write_sales_data",
"object-name": "loadfiles/sales_history_06012021.dat",
"time-created": "2021-10-17T20:37:23+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
},
{
"access-type": "ObjectWrite",
"id": "828a0651-60f7-4d2a-998c-b3518e1bfa92",
"name": "write_sales_data",
"object-name": "loadfiles/sales_history_07012021.dat",
"time-created": "2021-10-17T20:40:08+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
}
]
}
Get Detail on a Pre-Authenticated Requests
[oracle@oracle-server ~]$ oci os preauth-request get --config-file ~/.oci/config --profile SALESDATA --namespace-name salesupload --endpoint http://10.0.0.231/oci --bucket-name salesdrop --par-id 828a0651-60f7-4d2a-998c-b3518e1bfa92
{
"data": {
"access-type": "ObjectWrite",
"id": "828a0651-60f7-4d2a-998c-b3518e1bfa92",
"name": "upload_sales_data",
"object-name": "loadfiles/sales_history_07012021.dat",
"time-created": "2021-10-17T20:40:08+00:00",
"time-expires": "2022-11-21T22:59:59+00:00"
}
}
Delete a Pre-Authenticated Requests
[oracle@oracle-server ~]$ oci os preauth-request delete --config-file ~/.oci/config --profile SALESDATA --namespace-name salesupload --endpoint http://10.0.0.231/oci --bucket-name salesdrop --par-id 828a0651-60f7-4d2a-998c-b3518e1bfa92
Are you sure you want to delete this resource? [y/N]: y