Friday, October 29, 2021

Oracle Database API for MongoDB with Oracle Autonomous Database opens up an entirely new set of use cases for MongoDB applications

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

Simpler database development experience

Developers like simplicity in building applications using choice of programming languages, frameworks, tools and databases. Document databases have become a popular choice for storing and processing application data in a language independent Java Script Object Notation (JSON) documents as a collection.

Document databases offer simple APIs for CRUD operations (Create, Read, Update, Delete) and frameworks that make it easy to persist application objects. This allows developers to focus on the application logic in microservices / API driven architecture without having to learn SQL (Structured Query Language).

Easily adapt to application changes

Unlike relational database models where application objects are mapped to database tables with predefined set of columns a.k.a schema, JSON documents are schema-flexible and allows values in each document to vary throughout a collection, making them ideal for Agile development methodology

Further, JSON supports nested structures that can be used to selectively add redundant data, so it can be read and written as a single unit to avoid joins and multiple DML (Data Manipulative Language) statements when reading or writing an application object.

Finally, document database architecture provides internet scale applications required performance and scalability with scale-out storage and compute capacity with minimum response time from database.  Document databases typically provide single-digit millisecond latencies for small reads and writes, while serving 1000s of concurrent users. 

Document Databases are great, but...

MongoDB has popularized document databases for developers with their free to use community edition. MongoDB offers simple document database APIs; however, it has limitations supporting multi-document ACID transactions critical to enterprise applications. For developers, implementing basic SQL engine functionality requires writing, testing and maintaining hundreds of lines of application code, resulting in security vulnerabilities, increased development time and maintenance.

As the use cases for the applications change, supporting additional data types, workload types require external integration with other single-purpose databases such as Graph, Spatial etc, potentially using different APIs and need to move the data to those databases, resulting in data fragmentation, data inconsistency and security risks.

Keeping it together with converged database

Oracle's converged database has built-in support for all modern data types and the latest development paradigms. Converged databases support spatial data for location awareness, JSON for document stores, IoT for device integration, in-memory technologies for real-time analytics, and of course, traditional relational data. By providing support for various data types, a converged database can run all sorts of workloads from IoT to Blockchain to Analytics and Machine Learning. It can also handle various development paradigms, including Microservices, Events, REST, SaaS, and CI/CD eliminating data replication or duplication, reducing the cost and increasing developer productivity.

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

Overview of Oracle Autonomous Databases


Oracle Autonomous Database combines the flexibility of cloud with the power of machine learning to deliver data management as a service that enables businesses to run mission-critical workloads in a most secure and highly available environment, while reducing the administration and runtime costs up to 90%.

Oracle Autonomous Database is family of cloud services deployed on Oracle Exadata, a high performance engineered platform for converged database, optimized for specific workload typ

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

Oracle Autonomous Data Warehouse(ADW): ADW uses adaptive machine learning to deliver unprecedented simplicity, performance, and highly elastic data management that enables data warehouse deployment in seconds.

◉ Oracle Autonomous Transaction Processing(ATP): ATP provides simpler application development, real-time analytics, personalization, and fraud detection for complex mix of high-performance transactions, batch reporting, and Internet of Things (IoT) applications.

◉ Oracle Autonomous JSON Database(AJD): AJD makes it simple to develop JSON-centric applications with document APIs, serverless scaling and high-performance ACID transactions.

◉ APEX Service: Oracle APEX Application Development (APEX Service) is a fully managed, low-code application development platform for building and deploying modern, data-driven applications in Oracle Cloud

Introducing Oracle Database API for MongoDB with Oracle Autonomous JSON Database


Oracle Database API for MongoDB is the latest initiative to help developers connect applications, drivers and tools built for MongoDB to Oracle Autonomous Databases*.

With the Oracle Database API for MongoDB developers can continue to use their skills but also leverage converged database and Autonomous Database capabilities over their MongoDB collections, opening an entirely new set of use cases for their applications.

Getting Started with Oracle Database API for MongoDB


Currently Oracle Database API for MongoDB is in Limited Availability (LA), this means it Is not enabled for every cloud database by default.

◉ If you haven’t already signed up for ‘Always Free’ services in Oracle Cloud Free Tier, please do so now, you are entitled to two (2) instances of Autonomous JSON databases and follow the steps outlined in the blog to create and manage instance.

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

◉ Oracle Database API for MongoDB requires an Access Control List (ACL) secured network access. ‘Secure access from everywhere’ and ‘Private endpoint access only’ options are not supported currently. In the ‘Access Control List’ setup under ‘Network’ section of the instance, as specified in Configure ACL for an existing Autonomous Database Instance.

Connecting MongoDB Clients


Use MongoDB Shell, a command-line utility to connect and query data in Autonomous JSON Database. We recommend v3.6 or later. For applications built using popular languages such as Java, Node, .NET and Python, we recommend Java driver 3.6 or later, Node.js 3.0 or later, .NET 2.5 or later, Python 3.6 or later with load_balanced, parameter set as ‘true’.

◉ Once your instance is authorized to use MongoDB APIs, retrieve connection string from the ‘Service Console’

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

◉ Create database user(s) from ‘Database Actions’ and grant ‘SODA_APP’  role for the user, e.g. TESTUSER

◉ Test connectivity from MongoDB Shell for TESTUSER

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

◉ Create collection(s) and insert documents as you would normally do using MongoDB Shell

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

◉ Verify the collection from Autonomous Database SQL Web Client

Oracle Database API, MongoDB, Oracle Autonomous Database, MongoDB Applications, Oracle Database Exam Prep, Oracle Database Preparation, Oracle Database Career, Oracle Database Jobs

Source: oracle.com

Monday, October 25, 2021

Oracle Cloud : Autonomous Database (ADW or ATP) - Export Data to an Object Store (expdp)

Oracle Cloud, Oracle Autonomous Database, Oracle Database Certification, Oracle Database Preparation, Oracle Database Career, Database Guides, Database Learning

This article demonstrates how to export data from an Autonomous Data Warehouse (ADW) or Autonomous Transaction Processing (ATP) service on the Oracle Cloud using the expdp utility.

◉ Assumptions

For the export to work you will have to make a connection from an Oracle client to the ADW or ATP database. You can see the necessary setup to do this here.

The Oracle 18c impdp utility introduced the CREDENTIAL parameter to specify the object store credential to be used for an import. From Oracle 21c (21.3) we can also use the CREDENTIAL parameter with the expdp utility.

We need an object store bucket to export the data to. This could be an Oracle Cloud Object Storage bucket, or an AWS S3 bucket.

◉ Create Something to Export

We connect to an autonomous database and create a new test user.

conn admin/MyPassword123@obatp_high

create user testuser1 identified by "MyPassword123";

alter user testuser1 quota unlimited on data;

grant create session to testuser1;

grant dwrole to testuser1;

We create a test table which we will export.

create table testuser1.t1 as

select level as id,

       'Description for ' || level as description

from   dual

connect by level <= 1000;

commit;

◉ Object Store Credentials

Create a credential for your object store. For an Oracle object storage bucket we use our Oracle Cloud email and the Auth Token we generated.

conn admin/MyPassword123@obatp_high

begin

  dbms_cloud.drop_credential(credential_name => 'obj_store_cred');

end;

/

begin

  dbms_cloud.create_credential (

    credential_name => 'obj_store_cred',

    username        => 'me@example.com',

    password        => '{my-Auth-Token}'

  ) ;

end;

/

For AWS buckets we use our AWS access key and secret access key.

begin

  dbms_cloud.create_credential (

    credential_name => 'obj_store_cred',

    username        => 'my AWS access key',

    password        => 'my AWS secret access key'

  );

end;

/

◉ Export to Object Store

Oracle Cloud, Oracle Autonomous Database, Oracle Database Certification, Oracle Database Preparation, Oracle Database Career, Database Guides, Database Learning
We can use a local Oracle 21.3 installation to export data from the autonomous database to an object store.

We use the CREDENTIALS parameter to point to the database credential we created earlier. We use an object store URI for the DUMPFILE location. For AWS S3, use the URI of your S3 bucket. For Oracle Cloud the URI can take either of these forms.

https://swiftobjectstorage.{region}.oraclecloud.com/v1/{namespace}/{bucket-name}/{file-name}.dmp

https://objectstorage.{region}.oraclecloud.com/n/{namespace}/b/{bucket-name}/o/{file-name}.dmp

The following example uses the "swiftobjectstorage" URI.

expdp admin/MyPassword123@obatp_high \

      tables=testuser1.t1 \

      directory=data_pump_dir \

      credential=obj_store_cred \

      dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp \

      exclude=statistics

Export: Release 21.0.0.0.0 - Production on Tue Sep 7 18:36:39 2021

Version 21.3.0.0.0

Copyright (c) 1982, 2021, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production

Starting "ADMIN"."SYS_EXPORT_TABLE_01":  admin/********@obatp_high tables=testuser1.t1 directory=data_pump_dir

  credential=obj_store_cred dumpfile=https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp exclude=statistics

Processing object type TABLE_EXPORT/TABLE/TABLE_DATA

Processing object type TABLE_EXPORT/TABLE/TABLE

. . exported "TESTUSER1"."T1"                            32.60 KB    1000 rows

ORA-39173: Encrypted data has been stored unencrypted in dump file set.

Master table "ADMIN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded

******************************************************************************

Dump file set for ADMIN.SYS_EXPORT_TABLE_01 is:

  https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp

Job "ADMIN"."SYS_EXPORT_TABLE_01" successfully completed at Tue Sep 7 18:37:14 2021 elapsed 0 00:00:26

$

The following example uses the "objectstorage" URI.

expdp admin/MyPassword123@obatp_high \

      tables=testuser1.t1 \

      directory=data_pump_dir \

      credential=obj_store_cred \

      dumpfile=https://objectstorage.uk-london-1.oraclecloud.com/n/my-namespace/b/ob-bucket/o/t1.dmp \

      exclude=statistics

Export: Release 21.0.0.0.0 - Production on Tue Sep 7 19:05:47 2021

Version 21.3.0.0.0

Copyright (c) 1982, 2021, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 21c Enterprise Edition Release 21.0.0.0.0 - Production

Starting "ADMIN"."SYS_EXPORT_TABLE_01":  admin/********@obatp_high tables=testuser1.t1 directory=data_pump_dir

  credential=obj_store_cred dumpfile=https://objectstorage.uk-london-1.oraclecloud.com/n/my-namespace/b/ob-bucket/o/t1.dmp exclude=statistics

Processing object type TABLE_EXPORT/TABLE/TABLE_DATA

Processing object type TABLE_EXPORT/TABLE/TABLE

. . exported "TESTUSER1"."T1"                            32.60 KB    1000 rows

ORA-39173: Encrypted data has been stored unencrypted in dump file set.

Master table "ADMIN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded

******************************************************************************

Dump file set for ADMIN.SYS_EXPORT_TABLE_01 is:

  https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/t1.dmp

Job "ADMIN"."SYS_EXPORT_TABLE_01" successfully completed at Tue Sep 7 19:06:06 2021 elapsed 0 00:00:15

$

◉ Get the Log File

If we want to read the contents of the expdp log file we can push it across to the object store using the PUT_OBJECT procedure in the DBMS_CLOUD package.

conn admin/MyPassword123@obatp_high

begin

  dbms_cloud.put_object(

    credential_name => 'obj_store_cred',

    object_uri      => 'https://swiftobjectstorage.uk-london-1.oraclecloud.com/v1/my-namespace/ob-bucket/export.log',

    directory_name  => 'data_pump_dir',

    file_name       => 'export.log');

end;

/

Source: oracle-base.com

Friday, October 22, 2021

Autonomous JSON Database under the covers: OSON format

Autonomous JSON Database, Oracle Database Exam Prep, Oracle Database Certification, Oracle Database Career, Oracle Database Preparation

One of the secrets behind Autonomous JSON Database is a new optimized native binary storage format for JSON, called OSON. All JSON documents in Autonomous JSON Database are automatically stored in OSON format. This delivers big performance benefits to your JSON applications, including faster query performance, more efficient updates, and reduced storage sizes. This native JSON format is completely transparent to your applications; your application always uses standard JSON documents (e.g. text strings), but all database operations on JSON documents are optimized under the covers via this binary format.

OSON is based on a tree encoding. Following Figure shows OSON encoding of a simple JSON document as a serialized OSON byte array having tree pointers represented as jump navigation offsets. 

Autonomous JSON Database, Oracle Database Exam Prep, Oracle Database Certification, Oracle Database Career, Oracle Database Preparation

Such encoding gives the following benefits when processing SQL/JSON path query and update.

◉ Fast query performance: OSON supports ‘tree-based jump navigation’. If you want to retrieve only the field “name” of the second element of the array entry “items”, that is, evaluating the dot expression ‘items[1].name’, then Autonomous JSON Database will ‘jump’ directly to the value “PC” within the OSON buffer, instead of linearly parsing and scanning the text of the JSON document. For a JSON document having few fields, the performance gains may be modest -- but for a typical JSON document having dozens or hundreds of fields and array elements nested within each other forming complex tree structures, then ‘tree-based jump navigation’ will enable Autonomous JSON Database to efficiently skip past the irrelevant portions of the document to retrieve only the requested fields. OSON format often delivers order of magnitude query performance gains (as compared to JSON stored in its raw text format). 

◉ Efficient updates: In many applications, you may want to update only a subset of the fields of a JSON document (known as a ‘partial update’). For example, you may want to update the “id” field of document from “CDEG4” to “CDEG52” in above Figure. Autonomous JSON Database performs partial updates of OSON whenever possible, by changing only a subset of nodes in the JSON tree structure – even if the new data is larger than the original field(s). Even for complex update operations, full document replacement is typically not necessary. Partial updates of OSON always improve the efficiency of modifying JSON documents, especially for large documents which would be expensive if the entire document was replaced. Partial update often significantly reduces database redo/replication log size and improves document update performance for medium to large sized documents. 

◉ Reduced storage: Medium to large sized JSON documents often contain nested object array structures and/or recursive structures. In those documents, the field names may be repeatedly stored many times in JSON text. For example, In above Figure, the field “name” is repeated in each element of the array. OSON will only store each distinct field name once using a dictionary encoding to shrink larger JSON documents.

There are even more benefits to OSON, which benefit other aspects of your JSON applications:

◉ Leaf scalar Data in OSON uses native Oracle scalar data types (number, date, timestamp, binary float/double, year-month, day-time intervals, binary) and encoding formats. Consequently, there is no data-conversion required when extracting JSON leaf data as SQL scalar value in Autonomous JSON Database and its client drivers.

◉ OSON processing is fully integrated into Oracle’s client drivers. JSON encoding into OSON and decoding from OSON can occur at the client side whenever appropriate and possible. This saves server CPU to encode JSON into OSON for ingest or to decode OSON into JSON for retrieval.Furthermore, Oracle client drivers are capable of directly leverage optimizations like ‘jump navigation’ over OSON format. SODA client driver for Autonomous JSON database uses OSON client side encoder and decoder autonomously.

◉ OSON is fully and deeply integrated into all aspects of Oracle Autonomous Database: JSON functional index, JSON search index, JSON_TABLE() materialized views, JSON Data-Guide, parallel query processing, In-Memory columnar store, ExaData smart scan, have all been enhanced to fully take advantage of the OSON format to speed up SQL/JSON processing. 

Source: oracle.com

Wednesday, October 20, 2021

Logical Database

A Logical Database is a special type of ABAP (Advance Business Application and Programming) that is used to retrieve data from various tables and the data is interrelated to each other. Also, a logical database provides a read-only view of Data.

Structure Of Logical Database:

A Logical database uses only a hierarchical structure of tables i.e. Data is organized in a Tree-like Structure and the data is stored as records that are connected to each other through edges (Links). Logical Database contains Open SQL statements which are used to read data from the database. The logical database reads the program, stores them in the program if required, and passes them line by line to the application program. 

Logical Database, Oracle Database Exam, Oracle Database Preparation, Oracle Database Career, Database Prep
Structure of Logical database

Features of Logical Database:


In this section, let us look at some features of a logical database:

◉ We can select only that type of Data that we need.
◉ Data Authentication is done in order to maintain security.
◉ Logical Database uses hierarchical Structure due to this data integrity is maintained.

Goal Of Logical Database:


The goal of Logical Database is to create well-structured tables that reflect the need of the user. The tables of the Logical database store data in a non-redundant manner and foreign keys will be used in tables so that relationships among tables and entities will be supported.

Tasks Of Logical Database:


Below is some important task of Logical Database:

◉ With the help of the Logical database, we will read the same data from multiple programs.

◉ A logical database defines the same user interface for multiple programs.

◉ Logical Database ensures the Authorization checks for the centralized sensitive database.

◉ With the help of a Logical Database, Performance is improved. Like in Logical Database we will use joins instead of multiple SELECT statements, which will improve response time and this will increase the Performance of Logical Database.

Data View Of Logical Database:


Logical Database provides a particular view of Logical Database tables. A logical database is appropriately used when the structure of the Database is Large. It is convenient to use flow i.e

◉ SELECT
◉ READ
◉ PROCESS
◉ DISPLAY

In order to work with databases efficiently. The data of the Logical Database is hierarchical in nature. The tables are linked to each other in a Foreign Key relationship.

Diagrammatically, the Data View of Logical Database is shown as:

Logical Database, Oracle Database Exam, Oracle Database Preparation, Oracle Database Career, Database Prep

Points To Remember:


◉ Tables must have Foreign Key Relationship.
◉ A logical Database consists of logically related tables that are arranged in a hierarchical manner used for reading or retrieving Data.
◉ Logical Database consist of three main elements:
     ◉ Structure of Database
     ◉ Selections of Data from Database
     ◉ Database Program

◉ If we want to improve the access time on data, then we use VIEWS in Logical Database.

Example: 


Suppose in a University or College, a HOD wants to get information about a specific student. So for that, he firstly retrieves the data about its batch and Branch from a large amount of Data, and he will easily get information about the required Student but didn’t alter the information about it.

Logical Database, Oracle Database Exam, Oracle Database Preparation, Oracle Database Career, Database Prep

Advantages Of Logical Database:


Let us look at some advantages of the logical database:

◉ In a Logical database, we can select meaningful data from a large amount of data.

◉ Logical Database consists of Central Authorization which checks for Database Accesses is Authenticated or not.

◉ In this Coding, the part is less required to retrieve data from the database as compared to Other Databases.

◉ Access performance of reading data from the hierarchical structure of the Database is good.

◉ Easy to understand user interfaces.

◉ Logical Database firstly check functions which further check that user input is complete, correct, and plausible.

Disadvantages Of Logical Database:


This section shows the disadvantages of the logical database:

◉ Logical Database takes more time when the required data is at the last because if that table which is required at the lowest level then firstly all upper-level tables should be read which takes more time and this slows down the performance.

◉ In Logical Database ENDGET command doesn’t exist due to this the code block associated with an event ends with the next event statement.

Source: geeksforgeeks.org

Monday, October 18, 2021

Why should you use graph analytics?

Oracle Database Tutorial and Material, Oracle Database Certification, Oracle Database Preparation, Oracle Database Study Material, Oracle Database Graph Analytics

Data is growing exponentially, and rising automation is generating a plethora of data from smart phones, mobile and IoT devices, security systems, satellite imagery, vehicles and more. The question becomes: how to rapidly obtain meaningful insights from ever growing data sets across different types and sources?

Graph technology makes it easy to explore relationships and discover connections in data, allowing developers and analysts to gain meaningful insights quickly. Much of the world's data is indeed connected, including financial transactions, social and professional networks of people, manufacturing supply chains and more. Graphs instantly reveal those connections.

Here is an illustration showing how data sets are connected and how complex analysis can get due to data relationships

Oracle Database Tutorial and Material, Oracle Database Certification, Oracle Database Preparation, Oracle Database Study Material, Oracle Database Graph Analytics

Graph data platforms provide automation to increase developer productivity. They deliver the performance and scalability to support large deployments, while enhanced query and search capabilities simplify access and accelerate time to insights for connected-data use cases.

Oracle makes it easy to adopt graph technologies. Graphs are part of Oracle's converged database, which supports multi-model, multi-workload, and multi-tenant requirements- all in a single database engine. Oracle Database and Oracle Autonomous Database allow analysts to rapidly discover new insights using the power of built-in graph algorithms, pattern matching queries, and visualization. Developers can easily add graph analytics to existing applications, taking advantage of the performance, scalability, reliability, and security provided by Oracle Database.

Oracle recognized a leader in graph data platforms



Forrester considered 27 criteria to evaluate graph data platforms. Some of the key criterions include graph model/engine, deployment options, cloud, app development, API/extensibility, data loading/ingestion, data management, transactions, queries/search, analytics, visualization, high availability and disaster recovery, scalability, performance, data security, workloads, and use cases.

Oracle Database Tutorial and Material, Oracle Database Certification, Oracle Database Preparation, Oracle Database Study Material, Oracle Database Graph Analytics

Why graph technologies from Oracle?


Key features of the Oracle offering include:

1. Complete graph database with support for both property graph and RDF knowledge graphs. Oracle Database simplifies the process of modeling relational data as graph structures. It automates the discovery, processing, and visualization of connected data sets on-premises or in the cloud

2. Enterprise-level scalability and security. Interactive graph queries can run directly on graph data or in a high-performance in-memory graph server, supporting millions of concurrent users and queries per second. Customers gain fine-grained security, high availability, easy manageability, and integration with other data in business applications. Oracle provides sophisticated, multilevel access control for property graphs vertices and edges, and RDF triples. Oracle also aligns with applicable ISO and Worldwide Web Consortium standards for representing and defining graphs and graph query languages.

3. Comprehensive graph analytics to explore relationships with more than 60 prebuilt algorithms. Analysts can use SQL, native graph languages, JAVA, and Python APIs, as well as Oracle Autonomous Database features to create, query, and analyze graphs. They can display connections easily in data to discover insights, and then use interactive analytics and visualization tools to publish and share analysis results. With Graph Studio in Autonomous Database, almost anyone can get started with graphs to explore relationships in data. Graph Studio automates graph data management and simplifies modeling, analysis, and visualization across the graph analytics lifecycle.

Graph analytics use cases

 
Money laundering detection in financial services: To make fraud detection simpler, users can create a graph from transactions between entities as well as entities that share some information, such as email addresses, passwords, and more. Once a graph is created, running a simple query will find all customers with accounts who have similar information, reveal which accounts are involved in circular payment schemes, and identify patterns used to perpetuate fraud.
 
Traceability in manufacturing: Traceability is of great significance in the manufacturing world. An automobile company might have to issue a recall for a car model because that specific model has a component which was produced from a factory during a limited time slot. Most companies have a production database, a separate retail database, a separate sales database, and a separate shipping database. It is complicated to discover all the relevant information to find the cars with the problem, where they were shipped, and to whom they were sold- unless the company has a graph database to connect all the relationships, and graph algorithms to highlight connections and relevant information.

Criminal investigation: Putting data into graphs provides a natural and efficient way to identify criminal networks and look for patterns. Applying graph-based algorithms makes it easier to identify specific locations, highlight co-traveler relationships, or discover key suspects and criminal gangs. For example, by applying betweenness centrality, users can find the "weakest link," meaning the vertex that the graph relies upon. If you remove that vertex, the entire graph may fall apart, meaning you may have just found the linchpin of a criminal gang.

Data regulation and privacy: Tracking data lineage is a perfect match for a graph. The various steps in the data lifecycle can be tracked and navigated, vertex by vertex, by following the edges. With graph, it becomes possible to follow a path and see where the information originally resided, where it was copied, and where it was utilized. With all this information laid out in a graph, it becomes simpler for data professionals to determine how to fulfill GDPR requests and remain compliant.

Product recommendations in marketing: Graph databases collect all data and form connections to gain speedy insight into customer needs and product trends to provide real-time recommendations. Many large corporations rely upon graph analytics to provide product recommendations because the relationships are already laid out, and the analysis of these relationships to provide recommendations is very fast. Additionally, graph analysis can identify the patterns that reveal trolls, bots, artificially promoted reviews, and information that may distort marketing analysis.

Source: oracle.com

Friday, October 15, 2021

Using Expressions in Initialization Parameters in Oracle Database 21c

Oracle Database 21c, Oracle Database Tutorial and Material, Oracle Database Exam Prep, Oracle Database Certification, Oracle Database Career

Oracle database 21c introduced the ability to use expressions to set initialization parameters. These expressions can reference other parameters and environment variables.

1. Referencing Parameters

We check the values of the JOB_QUEUE_PROCESSES and PROCESSES parameters. We see the values are 80 and 400 respectively.

SQL> show parameter processes

NAME                                        TYPE        VALUE

------------------------------------ ----------- ------------------------------

aq_tm_processes                       integer            1

db_writer_processes                  integer           1

gcs_server_processes                 integer           0

global_txn_processes                 integer          1

job_queue_processes                  integer         80

log_archive_max_processes       integer         4

processes                                    integer         400

SQL>

We set the JOB_QUEUE_PROCESSES parameter to 1/10 of the PROCESSES parameter. In this case we use the MAX function, to make sure the JOB_QUEUE_PROCESSES parameter value never drops below 10.

SQL> alter system set job_queue_processes='max(processes/10,10)';

System altered.

SQL>

We check the JOB_QUEUE_PROCESSES parameter again, and we see it has been set to the correct value.

SQL> show parameter job_queue_processes

NAME                                       TYPE        VALUE

------------------------------------ ----------- ------------------------------

job_queue_processes               integer         40

SQL>

We create a parameter file based on the current spfile.

SQL> create pfile='/tmp/pfile.txt' from spfile;

File created.

SQL>

We check the setting of the JOB_QUEUE_PROCESSES parameter in the resulting pfile.

SQL> host fgrep job_queue_processes /tmp/pfile.txt

*.job_queue_processes=max(processes/10,10)

SQL>

So this has not only set the correct JOB_QUEUE_PROCESSES parameter value, but maintained the relationship to the PROCESSES parameter in the parameter definition.

As a result, if we alter the PROCESSES parameter value, we will also be altering the JOB_QUEUE_PROCESSES parameter value.

SQL> alter system set processes=600 scope=spfile;

System altered.

SQL> shutdown immediate;

Database closed.

Database dismounted.

ORACLE instance shut down.

SQL> startup;

ORACLE instance started.

Total System Global Area 1.4496E+10 bytes

Fixed Size                  9702824 bytes

Variable Size            2147483648 bytes

Database Buffers         1.2314E+10 bytes

Redo Buffers               23851008 bytes

Database mounted.

Database opened.

SQL> show parameter job_queue_processes

NAME                                        TYPE        VALUE

------------------------------------ ----------- ------------------------------

job_queue_processes                  integer         60

SQL>

2. Referencing Environment Variables

We check the current setting of the ORACLE_BASE environment variable.

SQL> host echo $ORACLE_BASE

/u01/app/oracle

SQL>

We check the current value of the AUDIT_FILE_DEST parameter.

SQL> show parameter audit_file_dest

NAME                                 TYPE         VALUE

------------------------------------ ----------- ------------------------------

audit_file_dest                      string        /u01/app/oracle/admin/cdb1_lhr

                                                               12p/adump

SQL>

We replace the path with one containing the ORACLE_BASE environment variable.

SQL> alter system set audit_file_dest='$ORACLE_BASE/admin/cdb1_lhr12p/adump' scope=spfile;

System altered.

SQL>

We create a parameter file based on the current spfile.

SQL> create pfile='/tmp/pfile.txt' from spfile;

File created.

SQL>

We check the setting of the AUDIT_FILE_DEST parameter in the resulting pfile.

SQL> host fgrep audit_file_dest /tmp/pfile.txt

*.audit_file_dest='$ORACLE_BASE/admin/cdb1_lhr12p/adump'

SQL>

So now the audit location is based on the value of the ORACLE_BASE environment variable, as it was set at instance startup.

We need to make sure any required environment variables are set before startup time, or the instance will not start. As an example, we unset the ORACLE_BASE environment variable value.

$ unset ORACLE_BASE

$ echo $ORACLE_BASE;

$

We shutdown the instance, and when we attempt to start it we get an error.

SQL> shutdown immediate;

Database closed.

Database dismounted.

ORACLE instance shut down.

SQL> startup;

ORA-07217: sltln: environment variable cannot be evaluated.

SQL>

We exit SQL*Plus and set the environment variable again.

$ export ORACLE_BASE=/u01/app/oracle

Now the instance starts as expected.

SQL> startup;

ORACLE instance started.

Total System Global Area 1.4496E+10 bytes

Fixed Size                  9702624 bytes

Variable Size            2147483648 bytes

Database Buffers         1.2314E+10 bytes

Redo Buffers               23851008 bytes

Database mounted.

Database opened.

SQL>

3. Considerations

Some things to consider when using expressions.

◉ When issued from the ALTER SYSTEM or ALTER SESSION commands, the expression must be enclosed in single quotes.

◉ When setting parameters we have access to the MIN and MAX functions, both of which accept two values. The MIN function returns the lower of the two values, and so may be useful in defining a maximum value for a parameter. The MAX function returns the higher of the two values, and so may be useful in defining a minimum value for a parameter. These aren't to be confused with the SQL functions of the same name.

◉ Environment variables must be set before instance startup, and their values are read at that point only.

◉ Expressions can be used in a spfile or a pfile.

Source: oracle-base.com

Wednesday, October 13, 2021

How to implement self-service data analytics for finance teams

Oracle Database Tutorial and Materials, Oracle Database Preparation, Oracle Database Certification, Oracle Database Career, Oracle Database Exam Prep

No longer is it enough for finance teams to report financial results and streamline processes; in fact, 85% of an ESG survey respondents believe it is imperative for the finance organization to transform from reporting on “what” is happening in the business to “why” things are happening. Finance leaders are expected to answer new questions from executives every day, and to guide business strategy. It is therefore not surprising that according to Gartner, the #1 priority of CFOs is advanced data analytics technologies.

While Deloitte notes that “CFOs are in a unique position to become the chief analytics officers as finance gains a bigger influence in driving the company’s strategy”, CFOs do spend more time sifting through spreadsheets than doing anything else—an average of 2.24 hours per day. Finance teams face significant challenges combining ever growing data sets across different formats and sources into a single source of truth to provide actionable insights to other departments.

For IT teams, those needs translate into an increasing number of complex, time consuming demands from financial analysts.

How can IT enable finance teams to get the deep, trustworthy, data-driven insights they need to make quick decisions—while ensuring data governance, security, and saving significant time and efforts?

Real-time insights to drive innovation and growth while reducing costs

Let’s consider the following customer stories highlighting how, with Oracle’s solution, finance teams have been able to dramatically improve their data analytics processes while significantly reducing costs.

Dou Yue’s 30 restaurants across China are committed to serving traditional Chinese cuisine on premises and for takeout. Multiple isolated legacy data platforms prevented the company from gaining the comprehensive, real-time insights it needed. If Dou Yue executives wanted to get a view of the entire business, they had to export spreadsheets from various systems to then manually aggregate and calculate results, a process that was outdated and error prone.

By deploying Oracle Autonomous Data Warehouse and Oracle Analytics Cloud on Oracle Cloud Infrastructure, Dou Yue integrated the data from multiple business systems onto a single, cloud-based platform. The financial team can now pull revenue, inventory, and other data—by restaurant—for company executives to act on. For example, when a restaurant posts a revenue decline, Dou Yue executives can analyze the environmental conditions (traffic, weather), sales model (dine-in or takeout), as well as menu and pricing in the region where the restaurant is located to determine the root causes and adjust strategy. By analyzing historical trends, repeat-consumption, and other data, Dou Yue is now able to understand which dishes customers like and adjust them—or create new ones—in a timely manner. They can also determine which commercial buildings tend to order the most take-out to inform targeted marketing campaigns and where to locate future restaurants. In the past, the company required two or three full-time staffers to spend a few hours to manually produce reports. With Oracle’s solution, reports are generated with a simple click, dramatically improving business decision-making while reducing labor costs.

At Data Intensity, the finance team was spending 60% of the time just getting the data out of the systems. With Autonomous Data Warehouse, they can now run 200 reports in seconds. A testament to the value of the solution, 10X more users are now accessing the system, driving value. Additionally, the company reduced costs by 30%. “Our CFO is delighted, he could do what he could not do before,sit in a board meeting and get the data then and there at his fingertips.” said James Anthony, CTO.

Lyft, the transportation network, was busy reimagining the future of transportation. Behind the scenes though, the company had gone from a high-growth start-up to a publicly traded enterprise processing billions of transactions a year—and its finance systems hadn’t kept up. Lyft turned to Oracle for its integrated Oracle Fusion applications and data analytics solution. Jay Weiland, Director of Financial Solutions at Lyft said “When I have a process running in the middle of the night, I can’t tell you the exact minute it’s going to stop. With the autoscaler in Autonomous Data Warehouse, it scales the number of CPUs automatically, so I don’t have to pay for idle hours. That’s very attractive to my finance team.”

A complete, self-service data analytics solution

Oracle delivers a complete, self-service data analytics solution empowering finance teams to rapidly get the deep, trustworthy, data-driven insights they need to make quick decisions.

The architecture of the solution is represented below:

Oracle Database Tutorial and Materials, Oracle Database Preparation, Oracle Database Certification, Oracle Database Career, Oracle Database Exam Prep

Data from all sources and formats can be combined in Autonomous Data Warehouse to drive secure collaboration around a single source of truth. Autonomous Data Warehouse intelligently automates provisioning, configuring, securing, patching, backing up, performance tuning, and repairing of a data warehouse. This reduces administration effort by up to 90%, enabling finance teams to operate independently while freeing up valuable resources for IT teams.

It is the only cloud data warehouse that is autonomous, self-service, and complete, providing finance teams with a comprehensive suite of built-in tools:

◉ Data tools enable self-service drag-and-drop data loading, data transformation, and business modelling. Financial analysts can automatically discover insights with machine learning algorithms—no coding required—saving them significant time and efforts.

◉ Built-in graph analytics enables financial analysts to visualize relationships and connections between data entities. They can for example instantly see all costs and headcount associated to a given project, or understand all dependencies associated to a given supplier to best manage supplier relationships.

◉ With built-in spatial analytics, they can rapidly answer financial questions such as “where did bad weather impact revenue?”, or “where are our most profitable customers?”

◉ Financial analysts can build machine learning models—with a no code interface—to predict likely financial outcomes, e.g. customers likely to default on payment, transactions likely to be fraudulent, expected revenue based on forecast and historical patterns…etc

◉ With the built-in Oracle APEX low-code development platform, finance teams can quickly develop applications for ad hoc needs and gaps/processes handled outside of their ERP—without having to join a queue of IT projects. Such applications can include ad hoc data rooms for acquisitions, tracking the progress of digital transformation initiatives, or COVID-19 related applications.
 
“It's like the iOS of the enterprise cloud data warehouse space.” 

Patrick Moorhead - Founder, President, & Principal Analyst at Moor Insights & Strategy

Oracle Analytics Cloud is connected to Autonomous Data Warehouse, empowering business users and executives with modern, AI-powered, self-service analytics capabilities for data preparation, visualization, enterprise reporting, augmented analysis, and natural language processing/generation. Alternatively, Autonomous Data Warehouse is certified with all popular analytics tools inlcuding Tableau, Looker, and Microsoft Power BI, ensuring freedom of choice for customers.

The governed, secure solution allows IT teams to reduce risks. They can additionally rely on a simple, reliable, and repeatable approach for all data analytics requests from finance teams.

Beyond their core financial responsibilities, finance leaders are in a unique position to guide business strategy and help other departments achieve their goals. Having the ability to rapidly and independently turn a growing mountain of data into insights is essential to achieve these objectives. With Oracle’s complete, self-service data analytics solution powered by Autonomous Data Warehouse, IT can enable finance teams to efficiently take on the leadership role that is increasingly expected of them—while reducing risks, costs, and increasing both IT and analysts' productivity.

Source: oracle.com

Monday, October 11, 2021

JSON Type and other cool new JSON features in Oracle database release 21c

Oracle Database Release 21 offers many new many and major JSON enhancement and improvements. This blog will give you an overview - with screenshots and examples that you can try out immediately in an always free Autonomous Database like the new Autonomous JSON database (AJD).

Intro on how to run the examples

To deploy an always free Autonomous JSON database follow the steps described by the screenshots. You can pick an Autonomous JSON or Autonomous Transaction Processing database. Note: as of now always free Autonomous JSON is only available in the four home region Phoenix, Ashburn, Frankfurt, and London. Pick Autonomous Transaction Processing if you're in a different region.

Oracle Database 21c, Oracle Database Preparation, Oracle Database Prep, Oracle Database Certification, Oracle Database Career, Oracle Database Learning

Then click on 'Create Autonomous Database'. On the next page (screenshot below) give your service a name, pick JSON (or Transaction Processing), make sure to select 'always free' and '21c' and choose your password (and remember it) - for the rest keep the default.

Oracle Database 21c, Oracle Database Preparation, Oracle Database Prep, Oracle Database Certification, Oracle Database Career, Oracle Database Learning

After some time (minutes) your database is ready.  Go to 'Tool', select 'Open Database Action' , login with 'ADMIN' and the password you have chosen. Click on the 'SQL' tile - now you can run the following examples: 

SQL data type 'JSON'

The most important and visible extension is likely a new data type 'JSON' (another hint showing Oracle's overall commitment to JSON). This new data type not only allows the user to identify table  and view columns and PL/SQL arguments  as JSON but also yields performance improvements as the data gets encoded into a binary format. It took us a while to find the best format, we looked at all existing binary formats like BSON, AVRO, Protobufs, etc but found none to be a good match for a database: some needed serial reads (not allowing to 'jump' to a selected value), are inefficient on updates (requiring too much data to be written) or required schema management - nullifying some benefits of JSON. So we had to develop a new format, OSON which not only allows fast (hash and offset based) navigation to relevant values but also to minimize the amount of data that needs to be written when performing piece-wise updates (e.g. change a field value) - this greatly improves update performance as fewer data needs to be written to disk or undo/replication logs.

Read More: 1Z0-888: MySQL 5.7 Database Administrator


'JSON' can be used like any other data type. If you insert textual JSON into a JSON type column then the encoding to the binary format happens implicitly - there is no need to call a JSON constructor (although still possible, see below). Any SQL/JSON operator like JSON_VALUE, JSON_QUERY, JSON_EXISTS, JSON_TABLE, JSON_DATAGUIDE, etc accepts a JSON type as input. Also the simple dot notation can be used; please note that a trailing item method such as number() or string() is needed to convert the selected JSON value to a SQL scalar value. Without such trailing function the simple dot notation returns a JSON type instead. The function JSON_Serialize converts a JSON type instance to a JSON string (serialization); it allows optional pretty printing which makes JSON data easier to read.

Examples:

create table customers(id number, created timestamp, cdata JSON);

insert into customers values (1, systimestamp, '{"name":"Atif", "address":{"city":"San Mateo"}}');

desc customers;

Name    Null? Type         
------- ----- ------------ 
ID            NUMBER       
CREATED       TIMESTAMP(6) 
CDATA         JSON 

-- simple dot notation
select c.cdata.address.city.string() from customers c;

San Mateo

-- SQL/JSON operator
select JSON_VALUE(cdata, '$.address.city') from customers;

San Mateo

-- Convert the JSON type to a JSON string (binary to text conversion)
select JSON_Serialize(c.cdata pretty) from customers c;
 
{
  "name" : "Atif",
  "address" :
  {
    "city" : "San Mateo"
  }
}

JSON type can also be used directly as argument type in PL/SQL:

create or replace function upgradeCustomer(customer JSON) return JSON as
  obj JSON_Object_T;
begin
  obj := JSON_Object_T(customer);
  if obj.has('status') then
    obj.put('status','platinum with pink sprinkles');
  else
    obj.put('status','silver');
  end if;
  return obj.to_JSON;
end;
/

select JSON_Serialize(upgradeCustomer(cdata))
from customers c
where c.cdata.name.string() = 'Atif';

{"name":"Atif","address":{"city":"San Mateo"},"status":"silver"}

Using the 'returning clause' it is also possible to generate a JSON type result (example uses Scott.Emp). Please note that a JSON type is printed as [object Object] in the Sql Developer Web interface.

select JSON_OBJECT(* returning JSON) from emp;


Note: The SQL type JSON is also supported in JDBC so that applications can directly work with the binary format: package oracle.sql.json, JDBC/JSON examples.

Extended data types


Everyone sooner or later finds out that JSON  lacks important types like 'date' or 'timestamp'. Sure, it is possible to convert such values to strings (hopefully ISO 8601 formatted) but from a strict point of view the type information is lost. Therefore, the new Oracle JSON type supports additional data types like date, timestamp, double, float, raw, etc. An application (e.g. Java with JDBC driver supporting JSON type) can therefore map programming language values to JSON without any lossy conversion. Also when generating JSON type from relational tables the column's data types are preserved. No more "this string is really a timestamp"!.

You can find out the type of a JSON value using the type() method at the end of a path expression. The following example uses a WITH clause to create a new JSON object from all columns in the 'customers' table. In the following select clause we extract the 'CREATED' field but instead if its value we select the type using the type() function. As you see the JSON objects created in the WITH clause subquery preserved that it is a timestamp field.

Example: 

with subQuery as (
   select JSON_OBJECT(* returning JSON) jdata
   from customers)
select s.jdata.CREATED.type()
from subQuery s;

timestamp
 
JSON_SERIALIZE converts a JSON type instance to textual JSON. It allows pretty printing to make the output easier to read.
An extended data type needs to undergo conversion at this point. Numerical values (integer, float, double) are converted to JSON numbers. Temporal values like date or timestamp are converted to ISO-8601 formatted string.

Example, the WITH clause is the same but now we convert the generated JSON object to a string, the CREATED field becomes a JSON string.

with subQuery as (
   select JSON_OBJECT(* returning JSON) jdata
   from customers)
select JSON_SERIALIZE(jdata)
from subQuery s;

{"ID":1,"CREATED":"2021-03-03T14:58:31.465300","CDATA":{"name":"Atif","address":{"city":"San Mateo"}}}

JSON Transform


We already offered multiple ways to update JSON documents, staring from trivial full document replacement, to JSON_Mergepatch or the Pl/SQL JSON api. We added an additional operator 'JSON_Transform' that uses the same path expressions, filter expression, etc as the existing SQL/JSON operators JSON_Value or JSON_Table. JSON_TRANSFORM allows the user to perform multiple modifying operations like setting or removing field values or appending to an array. The operations are executed in the order they're provided by the user. Optional handlers allow to specify what to do if a field already exists (error, ignore, replace) or is missing in a 'remove' operation. A special operation is 'KEEP' which accepts a list of path expressions to be preserved in the JSON with all others being removed. This allows for example to only ship a subset of the data to a client if JSON_TRANSFORM is used in the SELECT (not updating the data on disk but creating a transient output). With optional handlers the user can specify what to do if a value is missing: options are to raise an error, ignore it or to create it (see APPEND in the next example).

Example

update customers
set cdata = JSON_Transform (cdata, SET '$.lastUpdated' = SYSTIMESTAMP,
                                   SET '$.address.zip'    = 94402,
                                   RENAME '$.name'     = 'firstName',
                                   APPEND '$.friends'  = 'Beethoven' CREATE ON MISSING) 
where id = 1;

select JSON_Serialize(cdata PRETTY) from customers;
{
    "address": {
        "city": "San Mateo",
        "zip": 94402
    },
    "lastUpdated": "2021-03-03T16:49:15.479563Z",
    "firstName": "Atif",
    "friends": ["Beethoven"]

Multi-Value index for JSON arrays


JSON arrays have always been a bit 'odd' in a relational database as they highlight the differences between the data models: JSON is a hierarchical (tree) data model where entities are embedded in each other (using nested arrays or objects) whereas tables are flat with just one value per columns and entities connected by relationships (same column value in different tables). Function based indexes used to be limited to one value per column and therefore could not support JSON arrays (We had good 'workarounds' like JSON Search index, materialized views, etc). But now in 21c we added a true multi-value functional index allowing to index specific values in JSON arrays. 

Example:

--insert a new customer with multiple addresses

insert into customers values (2, systimestamp,
      '{"firstName":"Rodrigo",
        "address":[{"type":"home","city":"Sunnyvale","zip":94085},
                   {"type":"work","city":"Redwood Shores","zip":94065}]}');

create multivalue index cust_zip_idx
on customers c(c.cdata.address.zip.number());

Multivalue INDEX created.

-- this will use the multi-value index (see plan)
select JSON_serialize(cdata) 
from customers c 
where JSON_Exists(cdata, '$.address?(@.zip == 94065)');

Oracle Database 21c, Oracle Database Preparation, Oracle Database Prep, Oracle Database Certification, Oracle Database Career, Oracle Database Learning

-- Note to always use JSON_EXISTS to use the multi-value index.
-- JSON_Value in the WHERE clause will not pick up the index because it is
-- limited to a single value. The following will therefore not work

select JSON_serialize(cdata) 
from customers c
where JSON_Value(cdata, '$.address.zip' ERROR ON ERROR) = 94065;

ORA-40470: JSON_VALUE evaluated to multiple values
 

Simplified JSON constructor syntax


A simpler syntax to generate JSON is being provided that makes the queries shorter and more readable. These are shortcuts for SQL/JSON operators JSON_Object and JSON_Array.

Examples:

select JSON{*} from emp;

select JSON[1,2,'cat'] from emp;

are equivalent to 

select JSON_Object(*) from emp;
select JSON_Array[1,2,'cat'] from emp;

Note: In SQL Dev Web above syntax is not supported yet. As a temporary workaround you need to provide JDBC escaping as follow: select {\ JSON_Serialize(JSON {*} ) \} from dual; 

Trailing item methods in path expressions 


Everyone who has used the SQL/JSON functions JSON_VALUE or JSON_TABLE has encountered  JSON path expressions like '$.customer.address.city' to navigate inside the JSON data. Less known is that these path expressions can end with a method like number() and count(). These methods allow to perform an operation on the selected values(s), for example to convert them to another type, get their type or perform a filter or aggregate. Item methods were already present in previous releases but 21c adds a few more. 

Examples:

with jtab as (select JSON_Array(1,2,3,4,5,6) jcol from dual)
select jtab.jcol[*].maxNumber() from  jtab jtab;

6

with jtab as (select JSON_Array(1,2,3,4,5,6) jcol from dual)
select jtab.jcol[*].sum() from  jtab jtab;

21

with jtab as (select JSON_Array(1,2,3,4,5,6) jcol from dual)
select jtab.jcol[*].avg() from  jtab jtab;

3.5

Trailing functions are also relevant when working with (scalar) JSON Types because they are not directly comparable, indexable or usable in GROUP/ORDER BY expression. For these cases a trailing item method is used to 'tell the database' how the scalar JSON value is to be treated.

Example:

-- the following 2 queries raise an error as
-- JSON type instances are not directly comparable

select count(1)from customers c
where c.cdata.address.zip > 12345;
ORA-40796: invalid comparison operation involving JSON type value

select count(1)from customers c 
group by c.cdata.address.zip;
ORA-40796: invalid comparison operation involving JSON type value

-- works with trailing item method to convert JSON type
select count(1)
from customers c 
where c.cdata.address.zip.number() > 95033
group by c.cdata.address.zip.number();

Support of new JSON RFC (allowing scalar values)


At the development time of our initial JSON database release the then current JSON RFC (4627) prescribed that JSON is either an Object or Array. Since then this RFC has been obsoleted multiple times (the current RFC is 8259). Most importantly, scalar values are now also considered JSON. This means a JSON document can now consist of just one scalar value like 1, false or "dog".

Hence, these values now need to pass the IS JSON operator. We have therefore extended the IS JSON SQL operator to conditionally accept or reject scalar values. The default changes after setting compatibility to 21 to accept scalar values! This way we are compliant with the current JSON RFC 8259.

Examples:

create table whatsJson (val varchar2(30));
insert into whatsJson values('1');
insert into whatsJson values('true');
insert into whatsJson values('"true"');
insert into whatsJson values('null');
insert into whatsJson values('dog');
insert into whatsJson values('"cat"');
insert into whatsJson values('{}');
insert into whatsJson values('[]');
-- new RFC
select val from whatsJson where val is json;
1
"true"
true
null
"cat"
{}
[]

select val from whatsJson where val is NOT json;
dog

-- enforce old  RFC: 'disallow scalars'
select val from whatsJson where val is json (DISALLOW SCALARS);
{}
[]
 

Programing Language (client) Drivers:


To get the most out of the JSON type we strongly recommend you use the latest client drivers which understand the new SQL data type 'JSON'. Older drivers (pre-21c) will work but the JSON type will be converted automatically to textual JSON (so that you lose advanced type information for timestamps, dates, etc). Thus it is highly recommended you upgrade to a 21c client driver to get both JSON type functionality and best performance.

Source: oracle.com