Friday, June 30, 2023

Oracle Announces Oracle Database for Arm Architectures in the Cloud & On-Premises

Oracle Announces Oracle Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials, Oracle Database Certification

Oracle today announced that Oracle Database 19c Enterprise Edition, the current long-term support release of Oracle Database, is now certified and available on the popular Arm architecture for both cloud and on-premises deployments. Specifically, you can now subscribe to Oracle Database Service on Oracle Cloud Infrastructure (OCI) using Ampere Altra processor (OCI Ampere A1) compute instances and you can run Oracle Database 19c on Ampere Arm-based servers of choice—with both options providing highly economical price points. Oracle Database customers are building increasingly complex applications using more data—as well as AI/ML and JSON documents with greater interactivity and more database processing requirements than ever—so the timing of this Arm certification is ideal for organizations worldwide.

The eco-friendly and energy-efficient architecture of Arm technology, and specifically Ampere processors, has made it popular in hyperscale cloud data centers, and more than 180 billion processors have shipped in mobile phones, IoT sensors, and other smart devices to date. Ampere processors bring these cost-effective properties to the enterprise, enabling you to develop and operate increasingly complex workloads and modern applications.

The great news is that you can now run your Oracle Database 19c workloads with predictable performance at a lower cost, specifically with the Ampere Altra processor family in both cloud and on-premises environments—including AmpereOne processors. These processors provide a single thread per core architecture, enabling organizations to run workloads with consistent and predictable performance while achieving excellent performance scaling. The cores are wholly isolated from the “noisy neighbor” impact of other hardware threads running on the same core. This is a benefit for both scale-up workloads that require a very high core count, and for scale-out workloads that benefit from multiple instances of smaller VM shapes. Predictable performance also means a more predictable invoice at the end of the month. Lower power consumption results from a CPU architecture designed with sustainability as a core design priority.

Today’s announcement highlights the broad architectural shift across the market to Ampere processors that meet the demands of both modern cloud and on-premises environments,” said Jeff Wittich, Chief Product Officer of Ampere. “With the Ampere Altra family of processors, customers of the world’s most popular database—Oracle Database—now have a high-performance, energy efficient architecture built with sustainability in mind for organizations of all sizes.

Oracle’s converged database improves both developer and operational productivity by supporting any data, workload type, and development style. Oracle Database greatly simplifies application development, data integration, and database management by enabling you to use a single industry-proven, enterprise-grade database that can be deployed anywhere instead of requiring a different single-purpose database for each data type and workload—along with the learning curves, associated management, and individual bills for each one. Full compatibility with Oracle Database deployments in the cloud and on-premises allows you to develop applications once and run them wherever they are needed.

Oracle Database Service on OCI—Extremes of Energy Efficiency and Cloud Economics


Oracle Database Service offers flexible VM shapes of AMD E4, Intel X9, and now Ampere A1. Using Oracle Database Service, organizations greatly reduce management by eliminating infrastructure management and automating database provisioning, patching, backup, and disaster recovery using cloud automation tools. User-controlled cloud automation increases both administrator and developer productivity.

The OCI Ampere A1 database shape is based on the Ampere Altra processor with single-threaded cores, which means no sharing of the execution engine, registers, and L1/L2 cache between threads. Combined with OCI’s high-performance architecture and non-blocking networks, Ampere Altra processors deliver predictable performance. These processors are now available in a flexible VM shape to right-size your workload from 1- 57 OCPUs with 8 GB of memory per OCPU, up to a total of 456 GB and 1 Gbps of network bandwidth per OCPU, up to a total of 40 Gbps.

Customers can select from several Oracle Database Service licensing options, including Enterprise Edition, High Performance, and Extreme Performance. Plus, you can start for free with our Free Tier, with additional free resources available for qualified developers, ISVs, and universities from the Oracle Arm Accelerator Program.

You can leverage the already available Ampere A1 compute instances in OCI, for end-to-end Arm-based application development and deployment. Additionally, Oracle APEX is a no cost add-on to Oracle Database Service. It provides a comprehensive low-code platform to build and deploy powerful, responsive, and scalable web apps for both mobile devices and desktops. Using APEX and the Ampere A1 flexible database shape, that is 100% compatible with on-premises Enterprise Edition environments, developers can quickly create data-centric applications to visualize and manage relational, JSON, and spatial data.

Oracle is the only hyper-scale cloud provider that delivers the same operating system—Oracle Linux—that we use in our cloud as the hypervisor/host and for every single service that we build/deploy. No other cloud vendor does that.

Arm Support for On-Premises Oracle Database Environments


With today’s announcement, you can now run the highest-rated database on the market on the cost-effective and scalable Arm-based platform, with specific advantages available for Ampere Altra customers. In fact, Oracle Database 19c costs half as much when running on Ampere Altra processors due to the low Core Processor Licensing Factor of 0.25.

Migrating databases to Arm is fast and simple: Oracle databases running on existing platforms can use Oracle Recovery Manager (RMAN) to back up databases on their existing platform and restore to the Arm platform. All supported features of Oracle Database 19c are available on the Arm platform as well.

Enabling Everyone to Have Access to Arm


OCI Arm Accelerator is a 365-day Free Trial that offers free credits to open source developers, research universities, industry partners, and Oracle customers to run their workloads on Arm and contribute to the Arm ecosystem. The free credits can be used towards the Oracle Database Service on OCI Ampere A1, OCI Ampere A1 Compute, and other OCI services for the duration of the trial or until all the credits are used—whichever comes first.

Our goal is to offer enough Arm resources to develop your first application, test Arm at real-world scale, and even build or port a production application. Just sign up here, and you’ll get access, along with more than 20 great Always Free services. The Arm Accelerator to scale your Arm development projects provides qualified open source developers, researchers, and industry partners with up to $3,000 in credits valid for 12 months to help expand the Arm ecosystem and solve the next challenging scientific problem faster.

Source: oracle.com

Wednesday, June 28, 2023

US Treasury Department Affirms What Leading Exadata Cloud@Customer Financial Services Customers Already Know!

Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

Recently the US Department of the Treasury published a paper titled “The Financial Services Sector’s Adoption of Cloud Services.” The paper raised some potential concerns with Banks embracing the Cloud. It raises some valid points, but mostly it re-affirms what users of Oracle Exadata Cloud@Customer already know: Cloud@Customer is a better model than pure cloud for addressing the Cloud requirements of Financial Services companies. If you look at the six key issues raised in the paper, most are mitigated by the Cloud@Customer model. The proof is the major global customers including both large and small banks who have adopted Exadata Cloud@Customer to leverage the benefits of cloud while addressing the regulatory requirements associated with their business.

Let’s take a closer look the issues raised in the paper:

1. Insufficient transparency to support due diligence and monitoring by financial institutions.


Oracle has worked closely with its financial service partners and provided detailed information about the architecture and operations of the system. In addition, we have put a strong focus on improving the notifications to customers regarding various incidents that may affect the health of their system. Cloud users can subscribe to various incident events and receive automated notifications when incidents occur. Customers can optionally share those notifications with Oracle Cloud Operations, to speed resolution of issues when detected by the cloud incident monitoring. Customers who have enabled Operator Access Control, a feature developed specifically to meet requirements provided by Oracle’s leading financial services customers, can also in near real time monitor the commands and keystrokes of Oracle operators accessing the Cloud@Customer infrastructure and terminate operator access if they detect abnormal activity.

2. Gaps in human capital and tools to securely deploy cloud services.


Oracle Database in the Cloud is the same as Oracle Database on-premises. Many Oracle users already have the talent and tools to work with Oracle databases. Not all tools and skills are applicable in the cloud, as we have provided additional cloud specific automation and monitoring to simplify end-user operations and reduce their dependency on staff with specific skills or knowledge. Unlike database services that are locked to a specific cloud, Oracle users can easily migrate Oracle databases to and from the cloud with near zero downtime. It is unnecessary to keep things on-premises forgoing the benefits of cloud out of concern there will be a skills gap.

3. Exposure to potential operational incidents, including those originating at a Cloud Service Provider (CSP).


Although Exadata Cloud@Customer is designed to run connected, disconnecting Cloud@Customer from the Oracle Cloud will not affect the databases and applications using the system. If the Oracle Cloud suffers an outage, databases running in customer data centers on Exadata Cloud@Customer will continue to be fully accessible.

4. Potential impact of market concentration in cloud service offerings on the financial sector’s resilience.


This is a valid concern but illustrates why many influential cloud users are resisting putting all their applications in a single cloud, under the control of a single hyperscale cloud provider. Oracle’s approach to cloud is to give users choice, by partnering with other CSPs and to provide features and capabilities to make multicloud solutions that span multiple CSPs easy to implement and operate. The most common example of this is Oracle Database for Microsoft Azure, but Exadata Cloud@Customer can play a similar role. Some of the largest banks are using Cloud@Customer to co-locate their critical database services in facilities where they can be easily accessed by applications running in other 3rd party clouds.

5. Dynamics in contract negotiations given market concentration.


Another good point, which is why we believe the market will embrace multicloud solutions, eventually forcing CSPs who have built walled gardens to better support interoperability with the other CSPs. This open interoperable approach will serve to blunt vendor power and level the playing field for all customers. Oracle strives to compete by best meeting customer requirements. As we’ve developed new and innovative features to meet the requirements of our larger customers, we’ve incorporated them into our services making them available to all.

6. International landscape and regulatory fragmentation.


Oracle’s strategy is different from other CSPs. We are providing our customers multiple ways to operate their cloud in the location of their choice. We start with many regions, located all over the globe. Unlike some CSPs, we never move customer data out of the region where it resides unless directed by the database owner. We offer Dedicated Regions for those who wish to implement a full cloud solution in a location of their choice, be it in a country with no OCI presence, or even inside their data center.

We are working to roll out Sovereign Clouds in the EU, where we will restrict operations and customer support responsibilities to EU residents. This will help address any EU requirements for data locality and operational control. For a solution at even a smaller scale, we offer Exadata Cloud@Customer, where users can deploy as little as a single system but keep their data under their physical control. We also offer Operator Access Control for Exadata Cloud@Customer, to limit and control Cloud Operations access to a local Cloud@Customer Infrastructure, extending the user’s control to allow them to meet various local country regulations.

Summary

Major banks have already identified the issues in the US Department of the Treasury report and potential pitfalls with the cloud and have worked with vendors like Oracle to develop alternative cloud deployment models like Exadata Cloud@Customer that address these issues. This report reaffirms Oracle’s strategy in the market with multiple public cloud regions, Dedicated Regions, Sovereign Clouds, Exadata Cloud@Customer and Operator Access Control. While the US Treasury report raises valid concerns, smart adoption of cloud is key to safely leveraging the cloud while meeting regulatory requirements.

Source: oracle.com

Monday, June 26, 2023

Develop MongoDB Applications with Oracle Autonomous Database on Dedicated Exadata Infrastructure

Oracle Autonomous Database (ADB) offers powerful capabilities for developing applications with high-performance ACID transactions and robust security, including native support for JSON data. If you're a MongoDB developer, you can harness these capabilities within Autonomous Databases using the Oracle Database API for MongoDB.

The Oracle Database API for MongoDB enables seamless integration with Oracle Autonomous Database by utilizing MongoDB language drivers and tools. It takes advantage of the converged database capabilities of Autonomous Database, allowing you to manage multiple data types, including JSON, within a single database. This convergence also empowers you to leverage SQL for querying and updating JSON data.

In this blog post, we'll explore how to migrate a MongoDB collection and reap some of the benefits of ADB's converged capabilities. To use the MongoDB API with an Autonomous Database on Dedicated Exadata infrastructure (ADB-D)*, follow these steps:

1. Create or reuse and Autonomous Transaction Processing (ATP) on Dedicated Exadata infrastructure (ATP-D, both OCI and Cloud@Customer supported)

2. Install and configure the customer-managed Oracle REST Data Services (ORDS) with a version of 22.3 or later

3. Obtain the Autonomous Wallet, which can be downloaded from the "Database connection" tab of your ADB instance

4. Install Oracle JDK

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

*Autonomous Database Serverless already includes a version of ORDS with support for the Oracle Database API for MongoDB so no installation is needed

Once the prerequisites are in place, the installation process is straightforward. Unzip the downloaded ORDS package and locate the binary file. Running the binary initiates a guided process that prompts for necessary information, such as the ADMIN password and the wallet's full location. Execute the following command to begin the installation:

./ords install adb --interactive

By default, the MongoDB API is not enabled. To enable it, run the following command:

./ords config set mongo.enabled true

Finally, start ORDS:

./ords serve

Once started, the log will display the connection string for using the MongoDB API, which should resemble the following:

mongodb://[{user}:{password}@]localhost:27017/{user}?authMechanism=PLAIN&authSource=$external&ssl=true&retryWrites=false&loadBalanced=true

(For a deep dive into configuring any Oracle database to support the Oracle Database API for MongoDB see this blog post by Roger Ford)

Now, let's migrate a MongoDB collection into the Autonomous Database instance. Suppose we have a sample dataset called "inspections" with 81k documents and this specific structure:

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

We can generate the JSON document using mongoexport:

mongoexport --collection=inspections --db city --out=city_inspections.json

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

Next, use mongoimport to import the data into the Autonomous Database instance via the MongoDB API:

mongoimport --sslAllowInvalidCertificates --file city_inspections.json --uri 'mongodb://admin:password@localhost:27017/admin?authMechanism=PLAIN&authSource=$external&ssl=true&retryWrites=false&loadBalanced=true'

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

The data migration process is complete. Now, you can connect to the Autonomous Database instance using mongosh and verify that the collection and the number of documents are preserved:

mongosh --tlsAllowInvalidCertificates 'mongodb://admin:password@localhost:27017/admin?authMechanism=PLAIN&authSource=$external&ssl=true&retryWrites=false&loadBalanced=true'

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

You can execute your typical queries. For instance, let's find a specific certificate number:

db.city_inspections.find({ certificate_number: { $eq: 9278806 } })

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

Now, let's explore the migrated collection using SQL within the Autonomous Database. We'll use the Oracle SQL Developer command line (sqlcl) to run a SQL count and verify the number of rows:

SELECT COUNT(*) FROM city_inspections;

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

Autonomous Database allows you to view data in both JSON and relational formats. For example, you can execute a SQL query to display the ID and certificates for New York City:

SELECT id, certificates FROM city_inspections WHERE city = 'New York';

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

Additionally, if needed, you can retrieve data in JSON format:

SELECT JSON_OBJECT('id' VALUE id, 'certificates' VALUE certificates) FROM city_inspections WHERE city = 'New York';

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

Furthermore, you have the flexibility to perform JOIN operations between existing relational tables and JSON collections:

SELECT c.name, i.certificates FROM customers c JOIN city_inspections i ON c.customer_id = i.customer_id;

Oracle Autonomous Database, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Tutorial and Materials

As demonstrated, migrating your MongoDB workloads to Autonomous Database - Dedicated allows you to leverage your MongoDB expertise while benefiting from the features of a converged database. You can continue using familiar tools and also take advantage of capabilities like querying JSON collections with SQL.

In conclusion, by integrating MongoDB applications with Oracle Autonomous Database, developers can harness the power of a dedicated Exadata infrastructure, native JSON support, ACID transactions, comprehensive security, and the ability to manage multiple data types within a single database. This seamless integration provides an opportunity to enhance existing MongoDB code and leverage the benefits of a converged database environment.

Source: oracle.com

Friday, June 23, 2023

Benefits of Passing 1Z0-1091-22 Exam and Becoming an Oracle Certified Professional

Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional, Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional Exam, Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional Certification, Oracle, Oracle Exam, Oracle Certification, Oracle 1Z0-1091-22, Oracle 1Z0-1091-22 Exam, Oracle 1Z0-1091-22 Certification, 1Z0-1091-22, 1Z0-1091-22 Exam, 1Z0-1091-22 Certification, 1Z0-1091-22 Syllabus, 1Z0-1091-22 Questions, 1Z0-1091-22 Practice Exam, 1Z0-1091-22 Mock Test

In today's rapidly evolving technological landscape, staying ahead in your career requires continuous learning and upskilling. For professionals in the field of Oracle Fusion Cloud, the 1Z0-1091-22 exam is a crucial milestone. This exam tests your knowledge and expertise in Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional, a vital Oracle Fusion Cloud suite component. This article will explore essential tips to help you excel in the 1Z0-1091-22 exam and pave the way for a successful career in Oracle Fusion Cloud.

Understanding of Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional Certification

Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional is a comprehensive solution that enables utility companies to effectively manage and analyze the vast amounts of meter data they collect. It provides a centralized data storage, validation, estimation, editing, and reporting platform. Oracle Fusion Cloud's 1Z0-1091-22 exam evaluates your proficiency in configuring and implementing Oracle to address real-world challenges utility organizations face.

Preparing for the 1Z0-1091-22 Exam

To maximize your chances of success in the 1Z0-1091-22 exam, it is crucial to establish a well-structured study plan. Here are some essential tips to help you prepare effectively:

  • Set Clear Goals: Define specific goals and objectives for your exam preparation journey. This will help you stay focused and motivated throughout the process.
  • Gather Study Materials: Collect comprehensive study materials, including official Oracle documentation, textbooks, online courses, and practice exams. Referencing multiple resources will provide a well-rounded understanding of the subject matter.
  • Practice Hands-on: Oracle Fusion Cloud is a practical platform, and hands-on experience is invaluable. Set up a practice environment and actively engage with Oracle Utilities MDM functionalities.
  • Join Study Groups: Collaborate with fellow professionals preparing for the 1Z0-1091-22 exam. Participating in study groups allows for knowledge sharing, discussion of complex topics, and exposure to different perspectives.
  • Manage Time Effectively: Create a study schedule that suits your routine and commitments. Allocate dedicated time slots for studying and stick to the plan consistently.

Effective Study Strategies

While studying for the 1Z0-1091-22 exam, employ these effective strategies to enhance your learning experience:

  • Chunking: Break down complex topics into smaller, manageable chunks. This approach aids comprehension and retention of information.
  • Active Recall: Engage in active recall techniques such as flashcards, summarizing concepts in your own words, or teaching the material to someone else. This strengthens memory and understanding.
  • Interleaving: Instead of studying a single topic extensively, interleave your study sessions with different subjects. This technique promotes better long-term retention and improves overall understanding.
  • Spaced Repetition: Regularly review previously covered topics at spaced intervals. Spaced repetition helps reinforce knowledge over time and prevents forgetting.
  • Utilize Mnemonics: Mnemonic devices like acronyms or visual associations can assist in memorizing complex information, formulas, or critical concepts.

Recommended Learning Resources

Here are some highly recommended learning resources for comprehensive 1Z0-1091-22 exam preparation:

  • Official Oracle Documentation: The official documentation provides in-depth information about Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional and related functionalities.
  • Oracle University Training: Oracle University offers instructor-led training courses specifically designed for Oracle Fusion Cloud certifications. Enroll in their "Oracle Utilities Meter Solution Cloud Service 2022 Implementation Professional" course for comprehensive exam preparation.
  • Online 1Z0-1091-22 Practice Exams: Several online platforms provide 1Z0-1091-22 practice exams that simulate the actual exam environment. Practice exams help you assess your readiness and identify areas that require further attention.

1Z0-1091-22 Exam Day Tips

On the day of the 1Z0-1091-22 exam, follow these tips to optimize your performance:

  • Get Sufficient Rest: Ensure you have a good night's sleep before the exam day to be mentally alert and focused.
  • Arrive Early: Plan to arrive at the exam center well in advance to avoid any last-minute rush or anxiety.
  • Read Instructions Carefully: Take your time to read the exam instructions thoroughly before starting the test. Understand the format and any specific requirements.
  • Manage Time: Budget your time wisely throughout the exam. Pace yourself to complete all the questions within the allocated timeframe.
  • Review Your Answers: Review your answers before submitting the exam if time permits. Check for any errors or omissions.

Exploring Career Prospects with the 1Z0-1091-22 Certification

The 1Z0-1091-22 certification holds significant value in today's technology-driven world. By obtaining this certification, professionals can unlock a wide range of career opportunities and enhance their prospects in the industry.

  • Database Administrator: With the 1Z0-1091-22 certification, you can pursue a career as a database administrator. This role involves managing and organizing data within an organization's databases. Database administrators are critical in ensuring data security, optimizing database performance, and troubleshooting issues. This certification equips you with the necessary skills to excel in this field and opens doors to opportunities with organizations that rely on efficient data management.
  • Database Developer: You can explore the database developer role as a certified professional. Database developers are responsible for designing, implementing, and maintaining databases. They work closely with software developers to ensure that applications are appropriately integrated with databases and meet the organization's requirements. The 1Z0-1091-22 certification provides you with the knowledge and expertise to excel in this field, allowing you to contribute to developing robust database solutions.
  • Data Analyst: In the age of big data, the demand for skilled data analysts is soaring. With the 1Z0-1091-22 certification, you can pursue a career as a data analyst, where you will analyze and interpret data to uncover valuable insights and support decision-making processes. This certification equips you with the necessary skills to handle large datasets, perform data mining and analysis, and present findings effectively, making you a valuable asset to organizations across various industries.
  • Database Consultant: Certified professionals can explore opportunities as database consultants, providing expert advice and guidance on database design, implementation, and optimization. As a consultant, you will work closely with clients to understand their unique requirements, analyze existing systems, and recommend solutions to improve performance, scalability, and security. The 1Z0-1091-22 certification enhances your credibility and positions you as a trusted advisor in database management.
  • IT Project Manager: The 1Z0-1091-22 certification can also pave the way for a career in IT project management. As a project manager, you will oversee the planning, execution, and successful delivery of IT projects, including database-related initiatives. This certification equips you with a strong foundation in database technologies, enabling you to communicate effectively with technical teams, manage project timelines, and ensure successful project outcomes.

Conclusion

The 1Z0-1091-22 certification opens up a plethora of career prospects in the field of database management. Whether you aspire to become a database administrator, developer, data analyst, consultant, or project manager, this certification validates your expertise and enhances your professional credibility. Embrace the opportunities with this certification and embark on a rewarding career path in the ever-evolving realm of data management and technology.

Wednesday, June 21, 2023

Introduction to JavaScript in Oracle Database 23c Free - Developer Release

Developing Applications for Oracle Database: Client & Server


The Oracle Database is renown for its rich support of programming languages. In addition to support for many client-side development languages the Oracle database has supported server-side programs for a very long time. Sometimes these have been referred to as "stored procedures", although the name doesn't give the feature the credit it deserves: apart from writing "procedures" a great many other possibilities exist to work with data.

JavaScript in Oracle Database 23c Free, Oracle Database Certification, Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Preparation

The most common programmatic server-side interface to the Oracle database is PL/SQL. By using PL/SQL it is possible to keep business logic and data together, often offering significant improvements of performance and efficiency. Developers also benefit from a unified processing pattern for data, regardless of the client interface in use. And last but not least using a programmatic interface to the application decouples the frontend from the backend, crucial for modern application development techniques.

In addition to PL/SQL it is possible to create stored procedures using the Java Programming Language in the database. Even more languages are supported via External Procedures. That was the situation before the release of Oracle 21c.

Introducing JavaScript in Oracle Database 21c

Oracle 21c for Linux x86-64 added another language for server-side development to the mix: JavaScript. JavaScript is one of the most popular programming languages today. It has come a long way since its inception as a browser-based solution for interactive web pages. Whilst its popularity for front-end development remains strong, it has found its way into backend development as well: the node.js and deno projects for example are very popular in that space.

JavaScript support further enhances Oracle's already strong message about the Converged Database. A Converged Database is a multi-model, multi-tenant, multi-workload database. It effortlessly supports the data model and access method each development team wants, simplifying the development process. With its high popularity the JavaScript language fits right into this concept. Under the hood the JavaScript engine is based on GraalVM, a polyglot runtime that can execute several programming languages with high performance. The component powering the JavaScript engine in Oracle 21c and later is known as Multilingual Engine (MLE).

Oracle release 21c focused on dynamic execution of JavaScript snippets, and integration into Oracle's low-code application framework: Application Express (APEX). The DBMS_MLE package allows developers to execute code snippets written in JavaScript inside the database, both on-premises and in the cloud, for Linux x86-64.

Enhanced JavaScript Support in Oracle Database 23c Free - Developer Release

The availability of Oracle Database 23c Free - Developer Release on Linux x86-64 provides a wealth of new features to developers. In the context of JavaScript in Oracle database the following two concepts are introduced:

◉ JavaScript modules and environments
◉ Inline JavaScript procedures

The following sections discuss these in more detail.

JavaScript modules and environments

These are stored as schema objects and can be created in-line with the module header, based on Character Large Objects (CLOBs), or BFILEs stored in the file system. The following example demonstrates how to create a MLE JavaScript module with the actual JavaScript code provided in-line with the declaration:

create mle module if not exists example_module 
language javascript as 

/**
 * convert a delimited string into key-value pairs and return JSON
 * @param {string} inputString - the input string to be converted
 * @returns {JSON}
 */
function string2obj(inputString) {
    if ( inputString === undefined ) {
        throw `must provide a string in the form of key1=value1;...;keyN=valueN`;
    }

    let myObject = {};
    if ( inputString.length === 0 ) {
        return myObject;
    }

    const kvPairs = inputString.split(";");
    kvPairs.forEach( pair => {
        const tuple = pair.split("=");
        if ( tuple.length === 1 ) {
            tuple[1] = false;
        } else if ( tuple.length != 2 ) {
            throw "parse error: you need to use exactly one '=' between key and value and not use '=' in either key or value";
        }
        myObject[tuple[0]] = tuple[1];
    });

    return myObject;
}

/**
 * Perform a simple string concatenation
 * @param {string} str1 - the first input string
 * @param {string} str2 - the second input string
 * @returns {string}
 */
function concat(str1, str2) {

    return str1 + str2;
}

export { string2obj, concat }
/

Just as with client-side node.js development, modules can import other modules to allow for a divide-and-conquer programming model. It is also possible to use existing JavaScript modules, leveraging the huge ecosystem the community created, provided they adhere to the rules laid out by the MLE runtime.

JavaScript in Oracle Database 23c Free - Developer Release is based on ECMAScript 2022, exporting and importing functionality is based on the export and import keywords. Since there is no file system where modules reside, a new helper-entity named MLE environment is introduced. These MLE environments are the second major new innovation in database 23c Free - Developer Release and like MLE modules they are schema objects. MLE environments define import names to be used with the import keyword, pointing to a module. MLE environments and dependencies between MLE modules are not in scope of this introduction and will be covered extensively in future posts.

It is of course possible to call JavaScript code from SQL and PL/SQL as well. A JavaScript specific so-called Call Specification exposes a function exported from the MLE Module, for example:

create function if not exists string_to_JSON_module_example(
    p_str varchar2
) return JSON
as mle module example_module 
signature 'string2obj(string)';
/

Each PL/SQL function or procedure wishing to invoke JavaScript code needs to reference the module and (JavaScript) function, along with the correct number of parameters both in the PL/SQL argument list as well as the mapped JavaScript function using the signature clause. With all requirements satisfied it is possible to execute JavaScript code via the PL/SQL module call:

select 
    json_serialize(
        string_to_JSON_module_example('a=1;b=2;c=3;d')
        PRETTY
    ) as result;

RESULT
--------------------

{
  "a" : "1",
  "b" : "2",
  "c" : "3",
  "d" : false
}

Inline JavaScript Functions and Procedures

In cases where you just need a JavaScript function, you can use inline JavaScript procedures instead of a module. The previous example can be rewritten as an inline function as follows:

create function if not exists string_to_JSON_inline_example(
    "inputString" varchar2
) return JSON
as mle language javascript 
q'~
    if ( inputString === undefined ) {
        throw `must provide a string in the form of key1=value1;...;keyN=valueN`;
    }

    let myObject = {};
    if ( inputString.length === 0 ) {
        return myObject;
    }

    const kvPairs = inputString.split(";");
    kvPairs.forEach( pair => {
        const tuple = pair.split("=");
        if ( tuple.length === 1 ) {
            tuple[1] = false;
        } else if ( tuple.length != 2 ) {
            throw "parse error: you need to use exactly one '=' between key and value and not use '=' in either key or value";
        }
        myObject[tuple[0]] = tuple[1];
    });

    return myObject;
~';
/

Inline JavaScript functions offer high convenience at a slight expense of functionality but in many cases the trade-off is negligible. The function in the listing above can be executed as if it were a PL/SQL function, a nice productivity boost.

Prerequisites for using JavaScript in Oracle Database 23c Free - Developer Release

Before you can run the examples in this post make sure you meet the following prerequisites:

  • Ensure that the compatible initialization parameter is set to 23.0.0 or higher
  • The new initialization parameter multilingual_engine is set to enable
  • The following system privileges have been granted to your user:
    • create mle
    • create procedure
    • any other privileges such as creating tables, indexes, etc. The db_developer_role might prove to be useful
  • The execute on javascript object privilege has been granted to the user
  • Your platform is Linux x86-64
Source: oracle.com

Monday, June 19, 2023

Oracle Database 23c: New feature - DB_DEVELOPER_ROLE

Oracle Database 23c: New feature - DB_DEVELOPER_ROLE

Oracle Database 23c, Oracle Database, Oracle Database Certification, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

Starting with Oracle Database 23c, the new role "DB_DEVELOPER_ROLE" allows administrators quickly assign all necessary privileges developers need to design, build, and deploy applications for the Oracle Database. (Include System privileges required to build a data model and Object privileges required to monitor and debug applications).

By using this role, administrators no longer have to guess which privileges may be necessary for application development.

Oracle recommends that you grant the application developer the DB_DEVELOPER_ROLE role, rather than individually granting these privileges or granting the user the DBA role, as the DB_DEVELOPER_ROLE role adheres to least-privilege principles and ensures greater security for the development environment.

The DB_DEVELOPER_ROLE role can be use in either the CDB root or the PDB.

The following SQL statements allows you to retrieve the privileges assigned to this role:

For Sys Privileges

SQL> SELECT privilege FROM role_sys_privs WHERE role='DB_DEVELOPER_ROLE' ORDER BY 1;

Oracle Database 23c, Oracle Database, Oracle Database Certification, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

For Object Privileges

SQL> SELECT table_name, privilege FROM role_tab_privs WHERE role = 'DB_DEVELOPER_ROLE' ORDER BY 1;

Oracle Database 23c, Oracle Database, Oracle Database Certification, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

For: Other functionalities - Roles

Enable SODA_APP role to work with JSON collections - (Simple Oracle Document Access (SODA)).

The CTXAPP role is a system-defined role that enables users to create and delete Oracle Text preferences. The CTXAPP role allows users create preferences and use the PL/SQL packages.

SQL> SELECT granted_role FROM role_role_privs WHERE role ='DB_DEVELOPER_ROLE';

Oracle Database 23c, Oracle Database, Oracle Database Certification, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

If you want to add or revoke the role, you can use the next sentences:

SQL> GRANT DB_DEVELOPER_ROLE TO <user_name>;

Oracle Database 23c, Oracle Database, Oracle Database Certification, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

SQL> REVOKE DB_DEVELOPER_ROLE FROM <user_name>;

Oracle Database 23c, Oracle Database, Oracle Database Certification, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning

Source: oracle.com

Friday, June 16, 2023

ADDM Spotlight provides strategic advice to optimize Oracle Database performance

ADDM has new Spotlight capability in Oracle Enterprise Manager (EM) and Oracle Cloud Infrastructure Operations Insights service (Operations Insights).

For years Automated Database Diagnostic Monitor (ADDM) has provided Oracle Database Administrators with a continuous stream of findings and recommendations for optimizing database and application performance. ADDM analyzes Automated Workload Repository (AWR) performance snapshots as soon as they are created, once per hour (typically), using Oracle's proven time-based performance optimizing methodology.

ADDM findings are statements about database time, the fundamental measure of database performance, and the amount of database time (DB Time) involved is the "impact" of the finding.  Similarly, recommendations are actions that can potentially be taken to reduce the DB Time of a given finding, and the amount of time they may save is the "benefit".  Findings may have multiple recommendations because there may be more than one way to reduce the DB Time for any given finding.

ADDM Spotlight aggregates these hourly findings and recommendations over longer periods such as a week or month. The longer time window enables DBAs and system administrators to assess the systemic impact of implementing ADDM recommendations over all the workloads serviced by the database. Administrators can weigh the total benefits of big changes against the cost and/or risk of implementation and make better performance management decisions.

Oracle Database performance, Oracle Database Career, Oracle Database Skills, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning
Figure 1:  ADDM Spotlight overview in EM

ADDM Spotlight provides performance analysis for a variety of personas


ADDM Spotlight supports database, system, and application administrators in optimizing database application performance.

Database or system administrator capabilities:

◉ Make decisions to upgrade system capacity, such as adding CPU, that may be costly
◉ Gather optimizer statistics on specific sets of tables implicated in performance issues
◉ Prioritize system changes by understanding workload and performance impacts over time

Application administrator capabilities:

◉ Discover poor-performing SQL statements and when they execute
◉ Prioritize SQL tuning efforts based on the total or relative impact of the SQL statement on the application
◉ Identify application design issues causing operations inefficiencies such as a lock or latch contention

The global perspective offered by ADDM Spotlight enables users to make complex and high-impact performance management decisions with confidence.

Rich visualization with in-context drill-downs provides deeper insight into performance


Oracle Database performance, Oracle Database Career, Oracle Database Skills, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning
Figure 2:  ADDM Spotlight overview in Operations Insights
 
ADDM Spotlight includes the following key features:

◉ Summary timeline showing when findings and recommendations occur and their volumes
◉ Findings and recommendations tabs that organize findings by category and allow them to be sorted on their aggregated impact or benefit
◉ Database parameters tab to filter and drill down on initialization parameters critical to database performance

The summary timeline shows findings or recommendations aggregated by AWR snapshots over the reporting period.  Users can see when specific findings or recommendations occur over time and identify a pattern of database performance issues.  The timeline can be filtered to isolate specific findings or recommendations to better identify when and how often they occur.

Oracle Database performance, Oracle Database Career, Oracle Database Skills, Oracle Database Tutorial and Materials, Oracle Database Guides, Oracle Database Learning
Figure 3:  ADDM Spotlight Recommendations page in EM

A key feature in ADDM Spotlight is the Findings and Recommendations tables.  These tables present these aggregations by finding or recommendations over the reporting period:

◉ Frequency of occurrence: is this consistently the case or only intermittently?
◉ Average Active Sessions:  the total impact or benefit of the finding or recommendation over the entire period measured by the load on the database
◉ Maximum impact or benefit of the finding or recommendation as a percentage of the total workload running at the time when it was observed

These aggregations enable users to decide whether to implement recommendations based on overall severity, peak severity, and whether they address chronic or intermittent issues.  For example, ADDM may find the system was overloaded during a specific hour and make a recommendation to add CPU capacity, which will come at a cost.  If this finding occurs only once or infrequently then tuning SQL using CPU during that hour may improve performance without the need for additional CPU allocation.

Database parameters tab display initialization parameters over the reporting period.  These can have a significant impact on database performance and ADDM may recommend making changes to them.  The parameters table can be filtered to zoom into specific parameters with:

◉ High impact on performance
◉ Changes during the reporting period
◉ ADDM recommended changes
◉ Non-default values

ADDM Spotlight is available in EM and Operations Insights


Using EM Cloud Control (EMCC), ADDM Spotlight consolidates the finding and the recommendations that need to be taken into consideration or implemented by looking over the ADDM data of multiple snapshots for an extended period, typically over a week or a day, which provides more justification for implementing the changes that improve the database performance.

Using Operations Insights, the ADDM Spotlight overview page provides a compartment-level view, including child compartments, for your database resources’ ADDM findings. This view enables a quick sort and filter of ADDM results to narrow down the most impactful performance findings and better utilize time to improve the overall performance of your database fleet. ADDM data is stored for up to 25 months in Operations Insights, allowing for larger-scale performance investigations based on seasonality trends.

  OPSI ADDM Spotlight EM ADDM Spotlight 
Retention period 25 month ADDM data is retained for 30 days in the database
Scope  Fleet-wide or compartment  Single target database 
Supported versions  PDBs: 19c or higher*
Non-PDBs: 18c or higher 
19c or higher 
Deployment type  Cloud and on-premises, ADB coming soon  Cloud and on-premises 

*Pluggable Databases require a couple of additional configuration steps to begin collecting ADDM data. You must log in to the PDB as a SYS user and set the AWR_PDB_AUTOFLUSH_ENABLED parameter to 'TRUE'. You must also execute the dbms_workload_repository.modify_snapshot_settings to configure snapshot interval collections.

Source: oracle.com

Wednesday, June 14, 2023

Second Quarterly Update on Oracle Graph (2023)

Oracle Database, Oracle Database Prep, Database Skills, Database Jobs, Database Preparation, Database Tutorial and Materials

The graph features of Oracle Database enables developers to store and navigate relationships between entities. Oracle provides support for both property and RDF knowledge graphs and simplifies the process of modeling relational data as graph structures. Interactive graph queries can run directly on graph data in the database or in a high-performance in-memory graph server. Oracle Graph Server and Client enables developers, analysts, and data scientists to use graphs within Oracle Database, while Graph Studio in Oracle Autonomous Database removes barriers to entry by automating complicated setup and management, making data integration seamless, and by providing step-by-step examples for getting started.

The last quarterly update on Oracle Graph, announced the availability of Oracle Graph Server and Client 23.1. That release included the graph visualization JavaScript library for Property Graphs, which allows developers to leverage many of the benefits of graph visualization available in Graph Studio and the graph visualization tool, but in their own applications. Additionally, it announced GraphML support for a Supervised Edge Wise model, and a new API for Python and Java, that can be used to import local GraphSON v3.0 files into Oracle Database, and create a graph from it.

Oracle Graph Server and Client 23.2 is now available for download for use with databases in the Cloud (OCI Marketplace image is available) and for databases on-premises. This release includes a number of changes, including changes to the graph visualization app, updates to PGQL, and integrations of Oracle Graph with other services. The graph visualization app has been changed to more closely align with the SQL standard and support SQL property graphs, which are available through Oracle Database 23c Free – Developer Release. It also includes additional functionality through PGQL, and updates that closely align PGQL with the SQL standard. Lastly, Oracle Graph is now available through PyPi, and has enhanced integrations with SQL Developer and OCI Data Science. Learn more about these exciting changes below.

Graph Visualization App


This release includes some changes to the Graph Visualization application. First, the drop-down to select a graph has been removed. Instead, the graph name needs to be given in the ON clause, in an effort to align with the SQL Standard. To see the list of available graphs, the drop-down has been replaced by a popover with a list of available graphs. 

Oracle Database, Oracle Database Prep, Database Skills, Database Jobs, Database Preparation, Database Tutorial and Materials

Oracle Database 23c Free – Developer Release, announced on April 3rd, includes support for SQL Property Graphs. The visualization application will detect if there are SQL Property Graphs available, and if there are, it will create a tab for SQL/PGQ queries. The application will also remember your preferred tab, for greater ease of use.

Oracle Database, Oracle Database Prep, Database Skills, Database Jobs, Database Preparation, Database Tutorial and Materials

PGQL Updates


Path modes WALK, TRAIL, SIMPLE, ACYCLIC

PGQL updates include support for querying along paths to allow for cycle avoidance. Each path mode (WALK, TRAIL, ACYCLIC, and SIMPLE) are different path modes and can be summarized as follows:

  • WALK: the default path mode, where no filtering of paths happens.
  • TRAIL: where path bindings with repeated edges are not returned.
  • ACYCLIC: where path bindings with repeated vertices are not returned.
  • SIMPLE: where path bindings with repeated vertices are not returned unless the repeated vertex is the first and the last in the path.

Path modes are syntactically placed after ALL, ANY, ANY SHORTEST, SHORTEST k, ALL SHORTEST and are optionally followed by a PATH or PATHS keyword. See the following example, which traverses the graph for paths from account 10039 back to itself without repeating vertices, except for the vertex for account 10039:

SELECT LISTAGG(e.amount, ', ') AS amounts FROM MATCH ALL SIMPLE PATHS (a:account) -[:transaction]->+ (a) WHERE a.number = 10039
 
+--------------------------------+
| amounts                        |
+--------------------------------+
| 1000.0, 1500.3, 9999.5, 9900.0 |
| 1000.0, 3000.7, 9999.5, 9900.0 |
+--------------------------------+

LATERAL subquery

In this release, we added support for LATERAL subqueries to allow for passing the output rows of one query into another. For example, you can use ORDER BY / GROUP BY on top of another ORDER BY / GROUP BY.

/* Find the top-5 largest transactions and return the account number that received the highest number of such large transactions */
SELECT recipient, COUNT(*) AS num_large_transactions
FROM LATERAL( SELECT m.number AS recipient
FROM MATCH (n:account) -[e:transaction]-> (m:account)
ORDER BY e.amount DESC
LIMIT 5 )
GROUP BY recipient
ORDER BY num_large_transactions DESC
LIMIT 1

Note: In release 23.2, FROM clauses may only contain a single LATERAL subquery or one or more MATCH clauses.

GRAPH_TABLE subquery

To align PGQL with the SQL:2023 standard, we added GRAPH_TABLE syntax for PGQL, which can be used even when querying graphs in Graph Server (PGX). For example, if we have a graph named “financial_transactions”, we can write a query as follows:

SELECT *
FROM GRAPH_TABLE ( financial_transactions
     MATCH ALL TRAIL (a IS account) -[e IS transaction]->* (b IS account) 
     /* optional ONE ROW PER VERTEX/STEP clause here */
     WHERE a.number= 8021 AND b.number= 1001
     COLUMNS ( LISTAGG(e.amount, ', ') AS amounts )
)
ORDER BY amounts
 
+----------------------------------------+
| amounts                                |
+----------------------------------------+
| 1500.3                                 |
| 1500.3, 9999.5, 9900.0, 1000.0, 3000.7 |
| 3000.7                                 |
| 3000.7, 9999.5, 9900.0, 1000.0, 1500.3 |
+----------------------------------------+

When PGQL queries contain one or more GRAPH_TABLE subqueries, you must use only associated syntax that is in the SQL standard. For example, MATCH (a:account) must be replaced with MATCH (a IS account) when using GRAPH_TABLE subqueries.

Additional SQL standard alignments

This release includes syntax variations that align with the SQL standard, including the following:

  • FETCH [FIRST/NEXT] 10 [ROW/ROWS] ONLY as standard form of LIMIT 10
  • v [IS] SOURCE/DESTINATION [OF] e as standard form of is_source_of(e, v) / is_destination_of(e, v)
  • e IS LABELED transaction as standard form of has_label(e, 'TRANSACTION')
  • MATCH SHORTEST 10 (n) –[e]->* (m) as standard form of MATCH TOP 10 SHORTEST (n) –[e]->* (m)
  • MATCH (n) –[e]->{1,4} (m) as alternative for MATCH ALL (n) –[e]->{1,4} (m)
    • The ALL keyword became optional)
  • VERTEX_ID(v) / EDGE_ID(e) as alternative for ID(v) / ID(e)
  • VERTEX_EQUAL(v1, v2) / EDGE_EQUAL(e1, e2) as alternative for v1 = v2 / e1 = e2 

PG Schema Deprecation

The PG objects (also known as PG schema) data format has been deprecated, and replaced with PG View and SQL Property Graph, starting in 23c. This deprecation includes any APIs to create, query, update, remove or interact in any other way with graphs stored in VT$ and EG$ tables. Additionally, OPTIONS (PG_SCHEMA) is no longer the default in PGQL CREATE PROPERTY GRAPH statements. The OPTIONS clause now needs to be explicitly specified every time a property graph is created using PGQL. For example:

CREATE PROPERTY GRAPH BANK_GRAPH_PGQL
    VERTEX TABLES (
        BANK_ACCOUNTS
            KEY ( ID )
            LABEL accounts PROPERTIES ( ID, name )
    )
    EDGE TABLES (
       BANK_TRANSFERS
           SOURCE KEY ( src_acct_id ) REFERENCES BANK_ACCOUNTS
           DESTINATION KEY ( dst_acct_id ) REFERENCES BANK_ACCOUNTS|
           LABEL transfers PROPERTIES ( amount, description, src_acct_id, dst_acct_id, txn_id )
) OPTIONS (PG_VIEW);

Integrations

Along with this release, we are excited to announce integrations with a few other services.

The Python Client is now available on PyPI.org. The Python Package Index (PyPI) is the official software repository for the Python community. This integration simplifies the installation for python users by making the client installation as simple as:

pip install oracle-graph-client

There have been improvements to the SQL Developer integration in SQL Developer 23.1. The graph integration with SQL Developer now has added support for multiple statements in a single PGQL worksheet, and a single statement can be highlighted to be executed. 

The full Oracle Graph Python client has been added to OCI Data Science. While PyPGX has been available on OCI Data Science since version 21.4, this release adds the missing client components, including, the Graph Server client, the Oracle Graph client for Graph Studio in Autonomous Database, and PGQL on RDBMS library.

Source: oracle.com