Friday, December 30, 2022

Fusion Analytics Warehouse: Migrate Data Augmentations Across Environments

Introduction


Bundles in Fusion Analytics are primarily used to:

◉ Migrate Fusion Analytics artifacts from one environment to another.
◉ Create backups of artifacts.

You can use a data config bundle to migrate pipeline parameters, report parameters, activation metadata, and data augmentations from one environment to another. This article describes the steps of a data augmentation migration with examples. See Migrating Artifacts for more information.

Use case


Suppose that you have six data augmentations in your staging environment that meet your custom business requirements. As part of going live with your implementation, you must migrate these augmentations to the production environment.

◉ DW_FA_X_OPTY_REV_LINE_ESTCODE
◉ DW_FA_X_OPTY_REVENUE_LINE_EXTENSION
◉ DW_FA_X_OPPORTUNITY_QUOTED_BY
◉ DW_FA_X_OPTY_CUSTOM
◉ DW_FA_X_OPTY_SR_EXTN
◉ DW_FA_X_OPTY_EXT_REVN_LINE_SA

Migration steps for data augmentations


Migrating data augmentations from a non-production environment to a production environment requires the following steps:

Steps in the source environment

1. Navigate to the bundle.
2. Select the bundle type and augmentations and save it.
3. Generate and export the bundle.

Steps in the target environment

1. Navigate to the bundle.
2. Import and deploy the bundle.
3. Validate the migration.

Steps in the source environment 


1. Navigate to the bundle

To access the Bundles feature, log in using the Fusion Analytics URL. Click on the Navigation menu on the top left side.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Select Console.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guidev

Select Bundles.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

2. Define the bundle

Click Create and Data Config Bundle.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

In the Data Config Bundle screen, enter the Name and Description. Click Select Augmentations under Data Augmentations to select only the custom data augmentations.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

To select all the augmentations, check the Data Augmentations box under Select Data Augmentations.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

3. Generate the bundle

Click the Actions icon and then click Generate.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Click Generate and wait until it completes. This action generates a snapshot of the application artifacts and saves the snapshot to a repository.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

4. Export the bundle

To export the data config bundle, click the Actions icon and then click Export. It's saved in your local download folder as <bundle name>_date.aab file. This file contains the data augmentation artifacts and is used for importing into the target environment.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Steps in the target environment 


Before migrating, check the production environment and determine if these data augmentations exist. If they exist, delete them.

1. Import the bundle

Log into the Fusion Analytics production environment. From the Navigator Menu, navigate to Console, Data Configuration, and Data Augmentation.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

From the menu, navigate to Console and Bundles.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Click Import.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Click Drag and Drop and select the local exported file that you saved earlier. Then click Import and wait for the process to complete.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

2. Deploy the bundle

After the import succeeds, click the Actions icon and then click Deploy.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Select Run data pipelines immediately and click Deploy.

This does two things:

◉ Validates the bundle to ensure the software, model versions, and other dependencies are compatible. 
◉ Deploys the data augmentations and runs the data pipeline.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

When the status changes to Completed, the Data Config Bundle deployment is successful.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

3. Validate the migration

To validate the migration, navigate to Console, Data Configuration, and Data Augmentation. For each data augmentation, confirm that the Pipeline Status shows Activation Complete and the Semantic Model Status shows Completed.

Oracle Fusion Analytics Warehouse, Oracle Database Exam, Databaase Career, Database Jobs, Database Prep, Database Skill, Database Tutorial and Materials, Database Guide

Source: oracle.com

Wednesday, December 28, 2022

OCI Queue is now available in all commercial regions

We’re pleased to announce that Oracle Cloud Infrastructure (OCI) Queue service is now generally available in all OCI commercial regions, with the non-commercial regions following shortly. OCI Queue provides a serverless, high-performance messaging solution for asynchronous interprocess communication, allowing services and applications to be decoupled and event-driven. The service is priced on API calls, with the first million calls each month being free.

Oracle Database Certification, Database Exam, Database Career, Database Jobs, Database Prep, Database Skills, Database Prep Exam

The service includes an intuitive user interface for configuring, creating, and managing queues, along with Terraform support. The service is supported with REST APIs and SDKs for multiple languages, including Java and Python, for not only sending and receiving messages on the queues but also configuring and managing the queues. With REST APIs, the service also supports the use of the STOMP protocol, and other messaging protocols get support in the future.

The Oracle Architecture Center and public GitHub repositories include example and referenceable implementations that you can deploy to see queues being used and see code putting the APIs and SDKs into use.

OCI Queue capabilities


The service provides the following features:

◉ Delivery assurance
◉ Automatic scaling based on demand
◉ Highly available with availability domain and fault zone redundancy
◉ OCI Queue messages are encrypted in flight and at rest.
◉ Implementation agnostic
◉ Provision of dead letter queues to hold messages that have failed delivery
◉ Strong access control managed through policies controlled by OCI Compartments and Identity and  Access Management (IAM) services
◉ Messages in batches to ensure efficiency

Oracle Database Certification, Database Exam, Database Career, Database Jobs, Database Prep, Database Skills, Database Prep Exam

Consumption management


The consumption of messages provides the means to control their visibility. When one consumer takes a message, it isn’t visible to any other consumers but retained on the queue until the first consumer confirms successful receipt or is timed out. With this method, the consumer can change how long a message remains hidden if its transaction takes longer than expected.

Oracle Database Certification, Database Exam, Database Career, Database Jobs, Database Prep, Database Skills, Database Prep Exam

The process uses the following steps:
  1. A producer sends a message to the queue with the default message retention time. The producer receives confirmation that the Queue service received and stored the message.
  2. Consumer A receives the message, which it is supposed to process within Visibility Timeout A.
  3. Consumer B receives nothing because the only available message was already consumed by Consumer A.
  4. Consumer A fails to process the message within Visibility Timeout A, so it updates the message to extend the visibility timeout.
  5. Consumer B tries to receive a message again but can’t because the only available message was consumed and extended by Consumer A.
  6. The extended visibility timeout elapses, and the message becomes visible again.
  7. Consumer B tries to receive a message a third time. Consumer B receives the message, which is supposed to process within Visibility Timeout B.
  8. Consumer A tries to receive the message but receives nothing because Consumer B consumed the message. Consumer A can no longer extend the message’s visibility timeout or delete the message.
  9. Consumer B processes the message successfully and tries to delete the message from the queue. Consumer B receives confirmation that the message was permanently deleted, so it can’t be delivered to any other consumer.
Source: oracle.com

Monday, December 26, 2022

Essential reading: Explaining modern data management (Part 3)

In previous posts, we covered the concepts and four architectures to build a modern data platform. Now, we’re wrapping up this series by looking at three companies that successfully implemented Oracle solutions. Each company is unique—different technologies and even different clouds—but all of them reached their goals.

Lyft on Autonomous Database


Lyft, the transportation network, is busy reimagining the future of transportation. Behind the scenes, though, the company had gone from a high-growth startup to a publicly traded enterprise processing billions of transactions a year, and its finance systems hadn’t kept up. It was running about 30 systems that were siloed, costly to maintain, and inefficient. They didn’t provide the timely information that a fast-growing business needs to make decisions. Lyft chose Oracle Database Platform that we presented, with Oracle Autonomous Database as the data platform. They benefited from its built-in integration with their Oracle Fusion Cloud Enterprise Resource Planning (ERP) and Oracle Analytics Cloud.

Oalce Database Exam, Oracle Database Study, Oracle Database Skills, Oracle Database Jobs, Database Skill, Database Preparation, Database Tutorial and Materials

Following the implementation, Lyft closes their financial books faster and allows more stakeholders access to centralized data, including meaningful visualizations that lead to better business decisions.


Bionime on MySQL Heatwave on AWS


Bionime’s vision is to provide diabetes patients with peace of mind through self-monitoring blood glucose systems that help them accurately manage and control their health.

The medical device manufacturer’s analytics service was built around Amazon Web Serviced (AWS) Relational Database Service (RDS) and wasn’t performing for the large amount of data needed to provide accurate and up-to-date glucose monitoring for patients. Bionime sought a faster data platform that would support its existing technologies.

Bionime selected Oracle MySQL HeatWave on AWS for a fully managed database service that combines transaction processing with an in-memory query accelerator for a high-performance analytics engine. MySQL HeatWave eliminated the need for a separate analytics database and extract, transform, load (ETL) processes. Real-time analytics helps Bionime accelerate the development of its glucose monitoring systems but with less database administration. Bionime has seen improvements of 50-times faster resolution of complex queries compared to AWS RDS.


Experian using managed open source services


With almost 18,000 employees in over 40 countries, Experian manages the credit history data on 1.3 billion individuals and 166 million businesses across the globe. They needed an open data platform with open source products and services to build vast data pipelines and process large amounts of data, including real-time events. The company wanted to unify critical data and analytics from on-premises data centers, co-location facilities, and cloud providers onto a single lakehouse architecture. Experian also had a significant investment in relational databases that needed to be integrated. To accomplish all this, Experian chose the managed open source for big data platform.

They migrated open source workloads on Spark and Hadoop from other cloud providers to Oacle Cloud Infrastructure (OCI) Big Data and OCI Data Flow services without requiring reengineering or rearchitecting, which contributed to significant cost and time savings. Experian’s success in the migration to OCI Lakehouse was also supported by Oracle Cloud Lift Services, which provided guidance on planning, architecting, and prototyping to help deliver immediate value.

Oalce Database Exam, Oracle Database Study, Oracle Database Skills, Oracle Database Jobs, Database Skill, Database Preparation, Database Tutorial and Materials

With this architecture in place, Experian saw a 40% increase in performance, a 60% reduction in costs, and increased reliability and resilience.


Conclusion


We started this series by observing that data is key to business success; Lyft, Bionime, and Experian are only a few examples. They apply Oracle’s commitment to customers to meet you where you are in your journey to the cloud, ensure portability across deployment options, and avoid vendor lock-in.

Oalce Database Exam, Oracle Database Study, Oracle Database Skills, Oracle Database Jobs, Database Skill, Database Preparation, Database Tutorial and Materials

We believe that the Oracle Data Platform is uniquely positioned to enable you to achieve success with your data with the following capabilities:

◉ Combine transactional and analytical data and avoid siloes
◉ Use OCI, Oracle software as a service (SaaS), or any amount in between. You select the amount of control
◉ Bring any kind of data to the platform, and we provide tools to help you do it. You can also use your own!
◉ Explore the power of OCI and other clouds. We meet you where you are.
◉ Use leading Oracle Analytics Cloud reporting or any third-party analytical application. OCI is open!

Source: oracle.com

Friday, December 23, 2022

Essential reading: Explaining modern data management (Part 2)

In our last post, we reviewed the evolution of data warehouses, big data, and data lakes. We also mentioned that today’s needs in data management include the following examples:

◉ A unified data platform
◉ Support for open source
◉ Integration of artificial intelligence (AI) and machine learning (ML)
◉ Pay-as-you-go pricing
◉ Support for multicloud

Other cloud platform vendors have responded by releasing multiple services that are tailored to specific use cases. This choice often leads to an expansion of services rather than a holistic solution. For example, Google Cloud offers eight different data management products (with significant overlap). Amazon Web Service (AWS) offers at least ten.

The challenge is that you must figure out which service—or combination of services—is best for your needs. Each extra service adds data movement, design, and maintenance. That’s not really a unified platform but a collection of parts with the drawback of more administration.

In contrast, Oracle has chosen to take its popular, proven database experience and break down traditional barriers encountered by other databases. The Oracle Cloud Infrastructure (OCI) Data Platform has the following features and capabilities:

◉ Combines transactional database, data warehouse, and data lake capabilities into a single platform of choice
◉ Supports all types of data (structured, semi-structured, and unstructured)
◉ Supports all usage scenarios (reporting, analytics, artificial intelligence, and business applications) with predefined integrations with software as a service (SaaS)
◉ Runs on major cloud platform vendors, with OCI offering the most options

Oracle Database Exam Prep, Oracle Database Career, Database Skills, Database Jobs, Database Certification

Let’s review the architectural choices that Oracle offers so that you can select the best data platform for your needs.

Oracle Autonomous Database platform


If most your data is already inside a relational database in a structured format and you’re planning expansions, then the all-in-one Autonomous Database platform is the solution. Oracle Autonomous Database is the well-known Oracle database that adds zero administration and includes spatial, graph, JSON, MongoDB API compatibility, machine learning, and artificial intelligence capabilities. Instead of choosing between transactional and analytical capabilities, you can get both in one system.

Autonomous Database also extends its reach to OCI Object Storage, AWS S3, or any S3-compliant object storage. You can read and query object data just like tables, using the full power of the Oracle Database. This capability is especially exciting for data engineers and data scientists, who can examine object data with familiar Oracle SQL and then combine those results with existing data in the Oracle database for rich results. Autonomous Database supports data in object storage that are in delimited text, JSON, Parquet, Avro, and ORC formats.

Oracle Database Exam Prep, Oracle Database Career, Database Skills, Database Jobs, Database Certification

MySQL Heatwave


Maybe you’re more familiar with the world’s most popular open-source database: MySQL. Oracle now offers MySQL HeatWave, which is a managed database service that combines transactions, analytics, and machine learning into a single cloud offering. It delivers real-time, secure analytics without the complexity, latency, and cost of extract, transform, and load (ETL). MySQL HeatWave is available on OCI, AWS, and Microsoft Azure.

MySQL HeatWave adds a scalable, analytical engine that transparently and continuously mirrors the data in MySQL. As each query runs, HeatWave automatically evaluates the query and responds from the analytical engine when advantageous. No change is needed to your applications.

HeatWave also accelerates machine learning by letting you build models in the service instead of having to export data out to yet another specialized service. Developers and data scientists can continue using their existing tools, such as SQL, Jupyter, and Apache Zeppelin, while working with the data in a single environment.

Oracle Database Exam Prep, Oracle Database Career, Database Skills, Database Jobs, Database Certification

Like Oracle Autonomous Database, Oracle has enhanced MySQL to directly read data in object storage. We call that feature the MySQL HeatWave lakehouse. The lakehouse gives developers and data scientists a unified platform to run transactions, analytics, and machine learning applications.

Managed open source for Big Data


The previous scenarios are a great fit for data that is mostly structured as well as for developers with experience in the leading database technologies. But what if your data is mostly unstructured? What if you’re more experienced with Hadoop-based environments?

Oracle offers several managed services based on Spark and Hadoop to ease your migration into the cloud while using your familiar tools. Using services based on open-source technologies affords you more control over your operations while running on the secure and reliable OCI infrastructure. With this architecture, you can quickly move your existing cluster into OCI (or create a new data lake).

You can complement these services with easy-to-use artificial intelligence services, such as speech-to-text, sentiment analysis, image analysis, and more.

Oracle Database Exam Prep, Oracle Database Career, Database Skills, Database Jobs, Database Certification

Modular approach


Finally, some advanced scenarios don’t match any of the previous options. You need specific capabilities, and you need to assemble them in a particular manner. Oracle’s comprehensive portfolio of services allows you to build your own, personal data platform.

Oracle Database Exam Prep, Oracle Database Career, Database Skills, Database Jobs, Database Certification

Source: oracle.com

Wednesday, December 21, 2022

Essential reading: Explaining modern data management (Part 1)

Want business success? Data is key. The right data and analytics can enable tremendous outcomes. We’ve seen a banking customer with a 40% increase in marketing conversion, a healthcare company reduce costs by 25% while still personalizing care plans, and a manufacturing customer achieve 50% savings on operational costs. The possibilities are exciting.

Oracle Database, Oracle Database Exam, Oracle Database Career, Oracle Database Prep, Database Tutorial and Materials, Database Guides, Database Management System

Despite these and many other success stories, however, we still see organizations struggling to build useful and capable data environments. This occurrence isn’t recent, with the following challenges facing companies rooted in history:

◉ Data is scattered throughout business units and application systems.
◉ Accessing data and moving it between systems is complex and adds latency.
◉ Information lives in different formats at different stages in a myriad of business processes.

The first solution


The first solution at creating an analytic data architecture was the data warehouse. A large database consolidated data from internal databases and potentially external sources, such as market data. It used the classic strategy of centralization.

Oracle Database, Oracle Database Exam, Oracle Database Career, Oracle Database Prep, Database Tutorial and Materials, Database Guides, Database Management System

The traditional data warehouse addressed the following challenges:

◉ Sparked an industry around extract, transform, load (ETL) tools and methodologies to move, cleanse, and remove duplications from the data into a consolidated database
◉ Required data that originated in different systems to be standardized, which allowed for analysis
◉ Spurred domain-specific views, which mapped to business processes and metrics
◉ Provided a single repository for visualization tools

Unfortunately, as good as the improvement was, the data warehouse didn’t solve everything. It still had the following issues:

◉ Every point of integration (every source system) required time to investigate, design, code, test, and implement. Adding a new system is an arduous process.
◉ Any change in the source system required validating every step of the integration process. The data warehouse can quickly become fragile and break with application updates.
◉ Moving data from source systems was weekly or daily, which meant that the data warehouse can be one or more days behind. The choice was to show the latest with missing pieces or limit the view to the last data load.

The next solution


As data warehouses multiplied, so did the scale of the data, which we referred to collectively as “big data.” Big data brought its own set of challenges (all conveniently starting with V):

◉ Volume: The amount of data. We had more data than could fit on a single server. We wanted to keep it all, but we weren’t sure exactly which data was the most valuable, so we needed something that was both efficient and economical.
◉ Velocity: How often the data was received. Instead of the occasional batches that rolled in monthly, data could arrive daily, by the hour, or even in real time. We needed something that could ingest data from multiple sources at different rates.
◉ Variety: The diversity of data formats. Data was no longer in well-structured columns and rows but consisted of images, audio, video, logs, and so on. (According to Forbes, most data now collected is unstructured.)

The next solution for an analytic data architecture took advantage of the cloud revolution: The data lake. The data lake focuses on cost-effectiveness to store “everything” for future analysis.

Oracle Database, Oracle Database Exam, Oracle Database Career, Oracle Database Prep, Database Tutorial and Materials, Database Guides, Database Management System

The data lake taps into the (almost) unlimited object storage of the cloud to preserve all data, regardless of immediate value. It rapidly accepts new data because object storage is a distributed service, and it accepts data in any format. Because it doesn’t enforce structure or format, the data lake removes almost all delay between data updates in the source system and data being stored in the data lake.

However, the disparate nature of the data in the data lake requires advanced analysis tools to make sense of that data. That requirement can be a big drawback. Data lakes require a higher level of expertise, such as a data scientist and machine learning models, to extract value. Otherwise, the data lake becomes a data swamp.

A solution for today

Today, our needs have increased manyfold, including the following examples:

◉ A unified data platform: Today’s data management project involves the collaboration of data engineers, data scientists, analysts, and more, all typically working on disparate systems with more infrastructure and synchronization issues.
◉ Support for open source: Some of the most popular analytical tools are supported by the open source community. This support encourages collaboration and sharing of best practices but can be resisted by closed source vendors.
◉ Integration of artificial intelligence (AI) and machine learning (ML): The advancement of AI and ML has ushered in a revolution in capabilities. Analytical systems must support pretrained and trainable services to provide maximum value.
◉ Pay-as-you-go pricing: Analytics can use a lot of storage and a varying amount of compute. Paying only for what you use helps keep costs down, while providing resources when needed.
◉ Support for multicloud: No cloud has everything, and depending on one cloud can be a strategic risk. Companies prefer a best-of-breed solution where the data goes to the most capable service.

Are we doomed to live with a data warehouse that’s too rigid or a data lake that’s too incoherent? What if there’s another way?

The Oracle Data Platform is a modern data cloud platform with an architecture that provides for the needs we’ve covered. It breaks down the barriers between structured and unstructured data, provides faster and deeper insights on a platform, works with other clouds, and provides pay-as-you-go pricing.

Oracle Database, Oracle Database Exam, Oracle Database Career, Oracle Database Prep, Database Tutorial and Materials, Database Guides, Database Management System

Source: oracle.com

Monday, December 19, 2022

Multiple VM Autonomous Database on Dedicated Exadata Infrastructure

I am excited to announce the launch of Multiple VM Autonomous Database on Dedicated Exadata Infrastructure. In March 2022, we launched Multiple VM Autonomous Database on Exadata Cloud@Customer, allowing many Exadata Cloud@Customer users to experience Autonomous Database at a significantly low cost, and effortlessly deploy and seamlessly migrate their workloads to Autonomous Database. With the launch of Multiple VM Autonomous Database on Dedicated Exadata Infrastructure, Oracle Cloud Infrastructure (OCI) customers can now create multiple Autonomous Exadata VM Clusters and Exadata Database VM Clusters on the same Exadata infrastructure. They can now easily benefit from the improved operational efficiency, lower costs, and superior development environment offered by Autonomous Database without having to provision two independent sets of infrastructure. 

Autonomous Database on Dedicated Exadata Infrastructure is a cloud database service running in OCI that uses machine learning to automate database tuning, security, backups, updates, and other routine management tasks traditionally performed by DBAs. The service supports all types of applications and levels of database criticality, but is especially well suited for modern application architectures that utilize multiple data types, workloads, and analytic functions in a single solution.  

Customers can now create multiple Autonomous Exadata VM Clusters and Exadata Database VM Clusters on a single Exadata infrastructure and allocate resources to each cluster based on their workloads. Multiple VM Autonomous Database is available on X8M and higher generations of Exadata infrastructure that are provisioned after the launch of the Multiple VM Autonomous Database feature.

Oracle Database, Oracle Database Prep, Database Career, Database Skills, Database Certification, Database Learning

Each Autonomous VM Cluster supports separate network configuration, maintenance scheduling, license type selection (BYOL and License Included), and customizable memory, storage, and compute allocations.

Customers realize multiple benefits by deploying multiple VM clusters on Dedicated Exadata Infrastructure. To name a few:

◉ Single platform with both Exadata Database Service and Autonomous Database on Dedicated Exadata Infrastructure able to run concurrently
◉ Lowers cost to adopt Autonomous Database and setup a private Database as-a Service (DBaaS) for customers already using or planning to use dedicated Exadata Infrastructure
◉ Flexible license types (BYOL and Included) on the same Exadata Infrastructure
◉ Secure environment separation with network-isolated Dev-Test, Staging, and Production environments for different applications, projects, and lines of business.

OCI Console Experience


Create Autonomous Exadata VM Clusters

To create an Autonomous Exadata VM Cluster on Dedicated Infrastructure, navigate to the Autonomous Exadata VM Cluster list view page and select "Create Autonomous Exadata VM Cluster."

You must allocate resources that will be used for Autonomous Container Databases and Autonomous Databases. Key resource configuration parameters:

◉ Number of Autonomous Container Databases you plan to create in the Autonomous Exadata VM Cluster - Local storage is automatically allocated based on this value
◉ OCPU count per node – sets OCPUs per node in the Autonomous VM Cluster for Autonomous Databases
◉ Database memory per OCPU – sets total memory in the VM cluster for Autonomous Database workloads based on total CPU allocation
◉ Autonomous Database Storage – User data storage for your Autonomous Databases

Resource configuration sliders default to the minimum values needed for the Autonomous Exadata VM Cluster.

Oracle Database, Oracle Database Prep, Database Career, Database Skills, Database Certification, Database Learning

Each Autonomous Exadata VM Cluster has a separate maintenance schedule. Select the "Modify Maintenance" button to configure your Autonomous Maintenance preference. Set your maintenance schedule and click "Save Changes.”

Oracle Database, Oracle Database Prep, Database Career, Database Skills, Database Certification, Database Learning

Select the license type and click "Create Autonomous Exadata VM Cluster". With Multiple-VM Autonomous Database, you can have Autonomous Databases with different license types on the same Exadata Infrastructure.

Source: oracle.com

Friday, December 16, 2022

Multiple VM Cluster Support & VM Cluster Node Subsetting now available on ExaDB-D

We are pleased to announce the General Availability (GA) of support for Multiple VM Clusters and VM Cluster Node Subsetting capability on Exadata Database Service on Dedicated Infrastructure (ExaDB-D). Previously, ExaDB-D customers could only create a single VM cluster on any given Exadata Infrastructure and the single cluster automatically spanned across all DB Servers in the infrastructure. With Multiple VM Clusters (Multi-VM) support and VM Cluster Node Subsetting capability, you can now create multiple VM Clusters on a single Exadata Infrastructure in ExaDB-D and have the flexibility to choose specific DB Servers within the infrastructure to host VMs from the cluster.

Key Customer Benefits


With the Multi-VM and VM Cluster Node Subsetting capability, you can now

◉ Provision a new VM cluster with any number of VMs rather than hosting a VM on each DB server in the Exadata Infrastructure.
◉ Start with a smaller VM cluster size at provisioning time, thereby enabling cost savings on resources allocated per VM.
◉ Expand the VM clusters to add VMs on-demand providing flexibility to scale resources without disrupting current running workloads.
◉ Shrink the VM clusters to remove VMs as needed to ensure efficient allocation of DB server resources.
◉ Isolate VM clusters to run on specific DB Servers giving complete control over the isolation strategy for mission-critical workloads.
◉ Co-locate VM clusters on specific DB Servers to implement efficient consolidation and streamline maintenance across various workloads.
◉ Allocate resources from the new generation of DB servers to provision new VM clusters or extend the existing VM clusters to ensure optimal utilization of available resources.

OCI Console Experience


Let's go over the following core user journey highlights related to Multi-VM and VM Cluster Node Subsetting using the OCI console.

◉ Provision VM cluster on a subset of DB Servers
◉ Add or remove VM(s) to scale VM Cluster
◉ Scale VM resources allocated to a provisioned VM cluster
◉ Scale Infrastructure with multiple VM Clusters

1. Provision VM Cluster on a subset of DB Servers


On the Exadata Infrastructure details page, you can navigate to the VM Cluster section and initiate the create VM Cluster workflow to provision a new cluster on this infrastructure.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

The create VM Cluster workflow now includes selecting the DB servers on which you want to host the VMs for the new cluster. By default, the create VM Cluster flow selects all the DB Servers in the infrastructure. You can change this selection to pick a subset of DB Servers to host VMs for the cluster.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

You can specify the placement of each VM in the cluster by selecting the DB server to host the VM for this cluster. All DB Servers part of the Exadata Infrastructure are listed and available for selection to place a VM. You can see the available OCPU, memory, and local storage resources for each DB server, along with the list of existing VM clusters already hosting VMs on the respective DB Servers. Based on the isolation and co-location preferences and planned resource allocation limits, you can choose the DB servers best suited for their specific use case to be part of this VM cluster.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Once the DB servers to host VMs for the cluster are selected, you can specify the allocation for OCPU, memory, and local file system storage resources per VM using the presented controls. The maximum resources available for assignment per VM depend on the selected DB servers that will host these VMs for the cluster. The DB server with the least resources will determine the maximum limit available for allocation per VM, given the symmetric resource allocation across all VMs in the cluster.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

You can now specify the Exadata Storage allocation as part of the create VM Cluster flow.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

The VM cluster details page shows the total number of VMs and the total resources allocated across all VMs after the provisioning completes successfully. You can also view the list of VMs in the cluster and their respective resource allocation, IP addresses, and a hyperlink to view the DB Server hosting the VM. The DB server details page will list all the VMs from various clusters hosted on that DB Server.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

2. Add or remove VM(s) to scale VM Cluster


2.1 Expand a provisioned VM Cluster by adding VM(s)

You can expand a provisioned cluster on-demand by adding VMs from the cluster details page.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

While adding new VMs to a provisioned cluster, you can choose the specific DB servers on which you want to add new VMs and extend the VM Cluster. DB servers already hosting a VM from a particular cluster are not available to host another VM from the same cluster. For every DB Server, you can see the available OCPU, memory, and local storage along with the list of VM clusters hosting VMs on that DB server. Based on the information presented, you can choose the DB server(s) best suited to host the newly added VM(s).

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Note: Newly added VMs will have the same resource allocation for OCPU, memory, and local storage as existing VMs part of the cluster.

Total resources allocated across the cluster are updated to reflect the newly added VM resources. Each new VM added to the cluster is listed along with existing VMs and displays the allocated resources, IP addresses, and the DB server hosting the VM.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

2.2. Shrink a provisioned VM Cluster by removing VM

Additionally, you can navigate to a specific VM listed as part of the cluster and use the action menu dropdown for the list row to terminate the VM.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Terminating a virtual machine will terminate any database instances running on the VM and requires additional confirmation to proceed.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

3. Scale VM resources allocated to a provisioned VM Cluster


For a provisioned VM cluster, you can always navigate to the cluster details page and initiate a scale action to change the resource allocation for the VMs in the cluster.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

The scale VM cluster resources workflow shows the number of VMs that are part of the cluster and presents controls to change the allocation for OCPU, memory, and local file system storage resources per VM. You can view the total resources allocated across all VMs in the cluster as a read-only summary similar to the view shown during cluster creation.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

4. Scale Infrastructure with multiple VM Clusters


4.1. Scale Infrastructure with Database Server

With Exadata Infrastructures now supporting multiple VM Clusters, adding a database server to the infrastructure makes the additional resources from the newly added DB server available to all VM Clusters. You can initiate a scale infrastructure operation from the infrastructure details page.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

You can specify one or more DB servers to add as part of the scale infrastructure flow.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Once the DB Server is added to the infrastructure, all resources (OCPU, Memory & Local Storage) from the newly added DB Server are available and updated on the infrastructure details page. The newly added DB Server is listed alongside existing DB Servers already part of the infrastructure and is not assigned to any VM cluster.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

The newly added DB Server is available to host a VM as part of the Create VM Cluster and Add Virtual Machine flows.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

4.2. Scale Infrastructure with Storage Server 

Adding a storage server to the infrastructure makes the additional storage capacity from the newly added server available to all VM Clusters. You can specify one or more storage servers to add as part of the scale infrastructure flow.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

You can add the storage capacity from the newly added servers to the total usable Shared Exadata Storage capacity and make it available for VM Clusters to allocate and consume. 

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Adding storage capacity to the infrastructure's shared Exadata Storage capacity free pool triggers an ASM rebalance of existing disk groups (used by the provisioned VM Clusters) to all storage servers visible to the infrastructure, including the newly added ones.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Note: While rebalance is in progress, you may see some performance impact depending on how much capacity is already in use by the existing disk groups.

After completing all these steps for elastic storage expansion, you can view the total number of storage servers and the total Exadata Shared Storage capacity on the infrastructure details page.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

As part of the VM Cluster scale workflow, you can allocate and use the additional storage capacity from the newly added storage server(s). The new maximum limit for shared storage capacity is reflected and used for validation.

Database Cloud Services, Oracle Database, Database Career, Database Skills, Database Jobs, Database Tutorial and Materials, Database Tutorial and Material

Note: The additional storage capacity from the newly added storage server(s) is also presented and available as part of the workflow to create new VM Clusters.