Friday, December 29, 2023

How Do You Prepare for the Oracle 1Z0-1078-23 Exam? Don't Get Left Unprepared

It requires focus, serious studies, understanding of concepts, persistence, and consistent preparation to pass the Oracle 1Z0-1078-23 exam.

So, you've set your sights on conquering the Oracle 1Z0-1078-23 Certification, also known as the Oracle Product Lifecycle Management 2023 Implementation Professional exam. You're in for an exciting journey that could open doors to new career opportunities.

Understanding the Oracle 1Z0-1078-23 Certification

Before we embark on this certification adventure, let's get to know our destination better. The 1Z0-1078-23 Certification is a badge of honor for professionals seeking to master Oracle Product Lifecycle Management 2023 Implementation. It's not just an exam; it is a ticket to becoming an Oracle expert, showcasing your skills and knowledge in the dynamic field of product lifecycle management.

Unveiling Effective Preparation Strategies for the Oracle 1Z0-1078-23 Exam

Unveiling the Oracle 1Z0-1078-23 Exam

Every successful journey begins with a map, the exam blueprint in Oracle certifications. Oracle 1Z0-1078-23 has a well-structured blueprint outlining the topics and skills you require to command. Please take a moment to familiarize yourself with this document; it is your treasure map to success.

The Power of 1Z0-1078-23 Practice Tests

As the saying goes, practice makes perfect. In the realm of certification exams, practice tests are your secret weapon. They simulate the exam environment and highlight areas you might need extra focus. Leverage practice tests available online to fine-tune your knowledge and build the confidence required to ace the 1Z0-1078-23 Exam.

Dive into the Oracle 1Z0-1078-23 Learning Paths

Oracle offers a rich display of learning paths, and it's like having a personal guide on your certification journey. Whether you prefer videos, documentation, or interactive labs, Oracle's learning paths cater to different learning styles. Submerge in these resources; the certification content will be more digestible and engaging.

Connect with the Oracle Community

In the vast realm of Oracle certifications, you are not alone. Joining the Oracle community opens doors to a wealth of knowledge and shared experiences. Participate in forums, discussions, and networking events. Engaging with fellow learners and experts improves your understanding and provides valuable insights that might be found outside textbooks.

The Art of Time Management

Time is of the essence, especially when preparing for a certification exam. Create a realistic study schedule that fits your routine. Break down your study sessions into manageable parts, allowing you to absorb information more effectively. Remember, it is not about the hours you put in but the quality of your study time.

Stay Updated with Oracle's Official Documentation

In the ever-evolving landscape of technology, staying updated is non-negotiable. Rely on Oracle's official documentation as your holy grail for the Oracle 1Z0-1078-23 exam. It is the most dependable source, straight from the horse's mouth. Monitor updates and changes, ensuring your preparation meets the latest exam needs.

Embrace the Mentorship Advantage

Seeking guidance from those who have conquered the 1Z0-1078-23 mountain can be highly beneficial. Connect with mentors, whether through online platforms or local meetups. Their insights, tips, and firsthand experiences can be the extra push you need to cross the certification finish line.

Celebrate Small Wins Along the Way

Oracle Product Lifecycle Management 2023 Implementation Professional Certification journeys can be demanding, so remember to celebrate small victories. Completing a challenging module, scoring well on a practice test, or grasping a problematic concept are all milestones worth acknowledging. Positive reinforcement supports you in motivation and makes the learning process more enjoyable.

Why Choose Oracle 1Z0-1078-23 Certification?

In a competitive IT landscape, certifications act as a beacon, guiding employers to skilled professionals. The 1Z0-1078-23 certification, focusing on Oracle Product Lifecycle Management 2023 Implementation, signifies knowledge and practical expertise. This shows you in the job market, making you a sought-after professional.

Final Words

As we reach the conclusion of our guide, remember that the journey to Oracle 1Z0-1078-23 Certification is not just about the destination but the experiences along the way. Stay focused, stay curious, and embrace the challenges as opportunities for growth.

Mastering the Oracle Product Lifecycle Management 2023 Implementation Professional certification requires strategic preparation, utilizing official resources, and active engagement with the Oracle community.

As you embark on this journey, remember that success is not just about passing an exam but acquiring skills that set you apart. So, dive into Oracle Product Lifecycle Management 2023 Implementation confidently, and let your certification be a testament to your expertise.

You have got this! Good luck with your certification adventure.

Coaching with Clarity: Changing the Game in Learning and Growth!

Last year we announced our Oracle ME employee experience platform to help our customers deliver unique, hyper-personalized, immersive experiences to every individual. As part of Oracle Cloud HCM, it provides capabilities to streamline communication across the organization, increase productivity by guiding employees to relevant tasks, help build the manager-employee relationship, and connect your employees with their peers to build professional networks. And since its announcement, Oracle ME has received fantastic recognition from industry experts and analysts, and customers are leveraging the capabilities to innovate and drive impact.

But, as we continue to innovate alongside our customers, we again find that we are in yet another period of unprecedented change (rapid evolution of skills, volatile economy, etc.). And we are hearing more about headcount reductions in parallel with skills/talent shortages. As such, many organizations are being asked to do more with less, while still chasing in-demand skills – often with reduced budgets. This means it is more critical than ever to remain steadfast in their commitments to their workforce -- and not putting at risk any strides they have made with respect to employee expectations and experience -- to remain competitive and growing.

If there is anything we CAN count on, it is uncertainty and change. In a recent research article, Deloitte points to the need for adaptability and resilience to navigate this highly dynamic world, with an organization’s people as the foundation. In a recent survey, they found more than 70 percent of global executives identified “the ability of their people to adapt, reskill, and assume new roles” as the top-ranked item to navigate future disruption. According to the article, as environments change, so do business priorities, and effective organizations will position their people to adapt to meet them.

As such, how do organizations and workers continue to develop the RIGHT skills in this dynamic environment -- to stay focused on customer satisfaction goals, workforce productivity, stop turnover, keep people motivated, and support the overall revenue of the business? What is in their way that this has not been licked yet -- despite the hundreds of billions of dollars spent in the corporate training industry?

In the Deloitte article they point out many organizations proudly state that they “empower their workers to own their own careers” – but in most cases, to do this, employees are simply given access to expansive libraries with little personalized or purposeful guidance. And often organizations are also trying to solve for this with disparate systems – if at all. This approach leads to multiple challenges including poor engagement – especially since trying to work through multiple interfaces adds more work for, and onus on, the individual. Without the ability to dynamically connect all these solutions and data sources, employees have poor visibility into role expectations – much less, career growth and opportunities. And the business is unable to know and grow the skills they need – much less, tracking the progress and impact of upskilling and reskilling initiatives.

I personally love the call to action for “growth in the flow of work” from recent research by Nehal Nangia, L&D Research Director of the Josh Bersin Company –  an article where they call for a new strategy in Corporate Learning. We also strongly believe that “learning is the linchpin - the most critical piece for developing skills, enabling growth, fostering future talent, and resolving the talent crisis of today” – and we are investing accordingly. This is why I am so excited to announce our newest solution to the Oracle ME employee experience platform, Oracle Grow, for our Oracle HCM suite customers. Think of it as an AI-powered coaching experience -- with personalized, intelligent, and most importantly dynamic guidance -- focused on worker potential, growth, and agility.

Coaching with Clarity: Changing the Game in Learning and Growth!

So what makes our approach with Oracle Grow different than what other vendors are doing? This is a brand new, first-of-its kind EXPERIENCE – connecting what your workers need from our suite for learning, skills growth and career mobility growth in one simple, clear experience.. We are uniquely taking -- for many what is passive, historical data – and transforming it into intelligence (AI) based recommendations that guides workers – and their managers --through key development choices to help them succeed in their roles today, and for the future. This really is a next-generation approach to continuous, life-long learning, giving individuals transparency and agency over their growth plans while amplifying organizational success.

It helps guide workers and managers through key upskilling and reskilling decisions and actions -- making it simpler and easier.  With our unified, natively-built suite of tools and single data model, the technology acts as an extension of your team doing the work – like a coach and advisor -- so your people can save time, be more engaged, and have access to real-time, dynamic, and tailored growth insights and guidance.

Coaching with Clarity: Changing the Game in Learning and Growth!
Finally, employees can alleviate some of the guesswork out of what will help them level up in their role or to their next role -- maybe in ways they never have considered or imagined. I know for many of us, we’ve used our networks – or simply just got lucky -- to find opportunities to grow within the organization, but can you imagine getting personalized advisory, or even seeing a visual representation, of the different ways you could move in the organization? All based on real-time patterns and trends in your organization’s people data in conjunction with your own preferences -- AND the things you need to better qualify for those roles?

That kind of information can be life changing. And finally, managers can now also play a more direct, focused, and purposeful role in delivering a great employee experience. With Oracle Grow they have the tools to ensure their team is on the right track and cultivating the skills they need to move themselves, their team, and the business forward. I am excited for the future and the value our customers – and their workers -- can get with Grow.

Source: oracle.com

Wednesday, December 27, 2023

Annoucing Oracle Spatial Studio 23.2 Release

Oracle Spatial Studio 23.2 is now available, including multiple user enhancements and behind-the-scenes improvements. Analysts can utilize the redlining tools to mark up maps, highlighting points of interest or marking unusual activities for further exploration. You can use custom symbols to show your storefront locations using your logo instead of a generic icon. Developers may find the ability to embed Spatial Studio maps into external web applications interesting. We have added a native Web Component for embedding Spatial Studio maps into other applications. An embedded map includes interactive behavior, such as click events, for integration with the host application. In this article, we'll talk about these and other new features.

Annoucing Oracle Spatial Studio 23.2 Release

Redlining tools


Redlining refers to ad hoc drawing on maps and is useful for communicating locations and areas of importance. For example, locations of unusual activity or areas needing detailed investigation. Spatial Studio now supports creating redline shapes (lines, circles, rectangles, and polygons) on your maps. You may style these shapes, add descriptions, and export as GeoJSON.

Embeddable maps


You may now embed Spatial Studio maps into other web applications using a new first-class Web Component. This enables embedding of fully interactive maps with feature selection, so that you can configure integration of the embedded maps within your application. For example, you can embed a map and configure your application to invoke an action or detailed report based on a user clicking on an item in the map.

Custom symbols


Users can now upload icons (.gif, .png, or .jpeg) and use them to represent point data on their maps. For example, you can display your office locations on an interactive map using your company logo, or display locations of interest using symbols aligned with your corporate standards.

Animate historical data


Animation of spatiotemporal data has been expanded from real-time to now include historical data. The only requirement is that your spatial data includes a UTC date or timestamp. As with real-time data, the style of moving objects and their tails in historical data movement is fully configurable.

Connect lat/long pairs with lines


Annoucing Oracle Spatial Studio 23.2 Release

You are now able to generate lines connecting coordinate pairs. You have the choice to generate lines that are displayed as straight lines, or as “geodesic” lines which follow the path along the Earth’s surface. Geodesic lines appear curved on the screen and are commonly associated with real-world trajectories such as flight paths.

Source: oracle.com

Friday, December 22, 2023

Manage the security of your Amazon RDS for Oracle databases with Oracle Data Safe

We’re excited to announce that Oracle Data Safe service now delivers essential security services for Oracle databases running in Amazon Relational Database Service (RDS). With the addition of RDS support, Data Safe can help secure all Oracle Database deployments in Oracle Cloud Infrastructure (OCI), Oracle Cloud@Customer, third-party clouds like Microsoft Azure and Amazon Web Services (AWS), and on-premises.

Data security is one of the top concerns for business leaders due to compliance and never-ending security breaches. The security teams tasked with managing security for the Oracle databases face many challenges, including disparate standalone tools and proliferating databases on multiple clouds and on-premises. As a result, those databases can be vulnerable to more straightforward attacks. Oracle Data Safe provides customers with a solution that helps secure all of their Oracle databases, irrespective of where they are, whether they’re enterprise or standard edition, or if they’re running any of the currently supported releases of Oracle Database.

Data Safe helps you evaluate security controls, assess user security, and monitor user activity. It helps you address data security compliance requirements for your database by discovering sensitive data and masking sensitive data for nonproduction purposes. You can use Data Safe to spot gaps in security configurations, identify dormant user accounts, understand what sensitive information they store in their databases, protect sensitive data in test and development environments, and address audit data collection, retention, and reporting requirements.

Oracle Data Safe now supports Oracle Enterprise Edition and Oracle Standard Edition Two on RDS databases. With Data Safe support for Oracle Standard Edition databases, you can now access advanced security features such as data masking, previously available only to Enterprise Edition customers, helping you keep you data secure wherever it resides.

Data Safe helps secure all your Oracle databases in one place, eliminating the need to have multiple consoles or manage multiple instances. Oracle Data Safe has an easy-to-use cloud-based interface that requires no installation or maintenance.

Connect to Oracle Data Safe quickly and easily


You have two options for connecting your Oracle RDS database running in AWS to Oracle Data Safe.

Use private endpoints

If you already set up network connectivity between your Amazon RDS for Oracle databases and your OCI virtual cloud network (VCN), you can leverage that connection to register your database through a Data Safe private endpoint. The private endpoint represents the Oracle Data Safe service in your OCI VCN with a private IP address. The private endpoint must be able to call from your OCI VCN into the AWS VPC subnet for your target database.

Install a light-weight connector in an EC2 instance

Another easy way to register your database is through the Data Safe on-premises connector. You can install this connector on a Linux host in your AWS environment. The connector then establishes an encrypted TLS tunnel to Oracle Data Safe. You only need to deploy one connector to support multiple Oracle databases in your AWS tenancy.  

You can create the Data Safe private endpoint or the Data Safe on-premises connector before registering your database with Data Safe, or you can create them during registration.

Register your database with Oracle Data Safe


When you’ve decided which connectivity option to use, registering your database with Data Safe is easy with a dedicated registration guide:

Manage the security of your Amazon RDS for Oracle databases with Oracle Data Safe

Figure 1: Database registration guides

During registration, you must provide a database account for Data Safe to use to connect to your database. We provide a SQL script that you can run to grant the Data Safe user the necessary roles and privileges. Select which privileges to grant depending on which Data Safe features you want to use. You can learn more in the following resources:



Manage the security of your Amazon RDS for Oracle databases with Oracle Data Safe
Figure 2: Amazon RDS target registration wizard

Then, use the following steps:

1. Provide your database's target information, including the service name, the IP address and port number, and the Data Safe service account credentials you created on your database.

2. Connectivity option: Select whether you want to connect through a Data Safe private endpoint or a Data Safe on-premises connector. You can enter an existing private endpoint or connector you created previously or have one created.

3. Security rules: When using a Data Safe private endpoint, you must allow outgoing communication from the private endpoint within the VCN. The process can create the necessary egress rule for you. You also need to allow incoming communication for your database on AWS.

Your target database is now ready for Data Safe. Get started by reviewing the security and user assessment reports automatically scheduled during the registration. You can find them in the Data Safe Security Center under Security Assessment and User Assessment.

Manage the security of your Amazon RDS for Oracle databases with Oracle Data Safe
Figure 3: Security assessments in Data Safe

Wednesday, December 20, 2023

Oracle and Microsoft expand partnership to deliver Oracle database services in Azure

Organizations want choice, and nearly all of them are using two or more clouds. But, most workloads are still on-premises, and many of them are Oracle Database workloads. Why?

We believe that customers need greater flexibility with best-in-class technologies to move workloads into the cloud. 97% of the Fortune 100 companies rely on Oracle Database for their most critical workloads. Many of these customers have also invested in Microsoft Azure.

Since 2019, we have partnered with Microsoft to deliver the Oracle Interconnect for Azure, which offers secure, private interconnections with sub-2 millisecond latency in 12 global regions. This multicloud network foundation enabled customers, such as AT&T, Marriott International, and Veritas, to build applications across Oracle Cloud Infrastructure (OCI) and Azure.

We’re excited to share that we’re significantly expanding our partnership with Microsoft to launch a new service called Oracle Database@Azure. With this service, Microsoft and Oracle will deliver the same Oracle database services running on OCI in Microsoft Azure datacenters. Oracle database services in Azure are designed to offer high levels of performance, scale, security, and availability — at parity with what we offer in OCI today. By colocating OCI database services in Azure datacenters, we expect that Oracle Database@Azure will have the same low latencies as other Azure services. You can deploy and manage the service through the existing Azure portal and use Azure developer tools, software development kits (SDKs), and APIs.

With this new offering, we’re making it easier for customers to migrate and modernize their Oracle Database workloads to Azure. With Oracle database services running locally in Azure, you can benefit from proximity to familiar application development tools and frameworks to modernize your workloads and run cloud native applications with direct access to data in Oracle Database with the same security and compliance across the entire solution.

Additionally, customers will be able to purchase these services through the Azure Marketplace and apply their purchases to their Microsoft Azure Consumption Commitment (MACC).

Bringing the power of Oracle Autonomous and Exadata to Azure customers


Leading organizations in every industry have long relied on Oracle Exadata and Oracle Real Application Clusters (RAC) to make the most of their data and to power mission critical applications. Oracle Database@Azure brings these technologies into Azure. Oracle RAC allows you to run a single Oracle Database across multiple servers, providing high availability and enabling horizontal scaling for traditional workloads. Building on this scale-out architecture, Oracle Exadata combines high-performance database servers, low-latency interconnects, and intelligent storage with unique, database-aware software optimizations. It can let you access data with less than 20 microseconds of latency, process tens of millions of operations per second, and analyze petabytes of data. Exadata offers the same functionality on-premises and in the cloud, so you don’t have to redesign and rearchitect your applications.

Migrating and deploying production-grade environments is straightforward. Oracle provides proven database migration strategies, including automated migration solutions like Zero-Downtime Migration (ZDM) and powerful tools like Oracle Data Guard and Oracle GoldenGate. Oracle Database@Azure customers will have access to Oracle RAC for high availability using Exadata hardware. The service will be deployable in multiple availability zones to ensure regional high availability and in cross-region pairs to support cross-geography disaster recovery scenarios.

Simplifying purchasing and operations


The deep Azure integration greatly simplifies purchasing and operations in the following ways:

◉ Simplified purchasing: You will be able to purchase Oracle Database@Azure in Azure Marketplace, and those purchases will apply to MACC. You’ll also be eligible for Oracle Support Rewards, a program where you can earn rewards that you can use to reduce your Oracle technology license support bill.

◉ Simplified management: You will be able to manage Oracle databases using native Azure tooling like you would with any other Azure resource.

Oracle and Microsoft expand partnership to deliver Oracle database services in Azure
Figure 1. Oracle Database@Azure home page

◉ Simplified operations: You will be able to use Azure tooling to view all metrics, events, and logs for all Oracle Database@Azure databases.

Oracle and Microsoft are already working with customers, such as PepsiCo, Vodafone, and Voya Financial, to realize this new multicloud reality.

“As a global leader in the financial services industry, Voya has harnessed the power of digital transformation to help provide the best experience for our customers and employees,” said Santhosh Keshavan, executive vice president and chief information officer of Voya Financial, Inc. “As we continue to bring our business applications to the cloud, cloud partnerships have the potential to help the entire industry maintain better security, compliance, and performance, helping to accelerate the development of new technology products, solutions and services that enhance customer experience and help achieve better financial outcomes.”

Source: oracle.com

Monday, December 18, 2023

Deploy LangChain applications as OCI model deployments

Are you looking for a fast and easy way to deploy your LangChain applications built with large language models (LLMs)? Oracle Accelerated Data Science (ADS) v2.9.1 adds a new feature to deploy serializable LangChain application as REST API on Oracle Cloud Infrastructure (OCI) Data Science Model Deployment. With ADS software developer kits (SDKs), you can deploy your LangChain application as an OCI Data Science Model Deployment endpoint in a few lines of code. This blog post guides you through the process with step-by-step instructions.

LangChain


LLMs are a groundbreaking technology that encapsulate vast human knowledge and logical capabilities into a massive model. However, the current language model usage has many issues. The entire ecosystem is still evolving, resulting in a lack of adequate tools for developers to deploy language models.

LangChain is a framework for developing applications based on language models. Tasks like prompt engineering, logging, callbacks, persistent memory, and efficient connections to multiple data sources are standard out-of-the-box in LangChain. Overall, LangChain serves as an intermediary layer that links user-facing programs with the LLM. Through LangChain, combining other computing resources and internal knowledge repositories, we can advance various LLM models, toward practical applications based on the current corpus resources, providing them with new phased utility.

OCI model deployment


OCI Data Science is a fully managed platform for data scientists and machine learning (ML) engineers to train, manage, deploy, and monitor ML models in a scalable, secure, enterprise environment. You can train and deploy any model, including LLMs in the Data Science service. Model deployments are a managed resource in OCI Data Science to use to deploy ML models as HTTP endpoints in OCI. Deploying ML models as web applications or HTTP API endpoints serving predictions in real time is the most common way that models are productionized. HTTP endpoints are flexible and can serve requests for model predictions.

ADS SDK


The OCI Data Science service team maintains the ADS SDK. It speeds up common data science activities by providing tools that automate and simplify common data science tasks, providing a data scientist-friendly Pythonic interface to OCI services, most notably Data Science, Data Flow, Object Storage, and the Autonomous Database. ADS gives you an interface to manage the lifecycle of ML models, from data acquisition to model evaluation, interpretation, and model deployment.

Create your LangChain application


As an example, the following code builds a simple LangChain application to take a subject as input and generate a joke about the subject. Essentially it puts the user input into a prompt template and sends it to the LLM. Here, we use, Cohere as the LLM, but you can replace it with any other LLM that LangChain supports.

import os
from langchain.llms import Cohere
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate

os.environ["COHERE_API_KEY"] = "<cohere_api_key>"

llm = Cohere()
prompt = PromptTemplate.from_template("Tell me a joke about {subject}")
llm_chain = LLMChain(prompt=prompt, llm=llm, verbose=True)</cohere_api_key>

Deploy the LangChain application using the Oracle ADS SDK


Now, you have a LangChain application, llm_chain, and you can easily deploy it to your model deployment using Oracle ADS SDK. The ADS SDK simplifies the workflow of deploying your application as a REST API endpoint.

Deploy LangChain applications as OCI model deployments

from ads.llm.deploy import ChainDeployment
ChainDeployment(chain=llm_chain).prepare_save_deploy(
    inference_conda_env="pytorch21_p39_gpu_v1",
    deployment_display_name="LangChain Deployment",
    environment_variables={"COHERE_API_KEY":"<cohere_api_key>"}
)</cohere_api_key>

Behind the scenes, ADS SDK automatically handles the following processes:

  • Serialization of the LangChain application: The configuration of the your application is saved into a YAML file.
  • Preparation of the model artifact, including some autogenerated Python code to load the LangChain application from YAML file and run it with the user’s input.
  • Saving the model artifacts to model catalog: This process uploads and registers your application as model in the model catalog.
  • Creating a model deployment: This deploys the scalable infrastructure to serve your application as HTTP endpoint.

Deploy LangChain applications as OCI model deployments

When the deployed model is active, you can use OCI CLI to invoke it. Replace the <langchain_application_model_deployment_url> with the actual model deployment URL, which you can find in the output from the previous step.

oci raw-request --http-method POST --target-uri <langchain_application_model_deployment_url>/predict --request-body ‘{“subject”: “animals”}’ --auth resource_principal</langchain_application_model_deployment_url>

Deploy LangChain applications as OCI model deployments

Source: oracle.com

Friday, December 15, 2023

Architecting Hyper-Scalable Infrastructure for AI and ML-Driven Fintech with Oracle’s Globally Distributed Database

Architecting Hyper-Scalable Infrastructure for AI and ML-Driven Fintech with Oracle’s Globally Distributed Database

Introduction


Digital technologies are reshaping payments, insurance, wealth management, lending, and many more processes. This digitization over the last few years has built the foundation for innovations in financial technologies such as mobile money, peer-to-peer or marketplace lending, insurance technology, and crypto-assets that have emerged around the globe.

One of the key features for most of these financial enterprise applications is “Build to scale”. The capability of an application to scale is something that all businesses desire, as it allows an accelerating number of users to onboard onto the platform. Without scalability, many issues in service disruption, performance, slower response times, and loss of users might surface. These factors impact a firm’s revenues, making scalability critical.

The case study that we discuss shows the need for  fintech companies to scale, and how Oracle Globally Distributed Database helps them to fill that technological gap and attain scalable design embedded at their core.

Proliferating scalability in financial firms


Technology innovation is creating new business opportunities for the finance vertical, and among all other disruptive technologies, hyper-scale computing has been the most transformative innovation of many years. The global datasphere, a term coined by IDC to describe all of the data “out there,” is projected to keep growing. The firm estimates that in 2025, the world will create and replicate 163ZB of data, representing a tenfold increase from the data created in 2016. This hypergrowth is the outcome of an evolution of computing that goes back decades. Another report mentions that 39% of banks believe their digital transformation exercise has fast-tracked their product and service delivery. Financial enterprises need to think ahead and plan for their growth – a roadmap where scalability should be at the top of the agenda.

The COVID-19 pandemic drove the adoption of scalable platforms across industries, including banking and financial services. Post-pandemic, financial institutions invest in hyper-scale data solutions to address infrastructure challenges. Scalable solutions enable fintech to meet increasing demands during crises, expand client reach, diversify offerings, and boost profitability. Fintech enterprises achieving hyper scalability can reshape the future of finance.

Navigating the Challenges With Scalability


Need to hyperscale and unleash the real value of data

Ever-growing data is the heart of financial firms, and the sense of urgency to develop a comprehensive scaling strategy is important to them. To harness the true potential of data and facilitate hyperscale growth, fintech companies are increasingly turning to distributed databases coupled with cutting-edge AI and ML capabilities.

These solutions empower fintech with improved operations, encompassing fraud prevention, personalized services, payment gateways, risk assessment, and regulatory compliance, providing benefits to companies and their clients. Oracle’s Globally Distributed Database facilitates hyperscaling to address the surging demands and data-intensive nature of fintech operations.

The solution offered:

Oracle’s Globally Distributed Database is a distributed deployment of the Oracle database presented as a single logical database. It has all the Oracle database features, plus additional capabilities when used with geographic distribution and replication. It thereby combines the benefits of familiar, relational SQL—consistency and reliability—with those of NoSQL—easy scaling and global reach. Key benefits include:

  • Distributes segments of a data set across many databases (shards) on different computers, on-premises, or in the cloud.
  • Shards can use different cloud providers (multi-cloud strategy) and replicas of a shard can be in another cloud or on-premises.
  • Online re-sharding allows you to move data between clouds, or to and from the cloud and on-premises.
  • Add shards online to increase database size and throughput, delivering online elasticity.
  • Delivers globally distributed, linearly scalable, multi-model databases without specialized hardware or software
  • Enables strong consistency, full power of SQL, support for structured and unstructured data, and the Oracle Database ecosystem
  • Built-in Raft replication to provide a consensus-based, high-performance, low-overhead availability solution, with distributed replicas and fast failover with zero data loss
  • Provides flexibility for hybrid or multi-cloud strategies
  • It provides easy access to Oracle's parallelized, scalable in-database Oracle Advanced Analytics' ML algorithms enabling customers to form predictive insights

Empowering a US based fintech achieve hyper scalability in their architecture with Oracle's Globally Distributed Database


Client's Overview:

  • Provider offers cutting-edge solutions in fraud prevention, digital identity, device intelligence, chargeback management, and payment gateways. Their Solutions empower businesses to reduce fraud-related expenses, simplify operations, and drive revenue growth
  • Their Multi-layered risk platform utilizes machine learning and a community-based reputation database for effective risk management across the customer journey. It covers account setup, authentication, activity monitoring, payments, and dispute handling
  • Customer’s environment involved a database having data of hundreds of terabytes, which had become increasingly difficult to manage efficiently. As their data grew, backup and recovery processes took longer and became cumbersome. The customer realized that their monolithic data infrastructure was no longer sustainable and sought to break free from these limitations and transition to a hyper-scale solution that would meet their evolving data needs.

Technological Overview:

  • Oracle serves as the backbone for transactional payment databases, spanning multiple instances from Release 11g to 21c
  • Existing technologies include Partitioning, Advanced Compression, Oracle Active Data Guard, Oracle GoldenGate, and Advanced Security
  • Data intensive OLTP applications with strict SLAs
  • Data warehouse databases support reporting needs

Customer Requirements:

  • Robust architecture with extreme availability, performance, and scalability
  • Flexible scale-out solution to meet rising customer expectations and workload demands
  • Scalable, reliable, highly available infrastructure with industry-standard security and auditing
  • Required a shift to distributed infrastructure, emphasizing the need to break the monolith and eliminate single points of failure during a catastrophic crash
  • Minimal code changes to support complex joins in the application
  • Real-time query execution within 1 to 2 milliseconds
  • Managing a 200-terabyte database became challenging, leading to extended backup times
  • Efficiently handling 200 terabytes of data is crucial for effective fraud detection
  • Integration of AI/ML capabilities for enhanced fraud detection and real-time analytics to support proactive decision-making

Choosing Oracle's Globally Distributed Database:

The customer explored various NoSQL and distributed databases, grappling with challenges related to data sharding, synchronous cross-region replication, and the need for application rewriting. Despite evaluating numerous NoSQL options, they ultimately chose Oracle's globally distributed database for its ability to provide scalability akin to NoSQL databases while preserving enterprise-grade features such as data consistency, transactions, joins, atomicity, security, and availability, etc. Oracle's Globally Distributed Database offers a unified and seamlessly integrated solution, a departure from the modular approach of other new distributed databases that employ a mix of disparate software modules. With a mature Globally Distributed Database, full SQL support, a mature storage engine, and multiple high-availability options, including Raft replication, Oracle fulfilled all requirements, helping the customer achieve their desired objectives.

Deployment with Oracle’s Globally Distributed Database:

  • Implemented User-defined distribution methodology to a group of customer(s) (shard isolation) with a common data model using customer ID as a sharding key
  • Horizontal data partitioning into four shards to maximize performance in OLTP, Batch, and Reporting
  • Strict data sovereignty enforcement to prevent cross-region data leaks
  • Simplified architecture with smaller databases for manageability
  • Deployment of two Oracle Data Guard replicas for each shard: one local with synchronous replication and one remote with asynchronous replication
  • Deployment of a Global Service Manager (GSM) in the primary region, paired with local and remote standby hosts for improved system availability and reliability
  • Utilization of Oracle's AI and ML capabilities on massive data volumes for real-time credit card fraud detection with up-to-date data access for real-time decision-making

Architecting Hyper-Scalable Infrastructure for AI and ML-Driven Fintech with Oracle’s Globally Distributed Database
Architecture

Maximizing Efficiency with Oracle’s Globally Distributed Database:

Our solutions efficiently addressed customer pain points with Oracle’s Globally Distributed Database, enhancing core components for excellence.

Linear Scalability:

  • Enabled linear scaling of Oracle databases along with the applications 
  • Scalability of data for the downstream system by assigning each shard to stream data, hence multiplexing the jobs
  • Scaling up and out of OLTP databases, and customers turned out to use a lower number of shards based on workload demand

Better Fault Isolation:

  • Only a part of the platform is affected during an accidental database crash. No impact on other areas
  • Brings strong multi-version concurrency control, data protection, and security for the processing of high volume of complex transactions in customer applications

Value Additions:

  • Utilized essential Oracle features including data consistency, security, availability, performance optimizer, centralized backup and recovery inherent in Oracle’s Globally Distributed Database

Unlocking Business Benefits: Modernizing the Technology Stack with NoSQL like Benefits and Breaking the Monolith

  • Minimized the impact of catastrophic crashes
  • Enabled resiliency orchestration to help enterprises by providing better fault isolation, as planned/unplanned downtime within one region does not impact other customers
  • Better application SLAs
  • Ability to isolate and manage heavy I/O customers
  • More control over the data placement 
  • Improved customer services
  • Leveraging Oracle's AI and ML capabilities, the customer achieved rapid transaction validation (within mere milliseconds), ensuring a seamless customer experience and trust-building

Summary:

At its core, Oracle is the trusted database partner for financial institutions, enabling them to extract the utmost value from their systems. This invaluable partnership empowers enterprises to embrace next-generation, scalable solutions optimizing their existing investments and  maximizing performance. By adopting Oracle's Globally Distributed Database, fintech companies can effectively address their scalability requirements and gain the competitive edge needed to thrive in the rapidly evolving financial technology landscape.

Source: oracle.com

Thursday, December 14, 2023

Navigate the World of Database APIs with Oracle

We delved into Oracle’s vision for the Converged Data Platform and how it streamlines data integration challenges while preserving developer flexibility. We saw how the Oracle Database service can easily handle various datatypes and be integrated with open source development platforms, analytics tools, and other Oracle Cloud Infrastructure (OCI) services. Now, it's time to translate theory into practice.

In this post, you learn how the Autonomous Database service inherently serves data, because APIs can be seamlessly integrated with Oracle API Gateway, and you can go through a hands-on Oracle LiveLab to experience this integration yourself. Oracle REST Data Services (ORDS) and OCI API Gateway not only simplify the process but also enable robust security, effective management, and comprehensive auditing, resulting in a powerful boost to your business applications by extracting the full potential of your data assets.

Oracle APEX


ORDS is part of Oracle APEX architecture, which includes features like Oracle Database Actions, Oracle APEX Access, REST APIs, and many more. This post focuses on the Oracle REST API features that allow database developers like you to expose data for other developers to consume and create productive business applications. By using REST API architecture, you gain the benefit of a converged data platform and the ability to manage data in many formats like text, JSON, Spatial, XML, and more, while still maintaining a consistent API syntax.

This architecture contrasts the inconsistency and code variability introduced when using multiple purpose-built data stores each employing its own API formats. With ORDS, as you build and integrate applications, you don’t have to shift gears as you traverse data stores and different APIs.

Oracle APEX is integrated into Oracle Autonomous Database. With a single click, you can deploy a highly available and completely managed environment for ORDS and APEX. ORDS allows you to publish RESTful webservices for your Oracle database, including tables, views, and even stored procedures. This opens the door for modern microservice development, enabling seamless data transfer to and from Oracle objects in various formats. This functionality allows you to dive into modern application development, while keeping the legacy development paradigms intact, without dealing with operational and integration issues from purpose-built databases.

Exposing the APIs originating from the database to the public internet can often lead to security concerns. Bad actors can call these APIs and gain access to all your valuable data. Securing these APIs is an essential part of modern app development. Let's look at how we can do that.

OCI API Gateway


Part of securing these APIs is deciding who gets to call them, where they call them from, and auditing their use. You probably also want to limit how many times APIs are being called. The Oracle API gateway is an OCI native service that provides all these capabilities.

InfStones and Autonomous JSON Database

InfStones and Autonomous JSON Database

The OCI API gateway receives all the API requests from various clients, translates those APIs to ORDS APIs, performs the requested action on the database objects and responds. It acts like a gatekeeper to help ensure that your ORDS APIs are never exposed.

Beyond helping to secure your ORDS APIs, OCI API Gateway can offer data on API usage. As your API ecosystem grows through the adoption of microservices, questions arise around how to understand which APIs are used most and which ones are unimportant or no longer valid. API Gateway helps you gain insight on your API usage across user categories, such as lines of business users, B2B users, or single users. With this data, you can decide which APIs to support, transition, monetize, and more.

Get Hands-on


If you want to try out this robust architecture and get real hands-on experience, check out our step-by-step LiveLab to create APIs from your Autonomous Database Dedicated and secure them with Oracle API Gateway.

Source: oracle.com

Wednesday, December 13, 2023

MicroTx Enterprise Edition is Now Available

MicroTx Enterprise Edition is Now Available

Oracle is pleased to announce that the Enterprise Edition of MicroTx previewed at Oracle CloudWorld 2023 is now available for download.  Available as a separate download, MicroTx Enterprise Edition provides the following features above and beyond what is provided in the Free Edition:

  1. Clustered Transaction Coordinator – the MicroTx transaction coordinator can now be deployed in a cluster providing high availability and high performance.  The number of replicas of the coordinator can by dynamically scaled out or in as required to achieve the necessary level of performance.
  2. Transaction Recovery – the outcome of transactions is durably recorded and recoverable by any member of the transaction coordinator cluster.  This ensures that all participants in a transaction will be notified of their outcome, even if a transaction coordinator instance fails.
  3. Transaction Store – the MicroTx transaction coordinator can persist transaction state to either etcd or Oracle Database based upon configuration.
  4. Transaction Caching – transaction state information maintained by the coordinators is now cached to improve performance
  5. MicroTx Console – an administrative console is provided that allows users and administrators to:
    • View in flight transaction information
    • View performance and health metrics of the coordinator cluster
    • Manage transactions - commit or rollback transactions with heuristic outcomes
    • View number of transactions for each transaction pattern that were:
      1. Processed
      2. Confirmed/Committed
      3. Cancelled/Rolledback
      4. Completed heuristically
    • View configuration of the coordinator
  6. RAC Support – participants can now use RAC based databases in XA transactions.  MicroTx tracks which RAC instance participants are using and ensures that a transaction branch doesn’t span RAC instances.
  7. Common XID – the MicroTx coordinator for XA transactions will now try to minimize the number of branches by reusing existing branches if possible, instead of creating new branches.  This provides performance improvements when multiple participants use the same resource manager.  If all the participants are using the same resource manager, the transaction will end up being a single branch that can be committed with a one phase commit, eliminating the need for the prepare phase.
  8. XA Transaction Promotion – this feature allows a transaction to start as a local RM transaction and only when another resource manager may become involved in the transaction does it get promoted to a full XA transaction.  This is currently only supported with Oracle Database.
  9. Grafana Dashboards – the MicroTx coordinators now provide metrics that can be collected by Prometheus and visualized using predefined dashboards in Grafana.
  10. Unlimited Transactions – the Free edition of MicroTx has a limitation of 4,800 transactions/hour.  This limit is removed in the Enterprise Edition, so production deployments are free to process as many transactions as needed.

With this release, MicroTx is now ready for your production deployments.  MicroTx EE is licensed as part of GoldenGate for Distributed Applications and Analytics. Get started for free using MicroTx Free and then upgrade to MicroTx EE when moving to production.

Source: oracle.com

Saturday, December 9, 2023

Backup Enhancements on Autonomous Database on Dedicated Infrastructure and Exadata Cloud@Customer

As 2023 comes to a close, Oracle is proud to announce significant enhancements to the backup capabilities in our Autonomous Database on Dedicated Exadata Infrastructure and Cloud@Customer (ADB-D and ADB-CC). These updates are designed to streamline your data management experience, offering greater flexibility and efficiency.

Introducing ‘Disable Backups’

We're excited to announce that automatic backups can now be disabled when creating an Autonomous Container Database (ACD). This feature is a must-have for environments without daily backup requirements, allowing you to eliminate unnecessary backup storage costs for your Autonomous Databases (ADB). This flexibility means you only pay for what you need, ensuring a more efficient and cost-effective database management experience.

Extended Backup Retention for Enhanced Flexibility

For ADBs requiring daily backups, we've extended the backup retention period, offering a range between 7 and 95 days. This upgrade from the previous maximum of 60 days gives you more control over your data retention strategies, ensuring that your backup schedules align perfectly with your business needs.

Simplified User Interface: Easier Backup Configuration

The new UI enhancements streamline the backup configuration process. We've moved backup configuration settings from the advanced options directly into the main flow of ACD creation, making it more accessible and intuitive. Here's a sneak peek at the revamped interface:

Backup Enhancements on Autonomous Database on Dedicated Infrastructure and Exadata Cloud@Customer

Backup Enhancements on Autonomous Database on Dedicated Infrastructure and Exadata Cloud@Customer

Streamlined ADB-CC Setup Process


For ADB-CC customers, setting up ACDs has become much simpler. Previously, configuring backup storage options like object storage, network file system (NFS), or ZDLRA involved intricate network configurations and security approvals, which could be time-consuming. Now, you can create an ACD without initially enabling backups, allowing you to save precious time and activate them later at your convenience. This flexibility ensures you can focus on what's important without being bogged down by initial setup complexities.

Enhanced Backup Descriptions for Greater Clarity


Understanding the nature of your backups is now easier than ever. We've refined our backup descriptions to clearly indicate the type of backup (Full, Incremental, Cumulative Incremental, and Virtual Full for Zero Data Loss Recovery Appliances) and whether it was initiated by our Autonomous tooling or manually by a user.

Backup Enhancements on Autonomous Database on Dedicated Infrastructure and Exadata Cloud@Customer

And this is just the beginning. As we move into 2024, stay tuned for more advancements in Autonomous Database backup and recovery capabilities. Our roadmap is filled with innovative plans to make our service even more robust and accommodate a broader range of customer needs.

Source: oracle.com

Friday, December 8, 2023

Migrating to the Autonomous Database - Dedicated using Database Links

Autonomous Database on Dedicated Exadata Infrastructure and Cloud@Customer (ADB-D and ADB-CC) includes support for outgoing Database Links to various Oracle databases, including its Serverless variant (ADB-S). These links offer a bridge between your source Oracle Database and an Autonomous Database, facilitating data reading or transfer between the two.

Migration using Oracle Data Pump


Oracle's Data Pump empowers you with diverse techniques to transition data between your source and target databases.  Among the numerous methods, one efficient route involves using Network or Database Links.


Advantages? Utilizing Database Links for data migration via the Data Pump utility (impdp) eliminates the necessity of writing dump files or moving these files from the source database to intermediary storage areas like Object Storage or Network File System. This is particularly beneficial for databases of a smaller size since the entire migration can take place over a network link.

Step-by-Step Migration Guide Using Database Links


Prerequisites:

Source Oracle Database:

Your database could be on-premises, on Oracle Cloud Infrastructure (OCI), or with another cloud provider.

CPAT:

The Cloud Premigration Advisory Tool (CPAT) helps you evaluate an existing Oracle database for compatibility with the Autonomous Database (ADB). Using CPAT before migrating to ADB makes assessing your source database easier and faster!  CPAT removes much of the legwork of identifying potential user actions, prioritizing their importance, and suggesting resolutions.

Network Connectivity:

Ensure seamless network connectivity between your source Oracle Database and the target ADB-D.  Remember, your ADB-D sits in a private subnet, making it essential to have a reliable network configuration in place.

Target Autonomous Database:

Gear up by provisioning the ADB-D on Oracle Cloud Infrastructure or Cloud@Customer. Need guidance on this?  Dive into our detailed guide.

Setting Up the Database Link:


Preparation:

Before creating the Database Link, ensure you have connectivity to your ADB-D, either copying Wallet into your source Oracle Database or using a Walletless connection.  Here's a quick walkthrough (Lab 2: Configure a Development System) to help you with the download and transfer process.

Configuration:

Adjust your tnsadmin and sqlnet.ora settings on your source Oracle Database and make certain you can seamlessly SQLPLUS into the Autonomous Database - Dedicated.

Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Certification, Oracle Database Tutorial and Materials

Database Link Creation:

Execute the following command:

CREATE DATABASE LINK <Source_Database_global_unique_name> CONNECT TO system IDENTIFIED BY <SourceDB_password> USING '<Connect_string_Source_DB>';

Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Certification, Oracle Database Tutorial and Materials

Testing:

Confirm the functionality of the newly created Database Link with this simple test:

select * from dual@<Database_Link>;

Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Certification, Oracle Database Tutorial and Materials

Data Migration:


Preparation:

Before migrating, ensure you've set up the necessary profiles, roles, and Tablespaces in the ADB-D instance.

Data Import:

You can remap to DATA Tablespace in Autonomous Database Dedicated or you can create Tablespaces as you have them in your source Database. For Autonomous Datawarehouse specific, custom-created Tablespace will not be enabled with compression. You can alter the created Tablespace or include it while creating Tablespce to enable compression.

alter tablespace test default compress for QUERY HIGH ROW LEVEL LOCKING;

Also, make sure to create your existing Roles in ADB-D before starting the import datapump process. 
Determine which schemas you'd like to migrate.  Once decided, initiate the Import Data Pump from your source Oracle Database with the following command:

impdp admin/<ADB_Password>@adb2_high SCHEMAS=<schemas> network_link=<Database_Link> parallel=1 transform=segment_attributes:n exclude=cluster nologfile=yes remap_tablespace=USERS:DATA

Note: Post-migration, you may occasionally encounter 'Role grant failed' errors.  This is because ADB-D restricts access to SYS, SYSTEM, or DBA.

Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Certification, Oracle Database Tutorial and Materials

Verification:

After the migration, SQLPLUS into your ADB-D to cross-check the migrated data.  Simply select values from the table in question to confirm its presence. 

Oracle Database Career, Oracle Database Skills, Oracle Database Jobs, Oracle Database Prep, Oracle Database Preparation, Oracle Database Certification, Oracle Database Tutorial and Materials

Source: oracle.com

Wednesday, December 6, 2023

Database links in Autonomous Database Serverless are the past - Cloud links are the future

As many of you know, database links are an established mechanism to allow other remote databases to access specific tables or views in your database.

Database links have been around for decades and always require a two-way interaction in one of two ways: The remote (accessing) side contacts you, the data owner, to ask for access and to get the specifics of how to connect to your system. Alternatively, as the data owner, you must proactively contact the remote side and share the access details for the remote side to set up the database link. With Oracle Autonomous Database Serverless, this is a thing of the past.

With Cloud Links, new functionality in Autonomous Database Serverless, the data owner registers a table or view for remote access for a selected audience defined by the data owner. The data is then instantaneously accessible by everybody who got remote access granted at registration time. No further actions are required to set up a Cloud Link, and whoever is supposed to see and access your data will be able to discover and work with the data made available to them.

Sounds almost too cool to be true, doesn't it? Let's step through it how it works.

Cloud Links at work


Let's assume I have central sales information in my autonomous database that other autonomous databases need to access remotely now and then. "Trusted" autonomous databases in the same compartment than my system should be able to access all my detail sales data, whereas other satellite databases within my tenancy should only be able to see the aggregated sales information per sales channel. 

The objects I want to give remote access to look as follows:

Database links in Autonomous Database Serverless are the past - Cloud links are the future

You see that there is a base table SALES_ALL and a view SALES_VIEW_AGG defined on top of it that removes the customer information and aggregates the sales information for the other dimensions. The base table will be accessible within my trusted compartment, whereas the aggregated sales information without any customer information should be accessible for everybody in my tenancy.

After the Administrator of my Autonomous Database has given me the privilege to register a table (or view) for remote access with the scope needed for the task at hand, I simply register my table SALES for compartmental access and view SALES_VIEW_AGG for my tenancy (for brevity reasons, pls. consult the documentation for the privilege details):

Database links in Autonomous Database Serverless are the past - Cloud links are the future

What is it about this registration? 


Cloud Links introduce a new concept of regional namespace and name for any data that is made remotely accessible. Think of it as something similar to the database today, where one of the most famous Oracle tables ever has the name "EMP" and lives in the namespace "SCOTT". There can only be one SCOTT.EMP in your database. With Cloud Links, it's the same concept, just on a regional level and without being tied to a single database. And since it's not linked to a single database but needs some boundaries of visibility, there's a new concept of scope. The scope defines who can access your table or view through a cloud link remotely. The scope can be a region, tenancy, compartment, individual databases, or a combination of those. 

That was it. My view CLOUDLINK.SALES_VIEW_AGG will be remotely accessible within my tenancy as REGIONAL_SALES.SALES_AGG, and table CLOUDLINK.SALES_ALL as TRUSTED_COMPARTMENT.SALES without exposing its origin.

After a brief period of central metadata synchronization, my trusted databases in my compartment can access all my sales data, whereas all databases in my tenancy can access my high-level aggregated information (it normally takes 5 to 10 minutes). Any future database in my tenancy or the same compartment as my database will be able to access the same data, safely and filtered as required for their work based on the registration policies. 

I can verify what objects I registered for remote access for the different scopes in the data dictionary:

Database links in Autonomous Database Serverless are the past - Cloud links are the future

The scope of my two registered objects are, as expected, on the tenancy level for REGIONAL_SALES.SALES_AGG and on the compartment level inside my tenancy for TRUSTED_COMPARTMENT.SALES

On the remote (receiving) end, every autonomous database can see what remote objects they have access to by querying the data dictionary:

select * from dba_cloud_link_access;

Let's see what my trusted autonomous databases (in the same compartment) and others in my tenancy will see.

Database links in Autonomous Database Serverless are the past - Cloud links are the future

If I connect to a trusted autonomous database, I will see the following output on the left: I can see both remote data sets. In contrast, when I connect to an arbitrary autonomous database within the same tenancy as my autonomous database that registered the objects, the output will look different, as shown on the right: I can only see the data set shared on the tenant level.

Besides the trusted autonomous database and other autonomous databases in my tenancy, no one else will be able to discover or see the table and view that I registered in this example.

Now it's probably only sometimes known to you what data is made remotely available to you, so you can discover what was made available to you or even find particular data of interest by yourself. If you know the data (namespace, name) you can describe it explicitly, or, the more interesting case, you can see what's out there using free text search.

Database links in Autonomous Database Serverless are the past - Cloud links are the future

Voilà, we found the dataset that was shared with everybody without necessarily knowing about its existence.

How to work with registered data?


We registered some objects for remote access and verified that we can see these objects within the scope they were defined, but how do I access them now? I do not have any username/password or other means of authentication and authorization that I shared with a remote database that wants to access my data.

The authentication is done at the registration time of an object. In our example, the trusted autonomous database got access to my sales data by being a trusted database within the same compartment. The same is true for the autonomous database in my tenancy for the aggregated sales data. You only need your Administrator to give you the read privilege on cloud links for authorization (again, please consult the documentation for details here), and you're ready to read the remote data.

After having gotten the proper privilege, any remote object that is made accessible for your autonomous database and your user can be queried with standard "cloud link syntax", namely:

select .. from <namespace>.<name>@cloud$link;

You access remote data without any location knowledge:

Database links in Autonomous Database Serverless are the past - Cloud links are the future

That was not too hard to set up. If I can do it, you can do it for sure as well.

Source: oracle.com