Friday, March 29, 2024

What is AnalyticsOps, and how could it improve your business decisions?

What is AnalyticsOps, and how could it improve your business decisions?

Analytics operations, or AnalyticsOps for short, is a concept that has evolved beside the advent of cloud computing, democratized analytics, and advanced analytics techniques such as AI/ML.  Don't confuse AnalyticsOps with Operational Analytics, which is an entirely different concept, nor the older concept of the BI center of excellence (BICOE).  AnalyticsOps works alongside, and doesn't replace, DevOps and MLOps to improve business outcomes. Implementing AnalyticsOps as a formal function is a key step in the journey to achieve analytics-mastery.

What is AnalyticsOps (analytics operations)?


AnalyticsOps is a set of operational and administrative processes, tools, and tasks for an analytics platform deployment. It involves the integration of analytics and operations to drive user adoption, improved analytics and KPIs, better decision-making, and optimization of business processes toward achieving strategic goals. In simpler terms, AnalyticsOps is the practice of using data analytics and operational tools to increase the value of an analytics platform and achieve desired outcomes. AnalyticsOps enables organizations to collect, process, analyze, and act on data in real-time. AnalyticsOps aims to create an analytics-driven culture within the organization, where data is integrated into decision-making at all levels - from daily operations to strategic planning - to optimize efficiency, minimize costs, and maximize revenue.

For example, Oracle has a dedicated Global Revenue Operations (Rev Ops) team responsible for developing and maintaining the Oracle Global Analytics, Reporting, and Sales Intelligence Platform (SI), which supports over 11,000 internal stakeholders from various departments, including sales, marketing, and finance. The team proactively monitors a wide range of AnalyticsOps KPIs using Oracle Analytics Cloud (OAC), ensuring that all queries essential to the monthly review process have a maximum response time of 5 seconds. This level of service has led to increased efficiency and accuracy, because Oracle's sales management has agreed to review live data during meetings instead of relying on error-prone spreadsheets or slide show presentations.

Additionally, the Rev Ops team analyzes poorly performing queries and reaches out to the specific users who initiated the request, offering to help them build more efficient queries. By examining the telemetry data of the OAC instance, the team can identify opportunities for efficiency, such as educating users on scheduling data flows to not run jobs hourly when the source data system updates every 24 hours, which can help reduce the operational load on the system. These efforts have had a significant positive impact that has led to improved stability and performance of the analytics platform, especially given the large and active user community of over 11,000 stakeholders.

How is AnalyticsOps related to DevOps and MLOps?


DataOps focuses on streamlining and automating data pipelines to improve the speed and quality of analytics. MLOps is an extension of DataOps that specifically addresses the unique challenges of deploying and managing ML models at scale. Conversely, AnalyticsOps integrates analytics and operations specifically to improve the analtyics workflows to drive better decision-making and optimize business processes.  All three - DataOps, MLOps, and AnalyticsOps - are crucial for business success in today's data-driven economy.

How does AnalyticsOps differ from the BI center of excellence?


The BI center of excellence (BIoE) is an older approached that creates a centralized team, usually residing with IT, responsible for managing the organization's business intelligence (BI) efforts, including data governance, data quality, reporting, and analytics. That team’s responsibilities included providing standardized reports and dashboards to business users, who then used this information to make decisions.  Essentially, a BIoE established, deliverd, and maintained policies and processes that delivered a mode 1  (as defined by Gartner) style of analytics. The focus was on delivering content through a given BI tool but didn’t incorporate other operational aspects and technologies related to a comprehensive analytics deployment.  

In 2016, Gartner declared that “the BICC is dead…” recognizing the evolving landscape of analytics tools, including the emergence of predictive technologies and the groundwork for machine learning (ML). Gartner's observation indicated that the successor to the BICC should prioritize empowering businesspeople with self-service capabilities through training, education, and coaching – which aligns with what we now know as AnalyticsOps.

AnalyticsOps takes a broader approach toward what Gartner refers to as mode 2 analytics. It integrates analytics and operations, including promotion of advanced analytics techniques such as AI/ML, to create an analytics-driven culture within the organization. Instead of relying on a centralized team to provide reports and dashboards, AnalyticsOps empowers business users to analyze and interpret data in real-time, allowing them to make faster, more informed decisions. In addition, it ensures all processes, policies, and tools are in place to ensure the deployment and availability of the analytics platform that supports modern analytics

How can AnalyticsOps help reduce “Data Debt?”


“Data debt” refers to the accumulation of data-related issues and problems that haven't been addressed or resolved over time, resulting in a backlog of work that IT hasn’t been able to complete. In the context of corporate data analytics, data debt can arise from incomplete or inaccurate data or data models, outdated data infrastructure, or inefficient data processes that haven’t been adequately maintained. This can impede an organization's ability to make informed analytics-driven decisions. To address data debt, organizations need to invest in data management and governance processes that ensure data quality, accuracy, and consistency for an analytics platform, which can help them make better decisions and stay ahead of the competition. By adopting an AnalyticsOps approach, organizations can better manage their data analytics processes and reduce the risk of accumulating data debt. This helps increase the accuracy and trustworthiness of data and analytics that support the decision-making process. 

To illustrate, consider a digital marketing company that has been collecting customer data from multiple sources including social media, online orders, and service calls, over the course of several years. During this time, the company underwent changes such as switching database vendors, acquiring companies, and revamping their customer loyalty program, resulting in a significant amount of customer data stored across multiple locations in different formats.  As a result, the business users struggle to report accurately on customers, often making mistakes due to inconsistent or incomplete data.  In this scenario, the company is grappling with overdue issues, or data debt. By implementing AnalyticsOps, the company can address the root cause of its customer data problems by analyzing its current data definitions and creating customer data standards with improved automated data collection, cleansing, and validation processes. AnalyticsOps has simplified the customer data definitions, making it easier for business users to correctly access and report on their customers.

What are the benefits of AnalyticsOps?


Unlike DataOps and DevOps, AnalyticsOps takes into account the last mile of the data journey at the Analytics layer. Say goodbye to low adoption rates, lack of trust, high costs, and difficulties with regulatory compliance.

  • Improved stability and performance: Stakeholders have greater confidence in using the analytics platform to make business decisions. Ensuring the best performance reduces time wasted waiting and instead focuses businesspeople on strategic tasks.
  • Better analytics integrity: AnalyticsOps actively monitors queries and activity to ensure the integrity of the analytics platform by quickly identifying and resolving slow or malformed queries through updates to the data model or data source, or by providing end-user education.
  • Better decision-making: By having access to accurate, timely, and relevant data, businesspeople make informed decisions by being data-driven based on data insights rather than assumptions or gut feelings. This improves accuracy of decision-making and reduces the risk of errors.
  • Boosted user adoption: Success of the analytics platform can be directly measured in terms of user adoption. Higher adoption means better ROI on the analytics platform. AnalyticsOps benefits ensure the best user experience and business outcomes, which leads to greater usage (and adoption)
  • Enhanced agility: AnalyticsOps can enable businesses to spot potential risks and to adapt quickly to market shifts and changing customer needs. Real-time data insights allow businesses to make quick decisions and adjust their strategies accordingly.
  • More competitive advantage: By using data insights, businesses can gain a competitive advantage because through using techniques such as predictive analytics, they can take advantage of new and potentially unseen opportunities before their competitors.
  • Increased efficiency: AnalyticsOps can help to automate analytics processes and workflows, reducing the time and effort required to collect, process, and analyze data. AnalyticsOps also helps with cleaning up the catalog (identifying and deleting unused workbooks, data flows, and datasets).
  • Improved collaboration: AnalyticsOps can foster structured collaboration between different people within an organization, including IT, data science, and business groups. This leads to better alignment of objectives, as well as more effective communication and faster decision-making.

What are the costs of AnalyticsOps?


The costs of establishing AnalyticsOps aren't just limited to financial investments in analytics tools and technologies but are primarily indirect costs. These costs are associated with the time, effort, and resources required to provide AnalyticsOps and include:

  • Training and upskilling: Establishing AnalyticsOps requires the upskilling of existing employees to enable them to understand not only analytics tools and technologies, but also the data sources and definitions across the organization’s data estate.
  • Time and resources: Creating an analytics-driven culture requires investment of time and resources. This includes time spent on data collection, preparation, analysis, and interpretation, as well as the resources required to maintain data pipelines and systems.
  • Data governance: AnalyticsOps requires a strong data governance framework to ensure that data is accurate, trusted, and secure.  This requires additional effort and resources to develop and maintain.
  • Change management: Implementing AnalyticsOps may require changes to existing processes and workflows. This can create resistance from employees who are accustomed to working in a certain way with certain tools (such as Microsoft Excel) and may require additional time and effort to manage.
  • Collaboration and communication: AnalyticsOps requires collaboration between different teams within the organization, including IT, data science, and business groups. This requires effective communication and coordination, which may be challenging to establish and coordinate.

Summary

AnalyticsOps is a critical and proactive methodology that integrates analytics and operations to improve business outcomes, to drive user adoption, and to achieve strategic goals. It's a decentralized approach that empowers business users to analyze and interpret data in real-time, allowing them to make faster, more informed decisions. AnalyticsOps also emphasizes the importance of agility and continuous improvement. It uses automation and machine learning to streamline data processes and optimize business performance. This enables organizations to quickly adapt to changing market conditions and stay ahead of the competition. It's a key step in the journey toward analytics-mastery, and organizations that embrace it will thrive in the data-driven economy.

Source: oracle.com

Wednesday, March 27, 2024

Understanding Database Management Systems (DBMS)

Understanding Database Management Systems (DBMS)

In the realm of digital infrastructure, the term Database Management System (DBMS) holds immense significance. A DBMS serves as a cornerstone for organizing and managing data efficiently within an organization. Let's delve into the intricacies of what constitutes a DBMS and its pivotal role in modern-day information management.

What is a Database Management System?


A Database Management System (DBMS) is a software suite designed to facilitate the storage, retrieval, and manipulation of data in a structured format. It acts as an intermediary between users and databases, providing an interface for users to interact with the data while ensuring its integrity, security, and accessibility.

Components of a DBMS

A DBMS comprises several key components, each playing a crucial role in its functionality:

1. Data Definition Language (DDL)

The DDL component of a DBMS enables users to define the structure of the database, including tables, fields, and relationships. It allows for the creation, modification, and deletion of database objects, ensuring data integrity and consistency.

2. Data Manipulation Language (DML)

The DML component facilitates the manipulation of data within the database, allowing users to insert, update, delete, and retrieve records. It provides the necessary commands and operations to interact with the data stored in the database.

3. Data Query Language (DQL)

DQL enables users to retrieve specific information from the database using queries. It allows for the extraction of data based on predefined criteria, enabling users to retrieve relevant information efficiently.

4. Data Control Language (DCL)

The DCL component governs the access and permissions associated with the database. It regulates user access rights, security privileges, and data integrity constraints, ensuring that only authorized users can manipulate the data as per defined policies.

Types of Database Management Systems


DBMSs can be categorized into various types based on their underlying architecture and functionality. Some common types include:

1. Relational DBMS (RDBMS)

RDBMS organizes data into tables with rows and columns, establishing relationships between them. It employs Structured Query Language (SQL) for data manipulation and retrieval, offering a robust and standardized approach to data management.

2. NoSQL DBMS

NoSQL databases diverge from the traditional relational model, offering flexible schemas and scalable architectures. They are well-suited for handling unstructured or semi-structured data types and can accommodate high volumes of data with distributed computing capabilities.

3. Object-Oriented DBMS (OODBMS)

OODBMS stores data as objects, encapsulating both data and methods within a single entity. It provides support for complex data structures and inheritance relationships, making it suitable for object-oriented programming paradigms.

4. Graph DBMS

Graph databases represent data as nodes, edges, and properties, facilitating the storage and traversal of interconnected data structures. They excel in modeling complex relationships and are widely used in applications involving social networks, recommendation systems, and network analysis.

Importance of DBMS in Business Operations


The adoption of a robust DBMS offers several benefits for organizations across various industries:

  • Data Centralization: A DBMS centralizes data storage, enabling organizations to maintain a single source of truth for their information assets.
  • Data Consistency: DBMS ensures data consistency by enforcing integrity constraints and validation rules, thereby minimizing the risk of errors and inconsistencies.
  • Data Security: DBMS implements access controls and encryption mechanisms to safeguard sensitive data from unauthorized access and cyber threats.
  • Scalability: DBMS offers scalability features to accommodate growing data volumes and user demands, ensuring optimal performance and resource utilization.
  • Decision Support: DBMS provides tools and utilities for data analysis, reporting, and visualization, empowering decision-makers with actionable insights derived from data-driven analysis.

Conclusion

In essence, a Database Management System (DBMS) serves as a foundational pillar for efficient data management, storage, and retrieval within organizations. By leveraging the capabilities of a robust DBMS, businesses can streamline their operations, enhance data integrity, and unlock valuable insights to drive informed decision-making.

Friday, March 22, 2024

Unveiling the Power of Oracle Database 23c: Revolutionizing Data Management

Unveiling the Power of Oracle Database 23c: Revolutionizing Data Management

In the ever-evolving landscape of data management, Oracle has yet again set a new standard with the introduction of Oracle Database 23c. This groundbreaking release brings forth a plethora of features and enhancements designed to streamline operations, enhance security, and elevate performance to unprecedented levels. In this comprehensive guide, we delve into the intricate details of Oracle Database 23c, exploring its key features, benefits, and the transformative impact it can have on your organization's data infrastructure.

Enhanced Security Measures


Security is paramount in today's digital age, and Oracle Database 23c delivers on this front with an array of enhanced security measures. With features such as Data Safe and Transparent Data Encryption, organizations can fortify their data against unauthorized access and cyber threats. Additionally, Oracle Label Security provides granular control over data access, allowing organizations to enforce fine-grained security policies based on user roles and data classifications.

Advanced Machine Learning Capabilities


Harnessing the power of machine learning, Oracle Database 23c empowers organizations to unlock valuable insights from their data like never before. Through features such as AutoML, organizations can automate the process of building and deploying machine learning models, enabling data-driven decision-making at scale. Furthermore, Adaptive Query Optimization leverages machine learning algorithms to optimize query performance, ensuring efficient utilization of resources and maximizing throughput.

Seamless Scalability and Performance


Scalability and performance are cornerstones of any robust data management solution, and Oracle Database 23c excels in both areas. With support for multitenant architecture and in-memory processing, organizations can effortlessly scale their databases to accommodate growing workloads while maintaining optimal performance levels. Moreover, Automatic Indexing leverages machine learning algorithms to automatically create and manage indexes, further enhancing query performance and reducing administrative overhead.

Cloud-Native Innovations


As organizations increasingly embrace cloud-native technologies, Oracle remains at the forefront with Oracle Database 23c's cloud-native innovations. With support for Oracle Autonomous Database and Oracle Cloud Infrastructure, organizations can seamlessly migrate their on-premises databases to the cloud, unlocking greater agility, scalability, and cost-efficiency. Additionally, Oracle Exadata Cloud@Customer brings the power of Exadata to the cloud, allowing organizations to leverage the industry's leading database platform without the need for upfront infrastructure investments.

Enhanced Data Integration and Management


Efficient data integration and management are essential for driving actionable insights and maximizing the value of data assets. Oracle Database 23c offers a comprehensive suite of tools and features for data integration, data warehousing, and real-time analytics, empowering organizations to extract, transform, and load data from disparate sources with ease. With Oracle Data Integrator and Oracle GoldenGate, organizations can streamline data movement and replication, ensuring data consistency and integrity across the entire ecosystem.

Future-Proof Your Data Infrastructure with Oracle Database 23c

In conclusion, Oracle Database 23c represents a paradigm shift in the realm of data management, combining unparalleled security, advanced machine learning capabilities, seamless scalability, and cloud-native innovations to deliver a comprehensive solution for modern organizations. Whether you're looking to enhance security, boost performance, or embrace cloud-native technologies, Oracle Database 23c has you covered. Embrace the future of data management with Oracle Database 23c and stay ahead of the curve in today's data-driven world.

Wednesday, March 20, 2024

How to help AI models generate better natural language queries

Using natural language to query your data is an easy way to answer business questions. One question I’m often asked is, “how can this work on my data? Have you seen my table and column names? The names are meaningless.”  Fear not! It is possible when you’re using Autonomous Database.

There is no magic. If your table and column names aren’t descriptive, you can help the large language model (LLM) interpret the meaning of tables and columns by using a built-in database feature called “comments”. Comments are descriptions or notes about a table or column’s purpose or usage. And, the better the comment, the more likely the LLM will know how to use that table or column to generate a the right query.

Adding Comments to your tables and columns


Let’s take an example. My database has 3 tables. The table names and columns are meaningless:

TABLE1
CREATE TABLE table1 (
c1 NUMBER,
c2 VARCHAR2(200),
c3 NUMBER
TABLE2
CREATE TABLE table2 (
c1 TIMESTAMP,
c2 NUMBER,
c3 NUMBER,
c4 NUMBER,
c5 VARCHAR2(100),
c6 NUMBER,
c7 NUMBER
)
TABLE 3
CREATE TABLE table3 (
c1 NUMBER,
c2 VARCHAR2(30)
)

There is zero chance that a natural language query will know that these tables represent movies, genres and streams. We can fix that ambiguity by adding database comments:

TABLE1
COMMENT ON TABLE table1 IS 'Contains movies, movie titles and the year it was released';
COMMENT ON COLUMN table1.c1 IS 'movie ids. Use this column to join to other tables';
COMMENT ON COLUMN table1.c2 IS 'movie titles';
COMMENT ON COLUMN table1.c3 IS 'year the movie was released';
TABLE2
COMMENT ON TABLE table2 IS 'transactions for movie views - also known as streams';
COMMENT ON COLUMN table2.c1 IS 'day the movie was streamed';
COMMENT ON COLUMN table2.c2 IS 'genre ids. Use this column to join to other tables';
COMMENT ON COLUMN table2.c3 IS 'movie ids. Use this column to join to other tables';
COMMENT ON COLUMN table2.c4 IS 'customer ids. Use this column to join to other tables';
COMMENT ON COLUMN table2.c5 IS 'device used to stream, watch or view the movie';
COMMENT ON COLUMN table2.c6 IS 'sales from the movie';
COMMENT ON COLUMN table2.c7 IS 'number of views, watched, streamed';
TABLE3
COMMENT ON TABLE table3 IS 'Contains the genres';
COMMENT ON COLUMN table3.c1 IS 'genre id. use this column to join to other tables';
COMMENT ON COLUMN table3.c2 IS 'name of the genre';
 
That’s it! The meaningless table and column names can now be understood by the LLM using Select AI.

Set up your Select AI profile to use comments

A Select AI profile encapsulates the information needed to interact with an LLM. It includes the AI provider, the model to use, the source tables used for natural language queries – and whether comments should be passed to the model for SQL generation.

begin

  dbms_cloud_ai.create_profile(

    profile_name => 'myprofile',
    attributes =>       
        '{"provider": "azure",
          "azure_resource_name": "my_resource",                    
          "azure_deployment_name": "my_deployment",
          "credential_name": "my_credential",
          "comments":"true",  -- enable the use of comments
          "object_list": [
            {"owner": "moviestream", "name": "table1"},
            {"owner": "moviestream", "name": "table2"},
            {"owner": " moviestream", "name": "table3"}             
          ]          
          }'
    );

    dbms_cloud_ai.set_profile(
        profile_name => 'myprofile'
    );
end;
/

Run your queries

You can now start asking questions using natural language against your complex schema. Even though the table and column names are meaningless, the LLM is able to identify the appropriate tables and columns through the comments and generate a query:

How to help AI models generate better natural language queries

Source: oracle.com

Monday, March 18, 2024

Announcing the Oracle APEX Sample Document Generator App!

Today, we are releasing the Sample Document Generator App, which showcases the integration of the Document Generator in APEX. The app is now available in the APEX 23.2 Gallery.

Announcing the Oracle APEX Sample Document Generator App!

This app includes examples to generate PDF documents using a custom plug-in, which invokes the Document Generator Function. Feel free to explore the plug-in code, look at MS Word templates stored in Static Application Files, and the JSON.

Cost-effectiveness


In order to create the Pre-Built Document Generator function, you need to have a paid account but the pricing is extremely cost-effective. With Oracle Cloud Functions, you only pay for the resources consumed during execution, eliminating the need for upfront infrastructure investment.

Installation on Oracle Cloud is required


The sample app uses a plug-in, which leverages the Oracle Cloud Infrastructure SDK for PL/SQL, which makes it very easy to manage / invoke OCI resources. This SDK is only available on the Autonomous Database, which means the sample app requires an Autonomous Database. This does not mean you can’t use the Document Generator in another APEX instance.

There are several prerequisites we must take care of before we can use the sample app to generate PDF documents. These steps are outlined below.

Configure the Document Generator Function


On OCI, open the navigation menu and select Pre-Built Functions in Developer Services.

Announcing the Oracle APEX Sample Document Generator App!

Select Document Generator. 

Announcing the Oracle APEX Sample Document Generator App!

Click the Create Function button. If a suitable application doesn't already exist in the current compartment, click Create new application.

Announcing the Oracle APEX Sample Document Generator App!

Tip: when you are using Document Generator in production, you can enable provisioned concurrency to reduce initial provisioning time and ensure hot starts.

Announcing the Oracle APEX Sample Document Generator App!

Click the Create button to finish the wizard and you can see the function is now deployed in the application.

Announcing the Oracle APEX Sample Document Generator App!

Create an Object Storage Bucket


Open the navigation menu and select Buckets from Storage.

Announcing the Oracle APEX Sample Document Generator App!

Create a new bucket if needed and provide a name.

Announcing the Oracle APEX Sample Document Generator App!

The sample app will use this bucket later to store the MS Word templates and PDF documents.

Configure the Database


In the next steps we make sure the database has the rights to invoke the Document Generator function and manage objects in Object Storage Buckets.

We start by creating a Dynamic Group for the database.

Open the navigation menu and select Domains from Identity & Security. Select the identity domain you want to work in and click Dynamic Groups. Create a new Dynamic Group if needed by specifying a name, description, and a rule using the OCID of your Autonomous Database as resource ID. Remember the name because we will need it afterwards.

Announcing the Oracle APEX Sample Document Generator App!

The matching rule is defined as:

resource.id = '<db_ocid>'

Next, we create a new Policy for this Dynamic Group. Open the navigation menu and select Policies from Identity & Security.

Announcing the Oracle APEX Sample Document Generator App!

Create the following policy.

Announcing the Oracle APEX Sample Document Generator App!

Allow dynamic-group <group_name> to manage objects in compartment <compartment_name>
Allow dynamic-group <group_name> to manage buckets in compartment <compartment_name>
Allow dynamic-group <group_name> to use functions-family in compartment <compartment_name>

As final step in OCI we have to execute the following statements as the ADMIN user in the Autonomous Database.

Go to the Automous Database and Click SQL under Database Actions.

Announcing the Oracle APEX Sample Document Generator App!

Execute the following code as the ADMIN user.

Announcing the Oracle APEX Sample Document Generator App!

begin
    DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL();
    DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL(username => '<WORKSPACE_SCHEMA>');
end;
/

grant DWROLE to <WORKSPACE_SCHEMA>;

The resource principal is used to authenticate and access Oracle Cloud Infrastructure resources. The DWROLE enables you to use the OCI PL/SQL SDK.

Quickly Installing the Document Generator Sample App


Perform the following steps to install the app:

1. Go to the App Gallery.
2. Search for the Sample Document Generator app.
3. Click install.
4. Go to application and navigate to Shared Components > Component Settings
5. Specify values for the empty attributes:
  • Region Name: The OCI region of the Object Storage Bucket and Document Generator function.
  • Document Generator Function OCID: The OCID of the function in the OCI application.
  • Bucket Namespace: The OCI namespace of the Object Storage Bucket.
  • Bucket Name: The name of the Object Storage Bucket.

Announcing the Oracle APEX Sample Document Generator App!

Now we have completed the prerequisites and are ready to use the app!

Announcing the Oracle APEX Sample Document Generator App!

Source: oracle.com

Friday, March 15, 2024

Introducing Zero to low-cost Autonomous Database for Developers

Oracle has recently been recognized as a leading cloud service provider (CSP), providing a full suite of cloud computing solutions including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and industry-specific application solutions via Software as a Service (SaaS).

To accomplish this, we created a next-generation cloud experience that focuses on enterprise performance, availability, security and cloud economics where you pay for what you use. The economic benefits of Oracle Cloud Infrastructure (OCI) are substantial, enabling workloads such as AI and Oracle Database to achieve outstanding price-performance.

We are now further improving OCI economics for our Oracle Database cloud service portfolio with the introduction of Oracle Autonomous Database for Developers, which provides Autonomous databases for developers on Dedicated Exadata Infrastructure and Exadata Cloud@Customer at no additional cost. 

Oracle Autonomous Database is an ideal database for developers. It provides multi-model database capabilities for many types of data (including relational, JSON, spatial, graph, multimedia, XML, files, and more), many workloads (transactional, data warehouse, and analytics), and typical developer interfaces (full SQL, REST data access, and language drivers). It comes with free development tools such as Database Actions, Oracle Application Express (APEX) for low-code app creation, and Oracle REST Data Services. It also includes in-database machine learning algorithms and Select AI which enables users to query data using generative AI powered natural language processing.

Autonomous Database for Developers enables developers to experiment with Autonomous Database and build applications with no additional cost. The free developer databases are intended solely for development and functional testing. There is no limit on the number of developer databases users can create on their Exadata and these databases have no expiration dates. Autonomous Database for Developers supports Transaction Processing and Data Warehousing workload types. Each developer database instance has 4 ECPUs, 20 GB of data storage, and supports up to 30 concurrent database sessions.

Introducing Zero to low-cost Autonomous Database for Developers

Except for Autonomous Data Guard, Database In-Memory, Autoscaling, and long-term backups, all other features of Autonomous Database, such as backup and restore, cloning, patching, APEX, Database Actions, ORDS, Performance Hub, Select AI, APIs, metrics, and notifications are included in Autonomous Database for Developers. While developer databases may lack a few production database features, they are otherwise 100% compatible with the Autonomous databases used in production environments, letting developers create and test their applications against identical database environments.

Introducing Zero to low-cost Autonomous Database for Developers

Developer databases are automatically patched following the same schedule as regular Autonomous databases. Developers can file service requests (SR) to Oracle Support to get assistance with their developer databases; however, there is no severity 1 SR support or critical one-off patches. Autonomous Database for Developers adheres to a 99.5% service level objective (SLO).

While Autonomous Database for Developers is for development and functional testing only, users can access the full suite of Autonomous Database features or scale up the database for non-development deployments such as load/stress testing and production by cloning a developer database to a full-featured Autonomous Database instance and running there.

Oracle Autonomous Database offers developers a powerful and user-friendly platform for building and deploying mission-critical applications with high performance, scalability, and built-in security while minimizing administrative overhead and costs. With the introduction of Autonomous Database for Developers and it’s free Autonomous databases for developers, there is now an even more compelling reason to start all new application development with Autonomous Database.

Source: oracle.com

Wednesday, March 13, 2024

Audit Active Data Guard with Data Safe in Oracle Cloud

We’re excited to announce that Oracle Data Safe can now monitor the database activity of Active Data Guard configurations for Oracle Database on Oracle Exadata Database Service on Dedicated Infrastructure (formerly known as Exadata Cloud Service) and Oracle Base Database Service (formerly known as Oracle Database Cloud Service).

Active Data Guard (ADG) is an evolution of Oracle Data Guard technology that incorporates significant innovation designed for a specific purpose - to offload work from the production database, freeing up resources for critical transactions. ADG enables read-only access to a physical standby database while redo application is active. Workloads such as reporting, analytics, backups, queries, and even occasional writes (a new ADG feature as of Oracle Database 19c) can be offloaded from the production system to a synchronized physical standby database. These workloads would otherwise consume valuable resources on the primary production site; therefore, ADG saves valuable CPU and I/O cycles and promotes efficient use of system resources in the configuration. Since ADG opens up standby databases for read/write workloads, most regulatory and compliance requirements emphasize the need to monitor the database activity on standby databases, though less rigorously compared to the primary production database.

Data Safe now provides a single pane of glass monitoring database activity for all the database peers in an ADG configuration (including the primary database and all the associated standby databases) without worrying about redundant audit record collection. A brief insight into the mechanism within Oracle Database auditing that enables the feature in Data Safe explains why this is important.

Unified audit records within the Oracle Database are written to a table in the AUDSYS schema called AUD$UNIFIED. When the database is not writable (typically occurs when the database is closed or is read-only as in ADG), the Oracle Database writes audit records to external operating system spillover .BIN files. The audit data of the spillover files is presented in the view GV$UNIFIED_AUDIT_TRAIL.

The view UNIFIED_AUDIT_TRAIL is a UNION ALL of the table AUDSYS.AUD$UNIFIED and the view GV$UNIFIED_AUDIT_TRAIL.

The capability to monitor audit records from standby databases is built into the UNIFIED_AUDIT_TRAIL since unified audit was introduced in Oracle Database 12c. However, because audit records from the primary database (written to the database table AUD$UNIFIED) are captured in redo and replicated to the standby, it was challenging to separate activity on the standby from activity on the primary. Oracle Database 19c Release Update 21 (19.21) introduced a new column, SOURCE, in UNIFIED_AUDIT_TRAIL, making it easy to differentiate the origin of audit records. That new column helps avoid redundant audit record collection from ADG.

Audit Active Data Guard with Data Safe in Oracle Cloud
Figure1: Unified audit trail with SOURCE column to differentiate the origin of audit records

Leveraging the SOURCE column value in the UNIFIED_AUDIT_TRAIL view enables Data Safe to monitor the entire ADG configuration with a single primary database and multiple standby databases as a single target with multiple unified audit trails. The primary database in the ADG (as identified by the system-generated failover connection string with role-based database service) has an audit trail to collect from the database table AUDSYS.AUD$UNIFIED by querying the  UNIFIED_AUDIT_TRAIL view with SOURCE set to DATABASE. Each database in the ADG will have an audit trail to collect from that database’s corresponding spillover files by querying the UNIFIED_AUDIT_TRAIL view with SOURCE set to FILE.

A sample monitoring configuration for an ADG with one primary and two standby databases is represented here.

Audit Active Data Guard with Data Safe in Oracle Cloud
Figure2: Database activity monitoring of ADG as a single target with multiple unified audit trails

Once you register the primary, along with any ADG peers, in Data Safe as a database target, the associated audit profile contains the details of the multiple audit trails discovered automatically from the metadata. The audit trails will have an indicator (FILE or TABLE) to identify the SOURCE of audit records, as shown here.

Audit Active Data Guard with Data Safe in Oracle Cloud
Figure3: Audit profile of the single ADG target with multiple unified audit trails in Data Safe

Collecting unified audit records in Data Safe commences once you start the corresponding audit trails, and audit reports show the ADG target's audit events from the primary and standby databases.

Sample login activity report of the ADG target is shown here with audit events from both primary and standby databases. The column database unique name lets you correlate activity to the specific database in the ADG target where the audit event was triggered.

Audit Active Data Guard with Data Safe in Oracle Cloud
Figure4: Audit report in Data Safe of the ADG target showing audit events from all the databases

In a nutshell, Data Safe provides a single pane of glass monitoring database activity for all the Oracle databases in ADG configuration as a single target with multiple unified audit trails.

Source: oracle.com

Monday, March 11, 2024

Customizing risk assessment in Oracle Data Safe

Oracle Data Safe Security Assessment helps you assess and monitor changes to your database security risks by identifying security misconfigurations, missing policies, users, and entitlements. After the initial risk identification, customers typically evaluate the risks by validating them and their risk levels before remediating them. Sometimes the identified risk is not applicable as there might be some other mitigating control or it might not be important for your business or auditors. Customers would like Data Safe to adjust the findings to match their organization’s specific needs and help streamline the assessment process.

We are pleased to announce that you can now “defer risk” or “change risk” level to match your specific environment and deployment. “Defer Risk” allows you to indicate that you have reviewed the finding and will work on it later (or, eventually, never) so that it doesn’t show up again as a finding in subsequent reports. “Change Risk,” allows you to raise or lower the severity of a finding to suit your requirements.

Use Cases


In the example below, the organization has decided to “defer” the risk for users with expired passwords until they can study who are these users

Customizing risk assessment in Oracle Data Safe
Figure 1. Data Safe Security Assessment - Deferring or changing a finding risk level.

Use case 1


Data Safe Security Assessment identified that the database does not have a recent backup (no records in the last 90 days) and flagged it as a High Risk. But here, the database was backed up in the last 80 days, but as a cold backup with a 3rd party technology. You have decided that there is no risk, and you can now mark it as a “Pass.” Thus, the assessment report would no longer show this as a “finding”.

Use case 2


Security Assessment identified that you have five users with the DBA role and marked it as “Evaluate.” After careful examination, you’ve noticed that all five users are approved accounts for your company’s database administrators. Despite reading the remarks on why it is better not to use the out-of-the-box DBA role, you consciously decided to mark it as “Low risk.” Database administrators are still using the default DBA role but there are plans to review their privileges with Privilege Analysis and to create a customized DBA role with only the necessary privileges. Additionally, Database Vault realms protect the application schemas to further reduce the risk of misuse or compromise.

Customizing risk assessment in Oracle Data Safe
Figure 2. Deferring risks for later reevaluation

Use case 3


Security Assessment identified that there is an application service account that allows unlimited failed logins. Investigation reveals that following the last password change several batch processes continued to use the old password, locking the application account and causing an outage. The issue is being worked on, with plans to implement gradual password rollover for all application account profiles. In the meantime, failed login attempts are being audited, and Audit Vault is configured to alert wherever a new failed login attempt is made. Setting this risk to deferred until the password rollover profiles are implemented.

Risk Modification Report


The user changing the risk level will need to provide a justification for the change. The user can also set an expiration date. Setting an expiration date will clear up the overridden level at that time and again let the assessment show the actual finding level.

Modified risk levels are tracked and available under the “Risk modification report.” In this report you will see the originally identified risk level, the modified risk level, or whether the risk was deferred, along with the justification and the expiration date. The user that made the change and the last update time are also tracked.

Customizing risk assessment in Oracle Data Safe
Figure 3. Data Safe Security Assessment – Risk modification report.

Conclusion

With this addition, Data Safe helps you streamline and adjust the assessment report to meet your corporate and regulatory needs. Now, in addition to assessing your database according to standard practices, you can also customize the risk levels, manually pass findings, and track your progress toward compliance.

Source: oracle.com

Friday, March 8, 2024

Expanded enterprise-class support with Oracle Audit Vault and Database Firewall Release Update 11 (20.11)

One of our design goals for Oracle Audit Vault and Database Firewall is to continue to provide an enterprise-class solution that takes much of the complexity out of database activity monitoring and database security posture management. With that goal in mind, Audit Vault and Database Firewall (AVDF) 20 Release Update 11 (20.11) continues to expand support for enterprise-class features along with significant improvements in usability and operations.

Here is what’s new in the latest AVDF Release:

  • QuickCSV audit collector
  • Integration with identity provider for single-sign-on
  • Revamped alert UI workflow
  • Fleet-wide security assessment drift chart
  • Expanding support for tracking before/after values
  • Finely scoped database firewall policies and reports
  • Use of global sets in all activity and GDPR reports
  • Audit trail migration
  • AVDF certificate rotation from UI

Now, let's review some of those in detail.

QuickCSV audit collector


AVDF supports out-of-box audit log collection from multiple target types, including relational databases such as Oracle, Microsoft SQL Server, IBM DB2, MySQL, and PostgreSQL. AVDF also collects audit records from non-database targets, including operating system audit records for Windows, AIX, Sparc, and Solaris, as well as Microsoft Active Directory. However, there is a variety of systems that produce audit records, and this is where AVDF’s custom collector framework helps by collecting audit records available from database tables via RESTful API or audit data stored in JSON, CSV, and XML file formats.

We have seen that comma-separated value (CSV) is one of the most popular audit log formats used in applications, databases, and infrastructure components. With the new QuickCSV Collector in AVDF 20.11, you can easily import CSV audit files and map them to the AVDF audit schema as a one-time task. Once mapping is complete, audit data will be collected periodically from the CSV audit files like any other supported targets.

For example, you may use the QuickCSV collector to collect audit data from MariaDB, EnterpriseDB (Postgres), and other databases that create audit data in CSV. This approach helps you generate audit reports and alerts and protect and manage audit logs. 

Expanded enterprise-class support with Oracle Audit Vault and Database Firewall Release Update 11 (20.11)
Figure 1: QuickCSV collector

Integration with identity provider for single-sign-on


Today, many of you implement single sign-on (SSO) using an enterprise identity service for your applications to minimize account proliferation and authentication mechanisms. Now, with AVDF 20.11, you can integrate with identity providers (IdP) such as Azure, Active Directory Federation Services, and Oracle Access Manager through SAML 2.0 integration. After integrating AVDF with your IdP, AVDF console users can be authenticated by your IdP using SSO.

With this new feature, you have multiple options to choose from as your authentication method:

  • Users authenticated with SSO – added in 20.11
  • Users in a centralized directory like Microsoft Active Directory or OpenLDAP
  • Users authenticated using local passwords in AVDF

Expanded enterprise-class support with Oracle Audit Vault and Database Firewall Release Update 11 (20.11)
Figure 2: AVDF authentication methods

Revamped alert UI workflow: 


We know most organizations are concerned about data breaches and ransomware, but when prevention fails, we need to help you understand what happened and detect attempts quickly. A better alert policy workflow contributes to a better detection system.

AVDF’s alert policy creation is completely revamped in AVDF 20.11, providing an intuitive and user-friendly experience.  New alert policies can be created with just a few clicks through pre-defined templates or by modifying existing policies with new conditions.

AVDF’s unique interactive reporting  capability allows customers to quickly slice and dice the data to reach that one record of interest, and now this capability is available in the alert policy as well.  You can also use the interactive report filters to define complex conditions for which alerts can be raised. In the example below, if you want to receive an alert when a privileged user updates sensitive data directly from SQLPlus, you can simply apply filters to your report and copy that to create the alert condition.

Expanded enterprise-class support with Oracle Audit Vault and Database Firewall Release Update 11 (20.11)
Figure 3: Alert policy condition using report filters

You can get a quick view of all the alerts generated on the alert policy page without going away from the alert definition, improving the overall user experience of alert usability. In addition, we made it much easier to notify the recipients of any alerts raised. Now, your auditor dashboard provides multiple actionable insights on the generated alerts.

Expanded enterprise-class support with Oracle Audit Vault and Database Firewall Release Update 11 (20.11)
Figure 4: Alert Insights

Fleet-wide security assessment drift chart


In AVDF 20.9 and 20.10, we introduced fleet-wide security assessment and drift management, respectively. AVDF 20.11 now allows you to quickly see how the security posture of all your Oracle databases is changing by introducing the security assessment drift chart. The chart on the auditor’s dashboard compares the latest assessment with the defined baseline for all databases and quickly identifies any drift requiring attention.

Expanded enterprise-class support with Oracle Audit Vault and Database Firewall Release Update 11 (20.11)
Figure 5: Security Assessment Drift Chart

More updates with AVDF 20.11


AVDF 20.11 introduces many other significant enhancements and features to improve your AVDF experience. Some of them are listed below.

Expanding support for tracking before/after values: AVDF currently collects before/after values from Oracle and Microsoft SQL Server databases and helps customers meet compliance requirements where they need to track the value change. AVDF 20.11 now extends the same before/after value change auditing support for MySQL, helping customers meet their compliance requirements for MySQL database also.

Finely scoped database firewall policies and reports: Until now, Database Firewall (DBFW) policies and reports were based on command groups such as DML, DDL, and DCL, and customers could not easily create policies on just a specific command.  With AVDF 20.11, the command class has been expanded to commands such as DELETE, INSERT, UPDATE, DROP TABLE, etc. This enhancement helps you define narrow alert conditions and create unified reports – irrespective of whether the event data was from the audit logs or network-based SQL.   

Use of global sets in all activity and GDPR reports: Until now, global sets of IP addresses, OS/DB users, sensitive objects, privileged users, and client programs have been used across Database Firewall policies, making it easier to apply the same rules.  Starting in AVDF 20.11, you can now apply the same global set to filter all activity reports, including the compliance reports. For example, in GDPR compliance reports, you can use sensitive object sets to view user activity on sensitive data.

Audit trail migration: Customers have requested easy ways to migrate their audit trails to different agents due to aging agent hardware or the need for improved load balancing across agents. AVDF 20.11 provides flexibility to migrate the audit trail from one agent to another or agentless configuration and vice versa without losing any audit data and restarting the agent/trail.   

AVDF certificate rotation from UI: AVDF uses certificates for internal communication among various services. The current process was lengthy and only partially automated. Now, with 20.11, you can have a clear picture of the certificate validity status from the AVDF console, and you can rotate these certificates with a single click when needed.

Source: oracle.com

Wednesday, March 6, 2024

Announcing the general availability of Oracle Globally Distributed Autonomous Database

Announcing the general availability of Oracle Globally Distributed Autonomous Database

Today, we have the pleasure of announcing the general availability of the Oracle Globally Distributed Autonomous Database. This fully managed Oracle Cloud Infrastructure (OCI) service is available in data centers around the world. Built-in, cutting-edge capabilities redefine how enterprises manage and process distributed data to achieve the highest possible levels of scalability and availability, and provide data sovereignty features. And, with Autonomous Database’s automated management and ease of use you don’t need to have an extensive staff of experts to take advantage of the power of a distributed database.

The Oracle Globally Distributed Autonomous Database simplifies the deployment and management of distributed databases. It provides transparent access for applications using these databases by automatically placing data in the appropriate location. The addition of distributed database capabilities on top of the proven SQL capabilities of Oracle Database enables customers to immediately benefit from decades of innovation in performance optimization, RAC parallel clusters, converged database architecture, and security.

Why Use Oracle’s Globally Distributed Autonomous Database


Oracle Globally Distributed Autonomous Database is designed to address global enterprises’ needs for high levels of performance and availability, and data sovereignty. It makes it easier for organizations to run critical distributed databases that use all types of data with high performance and availability. It  is ideal for a wide range of applications including payment processing, credit card fraud detection, personalized marketing, mobile messaging, internet infrastructure, and the Internet of Things (IoT).

Let's explore some of the key capabilities.

High Availability: Oracle's Globally Distributed Autonomous Database splits a single logical database into multiple physical databases (called shards) that are distributed across multiple data centers, availability domains, or regions. Faults in one shard do not affect others, enhancing overall availability. Automatic replication of shards across domains or regions provides protection from outages. The Oracle Globally Distributed Autonomous Database runs on fault-tolerant Exadata infrastructure for the highest possible availability.

Horizontal Scalability: Organizations can scale Globally Distributed Autonomous Database horizontally by adding servers and their associated database shards online and without interrupting database operations. Data and accesses are automatically redistributed to maintain a consistently balanced workload. The database infrastructure scales from multi-terabyte to multi-petabyte levels, addressing the requirements of the most demanding applications. In addition, each Globally Distributed Autonomous Database shard runs on an Exadata platform in OCI, providing high levels of vertically scaled performance that can automatically increase to meet local demand or scaled down to help reduce costs.

Data Sovereignty: Organizations can specify where data is stored using a choice of customer-defined data placement policies. Updates are automatically inserted into database shards in the correct location based on these policies.

Choice of Data Distribution Methods: Globally Distributed Autonomous Database offers extensive control over how data is distributed across shards. Unlike other databases with limited methods, we support value-based, system-managed consistent hash, user-defined, duplicated, and partitioned distribution within shards, as well as allowing flexible combinations.

Autonomous Management: This service brings the advanced, ML-driven capabilities of Autonomous Database to distributed databases with automatic database patching, security, tuning, and performance scaling within shards. The service combines vertical and horizontal scalability to achieve optimum levels on demand.

AI: Autonomous Database Select AI is also supported, letting users access their distributed databases using LLM-enabled natural language queries without having to know how data is structured or where it is located.

Simple Application Development: The Globally Distributed Autonomous Database offers a unified logical database view to applications. Its cloud-native capabilities and support for Oracle Database’ rich feature set provide the ideal platform for modern applications. Automated and transparent data distribution and access simplify the development of distributed applications.

Announcing the general availability of Oracle Globally Distributed Autonomous Database
Oracle Globally Distributed Autonomous Database
 

Oracle Globally Distributed Autonomous Database can help you:


◉ Achieve hyperscale performance for transaction processing and mixed workloads

◉ Address data sovereignty requirements for distributed data warehousing and data lakes

◉ Deploy concurrent data pipelines and machine learning analytics

◉ Provide maximum availability for mission-critical applications

◉ Build cloud-native, scalable applications

Pricing


Globally Distributed Autonomous Database is priced based on the number of shards being used and the amount of database consumption on each shard. Our pricing is simple and predictable. You can find our pricing for the service at https://www.oracle.com/autonomous-database/distributed-autonomous-database/pricing/

The Future of Distributed Data Management is Here!


Oracle’s Globally Distributed Autonomous Database is a powerful and easy-to-use service designed to help you meet your diverse data needs. Whether it's enabling data residency, achieving extreme availability, managing massive scale, or delivering high performance with low local-access latency for global users, this service has you covered. If you're seeking a fully managed distributed database service, we invite you to experience the capabilities of Globally Distributed Autonomous Database. Explore the next era of data management and elevate your data operations.

Source: oracle.com