AWS joins Microsoft, Google Cloud in offering Oracle database

AWS joined Microsoft Azure and Google Cloud in offering the Oracle Database, an indication that enterprises want to use Oracle alongside the cloud providers’ AI services.

LAS VEGAS - Enterprises will soon have the option of running Oracle Databases on the three leading cloud providers.

On Monday, Oracle launched the Oracle Database on AWS at the Oracle CloudWorld conference. The new Oracle Database@AWS offering allows customers to access Oracle Autonomous Database on dedicated infrastructure and Oracle Exadata Database Service within AWS. 

The AWS offering follows a similar deal Oracle made with Google Cloud in June and with Microsoft Azure last September.

The cloud providers' willingness to offer Oracle Database indicates that they haven't succeeded in replicating its on-premises performance and feature set without using Oracle hardware and software, IDC analyst Dave McCarthy said. The latest offerings also point to solid loyalty among Oracle Database customers.

"Now [that] they have a choice of all the major clouds, I suspect this will unlock a new wave of cloud migrations from on-premises Oracle environments," McCarthy said.

The core Oracle services on the three cloud providers are essentially the same. They include the Autonomous Database, the Exadata Database Service, and the Real Application Clusters (RAC). The products are part of the Oracle Cloud Infrastructure (OCI).

The Autonomous Database Service automates routine database maintenance tasks, including patching, upgrades and tuning. Exadata provides security and numerous features for high-performance analytics, AI, and transaction processing. RAC allows a single Oracle Database to run across multiple servers.

The Autonomous Database Service and Exadata run on Database 23ai, a database management system that integrates AI capabilities. The features include AI Vector Search, in-database machine learning, and support for AI models. The capabilities let organizations leverage AI for data processing and analytics without moving the data.

"Oracle's strategy, in general, is to try not to move data around a lot," Leo Leung, vice president of OCI product marketing, said. To reduce costs and security risks, Oracle prefers to "just keep it all in the same place and then expose it to different types of use cases and capabilities."

Cloud provider services on Oracle Database

The cloud providers will offer their AI services alongside the Oracle Database, the companies said. For example, enterprises can use AWS Bedrock or Google's Vertex AI to train, deploy and manage AI applications.

The Oracle Database@Google Cloud is initially available in four Google regions, two in the United States and two in Europe. Oracle plans to roll out the service gradually in other areas. AWS plans to make the Oracle Database@AWS service available in preview this year, with broader availability scheduled for next year.

The initial offering on AWS lacks some features available on Azure and Google Cloud, Rob Strechay, an analyst at theCUBE Research, said. For example, AWS does not include the Exadata Exascale data architecture for high-performing AI applications and the low-latency, high-throughput private interconnect between OCI and the cloud provider.

"It will have some limitations out of the gate," Strechay said. "[But] we expect that to change."

Despite the latest database deals, Oracle Cloud remains in catch-up mode with AWS, Microsoft Azure, and Google Cloud in deploying and running generative AI applications, analysts said. However, OCI has had success in attracting startups seeking Nvidia GPU clusters for running AI applications.

"A startup will be OK with saying, I use Oracle just for their GPUs," McCarthy said. "An enterprise is going to want more than that."

Oracle is trying to close the gap with OCI Generative AI. OCI GenAI is a fully managed service on Oracle Cloud that provides large language models for various use cases, including chat, text generation, summarization, and text embeddings. Customers can use pre-trained models or host their fine-tuned custom models on dedicated AI clusters. The service lets customers integrate LLMs into a wide range of applications.

Antone Gonsalves is an editor at large for TechTarget Editorial, reporting on industry trends critical to enterprise tech buyers. He has worked in tech journalism for 25 years and is based in San Francisco. Have a news tip? Please drop him an email.

Dig Deeper on Cloud infrastructure design and management

Data Center
ITOperations
SearchAWS
SearchVMware
Close