AWS Database Blog

Use AWS DMS to migrate data from IBM Db2 DPF to an AWS target

AWS has introduced a new feature in AWS Database Migration Service (AWS DMS) that simplifies the migration of data from IBM Db2 databases with the Database Partitioning Feature (DPF) databases to Amazon Simple Storage Service (Amazon S3), a highly scalable and durable object storage service. With this new capability, you can now migrate your data from IBM Db2 DPF databases to Amazon S3, paving the way for building robust data lakes in the cloud. This new feature streamlines the migration process, provides data integrity, and minimizes the risk of data loss or corruption, even when dealing with large volumes of data distributed across multiple partitions and databases of varying sizes. In this post, we delve into the intricacies of this new AWS DMS feature and demonstrate how to implement it. We explore best practices for orchestrating data flows and optimizing the migration process, achieving a smooth transition from on-premises IBM Db2 DPF databases to a cloud-based data lake on Amazon S3.

Create a fallback migration plan for your self-managed MySQL database to Amazon Aurora MySQL using native bi-directional binary log replication

In this post, we show you how to set up bi-directional replication between an on-premises MySQL instance and an Aurora MySQL instance. We cover how to configure and set up bi-directional replication and address important operational concepts such as monitoring, troubleshooting, and high availability. In certain use cases, native bi-directional binary log replication can either provide a simpler fallback plan for your migration or provide a way to migrate applications or schemas individually, rather than all at the same time.

Executive Conversations: Putting generative AI to work in omnichannel customer service with Prashant Singh, Chief Operating Officer at LeadSquared

Prashant Singh, Chief Operating Officer at LeadSquared, joins Pravin Mittal, Director of Engineering of Amazon Aurora, for a discussion on using generative artificial intelligence (AI) to scale their omnichannel customer service application while controlling costs. LeadSquared helps customers build truly connected, empowered, and self-reliant sales and service organizations, with the power of automation. This Executive […]

How LeadSquared accelerated chatbot deployments with generative AI using Amazon Bedrock and Amazon Aurora PostgreSQL

LeadSquared is a new-age software as a service (SaaS) customer relationship management (CRM) platform that provides end-to-end sales, marketing, and onboarding solutions. Tailored for sectors like BFSI (banking, financial services, and insurance), healthcare, education, real estate, and more, LeadSquared provides a personalized approach for businesses of every scale. LeadSquared Service CRM goes beyond basic ticketing, […]

Choose the right change data capture strategy for your Amazon DynamoDB applications

Change data capture (CDC) is the process of capturing changes to data from a database and publishing them to an event stream, making the changes available for other systems to consume. Amazon DynamoDB CDC offers a powerful mechanism for capturing, processing, and reacting to data changes in near real time. Whether you’re building event-driven applications, […]

Migrate logins, database roles, users, and object-level permissions from Azure SQL Database to Amazon RDS for SQL Server

In this post, we demonstrate how to migrate SQL logins, database roles, users, and object-level permissions from Azure SQL Database to Amazon Relational Database Service (Amazon RDS) for SQL Server using T-SQL. Within SQL Server, a SQL login acts as a security principal, allowing a user or application to connect to a SQL Server instance. […]

Provision and manage Amazon RDS for Oracle using Terraform

This the first post in a multi-part series where we discuss how you can set up Amazon Relational Database Service (Amazon RDS) for Oracle with Terraform. Terraform by HashiCorp allows you to define the instructions for setting up the infrastructure as a code, simplifying and automating the process instead of doing everything manually. Overview of […]

How Agnostic Engineering improved storage latency for running Polygon nodes on AWS

This is a guest post co-written by Arnaud Briche, the Founder of Agnostic. At Agnostic, our mission is to democratize access to well-structured blockchain data. We aim to provide a swift, user-friendly, and robust method for querying the vast volumes of data generated by smart contract blockchains. As a company, for performance reasons we first […]

A generative AI use case using Amazon RDS for SQL Server as a vector data store

Generative artificial intelligence (AI) has reached a turning point, capturing everyone’s imaginations. Integrating generative capabilities into customer-facing services and solutions has become critical. Current generative AI offerings are the culmination of a gradual evolution from machine learning and deep learning models. The leap from deep learning to generative AI is enabled by foundation models. Amazon […]

Enable fine-grained access control and observability for API operations in Amazon DynamoDB

Customers choose Amazon DynamoDB to improve their applications’ performance, scalability, and resiliency. DynamoDB’s serverless architecture simplifies operations by abstracting hardware, scaling, patches, and maintenances. Managing data access and security in DynamoDB is different than instance-based database solutions. DynamoDB uses AWS Identity and Access Management (IAM) to authenticate and authorize access to resources, whereas RBDMS solutions rely on firewalls rules, […]