phone icon nav Call Us 888-690-3282    Follow Us
phone icon nav Call Us 888-690-3282    Follow Us

Optimizing OLTP Tables: Strategies for High-Volume Data Management

The Challenge

The client’s vendor did not have archive processes, and the OLTP tables have grown to over 2 billion rows, impacting performance, especially when execution plan stability issues arise. The client asked Performance One Data Solutions to address this before the enterprise application was unmanageable.

 

The Performance One DBAs and Engineers addressed this by building a custom archiving and purge process, ensuring no table blocking could occur when populating the archive database and that the database remained online and available throughout the entire process.

 

Handling such large OLTP tables without proper archive processes presents several challenges. Let’s break them down:

  1. Performance Impact:

    • The sheer volume of data (2 billion rows) can slow down query execution, affecting overall system performance.
    • Index maintenance, query optimization, and resource utilization are critical to acceptable response times.
  1. Execution Plan Stability:

    • With complex queries, execution plans can vary, leading to inconsistent performance.
    • Frequent recompilation of execution plans can cause overhead and instability.
  1. Database Availability:

    • Populating the archive database without table blocking requires careful planning.
    • Ensuring the OLTP database remains online during the process is essential for uninterrupted operations.
  1. Data Partitioning and Archiving Strategy:

    • Deciding how to partition data (e.g., by date, region, or other criteria) for efficient archiving.
    • Implementing an effective data retention policy to manage historical data.
  1. Vendor Collaboration:

    • The lack of archive processes from the vendor necessitates collaboration to find alternative solutions.
    • Vendor engagement for performance tuning, indexing, and query optimization is crucial.

 

Remember, addressing these challenges involves a combination of technical expertise, database design, and effective communication with stakeholders!

The Solution

Performance One Data Solutions worked closely with the client to define tables to be archived. An archive database was built, and a custom process was built utilizing a combo DML/DDL command to quickly populate the archive database and then create cursors to delete data already archived incrementally.

 

Let’s break down how this solution was optimal for a custom OLTP database archiving and purge process:

  1. Table Definition and Collaboration:

    • Collaboration with the Customer: Working closely with the customer to identify tables that can be archived is crucial. This ensures that only relevant data is moved to the archive database.
    • Defining Archivable Tables: The process begins by selecting tables based on business requirements, historical data, and retention policies.
  1. Creating the Archive Database:

    • Purpose of the Archive Database: An archive database was specifically designed to store historical data that is no longer actively used in the OLTP system.
    • Separate Storage: Creating a separate archive database prevented interference with the OLTP database’s performance.
  1. Combo DML/DDL Commands:

    • Data Manipulation Language (DML): DML commands (e.g., INSERT, UPDATE, DELETE) were used to populate the archive database with relevant data from the OLTP tables.
    • Data Definition Language (DDL): DDL commands (e.g., CREATE TABLE, ALTER TABLE) are used to create the necessary structures (tables, indexes) in the archive database.
  1. Incremental Data Deletion:

    • Using Cursors: Cursors were employed to process data incrementally. They allow you to fetch and delete data in smaller chunks, minimizing the impact on system resources.
    • Efficient Deletion: By deleting data already archived, you maintain a manageable size in the OLTP tables while ensuring data integrity.
  1. Online and Available Process:

    • Database Availability: Throughout the entire process, the OLTP database remains online and accessible to users.
    • No Blocking: Ensuring no table blocking occurs during data population or deletion is critical for uninterrupted operations.

 

This solution optimizes performance, maintains data integrity, and allows for efficient archiving without disrupting the OLTP system.

 

The Results

The approach resulted in a much smaller primary database that is more reliable when poor execution plans arise and requires fewer server resources for daily processing tasks. The archive process will extend the useful life of both the application and database servers.

 

Let’s delve into the outcomes of the custom archive and purge process for the OLTP database:

  1. Smaller Primary Database:

    • By archiving historical data, the primary OLTP database becomes significantly smaller.
    • Reduced data volume leads to improved query performance, faster backups, and more efficient resource utilization.
  1. Reliability Enhancement:

    • Removal of obsolete data minimizes the risk of poor execution plans.
    • Stable execution plans contribute to consistent query performance, enhancing overall reliability.
  1. Resource Efficiency:

    • With less data to manage, server resources (CPU, memory, storage) are freed up.
    • Daily processing tasks execute faster due to reduced data volume.
  1. Extended Application and Server Life:

    • The archive process ensures that the OLTP database remains manageable and optimized.
    • Both application and database servers benefit from extended useful life, avoiding premature hardware upgrades.

 

In summary, this tailored approach optimizes performance, reliability, and resource utilization, positively impacting the entire system. 

Contact-Us

Let's Talk

Use our expertise to propel your business to the next level.