phone icon nav Call Us 888-690-3282    Follow Us
phone icon nav Call Us 888-690-3282    Follow Us

Optimizing OLTP Tables: Strategies for High-Volume Data Management

The Challenge

The client’s vendor did not have archive processes, and the OLTP tables have grown to over 2 billion rows, impacting performance, especially when execution plan stability issues arise. The client asked Performance One Data Solutions to address this before the enterprise application was unmanageable.

 

The Performance One DBAs and Engineers addressed this by building a custom archiving and purge process, ensuring no table blocking could occur when populating the archive database and that the database remained online and available throughout the entire process.

 

Handling such large OLTP tables without proper archive processes presents several challenges. Let’s break them down:

  1. Performance Impact:

    • The sheer volume of data (2 billion rows) can slow down query execution, affecting overall system performance.
    • Index maintenance, query optimization, and resource utilization are critical to acceptable response times.
  1. Execution Plan Stability:

    • With complex queries, execution plans can vary, leading to inconsistent performance.
    • Frequent recompilation of execution plans can cause overhead and instability.
  1. Database Availability:

    • Populating the archive database without table blocking requires careful planning.
    • Ensuring the OLTP database remains online during the process is essential for uninterrupted operations.
  1. Data Partitioning and Archiving Strategy:

    • Deciding how to partition data (e.g., by date, region, or other criteria) for efficient archiving.
    • Implementing an effective data retention policy to manage historical data.
  1. Vendor Collaboration:

    • The lack of archive processes from the vendor necessitates collaboration to find alternative solutions.
    • Vendor engagement for performance tuning, indexing, and query optimization is crucial.

 

Remember, addressing these challenges involves a combination of technical expertise, database design, and effective communication with stakeholders!

The Solution

Performance One Data Solutions worked closely with the client to define tables to be archived. An archive database was built, and a custom process was built utilizing a combo DML/DDL command to quickly populate the archive database and then create cursors to delete data already archived incrementally.

 

Let’s break down how this solution was optimal for a custom OLTP database archiving and purge process:

  1. Table Definition and Collaboration:

    • Collaboration with the Customer: Working closely with the customer to identify tables that can be archived is crucial. This ensures that only relevant data is moved to the archive database.
    • Defining Archivable Tables: The process begins by selecting tables based on business requirements, historical data, and retention policies.
  1. Creating the Archive Database:

    • Purpose of the Archive Database: An archive database was specifically designed to store historical data that is no longer actively used in the OLTP system.
    • Separate Storage: Creating a separate archive database prevented interference with the OLTP database’s performance.
  1. Combo DML/DDL Commands:

    • Data Manipulation Language (DML): DML commands (e.g., INSERT, UPDATE, DELETE) were used to populate the archive database with relevant data from the OLTP tables.
    • Data Definition Language (DDL): DDL commands (e.g., CREATE TABLE, ALTER TABLE) are used to create the necessary structures (tables, indexes) in the archive database.
  1. Incremental Data Deletion:

    • Using Cursors: Cursors were employed to process data incrementally. They allow you to fetch and delete data in smaller chunks, minimizing the impact on system resources.
    • Efficient Deletion: By deleting data already archived, you maintain a manageable size in the OLTP tables while ensuring data integrity.
  1. Online and Available Process:

    • Database Availability: Throughout the entire process, the OLTP database remains online and accessible to users.
    • No Blocking: Ensuring no table blocking occurs during data population or deletion is critical for uninterrupted operations.

 

This solution optimizes performance, maintains data integrity, and allows for efficient archiving without disrupting the OLTP system.

 

The Results

The approach resulted in a much smaller primary database that is more reliable when poor execution plans arise and requires fewer server resources for daily processing tasks. The archive process will extend the useful life of both the application and database servers.

 

Let’s delve into the outcomes of the custom archive and purge process for the OLTP database:

  1. Smaller Primary Database:

    • By archiving historical data, the primary OLTP database becomes significantly smaller.
    • Reduced data volume leads to improved query performance, faster backups, and more efficient resource utilization.
  1. Reliability Enhancement:

    • Removal of obsolete data minimizes the risk of poor execution plans.
    • Stable execution plans contribute to consistent query performance, enhancing overall reliability.
  1. Resource Efficiency:

    • With less data to manage, server resources (CPU, memory, storage) are freed up.
    • Daily processing tasks execute faster due to reduced data volume.
  1. Extended Application and Server Life:

    • The archive process ensures that the OLTP database remains manageable and optimized.
    • Both application and database servers benefit from extended useful life, avoiding premature hardware upgrades.

 

In summary, this tailored approach optimizes performance, reliability, and resource utilization, positively impacting the entire system. 

Background

A large automotive plant operator has been using a custom application framework to build an application integral to their operations. Originally built using JDK8 and JBoss4 against an Oracle database, the application received periodic updates, with the last major update in 2020/2021. In 2024, the client and Performance One Data Solutions, a division of Ross Group Inc., embarked on a joint effort to upgrade the technology stack to enhance performance, security, and maintainability.

Objectives

The primary goals of the project were:

  1. Upgrade the Custom Application Framework from JDK8 to JDK21.
  2. Upgrade the jQuery integration to release 3.7.1.
  3. Upgrade the application to run on Wildfly 30 from Wildfly 11.
  4. Migrate the database backend from Oracle 12 to Microsoft SQL Server 2020.

Challenges and Solutions

Framework and Database Migration:

  • Challenge: The application framework was vetted against Microsoft SQL Server but was not deployed in a major application.
  • Solution: Performance One refactored the framework to conform to Java 9+ standards and regression-tested it against Microsoft SQL Server 2020.

Application Refactoring:

  • Challenge: Refactoring the client application to compile and run using the new tech stack.
  • Solution: Performance One took a copy of the client application and refactored it to compile and launch using JDK21, Wildfly 30, and Microsoft SQL Server.

Data Migration:

  • Challenge: Migrating data from Oracle to Microsoft SQL Server.
  • Solution: Performance One provided migrated data and created installation scripts to upgrade the client’s Microsoft SQL Server.

Report Conversion:

  • Challenge: Converting BIRT and Crystal reports against Microsoft SQL Server.
  • Solution: Performance One converted the reports and ensured they worked with Microsoft SQL Server.

Project Timeline

Initial Milestones:

    1. Complete Framework upgrade to JDK21/Wildfly 30.
    2. Provide migrated data from Oracle to Microsoft SQL Server.
    3. Regression test the application framework against Microsoft SQL Server 2020.
    4. Refactor the client’s application to compile and launch using the new tech stack.
    5. Create installation data script for Microsoft SQL Server compliance.
    6. Regression test the application installation processes.
    7. Make recommendations for unused application features.
    8. Validate and upgrade application code.

Additional Tasks

  • Database Procedures Conversion: Performance One converted the client’s database procedures, functions, and views to Microsoft SQL Server.
  • Report Migrations: Ongoing conversion of BIRT and Crystal report development.
  • Application Enhancements: Upgrading client’s application screens for jQuery 3.7.1, file attachment uploads, and integration with scanner devices using web sockets.
  • Server Maintenance: Wildfly server maintenance, release builds, and deployment handling, with a transition plan for client resources post-go-live.

Conclusion

Performance One Data Solutions partnered with the client to upgrade the technology stack, ensuring improved performance and maintainability. The collaboration between Performance One and the client team demonstrated effective problem-solving and adaptability, setting a strong foundation for future upgrades and enhancements.

Contact-Us

Let's Talk

Use our expertise to propel your business to the next level.