Skip to main content

How Fenergo handles Data Migration - SaaS Perspective

Introduction

When we created our SaaS solution, we needed to design an approach for the migration of data which both complements the technology and meets the requirements of our consumers. The approach challenges the norms of traditional data migrations because a SaaS platform serves multiple tenants concurrently. The software is continuously in-flight with 99.9% uptime availability, no one tenant can impede that performance and availability of another. To understand how we achieved this, we need to look at what norms do not translate to SaaS, why they do not translate and what is the right approach for loading data to a highly available, multi-tenanted software product?

 

Data Migration Constants

Before we examine differences which should be acknowledged, it can be agreed that the end goal is the same for getting data onto a SaaS Platform as it is for a more traditional system.

  • Load Data: Get data from an existing source and load it into a new platform.

  • Align to a Target Structure / Format : There will be some configuration of the target data structure and source data getting loaded may need to be transformed to align to that structure.

  • Channel : A mechanism or  facility to take data and push it to a destination system.

  • Performance / Predictability : How long will it take to load data? Can a consumer build a plan and strategy around existing performance considerations?

Traditional Standard Conventions

When consumers provision new software deployed on-premises, or even to a cloud data centre, the responsibility for the components within that software stack remain with the consumer, as does the ability to directly access those components.  This infers a certain approach when thinking about a data migration. Namely:

  • Database Access : They can setup the database server, tune its performance and typically have direct access to raw tables. “Where is my Database, get a connection String and push data right into the tables”.

  • Timing and Planning : Traditional Migrations are done within change control windows, where the migration itself is sequenced to execute up front. Once complete, only then does testing of the system begin. If there are issues loading data or if the testing uncovers a problem, the control gates (go/no-go decisions) can be tested and migrations rolled back.

  • Safety Nets & Rollback Strategy : When access to a Database is available, best practice is to take a snapshot, before making any changes, in the event of an issue, a rollback can be performed and another attempt made at a later time. 

 

SaaS Data Migration Solution Overview

As a SaaS client, those traditional conventions no longer hold true and for good reason. Let’s first consider a quick overview of the API First solution to migrating data which Fenergo has created.

  • Users can configure their data model themselves. This structure is used by the data migration functionality to create staging placeholders for data being migrated.

  • An API call is made to create a Migration Session, specifying which of those already configured data structures to use.

  • Further API calls can be made to retrieve template CSV files (with their respective column names that come from the data model) and a summary schema containing data validation detail and Lists of lookup data.

  • Users can then use APIs to upload populated CSV files into a holding location, then call again to start the migration activity.

  • Fenergo will read the uploaded data, validate it against the configured data models and then create the data on the SaaS platform using the contents of the CSV files.

  • Users can monitor progress by calling a status API and then reconcile the migration with the new identifiers. The process is illustrated below.

Tackling Traditional thinking 

The solution designed above is straightforward. It presents the Interfaces needed to create and control the migration of data onto the platform. It does require some rethinking of the standard conventions mentioned earlier. 

  • Database Access : SaaS clients do not have direct access to, or exclusive control of, the underlying Data Stores. The interface is an API which accepts a structured data file. There is no way to make the system unavailable during migration. A SaaS platform is always available. Even if it is not being used by a clients specific users at a pre-planned time, other clients are utilising their tenants and shared infrastructure. 

  • Timing and Planning : Clients can continue to execute their migrations inside change control windows but this is no longer necessary.  The linear dependancy to have a migration complete before testing begins no longer exists. Users can work with a system whilst it migrates data in the background. Records will be available as they are loaded. 

  • Safety Nets & Rollback Strategy : No direct access means clients cannot snapshot or roll back data stores. They must work with their data on the system. Planning and validation is important, as is testing. But once data is loaded, it's loaded. If a record needs to be adjusted migrations can be re-executed or a reconciliation file loaded (a roll forward strategy).

What better approaches can a SaaS data migration offer

The capabilities of SaaS, enable consumers to think differently about how they plan for, test and execute their migrations. The focus should not be on what you cannot do when you don’t  directly control the underlying components, it should be what can we now do differently that we could not do before? 

  • No Longer Specialist Work - With an API interface, the execution process for migration does not require DBAs. Easy to follow API patterns mean once data is structured in the correct format it can be uploaded as a file attachment to an API call. 
  • No more All or Nothing Approach- Clients should take a phased approach to migrating data. The source data may be large but it is not all used all of the time. Slicing the data to be migrated into tranches and migrating what is needed as priority first, allows for a lot of flexibility and control from a client perspective. 
  • No Restriction of Access is Required - As a migration executes on Fenergo, users can continue to use the system. Other tenant’s  are operating  on the shared infrastructure and the platform is designed to scale on-demand to meet resource usage peaks. 
  • No more linear Milestones for Release Activities - Migrating data takes time. Depending on how that data is processed by a system, waiting for a migration to complete can often be the first milestone in a release window and acceptance testing often has a linear dependancy on this milestone. Because of the ease with which migrations can be divided up on Fenergo, a well structured plan could migrate the data to support testing first. The test data could be ready in minutes, and test activity commence simultaneously with the balance of data migration.  Migrations are no longer restricted to release windows, it can occur at any time including during hours of operation.  
  • Speed is no longer primary factor - SaaS solutions need to be considerate of collective performance for all tenants and users. If data is divided into multiple data sets, then that data can be migrated at a controlled pace with continuous reconciliation, whilst the system availability is maintained for all users and all tenants. Change control windows can become a constraint of the past. 

 

 

Summary 

Fenergo are continually seeking solutions which create simpler, clean experiences for consumers. Data Migration is a requirement we have been solving for years and time has taught use where bottlenecks and redundant effort reside. We believe the removal of linear dependancies and availability to operate in more flexible ways supports such experiences. Like anything we will continue to iterate and enhance, continue to ask why? and focus on high standards. 

 


George McGrane is a Technical Advocate for the Fenergo SaaS Engineering Team and has been with Fenergo for 5 years.