Get In Touch
Sky Loft, Creaticity Mall, Off, Airport Rd, opposite Golf Course, Shastrinagar, Yerawada, Pune, Maharashtra - 411006

Why and how to migrate MS SQL server DB schema?

Now what to expect from the webinar what will be a takeaway from this webinar in the next 30 minutes we are going to talk about so before I’ll set expectations what it is not about marketing the data in the database it is another complex subject and maybe in another webinar will handle about that already not a sales speech it is purely an experience of us using the different tools and pocs that we have done to help you know to do the schema migration from my Microsoft SQL Server to post here is SQL.

so it is just our experience that we are trying to share and it’s a very very Point experience this is not and you know a big topic that we’re trying to you know talk about a very Point problem and I’ll talk about the use cases and what we are trying to solve and there will be you know convergent Journey

that will cover how we you know really did the conversion from Microsoft SQL Server schema to and store position functions to post GRE SQL so in the next 20 25 minutes what I and GG would talk about at the end of it I expect all of us to be aware of what are different why should we first migrate when should we think about even migrating and see what all can be tried because you’ve  already tried something so you can learn from our mistakes and our challenges and on top of it you can add your own value

so at the end of 30 minutes, we should be able to you know plan out any migration activity of this level where you are moving from uh you know Microsoft SQL

Server to Postgres school so the flow of the event would be you know I’ll talk about a problem and approach solution what was the problem why even did we do this right

what did the customers you know had a problem how did it articulate it and then benefits that we you know the customer is getting out of it

and then GG talked about you know the conversion steps technically how really we did the conversion and then what kind of time estimation we should keep in our mind when we are embarking on such projects and then and then you will have you know close to 10 minutes time to ask any questions or anything that comes to your mind

Well to move to the story our customer is a leading CRM for investment Banks uh best of Singapore they have multiple investment Banks as their customers and the problem so they have been using Microsoft SQ.

Server entirely you know Microsoft stack that they had Dot and stack and they have been using Microsoft SQL server for the application so the three top problems when we started talking with them and we started talking to them for you know a lot of migration and it DevOps work that we are doing with them so one of the problems that you know came to us as you know number one the cost

they were paying close to 40K 50k per year ARR on the MSQL server license plus you know the other costs that they were paying second to upgrade and maintenance though there are tools available by Microsoft SQL Server to help upgrade I mean it still takes time it’s not you know but

it’s not as easy right so that was another challenge and then there was a downtime which they could not afford at this point right so and we talked about it the challenge that they were facing and the application this leading CRM application is a database Centric application the way it is designed today

And a lot of business logic and core functionalities are part of the database the stored specification functions so they have close to 360 odd stored procedures and some of them really complex ones which are trying to literally be part of some other business logic

and when you have so much of dependency on your storage persuasion functions to decide your business logic and application functionality then the challenge was

that’s any other like any other code are we even doing testing of data you’ve been doing code coverage of that are we doing unit testing of that right like any other code and

this was a big problem because every change in a database every change in the schema every change in the stored procedure cost them a fair amount of quality testing fair amount of regression and whatnot

so and very good problem that they covered about it foreign we could save you know money and also we talked to the customer what all that we have done and then the solution that we agreed upon and I’m using what agreed upon because uh there is some work you know on the customer side also to make

sure that this migration happens you know end to end it’s not a button-click approach so the solution we agreed upon is a migrate to an open source DB so so that

they can say what tangible customer license and I will talk about a cost saving in for the slides then B database schema conversion from Microsoft SQL Server to Postgres uh SQL that included the tables triggers the store’s producers and functions you know all of this and we’ll talk

about how we did it you know that’s the whole uh you know purpose of this webinar and then c unit testing of a DB code because there was Heavy business logic dependency

so these three became you know to work for us to help customer achieve cost difference help with upgrades and help with you know uh Britain managed fun tested stored processes

Check out the video version here: https://www.youtube.com/watch?v=IDTDUMvbzmI

so with this background let me move to the benefits they achieved after this and then we’ll jump on to more technical stuff which is you would cover in detail

They saved you know close to 35 40K on and you know ARR on the license cost and DB operation costs

That’s the saving they achieved be hype availability in performance because it’s AWS uh hosted in AWS so inbuilt you know availability and performance see

Could you know uh help with the functional testing and code coverage calculation for the stored position function that we migrated or converted over time the 360 order password pushes I talked about

And we have learned over time right that and the experience that we carry from our development world that the area the testing should be localized to the area which is getting changed in this case the database schema is getting migrated that’s your first results and functions are getting migrated so we have to make sure that we are localizing the testing to that

I’m not saying that they don’t have to do integration testing, they still have to do a spare bit of functional testing on their application post-migration but that should be a fair bit should not be boiling the ocean kinda right because we are boiling the ocean at a very granular localized level.

when we are really getting the schema from the micro SQL server to post gear that’s where we are trying to make sure that our code coverage is 100 of all the code that we are now migrating which is to official functions so that we could make sure that that is done so you know

we could achieve close to 95 to 98 code coverage out there than the ability to reduce back-end load

wave processing results at the DB level this is super super important from a customer perspective you may encounter this in your applications or not but in this case, it was super important because the existing

The one the stored pushes and function they had in Microsoft SQL Server had to give a specific output and then there was a layer which was transforming that for each application input that required the modules that required now that so they had to you know depend upon an interpreter in between to interpret the result of this position functions for the applications

now when we moved to post grd SQL they are inbuilt functionality function which helps us you know to produce a result in a specific format so the application could consume that directly

that I avoided another layer avoided another piece of code avoided another piece of redundancy and failures right so this is something that helped them with performance with help them also with maintainability and other aspects

And lastly, I mean unsaid being an open source you know purchase equally is growing I’m sure that’s happening in micro SQL Server also so they get the benefit of it overall from a cost perspective

they could uh say five twice or seven thousand per year cost on the license to zero dollars and saved close to 775 percent of office will cost over the year

so this is in you know it a kind of graph that you know is the same so with this let me change gears and move this to GG.

if you can talk about you know how did we embark on the conversion Journey thank you sure thanks a lot yeah I think that that background helps a lot and yeah let’s get into the details of it I think many of us have the possibility wanting to look at what did or how did we do and what are the tools

that we used and what was all the challenges that we faced and why you faced I am while I’m talking through it if you yourself have gone through it and you have some better good experiences that you want to share in terms of the tour please do not hesitate to write it in in the chat and I love to hear that okay so this it started with an engagement I mean an interesting customer so it started with an engagement where we had to do a quick POC because of the requirements right.

one of the important ones being the DP at the bottom nurse player if any issues or bugs in terms of after conversion or after migration of the schema then it is going to have a ripple effect in the application so that’s why the code coverage and that’s where a lot of focus went in in terms of identifying.

the tool which can give us the code coverage for the uh the for this for the store processing functions that are getting converted and are getting migrated right that’s one important piece second was the tool to use for the schema migration and we are AWS shop so there was a good amount of focus in terms of using AWS

we also looked at some of the other tools which allow us or which converts the schema or migrated the schema but we zero down to AWS which gave us a good amount of accuracy

I’ll get into the level of accuracy in different areas there is some manual work which we saw that every other tool which we looked at for human migration has some other manual work in terms of the data type or in terms of looking at the logic and validating the logic and functionality

and those things so that’s the second important piece that we have to spend time on after identifying and then doing some pics around the code Collision unit test things and once that was done then it was all I mean converted schema in a form of the code which was used in the development pipelines to deploy and manage the Database as the application had it today in terms of deploying.

And installing the DB changes okay so yeah moving to the next one here and I think this is the place where possibly there might be interest in terms of some of the tools like I said the DB schema conversion tool was AWS schema conversion tool or a city that we use it gives good accuracy in terms of 60 to 70 percent for simple and small procedures they were not those were simply not very complex procedures they were some work like 30 to 40 percent of work manual work was still required post conversion why because of some logic checks or some entity checks had to be done even the parameter names had, to be changed manually.

Small work there but we were able to pick a lot of them get it converted get into the hands of the developers get those done quickly so we were moving in phases right get the conversion done for Selected Few we were looking at around 350 to 400 different stored positives for this customer so pick the ones identified as simple and small one get them converted

Focus effort on getting those manually fixed write the unit test for them get the get them in the pipeline in terms of making sure that the coverage for all of those converted is in the right so we work like that even some small things like a simplistic begin and end block is a begin and end block in Ms. SQL

but when it comes to posting the transaction control is there so we have to validate that it is in the right place and it’s not violating anything or some inbuilt functions like avid generator date time or some of the other things right where different and we had to do it manually to now adder or to work with the converter or the change in DB

we did face among those 360 plus procedures we did face few which were complex enough that apart from some of the work that I mentioned we had to put in a good amount of time to look at the queries because they were complex multiple tables but a different type of join queries

and they were working differently in the mystical versus post view so we had to play with that code and change that logic along with the input data getting past the procedure being XML data and being a different types of data at times had to be extracted and and and then manually changed when converted to a post lead so

that’s why the accuracy that we got in complex procedures like those was not more than 20 30 percent and a good amount of effort to end there so that’s why we blocked it like that we faced it like pick up nipple ones picking up complex ones get them done but much before that we looked at the table conversions were fairly highly accurate close to 100 percent.

the only thing that we I think we faced was once we go once we realized some of these changes we were able to write some rules within the act in terms of like the data type I mean big two Boolean into different databases and some of the other standards in terms of naming and all once we did that we got a fairly good accuracy in terms of table transforms or table conversions

before I talk about any test let me spend a minute or so on Utica is like we talked about earlier coverage was one of the important things for this customer we landed on a tool called pigly as well as PG tab

one important point which helped us make a decision to use Piggly was it allowed you to write the unit test in whatever form you want because it was based on a design an architecture that it was it was instrumenting the end code or the stored procedure of functions code in such a way that then it was only playing with the DB does not matter what type of unit testing framework or language or programming thing that you’re using so it worked very well.

We had a familiarity with Pi test across our team being following it for a long for all of our serverless development which is python is development for us so it was an easy choice we faced a problem where the newer I mean the newer versions of post gree now has an as native support for store procedures earlier functions where users so procedures in post way but bigly is not being worked on being open search not being worked on it’s not as active as other

open sources are not being worked on and so it is not supporting stored those native store procedures but the function it does support

and for this customer, we are okay to use functions so so that’s why it went fine but there was one challenge one part of the prefix and PG tab the whole way of writing tests, and all was different learning so it was an easy choice for us to go with Piggly and write the unit test in my test.

So those were the tools for primarily nothing different I mean underline pipelines containers and all were part of it but the core piece of the tools that were used with those helped us get this done easily and quickly

for this customer like I said some of the challenges coverage was an important one so identifying the tool if we had to go back and forth work with the customer show the demos show the recordings to make sure we I mean there’s an engagement and there is a there’s an because finally customer had to return it use it right.

So there was a good amount of Engagement customer was a good DB expertise himself right so that’s why picking the right pool there was important the challenge of Piggly not supporting

the native store procedure in posting became a challenge

but we are hoping that we are fell for maybe somebody in the open source world can uh start or put up a PR for support so the native store procedures in Piggly it shouldn’t be big.

there it’s a ruby West tool so should be easily done maybe we yourself will do it in near future with the DB-heavy logic so think of this we guys did not know the core functionalities of the product right?

So we had to look at this DB heavy logic make sure we understand it and make sure we look at it and make sure we validate it right so that we are able to write good functional uh test cases for that for the converted functions and purposes so that that became a challenge.

because of unfamiliarity but we faced it and worked around it and worked closely with the customer to make it up I mean what is the so I don’t know.

I mean we thought that it might help to figure out how long it might take based on which decision of when to look for such conversion migration would be the right time there is no there is no one answer to it and they mean it the simplistic answer it depends which nobody likes but depending on the complexity of the databases that you’re using if it is if you are not database.

in terms of having the computer close to the database because of the architectural decision that you want to make like this customer then your databases having I mean may not have such logic or core logic close to the database in terms of triggers and surprises become simple even the conversion tool will give you really high value.

and you’ll be able to convert really quickly within maybe a month or even less than that I mean overall testing and all will take some time but that’s what it would be but when it comes to complex uh processing and logic close to the data within the database using the core database constructs like store procedure functions and others then the complexity increase then so the time taken to convert a surely will increase like it did for us.

And based on that view may I would make a decision in terms of when to look for I mean for this customer it was the right time based on the estimate their license expiry was coming in three to four months and that’s where they were able to pick a decision that we want to start now and make it happen.

so that we don’t have to worry about renewing the license in the next cycle that’s one simple thing that came out as a takeaway for us good thank you GG.

so that brings us to the end of the now discussion now we should go through the questions that I see in the chat a couple of questions if you can just open the chat.

If you have any other questions, feel free to ask us.

We use cookies to give you the best experience.
Applied AI Consulting

AWS Migration Readiness Checklist To Evaluate Your Systems

  1. Step-by-step guide
  2. Conduct the process yourself
  3. Save 80% time
  4. Get accurate results to proceed