Welcome to today’s post.
In today’s post I will discuss how you can decide what approaches to take when developing and maintaining databases with Entity Framework Core.
There are two approaches that are available when creating and updating database schemas. The approach we can take is dependent on several factors.
These include:
- The type of database we are using in our application, relational or non-relational.
- The stage of the software development-cycle our application is currently at: development, testing or production.
- The amount of data in our database.
In the next three sections, I will be explaining how to recognise each of the above dependent factors and decide when to use them to decide on your migration strategy. In each section, I will use Entity Framework Core as our ORM when explaining each strategy.
Deciding the Migration Strategy Based on the Type of Database
In a previous post where I introduced the use of the Entity Framework Core PowerShell Migration Tools, I showed how to apply the following PowerShell commands to add, update and remove migrations from a database:
Add-Migration
Update-Database
Remove-Migration
I then showed in more detail how to apply the above commands for incremental schema updates to the database.
The manual approach taken with the PowerShell commands is known as the Migration approach to updating database schemas. The manual approach is only applicable when the database type is a relational database, such as MS SQL Server.
If the database type is a non-relational database, such as a NoSQL database, then the Migration approach is not possible. In this case, we would need to use the programmatic method with the Entity Framework Core API.
Deciding Migration Strategy based on Stage of Software Development Cycle
In addition, to apply migrations manually to the database schema, we can use data context Migrated() extension method.
In a production environment, it is recommended that the above extension method Migrated() is NOT called as it requires higher level of permissions, and in addition, any changes can cause data and schema corruption if another instance accesses the database with a concurrent migration.
With production scenarios, I recommend generating the SQL script using either the extension method GenerateCreateScript() or from the PowerShell command:
Script-Migration
The SQL script can then be used to test a deployment before running it in a production environment.
When a script needs to be generated that can be applied directly to the database offline or through a deployment pipeline, you can use the following command:
Script-Migration -Idempotent -Output [output file name]
The Idempotent switch will generate a script that can be run to bring any database schema up to date with the last migration. Entries in the script will be of the form:
IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20191112075920_add_bookloan_newfields_12_11_2019')
BEGIN
ALTER TABLE [Loans] ADD [DateDue] datetime2 NOT NULL DEFAULT '0001-01-01T00:00:00.000';
END;
GO
Essentially, the idempotent script when run will create any schema tables not existing in the target schema.
When the development cycle is at the testing stage, we would want to tear down, rebuild, and re-populate our database environment and data entirely. In this case, we have a useful extension method, EnsureDeleted(), which ensures that the database is removed when the application is run, and can be re-created thereafter with the EnsureCreated() extension method. After the database schema is re-generated, we would then populate the database with initial data (such as lookups), then execute any application unit tests with the test data.
With databases that contain large amounts of data that closely reproduce the size of data used within a production environment, it is impractical to be developing and frequently changing the database schema, while tearing down the environment. The data re-population loading times would make it impractical. In this case, the data would be more suitably used within a QA or production environment.
Deciding on Migration Strategy Based on the Size of Existing Data
For the database to be created up to the prototyping phase of development from the data context and POCO models, there is code that can be added within the code block of the startup Configure() method, and this is a call the EnsureCreated() extension method as shown:
public void Configure(IApplicationBuilder app, IHostEnvironment env,
ILogger<Startup> logger)
{
…
using (var serviceScope = app.ApplicationServices
.GetService<IServiceScopeFactory>()
.CreateScope())
{
var context = serviceScope.ServiceProvider
.GetRequiredService<ApplicationDbContext>();
context.Database.EnsureCreated(); // create database if not already created.
}
…
}
When the application is run initially, the database will be created. On subsequent executions, the database will be opened for access. There is no additional code that we need to run to detect database existence. Also, Migrations and the EnsureCreated() extension method have interoperability issues and so are not guaranteed to work as expected, so only use the latter when the schema changes have already been migrated.
With development environments, any sample data stored within the database should be reproducible with additional SQL scripts or with appropriate data seeding routines or data loading applications. In this case, the database can be deleted and recreated without risk of removing important data.
The above discussion gives us some clear ideas on what methods and tools to use when migrating database schemas. When used appropriately, they provide the developer with a powerful set of tools that can make schema migrations more structured, more automated, and less likely of scripting errors being applied to production environments.
That is all for today’s post.
I hope you found this post useful and informative.
Andrew Halil is a blogger, author and software developer with expertise of many areas in the information technology industry including full-stack web and native cloud based development, test driven development and Devops.