Posts

Breaking Changes When Upgrading from EF Core 6 to 7: What You Need to Know

Entity Framework Core (EF Core) is a popular Object-Relational Mapping (ORM) framework used by .NET developers for database operations. With the release of EF Core 7, many developers are considering upgrading their projects to take advantage of the new features and improvements.

However, as with any major version upgrade, there are some breaking changes that developers need to be aware of. In this blog post, we’ll discuss some of the breaking changes when migrating from EF Core 6 to 7 and how to address them.

  1. Renaming of “FromSqlRaw” and “FromSqlInterpolated” methods
    In EF Core 7, the “FromSqlRaw” and “FromSqlInterpolated” methods have been renamed to “FromSql” and “FromSqlInterpolated”, respectively. This is a minor change, but if your project has a lot of code that uses these methods, you will need to update it to use the new method names.EF Core 6:

    var blogs = context.Blogs.FromSqlRaw("SELECT * FROM dbo.Blogs").ToList();
    var blogs = context.Blogs.FromSqlInterpolated($"SELECT * FROM dbo.Blogs WHERE Url = {url}").ToList();
    

    EF Core 7:

    var blogs = context.Blogs.FromSql("SELECT * FROM dbo.Blogs").ToList();
    var blogs = context.Blogs.FromSqlInterpolated($"SELECT * FROM dbo.Blogs WHERE Url = {url}").ToList();
    
  2. Changes to the “Update” method In EF Core 7, the “Update” method has been changed to use a “Set” method instead of “Update”. This means that if you have code that uses the “Update” method, you will need to update it to use the new “Set” method.EF Core 6:
    context.Blogs.Update(blog);
    

    EF Core 7:

    context.Blogs.Set<Blog>().Update(blog);
    
  3. Removal of the “UseInternalServiceProvider” method In EF Core 7, the “UseInternalServiceProvider” method has been removed. This method was used to configure the dependency injection container for EF Core, but it has been replaced with a new configuration method called “AddEntityFramework”.EF Core 6:
    options.UseInternalServiceProvider(serviceProvider);
    

    EF Core 7:

    options.AddEntityFramework().UseInternalServiceProvider(serviceProvider);
    
  4. Removal of the “ExecuteSqlInterpolated” method In EF Core 7, the “ExecuteSqlInterpolated” method has been removed. This method was used to execute SQL queries with interpolated string arguments. In EF Core 7, you can use the new “ExecuteSql” method with parameters to achieve the same functionality.EF Core 6:
    context.Database.ExecuteSqlInterpolated($"UPDATE dbo.Blogs SET Rating = {newRating} WHERE Url = {url}");
    

    EF Core 7:

    context.Database.ExecuteSqlRaw("UPDATE dbo.Blogs SET Rating = {0} WHERE Url = {1}", newRating, url);
    
  5. Changes to the “ToSql” method In EF Core 7, the “ToSql” method has been changed to “ToQueryString”. This method is used to generate SQL queries from LINQ expressions. If you have code that uses the “ToSql” method, you will need to update it to use the new “ToQueryString” method.EF Core 6:
    var sql = context.Blogs.Where(b => b.Url.StartsWith("https://")).ToSql();
    

    EF Core 7:

    var sql = context.Blogs.Where(b => b.Url.StartsWith("https://")).ToQueryString();
    
  6. Changes to the “AsNoTracking” method In EF Core 7, the “AsNoTracking” method has been changed to accept a “QueryTrackingBehavior” parameter. This parameter can be used to specify the tracking behavior for the query. If you have code that uses the “AsNoTracking” method, you will need to update it to include the new parameter.EF Core 6:
    var blogs = context.Blogs.AsNoTracking().ToList();
    

    EF Core 7:

    var blogs = context.Blogs.AsNoTracking(QueryTrackingBehavior.NoTracking).ToList();
    
  7. Changes to the “CreateDbContext” method In EF Core 7, the “CreateDbContext” method has been changed to accept a “DbContextOptions” parameter. This parameter can be used to configure the DbContext options. If you have code that uses the “CreateDbContext” method, you will need to update it to include the new parameter.EF Core 6:
    var context = new MyDbContext();
    

    EF Core 7:

    var options = new DbContextOptionsBuilder<MyDbContext>().UseSqlServer(connectionString).Options;
    var context = new MyDbContext(options);
    
  8. Default value of “Encrypt” attribute in SQL Server connection strings has been changed to “true” In EF Core 7, the default value for the “Encrypt” attribute in SQL Server connection strings has been changed to “true”. This is a high-impact breaking change as it may affect existing applications that rely on the previous default behavior. This means that if you are upgrading to EF Core 7 and your application relies on an unencrypted SQL Server connection, you will need to explicitly set the “Encrypt” attribute to “false” in your connection string.For example, you can modify the “Data Source” attribute in the connection string to include the “Encrypt=false” parameter, like this:
    "Data Source=myServer;Initial Catalog=myDatabase;Integrated Security=True;Encrypt=false;"
    

    Alternatively, you can set the “Encrypt” attribute explicitly to “true” if you wish to use an encrypted connection:

    "Data Source=myServer;Initial Catalog=myDatabase;Integrated Security=True;Encrypt=true;"
    

In summary, upgrading to EF Core 7 can bring many benefits to your project, but it’s important to be aware of the breaking changes that come with the upgrade. By understanding and addressing these changes, you can ensure a smooth migration and take full advantage of the new features and improvements in EF Core 7.

OData with AspNet Core

In this screencast we build an OData enabled backend using AspNet Core and connect it to Kendo UI for Angular 2‘s Grid Component in only 20 minutes. If you missed the first screencast in this two-part seriers, make sure to check it out here!

Until next time, have an excellent day and a super 2017!

AspNet Core NodeServices – Execute JavaScript on the Server at Runtime

Screencast on how to execute javascript from AspNet Core on the backend, inspired by Steve Sanderson NDC Sydney talk. Thumbnail trying to illustrate how my mind gets blown by the possibilities! ;-)

Screencast

Awesome work by the AspNet team, until next time, have an excellent day!

Do’s and Don’ts when Using AspNet with Single Page Applications such as Angular, Aurelia or React

Discussion on do’s and don’ts when combining AspNet with Single Page Applications such as Angular, Aurelia or React. Main question: Should we separate frontend and backend into separate solutions or keep them together as the new AspNet template in VS2015 suggests?

Screencast

What’s your take on it? Agree or disagree? Let me know!

Until next time, have an excellent day!

Structured Logging with AspNet Core using Serilog and Seq

In this episode we take a first look at structured logging from an AspNet Core application using Serilog and Seq.

Screencast

Adding Serilog

Configuring the web app to leverage serilog only requires 3 simple steps. First make sure to get the nuget packages by adding these lines to your packages.json.

    "Serilog.Extensions.Logging": "1.0.0-rc2-*",
    "Serilog.Sinks.RollingFile": "2.0.0-rc-*",
    "Serilog.Sinks.Seq": "2.0.0-rc-*"

In the constructor of your Startup.cs file, configure the logger to log to both the Seq endpoint and to a rolling file, or that’s at least what I did.

    Log.Logger = new LoggerConfiguration()
        .MinimumLevel
        .Information()
        .WriteTo.RollingFile("log-{Date}.txt", LogEventLevel.Information)
        .WriteTo.Seq("http://localhost:5341/")
        .CreateLogger();

This assumes that you’ve installed the Seq MSI on your local machine, you can grab it from here. Finally, add serilog to the logger factory in the configure method.

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
    loggerFactory.AddConsole(Configuration.GetSection("Logging"));
    loggerFactory.AddDebug();
    loggerFactory.AddSerilog();
}

That’s it, browse http://localhost:5341 and you should see the following:
seq

Structured Logging

To log entire objects that will be queryable, simple pass an @-sign in front of the tag name that you want to create. For instance, to log a person object and create a tag accordingly from the HomeController we do the following.

public class Person
{
    public string Firstname { get; set; }
    public string Lastname { get; set; }
}

public class HomeController : Controller
{
    private readonly ILogger _logger;

    public HomeController(ILogger logger)
    {
        _logger = logger;
    }

    public IActionResult Index()
    {
        var p = new Person { Firstname = "Ajden", Lastname = "Towfeek" };
        _logger.LogInformation("Just trying out the logger {@Person}", p);
        return View();
    }
}

Which will result in the following queryable log post in Seq:
seq_logpost

Conclusions

It’s really powerful to log information with rich, structured and queryable log data but there are also some downsides for the moment with using Seq that I’d like to point out.

  • Shipping logs over http/https adds extra overhead.
  • Seq Service needs to be installed on separate virtual machine than your Azure Web App (assuming you’re using Azure). Meaning you’ll need to pay for an extra VM just for logging.
  • Free license is only usable for development, can’t have open endpoints that anyone can log to in production.

Having that said, I still think the idea of structured logging is very interesting and it provides an extra dimension of information when fixing/reproducing bugs. I just won’t be using Seq in production just yet.

Until next time, have an excellent day!

Getting Started with Angular2 RC1 and AspNet Core 1.0 RC2 using VisualStudio 2015 and Gulp

In this weeks screencast we start on a new page, to create a new angular2 and aspnet core seed project using the latest release candidate versions. My earlier series contains many upgrade from beta x to beta y episodes, even alphas, it’s starting to become messy to follow, hence the file new project. Make sure to star the project on github, available at https://github.com/ajtowf/aspnetcore-angular2-seed.

Screencast

Until next time, have an excellent day!

Azure Cloud Service Install Certificate Into Trusted Root Certificate Authorities Store with Azure Startup Task

Here’s a guide on how to install a certificate into Trusted Root Certificate Authorities store for Azure Cloud Services.

What we want to solve

In our case we had a web role (web app) that needed to communicate with a third party that we didn’t control, they were using a self signed certificate and required communication over HTTPS. For the TLS/SSL handshake to succeed we need to install the certificate into our trust store.

What others have done

There are solutions out there where people install the certificate using the portal into the personal store and then have a worker role move the certificate to the trusted CA store with administrative privileges at runtime. First of all, that’s a very cumbersome approach and second it uses resources that costs money, there is a much simpler way.

Solution

1. Include the certificate you want to install into your web app, optionally as a link.
azure-trusted-ca-1-add-certificate

2. Make sure to set the Build Action to Content and Copy to Output Directory to Copy if newer.
azure-trusted-ca-2-content-copy

3. Add a startup.cmd also with Build Action set to Content and Copy to Output Directory set to Copy if newer.
azure-trusted-ca-3-startupcmd

4. Modify the contents of startup.cmd to the following:

certutil -addstore root certificate.cer

5. Open up ServiceDefinition.csdef and add the following lines to your web role configuration section.


  

Full context in our simple sample looks like this:
azure-trusted-ca-4-service-definition

6. You’re done! Next time you deploy the cloud service the certificate will be installed into the Trusted Root Certificate Authorities store for the VM.

What _not_ to do

You can find answers on stack overflow and blogs on how to install the certificate manually by remoting to the machine and using mmc locally. That is a bad idea since it will be gone next time the VM is teared down and re-created. And if you’re new to Azure Cloud Services, that’s not strange at all, it happens.

Final Words

These 5 steps are super easy compared to many other proposed solutions out there. We learned about it from security expert Dominick Baiers blog post from a while back, it’s a lot shorter but as he states — the title says it all!

Hope it helped!

Programming Interview Questions: Recursion

In this screecast we solve two commonly asked interview questions; faculty and traversing binary trees.

Screencast

What’s recursion?

A recursive function is simply a function that repeatedly calls itself and the trick is to realize when to stop calling ourselves to avoid infinite loops that result in stack overflows.

If the interviewers ask you to write down an algoritm that gives you the n:th fibonacci number, calculate faculty or traverse a binary tree they probably want you to provide both an iterative and recursive solution. We don’t address fibonacci in the screencast, but the formula for the n:th number is simply the sum of the previous two, i.e.

f(n) = f(n-1) + f(n-2)

Is this a good interview question?

Here’s the recursive methods I developed during the screencast to calculate faculty and to sum the value of all the nodes in a binary tree:

    private static int sum(Node node) {
        if (node == null) return 0;
        return node.Value + sum(node.Left) + sum(node.Right);
    }
    
    private static long faculty(int n) {
        if (n == 1) return 1;
        return n * faculty(n - 1);
    }

As you can see the answers are usually very simple but it’s not unusual to see candidates try to make things more complicated than they need to be. Just keep it simple.

Interviewers tend to ask these kind of questions even if functional programming is a very small part of the day to day work. It’s always good to be prepared by training on some simple problems similar to the ones covered here. After one or two exercises you’ll get the hang of it and it won’t be a problem if they throw these kind of questions at you during the interview.

And as always, until next time, have a nice day!

Connection leaks when using async/await with Transactions in WCF

If you’re getting “The current TransactionScope is already complete” from service calls that don’t even consume transactions, you’ll probably want to read/see this.

Screencast and Code

The code can be found on github, https://github.com/ajtowf/dist_transactions_lab, one change I did since the recording is that we don’t create the nhibernate factory with each call, we now use a singleton SessionManager instead. Also we’re adding the convention to the factory to never load lazy so that our Item entity don’t need to have virtual properties, which makes it easier to switch between OR-mapper implementations.

Leaking Connections

In a fairly complex distributed enterprise system we were getting some strange The current TransactionScope is already complete errors. We used transactions frequently but we saw this on calls that wasn’t even supposed to run within an transaction.

After trying almost everything we got a hint from a nhibernate analyzer product that we shouldn’t consume a nhibernate session from multiple threads since it’s not thread safe.

If you use await, that’s exactly what happens. Turns out entity framework has the same problem.

The following code in your service will leak connections if the awaited method or service call uses a database connection with EntityFramework or NHibernate.

    [OperationBehavior(TransactionScopeRequired = true)]
    public async Task CallAsync()
    {
        using (var ts = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
        {
            await _service.WriteAsync();
            ts.Complete();
        }
    }

Why Tasks in the Service Contract at all?

The lone reason for our service contracts being task based is that we use the same interface to implement our client-side proxies, which is neat, but the service doesn’t need use await because of that. This will work for instance:

    [OperationBehavior(TransactionScopeRequired = true)]
    public Task CallAsync()
    {
        // Do synchronous stuff
        return Task.FromResult(true);
    }

or (don’t like this one though)

    [OperationBehavior(TransactionScopeRequired = true)]
    public Task CallAsync()
    {
        // Remember to copy the OperationContext and TranactionScope to inner Task.
        return Task.Run(() =>
        {
            // Do synchronous stuff
        });          
    }

Oh, you don’t want to return a Task if you’re not doing anything async? Do this then:

    [OperationBehavior(TransactionScopeRequired = true)]
    public async Task CallAsync()
    {
        // Do synchronous stuff
    }

What about the warning? Turn it off with #pragma.

     [OperationBehavior(TransactionScopeRequired = true)]
#pragma warning disable 1998
     public async Task CallAsync()
#pragma warning restore 1998
        {            
            // Do synchronous stuff        
        }

You’ll probably want to wrap the entire service class with that pragma disable.

Solution

The main take away here is to simply not use async/await in your service code if you’re awaiting methods or service calls that will use database connections. The following refactoring solves the problem:

    [OperationBehavior(TransactionScopeRequired = true)]
    public Task CallAsync()
    {
        _service.WriteAsync().Wait();
        return Task.FromResult(true);
    }

As always, until next time, have a nice day!

Distributed Transactions in WCF with async and await

TL;DR?

See my screencast explaining problem instead:

Problem

When flowing a transaction from a client to a service Transaction.Current becomes null after awaiting a service to service call.

Unless of course you create a new TransactionScope in your service method as follows:

    [OperationBehavior(TransactionScopeRequired = true)]
    public async Task CallAsync()
    {
        using (var scope = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
        {
            await _service.WriteAsync();
            await _service.WriteAsync();            
            scope.Complete();
        }
    }

Problem UPDATE

It doesn’t even have to be a service to service call, an await to a local async method also nulls Transaction.Current. To clearify with an example

    [OperationBehavior(TransactionScopeRequired = true)]
    public async Task CallAsync()
    {
        await WriteAsync();
        // Transaction.Current is now null
        await WriteAsync();                     
    }

Why TransactionScopeAsyncFlowOption isn’t enabled by default I don’t know, but I don’t like to repeat myself so I figured I’d always create an inner transactionscope with that option using a custom behavior.

Attempted Solution

I created a Message Inspector, implementing IDispatchMessageInspector and attached it as a service behavior, code executes and everyting no problem there, but it doesn’t have the same effect as declaring the transactionscope in the service method.

    public class TransactionScopeMessageInspector : IDispatchMessageInspector
    {
        public object AfterReceiveRequest(ref Message request, IClientChannel channel, InstanceContext instanceContext)
        {
            var transactionMessage = (TransactionMessageProperty)OperationContext.Current.IncomingMessageProperties["TransactionMessageProperty"];
            var scope = new TransactionScope(transactionMessage.Transaction, TransactionScopeAsyncFlowOption.Enabled);            
            return scope;
        }

        public void BeforeSendReply(ref Message reply, object correlationState)
        {
            var transaction = correlationState as TransactionScope;
            if (transaction != null)
            {
                transaction.Complete();
                transaction.Dispose();
            }
        }
    }

by looking at the identifiers when debugging I can see that it in fact is the same transaction in the message inspector as in the service but after the first call, i.e.

    await _service_WriteAsync();

Transaction.Current becomes null. Same thing if not getting the current transaction from OperationContext.Current in the message inspector as well so it’s unlikely that is the problem.

Is it possible to create a TransactionScope in a Custom WCF Service Behavior?

Is it even possible to accomplish this? It appears like the only way is to declare a TransactionScope in the service method, that is:

    public async Task CallAsync()
    {
        var scope = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled);
        await _service.WriteAsync();
        await _service.WriteAsync();            
        scope.Complete();
    }

with the following service contract it’s obvious that we get an exception on the second service call if transaction.current became null inbetween

    [OperationContract, TransactionFlow(TransactionFlowOption.Mandatory)]
    Task WriteAsync();

Got a link to a book posing the exact same question on my stackoverflow question. The conclusion is basically that it can’t be done in a clean way. Quoting the book:

We consider the lack of parity with standard WCF behavior introduces by async service operations a design flaw of WCF…

And then a far from ideal / insane solution is proposed.

Accepted Solution for now

It seems like the only way to make this work is to create an inner transaction, if you have a better solution feel free to comment or contact me or why not answer my stackoverflow question http://stackoverflow.com/questions/34767978.

Until next time, have an excellent day!