AutoMapper, ProjectTo() – Instance Version

Full source code available here.

In my previous post I showed how to use the wonderful AutoMapper ProjectTo() feature, the demo code shown worked with AutoMapper up to v8.1.1. It looked like this –

_salesContext.Products.OrderBy(p => p.ProductId).Take(count).ProjectTo<ProductModel>()

But this will not work with AutoMapper v9 and above. You will be promptly informed that No overload for the method ‘ProjectTo’ takes 0 arguments.

What happend? Well, AutoMapper moved away from a static API to an instance only API.
To get this working you need to make a few minor changes compared to previous post.
In ConfigureServices(…) replace this –

Mapper.Initialize(cfg => {
    cfg.AddProfile<MappingProfile>();
});

With this –

services.AddAutoMapper(typeof(Startup));

Then in the controller, change this

public class ProductsController : ControllerBase
{
    private readonly SalesContext _salesContext;  

    public ProductsController (SalesContext salesContext)
    {
        _salesContext = salesContext;
    }
    // snip…

To this –

 
private readonly SalesContext _salesContext;
private readonly IMapper _mapper;

public ProductsController (SalesContext salesContext, IMapper mapper)
{
    _salesContext = salesContext;
    _mapper = mapper;
}
// snip..

And finally, the ProjectTo call changes from this –

[HttpGet("projected/{count}")]
public ActionResult GetProjected(int count)
{
    return Ok(_salesContext.Products.OrderBy(p => p.ProductId).Take(count).ProjectTo<ProductModel>());
}

To this –

[HttpGet("projected/{count}")]
public ActionResult GetProjected(int count)
{
    return Ok(_mapper.ProjectTo<ProductModel>(_salesContext.Products.Take(count).OrderBy(p => p.ProductId).Take(count)));
}

The SQL produced by the instance ProjectTo() is the same as what was produced by the static version.

That’s it, easy once you figure it out.

Full source code available here.

AutoMapper, ProjectTo() – Static Version

Full source code available here.

I’ve been using AutoMapper for quite a few years, but one of the features that I like the most is the ProjectTo method. When using Entity Framework it lets me reduce the number of fields I query from the database to match the model I want to return to the caller.
For example, I have a Product table in the database and a DTO that represents the Product that looks like this –

public class Product
{
	public int ProductId { get; set; }
	public string Name { get; set; }
	public ProductCategory ProductCategory { get; set; }
	public string Description { get; set; }
	public decimal Price { get; set; }
	public string SKU { get; set; }
	public string Code { get; set; }
}  

But say I wanted to return only the ProductId, Name and Code, I could run a query to return the full Product and the map the results to a model that looks like this –

public class ProductModel
{
	public int ProductId { get; set; }
	public string Name { get; set; }
	public string Code { get; set; }
}

In this scenario, the generated SELECT statement will request all seven fields in product and return them to the application, I then have to them to the ProductModel with the three fields I want.
Here is the SQL produced an executed –

SELECT TOP(@__p_0) [p].[ProductId], [p].[Code], [p].[Description], [p].[Name], [p].[Price], [p].[ProductCategory], [p].[SKU]
FROM [Products] AS [p]
ORDER BY [p].[ProductId]

Instead of requesting all seven fields the ProjectTo method will examine the model and generate only the SQL needed to return the relevant fields.

Here is the SQL produced –

SELECT TOP(@__p_0) [p].[Code], [p].[Name], [p].[ProductId]
FROM [Products] AS [p]
ORDER BY [p].[ProductId]

And here is how to write the C# code that performs this time saving, energy saving work –

var productModels = _salesContext.Products.OrderBy(p => p.ProductId).Take(count).ProjectTo<ProductModel>();

To get this to work you need to do some wiring up.
First add the nuget package AutoMapper.Extensions.Microsoft.DependencyInjection v6.1.1 to your project.

Then create a mapping profile, for this scenario it is very simple. The below code maps Product to ProductModel.

public class MappingProfile : Profile
{
	public MappingProfile()
	{
		CreateMap<Product, ProductModel>();
	}
}

In starup.cs add a call to Mapper.Initialize as shown here.

public void ConfigureServices(IServiceCollection services)
{
	Mapper.Initialize(cfg => {
		cfg.AddProfile<MappingProfile>();
	});

That’s all you need, but that’s not the end of the story…

This approach works with AutoMapper.Extensions.Microsoft.DependencyInjection up to v6.1.1, and AutoMapper v8.1.1, but if you move to newer versions you need a slightly different approach because the static API has been removed.

I’ll show that in the next blog post.

Full source code available here.

Streaming Results from Entity Framework Core and Web API Core – Part 2

Full source code available here.

Some time ago I wrote a post showing how to stream results from Entity Framework over Web API. This approach a few benefits – the results would not be materialized in the API code, a small amount of memory would be used irrespective of the size of the data returned, the results would being steaming as soon as possible and the speed of the request was faster than doing something like .ToList() or .ToListAsync().

In the example code I directly accessed the DbContext from the API controller, a reader of the blog got in touch to ask the database access code could be kept away from the API controller. This blog post shows a simple way of achieving this.

In the related post I had a method like this –

[HttpGet("streaming/")]
public IActionResult GetStreaming()
{
    IQueryable<Product> products = _salesContext.Products.AsNoTracking();
    return Ok(products);
}

The _salesContext was passed into the constructor of the ProductsController class using dependency injection and access directly.

In this post I’m going to pass a ProductsService into the ProductsController and use it to make the request to the database.

Here is how the controller now looks –

    [Route("api/[controller]")]
    [ApiController]
    public class ProductsController : ControllerBase
    {
        private readonly ProductsService _productsService;

        public ProductsController(ProductsService productsService)
        {
            _productsService = productsService;
        }

        [HttpGet("streamingFromService/{count}")]
        public ActionResult GetStreamingFromService(int count)
        {
            return Ok(_productsService.GetStreaming(count));
        }

        [HttpGet("streamingFromServiceWithProjection/{count}")]
        public ActionResult GetStreamingFromServiceWithProjection(int count)
        {
            return Ok(_productsService.GetStreamingWithProjection(count));
        }

        [HttpGet("nonStreamingFromService/{count}")]
        public async Task<ActionResult> GetNonStreamingFromService(int count)
        {
            return Ok(await _productsService.GetNonStreaming(count));
        }
    }

The ProductService is very simple, it makes the relevant calls to the database and returns the Products or ProductModels. For good measure I’ve added Automapper to project the Product to ProductModel, separating the underlying data type from the one presented to the caller.

    public class ProductsService
    {
        private readonly SalesContext _salesContext;
        private readonly IMapper _mapper;

        public ProductsService(SalesContext salesContext, IMapper mapper)
        {
            _salesContext = salesContext;
            _mapper = mapper;
        }

        public IQueryable<Product> GetStreaming(int count)
        {
            IQueryable<Product> products = _salesContext.Products.OrderBy(o => o.ProductId).Take(count).AsNoTracking();
            return products;
        }

        public IQueryable<ProductModel> GetStreamingWithProjection(int count)
        {
            var productModels = _mapper.ProjectTo<ProductModel>(_salesContext.Products.OrderBy(o => o.ProductId).Take(count).AsNoTracking());
            return productModels;
        }

        public async Task<IList<Product>> GetNonStreaming(int count)
        {
            IList<Product> products = await _salesContext.Products.OrderBy(o => o.ProductId).Take(count).ToListAsync();
            return products;
        }
    }

Now, some people might be concerned about returning an IQueryable, if you are see this post by Mark Seemann https://blog.ploeh.dk/2012/03/26/IQueryableTisTightCoupling/ to start digging into the topic.

While writing this I came across what seems like a severe performance bug in Entity Framework Core 3.x, I’m working on another post which will cover this is detail.

Full source code available here.

Streaming Results from Entity Framework Core and Web API Core

Full source code here.
The code provided will not compile until you make a change in seeder.cs, the way it’s written it generates 500,000 rows in a local db. Set this to whatever value you want.

In this post I’m going to show you how to return an unlimited number of results from a database via a Web API application while keeping your memory usage low and constant. In effect, you are going to stream results from the database with Entity Framework.

To achieve this you need to do two things, disable tracking and return a not materialize your data inside the action method.

Turn off tracking
By default Entity Framework tracks entities that you read from a database. Tracking allows EF to determine what, if any, properties have changed since being loaded, then EF can save just the relevant changes. But tracking takes up memory and CPU.

If you have no intention of changing these entities there is no point in tracking them.

There are two ways of doing this, at point where you make the request to the database, or globally for the whole context.

To use AsNoTracking for a single request it looks like this –
_salesContext.Products.AsNoTracking().Where(..)

To use it for all requests to that context set AsNoTracking in the constructor of the context.

public SalesContext(DbContextOptions<SalesContext> options) : base(options)
{
	ChangeTracker.QueryTrackingBehavior = QueryTrackingBehavior.NoTracking;
}

That’s the first step taken care of, now we make sure not to turn our data into objects inside the controller.

Do not materialize
If you follow tutorials on Entity Framework Core and Web API you will see examples like this –

[HttpGet]
public async Task<ActionResult> Get()
{
    List<Product> products = await _salesContext.Products.ToListAsync();

    return Ok(products);
}

In this example, all the products in the database are read and put into a list of products, the return does not execute until after all the data has been read, so you are waiting for this to complete before getting any results. You are also storing the whole list of products in memory (you are probably also tracking the entities).

Instead of that you can do the following –

[HttpGet]
public ActionResult GetStreaming()
{
	IQueryable<Product> products = _salesContext.Products.AsNoTracking();

	return Ok(products);
}

Now, the products do not materialize, there is no list to store in memory, tracking is turned off and the action method begins returning data almost immediately.

Your memory profile will be almost same whether you are returning a few hundred or a few million rows of data.

Example of Memory Usage
Below are screenshots from Visual Studio 2017 of the application running showing the amount of memory consumed for a variety of requests. The first shows the memory usage when I requested a single row from the database

The following images show the memory usage when streaming and not streaming results sets of 100,000, 200,000, 300,000 and 500,000.

106 MB used for a single row

 

100,000 rows returned. On the left is with streaming, on the right without.

 

200,000 rows returned. On the left is with streaming, on the right without.

 

300,000 rows returned. On the left is with streaming, on the right without.

 

500,000 rows returned. On the left is with streaming, on the right without.

 

Summary of Results

It’s obvious that streaming maintains a uniform memory footprint while the memory consumed for non-streaming grows with the number of rows returned.

What you don’t see here is that streaming requests complete more quickly. If you download the attached code and use a local mdf file, you probably won’t see much of a difference in speed, but if your application connects to remote database you will see approximately a 20% improvement in speed.

Summary
For streaming to work you need to turn of tracking of entities and return an IQueryable from the action method.

Full source code here.
The code provided will not compile until you make a change in seeder.cs, the way it’s written it generates 500,000 rows in a local db. Set this to whatever value you want.

Saving Enums with Entity Framework Core

Full source code here.

A few years ago I wrote a post about saving enums to the database with Entity Framework. It was able to save the enum as a string to the database and when reading from the database it was able to take that string and populate the enum correctly. It worked fine but felt a bit hacky.

With Entity Framework Core there is a neater and nicer way to do this using value conversions.

Let’s say we have an Address with two enums – AddressType and DeliveryPreference.

public partial class Address
{
    public int AddressId { get; set; }
    public string Line1 { get; set; }
    public string Line2 { get; set; }
    public AddressType AddressType { get; set; }
    public DeliveryPreference DeliveryPreference { get; set; }
}

We can save the enums to the database as either strings or ints, and when reading them back populate the enum! Here’s how to do both.

The table
My table looks like this

As you can see, the AddressType is stored as an int and the DeliveryPreference is stored as a string.

When you run the application it should create the database and table for you, but in case you don’t have your permissions setup correctly, here’s the script to create it.

CREATE TABLE [dbo].[Address] (
    [AddressId]          INT            IDENTITY (1, 1) NOT NULL,
    [Line1]              VARCHAR (50)   NOT NULL,
    [Line2]              VARCHAR (50)   NOT NULL,
    [AddressType]        INT            NOT NULL,
    [DeliveryPreference] VARCHAR (50) NOT NULL,
    CONSTRAINT [PK_Address] PRIMARY KEY CLUSTERED ([AddressId] ASC)
);

Saving an enum as an string
Firstly lets look at saving the DeliveryPreference enum as an string.

In the context we build the entity, I have the usual things you would expect for AddressId, Line1 and Line2. But DeliveryPreference is a little different.

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    modelBuilder.Entity<Address>(entity =>
    {
        entity.Property(e => e.AddressId).ValueGeneratedOnAdd();

        entity.Property(e => e.Line1)
            .IsRequired()
            .HasMaxLength(50)
            .IsUnicode(false);

        entity.Property(e => e.Line2)
            .IsRequired()
            .HasMaxLength(50)
            .IsUnicode(false);

        entity.Property(e => e.DeliveryPreference) 
            .HasMaxLength(50)
            .HasConversion(x => x.ToString(), // to converter
                x => (DeliveryPreference) Enum.Parse(typeof(DeliveryPreference), x));// from converter

	    //snip..

The DeliveryPreference uses the HasConversion method, passing it two parameters.
The first parameter is the convert to provider expression which as the name suggests converts the value in our object to the type you will store in the database.

And the second is the convert from provider expression, it converts the the type in database to the type in your class.

In this example, I cast the enum to an string to store in the database, and when retrieving from the database I cast the stored string to an DeliveryPreference enum.

Saving an enum as an int
For the second enum, AddressType I don’t explicitly need to convert it from an enum to an int, Entity Framework will do this automatically (as pointed out to me in this tweet). But I’m including it as an example in case you need something like this for some other explicit conversion.

In this example, I cast the enum to an int to store in the database, and when retrieving from the database I cast the stored int to an AddressType enum.

// this is a continuation of the OnModelCreating method
        entity.Property(e => e.AddressType)
            .HasConversion(x => (int) x, x => (AddressType) x);

Full source code here.

Performance Comparison of Entity Framework Core 2.1 and Dapper 1.5

tl;dr – ignore most (maybe all) of the posts out there comparing Dapper and Entity Framework performance, you need to measure it yourself.
Here’s why –
1. Some are angry opinion pieces from people who don’t like one technology or the other and clearly haven’t run any tests.
2. All are out of date (as this one will be shortly) because the libraries move so quickly.
3. None are running against the database and network you have.
4. Your standard queries are far more important than whatever arbitrary queries they used.

Longer Version
I recently had to decide between a few different ORMs for a project, the specifics of the project are irrelevant, but I was using a Postgres database that had lots of data (my lots and your lots are probably different, that I had hundreds of millions of records should not matter to you).

Most of the posts comparing Dapper and EF that I read came down on the side of Dapper, some by a tiny margin and one by no margin but a lot of bluster.

Rather than trust any of these posts I wrote my own benchmarking application using the BenchmarkDotNet library.

I picked seven of the most representative queries I was making and coded them up in both Dapper and EF. Some of the queries pulled back tens of thousands of records and some brought back scores of records, but the most important thing was that these were queries I was going to run in the finished application.

BenchmarkDotNet takes care of things making sure the code has been jitted, that everything has been “warmed up” and then it runs the same request repeatedly to get a useful average.

On top of this, I ran the whole suite of benchmarking tests multiple time, at different times of the day because my application does not live in a vacuum, I needed to know how it would react when the network or database is under more or less load.

At the end of all this Entity Framework came out on top; for the vast majority of test runs it performed better.

Here are the results of one of the test runs, I didn’t pick the best or the worst, but this one is is indicative of what I saw for the majority of tests.

MethodMeanErrorStdDevMedian 
EF_Query1431.19 ms 13.9615 ms38.1581 ms 445.98 ms
Dapper_Query1689.88 ms 35.9716 ms102.7689 ms668.13 ms
EF_Query245.21 ms 1.5613 ms4.6816 ms46.35 ms
Dapper_Query261.55 ms1.7915 ms5.0850 ms62.19 ms
EF_Query3278.72 ms 51.4681 ms150.8833 ms351.38 ms
Dapper_Query3298.64 ms7.3619 ms20.2561 ms297.15 ms
EF_Query417.97 ms 0.3570 ms0.6310 ms17.93 ms
Dapper_Query426.59 ms 0.6516 ms1.7592 ms25.78 ms
EF_Query570.78 ms 1.4936 ms4.0985 ms71.21 ms
Dapper_Query5116.71 ms2.5632 ms7.4015 ms116.36 ms
EF_Query6189.71 ms50.1289 ms148.7825 ms311.21 ms
Dapper_Query6248.61 ms7.2891 ms21.1751 ms304.16 ms
EF_Query7266.62 ms6.2219 20.3179 ms281.27 ms
Dapper_Query7304.27 ms7.8163 ms15.1561 ms301.21 ms

I was a little surprised at this result, but there you have it. EF Core 2.1 is better for me than Dapper.

Out of interest, I took a small portion of my data and put it into SqlServer, and re-ran the tests. This time, Dapper came out slightly ahead. But the SqlServer is on a different network than the Postgres db, is under different load and had a much small dataset so not a realistic exercise.

Conclusion
The moral of the story is you have to test performance yourself, there is absolutely no way you can extrapolate from a test someone wrote about in a blog (including this one) when judging how an ORM will perform for you.

Entity Framework Core, Calling Stored Procedures and Returning to a Model

Full source code available here.

I wrote a post some time back about calling a stored procedure with Entity Framework using the DbCommand, but it was a bit complicated and not that easy to use.

There is now a FromSql method, you just pass the name of the stored procedure and the parameters. The method returns the results and maps them into a model that matches.

First step is to examine what the stored proc returns. I’m using the NorthWind database and its CustOrderHist stored procedure, its output looks like this – I know from the stored proc that this is a string and int.

I created a class to match –

public class OrderHistory
{
    private OrderHistory() { }

    public int Total { get; private set; }
    public string ProductName { get; private set; }
}

In Entity Framework Core 2 you can call the stored proc like this –

List<OrderHistory> orderHistoryList = await _northwindContext.OrderHistory.FromSql($"EXECUTE dbo.CustOrderHist {customerId}").ToListAsync();

In earlier versions you can use this –

List<OrderHistory> orderHistoryList = await _northwindContext.OrderHistory.FromSql("EXECUTE dbo.CustOrderHist {0}", customerId).ToListAsync();

Full source code available here.

Using an mdf file database with Entity Framework Core 2 in Visual Studio 2017

Full source code available here.

If you want to play around with Entity Framework it can be a little frustrating to create a complex database with a lot of sample data.
Instead, you could download the Northwind database from Microsoft, it has plenty of tables, a bunch of views and handful of stored procs. More than enough to try out many of the features of EF.

Now, all you have to is figure out how to get access to the db from your application, and it’s not as simple as you might hope.
So here’s how to do it in Visual Studio 2017.

Step 1
Download the Northwind database – you can get it here https://www.microsoft.com/en-us/download/details.aspx?id=23654
Install it, it will create some files in c:\SQL Server 2000 Sample Databases.

Step 2
Copy Northwind.mdf and Northwind.ldf to your user directory, something like c:\users\bryan. The Northwind files have to be in that directory to work with the connection string below.

Step 3
Now in Visual Studio use the following connection string.

"Data Source=(LocalDB)\\MSSQLLocalDB;DataBase=Northwind;Integrated Security=True;Connect Timeout=30"

My sample application is a .NET Core Web Api. In startup.cs, add the following –

public void ConfigureServices(IServiceCollection services)
{
	services.AddDbContext<NorthwindContext>(options =>
		options.UseSqlServer("Data Source=(LocalDB)\MSSQLLocalDB;DataBase=Northwind;Integrated Security=True;Connect Timeout=30"));
	services.AddMvc();}

If you want to place the database files elsewhere in your filesystem, add an absolute filepath to the connection string.

Step 4
Add an Order.cs class, this represents the orders table from the Northwind database. You can of course add more classes to represent other tables in the Northwind database.

public class Order
{
	public int OrderId { get; set; }
	public string CustomerID { get; set; }
	public int EmployeeID { get; set; }
	public DateTime OrderDate { get; set; }
	// etc.
}

Step 5
Add a NorthwindContext class that inherits from DbContext and add the Order DbSet.

public class NorthwindContext : DbContext
{
	public NorthwindContext(DbContextOptions options) : base(options) { }
	public DbSet<Order> Orders { get; set; }
}

Step 6
In the controller, add the NorthwindContext to the constructor.

public OrdersController(NorthwindContext northwindContext)
{
	_northwindContext = northwindContext;
}

Step 7
And finally, use the context in the Get method.

[HttpGet("{orderId}")]
public async Task<IActionResult> Get(int orderId)
{
	var order = await _northwindContext.Orders.Where(o => o.OrderId == orderId).SingleOrDefaultAsync();
	return Ok(order);
}

Full source code available here.

Performing a WHERE IN with Entity Framework or on a List

WHERE IN is a very useful and commonly used feature of SQL, it looks like this –

SELECT * FROM ORDER WHERE OrderId IN (10248, 10249, 10250, 10251)

The will return up to four rows of data, showing just the orders that have an OrderId in the list you passed to the select statement.

You can do the same with Entity Framework by using the Contains predicate with a Where.

First you need to put the OrderIds you are looking for in some sort of enumerable.

IEnumerable ordersToFind = new List { 10248, 10249, 10250, 10251 };

Then you use Where on the Orders DbSet and check the orderIds against the list you just created.

var orders = _northwindContext.Orders.Where(o => ordersToFind.Contains(o.OrderId));

That’s all there is too it. You don’t have to use this on just Entity Framework, it also works on simple lists, arrays, enumerables, etc.

Unit testing Entity Framework Core Stored Procedures

Full source code available here.

Entity Framework Core has made unit testing CRUD functions much easier, see here for an example of using the In Memory Database, it allows you to search, add, remove and update rows.

But the in memory database doesn’t support execution of stored procedures in any way. I’ve seen people suggest that execution of a stored procedure is an integration test, but I don’t agree with that. If testing a method that does CRUD is a unit test, then testing a method that calls a stored procedure must also be unit a test.

An anonymous colleague of mine came up with this technique.

The first thing to do is add a class that implements IAsyncEnumerable and IQueryable.

public class SpAsyncEnumerableQueryable<T> : IAsyncEnumerable<T>, IQueryable<T>
{
	private IAsyncEnumerable<T> _spItems;
	public Expression Expression => throw new NotImplementedException();
	public Type ElementType => throw new NotImplementedException();
	public IQueryProvider Provider => throw new NotImplementedException();

	public SpAsyncEnumerableQueryable(params T[] spItems)
	{
		_spItems = AsyncEnumerable.ToAsyncEnumerable(spItems);
	}

	public IEnumerator<T> GetEnumerator()
	{
		return _spItems.ToEnumerable().GetEnumerator();
	}

	IEnumerator IEnumerable.GetEnumerator()
	{
		return GetEnumerator();
	}

	IAsyncEnumerator<T> IAsyncEnumerable<T>.GetEnumerator()
	{
		return _spItems.GetEnumerator();
	}
}

Then add an extension class for DbSet, it’s DbSet because it has the method FromSql that calls stored procedures. MockFromSql takes a SpAsyncEnumerableQueryable of items that the stored proc will return.

public static class DbSetExtensions
{
	public static DbSet<T> MockFromSql<T>(this DbSet<T> dbSet, SpAsyncEnumerableQueryable<T> spItems) where T : class
	{
		var queryProviderMock = new Mock<IQueryProvider>();
		queryProviderMock.Setup(p => p.CreateQuery<T>(It.IsAny<MethodCallExpression>()))
			.Returns<MethodCallExpression>(x =>
			{
				return spItems;
			});

		var dbSetMock = new Mock<DbSet<T>>();
		dbSetMock.As<IQueryable<T>>()
			.SetupGet(q => q.Provider)
			.Returns(() =>
			{
				return queryProviderMock.Object;
			});

		dbSetMock.As<IQueryable<T>>()
			.Setup(q => q.Expression)
			.Returns(Expression.Constant(dbSetMock.Object));
		return dbSetMock.Object;
	}
}

Finally, the unit test looks like this.

[Fact]
public async Task Get_Results()
{
	//Arrange 
	var products = new SpAsyncEnumerableQueryable<Product>(new Product()
	{
		ProductId = Guid.NewGuid(),
		ProductName = "Some name",
		Size = 1,
		Value = 2
	}, new Product()
	{
		ProductId = Guid.NewGuid(),
		ProductName = "Some other name",
		Size = 3,
		Value = 4
	});

	var productContextOptions = new DbContextOptionsBuilder<ProductContext>()
		.UseInMemoryDatabase(databaseName: "Get results")
		.Options;

	ProductContext productContext = new ProductContext(productContextOptions);

	productContext.Products = productContext.Products.MockFromSql(products);

	ProductsController controller = new ProductsController(productContext);

	//Act
	IActionResult actionResult = await controller.Get();

	//Assert
	OkObjectResult okObjectResult = actionResult as OkObjectResult;
	Assert.NotNull(okObjectResult);
	List<Product> productResponse = okObjectResult.Value as List<Product>;

	Assert.Equal(2, productResponse.Count);
}

Full source code available here.