So Microsoft aren't going to be supporting Linq to SQL (L2S) see "Microsoft Kills Linq to SQL" for more info, but there are still some great situations that it can be really useful in your environment with it going into a production build and one of those is using L2S for testing your data access logic.
This post aims to talk you through a way of using L2S to confirm that what you think your data access logic is doing is what your data access logic is actually doing.
A really useful tool that comes with L2S is called SQLMetal, this tool is used to automatically generate a data context based on the current schema of a given database. So instead of using the fancy designer front end, you auto generate the whole database schema including Stored Procedures and Views if you want but you can restrict this if you want by simply not including the required input parameter. To do this every time your application builds simply amend your project file with the following ensuring that you replace the text in [] with your specific fields:
<Target Name="BeforeBuild">
<Exec Command=""C:\Program Files\Microsoft SDKs\Windows\v6.0A\bin\SqlMetal.exe" /server:[ServerName] /Database:[DatabaseName] /code:[MyDataContext].cs /context:[MyDataContext] /namespace:[MyCompany].[MyProject].DataAccess /views /sprocs /functions /pluralize WorkingDirectory="$(ProjectDir)" />
</Target>
The parameters /views /sprocs /functions will expose all of the views, stored procedures and functions accordingly. The /pluralize parameter indicates that you want to generate plural names for tables so a table called "Customer" will be exposed as "Customers" in the data context. For full details of the parameter list go here. Once you've modified the project file reload it and when you next build the project file it will generate a data context for you with as much of your database exposed as you would like.
So now that we have our data context we can put this to good effect when testing our database access logic, for example take the following data portal insert method from a CSLA business object which will be called when our business object is new and the Save method is called:
protected override void DataPortal_Insert()
{
using (var connection = ConnectionManager<SqlConnection>.GetManager("MyDatabase").Connection)
{
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.StoredProcedure;
cmd.Connection = connection;
cmd.CommandText = "usp_CreateEmployee";
cmd.Parameters.AddWithValue("@FirstName", ReadProperty(FirstNameProperty));
cmd.Parameters.AddWithValue("@LastName", ReadProperty(LastNameProperty));
cmd.Parameters.AddWithValue("@EmailAddress", ReadProperty(EmailAddressProperty));
cmd.ExecuteNonQuery();
}
}
This method quite simply executes a stored procedure called "usp_CreateEmployee" and passes in the parameters from the business object. What we want to do here is validate that the parameters are correctly passed from our business object into our stored procedure and the stored procedure correctly performs an operation to insert a new employee with the details set on the business object. What we don't want to do here is have to write a whole bunch of code in order to do this, step in L2S.
So now back in our unit test we can do the following:
[TestMethod]
public void EmployeeIsCreatedInDatabase()
{
TestDataContext = new MyTestDataContext(CreateConnection());
int initialCount = TestDataContext.Employees.Count();
const string FirstName = "TestFirstName";
const string LastName = "TestLastName";
const string EmailAddress = "test@test.com";
// Create business object and set values
var employee = Employee.NewEmployee(CompanyId);
employee.FirstName = FirstName;
employee.LastName = LastName;
employee.EmailAddress = EmailAddress;
// Call save on business object, this will execute data portal logic
// and insert a new row in the database with the new values
employee.Save();
// Refresh our data context with data from the database
TestDataContext.Refresh(RefreshMode.OverwriteCurrentValues);
// Now get the new employee table count and assert it has increased by 1
int newCount = TestDataContext.Employees.Count();
Assert.AreEqual(initialCount + 1, newCount);
// Now get a Linq Employee object directly from the data context and verify that
// the all of the fields have been populated as we expected them to.
LinqEmployee linqEmployee = TestDataContext.Employees.OrderByDescending(e => e.ID).FirstOrDefault();
Assert.IsNotNull(linqEmployee);
Assert.AreEqual(linqEmployee.FirstName, FirstName, "Expected first names to match");
Assert.AreEqual(linqEmployee.LastName, LastName, "Expected last names to match");
Assert.AreEqual(linqEmployee.Email, EmailAddress, "Expected email addresses to match");
}
As you can see from the above code, the test creates a TestDataContext L2S object and takes an initial count of the number of employees in the Employee table and defines our test values as constants.
Next we create a new employee business object and set the appropriate values with the test data. We then call the Save method on the business object, this is the method we are actually testing and we want to ensure that it correctly passes the information set on the business object down to our data access layer and then the data access layer inserts the appropriate data into the relevant tables.
We can then verify that the Save method has done what it needed to do by using out TestDataContext, we simply refresh the data with values from the database and then perform some assertions. Firstly, we check that the employee count has incremented by 1, this ensures that an insert was performed on the table. Next, we sort the employees by Id and order descending ensuring that we get the latest Id generated by the table - our new employee should be the last row in the table (unfortunately the Last() function in L2S does not work so we have to do this order by descending approach). We then get an instance of this L2S object and perform assertions on its properties ensuring that the data we set on our business object has been set on the L2S object.
Conclusion
Linq to SQL may no longer be supported by Microsoft and as such might not be suitable for production ready systems, but it can still be a really useful tool especially when performing data tests to ensure our data access layers are performing as we expect. I hope you find this post useful and it helps you to get a bit more data access testing done. Let me know your comments.
No comments:
Post a Comment