Welcome Guest!
Create Account | Login
Locator+ Code:

Search:
FTPOnline Channels Conferences Resources Hot Topics Partner Sites Magazines About FTP RSS 2.0 Feed

Free Trial Issue of Visual Studio Magazine

email article
printer friendly
more resources

Optimize SQL Server Data Access
Numerical test data with real-world data objects can help you choose a data-access methodology that maximizes the database's overall performance.
by Roger Jennings

November 2003 Issue

Technology Toolbox: VB.NET, SQL Server 2000, SQLXML 3.0 SP1, Access 2000

Moving your data-access applications to ADO.NET and XML Web services from traditional ActiveX Data Objects (ADO) and OLE DB providers presents new challenges for you and your organization's database architects and database administrators (DBAs). The .NET Framework's System.Data.SQLClient namespace and the SQLXML 3.0 update to SQL Server 2000's initial XML-oriented feature set offer a multitude of create, retrieve, update, and delete (CRUD) options. Middle-tier data-access choices you make play a pivotal role in database server load and scalability. You and your IT managers, system architects, and DBAs must be aware of potential performance pitfalls, especially when adopting new service-oriented architecture (SOA) technology for business processes.

ADVERTISEMENT

ADO.NET documentation, books, and white papers use adjectives such as "improved," "better," or "best" in offering general guidance on data-access performance, but comparative performance numbers with real-world data structures are few and far between. I'll describe a test regimen that emulates a typical business process and provide performance comparisons of several ADO.NET and SQLXML 3.0 data-access methods (see Table 1). You can use recommendations I base on the test results to ensure that your data-access applications don't cause database performance bottlenecks.

Microsoft's "Designing Data Tier Components and Passing Data Through Tiers" white paper describes a set of patterns and best practices for ADO.NET-based components that connect to SQL Server 2000 as the data tier (see Additional Resources). The white paper distinguishes middle-tier data access logic components (DALCs), which connect directly to the database for CRUD operations, from business entities (BEs), which represent instances of traditional business objects—products, customers, sales orders, invoices, insurance claims, and the like. Mapping relational source data to a strongly typed, hierarchical BE class usually is a straightforward process. You can persist BEs at any point in the business process as XML Information Set files with corresponding XML schemas for disconnected (offline) processing. DALCs isolate BEs from dependence on the database schema and the CRUD methodology you choose. Schema- and data-access–independence is especially important when you adopt new and untried database-connectivity approaches.

Business process components (BCPs) manipulate BEs by their own set of CRUD operations, which usually incorporate business rules and rely often on secondary or third-party data sources. The most common secondary data sources are customer account, product catalog, and inventory information, which often comes from other databases or servers and is stored locally in some cases to minimize server round trips. Typical third-party data sources are charge-card verification, business credit reporting, and insurance-claim adjusting services.

The BE for this article's performance tests is a strongly typed Order object with a hierarchical OrderDetails member that's mapped to the Northwind or NorthwindCS sample database's Orders and Order Details tables (see the sidebar, "Build the Order Object"). The Microsoft white paper uses a slightly modified version of Northwind for its examples. All test DALCs (including XML Web services) create or consume Order type instances and declare order type members as public for compatibility with .NET's document/literal Web services.

Drive DALCs With a Test Harness
The test regimen uses a WinForms test-orchestration client to program execution of the DALCs on three test computers to generate comparative performance data (see Figure 1). The client emulates a simple BCP that retrieves, creates, and updates Order business entities. (The tests don't include deletions because deleting an order is a direct operation that doesn't involve Order objects.) The client executes a sequence of operations that selects one of three DALC types—GetOrder, UpdateOrder, or InsertOrder—and invokes it 100 or more times to obtain an average execution time for a test group. The UpdateOrder and InsertOrder DALCs invoke GetOrder to obtain an Order instance to modify or add to the tables, so average GetOrder time for the group is subtracted from each UpdateOrder or InsertOrder execution time. The first group's data is discarded because initial execution times might include establishing a database connection or instantiating an XML Web service. Averaging data for batches of 10 or more test groups reduces the test data's standard deviation substantially. Repeatability of individual batch timing data is within about 2 percent when run with random selections from a set of about 150,000 Orders and 450,000 Order Details records. An SQL Server table on the client stores average trial and batch times for analysis.

Back to top














Java Pro | Visual Studio Magazine | Windows Server System Magazine
.NET Magazine | Enterprise Architect | XML & Web Services Magazine
VSLive! | Thunder Lizard Events | Discussions | Newsletters | FTP Home