Test Framework for Ado.NET December 23, 2004 Table of Contents 1.0 Overview 2.0 Requirements 3.0 Design 3.1 Various Components 3.2 Low Level Design 4.0 Setup 4.1 Database Setup 4.2 Configuration 4.3 How to add new database / tables / values 5.0 DataReader tests: 5.1 Control Flow 5.2 Configuration files 5.3 How to add new queries 6.0 DataAdapter tests 6.1 Control Flow 6.2 Configuration files 6.3 How to add new queries 7.0 Running the tests 8.0 ToDo List 1.0 Overview This framework aims at testing various database providers using a common set of testcases and comparing the results with a static list. This framework abstracts the connection, table- creation and other user operations. 2.0 Requirements These are the high-level requirements : * The framework should abstract database, table, sql queries and other user operations. * The framework should provide for the setup of database tables and store procedures required before the actual tests start. * The framework should provide for comparison of results. * One should be able to run the tests for any or all of the databases * Should be extensible for newer databases 3.0 Design 3.1 Various Components: The various components of the framework are : a) Setup / Teardown of database Tables / Stored Procedures : This component is used to create test database for each of the databases to be tested. It will create tables / stored procedures /views based on testing requirements. After creating tables, it also populates values (including negative/border values) in the tables. b) Tests for DataProvider Code : The tests contained in this component try to connect to database, create commands, execute them, and retrieve results. In this component, the retrieved results are placed in a DataReader and are processed from there only. c) Tests for System.Data Code : These tests are similar to tests in (c) , except that the results retrieved from the database are populated in a DataSet, which represents an in-memory cache of data. Changes made to this DataSet are re-conciliated with the database. d) Results comparison: This does not exist as a independent component, but is an inevitable part of (c) and (d). This involves comparing the values retrieved from the database with the actual values entered. 3.2 Low Level Design Setup/Teardown of database tables/stored procedures : * All the operations to be performed on the database (like creating, dropping, inserting values ) are specified in a config file. All the data necessary for carrying out these operations (like the connection parameters, table description, column values, column types) are also specified in the config (XML) file. * The first column of each table is the serial no and is essentially the primary key. This is done to enforce ordering while data retrieval. * If any constraints have to be defined, they also have to be added to this config file. * Generic stored procedures (that are same for all tables, e.g. retrieving values from a table) are also described in the config file in the form of a template. This template has appropriate tags for table names and columns. By replacing these tags with actual names, we get the script for a stored procedure. These kinds of stored procedures, if defined, are created for all the tables described in the config file. * For stored procedures using two or more tables, appropriate tags have to be mentioned in the template. For example, the tag [[TAB2]] will have to be replaced by the name of the second table, the tag [[TAB2-COL1]] will have to replaced by the name of first column of the second table and so on. This is yet to be implemented. * For tests involving DataAdapter, Dataset, the changes that need to be done to the dataset and reconciled back to the database are also mentioned in the config file. Data Provider Tests : There is a base class which defines those members and methods that would be common for all databases. This class essentially contains members to hold the connection, commands, data readers, dataset, data adapter, etc. and has methods to query / update the database and compare the retrieved data against the actual data entered (as in the config file). We then derive a class specific to each database from this base class. These derived classes have code specific to each database. 4.0 Setup 4.1 Database Setup As mentioned in section 3.2, all the database operations and the required data has to be specified in a config file. Right now, this framework supports only four databases : * MySql * MsSql * Oracle * Postgres 4.2 Configuration The configuration file is an XML file in which there is a configuration section corresponding to each database. Under each such section, we have the following data : 1) User operations : Create/Drop tables or Insert data 2) Connection parameters : Connection strings (for connecting through ODBC and in the normal way) 3) Table definition : a) Table name b) Names and Types of columns c) Any other constraints 4) Data to be inserted into each table : a) Table number in which to enter data b) number of rows c) value to be entered in each column of various rows 5) StoredProcedures: a) Type of stored procedure (only 'generic' type is supported as of now) b) Name of the stored procedure c) Number of statements it has d) Template of the stored procedure body 4.3 How to add new database / tables / values For adding new databases, we will have to add appropriate data to the config file. Also we need to add code for connecting to the new database using appropriate providers. Adding a new table to an existing database or additional rows to an existing table, is just a config file change. While changing the config file, one must make sure that the number of tables/columns/rows are consistent with the data. 5.0 DataReader tests: 5.1 Configuration : Apart from the configuration parameteres listed in sec 4.2, here the config file also contains the queries to be run on each table. 5.2 Control Flow The control flow goes like this : 1) Create a database connection and command 2) Run each of the query and retrieve the data using a DataReader 3) Parse the query to find out what all columns were selected and from which table 4) Verify the data in the DataReader against the ones in the config file. 5.3 How to add new queries Add appropriate entries to the config file. 6.0 DataAdapter tests 6.1 Configuration files In addition to the configuration parameters mentioned in sec 5.1, we also have a list of changes to be done to each table. These are the changes that have to done to a DataSet before updating the database. A change description contains the row and the column number to be changed and the value. 6.2 Control Flow The control flow goes like this : 1) Create a database connection and command 2) Fill a DataSet by running each of the query specified in the config file 3) Parse the query to find out what all columns were selected and from which table 4) Verify the data in the DataSet against the ones in the config file. 5) Select all data from each table. 6) Make appropriate changes to the DataSet and update the database 7) Clear and Fill the DataSet again. 8) Verify the data in the DataSet against the ones in the config file considering the changes done. 6.3 How to add new queries Config file change. 7.0 Running the tests The setup and the tests can be done by issuing a 'make run-test' under 'System.Data/Test/DataProviderTests' directory. Doing this will first do the setup of databases and run tests for data readers and data adapters. Doing a 'make run-test' under any of the subdirectories 'datadaptertests' or 'datareadertests' will run the tests for Data Apadters and DataReaders only. To run tests for a specific database, one needs to do a : 'make DATABASE= run-test' 8.0 ToDo List * To make the framework more extensible * Right now the framework understands simple queries (like select) only. Also querying from only one table is supported as of now. The frameowrk should be a able to understand other kinds of queries too. * Only generic stored procedures (common for all tables) are supported as of now. We might want to extend this to be able to write stored-procedures spanning more than one table, etc * Integrate with the daily tests * Better error reporting