Archive by Author

Dependency Injection pattern in database development

24 Jun

To get yourself familiar with dependency injection
pattern please check Wikipedia link here:
http://en.wikipedia.org/wiki/Dependency_injection

The primary database focus is to persist data and guaranty data integrity… well, at list we can say it about many databases. As a result database development evolved in its own procedural-way, and it is hard to argue against such a way or to change it, because in most cased database designs have tendency to harden, solidify, and resist a change after they get built. Besides, who want a change when over the years masses of applications got built on that-old-table. Just try to change it and wait for something to fail…
Whatever reason you have for a change try the Dependency Injection pattern. This pattern injects the “depended-on” code to the destination code automatically by knowing the requirement of the destination code and altering an expected outcome of the destination code. Let’s take very popular BOO and FOO templates for our example:

CREATE PROCEDURE dbo.sp_BOO_manager
AS
BEGIN
   IF dbo.FOO() < dbo.FOOTheshold()
      EXEC BOO 'Have to do BOO logic';
      ELSE MOO 'Have to do MOO logic, instead';
END

Here is how to introduce dependency injection into our code:

  • Create following two schemas: INTERFACE, PROD_IMPLEMENTATION
CREATE SCHEMA INTERFACE;
CREATE SCHEMA PROD_IMPLEMENTATION;
  • Move actual functions and procedures with business logic to the PROD_IMPLEMENTATION schema, like this one:
CREATE PROCEDURE PROD_IMPLEMENTATION.BOO
@Message VARCHAR(50)
AS
BEGIN
    PRINT 'Very complicated proprietary logic';
END
  • Create synonyms in the INTERFACE schema that will point to the functions and stored procedures in the PROD_IMPLEMENTATION schema. Note that synonyms can have the same names, because they are located in the different schemas than actual business logic. For instance:
CREATE SYNONYM INTERFACE.BOO
   FOR PROD_IMPLEMENTATION.BOO;
CREATE SYNONYM INTERFACE.FOO
   FOR PROD_IMPLEMENTATION.FOO;
CREATE SYNONYM INTERFACE.FOOTheshold
   FOR PROD_IMPLEMENTATION.FOOTheshold;
  • Then change dbo.sp_BOO_manager stored procedure to use synonyms from the INTERFACE schema instead of objects themselves. Here is what you will get:
CREATE PROCEDURE dbo.sp_BOO_manage
AS
BEGIN
   IF INTERFACE.FOO() < INTERFACE.FOOTheshold()
      EXEC INTERFACE.BOO 'Have to do BOO logic';
      ELSE INTERFACE.MOO 'Have to do MOO logic';
END

Now our sample code does not bounds to the actual business logic directly, instead it calls an abstracted logic through an interface (which we can override when needed), allowing the calling process to inject desired functionality.
Let’s do it…
Make a new object somewhere in another database:

CREATE PROCEDURE SOMEOTHERDB.MOCK_OBJECTS.BOO
@Message VARCHAR(50)
AS
BEGIN
   DECLARE @OverrideName VARCHAR(100)
     = 'SOMEOTHERDB.MOCK_OBJECTS.BOO'
   PRINT 'Very simple log message';
   INSERT INTO TEST_MESSAGES
      (OverrideName, [Message], EventDate)
      VALUES (@OverrideName,@TmpMessage, GETDATE());
END

Then (on your development server) change BOO synonym to point to the mocked object:

DROP SYNONYM INTERFACE.BOO;
CREATE SYNONYM INTERFACE.BOO
   FOR SOMEOTHERDB.MOCK_OBJECTS.BOO;

After this change, you can call business logic and verify that expected message got logged in to a test table.

--call logic
EXEC dbo.sp_BOO_manager;
--assert result
EXEC DBTD_ASSERT_COLUMN_HAVE_VALUE
   @v_TableName = 'TEST_MESSAGES',
   @v_ColumnName = 'OverrideName',
   @v_Value = 'SOMEOTHERDB.MOCK_OBJECTS.BOO',
   @v_ValueDataType = 'VARCHAR(100)',
   @v_UserMessage = 'BOO functionality should run';

There are a few ways to introduce dependency injection pattern in databases, we have chosen SYNONYMS, because of the following benefit:

  • Synonyms are very simple database objects;
  • Synonym does not change actual business logic they represents;
  • Synonym provides less invasive way to perform injection;
  • When creating synonym developer only need the name of the synonym (already known from consumer side) and the name of the object it represents (which is also known by developer from a provider side );
  • Locating synonyms and business logic in the different schemas allows synonym to have the same name as the object it abstracts, this preserve an existing naming convention used by development team;
  • Synonym can be created on the following objects:
    CLR Stored Procedures, Table-valued Functions, Scalar Functions, and Aggregate Functions, SQL Stored Procedure, Replication-filter-procedures, Extended Stored Procedures, SQL Scalar Functions, SQL Table-valued Functions, SQL Inline-table-valued Function, Views, User-defined Tables, includes local and global temporary tables

Well, here you go, now you have one more tool in your development toolbox. Use injection wisely, make sure you have basic infrastructure tests to go with it, and don’t forget about security.

Dynamically create a function or a stored procedure

8 May

Dynamically create a function or a stored procedure in “some-other“ database from within the stored procedure running in “another“ database.

Note: for the time being, we will forget about possible security issues caused by dynamic SQL.

First of all we all know that CREATE or ALTER procedure clauses do not allow specifying the database name as a prefix to the object name, for instance if you will try to run following statement you will get an error:

CREATE PROCEDURE A.dbo.sp_bar
AS
BEGIN
SELECT 'a'
END;

That brings us to the usual way of setting database and then executing CREATE statement in the script, like this:

USE A
GO
CREATE PROCEDURE dbo.sp_bar
AS
BEGIN
SELECT 'a'
END
GO

Don’t forget about “GO” either, because CREATE or ALTER procedure must be the first statement in a query batch. That in fact itself, brings a little challenge when trying to dynamically create a function or a stored procedure in “A database from within the stored procedure running in “B” database, since sp_executesql does not accept “GO”… if you will try following code:

DECLARE @sql NVARCHAR(100) = 'USE A GO'
EXEC sp_executesql @sql

or even this one

DECLARE @sql NVARCHAR(100)
SET @sql = 'USE A
GO'
EXEC sp_executesql @sql

both will return “Incorrect syntax near ‘GO’” error message

So, to get around of all those limitations, create a little stored procedure in the target database (assuming that you have control over database and that you have thought about all the security concerns, and you have limited all access to such stored procedure) that will look like this:

USE A
GO
CREATE PROCEDURE dbo.sp_executesql_inTHISDatabase
(
@sql NVARCHAR(MAX)
)
AS
BEGIN
SET NOCOUNT ON;
EXEC sp_executesql @sql;
END
GO

All we need now is to do what we originally intended

USE B
GO
DECLARE @sql NVARCHAR(800)
SET @sql=
'CREATE PROCEDURE dbo.sp_bar
AS
BEGIN
SELECT ''a''
END;
';
EXEC A.dbo.sp_executesql_inTHISDatabase @sql;

Hurray! It works.

But now… please do remember all those security concerns we were trying to forget for a little while.

Composite Data Warehouse – Part One

2 May

A database design trap

For many organizations the database maturity becomes a pivot point when entrepreneurial agile team culture seamlessly morphs in to a waterfall think tank. Loss of flexibility and development freedom creeps seamlessly from the policy, security, design, change, maintenance, support and other standpoints. Team in such a position might have a couple of database development servers that are rarely synchronized with production. After a few months of coding the development and production environments drifts apart so much that it becomes very hard to manage successful releases. The DB Administrators, who are in many cases are overloaded with multitude of administrative tasks, will have no time to merge production changes back in to development, neither would they have time for setting continuous development integration environments.

When database design solidifies enough, one of the primary goals on the team becomes avoiding the risk of impact to existing functionality: the simplest way to avoid impact is by minimizing database changes, and… After few deployment disasters (that everyone has), management will create a set of new deployment policies (with good intentions of course), that quite quickly can result in is a risk avoidance at expense of innovation. The application is aging.

Foundation that embraces innovation

It is a breeze to talk about an outstanding setup here at Fortigent, developers don’t even realize how lucky they are. Code production is blooming, developers cracking releases to production on regular basis. Years go by and there is no glimpse of hesitation to a change, no signs of fear, and no comments about outdated DBs. So, what is it do we have here at Fortigent that fuels an innovation?

At the core of continuous integration, a seamless and almost unnoticed process, brings latest production database and hot “out of the oven” development code together:

  • Automated process scripts latest production database on regular intervals and keeps all DDLs and metadata under code name “Base”;
  • New or changed database code is stored in the separate location called “Delta”;
  • Deltas from many projects are applied to the Base and compiled into a “Latest” DB version;
  • The common latest DB version is loaded individually for each project at compile time, either on the build server or on the development PCs
  • Numerous unit testing frameworks hit the application on all levels to verify functionality, identify impact, and flag integration deficiencies.

Code from a different teams compiled together with production database changes that flow in to development world as early as possible, giving developers a chance to adapt and produce high quality solutions, while operation can tune any aspect of database.

This is an environment that allows us to embark on the new database project “Composite Data Warehouse” that is paving the path for greater scalability, improved performance and growth.

To be continued…