Home  /  Questions  /  Question



75   96.3
Aug 04, 2013


Does the Spark platform in DPF 4.5 support stored procedures?

I know Spark uses dynamic SQL only. If a database allows stored procedures only, is Spark out of the question?
I want to know the amount of work to get Spark to work with stored procs.

Thanks.



508   99.9
Aug 05, 2013
Yes, the Spark platform in the Design Pattern Framework 4.5 does support stored procedures.

First a little background: The main goal of the Spark platform is simplicity and productivity. To this end, it only requires tables and indexes in the database. Furthermore, each table has its own corresponding business object (this is the Active Record pattern).  For example, the Art Shop reference application has tables named Artist, Product, and User, and in the code you'll find similarly named business objects. It is a simple, yet powerful model.

However, if your Sql database already has stored procedures, then Spark does support these. Here is how that works. 

We'll use the Art Shop as an example and add 5 stored procedures to the database. They implement basic CRUD operations against the Artist table. Their names are:

 -  ArtistSelectByIdCommand
 -  ArtistSelectCommand
 -  ArtistInsertCommand
 -  ArtistUpdateCommand
 -  ArtistDeleteCommand

These are fairly self-explanatory (note: all the code is attached to the bottom of this post).  

To minimize the amount of code required to invoke these procedures,we will first add a couple helper methods to the Artist.cs partial class (the extension point).  The class will look like this: 


public partial class Artist : Entity<Artist>
{
    public string FullName { get { return FirstName + " " + LastName; } }

    // utility functions to accommodate stored procedures

    static string parameters;
    static string parametersWithoutId;

    public object[] Values() { return base.Take(); }

    public string Parms() 
    {
        if (parameters == null)
        {
            parameters = string.Join(",",
                Values().OfType<string>().Where(s => s.StartsWith("@")).ToArray());
        }
        return parameters;  
    }
        
    public string ParmsWithoutId() 
    {
        if (parametersWithoutId == null)
        {
            parametersWithoutId = Parms().Replace("@Id,", "");
        }

        return parametersWithoutId; 
    }
}
Values() extracts the parameter values from the business object.

Parms() returns the parameter names

ParmsWithoutId() returns the parameters without the Id parameter

Again their only purpose is to simplify and shorten the stored procedure queries. 


And here is how you would use them (note: db is an alias for ArtContext).


using db = Art.Domain.ArtContext;

// ...

// select single typesafe artist by id

Artist artist   = db.Artists.Query("ArtistSelectByIdCommand @Id = 3").First();
            
// select single typesafe artist by id (using parameters)

Artist artist1 = db.Artists.Query("ArtistSelectByIdCommand @Id", new object[] { "@Id", 3 }).First();

// select single dynamic artist with parameters (usually not recommended, but more flexible)

dynamic artist2 = db.Query("ArtistSelectByIdCommand @Id", new object[] { "@Id", 3 }).First();

// select all typesafe artists

IEnumerable<Artist> artists = db.Artists.Query("ArtistSelectCommand");
            
// change name and insert as new artist (the old Id is ignored)

artist.FirstName = "Joe";
artist.FirstName = "Painter";
            
Artist art = db.Artists.Query("ArtistInsertCommand " +  artist.ParmsWithoutId(), artist.Values() ).First();

// update country of new artist

art.Country = "Italy";

art = db.Artists.Query("ArtistUpdateCommand " +  art.Parms(), art.Values()).First();

// delete new artist

db.Artists.Execute("ArtistDeleteCommand @Id", "@Id", art.Id);
So, yes, stored procedures are fully supported but they are not part of the core philosophy of Spark.  Spark's goal is simplicity and productivity which is accomplished with dynamic Sql (much like Ruby on Rails and other platforms that support the Active Record pattern).  The Sql is created automatically at runtime, so you won't even have to write lengthy Sql statements either.

I have attached all the code referenced in this post.

Hope this helps. 

 


75   96.3
Aug 06, 2013
Thank you for the comprehensive information. The zip file is corrupted.

508   99.9
Aug 06, 2013
Abdelhadi,

The link was fixed.
Please try the download link again.


75   96.3
Aug 06, 2013
Thanks.

May 24, 2016
I don't get the zip. I think is incorrect.

Dec 08, 2016
Sorry !

I can't get this attach file

And i have a problem if i use variable int? and call via store procedure, spark platform can't convert exactly type of sql server