Wednesday, August 29, 2012

Modern monitoring and prediction system is almost ready for your business

During past couple of months I'm looking for interesting ideas in B2B area. Last weeks I researched a lot of things in business intelligence. What can I say after that? It seems to me BI is turned from long stagnation to extremely fast improvement. I'm not saying about analysis of data in general, I'm saying about application of modern machine learning techniques in everyday analytics for average company. It’s just future coming. What I mean when I'm saying future? I mean continuous monitoring of company's performance metrics, not just monitoring, Real-Time processing in flow, analysis and transferring of ready to use analytics to the end user in shortest time, making assumptions and recommendations automatically. This is not stupid threshold driven analytics anymore. Today we can provide to our customers much and much more. And this is not ends here, next step is business process driven prediction, when you can set points you are interesting to influence the process, where you can correct behavior and reorganize your resources.
You are busy business guy, be honest, you cannot control whole your infrastructure, nobody in your company has complete information about all aspects of process, everybody doing what they have to do. But all we are peoples, we are not machines, there are mistakes, there is not accuracy, there are problems with communication. Of course all of this is issue for optimization, but nothing is perfect.
So, let us say you have an average company, you have to deliver some service to your customer, contract is already in your ECM system, almost approved by management. There is also another part of business process, logistics, your supplier delayed with his production supply, which is needed for delivering services for your customers. Also there is not so small delay in approvement of supply by your financial department. The situation is obvious, it is emergency case. To detect such situation you have to continuously monitor few your systems, it is just not possible for such busy guy like you. Of course you can hard code this situation detection, but there are various things can happen to your business processes.  You will say "I have crowd of analyst", but this crowd want to eat, want to sleep and work with static corporate analytic data, they are not experts in all your business processes, they can’t provide information in Real-Time.
This is our case. I researched a lot in this area last time, and surely can say in the nearest time system which alert you just in time will appear. Even if they will not provide information in time, because of having not enough information, they will say with some probability why it just happened. It really mater, when you  can optimize your business processes in time. It is like super power for that man from the picture above, and we on the way to get it.
First iteration of implementing such systems to create business activity analysis system (BAM), see Gartner's reports for more details.
I don't have a lot of time now to explain all details, but I hope I'll find the time to provide more in future.

Sunday, September 18, 2011

SharePoint Workflow + Nintex as platform for portal based ECM development (speech to the customer)

Few month ago I became a team leader of small group of specialists who develops some kind of portal based ECM system. Since I have started to study such things as SharePoint Workflow, Nintex and everything it depends. I found it is quite interesting to build company's workflows using graphic definition. Of course Microsoft and Nintex aren't inventors of  this approach, there are a lot of different graphical notations for defining workflows with different complexity (for example BPMN), but Microsoft did thing as they like, they took ready for use approach and made such a candy. If you have some patience and possibility to use such technologies you easily can build simple workflows for your company, even if you are not professional developer. That is correct I said this about simple workflows, such as sending notification about changes in the system, simple signing or agreement schemes. That is all for what enough SharePoint + Nintex, if you will run into development of more complex schemes, that could bring headache for you. SharePoint Workflow and Nintex are extendible, but basic set of activities they provide is little bit small (an activity is a kind of single function or action, like building block of workflows). If you will not extend this basic set, your workflows will be like fat monsters, very slow and heavy. Since this moment you will can't do anything better without helping of developers, who can make new activities for extending basic functionality, refactor and so on. This is just my experience, now we are developing some kind of extension for building ECM which can really help to develop more complex workflows easilly, of course there are a lot other platforms on the market, we are not alone. So, if you want to use SharePoint Workflow + Nintex (even without it) for automation of your business workflows and they are little bit complex, be ready to pay some extra money for extending it. Do not believe to slogans like "Nintex is for business analyst", "SharePoint provide possibility easily describe your business processes". SharePoint Workflow and Nintex are really extensible, powerful, flexible platform, but if you are not ready to pay some extra money for company like my, if you have complex business process, if you haven't own development department, do not be naive, buying it is not panacea for your administration and management problems.

Tuesday, July 26, 2011

How to Update SPFile an SPListItem simultaneously or do not lose the references

Hi, I'm back after long time!
Today I'll describe some specific issues of SharePoint API. Don't you know that SharePoint likes to create new objects? When I started working with it, I didn't know. Let us  glance in SPListItem and SPFile simultaneous update. This is example most of peoples use to see:

using (SPSite site = new SPSite("http://webserver"))
{
    using(SPWeb web = site.OpenWeb())
    {
        byte[] binary = new byte[0];
        SPListItem item = web.Lists["ListName"].Items[0];
        item["Title"] = "Title";
        item.File.SaveBinary(binary);
        item.File.Update()
        item.Update()
    }
}

Unfortunately it doesn't work, the program will throw an exception, it says that the item was already changed by someone else. I was really surprised when I found that the objects item and item.File.Item are different, of course they are equal from the point of view of data, but they have different references, SharePoin creates new objects each time you take SPListItem from the list. So to store references you need to do something like this:

using (SPSite site = new SPSite("http://webserver"))
{
    using(SPWeb web = site.OpenWeb())
    {
        byte[] binary = new byte[0];
        SPListItem item = web.Lists["ListName"].Items[0];
        item["Title"] = "Title";
        item.File.SaveBinary(binary);
        item.File.Update()
        item.File.Item.Update()
    }
}

Or:

using (SPSite site = new SPSite("http://webserver"))
{
    using(SPWeb web = site.OpenWeb())
    {
        byte[] binary = new byte[0];
        SPListItem item = web.Lists["ListName"].Items[0].File.Item;
        item["Title"] = "Title";
        item.File.SaveBinary(binary);
        item.File.Update()
        item.Update()
    }
}

Another example when you want to update the field of SPListItem. Look at example:

using (SPSite site = new SPSite("http://webserver"))
{
    using(SPWeb web = site.OpenWeb())
    {
        web.Lists["ListName"].Items[0]["Title"] = "Title";
        web.Lists["ListName"].Items[0].Update();
    }
}

It also doesn't work because of  accessing item two times, SharePoint creates new SPListItem object each time. You will not get exception this time, but the field value will not be updated. The right case is to store the item to the common variable:

using (SPSite site = new SPSite("http://webserver"))
{
    using(SPWeb web = site.OpenWeb())
    {
        SPListItem item = web.Lists["ListName"].Items[0]
        item["Title"] = "Title";
        item.Update();
    }
}

That is all, I just want to reduce your headache. Let us consider some specific of this system and create useful applications.

Monday, April 11, 2011

Getting file content from the ItemAdding method of SPItemEventReceiver

Today I have spent a lot of time to find the way to get file content (binary data and file name) from the ItemAddingMethod of SPItemEventReceiver. This task is depended with my previous post about content Deploymen. I used event receivers to redirect changes to other site collection and then deploy it to few others site collections. Anyway, I think practice is more interested for you. It is possible to extract files data from HttpContext. You only need to add default constructor, where context will be initialized and get all uploading files from Request.Files property of the context. Be careful, there could be empty files, you need to check ContentLength property firstly. Example is below:

public class DocumentEventReceiver : SPItemEventReceiver {
    
    private HttpContext _context;
    public DocumentEventReceiver () 
    {
        _context = HttpContext.Current;
    }
    
    public override void ItemAdding(SPItemEventProperties properties) 
    {
        HttpFileCollection files = context.Request.Files;
        foreach (String key in files.Keys) 
        {
            if (collection[key].ContentLength > 0) 
            {
               Stream stream = files[key].InputStream;
               string filePath = files[key].FileName;
            }
        }
    }
}
What is about ItemAddingMethod, will ask you? HttpContext doesn't contain any information about changed file content. I use now synchronous ItemAdded receiver and extract file content directly from the item, this is not perfect solution, but I use what I have.
Significant contribution for the first problem was provided by my boss, thanks for him.

Sunday, April 10, 2011

SharePoint Content Deployment programming API

Hi, as I promised earlier I'll try to describe SharePoint Content Deployment more deeply, from the side of programming. There are many reason to do not use standard SharePoint content deployment interface (provided in administrative interface), most significant is when you need to take more flexibility from content deployment. You can imagine the situation, if you need to deploy only one list from web, and logic of deployment cannot be scheduled as Timer Job and should be executed only when something specific happened, for example the user has updated content of source list. In that case programming deployment API is coming for you. I have developed more complex scenario of deployment, but here I'll describe basic programming stuff. Deployment has two stages: export and import. One specific thing you need to know, SharePoint uses file system to store intermediate deployment data. Firstly you export content data to the specified directory or .cab file, then you import this content using set of import classes. Let us start from the short overview of classes we will use: Classes for export:
  • SPExportObject is a class wrapper for for deployment content (SPWeb, SPList, SPListItem, etc.), it contains the ID of the object, which you will export and deployment type (type of export object, it is instance of enumeration SPDeploymentObjectType)
  • SPExportSettings is a configuration class, all general settings will be specified in this class. This class is used as property of SPExport class (see below), it also contains collection of export objects.
  • SPExport is a class, witch contains only two interesting things, first is Settings property, second is method Run. This method executes export of content using settings (instance of SPExportSettings class). 
Import is most easiest part of deployment, classes of import are really simple:
  • SPImportSettings contains just few settings for deployment, the main is location of export file or folder, others are depended with versions and so on.
  • SPImport is similar to SPExport, it also has two things to know, Settings property and method Run. There are also some useful events, but they are optional.

Export  

SharePoint Content Deployment supports export of SPListItem and SPList, as well as export of SPWeb and SPSite. Only thing you need to do is to specify object for export as SPExportObject. You can find examples of creating export object for SPWeb, SPList and SPListItem below:
//Creating export object for SPWeb
private SPExportObject GetExportWeb(SPWeb web)
{
    return new SPExportObject
    {
        Id = web.ID,
        Type = SPDeploymentObjectType.Web   
        //you also can set more specific properties, such as IncludeDescendants
    }
}

//Creating export object for SPList
private SPExportObject GetExportList(SPList list)
{
    return new SPExportObject
    {
        Id = list.ID,
        Type = SPDeploymentObjectType.List
    }
}

//Creating export object for SPListItem
private SPExportObject GetExportListItem(SPListItem item)
{
    return new SPExportObject
    {
        Id = item.UniqueId,
        Type = SPDeploymentObjectType.ListItem
    }
}
Then you need to create export settings. Usually I use such settings:
public SPExportSettings GetExportSettings(string siteUrl, 
    SPIncludeVersions includeVersions, 
    SPExportMethodType exportMethodType, 
    string lastChangeToken)
{
    return new SPExportSettings
    {
        SiteUrl = siteUrl,
        ExportMethod = exportMethodType,
        AutoGenerateDataFileName = true,
        FileCompression = true,
        ExcludeDependencies = false,
        OverwriteExistingDataFile = true,
        IncludeVersions = includeVersions,
        ExportChangeToken = lastChangeToken
    };           
} 
ExportMethod property says to deployment API is it export of changes from last change token or it is export of all content. I also recommend to use AutoGenerateDataFileName option, it choses location of export file automatically in the Temp folder for the current user. Also useful to use compression to cab file - FileCompression property. Others options meaning is easy to understand I think.
The last part of export process is to add all necessary objecs to the ExportObjects property of SPSettings and to create instance of SPExport class, using your export settings. Then you only need to call Run method of SPExport class, that is all, all specified objects and other necessary information will be exported to the .cab file.Notice, you can create more then one export object, SharePoint deployment API will understand how to combine them by itself, of course all of them have to be from the same SPSite. Deployment API works on site collection level. 
Example of export is below:
public void ExportWebs(SPWeb sourceWebs, string lastChangeToken)
{
    SPExportSettings settings = 
        GetExportSettings(sourceWeb.Site.Url, 
        SPIncludeVersions.All, 
        SPExportMethodType.ExportChanges, 
        lastChangeToken)
    SPExportObject exportWeb = GetExportWeb(sourceWebs);
    settings.ExportObjects.Add(exportWeb);
    SPExport export = new SPExport(settings);
    export.Run();    
}
The variable lastChange token you can take from SPExportSetting from property CurrentChangeToken (of course you need to do this after export), first time you will use empty string, all content will be exported.

Import

Import hasn't a lot of options. As I wrote earlier there are only two classes SPImportSettings and SPImport.
Firstly you need to create an instance of SPImport class and specify general properties. Let us begin with settings I usualy use:
public SPImportSettings GetImportSettings(string targetSiteUrl, string directory, string file, bool retainObjectIdentity)
    {
        return new SPImportSettings
        {
            SiteUrl = targetSiteUrl,
            FileLocation = directory,
            BaseFileName = file,
            FileCompression = true,
            RetainObjectIdentity = retainObjectIdentity,
            SuppressAfterEvents = true,
            UserInfoDateTime = 
                SPImportUserInfoDateTimeOption.ImportAll,
            UpdateVersions = SPUpdateVersions.Overwrite,
            IncludeSecurity = SPIncludeSecurity.All
        };
    }
You always have to specify target site and location of export file. I'll describe all options I use:
  • SiteUrl specifies site collection where content of export file will be deployed
  • FileLocation - folder, where export file is.
  • BaseFileLocation - the name of export file (.cab file)
  • FileCompression specifies, that export file is compressed.
  • RetainObjectIdentity. If you will set it to true, import will not change guids of objects and you will can use incremental deployment, when you deploy only changes. Notice, if you will set this option to true you need to store source and target site collections in different contend databases, becouse you cannot store objects with the same guids in DB.
  • SuppressAfterEvents will suppress adding, updating and other events during import, it is really helpfull, when you have event receivers in the target web.
  • UserInfoDateTime specifies which information should be trancfered with objects. SPImportUserInfoDateTimeOption.ImportAll will import all information, this information will contains "created by", "modified by" properties.
  • UpdateVersions. If you have versioning turned on in the target site collection this option will specify should be added new version to the object, or maybe it will be overrided.
  • IncludeSecurity. You can choose will you copy all security options of objects or not.
You can use more specific settings specially for your case, but I think, options I have described are enought for most cases.
Ok, now we created instance of SPSettings class, then we need to create SPImport object and run it:
public void ImportContent(SPImportSettings settings)
{
    SPImport import = new SPImport(settings);
    import.Run();
}
You can control process of import, SPImport class has some useful events, you can find description of them on MSDN. The most interesting event is ObjectImported, potentially you can additionally change object after it was imported.  My advice aloso is to delete export file after import. 

PS. Please notify me if I did some slips.

Saturday, April 9, 2011

Introduction to SharePoint Content Deployment

Currently I'm working under really complex scenario of information synchronization between few corporate portals. I use SharePoint Content Deployment for that task. When I experienced such feature of SharePoint as content deployment, I understood this could be really helpful in many cases.
SharePoint 2010 by default provides possibility to create content deployment paths and create timer jobs for periodical deployment. As easiest example it can works like this: administrator or other responsible user creates content deployment path, where he\she can specify source and target site collections. While executing content from source site collection will be deployed to the target site collection. For periodical deployment he\she has to create special timer job, where he\she can specify the period of deployment. That is all, timer job will call deployment and internal deployment system will migrate the content as it is specified in deployment path. There are much more options, I described just basic principles. Such deployment behavior could be useful when you have few site collections in Intranet and Internet, example is synchronization of news feeds between Intranet and Internet portals. Another example is opposite to first, it is aggregation of information from some corporate portals to the main portal (Headquarter company's portal for example).
I described just basic concepts of SharePoint content deployment, later I'll try to show some example of more complex deployment using SharePoint programming deployment API.

Thursday, April 7, 2011

Enable\Disable SharePoint event firing out of EventReceiver

Currently I'm working under really complex scenario of information synchronization between few corporate portals. I use SharePoint Content Deployment for that task. When I experienced such feature of SharePoint as content deployment, I understood this could be really helpful in many cases.
SharePoint 2010 by default provides possibility to create content deployment paths and create timer jobs for periodical deployment. As easiest example it can works like this: administrator or other responsible user creates content deployment path, where he\she can specify source and target site collections. While executing content from source site collection will be deployed to the target site collection. For periodical deployment he\she has to create special timer job, where he\she can specify the period of deployment. That is all, timer job will call deployment and internal deployment system will migrate the content as it is specified in deployment path. There are much more options, I described just basic principles. Such deployment behavior could be useful when you have few site collections in Intranet and Internet, example is synchronization of news feeds between Intranet and Internet portals. Another example is opposite to first, it is aggregation of information from some corporate portals to the main portal (Headquarter company's portal for example).
I described just basic concepts of SharePoint content deployment, later I'll try to show some example of more complex deployment using SharePoint programming deployment API.