Quantcast
Channel: SharePoint 2010 - Development and Programming forum
Viewing all articles
Browse latest Browse all 11508

Custom Data Lifecycle Solution

$
0
0

Dear Colleagues,

I thought I might share my plans with some SharePoint gurus here as 2 heads are always better than one... :)

I am looking at possibilities of SharePoint 2013 in terms of a solution for automated archiving, retention and disposal of files using multi-tiered storage. I also want to integrate the 2nd tier (visible archives) in SharePoint Search so users can specify the vertical, query and retrieve archived files themselves. Our main limitation here is that we own SQL Server 2012 Standard edition which does not allow us to take advantage of a native RBS in a practical way. We have analysed some 3rd party tools from AvePoint, Metalogix, Varonis etc. and even some of them work under SQL Standard- but they cost quite a lot of money- almost the same figure we would pay for upgrading to SQL Enterprise (uuu, that's a lot).

The whole situation made me look at the problem from a different angle. i know there may be convincing voices as to why we should upgrade to SQL Enterprise. This is not really what I am after in this post please.  

Out of curiosity, I looked into possibilities of Business Connectivity Services, External Content Types, Powershell and SharePoint workflows as alternative routes to achieve my goals.

So far, I realised the following and tested just the first phase of the solution:

1. We can scan each list/library/site/sitecollection for items according to our archival rules and move such filtered content out of SQL into 2nd Tier (File Share directories, each for Site Coll., Site, List and Content Types) on a conditional basis. E.g: execute the script with a Workflow (any item older than 2yrs? as trigger)

2. We can crawl those structures of File Share on the 2nd tier, create appropriate Result Sources, Vertical (Archives), maybe even refiners.

3. Each directory on Tier 2 would have another Windows Powershell script assigned to move files out of Tier 2 to the lower level. That would be based on how long for we want files to be searchable and retrievable from within SharePoint.

4. Similarly, on Tier 3 there will be another set of scripts that will fire after the retention period and the files will be destroyed.

Every step would generate a report file that will index every file that has been moved from any location. Ideally, I just want to update that very same report file on each level by adding to it lines of names of files that have moved downwards so there is always an up to date copy of the report.

I hope to take advantage of some automation and there is definitely some of it here. Obviously, when users would create new SharePoint objects: Sites, Lists etc. ,we would need to apply our archival, visibility and retention rules again. That would be the case anyway, even if we used some fancy £20,000+ tool form a 3rd party. With 3rd PArty, we would just have a nice GUI for it I guess. 

What do you guys think about such a solution for a real-world company? It saves us a significant amount of money but is it really something worth pursuing? I am not sure if I can propose such a solution the the board heh.

Thanks and I hope to see some interesting thoughts fetching...

Regards


Viewing all articles
Browse latest Browse all 11508

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>