Category Archives: Uncategorized

Difference between Refiners Active and Latent

Recently I was trying to setup search for a client and the 2 options caught my eye:

Refinable

Let me explain what these exactly do in real life.

When you set to Yes – Latent – this means that the property is set to refinable, yes,, but is not active. You do this when in design mode not worrying about performance at all since the property is not active right now. (I fail to image such a situation frankly speaking)

When you actually move to production you can set this property to Yes-Active which means I am fully performing and active right now.

Leave a comment

Filed under Uncategorized

SharePoint Versions are not records

During a migration planning exercise for a departmental site on SharePoint 2007 to a new site structure in SharePoint 2010, I came across an interesting use of versions to store records.

The user had no version retention options set on some libraries because they didn’t want old versions removed as particular versions represented a signed agreement at a point in time. They may now be on version 8 for instance that had amendments but version 4 was the signed the original agreed version with the customer.

A most unusual way of using version handling and not advisable given that most SharePoint implementations would impose retention of a set number of major and minor versions to control storage space. The recommended approach would be to declare the agreed copy as a record, copy it to a records centre or create a PDF/XPS copy for later referral.

Moral of the story is if you are discussing version handling with your users be sure to point out that is to allow them to recall an old version for comparison or revert back to a version before unwanted changes were made and not as a record keeping process.

This also reminded of another record keeping faux par unrelated to SharePoint but where a senior exec had an Outlook rule to copy everything to Deleted Items on arrival, he would then move it back to the inbox if it required attention. An IT policy to clear deleted items on Outlook exit removed all his emails….

 

Leave a comment

Filed under Uncategorized

SharePoint 2010 Web Analytics under the covers

I encountered an issue with Web Analytics not processing usage that was evident through the lack of hits in the usage reports in central administration and site collections. Fortunately I got the problem fixed which I’ll share with you in a moment as in the process I did some digging into the how the Web Analytics process works and the reporting databases that may be of interest.

I started with the high level view provided in Microsoft Enterprise Content Management (ECM) Team Blog http://blogs.msdn.com/b/ecm/archive/2010/03/21/introducing-web-analytics-in-sharepoint-2010.aspx.

I’ve interpreted the process visually for how web analytics get collected look something like this:

These are the steps that occur:

  • User request a page and action gets picked up by the Web Analytics service that runs on each SharePoint server in your farm (this may be server side or JS call but have investigated).
  • The Web Analytics service logs this in the “.usage” files in the location specified in central administration.
  • A Timer job called “Microsoft SharePoint Foundation Usage Data Import” that by default runs every 30 minutes that imports the logs into the staging database.
  • Each night the “Microsoft SharePoint Foundation Usage Data Processing” Timer job runs and transforms the data into the reporting database; from my investigation, this populated the “WAClickFact” (Fact) table.
    • This timer job also runs a number of stored procedure to aggregate the data into other tables (WAClickAggregationByDate , WATrafficAggregationByDate, WASearchTrafficAggregationByDate, etc) that are displayed in the reports. Note: Running this manually does not seem to execute the latter part of this process.
  • The last run time of the import from staging and the aggregation is logged in the Settings table in the Reporting database

In my case, the data for hits was being populated into the fact table in the reporting database but the aggregation tables were missing data. The problem was that database files were missing which is when I discovered that part of the Timer Job processing task creates new files for the aggregation tables every 4 or 5 days (this may be based on size) and is done so to improve performance by partioning the table. The reason for the missing files. Not enough disk space was available and as the routine never attempts to create the files again it fails until you manually create the file.

Microsoft assisted in locating the missing file which was logged in the Diagnostic logs when we set Web Analytics logging to Verbose. We could then create the files manually using the script below and leave the overnight jobs to run. Thankfully this processed all the missing aggregations and we lost no data so much thanks to Microsoft’s support team.

Use this SQL statement to find any groups and check for ones with zero files.

SELECT f.*,g.groupname, g.status AS FGStatus FROM sysfiles f LEFT OUTER JOIN sysfilegroups g ON f.groupid= g.groupid

Use the following SQL to get the file location and create a new file.

use DBName

go

DECLARE @DBFilePath NVARCHAR(2000)

SELECT @DBFilePath=LEFT(filename,LEN(filename)-CHARINDEX(N’\’, REVERSE(filename))+1) FROM sysfiles WHERE RIGHT(filename,3)=’mdf’

IF NOT EXISTS (SELECT 1 FROM sysfiles f INNER JOIN sysfilegroups g ON f.groupid= g.groupid WHERE g.groupname=’SharePoint_WebAnalytics_ReportingAggregation20101128′)

EXEC(‘ALTER DATABASE showFGDB ADD FILE (

NAME= ”SharePoint_WebAnalytics_ReportingAggregation20101128”,

FILENAME = ”’+@DBFilePath+’SharePoint_WebAnalytics_ReportingAggregation20101128.ndf”)

TO FILEGROUP SharePoint_WebAnalytics_ReportingAggregation20101128’)

By Alan Marshall

Twitter: @pomealan

Principal Architect

Gen-i NZ

5 Comments

Filed under Admin / Config, Fixes / Solutions, SharePoint 2010, Uncategorized

SharePoint 2010 Configuration with PowerShell and Untrusted SQL domain (SQL Authentication) workaround

This blog covers a work around to configuring SharePoint 2010 to use SQL Authentication to allow a SharePoint WFE to consume SQL Server resources from an untrusted domain. This scenario is common for Extranet and SharePoint for Internet implementations as it isolates the Data Layer into a separate network stack for security reasons; typically to mitigate risk of a compromised web front providing the intruder with direct access to an internal network.

The conceptual topology of this deployment looks as follows:

This diagram also includes a UAG to provide access for external users and ADFS to allow single sign on for internal users against their domain accounts but are not discussed further in this blog.

Although a number of blogs have been published on the configuration of SharePoint 2010 to use SQL Authentication using PowerShell scripts, they do not include the configuration of the services such as Managed Meta Data, Search, Excel etc that you would have in a SharePoint for Internet or Server deployment. If you attempt to configure the services through PowerShell including the SQL Authentication parameters, the provisioning code attempts to add the service domain account to SQL Server even though it is from brief testing not required and prevents the configuration of the service in a untrusted domain.

Note that in SharePoint 2007 this type of configuration worked without any issues

I’m not going to cover the provisioning of SharePoint configuration database and other basic installation steps as these are well covered in this and related blogs although I would create a SQL Server Alias using the cliconfg tool to point to the non-standard SQL Port:

http://blogs.technet.com/b/surama/archive/2010/03/17/sharepoint-2010-configuration-with-powershell-and-untrusted-sql-domain-sql-authentication.aspx

To configure the services you need to follow these basic steps:

  • Configure your network as in the conceptual topology above. Ensure you only allow the SQL Server TCP IP port through the firewall (I also recommend using a non-standard port i.e. not 1433)
  • Follow Sundar Ramakrishnan blog in link above to get the basic installation of SharePoint up and running and Central Administration site in place
  • Configure the firewall to allow the ports required for a domain trust.
  • Create a Trust between the DMZ domain and LAN network domain
  • Configure all the SharePoint 2010 Services. I would recommend doing this in a script so you can run it and confirm the services were created in a short time window to minimise risk of attack whilst firewall is open.
  • Close firewall ports opened to all the domain trusts.
  • Test all the services are operational

We have found that the Web Analytics does not work in this scenario as the PowerShell command has no SQL Authentication parameter. We haven’t done exhaustive testing yet but initial creation of Web Applications and sites are functioning correctly.

Example scripts to use for provisioning some of the services. These are provided as reference so please create your own and not copy these. Use the get-help PowerShell command to get the exact parameters as some services require DatabaseCredentials as a PSCredential object and others use SQL login and password parameters (nothing like consistency):

Create PS credential object to store database SQL account login details (Example use same credentials but you should use different ones for each service)

$dbCredentials = New-Object –typename System.Management.Automation.PSCredential –argumentlist "s_SPExtranetFarm", (ConvertTo-secureString "Password" –AsPlainText –Force)

 

State Service:

      $serviceApp=New-SPStateServiceApplication -Name "State Service"

      New-SPStateServiceDatabase -Name "State_Service_DB" -DatabaseCredentials $dbcredentials -ServiceApplication $serviceApp

      New-SPStateServiceApplicationProxy -Name "State Service Proxy" -ServiceApplication $serviceApp -DefaultProxyGroup

 

Session Service:

      Enable-SPSessionStateService -DatabaseName "Session_State_Service_DB" -DatabaseCredentials $dbcredentials

 

Usage Service:

      New-SPUsageApplication -Name "Usage and Health Data Collection Service Application" -DatabaseName "Usage_and_Health_Data_DB" -DatabaseCredentials $dbcredentials

 

Business Data Connectivity Services:

do {

      $serviceApplicationPoolAccount=Read-Host "Account for application pool (domain\username)"

    } until (Validate-AccountName($serviceApplicationPoolAccount) -eq $true)

      $dbCredentials = New-Object –typename System.Management.Automation.PSCredential –argumentlist "s_SPExtranetBCS", (ConvertTo-secureString "password" –AsPlainText –Force)

      New-SPServiceApplicationPool -Name "Business_Data_Connectivity_Service_Application_Pool" -Account "$serviceApplicationPoolAccount"

      New-SPBusinessDataCatalogServiceApplication -Name "Business Data Connectivity Service Application" -ApplicationPool "Business_Data_Connectivity_Service_Application_Pool " -DatabaseServer "SP_SQL" -DatabaseName "Business_Data_Connectivity_Service_DB" -DatabaseCredentials $dbcredentials

 

I have a support call with Microsoft on this issue and it is a known problem. The product team are deciding whether to fix it. I’ve expressed my opinion strongly as it worked fine in SharePoint 2007 and I wouldn’t want to deploy SharePoint with a domain trust. There other recommended workaround is to put the SQL Server in the DMZ domain whilst configuring services and then move it back to the LAN but I don’t think SQL Server would be too happy with that move but less risk as the firewall is never opened up.

Leave a comment

Filed under Uncategorized

SharePoint Governance and Information Architecture course

I recently attended a course lead by Paul Culmsee twitter: @paulculmsee on SharePoint Governance and Information Architecture and thought I’d share some quick feedback through the ECM team blog.

When I was asked to attend this course as part of the Elite partner program in New Zealand, I had impressions that we would be covering how to design and use SharePoint features to build and information architecture which would just be a repeat of my daily work but got instead a wealth of information, techniques and knowledge on how to successfully implement a SharePoint project based around Information Architecture.

When Paul turned up to run the course I was immediately put into a more positive frame of mind given his background in research wicked problems and the clever workarounds articles he writes. Having also shared a whisky with him at the SharePoint conference in NZ I had brief glimpse of his humour that was going to make this a more enjoyable 4 days.

The course broke down into several modules of which the ones on getting things right from the start and overall project governance set a really good framework. The tools we went through are excellent for day-to-day use on all sorts of work. The sections on SharePoint features were there but if you don’t know the new features such as Content Type syndication, document sets, document ID’s and meta navigation/search filtering then these are a vital part of rounding out the course.

I would recommend attending this course if you are undertaking SharePoint projects in the capacity of a BA, PM, SharePoint Architect or Information architect.

 

By Alan Marshall

1 Comment

Filed under Uncategorized

Welcome…

Welcome to the Gen-i Enterprise Content management team blog.  We will be posting our ideas, solutions, fixes and general discussion points on SharePoint and related products.

As a team, we have built Intranet, Extranet and Internet solution for ourselves and our clients building up a vast amount of experience that we would like to share with the wider community.

With the imminent release of SharePoint 2010 we will be looking at how we can implement this to deliver additional business benefits to our existing clients.

Comments Off on Welcome…

Filed under Uncategorized