Monday, December 20, 2010

[MOSS2007/WSSv3] PowerShell Library - Dump all solutions

Today I am sharing a PowerShell script I have created to export all solutions that have been added to SharePoint Solution Deployment.

A while ago I visited a customer to troubleshoot an issue, however they did very little about any form of documentation. This meant that I had no way of determining exactly which version of solutions they had installed. Using this PowerShell script I was able to export all installed solutions and start testing using the actually installed solutions.

How to use the script:
  1. Download the PowerShell script
  2. Create a folder called "Solutions" in the same directory
  3. Run the script
  4. All solutions (wsp files) are downloaded to the Solutions directory

Friday, December 17, 2010

[MOSS2007/WSSv3] PowerShell Library - Check for large Nintex workflow lists

Over the past months, I have been creating several PowerShell scripts in order to make my life a little easier. This week I thought, why not sharing these scripts with the world and make everybodies life a little easier :-)

Therefore here script number 1: Check for large Nintex workflow lists!

[DESCRIPTION]
When using Nintex Workflow 2007 and activating the site feature, a hidden list is created in that site called NintexWorkflowHistory. Even time a workflow starts, a Log History action is used, etc an entry is written in this log. Unfortunately this list is not cleaned automatically, potentially becoming a feared "Large List". On one occasion a user had created a workflow that added 200.000 items in just five days to that list. Because it is a hidden list, it does not appear in the GUI and you cannot easily see the size of these lists.

In order to be able to determine where large Nintex lists exist, I have created a PowerShell script.

How to use it:
  1. Download the PowerShell script
  2. Open it in a text editor like Notepad
  3. Change the <url> into the URL of the web application you would like to check
  4. Run the script
  5. Open the output in Microsoft Excel and use "*" as separator
The script generates an output file in the following format:
http://examples.sharepoint.com/sites/NintexSite * 8927
http://examples.sharepoint.com/sites/NintexSite/SubSite * 5304
http://examples.sharepoint.com/sites/NintexSite2 * 3840
http://examples.sharepoint.com/sites/NintexSite2/SubSite * 10239

Wednesday, December 15, 2010

The “Soft” part of SharePoint - Part 4, DTAP

What is DTAP?
DTAP is a strategy very often used in software development projects. It stands for “Development, Test, Acceptance, Production” (http://en.wikipedia.org/wiki/Development,_testing,_acceptance_and_production). Software is developed on a development environment, then technically tested on the Test environment, user acceptance tests are performed on the acceptance environment after which it is transferred to the production environment.

SharePoint is a product that can function as a platform on which companies can build custom solutions. Even though SharePoint is build on the .NET Framework, it has its own rules and boundaries. Poorly developed code can easily affect SharePoint and cause it to fail or seriously impact performance. From an administration standpoint it is very important to evaluate solutions before deploying them to the production environment (see previous article in this series). A DTAP strategy will assist you in this evaluation process.

But not only with the deployment of solutions a DTAP strategy can be useful. Also the implementation of changes or SharePoint updates/Service Packs can benefit from this strategy. They can first be tested before implementing on the production environment.

Environments: Purpose and permissions
Development
With SharePoint you can have multiple kinds of development environments. Each developer can create his own virtual local environments or you can use a centrally managed environment. Both options have their pros and cons.

Local development environment:
Purpose: Environment used for developing new SharePoint solutions. The environment is under total control of the developer.
Responsible: Developer
Server admin: Developer
Admin access: Developer
Pro : Flexible, can be used everywhere, total control over environment, developers do not impact each other
Con : Developer has to maintain environment the environment (patches, etc), host needs to have sufficient resources, environment used by one developer only

Central development environment
Purpose: Environment used for developing new SharePoint solutions. Multiple developers per environment (2 max), but they can impact each other.
Responsible: Developer
Server admin: TAM
Admin access: TAM & Developer
Pro : Centrally managed according to standards, multiple developers per environment, requires less licenses, no high end local hardware required
Con : Can only be used when connected to the network (direct or VPN), developers can impact each other

Test
Purpose: Environment used for technical testing of developed SharePoint solutions (check if conflicts are present with other solutions) and their deployment instructions. Also changes to the environment (e.g. change of settings or implementation of patch or Service Pack) can be tested on this environment.
Responsible: TAM
Server admin: TAM
Admin access: TAM
Other access: FAM and developers have admin access to SharePoint site collections and if required read access to administration pages

Acceptance
Purpose: Environment used for functional testing of developed SharePoint solutions (check if solutions complies with functional design). This environment should match your PRD as close as possible (setup and configuration wise). Just content can be out-of-date.
Responsible: FAM
Server admin: TAM
Admin access: TAM
Other access: FAM has admin access to SharePoint site collections and if required read access to administration pages

Pre-Production environment
Purpose: Environment that is a very close mirror to the production environment. Mirror on solutions, content, architecture and infrastructure. Meant for testing the implementation to a production like environment and the impact on production. Performance tests can also be done on this environment.
Responsible: TAM
Server admin: TAM
Admin access: TAM
Other access: None

Production environment
Purpose: Environment that is running the production solutions/content and is serving end-user requests.
Responsible: TAM
Server admin: TAM
Admin access: TAM
Other access: FAM has admin access to SharePoint site collections and if required read access to administration pages

Important!!
Make sure that starting with the Test environment and onwards, all environments have the same layers of the topology. The environments don’t have to be equal in number of servers, but if PRD has three layers (database, application. Web front-end), Pre-Production, Acceptation and Test should be three layers as well. Back in 2006 I learned this the hard way (http://share-point.blogspot.com/2006/02/problems-with-search-functionality.html). Our Acceptation environment consisted out of two servers and Production out of three servers……..Installation of Windows 2003 SP1 worked just fine on Acceptation, it didn’t on Production and broke search. It took me two weeks to find out the issue and another two weeks to clean up the mess I created during troubleshooting :-)

Explanation:
Purpose : Describes the purpose of the environment.
Responsible : Who is functionally responsible for the environment.
Server admin : Who is the administrating party of the environment and has to make sure that the environment is patched, secure, backed up, etc.
Admin access : Who has administrative access to the server.
Other access : Which other access is granted to which parties.
TAM : Technical Application Management
FAM : Functional Application Management

Tuesday, December 14, 2010

[SP2010] Site templates in SharePoint 2010

In SharePoint 2007 it was possible to create a site template (stp file) of a site and use that template to create new sites. When you downloaded that stp file and added it to the global template gallery by using the stsadm command addtemplate, you were able to create new site collections based on this template.

With SharePoint 2010, this mechanism has changed a bit. Site templates are no longer stp files, but when creating a template of a site SharePoint creates a Sandboxed solution which is placed in the sandboxed solutions gallery.

A possible solution
On his site Todd Klindt explains how to use these solutions to create new site collections based on this template.

Unfortunately this method has the downside that it requires manual actions, each time you would like to use the template.

An alternative
Another solution is to add the solution to the Farm Solutions gallery. This has the advantage that you don't have to upload the solution each time. But the downside is that when adding the solution, the template does not become available in the template selection, but is added as a feature to all site collections. During site collection creation you therefore still need to activate this feature before you can use the template.

The solution
Then how to solve this issue farm wide......simply by making a small change to the solution. The template feature in the solution is scoped to a site collection by default and therefore uploads the template to the site collection template gallery. If you change the scope of the feature to Farm and then activate that Farm feature, the template is globally deployed and available for selection during site collection creation!

Friday, November 12, 2010

The “Soft” part of SharePoint - Part 3, Solution intake process

Where developer should have a standard SharePoint Development process, which describes the development best practices in your environment, should every SharePoint administrator have a solution intake process.

Poorly implemented custom solutions can introduce security or performance risks, increase the cost of support, complicate deployment, and reduce productivity. Over the past years I have seen that developers do not always know how to develop good solutions for SharePoint. Even though SharePoint is build on .NET, developing for SharePoint is a totally different discipline than developing for .NET. More than often I have seen developers delivering code that was either seriously affecting the environment, not using the SharePoint deployment framework or very poorly documented.

In order to guarantee quality, it is very important to create a solution intake process. The process verifies if certain best practices have been followed and the solution is safe to deploy the environment:

  • Create a code acceptance checklist for the developer to fill out.
    • This checklist forces developer to sign off their solutions against a list of Best Practices.
  • Check the code using SPDisposeCheck.
    • This tool checks the custom code for memory leaks and proper use of disposable objects.
  • Check deployment documentation.
    • Check if the deployment documentation is correct and contains the required information.
  • Check solution package
    • Check if the solution package is created properly, using the Solution Deployment framework technologies.
If you want to have a good example of a solution intake process, Microsoft has released some documents for their SharePoint Online cloud service. Their intake process is really strict and requires the developer to design, document and test the solution before handing it over to Microsoft. They have published their process in the following documents:

More information:
Check list example: http://technet.microsoft.com/en-us/library/cc707802.aspx
SPDisposeCheck: http://code.msdn.microsoft.com/SPDisposeCheck
Using Disposable Object: http://msdn2.microsoft.com/en-us/library/aa973248.aspx

Tuesday, November 09, 2010

[SP2010] Issue migrate Classic to Claims authentication

[SITUATION]
Currently I am working at a customer where we have to migrate SharePoint 2007 data to a new SharePoint 2010 environment. Security ACLs on the SharePoint 2007 data are registered in the old SharePoint 2007 way (called Classic in SharePoint 2010). In order to use claims these ACLs need to be converted into Claims ACLs.

[ISSUE]
On the TechNet site I discovered this article. Unfortunately when performing these steps (running a PowerShell script), it did not work. As with many issues, SharePoint doesn't give any clue what might be wrong :-(

[SOLUTION]
After some troubleshooting I remembered an issue I had back in MOSS2007. There I tried to perform an activity on a site collection, which did not work. It turned out that I did not have permissions on the site collection. To solve that, I granted myself "Full Control" permissions via the "Policy for Web Applications" page, after which the activity worked fine.

To test this theory, I tried the following steps:
  1. Create a new web application
  2. Create a new site collection
  3. Add some data and set unique permissions
  4. Grant my admin account Full Control permissions for the web application
  5. Run the PowerShell script to migrate the web application to Claims
and see here, the script now runs just fine and migrates all Classic ACLs to Claims ACLs!!

[NOTE]
Microsoft has confirmed that there are some issues with the Classic to Claims migration. According to them, Service Pack 1 will include a tool which should be able to successfully migrate Classic to Claims. So either test your migration thoroughly or wait for SP1 (expected end Q2/beginning Q3)!!

Thursday, October 28, 2010

The “Soft” part of SharePoint - Part 2, Impact of design choices

During every SharePoint design phase, choices have to be made on exactly how to implement SharePoint and its components. These choice can have serious implications later on during the administration phase.

To be able to create a good design, it is imperative to have good requirements for the environment. These requirements must be gathered both at the business and at IT end, as they both have to “use” the platform in the future, although their use will be completely different.

For example:
  1. Where the business has certain availability requirements (e.g. 99.9% 24x7), IT has the requirement that these availability requirements must be achieved using redundancy. The final design has to be an environment design with which all parties can live with.
  2. In order to deliver a good service, IT will have to be able to test each change to the environment. In order to do this, they will require testing facilities in the form of DTAP environments. In most cases, the business will experience the DTAP strategy as annoying and time consuming.

It is very important that all choices are documented in a design document, especially when the persons performing the implementation are different than the actual administrators of the environment. The design document has to be reviewed and approved by the future administrators before implementation starts. If during implementation a deviation from the original design has to be implemented, this deviation has to be agreed between both parties.

Basically: If the implementation project messes up (for whatever reason), the future administrators will suffer the consequences!!

Very good examples of critical design choices with high impact are:

  • Use of a DTAP environment* - During a project, implementing a DTAP strategy requires time and money. Two things a project very often doesn’t have a lot of. Skipping the implementation of DTA environments saves time and money for the project. However this choice will seriously impact administrators in their ability to test changes before implementation (patches/service packs, new solutions or configuration changes).
  • Redundancy in the environment - Implementing redundancy requires extra hardware and therefore extra costs for hardware, software licenses and installation. If during implementation the requirements do not include redundancy, this will not be designed and implemented. Unfortunately, adding redundancy later will require a lot of work and has a high impact, especially for SQL Server where adding redundancy (clustering) will mean a complete reinstallation of the entire SQL environment.
  • Sizing - I have been part of a project where the available storage was limited. We only had a certain amount of storage available, that was it. The disk space for all servers was just enough to contain all data. Once the environment was transferred to the administrators, one of the first things they had to do was adding extra disk space and moving database across these extra disks.

The message of this story: Most items mentioned might sound obvious, but unfortunately I have seen a lot of situations where this turned out to be harder than you would think/like.

If you are part of the implementation project, make sure that you involve the future administrators as soon as possible. If you are the future administrator, make sure you get involved as soon as possible. Each design decision has to be approved by both parties! So check and communicate between both parties early and continuously.

A very good input for IT requirements are acceptation criteria. This is a list of criteria to which the project has to comply to. That way you can save yourself a huge amount of time, effort and stress! So make sure you create these!

*More on DTAP will follow in a later post

Monday, October 18, 2010

The “Soft” part of SharePoint - Part 1: Skills

With products like Exchange, only IT Pros are involved, the people who are implementing or administering the environment are often the same people that are managing the Windows operating system. At least they have sufficient (infrastructure) knowledge to cover all bases.

With SharePoint, this is a whole different ball-game. As SharePoint is build on top of a lot of Microsoft infrastructure products, good infrastructure knowledge is very important (Windows, DNS, Active Directory, ISA Server/TMG, etc). Unfortunately more than often, SharePoint knowledge is very limited with people who are cracks in those technologies. SharePoint developers however have very limited infrastructure knowledge, so they have a hard time doing a very good job from a performance/security perspective. In other words implementing or managing SharePoint properly can be quite a challenge.

Then what do you need?
When you are implementing or maintaining SharePoint in a proper way, you will need various kinds of skills. Most of the times these skills are not found in one person and therefore a team needs to be created.

The required skills are:


  1. Server - Hardware and operating system
    This is the most common discipline and the knowledge that is the most commonly available. Just like with other products, basic server management is required. This includes server patching, monitoring, backup/restore, antivirus management, troubleshooting, etc.

  2. Database - SQL server
    SharePoint is build on SQL databases. Without proper administration, these databases potentially can cause issues. However SharePoint databases cannot be compared to regular SQL database. They are very sensitive on actions you can and cannot do with them. Microsoft has released guidance (whitepaper and KB articles) for database administrators on how to manage SharePoint SQL databases.

  3. Technical SharePoint
    SharePoint Technical Application Management (TAM) is implementation and management of the infrastructure side of SharePoint. The full SharePoint configuration is the responsibility of TAM, therefore they are the only ones that are allowed to change that configuration. All TAM activities are done by logging onto the server or via the Central Administration. Most common TAM activities are: Application monitoring, troubleshooting application issues, application backup/restore, deploying SharePoint solutions, configuring SharePoint, creating web applications, SharePoint antivirus, etc.

  4. Functional SharePoint
    SharePoint Functional Application Management (FAM) is implementation and management of SharePoint from an end-user perspective. Determine which functionalities the user needs, how the logical structure is going to look like and how certain SharePoint components need to be configured. Tasks of FAM would be: Assisting end-users, setting up sites, site structure, content types, determining the desired configuration of SharePoint components like search/user profiles, etc. In an ideal world do these activities not require any access to the SharePoint servers directly. Everything can be done via a web browser.

  5. SharePoint Custom Development
    SharePoint is a very extendible platform. If the default out-of-the-box functionalities are not sufficient for end-users, extra functionality can be custom developed. All customizations are build using the .NET Framework, however pure .NET developers are not automatically SharePoint developers. SharePoint has its own set of development rules and best practices. SharePoint developers use Visual Studio to develop their custom code and package that in a SharePoint solution package. This package is then delivered to TAM, who are going to deploy this to the environment.

The technical and functional application management skills can be a grey area, as they can have some overlap. An example is the creation of Managed Properties:

  • Specifying managed properties is done by FAM, creating and configuring the managed properties is done by TAM.
  • Search management is the full responsibility of FAM

Each of the above skills can be implemented as a dedicated person(s), but also a combination of an SharePoint Technical guy (Skills: TAM/FAM) and a SharePoint Developer (Skills: FAM/Development). This totally depends on your organization. In a large organization, a split might be a good option to have a clear separation of duties.

The team that implements or administers SharePoint must have at least the first four skills available. The skill “SharePoint Development” is very useful (creating visual design, creating tools, etc), but not required.

Friday, October 15, 2010

The “Soft” part of SharePoint - Introduction

Over the past years I have done many SharePoint (2003 and 2007) implementations or supported with existing implementations. With each of these environments I have ran into the same types of issues. I have decided to capture all of this knowledge into some blog posts. In the next few weeks, I will post these here, starting with “Skills”. So keep posted!!

Thursday, October 07, 2010

[MOSS2007/WSSv3] User profiles - Updates and deletions

The how, what and why with user profiles
SharePoint is made up of two separate products, which are very tightly integrated: Windows SharePoint Services v3 (WSSv3) and Microsoft Office SharePoint Server 2007 (MOSS2007). Because WSSv3 also can be implemented by itself, Microsoft has implemented WSS user profiles in order to keep track of user information in a WSSv3 environment. Each site collection has its own User Information List with profile information for each user that has ever logged onto the site collection.

When implementing MOSS2007, extra user profile functionality is implemented on top of WSSv3, but the WSSv3 profiles still exist in the site collections. To keep all user profile information synchronized across both the MOSS profiles and the WSS profiles, Microsoft has implemented a User Profile Synchronization mechanism which synchronizes all MOSS profile information to all WSS user profiles. Because MOSS profiles can be synchronized with AD, changes in AD are synchronized to all user profiles using this mechanism.

More information on user profiles can be found here (in Dutch).

How are deletions of AD accounts handled
When a full import from AD is done and a user which has a MOSS profile is not present in the AD import, this account is marked “Missing from Import”. When the user is missing from import during three imports, SharePoint considers this account as deleted and deletes the account from the MOSS profiles. Unfortunately these deletions are NOT replicated to the WSS profiles, so the WSS profiles of those users remain in the system.

Ok, what does that mean
I found out that when you are trying to grant a user permissions somewhere in a site collection, the SharePoint People Picker both checks the User Information List and Active Directory. If a user has been deleted from AD, but still exists in the User Information List (still has a WSS profile), this user is returned in the results. This means that you can still see users that have left the company ages ago, which is very confusing for site collection administrators:

Actual real life situation:
E.g. John Doe has left the company, I can’t see him anymore in the Outlook GAL, but I can still grant him permissions in my site collection.

Crap, how do I solve this
This can be solved by cleaning your WSS profiles regularly. Unfortunately by default this is a manual process. If you only have a few site collections and a few users, this is quite easy. But if you have many site collections and many users, this is a major challenge!

In my environment we created a custom tool for this issue:

  • The tool checks the MOSS profiles against the WSS profiles and creates a delta file, “which WSS profiles do not have a MOSS profile”
  • The output we compare (automated) with AD. This because we do not import all accounts like admin accounts
  • The ouput is then feeded into another tool, which deletes these accounts

And are there any things I need to pay attention to
Of course! Nothing comes without a price! Because the WSS profile of users are deleted all items or documents that have been by those users become “orphaned”. SharePoint is not able to display who created or changed the item or document. So be careful!

Another issue we encountered is with search. If permissions have been granted to individual users and these users are deleted, the crawl has to change these permissions in the index file. If your index is small and have a limited amount of content, reset the index and start a new few crawl is the quickest way. In our case we ran an incremental crawl, which took three times the time a full crawl usually needs. During the crawl it looked like the crawl process was stuck. This because it was updating the index file, which just takes a very long time!

Tuesday, October 05, 2010

3rd edition of the Free DIWUG SharePoint magazine

The Dutch Information Worker User Group has released their 3rd edition of the Free DIWUG SharePoint Magazine. More information can be found at: http://www.diwug.nl/Pages/downloads.aspx

Available for download at:
http://www.diwug.nl/Downloads/DIWUG_SharePoint_eMagazine3.pdf

[MOSS2007] Cross domain Manager property

In an Active Directory, it is possible to configure a manager for a specific user. SharePoint is able to use this information and show you a hierarchy on a users MySite. Unfortunately this functionality has a limitation caused by Active Directory.

Active Directory issue
Within Active Directory It is only possible to select users (or contacts) that are in the same forest or domain as manager. So if you have (like at a customer of mine) a multi forest, multi domain environment, with Forest Trusts between all forests, you cannot select a user from domain 1 in forest 1 as the manager of a user in domain 2 in forest 2.

Customer situation
At the customer, they are using two tools to manage the data in AD:
1.) A replication engine to replicate users in one domain as a contact in the other domain. These contact are used for Exchange in that domain.
2.) A self service portal where users can configure their own data, including manager. When a user changes his/her manager, this is changed in the domain that has been marked as master domain. If the manager does not exist in that domain as a user, the contact in that domain is selected.

SharePoint behavior
This situation causes SharePoint to "generate" two different kind of hierarchy trees. One for domain1 and the other for domain2. Because contacts are not treated as users in SharePoint (as it shouldn't), when browsing through the hierarchy you never end up with the real manager account.

Then how to solve this issue
In order to work around the issue, we have modified the replication engine in such a way that the replication engine performs the following steps:
  1. Check the manager property and retrieve the master account in the other domain
  2. Configure the master account name in a separate AD property, which is not used
  3. Configure SharePoint to import that AD property as the manager, with the format <domain>\<userid> (configurable in property mapping settings)

This mechanism now works like a charm: Managers are imported correctly and more important, the hierarchy tree is displayed correctly!!

Thursday, July 08, 2010

[SP2010] Keep data in upgrade from RC to RTM

This week I wanted to upgrade our SharePoint 2010 RC environment to the RTM version. Unfortunately I found out that an upgrade was not supported by MS. Also a database detach and reattach to an RTM installation would not work :-(

But what about our data and site structure?? It is not much, but like any other person in IT I am lazy by nature: Don't want to do anything unless it is necessary :-)

As a test I created a site collection backup using stsadm of the site collection, reinstalled our environment with the RTM version of SharePoint 2010 and performed an stsadm restore. And it worked!

So if you want to upgrade from RC to RTM, but would like to keep your data, use stsadm backup/restore!!

Monday, July 05, 2010

[MOSS/WSSv3] SharePoint 2007 build numbers (Updated until June '10 CU)

Here a list of build number of SharePoint 2007, updated until the June '10 Cumulative Update. Based on the build number you can determine which patchlevel your SharePoint environment is on:

12.0.0.6539 - MOSS 2007/WSS 3.0 June '10 Cumulative update
12.0.0.6535 - MOSS 2007/WSS 3.0 April '10 Cumulative update
12.0.0.6529 - MOSS 2007/WSS 3.0 February '10 Cumulative update
12.0.0.6524 - MOSS 2007/WSS 3.0 December '09 Cumulative update
12.0.0.6520 - MOSS 2007/WSS 3.0 October '09 Cumulative update
12.0.0.6514 - MOSS 2007/WSS 3.0 August '09 Cumulative update
12.0.0.6510 - MOSS 2007/WSS 3.0 June '09 Cumulative update
12.0.0.6504 - MOSS 2007/WSS 3.0 April '09 Cumulative update
12.0.0.6421 - MOSS 2007/WSS 3.0 SP2
12.0.0.6341 - MOSS 2007/WSS 3.0 February '09 Cumulative update
12.0.0.6335 - MOSS 2007/WSS 3.0 December '08 Cumulative update
12.0.0.6327 - MOSS 2007/WSS 3.0 August '08 Cumulative update
12.0.0.6318 - MOSS 2007/WSS 3.0 Infrastructure Update
12.0.0.6300 - MOSS 2007/WSS 3.0 post-SP1 hotfix
12.0.0.6219 - MOSS 2007/WSS 3.0 SP1
12.0.0.6039 - MOSS 2007/WSS 3.0 October '07 public update
12.0.0.6036 - MOSS 2007/WSS 3.0 August 24 '07 hotfix package
12.0.0.4518 - MOSS 2007/WSS 3.0 RTM

You can find the build number of your environment via:
Central Admin > Operations > Servers in Farm

Source

Sunday, April 18, 2010

SharePoint 2010 and Office 2010 officially RTM

The moment has finally come: SharePoint 2010 and Office 2010 are officially Released To Manufacturing (RTM).

More info: SharePoint blog

Friday, April 16, 2010

[MOSS2007/WSSv3] SharePoint 2007 and Large Lists

Over the past years a lot of information became available on SharePoint 2007 and the use of large lists. Unfortunately it still happens a lot that large lists are created by users. What impact does this have and how to limit the impact of large lists?

Microsoft recommends to limit view to a maximum of 2000 items. However this is not the only thing that can impact large list performance. When sorting is used in a view, SharePoint has to sort all of the items, before it can create the view. So even when the view only contains 2000 items, SharePoint needs to retrieve ALL items anyway.

Therefore the recommendation for large lists is not only to limit the view to 2000 items, but also prevent sorting as much as possible. Especially in the default view, which is always used when a user accesses the list.

Monday, March 22, 2010

[MOSS/WSSv3] ForeFront update issues

I just installed ForeFront for SharePoint on a customer environment. Three of the engines we refusing to update their engines. A search on the Internet resulted in the following article:

ForeFront: Action required by Dec. 1, 2009: Keep protections current

Wednesday, March 17, 2010

[MOSS/WSSv3] Site Settings not visible, even when I am a site collection admin

[ISSUE]
Last week, I had to perform some restore activities for a site collection that became corrupt. When I was done, users reported that permissions were gone. After logging on I found out that I wasn't able to open the Site Settings. The Site Actions > Site Settings menu option was gone?!?!?!?

[SOLUTION]
After some troubleshooting I found out that during earlier steps, the site collection accidentally was set to Read-Only lock. That is why I wasn't shown any admin screens. When the lock was lifted, all started working again!

Monday, March 15, 2010

[MOSS] Unknown Error when opening AreaNavigationSettings.aspx

[ISSUE]
Last week one of my colleagues contacted me about an issue he encountered. When trying to open the navigation settings page (/_layouts/AreaNavigationSettings.aspx), he encountered an "Unknown Error" (don't you just love those nice and clear errors). When opening the default page, the Quick Launch navigation on that page was also displaying errors:




[TROUBLESHOOTING]
When using the CustomErrors=Off and StackTrace=True options in the web.config, we were able to retrieve a "Object reference not set to an instance of an object" message. In the ULS logs, we found the following errors:
    • Unable to retrieve the CachedObject that this ProxySiteMapNode references: /sites/test/sites/183/Pages/testpage.aspx
    • PortalSiteMapProvider was unable to fetch children for node at URL: , message: Object reference not set to an instance of an object

[SOLUTION]
The quick launch navigation was changed manually and navigation items were added. Looking a little further, I found that testpage.aspx didn't exist anymore in the Pages library. Probably a user had deleted this page, causing this issue. When I recreated the testpage.aspx, the Quick Launch navigation started working again and the Navigation Settings page didn't show the error anymore.

Monday, March 08, 2010

Launch date SharePoint 2010 known!

Last Friday Microsoft announced the launch date of SharePoint 2010. On May 12th Office 2010 and SharePoint 2010 will be officialy released to the public. They will reach RTM somewhere in April.

[MOSS/WSSv3] Error while deploying a solution

[ISSUE]
Last week I encountered an issue while deploying a solution. I received the following error "This operation uses the SharePoint Administration service (spadmin), which could not be contacted. If the service is stopped or disabled, start it and try the operation again."

[SITUATION]
The environment I was working on consisted out of three servers, one web front end, one application and one database server. The application server is running the Central Administration role. The above error was thrown on the web front end server.

[CAUSE]
This issue was caused by a bug in the .NET Framework v2.0. After deploying this update, the issue was solved and the deployment ran fine.

Thursday, March 04, 2010

SharePoint 2010 RC installation bug

The Release Candidate of SharePoint 2010 has an issue with the installation. When installing on a domain controller, you will end up with SharePoint 2010 installed in a Stand-Alone setup. This setup is using SQL Express 2008, which is not supported on a domain controller. Now I hear you say "why are you installing on a DC" and you are right, you don't want SharePoint installed on a DC.......in a production situation. In my case I am testing on a single VM, with all the roles on one server, including the DC role.

Quote MS: "In the Release Candidate build, we recommend you do not to install on a Domain Controller. While the Setup User Interface does the correct thing by suppressing the Standalone option, after the installation is done and the RC is fully installed, it is actually a Standalone install and not a farm install. Although you are not technically blocked, the install will not work since SQL Express 2008 is not supported on a Domain Controller."

Because SQL Express 2008 is not supported on a DC, if the installation detects you are installing on a DC, the selection page where you can choose between a Stand-Alone and Complete installation is removed. Unfortunately the Stand-Alone option is the default selected option, causing you to end up with a Stand-Alone installation anyways and no way to get around this via the GUI :-)

Solution:
The best way to install SharePoint 2010 RC on a domain controller is by using an scripted installation.

  1. Extract the installation file by executing <filename> /extract:<folder name>
  2. Open a Windows Explorer and create a file called config.xml.
  3. Open the config.xml file and paste in the part below
  4. Change the <PID KEY> into your own product key and the install folder
  5. Save the file
  6. Start a command prompt and browse to the folder
  7. Run "setup /config config.xml"
  8. SharePoint 2010 RC is now being installed as a Complete installation!

Have fun SharePointing!


Config.xml:

<Configuration>
<Package Id="sts">
<Setting Id="LAUNCHEDFROMSETUPSTS" Value="Yes"/>
</Package>

<Package Id="spswfe">
<Setting Id="SETUPCALLED" Value="1"/>
<Setting Id="OFFICESERVERPREMIUM" Value="1" />
</Package>

<Logging Type="verbose" Path="%temp%" Template="SharePoint Server Setup(*).log"/>
<PIDKEY Value="<PID KEY>" />
<Setting Id="SERVERROLE" Value="APPLICATION"/>
<Setting Id="USINGUIINSTALLMODE" Value="0"/>
<Setting Id="SETUP_REBOOT" Value="Never" />
<Setting Id="SETUPTYPE" Value="CLEAN_INSTALL"/>
<INSTALLLOCATION Value="D:\Program Files\Microsoft SharePoint" />
<Display Level="Basic" CompletionNotice="Yes" AcceptEULA="Yes" />
</Configuration>

Wednesday, March 03, 2010

My SharePoint 2010 article published in TechNet Magazine

Over the past weeks I have been working on a SharePoint 2010 article, which was published in this years first edition of the Dutch TechNet Magazine. If you interested in this article, below a scanned copy of this article:

SharePoint 2010, de stap naar volwassenheid (in Dutch)

Friday, February 19, 2010

[MOSS/WSSv3] Error "The detection failed, this can be due to a corrupted installation database"

[ISSUE]
When trying to install a SharePoint Cumulative Update I encountered the following error, directly after the update detection:
The detection failed, this can be due to a corrupted installation database


[CAUSE]
The installation files in the C:\Windows\Installer folder were deleted. These files are required to install updates.

[BACKGROUND]
When an application or update of an application is installed, the installation files are stored in the C:\Windows\Installer folder. When another update is installed, the updates checks the Installer folder and retrieves install information and other required components from that folder. So never delete files in the C:\Windows\Installer folder......in our case someone unfortunately did, resulting in the earlier mentioned error :-(

[SOLUTION]
On the MS Forums I found the solution for this issue. Doug Chandler has found out a way of copying the missing files from a working server to the broken server and running the Office Diagnostics tool to fix the issue. The only difference I encountered was that when I ran the Office Diagnostics tool, the tool required the installation packages of SharePoint and all of the language packs. By default it check the same folders as used during installation, so if you have these files in another directory it is going to ask you for the specific folder for all packages (148 in my case).

Friday, January 29, 2010

Useful and Free SharePoint Tools (Part 3)

It has been a while since my previous "Useful and free SharePoint Tools" post. Over the past period I have ran into some more brilliant tools and didn't want to keep them from you, so here we go again! :-)

Just for reference:
Part 1: Useful and Free SharePoint Tools (Part 1)
Part 2: Useful and Free SharePoint Tools (Part 2)

SharePoint ULS Log Viewer
Do you know that feeling: You have to troubleshoot an issue and dive into the ULS log. Soon you find out how difficult it is to find what you are looking for in these logs. When using Notepad, you can try to search on time stamp or error message, but filtering is out of the question. Fortunately a guy at Microsoft created a great tool that offers searching and filtering capabilities. Just open a ULS log file and go crazy. Life suddenly becomes a lot easier :-)
Link: http://ulsviewer.codeplex.com/

Zevenseas SharePoint Search Coder
This tool is useful for developers and IT Pro's. Using this tool you can generate search queries, but also send them to SharePoint, using the object model or web services. I have used this tool often to troubleshoot some search issues I encountered. By entering the search query you can see exactly which data is returned and therefore determine if the search engine is the issue, or some code (web part for example) that processes the returned data.
Link: http://mosssearchcoder.codeplex.com/

SharePoint Feature Administration and Clean Up Tool
Ever seen the error "failed to determine definition for feature with id <GUID>" in your ULS log? This means that a feature is still registered as active somewhere in your environment, but is not installed anymore. Unfortunately there is no way using the GUI to fix this issue. The SharePoint Feature Administration and Clean Up tool has the answer for this issue. When you start this tool, it has a button "Find Faulty Feature" in the lower right part of the window. This functionality searches for feature registrations of features that do not exist anymore and if found, it can delete these references. The tool can do other stuff with features, which I didn't need to far but I thought the clean up functionality is brilliant!
Link: http://featureadmin.codeplex.com/

SPTraceView
SPTraceView is a tool that connects to the tracing service of SharePoint and shows you all logged messages in real time. You can set filtering so only certain messages are displayed. The tool has an icon in the system tray and pops up a balloon if a message comes in. By clicking the balloon, a window with all captured messages is displayed. Really useful when troubleshooting!
The tool only has one "issue": It cannot display any messages higher than configured in the SharePoint Central Admin. For example if you set messages for the General category to High, you won't see Medium or Verbose messages in SPTraceView (or the ULS log for that matter). This is inherent to the way the tracing log is built.
Link: http://sptraceview.codeplex.com/

SPSFarmReport
Ever wanted to know how your farm is configured in high level? This tool can create a very high level report of your farm configuration. I have been looking for such a tool for quite a long time now. Even though this tool creates a nice report, I would like to see a little more detail in a next version. But until then.....another useful and free tool :-)
Link: http://spsfarmreport.codeplex.com/

PowerShell
The last one of this post isn't a tool by itself, but I have to say PowerShell rules!! I am not a developer, so creating applications/webparts/etc is not for me, but which IT Pro isn't a bit lazy by nature :-) Why do something manual if you can script it? Using other scripting languages, you couldn't use .NET objects so were very limited when it comes to SharePoint. But with PowerShell you can use every .NET object you like, so also SharePoint .NET objects!! And with the addition of 650+ PowerShell commandlets in SharePoint 2010, PowerShell becomes more and more important for the IT Pro. So if you haven't done much with it yet, LEARN POWERSHELL!! You will love it and it will make your life a lot easier!
Link: http://technet.microsoft.com/en-us/scriptcenter/dd742419.aspx

Thursday, January 28, 2010

[MOSS/WSSv3] Change the application pool of a web application

[ISSUE]
How should I change the application pool of a web application

[BACKGROUND]
Recently I ran into a situation where Microsoft had performed a MOSS Risk Assessment Program (MOSSRAP). One of the recommendations of Microsoft was to limit the amount of application pools used on one environment. Depending on your hardware specs, this number will be around eight. However the administrators had created about 50 web applications and configured an application pool for each one of them.

In order to adopt the Microsoft recommendation, the administrator went into IIS and changed the application pools manually. When they tried to add a new server to the farm, this triggered a reset of the configuration on all servers, basically resetting the application pools to the configuration known by SharePoint (each of the web application in its own application pool. This brought down the entire environment for several hours.

[TECHNICAL DETAILS]
During installation/configuration of SharePoint, SharePoint stores all configuration on web applications and application pools. It does that so it is able to deploy everything to a server which is joined to the farm with the correct settings. When you make manual changes to IIS, the settings in IIS do not match the configuration known to SharePoint and you might end up in deep shit.

[SOLUTION]
Unfortunately it is not possible to change the web application/application pool configuration via Central Admin. The only way is by using the SharePoint object model. On the Internet I found an article which described how you could use a PowerShell script to accomplish this configuration change:
Change application pools via PowerShell

This script did have one downside: It changed the configuration in SharePoint, however it did not change the configuration in IIS. To solve this, I found a method added to the SPWebApplication class in one of the Cumulative Updates called ProvisionGlobally. By using this method, you provision that specific web application on all servers running the web front end role. You can download the revised PowerShell script here.

After running this script the old application pools are still present in SharePoint AND IIS. In order to delete them from IIS, you need to write a script to Unprovision them. However the SPApplicationPool class does not have an UnprovisionGlobally method, so you need to run this script on each of the WFE servers.

Wednesday, January 27, 2010

[MOSS/WSSv3] Audit Log and MergeContentDBS

[ISSUE]
Audit log not migrated when using MergeContentDBS and old entries remain in old database

[DESCRIPTION]
Recently I migrated a site collection from one database to another using the StsAdm command MergeContentDBS. After this migration I found out that audit entries, stored in the audit log table of the database, were still present in the old database. In total this was occupying about 20GB of space. In one of the previous updates of SharePoint, Microsoft introduced the stsadm command trimauditlog, but unfortunately this command cannot be used to trim entries in the old database if the site collection in not in there anymore.

[WORKAROUND]
It would be a possiblity to manually delete the entries in the audit log, however with SharePoint you will end up with an unsupported environment after manually changing databases.

The only workaround I would currently have is to move out all remaining site collections to other databases and then delete the old database.

Tuesday, January 26, 2010

[MOSS/WSSv3] How to determine which server is used in Load Balancing configuration

[ISSUE]
When you are using multiple SharePoint Web Front End servers and load balancing (Windows NLB or otherwise), it is very hard to troubleshoot issues. Users hit one of the WFE servers and you have to figure out which one. SharePoint does not offer a quick and easy solution to do this for you.

[SOLUTION]
In order to solve this issue I have implemented a very simple, but highly effective solution. I just create an HTML file in the Layouts folder (C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\TEMPLATE\LAYOUTS) called server.htm. This file looks nothing more than:

<HTML>
<HEAD>
<TITLE>---SERVER NAME--- </TITLE>
</HEAD>
<BODY bgcolor=yellow>
<H1>---SERVER NAME---</H1>
</BODY>
</HTML>

On each server I replace the text "---SERVER NAME---" with the actual server name and the bgcolor with a different color (for example "green" or "red"). If a user then experiences issues, you can just have the user add the text "/_layouts/server.htm" to the site URL to show him the server he/she is on.

Like I said, very easy and highly effective! :-)

Tuesday, January 05, 2010

[MOSS2007] Crawling schedule alternative

[ISSUE]
On a customer environment, we were having some issues with the default search scheduler. On this environment we currently have more than 4.8 million items in the index and migrations are still happening. This means that incremental crawls run for several hours and sometimes even more than 24 hours.

By default it is only possible to schedule it once a day maximum (so not once every two days) and according to a Microsoft engineer, it is not advised to run a crawl when a previous crawl is still running. Somehow that can result in a corrupt SSP. Configuring a schedule to run each 15 minutes is therefore not an option.

[SOLUTION]
To solve this issue, I have created a PowerShell script. This script checks if a crawl is running and if not, starts a new incremental crawl. I have scheduled this script to run every 15 minutes.
[System.Reflection.Assembly]::Load("Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | out-null
[System.Reflection.Assembly]::Load("Microsoft.Office.Server, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | out-null
[System.Reflection.Assembly]::Load("Microsoft.Office.Server.Search, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c") | out-null

$serverContext = [Microsoft.Office.Server.ServerContext]::Default
$context = [Microsoft.Office.Server.Search.Administration.SearchContext]::GetContext($serverContext)

$sspcontent = new-object Microsoft.Office.Server.Search.Administration.Content($context)
$sspContentSources = $sspcontent.ContentSources

foreach ($cs in $sspContentSources)
{
  if ($cs.Name -eq "Local Office SharePoint Server sites")
  {
    Write-Host "NAME: ", $cs.Name, " - ", $cs.CrawlStatus
    if ($cs.CrawlStatus -eq [Microsoft.Office.Server.Search.Administration.CrawlStatus]::Idle)
    {
      Write-Host "Starting Incremental crawl"
      $cs.StartIncrementalCrawl();
    }
    else
    {
        Write-Host "Crawl running"
    }
  }
}