Full Control Rights User (Owners Group) cannot create sub sites

Sharing a live issue in one of our SharePoint 2013 application.

A user who is part of Owners group on a SharePoint site (having Full control), is not able to create subsites. When they do they receive an error “Sorry, you don’t have access to this page” or “Access Denied”.

If they are added as Site collection Administrator, they are able to successfully create subsite.

Analysis:

Looking at the ULS logs, the following error appears:

SPRequest.UpdateField: UserPrincipalName=i:0).w|s-1-5-21–4218016322-16051, AppPrincipalName= ,bstrUrl=http://projects-wst.***.com/****/collab/pt/rfq/ ,bstrListName={E8F681E0-C8AB-4454-9C52-376AADCB7112} ,bstrXML=<Field Type=”TaxonomyFieldTypeMulti” DisplayName=”HashTags” StaticName=”HashTags” Name=”HashTags” ID=”{333b1bc2-0532-4872-96f1-bbbdead35a56}” Description=”****” SourceID=”{2e9ba01e-b042-49ce-8993-e16635268252}” List=”{c442bc26-f509-4ec7-9da0-a8fe0234924e}”

On finding the name of the list mentioned in the error log, it appears to be SharePoint Taxonomy Hidden List. This list can be reached through UI by appending the URL Lists/taxonomyhiddenlist

Img

Cause:

This hidden list’s permissions should have Full Control (Owner) group added. However there was no user / group added to the list permissions. It was not inheriting security.

Administrators may have stripped these hidden lists using a PowerShell command during a process to clean up permissions on the site collection in staging, preparing it for a migration to their production server.

The permissions were broken (Unique permissions were applied on the list).

Solution:

Just inheriting the permissions got back Owners group added to the list.

Now when Owner group users tried creating the subsite, it got created successfully.

 

Advertisements

Duplicate a SharePoint 2013 site collection

At times there are requests by user on PROD environment, for having a duplicate site with same content , permissions, groups etc.

download

There are many approaches to go for this request, like Backup Restore, or using Metalogix.

However there is a new command in SP 2013 that helps creating a new site collection with exactly same content, permissions and everything.

Copy-SPSite “OldSiteCollectionUrl” -TargetURL “NewSiteCollectionUrl”

This is a very quick and easy method to get a duplicate site within few minutes, without having to do much.

However sharing the constraints below for using this command:
– Source and Destination site collections should be in same Web application
In case web application is different, additional parameter DestinationDatabase can be applied to this command.
– Applicable to SP2013 On Prem only

SharePoint 2013 Search not displaying all results

This is a live project issue that just came across today, so thought of sharing with everyone!!

images (1)

If we have a search based application in SharePoint 2013, we can retrieve search based results by either using a Search Results web part or a Content by Search web part.

Problem Statement:

If we configure both the web parts to have same query and result source, we should be receiving same results set and count.

However, Content by Search web part returns all results as expected, but search results web part was only returning few of the results. It surely is not a problem of crawl as data is present on the site and is being displayed using the other web part.

So what could be the issue?

Conclusion & Solution:

In the earlier versions of the Search Result Web Parts, the user had an option to “Trim Duplicates” which is no longer present in the SharePoint 2013 Search Result Web Part’s properties.

There is a way to set the Trim Duplicates property to false. Sharing the steps below:

  1. Export the Search Results Web Part from your page.
  2. Open the .webpart file in your favorite editor.
  3. Search for “Trim Duplicates”, you will find it as part of the DataProviderJSON property.
  4. Set the Trim Duplicates property to False.
  5. Upload the web part.
  6. Add the web part to your page.

image

SharePoint excludes the content that it thinks is similar. Even though the pages have a unique name and the content might not be the same, if it has some content that is similar, then search will collapse them in the result set.

Once the Trim duplicates option is set to false, you will get the results that you would expect and the result set from a Content By Search web part will match the result set from a Search Result web part

Setting up Provider Hosted Apps environment for SharePoint 2013

Hi All,

As we all know, setting up Provider hosted app in SharePoint 2013 environment can be a pain at times.

We all have come across various issues while setting this up.

Sharing a self created and tested document with step-by-step approach to set this up and create a provider hosted app.

Any suggestions, thoughts or comments are appreciated!!!

developing-a-provider-hosted-sharepoint-app-7-638

Click here to go through the complete step by step details to get this implemented

SettingUp_ProviderHostedApp

Cheers !!

SharePoint Crawl DB size increasing @20GB per day

Recently, one of my SharePoint farms faced an issue where the volume utilization of DB server (SharePoint crawl database) went upto 99% within 10 days.

Analysis:
– Looking at the crawl DB log in SharePoint, the incremental crawl was running every 1.5 hours with 0 success and 100% failures.

– There were about 961K errors with each incremental crawl scheduled at every 2 hours.

– There is a temporary folder that Crawl uses to place files temporarily on the server. The path is C:\Users\”SPAdminAccount”\AppData\Local\Temp\gthrsvc_OSearch14\

– I found out that this temporary folder was somehow missing / deleted.

Solution:

On creating the above mentioned folder manually, the success rate of crawl went high and there were very less failures (as expected)

Post this, the space utilization went about 10 MB a day which is expected during crawl.

CAML Designer 2013

For any SharePoint developer, CAML Query builder has always been a tool used very frequently for development of custom SharePoint solutions.

With SharePoint 2013 CAML Designer:

  • you can build CAML queries for single lists
  • you can build queries that can be executed with SPSiteDataQuery
  • beside the pure CAML queries, you can also get code snippets for the server-side object model, the .NET client-side object model, the JavaScript client-side object model and last but not least code snippets when working with REST.
  • Autogenerate the actual CAML Query
  • Autogenerate Server OM code
  • Autogenerate CSOM .NET code
  • Autogenerate CSOM REST code
  • Autogenerate Web Service code
  • Autogenerate PowerShell code

Map SharePoint library to Network Drive

Sometimes it becomes necessary to map a SharePoint document library as a network drive. This may be a customer requirement, or developer requirement to manage the SharePoint folder structure.

Sharing the steps below for the same:

Step 1:
Note down the document library’s site path which needs to be mapped as a network drive.

Example: https://home.intranet.com/departments/finance

Step 2:
Open File explorer, Right click, Select – Add Network Location




Step 3:
Click Next, and select option – Choose a custom network location, select Next.

Step 4:
Enter the SharePoint site URL. Click Next

Step 5:
It should prompt for credentials to connect to the SharePoint site. Enter Administrator credentials along with domain name.
It will ask for a name to be provided for this network drive location. Provide a custom name e.g. Finance Site.

Press Next.

Step 6:
It should now open the mapped drive location along with lots of folders. These folders represent all the lists and libraries contained in this SharePoint site.
You should also be able to view the Document library here.