February 3, 2016

SharePoint New-SPContentDatabase in VSTS Build throws error “String or binary data would be truncated”

Problem

Nightly build was running nicely until 3rd December, 2015, after which it started throwing error:

2015-12-08T07:51:03.8157500Z ##[error]New-SPContentDatabase : String or binary data would be truncated.    
    2015-12-08T07:51:03.8157500Z ##[error]The statement has been terminated.    
    2015-12-08T07:51:03.8157500Z ##[error]At E:\Builds\Plaza for SharePoint\Scripts\50_CreateModuleSiteCollections.ps1:36 char:9    
    2015-12-08T07:51:03.8157500Z ##[error]+         New-SPContentDatabase -Name $ContentDatabase -WebApplication $WebApplica ...    
    2015-12-08T07:51:03.8157500Z ##[error]+         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~    
    2015-12-08T07:51:03.8157500Z ##[error]    + CategoryInfo          : InvalidData: (Microsoft.Share...ContentDatabase:SPCmdletNewContentDatabase) [New-SPConte     
    2015-12-08T07:51:03.8157500Z ##[error]   ntDatabase], SqlException    
    2015-12-08T07:51:03.8157500Z ##[error]    + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletNewContentDatabase

when the PS script doing nightly environment reinstall attempted to create SharePoint content database with simple command:

New-SPContentDatabase -Name $ContentDatabase -WebApplication $WebApplicationName

In ULS logs it says things like:

    System.Data.SqlClient.SqlException (0x80131904): String or binary data would be truncated.  The statement has been terminated.
    Unknown SQL Exception 8152 occurred. Additional error information from SQL Server is included below.  String or binary data would be truncated.  The statement has been terminated.
    Exception occured during acquiring a server lock. System.Data.SqlClient.SqlException (0x80131904): String or binary data would be truncated.  The statement has been terminated.

Thoughts

I had all recent Windows Updates installed to SQL and SharePoint.

When I run the exact same script manually in PowerShell window in the same server, it works OK, but when the script is run as a part of nightly build, it fails.

I do not see the initial CREATE DATABASE in SQL Profiler when running New-SPContentDatabase via the Visual Studio Team Services Agent (nightly build script). I see CREATE DATABASE just fine if I run the script manually on the same server.

Issue occurs also if I manually Queue the build during day time.

There are no SQL server or SharePoint maintenance jobs taking place at the time when script is failing.

Build Agent account is the same account I'm using when running the script manually.

Finally after everything seemed OK, I was inclined to think cause is one of these things that occurred on Dec 3rd just before errors started to appear:

  1. Build Agent was updated at that time (cannot test with previous version as it auto-updates, have SR open of this whole issue with VSO support)
  2. Running Configuration wizard (ConfigWiz was also run 30th Nov, and there were no WU SharePoint updates released between 30thNov and 3rdDec, so it shouldn't directly be related to any updates, although ConfigWiz may have made some changes still)
  3. Windows Update KB3112336 (which I cannot seem to uninstall in order to test the effect)

So, I ended up opening service request at VSTS.

After 1,5 months of sending logs to friendly team at VSTS, I got answer that “Based on investigations from our Product team it does appear there were code changes in how VSTS communicate’ s with PowerShell which is breaking your builds. Development team will continue to investigate areas they may have to rework.

Workaround

Now that I had confirmation VSTS is the culprit, I came to think perhaps good old STSADM would work. AND IT DID!

So, use STSADM -o addcontentdb instead of New-SPContentDatabase:

#this works
& 'C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\BIN\STSADM.EXE' -o addcontentdb -databasename My_ContentDB -url $WebApplicationURL

#this does NOT work
New-SPContentDatabase -Name My_ContentDB -WebApplication $WebApplicationName

 

Solution

Will update here after it is fixed at VSTS or on the Agent and there is no need for workaround.

November 30, 2015

SharePoint Multi-Tenancy: Search returns no hits from tenant site

Problem

After setting up on-premises SharePoint 2013 in multi-tenancy mode, and configuring all Service Applications, everything else was working fine, but there was no search results being returned from the tenant site collections.

I could see from SharePoint search logs that the site contents were crawled properly, but even a broad search query such as Size>0 didn’t return any hits.

Thoughts

I had to tweak AutoSPInstaller scripts to enable creating different Service Applications using the -Partitioned or -PartitionMode parameter, and I had missed to set -Partitioned parameter for the Search Service Application Proxy.

I got to the bottom of this by looking at the Properties of the Proxy using PowerShell like this:

PS C:\AutoSPInstaller> Get-SPEnterpriseSearchServiceApplicationProxy

DisplayName          TypeName             Id
-----------          --------             --                                 
Search Service Ap... Search Service Ap... c54d9470-714a-46ff-983c-33422d466a4c

PS C:\AutoSPInstaller> $sp = Get-SPEnterpriseSearchServiceApplicationProxy c54d9470-714a-46ff-983c-33422d466a4c

PS C:\AutoSPInstaller> $sp.Properties

DisplayName                    Value                       
----                           -----
Microsoft.Office.Server.Uti... UnPartitioned

Note the “Unpartitioned”, it should say “UniquePartitionPerSubscription”.

Solution

  1. Remove Search Service Application Proxy
  2. Create it using the -Partitioned parameter

November 19, 2015

View Visual Studio Online build status on your Microsoft Band

I’ve been very happy with my Microsoft Band and eagerly waiting for the Band 2 to arrive. I’ve used my Band 24/7 now for 6 months, and it has worked quite well for fitness and sleep tracking. However, I’ve tried to figure out good uses for the Band at work.

Time flies, and past two years I’ve been concentrating on product development of a smashing Market Intelligence application for SharePoint, and with that we’ve doing Continuous Integration, in our current case meaning nightly builds. Every night the SharePoint application is removed from the server, and fetched, re-build, and installed using PowerShell scripts. Finally some tests cases are run towards the fresh installation to catch possible issues that are not installation related.

Now, I can of course order email notifications of failed builds (those do occur, believe it or not), but email is old skool, why not receive the build notifications somewhere else?

As you might have guessed, I use Microsoft Band for that. I have a nice Web Tile on Band showing the current status, but it will also notify me of the status when it changes.

How to do it

  1. Go to Band Web Tile wizard at https://developer.microsoftband.com/WebTile and click “Get Started”
  2. Select “Single page tile” and “Scrolling text wrap” and click Next
    1
  3. Go to your Visual Studio Online (new name is Visual Studio Team Services) build definition, and General tab

    2
  4. Select “Badge enabled”, and click the “Show url…”
  5. Copy URL to clipboard and paste the URL to step 2 in Web Tile wizard and click Next

    3
  6. Drag and drop the text fields as you desire, important is to have the succeeded/failed text somewhere and click Next
    4
  7. Set up notifications as you wish, in my case it will notify me of failed builds only, then click Next.
    5
  8. On the next step, name your Web Tile, and upload a picture, then click Next
  9. Finally, download the generated .webtile to your computer, and upload it to your mobile device either via E-mail, or via OneDrive/Google Drive, etc.
  10. Open the .webtile on your mobile device, and it should start Microsoft Health app, and ask your confirmation to save it
  11. View your build status like a boss!

    IMG_20151119_101659

September 17, 2015

Yammer Inbox: Mark All as Read

Problem

Cannot “mark all as read” in Yammer inbox.

Workaround

Took Nate Haug’s script here, and modified it slightly to work with current version of Yammer. You need to execute the following script in Chrome or Firefox (Firebug) / Safari JavaScript console:
  1. Press F1 to open developer tools
  2. Select Console
  3. Paste in the following code and press Enter (after script has finished, restart browser).
var yamTime = 1000;
var goBack = function() {
    history.back();
    setTimeout(clickLink, yamTime);
}
var clickLink = function() {
  var $element = jQuery('.yj-inbox-list--messages li:first a');
  var $moreLink = jQuery('#moreButton button');
  if ($element.length) {
    $element.click();
    setTimeout(goBack, yamTime);
  }
  else if ($moreLink.length) {
    $moreLink.click();
    setTimeout(clickLink, yamTime);
  }
}
clickLink();


NOTE! If you have slow connection you might run into errors when Inbox doesn’t have time to refresh properly and you’re stuck in a loop with one same item, please increase yamTime value.

NOTE! You might need to run the script few times, and click the “More items” link on Yammer page to show rest of the items.

Solution

No worries, real solution is just around the corner:

June 9, 2015

SharePoint: Create Publishing page with specific Modified date using PowerShell

Problem

I had to import thousands of lines from CSV file and turn those into SharePoint publishing pages. Overall, it was easy with PowerShell, but whenever I used $item.File.Publish("") at the end of creating new Publishing Page to publish it, the Modified date was set to current date and time. I tried all possible variations and tricks I could come up with until I came up with the solution.

Solution

You need to turn off minor versioning for the Pages list and only call CheckIn for the individual items. Doing CheckIn doesn't touch the Modified date. After you've imported all items, you can enable minor versioning on the list, and all items will remain published.

$ver = $host | select version
if ($ver.Version.Major -gt 1)  {$Host.Runspace.ThreadOptions = "ReuseThread"}
Add-PsSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
Import-Module WebAdministration -ErrorAction SilentlyContinue

$web = Get-SPWeb https://sharepoint.mydomain.com/m/nf/0
$list = $web.Lists["Pages"]

#Important to disable this first to get rid of
$list.EnableMinorVersions = $false
$list.EnableVersioning = $false
$list.Update()

# Change it to real layout name
$PageLayout =  $pweb.GetAvailablePageLayouts() | Where-Object {$_.ServerRelativeUrl -eq "/m/nf/_catalogs/masterpage/myportal/article.aspx"}

# Imports modified Rawdata from CSV file to MiPages - change filename if needed
Import-Csv -Delimiter ";" -Path (Resolve-Path RawData.csv) | ForEach-Object {
   
    # Page title
    $PageTitle = $_.Title
   
    # Place to save news, filename format is i+ID.aspx
    $PageUrl = "i" + [string]$_.ID + ".aspx"
   
    $SourceDate = Get-Date $_.Date

    Write-Host "Creating $($PageUrl)"

    # Create blank page
    $page = $pWeb.AddPublishingPage($PageUrl,$PageLayout)
       
    # Sets listitem fields
    $item = $page.ListItem
    $item["Title"] = $PageTitle
    $item["Created"] = $SourceDate
    $item["Modified"] = $SourceDate

    # Update list items
    $item.UpdateOverwriteVersion()

    # Check-in and publish page  
    $page.CheckIn("")      
}

$list.EnableMinorVersions = $true
$list.EnableVersioning = $true
$list.Update()

May 20, 2015

Media Center Status Application goes open source

Windows Media Center Status Application I mainly developed during the early stages of Windows Vista (also supports 7 and 8) is now open source. It was originally developed for my own use, but I soon realized there were hundreds if not thousands of Media Center users out there who wanted to share their media viewing preferences in social media, such as Facebook and Twitter.

During the evolution of the application many nice features were added, such as getting notifications of tweets of people you follow inside Media Center. It didn't take long until someone requested a way to filter out what items are published to Facebook wall (as it was called back then), so I also added RegEx filter to filter out specific media titles from getting sent out - if one had automatic publishing enabled.

There was also ambitious monetization ideas when I added feature that fetched media images (such as DVD and CD covers) from Amazon to show in the Facebook status update. I always knew that the user base for the application wasn't that huge and wasn't too surprised I didn't end up getting a dime from Amazon referrals. But I did get few hits so it proved it was working.

Otherwise monetization relied on donations, and I was surprised that I indeed got two donations totalling €25. That was €25 more than I was expecting, so thank you Marc Stride and Graham Evans. I'd like to think I bought beer with that money and enjoyed it on a sunny summer evening, but probably I just bought cement or boards for the house construction.

I got the idea for the application once during Microsoft's MSDN conference where I decided not to participate in the boring directly work related enterprise sessions, but instead go off-the-beaten-path. I learned so much not only about Media Center, but also on Robotics Studio. The next few evenings I spent at the hotel programming this application.

Biggest challenge was by no doubt the ever changing APIs of Facebook, Twitter. First the Facebook walls were gone and Graph API was introduced, then Twitter did lot of changes in authentication. Media Center development was the only one that didn't receive any major changes since Vista, unfortunately.

I was considering open sourcing it for a long time, but just didn't get to it, but hey, better later than never.

You can find bits at https://mcsa.codeplex.com.

Compiled app, changelog and other details are still at http://www.jussipalo.com/fbmce.

March 4, 2015

Sync any folder to OneDrive without moving it to local OneDrive folder

Problem

I’ve been using c:\work folder forever to store my work related semi-temporary files on my laptop hard drive. Although losing contents of that file due to hard drive issue or similar wouldn’t cause much downtime in my work (as all important files are stored in SharePoint Online), I still didn’t feel too relaxed with the possibility of losing even 5 minutes of work in case that folder somehow disappeared.

Few times I’ve decided to move my c:\work to OneDrive for Business folder in order to sync work related files automatically to business version of OneDrive hosted in Office 365. However, each time I’ve got frustrated by the fact that finding the OneDrive folder in File Explorer is so difficult. Sure, using Favorites or Quick Access (as in Windows 10), I can get there pretty easily, but still, it will add at least one additional folder level and lots of clikety clicks with my tired little mouse to get where I want to go. Plus, just looking at the left pane in File Explorer that is so full with garbage (several instances of OneDrive, DLNA devices just below This PC, Libraries, Control Panel?!, yet again same libraries, Desktop items at the root, etc.) makes me want to cry.

Solution

Use junction points to direct content from c:\work into your OneDrive folder, such as C:\Users\Jussi\OneDrive - Sulava Oy\Work. After creating junction point such as this, all content you add to c:\work will actually reside in the OneDrive folder, and are synced to OneDrive normally. And best of all, I can still use c:\work like I’ve always used.

Steps how to do this:

  1. Rename your existing c:\work (or whatever your folder is called) to e.g., c:\work_old
  2. Create Work folder into the OneDrive folder, e.g., C:\Users\Jussi\OneDrive - Sulava Oy\Work
  3. Open Command Prompt
  4. Type in mklink /j c:\Work "C:\Users\Jussi\OneDrive - Sulava Oy\Work"
  5. Move old content from c:\work_old to c:\work and confirm that Work folder under OneDrive folder starts syncing and eventually goes green

NOTE! As there is really just one instance of files on your hard drive, if you remove folder under OneDrive folder, it will also be removed from the junction point location.