05 April 2013

SharePoint 2013 and WCXM

A lot has been written about and presented on SharePoint regarding it’s use as a WCM/WCXM tool.  My position, generally, is that unless you’re using SharePoint across the enterprise, it probably won’t be a good fit for the majority of firms as a WCM solution.   I reiterated this guidance in a SharePoint 2013 WCXM advisory paper and subsequent “Just how good is Web CMS in SharePoint 2013” web cast for the Real Story Group.

After I tweeted that little had changed between 2010 and 2013 WCM capabilities, Waldek Mastykarz suggested that I “didn’t get the memo.”  Waldek and I had a short twitter discussion on whether the latest release significantly improved SharePoint’s WCM capabilities.

Admittedly, there was some merit to his response, though I still maintain that for the majority of cases, SharePoint’s WCM capabilities are not significantly changed and the platform isn’t any better a fit for loads of WCXM scenarios than 2010.  However, I thought it would be interesting to post a set of questions I asked Waldek directly and let you hear what he has to say.   At Waldek’s suggestion, I’ve also included some additional commentary that he and I exchanged through e-mail.

  1. In a blog post I published on the Real Story Group blog, I contend that not a whole lot has changed for WCM in the new SharePoint 2013.  You tweeted that I “hadn’t read the memo.”  Have I missed something really materially different?

    In SharePoint 2007 and 2010 the publishing model was based on Publishing Pages. One of the consequences was tight coupling between the physical location of the page and how it would appear in the site’s structure and in URLs. Additionally it was challenging in previous versions of SharePoint to reuse content across different sites. Finally it was maintenance-intensive to have an intelligent website which content and presentation would adapt to who the visitor is and which device he uses to browse the website. All of those can be now solved using the standard capabilities of SharePoint 2013.

    Next to the new search-based publishing SharePoint 2013 contains a number of capabilities that help you optimize your website for public search engines. In previous versions of SharePoint those capabilities had to be covered by third party solutions. Those are just two of the many WCM capabilities that SharePoint 2013 offers and that change how we think about and build websites on the SharePoint platform.

    [SHAWN’S NOTE] The search-based publishing features are not available in the Office365 version of SharePoint and only available on-premise.

  2. You contend that search driven navigation is a major change in the WCM capabilities in SharePoint 2013.  How so?

    In previous versions of SharePoint the navigation was based on the physical structure of the website. If you wanted to publish a press release, you would navigate to that particular branch of your site and create a new press release page. With SharePoint 2013 search-based publishing and Managed Navigation this is no longer necessary. First of all we can leverage Managed Metadata and taxonomies to build more flexible navigation structures. Secondly using search-based publishing we can tie those taxonomies to the content and have everything published dynamically on the website. This not only increases web content management flexibility but also lowers the effort required to build and maintain websites.

    [SHAWN’S NOTE] SharePoint’s managed metadata service was available in 2010, though basing web site navigation on the term store is new to 2013; in 2010 metadata-based navigation was available for document libraries and presenting documents ordered by terms applied to those list items.

  3. What are some of the other changes in SharePoint 2013, with regard to WCM, you think deserve highlighting?

    SharePoint 2013 contains many new and improved features for building public-facing websites. First of all it provides us with Search Engine Optimization features that we can leverage to optimize the website for public search engines. Next there are the Rich Text Editor improvements that make it easier for content managers to work with content. One of the common challenges in previous versions of SharePoint was working with content from Microsoft Word where all of the internal markup would be copied into the web page. In SharePoint 2013 this is no longer an issue as it automatically cleans the HTML of any Word markup. With regard to search-based publishing we can now leverage information about the visitor, such as information from social networks or his click-behavior, to personalize the content displayed on the website. With this we can build truly intelligent websites where the presented content is automatically tuned based on who the visitor is and what he did on the website.

    SharePoint 2013 also simplifies building websites optimized for mobile devices. Using the new Device Channels and Image Renditions capabilities we can now easier build websites optimized for different audiences using SharePoint 2013.

    Also, when it comes to scalability, because SharePoint 2013 uses enterprise-class search engine (previously known as FAST) as a fundamental piece of the content delivery mechanics, it is much more scalable than traditional database-driven approaches – especially if you take into account content personalization capabilities such as Recommendations and User Segments that are now a part of the standard functionality of SharePoint 2013.

    Another thing worth mentioning is that using the new and improved REST APIs it is so much easier to reuse content from SharePoint 2013 outside of SharePoint. With the new API you can very easily create native apps for Windows Phone, iPhone or any other device/OS that will interact with the content store in SharePoint. With the new APIs you not only get access to the static content but can also benefit of the powerful capabilities based on search-based-publishing such as recommendations and user segments which allow you to easily build apps with rich and dynamic experiences.
    Even though SharePoint 2013 has been released just recently, there are already some customers having their production websites live built using SharePoint 2013. One of the cases is available at
    http://technet.microsoft.com/en-us/library/jj822912.aspx and presents the implementation of the mavention.nl website which was the first website worldwide live on SharePoint 2013.

    [SHAWN’S NOTE] Search engine optimization features include improve URL readability (e.g. no longer necessary to have the “pages” library name in the URL in all cases) and the ability to inject arbitrary metadata into a page through SharePoint site settings.  I had noted editor improvements (specifically improvements in markup produced), though SharePoint still lags other editors like TinyMCE.  Device Channels and Image Renditions are improvements,  the capabilities represented by these two features were available 2010 (automated thumbnailing was a feature of an image library and the mobile capabilities of 2013 simply make configuring device channel easier, though there’s little improvement over 2010 and you’re still required to build an entirely new site to support specific mobile devices).  Finally, the FAST engine was available in 2010 and the capabilities represented, beyond search-based publishing, are little improved (if at all) in 2013.

    [Waldek’s Response] Image Renditions are more than the thumbnail capability available in the previous versions of SharePoint. In the past SharePoint automatically generated thumbnails for images. Those were however fixed and always sized (as opposite to cropped). With SharePoint 2013 you can specify yourself which kind of Image Renditions you want to use on your website and how every single image should look like for a particular rendition. This makes Image Renditions way more useful than the thumbnails from the past.

    [Shawn’s Response] I did point out that Image Renditions were updates to thumbnailing.  I understand that there are some additional capabilities with redition; as a developer, it does open up some possibilities that weren’t available OOTB in 2010.  However, I question, as the average IT buyer, how much value the feature adds over thumbnailing in the context of WCM.  For me, it’s not hugely consequential across the range of sites potentially “powered by” SharePoint (i.e. for product-oriented sites, image renditions could be very valuable; for the average marketing site, not as much – or at all).

    [Waldek’s Response] Device Channels in SharePoint 2013 are nothing like the mobile capabilities of SharePoint 2010. In the previous versions of SharePoint, whenever it detected you were browsing from a mobile device (detection was done based on User Agent-substrings and properties from the .browser files on the server) it always served you with a predefined experience that was challenging to customize. The only customization capability here was to use Control Adapters (custom code) to modify the rendering of various pieces of the website. In SharePoint 2013, with the introduction of Device Channels, this whole situation has changed. Not only we can distinct between multiple channels, but we can also use different Master Pages and content panels to optimize how the website is rendered on mobile devices. I also don’t agree with you when you say you need a whole separate website for mobile. In my opinion, when optimizing content for mobile you have three options: you can use responsive web design, build a mobile website or build a mobile app. With SharePoint 2013 you can make use of all those options (or a combination) but neither of them is a must.

    [Shawn’s Response] Again, my point was that mobile channels are an update and not a radical departure from 2010 capabilities.  I would argue that the variations feature, combined with the agent detection provided roughly the same capabilities as the 2010 mobile channels functionality (including control over master page and layouts).  Is it the same, no.  Are device channels easier to manage and control, perhaps.  However, for whom are these features relevant? I would argue they’re mostly for developers, not business users or even “power users.”  In the same way, I’m evaluating value to the end buyer.  This functionality will be useful to only a narrow range of customers and site within the context of WCM tool buyers.   Lastly, mobile channels are not available in all versions of SharePoint, most notably in Office365.

  4. Would you say that improvements in WCM are evolutionary or revolutionary?

    Some of the new WCM capabilities provided with SharePoint 2013, such as SEO-related features or Rich Text Editor improvements, evolved with the product. As SharePoint matures more capabilities are added and the existing capabilities are being improved. On the other hand there are also some capabilities like search-based publishing that are new and that offer us new ways of thinking about how we manage and publish web content. Using search in SharePoint 2013 we can now build more dynamic websites where the content is displayed not only based on relationships known to content managers but also based on information that we have about the visitors ensuring that they see the most relevant content that the website has to offer.

  5. Given SharePoint’s fairly heavy infrastructure requirements (e.g. disk space, RAM, number of servers to run the various services), do you believe that SharePoint can effectively compete with other, more pure-play WCM tools (e.g. SiteCore)?

    The exact hardware requirements depend on what you are exactly trying to achieve so even when you see some high numbers, it doesn’t mean that this is exactly what you need just be able to publish a website on SharePoint 2013. The only way to really determine what you need is to get a clear understanding of your requirements, test your website against them and make an educated choice about what kind of infrastructure you need to fulfill those requirements. Given the changes in how SharePoint 2013 is licensed for public-facing websites as well as what the rich Web Content Management capabilities it offers, SharePoint 2013 is a very interesting candidate to consider for choosing a web platform – even for organizations that don’t have SharePoint implemented yet. Compared to the past, licensing of SharePoint 2013 is very attractive now. Not only the price per server is lower but you also get best-in-class search capabilities as a part of that license.

    [SHAWN’S NOTE]  I’m not sure I agree that pricing is improved over 2010 and certainly the “it depends” answer on hardware is an appropriate response (I use that answer frequently).  However, “best in class” search is something you, as the buyer, should evaluate in the context of other available search technologies.  The Real Story Group has very deep research on Enterprise Search Technologies that will help you understand your choices and the trade offs.

I want to thank Waldek for participating in the discussion and hope that it has helped you, the buyer.

About Waldek

Waldek is Microsoft SharePoint Server MVP and works as SharePoint consultant at Mavention. Waldek participates frequently as a speaker and expert in community events such as SharePoint conference in London, SharePoint Connections, SharePoint Saturday and DIWUG. Recently Waldek became a Virtual Technology Solutions Professional for Microsoft Netherlands. In this role he answers customer questions around SharePoint Web Content Management (WCM).  Blog: http://blog.mastykarz.nl  Twitter: http://twitter.com/waldekm

21 March 2013

Fixing Azure Challenges - Part 1

Over the last year, we’ve built a few Azure-hosted applications (including our own web site).  Throughout that time, we’ve hit a few walls, suffered setbacks and learned a ton about how to work with both the service and the Azure SDK tools (including the sometime challenge of dealing with version differences with both the SDK and the Azure management interface).

Since we’ve been helped by the many folks that post to twitter and their own blogs, we’re hoping to return the favor to the larger community.  Here are a few of the initial challenges we ran into and how we solved them.  In all cases, your mileage may vary.  Certainly if you have feedback, a better way to do things or something else (related), please do comment.  We’ll be publishing a follow-up post with additional solutions in the coming months.

NOTE: This post was written with the assumption of Visual Studio 2012 and Windows 8, along with the v1.8 release of the Azure tools.  We’ve seen posts referencing similar problems with older tools, but these tips may be less helpful unless you’re using the same versions.

Error when publishing applications

This particular issue is fairly non-descript, since it encompasses a whole range of causes and subsequent actions on your part.  However, problems with publishing happened frequently enough that it caused some serious delays in getting new builds of applications out to the cloud.  The following are the ones that caught us most frequently.

Administrator Rights

In order to debug or publish, you need to start Visual Studio “as Administrator” or you’ll get an error stating that the current user has insufficient privileges.  The solution is to right click on the Visual Studio shortcut and pick “Run as Administrator” to fix this problem.  If you’ve pinned VS to your task bar (like I do), just right click once on the task bar item, right click again on the main Visual Studio link and then pick “Run as Administrator.”

Out of Memory

When either trying to publish your solution directly to Azure or simply trying to create a package, you may receive an error that is simply “Out of Memory Exception.”  Apparently, with only 4 Gb of RAM available, Visual Studio is unable to create the Azure deployment package.  It’s really unclear to me why so much memory is necessary, but often without a clean restart of Visual Studio, I get this message.  The solution is simple, close down all other applications, restart Visual Studio and try again.  I can occasionally get away without these steps, but if I’ve been debugging or opening lots of other applications, I can guarantee my publish/packaging operation won’t work.  I never had to restart Windows, just exit VS and start again.

Unknown Error

In the midst of publishing a package (after initiating the “update” on a cloud service), you get an error from the Azure web-based management console that states your update was unsuccessful.  For me, this usually happened after several minutes of waiting for the package to upload, then cycling the instances and, at the last minute, erroring out.  When pressed for additional details (by clicking that not-so-helpful “I” icon), there are no details.  This happened to me recently and it was the result of having both http and https endpoints defined, but without the SSL certificate referenced by the service configuration setting loaded into the target subscription (the client we were working with used multiple subscriptions to separate “production” from “QA”).

In this case, the production environment had the SSL certificate loaded, but the QA environment did not.  Because my Azure settings listed the thumbprint for the SSL certificate and both end points were defined, the update failed in QA, while successful in production.  Unfortunately, it was difficult to figure out the root cause since the management console didn’t give any details. 

Only after I looked in the log, click on the “failed” entry and then clicked the “I” details button at the bottom of the interface, did I realize the missing cert was the culprit.  Embarrassingly, it took a call to Microsoft support to figure out that the additional detail was indeed available, but double-clicking the log entry doesn’t work.  You have to click  on the details button at the bottom of the page to get additional XML-based information regarding the entry.

If the SSL certificate isn’t your issue, the logs should help you figure out what is and provide more insight into the failed update.

image image

Unable to manage subscription

More often than not, being able to manage a subscription is due to the lack of a management certificate, the fact that your Microsoft ID (aka LiveID) hasn’t been granted co-administrator rights (assuming you didn’t create the subscription) OR you’ve got a filter set.

The last one is the easiest to fix.  You set your subscription filter at the top of the management interface.  Just click the little funnel and then check or uncheck the subscriptions you want to see.  Occasionally, you’ll forget that you’ve unchecked a subscription you’d like to see OR someone will add you as co-admin on a subscription you need to work with, but don’t see it because of a previous filter setting (don’t ask me how I know this).  So, if you can’t see the subscription, just check your filter settings.

image

If you’re not filtering your subscriptions, the owner/primary administrator may have not added your Microsoft (Live) ID to the subscription.  In this case, there’s nothing you can do, but ask them to add you.

Finally, Azure requires various certificates to be loaded into the subscription to allow you to perform various functions.  These certificates are shown in two places: 1) in the settings section under “Management Certificates” and within Cloud Services under “Certificates.”

In both cases, you can upload your certificates manually through the interface.  However, the certificates have to be .CER type, not the .PFX (read about the different certificate formats for Azure here – NOTE: the older management interface for Azure is shown, but the basics should still work).  For the certificates required for Remote Desktop, you can create them dynamically through the Azure tools in Visual Studio.  This is done by choosing the “Configure Remote Desktop” option after right-clicking on your Azure Deployment project.  Clicking on the dropdown list box will show you existing certificates or allow you to create a new one.

image

After you publish, the new RDP certificate will be uploaded automatically.

Deploying different configuration settings

The Visual Studio tools are pretty decent and give you a degree of flexibility for deploying various environment-specific settings to your Azure subscription.  However, getting the hang of setting the various values took a bit.  Hopefully, if you’re challenged as I was, the following will help.

Generally, when developing an ASP.NET application, you have the option to create various build configurations directly in Visual Studio.  Each configuration is accompanied by a web.config transformation.  Each transformation is a derivative of the core web.config file, with specific XML-transformation directives to update, replace or delete various nodes within the file.  With Azure, these build configurations are handled exactly the same way, except that when you build a deployment package, you pick the configuration you want at the time you build the package; you don’t have to change the Configuration Manager setting.

In addition to build configurations, there are Azure-specific “service” configuration settings.  Much like adding Web.Config transformations, you can create various Azure service configurations, which allow you to stipulate variables like a blob storage account, certificate thumbprints, number of instances or what SMTP server to use (very similar to the AppSettings node in web.config).  Azure service configurations and build configurations are complementary.  However, build configurations (and the associated web.config settings) can only be changed when you upload a new build to the Azure environment.  Conversely, some service configuration settings can be changed through the Azure management interface (after deployment or at run-time), making them more flexible.  This is particularly handy when you have to make somewhat trivial application setting changes that would require a re-publish if the setting were housed in the web.config file (yes, yes.  You could RDP to the box, but it’s still easier to access the Azure Management console).

image

The various service configurations are managed through the Settings interface for your Azure project.   Set the drop down to “All Configurations” to establish the range of settings across service configurations.  If you click on Settings, for example, you will then see the range settings for your Azure deployment.  Each setting, while in the All Settings, any configuration setting that is configuration-specific is shown with a “select configuration” value.  Any setting that is common, simply displays the value.  While in “All Configurations” you can add or remove settings across all configurations. 

image

To change a setting for a particular configuration, just change the drop down value to the configuration you want and then click in the “Value” field of the setting.  Enter the new value and you’re done (don’t forget to save).  NOTE: You can add or remove settings within a specific configuration, so you’ll need to plan your setting approach so that all settings make sense for all configurations.

When you package up your application, you’ll be prompted to select the build configuration, as well as the service configuration. This will combine the specific web.config transformations along with service settings for the deployment you want to publish.

image

More information can be found on Microsoft’s site: http://msdn.microsoft.com/en-us/library/windowsazure/ff683672.aspx 

Finding that cursed subscription ID

As Azure has evolved, Microsoft has updated the management interface.  Where finding the subscription ID was relatively simple in early versions, later versions “hid” the subscription ID (for certain workloads) until after you publish for the first time.  If you’re also having trouble publishing, this is a double-whammy.

In truth, the subscription ID is always visible, though I’d argue the interface makes it a challenge to find if you’re unfamiliar with the management console.  To find you subscription ID, just log onto http://windows.azure.com with your LiveID.  Then click on the Settings menu item on the left (it’s the little gear thing).  Now, click on Administrators on the top menu.  This is the list of all administrators for your subscription.  Next to each individual, you’ll see the subscription name and the ID.  Just copy the ID from this interface.  If the column isn’t wide enough, just click and drag the column separator on the right.

image

 

Hopefully these tips will help you avoid challenges with Azure.  However, I’m certainly interested in your feedback.  If you have some, please comment.

Other Resources

Visual Studio and Azure: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484.aspx

Setting up Remote Desktop for Azure Cloud Service: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484.aspx

Running multiple web sites in an Azure web role: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484.aspx

Collecting and logging data by using Azure Diagnostics: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484.aspx

15 January 2013

Being a Professional

I was reminded about what I’d had been calling “a professional” by David Heinemeier Hansson at 37Signals.  He posted “Your Life’s Work” on their Signal vs. Noise blog. It was a terrific piece that, to me, really spoke to a “disposition” that professionals should possess.  I especially liked this quote: “[if] you’re not committed to your life’s work in a company and with people you could endure for decades, are you making progress …”  In essence, you have to be truly interested in the work that you do and care about the outcome – not just today, but tomorrow and beyond.  If not, as David puts it, you’d have to question whether you’d making any career progress. 

For Consejo, this concept (of a professional) is at the heart of who we want to hire and I’m glad we’re not alone.

21 December 2012

Intel and the Big Search Box

I was recently researching new ultrabooks and visited the Intel site for some insight on current processors.  When I arrived, I was  a bit shocked at what I saw…

Intel Home Page

Source: Intel.COM

Waiting for me  was not the typical corporate web site, with menu-based navigation (at least not immediately apparent).  Rather, Intel was presenting, dare I say, a very "Google" like approach – enter a keyword-based query to find what you want. 

I found the whole experience a bit off-putting.  I don't generally recommend our clients lead with a search-based content findability approach. In fact, I've argued against even making search a primary content findability technique.

If anyone has insight on this approach (e.g. real analytics supporting this experience strategy over more traditional IA), I'd love to hear about it.

18 December 2012

Interesting cause for exception: System.InvalidOperationException: $metadata Web Service method name is not valid.


Interesting cause for exception: System.InvalidOperationException: $metadata Web Service method name is not valid.

While deploying a classic web service to a production server recently, we came across an issue when calling the service from a test web application.  We were able to add a web service reference to our test harness web application with no issues.  The web service would load with no issues in a standard web browser as well by url:


When calling the web method the response would come back with no errors.  We noticed the data was not being saved to the server as expected.  After checking the event viewer on the server we found this error that made no sense to us nor offered much detail:

 System.InvalidOperationException: $metadata Web Service method name is not valid.

After searching the web a bit, we learned this was more so a mask error that covered up the true issue.  After digging around for several days, I attempted to test the service directly in a debug state in the server's browser.

The service responded with:

Could not load file or assembly 'Telerik.OpenAccess, Version=2012.2.816.1, Culture=neutral, PublicKeyToken=7ce17eeaf1d59342' or one of its dependencies. The system cannot find the file specified.

We somehow published a version of the telerik.OpenAccess dll that was different then what the service was looking for.

After re-publishing the service with the correct dll version, the issue completely resolved and the web method started saving data to disk.

Lesson learned, the meta data error that originally prompted our research had no relationship to the real exception.  If you come across this error in the future, debug the service straight against the problematic server's web browser to get the true exception.




10 December 2012

Interesting Code Exception–UnwillingToPerform

I have recently been busy putting the finishing touches on a new web application for a client in Indiana.  During the course of the project, we discovered that we needed to write a custom membership provider that would use secure LDAP to authenticate users (the ActiveDirectoryMembershipProvider does not, surprisingly, support the client's environment setup). 

While developing the basic provider was relatively easy to complete (and used an Oracle LDAP provider we built previously), we ran into a series of challenges.  One challenge is now a support incident at Microsoft (more on that in a later blog).

However, during one of the debugging sessions, we noticed an "unspecified operation error occurred" exception being thrown.  Digging down, we discovered the reason code: "unwilling to perform"

image

I have never thought about putting that kind of reason in my code, but if Microsoft can do it, perhaps it's O.K. Smile

03 December 2012

Write Better

When I started working as a consultant in the latter half of the 90's, our firm had writers that specialized in writing for the web.  These folks constantly had to help clients refine and organize traditional print copy for the "new" medium.

Recently, Jason Fried over at 37Signals published a short blog post that referenced a post by Maria Popova at BrainPickings.org highlighting writing advice from David Ogilvy, the original "Mad Man."  What I found fascinating is that the advice that Ogilvy delivered in 1982 is much the same advice our writers gave clients more than a decade later (in the context of writing for the web).

Clearly, good advice never goes out of style.

18 September 2012

Adoption: The forgotten key to Success

A solution is not just technology.  A solution is not just new features and functions.  In fact, a solution has many facets, including adoption.  While adoption is critical to the success of any solution, it is often ignored.

Organizations who are interested in successfully introducing a new technology or solution, need to spend as much time considering and developing an adoption plan as they did building the solution.  Here are a few keys to adoption success:

  • Communicate early and often with the end user community 
    Start well in advance of any change.  Start with letting folks know that a change is coming and why it’s important to them.  Create a clear connection between a challenge you know they have and how the new solution seeks to address it.   Once you reach specific milestones in the solution’s development, send out additional communications with more specific information and dates (like when folks should expect to see the solution). 
  • Introduce the new solution properly
    Spend time developing ways to acquaint your end user community with the new tool or function.  Simply sending an “informative” e-mail is insufficient; try something new.  For example, if you’re introducing a new Intranet, have a “scavenger hunt” to find specific information or content; the “winner” should get some prize for being the first to find the requested item (e.g. a gift card to a local restaurant or simple public recognition).  This helps introduce the new site, gets people to use the tool AND “tests” the new information architecture.  Feedback from the event can also help you avoid questions in the future.
  • Create triggers for using the solution
    Dr. BJ Fogg runs a persuasion lab at Stanford university.  Dr. Fogg’s Behavior Model suggests that while motivation and ease-of-use are important, people still need to be “triggered” to exhibit a specific behavior.  This means that no matter how easy the solution is to use, nor how motivated your end user community happens to be, you’ll still need to “remind” them to use the tool until a habit of use is formed.  Dr. Fogg’s research involved how Facebook uses e-mail notifications to “trigger” people to return to the site.   Think of ways to trigger your end-user community to constantly use the tool (e.g. remind them of what problem the tool solves OR how they can save themselves time by using the tool).
  • Gather feedback liberally
    Whether you’re speaking personally to people or using a more generic survey, always gather feedback.  Your solution might be fantastic, but there will always  be room for improvement.  Demonstrate you’re interested in your end user community by asking for feedback on what works, what doesn’t and how the solution can improve.  Clearly, you’ll get lots of opinions, but what’s important will be the trends you can discern; these trends will represent what’s important to the most number of users and where you should focus your attention.
  • Constantly evolve
    Once you’re done with the initial implementation… you’re not done.  Any solution will need to evolve to stay relevant to the end user community it serves.  Use the feedback you’ve gathered, combined with what’s happening in the broader organization and/or marketplace.  Create a real “road map” for the solution’s improvement.  Once you have that road map in place, start communicating that roadmap to your end user community; this too could be a trigger, as well as building support for the next release.

You will spend a great deal of time developing the right solution for your organization.  You need to spend just as much time making sure everyone uses it and that it adds value to your firm.

18 July 2012

Office and SharePoint 2013 Products Revealed

On 16 July 2012, Microsoft publicly announced the previews for latest versions of Office and SharePoint 2013.  For many of our clients, the announcement will be significant as these products may form the foundation of their information management and collaboration architectures for many years.
As a part of my continuing work with the Real Story Group, I just published a short blog post on our initial SharePoint 2013 advice for dealing with the avalanche of content that will be broadcast about the platform.

28 June 2012

SharePoint in Geographically Disperse Organizations

As SharePoint 2010 continues to grow in popularity among larger organizations, the collaboration platform is likewise being used by an increasing number of geographically dispersed companies that see it as a tool to keep far-flung employees on the same page. Because of SharePoint’s architecture, however, such implementations can add new administrative burdens whether you’re trying to keep workers across North America or around the world connected and communicating with one another.

SearchContentManagement just published an article I wrote on how to manage SharePoint is these distributed organizations.

21 May 2012

Improve Code Reuse with Search

In past, I have been critical of simply crawling content with a search engine and using the basic keyword query approach to find things.  However, there are some situations that do lend themselves to just this approach (though with a tad bit of “intelligence”).  One of these situations is code reuse. 

At Consejo, we use Subversion (coupled with VisualSVN Server) for source control.  Subversion (or SVN) is an open-source source control system. VisualSVN Server provides a terrific web interface to the repository.  Combined with the VisualSVN plugin for Visual Studio, we are able to very effectively manage our code production across clients with little or no cost.  Unfortunately, with our team dispersed, it’s sometimes difficult to make everyone aware of what’s already been built for one client or another.

SIDE NOTE: I have personally been critical of using open source solutions and even wrote an article arguing against the open source model.  However, it’s clear there are cases where open source makes sense (yup I was wrong).  Further, we’ve been a financial supporter of SVN (through donations) since we started using the tool regularly.

Much of what consulting companies do is based on past work.  It’s critical to how we work that every consultant is kept abreast of what the firm is doing, across industries, disciplines and clients.  For example, we recently built two different applications, for two very different clients.  Both applications, however, used the same licensed interface controls.  In fact, many of the interactions each application’s user base had with their respective application were very similar.  As a result, techniques we used, problems we solved and utility code (code not specific to a client or application) we constructed could be leveraged across both projects with minor updates.  This saved both our consultants and our clients time and, more importantly, money.   Unfortunately, since both projects overlapped, there was no a good way, besides knowing team members on both projects, to capture and surface code reuse opportunities in an automated way.  Each team has to basically know what the other team was doing and ask specific questions (or discover reuse opportunities through happenstance).  Not a fabulous nor scalable model. 

While we solved this problem the “old fashioned” way, it’s not a good long-term solution.  As projects get more complicated, team members more geographically or temporally dispersed, the “old fashioned” way becomes very burdensome and super inefficient.   So what’s the solution?

Historically, we’d been looking for ways to effectively expose our SVN repository to our consultants, outside of Visual Studio and a web browser.  The thought was that if developers were able to search for specific code constructs or even development pattern names, there was a reasonable chance of finding reusable code snippets or whole libraries (if they existed).   And, while there are a few tools that help in this regard like FishEye and SvnQuery, neither was a perfect fit for our needs.  Since VisualSVN Server presents the repository as a web site, why couldn’t we just use a standard search engine like MS Search Express, SharePoint or GSA (Google Search Appliance) to crawl the repository and allow consultants to query it like a web site? 

The answer, as Kenneth Scott points out in his blog post on crawling VisualSVN with a search engine, is “not exactly.”  The problem stems from the way VisualSVN renders the repository web interface (you should read his post for the details).  However, Kenneth solved this problem through the use of an HTTP handler and gave us exactly what we needed.  Using his utility, we can indeed use a standard search engine (take your pick) and then allow our developers to search for an example of an excel-like editing experience using a Telerik GridControl or discover if we’ve built a custom authentication provider for SharePoint; both of which we’ve done, but only one I knew about until recently.

You may still be questioning why this is a good idea?  I’ve been critical of this kind of shot gun search approach in the past.  Why should this work now?  The reason is the very narrow information domain and the very specific terms developers use.  Both work together in a way that makes finding content, using straight keyword queries, more reasonable.  For example, if I want to find example of our use of the Telerik GridControl, I can simply search for the RadGrid class name.  If I want to figure out if we’d built a .NET membership provider, I can search for the inheritance statement.  In the first case, I may get an overwhelming number of results.  However, they’ll all be examples of use, since the only time that class would appear is when I’m using it in code (in other words, relevant).  The second example would produce far fewer results and likely give me exactly what I need immediately (and relevant and, probably, highly precise).

In the end, I’m only really disappointed that I didn’t discover Kenneth’s blog post sooner, but my past search queries were about SVN or Subversion, not VisualSVN (poor search strategy on my part).  However, as we develop this code search feature inside of our intranet, I’m excited by the prospect of finding internal examples of code we can reuse.

If you have a similar feature inside your firm, I’d love to hear how it works and if it’s yield higher code reuse.

04 April 2012

The Myth about SharePoint Browser Support

Microsoft posted a blog entry today that pointed readers to SharePoint’s browser support page on TechNet.  In this post, they detail what browsers SharePoint supports and any specific support limitations.  However, I want to raise an important point that seems to be missing from the conversation: browser support isn’t entirely about SharePoint.

Unfortunately, what most everyone fails to mention is that browser support is actually a combination of what Microsoft supplies and the solution that you’ve built.  In other words, Microsoft’s support, or lack thereof, for specific browsers is limited to Microsoft-supplied interfaces.  Depending on the type of solution you’ve developed, much more of your solution’s browser capability could be dependent on your development than on Microsoft’s.

Here’s a quote from TechNet:

“For publishing sites, the Web Content Management features built into SharePoint Server 2010 provide a deep level of control over the markup and styling of the reader experience. Page designers can use these features to help ensure that the pages they design are compatible with additional browsers, including Internet Explorer 6, for viewing content. However, page designers are responsible for creating pages that are compatible with the browsers that they want to support.

Obviously, this quote relates specifically to publishing sites; primarily those internet-facing sites that are primarily content serving sites as opposed to more collaborative/intranet/extranet kind of sites.  However, even in the case of sites built with other site definitions (like Team Sites), browser support can and will be affected by new master pages, custom web parts or other components supplied by you or 3rd parties.

The myth here is that SharePoint’s support for specific browsers is somehow exclusively Microsoft’s domain.  In fact, true browser compatibility is a combination of Microsoft supplied interfaces (that are used in your solution) and those solution-specific interfaces that you or a vendor create.

06 March 2012

[Good] Communication is Key

Every now and again, a really fantastic opportunity to illustrate a best practice just falls in your lap.  The opportunity, just as occasionally, presents itself through inspiration found in the most surprising places.  In my case, I found this communication example in the men’s room at a client: 

IMG_0590

This sign provides an excellent template to use when communicating with your audience: 

  • It starts by presenting the message at the time when the recipient is engaging in a related behavior
  • The messaging points out a very specific feature of the tool being used
  • A statement of community support for that feature is provided
  • An aspirational goal is included to (through implication) encourage a specific, future behavior
  • All of this is followed by a polite.. thank you!

Ignoring the specific subject matter involved in this message, this could easily serve as a model for feature-obsessed technologists and enthusiastic Intranet managers on how to encourage intranet or technology adoption.

What do you think?

[EDITED TO CORRECT SPELLING AND WORD CHOICE]

31 January 2012

Why “Search” instead of “Find”

In July 2010, I posted “Can search really solve information ‘finability’.”  Since that post, I’ve run into a number of clients who continue to insist that they need to search (even if “search stinks” in their organization).  Yet, almost universally, they report the search experience misses their expectations.  This should be no surprise, as lots of firms struggle with this very problem.  However, I’d like to suggest an alternate hypothesis: they actually want “find,” not “search.”

For better or worse (mostly worse), loads of folks closely associate the act of search with the expectation of locating desired content.  Typically, at least outside of an organization, this means assuming they should navigate to Google or Bing (mostly Google). When they arrive, they enter a few keywords and press “GO.”  At this point, they’re presented with a set of results from which they choose one that looks promising.  In part, any perceived success is just that; they don’t necessarily expect to find what they want.  If, however, they happen to find the object of their desire, they are pleased. 

While both “search” and “find” are verbs, search does not imply a goal, simply an action.  Search describes everything in the scenario I just relayed except the part where you’ve found the appropriate destination.  Find, by contrast, is explicitly the goal and characterized by viewing your content.  If you need a very concrete example, if child goes missing, the goal is not to search for the child.  The goal is to find the child.

Therefore, in regards to content, consider changing the conversation.  Use the word FIND instead of SEARCH.  In doing so, you begin to think of the goal and not of one particular approach.  Further, if we focus on finding content, we can also measure a success rate.

This orientation change opens up a whole world of opportunities.  For example, start with the simplest model: place relevant content on the first page they see on an intranet.  While this may seem wholly impractical (too much content, no context to judge relevance), this kind of solution is possible.  Use what you know about your employees/users.  If you know within what department they work, you can begin surfacing content from their department. Not specific enough? What about adding in the role they serve and surface content targeted to that role (NOTE: a good driver for metadata).

Beyond actually locating the content in plain site, try surfacing tasks associated with the desired content. For example, display tasks like “Submit an Expense Report,” instead of requiring users to search for the expense report form or “Fill out a Timesheet,” which links to the time reporting system (or simply an interface to immediately report time).  In this way we’re presenting navigable elements that are easily understood and focus on actual tasks employees/users need to accomplish without the need to search.

However, if you really must provide search, give them help.  Provide them with a targeted search facility that enables them to narrow the scope of the search (i.e. don’t return results from the entire enterprise if they’re looking for a project related document).  Give them metadata to enable precise queries like “Author = Jane Smith.”  Finally, give them “canned” or pre-developed queries created by “experts” who can construct search queries that ensure result precision.  In this way, we’ve moved away from the simply keyword-driven approach to a more intelligent model that reduces “noise” and improves precision through carefully use of the technology. As an interesting alternative, execute these queries in the background and simply display the result like other navigation on the page; in this way, the user doesn’t actually have to execute the query and you’ve just saved them a step in finding content.

In short, search (as a tool) must necessarily become one of an array of techniques used to find content.  However, it should absolutely not be the first or only approach.  We must think in terms of FIND, not SEARCH.  Find is the concrete goal and has a measurable success rate; search is simply an action that, while measurable, does not necessarily lead to your user’s goal.

24 January 2012

Multilingual SharePoint Sites–Part 2

Some months back, I wrote a post about the basics of multilingual sites in SharePoint.  The post was a good primer for anyone that needs to understand SharePoint-centric concepts regarding multilingual web sites.  Unfortunately, the post didn’t really describe important details outside of the SharePoint sphere.  In particular, the post excluded all the ASP.NET-centric details.  In this post, I want to share at least some of those additional details.

Globalization

One of the early design goals that Microsoft had for SharePoint was that ASP.NET developers would be comfortable creating SharePoint-based solutions.  The theory is that if you’re a competent ASP.NET developer, you can simply pick up the additional SharePoint API universe; SharePoint is a good .NET citizen, so this idea shouldn’t be a stretch. 

Whether or not you believe a good ASP.NET developer could easily pick up SharePoint, SharePoint does borrow very heavily from many .NET facilities.  With regard to multilingual sites, this includes the Globalization namespace

The Globalization namespace is a group of classes that are responsible for allowing ASP.NET applications to understand the numerous languages and cultures that applications can target.  It includes everything from calendar differences and languages to date/time formats and string comparisons (and a whole lot more).  It also, importantly, provides a facility to allow developers to create a resource pool of commonly referred to assets (e.g. element labels, images).  These assets are all referenced using standard labels, creating an index of asset variants for each culture.  At runtime, based on the culture of the current user (usually indicated by a browser setting), the .NET framework will dynamically select the appropriate asset/resource based on a generic label describing that asset or resource. 

Resources and Resource Files (RESX)

Resources or multilingual assets are defined in a Resource File (RESX). There’s a resource file for each culture represented in the application. All resource files use the same labels to describe the asset, but with a culture-specific value.  For example, the text shown next to the text box where a user would enter their user ID to authenticate with the application would be an example of a reference resource. 

image

Figure 1 – Resource example for Login Page

In the RESX file, which is just XML, you’ll find the following entry

image

Figure 2 – Login_UserID label in the EN-US resource file

To add, edit or delete values, Visual Studio provides a “designer” view of the file.  In the designer, you have the ability to quickly and easily define the various labels and the corresponding values.  Figure 3 shows the Visual Studio interface for editing a RESX file.

image

Figure 3 – Visual Studio designer interface for RESX file

For every culture your application needs to support, you create a specific RESX file.  Each file would be named for the culture it supports and every file would contain the same labels, with values corresponding to the specific culture.  The example presented here, the RESX file is for the culture EN-US (US English). For more information on resource files, naming conventions and details on creating the files, take a look at this MSDN article on resource files

In SharePoint terms, the RESX would correspond to a specific variation of the same culture.  In effect, you will have at least one RESX file per variation.  These files define elements of the user interface that end users do not supply.  Whereas content on any given page is created and managed by content contributors, there are also elements, like the label for the User ID field on the login page, that are “baked” into the code of the application.  RESX files provide a mechanism to define what that label will say in the context of a specific variation or culture selection (usually set by the browser displaying the page).

Information Architecture and Visual Design

Beyond the somewhat mechanical processes for inserting culture specific content into a page, a more critical aspect of multilingual sites is Information Architecture (IA).  The IA defines the navigation paths (global navigation and its relationship to other sections and pages within the site) and the overall interface layout.  This means the decisions about where various interface elements are placed, what nomenclature is used and what sort of content is shown are all made through and by the IA (both the person – Architect – and their output – Architecture).

When developing a multilingual web site, consider that the interface will be at least slightly and potentially radically different based on the language being displayed.  The simplest example is word length.  If we compare an interface in German and one in English, it’s very likely that there will be different space needs for labels in the navigation, as well as content.  As a result, the IA must anticipate interface movement and allow for enough white space to accommodate an interface that will grow and shrink based on the language’s need.  This too is a challenge for constructing HTML and JavaScript, since both components of the web page may need to “react” to language differences.  However, beyond this relatively easy challenge, presenting content is matter of having the appropriate language-specific content.

A more complicated scenario is one involving differences in how a language is read.  For Hebrew or Arabic (as two examples), the languages are read right to left.  As a result, the whole orientation of the interface needs to shift.  The main navigation will need to start from the right, global navigation elements will be positioned in the upper left and text will flow from right to left within the content sections.  As such, you may require a unique master page and page layouts for these languages to sufficient accommodate the display differences.  The same is true for languages that are read vertically instead of horizontally as in Manchu.

Continuing with the above example, you also have the challenge of fonts.  Languages that utilize radically different character sets will require the IA and the designer to consider both font face and size choices. For example, for any font size choice in the cascading style sheet, will the text be readable across all languages represented by the site.  Most European languages have characters with relatively little detail compared with Asian languages.  Font sizes that are too small or font faces that carry too much embellishment may make detailed characters muddled or simply unreadable.  As such, these choices represent both a visual design and information architecture challenge, since font size differences will also present spacing issues to resolve.

Bringing it all together

With all of the details provided in the two posts of this series, here’s a quick review of the important parts:

  • When developing your Information Architecture for a multilingual site, include the various cultures included.  Each culture, in SharePoint terms, will be a “variation.”  A culture, remember, is a combination of a language and a country, represented like EN-UK (English – United Kingdom) or PT-PT (Portuguese – Portugal).  This culture approach makes it easy to distinguish between two countries that share a broad language (e.g. Spanish), but differ in usage (e.g. Spain vs. Mexico).
  • Developing an IA for a multilingual sites involves many more decisions and test cases to resolve than a single language site.  It’s important to explore the implications for your specific IA based on the languages that need to be supported and, when the visual design is complete, any challenges a specific design might pose based on the supported languages.
  • Within SharePoint, decide what variation will act as a the “primary” or source variation.  This is the variation that will syndicate content to all other variations.  For example, if the source variation is German (from Germany), your content will start in German; once a page is approved, it will be copied to the other language variations in your site collection (e.g. EN-UK, PT-PT).  From there, each non-source variation will be responsible for translating, approving and publishing a language specific version of the German content.
  • You will have at least one RESX file per variation.  If you have lots of different cultures, you will have as many RESX files and they must all contain the same labels.  Because labels are not evaluated during compile-time in Visual Studio, you’ll only discover missing (or conflicted) labels in a resource file at run time.  This is not, obviously, a good user experience.  As a result, you should thoroughly test and control the modification of RESX files.  Take a look at this blog series from Carel Lotz regarding one approach to effective RESX management: http://fromthedevtrenches.blogspot.com/2011/04/managing-net-resx-duplication-part-1.html
  • Think carefully about the taxonomy (aka organization) of variations and labels.  As much as the IA process should define navigation, the overall taxonomy will drive label names and how labels are used in the application.  For the project example in this post, we used labels tied to interfaces (interface name prepended on label name).  This is one approach.  However, this approach neglects opportunities to leverage labels across interfaces.  Conversely, label use across interfaces can make maintenance more challenging as label changes will necessarily have different impacts across the application.  Here, experimentation and testing are key.
  • A SharePoint multilingual site is really a combination of SharePoint variations and .NET globalization.  You must necessarily implement both; end users will leverage the variations component and your developers will have to provide matching RESX files for application-specific labels and static text.

As you may have surmised, there are a lot of details to consider when developing a multilingual web application.  SharePoint does provide decent facilities to enable basic multilingual sites and the .NET framework provides loads of flexibility in implementation.   Just be sure to consider the whole picture – it’s a combination of SharePoint centric constructs (aka variations), good information architecture/design and the technical “infrastructure” to make the whole solution work for end users.

16 January 2012

The [Tools are] too much with Us

As 2012 starts in earnest, I am reminded of the poem from which the title of this post has been taken “The World is too much with Us” by William Wordsworth.   In this poem, Wordsworth laments how out of tune with nature people had become during the first industrial revolution.  In much the same way, I see too much focus being placed on tools in the era of SharePoint.  Business users and Information Technology folks seem to be so enamored by the tools and technology, they forget that the focus should be on needs and solutions.  This is especially true when discussing SharePoint and, as I said many times, SharePoint is not the answer.  

Instead, SharePoint, like any technology, needs only to be included insofar as it provides the basis for creating a solution to a specific problem (or problems).  For example, if you needed to manage documents, SharePoint could provide you with a Document Library for storing the files.  Further, you could leverage Content Types and Information Management Policies to enable more precise management of a document’s lifecycle (if that were a need).  However, your specific use of these features should and must be governed by the solution – the overall set of features, functions and the specific solution implementation  in the context of your needs and goals.

When considering how to proceed with your SharePoint project, consider this one piece of advice: start with the problem or challenge first.  Ignore SharePoint and don’t speak of it again, unless you’re discussing how some feature in SharePoint can support a solution.  Even then, try focusing on the solution (give it a name if you have to) and not the tools or features involved.

07 October 2011

SharePoint Conference 2011– Good Reminder, Few Surprises

Microsoft wrapped up their SharePoint Conference 2011 on Thursday this week.  In attendance were some 7,600 people, all seemingly enthusiastic for the phenomenon that is SharePoint.  While there were few surprises, Microsoft did remind everyone why the platform is so popular.

If you’re interested in a complete review of the conference, check out my posts on the Real Story Group blog:

It’s fair to say that we haven’t seen the crest of SharePoint popularity or growth.  Microsoft even made an out of character announcement regarding another SharePoint conference in 2012 being held in Las Vegas.  Two conferences in two years – sounds like something new may be announced.  Stay tuned!

08 August 2011

Migrating your Content to SharePoint

Many clients want to take advantage of the on-premise or cloud-based power of SharePoint. However, many struggle with creating the ideal process to migrate their content. Questions like what steps are involved and how can you ensure success can be a challenge to answer without proper guidance. To assist our clients in making good decisions and to help ensure migration success, we’ve created a formalized migration process. 

If you’re interested in better understanding how to manage a content migration, read our Migrating your Content to SharePoint white paper or simply give us a ring.

29 June 2011

Does Microsoft need a SharePoint App Store

Yes and no.  You see, its just not as easy as a black and white response.  You could argue that SharePoint needs an app store because there are far too many specific use cases for which it’s out of the box experience is ill suited.  However, SharePoint does ship with a number of “apps” (Microsoft calls them “Site Definitions”) that can satisfy relatively simple needs with little or no work.  The challenge is whether the out of the box archetypes like “team sites” or “document workspaces” in Foundation or “publishing portal” or “search center” in the Standard/Enterprise versions are sufficient; could Microsoft or a 3rd party effectively create additional, purpose-built archetypes to improve end user adoption and overall usage?

To illustrate this point, let’s suppose you need to create a web-based collaboration space to work on a document with a small team of people inside your company.  It’s actually quite simple to use what SharePoint provides immediately. You’d likely start with a “document workspace” (shown in Figure 1).  The workspace can be provisioned while you’re reviewing the document from within Microsoft Word (or another Office product) or from SharePoint’s web-based interface. You can even invite your colleagues from the same interface (within Word) or, again, through the web interface, without any trouble. Once the workspace is provisioned, you’d automatically have a library to store the document (and the document would already be waiting there for you and your team members). In addition, the workspace would show who was participating in the workspace through the members list (with direct links to their individual profile by clicking on their names), a task list to assign & track responsibilities, an announcements function to communicate with your team and a discussion thread for working exploring topics relevant to the document. All of these features are packaged up and provisioned automatically. So what’s the problem?

image
Figure 1 – Typical Document Workspace in SharePoint 2010

What happens when you want to create a quick, web-based file sharing site and “invite” others inside or outside of your organization?  You’ll quickly find the standard SharePoint site definitions fall a bit short.  SharePoint does have features that support file sharing through sites and libraries; users could even access and edit those files through Office on your desktop or use Office web app through a browser.  There’s also the possibility to let non-employees access the site.  Unfortunately, the collection of features, along with the requisite security model are not available in a shipped SharePoint site definition.   As a result, an end user would have to start with an existing site definition, like a “blank site,” then add a combination of lists and libraries to create the environment they want.  Next, they would grant internal participants access to the site, applying library/list-specific or document-level permissions as necessary.  If the users aren’t in Active Directory (like the non-employees), SharePoint allows you to use SQL Membership to store those non-employee IDs in a SQL database.  However,the non-AD authentication configuration work requires you to manipulate machine-level settings in an XML configuration file, create databases on a SQL server and grant specific application pool identities access to the database.  Most of this effort is typically outside the expertise of an average end user; even if they weren’t, because of the changes necessary to the farm, most administrators wouldn’t want end users changing these settings.

Ultimately, there’s little evidence that SharePoint is wholly unsuited for collaboration needs inside the enterprise.  If fact, you could argue that SharePoint, given its breadth of functionality, has no equal in the collaboration space.  However, there are well-documented challenges in creating you own applications on SharePoint.  Remember that SharePoint, by Microsoft’s own admission, is a platform and not a product.  This means companies are either left developing their own applications or buying from a 3rd party.  This begs the question: should Microsoft devote some of their R&D budget to developing applications on top of SharePoint?  Further, wouldn’t building applications on SharePoint, in fact, be more useful to a greater number of people than adding new features to the platform?  This approach would certainly go a long way to blunt attempts by companies like Huddle, 37 Signals, Box.NET, HyperOffice and others to discredit SharePoint for purpose-build solutions.  Microsoft has demonstrated, historically, their willingness to drive down this path with new site definitions like the “Fab 40.”  Unfortunately, some of these applications were just silly (like the Baseball Team management portal) and they never followed-up with equivalents on SharePoint 2010.

If you’re searching for specific applications on SharePoint today, you’ll need to look to the myriad of 3rd party vendors like Bamboo Solutions, Coras Works, SharePoint Solutions or the app store concept from the larger SharePoint community called Sharevolution.  Unlike Apple and Google (or even Microsoft’s own online store for everything from Windows to XBox to non-Microsoft hardware), Microsoft has not created that one-stop-shop to find specific solutions built on SharePoint.  One can only hope that, in the future, Microsoft does something more creative than simply redirecting the domain sharepointapps.com back to the SharePoint product site on Microsoft.com. 

21 June 2011

Technology Just Gets in the Way

The idea that most folks in IT, and even some  on the non-IT side, spend way too much time worrying, thinking and generally kvetching about technology is almost passé these days – everyone knows it true, but they still get wrapped up anyway.  Incredibly, most people tend to see the process of solving business challenges exclusively through the lens of what a specific tool can solve.  This condition is never more obvious than when firms start discussing SharePoint. 

Inevitably, there are discussions that start out well – focused on business needs and what users have to accomplish.  Then, for some strange reason, things go seriously awry.  In the words of Ace Ventura, Pet Detective – “Gee, Chuck, the date started out good. Just before we got to the party she seemed to tense up.”   Perfectly rational people start discussing if SharePoint’s wiki has sufficient functionality, can the firm really use the out of the box search, are the records management features robust enough for the entire enterprise?  All of these questions are very reasonable to ask if two conditions have already been met: 1) you know what problems you’re trying to solve and 2) you have well-defined goals and corresponding metrics to measure if you’ve successfully met those goals (notice I avoided using the word “requirements”).  Unfortunately, most organizations fail to satisfy either condition before pushing head long into a “how do we implement SharePoint” discussion.

If you regularly read this blog, you’ll know that I’ve already said SharePoint is not the answer.  However, this is true of any technology if you haven’t clearly defined what you want to accomplish.  No matter what tools you might be considering, you must be clear about what you’re trying to accomplish AND ensure that your goals are achievable.

Published on the Word of Pie blog, there’s a great post about taking a break from ECM that illustrates this point perfectly (though through the broader lens of ECM).  In the post, Laurence Hart (@piewords) describes all of the challenges with ECM implementations.  He does such a terrific job that I’ll end this post with only one bit of advice; read his post.