ShareThis

Vertical Alignment of Elements in a DIV

The method for vertical alignment of text in a DIV is different from the common way, because of the style="vertical-align:middle;" used for table cells like elements doesn't work with DIVs.

As an alternative, if the length of the text is fixed, display the text inside a DIV with proper padding and margin settings will work. But when we have few lines of text and total height of the text is not fixed (that is variable length text, different font sizes, etc.), padding and margin will not work.

As a solution we can use CSS display Property with the value “table-cell” as in the below example.

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
<html>
 <head>
  <title></title>
  <style type="text/css">        
   #tablecellexamples div { border:1px solid black; width:130px; background-color:#eee }
  </style>
 </head>
 <body>
  <div id="tablecellexamples">          
   <div style="display:table-row">   
    <div>
        Lorem ipsum dolor sit amet, consectetur adipiscing elit. Proin luctus dignissim ipsum a dignissim. Proin mattis orci sit amet quam feugiat lobortis.
    </div>
    <div style="display:table-cell; vertical-align:top">Top aligned</div>
    <div style="display:table-cell; vertical-align:middle">Center aligned</div>
    <div style="display:table-cell; vertical-align:bottom">Bottom aligned</div>
   </div>
  </div>
 </body>
</html>

The rendered page will look like below:

Instruct Search Engine Robots to Skip Part of the Site from Crawling - robots.txt file

In Web site implementations, there can be a requirement that some files and directories of a web site should not be indexed by any of the search engines. For this purpose we can use Robot Exclusion Standard, also known as the Robots Exclusion Protocol (http://www.robotstxt.org/orig.html). Here we use robots.txt file, which is a text file placed in the root of a site to tell search engine robots which files and directories of the web site should not access (crawl).

Important:
  • When a robot wants to visits our Web site (http://myserver.com/default.aspx), it first checks for http://myserver.com/robots.txt. Robots do not search the whole site for a file named robots.txt, but look only in the main directory. They strip the path component from the URL (everything from the first single slash), and puts "/robots.txt" in its place.
  • File name should be all lower case. "robots.txt", not "Robots.TXT.
  • Robots can ignore our robots.txt (especially malware robots), which means we cannot rely 100% on a robots.txt to protect some data from being indexed and displayed in search results. Because of that should not be used as a way to protect sensitive data.
  • robots.txt file is publicly available. So anyone can browse it and see what sections of our web site we do not want robots to access.

robots.txt file syntax:

robots.txt file uses two rules:
·         User-agent: The robot the following rule applies to
·         Disallow: The URL we want to block

Sample 1: Entire server content is excluded from all robots.

   User-agent: *
   Disallow: /

Sample 2: Two directories are excluded from all robots.

   User-agent: *
   Disallow: /archive/
   Disallow: /tmp/

Sample 3: A file is excluded from Googlebot search engine.

   User-agent: Googlebot
   Disallow: /myFile.aspx

Not all search engines support pattern matching or regular expression in either the User-agent or Disallow lines. The '*' in the User-agent field is a special case, which means "any robot".

There are few third-party tools available to validate robots.txt file.
·         Robots.txt Checker at http://tool.motoricerca.info/robots-checker.phtml
·         Webmaster Tools at http://www.google.com/webmasters/tools/

List of robot software implementations and operators can be found here: http://www.robotstxt.org/db.html

There are many robots.txt generation tools on the web. One of them is a Mavention Robots.txt which is for creating and managing robots.txt files on websites built on the SharePoint platform.

Issues in V4.master based Custom Master Page in SharePoint Search Center

Search Center master pages are based on minimal.master and these sites do not have v4.master or nightandday.master applied to them. If we create a custom master page based on v4.master and apply it to one of the search center sites, we can notice two main issues.
1.    Webpart Add button is not visible when editing a page in search center
2.    There are two blue ribbon bars and two site actions buttons

This is because the search center page layouts and pages are created specifically to work with minimal.master. The minimal.master does not provide any of the usual navigation and location controls. On the other hand default search center page layouts provide their own ribbon row using the SPNavigation content placeholder.
Solution: Adding below Jquery in our custom master page fixes both the problems.
<script type="text/javascript">
    $(document).ready(function () {
        $("#s4-ribbonrow").children("div").each(function () {
            if ($(this).attr("id") == "s4-ribbonrow")
                $(this).hide();
        });
    });
</script>

Language Dropdown for SharePoint 2010 Variations

If you are using SharePoint 2010 variations for multilingual purposes, there is a native SharePoint user control to switch between languages. The control is named “VariationsLabelMenu” and resides in the 14 hive, C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\CONTROLTEMPLATES\VariationsLabelMenu.ascx
By default this control is hidden and if you are using a custom master page it might not been added to the master page. In that case you can use following markup in the master page to add this control.
<!-- Register dll -->
<%@ Register Tagprefix="PublishingWebControls" Namespace="Microsoft.SharePoint.Publishing.WebControls" Assembly="Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<!-- Add control and datasource -->
<PublishingWebControls:VariationsLabelEcbMenu id="variationLabelMenu" DataSourceID="LabelMenuDataSource" DisplayText="" IsCallbackMode="true" runat="server" /> 
<PublishingWebControls:VariationDataSource id="LabelMenuDataSource" LabelMenuConfiguration="1" Filter="" runat="server"/>


Note: As mentioned in different places, the language (variation label) dropdown is disabled to improve performance. Also you can experience slowness in loading dropdown items with this control.

SharePoint 2010 Vertical Scrolling Problem in Google Chrome

We had to support Google Chrome in one of the public facing website built on SharePoint 2010, and the only major issue we encountered was the vertical scrolling problem. To reproduce the problem, open the page in Google Chrome, if the vertical scroll bar exists, reload the page, we got grayed out scroll bar. When we refresh the page, sometimes horizontal scroll bar is not appearing.
For this issue we found a JQuery solution. Adding the following script in master page file fixed the issue for us.
<script type="text/javascript">
 jQuery(document).ready(function()
 {
  jQuery("#s4-workspace").height(jQuery(window).height() - jQuery("#s4-ribbonrow").height());
 });
</script>


Note: Even though the above script fixed the scrolling problem in our case, seems like it introduce other problems to different people. So use it with care!

System.ArgumentOutOfRangeException: Specified argument was out of the range of valid values. Parameter name:utcDate

We got this error when we try to configure a new web server to deploy an ASP.Net application. The application was working fine in other environments but giving a Javascript error “WebResource.axd 'WebForm_PostBackOptions' is undefined” and not working as expected in the new environment. 






















Even though the Javascript error was not descriptive enough to find out the root cause, there was another error logged in the Event Viewer in the same time:
System.ArgumentOutOfRangeException: Specified argument was out of the range of valid values. Parameter name: utcDate
























We did a bit of search on this error and found out the issue. The issue was some assemblies were dated in the future. This happened due to the wrong date time configuration in the server. The server was a fresh install and the date time wasn’t set properly. .NET was installed before correcting the time. But once the server is connected to the domain, it has taken the correct date time, but DLLs were still dated in the future. That is the reason for WebResource.axd Javascript error. With every ScriptResource.axd request there are two parameters being passed 'd' and 't'. The 'd' parameter is the data and the 't' parameter is the time which the resource was built.
It seems like the datetime in this case is obtained from the last write date of the System.Web.Extensions.dll or System.Web dll. Therefore there were suggestions in the Web to change the file created and modified date on these DLLs in the GAC.

But in our case we had to reinstall the .NET framework to fix the issue!