Apply High Availability Search Topology – SharePoint 2013 using Power Shell


$App1 = Get-SPEnterpriseSearchServiceInstance -Identity “app1”
$App2 = Get-SPEnterpriseSearchServiceInstance -Identity “app2”
$fe1 = Get-SPEnterpriseSearchServiceInstance -Identity “wfe1”
$fe2 = Get-SPEnterpriseSearchServiceInstance -Identity “wfe2”
$fe3 = Get-SPEnterpriseSearchServiceInstance -Identity “wfe3”
$fe4 = Get-SPEnterpriseSearchServiceInstance -Identity “wfe4”
$searchName = “High Availability Search Service”
$searchDB = “High Availability Search Service_DB”
$searchAcct = “imam\FarmAdminUserName”
$searchManagedAcct = Get-SPManagedAccount | Where {$_.username-eq ‘imam\FarmAdminUserName’}
$searchAppPoolName = “High Availability Search Services Application Pool”
IF((Get-spserviceapplicationPool | Where {$ -eq “High Availability Search Services Application Pool”}).name -ne “High Availability Search Services Application Pool”){
$searchAppPool = New-SPServiceApplicationPool -Name $searchAppPoolName -Account $searchManagedAcct}

Write-Host “Starting Search Service Instances on App1…..”

IF((Get-SPEnterpriseSearchServiceInstance -Identity $App1).Status -eq ‘Disabled’)

Start-SPEnterpriseSearchServiceInstance -Identity $App1
Write-Host “Starting Search Service Instance on” $App1.Server.Name
Start-Sleep 5;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchServiceInstance -Identity $App1).Status -eq ‘Online’)
Write-Host -ForegroundColor Green “Search Service Instance Started on” $App1.Server.Name
ELSE { Write-Host -f Green “Search Service Instance is already running on” $App1.Server.Name }
Write-Host “Starting Search Service Instances on App2…..”

IF((Get-SPEnterpriseSearchServiceInstance -Identity $App2).Status -eq ‘Disabled’)

Start-SPEnterpriseSearchServiceInstance -Identity $App2
Write-Host “Starting Search Service Instance on” $App2.Server.Name
Start-Sleep 5;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchServiceInstance -Identity $App2).Status -eq ‘Online’)
Write-Host -ForegroundColor Green “Search Service Instance Started on” $App2.Server.Name
ELSE { Write-Host -f Green “Search Service Instance is already running on” $App2.Server.Name }
Write-Host “Starting Search Service Instances on fe1…..”

IF((Get-SPEnterpriseSearchServiceInstance -Identity $fe1).Status -eq ‘Disabled’)

Start-SPEnterpriseSearchServiceInstance -Identity $fe1
Write-Host “Starting Search Service Instance on” $fe1.Server.Name
Start-Sleep 5;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchServiceInstance -Identity $fe1).Status -eq ‘Online’)
Write-Host -ForegroundColor Green “Search Service Instance Started on” $fe1.Server.Name
ELSE { Write-Host -f Green “Search Service Instance is already running on” $fe1.Server.Name }
Write-Host “Starting Search Service Instances on fe2…..”

IF((Get-SPEnterpriseSearchServiceInstance -Identity $fe2).Status -eq ‘Disabled’)

Start-SPEnterpriseSearchServiceInstance -Identity $fe2
Write-Host “Starting Search Service Instance on” $fe2.Server.Name
Start-Sleep 5;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchServiceInstance -Identity $fe2).Status -eq ‘Online’)
Write-Host -ForegroundColor Green “Search Service Instance Started on” $fe2.Server.Name
ELSE { Write-Host -f Green “Search Service Instance is already running on” $fe2.Server.Name }
Write-Host “Starting Search Service Instances on fe3…..”

IF((Get-SPEnterpriseSearchServiceInstance -Identity $fe2).Status -eq ‘Disabled’)

Start-SPEnterpriseSearchServiceInstance -Identity $fe3
Write-Host “Starting Search Service Instance on” $fe3.Server.Name
Start-Sleep 5;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchServiceInstance -Identity $fe3).Status -eq ‘Online’)
Write-Host -ForegroundColor Green “Search Service Instance Started on” $fe3.Server.Name
ELSE { Write-Host -f Green “Search Service Instance is already running on” $fe3.Server.Name }
Write-Host “Starting Search Service Instances on fe4…..”

IF((Get-SPEnterpriseSearchServiceInstance -Identity $fe4).Status -eq ‘Disabled’)

Start-SPEnterpriseSearchServiceInstance -Identity $fe4
Write-Host “Starting Search Service Instance on” $fe4.Server.Name
Start-Sleep 5;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchServiceInstance -Identity $fe4).Status -eq ‘Online’)
Write-Host -ForegroundColor Green “Search Service Instance Started on” $fe4.Server.Name
ELSE { Write-Host -f Green “Search Service Instance is already running on” $fe4.Server.Name }

Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $App1.server.Name
Do {
Start-Sleep 3;
Write-host -NoNewline “.”

While ((Get-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance | Where {$_.Server.Name -eq $App1.server.Name}).status -ne ‘Online’)
Write-Host -ForegroundColor Green “Query and Site Settings Service Instance Started on” $App1.Server.Name

Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $App2.server.Name
Do {
Start-Sleep 3;
Write-host -NoNewline “.”
While ((Get-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance | Where {$_.Server.Name -eq $App2.server.Name}).status -ne ‘Online’)
Write-Host -ForegroundColor Green “Query and Site Settings Service Instance Started on” $App2.Server.Name
$searchAppPool = Get-SPServiceApplicationPool -Identity “High Availability Search Services Application Pool”

IF ((Get-SPEnterpriseSearchServiceApplication).Status -ne ‘Online’){
Write-Host “Provisioning. Please wait…”
$searchApp = New-SPEnterpriseSearchServiceApplication -Name $searchName -ApplicationPool $searchAppPool -AdminApplicationPool $searchAppPool -DatabaseName $searchDB
DO {start-sleep 2;
write-host -nonewline “.” } While ( (Get-SPEnterpriseSearchServiceApplication).status -ne ‘Online’)
Write-Host -f green “Provisioned Search Service Application”
} ELSE { write-host -f green “Search Service Application already provisioned.”
$searchApp = Get-SPEnterpriseSearchServiceApplication

$AdminComponent = $searchApp | Get-SPEnterpriseSearchAdministrationComponent | Set-SPEnterpriseSearchAdministrationComponent -SearchServiceInstance $App1
$initialTopology = Get-SPEnterpriseSearchTopology -SearchApplication $searchApp
$cloneTopology = New-SPEnterpriseSearchTopology -SearchApplication $searchApp -Clone -SearchTopology $initialTopology
$AdminTopology = New-SPEnterpriseSearchAdminComponent -SearchServiceInstance $App1 -SearchTopology $cloneTopology
$CrawlTopology = New-SPEnterpriseSearchCrawlComponent -SearchServiceInstance $App1 -SearchTopology $cloneTopology
$AnalyticsTopology = New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchServiceInstance $App1 -SearchTopology $cloneTopology
$ContentProcessingTopology = New-SPEnterpriseSearchContentProcessingComponent -SearchServiceInstance $App1 -SearchTopology $cloneTopology
$IndexTopology = New-SPEnterpriseSearchIndexComponent -SearchServiceInstance $App1 -SearchTopology $cloneTopology -IndexPartition 0
$AdminTopology = New-SPEnterpriseSearchAdminComponent -SearchServiceInstance $App2 -SearchTopology $cloneTopology
$CrawlTopology = New-SPEnterpriseSearchCrawlComponent -SearchServiceInstance $App2 -SearchTopology $cloneTopology
$AnalyticsTopology = New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchServiceInstance $App2 -SearchTopology $cloneTopology
$ContentProcessingTopology = New-SPEnterpriseSearchContentProcessingComponent -SearchServiceInstance $App2 -SearchTopology $cloneTopology
$IndexTopology = New-SPEnterpriseSearchIndexComponent -SearchServiceInstance $App2 -SearchTopology $cloneTopology -IndexPartition 0
$fe1 = Get-SPEnterpriseSearchServiceInstance -Identity “sp-ife1”
$QueryTopology = New-SPEnterpriseSearchQueryProcessingComponent -SearchServiceInstance $fe1 -SearchTopology $cloneTopology

$fe2 = Get-SPEnterpriseSearchServiceInstance -Identity “sp-ife2”
$QueryTopology = New-SPEnterpriseSearchQueryProcessingComponent -SearchServiceInstance $fe2 -SearchTopology $cloneTopology

$fe3 = Get-SPEnterpriseSearchServiceInstance -Identity “sp-efe1”
$QueryTopology = New-SPEnterpriseSearchQueryProcessingComponent -SearchServiceInstance $fe3 -SearchTopology $cloneTopology

$fe4 = Get-SPEnterpriseSearchServiceInstance -Identity “sp–efe2”
$QueryTopology = New-SPEnterpriseSearchQueryProcessingComponent -SearchServiceInstance $fe4 -SearchTopology $cloneTopology
Set-SPEnterpriseSearchTopology -Identity $cloneTopology

$initialTopology = Get-SPEnterpriseSearchTopology -SearchApplication $searchApp | where {($_.State) -eq “Inactive”}
Remove-SPEnterpriseSearchTopology -Identity $initialTopology -Confirm:$false

$searchAppProxy = New-SPEnterpriseSearchServiceApplicationProxy -Name “$searchName Proxy” -SearchApplication $searchApp


27 New Features of .NET Framework 4.0

The new features and improvements are described in the following sections: 

Programming Languages 
Common Language Runtime (CLR) 
Base Class Libraries 

Common Language Runtime (CLR)

The following sections describe new features in security, parallel computing, performance and diagnostics, dynamic language runtime, and other CLR-related technologies.


The .NET Framework 4.0 provides simplifications, improvements, and expanded capabilities in the security model. For more information, see Security Changes in the .NET Framework 4.

Parallel Computing

The .NET Framework 4.0 introduces a new programming model for writing multithreaded and asynchronous code that greatly simplifies the work of application and library developers. The new model enables developers to write efficient, fine-grained, and scalable parallel code in a natural idiom without having to work directly with threads or the thread pool. The new Parallel and Task classes, and other related types, support this new model. Parallel LINQ (PLINQ), which is a parallel implementation of LINQ to Objects, enables similar functionality through declarative syntax. For more information, see Parallel Programming in the .NET Framework.

Performance and Diagnostics

In addition to the following features, the .NET Framework 4.0 provides improvements in startup time, working set sizes, and faster performance for multithreaded applications.

ETW Events

You can now access the Event Tracing for Windows (ETW) events for diagnostic purposes to improve performance. For more information, see the following topics:

Performance Monitor (Perfmon.exe) now enables you to disambiguate multiple applications that use the same name and multiple versions of the common language runtime loaded by a single process. This requires a simple registry modification. For more information, see Performance Counters and In-Process Side-By-Side Applications.

Code Contracts

Code contracts let you specify contractual information that is not represented by a method’s or type’s signature alone. The new System.Diagnostics.Contracts namespace contains classes that provide a language-neutral way to express coding assumptions in the form of pre-conditions, post-conditions, and object invariants. The contracts improve testing with run-time checking, enable static contract verification, and documentation generation.

The applicable scenarios include the following:

  • Perform static bug finding, which enables some bugs to be found without executing the code.
  • Create guidance for automated testing tools to enhance test coverage.
  • Create a standard notation for code behavior, which provides more information for documentation.
Lazy Initialiation

With lazy initialization, the memory for an object is not allocated until it is needed. Lazy initialization can improve performance by spreading object allocations evenly across the lifetime of a program. You can enable lazy initialization for any custom type by wrapping the type inside a System..::.Lazy<(Of <(T>)>) class.

Dynamic Language Runtime

The dynamic language runtime (DLR) is a new runtime environment that adds a set of services for dynamic languages to the CLR. The DLR makes it easier to develop dynamic languages to run on the .NET Framework and to add dynamic features to statically typed languages. To support the DLR, the new System.Dynamic namespace is added to the .NET Framework. In addition, several new classes that support the .NET Framework infrastructure are added to the System.Runtime.CompilerServices namespace. For more information, see Dynamic Language Runtime Overview.

In-Process Side-by-Side Execution

In-process side-by-side hosting enables an application to load and activate multiple versions of the common language runtime (CLR) in the same process. For example, you can run applications that are based on the .NET Framework 2.0 SP1 and applications that are based on .NET Framework 4.0 in the same process. Older components continue to use the same CLR version, and new components use the new CLR version. For more information, see Hosting Changes in the .NET Framework 4.


New interoperability features and improvements include the following:

  • You no longer have to use primary interop assemblies (PIAs). Compilers embed the parts of the interop assemblies that the add-ins actually use, and type safety is ensured by the common language runtime.
  • You can use the System.Runtime.InteropServices..::.ICustomQueryInterfaceinterface to create a customized, managed code implementation of theIUnknown::QueryInterface method. Applications can use the customized implementation to return a specific interface (except IUnknown) for a particular interface ID.

In the .NET Framework 4.0, you can attach profilers to a running process at any point, perform the requested profiling tasks, and then detach. For more information, see the [IClrProfiling::AttachProfiler]IClrProfiling Interface::AttachProfiler Method method.

Garbage Collection

The .NET Framework 4.0 provides background garbage collection; for more information, see the entry So, what’s new in the CLR 4.0 GC? in the CLR Garbage Collector blog. 

Covariance and Contravariance

Several generic interfaces and delegates now support covariance and contravariance. For more information, see Covariance and Contravariance in the Common Language Runtime.

Base Class Libraries

The following sections describe new features in collections and data structures, exception handling, I/O, reflection, threading, and Windows registry.

Collections and Data Structures

Enhancements in this area include the new System.Numerics..::.BigInteger structure, theSystem.Collections.Generic..::.SortedSet<(Of <(T>)>) generic class, and tuples.


The new System.Numerics..::.BigInteger structure is an arbitrary-precision integer data type that supports all the standard integer operations, including bit manipulation. It can be used from any .NET Framework language. In addition, some of the new .NET Framework languages (such as F# and IronPython) have built-in support for this structure.

SortedSet Generic Class

The new System.Collections.Generic..::.SortedSet<(Of <(T>)>) class provides a self-balancing tree that maintains data in sorted order after insertions, deletions, and searches. This class implements the new System.Collections.Generic..::.ISet<(Of <(T>)>) interface.

The System.Collections.Generic..::.HashSet<(Of <(T>)>) class also implements theISet<(Of <(T>)>) interface.


A tuple is a simple generic data structure that holds an ordered set of items of heterogeneous types. Tuples are supported natively in languages such as F# and IronPython, but are also easy to use from any .NET Framework language such as C# and Visual Basic. The ..NET Framework 4.0 adds eight new generic tuple classes, and also aTuple class that contains static factory methods for creating tuples.

Exceptions Handling

The .NET Framework 4.0 class library contains the newSystem.Runtime.ExceptionServices namespace, and adds the ability to handle corrupted state exceptions. 

Corrupted State Exceptions

The CLR no longer delivers corrupted state exceptions that occur in the operating system to be handled by managed code, unless you apply theHandleProcessCorruptedStateExceptionsAttribute attribute to the method that handles the corrupted state exception.

Alternatively, you can add the following setting to an application’s configuration file:



The key new features in I/O are efficient file enumerations, memory-mapped files, and improvements in isolated storage and compression.

File System Enumeration Improvements

New enumeration methods in the Directory and DirectoryInfo classes returnIEnumerable<(Of <(T>)>) collections instead of arrays. These methods are more efficient than the array-based methods, because they do not have to allocate a (potentially large) array and you can access the first results immediately instead of waiting for the complete enumeration to occur.

There are also new methods in the static File class that read and write lines from files by using IEnumerable<(Of <(T>)>) collections. These methods are useful in LINQ scenarios where you may want to quickly and efficiently query the contents of a text file and write out the results to a log file without allocating any arrays.

Memory-Mapped Files

The new System.IO.MemoryMappedFiles namespace provides memory mapping functionality, which is available in Windows. You can use memory-mapped files to edit very large files and to create shared memory for inter-process communication. The newSystem.IO..::.UnmanagedMemoryAccessor class enables random access to unmanaged memory, similar to how System.IO..::.UnmanagedMemoryStream enables sequential access to unmanaged memory.

Isolated Storage Improvements

Partial-trust applications, such as Windows Presentation Framework (WPF) browser applications (XBAPs) and ClickOnce partial-trust applications, now have the same capabilities in the .NET Framework as they do in Silverlight. The default quota size is doubled, and applications can prompt the user to approve or reject a request to increase the quota. The System.IO.IsolatedStorage..::.IsolatedStorageFile class contains new members to manage the quota and to make working with files and directories easier.

Compression Improvements

The compression algorithms for the System.IO.Compression..::.DeflateStream andSystem.IO.Compression..::.GZipStream classes have improved so that data that is already compressed is no longer inflated. This results in much better compression ratios. Also, the 4-gigabyte size restriction for compressing streams has been removed.


The .NET Framework 4.0 provides the capability to monitor the performance of your application domains.

Application Domain Resource Monitoring

Until now, there has been no way to determine whether a particular application domain is affecting other application domains, because the operating system APIs and tools, such as the Windows Task Manager, were precise only to the process level. Starting with the .NET Framework 4.0, you can get processor usage and memory usage estimates per application domain.

Application domain resource monitoring is available through the managed AppDomainclass, native hosting APIs, and event tracing for Windows (ETW). When this feature has been enabled, it collects statistics on all application domains in the process for the life of the process.

For more information, see the <appDomainResourceMonitoring> Element, and the following properties in the AppDomain class:

64-bit View and Other Registry Improvements

Windows registry improvements include the following:


General threading improvements include the following:

  • The new Monitor..::.Enter(Object, Boolean%) method overload takes a Boolean reference and atomically sets it to true only if the monitor is successfully entered.
  • You can use the Thread..::.Yield method to have the calling thread yield execution to another thread that is ready to run on the current processor.

The following sections describe new threading features.

Unified Model for Cancellation

The .NET Framework 4.0 provides a new unified model for cancellation of asynchronous operations. The new System.Threading..::.CancellationTokenSource class is used to create a CancellationToken that may be passed to any number of operations on multiple threads. By calling Cancel()()() on the token source object, the IsCancellationRequestedproperty on the token is set to true and the token’s wait handle is signaled, at which time any registered actions with the token are invoked. Any object that has a reference to that token can monitor the value of that property and respond as appropriate.

Thread-Safe Collection Classes

The new System.Collections.Concurrent namespace introduces several new thread-safe collection classes that provide lock-free access to items whenever useful, and fine-grained locking when locks are appropriate. The use of these classes in multi-threaded scenarios should improve performance over collection types such as ArrayList, and List<(Of <(T>)>).

Synchronization Primitives

New synchronization primitives in the System.Threading namespace enable fine-grained concurrency and faster performance by avoiding expensive locking mechanisms. TheBarrier class enables multiple threads to work on an algorithm cooperatively by providing a point at which each task can signal its arrival and then block until the other participants in the barrier have arrived. The CountdownEvent class simplifies fork and join scenarios by providing an easy rendezvous mechanism. The ManualResetEventSlim class is a lock-free synchronization primitive similar to the ManualResetEvent class. ManualResetEventSlimis lighter weight but can only be used for intra-process communication. TheSemaphoreSlim class is a lightweight synchronization primitive that limits the number of threads that can access a resource or a pool of resources at the same time; it can be used only for intra-process communication. The SpinLock class is a mutual exclusion lock primitive that causes the thread that is trying to acquire the lock to wait in a loop, or spin, until the lock becomes available. The SpinWait class is a small, lightweight type that will spin for a time and eventually put the thread into a wait state if the spin count is exceeded.


Enhancements have been made that affect how integrated Windows authentication is handled by the HttpWebRequestHttpListenerSmtpClientSslStreamNegotiateStream, and related classes in the System.Net and related namespaces. Support was added for extended protection to enhance security. The changes to support extended protection are available only for applications on Windows 7. The extended protection features are not available on earlier versions of Windows. For more information, seeIntegrated Windows Authentication with Extended Protection.


The following sections describe new features in ASP.NET core services, Web Forms, Dynamic Data, and Visual Web Developer.

ASP.NET Core Services

ASP.NET introduces several features that improve core ASP.NET services, Web Forms, Dynamic Data, and Visual Web Developer. For more information, see What’s New in ASP.NET and Web Development.

ASP.NET Web Forms

Web Forms has been a core feature in ASP.NET since the release of ASP.NET 1.0. Many enhancements have been made in this area for ASP.NET 4, including the following:

  • The ability to set meta tags.
  • More control over view state.
  • Easier ways to work with browser capabilities.
  • Support for using ASP.NET routing with Web Forms.
  • More control over generated IDs.
  • The ability to persist selected rows in data controls.
  • More control over rendered HTML in the FormView and ListView controls.
  • Filtering support for data source controls.
Dynamic Data

For ASP.NET 4, Dynamic Data has been enhanced to give developers even more power for quickly building data-driven Web sites. This includes the following:

  • Automatic validation that is based on constraints defined in the data model.
  • The ability to easily change the markup that is generated for fields in the GridView and DetailsView controls by using field templates that are part of your Dynamic Data project.
Visual Web Developer Enhancements

The Web page designer in Visual Studio 2010 has been enhanced for better CSS compatibility, includes additional support for HTML and ASP.NET markup code examples, and features a redesigned version of IntelliSense for JScript. In addition, two new deployment features called Web packaging and One-Click Publish make deploying Web applications easier.


The following sections describe new features in Windows Presentation Foundation (WPF) and Managed Extensibility Framework (MEF).

Windows Presentation Foundation

In the .NET Framework 4.0, Windows Presentation Foundation (WPF) contains changes and improvements in many areas. This includes controls, graphics, and XAML.

For more information, see What’s New in Windows Presentation Foundation Version 4.

Managed Extensibility Framework

The Managed Extensibility Framework (MEF) is a new library in the .NET Framework 4.0 that enables you to build extensible and composable applications. MEF enables application developers to specify points where an application can be extended, expose services to offer to other extensible applications, and create parts for consumption by extensible applications. It also enables easy discoverability of available parts based on metadata, without the need to load the assemblies for the parts.

For more information, see Managed Extensibility Framework. For a list of the MEF types, see the System.ComponentModel.Composition namespace.


For more information, see What’s New in ADO.NET.

Expression Trees

Expression trees are extended with new types that represent control flow, for example,LoopExpression and TryExpression. These new types are used by the dynamic language runtime (DLR) and not used by LINQ.


Windows Communication Foundation (WCF) provides the new features and enhancements described in the following sections.

Support for WS-Discovery

The Service Discovery feature enables client applications to dynamically discover service addresses at run time in an interoperable way using WS-Discovery. The WS-Discovery specification outlines the message-exchange patterns (MEPs) required for performing lightweight discovery of services, both by multicast (ad hoc) and unicast (using a network resource).

Standard Endpoints

Standard endpoints are pre-defined endpoints that have one or more of their properties (address, binding, contract) fixed. For example, all metadata exchange endpoints specifyIMetadataExchange as their contract, so there is no need for a developer to have to specify the contract. Therefore, the standard MEX endpoint has a fixedIMetadataExchange contract.

Workflow Services

With the introduction of a set of messaging activities, it is easier than ever to implement workflows that send and receive data. These messaging activities enable you to model complex message exchange patterns that go outside the traditional send/receive or RPC-style method invocation.


Windows Workflow Foundation (WF) in .NET Framework 4.0 changes several development paradigms from earlier versions. Workflows are now easier to create, execute, and maintain.

Workflow Activity Model

The activity is now the base unit of creating a workflow, instead of using theSequentialWorkflowActivity or StateMachineWorkflowActivity classes. TheWorkflowElement class provides the base abstraction of workflow behavior. Activity authors implement WorkflowElement objects imperatively when they have to use the breadth of the runtime. The Activity class is a data-driven WorkflowElement object where activity authors express new behaviors declaratively in terms of other activity objects.

Richer Composite Activity Options

The Flowchart class is a powerful new control flow activity that enables authors to construct process flows more naturally. Procedural workflows benefit from new flow-control activities that model traditional flow-control structures, such as TryCatch and Switch.

Expanded Built-in Activity Library

New features of the activity library include the following:

  • Data access activities for interacting with ODBC data sources.
  • New flow control activities such as DoWhile, ForEach, and ParallelForEach.
  • Activities for interacting with PowerShell and SharePoint.
Enhanced Persistence and Unloading

Workflow state data can be explicitly persisted by using the Persist activity. A host can persist a WorkflowInstance without unloading it. A workflow can specify no-persist zones when working with data that cannot be persisted so that persistence is postponed until the no-persist zone exits.

Improved Ability to Extend WF Designer Experience

The new WF Designer is built on Windows Presentation Foundation (WPF) and provides an easier model to use when rehosting the WF Designer outside Visual Studio. It also provides easier mechanisms for creating custom activity designers. For more information, see Extending the Workflow Designer.

* Copied

The Benefits Of Using Microsoft Sharepoint Server

One of the most popular software packages for businesses today is Microsoft SharePoint Server. Microsoft SharePoint Server is a very popular program for businesses, because it helps to improve business processes without the requirement of extensive training beforehand. SharePoint was created several years ago as a tool for creating corporate portals that serve as entryways to documents such as human resources forms, but more recently, it has expanded functionality and split. SharePoint is part of the foundation of a broad rethinking of the way office workers use computers. The suite supports a large number of programs and is built to be an effective content management platform for business use. It is also designed to streamline many different types of business applications, allowing the users to be more effective in their day to day functions. Many different types of users would be able to use Microsoft SharePoint Server effectively and the platform allows for easier control of many business processes.

One of the largest benefits to using Microsoft SharePoint Server is the ability to manage and control the content that is entering or leaving your business. Every business owner knows that one security breach or hacker can cause a great deal of damage to the company and even put the company out of commission for a period of time. All electronic content can be tracked and many business processes can be streamlined using the tools available with Microsoft SharePoint Server. The platform can also be used to access documents and workspaces that need to be shared among a number of individuals, such as databases and commonly used forms. The applications can also be used to create online content such as webpages, wiki’s, and blogs.

Microsoft SharePoint Server also gives users the ability to organize and edit their documents quickly and easily. Documents can be organized using any number of different criteria and collaborative editing is also possible using the applications available with the software. Microsoft SharePoint Server also includes updated navigation features and the ability to search documents using a number of different methods. Other applications available with the program include to do lists, contact management databases, workflow planning, discussion boards, and programmable alerts. This allows the employees to manage their work processes much more effectively and increase productivity across all departments.

Microsoft SharePoint Server can be easily integrated with Microsoft Office applications and can host a number of libraries very effectively. This software certainly helps with the issue of sharing Microsoft Office documents among teams and making those documents more accessible through search. There are a number of ways that Microsoft SharePoint Server can increase productivity and streamline businesses processes. Many companies have found that the addition of this business platform to their common applications have allowed employees to manage their documents more effectively and bring projects to completion at a much quicker rate which is essential in business world in the current days. In Addition, the team and site managers can coordinate site content and user activity easily, its environment is designed for easy and flexible deployment, administration, and application development.

We’re listing some common business needs that can be solved by using a SharePoint Services 3.0 site. Maybe you’ll find a few things here that you’ve been trying to find a solution for within your own organization.

I’d like to share with you the top 20 ways that SharePoint can solve your common business needs. And I want to emphasize that these 20 are only a few of the many ways to utilize the SharePoint Services. Once you start using it, you will begin to realize the many other valuable benefits that can easily be accomplished by the everyday user.

Shall we get started? Within this simple list you might find a solution to something you’ve been looking high and low for.

With SharePoint you can:

1. Store all your emails on a secure and centralized Website for easy archive.

2. Have a centralized location for tasks and assign tasks to team members. These tasks will automatically show up in yor team’s Outlook 2007 To-Do List. Those tasks will also link to your projects so you can easily find out what tasks are still open for each project.

3. Organize large events and store the related documents, assigned tasks, and generally post anything and everything related to the events. It will also integrate with Outlook for added efficiency.

4. Collaborate with team members on all documents and stay on top of who did what. Earlier versions can easily be restored in case someone has made too many mistakes. Projects can also be linked to related documents.

5. When tasks are assigned to team members they’ll automatically be notified that they’ve just been assigned a new task. Each time a task has been updated alerts are sent.

6. Quickly manage all projects for your team or organization so there’s no need to explore buying an expensive project management solution.

7. Use the efficient check-in / check-out management feature to sort your documents.

8. Implement a help ticket resolution for your organization or team without breaking the bank.

9. Start a private company blog to communicate and share ideas with your team that’s viewable only by those you give access to.

10. Gain more control over your company’s documents with the content approval function.

11. Offer training materials to your teams, clients, and/or partners in a password-protected Website that can be accessed anywhere in the world.

12. Offer a secure and private place to share documents and other information with clients and/or partners.

13. Access and work with your data using your Internet-enabled mobile phone for added convenience while traveling or out of the office at client meetings.

14. Create better team communication and brainstorming sessions where everyone can participate when their schedules permit.

15. Keep project announcements for your company and team in a central location. Everyone will receive a notification via email or mobile phone automatically, anywhere in the world.

16. Take the project tasks, team discussions, company contacts, centralized calendars, team blogs, and files offline. and then sync the updated information later on.

17. Create “central” documents (and syncronize), so all team members, clients, and/or partners are able to work on the same document and make changes. Updates are accessible with a click of a button. Everyone can then sync back to the “central” document and have all edits merged into that single document.

18. Easily add custom fields to any area and capture the information that’s most important to your company, all without the help of a web designer or IT person.

19. Pull up and update Microsoft Access 2007 database from a local desktop and sync information to a central location that can be accessed from any where at any time.

20. Create a project dashboard where on one page you can view and filter on common project elements, such as: project details, project documents, project tasks, project issues, project calendar, project milestones, project lessons learned, project risks, project change orders, and more.

Source: http://www.articlesbase

LINQ to SQL – 5 Minute Overview

free counters

LINQ to SQL allows .NET developers to write “queries” in their .NET language of choice to retrieve and manipulate data from a SQL Server database. In a general sense, LINQ to SQL allows us to create SQL queries in our preferred .NET language syntax and work with a strongly types collection of objects as a return result. We can make changes to these objects then save changes back to the database.
To get an idea of the syntax for LINQ to SQL, we will be using the following SQL database schema. It is a simple software registration and helpdesk. It is populated with sample data and has foreign-key relationships defined where appropriate.
SQL Database Schema used for LINQ to SQL examples
SQL Database Schema used for LINQ to SQL examples.
For the moment I ask you to ignore the fact that we will be coding against a type HookedOnLINQ, I’ll get to how that was created in a few pages time, for now just understand it is an object structure that mimics this database schema.

HookedOnLINQ db =
     new HookedOnLINQ("Data Source=(local);Initial
Catalog=HookedOnLINQ");   var q = from c in db.Contact
           where c.DateOfBirth.AddYears(35) > DateTime.Now
           orderby c.DateOfBirth descending
           select c;   foreach(var c in q)
       Console.WriteLine("{0} {1} b.{2}",
,c.DateOfBirth.ToString("dd-MMM-yyyy"));   Output:
Mack Kamph b.17-Sep-1977
Armando Valdes b.09-Dec-1973

LINQ to SQL Query Expression on a SQL Server Database – Contacts younger than 35 years of age, youngest first

The moment we entered the foreach loop (that’s important, the SQL was only executed the first time we requested data, until then the query was just kept in-memory as an expression, this is called Deferred Execution), the following SQL Statement was formulated by LINQ and executed on the server.

  SELECT [t0].[ContactId], [t0].[FirstName], [t0].[LastName],
[t0].[DateOfBirth],[t0].[Phone], [t0].[Email], [t0].[State]
FROM [Contact] AS [t0]
WHERE DATEADD(YEAR, @p0, [t0].[DateOfBirth]) > @p1
ORDER BY [t0].[DateOfBirth] DESC

SQL Statement generated by LINQ returning Contacts older than a given date that was passed in as a parameter.

Our C# query expression was translated into parameterized SQL code, parameters were created and the query executed on the server. Not everyone is going to be thrilled by this epiphany. Those purists who believe that all database access should be carried out through stored procedures will be horrified. Fear not, LINQ to SQL allows these developers to continue to use stored procedures rather than SQL, although you now have to write the stored procedure code yourself, missing out on some of the flexibility LINQ offers. We cover this is more detail later, for now just understand that LINQ to SQL supports stored procedure in addition to dynamically generated SQL calls in all circumstances.

If your database has Foreign Key relationships defined, then their hierarchy is reflected in the generated object models. You can access the related records data simply by specifying the child table, or from a child table, the parent table to access relational data. The next example demonstrates how you can navigate the foreign-key relationship chain without writing a Join statement explicitly.

HookedOnLINQ db =
new HookedOnLINQ("Data Source=(local);Initial Catalog=HookedOnLINQ");   
var q = from o in db.Orders
        where o.Products.ProductName.StartsWith("Asset") &&
              o.PaymentApproved == true
        select new { name   = o.Contacts.FirstName + " " +
                     product = o.Products.ProductName,
                     version = o.Products.Version +
                              (o.Products.SubVersion * 0.1)
                   };   foreach(var x in q)
    Console.WriteLine("{0} - {1} v{2}",
            , x.product, x.version);   Output:
Barney Gottshall - Asset Blaster v1
Barney Gottshall - Asset Blaster v1.1
Armando Valdes - Asset Blaster Pro v1
Jeffery Deane - Asset Blaster Pro v1.1
Stewart Kagel - Asset Blaster Pro v1.1
Blaine Reifsteck - Asset Blaster Pro v1.1
Ariel Hazelgrove - Asset Blaster v1.1

Accessing foreign-key relationships is simple. No join syntax necessary, you just access the sub-members directly.

This hierarchical object model works for updates as well. You can assign, add and delete records in related tables just by manipulating objects and adding/removing objects from tables. Behind the scenes LINQ to SQL generated the following SQL query command and executed it. The results were used to populate our result object collection which is a collection of an Anonymous Type (a dynamically created compile time type that has public properties called Name, Product and Version).

SELECT ([t2].[FirstName] + @p2) + [t2].[LastName] AS [value],
[t1].[ProductName], [t1].[Version] + ([t1].[SubVersion] * @p3)
AS [value2]
FROM [Orders] AS [t0], [Products] AS [t1], [Contacts] AS [t2]
WHERE ([t2].[ContactId] = [t0].[ContactId]) AND ([t1].[ProductName]
LIKE @p0) AND ([t0].[PaymentApproved] = @p1) AND
 ([t1].[Product_Id] = [t0].[ProductId])

SQL code showing how the joins to related table through foreign-keys were added.

If your database doesn’t have foreign-key relationships defined between two tables, LINQ to SQL still allows relational access by explicitly specifying Joins in the Query Expression. The following query demonstrates how to join where a foreign-key is not defined between two loosely related tables Contacts.Phone and CallLogs.Number.

HookedOnLINQ db =
new HookedOnLINQ("Data Source=(local);Initial Catalog=HookedOnLINQ");   var q = from call in db.CallLogs
     join contact in db.Contacts on call.Number equals contact.Phone
     select new {contact.FirstName, contact.LastName,
           call.When, call.Duration};   foreach(var call in q)
    Console.WriteLine("{0} - {1} {2} ({3}min)",
        call.When.ToString("ddMMM HH:mm"),
        call.FirstName.Trim(), call.LastName.Trim(), call.Duration);

If no foreign-key relationship exists, you can use the Join operator in the query expression.

To change and add a record to our database, you just need to make the changes to the in-memory objects and then call SubmitChanges method (be careful, I once mistakenly called AcceptChanges which accepts the changes and marks all records as original but doesn’t save to the DB). LINQ to SQL keeps track of the changes and generates SQL statements to affect all of the required updates, inserts and deletes. You can override this default behavior and specify your own implementation methods (which can call stored procedures) to use instead. LINQ to SQL provides a transaction around the database updates, so if any part fails you have a chance to capture the error, rectify, and then try again. You can also control how LINQ to SQL handles concurrency errors (when someone else changes data you were editing before you had a chance to save).

HookedOnLINQ db =
 new HookedOnLINQ("Data Source=(local);Initial Catalog=HookedOnLINQ");   
// Change - Get an object, make the change in memory, 
//Call SubmitChanges
Contacts q = (from c in db.Contacts
          where c.FirstName == "Armando" && c.LastName == "Valdes"
              select c).FirstOrDefault();   if (q != null) {
       q.Email = "";
}   try {
catch (OptimisticConcurrencyException e) {
// You have your choice of RefreshMode to resolve concurrency 
  // You can KeepChanges, KeepCurrentValues, OverwriteCurrentValues.

Update showing how to handle concurrency errors. You make changes to objects and then call SubmitChanges.

Inserting new records are as simple as creating a new instance of the object and adding it to the appropriate collection, then calling SubmitChanges. It is also just as easy adding sub-records that are related using Foreign Key’s by creating the sub-object and adding it to the new record we just created before calling SubmitChanges which will save both records and their relationship to the database.

HookedOnLINQ db =
new HookedOnLINQ("Data Source=(local);Initial Catalog=HookedOnLINQ");   
// Adding Records – (1) Create a new object and sub-objects,
// (2) Add it to the DataContext collection, 
(3) Call SubmitChanges   // (1)
Contacts newContact    = new Contacts();
newContact.FirstName   = "Troy";
newContact.LastName    = "Magennis";
newContact.Phone       = "425 749 0494";
newContact.Email       = "";
newContact.DateOfBirth = new DateTime(1980, 08, 07);
// Create sub-record and add to this contact
Orders newOrder         = new Orders();
newOrder.Products       = (from p in db.Products
                           where p.ProductName == "Asset Blaster Pro"
                           select p).FirstOrDefault();
newOrder.DateOfPurchase = DateTime.Now;   // (2)
db.Contacts.Add(newContact);   // (3)

Inserting a new record and a related sub-record. Simply create objects and add to a collection.

On the SubmitChanges, LINQ to SQL generates SQL statements in the correct order to save the new records to the database and to correctly reference each other. In our example, LINQ to SQL needs to insert the new Contact first to get the primary key (which in an identity column), then use that value when writing the new order to the database. The whole process is carried out in a Transaction, so if any step fails then the whole database is returned to the state before the SubmitChanges was called.

Start LOCAL Transaction (ReadCommitted)
   INSERT INTO [Contacts](FirstName, LastName, DateOfBirth, Phone,
State) VALUES(@p0, @p1, @p2, @p3, @p4, @p5)
SELECT [t0].[ContactId]
FROM [Contacts] AS [t0]
WHERE [t0].[ContactId] = (CONVERT(Int,@ @IDENTITY)) 
  INSERT INTO [Orders](ContactId, ProductId, DateOfPurchase,
PaymentApproved, Quantity, Discount, AccessCode)
VALUES(@p0, @p1, @p2, @p3, @p4, @p5, @p6)
SELECT [t0].[OrderId]
FROM [Orders] AS [t0]
WHERE [t0].[OrderId] = (CONVERT(Int,@ @IDENTITY))   
Commit LOCAL Transaction

SQL Executed when writing out a record and sub-record. Notice the transaction wrapping the whole process.

These records were added after the SubmitChanges method was called in the Figure 16 example.

Contacts table:











425 749 0494

Orders Table:








2006-11-30 18:50:24.187

Products Table, nothing was added, but a reference to the ProductId of Asset Blaster Pro was used in the order table record. All of this looking up of primary keys was automatically handled.







Asset Blaster Pro




Deleting records is just as simple. You remove an object from the current in-memory collection of objects gathered from a previous query.

// Delete the record(s) we just created (do sub-items first)

Example of deleting records from the database.

Until now I’ve omitted an important step. We have been writing queries against a type called HookedOnLINQ initialized with a database connection string, and instance types Contacts, Orders and Products. The HookedOnLINQ type inherits from the anchor of LINQ to SQL, a class called DataContext. This class manages marshalling our query expressions to SQL expressions and handles change tracking in preparation for calling SubmitChanges. In addition we need to have types to represent our data tables and the aspects of mapping objects and relationships to their SQL equivalents and vice-versa. Although all of these classes can be created by hand, it will hardly ever (if ever) be advisable. There is built-in design-time support in Visual Studio as well as a command line tool which does all of the heavy lifting in code generation on our behalf.

Our custom DataContext class –

  • Inherit from System.Data.DLINQ.DataContext type
  • Hold and initialize collections of our instance types (Table<[type]>) and make them accessible (For example, so we can call db.Contacts from within our query expressions)

Our custom instance object classes –

  • Be decorated with a [Table] attribute
  • Contain public fields or properties decorated with [Column] attributes
  • Define foreign key relationships with a [Association] attribute
  • Override the default Update, Insert and Delete behavior by defining methods marked with [Update], [Insert] and [Delete] attributes
  • Define Store Procedure, View and Function wrappers with methods marked with a [StoredProcedure], [View] or [Function] attribute.
  • Ensure that PropertyChanging and PropertyChanged events are raised whenever a value is altered.

To generate the wrapper classes and DataContext derivative that allow LINQ to SQL functionality over the tables and other database objects here are our choices:

  1. Do it all manually by hand;
  2. Use the built-in designer for Visual Studio 2005;
  3. Use the SQLMetal command line tool;
  4. Use an XML mapping file to link database tables and columns to types and properties. This allows database and mapping changes to occur without an application recompile.

To generate the object wrapper for our sample database called HookedOnLINQ, using the command line tool, you run the SqlMetal application with the following arguments.

sqlmetal /server:(local) /database:HookedOnLINQ /code:HookedOnLINQ.cs

It creates a HookedOnLINQ.cs fully functional for all the examples shown so far. I just copied it into the main project and compiled the solution.

The built-in designer allows you to create a DLINQ Object surface. From the Server Explorer window you can drag table instances onto that surface. Foreign Key relationships are automatically added to the surface if they are defined in the database, or you can manually add them from the Toolbox. When you compile, the DataContext and instance types are created for you. Here is a DLINQ Object surface representing our HookedOnLINQ schema from the database.

LINQ to SQL Designer Surface

LINQ to SQL Designer Surface. Dragging tables from the server exploer creates object model and automatically defines relationships.

The alternative method to using attributes that link the relational model to the object model is to move the mappings to an XML file. The SQLMetal command line tool will create this XML file for you, but you could also automate its generation in any way you desire. When you create your DataContext, you can pass in the mapping XML, and this will have exactly the same effect as using attributes, except it’s not hardcoded into your application when you compile.

Many people believe that database access should always be performed through Stored Procedure to improve security (permissions can be granted only for those stored procedures an application should run), and for improved performance (query plans are cached between calls and better optimization can be carried out). LINQ to SQL fully supports Stored Procedures for general calls and the update, insert and delete operations, and in many cases improves the developer experience by freeing you from having to create input parameters by hand or having to create a strongly typed object collections to work with any returned results. However, solely using Stored Procedures eliminates the benefits of writing Query Expressions in the developer’s native coding language. There is middle ground though; you can use Stored Procedures for all Insert, Update and Delete operations and use Query Expressions for data retrieval. This allows the database to be secured against data corruption, while still allowing the developers to construct query expressions in VB or C#.

Calling stored procedures is made extremely easy. Using traditional ADO.NET you were forced to construct parameters by hand prior to constructing a database connection and actually calling the procedure. The code generation tools supplied as part of LINQ to SQL create wrapper functions for stored procedures, and also create strongly typed objects to hold the return values.

The following stored procedure code retrieves a list of overdue payments. The number of days overdue is passed in as a parameter. The result is a cursor with a number of columns, definitely not a type we have declared in C# objects before.

ALTER PROCEDURE [dbo].[GetOverdueAccounts]
	@daysOverdue int = 15
	SET NOCOUNT ON;   SELECT o.OrderId, o.Quantity, 
o.DateOfPurchase, o.Discount,
	c.FirstName + ' ' + c.LastName AS CustomerName,
			c.Phone, c.Email,
			p.ProductName, p.Price,
	((p.Price*o.Quantity)*((100-o.Discount)/100)) AS Cost,
	DATEDIFF(day, o.DateOfPurchase, GETDATE()) AS OverdueDays
	FROM		Orders o,
			Contacts c,
			Products p
    WHERE		o.ContactId = c.ContactId
    AND		o.ProductId = p.Product_Id
    AND     	o.PaymentApproved = 0
    AND		p.IsBeta = 0
  AND	DATEADD(day, @daysOverdue, o.DateOfPurchase) < GETDATE()   

SQLMetal, the command like code generation tool has a switch that generates the wrapper and result type for stored procedures.

sqlmetal /server:(local) /database:HookedOnLINQ /sprocs
HookedOnLINQ db =
new HookedOnLINQ("Data Source=(local);Initial Catalog=HookedOnLINQ");  
var overdue = db.GetOverdueAccounts(30);   
foreach (GetOverdueAccountsResult c in overdue)
    Console.WriteLine("{0} days - {1:c}: {2}",
        c.OverdueDays, c.Cost, c.CustomerName);   Output:
215 days - $300.00: Armando Valdes
30 days - $180.00: Adam Gauwain
30 days - $247.50: Adam Gauwain

Introduction to jQuery UI

Introduction to jQuery UI

After many months of stellar work, the jQuery UI team has released version 1.5 of their flagship suite of user-interface widgets, components, and effects. This release was focused on bringing you a standardized development API across all of the components, allowing for a more seamless experience when working with the jQuery UI library.


A very exciting CSS theming application was also released with jQuery UI 1.5, calledThemeRoller. ThemeRoller is an amazing way to customize the style and colors across all of the jQuery UI components. It comes with a few preset styles, as well as allowing you to create your own. Once you are done, it packages your theme into a zip file that contains all of the images and CSS you need.

Brief Overview of the jQuery UI Project

The jQuery UI project was originally created to bring you a set of “official” jQuery plugins. Mature components from the plugins repository were pulled together to form the first release of jQuery UI. But since each of these plugins had its own style, having been written by different authors, the first release of the library felt a bit cumbersome when packaged together. With that in mind, the focus of UI 1.5 was on achieving a coherent, standardized API to eliminate much of the differences between the components. Through much time and effort, many bugs and feature requests were addressed along the way as well.

Inside Look at jQuery UI Version 1.5

Before starting, I want to make sure you know where the jQuery UI Documentation is located. You may also want to head to the download page to grab the library for yourself. Note that the development bundle is the easiest to get started with.

First, let’s start by including the necessary files for jQuery UI: jQuery latest js file, the Flora theme complete stylesheet, and the core UI file. Each of the components is built on top of these files. Here is how to include them:


  1. <link rel=“stylesheet” href=“themes/flora/flora.all.css” type=“text/css” media=“screen” title=“Flora (Default)” />
  2. <script src=“ui/ui.core.js”></script>

You may want to download these files and put them on your own server, but this is just fine for our demonstration.

At this point you may include the jquery.ui.all.js script for testing, or include each of the components individually. Here are the components that we are using for this demo:


  1. <script src=“ui/ui.draggable.js”></script>
  2. <script src=“ui/ui.resizable.js”></script>
  3. <script src=“ui/ui.accordion.js”></script>

Activating Components

Each component has a constructor method, which is the component name. For instance, we can make a div draggable by using the draggable() method:


  1. $(document).ready(function() {
  2.   $(“#dragme”).draggable();
  3. });

The component defaults can be overridden by passing in options to the main function. For instance, if we want to make the div drag only horizontally, we can set the axis option to “x” with the following code:


  1. $(document).ready(function() {
  2.   $(“#dragme-x”).draggable({ axis: “x” });
  3. });

Likewise, the Accordion can be accessed the same way. Here we set a custom option to specify the accordion to slide on the mouseover event:


  1. $(document).ready(function() {
  2.   $(“#accordionDemo”).accordion({ event: “mouseover” });
  4. });
  •  Test 1
    Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.
  •  Test 2
  •  Test 3

Some of the components, such as draggable and resizable, can even be combined:


  1. $(document).ready(function() {
  2.   $(“#dragme-resize”).draggable().resizable();
  3. });

This first makes the div draggable, then adds the resize handles to the div.

Now you should have what you need to start with each of the components! Head to the functional demos page to see in-depth examples of each of the components.

Looking at the Future of jQuery UI

With Paul Bakaus hired as (paid) full-time lead of jQuery UI, the project has been energized, charging forward by leaps and bounds. With an ever-growing set of UI components, jQuery UI’s future is shaping up to be one of great promise.


* Copied