Tag Archives: BCS

Office 365 update scheduled for October 20, 2011

I’m currently using Office 365 in a hybrid environment with federation enabled on my Office 365 E3-plan. Being the technical contact for the service, I just received an email outlining the future Office 365 update.

The update is scheduled to begin from October 20, and it will continue through the end of November. We are also promised to receive a notification email 24 to 48 hours prior to our specific update window, which is really great.

The email also outlines the updates we’ll be receiving – simply said, this is exciting. Not only do we finally get Business Connectivity Services for integrating LOB-data to SharePoint Online, but we also get Windows Live ID-support for “external sharing” and official support for Windows Phone 7 Mango! External sharing, to my understanding, allows partial extranet-scenarios for company project sites and similar needs.

Also, as a long time Google Chrome user it’s nice to get official support.

Full list from the email:

  • Business Connectivity Services (BCS) (E-plans only)
  • External Sharing: Windows LiveID support
  • Windows Phone 7 "Mango" (official support and http:// –support)
  • Recycle Bin: deleted site self-recovery
  • Browser support: Internet Explorer 9
  • Browser support: Chrome

I’ll post details on using BCS when the update has deployed.

Thoughts on SharePoint Conference 2011 (day 1)

The first day of breakout sessions, hands-on labs and Ask The Experts of Microsoft SharePoint Conference 2011 has just concluded here in Anaheim, California. It’s been a day of some insights into the future of SharePoint, a lot of enthusiasm on SharePoint ecosystems and partners and a few deep dives on more advanced SharePoint topics. Here’s my recap on things I felt are important and interesting enough to share.

Keynote: Nothing much new but it wasn’t that boring

Keynote this year was actually quite good. It was short at around 90 minutes, so not too many demos or speakers. Pacing was good and I didn’t get the feeling of hyperventilation like I did while watching Build 2011 keynote in September. That was awful. Also, compared to TechEd North America (2011) keynote, the speakers didn’t try to talk about everything but kept the focus to a few chosen main topics.

Jared Spataro, Director of Product Management for SharePoint announced the first major update to Office 365 would be released before end of year. I’m thinking November, since December is often too packed with other activities (think: Christmas) and would leave us IT Pro’s not enough time to divulge all the new features and functionality. No timeframe for other releases or a feature list of what might be included with the update.

Finally: BCS for the cloud

Only one new technical feature was mentioned as being part of the upcoming Office 365 update: Business Connectivity Services (BCS) from Office 365 to On-Premises and Azure. Later it was confirmed during Andrew Connell’s SPC410 session, that developers will get real access to the BCS object model. What that means based on my understanding, is that we can and should use Visual Studio to package BCS models, rather than point-n-click stuff live on production with SharePoint Designer.

I’ve often felt that SharePoint Online should have a native server-side interface for interacting LOB data from on-premises data sources and applications. Building client-side Silverlight applications and Javascript-based hacks didn’t seem like a long-lasting solution. Thus I’ve often resorted to either replicating data to avoid the cumbersome plumbing work or simply decided to run SharePoint locally from the same datacenter my data is already stored on.

With the addition of BCS to SharePoint Online, we’ll finally – and hopefully – get a solid approach to integrating content from WCF, SQL Server and custom .NET-assemblies to and from the cloud. I’ll make sure to try out the new features when they become available. It will be interesting to see how BCS can be used in terms of performance, scalability and most importantly, how securely with my on-premises data.

Microsoft Certified Architect for SharePoint

It was also announced that MCA, or Microsoft Certified Architect certification would be more closely aligned with SharePoint. I have mixed thoughts on this: due to our smaller economy in Finland the recognition of MCM (Microsoft Certified Master) and MCA is next to zero. It also seems challenging to balance the costs of attending either of these premium-level certification programs while trying to testify the added value to your existing customers.

As was known before, MCM is required for MCA. In order to become an MCA, one would first have to pass all four SharePoint 2010 certification exams (2 for IT Pro’s, 2 for developers), then attend the 3 week MCM training, pass the MCM exam and finally apply and prepare for the MCA program.

The cost of attending the MCM training program is $18,500 + $125 for the application fee. I happened to notice that starting in January, 2012 MCM for SharePoint is only one week of required training in Redmond (on-site), followed by 10 weeks of training remotely with scheduled classes (off-site). 

MCA has a list price of $7,500 + $125 for the application fee. It’s a lot of money for something most customers don’t yet have awareness for. For more details on MCA and what it requires, see here.

SPC 2012 in Las Vegas

I don’t think anyone had even a glimmer of hope to learn anything about SharePoint vNext during SPC 2011. It came as no surprise that the next SharePoint Conference will be held in 2012, from November 12th to 16th. I reckon this is close enough for a public beta of SharePoint vNext to be available near the end of 2012. Once again I’m simply extrapolating from the information floating around online and don’t have any hard facts to back this.

Why SPC 11 then?

I’ve been wondering why Microsoft decided to organize SharePoint Conference this time of the year, and why not simply wait until 2012 to announce something new and fresh. Sometimes I can’t escape the feeling that SharePoint 2010, as we know it by now, has been around for almost two years if you include all the beta bits and SPC 09. For us IT Pro’s and developers this is a long time and secretly I was hoping to hear or see something more solid on the future roadmap of Office 365, Azure or SharePoint in general. I guess the next opportunity to see new and shiny things is around November for the Office 365 service update, and then during Visual Studio vNext launch later in 2012.

It seems Office 365 doesn’t play that big of a part during the conference. The lack of deep dive technical Office 365 sessions is also a bit of a mystery, since one would think that moving to the cloud and have everyone deploy solutions based on SharePoint Online and Azure would be a priority by now.

All in all, reflecting day 1 of SPC 11 it’s been better than TechEd North America this year, and definitely better than TechEd Europe in late 2010. Looking at the upcoming sessions for Tuesday, Wednesday and Thursday I’m confident I’m leaving with a lot of new ideas and stuff to work on.

Integrating custom data to SharePoint (Part 1)

This is part 1 of Integrating custom data to SharePoint.

One of the business requirements often presented during a medium to large SharePoint implementation project is the integration of custom Line of Business (LOB) data. There are so many options and choices one can make based on different requirements and scenarios, that giving the absolutely correct answer is near impossible. In this post (divided in multiple parts) I’m going to outline the essential enterprise-grade approaches available for integrating custom data to SharePoint.

Since we’re talking about enterprise-grade I’m ruling out the following options – they are not really suitable for large deployments. Your mileage may vary:

  • SharePoint Designer: While creating data-driven applications with SPD is somewhat trivial, it’s not really an option when using multiple SharePoint farms and multiple layers of functionality.
  • Copy-Paste: One could argue that copying content from an internal data source to SharePoint is integration, but it really isn’t. It’s still an option when you  just need to visualize static data with minimal costs.
  • Static HTML-files/Server-Side Includes: A rather old school approach to embedding data but surprisingly often considered as an option. We’ll leave this to the 90’s.

Consider your data

What kind of data needs to be integrated? Where is the master data located? Can you integrate your data source directly? How will the data be consumed? Is data manipulation going to be two-ways? Is real-time access needed to the data?

These are just some of the questions you should ask when considering the options for integrating data to SharePoint.

Typically, data is stored in one or more of the following:

SQL Server database: A solid option. This could be a single table in a database, multiple databases or even a highly customized set of data from numerous data sources. Some or all elements of Business Intelligence might be included, such as data warehouses and analysis services cubes.

Web Service: In true SOA fashion, a consumable web service might be available. This could be a “traditional” SOAP-based web service or a Windows Communication Foundation (WCF) –endpoint exposing anything from RESTful services to ATOM and JSON –based data.

XML-content: Sometimes a static XML-file or feed is the simplest way to expose and consume data in SharePoint. This could also include a comma-separated values file (CSV) from Excel or files with similar well-known structures.

SQL database (non-SQL Server): Third-party databases sitting on MySQL, Oracle or some similar RDBMS are always great candidates for controversy and hidden technical issues. It’s possible to integrate these but pay close attention when choosing your approach as it might yield performance and security issues. Not to say these wouldn’t be a factor with any other integration methods either.

Windows Azure/SQL Azure: I’ve yet to see too many companies embracing the Azure offerings with regards to integrating data from the cloud to on-premises SharePoint. It’s still an option that can – and probably should – be considered more often.

Office 365/SharePoint Online: Similarly to Azure, using SharePoint Online to store your data and then integrating it to your on-premises SharePoint is something a business requirement.

Other ways of integrating data from LOB systems are probably available. These are the ones I seem to encounter the most and therefore think are relevant to include in this post.

Outline the effort

How many days of custom development are you willing to spend for integrating data to SharePoint? Requirements from a business stakeholder might not take into consideration the infrastructural changes, support issues, monitoring needs and licensing costs a simple “integrate A with B” request might incorporate. And to be honest, it shouldn’t since the request is almost always based on actual business needs and justification, not on the boundaries of SharePoint and experience of the delivery team.

When deciding on the optimal way for integration, try to keep the following considerations in mind:

  • Is this something that needs to be done once, and never touched again? Or is this a foundation for future integrations?
  • Who will be responsible for the data? Is it someone from our technical delivery team, or someone who actually owns the original data?
  • What capabilities do we possess, as a delivery team, to provide support and troubleshooting assistance for the LOB data?

So what’s the typical implementation?

A typical request for integrating data might be like this:

A business unit has a legacy SQL Server 2000 database running in a forgotten server somewhere on the corporate network. They now need to integrate several tables and views from said database to the company intranet running with SharePoint 2010.

In addition, the business unit has also decided that they own the data and want to keep updating the database content in the future. This would force us to consume data from their legacy SQL Server 2000, in turn causing serious security and network latency issues during production use.

The best starting point here is to investigate and document the current database functionality and interfaces provided for applications and users. Armed with this information one should first stabilize the situation by migrating the database to a more recent and redundant SQL Server environment. You could use an existing SQL Server cluster, or set up a dedicated SQL Server instance for the business unit to use. IT department would probably be happy to get one more legacy application server off from their network.

This – in essence – is the tactic of cleaning your backyard first, and then doing the fancy integration stuff. Suddenly your integration project looks a lot more like a traditional IT project – decommissioning servers, patching legacy operating systems, finding new virtual machines and budgeting to purchase new hardware. That’s the way it goes; try to foresee this and apply for enough budget to cover your time and efforts to do this properly.

Assuming we manage to migrate the older database from SQL Server 2000 to a more robust hardware running SQL Server 2008 R2, we can resume our original integration planning and implementation.

For SQL Server-based databases, the following options are available in order of preference:

  1. Use ADO.NET to directly consume data sources from your UI component (a web part or similar)
  2. Use External Content Types (by way of Business Connectivity Services) to connect with the data
  3. Create a custom WCF-endpoint to consume data directly from SharePoint (a web part is a typical approach for this)
  4. Create a scheduled task or a timer job to move data from the legacy database to a SharePoint list

One might argue that using ECT’s should be the primary means of integration towards SharePoint. While this might be true for certain scenarios, it often adds to the overall complexity and might also be overkill. It depends whether or not you need two-way data access, and if you plan on using the integrated data in metadata or with search. There are also performance and scalability boundaries to consider.

Using ADO.NET is fairly simple and allows full freedom to integrate the data as it fits your needs best. If the underlying schema changes it’s easy to create abstraction layers for data handling or modify the logic for the UI component.

More to follow in part 2 – stay tuned.