SQLUniversity: PowerPivot, Tableau and Jedi Knights

This blog will show an overview of how I mobilised PowerPivot using Tableau. I’ve previously given this session at SQLBits, NEBytes Microsoft Technology User Group, and SQLServerDays in Belgium but thought it would also be useful to supply the files for you. The steps are very simple since I intended to show the end-to-end solution simply as a proof of concept, as follows:

  • creation of a PowerPivot which mashed up UK Census data with geographical data
  • creation of the report in Tableau
  • deployed to Tableau Public for consumption by mobile devices such as the iPad
The example was deliberately kept simple in order to prove the concept of PowerPivot being mobilised. 
The data sample involved mashing up two sources:
  • Jedi Knight census, data, which can be downloaded from here This is a basic file but the final PowerPivot can be downloaded from a link later on in this article
  • Geonames offer an excellent free download service, which you can access here
The Jedi Knight data, along with the geographical data, were joined using the outcode of the postcode data. If you need more definitions of the UK postcode system, I’ve previously blogged about this here.  
Essentially, a very simple RELATED formula was used in order to look up the latitude and longitude from the UKGeography table, and put it into the Jedi Knights data, and produce the necessary data in a simple Excel table. The formula looks like this:
=RELATED(UKGeography[Latitude])
=RELATED(UKGeography[Longitude])
Once these very simple formula were put in place, it was time to load the data into Tableau.
Tableau can take both PowerPivot and Excel data – which driver to use?  I used version 6 of Tableau. Whilst this version of Tableau does see the PowerPivot correctly as an Analysis Services cube, it does not always read the date as a ‘date’ type, but instead as an attribute. There is a forum posting on the Tableau website which tells you how to fix this issue, which involves changing the date so it appears as a measure, which means it can then be used for trends and so on. 
However, I wasn’t comfortable with this solution because I like dates to be in date format. I’ve also run into this issue at customer site, where the customer wanted to use SSAS as a source and Tableau as the presentation layer. They were data-warehouse savvy and didn’t like the ‘measures approach’ fix. 
On customer site, I got around it instead by using the Excel data source, and importing all of the PowerPivot columns into an Excel 2010 sheet. By doing it in this way, date formats were preserved. In this example, I didn’t have date format so it didn’t matter – but this is a useful tip for the future if you are using PowerPivot with Tableau. The final data, in an Excel PowerPivot, can be obtained in zip format here or if you can’t access it, please email me at jenstirrup [at] jenstirrup [dot] com.
Once the data was accessible by Tableau, I used the Tableau Desktop version to upload the data into Tableau’s memory. I did this so that I could eventually upload the Tableau workbook to Tableau Public. The instructions to save to Tableau Public are given here
Once the data was in Tableau Public, I just needed to access the data using the Safari browser on the iPad. In case you are interested, the demos are publically accessible and you can access the final result by clicking on the hyperlinks below.
I hope that’s been a useful overview of PowerPivot, and the ease of which it was mobilised. This blog forms a use case of how it might be useful to use PowerPivot, since I think that people sometimes need examples of how PowerPivot can benefit them. In this case, the clear benefit of PowerPivot is to provide an easy way of mashing up different data sources.
I look forward to your comments and thank you for sticking with me for the PowerPivot SQLUniversity discussions!

SQLUniversity: Introduction to PowerPivot and Mobile Business Intelligence

Here is the second post for SQL University, where we introduce the topic of mobilising PowerPivot data. As an introduction, I thought it might be useful to share the presentations that I did recently. Tomorrow’s blog will talk more about the details of mobilising PowerPivot data using Tableau, but I thought that these articles might provide some introductory material for people to use, and re-use, in order to expose the usefulness of PowerPivot as a data source. Here is a brief overview:

Essentially,the Ordnance Survey and Census data were mashed up together and linked together in PowerPivot. The PowerPivot was then used as a source to Tableau, which was used to visualise the data.  As an introduction to the process, I generated some presentations to use when I was talking around this solution. and I have provided these presentations here.
Here is a very introductory presentation on PowerPivot in Sharepoint. This presentation was designed to be given in around 5 minutes before being supported a series of demos, and the intended audience was people who hadn’t seen PowerPivot previous to the presentation.

The following presentation came from SQLBits, where I gave a discussion on mobilising business intelligence with PowerPivot. Tableau Software was used in order to display the data from the PowerPivot, and the next blog in the series will present more technical detail on how the solution was built.

Here is the presentation that I produced for SQL Server Days in Belgium in November 2011:

I hope that you enjoy these presentations, which I’ve provided for people to use and enjoy as they wish. Tomorrow’s SQLUniversity post will provide more detail on the mobilisation of PowerPivot with Tableau.

SQLUniversity: PowerPivot and ATOM feed mashup Source to PerformancePoint

SQL-University-Shield-268x300This is the first episode of SQL University’s week of PowerPivot! In this episode, we will look at how PowerPivot can be used to mash up PowerPivot and ATOM Feed data, which in turn is then shaped and surfaced in PerformancePoint Services for Sharepoint. All of the images are hosted on Flickr so please feel free to click on them to see the larger versions.

Mashing together PowerPivot with an ATOM feed means that the disparate data sources appear unified, which can give us something business users can use very smoothly. The business users can also see the detailed Powerpivot in Sharepoint, so they are able to go back and check the detail of the data if they require. The ATOM feed means that they can change the target data using Sharepoint Lists, and it will pass through Sharepoint so it is visible in their scorecards. Ultimately, the business users want accurate information as quickly as possible in order to make a decision, whether it is operational or strategic – and mashing together data sources in PowerPivot can help them to do just that!

There are a number of steps involved in this sequence:
  •  Creating a Sharepoint List
  • Create Custom List in Datasheet View
  • Export the Data to an ATOM Feed
  •  Connecting PerformancePoint to a PowerPivot data source
  • Creating your KPIs and Scorecard with your PowerPivot source

Each of these steps will be detailed in the following sections.
Creating a Sharepoint List
On the left hand side of the page, look for ‘All Site Content’.

B Create Sharepoint List

Create Custom List in Datasheet View

Right click on the column called ‘Title’, and change it to something convenient.
Once the columns are added, enter information in the rows. This is done by directly typing into the cells.
Once the table is complete, we need to export the data to PowerPivot via a data feed. The next steps to export the data to PowerPivot are given next.

Export the Data to a ATOM Feed

If you can’t find the custom list, look for the link called ‘All Site Content’ in the browser. You should find your custom list, which will appear half way down the page under the heading ‘List’.
Click on the orange button called ‘Export as Data Feed’. This can be seen in the toolbar below.
Next, you need to save the atom feed output to a workbook. From the box entitled ‘PowerPivot’, choose a workbook.
Click on ‘OK’ to create a dedicated workbook for the PowerPivot model that will hold the target data.




C Export as Data Feed




The ‘Table Import Wizard’ will appear. Assign a name to the PowerPivot model that is appropriate to your network.
Keep the Data Feed as-is for the moment, and click ‘Next’.
Next, the Table Import Wizard offers you the opportunity to filter the table. For this exercise, this is not necessary.  We will click ‘Finish’.
Once you have clicked ‘Finish’, you should hopefully get a ‘Success’ feedback window. Choose the ‘Close’ option.
Once you have done this, your PowerPivot model of the Target data appears. Save it to your preferred Sharepoint location for the PowerPivot models. This is URL to the PowerPivot gallery on your Sharepoint site.
Now you have a PowerPivot model that contains your target data.  Now we want to mash it together with your actual data to give us a complete model of merged Actual and Target data in PowerPivot. This ‘mashed together’ model will give us a nice data set for visualising Actual versus Target data, and this is something that we are going to do in the next step.
PerformancePoint is a very nice way of displaying Actual versus Target data that is intuitive to the user. We will now look at some of the different ways that PerformancePoint allows us to display Actual versus Target data. As we proceed, we will look at the best ways of visualising the Actual versus Target data so that business users can understand the message of the data.

To use PerformancePoint, you firstly need to open Sharepoint in a browser since it may not be obvious where PerformancePoint is hiding! If you haven’t used PerformancePoint Services before, then it needs to be installed from the Business Intelligence Center in Sharepoint. The install process is straightforward, but if it does not work, then there may be an issue with your permissions.
In the Business Intelligence Center, click a section, such as Monitor Key Performance, Build and Share Reports, or Create Dashboards.
Additional information about each area is displayed in the center pane.
In the centre window, click a link that says, “Start using PerformancePoint Services.”
The PerformancePoint site template opens.  Next, you need to click the orange button that says ‘Start Using PerformancePoint Services’. The Dashboard Designer should start to install itself and run.
You do not need to install the Dashboard Designer every time. Instead, you can go to Start – All Programs  –  Sharepoint – PerformancePoint Dashboard Designer.


Connecting PerformancePoint to a PowerPivot data source
To create a PowerPivot data source connection, the following steps need to be followed:


 Click the Create tab, and then click Data Source.

3a Create then Data Source


In the Category pane of the Select a Data Source Template dialog box, click Multidimensional and then click Analysis Services. Click OK.


2. SSAS Data Source for PowerPivot

In the left navigation pane (workspace browser), type the name of your data source.


In the center pane, click the Editor tab. In the Connection Settings section, select Use the following connection.


Type the connection string to the PowerPivot data source by using the following format: 


PROVIDER=MSOLAP;DATA SOURCE= http://testsp01/sites/powerpivot_test/ABC/PowerPivot Base Data Hub/Operations PowerPivot Book.xlsx


Select the cube from the drop-down menu. The cube name for a PowerPivot model will always be Sandbox.


3b Connections Details for PowerPivot Source



In the Data Source Settings section, select the method on which to authenticate to the data source.


From the Formatting Dimension list, select desired dimension formatting needed for the report.


From the Cache Lifetime list, type the refresh rate (in minutes) for the cache. Data from this data source will update at this interval.

Here are the details below:


Click Test Connection to confirm that the connection is configured correctly.
Note: at the time of writing, Time Intelligence is not supported for PowerPivot data sources in SQL Server 2010. You should always test your connection, and hopefully you should get this result:


4. PowerPivot Successful Connect
The next step is to create a Scorecard that will display the performance of the Actual data compared to the Target data.  The Actual data will come from the PowerPivot Actual source, and the target data will come from a separate table in the same PowerPivot source.


The Scorecard PowerPivot data source appears the same as an Analysis Services data source:


5 Scorecard - select a data source


Technet offer an excellent walkthrough of the PerformancePoint scorecard creation here. The key difference is in selecting the data source. Note that the data source comes up as ‘Analysis Services’ rather than PowerPivot, but do not worry: remember PowerPivot is powered by Analysis Services, so that’s ok. Here is an example:
When we set up a KPI source, the source also appears to be Analysis Services. Here is what the next step looks like:


6 Select a KPI Source


Once you have created your scorecard containing your actual and target KPIs, it appears as a unified scorecard rather than coming from different mashed-up data sources.

To summarise, mashing together PowerPivot with an ATOM feed can give us something business users can use very smoothly since the disparate data sources appear unified. The business users can also see the powerpivot in Sharepoint, so they are able to go back and check the detail of the data if they require, which is something that they can find reassuring. The ATOM feed means that they can change the target data if they like, and it will be visible in their scorecards. Ultimately, the business users want accurate information as quickly as possible in order to make a decision, whether it is operational or strategic – and mashing together data sources in PowerPivot can help them to do just that!

Photos from SQLRally Nordic in Stockholm

As you know, I recently presented at SQLRally in Sweden. I don’t normally post photographs, but I thought that some of you might like to see what it was like! This blog is just a bit of fun and I’ll wait and see what you think of my ‘off-topic’ blog!

In a previous life, I used to work in Stockholm, so returning to Sweden was very exciting for me. Although it was great to be back, on the previous evening, I had walked around Stockholm with Dave Ballantyne. For SQL folks in the UK, Dave will need no introduction, but in case you’d like to see some of his presentations for SQLBits, please take a look here

As you can see here, the conference centre at Aronsborg was waiting to receive us. It tickled me to see that we were the ‘mightiest SQL Server and Business Intelligence in Midgaard’! Here are some photos from the Conference Centre that give a flavour of our welcome:

StockholmSQLRallyNordic1

StockholmSQLRallyNordic2

StockholmSQLRallyNordic3

I had work to do whilst I was in Stockholm, so I made use of the Regus office in Stockholm for the day. No time for sightseeing, I’m afraid! As a coffee-drinking workaholic, the occasional use of their business lounges suits me. I have a weakness for Nespresso! There is one near my home. I’ve got a TripIt membership and a 10-visit Gold pass, which means I can use their business lounges anywhere in the world. Here are some photos of their lounge: 
StockholmRegus1

StockholmRegus2

Sometimes people ask me how I fit it all in, and the truth is that I work everywhere I go, and squeeze something in as often as I can. Therefore, the occasional use of a business lounge, wherever I am, is very helpful in keeping things ticking over.

It means I don’t ‘smell the roses’ but I love my work so much, it is my passion! It makes me happy.

SQLPass SQLRally Nordic – a Viking treasure!

I’ve been at SQLRally Nordic this November, and I’ve had a fantastic experience. I hope that they will organise this event again! Here are some highlights:


High quality sessions from a variety of speakers on various subjects, including a presence from the SQLCat team such as Alexei Khalyako, Mark Souza (who is also a PASS Director), Thomas Kejser and Tobias Thernström. 


Business Intelligence experts such as Peter Myers, Chris Webb, Peter Myers, Marco Russo and Alberto Ferrari all gave sessions.  We also had a precon from Itzik Ben-Gan and Tim Peterson. Johan Ahlen – chief organiser – also gave a session in Business Intelligence.


It was also great to see long-serving MVPs support the event, so many thanks to Aaron Bertrand and the Quest team, supported by Kevin Kline. I’d also like to thank Johan Ahlen for his hard work in organising the event. I am sure that there were plenty of others who helped, whom I didn’t come across – but their hard work was visible because SQLRally Nordic went flawlessly. Well done everyone!


The event wasn’t all ‘work and no play’! There was a party in the evening, and if you’re interested in seeing what the Swedish IT Computing press made of the event, please do visit their site here.


We also had a WIT lunch, and it was great to have this facility where people could network and contribute. We talked on a number of subjects, and our main focus was on how we could individually mentor and support people as they progress through their IT and professional careers. It wasn’t a ‘pity party’ attitude at all! The main focus was to allow people to connect and network if they wished. I’d personally like to thank Simran Jindal and Johan Ahlen for facilitating the ‘Women in IT’ lunch.


I enjoyed giving my session in data visualisation. There was a very strong presence in business intelligence delegates, and I love meeting other Business Intelligence fanatics in order to share experiences! The Swedish delegates were very friendly, and they made the event very special with their welcome.


As a final note, the event was duly supported by a number of sponsors, and if you’d like to see the list, then you can find the sponsor details here

IT Recruitment Strategies: tailoring towards women?

Recently, I commented on my LinkedIn page that thirty plus recruitment agents had looked at my profile within a couple of days. This surprised me since it seemed to be a flurry of activity all at once, and I couldn’t understand why there was such a sudden interest in my profile. Then it became clear. A recruitment agency in the UK has started a dialogue with me about an emphasis on recruiting women into IT roles. I am purely guessing that I’ve come up on their radar for that reason. I haven’t named them, but here are my thoughts on the issue.
I disagree with the idea of tailoring IT recruitment services towards women. Recruitment should be about skills. If you read some of my previous posts, I’ve been in the situation where companies are keen to hire women into IT roles and I’ve been contacted initially on that basis. This strategy has switched me off since I don’t want to be hired over someone else simply because it might be ‘trendy’ or ‘marketable’ to have a technical woman on board. 
I don’t want someone to hire me because I’m a technical woman. I want to be taken on board for my experience, attitude and skill set and what I can contribute to their business. The ‘Women in IT’ thing doesn’t help because it sets me apart from the (usually male) people I work beside. I want to blend in, and contribute as part of a team. Either I can do the job, or I can’t. If someone else is better suited, then they should get the role. I guess that’s very simplistic but I focus on skills when I’m hiring someone, so I’m coming from that perspective.
In terms of recruitment strategies which emphasis hiring women for IT roles, the truth is that the numbers are simply not there. The proportion of technical women is small, so it is tailored towards a tiny proportion of the whole potential audience. I get concerned that it sends out a message that WIT need special help in some way. For me personally, I don’t. I will be interested to see what others think, and how the recruitment agencies shape their women-focused strategies.
For me it isn’t relevant because I am more interested in hiring than being hired these days! I jointly run a consultancy business with another SQL Server MVP. As a team, we both bring a mix of technical and business expertise to running Copper Blue, and delivering enterprise business intelligence solutions to our clients. By employing myself, the WIT issue is a non-issue and means I can focus on my customers and projects as part of a team whilst expanding my skills – which is where my focus should be.