Latest post on my Handy IoT Toolkit is released!

d3f07-spyI’ve started to update my IoT Toolkit blog post series.

You can get the latest post from here, which gives you some ideas on communication for your virtual blended team, and some pointers towards handy Visio stencils that might help. You can also navigate my other IoT posts by going to the IoT Toolkit menu that I’m trying to keep updated.

Jen’s PASS Diary: once more unto the breach

“Never allow carping critics to deter you from success. Instead, silence them with it.”

― Christian Baloga

“Success is not final, failure is not fatal: it is the courage to continue that counts.”

– Winston Churchill

Well, what have I been up to, since my last post? Note: I don’t represent PASS here. This is a brain dump of my own thoughts.

Briefly, I’ve suffered a series of malicious problems over the Internet. Like many other tech women, I just put up with it. I’ve reported it to the police and I’ve been talking to them over the past ten days or so, and the matter is now proceeding through that process. Since it’s now a Police matter, I won’t go into details. I will simply say that I’ve changed most of my contact details now and I’ll set up a dedicated blog email address so that people can still contact me. Details to follow.

I’ve been distracted, to say the least. It’s been suggested to me that I should step down from various community activities, including the MVP Program. However, I am not going to do that. I think it sends out the wrong signal, which is that these people/person (I still don’t know who it is) will win in the end. So, whilst I’m clinging on with my fingernails, I am still clinging on and I am not going away any time soon.

I’ve served on the Board of PASS for the last two years, and I’ve been an MVP for five years now in my own right. I’m one of only four female MVPs in Europe, although we are all Data Platform now. I’ve written my own book all by myself – not just a chapter or so! And I run my own one woman band small business.

I’m happy just to be part of the crowd for as long as it lasts. It won’t last forever and one day, it will end. I’ve met some great people along the way, learned languages In foreign countries, lost friends who weren’t friends, travelled to more places than I ever dreamed of seeing and I’ve achieved more than a wee lassie from Kilmarnock could ever have hoped for. These problems have made me down but not out. Life has thrown so much at me already and there is a truth in the saying, that which doesn’t kill you, makes you stronger.

Regarding PASS, I’m not going to let down the 1.2k plus people who voted for me, and I won both elections outright – twice, once in 2013 and once in 2015. That was a huge mark of faith in me and it’s a lot to live up to, and I don’t want to let people down. I know full well there are areas I could do better and I am trying. I am also very aware that there are people who don’t like me or what I’m doing. But that isn’t everyone and there are still people standing behind me and beside me who believe in the community things I am trying to do. However distractions like this do not help, but, despite this, I have still managed to keep my phone commitments to PASS this week leading up to Christmas plus keep on top of things BA. I’m excited to be part of the team and I feel I’m working with people who are friends as well as team colleagues and I don’t want to lose that.

I will continue to send all problems on to the right police process. I have the “courage to continue” and I want to thank everyone who has been in touch over the past week or so, with their support. I’ve commented previously about the kindness of strangers and how nice people can be sometimes, even if they don’t really know me at all. also, to the kind hearts from SQLFamily – too many to mention! – but I think I’ve been in touch to thank everyone personally by now. Usually, I’d name people who had helped me, but I’ve turned a bit paranoid about mentioning names in case they get some problems, too. A rather bad halo effect :(

hopefully the next post will be more cheerful!

jen

x

 

 

 

 

Learning pathway for SQL Server 2016 and R Part 2: Divided by a Common Language

http://whatculture.com/film/the-office-uk-vs-the-office-us.php

Britain has “really everything in common with America nowadays, except, of course, language.” Said Oscar Wilde, in the Centerville Ghost (1887) whilst George Bernard Shaw is quoted as saying that the “The United States and Great Britain are two countries separated by a common language.”

There are similarities and differences between SQL and R, which might be confusing. However, I think it can be illuminating to understand these similarities and differences since it tells you something about each language. I got this idea from one of the attendees at PASS Summit 2015 and my kudos and thanks go to her. I’m sorry I didn’t get  her name, but if you see this you will know who you are, so please feel free to leave a comment so that I can give you a proper shout out.

If you are looking for an intro to R from the Excel perspective, see this brilliant blog here. Here’s a list onto get us started. If you can think of any more, please give me a shout and I will update it. It’s just an overview and it’s to help the novice get started on a path of self-guided research into both of these fascinating topics.

R SQL / BI background
A Factor has special properties; it can represent a categorical variable, which are used in linear regression, ANOVA etc. It can also be used for grouping. A Dimension is a way of describing categorical variables. We see this in the Microsoft Business Intelligence stack.
in R, dim means that we can give a chunk of data dimensions, or, in other words, give it a size. You could use dim to turn a list into a matrix, for example Following Kimball methodology, we tend to prefix tables as dim if they are dimension tables. Here, we mean ‘dimensions’ in the Kimball sense, where a ‘dimension’ is a way of describing data. If you take a report title, such as Sales by geography, then ‘geography’ would be your dimension.
R memory management can be confusing. Read Matthew Keller’s excellent post here. If you use R to look at large data sets, you’ll need to know
– how much memory an object is taking;
– 32-bit R vs 64-bit R;
– packages designed to store objects on disk, not RAM;
– gc() for memory garbage collection
– reduce memory fragmentation.
SQL Server 2016 CTP3 brings native In-database support for the open source R language. You can call both R, RevoScaleR functions and scripts directly from within a SQL query. This circumvents the R memory issue because SQL Server benefits the user, by introducing multi-threaded and multi-core in-DB computations
Data frame is a way of storing data in tables. It is a tightly coupled collections of variables arranged in rows and columns. It is a fundamental data structure in R. In SQL SSRS, we would call this a data set. In T-SQL, it’s just a table. The data is formatted into rows and columns, with mixed data types.
All columns in a matrix must have the same mode(numeric, character, and so on) and the same length. A matrix in SSRS is a way of displaying, grouping and summarizing data. It acts like a pivot table in Excel.
 <tablename>$<columnname> is one way you can call a table with specific reference to a column name.  <tablename>.<columname> is how we do it in SQL, or you could just call the column name on its own.
To print something, type in the variable name at the command prompt. Note, you can only print items one at a time, so use cat to combine multiple items to print out. Alternatively, use the print function. One magic feature of R is that it knows magically how to format any R value for printing e.g.

print(matrix(c(1,2,3,5),2,2))

PRINT returns a user-defined message to the client. See the BOL entry here. https://msdn.microsoft.com/en-us/library/ms176047.aspx

CONCAT returns a string that is the result of concatenating two or more string values. https://msdn.microsoft.com/en-GB/library/hh231515.aspx

Variables allow you to store data temporarily during the execution of code. If you define it at the command prompt, the variable is contained in your workspace. It is held in memory, but it can be saved to disk. In R, variables are dynamically typed so you can chop and change the type as you see fit. Variables are declared in the body of a batch or procedure with the DECLARE statement and are assigned values by using either a SET or SELECT statement. Variables are not dynamically typed, unlike R. For in-depth look at variables, see Itzik Ben-Gan’s article here.
ls allows you to list the variables and functions in your workspace. you can use ls.str to list out some additional information about each variable. SQL Server has tables, not arrays. It works differently, and you can find a great explanation over at Erland Sommarskog’s blog. For SQL Server 2016 specific information, please visit the Microsoft site.
A Vector is a key data structure in R, which has tons of flexibility and extras. Vectors can’t have a mix of data types, and they are created using the c(…) operator. If it is a vector of vectors, R makes them into a single vector. Batch-mode execution is sometimes known as vector-based or vectorized execution. It is a query processing method in which queries process multiple rows together. A popular item in SQL Server 2016 is Columnstore Indexes, which uses batch-mode execution. To dig into more detail, I’d recommend Niko Neugebauer’s excellent blog series here, or the Microsoft summary.

There will be plenty of other examples, but I hope that helps for now.

Learning pathway for SQL Server 2016 and R Part 1: Installation and configuration

Jargogled is an archaic word for getting confused or mixed up. I’m aware that there are lots of SQL Server folks out there, who are desperate to learn R, but might be jaRgogled by R. Now that R is in SQL Server, it seems like the perfect opportunity to start a new blog series to help people to splice the two great technologies together. So here you are!

First up, what do you need to know about SQL Server installation with R? The installation sequence is well documented here. However, if you want to make sure that the R piece is installed, then you will need to make sure that you do one thing: tick the Advanced Analytics Extension box.

SQL Server 2016 R Feature Selection

You need to select ‘Advanced Analytics Extensions’, which you will find under ‘Instance Features’. Once you’ve done that, you are good to proceed with the rest of your installation.

Once SQL Server is installed, let’s get some data into a SQL Server database. Firstly, you’ll need to create a test database, if you don’t have one already. You can find some information on database creation in SQL Server 2016 over at this Microsoft blog. You can import some data very quickly and there are different ways of importing data. If you need more information on this, please read this Microsoft blog.

If you fancy taking some sample data, try out the UCI Machine Learning data repository. You can download some data from there, following the instructions on that site, and then pop it into SQL Server.

If you have Office x64 installed on your machine, you might run into an issue:

Microsoft.ACE.OLEDB.15.0′ provider is not registered on the local machine

I ran into this issue when I tried to import some data into SQL Server using the quick and dirty ‘import data’ menu item in SSMS. After some fishing around, I got rid of it by doing the following:

There are other ways of importing data, of course, but I wanted to play with R and SQL Server, and not spend a whole chunk of time importing data.

In our next tutorial, we will look at some of the vocabulary for R and SQL Server which can look confusing for people from both disciplines. Once you learn the terminology, then you’ll see that you already know a lot of the concepts in R from your SQL Server and Business Intelligence expertise. That expertise will help you to springboard to R expertise, which is great for your career.

PASS Summit Notes for my AzureML, R and Power BI Presentation

I’m going to have fun with my AzureML session today at PASS Summit! More will follow on this post later; I am racing off to the keynote so I don’t have long :)

I heard some folks weren’t sure whether to attend my session or Chris Webb’s session. I’m honestly flattered but I’m not in the same league as Chris! I’ve posted my notes here so that folks can go off and attend Chris’ session, if they are stuck between the two.

Here is the order of things:

  • Slide Deck
  • How do you choose a machine learning algorithm?
  • How do you carry out an AzureML project?
  • AzureML Experiment 
  • R Code

So, the slide deck is here:

  • AzureML Experiment 

You can see this experiment in the AzureML Gallery. You may have to sign up for a Windows Live account to get a free AzureML studio account, and I recommend that you do.

  • How do you choose a machine learning algorithm?

Kudos to Microsoft – this is their cheatsheet and I recommend that you look at the original page.

Here is some more information on the topic from Microsoft, and I recommend that you follow it.

How do you carry out an AzureML project?

Try the CRISP-DM Framework for a start

See the Modelling Agency for the original source. https://the-modeling-agency.com/crisp-dm.pdf

CRISP-DM Process Diagram.png
CRISP-DM Process Diagram” by Kenneth JensenOwn work. Licensed under CC BY-SA 3.0 via Commons.

R Code

Here’s a sample R code. I know it is simple, and there are better ways of doing this. However, remember that this is for instructional purposes in front of +/- 500 people so I want to be sure everyone has a grounding before we talk more complicated things.

You may have to install the libraries first, if you haven’t done so.

library(data.table)
library(ggplot2)
library(xtable)
library(rpart)
require(xtable)
require(data.table)
require(ggplot2)
require(rpart)

summary(adult.data)
class(adult.data)

# Let’s rename the columns
names(adult.data)[1]<-“age”
names(adult.data)[2]<-“workclass”
names(adult.data)[3]<-“fnlwgt”
names(adult.data)[4]<-“education”
names(adult.data)[5]<-“education.num”
names(adult.data)[6]<-“marital.status”
names(adult.data)[7]<-“occupation”
names(adult.data)[8]<-“relationship”
names(adult.data)[9]<-“race”
names(adult.data)[10]<-“sex”
names(adult.data)[11]<-“capital.gain”
names(adult.data)[12]<-“capital.loss”
names(adult.data)[13]<-“hours.per.week”
names(adult.data)[14]<-“country”
names(adult.data)[15]<-“earning_level”

# Let’s see if the columns renamed well
# What is the maximum age of the adult?
# How much data is missing?
summary(adult.data)

# How many rows do we have?
# 32561 rows, 15 columns
dim(adult.data)

# There are lots of different ways to deal with missing data
# That would be a session in itself!
# For demo purposes, we are simply going to replace question marks, and remove rows which have anything missing.

adult.data$workclass <- as.factor(gsub(“[?]”, NA, adult.data$workclass))
adult.data$education <- as.factor(gsub(“[?]”, NA, adult.data$education))
adult.data$marital.status <- as.factor(gsub(“[?]”, NA, adult.data$marital.status))
adult.data$occupation <- as.factor(gsub(“[?]”, NA, adult.data$occupation))
adult.data$relationship <- as.factor(gsub(“[?]”, NA, adult.data$relationship))
adult.data$race <- as.factor(gsub(“[?]”, NA, adult.data$race))
adult.data$sex <- as.factor(gsub(“[?]”, NA, adult.data$sex, fixed = TRUE))
adult.data$country <- as.factor(gsub(“[?]”, NA, adult.data$country))

is.na(adult.data) = adult.data==’?’
is.na(adult.data) = adult.data==’ ?’
adult.tidydata = na.omit(adult.data)

# Let’s check out our new data set, called adult.tidydata
summary(adult.tidydata)

# How many rows do we have?
# 32561 rows, 15 columns
dim(adult.tidydata)

# Let’s visualise the data
boxplot(adult.tidydata$education.num~adult.tidydata$earning_level,outline=F,xlab=”Income Level”,ylab=”Education Level”,main=”Income Vs Education”)

prop.table(table(adult.tidydata$earning_level,adult.tidydata$occupation),2)
for (i in 1:ncol(adult.tidydata)-2) {
if (is.factor(adult.tidydata[,i])){
pl =ggplot(adult.tidydata,aes_string(colnames(adult.tidydata)[i],fill=”earning_level”))+geom_bar(position=”dodge”) + theme(axis.text.x=element_text(angle=75))
print(pl)
}

}

evalq({
plot <- ggplot(data = adult.tidydata, aes(x = hours.per.week, y = education.num,
colour = hours.per.week))
plot <- plot + geom_point(alpha = 1/10)
plot <- plot + ggtitle(“Hours per Week vs Level of Education”)
plot <- plot + stat_smooth(method = “lm”, se = FALSE, colour = “red”, size = 1)
plot <- plot + xlab(“Education Level”) + ylab(“Hours per Week worked”)
plot <- plot + theme(legend.position = “none”)
plot
})

That’s all for now! More later.

Jen xx

Jen’s Diary: Why are PASS doing Business Analytics at all?

As always, I don’t speak for PASS. This is a braindump from the heart. I realise that we haven’t communicated about BA as much as some members might like. It’s a hard balance – I don’t want to spam people, and I don’t want to get it too light, either. If you want to sign up for PASS BA news, here’s the link. So I have to apologise here, and hold my hands up for that one. I’ll endeavour to ensure we have a better BA communications plan in place, and i’m meeting the team on Friday to discuss how we can make that happen.

In the meantime, I’d like to blog about BA today. How did we get here, and where are we going? Why are PASS interested in Business Analytics at all? To answer this question, let’s look at the history of Business Intelligence, what Business Analytics means, and how PASS can be part of the story. Let’s start with the history lesson. What are the stages of Business Intelligence?

First generation Business Intelligence – this was the world of corporate Business Intelligence. You’ll know this by the phrase ‘the single source of truth’. This was a very technical discipline, focused on the data warehouse. It was dominated by Kimball methodology, or Imon methodology, dependent on the business requirement. However, the business got lost in all this somewhere, and they reverted to the default position of using Excel as a tool to work with Excel exports, and subverting the IT departments by storing data in email. Microsoft did – and still do – cater for the first generation of business intelligence. It has diversified into new cloud products, of course, but SQL Server still rocks. You’ll have seen that Gartner identified SQL Server as the number one RDBMS for 2015. Kudos to the team! For an overview, the Computer Weekly article is interesting.

Second generation Business Intelligence – the industry pivoted to bring the Business back into Business Intelligence. You’ll know this by the phrase ‘self-service business intelligence’. Here, the business user was serviced with clean data sources that they could mash and merge together, and they were empowered to connect to these sources. In the Microsoft sphere, this involved a proliferation of tabular models, PowerPivot as well as continued use of analysis services multidimensional models. As before, Excel remained the default position for working with data. PASS Summit 2015 has a lot of content in both of these areas.

So far, so good. PASS serves a community need by offering high quality, community education on all of these technologies. Sorted, right?

Wrong. The world of data keeps moving. Let’s look at the projected growth of Big Data by Forbes.

Well, the world of business intelligence isn’t over yet; we now have business analytics on the horizon and the world of data is changing fast. We need to keep up! But what do we do with all this data? This is the realm of Business Analytics, and why is it different from BI? The value of business analytics lies in its ability to deliver better outcomes. It’s a different perspective. Note from our first generation and our second generation BI times, technology was at the forefront of the discussion. In business analytics, we talk about organizational change, enabled by technology. In this sphere, we have to quantify and communicate value as the outcome, not the technology as a means to get there. So what comes next?

Third generation of business intelligence – self-service analytics. Data visualisation software has been at the forefront of second generation Business Intelligence, and it has taken a priority. Here, the position is taken that businesses will understand that they need data visualisation technologies as well as analytical tools, to use the data for different purposes.

How is Business Analytics an extension of Business Intelligence? Let’s look at some basic business questions, and see how they fall as BI or BA. Images belong to Gartner so all kudos and copyright to the team over there.

What happened?

If the promise of business intelligence is to be believed, then we have our clean data sources, and we can describe the current state of the business. Gartner call this descriptive analytics, and it answers the question: What happened? This level is our bread-and-butter business intelligence, with an emphasis on the time frame until this current point in time.

Why did it happen?

We can also understand, to a degree, why we are where we are. This is called diagnostic analytics, and it can help pinpoint issues in the organisation. Business Intelligence is a great domain for understanding the organisation until this point in time. However, it’s a rearview impressio of the data. What happens next? Now, we start to get into the remit of Business Analytics:

What will happen?

Businesses want to know what will happen next. Gartner call this predictive analytics, and this perception occurs when we want to try and look for predictive patterns in the data. Once we understand what will happen next, what is the next question?

How can we make this happen?

This is the power of prescriptive analytics; it tells us what we should do, and it is the holy grail of analytics. It uses business intelligence data in order to understand the right path to take, and it builds on the other types of analytics.

Business Intelligence and Business Analytics are a continuum. Analytics is focused more on a forward motion of the data, and a focus on value. People talk about ROI, TCO, making good business decisions based on strong data. First generation and second generation are not going away. A cursory look around a lot of organisations will tell you that. The Third Generation, however, is where organisations start to struggle a bit. PASS can help folks navigate their way towards this new generation of data in the 21st century.

How do we measure value? It is not just about storing the data, protecting it and securing it. These DBA functions are extremely valuable and the business would not function without them – full stop.  So how do we take this data and use it as a way of moving the organisation? We can work with the existing data to improve it; understand and produce the right measures of return, profiling, or other benefits such as team work. Further, analytics is multi-disciplinary. It straddles the organisation, and it has side effects that you can’t see, immediately. This is ‘long term vision’ not ‘operational, reactive, here-and-now’. Analytics can effect change within the organisation, as the process of doing analytics itself means that the organization solves a business problem, which it then seeks to re-apply across different silos within the organization.

SQL Server, on the other hand, is a technology. It is an on-premise relational database technology, which is aimed at a very specific task. This is a different, technologically based perspective. The perspectives in data are changing, as this Gartner illustration taken from here shows:

Why do we need a separate event? We need to meet different people’s attitudes towards data. DBAs have a great attitude; protect, cherish, secure data. BAs also have a great attitude: use, mix, apply learnings from data. You could see BA as a ‘special interest group’ which offers people a different choice. There may not be enough of this material for them at PASS Summit, so they get their own event. If someone wants to go ahead and have a PASS SQLSaturday event which is ‘special interest’ and focuses solely on, say, performance or disaster recovery, for example, then I don’t personally have a problem with that.  I’d let them rock on with it. It might bring in new members, and it offers a more niche offering to people who may or may not attend PASS because they don’t feel that there’s enough specialised, in depth, hard-core down-to-the-metal disaster recovery material in there for them. Business Analytics is the same, by analogy. Hundreds and hundreds of people attended my 3 hour session on R last year; so there is an interest. I see the BA event as a ‘little sister’ to the PASS ‘big brother’ – related, but not quite the same.

Why Analytics in particular? It’s about PASS growth. To grow, it can be painful, and you take a risk. However, I want to be sure that PASS is still growing to meet future needs of the members, as well as attracting new members to the fold However, the feetfall we see at PASS BA, plus our industry-recognised expert speakers, tell us that we are growing in the right direction. Let’s take a look at our keynote speaker, Jer Thorpe, has done work with NASA, the MOMA in New York, he was Data artist in residence at the New York Times and he’s now set up. The Office for Creative Research & adjunct professor at ITP. Last year, we had Mico Yuk, who is author of Dataviz for Dummies, as well as heading up her own consultancy team over at BI Brainz. They are industry experts in their own right, and I’m delighted to add them as part of our growing PASS family who love data.

The PASS BA event also addresses the issue of new and emerging data leaders. How do you help drive your organisation towards becoming a data-oriented organisation? This means that you talk a new language; we talk about new criteria for measuring value, working out return on investment, cross-department communication, and communication of ideas, conclusions to people throughout the organisation, even at the C-level executives. PASS BA is also looking at the career trajectories of these people as well as DBA-oriented folks, and PASS BA is out there putting the ‘Professional’ aspect into the event. We have a separate track, Communicate and Lead, which is all about data leadership and professional development. A whole track – the little sister is smartly bringing the Professional back, folks, and it’s part of our hallmark.

PASS is part of this story of data in the 21st Century. The ‘little sister’ still adds value to the bigger PASS membership, and is an area of growth for the family of PASS.

Any questions, I’m at jen.stirrup@sqlpass.org or please do come to the Board Q&A and ask questions there. If you can’t make it, tweet me at jenstirrup and I’ll see if I can catch them during the Q&A.

Jen’s Diary: Top 5 free online tools for organising yourself at PASS Summit, Live! 360 or any other conference!

As always, I don’t speak officially for PASS. This is a personal braindump.

I’m presenting at PASS Summit and Live! 360 in Orlando, so self-organising is a hot topic for me!

For diary purposes, I have been doing lots of work on PASS BAC with the team, as you can imagine. I’d like to thank the PASS team here: Vicki, Angie, Anika, Teresa, Georgia and Judy to name a few. If you want ‘behind the scenes’ tweets, please follow Anika on Twitter to know more about the running of PASS. It’s no mean feat – 6 thousand or so SQL Server fans in one place! – and the team keep everyone happy.

As you know, the PASS BAC first wave of speakers has gone out. If you have any questions, please fire them at me: jen.stirrup@sqlpass.org. More on this later! I’m humbled by the amount of industry expertise that we have in our community, and everyone who submitted is simply amazing. Thank you to everyone for their faith and belief in what we are trying to achieve, and thank you to the people who have had faith in us and have bought their tickets so far without even seeing the full agenda! PASS BAC is going to be a blast again, and I hope you’ll join us. You can register here.

Ok, now onto the clickbait that you really wanted to see. Here are some handy tools which I’m using to organize my time at PASS Summit.

Evernote – I prefer it to OneNote because, when I search my browser for items, it also brings up my Evernote notes about the same topic. Neat, huh? I use it for taking notes and I ‘snip’ everything. I use the local version which syncs to my online version, and I can read the offline local version whilst I’m on the plane. How good is that? Think of it as an offline Google or Bing repository to help you to actually read the things you marked as ‘to read later’.

 Sunrise Calendar I have tried using online tools for YEARS but this is the only one that works for me. It takes my nine calendars (yes, I am that busy!) and synchronises them in one place. I can see things in different timezones (I work BST/GMT, and then onto PST). I cannot do without this calendar now. Go and take a look, and you’ll find yourself organized in no time. You’ll need this for preventing yourself from becoming quadruple booked, as I do. These guys deserve a freakin’ award. Seriously. It is owned by Microsoft but why the hell they don’t advertise this, I have no idea. It’s so simple and it does what I need it to do. Guys – deep thanks from me.

IMAG0494Watch this video on  Productivity and take the bits from it that are useful for you. This 20 minute video has helped me so much, and it’s left its mark on me.  I hope it will help you to see how you can be more productive. It has helped me, and in some ways helped to heal me of things that were hard to let go. It’s been a hard lesson and I am humble enough to admit I’m still learning it. Good luck with it. All I do, is carry around a little SQLBits notebook that I’ve had, and I brain dump into it by writing everything down. It works. Trust me. How is this free? Well, make sure you visit the sponsor gallery at PASS Summit, and see if you can score a little notebook from one of the sponsors. It may help you more than you think!

TripIt – I am travelling across timezones and I need everything in one place. I also want to see when my friends come in. TripIt is soooo good at telling me about delays before the airlines do, that I pay for TripIt Pro. You can find Tripit here ( t ¦ w )

Packpoint save me from forgetting things. This is for Android, but I’m sure you can find something else for your favourite mobile OS. Basically, you tell PackPoint where you are going, and for how long. If you are TripIt customer (see above) it syncs everything for you so can pick your trip. If you are one of these people who forgets business cards and little things like that, give it a try. You can find your packing list online as well as on your phone, and tick things off as you go.

I hope to see you there!

Love,

Jen x