Nigel Cassidy: You're up to your ears in online courses and professional training plans but what evidence is there that your learning and development is actually any good?
Hello I'm Nigel Cassidy and this is the CIPD podcast. There's a great old cartoon, you may have seen it, two Neolithic looking types in loin cloths are sweating as they try and pull a box filled with stones uphill. Another looks on, offering them an axle and two stone wheels but they wave away the offer shouting, ‘No thanks we’re much too busy!’ So are we too busy to learn, too busy to look for evidence that our workplace learning is actually hitting the spot and therefore worth what the organisation is paying for it? Well stand by for everything about evidence based learning that you’ve been afraid to ask. We’ve some top guests waiting in the wings to help us, but first, and right alongside me for this podcast a CIPD-er steeped in professional development, so much so he was at the heart of developing the institute’s new L&D qualification it’s Andy Lancaster, Head of Learning at the CIPD. Hi Andy.
Andy Lancaster: Hi Nigel great to be with you.
NC: Yeah I'm guessing if people as you what you do at a party, assuming that we’ll ever go to a party again soon, you might not say, ‘I'm in evidence based learning and development.’ It’s a bit scary isn't it?
AL: I think it is but being in learning and development is the role we have and what we’re looking for is for effective learning that has a real impact in organisations. So I think great learning is an inherent thing that we’re looking for evidence based practice which really drives forward impact.
NC: Yeah I mean could we just start with a definition here, I mean what is evidence based learning?
AL: I wanted to start Nigel just by telling you a story which my grandfather told me, because I think this sets the context for it really well and it’s about onions in your socks. My grandfather sadly is no longer with us but he was a real character and he used to come out with these amazing one-liners and on a matter of steering clear of flu and colds he once invited me to put some onions in my socks. Now this is an old folk remedy that what you do is you slice some onions and you put them in your socks underneath your feel on the basis that when you wake up in the morning you'll be cured. Now I looked this up, according to the National Onion Association, and yes there is a National Onion Association, you can still find this remedy on the internet and it originates back to the 1500s, and it was thought in those days that infections were spread by poisonous air. That theory has subsequently been replaced to show that actually germ based theory, germs are, you know, germs are contracted in different ways. So studies have now shown that there isn't a lot of evidence about putting onions in your socks it relies on claims and anecdotes and yet there have been no studies that actually refute that putting onions in your socks will prevent you getting colds and flu.
And in some ways for me Nigel, you ask the question what is evidence based learning all about? That picture really sums it up for me, you know in learning and HR practice and organisations often what we do is based on history. It’s anecdotes which have been passed down, it’s principles that have neither been proved or dismissed, and now as we focus on evidence based practice what we really need to do is to dispel some of these old theories and anecdotes and we now need to recognise that we can research and we can find principles which are based on real evidence which will inform our impact. So it’s time to take the onions out of the socks and actually do things which have real, genuine impact.
NC: In a way you’re almost answering my next question because to me when we sort of embarked on talking about this I thought this topic felt a bit like a no-brainer, I mean surely all organisations will deliver learning and develop their people in ways that are shown or proven to work so I couldn’t understand well why wouldn’t they, but I mean it’s something to do with well we’ve always done it this way.
AL: Exactly right and we’ve got to face some very uncomfortable truths that some of our practice is now outdated and things which maybe may have been worth entertaining in the past now those assumptions have been challenged and are wrong. And I think we also need to be fiercely anti-fad. Sometimes I think in learning there is no bandwagon which learning professionals won't leap on and now we’ve got to stand back and recognise that it’s not about our pet approaches, it’s not about what we might like to do, it’s not about copying other people who may not have any evidence behind their practice but really fiercely focusing on now doing things which have a direct impact on effective learning and that obviously by nature is going to affect the organisation positively. So yeah we’ve got much history here Nigel in learning which we’ve got to now put on one side and we’ve really got to focus on what is genuinely evidence based practice.
NC: And thinking about that there are so many things aren’t there? I mean the availability of the trainers, how they’ve always done it, their personal judgement, preferences of individuals, budgets, I mean there are all kinds of things which must seem like reasons why you do things a particular way?
AL: There are, you know and that's why I'm valuing Laura Overton and Owen Ferguson coming in today who are people who I really esteem in this space who are passionate about great practice, you know genuine evidence based practice. So yeah there's loads of stuff out there but it’s now time for us as a profession to stand back and reflect, often we’re too busy to do this, to stand back and reflect on what really underpins great learning in organisations and now to emulate that practice rather than some of the hearsay stuff we’ve done.
NC: Well you say great learning Andy, I mean this topic has been kicking around for years why has it come to the fore now?
AL: I think, you know, two words on the lips of most senior leaders is performance and productivity, you know that's a reality. For organisations now to not only thrive but to survive we’ve got to address the issue of performance and productivity and that's really crucial for us. So if we’re going to invest great resources often in learning, both in time and often in budget, it has to have a positive outcome. So I think why it’s particularly pertinent now Nigel is because organisations are really focusing on the productivity and performance agenda and rightly so and therefore learning and HR professionals need to now be emulating and demonstrating practice which is going to really drive performance in the organisation.
NC: And we’re going to bring in our guests in just a second but just before we do that I came across a few statistics looking up this topic and I realised this was a live issue when I saw this, and this is from the CIPD, more than a quarter of people surveyed believed they didn’t have the skills to make informed evidence based decisions. 96% identify using data and analytics as a priority area, and this was actually from the CEB, now part of Gartner, only 16% of L&D practitioners actually use data and metrics proficiently. Only 25% have the right skills. So I mean that really, all those figures show there's a big deficit here.
AL: There is, you know and particularly Owen and Laura have been looking at longitudinal studies on this, this has been an ongoing trend, so it is time to break this cycle and there's all sorts of reasons, maybe fear and all these kind of things but now we’ve got to look at how we can be evidence based practitioners and I'm looking forward to the conversation with Laura and Owen, who I know will bring some great insights around this.
NC: Okay well let’s bring them in right now. Firstly, delighted to welcome somebody whose whole professional career has been focused on supporting leaders and managers. He's chief product officer of Emerald Works and a familiar face from the Good Practice podcast, many people will have followed, it’s Owen Ferguson. Hello.
Owen Ferguson: Hi there Nigel. Good to be here.
NC: And the co-creator of Emerging Stronger, which is a masterclass in becoming a more effective development leader she's the founder of the Learning Innovation Research’s Towards Maturity, Laura Overton. Hi Laura.
Laura Overton: Hi Nigel thanks for having me here.
NC: So okay Laura we’ve just kind of had a very general introduction to this. In my mind I'm still seeing evidence as a kind of tool to evaluate, to improve or even maybe an excuse to dump some existing programmes but I think you say that evidence should be deployed at a much earlier stage, I mean sometimes to question managers or even the nature of the actual projects themselves?
LO: Yeah I guess for me evidence is all about how we use the information around us to be able to bring proof, you know if we think about evidence in a court it’s normally about proof, it’s about what do we need to kind of prove? Does this work, to Andy’s point earlier, you know, is our practice effective; is it backed up scientifically; does the data kind of show that our practice is effective? But also we use evidence to approve things, you know when we’ve got good information in our hands then we can work with other people to actually bring them on board, when it’s not just opinion that's driving something but we’ve got real evidence and proof. So to prove, approve, to disprove, you know I think that's also important use of evidence, particularly when we’re so caught up in these habits that Andy’s just been describing.
And sometimes we have to challenge our own thinking as well as challenge the thinking of those that are around us. So how do we use evidence in that way? And then also evidence to improve. So the whole role of evidence in the way that we are going about our practice how do we improve it, even the way that we just look at something we’ve just delivered or just been involved with has that worked? In what ways can we continually improve it.
So it’s not necessarily use of evidence to improve ROI of Learning and Development you know that actually can be quite debilitating sometimes when there are so many factors that actually link back to, but how do we use evidence to prove, improve, disprove and approve? There are just so many different ways in which we can bring evidence into our decision making and into our practice.
AL: I like the progression of the disprove and approve, I think that's really nice because it is, sometimes we get into a negative mindset Laura about this don't we? It’s just about rubbishing everything and sometimes professionally we just see so much of that going on but it is about approving the right things as well, so I think having a critical mind, Nigel, is not about being negative critical it’s a positive critical thing and I love that kind of let’s approve our practice as well and also, Laura, love that kind of thing we’ve got to look wider on this one, often this is done far too late isn't it? And I know Owen will probably join us, we’ve been three Musketeers around this one, this is often too late. So I love this concept about evidence based practice throughout the whole learning process so yeah I'm with you on that one.
NC: So Owen why are we so bad then at using evidence if, essentially in their heart, most L&D people actually know that they need more evidence why do we just not get on with it?
OF: Well the simple answer Nigel is that it’s quite hard to make that shift because things have been done in a particular way for such a long period of time and the reason that we know that it’s difficult is if you’re looking at other professional domains there has been quite a struggle at particular points of their evolution to make the jump from being expert opinion led, experience led, to being genuinely evidence led. So Andy mentioned the onions in the socks and the miasma theory of how disease got spread, there's a brilliant book called The Ghost Map and it’s all about Dr. John Snow and how he uncovered the actual transmission mechanism of a massive cholera outbreak in London and he managed to track it down to it being carried by water, but the way that he did that was painstaking work to collect data that he could then examine and then test against reality and that was back in the 19th century. It wasn’t until the 1970s that a genuine evidence based approach to medicine took hold and it took quite a few people fighting against the people who led the profession through their kind of expert demeanour in order for that change to happen. So it takes some brave people to do some painstaking work but right at the core of it we are talking about figuring out whether the stuff that we do actually works. So it cannot be any more disheartening than doing stuff and not having a degree of surety around that. So if I was to boil down evidence based practice it’s doing things in a way that you can have more confidence that your actions are going to have a positive impact and then progressively understanding what’s worked and what hasn’t in terms of your own activity.
NC: All right well let’s begin and the beginning Laura you must have helped a number of organisations who are stuck at this point so just give us some simple ideas of the kinds of evidence you can gather and how difficult is this, I mean do you need to be a scientist, do you need to really understand data to be able to marshal and then use this information?
LO: Well that's a great question, do you need to be a scientist? Do you need to be mathematically inclined? Most learning and development professionals really struggle with that and I know Andy you’ve asked that question many times in the room haven’t you and say what’s your background? And very few of us come from that kind of arithmetic background. But I think really our best starting point as a learning professional is to be able to tap into our curiosity. If we’re genuinely curious about wanting to know more, wanting to be our best selves, wanting to be professional then we can look around and see that data is all around us.
And I think sometimes we’re very blinkered when it comes to our attitude towards data and we think what comes out of learning management system or our management information systems? It’s very, very blinkered. But with a curious attitude we can actually see that evidence and data is all around us. There is scientific evidence and there are people who have really boiled that down brilliantly in order to help us as mere mortals understand that. And Andy I’ll give a shout out to your book here because when other people have looked at what the evidence says about good practice and they’re distilling it down that's a great place to start for us to be able to think and to be reading in that space.
But also what evidence is around here, around us in terms of the things that we can capture? What are the stories that people are telling? What’s going on in the podcast, you know there's a whole range of different sources of evidence and personally I think if we can just embrace the information that is around us it is a starting point because it starts to break down our fear of working with evidence and our fear.
NC: I have to be honest with you I'm still struggling with this because you’re all saying use evidence but you’re still not taking me to that point where I can get my head around where you start gathering it from and how you start applying it, so Andy can you help me here?
AL: Yeah I think Laura’s touched on a really important point so I'm just going to emphasise something really important Laura said about curiosity, we often see in child development as a child goes through, you know the ‘why’ stage is really important and it’s where you’re gathering information which actually changes your thinking, your behaviour and I think that asking why is really important and I think Laura has touched a really crucial point about that we must be curious.
Let’s start through the process: I think one of the reasons we often fall down on this is we don't diagnose very well what the actual learning need is, if it’s a learning need at all. So it’s a practical thing on this one, often we end up our default position is we think there's going to be a learning outcome and we fail to recognise that the actual performance need is based in a real ecosystem, it’s a systemic thing that's going on. So we sometimes over-simplify even what we’re trying to do here. So if you want to look for evidence I think diagnosis is really, really important. We know the thing if we go to the doctor’s we expect the doctor to do a good diagnosis based on evidence as to what the outcome might be needed here.
So I think a starting place Nigel is we’ve got to start diagnosing really well as to what the default issue, the default opportunity we have. And I think that comes right back to Laura’s point about curiosity, asking interesting and difficult and probing questions about what we’re even trying to do, is the first place where we start finding evidence.
NC: Okay in that case Owen Ferguson can you give me an example perhaps from your own experience of a situation where starting to use evidence has improved outcomes, in a business, perhaps improved talent management or just something that’s important to the learning and development process?
OF: I can talk to my own personal context here, so one of the things that the company does we make products that aim to be used within organisations and there are certain expected outcomes from those products being used. And so the first thing we ask ourselves is what problem are we trying to solve or what change is being required? How will we know when that has happened? And what metrics are already in place for us to measure the effectiveness of that change? And from that starting point you work your way back. And so we do lots of things like we run AB tests, so rather than just design one platonic ideal of a solution we’ll design multiple solutions then we’ll deploy them and we’ll figure out which one is working best, based on the metrics, the performance outcomes, or the evaluation metrics that we’ve put in place.
And so I guess, as Andy was mentioning, starting with the end of mind is it’s critical in order for you to be able to put something in place that actually has the impact that you’re intending, but if someone’s looking for a cookie cutter approach that can be used in every situation, in every single organisation, that's not being evidence led because you need to take evidence from your local context, you need to critically evaluate whatever research evidence is available for the problem that you are solving as opposed to just adopting approaches that have been taken on by other organisations and expecting it explicitly to work within your own.
NC: Laura I know you've been talking about work you’ve done in the past in retail.
LO: Yes there was one example of using evidence in a big retail organisation that we were working with where we actually went out to the staff in those organisations, at all levels, from directors, through line managers, through the people onto the shop floor and we used questions to ask them to reflect on how they currently learnt what they needed to do their job. So a very similar kind of research programme I think Owen that you've done as well with leaders as well. But just getting people to reflect.
And that surfaced all kinds of different insights into how people were doing their job and how people were learning how they were doing their job. And yet when their line managers came to the learning and development team saying, ‘I would like a course, I would like you to put everybody on a course, everyone on a programme,’ the learning and development team were able to take this learning landscape work and say, ‘Actually here is a picture of how your team say they learn best, are you sure you want me to go to number five, or should we perhaps use this as an opportunity to try something new?’
Now I'm not suggesting that just because their team said, ‘I normally learn in this way,’ that that is the reason for you to change everything. No it’s going back exactly Owen to what you said, you've got to use multiple sources of information and evidence in order to make the best decision about how you meet that need. But just having one source of evidence from that context allows you to kind of challenge the current status quo and say, maybe it’s time to use a different type of learning intervention in order to drive performance and that's when you then are able to use evidence to bring other sources of evidence-informed decisions about design.
NC: Yeah I mean it will be interesting to see what Andy says about that. It’s almost as if some leaders in organisations have pretty traditional ideas about learning and development and it’s actually they're the people that the L&D types have got to challenge sometimes.
AL: Yeah I think without doubt our own research, we did a report called Professionalising Learning and Development which actually called out an uncomfortable truth that often one of the biggest blockers is the mindset of senior leaders, that's a tough one to call out. But often what we think is defective is based on the shape we’ve had ourselves. So if we’ve been through a particular learning experience then often we think that's the way other people should learn. So Laura’s absolutely right I think dialogue is really important in this one, and asking questions.
So I think it’s important to think about where we find it. In terms of the data question, we’re rich in data now, in fact probably we have too much data in organisations. So I think one of the skills is understanding what data we use but particularly if you think about designing learning data lurks in customer services, it lurks in feedback and complaints, it lurks in HR systems, recruitment systems, performance systems, finance systems. So I think the issue is not that there isn't the data on which we can make these decisions, I think the challenge is finding the right data. And I think for me one of the skills we now need as learning and HR professionals is to build really good stakeholder relationships because a lot of this evidence based practice relies on other people in the learning ecosystem being part of the solution here and often we’ve been so locked into our own departments that we haven’t gone out there.
So I think Laura’s example from retail is absolutely spot on that we’ve got to get out there and make these connections. And to Owen’s point you’ve got to ask the right question, you've got to have a hypothesis, what are we trying to solve here and then we can garner all manner of great insights, both quantitative and qualitative in terms of supporting how we best design and deploy learning.
LO: I think that's such an important point because when two minds come together then there's overlap as well and there's kind of a real excitement in the overlap. And Owen I just remember working with you on some of the data and you approached the data that I was looking at from a completely different mindset and out of that it was just a fantastic learning experience for me because I was able to see things in what I was looking at, I’d been looking at day on day out, because you had a different approach Owen, then that kind of released a new life, a new insight into that. And so I think that collaboration and connection can really help us as learning professionals. It’s not just down to ourselves, we’ve got great people around us who are as equally passionate about surfacing insights and solving business problems and we should pool resources. And I mean Owen I don't know what you think about that but I just felt as we were talking about that it just reminded me of working with you and you challenged me and I really, really valued a different perspective on that and it gave me more insight.
OF: Absolutely and the experience was reciprocal and I think actually that’s interrogating the questions that you're actually trying to ask and getting different perspectives, it’s all part of that evidence based approach, rather than just making assumptions or not opening yourself up to feedback or challenge. So it comes back to one of the things that we said earlier, fundamentally it comes down to curiosity and a willingness to get better.
NC: How do we upskill people because we saw, I mentioned those statistics at the beginning the lack of ability or confidence people have in handling this data?
OF: I think getting more data savvy is important and there are plenty of ways of doing it, we are learning professionals so we should be able to figure out a way of putting ourselves onto a development path for that but let’s not forget there are plenty of data savvy people within the organisations that we work in, they can help us to interrogate the data. So my advice would be to expand out, during those very early stages when you’re trying to figure out what’s the actual problem that we’re trying to solve here, how would we know that we would be successful, what data do we currently have, is to step outside the function and start speaking to some of the other areas that might be able to provide some of that insight, because you don't have to do the calculations yourself, there are people there that can help you to get there.
LO: I’d like to give an example of that because I was chairing, when we were allowed to chair at live events, a CIPD event last year, and Seb Tindall was there from the Vitality Group and I know that there's a story about what Seb’s been doing on the CIPD site, but what I loved about Seb’s story is he moved his learning and development department and sat them next door to the data analytics department and so the teams were seated physically side by side and just picking up each other’s challenges and working out how to do that. And I just thought, God what a simple idea! Just get close to people, physically is even better.
NC: I can see Andy in this post-COVID era that would still be quite difficult but not impossible maybe?
AL: I think Laura makes a brilliant point there, it just triggered off the thinking, which is why it’s so cool to hang around in these kind of settings. The Medici in the Renaissance, they hung around in a very diverse group. You look at that and is it John Donne says, ‘No man (or woman) is an island,’ and I think there's a real danger that professionally we get very isolated and I think, to Laura’s point is it is about positioning and interfacing with other professionals. And the Medici was fascinating, you know you've got philosophers and writers and painters and all manner of people and together they were able to be far more creative and have a greater impact because of that diversity.
For me, Owen and Laura, I think one of the challenges we have is that our own professional development’s very limited and things are moving fast and we think the world of learning it’s not only data which is crucial but when you think about technology, when you think about cognitive science, there are many areas which the Venn diagram overlap with learning and I think our challenge but also our opportunity is to hang around other professional communities because that's where we understand not only about data but we understand about how these other areas, how practice is moving on and changing. And those things have a direct impact on how we design. So for instance technology it’s no good to say, I'm not particularly interested in technology.
NC: So Owen how do we do that because of the way we’re working at the moment because clearly it is much more difficult to get that interplay isn't it?
OF: Well it is and it isn't. My experience is people are more than happy to share if you reach out to them. So whilst it’s not necessarily possible for someone in L&D to physically go and attend a marketing conference, and I would hold up marketing as a professional domain that has possibly made that transition, or is starting to make that transition into being much more evidence led, slightly earlier than we have perhaps, so you can't physically go and attend a conference and mingle with people there but all the conferences are virtual now and people who work in marketing functions and marketing functions that are trying to become much more data led, are more than happy to talk about their experience if you reach out to them. People are passionate about what they do. They will talk to you.
I went through an exercise a number of years ago when I spoke to a group of digital leaders in very different spaces to the one that I was working in because I was looking at revamping how we did stuff and all that I needed to do was to drop them a LinkedIn message or drop them an email and the vast majority, 90% plus, people came back and said, ‘I can spare an hour to have a chat.’
NC: And Laura is there any danger that L&D people are going to be a bit side-lined in the business if they don't act a bit faster on this evidence thing because evidence based practice is just so common in every other part of business isn't it?
LO: Yeah I think there is a sense that if we don't start talking about what’s important to business and ask smart questions that allow us to be more valuable in that business field we will be side-lined. If we only talk about data that is relevant to how many people have engaged, how many happy sheets have been collected, you know or how many moments people have spent in videos and learning related data that our systems churn out that as business is moving along so quickly at the moment then we’re really in danger of completely thinking we’re on the right bandwagon and we’re talking data and analytics but we’re looking at it only through our narrow lens. So there's a real danger that we could be side-lined there.
But also Nigel there's a danger that we could be side-lined when we’re not able, to address what Andy and Owen were saying earlier about the critical reason we are here is to get people ready for change, for performance, for productivity. And if we haven’t got evidence about the best way of doing that, whether or not we’re on track, what might be getting in the way in terms of systems, then we’re not going to be able to smooth that process. We’re not going to have the smart conversations that are needed in the system of organisation and in the way that the organisation is managed. So we need that to challenge ourselves, challenge our design and challenge others in order to bring that value that is so essential from our profession.
AL: Yeah and Laura I think that's absolutely right and I would slightly challenge the question Nigel that you've asked there because it’s easy to spin this in the negative, you know if we don't do this then do we become a defunct function within organisations? Do leaders say we don't need learning professionals or we’ll outsource this? How about twisting that around completely on its face and say, if we are driven by great evidence-based we are able to go to organisations and say, if you invest in this initiative, or you invest in this, then we will show a positive outcome.
And I think so often working with learning professionals we know the issue on this, so the issue is not to be negative about this but to get this into a positive place where we can say we are confident that the way we design learning, the methods we use, the technology we use, the approach we use, the data we use, brings such a positive outcome. So I think let’s not down ourselves on here but recognise there's a massive opportunity here to add value to organisations and that's where we should be.
NC: I stand corrected, too many years as a journalist clearly. We’re almost reaching the end of our time here, I wanted to try and draw some conclusions to go round to each of you. So if we just start with Owen just some kind of top tips to finish really to set people on the right path, just gathering more evidence and using it more intelligently.
OF: I think my main tip would be to learn from how other professional domains have made or have started to make the transition.
NC: Well like your marketing thing?
OF: Like the marketing thing exactly, but actually the clinical example is medicine and you can get a delightful primer on that through Ben Goldacre’s book Bad Science but there is a lot of stuff out there. For anyone thinking that medicine’s got a much longer track record and importance in learning and development I’d highlight two things: the first is that the drive towards evidence based practice only took hold in medicine from the 1970s onwards and the second is that our ambitions should be that high because as we’ve just been discussing we play a critical role in enabling the organisation to enact its strategy, all we’re saying is by taking an evidence based approach you’re providing more certainty that our activities will have a positive impact on performance.
So what can we learn from how other professional domains have done that? But start small, it doesn’t have to be on the biggest, most strategic, costly initiatives, start small and start trying out some of the techniques that other professional domains have found to be successful.
LO: I think for me I love that because starting small is about building courage, you know building your courage, building your confidence, and that is so good. And I think my tip would be avoid the analysis paralysis, you know we can suddenly think, I've got to be more evidence informed, I need to read every study about cognitive science, I need to look at every piece of data, I need to find causation and correlation, because I need to define causation and correlation, there is so much that comes at us with the jargon that we can be paralysed in this new world of work.
So stop analysis paralysis and I love the work that Rob Bryan has been doing with the CIPD in the past about evidence based decisions. And they say get your evidence, as much as you can, from as many sources as you can but just enough to help you make that decision. So you don't have to know everything about everything but look at multiple sources of evidence to help you inform your decision. Focus on action. Don't focus on just pure analysis. So that would be my tip.
NC: And Andy Lancaster there's a lot of CIPD resources online, I know you've been involved in quite a bit of that, anything in particular you point to, to set people on the right path?
AL: I think there's one little stick step process which I find really useful which is all around being grounded in evidence based practice, six As: Asking, really translating a practical issue or a problem into a question, which is that whole kind of hypothesis; Acquiring, systematically searching for the evidence; Appraising, critically judging the trustworthiness and relevance of evidence; Aggregating, weighing and pulling together evidence; Applying, then incorporating that evidence into decision making; and finally Assessing, evaluating the outcome of the decision. So it’s a neat little six As but I kind of have that by the side of my desk as a check that we’re going to do this in a really good way and I’d say as well as that Nigel just hang around with good people who are really passionate about this, which is why it’s great to have had Owen and Laura in this session here. Let’s get alongside great people who are passionate about this.
NC: Well it’s been great to hang around with such good people. Thank you one and all Laura Overton, Owen Ferguson and Andy Lancaster who I think between them made a pretty overwhelming case for the value of evidence as a tool to improve learning, nurture talent and help people become fitter for the business in hand.
By the way if you are looking for more suggestions on how to work smarter in L&D and HR in these times don't forget to subscribe to the CIPD podcast so you don't miss an edition, that's anywhere you usually get your podcasts. Last time, for example, we heard about the benefits of taking a few more calculated risks with your people. We had a lot of positive reaction to that one on social media: Fantastic podcast, really good. And another: About time HR had a shake-up. So do go back and see what you can catch up on from recent editions.
Until next time from me Nigel Cassidy and all of us here at the CIPD it’s goodbye.