Use the approach outlined above to explore opportunities for applying AI against the employee lifecycle and to prioritise actions - from the start of the recruitment process.
Attraction
Each element of attraction could be enhanced with generative AI, including:
- Finding routes to candidates, such as identifying the most effective job boards for specific roles, or the events and approaches that can broaden attraction.
- Where social media is used to promote employer brand, generative AI can help shape a content strategy and individual post content.
- Transforming detailed role profiles into candidate-centric job adverts. They can improve inclusive language, brand consistency and clarity.
Selection
Generative AI tools can help candidates be much better prepared for job search – it can help research companies, prepare presentations and prepare answers to questions.
This means that old selection practices may need re-examining and updating, including:
- Consider the questions you ask, and how you score them. There may be a need to weight specificity in answers over generalisations. AI can help people to answer thematic questions, but less so about specific experiences.
- Consider the use of psychometrics. These provide an additional lens on candidates, giving an opportunity to assess candidates’ attitudes and mentality, and there’re no right or wrong answers.
- Make ability to use AI a selection criteria – this is effectively a demonstration of curiosity and initiative. To test for this, ask candidates to prepare a case study, and specifically ask them to use AI to help them. This gives the opportunity to ask questions about the subject, but also to ask how the candidate used AI, where they added to it, and how they applied judgement.
AI tools can provide a useful service in sifting and ranking candidates based on their written applications. However, there are some material risks that you should consider:
- If you are using an open-source tool, how do you protect candidate data?
- How do you minimise risk of discrimination (eg make sure you are clear on the selection criteria being applied, and how the AI has been trained)?
- How are you tracking and correcting any potential discrimination from the AI tools?
These risks are mitigated by HR professionals investing the time to properly set up AI tools – being clear on criteria, providing quality context, testing with sample data and refining. Treat AI outputs as draft recommendations – always bring human expertise and judgement to bear in making decisions. AI use prepared in this way can guard against human bias, and maintain consistency in assessing applications (AI does not get bored or tired). It is critical that the professionals own the outcomes, and actively test to mitigate bias.
Induction
Induction processes can be difficult to shape and maintain. AI can support with planning induction programmes, stakeholder meetings, sharing relevant information, phasing training and reading for new starters.
AI-powered search engines and chatbots could change this process, creating a much more interactive way of getting to know the company, or scouring organisational resources to accelerate learning.
Onboarding
Generative AI tools could be used to create more personalised onboarding plans based on the role, the new hire’s experience, and interview outputs.
Proprietary AI tools such as Microsoft Copilot can be used to provide 24/7 support to new hires to answer common questions (removing reliance on the old HR FAQ documents). They could help shape personalised learning plans for the new hire based on assessment of interview notes, experience, and the nature of the role. They can provide support for hiring managers in clarifying important tasks, and significantly reduce the time spent searching HR materials or speaking to HR teams.
Compliance training
Compliance learning can be difficult to design and to maintain, as organisational policy or external regulation develops. For policy owners, AI tools could accelerate this activity through review of current copy and identifying legislation changes.
AI can test to ensure that learning modules prioritise the key learning points for users, and can be used to write tests that reflect the most critical learning points.
Existing approaches risk becoming less effective as employees click-through videos and slides then use AI or simply search engines to answer tests.
There is an opportunity to rethink the approach to compliance learning overall. For instance, AI tools could be used for open-text questions, replacing the usual multiple-choice tests. There is a greater degree of rigour in assessment and more self-driven learning.
A further step would be to shift from the current learning modules to employees sourcing information for themselves, using AI tools. This could be developed into coaching style conversations with chatbots to explore the area of learning.
Organisational policies
It is possible to use an AI tool to draft a policy on a specific subject incorporating previous policy, updated expectations, applying a house style or similar. This can accelerate the development of policies, and create consistency in style of policy. However, it remains important for the professionals to review and own the content.
Where internal tools are available, an AI-powered chatbot can point employees to relevant sections of policy as and when required. An alternative approach is to use AI to produce user guides, or process walkthroughs. This is particularly helpful if policies are reflecting complex legislation. In areas of high risk or ambiguity, it would be advisable to have employees check with the HR team.
Reward
Reward practice can benefit from AI tools, but is an area where data security must be considered. It is worth teams exploring where data can be secured (eg by ensuring employees cannot be identified) or identifying tools that can be used in-house to manage company data.
One simple action to manage security is to use extracts of data, with unique identifiers in place to replace identifiers such as name or employment number.
With measures in place, there are opportunities to use AI for activity such as:
- Analysing data including performance metrics, productivity, skills and activity, to provide more thorough benchmarking for reward decisions.
- Analysing internal and external reward information to ensure fairness and alignment. This has the potential to identify discrepancies and trends over time which in turn can help address pay inequity.
- There is potential for employees to use generative AI to really personalise their use of rewards and benefits, for example ‘discussing’ their personal position and the outline of company benefits with AI. This could be done with personal tools, but could become more targeted if enterprise tools were adopted.
- Reward teams can potentially identify how benefit packages are used, what is the relative value of different elements, and produce targeted communications to help colleagues make informed choices.
- There’s also the potential to use AI to improve the speed and quality of pay benchmarking submissions.
Ultimately, there may be enterprise-level, data-compliant AI tools that are better suited to help in this area, but personal tools can also be of good use.
Wellbeing
As wellbeing offerings are extended by the organisation, generative AI can help navigate the offering. For example, AI can help wellbeing professionals to access information, accelerate learning, and build strategy.
For employees, AI tools could help navigate resources to find what is most suitable. There is also great potential for coaching – employees and managers can use generative AI to help frame conversations, gather information, identify good practice and get to more personalised solutions faster.
Learning and development
AI can accelerate the development and design of learning programmes, creating outlines, detailed content, exercises and notes for facilitators. Company behaviours or priorities can be updated, allowing developers to ensure programmes remain current.
There’s an opportunity to encourage learners to use AI as part of their pre-reading, or as follow-up exercises. This gives the opportunity to build confidence in AI use as well as the subject of the course.
Finally, it is common for organisations to have catalogues of learning resources. AI could be used to:
- Help employees find and summarise content from the library (less time searching, more time learning)
- Remove the need for some of these catalogues altogether. AI tools can be used to find and identify openly available material, summarise it and give core information. There is a risk regarding the accuracy and variety of source content, and qualifying parameters should be established. Shaping guidance on using AI for learning prompt good questions of how to connect it to role needs and what are important roles for development teams.
Management
One way AI can support managers is to help them identify and manage the demands on their time. But as generative AI is embedded into software packages (eg Microsoft Copilot), it may be more about using enterprise-level rather than personal AI to schedule 1-1s, team meetings, capturing action points and so on.
Another use is to help managers understand what policies apply in specific situations, and how to apply them. AI tools can help managers to boil policies down into a short list of tasks they need to complete, give them guidance on how to approach conversations with colleagues. This significantly simplifies HR advice, allowing HR professionals to provide more in-depth support for complex situations.
More importantly, management skills change with AI. As employees use AI more, managers need to have quality coaching skills, the ability to assess and understand the use of AI. As team members shape pieces of work, managers will need to prepare questions such as:
- “How did you use AI to help with this piece of work?”
- “How did you validate the inputs?”
- “What other sources did you use?”
- “How did you bring organisational context to your work?”
- “Where did you add your judgement?”
- “How did you ensure data security?”
There is also a wider consideration. In recent years demands on managers have increased, as issues like worker wellbeing, inclusive culture and managing hybrid teams or flexible working all compete for attention. AI use adds another layer as it requires managers to be able to coach others around AI use. HR teams should consider managers’ capabilities and the impact on their capacity. This may require some role and/or organisational design changes.
Performance management
AI tools can support each step of the performance management journey, reducing the administrative burden to create more focus on quality management.
Tools can help craft objectives (using historic ones), organisational priorities or strategies, assess existing workload and external data. Cascading objectives can similarly be accelerated. All this activity can increase organisational alignment and improve collective performance.
Development plans can be developed in a similar way – individuals can receive guidance from AI tools on how to identify development priorities, how to work towards them, along with specific suggestions. Increasing the quality of development plans and the speed of generation allows more time for thought.
Enterprise-level AI tools magnify the potential to assess what people have done. AI and workplace analytics embedded into applications like Microsoft VIVA can identify where people have spent their time, who they’ve spent time with, what they’ve prioritised and what they’ve neglected. It can allow for much more evidence-based performance discussions.
A key element of performance management is ongoing feedback. Many people will spend hours crafting feedback to ensure their meaning is clear, or to avoid offence. AI can help in crafting and editing messages and adopting the right tone of voice.
Career progression
AI tools can help employees identify their own career paths and opportunities based on their experience and preferences. If an employee has a known goal (eg chief people officer), information on their starting point (eg reward analyst job description and their CV), and potential steps to get there (eg competency frameworks, HR team structure, CIPD Profession Map), then AI can trawl all these sources to recommend a route.
Frequently all these information points will exist within organisations, but bringing them together manually would be time-consuming.
Talent assessment and profiles
Talent processes can be sensitive, and there are risks relating to bias in the dataset due to historic data (eg senior leaders are potentially more likely to be male, white, straight, university educated).
Nevertheless, AI can be used to help collate, analyse and plan talent outcomes. It can be used to generate standard talent profiles based on data such as CVs and LinkedIn profiles and use that data to identify cross-functional succession candidates, suitable people for key projects and longlists for internal vacancies.
As already mentioned, the requirement for human judgement and validation is critical here to avoid bias in the system.
Guidance
AI tools can make this significantly simpler. Applying a generative AI tool to internal policies would allow managers and employees to ask for information, for guidance on how to handle a situation, for the right forms to complete, to check for deadlines and so on. It can shift to a true self-service model, based on questions that managers and employees really do ask, rather that HR teams having to develop sets of FAQs. This is likely to be significantly easier and more tailored as enterprise-level tools incorporating generative AI become more widely available.
This then lifts HR teams into a more value-adding role, with more focus on advice, quality assurance and intervention in complex situations.