In recent news, BBC (England) covered an interesting piece on the role artificial intelligence (AI) has to play in the workplace. While we are now familiar with how this is helping transition the average HRBP's role from transactional to strategic and providing insights on employee sentiment for leaders to act on, AI engagement tools like Amber are also actively positioned as a digital friend at the workplace.
Our experience with global HR teams we work with confirm that most employees trust a neutral platform that is clearly positioned as the CXO's digital assistant. In fact, BBC's story highlights how the leading retail chain Lenskart has been using Amber in their employee engagement strategies. Suruchi Maitra, Chief HR Officer, Lenskart mentions the problem statement that most HR teams are trying to solve, "The biggest issue in any human relationship is if people have an opportunity to speak and be heard."
Not just Amber, several HRtech tools, recommendation engines in the market are now being either deployed or being considered as a necessary means to accurately measure the sentiment of an evolving and diverse workforce. However, at one part of BBC's narrative, the story suddenly takes a different direction with Rebecca Herrold, CEO at The Privacy Professor, mentioning that the information shared by employees is "highly personal" and is accessible by many stakeholders in the company. This, according to Herrold, "opens up a very large privacy Pandora's box."
And we completely acknowledge that working with employee data while ensuring privacy is a key concern that even our customers across the world have highlighted before deploying Amber into their workforce.
2 ways we solve this critical privacy concern
#1 Confidentiality through Amber's Positioning and Trust Statement
The HR team spends weeks developing a communication plan to inform their employees about the addition of a digital colleague to the workforce, specifically brought in to record their feedback for HR and business leaders to course correct and act on. Employees are also periodically reached out to over email by the CEO or CHRO to share detailed context on why a bot like Amber can be considered a neutral platform to open up to help identify and resolve pain points that affect organization culture.
Before any employees can start their conversation with Amber, they are informed on who can view their conversation. This is called the Trust Statement which Amber shares at the beginning her chat. inFeedo as a third party organization ensures that these conversations are encrypted and unauthorized access is not possible. And in terms of participation, organizations report as high as 92% response rate from their employees with several anecdotes on how Amber's persona is almost human.
#2 Absolute Discretion through Anonymous Bat
After an employee complete their chat with Amber, they are provided the option to chat with Anonymous Bat. This is an additional chat module which acts as an ombudsperson for employees to access and maintain complete anonymity. Only the primary administrator of the tool (could be the CHRO or CEO) can see these messages without the sender’s name.
The employee here can share anything they feel cannot be shared openly without the assurance of complete anonymity These could be around topics that might harm their carrier, the business or work culture. For example, our customers report issues around cases of racism, sexual harassment, bullying being highlighted to Anonymous Bat, which they could not have shared openly and directly with Amber or the leadership team.
Anonymous Bat also comes with a Trust Statement that clearly informs employees who and how one might have access to the chat.
In both scenarios, the employee is at all times aware of how their interaction will be recorded and used in the organization's context. More importantly, how confidentiality and anonymity are two separate concepts and how employee feedback shared will be accessed and acted on.
As human beings interact more with technology to gather and share data, consent becomes critical. And we feel this is where BBC's narrative drifted off from its own story. This is precisely why HR spends so much effort, time, and resources to educate and spread awareness across the workforce on the introduction of tools like these at the workplace. And what's more important is not at any point is the employee made to share their opinions; it is not nor can it ever be made a mandate in any context. The purpose of or intent behind tools like Amber is to encourage open, honest discussions; to exchange feedback and see a positive change in the workforce culture; and to make it easier for HR to understand the exact sentiment given the average HRBP:employee ratio is 1:300 across industries.
Being part of the SaaS industry, we are always aware or made aware of the different privacy acts, laws, and mandates that govern a solution such as ours. And we hope we could set the story straight and add the missing piece to BBC's narrative. Watch their take on AI in workspaces: