First published on Friday, September 5, 2025
Last updated on Friday, September 5, 2025
Welcome to HR Heartbeat, where we take a look at the week’s most pressing HR and employment law stories. With over a decade of experience working within the HR and employment law industry, I give my opinion on current trends impacting your business, as well as my own personal commentary on all things HR and legal.
Misusing data can land you in legal hot water
What happens when sensitive employee or client information isn’t properly protected?
A recent case from Ontario shows us just how costly and damaging data breaches can be.
It’s common to imagine online leaks or hackers worming their way into digital files when we think of data breaches, but it’s often way simpler than that. Most of the time, it comes down to misusing private information, as a recent investigation by the Information and Privacy Commissioner found.
In this instance, a paediatrician took hospital records of newborn boys without consent and contacted their parents to promote circumcision at their clinic. This data breach was revealed when one new mom reported a suspicious call just one day after giving birth.
The paediatrician was personally fined $5,000, and their clinic was also slapped with a $7,500 fine and ordered to destroy all improperly obtained records.
This serves as an example to employers that privacy breaches aren’t only damaging to trust, they also come with financial and reputational risks.
So, whether you run a hospital, a small business or a large corporation, it's important to properly train staff on data privacy, have robust documentation on confidentiality and enforce it. Because protecting sensitive information is your legal duty.
One interview mistake can cost you thousands
Many employers are aware they have a legal duty to accommodate employees with disabilities, but if you think that duty only kicks in once someone’s hired, think again.
A recent B.C. Human Rights Tribunal ruling makes it crystal clear, your obligations apply from your very first interaction with a job candidate.
A deaf job applicant arrived at an interview with a sign language interpreter, and instead of exploring how accommodation could work, the employer outrightly dismissed them, insisting it would hinder their communication with clients.
The tribunal found this was discrimination under the Human Rights Code and ordered $15,000 in damages for injury to dignity.
Here are a few things employers can take away from this case:
Recruitment counts: Human rights protections start before the first shift; it begins at the interview stage.
Don’t assume, ask: Assuming applicants’ abilities without considering accommodation can be a fast track to liability.
Proactive inclusion matters: Add accommodation notices to job postings, offer accessible interview formats, and train hiring managers to prevent bias.
The message is simple: give all candidates a fair chance from the get-go regardless of perceived abilities. And if you’re unsure how to navigate the recruitment process, turning to expert guidance[CE1] can make all the difference.
AI at Work: The future or a lawsuit waiting to happen?
AI might feel like the future of work, but for employers, it could create major legal headaches when used improperly.
A recent report from Infosys found that 95% of executives have had AI-related issues, with major culprits being privacy violations, systemic failures, and reputational damage.
Almost half said using AI led to legal trouble, with fines and settlements to match.
My advice to employers adopting AI is to tread carefully. Using AI to optimize processes can be a game-changer, but using it to track keystrokes or monitor chat messages risks crossing the line into intrusion, pushing good employees out the door.
There’s also the problem of “hallucinations.” AI tools that spit out inaccurate results aren’t just embarrassing, they’re also dangerous. One system spread false workplace gossip, and another misreported details of a discrimination case. Both could have landed companies in court.
The bottom line is humans must stay in the loop. Make sure you’re auditing AI tools before rolling them out, checking for accuracy, bias, and compliance with privacy laws. You’ll also need to update your contracts and workplace policies to reflect AI use.
Finally, any AI-generated outputs must be reviewed by people, for when AI goes wrong, it’s not the algorithm that gets sued, it’s you.
And that’s a wrap from me. Tune in next time for my take on the latest headlines and employment law stories, helping keep your business ahead!






