
Construction companies have been warned about using guidance generated by artificial intelligence (AI) to comply with lifting and work equipment regulations.
The Lifting Operations and Lifting Equipment Regulations (LOLER) and the Provision and Use of Work Equipment Regulations (PUWER) govern safe lifting and machinery operations.
But accreditation organisation Consolidated Fork Truck Services (CFTS) warned against relying on answers about the regulations generated by AI tools such as ChatGPT, Gemini and Copilot.
This kind of AI-generated content can also fail to clarify that different types of equipment have different inspection needs and often simplifies complex legal requirements, CFTS added.
A common problem with generative AI is that it can ‘hallucinate’, which is when information outputs are generated that seem logical but are factually incorrect or misleading due to biases, insufficient training data or misinterpretations of input.
CFTS is the body that administers the ‘Thorough Examination‘ to certify that the mechanical parts of a lift truck are in safe working order.
CFTS’s director, Robert Fisher, said: “We understand why people turn to AI tools for quick answers, but when it comes to safety and compliance, businesses cannot afford to rely on unverified sources.
“Thorough Examinations and safety inspections must meet the legal requirements of both LOLER and PUWER, and getting this wrong puts people at risk.
“We’ve already seen cases where AI has provided misleading information about Thorough Examinations, leaving businesses with a false sense of compliance. To ensure your equipment meets legal requirements, always check official Health and Safety Executive or UK Material Handling Association guidance or, better still, speak to a CFTS-accredited examiner.”