Apple has suspended the practice of having human contractors listen to Siri recordings after concerns were raised about user privacy.
A report by The Guardian last week revealed the company used the programme to listen to a small number of recordings from the voice assistant to grade them.
The report said contractors had regularly heard parts of private conversations as part of the work.
Apple has confirmed it is to review the scheme and will not restart it until that has been completed.
The tech giant also said it would give users the option to choose to opt out in any grading programmes.
“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally,” Apple said in a statement.
“Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
The grading scheme had been used to analyse the quality of Siri interactions.
The virtual assistant can be activated on Apple devices – including the iPhone and Apple Watch – by saying the wake phrase “Hey Siri”.
However, the assistant can be accidentally prompted when it mistakenly thinks it hears the wake phrase.
It is also possible to accidentally trigger Siri when an Apple Watch detects it has been raised and then hears speech, even when a user is not planning to speak to the assistant.
The incident is not the first time a tech giant has received criticism for quality control schemes.
In recent months, Amazon and Google have confirmed they also use small samples of user recordings from their own voice assistants to train and develop language recognition software.
The two firms each have a virtual assistant in smart speakers and some smartphones, and both confirmed they use human auditors to analyse a small section of recordings from users.