Amazon is pausing police use of its facial recognition technology for a year, calling on politicians to use this period to introduce stronger regulations.
It follows IBM's chief executive telling US Congress his company would stop selling its facial recognition software and will oppose uses of the technology for mass surveillance and racial profiling.
Announcing the pause Amazon said it was "implementing a one-year moratorium" on police use of the technology, although this wouldn't apply to organisations investigating human trafficking.Met Police's facial recognition tech has '81 per cent error rate'
It did not reference the difficulties the technology had when recognising people with darker skin tones, and women in particular, as research in 2018 warned.
Nor did the company's statement make reference to the ongoing Black Lives Matter protests around the world.
"We've advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge," the company stated.
"We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested."
Despite the moves from both Amazon and IBM, neither company's technology is believed to be particularly popular with law enforcement organisations.
The police department in Orlando, Florida, dropped its trial of Amazon's Rekognition software last year amid difficulties getting the tech to work and criticisms that it was adding to racial biases in policing.
Neither Amazon nor IBM were immediately able to confirm to Sky News how many police users of their technologies would be affected by their decisions.
Image: The technology has prompted concerns about mass surveillance
NEC Corporation's NeoFace suite of real-time facial recognition and mugshot-matching technology is the most commonly used software in the US and UK.
An independent report into the Metropolitan Police's use of NeoFace - commissioned by the Met itself and revealed by Sky News - found the system was actually 81% inaccurate.
There are currently no specific laws which regulate the use of the technology in Britain, unlike laws designed to protect the public's privacy when it comes to DNA and fingerprint biometrics.
Civil society groups, academics and successive independent biometrics commissioners have warned that the legal regime surrounding the use of technology is lacking, and that the technology itself could leave the UK like a "surveillance state".
Parliamentary committees have called for the use of the technology to be paused until it can be put on statutory footing, but both the Met and the UK's data watchdog, the Information Commissioner's Office, have said that using the technology is lawful.