MSPs have warned there is no justification for using live facial recognition following privacy and human rights concerns.
A report said the software would be a "radical departure" from the current practice of policing by consent.
Police Scotland said it hoped to use the software by 2026, but later put plans on hold.
The technology can scan crowds of people and cross-reference faces with police databases.
The report from the justice sub-committee on policing was published as part of their inquiry into the advancement of live facial recognition.
It highlighted the technology was "known to discriminate against females and those from black, Asian and ethnic minority communities".
The report added: "The use of live facial recognition technology would be a radical departure from Police Scotland's fundamental principle of policing by consent."
The committee also said Police Scotland would need to ensure any technology in use would need to be "provided for in legislation and meets human rights and data protection requirements".
How does the software work?
New computer systems are able to watch thousands of people at an incredibly fast pace, with the most powerful able to operate at distances of over a mile.
They can do all of this in real-time, meaning everyone who passes by the camera can be scanned against a watchlist of suspects.
Last month the Metropolitan Police announced it would roll out the technology for the first time on London's streets.
The cameras will be in use for five to six hours at a time alongside bespoke lists of suspects wanted for serious and violent crimes.
An independent review of its use has shown that most matches produced by the cameras are false alarms.
'No fit state'
Police Scotland initially planned to roll out the software - as detailed in Policing 2026, a 10-year strategy published by the force.
They have since pledged to put the plan on hold and have committed to take part in a wider debate about the implications of the software.
Convener John Finnie said the sub-committee was "reassured" the force have no plans to introduce live facial recognition technology at present.
He said the technology "throws up far too many 'false positives'" and "contains inherent biases" that are known to be discriminatory.
He added: "It is clear that this technology is in no fit state to be rolled out or indeed to assist the police with their work.
"Our inquiry has also shone light on other issues with facial recognition technology that we now want the Scottish Police Authority (SPA) and the Scottish government to consider.
"Not least amongst these are the legal challenges against similar technologies in England and Wales, and the apparent lack of law explicitly governing its use in Scotland - by any organisation.
"So whether this technology is being used by private companies, public authorities or the police, the Scottish government needs to ensure there is a clear legal framework to protect the public and police alike from operating in a facial recognition Wild West."
Police Scotland said it was not "using, trialling, or testing" live facial recognition technology.
"We are keeping a watching brief on the trialling of the technology in England and Wales," Assistant Chief Constable Duncan Sloan said.
"Prior to any such technology being implemented we would carry out a robust programme of public consultation and engagement around the use of this technology, its legitimacy, viability and value for money," he added.
"This would include taking advice and guidance on ethical, human rights and civil liberties considerations. In my view, the use of such technology would not be widespread but would be used in an intelligence-led, targeted way."