A Minneapolis council member, citing concerns about privacy and civil liberties, plans to push for tight regulations on any future use of facial recognition technology by the city's police.

Department officials insist they are years away from adopting a technology that's being used more often in criminal investigations around the country, but Council Member Steve Fletcher said in this case it's better to be a year too soon than a year too late.

"Sometimes we want to get ahead of things before we're pushing back against an implementation that people might be mad about," Fletcher said. "We have the opportunity right now to get ahead of it in a thoughtful way rather than reacting to something."

Fletcher, who sits on the Public Safety Committee, said he hopes the technology will be covered by a draft policy for data privacy, which is expected to be unveiled later this summer.

His comments come amid the ongoing debate over the use of biometric technologies, such as facial recognition, which is already used at many airports, stadiums and public safety agencies across the country. So far, officials said, Minneapolis is not one of them.

But Fletcher thinks it's only a matter of time before the city's police force adopts the powerful identification and surveillance technology.

Many in law enforcement have defended the technology as too important a tool to ignore in an increasingly wired world. With the help of facial recognition, even a grainy image captured on a security camera or social-media account can lead investigators to a suspect, as it did in the case of the man charged in the shooting at a Maryland newspaper last year that left five staff members dead.

Minneapolis police spokeswoman Sgt. Darcy Horn declined to answer questions about the technology, saying the department has no immediate plans to start using it.

According to a presentation made this year to the civilian Police Conduct Oversight Commission, the Police Department's vast network of surveillance cameras does not use any automated technology, such as facial recognition, to analyze real-time video. Such technology isn't used with the department's body-worn camera program, either.

Fletcher said he's not advocating for an outright ban similar to those passed in cities such as San Francisco and Somerville, Mass. However, he said he wants to study the issue and draft guidelines for its use not only by city agencies, but also "third-party vendors who have services that would do facial recognition."

"The real sort of core principle here is that we want to create a framework where we can invite technology in and take advantage of technology without creating a digital trail about individuals that becomes searchable and infringes on their privacy," he said. "You should be able to walk someplace and not have an electronic record created of you walking down the street."

His words may bring some comfort to local civil liberties and privacy advocates, who fear that its unregulated use could lead to the creation of an all-seeing surveillance state like in China, where the government is using the technology to arrest ticket scalpers and track Muslim minority groups.

While Minneapolis doesn't use facial recognition, other local agencies do, notably the Hennepin County Sheriff's Office — whose efforts were uncovered in a lengthy court battle by Tony Webster, a local journalist, web developer and privacy advocate.

Court filings show that in one recent case, analysts from the Sheriff's Office's Criminal Information Sharing and Analysis Unit ran a suspect's photo from Instagram against several databases, which included Hennepin County booking photos, revealing a possible match.

The technology was used in a similar fashion to solve the fatal shooting of a man on Minneapolis' South Side last summer.

The Sheriff's Office did not respond to a request for comment.

Fellow Minneapolis City Council Member Andrew Johnson echoed Fletcher's skepticism, saying the breakneck pace of technology creates a moral dilemma at a time when Americans are more concerned for their privacy.

"When it comes to things like facial recognition and tracking people, there has to be a balance here between civil liberties and constitutional rights of privacy and public safety interests," he said. "I think the thing that folks learned after Sept. 11 is that you don't want to waive your rights with the flag."

And yet, no federal laws govern its use, experts say.

Most facial recognition systems work by analyzing people's unique facial measurements — breaking them down into long strands of code called "feature vectors" or "faceprints" — to create a virtual map, which can be compared against other scanned images of faces.

In some ways, the technology is already part of everyday life, transforming the way people check in at airports, unlock smartphones or tag their friends in photos on social media.

But critics say the stakes are higher for government agencies, arguing it's one thing to use facial recognition to misidentify a friend on Facebook — it's another to misidentify a suspect.

FaceApp, a popular yet controversial app that allows users to age social media photos, recently stirred privacy concerns over whether its Russian developers were hoarding biometric data, a claim that has since been questioned by some security researchers.

In a recent blog post, the Brookings Institution said the technology, while promising, is still too inconsistent to be relied on to accurately identify crime suspects.

As an example, it pointed to the well-documented troubles of Amazon's Rekognition software — already in use by law enforcement in Oregon and Florida — which drew sharp criticism for misidentifying ethnic minorities at higher rates than whites.

"In formulating a coherent policy to govern facial recognition, policymakers should consider the 'how,' 'when,' and 'why' of using such a powerful tool: using appropriate thresholds of confidence for photographs, only utilizing facial recognition after the fact rather than in real time, and limiting its use to the most serious crimes," the post read. "Finally, governments must consider securing data to avoid breaches of sensitive personal information."

Axon, which supplies Minneapolis and many other large U.S. Rekognition police agencies, announced recently it would ban the use of such software on its devices.

An internal report concluded that "face recognition technology is not currently reliable enough to ethically justify its use."

The controversy over its use reignited this month after a Washington Post story revealed that federal agents had for years searched through millions of license photos, without warrants or drivers' consent.

Matt Ehling, with the Minnesota Coalition on Government Information, said the pro-transparency group has urged state lawmakers to regulate the technology — likening the heated debate to one that played out at the Legislature years ago over license plate readers.

He said that facial recognition's sometimes shaky results doesn't inspire much confidence, creating the risk of police misidentifying an innocent person as a suspect based on a blurry or low-quality image.

"We're on the cusp of things that people just didn't think were possible even a few years ago," Ehling said.

The Associated Press contributed to this report. Libor Jany • 612-673-4064 Twitter: @StribJany