Showing posts with label privacy. Show all posts
Showing posts with label privacy. Show all posts

Monday, July 4, 2022

American data privacy, post Roe

 As we plunge ahead into the post-Roe era, American laws about abortion are going to be very divided. Some states will seek to criminalize not only surgical abortions, but the use of pharmaceuticals as well (and, if Justice Thomas gets his wish, perhaps contraceptives of all sorts, as well as day-after pills).*

Some states may seek to prosecute their residents who seek treatment out of state, or who order mail order pharmaceuticals. Doing so will leave a data trail, in searches on the web, emails, and geo-location data.  How private will those data be?

This is going to be an issue for tech companies, prosecutors, and legislators at both state and federal levels.  E.g. can prosecutors access and use your geo-location data to determine if you visited a clinic?  Your web searches to see if you looked for one? Your emails or pharmacy data to see if you ordered drugs?  Your medical data of other sorts?

*Here is the Supreme Court Opinion, written by Justice Alito followed by the other opinions. Justice Thomas' concurring opinion begins on p. 117 of the pdf, after Appendix A to the majority opinion which ends on numbered page 108 (but the numbering restarts at 1 for Justice Thomas' opinion).  DOBBS, STATE HEALTH OFFICER OF THE MISSISSIPPI DEPARTMENT OF HEALTH, ET AL. v. JACKSON WOMEN’S HEALTH ORGANIZATION ET AL. 

Here are some thoughts on various aspects of the emerging situation.

From STAT:

HIPAA won’t protect you if prosecutors want your reproductive health records  by By Eric Boodman , Tara Bannow , Bob Herman  and Casey Ross

"With Roe v. Wade now overturned, patients are wondering whether federal laws will shield their reproductive health data from state law enforcement, or legal action more broadly. The answer, currently, is no.

"If there’s a warrant, court order, or subpoena for the release of those medical records, then a clinic is required to hand them over. 

...

"As far as health records go, the most salient law is HIPAA — the Health Insurance Portability and Accountability Act. It’s possible that federal officials could try to tweak it, so records of reproductive care or abortion receive extra protection, but legal experts say that’s unlikely to stand up in the courts in a time when many judges tend to be unfriendly to executive action.

...

"In states that ban abortion, simply the suspicion that a patient had an abortion would be enough to allow law enforcement to poke around in their medical records under the guise of identifying or locating a suspect, said Isabelle Bibet-Kalinyak, a member of Brach Eichler’s health care law practice. “They would still need to have probable cause,” she said."

***

Health tech companies are scrambling to close data privacy gaps after abortion ruling By Katie Palmer  and Casey Ross July 2

"STAT reached out to two dozen companies that interact with user data about menstrual cycles, fertility, pregnancy, and abortion, asking about their current data practices and plans to adapt. The picture that emerged is one of companies scrambling to transform — building out legal teams, racing to design new privacy-protecting products, and aiming to communicate more clearly about how they handle data and provide care in the face of swirling distrust of digital health tools.

"Period-tracking apps have been the target of some of the loudest calls for privacy protections, and the most visible corporate response. At least two period-tracking apps are now developing anonymous versions: Natural Cycles, whose product is cleared by the Food and Drug Administration as a form of birth control, said it’s had calls to trade insights with Flo, which is also building an anonymous version of its app."

********

From the Guardian:

Tech firms under pressure to safeguard user data as abortion prosecutions loom. Private information collected and retained by companies could be weaponized to prosecute abortion seekers and providers by Kari Paul

"Such data has already been used to prosecute people for miscarriages and pregnancy termination in states with strict abortion laws, including one case in which a woman’s online search for abortion pills was brought against her in court. 

...

"Smaller companies are also being targeted with questions over their data practices, as frantic calls to delete period tracking apps went viral following the supreme court decision. Some of those companies, unlike the tech giants, have taken public stands.

“At this fraught moment, we hear the anger and the anxiety coming from our US community,” period tracking app Clue said in a statement. “We remain committed to protecting your reproductive health data.”

"Digital rights advocacy group the Electronic Frontier Foundation (EFF) has advised companies in the tech world to pre-emptively prepare for a future in which they are served with subpoenas and warrants seeking user data to prosecute abortion seekers and providers.

"It recommends companies allow pseudonymous or anonymous access, stop behavioral tracking, and retain as little data as possible. It also advocated for end-to-end encryption by default and refrain from collecting any location information."

**********

From the NYT:

When Brazil Banned Abortion Pills, Women Turned to Drug Traffickers. With Roe v. Wade overturned, states banning abortion are looking to prevent the distribution of abortion medication. Brazil shows the possible consequences.  By Stephanie Nolen

"The trajectory of access to abortion pills in Brazil may offer insight into how medication abortion can become out of reach and what can happen when it does.

"While surgical abortion was the original target of Brazil’s abortion ban, the proscription expanded after medication abortion became more common, leading to the situation today where drug traffickers control most access to the pills. Women who procure them have no guarantee of the safety or authenticity of what they are taking, and if they have complications, they fear seeking help.

************

From the Guardian

Google will delete location history data for abortion clinic visitsThe company said that sensitive places including fertility centers, clinics and addiction treatment facilities will be erased

"Alphabet will delete location data showing when users visit an abortion clinic, the online search company said on Friday, after concern that a digital trail could inform law enforcement if an individual terminates a pregnancy illegally.

...

"Effective in the coming weeks, for those who do use location history, entries showing sensitive places including fertility centers, abortion clinics and addiction treatment facilities will be deleted soon after a visit."

***********

And while we await further developments here, the Times has an article about growing surveillance in China:

‘An Invisible Cage’: How China Is Policing the Future By Paul Mozur, Muyi Xiao and John Liu, June 25, 2022

It begins "The more than 1.4 billion people living in China are constantly watched. They are recorded by police cameras that are everywhere, on street corners and subway ceilings, in hotel lobbies and apartment buildings. Their phones are tracked, their purchases are monitored, and their online chats are censored..."

Sunday, July 3, 2022

Pregnancy in Poland, a database and anti-abortion laws

 The Lancet recently reported on new pregnancy data being collected in Poland, and controversy on whether and how it might be used in enforcing Poland's very stringent anti-abortion laws.

Poland to introduce controversial pregnancy register, by Ed Holt, Lancet,  VOLUME 399, ISSUE 10343, P2256, JUNE 18, 2022  DOI:https://doi.org/10.1016/S0140-6736(22)01097-2

"A new legal provision in Poland requiring doctors to collect records on all pregnancies has been condemned by critics who fear it could create a pregnancy register to monitor whether women give birth, or track those who go abroad for abortions.

Poland has some of Europe's strictest abortion laws, with terminations allowed in only two instances—if the woman's health or life is at risk and if the pregnancy is the result of either rape or incest. Until last year, abortions had also been allowed when the fetus had congenital defects. Most legal terminations in Poland were carried out under this exemption. But this provision was removed by a constitutional court ruling following a challenge by members of the ruling right-wing Law and Justice party, which some rights activists accuse of systematic suppression of women's rights.

Rights groups and opposition Members of Parliament (MPs) say that, in light of the tightened abortion legislation, they worry that the collected pregnancy data could be used by police and prosecutors in an unprecedented state surveillance campaign against women. “A pregnancy register in a country with an almost complete ban on abortion is terrifying”, Agnieszka Dziemianowicz-BĄk, an MP for the New Left party, said. 

***********

Here's a recent NY Times story on the implementation of Polish anti-abortion law:

Poland Shows the Risks for Women When Abortion Is Banned. Poland’s abortion ban has had many unintended consequences. One is that doctors are sometimes afraid to remove fetuses or administer cancer treatment to save women’s lives.  By Katrin Bennhold and Monika Pronczuk, Updated June 16, 2022

"Today, Poland and Malta, both staunchly Catholic, are the only European Union countries where abortions are effectively outlawed.

"The consequences in Poland have been far-reaching: Abortion-rights activists have been threatened with prison for handing out abortion pills. The number of Polish women traveling abroad to get abortions, already in the thousands, has swelled further. A black market of abortion pills — some fake and many overpriced — is thriving.

"Technically, the law still allows abortions if there is a serious risk to a woman’s health and life. But critics say it fails to provide necessary clarity, paralyzing doctors."

Wednesday, June 1, 2022

Health data and privacy, in a world of overlapping data

 Re-identifying de-identified data, by combining it with other data sets, sometimes provides a way of legally circumventing medical privacy laws such as HIPAA.  Data re-identification isn't illegal.

Here's a story from Stat:

 Top privacy researchers urge the health care industry to safeguard patient data. By Megan Molteni 

"As a STAT investigation published Monday revealed, data brokers are quietly trafficking in Americans’ health information — often without their knowledge or consent, and beyond the reach of federal health privacy laws. This market in medical records has become highly lucrative  — $13.5 billion annually —  thanks to advances in artificial intelligence that enable the slicing, dicing, and cross-referencing of that data in powerful new ways.

"But the building of these algorithms often sidelines patient privacy. And researchers who’ve been tracking these erosive effects say it’s time to reform how health data is governed and give patients back control of their information.

...

"One of the most frequent harms he and other researchers have chronicled: Patients being denied care or insurance coverage based on information payers drew from their social media activities after combining datasets to re-identify them. 

Friday, May 27, 2022

Personal data as a national (not international) resource

 The NY Times has the story:

The Era of Borderless Data Is Ending. Nations are accelerating efforts to control data produced within their perimeters, disrupting the flow of what has become a kind of digital currency.  By David McCabe and Adam Satariano

"France, Austria, South Africa and more than 50 other countries are accelerating efforts to control the digital information produced by their citizens, government agencies and corporations. Driven by security and privacy concerns, as well as economic interests and authoritarian and nationalistic urges, governments are increasingly setting rules and standards about how data can and cannot move around the globe. The goal is to gain “digital sovereignty.”

...

"In Washington, the Biden administration is circulating an early draft of an executive order meant to stop rivals like China from gaining access to American data.

"In the European Union, judges and policymakers are pushing efforts to guard information generated within the 27-nation bloc, including tougher online privacy requirements and rules for artificial intelligence.

"In India, lawmakers are moving to pass a law that would limit what data could leave the nation of almost 1.4 billion people.

"The number of laws, regulations and government policies that require digital information to be stored in a specific country more than doubled to 144 from 2017 to 2021, according to the Information Technology and Innovation Foundation.

"While countries like China have long cordoned off their digital ecosystems, the imposition of more national rules on information flows is a fundamental shift in the democratic world and alters how the internet has operated since it became widely commercialized in the 1990s.


Saturday, July 3, 2021

The art of money laundering through the art market

 What looks like privacy to some looks like secrecy to others.

The NY Times has the story:

As Money Launderers Buy Dalís, U.S. Looks at Lifting the Veil on Art Sales. Secrecy has long been part of the art market’s mystique, but now lawmakers say they fear it fosters abuses and should be addressed.  By Graham Bowley

"Billions of dollars of art changes hands every year with little or no public scrutiny. Buyers typically have no idea where the work they are purchasing is coming from. Sellers are similarly in the dark about where a work is going. And none of the purchasing requires the filing of paperwork that would allow regulators to easily track art sales or profits, a distinct difference from the way the government can review the transfer of other substantial assets, like stocks or real estate.

...

"In January, Congress extended federal anti-money laundering regulations, designed to govern the banking industry, to antiquities dealers. The legislation required the Department of the Treasury to join with other agencies to study whether the stricter regulations should be imposed on the wider art market as well. The U.S. effort follows laws recently adopted in Europe, where dealers and auction houses must now determine the identity of their clients and check the source of their wealth.

“Secrecy, anonymity and a lack of regulation create an environment ripe for laundering money and evading sanctions,” the U.S. Senate’s Permanent Subcommittee on Investigations said in a report last July in support of increased scrutiny.

"To art world veterans, who associate anonymity with discretion, tradition and class, not duplicity, this siege on secrecy is an overreaction that will damage the market. They worry about alienating customers with probing questions when they say there is scant evidence of abuse.

...

"What is the origin of such secrecy? Experts say it likely dates to the earliest days of the art market in the 15th and 16th centuries when the Guilds of St. Luke, professional trade organizations, began to regulate the production and sale of art in Europe. Until then, art was not so much sold as commissioned by aristocratic or clerical patrons. But as a merchant class expanded, so did an art market, operating from workshops and public stalls in cities like Antwerp. To thwart competitors, it made sense to conceal the identity of one’s clients so they could not be stolen, or to keep secret what they charged one customer so they could charge another client a different price, incentives to guard information that persist today.

...

"Auction catalogs say works are from “a private collection,” often nothing more. Paintings are at times brought to market by representatives of owners whose identities are unknown, even to the galleries arranging the sale, experts and officials say. Purchasers use surrogates, too. 

************

Here's a related State Department report on how art sales have been used to circumvent U.S. sanctions on Russian oligarchs:

THE ART INDUSTRY AND U.S. POLICIES THAT UNDERMINE SANCTIONS.  STAFF REPORT, PERMANENT SUBCOMMITTEE ON INVESTIGATIONS, UNITED STATES SENATE

Tuesday, February 9, 2021

Understanding Big Data:Data Calculus In The Digital Era : report from the Luohan Academy

 Here's a new report from the Luohan Academy

Understanding Big Data:Data Calculus In The Digital Era   Feb 05, 2021 

Authors

Luohan Community: Patrick Bolton, Bengt Holmström, Eric Maskin, Sir Christopher Pissarides, Michael Spence, Tao Sun, Tianshu Sun,Wei Xiong, Liyan Yang

In-house: Long Chen, Yadong Huang, Yong Li, Xuan Luo, Yingju Ma, Shumiao Ouyang, Feng Zhu

From the foreword: "The pervasive use of digitized information has reached a new height that we call the era of "big data." While this has led to unprecedented societal cooperation, it has also intensified three major concerns: How can we properly protect personal privacy in the age of big data? How do we understand and manage the ownership and distribution of benefits and risks arising from the use of data? Will the use of big data lead to "winner-take-all" markets that undermine competition to the detriment of consumers and society? "

From the conclusion: "While acknowledging the challenges of privacy and data security risks, we have explored how such risks can be effectively and efficiently managed through a middle ground of government and industry self-regulation. With the right design of mechanisms and technologies, it has become increasingly possible to maintain anonymity, collect and share data while avoiding the sharing of personally identifiable information and reducing privacy and security risks, while still allowing data to freely flow. With the right technologies, the benefits of data sharing do not have to conflict with unacceptable risks to privacy. There is a way forward to capture the enormous benefits of big data while mitigating its risks, the goal of efficient and effective privacy protection. 

"One major issue is data ownership. Giving ownership of data to users who are the subjects of the data may seem like a natural safeguard of privacy. But exclusive ownership would run up against the efficient use of data as a non-rivalry good. In practice, individuals are seldom willing to make the effort of producing and recording data. In the language of economists, the private provision of a public good is generally inefficient. In addition, most people on the street do not have the capacity to mine and create big data for innovation. Data producers -- engineers at information technology firms -- do.

...

"We conclude by recommending the following three principles for governing the market for digital data:

Principle 1: Data ownership by data producers (including data subjects as producers) should be predicated on data integrity, anonymity, and especially the protection of personal and societal privacy.

Principle 2: Privacy protection and data security can to a large extent be achieved by combining state-of-the-art technologies and innovative mechanism designs.

Principle 3: Competition and consumer protection analyses of and policy prescriptions for data-driven markets should take into account the documented pro-competitive and pro-consumer benefits of big data along with any potential for anti-competitive and anti-consumer effects in specific markets."

Friday, August 14, 2020

Should residency program rank order lists be kept confidential from the Dean?

Here's the report of a survey of residency program directors in radiology. One issue, not confined to radiology, is the confidentiality of their rank order list for the resident match--confidentiality from their own administrative hierarchy.  The problem with having to show your rank order list to your dean is that it interferes with program directors' incentives to rank candidates in order of true preferences:   Thirty-seven percent felt pressure to match applicants from the top of the rank list in order to improve the perceived “success” in the match."  That is, some of these programs are refraining from ranking the most desirable applicants they interviewed because they worry these people will match to other programs.  This will make their program look bad to the dean (who will ask "how come you have to go so far down on your list?")

“What Program Directors Think” V: Results of the 2019 Spring Survey of the Association of Program Directors in Radiology (APDR) Academic Radiology,  8 August 2020, In Press, Corrected Proof

by Anna Rozenshtein MD, MPH1 Brent D. Griffith MD2 Priscilla J.Slanetz MD, MPH3 Carolynn M.DeBenedectis MD4 Jennifer E.Gould MD Jennifer R.Kohr MD6 Tan-Lucien Mohammed MD, MS7Angelisa M.PaladinMD8Paul J.Rochon MD9 Monica Sheth MD10Ernest F.Wiggins III MD11 Jonathan O.Swanson MD12   Academic Radiology Available online 8 August 2020, In Press, Corrected Proof 

"The Association of Program Directors in Radiology (APDR) surveys its membership annually on hot topics and new developments in radiology residency training. Here we report the results of that annual survey.

...

"Radiology Residency Match: Forty-nine percent of respondents reported that the final rank list is known only to the program administration (PD/APD) and the selection committee, while 27% disclosed the rank list to the department administration and 24% to the institution. Thirty-seven percent felt pressure to match applicants from the top of the rank list in order to improve the perceived “success” in the match."

Monday, December 30, 2019

Some kinds of privacy may be gone forever

Lots of family secrets are revealed by DNA analysis, and it may no longer be possible to keep those secrets.  That is part of the argument made by Dr. Julia Creet, in an interview published at Bill of Health under the title "The End of Privacy?"


Dr. Julia Creet: I made the statement that any idea we had about privacy is over in response to a number of troubling trends in genetic genealogy. DTC genetic tests have revealed long-held family secrets, biological parents and siblings of adoptees, and the identities of sperm and egg donors. In each case, the question of the right of the searcher trumped the rights of those who wanted their privacy protected. In a few cases, sperm donors have sued for invasion of privacy. What these cases show is that even if we think we are protected by the privacy provisions of donor agreements or closed adoptions, genetic tests can leap over those privacy barriers. Many genealogists have declared that there will be no more family secrets in the future. So, family privacy is a thing of the past, which may or may not be a good thing. On a larger scale, law enforcement use of DTC genetic testing databases has demonstrated that data uploaded for one purpose can be used in the future for a completely unanticipated purpose. Without the ability to predict future uses of this information, we cannot put a privacy policy in place that will anticipate all the unforeseen future uses. I think the most telling cases in the last few weeks are the recent warrant that allowed law enforcement access to the GEDmatch database even though most users had opted out of having their results included in searches, and the rather frightening report for Peter Ney about the ease of malware intrusions on genetic genealogy databases.

Monday, December 23, 2019

"The Ethical Algorithm" by Michael Kearns and Aaron Roth (book talk at Google)

Here's a talk about "The Ethical Algorithm--The Science of Socially Aware Algorithm Design"
by Michael Kearns and Aaron Roth.


IMHO it would make a fine last minute holiday gift for those interested in econ and market design as well as for fans of computer science and algorithms:) 

Friday, July 19, 2019

Privacy and dating apps

As internet and app-driven dating becomes increasingly common, so has the tension between dating and privacy, i.e. between indicating to potential partners who you are and what you want, and keeping some privacy about these things in the rest of your life.  The NY Times has an article by NY Law School prof Ari Ezra Waldman that focuses on the design of dating apps with respect to privacy:

 Queer Dating Apps Are Unsafe by Design
Privacy is particularly important for L.G.B.T.Q. people. By Ari Ezra Waldman.

"Pete Buttigieg met his husband on a dating app called Hinge. And although that’s unique among presidential candidates, it’s not unique for Mr. Buttigieg’s generation — he’s 37 — or other members of the L.G.B.T.Q. community.
In 2016, the Pew Research Center found that use of online dating apps among young adults had tripled in three years, and nearly six in 10 adults of all ages thought apps were a good way to meet someone. The rates are higher among queer people, many of whom turn to digital spaces when stigma, discrimination and long distances make face-to-face interaction difficult. One study reported that in 2013 more than one million gay and bisexual men logged in to a dating app every day and sent more than seven million messages and two million photos over all.
...
"But for queer people, privacy is uniquely important. Because employers in 29 states can fire workers simply for being gay or transgender, privacy with respect to our sexual orientations and gender identities protects our livelihoods. 
...
"All digital dating platforms require significant disclosure. Selfies and other personal information are the currencies on which someone decides whether to swipe right or left, or click a heart, or send a message. 
...
Hinge made a commitment to privacy by designing in automatic deletion of all communications the moment users delete their accounts. Scruff, another gay-oriented app, makes it easy to flag offending accounts within the app and claims to respond to all complaints within 24 hours. Grindr, on the other hand, ignored 100 complaints from Mr. Herrick about his harassment. If, as scholars have argued, Section 230 had a good-faith threshold, broad immunity would be granted only to those digital platforms that deserve it.
Privacy isn’t anathematic to online dating. Users want it, and they try hard to maintain it. The problem isn’t sharing intimate selfies, no matter what victim-blamers would have us believe. The problem is the law permits the development of apps that are unsafe by design."

Tuesday, July 2, 2019

Sperm donors used to be anonymous. Technology has made that obsolete

Here's a representative story from the NY Times:

Sperm Donors Can’t Stay Secret Anymore. Here’s What That Means.  By Susan Dominus

"To be the biological child of an anonymous sperm donor today is to live in a state of perpetual anticipation. Having never imagined a world in which donors could be tracked down by DNA, in their early years sperm banks did not limit the number of families to whom one donor’s sperm would be sold — means that many of the children conceived have half-siblings in the dozens. There are hundreds of biological half-sibling groups that number more than 20, according to the Donor Sibling Registry, where siblings can find one another, using their donor number. Groups larger than 100, the registry reports, are far from rare.
"Because of the increasing popularity of genetic testing sites like 23andMe, in the past two or three years a whole new category of people, including those who never knew they were conceived via donor insemination, are reaching out to half siblings who may have already connected with others in their extended biological family. 
...
"Over time the adoption movement popularized the principle that individuals had a right to know their biological roots, and lesbian couples and single mothers, dominating ever more of the sperm banks’ market, called for greater transparency. In the early 2000s, California Cryobank offered, for a premium fee, an option for parents to choose a donor who agreed not just to be contacted when the offspring turned 18 but to respond in some fashion (though still anonymously if that was his preference).
By 2010, experts in reproductive technology were starting to note that internet searchability, facial-recognition software and the future of DNA testing would soon render anonymity a promise that the sperm banks could no longer keep. Since 2017, California Cryobank has stopped offering anonymity to its new donors. Donors now must agree to reveal their names to their offspring when they turn 18 and to have some form of communication to be mediated, at first, by the bank."
************

And here's an accompanying story, by a man who has now met and photographed many of his half-sibs.

Wednesday, April 24, 2019

Insurance, privacy, surveillance, algorithms, and repugnance

The NY Times is on the case:

Insurers Want to Know How Many Steps You Took Today
The cutting edge of the insurance industry involves adjusting premiums and policies based on new forms of surveillance.
By Sarah Jeong

"Last year, the life insurance company John Hancock began to offer its customers the option to wear a fitness tracker — a wearable device that can collect information about how active you are, how many calories you burn, and how much you sleep. The idea is that your Fitbit or Apple Watch can tell whether or not you’re living the good, healthy life — and if you are, your insurance premium will go down.
...
"artificial intelligence is known to reproduce biases that aren’t explicitly coded into it. In the field of insurance, this turns into “proxy discrimination.” For example, an algorithm might (correctly) conclude that joining a Facebook group for a BRCA1 mutation is an indicator of high risk for a health insurance company. Even though actual genetic information — which is illegal to use — is never put into the system, the algorithmic black box ends up reproducing genetic discrimination.

"A ZIP code might become a proxy for race; a choice of wording in a résumé might become a proxy for gender; a credit card purchase history can become a proxy for pregnancy status. Legal oversight of insurance companies, which are typically regulated by states, mostly looks at discrimination deemed to be irrational: bias based on race, sex, poverty or genetics. It’s not so clear what can be done about rational indicators that are little but proxies for factors that would be illegal to consider.
...
"A. I. research should march on. But when it comes to insurance in particular, there are unanswered questions about the kind of biases that are acceptable. Discrimination based on genetics has already been deemed repugnant, even if it’s perfectly rational. Poverty might be a rational indicator of risk, but should society allow companies to penalize the poor? Perhaps for now, A.I.’s more dubious consumer applications are better left in a laboratory."

HT: Julio Elias

Wednesday, January 23, 2019

Bounty hunters, licensing, and cellphone location data

The Seattle Times has a disturbing article about a little-regulated private enterprise part of the criminal justice system in many American states--the bounty hunters who work with bail bondsmen:

Lax Washington oversight of bounty hunters sets stage for mayhem, tragedy
(The url gives an alternative headline: https://www.seattletimes.com/seattle-news/times-watchdog/high-adrenaline-bounty-hunter-industry-operates-with-little-oversight-despite-concerns-over-training-tactics/ .)

"Formally known as bail-bond recovery agents, bounty hunters frequently carry firearms and have the right to forcibly enter homes and apprehend people who jump bail.

"Yet getting a license is relatively easy, and hardly anyone is turned away — even if they have a history of violence, a Seattle Times investigation has found.
...
"The lax requirements for bounty hunters are at odds with the weapons and tactics the agents are allowed to use. To get a license, an applicant must take 32 hours of training, which can include self-study, and must pass a 50-question, multiple-choice exam. The state has no formalized curriculum or certification process for instructors. Only the person teaching the firearms portion of the training is required to be certified through the state.

"By comparison, to get a license to perform manicures and style hair, a cosmetologist must receive 1,600 hours of training from a state-approved and licensed instructor."
***********

Furthermore:
I Gave a Bounty Hunter $300. Then He Located Our Phone
T-Mobile, Sprint, and AT&T are selling access to their customers’ location data, and that data is ending up in the hands of bounty hunters and others not authorized to possess it, letting them track most phones in the country.

and
Ajit Pai Refuses to Brief Congress About Why Bounty Hunters Can Buy Cell Phone Location Data
The Chairman's staff said the selling of location data is not a 'threat to the safety of human life or property that the FCC will address during the Trump shutdown.'

Friday, October 12, 2018

Coffee for personal data

Inside Higher Ed has the story:
Café Swaps Espresso for Personal Info
A Japanese café chain plans to spread among Ivy League and other top campuses, offering free coffee and tea in exchange for students' personal information and consent to be contacted by companies.

"The cashless Shiru cafés give out handmade coffee and tea drinks for free. In exchange, students flash a university ID and, in the bargain, hand over a small cache of personal information: name, age, email address, interests, major and graduation year, among other details. They also agree to be contacted by Shiru’s corporate sponsors, who underwrite all those cappuccinos, matcha lattes and iced Americanos.
...
"Starbucks, meet LinkedIn … with extra foam.
...
"[at Brown University]...“I don’t get the feeling from my classmates that they’re trying to reduce their data footprint.”

Sunday, November 27, 2016

An interview with computer scientist Cynthia Dwork

Quanta Magazine, a publication of the Simons foundation, has an interview with Cynthia Dwork on differential privacy and fairness among other things.

How to Force Our Machines to Play Fair
The computer scientist Cynthia Dwork takes abstract concepts like privacy and fairness and adapts them into machine code for the algorithmic age.

Here are some earlier news stories about Apple's introduction of differential privacy to the iPhone,  which I've been following for a number of reasons.

From TechCrunch: What Apple’s differential privacy means for your data and the future of machine learning

From Wired: Apple’s ‘Differential Privacy’ Is About Collecting Your Data—But Not ​Your Data

Apple's differential privacy analyzes the group, protects the individual
Posted on June 21, 2016 

Friday, November 11, 2016

Designing privacy (differential privacy) at the Institute for Advanced Study


Differential privacy disentangles learning about a dataset as a whole from learning about an individual data contributor. Just now entering practice on a global scale, the demand for advanced differential privacy techniques and knowledge of basic skills is pressing. This symposium will provide an in-depth look at the current context for privacy-preserving statistical data analysis and an agenda for future research. This event is organized by Cynthia Dwork, of Microsoft Research, with support from the Alfred P. Sloan Foundation.
Speakers include:
Helen Nissenbaum, Cornell Tech and NYU
Aaron Roth, University of Pennsylvania
Guy Rothblum, Weizmann Institute
Kunal Talwar, Google Brain
Jonathan Ullman, Northeastern University

Saturday, November 5, 2016

In Britain, National Sperm Bank stops recruiting donors

The Telegraph has the story: National Sperm Bank stops recruiting donors after just two years

"The National Sperm Bank (NSB) was a joint project run by the National Gamete Donation Trust (NGDT) and the Birmingham Women’s Fertility Centre and launched in October 2014 with a £77,000 grant from the Department of Health.
It was hoped the service would plug the gap in the shortage of donors and prevent couples being forced to look for sperm from overseas.
The bank hoped to be self-sufficient within a year but because the full donor process takes up to 18 months they were unable to generate enough income to keep going.
Although they only managed to recruit seven viable donors, experts said it was the business model that proved their ultimate downfall.
...
"For every 100 men who enquire about being a donor, only 4 or 5 are ultimately accepted.
The Human Fertilisation and Embryology Authority estimate that 2,000 children are born every year in the UK using donated eggs, sperm or embryos and there are around licensed UK clinics performing sperm donor insemination.
But the majority of clinics are based in London and the south-east of England and treatment can be expensive. The cost of donor sperm from the UK's largest sperm bank, the London Sperm Bank, is currently £950. In contrast the National Sperm bank was proposing to charge £300 per insemination.
...
"The bank has also suffered because since 2005 the children of donors have a right to learn the identity of their fathers when they turn 18. The numbers of men willing to donate sperm has fallen dramatically since their anonymity was removed."

Saturday, August 20, 2016

Differential privacy at Apple

The MIT Technology Review has an article about Apple's use of differential privacy, that caught my eye for several reasons: Apple’s New Privacy Technology May Pressure Competitors to Better Protect Our Data: The technology is almost a decade-old idea that’s finally coming to fruition.

"On a quarterly investor call last week, Apple CEO Tim Cook boasted that the technology would let his company “deliver the kinds of services we dream of without compromising on individual privacy.” Apple will initially use the technique to track trends in what people type and tap on their phones to improve its predictive keyboard and Spotlight search tool, without learning what exactly any individual typed or clicked.
...
“It’s exciting that things we knew how to do in principle are being embraced and widely deployed,” says Aaron Roth, an associate professor at University of Pennsylvania who has written a textbook on differential privacy. “Apple seems to be betting that by including privacy protections, and advertising that fact, they will make their product more attractive.”
In the version of differential privacy Apple is using, known as the local model, software on a person’s device adds noise to data before it is transmitted to Apple. The company never gets hold of the raw data. Its data scientists can still examine trends in how people use their phones by accounting for the noise, but are unable to tell anything about the specific activity of any one individual.
Apple is not the first technology giant to implement differential privacy. In 2014 Google released code for a system called RAPPOR that it uses to collect data from the Chrome Web browser using the local model of differential privacy. But Google has not promoted its use of the technology as aggressively as Apple, which has this year made a new effort to highlight its attention to privacy (see “Apple Rolls Out Privacy-Sensitive Artificial Intelligence”)."

Saturday, March 5, 2016

Penn celebrates differential privacy (and an application to anti-terrorism surveillance)

Briefly on Penn's front web page, here's the story.
Balancing Privacy and Security in Network Analysis

Something about the pictures caught my eye:


(L to R) Steven Wu, Michael Kearns, Aaron Roth, and Grigory Yaroslavtsev

Friday, March 13, 2015

Reflections on practical market design, by Moritz Hardt

Moritz Hardt reflects on the political parts of market design, in connection with some of his (more or less) recent, discouraging experience in proposing its use to the California Public Utilities Commission: Towards practicing differential privacy.

Long story short, the CPUC decided not to give data to some users rather than to adopt a privacy standard that would have allowed those users to get useful data.

It's a long post, well worth reading, about what went wrong and what could have been done better. I'll just summarize some of his subject headings, as he thinks about how he'll go about this in the future, in the second part of his post, called On practicing differential privacy:

Focus on win-win applications
"Apply differential privacy as a tool to provide access to data where currently access is problematic due to privacy regulations. Don’t fight the data analyst. Don’t play the moral police. Imagine you are the analyst
....
Don’t empower the naysayers
"for differential privacy to be a major success in practice it would be sufficient if it were successful in some applications but certainly not in all—not even in most.
...
Change your narrative
"Don’t present differential privacy as a fear inducing crypto hammer designed to obfuscate data access. That’s not what it is. Differential privacy is a rigorous way of doing machine learning, not a way of preventing machine learning from being done.
...
Build reliable code repositories
"A weakness of the differential privacy community has been the scarcity of available high quality code.
...
Be less general and more domain-specific
"... reading the scientific literature on differential privacy from the point of view of a domain expert can be very frustrating. Most papers start with toy examples that make perfect sense on a theoretical level, but will appear alarmingly naïve to a domain expert.
...
Be more entrepreneurial
"The CPUC case highlighted that the application of differential privacy in practice can fail as a result of many non-technical issues. These important issues are often not on the radar of academic researchers.
...
So, is differential privacy practical?
"I like the answer Aaron Roth gave when I asked him: It's within striking distance."