Showing posts with label privacy. Show all posts
Showing posts with label privacy. Show all posts

Saturday, October 19, 2024

Proof of human

 TheTuring test was to see if computers could pass for humans, but as AI increasingly passes those tests, there may be increasing demands for humans to prove their humanity.  World (renamed from Worldcoin this week) plans to attack that task, in a privacy preserving way (i.e. you verify that you are human without identifying yourself).

Reuters has the story:

 Sam Altman's rebranded Worldcoin ramps up iris-scanning crypto project, By Anna Tong, October 17, 20242

"SAN FRANCISCO, Oct 17 (Reuters) - - Worldcoin, a cryptocurrency project founded by OpenAI CEO Sam Altman, said on Thursday it was rebranding to World Network and was ramping up efforts to scan every human's iris using its "orb" devices.


"Its core offering is its World ID, which the company describes as a "digital passport" to prove that its holder is a real human and tell the difference with AI chatbots online"

#########

Two snapshots I took at their event on Thursday:



Sunday, March 17, 2024

Privacy while driving

 Internet connected cars collect lots of data on driving behavior, which can be sold to insurance companies and used to change drivers' insurance rates.

The NYT has the story:

Automakers Are Sharing Consumers’ Driving Behavior With Insurance Companies . LexisNexis, which generates consumer risk profiles for the insurers, knew about every trip G.M. drivers had taken in their cars, including when they sped, braked too hard or accelerated rapidly.   By Kashmir Hill

"LexisNexis is a New York-based global data broker with a “Risk Solutions” division that caters to the auto insurance industry and has traditionally kept tabs on car accidents and tickets. 

...

"In recent years, insurance companies have offered incentives to people who install dongles in their cars or download smartphone apps that monitor their driving, including how much they drive, how fast they take corners, how hard they hit the brakes and whether they speed. But “drivers are historically reluctant to participate in these programs,” as Ford Motor put it in a patent application that describes what is happening instead: Car companies are collecting information directly from internet-connected vehicles for use by the insurance industry.

"Sometimes this is happening with a driver’s awareness and consent. Car companies have established relationships with insurance companies, so that if drivers want to sign up for what’s called usage-based insurance — where rates are set based on monitoring of their driving habits — it’s easy to collect that data wirelessly from their cars.

But in other instances, something much sneakier has happened. Modern cars are internet-enabled, allowing access to services like navigation, roadside assistance and car apps that drivers can connect to their vehicles to locate them or unlock them remotely. In recent years, automakers, including G.M., Honda, Kia and Hyundai, have started offering optional features in their connected-car apps that rate people’s driving. Some drivers may not realize that, if they turn on these features, the car companies then give information about how they drive to data brokers like LexisNexis.

"Automakers and data brokers that have partnered to collect detailed driving data from millions of Americans say they have drivers’ permission to do so. But the existence of these partnerships is nearly invisible to drivers, whose consent is obtained in fine print and murky privacy policies that few read.

"Especially troubling is that some drivers with vehicles made by G.M. say they were tracked even when they did not turn on the feature — called OnStar Smart Driver — and that their insurance rates went up as a result."

Saturday, January 27, 2024

Open source intelligence purchases that would require a warrent to be collected directly

 The NYT has the story:

N.S.A. Buys Americans’ Internet Data Without Warrants, Letter Says By Charlie Savage, January 25

"The National Security Agency buys certain logs related to Americans’ domestic internet activities from commercial data brokers, according to an unclassified letter by the agency.*

...

"In [a different] letter, General Nakasone wrote that his agency had decided to reveal that it buys and uses various types of commercially available metadata for its foreign intelligence and cybersecurity missions, including netflow data “related to wholly domestic internet communications.”

"Netflow data generally means internet metadata that shows when computers or servers have connected but does not include the content of their interactions. Such records can be generated when people visit different websites or use smartphone apps, but the letter did not specify how detailed the data is that the agency buys."

...

"Law enforcement and intelligence agencies outside the Defense Department also purchase data about Americans in ways that have drawn mounting scrutiny. In September, the inspector general of the Department of Homeland Security faulted several of its units for buying and using smartphone location data in violation of privacy policies. Customs and Border Protection has also indicated that it would stop buying such data."

#######

*Here is the letter referred to above. It is not in fact a letter "by the agency," but is from a senator to the Director of National Intelligence.

"As you know, U.S. intelligence agencies are purchasing personal data about Americans that would require a court order if the government demanded it from communications companies.  

...

"The FTC notes in its complaint [against the data broker X-Mode Social] that the reason informed consent is required for location data is because it can be used to track people to sensitive locations, including medical facilities, places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, and welfare and homeless shelters. The FTC added  that the sale of  such data poses an unwarranted intrusion into the most private areas of consumers lives. While the FTC's -Mode social complaint and order are limited to location data, internet metadata can be equally sensitive. Such records can identify Americans who are seeking help from a suicide hotline or a hotline for survivors of sexual assault or domestic abuse, a visit to a telehealth provider focusing on specific healthcare need, such as those prescribing and delivering abortion  pills by mail, or reveal that someone likely suffers from a gambling addiction."

Friday, December 1, 2023

Fairness in algorithms: Hans Sigrist Prize to Aaron Roth

 The University of Bern's Hans Sigrist Prize has been awarded to Penn computer scientist Aaron Roth, and will be celebrated today.

Here are today's symposium details and schedule:

Here's an interview:

Aaron Roth: Pioneer of fair algorithms  In December 2023, the most highly endowed prize of the University of Bern will go to the US computer scientist Aaron Roth. His research aims to incorporate social norms into algorithms and to better protect privacy.  by Ivo Schmucki 

"There are researchers who sit down and take on long-standing problems and just solve them, but I am not smart enough to do that," says Aaron Roth. "So, I have to be the other kind of researcher. I try to define a new problem that no one has worked on yet but that might be interesting."

"Aaron Roth's own modesty may stand in the way of understanding the depth of his contributions. In fact, when he authored his doctoral thesis on differential privacy about 15 years ago and then wrote on the fairness of algorithms a few years later, terms like “Artificial Intelligence” and “Machine Learning” were far from being as firmly anchored in our everyday lives as they are today. Aaron Roth was thus a pioneer, laying the foundation for a new branch of research.

"I am interested in real problems. Issues like data protection are becoming increasingly important as more and more data is generated and collected about all of us," says Aaron Roth about his research during the Hans Sigrist Foundation’s traditional interview with the prize winner. He focuses on algorithmic fairness, differential privacy, and their applications in machine learning and data analysis.

...

"It is important that more attention is paid to these topics," says Mathematics Professor Christiane Tretter, chair of this year's Hans Sigrist Prize Committee. Tretter says that many people perceive fairness and algorithms as two completely different poles, situated in different disciplines and incompatible with each other. "It is fascinating that Aaron Roth’s work shows that this is not a contradiction."

...

"The first step to improving the analysis of large data sets is to be aware of the problem: "We need to realize that data analysis can be problematic. Once we agree on this, we can consider how we can solve the problems," says Aaron Roth."





Monday, September 25, 2023

Smart toilets and data privacy

 Something to sit and think about:

Smart toilets could leak your medical data, warn security experts. by Matthew Sparkes New Scientist, Volume 259, Issue 3456, 2023, Page 14, ISSN 0262-4079, https://doi.org/10.1016/S0262-4079(23)01720-7. 

"A range of start-ups and research projects have developed smart toilets to monitor everything from heart rate to the consistency of stools and the presence of certain proteins in urine that indicate disease. One device even features an “anus camera” that takes a photo from below for identification, something that has been described as the “polar opposite of facial recognition”.*

...

"One concern was the privacy of people other than the owner: are visitors consenting to have photographs or measurements taken? There were also worries about the risk of losing sensitive data to hackers, as well as the possibility of companies selling the data on. And if smart toilets were installed in public areas or workplaces, there would be questions about who has access to that data, it was argued.The group of experts concluded that smart toilets shouldn't be sold as consumer devices, but instead as medical devices that have to meet high regulatory standards for privacy and safety (arXiv, doi.org/ksx5).

"Chase Moyle at smart toilet start-up Coprata says he set out to build a consumer device because creating a medical device under US Food and Drug Administration regulations would raise the price by a factor of 10. It would also mean that, in the US, insurance companies would only offer it to people with diagnosed conditions.

...

"Alan Woodward at the University of Surrey, UK, says so-called internet of things (IoT) devices, such as heart rate monitors and CCTV cameras, have often been found to have security flaws, including a smart toilet with a computer-controlled bidet. He fears the same could be true for medical-focused smart toilets. “With a lot of IoT devices, security has never been uppermost in the mind and yet something like a smart toilet is collecting some very personal data,” he says. “They're making these weird devices because they can, but nobody's thought through ‘should we?’”

#########

See also (for the first instance of that quote I can find):

‘Smart toilet’ monitors for signs of disease. A disease-detecting “precision health” toilet can sense multiple signs of illness through automated urine and stool analysis, a new Stanford study reports.  April 6, 2020 - By Hanae Armitage, Stanford Medicine News

"One of the most important aspects of the smart toilet may well be one of the most surprising — and perhaps unnerving: It has a built-in identification system. “The whole point is to provide precise, individualized health feedback, so we needed to make sure the toilet could discern between users,” Gambhir said. “To do so, we made a flush lever that reads fingerprints.” The team realized, however, that fingerprints aren’t quite foolproof. What if one person uses the toilet, but someone else flushes it? Or what if the toilet is of the auto-flush variety?

"They added a small scanner that images a rather camera-shy part of the body. You might call it the polar opposite of facial recognition. In other words, to fully reap the benefits of the smart toilet, users must make their peace with a camera that scans their anus.

“We know it seems weird, but as it turns out, your anal print is unique,” Gambhir said. The scans — both finger and nonfinger — are used purely as a recognition system to match users to their specific data. No one, not you or your doctor, will see the scans."

#######

Also, Meet the winners of the 2023 Ig Nobel Prizes 

"Public Health Prize

Citation: "Seung-min Park, for inventing the Stanford Toilet, a device that uses a variety of technologies—including a urinalysis dipstick test strip, a computer vision system for defecation analysis, an anal-print sensor paired with an identification camera, and a telecommunications link—to monitor and quickly analyze the substances that humans excrete."

Monday, June 12, 2023

Data privacy concerns in the U.S. and Europe

A selection from many news stories that touch on data privacy concerns (in the U.S. about Tiktok, in Europe about Facebook...and about DNA):

From the NYT:

Driver’s Licenses, Addresses, Photos: Inside How TikTok Shares User Data. Employees of the Chinese-owned video app have regularly posted user information on a messaging and collaboration tool called Lark, according to internal documents.  By Sapna Maheshwari and Ryan Mac

"Alex Stamos, the director of Stanford University’s Internet Observatory and Facebook’s former chief information security officer, said securing user data across an organization was “the hardest technical project” for a social media company’s security team. TikTok’s problems, he added, are compounded by ByteDance’s ownership.

“Lark shows you that all the back-end processes are overseen by ByteDance,” he said. “TikTok is a thin veneer on ByteDance.”

********

********

From the WSJ:

Former ByteDance Executive Claims Chinese Communist Party Accessed TikTok’s Hong Kong User Data. Allegation is made in suit against TikTok parent company; ByteDance says it vigorously opposes the claim. By Georgia Wells

"A former executive at ByteDance, the parent company of the hit video-sharing app TikTok, alleges in a legal filing that a committee of China’s Communist Party members accessed the data of TikTok users in Hong Kong in 2018—a contention the company denies. 

"The former executive claims the committee members focused on civil rights activists and protesters in Hong Kong during that time and accessed TikTok data that included their network information, SIM card identifications and IP addresses, in an effort to identify and locate the users. The former executive of the Beijing-based company said the data also included the users’ communications on TikTok.

From the Guardian:

Revealed: the contentious tool US immigration uses to get your data from tech firms. Documents show Ice has sent Google, Meta and Twitter at least 500 administrative subpoenas for information on their users.  by Johana Bhuiyan

"The US Immigration and Customs Enforcement Agency (Ice) sent tech giants including Google, Twitter and Meta at least 500 administrative subpoenas demanding sensitive personal information of users, documents reviewed by the Guardian show.

"The practice highlights the vast amount of information Ice is trying to obtain without first showing probable cause. Administrative subpoenas are typically not court-certified, which means companies are not legally required to comply or respond until and unless a judge compels them to. The documents showed the firms handing over user information in some cases, although the full extent to which the companies complied is unclear."

**********

From the WSJ:

Meta Fined $1.3 Billion Over Data Transfers to U.S.  Decision places pressure on Washington to implement surveillance changes for Europe to allow Meta to keep the data spigot open.  By Sam Schechner

"Meta’s top privacy regulator in the EU said in its decision Monday that Facebook has for years illegally stored data about European users on its servers in the U.S., where it contends the information could be accessed by American spy agencies without sufficient means for users to appeal."

*********

From the Guardian:

NHS data breach: trusts shared patient details with Facebook without consent. Observer investigation reveals Meta Pixel tool passed on private details of web browsing on medical sites."by Shanti Das

"Records of information sent to the firm by NHS websites reveal it includes data which – when linked to an individual – could reveal personal medical details.

"It was collected from patients who visited hundreds of NHS webpages about HIV, self-harm, gender identity services, sexual health, cancer, children’s treatment and more.

...

"In one case, Buckinghamshire Healthcare NHS trust shared when a user viewed a patient handbook for HIV medication. The name of the drug and the NHS trust were sent to the company along with the user’s IP address and details of their Facebook user ID."

**********

From the NYT:

Your DNA Can Now Be Pulled From Thin Air. Privacy Experts Are Worried. Environmental DNA research has aided conservation, but scientists say its ability to glean information about human populations and individuals poses dangers.  By Elizabeth Anne Brown

"Forensic ethicists and legal scholars say the Florida team’s findings increase the urgency for comprehensive genetic privacy regulations. For researchers, it also highlights an imbalance in rules around such techniques in the United States — that it’s easier for law enforcement officials to deploy a half-baked new technology than it is for scientific researchers to get approval for studies to confirm that the system even works."

**********

From the LA Times:

Microsoft will pay $20 million to settle U.S. charges of illegally collecting children’s data

"Microsoft will pay a fine of $20 million to settle Federal Trade Commission charges that it illegally collected and retained the data of children who signed up to use its Xbox video game console.

"The agency charged that Microsoft gathered the data without notifying parents or obtaining their consent, and that it also illegally held on to the data. Those actions violated the Children’s Online Privacy Protection Act, the FTC stated."

Sunday, June 11, 2023

Digital data yields suspect in Idaho murders (NYT)

 The NYT has the story of how a wide ranging search of a large variety of digital data  led to an arrest of a suspect (whose trial hasn't yet begun):

Inside the Hunt for the Idaho Killer,” by Mike Baker, New York Times, June 10, 2023

"“Online shopping, car sales, carrying a cellphone, drives along city streets and amateur genealogy all played roles in an investigation that was solved, in the end, as much through technology as traditional sleuthing.

...

"A week after the killings, records show, investigators were on the lookout for a certain type of vehicle: Nissan Sentras from the model years 2019 to 2023. Quietly, they ran down details on thousands of such vehicles, including the owners’ addresses, license plate numbers and the color of each sedan.

"But further scrutiny of the video footage produced more clarity, and on Nov. 25 the police in Moscow asked law enforcement agencies to look for a different type of car with a similar shape: white Hyundai Elantras from the model years 2011 to 2013.

"Just across the state border, at Washington State University, campus police officers began looking through their records for Elantras registered there. 

...

"The hunt broadened as investigators vacuumed up more records and data. They had already sought cellphone data for all phones that pinged cell towers within a half-mile of the victims’ house from 3 a.m. to 5 a.m., according to search warrant filings. 

...

"after getting back data on [one of the victim]’s account on the Tinder dating app, detectives asked for details on 19 specific account-holders, including their locations, credit card information and any “private images, pictures or videos” associated with the accounts.

...

"Investigators were also working with a key piece of evidence: a Ka-Bar knife sheath, branded with a U.S. Marine Corps logo, that had been found next to two of the victims. They initially began looking for local stores that may have sold the weapon, and then fanned out.

"A request to Amazon sought the order histories of account holders who had purchased such knives. A follow-up request to eBay focused on a series of specific users, seeking their purchase histories. Some had connections to the area — including one in Idaho and two in Washington State...

...

"Forensic teams had examined the knife sheath and found DNA that did not belong to any of the inhabitants of the house. They ran the sample through the F.B.I.’s database, which contains millions of DNA profiles of past criminal offenders, but according to three people briefed on the case, they did not get a match.

"At that point, investigators decided to try genetic genealogy, a method that until now has been used primarily to solve cold cases, not active murder investigations.

...

"F.B.I. personnel ...{spent] days building out a family tree that began with a distant relative.

"By the morning of Dec. 19, records show, investigators had a name: Bryan Kohberger. He had a white Elantra. He was a student at a university eight miles from the murder scene.

...

"On Dec. 23, investigators sought and received Mr. Kohberger’s cellphone records. The results added more to their suspicions: His phone was moving around in the early morning hours of Nov. 13, but was disconnected from cell networks — perhaps turned off — in the two hours around when the killings occurred.

"Four days later, agents in Pennsylvania managed to retrieve some trash from Mr. Kohberger’s family residence, sending the material to the Idaho State Police forensic lab. Checking it against their original DNA profile, the lab was able to reach a game-changing conclusion: The DNA in the trash belonged to a close relative of whoever had left DNA on the knife sheath.

"Mr. Kohberger was arrested on Dec. 30."


Wednesday, June 7, 2023

Snowden and state surveillance: the view from The Guardian, ten years later

 Here's a look back at the Snowden affair (publication of documents about government surveillance) by the then editor in chief of the Guardian, one of the newspapers that took the lead.

Ten years ago, Edward Snowden warned us about state spying. Spare a thought for him, and worry about the future by Alan Rusbridger

"one story the Guardian published 10 years ago today exploded with the force of an earthquake.

"The article revealed that the US National Security Agency (NSA) was collecting the phone records of millions of Verizon customers. In case anyone doubted the veracity of the claims, we were able to publish the top secret court order handed down by the foreign intelligence surveillance court (Fisa), which granted the US government the right to hold and scrutinise the metadata of millions of phone calls by American citizens.

...this was but the tip of a very large and ominous iceberg.

...

"the Guardian (joined by the Washington Post, New York Times and ProPublica) led the way in publishing dozens more documents disclosing the extent to which US, UK, Australian and other allied governments were building the apparatus for a system of mass surveillance

...

"It led to multiple court actions in which governments were found to have been in breach of their constitutional and/or legal obligations. It led to a scramble by governments to retrospectively pass legislation sanctioning the activities they had been covertly undertaking. And it has led to a number of stable-door attempts to make sure journalists could never again do what the Guardian and others did 10 years ago.

"Even now the British government, in hastily revising the laws around official secrecy, is trying to ensure that any editor who behaved as I did 10 years ago would face up to 14 years in prison.

...

"The British government believed that, by ordering the destruction of the Guardian computers, they would effectively silence us. In fact, we simply transferred the centre of publications to New York, under ​the paper’s then US editor, Janine Gibson.

...

"The notion that the state has no right to enter a home and seize papers was established in English law in the famous case of Entick v Carrington (1765), which later became the basis for the US fourth amendment. In a famous passage, Lord Camden declared: “By the laws of England, every invasion of private property, be it ever so minute, is a trespass.”

"When I went out to talk about the Snowden case to assorted audiences (including, after a suitable gap, at MI5 itself), I would begin by asking who in the audience would be happy to hand over all their papers to a police officer knocking on their front door, even if they assured them they would only examine them if there was sufficient cause.

"Never, in any of these talks, did a single member of any audience raise a hand. Yes, people valued their security and were open to persuasion that, with due process and proper oversight, there would be occasions when the state and its agencies should be granted intrusive powers​ in specific circumstances​. But the idea of blanket, suspicionless surveillance – give us the entire haystack and we’ll search for the needle if and when it suits us – was repellent to most people."

Saturday, May 20, 2023

More governments seek to limit TikTok over data concerns

The NYT has the story:

Governments have expressed concerns that TikTok, which is owned by the Chinese company ByteDance, may endanger sensitive user data.  Sapna Maheshwari and 

In recent months, lawmakers in the United States, Europe and Canada have escalated efforts to restrict access to TikTok, the massively popular short-form video app that is owned by the Chinese company ByteDance, citing security threats.

The White House told federal agencies on Feb. 27 that they had 30 days to delete the app from government devices. A growing number of other countries and government bodies — including Britain and its ParliamentCanada, the executive arm of the European UnionFrance and New Zealand’s Parliament — have also recently banned the app from official devices. On April 4, Australia became the latest country to announce that it was prohibiting the TikTok app on government devices on advice from intelligence and security agencies.

On March 1, a House committee backed an even more extreme step, voting to advance legislation that would allow President Biden to ban TikTok from all devices nationwide. 

Lawmakers and regulators in the West have increasingly expressed concern that TikTok and its parent company, ByteDance, may put sensitive user data, like location information, into the hands of the Chinese government. They have pointed to laws that allow the Chinese government to secretly demand data from Chinese companies and citizens for intelligence-gathering operations. They are also worried that China could use TikTok’s content recommendations for misinformation.

India banned the platform in mid-2020, costing ByteDance one of its biggest markets, as the government cracked down on 59 Chinese-owned apps, claiming that they were secretly transmitting users’ data to servers outside India.”



Saturday, October 29, 2022

The end of anonymous sperm donation...

 In  Colorado, a new law ending anonymous sperm donation seeks to catch up with the technological developments involving genetic sequencing that have already made anonymity of sperm or egg donors fairly fragile. Here's an account in JAMA:

The End of Anonymous Sperm Donation in Colorado--A Step Forward to a New Fertility Future in the US?  by I. Glenn Cohen, JD1; Eli Y. Adashi, MD, MS2; Seema Mohapatra, JD, MPH3   JAMA. Published online October 24, 2022. doi:10.1001/jama.2022.19471

"On May 31, 2022, Colorado became the first state to effectively ban anonymous gamete donation.1 Starting in 2025, fertility clinics in Colorado must collect identity and medical information from sperm and egg donors and may not match donors that do not agree to such disclosure (the statute uses the word “donor” though in many instances compensation is provided). The new law also requires that the clinics make a request that donors update their contact information and medical history at least once every 3 years. The law provides that a donor-conceived person aged 18 years or older shall be provided donor information upon request. The statute purports to also prohibit fertility clinics outside Colorado from providing gametes to Colorado residents (or individuals located in Colorado) if they do not abide by these rules. The statute also instructs clinics not to match a donor once it is known or reasonably should be known that “25 families have been established using a single donor in or outside of Colorado.”1

...

"Two states, Utah and Washington, have enacted statutes requiring the collecting and sharing of identifying information about a donor with donor-conceived children who request it after reaching the age of 18 years.3 However, both states also permit a donor to opt out, thereby limiting the utility of the laws. By contrast, the UK, Germany, Sweden, France, and many other countries have created mandatory registries that donor-conceived individuals can access when they turn 18 years of age, having an effect similar to the new Colorado law.3,4

"The new Colorado law highlights the gap between the law and reality of gamete donor anonymity in the US outside Colorado. Banks have promised donors anonymity in other US states and prior leaks of donor information from banks’ files have been exceedingly rare, if they ever happened at all; the banks have litigated to protect the identifying information provided by the donor.3 But in a practical sense, the promise of anonymity is now much less thoroughgoing.4 Direct-to-consumer genetic testing has become very common, and it has been estimated that 100 million people worldwide have taken a direct-to-consumer genetic test by 2021.4 Studies estimate that a genetic database covering only 2% of the population could match nearly anyone in that population.4 The combination of direct-to-consumer genetic testing, publicly available information, and social media suggest that many donor-conceived individuals will in fact be able to reidentify their gamete donor."

Saturday, October 22, 2022

Privacy and data gathered by home devices

 Does your robot vacuum cleaner make a map of your house as it moves around, and store it on the web?  Could the fact that your kitchen chairs haven't moved all week allow someone to know that no one is home?  These are the kinds of things that people worry about when thinking of all the data collected by smart devices.

The Washington Post has this story:

Tour Amazon’s dream home, where every appliance is also a spy. Here’s everything Amazon learns about your family, your home and you.  by Geoffrey A. Fowler


"Echo speaker

"Echos respond to the wake word “Alexa” to summon the voice assistant to play music, answer questions, shop and control other devices.

"What it knows: Collects audio recordings through an always-on microphone; keeps voice IDs to differentiate users; detects coughs, barks, snores and other sounds; logs music and news consumption; logs smart-home device activity and temperature; detects presence of people though ultrasound.

"Ring doorbell

"What it knows: Live and recorded video, audio and photos of the outside of your house; when people come and go and you receive packages; status of linked devices like lights.

...

"Kindle or Fire Tablet

"What it knows: What and when you read and watch entertainment and news; when you open, close and how long you use third-party apps; your location.

"Why that matters: Amazon knows exactly how fast you read and how far you actually got through your last novel. Kindles and Fire Tablets are another way Amazon gets to know your tastes, which helps it sell you things.

...

"Roomba vacuum cleaner

"A vacuum cleaner that automatically roams around your house to clean, which Amazon is acquiring in a still-pending deal for $1.7 billion.

"What it knows: Camera identifies obstacles and layout of rooms and furniture; when, how often and where you clean.

"Why that matters: When the deal was announced, some Roomba owners balked at the idea that Amazon might gain access to maps of their home, created by the robots to help them clean. "

Saturday, October 1, 2022

Your digital trail, in cyberspace and in public spaces

 Here are two recent privacy-related stories about how the digital trails we leave can be combined in surprising ways.

From the NYT a story about an artist who became a digital sleuth, to capture people working hard to take casual-seeming Instagram photos of themselves in famous locations.

This Surveillance Artist Knows How You Got That Perfect Instagram Photo. A tech-savvy artist unearthed video footage of people working hard to capture the perfect shot for Instagram. It is a lesson in the artifice of social media and the ubiquity of surveillance.  By Kashmir Hill

"The 24/7 broadcast that Mr. Depoorter watched — titled “Live From NYC’s Times Square!” — was provided by EarthCam, a New Jersey company that specializes in real-time camera feeds. EarthCam built its network of livestreaming webcams “to transport people to interesting and unique locations around the world that may be difficult or impossible to experience in person,” according to its website. Founded in 1996, EarthCam monetizes the cameras through advertising and licensing of the footage.

"Mr. Depoorter realized that he could come up with an automated way to combine these publicly available cameras with the photos that people had posted on Instagram. So, over a two-week period, he collected EarthCam footage broadcast online from Times Square in New York, Wrigley Field in Chicago and the Temple Bar in Dublin.

"Rand Hammoud, a campaigner against surveillance at the global human rights organization Access Now, said the project illustrated how often people are unknowingly being filmed by surveillance cameras, and how easy it has become to stitch those movements together using automated biometric-scanning technologies."

******

From the Washington Post, a story about how data from health apps makes its way to advertisers and others, with device identifiers (e.g. with the identity of your phone...):

Health apps share your concerns with advertisers. HIPAA can’t stop it. From ‘depression’ to ‘HIV,’ we found popular health apps sharing potential health concerns and user identifiers with dozens of ad companies  By Tatum Hunter and Jeremy B. Merrill 

"several popular Android health apps including Drugs.com Medication Guide, WebMD: Symptom Checker and Period Calendar Period Tracker gave advertisers the information they’d need to market to people or groups of consumers based on their health concerns.

"The Drugs.com Android app, for example, sent data to more than 100 outside entities including advertising companies, DuckDuckGo said. Terms inside those data transfers included “herpes,” “HIV,” “adderall” (a drug to treat attention-deficit/hyperactivity disorder), “diabetes” and “pregnancy.” These keywords came alongside device identifiers, which raise questions about privacy and targeting."

Thursday, September 29, 2022

What is needed to gain support for effective algorithms in hiring, etc?

 Here's an experiment motivated in part by European regulations on transparency of algorithms.

Aversion to Hiring Algorithms: Transparency, Gender Profiling, and Self-Confidence  by Marie-Pierre Dargnies, Rustamdjan Hakimov and Dorothea Kübler

Abstract: "We run an online experiment to study the origins of algorithm aversion. Participants are either in the role of workers or of managers. Workers perform three real-effort tasks: task 1, task 2, and the job task which is a combination of tasks 1 and 2. They choose whether the hiring decision between themselves and another worker is made either by a participant in the role of a manager or by an algorithm. In a second set of experiments, managers choose whether they want to delegate their hiring decisions to the algorithm. In the baseline treatments, we observe that workers choose the manager more often than the algorithm, and managers also prefer to make the hiring decisions themselves rather than delegate them to the algorithm. When the algorithm does not use workers’ gender to predict their job task performance and workers know this, they choose the algorithm more often. Providing details on how the algorithm works does not increase the preference for the algorithm, neither for workers nor for managers. Providing feedback to managers about their performance in hiring the best workers increases their preference for the algorithm, as managers are, on average, overconfident."

"Our experiments are motivated by the recent debates in the EU over the legal requirements for algorithmic decisions. Paragraph 71 of the preamble to the General Data Protection Regulation (GDPR) requires data controllers to prevent discriminatory effects of algorithms processing sensitive personal data. Articles 13 and 14 of the GDPR state that, when profiling takes place, people have the right to “meaningful information about the logic involved” (Goodman and Flaxman 2017). While the GDPR led to some expected effects, e.g., privacy-oriented consumers opting out of the use of cookies (Aridor et al. 2020), the discussion over the transparency requirements and the constraints on profiling is still ongoing. Recently, the European Parliament came up with the Digital Services Act (DSA), which proposes further increasing the requirements for algorithm disclosure and which explicitly requires providing a profiling-free option to users, together with a complete ban on the profiling of minors. Our first treatment that focuses on the workers aims at identifying whether making the algorithm gender-blind and therefore unable to use gender to discriminate, as advised in the preamble of the GDPR and further strengthened in the proposed DSA, increases its acceptance by the workers. The second treatment is a direct test of the importance of the transparency of the algorithm for the workers. When the algorithm is made transparent in our setup, it becomes evident which gender is favored. This can impact algorithm aversion differently for women and men, for example if workers’ preferences are mainly driven by payoff maximization.

"The treatments focusing on the managers’ preferences aim at understanding why some firms are more reluctant than others to make use of hiring algorithms. One possible explanation for not adopting such algorithms is managerial overconfidence. Overconfidence is a common bias, and its effect on several economic behaviors has been demonstrated (Camerer et al. 1999, Dunning et al. 2004, Malmendier and Tate 2005, Dargnies et al. 2019). In our context, overconfidence is likely to induce managers to delegate the hiring decisions to the algorithm too seldom. Managers who believe they make better hiring decisions than they actually do, may prefer to make the hiring decisions themselves. Our paper will provide insights about the effect of overconfidence on the delegation of hiring decisions to algorithms. Similar to the treatments about the preferences of workers, we are also interested in the effect of the transparency of the algorithm on the managers’ willingness to delegate the hiring decisions. Disclosing the details of the algorithm can increase the managers’ trust in the algorithm."

Thursday, August 18, 2022

Facebook data, abortion prosecution, and search warrents

 The Guardian has the story:

Facebook gave police their private data. Now, this duo face abortion charges  Experts say it underscores the importance of encryption and minimizing the amount of user data tech companies can store. Johana Bhuiyan

"In the wake of the supreme court’s upheaval of Roe v Wade, tech workers and privacy advocates expressed concerns about how the user data tech companies stored could be used against people seeking abortions.  

...

"when local Nebraska police came knocking in June – before Roe v Wade was officially overturned – Facebook handed the user data of a mother and daughter facing criminal charges for allegedly carrying out an illegal abortion. Private messages between the two discussing how to obtain abortion pills were given to police by Facebook, according to the Lincoln Journal Star. The 17-year-old, reports say, was more than 20 weeks pregnant. In Nebraska, abortions are banned after 20 weeks of pregnancy. The teenager is now being tried as an adult."

********

And the Washington Post focuses on search warrents:

Search warrants for abortion data leave tech companies few options. Facebook’s role in a Nebraska case underscores the risks of communicating on unencrypted apps. By Naomi Nix and Elizabeth Dwoskin 

"Prosecutors and local law enforcement have strict rules they must follow to obtain individuals’ private communications or location data to bolster a legal cases. Once a judge grants a request for users’ data, tech companies can do little to avoid complying with the demands.

...

“If the order is valid and targets an individual, the tech companies will have relatively few options when it comes to challenging it,” said Corynne McSherry, legal director at the privacy advocacy group Electronic Frontier Foundation. “That’s why it’s very important for companies to be careful about what they are collecting because if you don’t build it, they won’t come.”

************

And then there's this to watch out for, also from the Guardian:

How private is your period-tracking app? Not very, study reveals. Research on more than 20 apps found that the majority collected large amounts of personal data and shared it with third parties.  by Kari Paul

*******

The Washington Post offers some advice on keeping your data private (it's not so easy...)

Seeking an abortion? Here’s how to avoid leaving a digital trail. Everything you should do to keep your information safe, from incognito browsing to turning off location tracking.  By Heather Kelly, Tatum Hunter and Danielle Abril