Market Snapshot analyzes consumer research on attitudes on data sharing and personal experiences with COVID-19
DALLAS, Oct. 13, 2020reports among heads of US broadband households ages 18-24, 90% are willing to share smartphone data if privacy protections are offered. In contrast, just over 63% of those age 65+ are willing to share data.
Higher income households and those with higher levels of education are also more likely than lower income households and those with lower educational attainment to share their data.
“As COVID-19 continues to spread, more people will know someone who has contracted COVID-19, which will likely increase their willingness to share smartphone data,” said Jennifer Kent, Senior Director, Parks Associates. “Already 93% of US broadband households report lifestyle changes to limit the spread of the disease, so smartphone data in aid of contact tracing gives consumers an opportunity to take an active role in combating the virus.”
Before I became a reporter at NPR, I worked for a few years at tech companies.
One of the companies was in the marketing technology business — the industry that’s devoted in part to tracking people and merging their information, so they can be advertised to more effectively.
That tracking happens in multiple senses: Physical tracking, because we carry our phones everywhere we go. And virtual tracking, of all the places we go online.
The more I understood how my information was being collected, shared and sold, the more I wanted to protect my privacy. But it’s still hard to know which of my efforts are actually effective and which are a waste of time.
So I reached out to experts in digital security and privacy to find out what they do to protect their stuff – and what they recommend most
Amazon announced a new palm-recognition system last week that lets people shop in two of its Amazon Go stores by scanning their palm at the entrance. The store automatically tracks what products they pick up and then charges the credit card associated with their hand.
It’s the latest in a long line of product announcements from the company to raise privacy or security concerns while selling its vision of an automated, frictionless future.
Called Amazon One, the palm-scanning system is only in two Go stores in Seattle at the moment but with the massive online retailer behind it, has the potential to become a standard form of payment of even identification. Amazon’s plan is to start selling it as a service to other companies, like retail stores, office buildings that use ID badges to get in and out, or stadiums that require tickets for events.
Proposition 24 on the November ballot is pitched as an expansion of California’s already robust consumer data privacy law, an iron cuff on the claws of companies that profit handsomely from tracking and selling your online search, travel and purchase habits to marketers.
But the technology giants seemingly square in its sights — the likes of Facebook, Amazon and Google — haven’t shown up to the battlefield. Instead, those opposing the new California Privacy Rights Act are some of the same types of consumer, labor and civil rights advocates who support its predecessor.
“Proposition 24 is a wolf in sheep’s clothing,” said Richard Holober, president of the Consumer Federation of California and a leader of the No on 24 effort. “It’s loaded with giveaways to tech companies.”
Not so, said Alastair Mactaggart, the East Bay real estate developer who spurred California’s adoption two years ago of the country’s toughest data
SAN FRANCISCO (AP) — The Fitbits on our wrists collect our health and fitness data; Apple promises privacy but lots of iPhone apps can still share our personal information; and who really knows what they’re agreeing to when a website asks, “Do You Accept All Cookies?” Most people just click “OK” and hope for the best, says former Democratic presidential candidate Andrew Yang.
“The amount of data we’re giving up is unprecedented in human history,” says Yang, who lives in New York but is helping lead the campaign for a data privacy initiative on California’s Nov. 3 ballot. “Don’t you think it’s time we did something about it?”
Yang is chairing the advisory board for Proposition 24, which he and other supporters see as a model for other states as the U.S. tries to catch up with protections that already exist in Europe.
It could be the wackiest product yet from Amazon — a tiny indoor drone which buzzes around people’s homes as a security sentry.
The introduction of the Ring Always Home Cam planned for 2021 has opened up fresh debate on the potential for intrusive surveillance and privacy infringement.
Amazon says the tiny drone is “built with privacy in mind” and operates at the direction of its customers. Nestled in a charging dock, the drone can be deployed remotely and send up to five minutes of video to the user.
But some activists express concerns about the device — part of a family of Ring-branded home security technology which has been scrutinized over its links to law enforcement.
John Verdi, vice president of policy at the Future of Privacy Forum, a Washington think tank, said the deployment may contribute to a “normalization of surveillance” in everyday life as more consumers install
The companies powering our connected lives know our names and addresses, political preferences, moods, and anxieties and the rabbit holes we fall down late at night. When it comes to data privacy, they’re often considered the bad guys, but they have the power to be the good guys too.
PARIS (Reuters) – France’s data privacy watchdog CNIL recommended on Thursday that websites operating in the country should keep a register of internet users’ refusal to accept online trackers known as cookies for at least six months.
In specifying a registration timeframe, the guideline goes beyond European Union-wide data privacy rules adopted two years ago, adding an extra hurdle that a data protection lawyer said would put some of the companies exploiting such tools to target advertising out of business.
Under the CNIL guideline, which the watchdog said must be adopted by March, internet users have the right to withdraw their consent on cookies – small pieces of data stored while navigating on the Web – at any time and they can refuse trackers when they go on a website.
“The internet user’s silence actually implies a refusal (to accept cookies),” said Etienne Drouard of American-British law firm Hogan Lovells.
The National Institute of Standards and Technology (NIST) is launching the Differential Privacy Temporal Map Challenge. It’s a set of contests, with cash prizes attached, that’s intended to crowdsource new ways of handling personally identifiable information (PII) in public safety datasets.
The problem is that although rich, detailed data is valuable for researchers and for building AI models — in this case, in the areas of emergency planning and epidemiology — it raises serious and potentially dangerous data privacy and rights issues. Even if datasets are kept under proverbial lock and key, malicious actors can, based on just a few data points, re-infer sensitive information about people.
The solution is to de-identify the data such that it remains useful without compromising individuals’ privacy. NIST already has a clear standard for what that means. In part, and simply put, it says that “De-identification removes identifying information from a dataset so that
Sometimes the light Kiana Caton is forced to use gives her a headache. On top of common concerns that come with taking a state bar exam — like whether you pass the test — Caton has to deal with challenges presented by facial recognition technology. She’s a Black woman, and facial recognition tech has a well-documented history of misidentifying women with melanin. Analysis by the federal government and independent research like the Gender Shades project has proved this repeatedly. The European Conference on Computer Vision also recently found algorithms don’t work as well on Black women as they do on other people.
Ok @ExamSoft support told me to “sit directly in front of a lighting source such as a lamp.” I’m receiving the same issue preventing me from completing the NY UBE mock exam. Facial recognition technology is racist. @DiplomaPriv4All do y’all think I have “adequate lighting”? pic.twitter.com/7tFdwfpyHB