COVID Tracing- A Privacy Minefield

WHEN THE NOTION of enlisting smartphones to help fight the Covid-19 pandemic first surfaced last spring, it sparked a months-long debate: Should apps collect location data, which could help with contact tracing but potentially reveal sensitive information? Or should they take a more limited approach, only measuring Bluetooth-based proximity to other phones? Now, a broad survey of hundreds of Covid-related apps reveals that the answer is all of the above. And that’s made the Covid app ecosystem a kind of wild, sprawling landscape, full of potential privacy pitfalls.

Late last month Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, released the results of his analysis of 493 Covid-related iOS apps across dozens of countries. His study of those apps, which tackle everything from symptom-tracking to telehealth consultations to contact tracing, catalogs the data permissions each one requests. At WIRED’s request, Albright then broke down the data set further to focus specifically on the 359 apps that handle contact tracing, exposure notification, screening, reporting, workplace monitoring, and Covid information from public health authorities around the globe.

The results show that only 47 of that subset of 359 apps use Google and Apple’s more privacy-friendly exposure-notification system, which restricts apps to only Bluetooth data collection. More than six out of seven Covid-focused iOS apps worldwide are free to request whatever privacy permissions they want, with 59 percent asking for a user’s location when in use and 43 percent tracking location at all times. Albright found that 44 percent of Covid apps on iOS asked for access to the phone’s camera, 22 percent of apps asked for access to the user’s microphone, 32 percent asked for access to their photos, and 11 percent asked for access to their contacts.

“It’s hard to justify why a lot of these apps would need your constant location, your microphone, your photo library,” Albright says. He warns that even for Covid-tracking apps built by universities or government agencies—often at the local level—that introduces the risk that private data, sometimes linked with health information, could end up out of users’ control. “We have a bunch of different, smaller public entities that are more or less developing their own apps, sometimes with third parties. And we don’t we don’t know where the data’s going.”

The relatively low number of apps that use Google and Apple’s exposure-notification API compared to the total number of Covid apps shouldn’t be seen as a failure of the companies’ system, Albright points out. While some public health authorities have argued that collecting location data is necessary for contact tracing, Apple and Google have made clear that their protocol is intended for the specific purpose of “exposure notification”—alerting users directly to their exposure to other users who have tested positive for Covid-19. That excludes the contact tracing, symptom checking, telemedicine, and Covid information and news that other apps offer. The two tech companies have also restricted access to their system to public health authorities, which has limited its adoption by design.

But Albright’s data nonetheless shows that many US states, local governments, workplaces, and universities have opted to build their own systems for Covid tracking, screening, reporting, exposure alerts, and quarantine monitoring, perhaps in part due to Apple and Google’s narrow focus and data restrictions. Of the 18 exposure-alert apps that Albright counted in the US, 11 use Google’s and Apple’s Bluetooth system. Two of the others are based on a system called PathCheck Safeplaces, which collects GPS information but promises to anonymize users’ location data. Others, like Citizen Safepass and the CombatCOVID app used in Florida’s Miami-Dade and Palm Beach counties, ask for access to users’ location and Bluetooth proximity information without using Google’s and Apple’s privacy-restricted system. (The two Florida apps asked for permission to track the user’s location in the app itself, strangely, not in an iOS prompt.)

 

Article Source: WIRED

They Hacked A Coffee Maker

With the name Smarter, you might expect a network-connected kitchen appliance maker to be, well, smarter than companies selling conventional appliances. But in the case of the Smarter’s Internet-of-things coffee maker, you’d be wrong.
As a thought experiment a researcher reverse engineered one of the older coffee makers to see what kinds of hacks he could do with it. After just a week of effort, the unqualified answer was: quite a lot. Specifically, he could trigger the coffee maker to turn on the burner, dispense water, spin the bean grinder, and display a ransom message, all while beeping repeatedly. Oh, and by the way, the only way to stop the chaos was to unplug the power cord.
Yes, anything can be hacked. Welcome to the Internet of Things. If you haven’t heard this term before here’s a simple definition:
In the broadest sense, the term IoT encompasses everything connected to the internet, but it is increasingly being used to define objects that “talk” to each other. “Simply, the Internet of Things is made up of devices – from simple sensors to smartphones and wearables – connected together,”
Matthew Evans, the IoT programme head at techUK, says.
But do we really need to be that connected? Do I need to start my coffee while I’m driving home from work at the expense of my privacy? What do you think?

Kids Smartwatch Security Nightmare

Kids’ Smartwatches Are a Security Nightmare Despite Years of Warnings
Five out of six brands tested by researchers would have allowed hackers to track kids—and in some cases eavesdrop on them.
CONNECTING EVERY POSSIBLE device in our lives to the internet has always represented a security risk. But that risk is far more pronounced when it involves a smartwatch strapped to your child’s wrist. Now, even after years of warnings about the security failings of many of those devices, one group of researchers has shown that several remain appallingly easy for hackers to abuse.
When WIRED asked Schinzel if three years of security analyses gave him the confidence to put these smartwatches on his own children, he answered without hesitation: “Definitely not.” ays Sebastian Schinzel, is a Münster University computer scientist who worked on the study and presented it at the International Conference on Availability, Reliability, and Security in late August.
What are your thoughts? We would recommend NOT giving smartwatches to children, and limiting their personal electronic use for health and privacy reasons.

 

How Coronavirus Is Eroding Privacy

 

“In South Korea, investigators scan smartphone data to find within 10 minutes people who might have caught the coronavirus from someone they met. Israel has tapped its Shin Bet intelligence unit, usually focused on terrorism, to track down potential coronavirus patients through telecom data. One U.K. police force uses drones to monitor public areas, shaming residents who go out for a stroll.

American officials are drawing cellphone location data from mobile advertising firms to track the presence of crowds—but not individuals. Apple Inc. and Alphabet Inc.’s Google recently announced plans to launch a voluntary app that health officials can use to reverse-engineer sickened patients’ recent whereabouts—provided they agree to provide such information…”

We didn’t write it, we’re just sharing the information.  Find the rest of the article HERE

 

San Diego Mass Surveillance Without Oversight

City of San Diego Awarded GE Mass Surveillance Contract Without Oversight.
San Diego is now home to the largest mass surveillance operation across the country

While the California Legislature passed and Governor Brown signed the California Consumer Privacy Act (CCPA) into law in 2018, ostensibly to help California consumers protect their online data, state and local government doesn’t appear to be required to comply with this law.

Recently we learned that the Department of Motor Vehicles is earning more than $50-million a year by selling California drivers’ personal information, and the public is not offered an opt-out option of sharing personal information.

Now we learn that San Diego City Attorney Mara Elliott gave the approval to General Electric to outfit 4,000 new “smart street lights” with cameras and microphones in 2017. These CityIQ Nodes are listed on this city map and in the screen shot below.
The City of San Diego appears to now be in the business of enabling mega-data companies to cash in on city residents’ privacy.

Want to know more? Check out the rest of the story HERE.

How To Delete The Siri Recordings Apple Has Saved Of You

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Apple is giving users more control over data. “In July, Apple admitted it had been storing and listening to users’ interactions with the digital assistant as a way of improving Siri. The recordings didn’t have identifiable information, but could potentially contain private conversations. Shortly thereafter, Apple paused the program and promised to give users the option to opt in to it, which we’re now seeing in iOS 13.2…”

We care about your privacy, and wanted to share the rest of this article with you HERE.