قالب وردپرس درنا توس
Home / LifeStyle / The laws protecting our privacy are too weak

The laws protecting our privacy are too weak



Professor Sandra Wachter is an expert in law, data and AI at the University of Oxford Internet Institute. She says that every time your data is collected, "you leave something of yourself." She added that anyone can use their online behavior to "shut down very sensitive things about you", such as: Your ethnicity, gender, sexual orientation and state of health.

It's bad enough for companies to use these conclusions for targeted ads. But it gets even worse if they get access to very private data. Would you, for example, feel comfortable if, after visiting a doctor, Google would display fertility treatment ads in your emails? Or if your health care provider could access your browsing history without your knowledge to determine how well you are eligible for insurance.

Last week, we learned that Google has taken large amounts of unreduced and unanonymised data from healthcare provider Ascension. The files contained test results, diagnoses and records of hospitalizations for tens of millions of patients.

These were made available to researchers as part of the Project Nightingale team to develop software that could help improve healthcare software. Access to the records is strictly controlled and only accessible to employees who have been reviewed by Ascension. That did not stop Congress and the Ministry of Health from investigating.

How did Google capture this data without the consent of the people involved? In the US, it is legal under the Health Insurance Portability and Accountability Act (HIPAA), and Google and Ascension have followed the law. At least within the framework of the legal provisions, which under certain circumstances enable cross-company data flow. However, this is not just a legal loophole in the US.

"I do not think we can rule that out in the EU," says tech lawyer Neil Brown of decoded.legal. "There are no absolute prohibitions in the GDPR," he said, referring to the European General Data Protection Regulation, which covers the European Union and the wider European Economic Area.

"We do not have time to read 600 pages No one has the time to do this, and everyone knows that no one has the time to do so."

Brown says instead The GDPR "is a set of controls or standards that companies must follow to act in compliance, one of which is that the processing is required for scientific research, so what Google does here can meet the requirements." If there is a case law supporting this claim, we are in a gray area.

With data hungry companies looking to health care, the only way to stop these quiet businesses could be a reform. Wachter thinks the first step should be to remove those often-ignored scrolls. "Consent is a very bad tool," she said. "We do not have time to read a 600-page privacy policy, no one has the time to do that, and everyone knows that nobody has the time to do it," she added.

Another The problem is that data protection laws are too focused on the date of data collection, Wachter wrote in 201

8, and not on what happened after data was collected. This is at least one of the advantages of using GDPR, which forces companies to minimize the data they hold about people. Otherwise, even if you have given your consent at this time, you can not control the conclusions drawn from the data. If a company thinks you are a bad debtor, you can not challenge it.

These conclusions are often the biggest problem, especially in areas where machine learning has been implemented. For this reason, Wachter believes that now is the time to shift responsibility from the person to the entity hoarding all that data. It wants to "make it a duty or a responsibility" who collects the data in a responsible and ethical way.

Wachter also believes that this is not the case with a single privacy model. Work in a world where information is critical. "They want stricter rules for financial regulation," but perhaps less if you "do cancer research at a university." However, it would be up to every institution, corporation or corporation to prove that they deserve this trust.

An integral part of Wachter's reform proposals is the idea that we, like the right to forget, need a right to "reasonable conclusions." "In this way, we could, for example, find out what data influenced a decision and what underlying assumptions were made at the time of data collection.

We have previously reported about it – where the data collection agencies are doing our online activities and making completely wrong assumptions. When I interviewed one of the largest US data companies to investigate what and by whom they thought I was (under GDPR), the data was badly flawed – they even had basic facts that were publicly known, like my age and ignores my marital status, in favor of algorithmic inferences.

This will be a problem both now and in the future, especially when organizations trust machines to draw conclusions in their name.Facial recognition already impairs your employability beyond what's in your resume even Facebook uses It is a security measure despite numerous catastrophic data breaches.

In Europe, experts are already calling on legislators to ban more advanced forms of these social credit rules. In the USA there are demands for stricter data protection laws in the sense of the European DSGVO. But without specific measures to prevent companies from retrieving large amounts of confidential data and processing them through their own machine learning, there are more problems ahead.


Source link