Is there a threat to human rights from the collection and use of data by private companies? If there is, what should be done about it? The Joint Committee on Human Rights of the UK Parliament is seeking views on this in a new inquiry.
While states are responsible for protecting privacy rights under human rights law, in the committee's view it is arguably private companies which provide digital infrastructure, products and services that have the greatest impact on rights. Personal data is a valuable commodity and a major asset, routinely collected, analysed, aggregated and stored on a massive scale.
Using such data, for example, it is possible to target junk food advertising at children most vulnerable to unhealthy lifestyles; or to exclude certain ethnic groups from seeing housing advertisements, or certain age groups from job ads. Algorithms can be unintentionally discriminatory. Metadata can be used to deduce someone’s background, religion, political beliefs, gender identity, and – as researchers at Stanford University have shown – even medical conditions.
Since the Facebook/ Cambridge Analytica scandal broke last year, the role of the big tech companies such as Facebook, Google and Apple in protecting the right to privacy is increasingly coming under scrutiny, with greater pressure for regulation. And the increasingly rapid development of artificial intelligence presents some of the most challenging ethical and social questions – in both the public and private sector.
The committee is urgently considering whether existing safeguards to regulate the collection, use, tracking, retention and disclosure of personal data by private companies are sufficient to protect human rights.
While the key human right at risk is the right to private and family life (article 8 ECHR), freedom of expression (article 10), freedom of association (article 11), and non-discrimination (article 14) are also at risk. The right to privacy is also protected. by the UN Guiding Principles on Business and Human Rights 9, and in domestic law by the Data Protection Act 2018.
The committee seeks written evidence, including examples, on the use of data by private companies, with a focus on these questions:
- Are some uses of data by private companies so intrusive that states would be failing in their duty to protect human rights if they did not intervene? If so, what uses are too intrusive, and what rights are potentially at issue?
- Are consumers and individuals aware of how their data is being used, and do they have sufficient real choice to consent to this?
- What regulation is necessary and proportionate to protect individual rights without interfering unduly with freedom to use and develop new technology?
- If action is needed, how much can be done at national level, and how much needs international cooperation?
- To what extent do international human rights standards, such as the UN Guiding Principles on Business and Human Rights, have a role to play in preventing private companies from breaching individuals rights to privacy?
Click here for further information and for the written submission form. The deadline for responses is 31 January 2019.