Ohrvik states that many people in the UK have been calling for an internet regulator to carry out different digital-safety related responsibilities. For instance, the NSPCC, National Society for the Prevention of Cruelty to Children, called for an internet regulator to make sure that children are safe online. Similarly, media and Sport Committee is called out to implement an ethical code of practice for social media platforms and big search engines.
Given the fact that many people were talking about the independent internet regulatory body, Doteveryone decided to come out with their own set of proposals. It had previously carried out a survey that observed the public attitude and understanding of digital technologies. As per the survey results, one of the main things that people emphasized was greater accountability from tech companies. Also, people were supportive of the idea of an independent internet regulator.
“We spoke to lots of people, we did some of our own thinking and we were trying to imagine what this independent internet regulator might look like. But..we uncovered some more sort of deep-rooted systemic challenges that a single internet regulator couldn't really tackle” said Ohrvik.
The systemic challenges presented by Ohrvik are the need for better digital capabilities, society needs an agency and the need for evidence.
Ohrvik cites the example of Christopher Wiley, a “whistleblower” in the Cambridge Analytica scandal. As per Wiley, one of the weak points of the system is the lack of tech knowledge. The fact that he was asked a lot of basic questions by the Information Commissioner’s Office (UK’s data regulator) that wouldn’t be normally asked by a database engineer is indicative of the overall challenges faced by the regulatory system.
The second challenge is that society needs an agency that can help bring back their trust in tech. Ohrvik states that as part of the survey that Doteveryone conducted, they observed that when people were asked to give their views on reading terms and conditions, 58 percent said that they don't read terms and conditions. 47% of people feel that they have no choice but to accept the terms and conditions on the internet. While 43% of people said that there's no point in reading terms and conditions because tech companies will do what they want anyway. This last area of voters especially signals towards a wider kind of trend today where the public feel disempowered and cynical towards tech.
This is also one of the main reasons why Ohrvik believes that a regulatory system is needed to “re-energize” the public and give them “more power”.
Ohrvik states that it’s hard to get evidence around online harms and some of the opportunities that arise from digital technologies. This is because:
Ohrvik then discussed the importance of having a separate office for responsible technology if we want to counteract the systemic challenges listed above.
Ohrvik states that the office for responsible tech would do three broad things namely, empowering regulators, informing policymakers and public, and supporting people to seek redress.
This would include analyzing the processes that regulators have in-place to ensure they are up-to-date. Also, recommending the necessary changes required to the government to effectively put the right plan in action. Another main requirement is building up the digital capabilities of regulators. This would be done in a way where the regulators are able to pay for the tech talent across the whole regulatory system, which in turn, would help them understand the challenges related to digital technologies.
ODI: Regulating for responsible technology
Empowering regulators would also help shift the role of regulators from being kind of reactive and slow towards being more proactive and fast moving.
This would involve communicating with the public and policymakers about certain developments related to tech regulation. This would further offer guidance and make longer-term engagements to promote positive long term change in the public relationship with digital technologies.
ODI: Regulating for responsible technology
For instance, a long term campaign centered around media literacy can be conducted to tackle misinformation. Similarly, a long-term campaign around helping people better understand their data rights can also be implemented.
This is aimed at addressing the power imbalance between the public and tech companies. This can be done by auditing the processes, procedures, and technologies that tech companies have in place, to protect the public from harms.
ODI: Regulating for responsible technology
For instance, a spot check can be carried out on algorithms or artificial intelligence to spot harmful content. While spot checking, handling processes and moderation processes can also be checked to make sure they’re working well. So, in case, certain processes for the public don't work, then this can be easily redressed. This approach of spotting harms at an early stage can further help people and make the regulatory system stronger.
In all, an office for responsible tech is quite indispensable to promote the responsible design of technologies and to predict their digital impact on society. By working with regulators to come out with approaches that support responsible innovation, an office for responsible tech can foster healthy digital space for everyone.
Microsoft, Adobe, and SAP share new details about the Open Data Initiative
Congress passes ‘OPEN Government Data Act’ to make open data part of the US Code
Open Government Data Act makes non-sensitive public data publicly available in open and machine readable formats