Emphasising on the borderless nature of the Internet, Minister of State for Electronics and IT Rajeev Chandrasekhar said that the future of Internet regulation will need a harmonisation between the democracies of the world. In an interview with Soumyarendra Barik, he spoke about the upcoming legislations for the online ecosystem, why India may not follow Europe on data protection, and the issue of bots and algorithmic accountability of social media companies. Edited excerpts:
- Advertisement -
You have said that a new set of laws for the online space including data protection and a new Information Technology Act will be out soon. What is the status of these laws?
The digital economy is one of the biggest opportunities for India and a lot of what we are doing today is to accelerate that. The Ministry will soon come up with a set of laws, after extensive public consultation, that will serve as the guiding framework for the next ten years. Start-ups, young entrepreneurs and innovation will be an inherent part of the design of whatever we do.
But how will you ensure that the laws have enough regulatory legs while you allow start-ups some breathing space?
It is a false binary that there is a choice to be made between data protection and ease of doing business. Our architecture will effectively ensure that those are not binaries — that while citizen rights and consumer expectations of data protection will also be met, at the same time, we will make it easier for innovators to innovate in India and for investors to invest in innovators in the country to further grow the digital economy pie.
There has been speculation about dilution of the contentious data localisation norms in the new data protection Bill. There was significant pushback from Big Tech against these norms earlier. In retrospect, how is one to understand that pressure?
Sometimes the debate gets framed around the wrong issue. The issue is not localisation or free data flow. Rather it is protecting data of citizens and making online platforms accountable. We have set the boundary conditions of openness, safety and trust, and accountability for platforms and there is more than one way of ensuring that the data fiduciary is responsible for the security of the data principal’s data.
- Advertisement -
There is also a reciprocal obligation on the data fiduciary to allow law enforcement agencies in the event of a criminal conduct to give access to that data.
Are you perhaps exploring a model similar to standard contractual clauses under EU’s General Data Protection Regulation (GDPR) for data flows?
We are not using GDPR as our peer or our framework for comparison. Their requirements are different and they have come up with a framework. While we read, observe, and understand all the global laws, the GDPR is not particularly the model we are following. We recognise on behalf of the innovators that cross-border data flow is inherent to the nature of the internet. What we will come up with is to address issues of security and consumers’ rights to data protection, and therefore evolve a framework that, again, will not be a binary between whether we localise or not.
How important is it for India to get adequacy status with the GDPR?
I don’t want to say that it is not as important or it is as important. It is an important part of our discourse, because anything digital and data is a multivariable equation. During the consultation, we will figure out whether the weightage is on adequacy, privacy, or ease of doing business. The GDPR is a little bit more absolutist in terms of how they approach data protection. For us, that is not possible, because we have a thriving ecosystem of innovators.
- Advertisement -
Europe seemed sceptical of the old data protection Bill. Its data protection board in a 2021 report had flagged that national security in our Bill was recurring, broad, vague, and used as an excuse to process personal data. We are currently exploring a trade deal with the EU. How should one look at that in the context of the withdrawal of the Bill?
India has the largest digital footprint globally and we are the ones with the most significant momentum in terms of being a player in the future of technology. So, if a body in Europe comments about India’s digital ecosystem, I would respectfully tell them that the days when we used to blindly accept somebody’s view on digital matters as the holy grail are over. We have very sharply defined views which we have laid out in public and are happy to engage with anybody because the future of Internet regulation will need a harmonisation between the democracies of the world since the fundamental nature of the Internet is borderless. I am hoping that under India’s presidency of the G20, we can discuss that openly.
I have no problem today with there being some discourse about our approach not being consistent with somebody else’s approach. I think that will happen for the next one or two years, before we all come to an agreement.
Over the last few years, the most stringent privacy- and platform-related penalties on Big Tech companies like Meta and Google have been imposed by the EU. Do we have enough regulatory teeth to do something like that?
There is rampant data misuse by data fiduciaries, which includes Big Tech. On that, the law will be very clear that if you do that, there will be punitive consequences, in the shape of financial penalties. If there is misuse or non-consensual use of data or any breach, there will 100 per cent be penalties on companies. There also used to be a discussion about the individual citizen having to prove that a harm was committed. I am not particularly of that view.
- Advertisement -
Peiter Zatko, a former Twitter executive, has alleged that there was an Indian government agent working at the company. The government is yet to react to the claims…
Platforms use algorithms as a shield for intermediary conduct when algorithms are clearly being coded by people whose bias or lack of bias has not been tested. So, if we assume for a minute that the Twitter gentleman is right, you will have people who are either paid or have other ideological incentives that are coding algorithms which decide who is being muted or amplified. That is why I have been insisting on algorithmic accountability since 2018. It is a broader issue than Twitter.
- Advertisement -
Newsletter | Click to get the day’s best explainers in your inbox
There is no scrutiny on who is coding and it becomes more dangerous when a company hires someone with a dubious political background to code the algorithm. You can imagine the consequences.
Bots are another issue using which you can spread misinformation, child pornography, or defame someone. But it is impossible to prosecute them because they are bots. Having said that, I am not going to be drawn into an argument with somebody deposing 10,000 miles away.
How do you suggest regulating algorithmic accountability?
We have to figure it out. In my opinion, it is not acceptable that bots are not identified. When bots masquerade as a user, and then are responsible for criminal behaviour or user harm, it is a much deeper and important problem. We have some broad ideas, but a lot of those ideas depend on a relationship of accountability that will be defined by law.