2024: regulation and cybersurveillance, a pivotal year for the cloud market

7 Feb 2024 | Press

While 2024 is undoubtedly the year of generative AI, it will also be decisive for cloud providers both through their direct contribution to the subject of AI and through the creation or evolution of certain regulations.

In both Europe and the United States, several pieces of legislation are currently under discussion that will have a major impact on the global cloud landscape, with concrete consequences for European organizations in terms of their technological choices. More than ever, decision-makers need to adopt a global risk management approach to anticipate these changes.

The European Union is trying to get member states to agree on a European certification scheme, and to harmonize the various national certifications known as “trusted clouds”, such as the SecNumCloud scheme run in France by Anssi. The aim? To provide a harmonized, readable level of trust for all European organizations using cloud services, and less complexity for operators subject to fragmented levels of requirements from one member state to another..

But the issues at stake here are far more political than technical, and revolve around the criterion of immunity to extraterritorial laws. The inclusion of such a clause, like France’s SecNumCloud (v3.2), will play a structuring role in the evolution of the cloud market in Europe.

On the other side of the Atlantic, the extension until April 2024 of Section 702 of the Foreign Intelligence Surveillance Act (FISA) and its uncertain evolution – two texts are under discussion to take over – brings another level of complexity and uncertainty. As a reminder, the FISA governs procedures for physical and electronic surveillance of foreign individuals and companies. Amended in 2008, it enables the US government, via section 702, to monitor foreign electronic communications with the assistance of service providers.

Section 702 owes its fame to the Edward Snowden affair. Snowden revealed several mass surveillance programs, such as PRISM, which enabled the NSA to access the communications of foreign Internet users outside the United States. And this through companies such as Microsoft, Apple, Google, Facebook and Skype.

Section 702 has a direct impact on cloud players, as it enables the US government to request information hosted by these “electronic communications service providers” (ECSPs). Access to information requested by the US authorities is not limited to servers located in the USA, but to all servers operated by service providers domiciled in the USA. The question of where information is hosted therefore offers no protection.

As for encryption, it raises many questions about its true protective capacity. Don’t American agencies have the capacity and computing power to infiltrate all encryption solutions? To quote French MP Philippe Latombe: “Encryption is a bit like putting an armored door on your apartment. It’s harder to get in, but you can still get in.” And beyond encryption, what about managing the risks of technological independence? In the event of a trade or political dispute, the Americans have already demonstrated their ability to react forcefully by denying China access to the semi-conductor market. Is it really possible to imagine a scenario in which the USA taxes or even cuts off American cloud services to Europeans? What about after the next US presidential election?

2024, the year of the increasingly shared observation: these regulatory issues can no longer be neglected. Cloud projects, as well as data, AI and cyber projects, can no longer be seen as technological issues. They are also regulatory and strategic.

Upcoming and ongoing regulations, and the ping-pong between the USA and Europe – as evidenced by the third version of the data transfer agreement already back before the European Court of Justice – are creating a gray area. All the more so as this complex and unclear regulatory environment is likely to become obsolete with technological innovation.

Take, for example, the right to be forgotten enshrined in European law with regard to generative AI, which is fed a large volume of data for training purposes. What happens to the right to be forgotten once your data has been absorbed by a generative AI? By its very nature, AI cannot forget.

These are all questions that make us realize that we are facing not only major technological upheavals, but also decisive political and strategic choices. With the European elections just a few weeks away, it’s high time these issues were given their rightful place in the campaign. Nothing would be worse than inaction.

Share This