Return to flip book view

Coup Data Volume 1

Page 1

Coup dataANALYZING THE IMPACT OF DATA ON SOCIETYBASDEVANT AVOCATS WITH THE SUPPORT OF ALTERMIND A NEW GENERATION , VOLUME 1 2021

Page 2

Coup data 3Originally, "Coup Data" is a phrase coined by Adrien Basdevant, lawyer in Paris, passionate about the impact of technology on society. Coup Data is a platform striking both digital and so-cietal issues, with the ambition to help the uprising of a French and European tech culture. To involve everyone in these critical subjects, four goals are set: 1. Giving rise to a new generation of digital pioneers2. Promoting a European vision on digital topics3. Providing insights for public policies regarding data & society challenges4. Defending Digital RightsCoup Data is promoting knowledge on our data society by interviewing inspiring people with mul-tidisciplinary backgrounds: hard sciences, social sciences, entrepreneurs, lawmakers… Coup data gives them the opportunity to share their views and experience on key subjects through its “3 Questions To …”. The platform also provides a regular content on topics revolving around law, technology and po-licy, with a special focus regarding digital rights of individuals. It is authored by Adrien Basdevant, with the help of Caroline Leroy-Blanvillain and invited guests…Join the conversation on:www.coupdata.frWhat is Coup data?3 Questions to Aurélie Jean Why data ownership is not a good idea 3 Questions to Axelle LemaireHow to fight cybercriminality? 3 Questions to Marion Bergeret Covid19, individual control & biopower3 Questions to Aude Bernheim & Flora VincentWhen the average wants to be the norm3 Questions to Sébastien SorianoA Brief History of DataBlockchain by Primavera de Filippi Libra by Xavier Lavayssière Quantum Computing by Elham KashefiThe "Coup Data"About Basdevant Avocats and AltermindArticlesTech of TomorrowEDITOR'S PREFACESUMMARYpage 10page 14 page 18 page 22 page 24 page 26 page 32 page 34 page 42page 46page 54page 58page 60page 66page 74

Page 3

Coup data 5Aurélie JeanComputational Scientist Researcher in Computer Programming Founder & CEO of In Silico Veritas Primavera De Filippi Permanent Researcher at CNRS Faculty Associate at Harvard University Legal scholar, Author & ArtistSébastien Soriano Former Chairman of ARCEP French regulatory authority for telecom and postal sectorsAxelle LemaireGlobal Head of Terra Numerata Former French Secretary of State for Digital and Innovation Marion Bergeret Tech enthusiast General Counsel, ALAN Former Associate General Counsel at SonosXavier Lavayssière Blockchain Regulation Researcher Entrepreneur Dr. Elham KashefiResearcher in Quantum computing Entrepreneur & Co-founder of VeriQloudDr. Aude Bernheim & Dr. Flora Vincent Researchers at Weizmann Institute of ScienceAuthors & Co-founders of WAX ScienceOur ContributorsMANY THANKS TO…

Page 4

Coup data 7Since the 1990s, our world has gone through major disruptions. Trade is more international than ever, with economies like China and India surging ahead; the environmental imperative has finally imposed itself, forcing us to make profound changes to the way we live and produce; and we spend more time online as almost every aspect of our life becomes at least partly digital. As Ian Goldin put it in his 2016 book, we are living a new ‘Age of Discovery’.The internet has completely transformed the way we live – it has also created a new economy, where people shop, get informed, pick a restaurant, and entertain online. The advent of the smartphone, less than 15 years ago, has relocated the economy. We, now, constantlycarry it with us. Change has come fast, and our institutions are Why “a New Generation”? PARTNER FOREWORDSstruggling to keep pace. Policy and governance structures lag behind innovations, some of which are so transformative that they are shaking our social fabric and democratic institutions. It is therefore not an exaggeration to conclude that our era, which was rightly described as a ‘digital age’ by Eric Schmitt and Jared Cohen, is one of ‘revolution’.Data is leading this transformation - data is everywhere, circulating extremely rapidly, carrying love messages as well as business or-ders to and from almost anywhere on earth (and beyond). The power of the internet is that it gathers information and makes it accessible, enabling businesses to make much better-infor-med decisions. In that sense, we are living a truly ‘Hayekian’ moment. The Austrian economist and Nobel prize winner brilliantly demonstrated how markets help reveal the relevant information, indicating at every moment all the available knowledge in their price equilibrium. Competition is another key driver of this transfor-mation. Competition dynamics have been rein-vigorated by the digital age. Giant firms – with turnovers larger than many nations’ GDP – which the world has never seen before, have emerged. There is a glorious explosion of entrepreneurs who have the ability to disrupt almost any mar-ket. The internet is a boiling pot in which new solutions and products constantly challenge, re-place or impose innovation on old ones. Every day new concepts emerge, new technologies are tested and then shared at the speed of light. The velocity is mind-bending. If, as Deidre McCloskey puts it, ideas lead growth and the world, we are definitely living in a time of almost meteoric pos-sibilities.This whirl of abundance is nevertheless also so-metimes perceived as a risk. First, it can pose a social danger. As Philippe Aghion explains, crea-tive destruction, a brilliant concept taken from Schumpeter, makes us richer and freer, but it also hurts some and therefore scares. Second, technology generates anxiety and concern: un-derstanding how digital platforms or AI work is a

Page 5

Coup data 9complicated matter. This is why we need talented educators, such as Aurélie Jean, author of ‘De L’autre cote de la Machine’, who makes cutting edge technological science clear and accessible to everyone.To navigate these uncharted waters, we need the world’s leading thinkers to adopt a form of intellectual Esperanto so that their work can be read and understood around the globe. I believe these thought leaders are the true explorers of modern times: patiently and determinedly map-ping and sharing with us the knowledge gleaned from these new frontiers. They will help us expe-rience these disruptions as exhilarating, not fear inducing. The power of these intellectual guides is greatly multiplied when they work together, across dis-ciplines and take the time to simplify complexity. This is why, at Altermind, we co-build solutions with them. Many leading academics and pro-minent experts, including Adrien Basdevant, have joined our ad hoc project teams. With them we have explained the most complicated tech-nological and data issues to demanding regula-tory bodies in ways that have enabled them to grow in understanding and shi paradigms. We invented a new regulatory partnership between an international platform and the French go-vernment, based on the law setting out the ob-jectives to be achieved, and companies having the choice of how to achieve them. In this trans-formational digital age, flexibility is key.This edition of Coup Data entitled “ A New Gene-ration” will give its readers examples of how bril-liant new thinkers are helping us understand and ultimately participate in the digital revolution. I strongly encourage everyone who is interested in data, digital regulation and economics (in short: the future) to read it carefully. Mathieu Laine Founder of Altermind

Page 6

Coup data 11"Users are the best enemies of tech companies as they can challenge them and disturb them in order to improve the technologies. But to do so, people need to understand how technologies are built and are working."A.B: We oen use the metaphor of the “black box” to describe our current data society in which algo-rithms help taking daily decisions that aect our lives - from the loan we get, the insurance rate we pay, opportunities at school which in-fluence our success or otherwise, our potential recidivism risk as-sessment - in an opaque and in-contestable way. As an expert in Applied Mathematics and Compu-ter Programming, in your view what tools can be used to promote al-gorithm explainability? Should we design technical standards or legal obligations, like a right to audit?A.J: Algorithm explainability is a challenge. That being said, we (designers of algorithms) should all work to make users, decision makers and state organizations understand the mechanisms of a given algorithm. This also provides an opportunity to improve the al-gorithm thanks to close collabora-tion between users and designers. This explainability can be shaped in a broad range of ways, and de-pends on the type of algorithms. Indeed, while explicit algorithms (that are explicitly defined by hu-mans) are easily explainable by listing its explicit criteria, hypothe-sis and logic conditions, implicit algorithms are much less straight-forward to explain as their criteria are implicitly and abstractly de-fined within what we call the inva-riants of the neural network that is created from machine learning.Once we understand how algo-rithms are designed and used, It seems obvious that auditing algo-rithms does not make sense at all. Indeed, it is impossible to provide users with the list of implicit criteria because the invariants are much too abstract and largely unknown. Instead, it is more valuable and concrete to audit the best prac-tices of developing, deploying and testing algorithms within an orga-nization. When developing an algorithm, one should guarantee that the data chosen to calibrate or train the algorithm is statistically repre-sentative, and that some relevant tests on the algorithms are done to identify any risk of underlying bias. One could also list the basic logic mechanisms of the algorithm as well as the purpose of this al-gorithm, what it means to do and what it does not mean to achieve. Providing users and states with the list of best practices for a gi-ven algorithm is key to help those people provide relevant and well formulated questions or even pos-sible solutions to algorithm issues. Auditing those best practices is also important to reveal any data usage abuse, thereby preventing scandals and minimizing the re-putational risk of the company that develops those algorithms.A.B: You claim that everyone should be exposed to coding. Without nee-ding to become a soware develo-per, you are convinced that it is a powerful way to capture the feel Too brainy for my computer3 QUESTIONS TO AURÉLIE JEANCoup data

Page 7

Coup data 13Coup dataof this discipline, to understand the process of categorization, and the influences on the choice of al-gorithm. What public policy mea-sures would you advise to imple-ment toward that goal?A.J: In practice, this can be achieved in schools or in compa-nies in the context of continuous learning. Some methods were created to teach algorithms and coding to kids without using com-puters, such as the methods pro-posed by the company COLORI, which were designed for 3-6 year old kids. Some coding classes are proposed in elementary and middle schools in many countries all over the world. In France, mandatory and op-tional computer sciences classes are now provided to high school students. This is a big deal as France is one of the few coun-tries in the world to oer such classes country-wide. With re-gard to adults and professionals, companies should provide their employees with coding and al-gorithm training to develop their analytical mindset and to enable them become knowledgeable users of technologies. Even though many employees will never write a single line of code in their career, they will need to work closely with tech people gi-ven the growing interest of com-panies for technologies and the current artificial intelligence transformation. As I oen say, co-ding and algorithms are the next languages to learn!A.B: We oen hear about regula-tion when it comes to platforms and dominant tech players. Ac-cording to you, what is the user’s role in this equilibrium? Are there responsibilities that one should bear in mind when using such ser-vices?A.J: I believe that we need regu-lations to frame the development and the usage of such techno-logies. That being said those re-gulations need to be created smartly to accelerate innovation, as well as to concretely protect users. To reach that goal, those regulations should be written in close collaboration with scien-tists and developers of techno-logies, who are working in the in-dustry and not only by legislators. Also, those regulations should not only be framed around the obli-gations of tech actors, but also around the duties and roles of users. Indeed, having responsibi-lities helps users be free as they understand their leverage and role in the ecosystem. Users should be more proac-tive and understand the main mechanisms of technologies to better protect and defend them-selves as well. To Go Further • Too brainy for my computer: how programming will become the next mainstream computer-user evolution • Algorithms are the wrong usual suspect! • Who Should Decide How Agorithms Decide?

Page 8

15Coup data“Those who claim a right of ownership over personal data forget that raw data, data that is not linked to others, has very little intrinsic value.” Every microsecond, personal data are being collected, pro-cessed and exchanged. But who does it belong to? Individuals, States, platforms, brokers...? While some insist on establishing a right of ownership on the data, it does not appear as a viable solution, for at least two reasons:1. Right of ownership would imply exclusivity, when it is preci-sely the circulation and use of data that creates its value. 2. The balance of power between individuals and platforms is clearly unequal. The latter will continue to impose their terms, with no possibility of negotiating, leading to a sale of our per-sonal data at low prices.Who owns personal data?If our data are called « personal », our ownership or belonging of them is not obvious. According to the French Civil Code, ownership refers to three attributes: usus (right to use), fructus (right to produce), and abusus (right to dispose). A right of ownership thus implies an exclusivity that could qui-ckly lead to absurd situations. If I was the owner of my first name, then no one else would have the right to use it without my prior authorization, or would have to pay me a license to avoid being sued for identity the. The result would be the op-posite of what is sought, because it is precisely the circulation and use of data that creates its value. « Personal right » vs. « right in rem » over dataThe question of the attribution of proprietary rights is by far the most delicate as the concepts of ownership are very die-rent from one continent to another. Both France and the Eu-ropean Union have a personalist approach: individuals have rights over their personal data. This does not boil down to the protection of a mere economic value, as proposed by the American « pragmatic » approach. Thus, our conception is not particularly propitious to a propo-sal of patrimonialization. The French Senate report entitled « Privacy in the age of digital memories. Towards greater trust between citizens and the information society » had already ruled out the possibility of granting proprietary rights to indi-viduals in 2009, referring to the risk of merchandising perso-nality attributes. By considering that a data, in the same way Why Data Ownership is not a Good IdeaBY ADRIEN BASDEVANTCoup data

Page 9

Coup data 17as a kidney or a lung, would be an emanation of our person, it could thus be forbidden to trade it. This is why some de-fend the idea of using the legal rule of the unavailability of non-commercial things for data, just as the unavailability of the human body prohibits anyone from trading in all or part of his or her body.The most decisive argument remains above all the unequal balance of power that would exist between the co-contrac-tors. In other words, how would an individual be able to de-termine the value of his data? A few illustrations: How much would your genetic code cost? What would you charge to sell your musical tastes? Each of us would quickly fall victim to an informational asymmetry that would lead us to sell our data for an unfairly low price. Surely, an individual should be able to control the use of his personal data, for example by choosing how to manage the primary and secondary use of his data, and so on. However, « proprietary ownership » in the legal perspective would not directly solve this problem. On the contrary, since the evalua-tion of the price of data depends on their use, it could lead to under- or over-valuation.How does the law stand?The French Law for a Digital Republic of October 2016 ruled out the idea of proprietary data, while allowing individuals to retain control over it. Indeed, Article 1 of this text provides that "Any person has the right to decide and control the uses made of personal data concerning him/her, under the condi-tions laid down in this law". This so-called "informational self-determination" principle had been defined by the German Constitutional Court as ear-ly as 1983. By endorsing it, French domestic law has rear-med that an individual is the right-holder regarding his or her data. Informational self-determination must be understood as an end in itself, and can be understood as the possibility of transcribing and controlling one's identity in the digital space. This autonomy goes hand in hand with the right for everyone to be aware of what is known about them. This is one of the reasons why the law for the Digital Republic has created an obligation of loyalty for the platforms. It im-poses on the platforms a stronger obligation to provide infor-mation to the consumer. In other words, platforms’ transpa-rency becomes a necessary condition to ensure compliance between the displayed promise of the service and actual practices. The principle of platform loyalty is intended to compensate for the structural imbalance between the platform, oen in a monopolistic or oligopolistic situation, and users, who are in a dependent situation since they are unable to find alterna-tive services. Other solutions deserve to be explored and will have to be found, without falling into the illusion of a salutary ownership. “Platforms offering to buy user data will impose membership contracts on a ‘take-it-or-leave-it’ deal, with no room for negotiation.”

Page 10

Coup data 19"Europe is better equipped to oer to businesses and citizens alike a trusted and secure digital environment than any other regions"A.B: We oen hear that Europe does not have anything like the GAFA or BATX. As a key player in the expan-sion of the French Tech movement, uniting the French digital startups worldwide, what should the Euro-pean Union do to keep up with this race? Are there key initiatives that should be strongly supported?A.L: It is oen considered that the USA creates, China copies and the EU regulates. If you think of what the situation will be in five years, the new setup could be China as the country which creates, the USA as the regulating country and the EU both creating and regulating. Regulatory challenges are now better understood at higher levels, with a more acute awareness of the dominant position occupied by the big tech players. Indeed, large technology companies have become so huge on the global market, and so present in our daily lives, that they might somehow be hindering innovation and access to market, thereby restricting open competition. Calls for change are more vocal, and even the USA are paying more attention to this situation. The trick will be to bring evidence of the va-lue created through the collection and use of data outside of clas-sic business models, and to relate competition issues to new hori-zontal and data-driven business models. Antitrust regulation was historical-ly well designed to address mono-polistic or oligopolistic situation in the banking or energy sectors, but the rules need to be adapted to grab these new models. Europe has always been one step ahead when considering these matters. If one strongly believes that the fu-ture will develop through techno-logy push, regulators will have to adapt and legislate in real-time, have companies open their APIs and share their data, and this new future implies deep structu-ral changes for policy makers. But Europe is better equipped to oer to businesses and citizens alike a trusted and secure digital environ-ment than any other regions, and that is a winning argument in the long run, as long as the competi-tion remains free.Against this background, there still can be room for the emergence of new players, which is good news for European companies. I am op-timistic for new startups to flouri-sh, following the example of the increasing number of unicorns in France, getting from 1 to 8 uni-corns in recent years. However, the real question is: what moves do we want to make at the European level to leverage this trend harmoniously? My number one recommendation would be to invest in research and long-term projects rather than back pri-vate investment only with a short-sighted obsession on quick return on investment. Breakthrough in-novations usually come from re-search labs or entrepreneurs who Europe strikes back3 QUESTIONS TO AXELLE LEMAIRECoup data

Page 11

Coup data 21are former or present researchers. This is our strongest asset in Eu-rope, a continent where research, innovation and education are in-creasingly linked.A.B: Data is a key driver in the digi-tal transformation. However, cen-tral administrations, local authori-ties and public agencies generate important volume of information that most oen remain unused due to a lack of data strategy and sui-table resources. To overcome this, the French Digital Republic Bill was passed in 2016 (named aer you: Lemaire’s law!) and introduced the principle of open data by de-fault. This legislation is now appli-cable to the entire public sector in France, including every local au-thority with more than 3,500 inha-bitants. Is this enough to guarantee that open data creates value?A.L: To recall the context, when I was negotiating this bill as a Go-vernment member, neither the principles of open data by default nor online privacy were conside-red serious matters, except in spe-cific specialized instances. We are past this today. Public institutions now understand that the massive amount of data they are produ-cing and collecting is precious. Data is sometimes compared to the new oil, a comparison I disa-gree with as it relates to proprie-tary rights and limited resources. Data is rather like air or light, the more it circulates the more it creates value.The challenge for successful open data policies is not only to produce data that are well qualified, reu-sable, portable and machine rea-dable, but to find ways for other stakeholders such as entrepre-neurs, associations and citizens to get involved and to share and use the data produced in the bosom of public institutions. Some natio-nal or local authorities proactively engage in open data policies by creating platforms for instance, but the ability to demonstrate the value of such policies (democratic value -thanks to the transparency it brings- but also economic va-lue) is of great importance. We know that startups and innova-tive companies strive on their use of data, especially in the mobility and utilities sectors, and that data sharing can have a positive impact on lowering the bills for consumers, reducing environmental damage, improving mobility, implementing smart cities, etc. There are multi-ple illustrations of valuable mixes between public, transactional and users' data. But the culture of se-cret is still dominant and such exa-mples are still too rare to attract and convince a sucient number of stakeholders to embark in a data sharing journey.There is a need to create new frameworks of cooperation around concepts such as "data for the public good" or "gene-ral interest data", where public authorities can lead as data or-chestrators in their relations with concession holders or other pri-vate entities. The digital repu-blic bill has led the way by crea-ting that new concept, but the concrete implementation can be complex and needs to be ba-lanced with privacy rights and bu-siness confidentiality. Somehow, it appears that when the value of open data and data sharing is not directly seeable, actions are not taken. Though the democratic value of such data is obvious, im-mediate business considerations can shatter hopes of developing the positive externalities promised by strong policies in terms of open data. However, the public sector has to move fast in this respect, or it bears the risk of being le aside of the promise of smart data and artificial intelligence for the pu-blic good. A.B: You have always been invol-ved in topics related to education, CivicTech and technologies for the benefit of the common good. Indeed, “Tech for Good” has re-cently become a motto for many initiatives. Would you say this is only a “purpose washing” trend or are we witnessing a real change in business approaches? How can entrepreneurs balance economic performance and the general inte-rest?A.L: I am currently working on a study that will be entitled "Fiy Shades of Good". The title itself shows that it is impossible to sub-sume all the dierent initiatives relating to Tech for Good under one category. The common star-ting point of all these projects is to increase awareness among ci-tizens and consumers, that trans-lates into more pressure on corpo-rate companies to produce "good goods".I was recently reading the front page of The Guardian explaining that a third of pollution's emissions are the doing of only eight com-panies (oil and gas mainly), five of them being American companies. It made me wonder once more whether we were ready for huge changes and fit to face the great environmental challenges ahead. I think the answer, for now, is no. Though awareness on these issues is growing, I doubt these changes will come from very large indus-trial companies, which - though I reckon are in some cases sincere and ready to change - are facing too huge a challenge to be able to move fast enough. So, I believe that change will come from new initiatives launched by startups, associations, founda-tions, governments collaborating with the private sector, not only to mitigate bad sides and damages triggered by commercial activities (pollution included) but also to bring positive social and environ-mental change. I believe that the French ecosys-tem is full of hope, as many en-trepreneurs are putting the envi-ronmental and/or social impact at the heart of their business mo-del. These initiatives are likely to have a real impact. In the end, I am convinced that consumers and citizens are the fertile ground where transformation can hap-pen. When consumers behave as committed citizens, they have the power to change things. It is only with empowered citizens and sincere purpose-driven organi-zations that we are likely to ob-serve any fundamental change in the way companies create value beyond business. To Go Further • "Can digital technology save representative democracy? The example of online consultations about the bill of law on a “digital republic”•"La promesse de l'intelligence artificielle : l'action publique doit-elle être prédictive ?"•"Mixité dans la FinTech : où sont les femmes ?" •"GovTech en France : état des lieux et perspectives"

Page 12

23Coup data“Cybercrime will increasingly be at the heart of our daily lives. How many revealed computer attacks, dismantled online networks will be enough to make us fully aware of this? At a time when the whole of our society and economy is being put into data, the risk lies in the explosion of this crime, while a feeling of impunity persists.” What is cybercrime? Its facets are multiple. In the absence of a legally established definition, cybercrime covers all criminal oences attempted or committed against or by means of an information and com-munication system. It ranges from fraud, data the and iden-tity the to child pornography, the dissemination of illegal or malicious content, corporate espionage and the saturation of websites. This activity can be very lucrative. In three years, Silk Road, the world's largest drug sales site, is estimated to have generated $1.2 billion in trac with commissions of about $80 million.Cybercrime is becoming more "democratic". Far from targe-ting only large groups, it now reaches small and medium-sized companies, individuals and communities alike. All sectors are concerned, from agriculture to education and heavy industry, because almost all our activities and interactions are now lin-ked to an information system and are therefore vulnerable to intrusions and piracy. Indeed, each organization becomes an information technology company whose data is a coveted as-set. It also presents a degree of complexity that requires a high degree of specialization of actors and tools in order to better understand the mechanisms used, as in the case of anonymi-zation processes used by the TOR browser (The Onion Router) or ransom requests in Bitcoin.Silk Road a topical exampleClosed by the FBI in November 2014, the "Silk Road" website oered, among other things, according to the prosecution, the sale and purchase, only in Bitcoin, of drugs, weapons, forged documents, and the services of hired killers. During the trial, Ross Ulbricht's lawyers - prosecuted for leading this black mar-ket and finally sentenced by the American courts in May 2017 to life imprisonment - challenged several aspects of the investiga-tion, particularly the behavior of two federal agents who tried to extort and blackmail the accused. Shaun Bridges, a member of the US secret service, was convicted aer pleading guilty to embezzling $800,000 in Bitcoin electronic currency. When Carl Mark Force, an ocer with the US Drug Enforcement Adminis-tration (DEA), was sentenced to 78 months in prison for extortion and money laundering. This complex case highlights the speci-ficities inherent in technologies, and the need to train the next How to fight cybercriminality?BY ADRIEN BASDEVANTCoup data

Page 13

Coup data 25generations of investigators, judges and lawyers to understand all the subtleties. Specialization of the entire criminal justice system seems necessary: from investigation, to prosecution, to the trial phase.The analogy with the National Financial Prosecutor's Oce. The current limits for combating cybercrime are similar to those that existed in the economic and financial field before the creation, in 2014, of the French National Financial Prosecutor's Oce: aggravation and complexity of economic crime, absence of a precisely defined interlocutor at national and international level, insucient specialization of judges, insucient human and technical resources, cost of the loss of income resulting from fraud. The National Financial Prosecutor's Oce could thus serve as an example. The creation of a public prose-cutor's oce specializing in di-gital technology would make it possible to respond to the pro-blems raised by oences spe-cific to electronic networks, in particular attacks on informa-tion systems, denial of service and piracy, as well as those committed using electronic communication networks and information systems when they reach a certain degree of com-plexity or constitute particularly serious oences. Its high level of specialization and the advanced tools it would be entrusted with would make it possible, for example, to identify authors using encryption processes and to collect evidence made more dicult to obtain by advanced technologies. Some initiatives have already been implemented in the State of Rio in Brazil, or in Spain where there is a General Prosecutor in charge of cy-bercrime who benefits from the help of 70 prosecutors gathe-red within a specific prosecutor's oce dedicated to this type of crime.Training of judgesThe creation of a digital judicial system requires an upstream debate on the type of specialization of judges, their number, the centralization or decentralization of such a prosecution service, and even the possibility of implementing a prosecution service at European level. A major challenge will then be to coordinate the action of the Digital Prosecutor's Oce with the various existing specia-lized services, as digital technology is oen a mean of enabling major organized crime to launder money or finance terrorist networks, for which other bodies are already competent. In the long term, the creation of a 33rd Correctional Chamber specia-lized and exclusively dedicated to the prosecution of oences investigated by the French National Digital Public Prosecutor's Oce - like the 32nd Correctional Chamber at the Paris Court, dedicated to cases originating from the French National Finan-cial Prosecutor's Oce (Cahuzac, Wildenstein, etc.) - could be proven relevant.The relevance of pan-European coordination. All the above findings suggest that cooperation at a supranatio-nal level should be considered, particularly in view of the subs-tantially transnational nature of cybercrime. In support of the implementation of such a cooperation mechanism, we should bear in mind the reasoning developed by the European Union institutions for the establishment of a European Public Prose-cutor's Oce for criminal oences against the EU's financial interests. As security, justice and fundamental rights are mat-ters of shared competence between the EU and the Member States, the principle of subsidiarity is particularly relevant here in that action at European level would be much more eec-tive in combating cybercrime oences. As such, the creation of a European Digital Public Prosecutor's Oce would also give concrete expression to several of the EU's priorities in the field of justice, for example with regard to consumer protection, the stated objective of which being in particular to adapt consu-mer law to the digital age but also to provide "a stronger, coor-dinated and global response to [...] cybercrime".The emergency now requires us to reflect on the evolution of digital litigation at European level. This development will seem prospective for some. However, anticipating is essential. Change management is a long process. But inevitably, this will be the only way to ensure that tomorrow there will be prosecu-tors and investigation services with the necessary knowledge and resources to deal with cybercrime cases. To Go FurtherEuropean central banking risk assessment for 2020 “A major challenge will then be to coordinate the action of the Digital Prosecutor's Office with the various existing specialized services, as digital technology is often a means of enabling major organized crime to launder money or finance terrorist networks, for which other bodies are already competent.”

Page 14

27Coup data“The philosopher Michel Foucault, in his lecture of January 15th, 1975 at the Collège de France, announced: “It seems to me that as far as the control of individuals is concerned, basically, the West has had only two great models: one is the model of exclusion of the leper; the other is the model of inclusion of the plague-ridden. And I believe that the substitution of the inclusion of the plague victim, as a model of control, to the exclusion of the leper, is one of the great phenomena that took place in the 18th century.”This formidable analysis by the French philosopher entitled “Les Anormaux” (The Abnormal) resounds more than ever to our deaf ears at a time when the coronavirus pandemic is spreading, leading States to ask themselves the question of the appropriate use of technologies to monitor its course and warn their citizens of greater contamination. Thus, we are perhaps today, in the 21st century, witnessing a major phenomenon: the popularization of a third model of individual control.Coronavirus and tracing of individualsWhile debates are raging about how to use geolocation data from smartphones to combat the pandemic, France is looking into a digital strategy for identifying people, and the European Commission asked mobile network operators to provide it with aggregate statistical data to check whether the containment instructions are being properly applied. Elsewhere in the world, several countries are currently using “contact tracing” appli-cations, i.e. making it possible to trace all the contacts of an infected person over the last two weeks, by recording all inte-ractions via the Bluetooth function of their mobile phones. In this context, according to the TraceTogether application used by Singapore, only “anonymized identification data, encrypted locally on the users' phones” would be processed. Other initiatives are more controversial. This is the case of the deployment of facial recognition in Russia. This is also the case in China, where the police have the travel history of all citizens using the Alipay Health Hub application. This is finally the case in Iran which, under the guise of helping to diagnose the virus, has launched a mobile application that makes it possible to collect the location data of millions of citizens in real time.Simple provisional measures in exceptional circumstances or the beginning of a generalization of the use of technical, inva-sive and permanent tracing measures? The health crisis must be addressed, and no one will be able to contest it. However, the means to do so must be wisely considered. Because these tools for tracing, monitoring and surveillance of individuals will survive this crisis. Therefore, to avoid reaching a point of no re-turn, a balance must be found. This cursor must also be the subject of information and education.Covid19, individual control & biopower?BY ADRIEN BASDEVANTCoup data

Page 15

Coup data 29From statistical control to mass data controlIf Michel Foucault's analysis is still particularly enlightening today, it is because it describes how the use of statistical tools has revolutionized the control of pandemics and individuals. Whether it is now via applications, artificial intelligence or fa-cial recognition systems, the underlying mechanism is still that of large-scale data processing by algorithms. In other words, very advanced statistics. We therefore need to analyze how statistics has impacted the control of individuals to achieve the control that “Big Data” will have. To understand the emerging new age of individual control, we need to go back in time for a moment.Why does Foucault refer to the 18th century? It is because at that time statistics appeared as an objective measuring instru-ment, allowing us to know a social reality that was previously inaccessible and complex. The measurement and statistical description of reality made it possible to apprehend mass phe-nomena. Governments thus resorted to figures - known as “po-litical arithmetic” - to manage situations of epidemic, famine and war. In 1774, when smallpox swept away King Louis XV, the question arose as to whether or not to inoculate the royal fa-mily and then the entire population. From the end of the 18th century onwards, statistics were understood as an instrument of state control, making it possible to distinguish between nor-mal and abnormal behavior and habits. This is the thesis retained by Michel Foucault, when he distin-guishes between the treatment of lepers and plague victims. Foucault describes how the state treatment of bodies has moved from a regime of exclusion of cities and territories to a regime of inclusion. In the case of leprosy, this is a model of exclu-sion, known as marginalization, a social practice based on the re-jection of leprosy patients beyond the city walls. In the case of plague, the population is no longer rejected, but confined. This is quarantine: “And every day inspectors had to pass in front of every house, they had to stop and call out. Each individual was assigned a window at which he was to appear, and when his name was called he was to report to the window, it being understood that if he did not report, he was in bed; and if he was in bed, he was sick; and if he was sick, he was dangerous. And, therefore, something had to be done. That's when the in-dividuals were sorted out, between those who were sick and those who were not”. The State carried out a statistical grid of individuals within the city. Through this data processing, the State carries out a meticulous counting of the healthy and the unhealthy, the normal and the pathological. “More flexible, more precise, biopower lets people die and makes them live.”

Page 16

Coup data 31This biopower is no longer a crude and primary method of sepa-rating into two distinct groups, the healthy from the sick. Those who must be allowed to live versus those who must be made to die. The logic is reversed. More flexible, more precise, biopower lets people die and makes them live. On the other hand, Fou-cault strikingly describes the emergence of a power of norma-lization, what he describes as the advent of disciplines: “The moment of the plague is that of the exhaustive squaring of a population by a political power, whose capillary ramifications constantly reach the grain of the individuals themselves, their time, their habitat, their location, their body. (...) I would say roughly this. It is that, basically, the replacement of the leprosy model by the plague model corresponds to a very important historical process that I will call in one word: the invention of positive technologies of power.”The emergence of a third control model The objective of the “biopolitics” presented by Foucault is to produce a healthy population by identifying regularities. Indi-vidual bodies become objects of disciplines on which conti-nuous, regulatory and corrective mechanisms will be applied. As Foucault remarked, the aim of the state, in this gigantic col-lection of data, in this regime of knowledge, is not so much to obtain knowledge as to achieve good government by gaining access to the intimate functioning of individualities. The go-vernment must then conform to the knowledge it possesses of the masses, to ensure eective economic redistribution and maximum judicial equality. In our contemporary society, statistics - now called “Big Data” - is the indispensable tool for rational and predictable human administration and for the preservation of public order. In this context, the control of individuals may well be in its third age, extending Foucault's reasoning. Not that of exclusion, nor that of inclusions by confinement or quarantine, but of a certain freedom to come and go, in return for real-time tracking and surveillance. From the leprosy model to the plague model, we now have the coronavirus model. But in return for what preci-sely? In the face of these developments, the question is less of whether or not we should use technologies to help limit the spread of the virus than which technologies should be applied and how. Should we only process anonymized data? If so, what anony-mization technologies should be deployed? Will they be robust to re-identification possibilities? If so, will these mechanisms be oered based on individual consent? Will the data be subse-quently deleted? Will advanced encryption technologies allow for the preservation of individual freedoms? Will the purposes of these processings be clearly framed so as to be limited to the fight against the epidemic? How can we ensure that these measures are proportionate and legitimate? There is nothing cosmetic about this reflection and it could well have important consequences for the respect of the rule of law and the way in which our societies of tomorrow will be shaped. “The control of individuals may well be in its third age, extending Foucault's reasoning. Not that of exclusion, nor that of inclusions by confinement or quarantine, but of a certain freedom to come and go, in return for real-time tracking and surveillance. ”

Page 17

Coup data 33“In short, the algorithms must be ‘educated’ ”A.B: Are the algorithms macho? A.B & F.V: Algorithms have no in-tentions per se, they reflect society and the views of its creators. But our society is sexist. Translation al-gorithms when they switch from a non-gendered language (like Tur-kish) to a gendered language pro-pose associations: a single person becomes a single man while a married person becomes a mar-ried woman. These stereotypes can become particularly unfair when it comes to sorting out CVs (systematical-ly discarding those of women for technical positions) or oering sa-laries (automatically lower for wo-men). AI can reproduce the gen-dered prejudices of our society, propagate and amplify them.A.B: So we are condemned to re-produce and propagate gender inequalities?A.B & F.V: No! The mechanisms generating these biases are today pointed out, understood and dis-sected. It is possible to domesti-cate, detect and reduce the biases of algorithms. From the design stage, at the level of the compu-ter code, then through the selec-tion and constitution of databases and their analysis, to the evalua-tion of automatic solutions; the di-versity of solutions is equal to the problem. Many researchers and industrialists are working on it. Some applications are already operational, in particular to mea-sure and fill the gender gap (the gap in statistics between women and men) in various fields. In ad-dition, institutional injunctions for algorithms that are egalitarian, ex-plainable and fair are becoming clearer in Europe and France. In short, the algorithms must be « educated ».A.B: In your opinion, what can work at the interface of feminism and technology provide?A.B & F.V: The fight to get more women involved in scientific fields is as essential as ever. Beyond this o-mentioned question, femi-nism today must question the cut-ting-edge technologies, and this is exactly our approach and that of the Equality Laboratory. Feminist thought has transformed many social sciences, the example of AI shows to what extent it can trans-form the so-called "hard sciences".What is a fair algorithm? Mathe-matical, computational, ethical answers exist and must be imple-mented. But technologies can also help equality. Artificial intelligence can quantify and reveal previously hidden biases. It is perhaps easier to change lines of code than men-talities. Let's seize this opportunity to encode equality in algorithms.To Go Further • L'intelligence artificielle, pas sans elles ! • The New York Times - Biased Algorithms are Easier to Fix Than Biased People Women in Tech3 QUESTIONS AUDE BERNHEIM AND FLORA VINCENTCoup data

Page 18

35Coup data“The history of statistics is filled with long hesitations. The diculty of reconciling statistical regularities with the singular situations of observed cases is found throughout the debates on calculation and interpretation methods for probabilities. Can we reduce societal interactions or behaviors to mathematical equations? What does it mean to be ‘normal’?”Two approaches have been in opposition for a long time: the descriptive approach and the prescriptive approach. It is par-ticularly interesting to consider for a moment the reasons that led to the criticism of the prescriptive approach, in particular Adolphe Quételet's notion of the « average man ». This article aims to bring a critical perspective to the substitu-tion of correlation for causality, which consists in prescribing measures, without any further concern for explaining the rea-sons for doing so. This analysis will allow us to understand why some of the prescriptive reasoning now applied to Big Data should in turn be decried and what are the underlying risks of using statistics and predictive models to moralize our conducts.The notion of « average man »In the 1830s, the Belgian astronomer Adolphe Quételet started from the fact that births like deaths, crimes like suicides occur randomly, for reasons that are dierent and specific to each case. However, they occur, each year, for a given country, on a regular basis. Moral behavior and physical attributes, despite their apparent heterogeneity, would possess unity. According to Quételet, this suggests the existence of constant causes. Regu-larity at the macroscopic level is no longer seen as a sign of a divine order, but as a statistical fatality. Above these singular cases, the « average man » would constitute the « norm », whose other occurrences would only be imperfect imitations.By measuring, for example, the size distribution of a given po-pulation, we can see the existence of a bell curve - the Gauss curve - which provides a distribution around a central value (the mean). The mean of this law - which will be presented from 1894 onwards by the British mathematician Karl Pearson un-der the name of « normal distribution » - is then considered as the real value of the observed size. The other values scattered around this mean being errors. For proponents of social phy-sics, deviations from the central value should not be taken into account, as these constituted imperfections.The shi from descriptive to prescriptive statistics is looming. Put dierently, the number of smallpox patients is no longer simply measured, but is used to make the decision whether or not to vaccinate a given population and whether or not to ad-When the Average Wants to Be the NormBY ADRIEN BASDEVANTCoup data

Page 19

Coup data 37minister this mandatory preventive intervention. From a moral perspective, normality is assimilated to good: « the individual who would sum up in himself, at a given time, all the qualities of the average man, would represent all that is great, beautiful and good at the same time » (Adolphe Quételet, Sur l'homme et le développement de ses facultés, ou essai d'une physique sociale, Editions Bachelier, 1835). In trying to identify the « ave-rage man », statistics are meant to be moralizing. The « average man » is thus supposed to become the standard of society. Be-longing to the average no longer refers to mediocrity. It beco-mes the object of a new fascination.Interpreting the diversity of human measurements as variation around the « average man » is not limited to physical attributes such as height or weight. According to Quételet, averaging is an ideal. « Social physics » is therefore interested in social regu-larities, such as the occurrence of crimes and suicides. Based on these metrics, statistics postulate the existence of objective causes explaining the measured regularities. Quételet said he was fascinated by « the frightening accuracy with which crimes are repeated ». The statistical results were all the more striking as they revealed a regularity, a social necessity, even for the most unpredictable acts. The statistical measurement of social phenomena thus held out the promise of being able to better explain them.The moralizing statisticsThis moralizing can be found in the very title of the underlying mathematical theorems’ terminologies. We refer to « bino-mial distribution » with Bernouilli, then « normal distribution » with Pearson, in order to model natural phenomena resulting from various random elements. They postulate the existence of constant causes, likely to explain the observed regularities. It is then no longer a question of measuring the eects of an identi-fied cause but, using an inductive method, of inferring from the constancy of certain measurements the presence of constant causes. In many ways, these aspects are similar to the current Big Data approach, which no longer focuses on explaining causes, but merely highlights the existence of statistical corre-lations. The law has therefore attached normative eects to statistical measures. The first social laws, for example, were the result of measuring the particular physical risks to which workers were exposed when operating the machines. Employment law was thus the first to have been indexed on social distinctions, re-vealed by quantification. We can already see in these probabi-lity laws the fantasy of substituting calculation for law, ancestor of the contemporary dream of replacing the Legal Code by the Computer Code, and why not tomorrow, to replace judges by machines.The reduction of social facts to mathematical formulae and indicators was already a concern for many philosophers, wri-ters and scientists, who saw in it a risk of moralizing social life through algebra and calculations. This led Auguste Comte to break with « social physics » and start talking about « sociolo-gy ». This succession of trends and neologisms shows above all that the appearance of statistics contributed to the empower-ment of society in relation to political power. Gradually, statis-tics were no longer solely at the leaders' service, but became autonomous to represent a whole social reality with its own laws and regularities that political powers in turn had to learn to know and measure.The criticism of the « average man »The historical opposition between the descriptive (« there is ») and prescriptive (« there must be ») statistical approach could be summarized as follows. According to the descriptive approach, man is a finite being, unable to know the universe, which implies making bets. In probability, we refer to « reasons to believe » or « degree of belief », allowing one to guide and orient one's choices in a situation of uncertainty. Probability “The reduction of social facts to mathematical formulae and indicators was already a concern for many philosophers, writers and scientists, who saw in it a risk of moralizing social life through algebra and calculations”

Page 20

Coup data 39then appears as a measuring tool of man's ignorance, and wants to help him to overcome it. On the contrary, the prescrip-tive approach, known as the « frequentist » approach, focuses on the regularity of the measured phenomena, and uses these measurements to justify and recommend actions.In his reference work The Politics of Numbers (La politique des grands nombres), the former administrator of the French Ins-titute for statistics (INSEE) and specialist in the history of sta-tistics, Alain Desrosières, perfectly summarizes the impact that the prescriptive approach has had on statistical culture and the abuses it may have caused: « [The prescriptive approach] forms the heart of statistical instrumentation in the public space. It was shown for the transformation of how industrial ac-cidents were socially addressed in the 19th century, from individual responsi-bility as defined by the Civil Code, to the company's insurance responsibi-lity, based on calculations of probabi-lity and average. Insurance and social protection systems are based on this transformation of individual hazards into stable collective objects, that can be publicly evaluated and debated. However, by paying attention not to unpredictable individuals, but to an average that can be used as a basis for controllable action, Quételet's reaso-ning does not provide a tool for debates on distributions and orders between individuals. Tending towards the reduction of heterogeneity, he is not interested in its’ objectivization, which is necessary if a debate is to focus precisely on it. This happens when a hereditary Darwinian problem of inequalities between individuals is imported from the animal world to the human world, by Galton. ». Indeed, from the 1870s, eugenists, including Galton and Pear-son, take up Quételet’s idea of the normal distribution of hu-man attributes around a central value, but this time to classify individuals, according to what they will call a law of deviation. Rather than eliminating these deviations from the average, they instead focus on them. Statisticians then become activists for a political cause. The technique of regression is applied in order to bring out the eects of heredity, to favor the birth rate of men categorized as the most able and to limit the reproduction of the poorest, who are unfit: « This scientific-political war machine is directed, on the one hand, against the landed gentry and the clergy, who are hostile to modern science and Darwinism, and, on the other hand, against reformers for whom misery results from economic and social causes, more than biological ones, and who militate for the establishment of welfare systems. ».In an article published in the Journal of the Royal Statistical Society in 1910, the economist Keynes, defending an anti-fre-quentist approach, opposed Karl Pearson's statistical induc-tion procedures. Keynes objected to a recent study by Pearson that sought to demonstrate how children's abilities were purely hereditary and not contingent on their parents' lifestyles. In this case, the purpose of the publication was to show that alcoholic parents - alcoholism being considered a way of life, free from any genetic cause - could produce children with undamaged physical and intellectual aptitudes. For Keynes, Pearson's stu-dy was dangerous, since nothing ruled out the possibility that causes other than alcoholism might be responsible for the re-sults obtained.The dierence in approach, both descriptive and prescriptive, has not failed to spark controversy among physicians as well. Illness was oen seen as a single event and not a generalizable one. The categorization of patients - between those who were part of the norm and those who were abnormal - was therefore dicult to accept. French philosopher and physician Georges Canguilhem observed: « The ambiguity of the term normal has oen been noted, which sometimes designates a factual situa-tion able to be described by a statistical census - the average of the measurements made on a trait presented by a species and a plurality of individuals presenting this trait according to the average or with a few deviations considered indierent - or an ideal, a positive principle of appreciation, in the sense of a prototype or a perfect form. The fact that these two meanings are always linked, that the term normal is always confused, is what emerges from the very advice we are given to avoid any ambiguity. ». Canguilhem questioned what it means to be « normal ». For every human being is doomed to step, even if only temporarily, outside the norm. For example, we sometimes catch the flu or a fever in the winter. So, there is a certain nor-mality in being sick.Therefore, should it be considered that there are several pos-sible standards? The goal of 19th century medicine was the restoration of normality, of which illness was presented as a deviation. So, what if average people are sick, would it then still be normal? So we see that under the idea of norm - in the scientific, medical sense - there is a moral ground, that is a normative judgment according to which any dierence, any “The technique of regression is applied in order to bring out the effects of heredity, to favor the birth rate of men categorized as the most able and to limit the reproduction of the poorest, who are unfit (...) ”

Page 21

Coup data 41variation from what has been stated to be the norm, must be considered as a risk, a danger, that must be addressed and fought against. On the contrary, for Canguilhem, normality should not be approached in a universal approach (the same standard for all), but should rather be understood as singular. According to him: « There is no normal or pathological fact in itself. The abnormality or mutation is not in itself pathological; it expresses other possible norms of life ». Being normal would mean, on the contrary, being able to deviate from the norm by inventing new norms. The rejection of deviations from the norm will then need to be integrated into our approach to Big Data. Its biopolitical objectives of identifying patterns to produce a healthy population should not result in locking us into a stan-dard-setting society.Lawyers, coders, statisticians and technologists will have to learn how to work together to ensure an ethical use of statisti-cal information, which is favorable to the numerous dierences of the human species, as it is a condition for any democratic deliberation. “There is no normal or pathological fact in itself. The abnormality or mutation is not in itself pathological; it expresses other possible norms of life”

Page 22

Coup data 43“The problem is not so much the Regulation and the behaviors that it aims to prevent. The regulation only applies to Internet pipes, i.e. to the Internet service providers. [...] The problem lies elsewhere; not in the pipes, but in the faucets.”A.B: You defend network neutra-lity and internet as a common. Ac-cording to you, what public policy measures should be implemented to ensure this approach?S.S: Internet has spread as a com-mon good. Its value grows with the contributions of its users. It is The wealth of networks of Yochai Ben-kler, the end-to-end argument of Lawrence Lessig, the core of Tim Wu's proposal on net neutrality.This is the principle behind the Eu-ropean Open Internet Regulation from 2015. In France, the ARCEP oversees its implementation. At the European level, BEREC, the umb-rella organization of European re-gulators, is in charge of ensuring consistency in its implementation all over Europe. It has been a struggle for many years and in the US, it is not over yet. Though in Europe, the net neutrality principle is now unlikely to be challenged. The European Commission has firmly confirmed it recently. BEREC is in the process of reviewing our guidelines at the European level. We will adjust it for more clarity, especially for 5G, but the rules are futureproof and don’t need to be reshued.The problem is not so much the Regulation and the behaviors that it aims to prevent. The regulation only applies to Internet pipes, i.e. to the Internet service provi-ders. However, on this part of the network, we do not see major pro-blems anymore. The problem lies elsewhere; not in the pipes, but in the faucets. A.B: Platforms play a dominant role in the digital economy. For instance, access to the internet is now predominantly through smart-phones, reducing user’s freedom of choice. Hence you claimed to be in favor of a regulation of Big Tech - mentioning Android and iOS notably. How should they be regu-lated?S.S: The core issue with Big Tech is that decisions are made for us. To a certain extent, Big Tech takes the place of our free will. If we disa-gree with what they oer, we have no other choice than leaving them. Thus, we need a counterweight to face the asymmetry of power that exists. The State must act as the guardian of openness by regula-ting these companies’ behavior. Internet is a decentralized en-vironment, in which intelligence must remain at the edge of the network. The State must not re-place Big Tech: we need to take their power and redistribute it to the many. Digital sovereignty must not belong to the State, but to in-dividuals, people, innovators, so they can decide themselves of their future. The first tool to implement consists in bringing back choice. Choice is the first market discipline. Cur-Data-driven regulation3 QUESTIONS TO SEBASTIEN SORIANOCoup data

Page 23

Coup data 45rently, Big Tech behave as they want, simply because they are in monopolistic situations. The only possibility you have is to leave their platforms. We must in-troduce al-ternatives; it may require regulation to help inno-vators wishing to imitate Big Tech because it is very hard to enter the market, especially with such powerful players. It is exactly what we did twenty years ago with the complete opening-up of the te-lecommunications markets to competition. This philosophy must inhabit us all: we must allow alter-natives to arise. Today, we have various sets of re-gulation dealing with digital af-fairs: competition law, Platform to Business Regulation, GDPR, the e-Commerce Directive, the Open Internet Regulation, and so for-th. I believe we can go further in order to give people the right to really choose and target the main players to impose specific obliga-tions to them. As far as net neutrality is concerned, we see that it works fine on network and ISPs (Internet Service Providers) but it does not cover devices (smartphones, voice assistants, connected cars, etc.). Indeed, when using a smartphone, iOS and Android decide which apps you cannot uninstall on your smartphone or you can install via the online App Store available. They are also many features that competitors to Apple and Google cannot use on smartphones. The-refore, we propose that the prin-ciple of free-dom of choice should also ap-ply to devices. Many other re-medies could be implemented without creating any further bur-den on firms. Data sharing with public authori-ties for instance could be keyA.B: Data is at the heart of the contemporary digital transforma-tion. As a consequence, you have implemented a new method of intervention for the French Tele-com Authority by developing “da-ta-driven regulation”. Could you explain us what means? And illus-trate it with concrete examples?S.S: As individuals, we complain having too much information, like the general terms and conditions we have to approve. But in fact it means we don’t have the relevant data to make an enlighten choice. In the telecom sector, we realized that people did not get the data they needed. They felt well infor-med about prices but notably lacked information on coverage and quality of services. They were not happy with the data we pu-blished at a national scale, they want data related to their living or workplaces. So we decided to change our ap-proach and “unbundle” data. On coverage, we stopped making na-tional podium and started to build a program that would deliver tai-lor-made data to users. Also the ARCEP has engaged in has embarked on a crowdsour-cing approach with a range of third-party players (including for instance with transportation com-panies at a local and national scales). To build on collective in-telligence and crowdsourced tes-ting, the ARCEP has published a code of conduct to guarantee the reliability of the methodologies used.One final output is for instance monreseaumobile.fr which pro-vides information on mobile network performance through co-verage maps and data and qua-lity of service indicators. The data is naturally accessible to all and can be used for instance by real estate or tourism companies, or any other company. On fixed markets, we already have maps. So far, it is used mainly by operators themselves and muni-cipalities so that they can assess and plan fixed network deploy-ments. But the ARCEP will be soon releasing a tool giving to each re-sident the information on tech-nologies available at the address level. Another very tangible tool we created is J’alerte l’Arcep, an on-line reporting platform allowing users to flag any issue falling un-der Arcep jurisdiction. Users share their personal experience, in-fluence market regulation and re-ceive personalised advice by the same token. Such tool could first be extended to issues encounte-red with devices neutrality for ins-tance. “Data wants to be free” and it can be for the common good! States should act accordingly. “The first tool to implement consists in bringing back choice. Choice is the first market discipline.”To Go Further• Big Tech Regulation: Empowering the Many by Regulating a Few - Digital New Deal - September 2019• 11 main proposals of Arcep to ensure internet openness and users’ freedom of choice (nov. 2018) • 5G and Net Neutrality: friends or Foes? (June 2019)• Sébastien Soriano is now Director of the National Institute of Geographic and Forest Information. www.ign.fr

Page 24

47Coup data“How did we go from simple livestock counts to algorithms that recommend which movie to watch, which article to read, or which person to invite to a restaurant? In other words, how did we come to rely on numbers to make laws and correlate our policy decisions with arithmetic calculations? To understand this evolution, we need to look back at the history of data culture and establish the link - still insuciently studied today - between the emergence of statistics and the contemporary development of artificial intelligence algorithms.”Data are not a recent inventionData have been circulating for tens of thousands of years, when men had to count resources, hazardous phenomena, everything that could be counted. Numbers actually preceded the letters. Bone notches or chiselings were already used to count animals even before the Sumerians invented writing in the 4th millen-nium BC. At that time, the Mesopotamian clay tablets formed one of the first data corpus recording accounting operations. Dried tablets could be reused by rehydration. Less volatile, the non-transcriptible baked tablets allowed to keep track of the exchanges.The countable field has grown significantly over time. Wishing to know its territories and populations, the State thus began to count its vital forces, whether it was with a purpose of starting a war or distributing taxes according to each one's occupation. As early as Ancient Rome, the census was devolved to conse-crated magistrates, elected for five years: the "censors". Each family would then go to the Champs-de-Mars in order to share the composition of their human and material assets. But the information became outdated even before studies were fina-lized. It was not until the 18th century that these practices ge-neralized thanks to new ways of organizing data collection. In Sweden for instance, the proximity between state and religious administrations made censuses more reliable, namely thanks to the burial registers held by parish priests (data on age, gen-der and marital status). The higher counting frequency made it possible to observe the evolution of the measurements over time. Although it initially started as a simple herd counting exercise, the counting was then based on statistics that made it possible to assess the risks taken by merchant ships, to provide insurance, and to distingui-sh depending on the power of the States and companies. Sta-tistics then became an objective measuring instrument, making it possible to understand a social reality that had previously been complex and elusive.Data have been flowing since our species' genesis, but it is only through statistics that we have learnt how valuable and useful their collection is. Etymologically, statistics is linked to the idea A Brief History of DataBY ADRIEN BASDEVANTCoup dataPart I: The Emergence of Data

Page 25

Coup data 49of making an inventory. In medieval Latin, status means "inven-tory". This term is inherently attached to all government activity. Modern Latin uses statisticus to refer to what is "relative to the State". Its Italian derivative, statista, gives the term "statesman". This historical and central link with the State should not, howe-ver, hide the fact that it was originally used in a commercial context. Numbers and Law: from the Middle Ages to the GDPRMerchants' liability now derives from the law, but originally stemmed from the regulations of the merchant corporations, the lex mercatoria. It was only with the Italian City-States that it was gradually imposed on tradesmen in order to prevent bankruptcies, to guarantee proof of good management, and ultimately to protect trade. Trade globalization and the emer-gence of the market economy have made it necessary to quan-tify the new mass phenomena, whether demographic, econo-mic, social or moral. States in turn have embraced these tools and techniques in order to structure their societies more ade-quately. In the Middle Ages, it was indeed the traders who made it their duty to be accountable through the introduction of double- entry book-keeping. This invention allowed them to account for their activities to their contractors, both private individuals and public authorities. The faithful keeping of accounts was thus the basis of commercial liability. This medieval link between numbers and law, through the obligation of accountability, is still found today in our Big Data era. “Accountability” is indeed a key notion of the recent "European Regulation on Personal Data", known as the GDPR, which re-quires data controllers to "be accountable" towards the indivi-“This medieval link between numbers and law, through the obligation of accountability, is still found today in our Big Data era.”

Page 26

Coup data 51duals concerned by these processing operations, both on the impact on their privacy and the upstream safeguards imple-mented to prevent any abuse.Following this introduction to the history of data (Part I), we will examine the genesis of the statistical culture. In order to understand how we have gone from simple enumeration, to descriptive statistics, then prescriptive, and finally today... "predictive" statistics. The dierent forms of statisticsThe development of statistics has successively addressed se-veral concerns. Initially, statistics, according to the medieval tradition of the "mirror-prince", had a pedagogical vocation, consisting in instructing the regent while showing him the reflec-tion of his greatness: description of the kingdom's provinces, of its territory, of the amount of taxes he could collect. Statisti-cal analysis then abandoned the perspective of royal power to focus on the condition of society itself and its inhabitants for practical purposes: inventories of the prices of agricultural and industrial products, population enumerations, means of subsistence. Subsequently, times of famine, plagues and war called for increasingly specialized and regular statistical stu-dies: Abbé Terray's annual records of births, marriages and deaths from 1772 onwards, and Montyon's record of criminal convictions from 1775 onwards.The first public controversy on the use of statistics arose du-ring the debate on the inoculation of smallpox. The controversy began when, in 1774, smallpox swept the king away, leading his successor Louis XVI to inoculate the entire royal family. This technical progress was made possible by the work of Edward Jenner. According to this English physician, women working in contact with cows, especially those who milked them, did not get smallpox. His work showed that cattle carried an infectious disease called vaccinia. This gave him the idea to "vaccinate" - to infect humans as an antidote to smallpox. The question then arose as to whether public health would be better served by making the vaccine compulsory.For lawyer Daniel Bernoulli, the "chances of winning" militated in favor of a vaccination campaign. Brother of the mathema-tician Jacques Bernoulli - inventor of the law of large numbers - he proposed to solve the political question of vaccination, by applying a formula similar to that used in games of chance. He thus calculated that the chances of winning, in this case a three-year longer life expectancy for the inoculated indivi-duals, established the benefits of vaccination. On the other hand, French philosopher and mathematician d'Alembert and French physician Claude Bernard were opposed to this approach, which in their view led to a confusion between the "mean" and the norm. According to them, there should be no equivalence between the observed fact and the law deriving from it. This opposition, which ultimately proved the propo-nents of vaccination right, is one of the first instances of statis-tical calculation's victory over law.From the « artificial man » to the « artificial intelligence »The pre-industrial thrusts and the gradual emergence of capi-talism led to the dissolution of the feudal political system, gi-ving rise to unprecedented administrative, economic and so-cial problems. A renewed scientific and ideological foundation was therefore needed to tackle the new forming societies. This is the context in which statistical methodology emerges. Be-fore statistics became an autonomous and unified discipline, it had two very dierent origins: German descriptive statistics (Staatenkunde) and English political arithmetic. The term "statistics" was first used by the economist Gottfried Achenwall in the old Germany of the mid-18th cen-tury. It was an activity of a purely des-criptive nature, intended to present the characteristics of States, their territo-ries - climate, natural resources, land constitution - and their populations, notably with the help of cross-tabula-tions. This method was based on the distinguishing criteria of nations deve-loped by Leibniz during the same time, in an eort to be able to draw comparisons between each Eu-ropean country. The classification of heterogeneous knowledge was meant to allow for a distinction between natural and ma-terial wealth, types of regimes and administrations. It was a nomenclature with a holistic intention, designed to faci-litate the memorization of facts and teaching, for the good use of statesmen. Understanding the State in order to rule it more eectively became essential in the second half of the 19th cen-tury. Aer the Thirty Years' War, what was to become Germany was still a country divided into more than three hundred mi-cro-States. This fragmentation explains the desire to establish a general framework for classifying, cataloguing and archiving information in order to organize the collective memory, and “The birth of statistics thus correlated with the creation of the modern state, which the English philosopher Hobbes referred to - and this is not trivial - as the ‘artificial man’ ”Part II: The Birth of Statistical Culture

Page 27

Coup data 53subsequently on this basis: trade, justice and political deci-sions. The birth of statistics thus correlated with the creation of the modern State, which the English philosopher Hobbes refer-red to - and this is not trivial - as the "artificial man".The statistical description of reality to apprehend mass pheno-mena is found at the same time, in a completely dierent form, in England. Here, the scientific government resorted to num-bers - the counting of parish baptismal registers, construction of mortality tables, calculation of life expectancy - to chase out the arbitrary, giving way to "political arithmetic" in 1648. The British polymath William Petty, inventor and pioneer of this set of techniques, described it as an art of reasoning by figures upon questions of government: « The Method I take to do this, is not yet very usual; instead of using only comparative and superlative Words, and intellectual Ar-guments, I have taken the course (as a Specimen of the Political Arithmetick I have long aimed at) to express myself in Terms of Number, Weight, or Measure ». The development of mathematical tools for quantification - averaging, disper-sion, correlation, sampling - is destined to apprehend a supposedly uncontrol-lable diversity. The main dierence between these two models, German and English, stems from the fact that German descriptive statistics sought to give a global picture of the State without resorting spe-cifically to quantification, whereas English political arithmetic - much closer to today's statistics - was entirely based on fi-gured censuses. This heterogeneity eventually converged. Mo-dern statistical thinking, by its mathematical vocation, inspires objectivity, and therefore the legitimacy of a new administra-tion of beings and things. States thus created the first ocial statistical agencies, the forerunners of national statistical ins-titutes such as INSEE, or private polling institutes, who are now themselves disrupted by the advent of predictive algorithms.Unlike today's reality, statistical surveys were then intended only for governments - sovereigns and their administrations - and never for civil society. This dierence is significant, and shall be discussed later, especially when we consider the movement that has been going on for the last ten years with the opening up of public data (open data), which is supposed to inform everyone on multiple subjects, from air quality, to the availabi-lity of electric cars, WiFi access points in one's neighborhood, budgets and decisions voted by local representatives. “Modern statistical thinking, by its mathematical vocation, inspires objectivity, and therefore the legitimacy of a new administration of beings and things ”

Page 28

Coup data 55“A smart contract transaction between an individual and a DAO on the Ethereum blockchain may not qualify as an actual contractual relationship”A.B: You co-authored with Aaron Wright Blockchain and the Law: The Rule of Code, one of the first books analyzing the interaction between blockchain and law. Blockchain has since become a hot topic. It is oen seen as a technology that cannot be controlled by govern-ments. As such, you introduce the concept of "alegality". Could you tell us what it means? And how ins-titutions should take it into consi-deration?P.F: The terms "alegality" is essen-tially a fancy way to say that there are some things that happen in the blockchain world that are invisible to the eyes of the law. These things are not "legal" or "illegal", they are simply "alegal", in the sense that they occur outside of the purview of the law. For instance, a trans-fer of cryptocurrency from one person to a smart contract on the Ethereum blockchain could be regarded as an alegal activity. From a purely legal standpoint, this transfer may not qualify as a transfer of ownership over these digital assets, simply because a smart contract cannot qualify as the legal owner of these assets since it is not a legal person. The same applies in the context of (smart) contracts. A smart contract transaction between an indivi-dual and a DAO on the Ethereum blockchain may not qualify as an actual contractual relationship, simply because there is no coun-ter-party to the transaction, since a DAO does not have any legal per-sonality or legal capacity. Hence, while these transactions are tech-nologically enforced by the un-derlying technology, and might produce significant eects to the parties involved, they remain in-visible to the legal system, which cannot therefore be leveraged to enforce these alegal relationships. This highlights the discrepancies that currently subsist between the "rule of law" established by natio-nal governments and the "rule of law" established by blockchain code. A.B: Traditional organizations and jurisdictions face new kind of challenges as they interact with decentralized networks and auto-nomous blockchain-based code. You suggest that the growth of blockchain technology may give rise to a new type of legal order that you called "lex cryptogra-phia". You present it as a system of rules where autonomous, decen-tralized code — rather than legis-lators or judges — could determine the outcome of given interactions and disputes. Could you explain us what is lex cryptographia about? Do you consider it to be the conti-nuation of Lawrence Lessig's lex in-formatica, known as "Code is law"P.F: Internet and digital technolo-gies have enabled the emergence of a new normative system, a par-ticular set of rules spontaneously and independently elaborated by an international community of Block party PRIMAVERA DE FILIPPI - BLOCKCHAINCoup data

Page 29

Coup data 57Internet operators. This system—sometimes referred to as Lex In-formatica, by analogy to Lex Mer-catoria—is an ideal toolkit for the regulation of online transactions, since its normative power arises directly from the technical de-sign of the network infrastructure, which is used as a complement (or a supplement) to contractual rules. Just like Lex Mercatoria, Lex Infor-matica ultimately relies on self-re-gulation: it is a system of customary rules and technical standards ela-borated by those who interact on the global Internet network. The system operates transna-tionally, across bor-ders, independent of national boun-daries and do-mestic laws. Howe-ver, as opposed to Lex Mercatoria, which was elabo-rated by and for an international com-munity of mer-chants, in order to respond to their own needs, Lex Informa-tica is unilaterally imposed by online service providers onto their users. Indeed, by restricting the type of actions that can be performed on a digital platform, Lex Informati-ca introduces a system of techni-cal norms which are not a direct expression of the will of the people, but rather that of those in charge of maintaining the platform.Blockchain technology enabled the emergence of yet another normative system, a new mecha-nism of coordination which also relies on technical means in order to coordinate behavior. Yet, as op-posed to Lex Informatica, whose rules are ultimately dictated by a centralized operator, the rules established by the protocol of a blockchain network are establi-shed by the community and for the community, and must be en-forced through a mechanism of distributed consensus involving all network participants. The benefit of this new normative system—which I defined as Lex Cryptographica in my book—is that it operates inde-pendently of any third-party au-thority or intermediary operator.A.B: Primavera you are also an artist. You recently exposed your "Plantoid" a blockchain-based life form, during Burning Man Festival. Can you tell us about this bionic creature? How it interacts? Why and how you created it?P.F: The goal of the Plantoid is to illustrate one of the most revolu-tionary – and yet still unexplored – aspects of blockchain technolo-gy. It illustrates the ability to create “blockchain-based lifeforms”, i.e. algorithmic entities that are (1) autonomous, (2) self-sustainable, and (3) capable of reproducing themselves, through a combina-tion of blockchain-based code and human interactions. The pro-ject explores the use of technology to give agency to both inanimate and animate things, which are currently not granted any legal personality or legal capacity un-der the law. Specifically, the Plantoids col-lects cryptocurrency and then use these funds to hire artists to create new copies of themselves, through an evolutionary algo-rithm. Every Plantoid has its own Ethereum wallet, to which people can send cryptocurrency to. In sending these cryptocurrencies, people provide the Plantoid with the opportunity to fund its own re-production, while simultaneously acquiring the right to participate into the governance system of the newly created Plantoid. All cryptocurrencies collected in this way are stored in the wallet of each and every Plantoid. Depen-ding on their form and size, die-rent Plantoids will require dierent amounts of funds before they can blossom. The Plantoid constantly monitors its cryptocurrency ba-lance, and whenever it realizes that a particular threshold has been reached, the Plantoid will be able to use this money to ini-tiate its own reproduction by hi-ring someone to produce a new copy of itself. There are currently 12 Plantoids in the world, and the last three were exhibited at the Burning Man festival in Black Rock City, Nevada. Hopefully, they will continue to reproduce and slowly colonise the planet! To Go Further • Primavera de Filippi - Plantoid & DAOS: Blockchain Based Life Forms• Meet Plantoid: Blockchain Art With A Life Of Its Own• The invisible politics of Bitcoin: governance crisis of a decentralised infrastructure• Ethereum: the decentralised platform that might displace today’s institutions“Blockchain technology enabled the emergence of yet another normative system, a new mechanism of coordination which also relies on technical means in order to coordinate behavior”

Page 30

Coup data 59“It surprises me how this project triggers reactions while the underlying issues are already known, such as the current balance of power, skills, and vision between US-based private companies and nation-states.”A.B: The announcement of Libra, now renamed Diem, has provoked strong reactions from political lea-ders, some claiming that this is an infringement of their sovereignty. Should governments forbid this project?X.L: Since their inception, cryp-tocurrencies are a political and cultural move-ment. Bitcoin was designed to pro-vide a curren-cy without third parties, building on decades of technological activism such as cypherpunks and open source. Libra was inspired by every aspect of this movement: technical archi-tecture, decen-tralization voca-bulary, financial inclusion claims, and privacy concerns, without a specific focus. There are legi-timate concerns to be raised on economic and geopolitical risks of the project. However, at this stage, it relates more at the symbolic of sovereignty rather than it poses a precise threat.Privacy in payment and financial services is also already confronted to an ever-tightening AML regula-tion. The current state of the pay-ment and financial industry is not “good enough for this day and age” as points out Mark Carney, gover-nor of the Bank of England. Sove-reignty is not a given, but rather a reflection of the ability to act. More generally, an approach using regulations threats to forbid a specific project from a specific actor without a clear legal basis questions our mo-del. Compared to emerging powers that embrace sta-tism and the rise of protectionism, what is our philo-sophy regarding economic free-doms? What is our strategy to pro-mote and balance public good and innovation?A.B: So could we see Libra as a new take on crypto-currencies’ objec-tives?X.L: Libra arrives at an interesting turning point in this industry. On the one side, you have cryptocur-rencies such as Bitcoin or Grim, driven by passionate people, geared toward censorship re-sistance and privacy. You could compare them to the Tor browser, a privacy-enhanced web brows-er, or signal, a messaging app re-commended by Edward Snowden. Their adoption is slow but steady Money in the bankXAVIER LAVAYSSIÈRE - LIBRACoup data“Blockchains were presented as a pure technology with a confusing broad scope of applications. Governmental institutions are still enthusiastic about their potential but often missing the cultural element”

Page 31

Coup data 61as it suits a particular set of needs. Surprisingly, most regulators seem to consider them now as a given, maybe because of their marginal adoption or their resilience.On the other side, you have had many attempts to adapt the un-derlying techniques to the finan-cial industry. Blockchains were presented as a pure technology with a confusing broad scope of applications. Governmental insti-tutions are still enthusiastic about their potential but oen missing the cultural element. More impor-tantly, to restructure the financial industry is a challenge that will need more than a few startups.Despite its flaws, Libra brings le-gitimacy to the objectives and a particular approach. For ins-tance, the choice has been made to develop the underlying layer in an open-source manner and to discuss openly the institutional elements of the project. While it is still unclear how much openness the project will retain in the end, it shows that private companies can see interest in community-driven methods. On the institutional side too, this project suggests that pro-found reforms could be led, invi-gorating prior ideas such as cen-tral bank digital currency.A.B: So it brings reinforce existing ideas, but does it have a chance to succeed?X.L: If we look more closely, the project itself is actually twofold, with the Libra currency and its technical architecture on the one hand and the Calibra wallet, a payment solution by Facebook on the other hand. As a strategic move for Facebook, it is already a success. It has po-sitioned Facebook as a potential leader in the financial industry. Press coverage has enabled the launch of a payment solution. However, looking at similar pro-jects, end-user adoption is still li-kely to take years and to vary re-gionally.As a global financial architec-ture, the project suers the ten-sion between conflicting objec-tives. Financial inclusion itself is a very dierent topic when talking about US low-income household and the need for low-cost inter-national money transfers. My in-tuition is that we can expect more turns along the road, but there are enough needs in this ecosystem for the project to find, in a form or another, a suited purpose. To Go Further Libra Compendium - The most comprehensive independent report on Libra, a digital asset initiated by Facebook and inspired by cryptocurrencies.

Page 32

Coup data 63“We are here for a long marathon not a short sprint for both hardware and application design”A.B: The discovery of quantum phy-sics during the XXth century rocked the scientific world, and led to ma-jor inventions such as the tran-sistor or the GPS. How would you describe the current breakthrough in quantum studies and the conse-quences attached to the use of quantum computers, or quantum cryptography?E.K: We are witnessing the second quantum revolution (as coined by Alain Aspect), every aspect of in-formation processing and com-munication (Security, Privacy, Computational Speed, Precision, Cost, etc.) will be quantumly en-hanced. It is not just a matter of better machines that are emerging, we are embracing a radically die-rent way of manipulating data enabling us to probe new territo-ries from simulation of complex many-body system to discover exotic materials all the way to in-formationally secure communi-cation for unhackable, verifiable, private cloud computing.A.B: French congresswoman, Pau-la Forteza states that quantum technology is a “technological turning point” that France cannot miss. What is your take on France’s chances to become a European, or even a world leader on this matter?E.K: France has it all: skills, knowledge, person-power, accu-mulated over decades of French leadership in the academic do-main; as well as large scale indus-tries and growing number of smal-ler startups committed to transfer quantum knowledge to quantum innovation; and most importantly France is well integrated within the European quantum ecosystem for a global influence. So now all we need is a quantum government to connect the puzzle pieces in a coherent national pro-gram as proposed in the quantum technology report.A.B: Calculation capacities oered by large scale quantum calculators are potentially endless, but there are yet to be invented. Which short term opportunities and challenges do you see revolving around quantum technology? Can you tell us more about VeriQloud on this matter? E.K: I fully agree on the term, yet to be invented, we are here for a long marathon not a short sprint for both hardware and application design (as emphasized by Chris-topher Monro). However Noisy In-termediate-Scale Quantum tech-nology (coined by John Preskil) i.e. order of 100s of not very perfect quantum bits, that are becoming available in the next few years are already promising a suite of ap-plications in quantum simulations (e.g. quantum material and quan-tum chemistry) and in quantum computing (e.g. machine learning and quantum optimization).In quantum communication, we already have commercially avai-The cat is out of the box ELHAM KASHEFI - QUANTUM COMPUTING Coup data

Page 33

Coup data 65lable quantum key distribution and random number generator with Intermediate-Scale quantum networks already running in China, Japan, UK, Europe and elsewhere. In the short term, VeriQloud’s work aims to strengthen networks using quantum resources. We are deve-loping a hybrid classical-quan-tum cloud with better security and eciency using today’s quantum technology. But VeriQloud’s full stack ap-proach is also ready for the next generation of quantum networks as targeted in project such as Quantum Internet Alliance. This will enable new quantum appli-cations, which are collected in the Quantum Protocol Zoo to give a full picture of future evolutions of quantum networks. To Go Further• Rapport Quantique - January 9th 2020, remise du rapport « Quantique, le virage technologique que la France ne ratera pas » • Note sur les cryptographies quantiques et post-quantiques par Cédric Villani - July 2019, « Technologies quantiques : cryptographies quantiques et post-quantiques » (n°18 - juillet 2019) •LIP6 Homepage - About LIP6, the computer science laboratory of Sorbonne University's Faculty of Science and Engineering “We are witnessing the second quantum revolution (as coined by Alan Aspect)”

Page 34

67Coup data“Technological choices influence our daily lives, yet they largely escape the democratic debate. We are experiencing a “Coup Data”, a power takeover by those who process data. Only an enhanced digital culture will enable us to understand this paradigm shift. Above all, it is a question of governance and sovereignty. Both at the level of individuals wondering how to regain control over their data, and at the level of institutions competing with the advent of big platforms. More than ever, these societal issues call for European awareness.”The "Coup Data"BY ADRIEN BASDEVANT1. Data, the algorithms that process it, and the platforms that concentrate it have become so important in our lives and eco-nomies that they compete with the traditional power of states and influence the choices of their citizens. Every microsecond, these iterative sequences of computer instructions suggest so-lutions to both simple and complex problems. From eligibility for a credit, a job, an insurance policy, to the distribution of stu-dents in university courses at the end of high school, algorithms anticipate our actions, direct our interests, administer our lives, making data the new keys to power. Our social interactions are thus framed, governed by algorith-mic systems whose logic can be opaque, unexplained, or inex-plicable to us. In this context, who governs? The legal code or the computer code? Who elaborates and controls the norm? The paradigm shi is radical. States are facing a clash of sove-reignty. 2. The digital giants oppose national laws with their own constitutions, the famous Terms and Conditions (or Terms of Use). The most recent example being the decision to suspend or not the President of the United States by Twitter, aer the events at the Capitol. Many debates will unfold around such issues, since platforms are granted or given the role of private regulators, with quasi-regalian prerogatives, and so-power influence.3. Fully understanding this phenomenon requires a histori-cal perspective, a legal analysis, and a political projection. The history of data bears witnesses to the irresistible rise of the culture of numbers to model a reality that goes beyond the hu-man (I). This evolution explains the gradual transition from the reign of law to governance by calculation and algorithms (II). In this context, the challenge before us is to maintain the irredu-cible singularity of citizens (III). 4. How have we come to rely collectively on numbers to di-rect our choices and establish laws by correlating our political decisions with arithmetic calculations? To understand this evo-lution, we must quickly go back over the history of data culture and make the link – still insuciently studied today – between the emergence of statistics and the contemporary development of artificial intelligence algorithms. Part I: From descriptive statistics to predictive algorithms

Page 35

Coup data 69The appearance of data is not a recent phenomenon. In fact, they have been circulating for tens of thousands of years, when mankind had to count ressources, hazardous phenomena, everything that could be enumerated. The field of enumera-tion then gradually increased over time. Wishing to know its territories and populations, the State thus began to count its li-ving forces, whether with a view to starting a war or distributing taxes. Thus, the link between data and power, between num-bers and politics, has always existed. As such, it is not insigni-ficant to observe that status etymologically means “inventory”, when statisticus refers to what is “relative to the state”. The birth of statistics is directly correlated to the creation of the modern state, which the English philosopher Hobbes referred to as “ar-tificial man”. 5. From the 17th century onwards, “political arithmetic” the-refore resorted to numbers to dispel arbitrariness. Statistics are an objective measuring instrument, making it possible to know a social reality that was previously inaccessible and com-plex. The development of mathematical tools of analysis and quantification (average, dispersion, correlation, sampling) to capture a supposedly uncontrollable diversity remains today the foundation of artificial intelligence, which is nothing other than an advanced form of statistics, with self-learning algo-rithms that have become not only descriptive but prescriptive, even “predictive”. 6. In contrast to traditional statistics, the analysis of “Big data” takes everything into account. Data scientists pursue rules of association, repetition, and patterns.Information is collected and decisions are induced by correlation, which re-places causality. Big data thus carries the belief that it is possible to access reality directly, by collecting and measuring all the signals, without having to interpret or question their contents. We rely on them in an almost divine way. This revolution derives its power from its apparent innocuousness because it has to do with governing the potential (what might happen) rather than the actual (what happens). Hazard and risk are now relegated to the subjective realm of belief, which should be discarded in favor of a digital truth. 7. The desire to put an end to all uncertainty is reinforced in a security society, characterized by a crisis in the representa-tion of its institutions and the constant application of the pre-cautionary principle. To maintain a semblance of control, we rely more and more on algorithms and their data. We come to govern from a statistical expression of reality that is no lon-ger interested in causes and intentions, and to erase people's very existence. The logic of pre-emption is used to model the environment to prevent any risky behavior from occurring. The bad payer will be excluded from the field of credit. Anyone who crosses the path of a citizen showing the signs of Covid-19 would be asked in advance not to go to work or to his/her lo-cal food store, dangerous individuals would be turned back upstream of the stadium or concert hall. By wanting to preserve us from uncertainty, this logic of pre-emption evacuates any dierence. 8. In these circumstances, who will be the new excluded? Should we be reduced to our data and apprehended by a quantitative approach? Ignorance of the causes of algorith-mic measures carries the risk of re-peating prejudices, assigning us in the unquestionable, resignation, and thus to the disappearance of the public space. Now, politics is preci-sely linked to decision-making in un-certainty. Digital technology cannot contribute to erase public space, by claiming to put all our actions into equations. When big data substitutes facts for law, data make the law. The reduction of social facts to mathe-matical formulae and indicators has always worried philosophers, writers, and scientists, who saw in it a risk of moralizing social life through alge-bra and calculations. This reflection raises essential questions about chance and free will. This ten-sion itself refers to the old fantasy of substituting calculation for law, the ancestor of the contemporary one of replacing the legal code by the computer code, and why not tomorrow the judge by the machine. Now, there is nothing more dangerous than to induce in order to deduce. The risk would then be to see the deviation from a so-called social or economic ‘norm’ defined by a compu-ter code, unrelated to the legal ‘norm’, sanctioned tomorrow. A data subject would no longer be punished for his or her ac-tions, but for his or her profile in each situation. What will hap-pen when the data identifies criminals even before they have committed their crimes? What will be le of the presumption of innocence for those with a 93% chance of re-oending? It would then be a matter of sliding imperceptibly from the be-ginning of execution to the preparatory act, and then from the preparatory act to the potential to commit a crime. By an ex-Part II: When data makes the law“This tension itself refers to the old fantasy of substituting calculation for law, the ancestor of the contemporary one of replacing the legal code by the computer code, and why not tomorrow the judge by the machine.”

Page 36

Coup data 71traordinary shortcut, the newest of technologies would then join the oldest of criminology. Predictive digital systems would make it possible to identify the potential criminal, based on the crite-ria of dangerousness of individuals, i.e. their possible actions in the future, and not on the evidence of guilt, which requires proof of the facts committed. In the same way, morphological cha-racteristics should make it possible to identify the born criminal, according to the father of Italian criminology, Cesare Lombroso. 9. The risk our society faces is to be governed by technological choices that are not subject to any democratic debate. Since the standard is integrated in real time into the computer code of the platforms we use on a daily basis, our behaviour is de-termined without having been preceded by any social and po-litical dialogue. However, it is the decision-making process that conditions the legitimacy of the social body's membership. The-refore, the question of democratic deliberation is more than ever raised. The ability to deliberate and to make one's voice heard as a citizen, i.e. as a “subject of law”, is being undermined by this “Coup Data”. Contrary to the law, algorithms do not consider people as subjects of law, but apprehend them as fragments of data be-longing to a vast flow that must be represented, calculated, and mo-delled. 10. Let us beware of being deter-mined by the data. Let us guard innocence from its presumption, chance from its virtue, creativity from its freedom. We must be careful of this, at a time when the cold, ecient, objective and fluid governance of algorithms is on the horizon, to be able to collectively create a common imagina-ry into which to project ourselves. Because what is characteristic of man is precisely this capacity for belief and imagination. This is what is at stake. If the feeling that everything was played out before one had even lived and if everyone's life was locked up in systems of selection or even discrimination, then adherence to democracy would collapse. The challenge that artificial intelli-gence poses to us is precisely the restoration of humanism in our societies, where it is weakening, where it no longer exists, or is in the process of disappearing. Paradoxically, this is man's good fortune. The challenge before us is to promote the irreducible singularity of each person. To build the conditions for a Digital Democracy, so as not to suer the “Coup Data” but to appro-priate it. Part III: Maintaining the irreducible singularity of the citizens “The risk our society faces is to be governed by technological choices that are not subject to any democratic debate. ”

Page 37

Coup data 7311. To understand the impact of data on our society, we have therefore created the analysis platform www.coupdata.fr. Its objective is multiple: a. to promote the European vision of digital by highlighting the vision of a new generation of pioneers; b. to propose public policy recommendations for the construction of a digital Europe; c. to answer operational questions that entrepreneurs are asking themselves so that the law is no longer seen as a brake on innovation but as an opportunity; d. to defend individual freedoms in the digital age. 12. It is above all a question of governance and sovereignty. Both at the level of individuals, who are wondering how to keep or even regain control over their data, and at the level of insti-tutions. A succession of landmark cases: Snowden, Cambridge Analytica, Schrems... Examples will keep on increasing: from the inflation of cyber-attacks, debates on the hosting of health data, facial recognition, the application of competition law to the digital giants, to the responsibility of platforms for third-par-ty content, to the possibility or not of carrying out data transfers to States that do not have a legal framework for the protection of individual freedoms and fundamental rights. Citizens, data subject could suer from this “Coup Data”, if it comes down to being summarized and governed by the simple aggregation of our data, or on the contrary it could constitute a formidable responsibility to regain control over our data. This requires ac-culturation and digital literacy. This is our ambition: to promote, through a multidisciplinary approach, a critical and construc-tive dialogue in between citizens and our digital society. Conclusion: Promoting a European approach to the digital world

Page 38

Coup dataABOUT BASDEVANT AVOCATSBasdevant Avocats is a French law boutique dedicated to innovation, and the data-driven economy. Working at the intersection of law, technology and policy, we are passionate by case of first impressions which have not been addressed yet, by new frontiers.Acting as entrepreneur and pioneer, specialist in the regulation of innovations (artificial intelligence, blockchain, cybercriminality, data, platforms, marketplace), we advise startups and defend civil liberties in the information age. In a world as complex as a Rubik’s Cube, the issues become more and more specific, which requires assembling, for each project, complementary skills, to display the desired color, depending on the problem to be solved.How can we oer the missing brick of such puzzle ? By becoming a multi-faceted "cube". Lawyers need to be creative to fully grasp those legal, but also technological and economic challenges. Hence, the need to cultivate a multidisciplinary approach and learn to work with other “cubes”: developers, crypto experts, marketing, fintech, crisis communication ... to compose a customized solution.Let’s sail together on these new adventure playgrounds. With you, we will create here the law of tomorrow.ABOUT ALTERMINDAltermind is a boutique specialized in strategy consultancy. We bring the worlds of business know-how and academia together to help companies prosper.With AI experts from MIT, Stanford University and the Toulouse School of Economics, we have developed an internal algorithmic tool that is capable of finding and ranking the best specialists in the world, among a constantly updated database of more than 13 million authors and nearly 20 million publications in AI, economics and strategy. Our agile and unique approach leverages new insights from academics and new ways of thinking from business specialists to produce new and creative solutions.Thus, our global team of business consultants and academic experts provides corporate leaders with tailored, multi-faceted and actionable advice. This unique methodology gives the edge in today’s complex and fast-changing business environment.BASDEVANT AVOCATS 2 RUE DES HAUDRIETTES, 75003 PARIS FRANCEab@basdevant.tech www.basdevant.techwww.coupdata.frDesign: Sophie HanounIllustration cover: Théophile Sutter