Expertinnen-Statement für den D21 Index 2021/22 zur Lage der digitalen Gesellschaft Expertinnen-Statement für den D21 Index 2021/22 zur Lage der digitalen Gesellschaft
  • Home
  • Über mich
    • Über mich
    • Speakerin & Moderatorin
  • Termine
  • Blog
  • Newsletter
  • Presse
  • Kontakt
  • Home
  • Über mich
    • Über mich
    • Speakerin & Moderatorin
  • Termine
  • Blog
  • Newsletter
  • Presse
  • Kontakt

NetzDG

Tag: NetzDG

Expertinnen-Statement für den D21 Index 2021/22 zur Lage der digitalen Gesellschaft

Die Nutzung sozialer Medien durchdringt mittlerweile alle Teile der Gesellschaft und nimmt einen immer größeren Raum in unserer alltäglichen Kommunikation und Information ein. Doch Phänomene wie Hass, Hetze und gezielte Desinformationen dort stellen uns auch vor immense Herausforderungen. Eines ist klar: Tech-Konzerne brauchen Tech-Regulierung. Leider fokussieren wir uns in Deutschland bei der Diskussion um die Regulierung von Social-Media-Plattformen – insbesondere, um Maßnahmen gegen Hass und Hetze zu ergreifen – zu sehr auf die Inhalte. Natürlich sind Inhalte ein wichtiger Faktor. Sind sie es doch, die die Morddrohungen, Beleidigungen und rassistische Hetze transportieren.

Nur kommen wir niemals auf einen grünen Zweig, wenn wir uns ausschließlich mit der Regulierung von Inhalten beschäftigen. Sprache und Meinung sind nicht nur kompliziert – gerade, wenn es um die häufig diffizile Bewertung geht, was noch rechtmäßige Meinungsäußerung ist und was nicht. Es werden auch sekündlich viel zu viele Inhalte ins Netz gestellt, als dass wir sie alle adäquat prüfen könnten. Das heißt nicht – nochmal in aller Deutlichkeit –, dass uns diese Inhalte nicht kümmern sollten. Der Rechtsstaat muss unbedingt durchgreifen und VerfasserInnen müssen sich vor Gericht für ihre mutmaßlich rechtswidrigen Inhalte verantworten.

Warum daher der Fokus auf Tech­Regulierung?

Das Problem bei Social-Media-Plattformen ist vor allem, dass sie Hass und Hetze, Desinformationen und Verschwörungserzählungen durch ihre algorith- mischen Systeme verstärken und damit den Men- schen gezielt zuspielen. Zudem haben sie oftmals nur ungenügende Mechanismen, um die Rechte der NutzerInnen zu stärken, wenn Inhalte unrechtmäßig entfernt oder Accounts gesperrt wurden. Starre Fristen und Zeitdruck erschweren in Deutschland das sorgfältige Abwägen, bei welchen Inhalten und Accounts eine Blockierung legitim ist und wo nicht. So kommt es auch immer wieder zum sogenannten »Overblocking«. Das Reaktivieren solcher Beiträge oder NutzerInnen-Konten ist ein langwieriger und mühseliger Prozess. Deshalb brauchen wir gerade im Digitalen eine Gesetzgebung, die Strukturen, Prozesse und Verbraucherrechte in den Fokus nimmt.

Das wird der Digital-Service-Act (DSA), so scheint es gerade, gut umsetzen. Damit er kein zahnloser Tiger wird, ist es wichtig, dass wir uns in der gesellschaftlichen und politischen Debatte stärker mit ihm beschäftigen. Wer über notwendige Regulierung wegen Hass und Hetze auf Telegram und in sozialen Netzwerken spricht, darf daher das Gesetzeswerk aus Brüssel nicht ignorieren. Die Europäische Union wird mit dem DSA globale Standards setzen und im besten Falle darin unterstützen, dass sich weltweit Hass und Hetze weniger verbreiten.

Dieser Text erschien als Expertinnenstatement im D21 Index zur Lage der digitalen Gesellschaft der Initiative D21.

Read More
Gegen Hass im Netz: „Frauenhass und Herabwürdigung sitzen in unserer Gesellschaft tief“

Der jüngste, aufsehenerregende Vorfall passierte auf Instagram: Dank insbesondere der Berliner Influencerin Louisa Dellert und der feministischen Aktivistin Kristina Lunz, konnten wir miterleben, was Frauen regelmäßig ertragen müssen.

Im Netz werden sie mit Bedrohungen überhäuft – meist durch aufgestachelte Follower eines Nutzers. In diesem Fall von dem Comedian Hendrik Nitsch, besser bekannt als „Udo Bönstrup“, der sich vor einigen Tagen über ein Statement gegen sexistische Hassnachrichten lustig machte. Nur haben sich die betroffenen Frauen diesmal lautstark gewehrt.

Insbesondere bei Frauen, die sich öffentlich gegen Frauenfeindlichkeit aussprechen, wird häufig versucht, sie mundtot zu machen. Und dieser aktuelle Fall zeigt: Beim Thema digitale Gewalt ist noch viel zu tun. Hinweise von anderen Menschen, dass man sich nicht so darüber aufregen solle, tragen zu diesem sogenannten Silencing bei und verschlimmern die Gewalterfahrung. Zu viele Frauen ertragen diese digitale Gewalt. Das darf nicht sein.

Wussten Sie, dass 70 Prozent der Mädchen und jungen Frauen in Deutschland Bedrohungen, Beleidigungen und Diskriminierungen in sozialen Medien ausgesetzt sind? Das ergab eine Studie von Plan International unter 15- bis 24-Jährigen.

Weniger politisches Engagement wegen Bedrohungen und Beschimpfungen

Auch die neueste Studie der Initiative D21, „Digitales Leben““, kommt zu dem Ergebnis, dass Frauen häufiger unter (oft sexualisierten) Belästigungen im Netz leiden. Frauen nehmen laut der Studie weniger die Möglichkeiten des gesellschaftlichen und politischen Engagements auf sozialen Medien wahr als Männer. 

Die Europäische Akademie für Frauen in Politik und Wirtschaft (EAF) hat in ihren Analysen zur Kommunalpolitik herausgefunden, dass der Grund für ein geringeres Engagement in der Politik zu einem erheblichen Teil in den Bedrohungen und Beschimpfungen im digitalen Raum liege. Nach einer Auswertung des Guardian aus dem Jahr 2016 verstärken sich die Anfeindungen sogar noch, wenn ein Migrationshintergrund vorliegt.

Das sind erschreckende Befunde. Sie zeigen, dass Frauenfeindlichkeit noch weit verbreitet ist. Dabei dürfen wir nicht vergessen, dass dieser Hass von Menschen ausgeht, die auch in der analogen Welt agieren. Sie sind dort keine anderen Personen. Wir müssen allen Menschen einen sicheren – analogen wie digitalen – Raum bieten, in dem sie sich diskriminierungsfrei äußern können und damit auch, aber nicht nur, an demokratischen Diskursen teilnehmen können.

Sieben Vorschläge für einen sicheren digitalen Raum

Um einen sicheren digitalen Raum zu ermöglichen, müssen wir entschieden gegen digitale Gewalt vorgehen. Diese sieben Vorschläge sehen wir als wichtigen Anfang:

1. Zu Beginn steht Solidarität! Die braucht es nicht nur von Frauen. Gewalt gegen Frauen ist kein Frauenproblem, sondern ein gesellschaftliches Problem, meist ein Männerproblem. Tätern muss das Gefühl genommen werden, dass sie Macht über ihr Opfer haben. Ihnen muss klargemacht werden, dass ihr Verhalten – ob strafbar oder nicht – von einer offenen Gesellschaft nicht toleriert wird.

2. Hate Speech ist Gewalterfahrung. Opfer brauchen besondere Betreuung. Auch und gerade bei Straf- oder Zivilrechtsverfahren. Anlaufstellen, bei denen sie vorab Hilfe bekommen können und sich über ihre rechtlichen Möglichkeiten, aber auch Maßnahmen zum persönlichen Schutz informiert werden können, sind essenziell. Es braucht mehr von diesen Anlaufstellen. Bei schweren Ehrverletzungen sollte den Opfern zudem ein „Opferanwalt“ und eine psychosoziale Prozessbegleitung zur Seite gestellt werden. 

3. Straftaten im Netz müssen besser geahndet werden durch spezialisierte Kräfte in Polizei und Justiz. Zentralstellen der Staatsanwaltschaft wie sie zum Beispiel in Nordrhein-Westfalen schon eingerichtet wurden, sind hierfür sinnvoll. Polizei und Justiz müssen über technischen Sachverstand und modernste Ausstattung im digitalen Raum verfügen. Zudem müssen sie die Tragweite von Angriffen einschätzen und ernst nehmen können. Vorschläge, wie den Twitter- oder Instagram-Account zu löschen, dürfen nicht erfolgen. 

Es soll e-Courts für schnelle Verfahren geben

4. Strafanzeigen müssen einfach, online und anonymisiert gestellt werden können. In Zivilprozessen muss es für die Geschädigten möglich sein, das Verfahren auch ohne Nennung der privaten Anschrift zu betreiben, sondern derjenigen der beratenden Kanzlei oder NGO. Dies wird von Gerichten nach Angabe von Betroffenen nicht immer akzeptiert. Es ist aber von essentieller Bedeutung, da ansonsten der Täter über eine Akteneinsicht die Privatanschrift erfahren könnte. 

5. Das durch das Netzwerkdurchsetzungsgesetz geregelte eigenmächtige Löschen durch die Plattformen ohne Verfahren ist nicht zielführend. Vereinfachte Onlineverfahren, sogenannte e-Courts, können eine Beschleunigung von Verfahren ermöglichen. Zeit ist bei solchen Vorfällen von enormer Bedeutung. Onlineverfahren können nicht nur die Justiz entlasten. Zeitnahe Urteile sorgen auch bei Tätern für eine nachhaltige Wirkung.

6. Polizei und Justiz müssen effektiv und schnell mit den Plattformen und NGOs zusammenarbeiten. NGOs wie zum Beispeil „Hate Aid“, „Hassmelden“ und „Ich bin hier“ sollen die Möglichkeit bekommen, den Staatsanwaltschaften digital Strafanzeigen zuzuleiten. Sie verfügen über die Expertise der Beweissicherung und können so  schnell die Staatsanwaltschaft unterstützen und Betroffenen helfen.

Es braucht einen starken Staat und verantwortungsvolle Plattformen

7. Plattformen wie Instagram tragen Verantwortung. Sie müssen ihre eigenen Regeln durchsetzen und dürfen orchestrierten Hass nicht hinnehmen. Sie müssen mehr Content-Moderatorinnen und -Moderatoren einsetzen, die gemeldete Inhalte prüfen. Des Weiteren sollten Plattformen veröffentlichen, wie viele Moderatorinnen und Moderatoren sie beschäftigen. Schulungen zu geschlechtsspezifischem Hass und Silencing müssen für sie verpflichtend sein.

Es gibt noch viel zu tun. Wir dürfen bei den Problemen im digitalen Raum nie vergessen, dass diese nicht unabhängig von der analogen Welt passieren. Unser Justizwesen muss dafür gewappnet sein. Wir brauchen einen starken und modernen Rechtsstaat, der Opfer auch im digitalen Raum schützen kann. Hier sind vor allem die Länder in der Pflicht nachzubessern. Forderungen nach schärferen Gesetzen oder die alleinige Adressierung der Plattformen greifen zu kurz, denn das Problem ist umfassend. Frauenhass und die Herabwürdigung von Frauen sitzen in unserer Gesellschaft noch tief. Gehen wir es an!

Dieser Gastbeitrag erschient zuerst im Tagesspiegel (online und print) zusammen mit Maren Jasper-Winter am 25. November 2020.

Read More
Ready to set Sails — A Vision for Europe

I gave this speech as a keynote at the Transcultural Leadership Summit in Friedrichshafen on 15 November 2019. I publish my whole vision here because due to time restrictions, I had to shorten it for the speech. Special thanks again to the students who organised the summit and especially to Kristi Grund and Laura Trattner who allowed me to talk about this important topic.

In the year of the Friday’s for Future movement against the climate crisis, I gained my Frequent Traveler status. I’m not proud of that. Rather, I am humbled and thankful. I learned how privileged I am in being a white German woman, how luxurious it is to be an owner of a German passport. I think we often forget that. I am thankful because I have several sponsors who financed these journeys, mostly the political foundations here in Germany and first and foremost the Friedrich-Naumann-Foundation for Freedom. They also made it possible that way more people from countries from all over the world came to Germany and discussed with several organisations, such as mine, about a topic that in my opinion is as important as the climate crisis but barely on the table: Civil Liberties.

I was asked to give a keynote about a vision for Europe regarding Europe’s challenges and opportunities concerning digitalisation. As this is a summit on transcultural leadership — and digitalisation and the internet is nothing that ends at one’s national borders nor at the ones of the European Union, it is today even more important to think about the German, the European role in the world as a role model. As well as to work with others and get inspired by their ideas and experiences.

I’d like to quote the Islamic Scholar Abū ʿAbdallāh Muhammad Ibn Battūta who said: “travelling it leaves you speechless then turns you into a storyteller.”

So here is my story:

Russia is a beautiful country. I didn’t expect that, honestly. I learned that minus 17 degrees aren’t as cold as one would guess and that the special blue of rivers in Siberia can’t be caught by my iPhone’s camera. I learned about the beautiful tradition of giving a toast at the dinner table and that politicians and political actors form the opposition are sometimes stopped to hold a conference due to coincidental crackdowns of hotels in a whole area. I learned that people are afraid to join political discussions of liberal parties where they are asked to register in advance. They are scared because they are working for the police, therefore the state and are too afraid because they don’t know what might happen with the data later on.

Russia is not only famous for their cyber-attacks which are quite likely driven by the Kremlin or the “Internet Research Agency” in St. Petersburg which manipulated the US election with their troll army — also commissioned by the Kremlin with a very high probability. Russia is also, sadly, famous for the pressure on and the criminalisation of journalists and activists who are fighting for press freedom, freedom of speech, against surveillance, against corruption, and for LGBTQ rights.

The Kremlin copied a law which helps them to ban unlawful content easily from the Web. Unlawful can be caricatures of Vladimir Putin or LGBTQ content. The copied law is the German Network Enforcement Act, or Netzwerkdurchsetzungsgesetz, short NetzDG or also well known as the “Facebook Bill”. It’s a law launched by the former Minister of Justice Heiko Maas and criticised by several organisations such as journalists without borders, the Amadeus Antonio Foundation, several business associations, the Association of Journalists in Germany, Wikimedia, the Open Knowledge Foundation and several others, also mine. Amongst them even the UN special rapporteur for freedom of expression, David Kaye. Think about that: A German Law gets criticised by a UN special rapporteur for freedom of expression.

This law was installed to do something against hate speech, but after almost two years, we see clearly: it doesn’t help. Hate is often not unlawful. Death threads are — that’s right, but hate? Hate speech is not even a term that is defined by our German law. And we saw it at the decision regarding the case of the politician Renate Künast by the Landgericht of Berlin that judges make decisions most people and lawyers can’t understand. And we want private companies to decide within 24 hours what is unlawful and what is not? Want them to be forced by law to delete what is in their opinion unlawful? Renate Künast has the right to appeal her judgement, and luckily she did that. If Facebook, Twitter or YouTube removed some content because of the Network Enforcement Act, you have no right to appeal against that. This is not how the rule of law should work in a constitutional state. But this is the law that Russia and several other countries who are according to Freedom House “not free” or just “partly free” copied.

Moreover, several of these countries, including Venezuela, Vietnam, India, Russia, Malaysia, and Kenya, require intermediaries to remove vague categories of content that include “fake news,” “defamation of religions,” and “anti-government propaganda,” and many of them include overly broad definitions of hate speech that go much further than the German law. Responding to criticism, Kremlin representatives argued that false information “is regulated fairly harshly in many countries of the world including Europe. It is therefore of course necessary to do it in our country too.”

We need better laws that help against unlawful content on platforms. Journalists like Margarethe Stokowski and Richard Gutjahr showed us in several articles and speeches that the Network Enforcement Act neither helps them nor does in bring perpetrators to justice.

In Russia, one can only access “free” wifi if one registered in advance with the telephone or passport number. Of course, one gets only a SIM card if one presents one’s passport — by the way, it’s the same here in Germany since a few years. Anonymous web surfing is therefore nearly impossible. The Kremlin wants to have its own internet, the RuNet. They just passed a bill that should protect them from cyber-attacks but instead, it enables the Kremlin to expand the surveillance measures. With their own Domain Name System — imagine it as the telephone numbers of the Web — they can root someone of the websites the state wants. For example, if you type in google.com the state can root you to Yandex, the Russian search engine. They are also planning to install a so-called “Deep Packaging Inspection” at every Russian internet provider. This allows the Kremlin to look into each data package which is sent through the internet, to track it and to slow down the speed of the internet.

From Russia to the land of the free, the United States. Maybe we can’t talk about digitalisation in Europe without looking jealously to the Bay Area, Redmond, and Seattle. I haven’t been to any of these places. I have been in Portland, Oregon at the re:publica sequencer tour and spoke about civil rights and digitalisation.

When we talk about the US and how they shape the digital economy, we barely talk about the algorithmic systems that are used by the government. That surveil its people that make wrong decisions on well-fare, that supports unfair decisions in court, and that promote a racial bias towards black Americans.

A bias per se is nothing wrong. We all are biased. Everyone who says he or she isn’t is entirely wrong. And that is the problem with bias: being unaware of it, not able or not willing to reflect and to act on it. And, even more problematic, to not be able to do something against a biased decision. This is a problem we see often, not only in the US but also in other countries, such as Australia and also in Europe. I will present you one example from the United States.

COMPAS, that is an algorithmic decision system that helps judges to identify if a defendant should go into prison or can be released on bail. It doesn’t use the date “race” — according to the law — but discriminates either way against it. The problem is fairness. Yes, fairness, a word like so many more we often use but never really think about what it means. What is fair?

The COMPAS algorithmic system scores each defendant and recommends to send to pre-trial jail who has a risk score higher than seven. Seems fair, right? But in reality, that means that way more black people get arrested even though their personal probability to relapse is lower. Specifically, this means that 45% of the black defendants are false-positives and just half as many white defendants are false-positives. Whereas the rate of false-negatives of white defendants who got criminal again, even though the system said they wouldn’t, was 48%.

So instead it would be fair to have a threshold for each race and keep the error rates low for both, but that is against the 14th Amendment of the US constitution — of course, for good reasons. There shouldn’t be laws or thresholds for each race. There are two definitions of fairness: keep the error rates comparable between groups, and treat people with the same risk scores in the same way. Both of these definitions are totally defensible! But satisfying both at the same time is mathematically impossible.

But aren’t humans making the same mistakes? Yes, they do. And how they make their decisions is also not transparent. But COMPAS, which is made by a private company, is a trade secret that cannot be publicly reviewed or interrogated. Defendants can no longer question its outcomes, and government agencies lose the ability to scrutinise the decision-making process. There is no more public accountability.

The more significant question about using systems like COMPAS — or any algorithms to rank people — is whether they reduce existing inequities or make them worse. We shouldn’t overestimate the power and “neutrality” of technology, nor it’s “intelligence” especially when it comes to decisions over human beings which need more intelligence than making statistical approximations based on data of the past. Please keep that in mind.

The discussions about ethics in digitalisation need to be more holistic than just about the algorithm itself. It’s about the data, the definitions we make in advance and the environments in which we implement these systems. In my opinion, creating a valuable digital world is more than financial efficiency. It’s about ensuring values such as comprehensible decisions by the state which one can appeal. It’s about the rule of law and humans ruling about humans. Algorithmic systems should be used very carefully in this area.

Most impressively was my journey to Hongkong at the beginning of this year. You might haven’t seen it on the map because it’s too small, but I didn’t want to mark China as both “countries” are very different. I’m also lucky that I was able to meet Hongkongers from the front-line in Berlin last week. These guys are showing us what happens when a digitalised state and public sphere is used to suppress its people and is not anymore in the hands of democrats.

Because they are afraid who gets access to the data they stopped using their Octopus card in Hongkong, a card with which you can buy nearly everything especially in the 7/11 stores, and you buy your ride with public transportation with it. It is connected with your credit card, and therefore one can find out who exits the metro stations of the places of the protests and who probably takes part in it. They pulled down surveillance cameras with face recognition because they haven’t had the feeling that they can exercise their democratic right to demonstrate without the fear of suppression later on.

The activists deleted the Chinese App WeChat from their phones and used Telegram as their favourite messenger because one can use it without revealing one’s own telephone number. This is necessary for the case of moles in the group or if phones get confiscated by the police. Staying as anonymous as possible is essential for their fight right now. They also switched to Telegram because they need encryption they can rely on, and Telegram also responded to their requests to change some features for Hongkongers so they can act safe and according to their motto: be like water.

“Be like water” means several things for them. This movement is different from the occupy movement. They are “flooding” areas and disappear again. They have no leader; they are using open-source technology to coordinate their protest and to vote for the next actions. They are crowd-sourced and get several financial or material donations.

And they use Apples AirDrop to communicate with each other at places where they are gathering to inform about the next plans. AirDrop works with Bluetooth and just with people near you. And it also works during internet or mobile network shut down. Hongkong shows us what happens when democracy is in severe danger; data and surveillance cameras are used against its citizens, and secure communication is uncertain.

Please don’t get the impression that I am a sceptic of the digitalisation. I’m definitely not. I am also a huge optimist, and I am willing to shape our digital future for the better. But: Ensuring democracy and civil liberties is essential when we want to live in a liveable future. Yes, China and the US might be faster in developing and implementing new technologies. But we shouldn’t only focus on speed. Instead, we should focus on building resilient democracies for the 21st century. Democratic decisions need time and that is okay, as long as they are good ones. We should focus more on the quality of policies. That is how a digitalised world will be worth living.

So here is my vision for Europe:

I want a Europe that uses digitalisation to empower its people and to strengthen democracy.

A Europe that is sovereign but does not separate itself from the world. Instead, it sets standards that open chances for everyone.

I want a Europe that makes laws that protect people from threads but also finds the balance to secure civil liberties.

And a Europe that puts content in the phrase “ethical digitalisation”. Ensuring Human Rights might be a good start.

Let’s start from the back. Why do I include Human Rights in my vision for Europe? Because too often the already mentioned UN rapporteur for freedom of expression, David Kaye, has to criticise laws from the European Union or its member states. Not only the Network Enforcement Act got criticised by him but also the European Copyright Directive — you might have heard of it or taken part in the demonstrations against Article 13, and the upload filters. The same thing is about to happen when the European Commission goes on with their plants to demand upload filters against terrorist content.

And here we see again what I mean when I say we need to talk more holistic about the implementation of algorithmic systems. Banning “terrorism” is easy, but what exactly is terrorism? When does terrorism start? Just look at the case of the “Hizbullah” and the disagreement between European member states and the European Union itself. Is the Hizbullah entirely a terrorist organisation or just the military arm and not the political one? How should a system based on mathematics and statistics — call it Artificial Intelligence, but AI and algorithms are not more than that — decide what is still political and what is already military? How should it differentiate between propaganda and legitimate coverage? All of this is obviously not only about human rights. The rights of freedom of expression and freedom of information are also enshrined in the Charter of Fundamental Rights of the European Union and here in Germany in our Grundgesetz.

The same counts for the Human Right of privacy. I don’t want a European Union or European member states that want to force messenger services to put backdoors in the encryption of messages. Also here mathematics prove to us that it is impossible to weaken encryption just for potential terrorists and not everyone else. Just look at the people of Hongkong and how urgent they need reliable encryption in their fight for democracy and civil liberties. Look at all the people who are fighting authoritarian regimes in their countries and look at all the businesses with their legitimate interests to keep their business secrets. All of their communication might be accessed by criminals or authoritarian system. The European Union must be a role model in ensuring human rights like these. These are our values; this is the ethical digitalisation we need to push forward.

Europe also needs to push forward regarding digital sovereignty. That means not being dependant on China and the United States regarding Hard- and Software. We are relying on both and are barely able to produce them on our own conditions and under our own control. We see the problems now with the 5G technology, cloud services and the dependence on software in our administrations. We saw through the revelations of Edward Snowden that the United States are spying on us and we can expect this from the Chinese. We see what happens when Donald Trump puts sanctions on states and software companies aren’t able anymore to deliver software or updates of it to some countries. Digital sovereignty does not mean that we should not use hard- or software from other parts of the world. But it means that we should be able to switch hard- and software easily, diversify it for security reasons and be not entirely dependant on others. Promoting open-source software, open and standardised interfaces, and security standards is a good start as well as fostering a European industrial policy for hardware.

The Arab Spring is long ago. Now we don’t see Social Media as a tool to empower people anymore. Instead, it is used to influence and manipulate people and elections — and I could hold another talk about online manipulation and why we can not only blame social media alone. But, the fact is that the dream of empowering the powerless through the internet started to fade. Luckily the internet is more than social media and the digitalisation is more than the internet. But also including social media and the internet itself, I still believe that we can empower the powerless. I think this will work based on three points:

First, Europe needs to support innovation for the common good and help civil society organisations to prepare for the digital age. We need special fonds for that. Thinking about digitalisation solely in the business sector is too short-sighted. And we have amazing organisations here that will create a valuable digital Europe for everyone.

Second, the person that impressed me most is Taiwan’s digital minister Audrey Tang. She shows us how we can digitalise democracy, and by that, I don’t mean voting online. Audrey Tang fosters radical transparency of the government to the people, offers regulatory sandboxes for innovative companies and asks the Taiwanese people through an online consultation about their thoughts and feelings regarding new business ideas and necessary regulations so that the acceptance of new laws and businesses are very high. With the Presidential Hackathon, the state invites civil society to invent new tools that provide a better and efficient state from which everyone benefits. We should get inspired by Taiwan and implement more ideas to strengthen democracy for the 21st century.

Third, a vision for Europe is nothing without each of you. My generation, the generation Y and the younger ones, the generation Z, grew not only up with the naturalness of the internet, but also with the imagination of the irrevocability of civil liberties. But they aren’t. We need to fight for them every day and not only every four years at the ballot box. We need to show people here in our country, in Europe, and all over the world that we are standing with them and supporting them in ensuring freedom.

Travelling the world and talking to people in their countries and here in Germany helped me to understand the importance of Europe as a role model in democracy and ensuring civil liberties in general but especially in the digital sphere. We need to preserve that. Not only for us but for everyone. Enduring freedom and civil liberties is often hard, but it is necessary so that everyone can evolve as he or she wants as long as the rule of law covers it. The author Juli Zeh just wrote for the ZEIT what also my vision is in one markable sentence. And this is also my pledge to you: Democracy, and I add “Europe”, need democrats, otherwise, it dies from within.

Thank you.

Read More

Neueste Beiträge

  • Seit März 2023: Geschäftsführerin des Next e.v.
  • Berufung in Digitalbeirat des Bundesministerium für digitales und verkehr
  • Mehr Digitale Souveränität
  • Internet Governance – Grundlage unserer Demokratien im digitalen Zeitalter
  • Mitglied im Sounding Board des Forschungsprojekts „Plattformräte zur Sicherung öffentlicher Werte in den Regeln und Praktiken von Plattformen“

Neueste Kommentare

  • Steffi bei Gender Data Gap – die unsichtbaren Frauen

Archive

  • März 2023
  • Dezember 2022
  • Oktober 2022
  • Juli 2022
  • Juni 2022
  • Mai 2022
  • April 2022
  • Februar 2022
  • November 2021
  • August 2021
  • Juni 2021
  • Mai 2021
  • April 2021
  • März 2021
  • Februar 2021
  • Januar 2021
  • Dezember 2020
  • November 2020
  • Oktober 2020
  • August 2020
  • Juli 2020
  • April 2020
  • Januar 2020
  • November 2019
  • Oktober 2019
  • September 2019
  • August 2019
  • Mai 2019

Kategorien

  • Allgemein
  • Bundestagskandidatur
  • Digitale Bildung
  • Digitalpolitik
  • Gastbeitrag
  • Internet Governance
  • News
  • Publikation
  • Social Media
  • Studio
  • Tutorial
Impressum Datenschutz