Группы European Digital Range и European Digital
Лучший брокер! Бонус при регистрации 10 000 рублей!
Быстрое открытие счета + 1000 $ в подарок!
Бонус при регистрации до 35 000 рублей!
К группе European Digital относятся бинарные опционы так называемого европейского типа, которые подразделяются на опционы «Выше», опционы «Ниже». Подобные опционы еще называются опционами «Уровня».
European Digital Above являются бинарным опционом часто называемым опционом Выше. Above предоставляет наибольший доход покупателю, если по истечении сделки цена базиса является большей чем заданный барьер, то есть большей чем так называемый (по терминологии классических опционов) страйк. Если цена является меньшей, то покупатель теряет премию, то есть ранее уплаченную за опцион сумму. Продавец опционов имеет результаты обратные результатам, которыми обладает покупатель.
European Digital Bellow называется опцион Ниже. Bellow предоставляет наибольший доход покупателю, если к дате истечения стоимость базиса является меньшей, чем барьер. Если же стоимость является большей, то покупатель остается без ранее уплаченной за опцион суммы. Продавцы подобных опционов имеют результаты, которые обратны результатам покупателя.
К группе European Digital Range относятся бинарные опционы так называемого европейского типа, которые подразделяются на опционы «В диапазоне», «Вне диапазона». Зачастую подобные опционы называются опционами «Коридора цен».
Такие цифровые опционы имеют верхнюю и нижнюю границу, образующие коридоры цен. Инвесторы, которые работают с опционами Range, зарабатывают на предположениях, определяющих останется стоимость базисного актива в означенном коридоре или выйдет за пределы. Инвесторы сами выбирают trigger уровни, то есть цены границ коридора.
European Digital Range Inside или в диапазоне. Трейдер, который предполагает цену базиса в коридоре к периоду истечения срока, приобретая цифровой опцион и оказавшись верно определившим стоимость, получает доход. Продавцы «Inside» имеют обратные результаты.
European Digital Range Outside или вне диапазона. Трейдер, который предполагает цену базиса вне коридора к периоду истечения срока, приобретая подобный опцион и оказавшись верно определившим стоимость, получает доход. Продавцы «Outside» в подобной торговой ситуации имеют обратный результат.
Группы European Digital Range и European Digital
OLAF investigates fraud against the EU budget, corruption and serious misconduct within the European institutions, and develops anti-fraud policy for the European Commission.
OLAF in figures
Between 2020-2020, OLAF:
- Concluded over 1900 investigations
- Recommended the recovery of over €6.9 billion to the EU budget
- Issued over 2500 recommendations for judicial, financial, disciplinary and administrative action to be taken by the competent authorities of the Member States and the EU.
OLAF investigates a wide range of wrongdoings from embezzlement, fraudulent claims and misconduct in public procurement procedures, to customs fraud. Far from being an exhaustive list, these examples seek to illustrate different angles of OLAF’s investigative activity or different key moments in the lifespan of a case.
Drawing on its accumulated knowledge and experience, OLAF helps the authorities responsible for managing EU funds – inside and outside the EU – to understand fraud types, trends, threats and risks, and to protect the EU’s financial interests by preventing fraud of all kinds.
Лучший брокер! Бонус при регистрации 10 000 рублей!
Быстрое открытие счета + 1000 $ в подарок!
Бонус при регистрации до 35 000 рублей!
The Hercule Programmes fund actions which aim to prevent and combat fraud, corruption and other illegal activities affecting the EU’s financial interests. Actions eligible for funding include technical and operational investigation support, specialised trainings, research activities and they are implemented via grants and contracts.
Support our work by investing in a piece of e-clothing!
Your privacy is increasingly under threat. European Digital Rights works hard to have you covered. But there’s only so much we can do.
Help us help you. Help us get you covered.
Check out our 2020 collection!*
*The items listed below are e-clothes. That means they are electronic. Not tangible. But still very real – like many other things online.
Your winter stock(ings) – 5€
A pair of hot winter stockings can really help one get through cold and lonely winter days. Help us to fight for your digital rights by investing in a pair of these superb privacy–preserving fishnet stockings. This delight is also a lovely gift for someone special.
A hat you can leave on – 10€
Keep your head undercover with this marvelous piece of surveillance resistance. Adaptable to any temperature types and – for the record – to several CCTV models, the item really lives up to its value. This hat is an indispensable accessory when visiting your favourite public space packed with facial recognition technologies.
Winter/Summer Cape – 25€
Are you feeling heroic yet? Our flamboyant Winter/Summer cape is designed to keep you warm and cool. This stylish accessory takes the weight off your shoulders – purchase it and let us take care of fighting for your digital rights!
Just another White T-Shirt – 50€
A white t-shirt can do wonders when you’re trying to blend in with a white wall. This wildly unexciting but versatile classic is one of the uncontested fundamental pillars of your privacy enhancing e-wardrobe.
THE privacy pants ⭐️ – 100€
This ultimate piece of resistance is engineered to keep your bottom warm in the coldest winter, but also aired up during the hottest summer days. Its colour guarantees the ultimate tree (of knowledge) look. The item comes with a smart zipper.
Anti-tracksuit ⭐️ – 250€
Keep your digital life healthy with the anti-tracking tracksuit. The fabric is engineered to bounce out any attempt to get your privacy off track. Plus, you can impress your beloved babushka too.
Little black dress ⭐️ – 500€
Whether at a work cocktail party, a funeral, shopping spree or Christmas party – this dress will turn you into the center of attention, in a (strangely) privacy-respecting manner.
Sew your own ⭐️ – xxx€
Unsure of any of the items above? Let your inner tailor free, customise your very own unique, designer garment, and put a price tag of your choice on it.
⭐️ The items of value superior to 100€ are delivered with an (actual, analog, non-symbolic) EDRi iron-on privacy patch that you can attach on your existing (actual, analog, non-symbolic) piece of clothing or accessory. If you wish to receive this additional style and privacy enhancer, don’t forget to provide us with your postal address (either via the donation form, or in your bank transfer message)!
Question? Remark? Idea? Please contact us brussels [at] edri [dot] org !
Your face rings a bell: Three common uses of facial recognition
Not all applications of facial recognition are created equal. As we explored in the first and second instalments of this series, different uses of facial recognition pose distinct but equally complex challenges. Here we sift through the hype to analyse three increasingly common uses of facial recognition: tagging pictures on Facebook, automated border control gates, and police surveillance.
The chances are that your face has been captured by a facial recognition system, if not today, then at least in the last month. It is worryingly easy to stroll through automated passport gates at an airport, preoccupied with the thought of seeing your loved ones, rather than consider potential threats to your privacy. And you can quite happily walk through a public space or shop without being aware that you are being watched, let alone that your facial expressions might be used to label you a criminal. Social media platforms increasingly employ facial recognition, and governments around the world have rolled it out in public. What does this mean for our human rights? And is it too late to do something about it?
First: What the f…ace? – Asking the right questions about facial recognition!
As the use of facial recognition skyrockets, it can feel that there are more questions than answers. This does not have to be a bad thing: asking the right questions can empower you to challenge the uses that will infringe on your rights before further damage is done.
A good starting point is to look at impacts on fundamental rights such as privacy, data protection, non-discrimination and freedoms, and compliance with international standards of necessity, remedy and proportionality. Do you trust the owners of facial recognition systems (or indeed other types of biometric recognition and surveillance) whether public or private, to keep your data safe and to use it only for specific, legitimate and justifiable purposes? Do they provide sufficient evidence of effectiveness, beyond just the vague notion of “public security”?
Going further, it is important to ask societal questions like: does being constantly watched and analysed make you feel safer, or just creeped out? Will biometric surveillance substantially improve your life and your society, or are there less invasive ways to achieve the same goals?
Looking at biometric surveillance in the wild
As explored in the second instalment of this series, many public face surveillance systems have been shown to violate rights and been deemed illegal by data protection authorities. Even consent-based, optional applications may not be as unproblematic as they first seem. This is our “starter for ten” for thinking through the potentials and risks of some increasingly common uses of facial verification and identification – we’ll be considering classification and other biometrics next time. Think we’ve missed something? Tweet us your ideas @edri using #FacialRecognition.
Automatic tagging of pictures on Facebook
Facebook uses facial recognition to tag users in pictures, as well as other “broader” uses. Under public pressure, in September 2020, they made it opt-in – but this applies only to new, not existing, users.
- Saves time compared to manual tagging
- Alerts you when someone has uploaded a picture of you without your knowledge
- The world’s biggest ad-tech company can find you on photos or videos across the web – forever
- Facebook will automatically scan, analyse and categorise every photo uploaded
- You will automatically be tagged in photos you might want to avoid
- Errors especially for people with very light or very dark skin
- Facebook has been training algorithms using Instagram photos and then selling them
- Facebook tagging is 98% accurate – but with 2.4bn users, that 2% amounts to hundreds of millions of errors, especially for people of colour
Creepy, verging on dystopian, especially as the feature is on by default for some users (here’s how to turn it off: https://www.cnet.com/news/neons-ceo-explains-artificial-humans-to-me-and-im-more-confused-than-ever/). We’ll leave it to you to decide if the potentials outweigh the risks.
Automated border control (ePassport gates)
Automated border control (ABC) systems, sometimes known as e-gates or ePassport gates, are self-serve systems that authenticate travellers against their identity documents – a type of verification.
- Suggested as a solution for congestion as air travel increases
- Matches you to your passport, rather than a central database – so in theory your data isn’t stored
- Longer queues for those who cannot or do not want to use it
- Lack of evidence that it saves time overall
- Difficult for elderly passengers to use
- May cause immigration issues or tax problems
- Normalises face recognition
- Disproportionately error-prone for people of colour, leading to unjustified interrogations
- Supports state austerity measures
- Stats vary wildly, but credible sources suggest the average border guard takes 10 seconds to process a traveler, faster than the best gates which take 10-15 seconds
- Starting to be used in conjunction with other data to predict behaviour
- High volume of human intervention needed due to user or system errors
- Extended delays for the 5% of people falsely rejected
- Evidence of falsely criminalising innocent people
- Evidence of falsely accepting people with wrong passport
Evidence of effectiveness can be contradictory, but the impacts – especially on already marginalised groups – and the ability to combine face data with other data to induce additional information about travellers bear major potential for abuse. We suspect that offline solutions such as funding more border agents and investing in queue management could be equally efficient and less invasive.
Sometimes referred to as face surveillance, police forces across Europe – often in conjunction with private companies – are using surveillance cameras to perform live identification in public spaces.
- Facilitates the analysis of video recordings in investigations
- Police hold a database of faces and are able to track and follow every individual ever scanned
- Replaces investment in police recruitment and training
- Can discourage use of public spaces – especially those who have suffered disproportionate targeting
- Chilling effect on freedom of speech and assembly, an important part of democratic participation
- May also rely on pseudo-scientific emotion “recognition”
- Legal ramifications for people wrongly identified
- No ability to opt out
- UK police force says face recognition is helping make up for budget cuts
- Effectiveness of surveillance is nearly impossible to prove
- Evidence of law enforcement abuse of access to data
- Automates existing policing biases and racial profiling
- Makes legitimate anonymous protests impossible
- Undermines privacy rights, making us less free
Increased public security could be achieved by measures to tackle issues such as inequality or antisocial behaviour or generally investing in police capability rather than surveillance technology.
Facing reality: towards a mass surveillance society?
Without intervention, facial recognition is on a path to omniscience. In this post, we have only scratched the surface. However, these examples identify some of the different actors that may want to collect and analyse your face data, what they gain from it, and how they may (ab)use it. They have also shown that benefits of facial surveillance are frequently cost-cutting reasons, rather than user benefit.
We’ve said it before: tech is not neutral. It reflects and reinforces the biases and world views of its makers. The risks are amplified when systems are deployed rapidly, without considering the big picture or the slippery slope towards authoritarianism. The motivations behind each use must be scrutinised and proper assessments carried out before deployment. As citizens, it is our right to demand this.
Your face has a significance beyond just your appearance – it is a marker of your unique identity and individuality. But with prolific facial recognition, your face becomes a collection of data points which can be leveraged against you and infringe on your ability to live your life in safety and with privacy. With companies profiting from the algorithms covertly built using photos of users, faces are literally commodified and traded. This has serious repercussions on our privacy, dignity and bodily integrity.
Data-Driven Policing: The Hardwiring of Discriminatory Policing Practices across Europe (05.11.2020)
(Contribution by Ella Jakubowska, EDRi intern)
Serbia: Complaints filed against Facebook and Google
EDRi member SHARE Foundation has filed complaints to the Commissioner for Information of Public Importance and Personal Data Protection of Serbia against Facebook and Google for their non-compliance with the obligation to appoint representatives in Serbia for data protection issues. In May 2020, before the start of application of the new Serbian Law on Personal Data Protection, SHARE Foundation sent letters to 20 international companies and called upon them to appoint representatives in Serbia, in accordance with the new legal obligations.
Appointing representatives of these companies is not a formality – it is essential to exercising the rights of Serbian citizens prescribed by law. In the current circumstances, companies like Google and Facebook view Serbia, like many other developing countries, as a territory for unregulated exploitation of citizens’ private data, even though Serbia harmonised its rules with the EU Digital Single Market by adopting the new Law on Personal Data Protection. Namely, these companies recognise Serbia as a relevant market, offer their services to citizens of the Republic of Serbia and monitor their activities. In the course of doing business, these companies process a large amount of data of Serbian citizens and make huge profits. On the other hand, the new law guarantees numerous rights to citizens in relation to such data processing, but at the moment it seems that exercising these rights would face many difficulties.
Among other things, these companies do not provide clear contact points thatcitizens can contact – they mostly have application forms available in a foreign language. Experience has shown that such forms are not adequate, not only because they require advanced knowledge of a foreign language by Serbian citizens, but also because this type of communication is mostly done by programs that send generic automated responses.
Although fines under the domestic Law on Personal Data Protection that the Commissioner may impose, in this case 100 000 Serbian dinars (around 940 USD or 850 EUR), wouldn’t have a major impact on the budgets of these gigantic companies, SHARE believes that they would show that the competent authorities of the Republic of Serbia intend to protect its citizens and point out that these companies are not operating in accordance with domestic regulations.
(Contribution by EDRi member SHARE Foundation, Serbia)
ECtHR demands explanations on Polish intelligence agency surveillance
The European Court of Human Rights (ECtHR) has demanded the Polish government to provide an explanation on surveillance by its intelligence agencies. This is a result of complaints filed with the Strasbourg court in late 2020 and early 2020 by activists from EDRi member Panoptykon Foundation and Helsinki Foundation for Human Rights as well as attorney Mr. Mikołaj Pietrzak. The attorney points out that uncontrolled surveillance by the Polish government violates not only his privacy but most importantly the rights and freedoms of his clients. Activists add that as active citizens, they are at a particular risk of being subject to government surveillance.
Panoptykon has been criticising the lack of control over government surveillance for years. Without appropriate controls concerns and doubts exist about what intelligence agencies can use their broad powers without proper limitations. However, there’s no way of verifying to which extent these powers are used, because the law does not envision access to information about whether an individual has been subject to surveillance – even if surveillance has finished and the individual has not been charged. Therefore, as citizens we are defenceless and we cannot protect our rights.
The ECtHR decided that the complaints meet formal requirements and communicated the case to the Polish government which will have to answer the question whether its actions violated our privacy (Article 8 of the European Convention on Human Rights) and the right to an effective remedy (Article 13 of the Convention).
What’s at stake is not just the right to privacy. As attorney Mikołaj Pietrzak explains, the basis of the attorney-client relationship is trust that can only exist on condition of confidentiality. Attorneys are obliged to protect legal privilege, especially when it comes to defence in criminal cases. Current laws make it impossible. This infringes on the rights and freedoms of their clients, and in particular their right to defence.
The Polish Constitutional Court pointed out that the law should have been changed already in July 2020. However, so-called Surveillance Act and Counter-terrorism Act that were adopted in 2020, only expanded the intelligence agencies’ powers, without introducing any mechanisms of control. Compared to other EU countries where independent control over the activities of intelligence agencies is not surprising to anyone, Poland stands out in a negative way. These irregularities have been pointed out, among others, by the Venice Commission in a June 2020 Opinion. The obligation to inform the data subject about the fact that intelligence agencies accessed their telecommunication data results from multiple ECtHR (e.g. Szabo and Vissy v. Hungary, Saravia v. Germany or Zakharov v. Russia) and Court of Justice of the European Union (CJEU) judgements (e.g. Tele2 Sverige).
The complainants are represented by attorney Małgorzata Mączka-Pacholak.
No control over surveillance by Polish intelligence agencies. ECHR demands explanations from the government (18.12.2020)
(Contribution by EDRi member Panoptykon Foundation, Poland)
Copyright stakeholder dialogues: Filters can’t understand context
On 16 December 2020, the European Commission held the fourth meeting of the Copyright Directive Article 17 stakeholder dialogues. During the “first phase”, meetings focused on the practices in different industries such as music, games, software, audiovisual and publishing. This meeting was the last of what the Commission called the “second phase”, where meetings were focused on technical presentations on content management technologies and existing licensing practices.
During this fourth meeting, presentations were given by platforms (Facebook, Seznam, Wattpad), providers of content management tools (Audible Magic, Ardito, Fifthfreedom, Smart protection), rightsholders (European Grouping of Societies of Authors and Composers – GESAC, Universal Music Publishing, Bundesliga) as well as by consumer group BEUC and the Lumen database.
Say it again louder for the people in the back: Filters cannot understand context
After Youtube’s Content ID presentation during the third meeting, Facebook’s Rights Management tool presentation reiterated what civil society has been repeating during the entire duration of the copyright debates: filtering tools cannot understand context. Content recognition technologies are only capable of matching files and cannot recognise copyright exceptions such as caricature or parody.
This argument has now been clearly and repeatedly laid out to the European Commission by both civil society organisations and providers of content recognition technology. We would therefore expect that the Commission’s guidelines will take this into account and recommend that filters should not be used to automatically block or remove uploaded content.
A lack of trust
As the meetings usually help revive old divisions between stakeholders, it also tells us about new ones. Facebook’s Rights Management tool pointed out that one of their biggest issues was the misuse of the tool by the rightsholders who claim rights on work they do not own. As a result, not every rightsholder get access to the same tools. Some tools such as automated actions are limited or reserved for what the provider calls “trusted rightsholders”.
On the other side, rightsholders such as GESAC have criticised the way they are treated by the big platforms such as YouTube. In particular, they highlighted the categorisation made by the content recognition tools which can lead to loss of revenue. Indeed, rightsholders sometimes have no choice but to use tools created and controlled by big platforms with their own opaque rules, and therefore emphasised the need for transparency and accuracy of the information on the way platforms like YouTube operate with content whose rights they own.
Transparency is key
With the aim of understanding the management practices of copyright-protected content, quantitative information is crucial. Faced with the issue of filters, content recognition providers said they have been relying on redress mechanisms and human judgment. But when asked for factual information on the functioning of their practices, no number or percentage was available. It is therefore impossible to understand the necessity, proportionality or efficiency of the use of automated content recognition tools.
According the Article 17(10) of the Copyright Directive, which provides the basis for the ongoing stakeholder dialogue, “users’ organisations shall have access to adequate information from online content-sharing service providers on the functioning of their practices with regard to paragraph 4.”
After four meetings and still lacking such information from companies, civil society organisations participating in the dialogue decided to send a request for information to the European Commission. We hope that the Commission will be able to gather such factual information from platforms so that the ongoing dialogue can lead to an evidence-based outcome.
As part of these transparency needs, EDRi also signed an open letter asking the Commission to share the draft guidelines they will produce at the end of the dialogue. In the letter, we asked that the guidelines should also be opened to a consultation with the participants of the stakeholder dialogues and to the broader public, to seek feedback on whether the document can be further improved to ensure compliance with the Charter of Fundamental Rights of the EU.
The next stakeholder dialogue meeting will be held on 16 January and will open the “third phase” of consultation, which will focus on the practicality of Article 17. The Commission already sent out the agenda, and the topics covered on 16 January will be authorisations, notices and the notion of “best efforts”, while the following session on 10 February will cover safeguards and redress mechanisms.
(Contribution by Laureline Lemoine, EDRi)
Our New Year’s wishes for European Commissioners
EDRi wishes all readers a happy new year 2020!
In 2020, we had a number of victories in multiple fields. The European Parliament added necessary safeguards to the proposed Terrorist Content Online (TCO) Regulation to protect fundamental rights against overly broad and disproportionate censorship measures. The Court of Justice of the European Union (CJEU) ruled that clear and affirmative consent needs to be given to set cookies on our devices. Member States have been increasingly issuing fines under the General Data Protection Regulation (GDPR). Also, Google was fined for its abusing online ad practices, and new security standards for consumer Internet of Things (IoT) devices were introduced.
However, 2020 was also the year when some governments positioned themselves against encryption and started to normalise facial recognition in public spaces without adequate safeguards, public debate or fundamental rights assessment (France, Sweden, the UK). Mandatory upload filters were approved at EU level, and data breaches and privacy scandals frequently made the news.
For 2020, we need to ensure that the EU pushes forward policies that will lead to a human-centric internet rather than data exploitation models which deepen inequalities and enable surveillance capitalism. We are sending our wishes to the fresh new European Commissioners, so that they can help us defend our rights and freedoms online.
In 2020, we wish for President Ursula von der Leyen to:
- Start implementing a human-centric vision for the internet to ensure the protection of fundamental rights online (and offline);
- Define high privacy, security, safety and ethical standards for the new generation of technologies that will become the global norm;
- Ensure that EU decision making is strengthened by ensuring transparency in the Council;
- Ensure that any future measures on Artificial Intelligence (AI) leads to AI systems in Europe are based in the principles of legality, robustness, ethics, and human rights and where current data protection and privacy laws are not circumvented, but strengthened;
- Ensure that the upcoming proposal Digital Services Act (DSA) (reforming the current e-Commerce Directive) creates legal certainty and introduce safeguards that will enable users to enjoy their rights and freedoms.
In 2020, we wish for Executive Vice President for A Europe Fit for the Digital Age Margrethe Vestager to:
- Provide clarity on safeguards, red lines, and enforcement mechanisms to ensure that the automated decision making systems — and AI more broadly — developed and deployed in the EU respect fundamental rights;
- Assess the fundamental rights and societal impacts of facial recognition and other biometric detection systems, and propose criteria to assess or define domains or use cases where AI-assisted technologies should not be developed;
- Tackle exploitative business models and their violation of personal data protections through the Digital Services Act and any other necessary legislative or non-legislative initiatives;
- Promote equality and fight discrimination in the development and use of technology;
- Guarantee and promote the respect of fundamental rights through competition policy by investigating abuses by dominant platforms and exploring cooperation with data protection authorities.
In 2020, we wish for Commissioner for Internal Market Thierry Breton to:
- Unlock the ePrivacy reform through discussion with the EU Council and the Member States;
- Develop a sustainable, human-centric and rights-promoting Digital Services Act;
- Ensure privacy by design and by default in current and future tech-related proposals;
- Achieve digital sovereignty by ensuring the development of the necessary free and open hardware and software;
- Ensure that the strategy on data developed as part of the EU’s approach on AI respects fundamental rights.
In 2020, we wish for Vice President and Commissioner for Values and Transparency Věra Jourová to:
- Ensure transparency in trilogue negotiations;
- Address the harms caused by hate speech, political disinformation and the abuse of internet controls by authoritarian states;
- Analyse the risks of targeted political advertising and the online tracking industry;
- Protect and promote freedom of expression online.
In 2020, we wish for Commissioner for Home Affairs Ylva Johansson to:
- Ensure that illegal mass surveillance is not deployed, for example in any future attempts to implement data retention in Member States;Review all PNR frameworks in light of the jurisprudence of the CJEU;
- Reassess the “e-evidence” proposal and its necessity or to include meaningful human rights safeguards;
- Ensure that the safeguards adopted by the European Parliament and advocated by human rights groups are part of the final TCO Regulation.
In 2020, we wish for Commissioner for Justice Didier Reynders to:
- Ensure the full enforcement of the GDPR in Member States by ensuring that data protection authorities have the necessary funding, resources, and independence to protect our rights;
- Promote the European approach to data protection as a global model;
- Contribute to legislation on AI to ensure that fundamental rights are fully protected, and especially, equality for everyone, by adopting rules that mitigate the harms caused by discrimination.
The new year is a time to reflect on the past year and pledge to do better in the next. Looking for new year’s resolutions? You can do more to stay safe online or donate to EDRi, to help us continue defending your digital human rights and freedoms in 2020 and beyond.
Лучший брокер! Бонус при регистрации 10 000 рублей!
Быстрое открытие счета + 1000 $ в подарок!
Бонус при регистрации до 35 000 рублей!