Values and rights, rule of law, security
#TheFutureIsYours Looking after citizens’ freedoms
Keeping children safe online: raising awareness and reducing the risks
From setting up parental controls to online games and video apps, we all play a role in reducing risks and keeping children safe online. It is key for children, parents and teachers to work closely to develop safer online behaviour and promote digital literacy.
The Future of Europe starts with children. This online debate aims at addressing the challenges posed by modern technology with a view to the online safety for children.
The debate will be opened by:
Karl Hopwood, e-Safety Consultant for Insafe, European Schoolnet, Bath
Denton Howard, Executive Director, INHOPE, Amsterdam
Laviero Buono, Head of Section - Criminal Law, ERA
Event report
Title and date of the ERA online debate: Keeping children safe online: raising awareness and reducing the risks, Tuesday 14 September 2021, 12h00-13:30 CEST. Online Platform: Zoom
Context, purpose, subject, and structure/methodology of the event: The fight against child sexual abuse material is part of the ERA Curriculum (EU Criminal Justice). ERA systematically offers courses (mainly sponsored by the EU Commission DG Home) on this subject. However, here an online public debate with two distinguished speakers in the field was organised to discuss the relevance of awareness to keep children safe online. Such debate addressed not only to legal practitioners (main target group of ERA) but everybody interested in the subject. Methodology: 10 mins presentation (chair), 10 mins speaker 1, 10 mins speaker 2, 1 hour open discussion.
Reporting: OPEN FORUM (option 4). This means that the report, drafted by ERA, was sent to speakers and to selected participants for their review.
Number and type (general or specific public with details if possible) of participants present: The online public debate was attended by 50 participants, including lawyers, prosecutors, judges, academics, researchers, legal advisors, government and EU staff, IT specialists and journalists from 16 different EU Member States as well as USA. For a precise breakout see below.
If available, demographic information about participants (e.g. age, gender, etc.): 18 male participants and 32 female participants


Main subjects discussed during the public debate:
Children lives have changed considerably because of the Covid-19. With social distancing measures children have spent more time at home and online. Internet is a useful tool for young people to stay in touch with their friends but it does bring risks. It is therefore vital to talk to children about staying safe online and about the apps and sites they are using.
It can be difficult to start talking to children about what they are doing with their digital devices. But talking regularly will help to let them feel relaxed more likely to come and speak openly.
Threats to children’s internet safety include invasions of privacy, cyberbullying, harassment. Options to protect children include parental controls, apps and tracking software. But the most effective way to keep kids safe is to talk with them about online risks, how to avoid them and how they can come to you when something goes wrong.
This event showed how everybody can take an active role in protecting kids from online risks. Much of it is monitoring how they use the internet and how they access it.
Giving to a child a smartphone or tablet for the first time can be used as a teaching opportunity. It is key to show how to set up strong passwords and set new rules for who can and cannot download apps.
Laviero Buono, ERA Head of Section for European Criminal Law and chair, introduced the key topics of the event by noting that nowadays, online interactions have real-world consequences – what happens online no longer stays online. The first essential stage is ensuring that harm to users is prevented as a far as possible, but what do we do when prevention is no longer an option? At this stage, we turn to reporting, and this is when the work of helpline and hotline organisations is essential, in particular because citizens might feel often more comfortable using anonymous reporting tools than they do reporting information in person and directly to the police. He emphasised that one of the key themes of the conference would be the importance for parents and teachers of hearing about these organisations and the steps which can be taken to keep children safe online. Laviero Buono gave instructions on how to proceed, i.e. 10 minutes for the first speaker, 10 minutes for the second speaker. At the end of the 60 minutes, open discussion with the participants.
Karl Hopwood, e-Safety Consultant for Insafe, gave the first presentation on the event. He first gave a brief overview of his professional background and experience in the field, explaining that he worked as a teacher and then a head teacher in the UK prior to moving into his current work, which he has now been doing for 14 years. In his role as helpline coordinator and consultant for Insafe, he works with schools and law enforcement officials to improve safe internet practices across the EU. Insafe has a network of Safer Internet Centres across Europe and is a project funded by the Commission. They have an awareness-raising centre which reaches out to the relevant stakeholders (for example, parents and teachers) about the challenges children can face online. They also have hotlines dealing with illegal content online and have a youth coordinator in each safer internet centre, whose role is to listen to young people, as they really believe in the importance of children actively participating in the discussions around online safety (consistent with the position on children’s participatory rights adopted in UN General Comment 25). Insafe also works to coordinate Safer Internet Day as well, which is now well-established and happens on the second Tuesday of February each year. This year it took place on February 9th and saw involvement from 190 countries, as well as promotion from some very public figures, including the Pope.
Mr Hopwood explained that in his role, he is responsible for the helpline network. Helplines exist in every country and provide a service to which children and young people (they are not exclusively for these groups, but are primarily used by them) can reach out with issues encountered online. Helplines act as an early warning system – Insafe collects and publishes data about the helpline usage every three months, offering an overview of the general issues being most commonly raised by callers and identifying any possible trends overall and in particular countries and demographic groups (though Mr Hopwood emphasised at this stage of the presentation the importance of caution when using statistics and data to draw general conclusions). Public highlights of the data collection are shared in the Better Internet for Kids bulletin.
In his PowerPoint presentation, he showed a graph detailing the most common issues which lead young people to contact helplines (based on the helpline usage data for April-June 2021). These were as follows:
- Cyberbullying
- Love/relationships/sexuality (online)
- Potentially harmful content
- Media literacy/education
- Data privacy
- Sexting
- E-Crime
- Gaming
- Online reputation
- Sextortion
- Technical settings
- Sexual harassment
- Excessive use
- Grooming
- Hate speech
- Advertising/commercialism
He explained, however, that the sextortion category is relatively new – Insafe has only been collecting data on this in the last two or three years after helplines recognised that this was a new and present threat appearing in a lot of their cases. It is a term referring to the use of coercion and/or blackmail to extort sexual favours from others, as well as the use of sexual content in blackmail to obtain other gains, such as money. At this point, a participant drew attention in the online chat to the materials produced by Europol in relation to ‘sextortion’ (https://www.europol.europa.eu/crime-areas-and-trends/crime-areas/child-sexual-exploitation/online-sexual-coercion-and-extortion-of-children) and specifically its ‘SayNO’ campaign (https://www.europol.europa.eu/activities-services/public-awareness-and-prevention-guides/online-sexual-coercion-and-extortion-crime).
Mr Hopwood then went on to discuss in more detail some of the dangers facing children on the internet today, noting the prominence of disinformation and fake news as well as the growth of dangerous online challenges promoted by influencers and through platforms like TikTok. He discussed the example of the TikTok ‘skull breaker’ challenge which became popular last year, which two people trick a third into jumping high so that the ‘pranksters’ can kick the legs out from under them, causing them to fall over. This challenge was widely shared and then replicated by many teenagers, and caused severe injury in many cases and deaths in several others. In the presentation slides, he also drew attention to the following other dangerous challenges:
- Benadryl challenge
- Silhouette challenge
- Cha Cha Slide challenge
- Choking challenge
In discussion of the TikTok challenges, he emphasised the importance of parents and educators not treating these topics as taboo, or deliberately avoiding talking about them for fear that bringing them up will actually raise their profile and expose children to them rather than arming them against the danger posed. He explained that according to recent research in this area, children as young as 3 or 4 are looking at influencers who are doing these kinds of challenges and therefore encounter pressure to get involved – children are extremely likely to find out about these things by themselves, so parental discussion will not be what alerts them to their existence. Rather, discussion will emphasise the dangers of taking part and help to promote safety.
Mr Hopwood then noted that, unsurprisingly, the amount of time spent on screens by young people soared during lockdowns and times of restricted social contact over the duration of the pandemic – this was true for most people, but especially for young people as online content served as their source of schooling during closures as well as their lifeline to social contact whilst confined at home. However, he pointed out that even before the pandemic, there were concerns about screen addiction and growing ‘screen time’, though experts have been saying for a while now that ‘screen time’ is a bad metric to be using because of the omnipresence of screens and digital devices in our lives now – what is more important is ‘screen use’.
He emphasised that although modern social media platforms have been very cleverly designed to be addictive and keep us using them (persuasive design), we do also now have good control tools which we can use to put in limits for ourselves and which parents can also use for their children’s devices (screen time monitoring and app limits, privacy settings, etc).
Mr Hopwood then moved on to discussing other things parents and teachers should be aware of in relation to children’s internet use, stressing that there exists a really huge spectrum of harmful content viewable by children on the internet. For example, there is a lot which perpetuates body shaming and negative body image, including pro-anorexia content which has been found on many prominent platforms, such as TikTok. Recent research showed that on average, 26% of all 13-17 year olds sometimes do not post images online due to worry about body shaming. He also noted that though the harmful impact of this content is by no means limited to girls, it does affect them in higher numbers than boys – the average for girls in this study was 32% and that for boys was 18% - and in fact this gender imbalance is generally true of all online harms suffered by children.
Another example he gave of a danger children face was the receipt of sexual messages online, with on average 22% of children surveyed confirming that they had received such message. A smaller amount (average 6%) stated that they had sent such messages online, but Mr Hopwood pointed out that this may not be as accurate because people are less likely to self-report this kind of behaviour. He also presented statistics showing the percentages of 13-17-year olds who had:
- Witnessed young people sharing images or videos of someone they know doing sexual acts in the last year (27%)
- Witnessed young people sharing nude/nearly nude images that are supposed to be of a particular person, but could in fact not be them in the last year (33%)
- Witnessed young people sharing folders or collections of sexual images of people their age in the last year (23%)
Returning to the impact of the pandemic on his field of work, Mr Hopwood explained that Insafe helpline calls spiked significantly in the second quarter in 2020, coinciding with higher screen use at the point when many countries went into lockdown for the first time as a result of Covid-19. Between 2019 and 2020, numbers of calls to helpline cases increased significantly in most categories (with the exception of the topics of gaming and excessive use, where the numbers decreased). The largest increases by percentage were in relation to the topics of grooming, online reputation, sexting, sextortion, and sexual harassment (numbers increased by between 42% and 62% in these areas).
Mr Hopwood then discussed a case study example, where a father had called the helpline seeking advice. His 11-year-old daughter had been contacted online by someone pretending to be a teenage girl. He had reported the messages to the police and been told that there was very little chance that anything could be done about it. He wanted advice from the helpline on the following issues:
- How to make a complaint and speak to a law enforcement officer with more experience
- How to collect relevant information such as URLs, screenshots, account/profile of the offender
- How to block and report posts
- Safety tips for the daughter
- Further online safety support for the father
Mr Hopwood then spoke to participants about the website www.betterinternetforkids.eu, organised by the Insafe network programme, which contains many guides for parents about the opportunities and risks of different sites, as well as lots of resources for children of all ages in all different European languages. He explained that all of the safer internet centres organised by the Insafe network also have a page dedicated to remote learning. The organisation also hosts conferences for parents about various online safety issues which affect children, and in particular have events and conferences in relation to the impact of the pandemic on children’s online safety.
He stressed that the really key point he would take away from his work in the field is that we really have to open up channels of communication with children – if they’re calling helplines, it’s because they feel they cannot talk to their family or teachers about issues and dangers they are facing. He showed a video at the end of the presentation demonstrating the impact of negative attitudes and stigmas around online dangers, and the way that this kind of atmosphere can make children feel afraid to tell people they know about issues they face online. He emphasised that parents need to not overreact if kids come to them with issues about online safety – instead they should celebrate that the children feel safe enough to do so. He drew attention to the site www.deshame.eu, from which the video he showed was drawn, which covers Childnet’s Project deSHAME and hosts a lot of resources. He also mentioned the ongoing Digital Decade consultation being organised by the Commission, which people can contribute to: https://ec.europa.eu/eusurvey/runner/DigitalDecade4YOUth.
Denton Howard, Executive Director at INHOPE, gave the second presentation. He explained that INHOPE (the International Association of Internet Hotlines) deals with reports of content of child sexual abuse material (commonly abbreviated to CSAM).
He explained that although the organisations working in this field always seek, first and foremost, to prevent harm and abuse occurring, his organisation (INHOPE) generally do less work in the area of prevention, because their role is primarily to do with reporting illegal content which is already out in the world. He emphasised heavily that it is essential to remember that there is always a victim at the centre of every case, and those working in the field have to always make sure that the victim’s interests come first. The four stages of this kind of work are:
- Awareness
- Prevention
- Reporting
- Removal
Of course, he stressed that the ideal outcome is that the problem stops at the prevention stage and no one is victimised, but unfortunately this doesn’t always work, which is where the hotline system comes in. The key elements necessary for effective prevention of online abuse are good education and communication. Education is about making information available in schools and books; communication is about sharing the information outside these places and especially spreading it amongst parents, trying to reduce stigmas around this so that children can safely and comfortably come forward about what has happened or might happen to them online.
Mr Howard explained in his presentation that hotlines are organisations generally operating on a national basis that allow anonymous reporting of suspected illegal internet material including CSAM. Hotlines are operated by a mixture of NGOs, governments, ISP associations, and hybrid approaches. The relevant law for determining whether content is illegal is the law of the country where the content is available.
Once content has been reported to a hotline, it is then assessed by trained analysts at the hotline, and if it is determined to be illegal, they notify law enforcement and the internet service provider. This is the process of advising them for notice and takedown, which entails the removal of public access to the content – however, the content is not deleted, because it is important to maintain chains of evidence for the prosecution process. Content is also reported via the ICCAM report exchange system, which facilitates cross-border exchange among different hotlines. This is essential because of the global nature of the internet, which means that content uploaded in one place can easily be accessed from all over the world.
Mr Howard noted that people often wonder why there is a need for hotlines – why don’t people just report things they see online and suspect to be illegal directly to the police? He explained that when you make a report directly to the police as a witness or a victim, you are an important part of the investigation going forward, and may be called upon in any consequent legal proceedings, and this is often not something that people want, especially when they are simply bystanders who have stumbled across illegal content – they don’t want any further involvement or links to the crime, they just want the experts to deal with it, and this is what anonymous hotlines facilitate. People also sometimes fear that if they were to report something like CSAM in person, it would raise questions about how they came across it and be potentially incriminating or embarrassing for them.
He stressed the importance of Notice and Takedown (NTD) procedures being as swift as possible to remove public access to illegal content permanently, explaining that every time CSAM is copied and spread around, the depicted person is being victimized all over again. He explained that because speed is of the essence, NTD is measured in terms of time, from the point when the report is received to the point when access to the content is removed. He did however note that there are some rare cases when police will request that a URL is not taken down but remains accessible online as normal because that user is already the subject of a live investigation and the NTD procedure could jeopardise this.
In his presentation slides, Mr Howard showed participants a flow chart of how the investigation and reporting process works, including across different hotline and police organisations in different countries, and in particular through the ICCAM system, which is based at the Interpol headquarters in Lyon France. He explained that like Insafe, INHOPE also gather statistical information which they publish on their website and annual report.
Main ideas suggested by participants during the workshops and the shared or debated narratives and arguments that led to them:
The first two presentations, some 10 mins each, set the tone, paved the way, to what was discussed afterwards.
After the introductory presentations by the two experts, the floor was open to the participants’ questions, comments, and views.
Firstly, Mr Buono addressed a question to Mr Hopwood about the adequacy of school teaching on digital literacy. Mr Hopwood responded that it varies from country to country, and in some places online safety and media literacy are not always mandatory subjects on the school curriculum. He explained that in his experience it is far better for the subject to be handled in schools at least to some extent, rather than leaving it entirely to parents to impart, as many parents simply will not feel that they can or should do so. At the moment, however, many teachers he has worked with have reported feeling ill-equipped to deal with this subject – often citing it as the subject they feel least comfortable discussing with pupils – and therefore more training for them would be good.
Next, a participant asked about whether hotlines are receiving a lot of calls about videos of inappropriate (often sexually related) content on platforms like TikTok. Mr Howard responded firstly by clarifying that it is important to be careful about terminology – the term ‘inappropriate’ is subjective and its meaning will vary across people and countries. It is better, at least in his field of work, to talk in terms of ‘illegal’ content – this is what hotlines deal with and pass on for police investigation. If content is not illegal, but is nevertheless harmful, it should in general be reported under site abuse guidelines or to a helpline. He then explained that as far as platforms such as TikTok (open platforms) are concerned, there are not really any instances of illegal material/CSAM because there’s too much traceability – illegal content tends to be more hidden. However, hotline analysts are still human beings, so if they receive reports of content which they deem not to be illegal but still abusive or harmful, they will generally take action themselves to report it as appropriate under site guidelines. He stressed the importance of common sense in these kinds of cases.
Mr Hopwood added that the same is the case at Insafe – they’re a ‘trusted flagger’ for harmful but not illegal content and are familiar with trusted reporting channels. As an example of content which could be harmful despite being legal, he mentioned that they have dealt with issues in the past when parents have posted innocent videos of their children at gymnastics performances which have then been sexualised in online comments.
A participant then asked Mr Howard how we can combine the important elements of education and communication. In his response, Mr Howard said that he was now talking as a parent rather than in his professional capacity (due to his job’s focus on the reporting and removal rather than awareness and prevention stages) and stressed, as Mr Hopwood had in his presentation, the importance of parents not creating taboos around these topics or otherwise making children feel unsafe coming to them with problems. He spoke about the need to strike a good balance between protecting children and letting them grow up and have their own private lives. He then passed the question over to Mr Hopwood, whose job focuses more on this side of things.
Mr Hopwood acknowledged that while it is easy to sit in a hypothetical discussion and be clear about the importance of talking to children about these issues, it is not so easy in real life. He stressed additionally that it is really important for parents and educators to be proactive about these topics, because awareness and prevention-focused strategies cannot work if we don’t have these conversations until something has already gone wrong. Parents should invite kids to have a real and natural conversation about their opinions on online safety and online things generally – these issues play an enormous role in children’s everyday lives, so they will certainly have opinions about them. He then mentioned that he has worked frequently with an educational psychologist and always remembers the advice they gave: parents should avoid having these conversations in a sit-down, face-to-face environment because this can seem quite confrontational. Instead, it is better to go out for a walk and raise the topic more naturally in conversation, so that children feel more comfortable.
A participant then asked, in relation to the earlier discussion of ‘inappropriate’ content and TikTok, whether the site has a trusted flagger system in place for harmful content. Mr Hopwood initially responded in the chat, then the matter was brought up in verbal discussion. He explained that from his perspective, it seems that the company has been doing quite a lot to try and address these issues, giving the following examples:
- They have set accounts held by under-16s to default to private rather than public automatically
- No notifications for under-16s after 9pm
- Age limitations on live-streaming
- Restrictions on adults contacting children
Though he noted that there are certainly ways around some of these safeguards, he said that in his experience the people working in the trust and safety department of the organisation are really trying to make things better and do have child protection and children’s rights at heart.
A participant then asked about the risk for illegal non-EU migrants coming out of hiding to report CSAM and harmful content. Mr Howard started his response by emphasising that these kinds of problems happen in every culture and every country and those who have a sexual interest in children or seek to harm children will take opportunities to do so where they find them – child abuse is not limited to any type of place or people. Therefore, the only way to combat CSAM is to ensure that every single child is fully educated about what is acceptable and what is not, and they all need to have someone they trust who they can talk to about it. Noting that the participant asking the question came from Italy, he went on to discuss the somewhat unique situation there – in Italy, only the police are allowed to be involved in investigations of CSAM, which means that the law does not allow for any role to be played by hotlines and anonymous reporting systems. He said that it would be welcome to see legislation introduced to remedy this, as this state of play is keeping Italy behind other countries in relation to combating CSAM.
Julia Laffranque, Programme Director at ERA, then asked whether the speakers think there is a need for online child safety regulation at EU level, and furthermore whether they think that their organisations will be able to predict what kinds of problems could arise in this area in the future, in particular with the growth of new Internet technology and Artificial Intelligence. In response, Mr Hopwood drew attention to the Safer Internet Forum, which is happening via his organisation in early October and will be discussing exactly these kinds of issues, with a special focus on what online safety will look like in the future: https://www.betterinternetforkids.eu/en-GB/policy/safer-internet-forum.
Mr Howard confirmed that INHOPE are also participating in the SIF, looking in particular at automated detection of CSAM through hashing – however, he explained that there are prominent concerns about whether this tech interferes with the right to privacy under GDPR. He highlighted the following elements of current EU-wide discussion in the area of online safety:
- Commission’s ongoing Digital Decade project
- EU strategy to fight child sexual abuse, which sets out a roadmap for legislative changes happening in coming years
- Planned EU Centre for Combating CSAM, which is learning from the experiences of similar organisations (for example in Australia and the US) and seeks to build up a shared approach and new mechanisms to improve protection, including legislation/regulations to shift the role played by EU institutions in this area from a mainly supportive one to being more active
Mr Buono then asked about the recent announcement by Apple of a planned system to scan all devices for child exploitation images, noting that this seemed to be a positive step forward but that the initiative had to be delayed after security and privacy experts said that it could open up a back door into iPhones and undermine users’ privacy. Mr Howard said that this has been very topical recently and he has been doing a lot of work in relation to it. He explained that Apple had proposed making changes to their infrastructure including introducing automated scanning for CSAM, but after they made their announcement, many organisations gave them feedback (both publicly and privately) about their plans, saying that their proposals were not going to work out in the way that they had hoped. He himself gave such feedback on behalf of INHOPE and asked Apple first and foremost if they were intending to introduce this strategy in Europe. Apple immediately responded that this was not their intention – their announcement was in relation to plans in North America – because the strategy they had outlined would be completely illegal under GDPR. Mr Howard said that he thought that Apple had tried to do the right thing, but that they really should have engaged more with the community and legal environment before announcing plans in the way that they did.
Mr Buono then asked Mr Hopwood about the extent of use of parental control of children’s mobile devices and phones, wondering how much parents really make use of the tools available. Mr Hopwood replied that the situation is mixed and in his opinion far too many parents don’t use them – recent OFCOM research in UK found that around half of parents weren’t even aware that these tools existed, between a quarter and a third said they know about them and use them, and the rest knew about them but didn’t use them for various reasons. One problem encountered in this area is that there are so many different devices at home with internet connectivity and so many different apps that children use, so it takes a significant time commitment to go through and make sure that everything that might pose a danger is rendered safe.
Mr Buono asked Mr Howard whether he thinks that legal practitioners are well-equipped to handle any potential court cases arising in this field. Mr Howard responded that the vast majority of lawyers he has met are very smart people, but nevertheless they are human beings and they cannot be experts in everything – just like in any area of law, you need people who are specialised and know the terminology and methodology. An added complication is that this terminology and methodology will vary from country to country. Therefore, the more exposure lawyers working in related fields have to relevant professional development training, the better! Additionally, he urged lawyers to remember that in any case of CSAM where there has been a victim, that victim is a person whose interests should be at the centre of their concern (though he acknowledged that this is not necessarily how lawyers are taught to deal with cases). He emphasised that lawyers need to keep learning and make sure to ask the right questions to inform themselves fully about the issues in play. He said that he thinks the new European centre in this field will be very useful for this, as it can mitigate some of the difficulties arising from that fact that every country has a different system.
He then made participants aware of an upcoming online summit being organised by INHOPE which is primarily targeted at protecting digital first responders (those who run abuse teams and/or may have to view abusive content and/or deal with victims in the course of their jobs).
General atmosphere and expected follow-up:
Overall, excellent lunch debate.
Audience engaged. This is a topic of interest for everybody, nobody excluded.
Participants agreed with speakers that a lot more needs to be done, at level of families and schools. Prevention is key in this area.
Parents should stay involved in their children's digital world, know the apps they use, use parental controls where possible, and block and report people who make them feel uncomfortable.
Children should talk with a trusted adult so they understand online risks, only chat with people they know, ensure their online accounts are private, block people they don’t know or trust, and trust their instinct—if something makes them feel uncomfortable, tell a trusted adult about it.
All participants seem to go along these lines and they are willing to follow more webinars and/or face-to-face events on this topic.
Share:
Share link:
Please paste this code in your page:
<script src="https://futureu.europa.eu/processes/ValuesRights/f/11/meetings/19372/embed.js"></script>
<noscript><iframe src="https://futureu.europa.eu/processes/ValuesRights/f/11/meetings/19372/embed.html" frameborder="0" scrolling="vertical"></iframe></noscript>
Report inappropriate content
Is this content inappropriate?
- Call us 00 800 6 7 8 9 10 11
- Use other telephone options
- Write to us via our contact form
- Meet us at a local EU office
- European Parliament
- European Council
- Council of the European Union
- European Commission
- Court of Justice of the European Union (CJEU)
- European Central Bank (ECB)
- European Court of Auditors (ECA)
- European External Action Service (EEAS)
- European Economic and Social Committee (EESC)
- European Committee of the Regions (CoR)
- European Investment Bank (EIB)
- European Ombudsman
- European Data Protection Supervisor (EDPS)
- European Data Protection Board
- European Personnel Selection Office
- Publications Office of the European Union
- Agencies
0 comments
Loading comments ...