Social Media and Knife crime

There has been a rise in tragic cases involving young men stabbing each other in the streets and parks of London, England.  I have no doubt that there are similar situations in cities of similar size the world over.  It just happens that the numbers brought it to the front pages early in this new year. The role of social media is once more brought up as one of the solutions. I do not think it is relevant and here’s why.

The heartbreak experienced by families cannot be understated or ignored.  The losing of a teenage child to a pointless stabbing is every parent’s worst nightmare and creates a great deal of angst among right thinking people everywhere.  In London, a city of nearly 9 million people, this end of year saw the appalling figure of 30 teenage knife deaths in 2021.  This was one more than figure for 2018 and sees the media calling for answers to this complex social issue that is blighting young men in the city.  One of the answers trotted out by both NGO’s and Police is that social media companies need to do something about the issue.  

There are a lot of NGOs, Politicians and Police working hard to deal with this issue and they are quick to point out the solutions and things that hamper them.  But none of them acknowledges that gang violence is a problem that is as old as the hills.  Inter-location violence and gang membership has always been an issue among young people, especially boys, in countries all over the world and it is not just in inner cities.  It is to be found everywhere.  As a teenager I witnessed it every weekend in my rural childhood town, culminating in the death of a young man from another town and the jailing of a young man from mine.  A tragic end to one life and a tragic beginning to another.   In all aspects of life, they were identical.   The same socio-economic background, the same family make up, the same prospects for life.  The only thing different about these two boys was where they were from. 

It seems trite, therefore, to blame social media for exasperating the problem.  In fact, it is more than trite.  It’s a total red herring and a distraction from the reality that social media is only a small part of the issue and even then, it is an innocent player in the drama of these kids lives.  The “calling” out videos we all see online are something we generally pass over but can be shocking in their threats of violence and they mean a lot more to those to whom they are aimed.  The people making these videos are only using the online space to do what they do.  If they were not doing it online, they would be doing it somewhere else.  It is a form of expression that is real in their lives regardless of the medium used.  Is tagging and graffiti not another way to express how they feel?  Is word of mouth not another way to get a meet organised between rival gangs?   What about other types of crime?  The medium by which the message is passed is irrelevant.

Social media did not exist when I was a kid in suburban Ireland of the mid 1980’s and it did not exist in the mid May 1940 when a fight at Baldoyle racecourse left two young men from the inner city on the brink of death after being stabbed during a fight.  What is more interesting about the people injured that day is their names.  50 years later the same names were appearing with tiring regularity in reports of arrests and the district courts that I inhibited as a young police officer.  This indicates to me that generational poverty is a far bigger issue than social media when looking for causes or solutions to the issue.  The “inner city” with it’s deep seated anti-social attitude, multi-generational unemployment and many idle hands is far more an issue.  The fact that these kids are using social media not be a surprise.  They also use it for other aspects of their lives such as announcing deaths, births and marriages.  In fact, as pointed out by a NGO working in London, the production values in these threat videos can often be to a very high level and this in and of itself is an outlet for their creative skills!

In her book, the Cyber Effect (2016) Mary Aiken outlines human behaviour is changed by the advent of the internet.  Included in these was the multiplier effect.  A message sent by one person can be seen by hundreds, thousands even millions but more than that it can be seen by the person or gang to whom it is aimed, in full technicolour.  This is also true of actions.  To be humiliated or shamed in real life is one thing, but for this to happen online it is so much worse.   She also points out that people often say stuff online that they would never say offline.  All the above are platform agnostic.  They will happen anywhere on the internet, not just on social media.

Social media is used to reflect real life.  People put the best and worst of themselves out there for everyone to see and if these kids are involved in “postcode” wars or baiting then this will appear in their feeds.  They have a right to communicate, and companies cannot be held responsible for that.  What they can do though, is enforce their own terms and conditions in a consistent and reliable manner.  This is the same for other social ills and is not exclusive to knife crime.  Radicalisation, cybercrime, suicide ideation, CSAM, cyberbullying is all the same and are indicative of the thin line between the right to communicate and the right to safety.  It’s not an easy line to straddle.

10 years is a long time (for an abused child)

Thea Puymbroek 1978 -1984

On the 22nd of August 1984, Police were called to the Holiday Inn in Eindhoven because a porn actress had been found dead of a cocaine overdose. She was later identified as Thea Puymbroeck. She was 6 years old.

What transpired during the investigation was that many, many red flags were missed by all those we expect to see them. Police, social services, and indeed her family were all aware that she was being abused and was being exposed to sexual and drug abuse. Coby Kruijswijk, a neighbour, and the person who cared for Thea most during her short life had been in touch with anyone who would listen about the abuse. Nobody listened. Thea finally died while a movie was being made of her sexual abuse in a hotel and her death was put down to a drug overdose. One would like to think that we could do better today, with increased awareness and increased investment in social services, procedures and laws. One would like to think….

We should never forget that what underpinned the death of Thea was an insatiable desire from some people for new child sexual abuse material (CSAM). That desire still exists today.

The 2021 Octopus conference on Cybercrime by the Council of Europe has just finished and as usual they had a some very useful presentations about online sexual abuse. As an organization, they do a lot of fighting against online child exploitation with the Budapest convention being the first to include Child Sexual Abuse Material as a cybercrime (content) and the excellent Lanzarote convention specializing in child exploitation in general and strongly in online child exploitation. 2021’s Octopus conference reminded me of Thea as she had featured in a presentation I gave to the same conference 10 years ago. You can find it here.

As I have already stated, it is important to remember, what happened to Thea in 1984 was fueled by a market for pornography featuring children – children being abused. At that time high quality movies and magazines of child abuse were being produced but there were still significant challenges in distributing it, obtaining it and because of laws passed not long after Thea died, producing it. Soon after this the internet exploded onto the world stage and all barriers both physical and morally were lifted. The internet fueled a resurgence of CSAM availability that the world had never seen and this continues unabated to this day. New material featuring child even younger than Thea appears online every day everywhere on the internet. In fact it is safe to say that no platform is immune. CSAM produced in 1984 is still circulating on the internet as is material produced in 1994, 2004, 2014 and indeed 2021. Each child featured was sexually abused and each child featured must grow up and become an adult knowing that images and movies of their abuse is circulating online. Think about that for a second.

Child Abuse is a stain on every society. The recording of that abuse and it’s subsequent sharing online is an aggravated stain on every society and everyone associated with the internet. The cyber-utopianism that sees company after company come to market without taking into consideration or dealing with the misuse of their platforms is another aggravating factor. It’s a no-brainer but it’s a cost to the bottom line, so they ignore it until forced not to.

What is perhaps disheartening about the text of the talk I presented in 2011 is it is full of optimism, proud of the progress we had made to that point in removing CSAM from the Internet (especially the web). I also evoked the challenges that remained such as the failure of self-regulation, the lack of investment by law enforcement, a lack of understanding among policy makers, prosecutors and Judges about the seriousness of the issue and the challenges imposed by encryption. All of those remain today, along with new challenges of which there are many. (Think #cyrptocurrency.)

The closing lines of my talk in 2011 were as follows;

Advocacy for sex with children, access to Child Abuse Material and access to children continue unabated on the Internet. Only by all sectors of society working together can we hope to improve.

Nothing in that statement has changed in 10 years. Is it likely to change in the next 10? The EU commission is working on regulation for the online service providers that should bring change in the same way GDPR did. If the EU does its job properly and does not allow the hard-line privacy advocates to win the day it should be revolutionary and protect countless children from further abuse, revictimization and prevent people from accessing material which is fuelling ongoing actual abuse of real children. Finding and removing child sexual abuse material from the internet should be the job of those who own the networks that creates the network of networks. They are the only ones who can. They are also the only ones who can ensure the privacy of users.

Somewhere in the middle there must be ground where both of these ideals can be satisfied and this plague arrested so that in 2031 we’ll have more to cheer about.

the language of child abuse

They say you can’t have an omelette without breaking eggs.  Similarly, you can’t have child pornography[1] without abusing a child; a real child, being real abused.

With this in mind, most professionals working in this area no longer use the term child pornography and use instead the term child sexual abuse material or CSAM. 

This makes it instantly recognisable for what it is, photo, video and text depictions of a child or children being sexually abused.  From the mid 1980’s countries began to make this material illegal through strong legislation that reflected societies abhorrence at the fact that it existed at all and the advent of the internet accelerated this process.  The UN, the Council of Europe and the EU all have strong legal instruments in place for their members.

In policing circles, we have worked hard to “re-see” this material as crime scenes in themselves rather than just evidence of crime for the person possessing or distributing it.  It is only right that we put children first and work to identify the child in the material, to stop the abuse as early as possible.  It is also right to see the material from the child’s perspective and not that of the abuser.  Abusers, including those who possess and distribute the documented abuse, see it as pornography, designed to titillate sexually, to arouse.  We must take that “regard” away by removing the word pornography. 

This is all contained in the Luxembourg Guidelines[2] a terminology guide to harmonise the terms and definitions related to child protection.  You will find this discussion on page 38.

Calling it porn denigrates the actors who “star” in these blockbusters, those human beings who did not consent, were not rewarded and who suffer life changing mental and sometimes physical scars. 

By calling it Child Sexual Abuse Material you acknowledge the reality for the child, remind the degenerate who made it what they’ve done and signal your and societies’ disgust that they stoop so low in our name.

You choose.


[1] As defined in the Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography

[2] Pulled together by ECPAT Luxembourg after significant input from a large number of experts in the area of child rights and child protection. http://luxembourgguidelines.org/

investment fraud – the dating site vector

so you’ve met a really hot chick on your dating site of choice.  you love the way she looks, you love the way she thinks, you love the way she tickles your fancy and makes you feel….

naturally you begin to trust her.  she clearly trusts you.  the clue is in how she shares intimate secrets with you, maybe even shows you more than normal in the live chat.

https.kami.com.ph

you get to chatting about how you earn your crust.  she does the same.  she makes good money in investments and she shares with you how you can make money in the same way. 

she let you in on the site she uses, the returns for modest investments and when you ask, or when she suggests you think why not…..

if something seems too good to be true, it probably is……..

some companies care about child safety and some…..

……and so it has come to pass.  The ePrivacy directive kicked in on the 20th of December.  This means two things – people have more privacy, a good thing and children are less safe online, not a good thing.

As explained in previous posts the ePrivacy directive coming into full force on the 20th of December means that companies who were voluntarily scanning for Child Sexual Abuse Material (CSAM) and patterns of Grooming could interpret the change as making their scanning activities illegal.

There is an active effort to bring in a regulatory framework that will cover these voluntary actions but this will take time and so a temporary derogation for a small number of articles was requested.   It is still under consideration by the EU Parliament, the EU Commission and the EU Council.

In the meantime, some companies have stood up and declared they not stop these efforts to keep children online while the political process continues.   These companies are Microsoft, Google, Linkedin, YUBO and ROBLOX.  Well done to all of you. 

Other companies have not stood up. We don’t know what they are doing because apart from Facebook they are saying nothing.

Facebook has declared that they will switch off scanning.  In a post on their blog they use a lot of language to basically say that they are choosing privacy over child safety.  This is a disappointing and strange decision given that they have been to the forefront of voluntary actions in the past.  At least they told us.  What of other companies?  The big ones? 

The political process continues, here is what you can do:

Talk to your local representatives and MEPs – list here .

and if you work for or use companies who provide online services other than those listed above, ask them stopping or continuing?

time is running out…..

There is 7 days to go until the ePrivacy regulation comes into force in the EU.  It actually passed in 2018 but member countries were given 2 years to prepare and so it “goes live” on 20th of December 2020.  [1] A temporary derogation applied for by the Commision has passed committe stage and now must get a reading in the plenary before being the subject of a trilogue. Is all this possible in 7 days?

Because it’s a regulation rather than a directive, it effectively supersedes existing law in member countries and therefore any laws passed locally since 2002 to meet the ePrivacy directive are effectively repealed.

The regulation, aimed at companies providing communication services in the EU, seeks to guarantee privacy through ensuring no interference or tracking of communications for marketing or other purposes.

One of the key unintended consequences of this regulation is that voluntary actions by the same companies to find, remove and report Child Sexual Abuse Material (CSAM) or Grooming activities on their networks will stop because companies cannot risk that they will be seen as illegal.

These voluntary actions are hard fought chips in the self-regulation wall that activists and advocates, even within the companies themselves, have achieved.  Stopping them in this manner is ridiculous and reduces the safeguarding opportunities for children being actively harmed.  Companies scan for CSAM with advanced technologies to ensure the privacy of their users in the same way they find, remove and report SPAM and malware.  Comparisons to someone at the post office opening every letter “IRL” are facetious and unhelpful.

Child abuse communication, whether grooming or CSAM, universally happens in private and the only people who know about it are the child and the abuser.  When the abuser shares the material there is an increased chance that it will be found, removed and reported to the police who can then take action to make the child safe.  There are very few other ways for people to find out about the abuse and help the child.

In an effort to avoid this unintended consequence the EU Commission has put a temporary derogation before EU lawmakers to stay just two articles within the regulation that will allow the voluntary actions to continue in the area of child abuse online only until they can get the law needed in place.  This temporary derogation has passed the committee stage and now will get its first reading in the EU Parliament then move into a formal “trilogue” or interinstitutional talks. 

The question is when?  The EU Parliament meets next week in plenary.  Will they do the first reading then?  When will the Trilogue meeting take place?  How long will it take?  Will they have this finalised and passed (without too much dilution) by 20th of December? 

I certainly hope so.

Here is what you can do:

Sign the petition

Talk to your local representatives and MEPs – list here .

Support your local InHope Hotline

Read more and see the supporting evidence of what stops if this derogation fails to pass here.


[1] It’s full name is actually “Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications)” and it repeals the ePrivacy Directive of 2002.  

LIBE committee passes derogation

So today the LIBE committee voted overwhelmingly to allow the derogation to proceed to the plenary.  The vote 53 votes in favour and 9 against, 2 abstentions.

The derogation will now go for a vote to the European Parliament at the plenary sessions during the week of 14th to 17th of December 2020.  Having passed the LIBE committee, the derogation should be accepted by the parliament without too much drama.  However, it must then be discussed in the trilogue which may mean that it gets watered down. There needs to be strong representation for child safety as you can be sure there will be strong representation from the privacy side of the house.

Once that happens the real work begins to put in place the legislation that will provide the framework for finding, reporting and removing CSAM and grooming from networks.   That work will take many years.

The LIBE committee press release is here.

on voluntary actions to combat child sexual abuse

Companies exist to make money for their shareholders and it’s important to remember that when we discuss what they do to deal with anything bad that happens on their network.  That includes Facebook, Google, Twitter et al.

They have grown in a low-regulation environment, which was encouraged by governments all over the world. So dealing with the dark-side of social media and the human condition was always going to be a hard sell to their boards as it costs money and is the antithesis of profit making.  See line one of this post.

Unlike the EU, the USA brought in regulation relating to the online facilitation of child sexual abuse through a law stating that when a company is aware of Child Sexual Abuse Material and Grooming on their network they will report it to NCMEC.

NCMEC processes these reports and sends the “cybertips” to law enforcement all over the world for action.  These cybertips have saved lives and helped remove countless children from harm all over the world.

So, how does a company become aware of CSAM or grooming on their network?  Users reports, sure, but in most cases they are actively scanning their systems for evidence of it in the same way they do for viruses or malware. There are “voluntary actions”.

When the ePrivacy directive comes into force on the 21st of December of this year those “voluntary actions will stop – dead.

The EU commission wants to bring in a law similar to the US law and the procedure to report (EU Style NCMEC) and so has applied for a limited temporary derogation from a number of articles (5 (1) and 6)  in the ePrivacy directive that will maintain the status quo until new law can be drafted.

This derogation is currently under consideration at the European Parliament. If it passes, the status quo remains while the EU Commission prepares regulation in the form of law for 2021/22. If it fails, the voluntary action by these companies stop and there will be less children saved or removed from dangerous situations.

Here’s what you can do, as soon as possible:

Sign the petition

Further reading/ watching:

Talk to your local representatives and MEPs – list here .

Support your local InHope Hotline

Read more and see the supporting evidence of what stops if this derogation fails to pass here.

abused children -the forgotten voice in privacy

Yesterday, a 36 year old man was prosecuted for sex offences in the United Kingdom. 

You’ll find quality reporting on the case in the local Eastern Daily Press

He pleaded guilty to intentionally causing or inciting boys to engage in sexual activity, blackmail, intentionally causing children to look at sexual images and intentionally facilitating the sexual exploitation of children by sending on images of those children. 

His name was David Nicholas Wilson and he pretended to be a teenage girl while grooming 51 boys aged from 4 (four) to 14 (fourteen). The NCA fear that his victims may actually number as many as 500 in the UK and abroad. 

This crime type is what we know in the trade as Online Sexual Coercion and Extortion of Children (OSCE) but the press generally simplify it to “Sextortion”.  There is a great detailed explainer on the Europol site here.

The thing about this crime type is that the multiplier effect of ICT allows one offender to contact thousands of children knowing that a percentage of them will respond and engage.  In this case his preferential target was young boys so most likely he was operating on gaming platforms before bouncing the ones who responded onto other platforms.

Now, while I do not have any knowledge of this case apart from the newspaper articles, I have read that the offender was traced because Facebook found abusive images, while scanning their network, that had been shared by young users to an apparent teenage girl; saw them for what they were and reported that to the National Centre for Missing and Exploited Children (NCMEC) in the USA. They passed the Cybertip to the National Crime Agency in the UK, who got a warrant to search the house of the offender and put him before the courts. 

NCMEC processes millions (6 zeros!) of these Cybertips every year from companies such as Microsoft, Google, Facebook, Yubo, Snapchat etc. and forwards them to law enforcement all over the globe for assesment and action where appropriate.

The above process is currently under threat from two directions:

  • The introduction of end to end encryption by Facebook on their messenger product

Safety –v Privacy is a complex area of society that needs proper, respectful and holistic debate. 

The 51 real life boys he abused should be more than a footnote in that debate.  It cannot just be an inconvenient truth to be brushed aside by privacy advocates and activists who rightly claim that all communications should be private. 

I fundamentally agree but argue that there is a difference between #privacy and #encryption.

There must be a middle ground where society can protect children and other vulnerable people from criminals like David Nicholas Wilson.

Please sign the petition

the derogation – a summary

This is an explainer the derogation being sought by the European Commision to Articles 5(1) and 6 of the ePrivacy Directive about to be updated by the Electronic Communication Code.  It’s a little technical and lawyerly but then all good legislation is.  This is my understanding of it and is subject to change since I’m not a lawyer.

The ePrivacy directive was issued in 2002 and with GDPR needed updating and this happened in 2018 as the European Electronic Communication Code.  It is a directive and all member countries must transpose it – make it law in their countries.  Failure to do this can result in legal action.

Now, countries had two years to transpose it and so all of the provisions of the EECC kick in on the 21st December 2020.  Definitions of electronic communications services and privacy will change and will include what are called “number-independent interpersonal communications services”.  This will include webmail, messaging services and IP telephony.  Essentially, therefore companies (such as Facebook, Google, Microsoft et al. who are now doing voluntary actions to find, report and remove Child Sexual Abuse Material and to find Groomers within their networks would run the risk of exposure to prosecution if they continued.

The derogation is simply a legal request to stay articles 5(1) and 6 of the ePrivacy for a short period while the commission gets a legal framework into place to allow companies to continue scanning for CSAM and Grooming activity. In other words, it’s asking tha tthe status quo remain in place for activities aimed at

The discussions as to whether the derogation will be allowed is ongoing and this is why we need to act now. 

As already stated, if the derogation is not accepted or is watered down, there is a major risk that all efforts currently being made to find, report and remove CSAM and Grooming stop.

Here is what you can do to help:

Sign the petition

Further reading/ watching:

Talk to your local representatives and MEPs – list here .

Support your local InHope Hotline

Read more and see the supporting evidence of what stops if this derogation fails to pass here.

Read the excellent analysis of John Carr in his blog here, here and here