1 in 5 corporate networks host child sex abuse content

News by Tim Ring

One in five companies have someone who has downloaded child sex abuse images at work. But in just 3.5 per cent of cases this has led to a criminal investigation and in 69 percent of the incidents nothing happened.

One in five companies have someone who has downloaded child sex abuse images at work. But in just 3.5 per cent of cases this has led to a criminal investigation and in 69 percent of the incidents nothing happened.

That's according to a survey by NetClean, which is calling on CISOs to do more to tackle the problem and to tell the police if they discover such material on their network.

“Today's employers have a moral duty to tackle child abuse images on corporate networks,” said chief marketing officer Fredrik Frejme in a 21 August blog.

But the company's study - of 141 senior executives and IT professionals attending a leading security trade show earlier this year - shows that just 9 per cent accept employers have a responsibility to stop child sexual abuse (CSA) content. Most believe responsibility lies with individuals (34.8 per cent), government (29 per cent) or internet service providers (22 per cent).

Companies also vastly underestimate the problem, NetClean said. The company estimates that one in every 1,000 employees will look at CSA content at work. But a third of the survey respondents believe the figure is one in every 10,000 employees, and the same proportion estimate it is just one in every million.

“This is simply not true,” Frejme said. “There's an inherent belief that it is the ‘local weirdo' accessing illicit images and not the person sat opposite them in their day-to-day jobs.”

He told SCMagazineUK.com that the root of the problem is personal mobiles devices such as laptops, phones and USB sticks being brought into work.

“We are still scratching on the surface about this huge problem,” he said. “We believe it's important to understand people bring pictures and films to the company because it's almost like a free zone.

“At home you have your family, you have friends, you have kids. And the most private thing that you have today is your corporate laptop. Everything else is open because the family needs it. So people can look at it on their own devices and also send it to the network that they have.”

The survey finds nearly 80 per cent of companies have an internet use policy that covers child sexual abuse sites. But Frejme said: “You also need to have tools to follow up the policies.”

Available solutions include tools to match pictures against a database of known illegal images, and to get an alert if people distribute pictures to the network.

“It's a huge trend now that you bring your own device,” he said. “But if you could secure your network or your company Wi-Fi to make sure you can't distribute these kinds of images and illegal content, then they do what they can do to follow up the policy.”

He also urged: “It's important when you find these kind of images to go to law enforcement.”

NetClean's controversial findings and its call to action are supported by other industry figures.

Professor John Walker, a security forensic specialist and CTO at Cytelligence, said: “I am surprised at the statistics published by NetClean, as in my opinion, based on experience, they should be even higher.”

Walker said in his work he has found “such tolerated and disgraceful storage” on systems owned by global automotive companies, a financial sector company and “even inside the Palace of Westminster where parliamentary privilege was considered to be a defence!”

He said: “There are two fundamental challenges to be encountered with such materials, which are that organisations do not understand the criminal nature and implication of such content, and treat them like any other detection of inappropriate images.

“And where organisations are aware, they simply wish to bury the bad news in fear of any external consequences and implications – leaving the real victim in a state of extended abuse – which is ethically, and socially unacceptable.”

Walker said: “There's massive education piece missing. Some corporates know the problem, and they're burying bad news. Others are finding it and don't know what they're finding. They refer to it as pornography - of course it's not. It's a criminal act to store it, to view it, to pass it on.

“I've not heard of anybody who actually reports it to the police.”

Tim Holman, president of the ISSA-UK user group and CEO of security firm 2-sec, urged companies to accept they have an obligation to prevent any illegal activity going on inside their businesses, not just child abuse but copyright theft, terrorism, harassment or insider fraud – “a whole heap of things that perhaps a corporate network might facilitate”.

He advised: “If a company is negligent in this respect, and doesn't realise over a period of many years that their systems have been used for illegal activity until the police come knocking on the door, then there's a potential criminal negligence claim.”

He added: “If corporate resources are being used for illegal activity, then also bear in mind the police might swoop down at any moment and seize these assets. These might be your Exchange servers, or critical file servers. In short, the police can put you out of business for a few days and would be perfectly within rights to do so.

“Companies like NetClean do offer a valid solution to address one aspect of the problem, with regards to child abuse images, but companies should really be taking a step back and addressing illegal activity as a whole, rather than diving in and buying specific solutions from the outset. As with all things security, this rarely works.”
Crime & Threats

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews