Hosted data - out of sight, but still a corporate responsibility
Hosted data - out of sight, but still a corporate responsibility

One in five companies have someone who has downloaded child sex abuse images at work. But in just 3.5 per cent of cases this has led to a criminal investigation and in 69 percent of the incidents nothing happened.

That's according to a survey by NetClean, which is calling on CISOs to do more to tackle the problem and to tell the police if they discover such material on their network.

“Today's employers have a moral duty to tackle child abuse images on corporate networks,” said chief marketing officer Fredrik Frejme in a 21 August blog.

But the company's study - of 141 senior executives and IT professionals attending a leading security trade show earlier this year - shows that just 9 per cent accept employers have a responsibility to stop child sexual abuse (CSA) content. Most believe responsibility lies with individuals (34.8 per cent), government (29 per cent) or internet service providers (22 per cent).

Companies also vastly underestimate the problem, NetClean said. The company estimates that one in every 1,000 employees will look at CSA content at work. But a third of the survey respondents believe the figure is one in every 10,000 employees, and the same proportion estimate it is just one in every million.

“This is simply not true,” Frejme said. “There's an inherent belief that it is the ‘local weirdo' accessing illicit images and not the person sat opposite them in their day-to-day jobs.”

He told that the root of the problem is personal mobiles devices such as laptops, phones and USB sticks being brought into work.

“We are still scratching on the surface about this huge problem,” he said. “We believe it's important to understand people bring pictures and films to the company because it's almost like a free zone.

“At home you have your family, you have friends, you have kids. And the most private thing that you have today is your corporate laptop. Everything else is open because the family needs it. So people can look at it on their own devices and also send it to the network that they have.”

The survey finds nearly 80 per cent of companies have an internet use policy that covers child sexual abuse sites. But Frejme said: “You also need to have tools to follow up the policies.”

Available solutions include tools to match pictures against a database of known illegal images, and to get an alert if people distribute pictures to the network.

“It's a huge trend now that you bring your own device,” he said. “But if you could secure your network or your company Wi-Fi to make sure you can't distribute these kinds of images and illegal content, then they do what they can do to follow up the policy.”

He also urged: “It's important when you find these kind of images to go to law enforcement.”

NetClean's controversial findings and its call to action are supported by other industry figures.

Professor John Walker, a security forensic specialist and CTO at Cytelligence, said: “I am surprised at the statistics published by NetClean, as in my opinion, based on experience, they should be even higher.”

Walker said in his work he has found “such tolerated and disgraceful storage” on systems owned by global automotive companies, a financial sector company and “even inside the Palace of Westminster where parliamentary privilege was considered to be a defence!”