Internet firms must protect children from potentially harmful content, say MPs.
 

A new report by the culture, media and sport select committee said that relying on individual companies to take measures was not enough, and resulted in “an unsatisfactory piecemeal approach which lacks consistency and transparency”.

 

Sites such as those promoting suicide, violence, anorexia and abuse came under fire particularly. The MPs said they were shocked that the industry standard for taking down child abuse footage was 24 hours and called for swifter action.

 

YouTube was criticised for not taking enough steps to prevent users, especially children, from seeing illegal content. A video supposedly showing a gang rape, which had been viewed around 600 times until two complaints had been received, was cited as evidence that measures must be stepped up.

 

Google, however, which owns the video-sharing website, says that YouTube has strict rules on what is allowed and states that its users could report any questionable footage at any time, and that it would be dealt with promptly. YouTube went on to explain that material flagged up was reviewed within half an hour and, if it was removed, steps would be taken to ensure it could not be re-uploaded.

 

The committee wants a new industry body set up to protect children from harmful web content and calls for sites that host user-generated content to review material proactively rather than waiting for pages to be flagged up. A body is expected to be set up later this year.