June 19 (UPI) -- Google Inc. announced Monday that it will enforce four new steps to identify terrorist-related content on its YouTube subsidiary.
The outline, published as a Google blog post Monday and in a Financial Times commentary, comes after Internet companies were accused by some politicians as inadequately dealing with extremist content on their websites.
It said that YouTube will develop additional artificial intelligence to to identify and remove extremist content. It will also increase the number of non-government organizations in its Trusted Flagger program, who notify YouTube of potential threats, from 63 to 113.
YouTube will also provide a warning and delete the comments section on videos with potentially inflammatory religious or supremacist content. The company plans to work with Jigsaw, an Internet firm which has used searches for Islamic State information to direct searchers to video programming which "debunks terrorist recruiting messages."