NEW YORK (AP) — Facebook is stepping up its efforts to keep inappropriate and often violent material — including recent high-profile videos of murders and suicides, hate speech and extremist propaganda — off of its site. David Fischer, the head of Facebook’s advertising business, said in an interview that the detection and removal of hate speech and content that promotes violence or terrorism is an “ongoing priority” for the company, and the community operations teams are a “continued investment.” Videos and posts that glorify violence are against Facebook’s rules, but Facebook has drawn criticism for responding slowly to such items, including video of a slaying in Cleveland and the live-streamed killing of a baby in Thailand. Families documenting a toddler’s first steps for faraway relatives, journalists documenting news events, musicians performing for their fans and people raising money for charities. With a quarter of the world’s population on it, Facebook can serve as a mirror for humanity, amplifying both the good and the bad — the local fundraiser for a needy family and the murder-suicide in a faraway corner of the planet. […] lately, it has gotten outsized attention for its role in the latter, whether that means allowing the spread of false news and government propaganda or videos of horrific crimes. In addition to removing videos of crime or getting help for someone who might hurt themselves, he said, the company’s bulked-up reviewing force will “also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation.” Wednesday’s announcement is a clear sign that Facebook continues to need human reviewers to monitor content, even as it tries to outsource some of the work to software due in part to its sheer size and the volume of stuff people post.