Sulake said it had kept 225 moderators and is still investigating what went wrong.By some measures, Internet-related sex crimes against children have always been rare and are now falling (as are reports of assaults on minors that do not involve the Net).
Yet even though defensive techniques are now available and effective they can be expensive.
They can also alienate some of a site's target audience - especially teen users who expect more freedom of expression.'There are companies out there that are doing a very good job, working within the confines of what they have available,' said Brooke Donahue, a supervisory special agent with an FBI team devoted to Internet predators and child pornography.
Filters and moderators are essential for a clean experience, said Claire Quinn, safety chief at a smaller site aimed at kids and young teens, Wee World.
But the programs and people cost money and can depress ad rates.
Under a 1998 law known as COPPA, for the Children's Online Privacy Protection Act, sites directed at those 12 and under must have verified parental consent before collecting data on children.
Some sites go much further: Disney's Club Penguin offers a choice of viewing either filtered chat that avoids blacklisted words or chats that contain only words that the company has pre-approved.
Users could be unnerved about the extent to which their conversations are reviewed, at least by computer programs.'We've never wanted to set up an environment where we have employees looking at private communications, so it's really important that we use technology that has a very low false-positive rate,' he said.
In addition, Facebook doesn't probe deeply into what it thinks are pre-existing relationships.
Metaverse Chief Executive Amy Pritchard said that in five years her staff only intercepted something 'terrifying' once, about a month ago, when a man on a discussion board for a major media company was asking for the email address of a young site user.
Software recognised that the same person had been making similar requests of others and flagged the account for Metaverse moderators.
Things like too many 'unrequited' messages, or those that go unresponded to, also factor in, because they correlate with spamming or attempts to groom in quantity, as does analysis of the actual chats of convicted pedophiles.