Careful, now…”artistic nudity” could be coming to a Facebook timeline near you.

No, seriously!

Facebook updated its “community standards” this week with new instructions on how to keep your content in-bounds in an increasingly diverse online community.

That diversity has given rise to a lot of confusion about what sorts of content are allowed, and why other types of content are taken down by site moderators. The BBC spoke to Facebook about how the new standards could help:

Monika Bicket, Facebook’s global head of content policy, said the rewrite was intended to address confusion about why some takedown requests were rejected.

“We [would] send them a message saying we’re not removing it because it doesn’t violate our standards, and they would write in and say I’m confused about this, so we would certainly hear that kind of feedback,” she told the BBC.

“And people had questions about what we meant when we said we don’t allow bullying, or exactly what our policy was on terrorism.

“[For example] we now make clear that not only do we not allow terrorist organisations or their members within the Facebook community, but we also don’t permit praise or support for terror groups or their acts or their leaders, which wasn’t something that was detailed before.”

Ms Bicket stressed, however, that the policies themselves had not changed.

The most controversial tweaks to policy? They involve sex and terrorism, of course.

New community standards keep it pretty simple: no genitals, no fully exposed buttocks, no nipples. However—breastfeeding photos are allowed! (Maybe this will end our long national nightmare of freaking out every time a breast is partially exposed in the name of infant health.)

Facebook emphasizes that they’ll still take down anything that’s sexually explicit, or that depicts a sex act—and that includes digitally created material, or erotica typed out in a status update.

At the end of the day, you won’t be seeing outright nudity on your timeline, but you may see a few more of your lady friends breastfeeding than you used to (or a little bit of side-butt).

Keep an eye on it.

The ban on “dangerous organizations” is a bit more comprehensive, but it could also make it a lot harder for these groups to spread their propaganda.

Here’s their explanation of what counts as a “dangerous organization”:

We don’t allow any organizations that are engaged in the following to have a presence on Facebook:

  • Terrorist activity, or
  • Organized criminal activity.
  • We also remove content that expresses support for groups that are involved in the violent or criminal behavior mentioned above. Supporting or praising leaders of those same organizations, or condoning their violent activities, is not allowed.

    We welcome broad discussion and social commentary on these general subjects, but ask that people show sensitivity towards victims of violence and discrimination.

    I’m interested to see how this update affects the ability of terror groups to rally support in an “unofficial” capacity. (For example, if you run a search for “Hamas” on Facebook, you’ll come up substantively empty.)

    For now, it’s time to put on our “watchdog” shoes. If you see something—especially if it involves support for terrorism, hate speech, or nudity—say something. I’m curious to see how these changes will affect how my timeline looks. Keep me posted in the comments with the changes you see on your Facebook pages!

    Facebook made a video explanation of their new standards. Watch here: