Fb-phone

Facebook is working on a new tool to help stem one source of harassment on its platform.

The social network is testing a new feature that will automatically alert you if it detects another user is impersonating your account by using your name and profile photo.

When Facebook detects that another user may be impersonating you, it will send an alert notifying you about the profile. You’ll then be prompted to identify if the profile in question is impersonating you by using your personal information, or if it belongs to someone else who is not impersonating you.

Though the notification process is automated, profiles that are flagged as impersonations are manually reviewed by Facebook’s team. The feature, which the company began testing in November, is now live in about 75% of the world and Facebook plans to expand its availability in the near future, says Facebook’s Head of Global Safety Antigone Davis.

While impersonation isn’t necessarily a widespread problem on Facebook, it is a source of harassment on the platform, despite the company’s longstanding policy against it. (Impersonation also falls under the social network’s names policy, which requires people to use an authentic name.)

“We heard feedback prior to the roundtables and also at the roundtables that this was a point of concern for women,” Davis told Mashable. “And it’s a real point of concern for some women in certain regions of the world where it [impersonation] may have certain cultural or social ramifications.”

 

Is This Profile Pretending to Be You (1)

IMAGE: FACEBOOK

Davis said the impersonation alerts are part of ongoing efforts to make women around the world feel more safe using Facebook. The company has been hosting roundtable discussions around the world with users, activists, NGOs and other groups to gather feedback on how the platform can better address issues around privacy and safety.

Facebook is also testing two other safety features as a result of the talks: new ways of reporting nonconsensual intimate images and a photo checkup feature. Facebook has explicitly banned the sharing of nonconsensual intimate images since 2012, but the feature it’s currently testing is meant to make the reporting experience more compassionate for victims of abuse, Davis says.

Under the test, when someone reports nudity on Facebook they’ll have the additional option of not only reporting the photo as inappropriate, but also identifying themselves as the subject of the photo. Doing so will surface links to outside resources — like support groups for victims of abuse as well as information about possible legal options — in addition to triggering the review process that happens when nudity is reported.

Davis said initial testing of these reporting processes has gone well but they are still looking to gather more feedback and research before rolling them out more broadly.

The photo checkup feature is similar to Facebook’s privacy dinosaur, which helped users check their privacy settings. Likewise, the new photo-centric feature is meant to help educate users about who can see their photos.

 

photo-checkup-2

IMAGE: FACEBOOK

Facebook already has fine-tuned privacy controls in place but users, particularly those in India and the other countries where the feature is being tested, aren’t necessarily familiar with how to use them, Davis said. The photo checkup is meant to bridge that gap by walking users through a step-by-step review process of the privacy settings for their photos. The photo checkup tool is live in India, as well as other countries in South America, Africa and southeast Asia.

 

[Source:- Mashable]