Facebook is adding more features to give users who see posts from their friends that are concerning to them in an attempt to help prevent suicide.

The social media site is working with several anti-suicide causes including Now Matters Now, the National Suicide Prevention Lifeline, Save.org and Forefront: Innovations in Suicide Preventionaccording to the University of Washington

If a Facebook user sees a post from their friend that they think is related to self harm or suicidal thoughts, they can report the post, and Facebook will get involved Once the Facebook user reports their friend's post, they will be given the option of contacting the friend, contacting another friend for support or contacting a suicide prevention helpline.

Facebook then gets involved in the process by looking into the post. If Facebook thinks the person is in danger of harming themselves or is suicidal, they will contact the person with the following alert:

"Hi (name), a friend thinks you might be going through something difficult and asked us to look at your recent post."

The user is given these choices from Facebook:

"Talk to someone: Reach out to a friend or helpline worker."

"Get tips and support: Learn how to work through this using some simple tips."

The user is encouraged to choose one of these options. If they choose to talk to someone, they will be asked to call a friend, send a Facebook message to a friend or contact a suicide helpline, which is available through phone or Facebook message.

Facebook also offers tips and support for people dealing with tough times.This includes videos from people who have actually had suicidal thoughts. Facebook even offers some relaxation techniques as advice to distressed individuals. They recommend baking, drawing, taking a walk or visiting a library.

This is the first time Facebook has directly created a reporting feature for suspected suicidal individuals. In the past, if someone thought someone was distressed or suicidal they had to specifically go to Facebook's suicide prevention page, copy the post's link and upload it to the page. Now, Facebook will offer a much more simple way to report this type of content.

"We have teams working around the world, 24/7, who review any report that comes in," Rob Boyle, Facebook Product Manager and Nicole Staubli, Facebook Community Operations Safety Specialist, wrote in a post for Facebook Safety on Wednesday. "They prioritize the most serious reports, like self-injury, and send help and resources to those in distress."