Facebook Facebook can legally rescue victims and find ways

Facebook
Live Killings

            Facebook is a social networking site that allows people
to be connect and share with family and friends online. Facebook was created in
2004 by Mark Zuckerberg while he was enrolled at Harvard University. Today,
Facebook is the world’s largest social network, with more than 1 billion users’
worldwide (GCF Global, n. d.).

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

Unfortunately,
it’s horrible to realize that crimes have been occurring in this social media
by some crime experts. Facebook live killings are horrific crimes that we are
living in today with this great network. To keep a safe environment on
Facebook, the organization must be in touch with law enforcement in crisis when
there are direct dangers to physical safety. Furthermore, we will embrace how
Facebook can legally rescue victims and find ways to be more proactive. These
two concepts will allow us to have proper safeguards and discuss key function
of Ethics Officer and suggest changes that can help ethical use.

Facebook
Legal or Ethical Duty

Facebook
do have legal or ethical duty to rescue a crime victim and prevent
psychological effects on people. Such organization that connect with the world
do exist legally and therefore, must have business ethics that all employees
and users must follow. It’s ok to admit that Facebook employees cannot control
all users but, it still their obligation to make sure all messages that go
viral through their organization are accurate and useful. Facebook is smart
enough to know that new technology means new behavior. The digital insurrection
is occurring so quickly that it has develop challenging to tell the difference
between passing trends. The emerging behavior that some Facebook users count as
an acceptable social norm must stop. I believe that Facebook Cyber psychologists can help make
sense of the online environment, where people behave differently than they do in
their real life.

Three
Ways to Be Proactive

Now
Facebook is facing backlash over the shooting video, as it grapples with its
role in policing content on its global platform.

It
is an issue that Facebook, the world’s largest social network, has had to contend
with more frequently as it has bet big on new forms of media like live video,
which give it a venue for more lucrative advertising. The criticism of Facebook
over Mr. Stephens’ video built swiftly Monday, with critics calling it a dark
time for the company and outrage spreading on social media over how long it had
taken more than two hours for the video to be pulled down. Ryan A. Godwin, the
victim’s grandson, pleaded with other users on social media to stop sharing the
video online (Mike, Isaac & Christopher, Mele, 2017).

•It’s
important that Facebook takes strong responsibilities in reviewing contents
posted on its platform that it only have access to and implement censors to
depict unusual contents for further review before posting. Taking more than two
hours to pull a live video shooting down is irresponsible.

According
to Isaac, M. & Mele, C. (2017), initiatives such as Facebook Live have been
embraced by Facebook users and advertisers. Video advertising, one of the
largest growth areas for Facebook and others, commands a premium compared with
traditional photo and text ad formats. The more users acclimate to posting and
viewing video content across these platforms, the easier it will be for the
companies to sell higher-priced advertisements against them. Yet monitoring the
content is imperfect. Because, Facebook’s network of users is so vast, the
company relies on a combination of artificial intelligence, human moderators
and alerts from users to flag objectionable content. If many people report an
instance of offensive or harmful content at once, Facebook’s algorithms will
show the post to a global team of human content moderators, who will review it
and decide if it violates Facebook’s terms of service.

•One
way interest can never be a welcome while dealing with a society especially in
this globalization. When an organization targets a group of people for its
business, that organization must make sure that the target group is reliable
and its lifecycle is in good position to be count on for the organization
lifecycle. Being proactive in protecting receivers from harmful messages from
senders must be the first responsibility as social network organization.
Therefore, Facebook must agree to the social network to be more transparent in
the censorship process and to agree to an external audit of its practices.

As
the approach of solution raises difficulties and knowing that in some
instances, Facebook moderators make unpopular decisions about what content
should or should not be allowed, the ultimate proactive action will be:

•To
pre-record all videos that come through the channel no matter their content and
if necessary, get feedback from the senders before to be posted.

Two
Safeguards for Social Media Platforms

To
help prevent acts of violence from being broadcasted, Facebook and others
social media must safeguard their platforms by continuously monitoring incoming
posts from users through the implementation of software to detect and deletion
of activities associated to violence. As technology is merging, software such
as artificial intelligence (AI) will be the best in this matter and also, the
government censorship should be reinforced to prevent access to violence
contents including nudity images, and pornographic images.

Facebook
Ethics Officer

According
to my research, Facebook doesn’t really have a Chief Ethics but do have staff
to discuss ethics issues. Organizations independently have different guidelines
a ways to respond to the issues interfering their businesses.

Anna,
Lauren Hoffman (2017), published that, after Zuckerberg decided to keep Thiel
on board, he justified the move by appealing to his commitment to openness,
even to views that one might not agree with, never mind that Thiel’s own views
and actions are antithetical to openness in any other sense, especially if
you’re a member of a free press or committed to the ideal of democracy. It’s
doubtful that a chief ethics officer could have convinced Zuckerberg a man
largely unconstrained in imposing his dreams on more than 1 billion people
across the globe that his position on Thiel was morally problematic. In light
of Facebook challenges, I think we are better served by reframing the question
of ethics and tech. The solution is not to corporatize ethics internally, it’s
to bring greater external pressure and accountability. Rather than position the
problem as one of “bringing” ethics to companies like Facebook via a
high-powered, executive hire, we should position it as challenging the
structures that prevent already existing collaborations and ethically sound
ideas from having a transformative effect.

According
to Anna, L. Hoffman (2017), the lesson of chief privacy officers a chief ethics
officer’s natural analogy is instructive. As Professors Kenneth Bamberger and
Deirdre Mulligan show in their book Privacy on the Ground, privacy officers
have been effective in part because of the rise of the Federal Trade Commission
as an active and engaged privacy regulator. In particular, FTC pressure has
been integral to the development of a corporate attitude toward privacy that
goes beyond mere compliance with the law and instead actively promotes and
protects the interests of consumers. As Bamberger and Mulligan note, the threat
of FTC oversight has helped generate “more forward-thinking and dynamic
approaches to privacy policies.” Without a major culture shift and increased
external and regulatory pressure, the possibility that an ethics officer could
spark widespread and necessary company reform remains limited.

Following
Anna’s statements, I agree that one’s can’t simply shout “more ethics!” within
corporate structures that prioritize economic gains and silence ethical voices,
and expect change to happen. Facebook doesn’t need to create ethics rules. If
ethics is to stand a chance, we need clear and increasingly potent means of
holding tech companies accountable for their actions (Hoffman, A. L., 2017).

Two
Changes Facebook can implement

One
big change that I will expect Facebook organization to implement is already in
place and just need to be enforced. We all know that Facebook vision is about
connecting friends and families by sharing messages, photos and video.
Therefore, the relationship between people on their site need to be scrutinized
to allow both senders and receivers, and the organization to benefit from the
transactions. In an interview with The New York Times, Mark Zuckerberg said
about the change that: “I expect the time people spend on Facebook and some
measures of engagement will go down”. “But I also expect the time you do spend
on Facebook will be more valuable” (Jonah, B. and Matthew, H., 2018).

The
second change that need to be done is to free users to search their own
connections. I realized that Facebook always tries to connect others friends
from other users whom doesn’t even know to each other. I personally think that
this kind of behavior from Facebook organization allow criminals to connect
with their targets and fulfill their harmful or killings action.

Conclusion

Taking
an inspiration plan that ensures safety should be a high priority for Facebook
organization and its users. However, victims from unusual threats must be
rescued and recovered. Despite all news feed that the company is trying to
implement, it’s imperative that Facebook remain customizable, through the
options that allow users’ to limit their exposure to certain person.

 

 

 

 

 

 

 

 

 

 

 

 

References

Bromwich, J. E.,
& Haag, M. (2018, January 12). Facebook Is Changing. What Does That Mean
for Your News Feed? Retrieved January 23, 2018, from https://www.nytimes.com/2018/01/12/technology/facebook-news-feed-changes.html

Free Facebook
Tutorial at GCFLearnFree. (n.d.). Retrieved January 23, 2018, from https://www.bing.com/cr?IOXk&v=1&r=https%3a%2f%2fwww.gcflearnfree.org%2ffacebook10

Hoffman, A. L.
(2017, January 26). Facebook Doesn’t Need a Chief Ethics Officer. Retrieved
January 23, 2018, from https://www.newamerica.org/weekly/edition-150/facebook-doesnt-need-chief-ethics-officer/

Isaac, M., &
Mele, C. (2017, April 17). A Murder Posted on Facebook Prompts Outrage and
Questions Over Responsibility. Retrieved January 23, 2018, from

 

 

x

Hi!
I'm Kara!

Would you like to get a custom essay? How about receiving a customized one?

Check it out