臉書已經意識到,人們會點擊更少的廣告,他們會賺更少的錢

https://gtv.org/video/id=615ddeb7d2e2b81748720494

rances Haugen (Facebook Whistleblower)
弗朗西斯·豪根(臉書吹哨人)

I never wanted anyone to feel the pain that I felt,
我從不希望任何人感受到我所感受到的痛苦

and I’d seen how high the stakes were in terms of making sure
而且我已經看到在為了保證方面的風險有多大

there was high quality information on Facebook
臉書上有高質量的信息

Scott Pelley (CBS 60 minutes host)
斯科特·佩利 (CBS 60分鐘節目主持人)

At headquarters, she was assigned to civic integrity,
在(臉書)總部,她被分配到公民誠信部

which worked on risks to elections including misinformation.
致力於解決選舉風險,包括錯誤信息

But after this past election, there was a turning point
但在這次結束的選舉之後,出現了一個轉折點

Frances Haugen
弗朗西絲·豪根

They told us weren’t dissolving civic integrity.
他們告訴我們並沒有解散公民誠信部

They basically said okay we made it through the election,
他們基本上說,好的,我們完成了選舉

there wasn’t riots, we can get rid of civic integrity now
沒有騷亂,我們現在可以取消公民誠信部了

fast forward a couple months we got the insurrection.
快進幾個月我們有了暴動

And when they got rid of civic integrity.
當他們取消公民誠信部

It’s a moment where I was like, I don’t trust that they’re will-ing to actually invest
那一刻我就像,我不相信他們願意真正投資需要投資的東西

what needs to be invested to keep Facebook from being dangerous
來讓臉書遠離危險

Scott Pelley
斯科特·佩利

Facebook says the work of civic integrity was distributed to oth-er units
臉書表示公民誠信工作已分發給其他單位

Haugen told us the root of Facebook’s problem is in the change
豪根告訴我們,臉書問題的根源在於

that it made in 2018 to its algorithms that programming that de-cides
它在 2018 年對其算法所做出的改變

what you see on your Facebook newsfeed.
即決定你在臉書新聞推送上看到的內容

Frances Haugen
弗朗西絲·豪根

So, you know, you have your phone, you might see only 100 pieces of content.
所以,你知道,你用你的手機,你可能只能看到 100 條內容

If you sit and scroll, for, you know, five minutes,
如果你坐下來滾動你的手機五分鐘,你知道,

but Facebook has 1000s of options that can show you
但是臉書有1000多個選項可以向你展示

Scott Pelley
斯科特·佩利

the algorithm picks from those options based on the kind of con-tent
該算法根據你過去參與最多的內容種類

you’ve engaged with the most in the past.
從這些選項中挑選

Frances Haugen
弗朗西絲·豪根

And one of the consequences of how Facebook is picking out that content today,
以及臉書今天如何挑選這些內容的後果之一

is they’re just optimizing for content that gets engagement, or reaction,
就是他們是否只是針對吸引人的參與或反應的內容進行優化

but its own research is showing that content that is hateful that is divisive, that is polarizing.
但它自己的研究表明,仇恨的內容是分裂的,是兩極分化的

It’s easier to inspire people to anger is to other emotions,
激發人們的憤怒比激發其他情緒更容易

Scott Pelley
斯科特·佩利

misinformation angry content is enticing to people
錯誤信息、憤怒的內容對人們很有吸引力

and keep keeps them on the platform.
並將人們繼續停留在平台上

Frances Haugen
弗朗西絲·豪根

Yes, Facebook has realized that if they change the algorithm to be safer,
是的,臉書已經意識到,如果他們改變算法以提高安全性

people will spend less time on site,
人們在平台上會花更少的時間

they’ll click on less ads, they’ll make less money.
人們會點擊更少的廣告,臉書賺的錢也會更少

____________________________

原視頻鏈接: https://youtube.com/watch?v=_Lx5VmAdZSI
Date:10/05/2021

英聽校: 洛杉磯盤古農場 – Layka
翻譯: 洛杉磯盤古農場 – Layka
翻譯終校: 洛杉磯盤古農場 – Mike Li
字幕+視頻製作: 洛杉磯盤古農場 – 天涯行
審片:洛杉磯盤古農場 – 銀龍
發布 : 洛杉磯盤古農場–彩虹Rainbow

洛杉磯盤古農場歡迎您加入:(或點擊上方圖片)

https://discord.gg/2vuvRm7z6U

免責聲明:本文內容僅代表作者個人觀點,平台不承擔任何法律風險。

0 則留言
Inline Feedbacks
View all comments