臉書的爆料者出面了,她是臉書公民不實資訊團隊的前首席產品經理Frances Haugen不滿臉書持續以自身的獲利優於公眾利益

臉書的爆料者出面了,她是臉書公民不實資訊團隊的前首席產品經理

連續向華爾街日報提供臉書(Facebook)內部研究的爆料者,在本周日(10/3)曝光了,她是Frances Haugen,為臉書公民不實資訊團隊的前首席產品經理,當天她接受美國哥倫比亞廣播公司(CBS)《60分鐘》(60 Minutes)節目的專訪,表示她挺身而出是因不滿臉書持續以自身的獲利優於公眾利益。

現年37歲的Haugen畢業於麻州奧林工程學院的電子與計算機工程系,並取得哈佛大學的MBA文憑,為演算法產品管理的專家,曾任職於Google、Pinterest、Yelp及臉書,她在2019年進入臉書,負責公民誠信,具體的工作內容是管理選舉風險,包含不實資訊在內,並在今年5月離職。

Haugen說,她所取得的文件在臉書內部是公開的,幾乎所有員工都能取得。這幾周以來,《華爾街日報》根據Haugen所提供的文件,陸續踢爆了臉書的貴賓專案XCheck、臉書演算法造成分化、販毒或人口販賣如何公開利用臉書平臺,以及Instagram對青少年所造成的負面影響等。

Haugen不僅向《華爾街日報》爆料,也向美國證券交易委員會(SEC)投訴臉書的不當行為,認為身為美國上市公司的臉書,隱暪了對投資人而言重要的資訊。

她在接受《60分鐘》專訪時指出,她一再地在臉書內部看到,一旦公司利益與公眾利益有所衝突,臉書總是選擇自己的利益,她曾經見過許多社交網路,但臉書是當中最糟糕的了。

Haugen舉了幾個例子,其中之一是她參與臉書是為了維護選舉正義,抑制不實資訊,但在美國大選結束後,臉書便決定解散公民誠信團隊,這讓臉書成為國會暴動者組織活動的平臺之一。臉書的確為了2020年的美國總統大選設置了安全系統來減少不實資訊,但許多的改變都只是暫時的。

另一個例子涉及臉書的演算法。她說,臉書可能有數千筆可以推薦給使用者的資訊,但當只能選擇其中100筆時,臉書會挑出使用者過去最常觀看的資訊類別,此一所謂的最佳化內容卻經常是不實資訊或令人憤怒的內容,因為它們更容易吸引使用者,也讓使用者停留在臉書的時間變得更長,更有機會點選臉書所提供的廣告,替臉書創造收入。臉書深知改變演算法就會減少收入的後果。

Haugen認為臉書放大了人性中的惡,而她相信社交網站其實可用來展現人性中最好的一面。

《60分鐘》在專訪Haugen之前,也同時發了專訪邀請予臉書高層,但遭臉書拒絕。臉書在節目播出前於內部發出了備忘錄,解釋自己對於極化(Polarization)與選舉的立場,企圖安撫軍心,另也提供了聲明予《60分鐘》,表示臉書團隊每天都嘗試在保護人們與言論自由之間取得平衡,亦努力改善不實資訊與有害內容的散布,暗示臉書鼓勵不當的內容且毫無作為是不正確的。

臉書也強調,如果有任何一個研究能夠找出對此一問題的精確解決方案,那麼不管是政府、科技產業或社會早在很久之前就能克服它。

🍎たったひとつの真実見抜く、見た目は大人、頭脳は子供、その名は名馬鹿ヒカル!🍏

臉書遭控「賺錢至上」?創辦人祖克柏喊冤:這不是真的

吹哨人指控臉書把賺錢看得比社會安全及大眾福祉更加重要,創辦人兼執行長祖克柏(Mark Zuckerberg)在臉書發文喊冤,表示這番說法「完全不正確」。

專攻企業與IT的新聞網站《ZDNet》報導,祖克柏在自己臉書上貼出寫給內部員工的信,信中寫道「我們非常關心社會安全、民眾福祉和心理健康等問題,因此很難接受那些扭曲臉書的報導。他們宣稱臉書故意推送讓人們憤怒的內容以謀取利益,但這完全不合邏輯。」

「臉書依靠廣告賺錢,而廣告商一直告訴我們,他們不希望自家廣告出現在有害或令人憤怒的內容旁邊。我也不知道哪家科技公司會製造讓人們憤怒或沮喪的產品,不管是基於道德、商業利益考量或大眾反應都不可能。」

曾任臉書公民錯誤訊息團隊首席產品經理的豪根(Frances Haugen),在離職後成了讓前東家頭痛的吹哨人。她告訴參議院,臉書為了追求成長不惜一切代價,包括犧牲社會大眾的安全,進而導致更多分裂、傷害、謊言、威脅及惡鬥。

祖克柏反駁「如果我們希望外界忽視臉書的問題,為什麼我們要創立一個領先業界的研究計劃來了解自家公司?如果我們不關心有害內容,為什麼要雇用更多專門研究及打擊有害內容的人?如果我們想隱藏專家對臉書的研究結果,為什麼我們要建立領先業界的透明度,並說明我們的行事標準?」

他也否認有關臉書影響兒童安全和福祉的說法,「當談到年輕人的健康或福祉時,每一次負面經歷都很重要……我們多年來一直投入業界最多的努力,在這些時刻幫助人們。我為我們的工作感到自豪,因為我們不斷透過研究進一步改善臉書。」

Hey everyone: it’s been quite a week, and I wanted to share some thoughts with all of you.

First, the SEV that took down all our services yesterday was the worst outage we’ve had in years. We’ve spent the past 24 hours debriefing how we can strengthen our systems against this kind of failure. This was also a reminder of how much our work matters to people. The deeper concern with an outage like this isn’t how many people switch to competitive services or how much money we lose, but what it means for the people who rely on our services to communicate with loved ones, run their businesses, or support their communities.

Second, now that today’s testimony is over, I wanted to reflect on the public debate we’re in. I’m sure many of you have found the recent coverage hard to read because it just doesn’t reflect the company we know. We care deeply about issues like safety, well-being and mental health. It’s difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted.

Many of the claims don’t make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing? And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world?

At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That’s just not true. For example, one move that has been called into question is when we introduced the Meaningful Social Interactions change to News Feed. This change showed fewer viral videos and more content from friends and family — which we did knowing it would mean people spent less time on Facebook, but that research suggested it was the right thing for people’s well-being. Is that something a company focused on profits over people would do?

The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction.

But of everything published, I’m particularly focused on the questions raised about our work with kids. I’ve spent a lot of time reflecting on the kinds of experiences I want my kids and others to have online, and it’s very important to me that everything we build is safe and good for kids.

The reality is that young people use technology. Think about how many school-age kids have phones. Rather than ignoring this, technology companies should build experiences that meet their needs while also keeping them safe. We’re deeply committed to doing industry-leading work in this area. A good example of this work is Messenger Kids, which is widely recognized as better and safer than alternatives.

We’ve also worked on bringing this kind of age-appropriate experience with parental controls for Instagram too. But given all the questions about whether this would actually be better for kids, we’ve paused that project to take more time to engage with experts and make sure anything we do would be helpful.

Like many of you, I found it difficult to read the mischaracterization of the research into how Instagram affects young people. As we wrote in our Newsroom post explaining this: “The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced. In fact, in 11 of 12 areas on the slide referenced by the Journal — including serious areas like loneliness, anxiety, sadness and eating issues — more teenage girls who said they struggled with that issue also said Instagram made those difficult times better rather than worse.”

But when it comes to young people’s health or well-being, every negative experience matters. It is incredibly sad to think of a young person in a moment of distress who, instead of being comforted, has their experience made worse. We have worked for years on industry-leading efforts to help people in these moments and I’m proud of the work we’ve done. We constantly use our research to improve this work further.

Similar to balancing other social issues, I don’t believe private companies should make all of the decisions on their own. That’s why we have advocated for updated internet regulations for several years now. I have testified in Congress multiple times and asked them to update these regulations. I’ve written op-eds outlining the areas of regulation we think are most important related to elections, harmful content, privacy, and competition.

We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress. For example, what is the right age for teens to be able to use internet services? How should internet services verify people’s ages? And how should companies balance teens’ privacy while giving parents visibility into their activity?

If we’re going to have an informed conversation about the effects of social media on young people, it’s important to start with a full picture. We’re committed to doing more research ourselves and making more research publicly available.

That said, I’m worried about the incentives that are being set here. We have an industry-leading research program so that we can identify important issues and work on them. It’s disheartening to see that work taken out of context and used to construct a false narrative that we don’t care. If we attack organizations making an effort to study their impact on the world, we’re effectively sending the message that it’s safer not to look at all, in case you find something that could be held against you. That’s the conclusion other companies seem to have reached, and I think that leads to a place that would be far worse for society. Even though it might be easier for us to follow that path, we’re going to keep doing research because it’s the right thing to do.

I know it’s frustrating to see the good work we do get mischaracterized, especially for those of you who are making important contributions across safety, integrity, research and product. But I believe that over the long term if we keep trying to do what’s right and delivering experiences that improve people’s lives, it will be better for our community and our business. I’ve asked leaders across the company to do deep dives on our work across many areas over the next few days so you can see everything that we’re doing to get there.

When I reflect on our work, I think about the real impact we have on the world — the people who can now stay in touch with their loved ones, create opportunities to support themselves, and find community. This is why billions of people love our products. I’m proud of everything we do to keep building the best social products in the world and grateful to all of you for the work you do here every day.

大家好:一周過去了,我想和大家分享一些想法。

首先,昨天關閉我們所有服務的 SEV 是我們多年來遇到的最嚴重的中斷。在過去的 24 小時內,我們一直在匯報如何加強我們的系統以應對此類故障。這也提醒我們,我們的工作對人們有多重要。像這樣的中斷更深層次的擔憂不是有多少人改用有競爭力的服務或我們損失了多少錢,而是依賴我們的服務與親人溝通、經營業務或支持他們的人意味著什麼社區。

其次,既然今天的證詞已經結束,我想回顧一下我們正在進行的公開辯論。我相信你們中的許多人都發現最近的報導很難閱讀,因為它沒有反映我們所知道的公司。我們非常關心安全、福祉和心理健康等問題。很難看到歪曲我們的工作和動機的報導。在最基本的層面上,我認為我們大多數人只是沒有認識到正在描繪的公司的虛假圖片。

許多說法沒有任何意義。如果我們想忽略研究,我們為什麼要創建一個行業領先的研究計劃來首先了解這些重要問題?如果我們不關心打擊有害內容,那麼為什麼我們要雇用比我們領域內任何其他公司都多的人來從事這項工作——甚至比我們更大的公司?如果我們想隱藏我們的結果,為什麼我們要建立行業領先的透明度和報告我們正在做的事情的標準?如果社交媒體像某些人聲稱的那樣導致社會兩極分化,那麼為什麼我們看到美國的兩極分化在增加,而在世界各地大量使用社交媒體的許多國家卻保持平穩或下降?

這些指控的核心是這樣一種想法,即我們將利潤置於安全和福祉之上。那不是真的。例如,一項受到質疑的舉措是我們將“有意義的社交互動”更改引入動態消息。這一變化表明,來自朋友和家人的病毒視頻更少,內容更多——我們確實知道這意味著人們花在 Facebook 上的時間更少,但這項研究表明,這對人們的福祉是正確的。這是一家專注於利潤而不是人的公司會做的事情嗎?

我們故意推送讓人憤怒的內容以謀取利益的論點是非常不合邏輯的。我們通過廣告賺錢,廣告商一直告訴我們,他們不希望他們的廣告出現在有害或憤怒的內容旁邊。而且我不知道有哪家科技公司開始製造讓人憤怒或沮喪的產品。道德、商業和產品激勵都指向相反的方向。

但在所有發表的文章中,我特別關注關於我們與孩子們的工作所提出的問題。我花了很多時間思考我希望我的孩子和其他人在網上獲得什麼樣的體驗,對我來說,我們構建的一切都是安全且對孩子有益的,這一點非常重要。

現實是年輕人使用技術。想想有多少學齡兒童擁有手機。科技公司不應忽視這一點,而應建立滿足其需求同時確保其安全的體驗。我們堅定地致力於在該領域做行業領先的工作。這項工作的一個很好的例子是 Messenger Kids,它被廣泛認為比替代品更好、更安全。

我們還致力於通過家長控制為 Instagram 帶來這種適合年齡的體驗。但考慮到關於這是否真的對孩子們更好的所有問題,我們暫停了該項目,以花更多時間與專家互動,並確保我們所做的任何事情都會有所幫助。

像你們中的許多人一樣,我發現很難閱讀有關 Instagram 如何影響年輕人的研究的錯誤描述。正如我們在新聞編輯室帖子中所解釋的那樣:“研究實際上表明,我們聽到的許多青少年都認為,當他們與青少年一直面臨的困難時刻和問題作鬥爭時,使用 Instagram 可以幫助他們。事實上,在《華爾街日報》引用的幻燈片的 12 個領域中的 11 個領域——包括孤獨、焦慮、悲傷和飲食問題等嚴重領域——更多的少女錶示,她們在這個問題上掙扎,也表示 Instagram 讓那些困難時期變得更好而不是更糟。”

但是,當涉及到年輕人的健康或幸福時,每一次負面經歷都很重要。想到一個處於痛苦時刻的年輕人,他們不但沒有得到安慰,反而使他們的經歷變得更糟,這令人難以置信。多年來,我們一直致力於行業領先的努力,以在這些時刻幫助人們,我為我們所做的工作感到自豪。我們不斷地利用我們的研究來進一步改進這項工作。

與平衡其他社會問題類似,我認為私營公司不應自行做出所有決定。這就是為什麼我們多年來一直倡導更新互聯網法規的原因。我曾多次在國會作證並要求他們更新這些規定。我撰寫了專欄文章,概述了我們認為與選舉、有害內容、隱私和競爭最重要的監管領域。

我們致力於盡我們所能,但在某種程度上,評估社會公平之間權衡的合適機構是我們民主選舉產生的國會。例如,青少年幾歲可以使用互聯網服務?互聯網服務應該如何驗證人們的年齡?公司應該如何平衡青少年的隱私,同時讓父母了解他們的活動?

如果我們要就社交媒體對年輕人的影響進行知情對話,從全面了解開始很重要。我們致力於自己做更多的研究,並讓更多的研究公開可用。

也就是說,我擔心這裡設置的激勵措施。我們擁有行業領先的研究計劃,因此我們可以確定重要問題並進行處理。看到這項工作脫離上下文並被用來構建我們不在乎的虛假敘述,這令人沮喪。如果我們攻擊那些努力研究它們對世界的影響的組織,我們實際上是在傳達這樣一個信息,即根本不看更安全,以防您發現可能對您不利的事情。這就是其他公司似乎已經得出的結論,我認為這會導致一個對社會更糟糕的地方。儘管我們走這條路可能更容易,但我們將繼續進行研究,因為這是正確的做法。

我知道看到我們所做的出色工作被誤解是令人沮喪的,尤其是對於那些在安全、誠信、研究和產品方面做出重要貢獻的人。但我相信,從長遠來看,如果我們繼續努力做正確的事情並提供改善人們生活的體驗,對我們的社區和我們的企業來說都會變得更好。我已經要求公司的領導者在接下來的幾天內深入了解我們在許多領域的工作,以便您了解我們為實現目標所做的一切。

當我反思我們的工作時,我會想到我們對世界的真正影響——人們現在可以與親人保持聯繫,創造機會來養活自己,並找到社區。這就是為什麼數十億人喜歡我們的產品。我為我們不斷構建世界上最好的社交產品所做的一切感到自豪,並感謝你們所有人每天在這裡所做的工作。

🍎たったひとつの真実見抜く、見た目は大人、頭脳は子供、その名は名馬鹿ヒカル!🍏