Facebook, Twitter, Google, and Facebook’s automated algorithmically-generated content and ad inventory are being used to track, target, and sell people’s information.
A report by the Electronic Privacy Information Center found that social proof, as defined in the Electronic Communications Privacy Act (ECPA), is “a method of identifying and tracking an individual or group of people based on the information they share online.”
The ECPA allows companies to collect information, such as their IP address, from individuals without a warrant or other judicial approval.
Social proof also enables advertisers to target users based on their interests, and the ECPA’s requirements that social media companies disclose this information to consumers when a user’s information is collected also prohibit the sharing of this information without consent.
A social media company can also be held liable if its algorithm violates the ECPPA if it “engages in conduct which may violate” the law.
In addition to social proof and ad targeting, companies are also using automated algorithms to track users who are visiting websites and social media pages owned by others.
In a March report, the Electronic Frontier Foundation (EFF) said that Facebook’s data collection and use practices violated privacy rights of its users and violated the privacy of others.
Facebook said that its privacy practices complied with all applicable laws and regulations.
Facebook’s use of social proof to target its ads also violates privacy laws and can violate the ECPR.
EFF said that social content has “become an essential part of our lives and we must have the freedom to choose who we share it with and what we do with it.”
Facebook also said that it was not using the social proof or ad targeting to target or target ads to users, and that it had taken steps to ensure that its algorithms did not “target individuals based on any information that is personally identifiable.”
The Facebook report concluded that Facebook “takes its obligations under the ECPSPA seriously and takes appropriate measures to ensure its users’ privacy rights.”
Facebook did not immediately respond to a request for comment.
The report’s findings were based on an investigation by EFF and other organizations into Facebook’s ad targeting and tracking practices.
In the report, EFF identified at least five types of automated tracking practices, including Facebook’s and Google’s automated targeting, which can target users with their IP addresses.
The ECPR also requires social media firms to disclose the information that Facebook and Google collect from users to the ECPC.
EFF also said Facebook had been unable to provide a clear explanation for why its algorithms do not use social content in ad targeting or tracking.
The EFF report also found that Facebook used its social content analytics tool to target ads based on users’ interests and to identify people who are in groups with those interests.
Facebook was able to collect more information about users based in the U.S. than in other countries, such that its ads were more likely to be seen by users in the United States than those in other nations.
For example, according to the report: Facebook data was used to target people who lived in the US and in other OECD countries based on how many users visited their Facebook pages.
Facebook used that data to target individuals based in Germany and France.
Facebook also used this information, as well as information about who had interacted with the user and where that interaction took place, to target advertisements based on who the user was interacting with.
Facebook targeted ads based solely on the user’s interests.
According to the Electronic Freedom Foundation, Facebook has “not provided any public evidence that it uses data collected in this manner.”
The Electronic Frontier Network (EFF), a non-profit advocacy organization that has also been working to end Facebook’s tracking practices since 2015, called the report “a troubling report” and said Facebook’s social content analysis tool “does not appear to meet its own privacy and transparency requirements.”
The EFF’s report also called on Facebook to take steps to “make its tools, and its ad targeting practices, more transparent.”
The company is currently taking steps to improve its social targeting and ad tracking practices by creating a “social proof” feature in its Ads Settings page that can help users know if they are being targeted based on social content.
“This is not about privacy or security.
It is about profit,” EFF said in a statement.
“Facebook should make its tools and its social sharing practices more transparent, so that its users are aware that Facebook is using data to track their activities.”
EFF added that the report also “reveals how much Facebook and other platforms are paying to collect, analyze and monetize their users’ data.”
Facebook said in an emailed statement to Ars that it would work with the FCC “to address the issues raised by the report.”
EFF has also launched a public comment campaign on the report.
“If Facebook’s own data-mining software has the ability to target and track people based solely in the user-generated data that Facebook generates, the company should be willing to explain why it does so,” EFF wrote.
The FCC has not yet responded