Get fucked, advertisers.
Advertisers track you with device fingerprinting and behaviour profiling now. Firefox doesn’t do much to obscure the more advanced methods of tracking.
Don’t all the advanced ways rely on JavaScript?
Lots do. But do you know anyone that turns JS off anymore? Platforms don’t care if they miss the odd user for this - because almost no one will be missed.
“Anymore”? I’ve never met a single soul who knows this is even possible. I myself don’t even know how to do it if I wanted to.
I do use NoScript, which does this on a site-by-site basis, but even that is considered extremely niche. I’ve never met another NoScripter in the wild.
Why not just use ublock medium mode?
Roughly similar to using Adblock Plus with many filter lists + NoScript with 1st-party scripts/frames automatically trusted. Unlike NoScript however, you can easily point-and-click to block/allow scripts on a per-site basis.
https://github.com/gorhill/uBlock/wiki/Blocking-mode:-medium-mode
Am I in the wild? I use it.
They probably mean in the flesh.
I don’t really talk about it in meat space, so they just might not have known.
I like the grid add-on for Firefox. It disables 3rd party pretty much anything by default. And you can control cookies separately from everything else, and I can’t remember any time I’ve needed to enable those cookies to get a site working properly (whereas sometimes you need to enable scripting, media, or iframe for cdn or something).
The people who I’ve tried to get on NoScript seem to have the brain capacity of goldfish. If the site doesn’t instantly work, it’s as if the sky has fallen and there is no way to convince them to pay attention to which scripts are actually needed.
It’s a rare breed that is willing to put up with toggling different scripts on and off. I’ll also acknowledge that too many people (including me) are in a giant rush. For work-type stuff, I have the laptop without noscript, because sometimes I do need something to work absolutely right now.
You don’t think you are being a tad judgemental?
People whose lives revolve around fashion probably think you dress like shit.
People who love food probably think you eat like shit.
People who love cars probably think you are a shit driver.
You probably love computers and care about privacy, and you are shitting on regular users(assumption, admittedly) for not being invested.
They had something that was working, you present noscript, thing no longer works. If you are not invested, how are you going to see the appeal of extra work?
Well, you know what they say. You can lead a horse to water, but you can’t make it interested in learning about the water cycle to have a deeper understanding of why the river flows in the first place.
I go hard with DNS-based ad blocking and I’m constantly confirming it works by checking the network tab in developer tools. I’m basically only seeing first party scripts and CDN assets — 99% of websites load all the tracking garbage from third-party domains that can be easily blocked.
Pihole?
It’s a common solution but I do something more involved and manual, but it’s the same concept.
Is it something you can talk about? I’m currently in the process of trying to switch from pihole to pfblockerng but am interested if there are better alternatives
uBlock origin + NoScript for me. I deal with the bigger umbrella of scripts with uBlock and then fine tune permissions to the ones that uBlock allowed with NoScript.
They might be fingerprinting me using these two extensions though.
I use LibreJS with few exceptions. If I need to use a site that requires non-free JavaScript, I’ll use a private browsing window or (preferably) Tor Browser.
Not all but most, yes. But TBF, sites that still function with JS disabled tend to have the least intrusive telemetry, and might pre-date big data altogether.
Regardless, unless the extent of a page’s analytics is a “you are the #th visitor” counter, all countermeasures must remain active.
It’s really strange how they specifically mention HTML5 canvas when you can run any fingerprinter test on the internet and see that Firefox does nothing to obfuscate that. You can run a test in Incognito mode, start a new session on a VPN, run another test, and on Firefox your fingerprint will be identical.
Well yeah, they’re just blocking known fingerprinting services. If you use a tool that they don’t recognize, it’ll still work, but their approach will still block the big companies that can do the most harm with that data.
The only alternative is probably to disable WebGL entirely, which isn’t a reasonable thing to do by default.
WebGL
I wish Firefox had a per-site or per-domain preference for WebGL (as well as for wasm, etc), the same way we have per-site cookies or notifs preferences. It’d help clear most issues regarding this.
Honestly would be hard to do. There a perfectly legitimate and everyday uses for pretty much everything used in fingerprinting. Taking them away or obscuring them in one way or another would break so much.
Librewolf has Resist Fingerprinting which comes pretty far.
Every Librewolf browser uses the same windows user agent, etc. But there are downsides, like time zones don’t work, and sites don’t use dark mode by default.
And even then, EFF’s Cover Your Tracks site can still uniquely identify me, mainly through window size. That’s one of the reasons why Tor Browser uses letterboxing to make the window size consistent.
Librewolf supports letterboxing as well, though the setting might be disabled by default
Oh neat! I just tried it, and it seems it’s broken on Gnome when using 125% scaling though :/ Still cool to have the feature!
I also just figured out how to expose dark mode and my timezone though with RFP, which is useful.
I don’t know what letterboxing is. But if window size is used to identify me, can’t it be circumvented simply by using the window in restored size, and not maximised?
Your restored window size is even more unique than your maximised window size!
The correct solution is to just not make the window size available to JS or to remotes at all. There’s no reason to ever need specifics on window size other than CSS media-queries, and those can be done via profiles.
But the restored size keeps changing - can’t be profiled, right?
And how do I not make the size available “to JS or to remote”?
Changing the source code of the browser, unfortunately. I don’t know what Tor Browser does or how, but basically you’d have to do about the same as they do.
EU outlaws it
The EU isn’t the only place on the planet, even if its laws have an impact.
Yeah, you need uMatrix. although it can be tricky to use.
There is still plenty of fish for advertisers, sadly.
For those who don’t care to read the full article:
This basically just confines any cookies generated on a page, to just that page.
So, instead of a cookie from, say, Facebook, being stored on site A, then requested for tracking purposes on site B, each individual site would be sent its own separate Facebook cookie, that only gets used on that site, preventing it from tracking you anywhere outside of the specific site you got it from in the first place.
Hahahahaha so it doesn’t break anything that still relies on cookies, but neuters the ability to share them.
That’s awesome
Honestly, I thought that’s how it already worked.
Edit: I think what I’m remembering is that you can define the cookies by site/domain, and restrict to just those. And normally would, for security reasons.
But some asshole sites like Facebook are cookies that are world-readable for tracking, and this breaks that.
Someone correct me if I got it wrong.
Total Cookie Protection was already a feature, (introduced on Feb 23st 2021) but it was only for people using Firefox’s Enhanced Tracking Protection (ETP) on strict mode.
They had a less powerful third-party cookie blocking feature for users that didn’t have ETP on strict mode, that blocked third party cookies on specific block lists. (i.e. known tracking companies)
This just expanded that original functionality, by making it happen on any domain, and have it be the default for all users, rather than an opt-in feature of Enhanced Tracking Protection.
That’s not what I was thinking of, which was even more fundamental. But that’s good info (and another way to cover stuff in the article).
Edit: what I was thinking originally was really stupid, that 3rd-party cookies weren’t allowed at all. Which was really dumb since of course they are.
No, you weren’t far off. A single site can only get and set cookies on its domain. For example, joesblog.com can’t read your Facebook session cookie, because that would mean they could just steal your session and impersonate you.
But third-party cookies are when joesblog.com has a Facebook like button on each post. Those resources are hosted by Facebook, and when your browser makes that request, it sends your Facebook cookies to Facebook. But this also lets Facebook know which page you’re visiting when you make that request, which is why people are upset.
With this third-party cookie blocking, when you visit joesblog.com and it tries to load the Facebook like button, either the request or just the request’s cookies will be blocked.
Although that raises an interesting question. Facebook is at facebook.com, but its resources are all hosted under fbcdn.com. Have they just already built their site to handle this? Maybe they just don’t strictly need your facebook.com cookies to load scripts, images, etc. from fbcdn.com.
They’ve been doing this with container tabs, so this must be the successor to that idea (I’m going to assume they’ll still have container tabs).
Container tabs are still useful, as they let you use multiple Cookie jars for the same site. So, it is very easy to have multiple accounts on s site.
Unless that cookie was somehow important for you to use both sites, but thats incredibly rare.
From my experience, blocking 3rd party cookies in general doesn’t seem to make any difference for site functionality anyways. Though I never log into sites with a Google or FB account other than Google or FB sites (and rarely at all for the latter).
I would love to see an icon of a neutered cookie please 🥺😄.
Basically creates a fake VM like environment for each site.
For those who don’t care to read the full article
Or even the whole title, really
I don’t know why this wasn’t the case long ago.
It increases implementation complexity of the browser and loses people who fund Firefox and contribute code $$$
Isn’t this basically Firefox’s version of the third party cookie block that Chrome rolled out a few months ago? Or am I missing something here?
I mean, it’s good news either way but I just want to know if this is somehow different or better.
Disabling cross site cookie is already a thing for decades…
Same with Do Not Track requests.
Do Not Track has never really done anything, it just asks websites politely to not track you. There’s no legal or technical limitation here.
I still much rather have it than not. It also lead to the spiritual successor GPC which does actually have regulatory requirements under the CCPA.
Fair. However, it also provides websites with additional information to fingerprint you, so that’s a thing too.
Disabling cross site cookies and allowing them to exist while siloed within the specific sites that need them are two different things.
Previous methods of disabling cross site cookies would often break functionality, or prevent a site from using their own analytics software that they contracted out from a third party.
Thank you for your explanation, tbat greatly clears up my confusion.
TBH, if a person’s concern is being tracked by, for example, Facebook; then this just lets Facebook continue tracking them without directly allowing Facebook’s anaylitics customers to track them to another site directly (but indirectly that information can still be provided). But I guess for all the people giving FB and Google those proviledges better to have this than not.
I think this tips it over the edge for me to switch to Firefox
I hope so! It’s a wonderful side of the Internet to be on
Really? This is what does it for you?
Ur… Yeah?
I miss Mozilla the product.
I unironically miss Netscape Communicator. Yes most of that went into Firefox but not all and I really miss the frame layout.
I used the shit out of their WYSIWYG HTML editor when I was an up and coming little script kiddie.
I prefer waterfox. Hard to trust Mozilla Corpos.
As long as it’s not Chromium, I’m happy people aren’t just handing over the keys to the Internet to Google.
Yeah, Waterfox is just another browser built on top of the Mozilla’s GECKO engine. But without all the AI dickriding.
How terrible to offer client-side translation or webpage description for differently abled people!
Client side incorrect translations*
How incorrect is it?
Sentences are a lot like math problems. An incorrect part changes the entire outcome.
I haven’t seen anything to signal Mozilla is untrustworthy other than from that one right wing guy with a chip on his shoulder.
Most of the revenue of Mozilla Corporation comes from Google (81% in 2022[2]) in exchange of making it the default search engine in Firefox.
Source: wikipedia
Other issues I have with Firefox is the telemetry bits, the way they handle some of their employees (laying one guy off because he has cancer), the lack of meaningful updates and features in the last decade, CEO granting herself a nice pay rise after doing well nothing really. The list goes on and on honestly.
Don’t get me wrong, you should still use Firefox or a Firefox derived browser if you care about a free internet. I myself use firefox (although I just switched to Zen browser on my PC which is based on Firefox). However we shouldn’t be blind ourselves just because we hate anything google based and/or closed sourced. Firefox is still back by a for profit company which is, as I quoted earlier, backed at least by 80% by google.
For the positive side now it seems that in the last 2-3 months firefox has been pumping out meaningful updates (even on mobile). Things seem to be taking a positive turn recently and I’m actually a bit excited to see where firefox is going to go from here.
The Mozilla Corporation is a for profit entity owned by the non-profit Mozilla Foundation, which lets them claim to be a nonprofit, which is a sketchy looking way to set up and promote your business if nothing else. They get most of their money from Google and they’ve been riding AI like all the other unethical companies.
I see absolutely no reason to give them a chance, either. Just use an actual open source build instead of the mainstream one.
Yup. Nobody else gets those cookies.
Aren’t cookies already limited to the site at which they were created??
What the fuck? You mean to tell me sites have been sharing cookies?
I thought all browsers only delivered cookies back to the same site.
The problem is that a website is generally not served from one domain.
Put a Facebook like button on your website, it’s loaded directly from Facebook servers. Now they can put a cookie on your computer with an identifier.
Now every site you visit with a Facebook like button, they know it was you. They can watch you as you move around the web.
Google does this at a larger scale. Every site with Google ads on it. Every site using Google analytics. Every site that embeds a Google map. They can stick a cookie in and know you were there.
Is this also how they know which ads to feed you?
Yes, it’s the reason for the tracking. To sell more targeted ads.
If you’re up for reading some shennanigans, check out the book Mindf*ck. It’s about the Cambridge Analytica scandal, written by a whistleblower, and details election manipulation using data collected from Facebook and other public or purchased data.
Is that because the like button is an iframe?
It doesn’t have to be. Your browser sends the cookies for a domain with every request to that domain. So you have a website example.com, that embeds a Facebook like button from Facebook.com.
When your browser downloads the page, it requests the different pieces of the page. It requests the main page from example.com, your browser sends any example.com cookies with the request.
Your browser needs the javascript, it sends the cookie in the request to get the JavaScript file. It needs the like button, it sends a request off to Facebook.com and sends the Facebook.com cookies with it.
Note that the request to example.com doesn’t send the cookies for Facebook.com, and the request to Facebook.com doesn’t send the cookie for example.com to Facebook. However, it does tell Facebook.com that the request for the like button came from example.com.
Facebook puts an identifier in the cookie, and any request to Facebook sends that cookie and the site it was loaded on.
So you log in to Facebook, it puts an identifier in your cookies. Now whenever you go to other sites with a Facebook like button (or the Facebook analytics stuff), Facebook links that with your profile.
Not logged in? Facebook sets an identifier to track you anyway, and links it up when you make an account or log in.
Thank you for the explanation!
How is Facebook able to know what site is requesting it? Is it in the referer header, or is it parameters in the javascript/image url?
There is a referer header sent, but depending on the exact code added to the page, it’s very likely they are loading a snippet of JavaScript that lets them collect other information and trigger their own sending of information to their server.
For example, Google Analytics has javascript added to the page, but loading fonts from Google’s CDN (which many sites do) will rely on the referer.
I know Facebook and Reddit are in cahoots.
I went to visit Reddit a couple weeks back to read the Deadpool & Wolverine comments, but used the wrong container tab and now Facebook feeds me endless Marvel related stuff.
A lot of it is culture war bullshit too. Hmmmmm 🤔
oh i know how this works and its not the way you think. Its somehow better and worse at the same time
Im going to describe the process using a hypothetical situation:
You decide to try a new shampoo but you’re not sure what to buy. You ask your friend “hey what shampoo do you use” and they tell you they use Head and Shoulders.
Later that night, you google head and shoulders and read reviews
The next day, your friend gets Head and Shoulders ads on youtube and facebook and Instagram, etc
This is because google knows both of your locations and search history. It sees that you two were within a few feet of each for hours and decides to shoot ads at you both, based on what either of you have searched recently.
This is called proximity targetted advertising and i think its gross.
But this is why so many people say things like “we were talking about it and now im seeing ads they must be spying on me”
No, you don’t know anything. Just because you have a suspicion because something happened to you once doesn’t mean you are sure in any way.
Nah I’m sure.
I never once saw a post about Marvel fed to me by Facebook and now it’s constant
Did it start after the extremely popular marvel movie “Deadpool and Wolverine” released?
Lol that’s your argument for why you think they don’t know what they’re talking about? Because all you did is make yourself seem like you have no idea how cookies work 🤣
It’s just one single person who noticed something once.
That’s an awful awful sample size absolutely filled with bias and thought fallacies.
Before. “Couple weeks” is more like 5 months at this point now that I think about it.
I don’t mind some of it much, but the obvious culture-war bait is infuriating.
It’s not because the movie just came out. I’ve been diligent about keeping Facebook and Reddit in their container tabs for years. It’s just Marvel stuff, not just the movies. Marvel’s been putting out huge movies for years and this hasn’t happened around any of their other releases.
Why aren’t you just using the official automatic Facebook container?
I made it before that was a thing. Habit I guess. 🤷♂️
NO.
https://en.m.wikipedia.org/wiki/Third-party_cookies
Maybe it’s not allowed in your local jurisdiction? But it’s been a problem since forever.
Why are we posting 2 year old articles as though they are new?
Looks like the article was updated today. I’m guessing this was originally covering an announcement for a future rollout and now it’s finally happening?
this article has not been edited, is from 2022, and says the feature was rolled out in June.
Maybe. Confusing decision on the part of Mozilla though, if so. I was checking to see if they mentioned which version this update happened in, but couldn’t find it. Then I noticed the original post date. Weird.
I guess it says updated, but hey. PR for Firefox is cool, until the imminent enshittification.
The moment that Firefox goes too far, it’ll immediately be forked and 75% of the user base would leave within a few months. Their user base is almost entirely privacy-conscious, technologically savvy people.
Depends on how it “goes too far”. What I am, for example, afraid of is the possibility of removing Manifest V2 support. Maintaining the browser with such a significant change would get more and more difficult as time goes on.
I think that would be an example of a wildly unpopular change, yeah.
Firefox did an add-on genocide years ago and it obviously didn’t hurt them in the long run.
I agree, but something will have to change because chrome will swallow ALL that. Just today some back-end problem was messing up all my stuff, and co-workers were asking, " did you try a different browser? " botch no I did not try Netscape
Not sure what you mean - I don’t think most of the people still using Firefox are going to switch to a Chromium based browser any time soon, I can’t speak for everyone of course but it feels like Firefox users tend to have an ideological objection to Google having a monopoly on web browsers.
It’s always worth trying a different browser when you have issues on websites - there are a lot of things that can be different beyond the layout and javascript engines - cookies, configuration, addons, etc. Yesterday I noticed a big difference between Chromium and Firefox in that even if you hard-refresh on a HTTP/2 connection, Chromium reuses a kept-alive connection, and firefox doesn’t — I would totally argue that Firefox’s implementation is more correct, but Chrome’s implementation will lead to a better experience for users hard-refreshing.
Personally, I remember chrome always flash banging me when on a website with a dark background and I clicked to the next page because apparently clearing the page to the same RGB value as what is set as the HTML background is too hard so they just always clear with pure white. But they did have a faster JS engine. Not sure anymore, haven’t given enough of a shit to try anything but firefox in years now.
I meant, talking to coworkers, – yes, I already tried Chrome, Edge, etc, not sure what etc would include - not worth it to explain what little I know of Chromium, and it doesn’t matter. I’m aware it’s Chromium or Firefox in getting a page to work. Random coworkers, they don’t know or care.
“I agree [with the opposite of what you said]. Also, here, have an irrelevant anecdote that includes a funny misspelling and a supposed diss of FF from 1999”
Does this make containers unnecessary? Or basically built in?
A lot different. Containers act as a separate instance of Firefox. So any sites you visit within a container can see each other as if you were using a browser normally. The containers can’t see the stuff from other containers though. So you have to actively switch containers all the time to make it work right.
This keeps cookies locked to each page that needs cookies. So a lot stronger.
So what you’re saying is, each site gets its own container?
I think there’s some confusion here. You’re talking about Multi-Account Containers, that person was talking about the Facebook Container. Both Firefox features with confusingly similar names, and honestly that’s on Firefox for naming them.
Facebook Container is similar to this TCP feature, but focused on Facebook. And of course it was a separate extension, so very opt-in. Now, Firefox has rolled it out for ALL sites by default, which is awesome and SHOULD HAVE BEEN HOW COOKIES WORKED IN THE FIRST PLACE!
Isn’t there just a non-extension container feature, I can’t tell what’s the difference between that one and multi-account containers.
Yeah this basically sounds like it takes the temporary container add on that I think was folded into Firefox at some point recently and basically just does it behind the scenes now on a per domain basis
Containers also handle other cached content besides cookies.
It is making the tracking protection part of containers obsolete, this is basically that functionality but built in and default. The containers still let you have multiple cookie jars for the same site, so they are still useful if you have multiple accounts on a site.
FREEEDOOOOOOOOOOOM
Good to see Firefox still has value to provide
Firefox is awesome.
Is this different from blocking 3rd party cookies?
A little. If a third party cookie is set while you’re visiting a site, only that site will get the third party cookie back. Multiple sites can have embedded content making third party cookies, and with this change firefox will track where it was made and only give it back there.
With this change, it doesn’t matter if it’s first or third or whatever; cookies will only be given back to a site that matches much of what is in your location bar.
Mozilla completes what Google was too afraid to finish.
ah yes, the other TCP
Maybe they should patent it, to protect their TCP IP.
Or have some higher tier version called Ultimate Cookie Protection {UDP)
Wouldn’t that be Ultimate Dookie Protection?
danvit, yes
Id prefer a security security oriented Secure Cookie Total Protection (SCTP)
LOL
Tasty Consensual Photos
Is this the reason why I have to “confirm it’s you” every time I sign into a Google service now? I appreciate the fact that Firefox’s protection is so good that Google doesn’t recognize my PC anymore, but it’s extremely annoying to have to pull out my phone every time I want to watch YouTube.
This might be what finally convinces me to ditch Google for good. Good job, Firefox devs.
No. That’s just Google trying to pester you into using Chrome.
I actually had a problem where on Chrome, I would be signed out of my google account every time I restart my computer, while on Firefox, everything works normally. I use Firefox now lol.
This wouldn’t make you have to log in every time you watch YouTube. It means by signing in to google.com, youtube.com can’t tell that you’re signed in. If you sign in on youtube.com, you’ll stay signed in on youtube.com unless you have something else deleting your cookies.
Well have had my cookies set to delete every time I close the browser for several years now but FF only now started doing this verification thing. A week ago all I had to do was enter my email and password.
If you’re already deleting all your cookies every time you close, then this new change should be identical to your first login of the day when your browser has no cookies. If you’re only getting 2fa requests after this change, then maybe you weren’t actually deleting every cookie, and Google was still fingerprinting you somehow.
You may want to just use tab containers for youtube, so that it maintains your session, but also isolates it.
Best way to use such (para)sites.
This article is from 2022
It was updated today. 2 years ago it was just an announcement of a beta function in private browsing, the full rollout happened with 129.0.2 which was released a few days back.
Cool, thanks. How’d you find the version number? I was looking on the linked post but didn’t find it. Maybe just me being tired.
I don’t think it was in the article, but I updated to 192.0.2 yesterday and checked the enhanced tracking protection settings, and block cross-site cookies is now in the default profile, so that was my assumption since it wasn’t there previously.
but it’s extremely annoying to have to pull out my phone every time I want to watch YouTube
This sounds wild. What is your setup? You are using Youtube directly and unmitigated?
At the moment yes cause I’m too lazy/ADHD to switch to NewPipe.
It’s okay to say lazy. Not everything is ADHD. You’re just lazy.
Hahaha that’s right in my feels and I’m not in this thread
FWIW I’ve been using the lazy excuse all my life until I got the ADHD diagnosis a year ago in my 30s.
Article from JUNE 14, 2022
I wonder how long until all the distros have this.
This is old news, from 2022!!
From the blog post:
“June 14, 2022”
“Updated Aug. 28, 2024”
“And starting in 2024, all our users can look forward to Firefox blocking even more third party cookies.”
Except it’s still out of date because it mentions chrome also blocking third party cookies when at this point in time they’ve announced that they’ve abandoned that course of action now.