this post was submitted on 27 Nov 2024
200 points (93.9% liked)

Firefox

17955 readers
281 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
 

They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

top 50 comments
sorted by: hot top controversial new old
[–] dukatos@lemm.ee 2 points 7 hours ago

And I still can't convince it to stop caching the images because it does not follows the RFC.

[–] Treczoks@lemmy.world 11 points 15 hours ago

Luckily, it seems to be disabled by default. At the moment.

[–] EngineerGaming@feddit.nl 6 points 15 hours ago

I wonder if this can be removed at compile time, like Pocket.

[–] Echolynx@lemmy.zip 4 points 13 hours ago (1 children)

Sigh. I'm glad to have switched to LibreWolf.

[–] HouseWolf@lemm.ee 0 points 13 hours ago

I switched a while back before all the Ai and "privacy preserving" telemetry stuff.

Every update note I see for Firefox now just reinforces my decision.

[–] Mwa@lemm.ee 6 points 15 hours ago (1 children)

Wasn't this there for a while, or just me.

[–] ByteMe@lemmy.world 3 points 14 hours ago (1 children)

It is since version 128 I think

[–] Mwa@lemm.ee 3 points 12 hours ago

I think 130

[–] JokeDeity@lemm.ee 32 points 23 hours ago (2 children)

Unpopular opinion, I think they're doing it right as well as it can be at least. It's completely optional and doesn't seem to be intrusive.

[–] Mwa@lemm.ee 4 points 15 hours ago
[–] potentiallynotfelix@lemmy.fish 6 points 22 hours ago (1 children)

yeah its not google chrome level which i'm thankful about.

[–] JokeDeity@lemm.ee 15 points 22 hours ago (1 children)

I'm way more pissed about restarting my PC after an update and having Copilot installed without my permission.

load more comments (1 replies)
[–] fibojoly@sh.itjust.works 25 points 1 day ago (1 children)

Didn't want it in Opera, don't want it in Firefox. I mean they can keep trying and I'll just keep on ignoring this shit :/

[–] davi@startrek.website 2 points 23 hours ago (1 children)

hopefully, it'll be possible to opt out somehow.

[–] ToxicWaste@lemm.ee 3 points 7 hours ago

as the screenshot shows, it is opt-in

[–] Scrollone@feddit.it 6 points 22 hours ago

Wow, great job Firefox. Thanks.

If I wanted unreliable bullshit like AI, I'd use Chrome.

[–] nu11@sh.itjust.works 38 points 1 day ago (2 children)

I don't understand the hate. It's just a sidebar for the supported LLMs. Maybe I'm misunderstanding?

Yes, I would prefer Mozilla focus on the browser, but to me, this seems like it was done in an afternoon.

[–] PrefersAwkward@lemmy.world 6 points 18 hours ago

It seems like common cynicism. Mozilla add this feature, as not to yield major features to other browsers. Mozilla's lets you natively have lots of different AI solutions to pick from.

Not every feature is for everyone. Not every feature is done being improved on at release.

And in spite of popular opinions, organizations don't do just one thing and then do just the next thing and the thing after that. Organizations can and do focus on and prioritize many things at the same time.

And for people who are naysaying AI at every mention, it has a lot of great and fascinating uses, and if you think otherwise, you really should try them more. I've used it plenty for work and life. It's not going away, might as well do some nice things with it.

[–] Scrollone@feddit.it 5 points 23 hours ago (1 children)

I want my browser to be a browser. I don't want Pocket, I don't want AI, I don't want bullshit. There are plugins for that.

[–] ToxicWaste@lemm.ee 2 points 7 hours ago

that's the great thing: you don't have to use it

[–] ilinamorato@lemmy.world 10 points 1 day ago (2 children)

This happened ages ago, didn't it? Am I missing something new?

[–] 2kool4idkwhat@lemdro.id 7 points 23 hours ago (1 children)

Yeah, it did. That feature has been there at least since when Mozilla enabled "Firefox labs" section in settings by default a few months ago, and maybe even earlier than that

[–] victorz@lemmy.world 6 points 23 hours ago (1 children)
[–] ilinamorato@lemmy.world 4 points 23 hours ago (1 children)

Well, this month in particular....

[–] victorz@lemmy.world 2 points 16 hours ago

True. ❤️

load more comments (1 replies)
[–] davel@lemmy.ml 104 points 1 day ago
[–] celeste@lemmy.blahaj.zone 7 points 1 day ago (5 children)

If they do it in a privacy-preseeving way, this could help them get back market share which will generally benefit an open internet.

load more comments (5 replies)
[–] fruitycoder@sh.itjust.works 7 points 1 day ago* (last edited 1 day ago)

I will say, the Le Chat provider is pretty decent. You really can use it more natural language. "Rewrite it with a better rhyme scheme" "remove the last line" and it just got it.

Why no local option though? Why no anonmysing option?

Edit: There is a right click option which does make this officially actually useful for me now (summarize this!).

Other models do have RAG options and Mist real supports making agents with specified documentation too to at least fine tune too (not as good as full grounding though IMHO)

[–] ocassionallyaduck@lemmy.world 32 points 1 day ago (10 children)

Thing is, for your average user with no GPU and whp never thinks about RAM, running a local LLM is intimidating. But it shouldn't be. Any system with an integrated GPU, and the more RAM the better, can run simple models locally.

The not so dirty secret is that ChatGPT 3 vs 4 isn't that big a difference, and neither are leaps and bounds ahead of the publically available models for about 99% of tasks. For that 1% people will ooh and aah over it, but 99% of use cases are only seeing marginal gains on 4o.

And the simplified models that run "only" 95% as well? They can use 90% fewer resources give pretty much identical answers outside of hyperspecific use cases.

Running a a "smol" model as some are called, gets you all the bang for none of the buck, and your data stays on your system and never leaves.

I've been yelling from the rooftops to some stupid corporate types that once the model is trained, it's trained. Unless you are training models yourself, there is no need for the massive AI clusters, just for the model. Run it local on your hardware at a fraction of the cost.

[–] sinceasdf@lemmy.world 2 points 19 hours ago

Idk I noticed pretty significant differences between models of various sizes. I mean there are lots of metrics on this

https://www.vellum.ai/llm-leaderboard

[–] LWD@lemm.ee 29 points 1 day ago (1 children)

There's the tragedy with this new feature: they fast-tracked this past more popular requests, sticking it into Release Firefox.

But they only rushed the part that connects to third parties. There was also a "localhost" option which was originally alongside the Big Five corporate offerings, but Mozilla ultimately decided to bury that one inside of the about:config settings.

[–] MrOtherGuy@lemmy.world 11 points 1 day ago

I'm guessing that the reason (and a good one at that) is that simply having an option to connect to a local chatbot leads to just confused users because they also need the actual chatbot running on their system. If you can set up that, then you can certainly toggle a simple switch in about:config to show the option.

load more comments (8 replies)
[–] fmstrat@lemmy.nowsci.com 12 points 1 day ago (5 children)

I mean, if you're going to do it, where's the Ollama love?

load more comments (5 replies)
[–] eleitl@lemm.ee 15 points 1 day ago (1 children)

Thanks for nothing, Mozilla.

[–] Rozauhtuno@lemmy.blahaj.zone 24 points 1 day ago (1 children)

They should raise the ceo's pay some more to celebrate.

load more comments (1 replies)
[–] that_leaflet@lemmy.world 57 points 1 day ago

That was there before 133, don’t remember the exact release that added it.

[–] Zementid@feddit.nl 9 points 1 day ago

Now add support for GPT4All and everyone is happy again.

load more comments
view more: next ›