this post was submitted on 26 Jun 2024
152 points (94.7% liked)

Firefox

17898 readers
59 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
 

Tldr: Theyre adding an opt-in alt text generation for blind people and an opt-in ai chat sidebar where you can choose the model used (includes self-hosted ones)

all 38 comments
sorted by: hot top controversial new old
[–] slazer2au@lemmy.world 65 points 4 months ago (2 children)
[–] dustyData@lemmy.world 27 points 4 months ago (1 children)

Self-hosted and locally run models also goes a long way. 90% of LLMs applications don't require users to surrender their devices, data, privacy and security to big corporations. But that is exactly how the space is being run right now.

[–] LWD@lemm.ee 6 points 4 months ago (2 children)

And yet, Mozilla went for the 10% that do violate your privacy and gives your data to the biggest corporations: Google, Microsoft, OpenAI.

What happened to the Mozilla Manifesto?

[–] xor@lemmy.blahaj.zone 6 points 4 months ago (1 children)

The alternative is only supporting self hosted LLMs, though, right?

Imagine the scenario: you're a visually impaired, non-technical user. You want to use the alt-text generation. You're not going to go and host your own LLM, you're just going to give up and leave it.

In the same way, Firefox supports search engines that sell your data, because a normal, non-technical user just wants to Google stuff, not read a series of blog posts about why they should actually be using something else.

[–] LWD@lemm.ee 4 points 4 months ago (1 children)

The alt text generation is done locally. That was the big justification Mozilla used when they announced the feature.

I'm talking about the non-local ChatGPT stuff.

[–] xor@lemmy.blahaj.zone 2 points 4 months ago (1 children)

Ah, I missed that alt text specifically is local, but the point stands, in that allowing (opt-in) access to a 3rd party service is reasonable, even if that service doesn't have the same privacy standards as Mozilla itself

To pretty much every non-technical user, an AI sidebar that won't work with ChatGPT (Google search's equivalent from my example previously) may as well not be there at all

They don't want to self host an LLM, they want the box where chat gpt goes

[–] LWD@lemm.ee 2 points 4 months ago

But the alt text generation already leverages a self-hosted LLM. So either Mozilla is going to cook in hundreds of extra megabytes of data for their installs, or people with accessibility issues are going to have to download something extra anyway. (IIRC it's the latter).

We could talk all day about things that Mozilla could add out of the box that would make the user experience better. How about an ad blocker? They can be like Opera, Brave, Vivaldi, even the most ambitious Firefox fork LibreWolf.

But for some reason they went with injecting something into Firefox that nobody was asking for, and I don't think it aligns at all with the average Firefox users needs or wants. Normies don't use Firefox. They use a browser that doesn't raise "switch to Chrome or Edge" messages. And if there was some subset of Firefox users who were begging Mozilla for AI, I never saw them. Where were they?

[–] LWD@lemm.ee 10 points 4 months ago (2 children)

If it was truly opt-in, it could be an extension. They should not be bundling this with the browser, bloating it more in the process.

AI already has ethical issues, and environmental issues, and privacy issues, and centralization issues. You technically can run your own local AI, but they hook up to the big data-hungry ones out of the box.

Look at the Firefox subreddit. One month ago, people were criticizing the thought of adding AI to Firefox. Two months ago, same thing. Look at the Firefox community. See how many times people requested AI.

[–] barryamelton@lemmy.ml 12 points 4 months ago (2 children)

If it was truly opt-in, it could be an extension. They should not be bundling this with the browser, bloating it more in the process.

The extension API doesn't have enough access for this.

You technically can run your own local AI, but they hook up to the big data-hungry ones out of the box.

While it is opt-in and disabled by default, this is the real problem.

[–] LWD@lemm.ee 3 points 4 months ago

What are they missing? So far, all they've added is a sidebar and a couple extra right-click menu additions. Both of these are available for all extensions.

[–] slazer2au@lemmy.world 9 points 4 months ago

Look at the Firefox subreddit. One month ago, people were criticizing the thought of adding AI to Firefox. Two months ago, same thing. Look at the Firefox community. See how many times people requested AI.

I believe what most people are concerned about, including myself, was the AI features being enabled automatically and then having to disable it like every other application would do to inflate metrics.

Because this is opt in like it says in the blog I am ok with it there and disabled.

[–] ScreaminOctopus@sh.itjust.works 13 points 4 months ago (1 children)

Will you need your own account for the proprietary ones? Mozilla paying for these feels like it couldn't be sustainable long term, which is worrying.

[–] Blisterexe@lemmy.zip 12 points 4 months ago

The proprietary ones are free

[–] Xuderis@lemmy.world 2 points 4 months ago* (last edited 4 months ago) (1 children)

But what does it DO? How is it actually useful? An accessibility PDF reader is nice, but AI can do more than that

Our initial offering will include ChatGPT, Google Gemini, HuggingChat, and Le Chat Mistral

This is great, but again, what for?

[–] Blisterexe@lemmy.zip 5 points 4 months ago (1 children)

A lot of people use llms a lot, ao its useful for them, but its also nice for summarizing long articles you dont have the time to read, not as good as reading it, but better than skimming jt

[–] rgbd@ursal.zone 2 points 4 months ago (1 children)

@Blisterexe @Xuderis It's true, as a researcher, these models have helped me a lot to speed up the process of reading and identifying specific information in scientific articles. As long as it is privacy respecting, I see this implementation with good eyes.

[–] Blisterexe@lemmy.zip 3 points 4 months ago (1 children)

It lets you use any model, so while it lets you use chatgpt, it also lets you use a self-hosted model if you edit about:config

[–] Xuderis@lemmy.world 1 points 4 months ago (1 children)

But what does using that in my browser get me? If I’m running llama2, I can already copy and paste text into the terminal if I want. Is this just saving me that step?

[–] Blisterexe@lemmy.zip 1 points 4 months ago