this post was submitted on 09 Sep 2024
491 points (95.7% liked)

Firefox

17899 readers
34 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
 

The poll is over, and the result is clear:

#FireFox users have very little interest for Chatbot integration into their browser.

I am very much aware that the people, who voted in this poll are hardly a representative sample, but more than 2.4K people is a better size than many "professional" opinion polls.

@mozilla & @firefox should take people, who actually care about their #browser choice, seriously.

I still seriously believe that #Mozilla's fate matters,

https://berlin.social/@mina/113102817500429735

1/3

you are viewing a single comment's thread
view the rest of the comments
[–] 1984@lemmy.today 5 points 2 months ago* (last edited 2 months ago) (3 children)

I think it can be useful for some users but hardly the majority.

You can select text now in Firefox and ask it to make a summary or to explain it in simpler words. Then it generates a query to chatgpt in the sidebar who answers it.

So for some use cases I think it's nice. Even better if you could make it do research and save us time. Like "check the top tech sites for reviews of this phone model and give me a summary of it's major flaws".

Chat gpt can do that but it's not really integrated into the Firefox experience. If you could select a phone name and have a one click option to "give me top flaws and pros of this model according to top reviewers", that could save a lot of time.

I think it's just about packaging this functionality better. I don't think it should be in a sidebar. It should just be in a new tab with lots of options to continue the research in different ways.

[–] possiblylinux127@lemmy.zip 3 points 2 months ago* (last edited 2 months ago) (1 children)

That's not private in the least. If anything add optional support for ollama

[–] 1984@lemmy.today 1 points 2 months ago (1 children)

They have a choice of different models in the Nightly version of Firefox. So I think we are getting there. Maybe even an option to run our own self hosted models.

[–] possiblylinux127@lemmy.zip 0 points 2 months ago

Welp, Firefox was the last option. I guess I'll just stick to Librewolf

[–] Vincent@feddit.nl 2 points 2 months ago

For the use cases you describe actually sound right on the mark? If you're viewing a page and you want something summarised on there, it would be nice to not have to leave that page, but to stay in its context, for example. If you're looking at the specs of a particular phone, ditto.

(I don't expect I'll use this feature myself, but if I did, it sounds like I'd use it in that way. Luckily, I can just choose not to use it without any downsides.)

[–] flux@lemmy.ml 1 points 2 months ago

Then there are the cases where you want the LLM to actually interact with the page, using the current web page state and your credentials.

For example, one might want to tell it to uncheck all the "opt in" checkboxes in the page.. And express this task in plain English language.

Many useful interactive agent tasks could be achieved with this. The chatbot would be merely the first step.