They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.
Now add support for GPT4All and everyone is happy again.
That was there before 133, don’t remember the exact release that added it.
https://support.mozilla.org/en-US/kb/ai-chatbot
Note that you need an account to use one of these supported systems. HuggingChat allows for a few connections as a gues before cutting the access; basically a trial version, so you have to create an account.
They better not decide to enable it by default.
it’s not enabled by default … it’s opt out by default
I think that means that it’s opt-in.
if third-party accounts are needed, it’ll have to stay that way.
If they do it in a privacy-preseeving way, this could help them get back market share which will generally benefit an open internet.
I really wish there was another way.
But it’s gonna be very difficult when you’ve got Google and OpenAI up there.
It’s an open source project, you can keep it in a box and people are able to check it.
Why would anybody want to have AI in their browser? It’s a fucking browser.
Because browsers are the most useful tool on most computers. Ordinary People go on google/ask chatgpt for mundane questions. If their browser can do that they need 1 app less and it will be more convenient which is what especially non-tech savy people care about.
Didn’t want it in Opera, don’t want it in Firefox. I mean they can keep trying and I’ll just keep on ignoring this shit :/
hopefully, it’ll be possible to opt out somehow.
as the screenshot shows, it is opt-in
oh good. hurray.
I don’t understand the hate. It’s just a sidebar for the supported LLMs. Maybe I’m misunderstanding?
Yes, I would prefer Mozilla focus on the browser, but to me, this seems like it was done in an afternoon.
It seems like common cynicism. Mozilla add this feature, as not to yield major features to other browsers. Mozilla’s lets you natively have lots of different AI solutions to pick from.
Not every feature is for everyone. Not every feature is done being improved on at release.
And in spite of popular opinions, organizations don’t do just one thing and then do just the next thing and the thing after that. Organizations can and do focus on and prioritize many things at the same time.
And for people who are naysaying AI at every mention, it has a lot of great and fascinating uses, and if you think otherwise, you really should try them more. I’ve used it plenty for work and life. It’s not going away, might as well do some nice things with it.
I want my browser to be a browser. I don’t want Pocket, I don’t want AI, I don’t want bullshit. There are plugins for that.
that’s the great thing: you don’t have to use it
But Firefox wastes time developing that instead of fixing 20 years-old bugs.
i know it is an unpopular opinion around here. but currently AI features open doors for sales. that is important.
for the software i help develop, we introduced an optional AI integration. just its presence allowed us to sell the main SW multiple times. the AI plugin was never sold so far.
investment AI: 2 weeks of gluecode. i am not concerned with finances, but that plugin is for sure net positive.
Do users like the AI integration, or is this just something the management class wanted to see? Right now, those clothes look crazy good on that emperor…
Could this replace Perplexity for (assisted free) online search?
I mean, if you’re going to do it, where’s the Ollama love?
I was disappointed there was no local option…
I don’t get it, ollama is a provider no?
I think the point is it’s open source
and so is firefox, so why use another model provider
Fuck you Mozilla
Are any of these open source or trustworthy?
There are no open source ai models, even if they tell you that they are. HuggingFace is the closest thing to as something like open source where you can download ai models to run locally without internet connection. There are applications for that. In Firefox the HuggingChat uses models from HuggingFace, but I think it is running them on a server and does not download from?
The reason why they are not open source is, because we don’t know exactly on what data they are trained on. We cannot rebuild them on our own. And for trustworthy, I assume you are talking about the integration and the software using the models, right? At least it is implemented by Mozilla, so there is (to me) some sort of trust involved. Yes, even after all the bullshit I trust Mozilla.
It’s “open weights” if they are publishing the model file but nothing about its creation. There’s some hypothetical security concerns with training it to give very specific outputs for certain very specific inputs but I feel like that’s one of those kind of far fetched worries especially if you want to use it for chat or summarization and the comparison is getting AI output from a server API. Local is still way better.
I think Mistral is model-available (ie I’m not sure if they release training data/code but they do release model shape and weights), huggingchat definitely is open source and model-available
Sorry but HuggingChat / HuggingFace and all models on it are not open source(Edit: Oh you meant the UI HuggingChat is Open Source. Yeah sorry, I was focused on the models. And there is no Open Source model from my understanding.) -> https://opensource.org/ai/open-source-ai-definition Off course opensource.org is not the only authority on what the word opensource means, but its not a bad start.
probably not
as someone who’s never dabbled with ai bots, what does this feature do? is it only to query for information like a web search?
It is a sidebar that sends a query from your browser directly to a server run by a giant corporation like Google or OpenAI, consumes an excessive amount of carbon/water, then sends a response back to you that may or may not be true (because AI is incapable of doing anything but generating what it thinks you want to see).
Not only is it unethical in my opinion, it’s also ridiculously rudimentary…
It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
and thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.
There’s a huge difference between something that is presented in an easily accessible settings menu, and something that requires you to go to an esoteric page, click through a scary warning message, and then search for esoteric settings… Before even installing a server.
Nothing was compelling Mozilla to rush this through. In addition, nobody was asking Mozilla for remote access to AI, AFAIK. Before Mozilla pushed for it, people were praising them for resisting the temptation to follow the flock. They could have waited and provided better defaults.
Or just wedged it into an extension, something they’re currently doing anyway.
It just adds ChatGPT or similar to your sidebar. Chatbots can do a lot of things, they are mostly good for information research and technical help, although they have serious flaws like hallucinating false information sometimes
good for information research and technical help
i’d say they are good precursors for information research… never trust them, but use them to find terms to search for reliable sources
good question
From the description in the UI, it does sound like it. Theoretically, a chatbot could be created where you can ask questions about the webpage you have currently opened, so if you don’t want to read a long article, for example. I guess, you could probably just throw a link into an existing chatbot either way, but yeah, direct integration might be convenient either way.
Well, or a chatbot could be created, which has access to your browser history, bookmarks and tabs, so you can ask it when you last saw certain information. However, you’d need a locally running chatbot for that, which makes it more difficult to implement.
The fact that I can’t choose one of the many AIs I have locally downloaded on my computer is bogus
And I still can’t convince it to stop caching the images because it does not follows the RFC.