Lemmy for LUS
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Karna to Firefox@lemmy.ml • 1 year ago

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

external-link
message-square
54
fedilink
  • cross-posted to:
  • news
80
external-link

Meet Orbit, Mozilla's AI Assistant Extension for Firefox

www.omgubuntu.co.uk

Karna to Firefox@lemmy.ml • 1 year ago
message-square
54
fedilink
  • cross-posted to:
  • news
Orbit by Mozilla is a new AI-powered assistant for the Firefox web browser that makes summarising web content while you browse as easy as clicking a
  • Jeena
    link
    fedilink
    English
    27•1 year ago

    Thanks for the summary. So it still sends the data to a server, even if it’s Mozillas. Then I still can’t use it for work, because the data is private and they wouldn’t appreciate me sending their data toozilla.

    • KarnaOP
      link
      fedilink
      21•1 year ago

      In such scenario you need to host your choice of LLM locally.

      • @ReversalHatchery@beehaw.org
        link
        fedilink
        English
        5•1 year ago

        does the addon support usage like that?

        • KarnaOP
          link
          fedilink
          7•1 year ago

          No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

          I have this setup running for a while now.

          • @cmgvd3lw@discuss.tchncs.de
            link
            fedilink
            4•1 year ago

            Which model you are running? Who much ram?

            • KarnaOP
              link
              fedilink
              4•
              edit-2
              1 year ago

              My (docker based) configuration:

              Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

              Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

              Docker: https://docs.docker.com/engine/install/

              Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

              Open WebUI: https://docs.openwebui.com/

              Ollama: https://hub.docker.com/r/ollama/ollama

    • @LWD@lemm.ee
      link
      fedilink
      12•
      edit-2
      11 months ago

      deleted by creator

    • Hamartiogonic
      link
      fedilink
      -2•
      edit-2
      1 year ago

      According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

      If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.

Firefox@lemmy.ml

!firefox@lemmy.ml
Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !firefox@lemmy.ml

/c/firefox

A place to discuss the news and latest developments on the open-source browser Firefox.


Rules

1. Adhere to the instance rules

2. Be kind to one another

3. Communicate in a civil manner


Reporting

If you would like to bring an issue to the moderators attention, please use the “Create Report” feature on the offending comment or post and it will be reviewed as time allows.


  • 9 users / day
  • 20 users / week
  • 86 users / month
  • 557 users / 6 months
  • 4 subscribers
  • 1.2K Posts
  • 21K Comments
  • Modlog
  • mods:
  • @k_o_t@lemmy.ml
  • @golden_zealot@lemmy.ml
  • BE: 0.18.5
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org