• @TommySoda@lemmy.world
    link
    fedilink
    English
    181 month ago

    Funny how the average person figured this out almost immediately while Google needed half a year to figure it out with their researchers. Almost like they were ignoring it as long as they could for the sake of profit. Fuck around and find out, I guess.

  • Optional
    link
    fedilink
    English
    61 month ago

    If only anyone - anyone at all - could have foreseen this horrible outcome

  • @requiem@lemmy.world
    link
    fedilink
    English
    31 month ago

    Google Researchers Now Also Say We All Should Use Their Shit AI Search That Tells Us To Eat Glue

  • @Please_Do_Not@lemm.ee
    link
    fedilink
    English
    31 month ago

    It’s alright guys–I just looked up a solution and Google suggests eating glue and a few small pebbles will solve the issue.

  • AwkwardLookMonkeyPuppet
    link
    fedilink
    English
    31 month ago

    They’re admitting that they are the source of a massive problem. But are they going to do anything about it, or keep pushing their shitty, half-baked AI? It’s crazy to me how much worse their AI is than ChatGPT, considering all of the financial and engineering resources available to Google.

  • cobysev
    link
    fedilink
    English
    21 month ago

    Ahh, just in time for the election season.

  • mozz
    link
    fedilink
    11 month ago

    I think almost certainly that disinformation based on fake accounts simply posting memes or targeted viewpoints, hoping to send the message through sheer repetition, it still a lot more common than doctored factual information. (Not that that means that faked up disinformation isn’t a problem - just saying I think it’s still relatively rare as a vehicle for disinformation.)

    Why would you even open yourself up to “see, the underlying citation for this thing they’re saying is not true” when you might as well not even enter into the sphere of backing up what you’re saying with facts, and just state your assertions as if they were facts, instead.