• @simonced@lemmy.one
    link
    fedilink
    English
    22 days ago

    Letting your text editor write your code, not using version control… I don’t feel sad at all. Hope lesson was learned.

  • Arsecroft
    link
    fedilink
    153
    edit-2
    5 days ago

    this guy would have force pushed onto main about 10 mins after this if he did have git

    • Lucy :3
      link
      fedilink
      7
      edit-2
      5 days ago

      Tbf you have to do that for the first push, if a Readme file was autogenerated

        • Lucy :3
          link
          fedilink
          15 days ago

          Huh? I’m talking about existing code being in a dir, then initting a git repo there, creating a pendant on your hoster of choice and then pushing it there. Wouldn’t cloning the repo from step 3 to the code from step 1 overwrite the contents there?

          • @stembolts@programming.dev
            link
            fedilink
            8
            edit-2
            5 days ago

            There are multiple solutions to this without using --force.

            Move the files, clone, unmove the files, commit, push being the most straightforward that I can summon at this time… but I’ve solved this dozens of times and have never use --force.

            • @Hoimo@ani.social
              link
              fedilink
              34 days ago

              If your remote is completely empty and has no commits, you can just push normally. If it has an auto-generated “initial commit” (pretty sure Github does something like that), you could force push, or merge your local branch into the remote branch and push normally. I think cloning the repo and copying the contents of your local repo into it is the worst option: you’ll lose all local commits.

              • @Jayjader@jlai.lu
                link
                fedilink
                23 days ago

                If it’s a single, generated, “initial” commit that I actually want to keep (say, for ex I used the forge to generate a license file) then I would often rebase on top of it. Quick and doesn’t get rid of anything.

              • Ethan
                link
                fedilink
                English
                13 days ago

                You can also just tell GitHub to not do that.

              • @stembolts@programming.dev
                link
                fedilink
                2
                edit-2
                4 days ago

                True, in the situation with a local history maybe it’s worthwhile to --force to nuke an empty remote. In that case it is practical to do so. I just typically like to find non-force options.

          • @dev_null@lemmy.ml
            link
            fedilink
            35 days ago

            Yeah, I was thinking of a new repo with no existing code.

            In your case you’d want to uncheck the creation of a readme so the hosted repo is empty and can be pushed to without having to overwrite (force) anything.

      • @computergeek125@lemmy.world
        link
        fedilink
        English
        55 days ago

        Does that still happen if you use the merge unrelated histories option? (Been a minute since I last had to use that option in git)

        • Lucy :3
          link
          fedilink
          35 days ago

          Never have heard of that, but in the case of you also having a Readme that will be even more complicated, I imagine. So just adding -f is the easier option.

  • Lucy :3
    link
    fedilink
    139
    edit-2
    5 days ago

    “Developer”
    “my” 4 months of “work”

    Those are the ones easily replaced by AI. 99% of stuff “they” did was done by AI anyway!

  • @darklamer@lemmy.dbzer0.com
    link
    fedilink
    1045 days ago

    The first version control system I ever used was CVS and it was first released in 1986 so it was already old and well established when I first came to use it.

    Anyone in these past forty years not using a version control system to keep track of their source code have only themselves to blame.

    • @barsoap@lemm.ee
      link
      fedilink
      325 days ago

      CVS was, for the longest time, the only player in the FLOSS world. It was bad, but so were commercial offerings, and it was better than RCS.

      It’s been completely supplanted by SVN, specifically written to be CVS but not broken, which is about exactly as old as git. If you find yourself using git lfs, you might want to have a look at SVN.

      Somewhat ironically RCS is still maintained, last patch a mere 19 months ago to this… CVS repo. Dammit I did say “completely supplanted” already didn’t I. Didn’t consider the sheer pig-headedness of the openbsd devs.

      • @lud@lemm.ee
        link
        fedilink
        55 days ago

        Pretty sure GTA V use(d) SVN or something like that. I remember reading the source code and being surprised that they didn’t use GIT.

          • @barsoap@lemm.ee
            link
            fedilink
            3
            edit-2
            4 days ago

            You definitely need something else than git for large assets, yes, its storage layer is just not built for that and they way art pipelines generally work you don’t get merge conflicts anyway because there’s no sane way to merge things so artists take care to not have multiple people work on the same thing at the same time, so a lock+server model is natural. Also, a way to nuke old revisions to keep the size of everything under control.

      • I Cast Fist
        link
        fedilink
        45 days ago

        “We’ve always done things this way, we ain’t changing!” - some folks in the Foss community, like those RCS maintainers

          • Terrasque
            link
            fedilink
            21 day ago

            Svn: 20 October 2000

            Git: 7 April 2005

            I remember using svn when git development was started

  • @Artyom@lemm.ee
    link
    fedilink
    865 days ago

    I just want to pause a moment to wish a “fuck you” to the guy who named an AI model “Cursor” as if that’s a useful name. It’s like they’re expecting accidental google searches to be a major source of recruitment.

  • @yarr@feddit.nl
    link
    fedilink
    English
    735 days ago

    It’s a scary amount of projects these days managed by a bunch of ZIP files:

    • Program-2.4.zip
    • Program-2.4-FIXED.zip
    • Program-2.4-FIXED2.zip
    • Program-2.4-FIXED-final.zip
    • Program-2.4-FIXED-final-REAL.zip
    • Program-2.4-FIXED-FINAL-no-seriously.zip
    • Program-2.4-FINAL-use-this.zip
    • Program-2.4-FINAL-use-this-2.zip
    • Program-2.4-working-maybe.zip
    • Program-2.4-FINAL-BUGFIX-LAST-ONE.zip
    • Program-2.4-FINAL-BUGFIX-LAST-ONE-v2.zip
    • @Boakes@lemmy.world
      link
      fedilink
      4
      edit-2
      4 days ago
      • Program-1.5-DeleteThis.zip
      • Program-1.6-ScuffedDontUse.zip
      • CanWeDeleteThesePlease.txt (last edit 8 months ago)

      Inspired by a small collaboration project from a few years ago.

    • @iegod@lemm.ee
      link
      fedilink
      24 days ago

      If we’re talking actual builds then zip files are perfectly fine as long as the revs make chronological sense.

      • @yarr@feddit.nl
        link
        fedilink
        English
        23 days ago

        I’m not. I’m talking about in companies where dev A wants dev B to do some work, but they don’t use git or any kind of source control, so you email over a cursed ZIP file, then dev B does the work and sends it back with a different name. It’s a highly cursed situation.

  • Eager Eagle
    link
    fedilink
    English
    68
    edit-2
    5 days ago

    if this is real, that’s the kind of people who should be worried about being replaced by an ai

    it’s also Claude

    lmao

    • Scrubbles
      link
      fedilink
      English
      235 days ago

      Was playing around with it. It’s neat tech. It’s interesting all the side projects I can spin up now. It absolutely cannot replace an engineer with a brain.

      I’ve caught so many little things I’ve had to fix, change. It’s an amazing way to kick off a project, but I can’t ever trust blindly what it’s doing. It can get the first 80% of a small project off the ground, and then you’re going to spend 7x as long on that last 20% prompt engineering it to get it right. At which point I’m usually like “I could have just done it by now”.

      I see kids now blindly trusting what it’s doing, and man are they going to fall face first in the corporate world. I honestly see a place for vibe coding in the corporate world. However I also see you still needing a brain to stitch it all together too.

      • Lucy :3
        link
        fedilink
        105 days ago

        Yeah, a coworker (also a trainee) spent 2 days trying to debug some C# MVC thing. It took me around 5 mins, from having last seen C# code 7 years ago, to realizing that the quotes were part of the literal string and needed to be checked too.

        Well he did literally everything with the internal ChatGPT instance (or so a coworker said, I don’t know which model actually runs there). I asked if he wrote JS code, he said no. Well even though there was JS in the cshtml file, he technically didn’t lie, as he didn’t write it.

  • @zovits@lemmy.world
    link
    fedilink
    535 days ago

    It’s actually reassuring to see that despite all warnings and doomsayers there will still be opportunities for programmers capable of solving problems using natural intelligence.

    • @finitebanjo@lemmy.world
      link
      fedilink
      185 days ago

      If anything it feels like we’re the doomsayers trying to warn people that their AI bullshit won’t ever work and they’re just not listening as they lay off the masses and push insecure and faulty code.

      • @Zron@lemmy.world
        link
        fedilink
        34 days ago

        You know, none of the “AI is dangerous” movies thought of the fact that AI would be violently shoved into all products by humans. Usually it’s like a secret military or corporate thing that gets access to the internet and goes rogue.

        In reality, it’s fancy text prediction that has been exclusively shoved into as much of the internet as possible.