Biden administration calls for developers to embrace memory-safe programing languages and move away from those that cause buffer overflows and other memory access vulnerabilities.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      91
      arrow-down
      3
      ·
      10 months ago

      I think that’s the point. You can’t trust the average developer to do things safely. And remember, half of all programmers are even worse than average.

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          10 months ago

          The word “average“ can mean many things, for example, mean, median, mode, or even things like “within 1 standard deviation from the mean”.

          I was using it strictly as the mean which divides the population exactly in half.

        • Feathercrown@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          10 months ago

          Bell curves don’t work to make this point. A bell curve is symmetrical, so half of developers will always be below average on a bell curve. But yes, it is true that for other types of distributions, more or less than half of the developers could be below average. What the person above you was looking for, in the general case, would be the median.

          • thisfro@slrpnk.net
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            Average is the mean (i.e. sum of all “skill” divided by the amount of programmers)

            What they were thinking of is the median (50th percentile = 0.5 quantile), which splits the group in two equal sized groups.

            For a bell curve, they are the same values. But think of the example of average incomes: 9 people have an income of 10$, one has an income of 910$. The average income is 100$ ((10*9+910)/10). The median is basically 10 however.

            • Bademantel@feddit.de
              link
              fedilink
              arrow-up
              4
              ·
              10 months ago

              The distribution of skill in humans, for various tasks and abilities, can often be approximated by a normal distribution. In that case, as you know, the mean is equal to the average.

              • burlemarx@lemmygrad.ml
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                Actually, in order to test your assumption, you’d need to quantitatively measure skill, which per se is something already problematic, but you’d also need to run a statistical test to confirm the distribution is a normal/Gaussian distribution. People always forget the latter and often produce incorrect statistical inferences.

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          10 months ago

          The mean is in the center of the bell curve, so I’m not sure what your point is.

    • u_tamtam@programming.dev
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      10 months ago

      Or rather a Dunning Kruger issue: seniors having spent a significant time architecturing and debugging complex applications tend to be big proponents for things like rust.

  • riodoro1@lemmy.world
    link
    fedilink
    arrow-up
    42
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Guys, C++ is gonna be dead in a couple of years now. Remember this comment…

    …and read it again in ten years.

      • zik@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        edit-2
        10 months ago

        Java’s runtime has had a large number of CVEs in the last few years, so that’s probably a decent reason to be concerned.

        • u_tamtam@programming.dev
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          10 months ago

          Yep but:

          • it’s one runtime, so patching a CVE patches it for all programs (vs patching each and every program individually)

          • graalvm is taking care of enabling java to run on java

      • ScreaminOctopus@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        Nothing really, the JVM has a pretty troubled history that would really make me hesitate to call it “safe”. It was originally built before anyone gave much thought to security and that fact plauges it to the present day.

        • u_tamtam@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          and how much of this troubled history is linked to Java Applets/native browsers extensions, and how much of it is relevant today?

        • FooBarrington@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          10 months ago

          There’s a difference between writing code on a well-tested and broadly used platform implemented in C++ vs. writing new C++.

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    4
    ·
    10 months ago

    You mean like android running java which is why everyone and their mom bought Israel’s Pegasus spyware toolkit?

    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      10 months ago

      When was the last time you’ve heard of a memory safety issue in Java code? Not the runtime or some native library, raw dogged Java.

      Memory safety isn’t a silver bullet, but it practically erases an entire category of bugs.

      • mlg@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        10 months ago

        Fair point, even log4j was running java code, not literally hijacking the stack or heap.

        That being said, I’m poking fun because C and C++ have low level capabilities of which only Rust offers a complete alternative of. Most of everything else is safe because it comes packaged with a garbage collector which affects performance and viability. I think Go technically counts if you set the GC allocation to 0 and use pointers for everything, but might as well use Rust or C at that point.

        I guess I’m just complaining out of all the issues ONCD could point out, they went after the very broad “memeory-safe is always better” when most of the people using C and C++ need the performance. They only offered Rust as a potential alternative in the report with nothing else which everyone already knows. Would be nice to see them make a real statement like telling megacorps to stop using unencrypted SCADA on the internet.

    • bamboo@lemm.ee
      link
      fedilink
      arrow-up
      15
      ·
      10 months ago

      The apps are (sometimes) Java, but the OS is a mix of languages, mostly C and C++. The Java runtime itself is C++.

    • a1studmuffin@aussie.zone
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      10 months ago

      I love that Android chose Java so they could run it on different processor architectures, but in the end one architecture won out so Java wasn’t necessary any more. I guess they didn’t know at the time, but they’d claw back a tonne of efficiency if they dropped the Java VM.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        Java also made it very accessible to the vast majority of existing Java developers.

        Way more Java developers than Objective C developers at the time.

        I wasn’t a fan of learning Objective C when I started learning just as swift was coming out but too new to use.

    • Leeker@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      What are you talking about? Did you read the report? On page 7 They directly say that C/C++ “lack traits associated with memory safety”.

    • ScreaminOctopus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      Thats because in government products many unsafe languages shittier than C(++) are used, like Ada, Fortran, and Cobol. It wouldn’t surprise me if most of the code running on products for government use werent written in C or C++

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    10 months ago

    When all the talented programmers are all gay communists and your entire state exists to murder gay communists. Still can’t forget how Allen Turing, a gay man whose inventions were a gigantic help in winning WW2, KYS’d because they still treated him like garbage even after the fact.

  • a4ng3l@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    4
    ·
    10 months ago

    Also like it’s the only source of vulnerabilities… in addition a lot of the trendy python libs are developed in C; do we also ditch those?

  • Treczoks@kbin.social
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    10 months ago

    Nice. Now I’m waiting for all the Rust or whatever “safe” languages environments for embedded systems to fall from the sky. And please some that actually work on small processors with little memories.

  • burlemarx@lemmygrad.ml
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    People are talking about Java, but the majority of programming languages are memory safe nowadays. Go satisfies this requirement, for example.