• 394 Posts
  • 572 Comments
Joined 1 year ago
cake
Cake day: July 29th, 2023

help-circle
  • Custom methods won’t have the benefit of being dealt with as if they shared specific semantics, such as being treated as safe methods or idempotent, but ultimately that’s just an expected trait that anyone can work with.

    In the end, specifying a new standard HTTP method like QUERY extends some very specific assurances regarding semantics, such as whether frameworks should enforce CRSF tokens based on whether a QUERY has the semantics of a safe method or not.



  • The problem with C++ is it still allows a lot of unsafe ways of working with memory that previous projects used and people still use now.

    Why do you think this is a problem? We have a tool that gives everyone the freedom to manage resources the way it suits their own needs. It even went as far as explicitly supporting garbage collectors right up to C++23. Some frameworks adopted and enforced their own memory management systems, such as Qt.

    Tell me, exactly why do you think this is a problem?


  • From the article.

    Josh Aas, co-founder and executive director of the Internet Security Research Group (ISRG), which oversees a memory safety initiative called Prossimo, last year told The Register that while it’s theoretically possible to write memory-safe C++, that’s not happening in real-world scenarios because C++ was not designed from the ground up for memory safety.

    That baseless claim doesn’t pass the smell check. Just because a feature was not rolled out in the mid-90s would that mean that it’s not available today? Utter nonsense.

    If your paycheck is highly dependent on pushing a specific tool, of course you have a vested interest in diving head-first in a denial pool.







  • So that’s where I would say, as long as performance doesn’t matter it’s better to default to B-Tree maps than to hash maps, because the chance of avoiding bugs is more valuable than immeasurable performance benefits (…)

    I don’t quite follow. What leads you to believe that a B-Tree map implementation would have a lower chance of having a bug when you can simply pick any standard and readily available hash map implementation?

    Also, you fail to provide any concrete reasoning for b-tree maps. It’s not performance on any of the dictionary operationd, and bugs ain’t it as well. What’s the selling point that you are seeing?








  • Why restrict to 54-bit signed integers?

    Because number is a double, and IEEE754 specifies the mantissa of double-precision numbers as 53bits+sign.

    Meaning, it’s the highest integer precision that a double-precision object can express.

    I suppose that makes sense for maximum compatibility, but feels gross if we’re already identifying value types.

    It’s not about compatibility. It’s because JSON only has a number type which covers both floating point and integers, and number is implemented as a double-precision value. If you have to express integers with a double-precision type, when you go beyond 53bits you will start to experience loss of precision, which goes completely against the notion of an integer.


  • The only think that TCP_NODELAY does is disabling packet batching/merging through Naggle’s algorithm. Supposedly that increases throughput by reducing the volume of redundant information required to send small data payloads in individual packets, with the tradeoff of higher latency. It’s a tradeoff between latency and throughput. I don’t see any reason for transfer rates to lower; quite the opposite. In fact the very few benchmarks I saw showed exactly that: TCP_NODELAY causing a drop in the transfer rate.

    There are also articles on the cargo cult behind TCP_NODELAY.

    But feel free to show your data.




  • lysdexic@programming.devtoProgramming@programming.devSafe C++
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    2 days ago

    It’s very hard for “Safe C++” to exist when integer overflow is UB.

    You could simply state you did not read the article and decided to comment out of ignorance.

    If you spent one minute skimming through the article, you would have stumbled upon the section on undefined behavior. Instead, you opted to post ignorant drivel.


  • I wouldn’t call bad readability a loaded gun really.

    Bad readability is a problem cause by the developer, not the language. Anyone can crank out unreadable symbol soup in any language, if that’s what they want/can deliver.

    Blaming the programming language for the programmer’s incompetence is very telling, so telling there’s even a saying: A bad workman always blames his tools.


  • lysdexic@programming.devtoProgramming@programming.devSafe C++
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    2 days ago

    Well, auto looks just like var in that regard.

    It really isn’t. Neither in C# nor in Java. They are just syntactic sugar to avoid redundant type specifications. I mean things like Foo foo = new Foo();. Who gets confused with that?

    Why do you think IDEs are able to tell which type a variable is?

    Even C# takes a step further and allows developer to omit the constructor with their target-typed new expressions. No one is whining about dynamic types just because the language let’s you instantiate an object with Foo foo = new();.