10
Björkus "No time_t to Die" Dorkus (@thephd@pony.social)
pony.socialA lot of people think I'm being sarcastic here, which is fair because I only went toe-to-toe against people on Twitter and didn't do much here, so I'll state my full opinion below anyhow:
I would agree with anyone about not wanting to replace C (or C++). But, C has been alive for 50 years (or just 35 from C89) and Rust has been alive for just barely under 10 (since Rust 1.0). Even if you measure the last 10 years of Rust versus the last 10 years of C or C++, one of these languages is making leaps and bounds ahead in providing people better primitives to do good work.
SafeInt secured pretty much all of Microsoft Office from some of the hardest bugs back in, around, 2005. C++ still lacks safe integer primitives; C only just got 3 functions to do overflow-checked math in C23, after David Svoboda campaigned for years. Rust just... has them baked into the standard library, for all the types you care about, too.
Similarly, people have been having memory issues in C and C++ for a while too. Most of the way to get better has been clamping down on static analysis and doing more testing, but we're still getting these errors. Meanwhile, teams writing Rust have been making way less errors on this in all the openly-published data from corporations like Google, and privately we are hearing a lot more about people taking complex financial and parsing code and turning it into Rust and having a fraction of the issues.
Even if I want to see C doing better, I have to acknowledge we were (a) too slow and not brave enough to do the things that could fix these portions of the language; (b) have fundamental design issues in the language itself that make ownership impossible to integrate as part of the language without breaking a ton of code; (c) do not provide good in-language tools and keep depending on vendors to "do the right thing" (i.e. adding or expanding U.B. and then just saying "vendors will check it" rather than taking responsibility with our language design); (d) are moving monumentally too slow to address the needs of the industry that many people -- especially security people -- have been yelling about since the mid 90s.
As much as I just want to pretend that I can write off every developer with "haha lole skill issue test better sanitize better IDIOT", if the root cause on this bug is "there was some C and/or C++ code that looked nominally correct but did batshit insanity in production", we absolutely will have problems to answer for. This doesn't absolve CrowdStrike for cutting 100s of workers and playing fast and loose, this doesn't excuse the fact that hospitals went down and people likely dead from lack of access to care, this doesn't change that it's abhorrent to have unmitigated hardware access in Ring0 just for a "security product", which has been the trend of every app wanting to plug in its own RootKit-like tool just for the sake of "app security" lately (League, NProtect, School Exam Spyware, etc.). There's a LOT of levels of "what the fuck have we let happen?" in play here, but I don't control those other levels.
I'm responsible for C, so I'm gonna look at the C bit. Other people responsible for the other parts of this stack should, hopefully, take sincere responsibility for those parts. (I doubt it, though, lmao.)
The post mentions data or research on how rust usage in is resulting in fewer errors in comparison to C. Anyone aware of good sources for that?
I’ve been able to find the following, but it does make sense and I’ve been reading articles for years saying the same thing. Memory bugs are the cause of the majority of security flaws in larger software. Rust, as it’s memory safe by default, allows one to avoid this in the majority of the codebase. That link seperatly links off to google, microsoft, and a few others stating exactly that.
I haven’t seen any direct “we switched to rust and now expeeriance 70% fewer errors” however, but the errors found would be impossible with rust, zig, or any other low level memory safe languages.
This post from 2022 was very interesting:
https://security.googleblog.com/2022/12/memory-safe-languages-in-android-13.html
Definitely interesting!
From the end of the article:
So have Google continued this and are generally pushing rust? With interest and support from Google, I’d imagine that’d flow into more contributions and financials etc.
Anecdotally I converted a python app to rust and suddenly had no more runtime errors. It’s utter bliss.
That’s interesting. What made the difference? The type system? Or memory safety constraints forcing more correct logic?
Both. It allowed/forced me to explicitly handle edge cases I wasn’t thinking about. That means the error doesn’t happen at run time, but at compile time (or while writing!) so technically speaking the errors didn’t go away, they moved to in my face rather than “maybe in the future.”
Most of the time the remedy was to explicitly catch whatever happened and nicely explain what happened, vs looking at empty production logs because logging is turned down.
It’s certainly a preference, but for me, I’d rather argue with the compiler all day long and push a bulletproof release than quickly ship something I thought was good and be embarrassed.
Yea, understated IMO how much the compiler requirements actually manifest directly in the writing process with LSPs etc.
Yea even that is making the logic more water tight.
Thanks!!