Reply by Les Cargill August 21, 20202020-08-21
Tom Gardner wrote:
> On 19/08/20 10:33, Les Cargill wrote: >> Tom Gardner wrote: >>> On 14/08/20 04:35, Les Cargill wrote: >>>> jlarkin@highlandsniptechnology.com wrote: >>>>> On Wed, 12 Aug 2020 08:33:20 +0100, Martin Brown >>>>> <'''newspam'''@nonad.co.uk> wrote: >>>>> >>>> <snip> >>>>> >>>>> The real dollar cost of bad software is gigantic. There should be no >>>>> reason for a small or mid-size company to continuously pay IT security >>>>> consultants, or to run AV software. >>>>> >>>> >>>> It's not even accounted for, nor is it an actual cost in the usual >>>> sense >>>> of the word - nobody's trying to make this actually less expensive. >>>> >>>>>> >>>>>> C invites certain dangerous practices that attackers ruthlessly >>>>>> exploit >>>>>> like loops copying until they hit a null byte. >>>>> >>>>> Let bad programs malfunction or crash. But don't allow a stack or >>>>> buffer overflow to poke exploits into code space. The idea of >>>>> separating data, code, and stack isn't hard to understand, or even >>>>> hard to implement. >>>>> >>>>> We probably need to go to pseudocode-only programs. The machine needs >>>>> to be protected from programmers and from bad architectures. Most >>>>> programmers never learn about machine-level processes. >>>>> >>>> >>>> That's what "managed languages" like Java or C# do. It's all bytecode >>>> in a VM. >>> >>> Java is, C# isn't. >>> >> >> Close enough: >> https://docs.microsoft.com/en-us/dotnet/standard/managed-code#:~:text=Managed%20code%20is%20written%20in,don't%20get%20machine%20code. >> >> >> "You get Intermediate Language code which the runtime then compiles and >> executes." >> >> I'd call that an implementation detail; it does not load the image >> into memory then jump to _main. > > Yebbut :) > >
We knew that was coming. :)
>> The point of my comment is that both Java and C# are considered "managed >> languages", especially for security purposes. I suppose somebody, >> somewhere is writing virii in C# but ... > > Except that C# has a gaping chasm in that security: "unsafe". > https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/unsafe > >
Oh my. Words fail.
> I imagine too many programs make use of that "convenience". > >
Of coursed they do. Without the proper training, we're all some shade of destructive primate anyway. -- Les Cargill
Reply by Les Cargill August 21, 20202020-08-21
Tom Gardner wrote:
> On 19/08/20 10:28, Les Cargill wrote: >> Tom Gardner wrote:
<snip>
>> >> Why did it surprise anybody? > > I can only guess they > &nbsp;- were young and inexperienced > &nbsp;- didn't think about the language's fundamentals > &nbsp;- believed vendor/supplier hype/statements > > Sad :( > >
I just mean anyone who's had a passing glance at an operating systems course would at least most likely understand. <snip>
>> You'll get no argument here. But all things which have too >> much light on them end up in Mandaranism. > > Mandaranism? > >
"That which is not required is forbidden. That which is not forbidden is required."
>> And the best tools to inspect memory are built into the running >> application itself. > > Yes, but they will inevitably rely on core language > and implementation guarantees and lack of guarantees. >
This is not a problem. Really. Now, if you're debugging whether *memory* works or not, that's a different sorta bear hunt. But it sorta shocks me how few developers seem to reach for an "instrumentation first" approach. I guess they guess. :)
> A classic in this context is that many optimisations > can only be done if const declarations are present. > Without that, the possibility of aliasing precludes > optimisations. > > Now generic inspection tools, e.g. those that you might > use to inspect what's happening in a library, have to > be able to access the data, which implies aliasing.
Mmmmm.... okay. Good instrumentation may invoke aliasing. but it's not necessary. I should, at this point... by "instrumentation", I mean anything from free transmission of state to RFC1213 style "pull" approaches to $DEITY only knows. Basically, what used to be known as "telemetry".
> And that requires the ability to remove constness. >
Ah, baloney :) ( that bit was intended to be entertaining ) . I just mean that this whole exersize in pilpul doesn't amount to too much.
> The debate as to whether it should possible/impossible > to "cast away constness" occupied the committees for > at least a year in the early 90s. >
Turns out you can, and it's fine.
> Different languages have taken different extreme > positions on that. Either extreme is OK, but fudging > the issue isn't. >
Extremes are boring.
> Java and similar heavily use reflection.
As should good C programs. Not the same thing as in Java but still. How can you measure something without... measuring something? <snip>
>>> The jury is out w.r.t. Rust and Go, but they are worth watching. >> >> Agreed. I really expected better progress, but you know how we are... > > I live in hope. The only thing that makes me despair is people > who think the status quo is good and acceptable :)
Overall, it's not that bad now. It wasn't that bad before. We have continents of embarrassments of options now. Getting people to pay you for it is another story. Functional correctness is for lawyers now. -- Les Cargill
Reply by Tom Gardner August 19, 20202020-08-19
On 19/08/20 10:33, Les Cargill wrote:
> Tom Gardner wrote: >> On 14/08/20 04:35, Les Cargill wrote: >>> jlarkin@highlandsniptechnology.com wrote: >>>> On Wed, 12 Aug 2020 08:33:20 +0100, Martin Brown >>>> <'''newspam'''@nonad.co.uk> wrote: >>>> >>> <snip> >>>> >>>> The real dollar cost of bad software is gigantic. There should be no >>>> reason for a small or mid-size company to continuously pay IT security >>>> consultants, or to run AV software. >>>> >>> >>> It's not even accounted for, nor is it an actual cost in the usual sense >>> of the word - nobody's trying to make this actually less expensive. >>> >>>>> >>>>> C invites certain dangerous practices that attackers ruthlessly exploit >>>>> like loops copying until they hit a null byte. >>>> >>>> Let bad programs malfunction or crash. But don't allow a stack or >>>> buffer overflow to poke exploits into code space. The idea of >>>> separating data, code, and stack isn't hard to understand, or even >>>> hard to implement. >>>> >>>> We probably need to go to pseudocode-only programs. The machine needs >>>> to be protected from programmers and from bad architectures. Most >>>> programmers never learn about machine-level processes. >>>> >>> >>> That's what "managed languages" like Java or C# do. It's all bytecode >>> in a VM. >> >> Java is, C# isn't. >> > > Close enough: > https://docs.microsoft.com/en-us/dotnet/standard/managed-code#:~:text=Managed%20code%20is%20written%20in,don't%20get%20machine%20code. > > > "You get Intermediate Language code which the runtime then compiles and > executes." > > I'd call that an implementation detail; it does not load the image into memory > then jump to _main.
Yebbut :)
> The point of my comment is that both Java and C# are considered "managed > languages", especially for security purposes. I suppose somebody, somewhere is > writing virii in C# but ...
Except that C# has a gaping chasm in that security: "unsafe". https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/unsafe I imagine too many programs make use of that "convenience".
>> During installation C# assemblies are compiled into code >> optimised for the specific processor. Of course those >> optimisations can only be based on what the compiler/installer >> guesses the code will do at runtime. >> >> I've wondered (without conclusion) whether that is why >> it takes so long to install Windows updates, compared >> with Linux updates. >> >> Caveat: I haven't followed C# since the designer (Anders >> Hejlsberg) gave us a lecture at HPLabs, just as C# was >> being released. >> >> None of us were impressed, correctly regarding it as a >> proprietary me-too Java knockoff with slightly different >> implementation choices. > > Exactly. >
Reply by Tom Gardner August 19, 20202020-08-19
On 19/08/20 10:28, Les Cargill wrote:
> Tom Gardner wrote: >> On 15/08/20 03:51, Les Cargill wrote: >>> Tom Gardner wrote: >>>> On 14/08/20 04:13, Les Cargill wrote: >>>>> Tom Gardner wrote: >>>>>> Rust and Go are showing significant promise in the >>>>>> marketplace, >>>>> >>>>> Mozzlla seems to have dumped at least some of the Rust team: >>>>> >>>>> https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/ >>>>> >>>> >>>> I doubt they will remain unemployed. Rust is gaining traction >>>> in wider settings. >>>> >>> >>> I dunno - I can't separate the messaging from the offering. I'm >>> fine with a C/C++ compiler so I have less than no incentive to >>> even become remotely literate about Rust. >>> >>> The Rustaceans seem obsessed with stuff my cohort ( read:old people ) >>> learned six months into their first C project. But there may >>> well be benefits I don't know about. >> >> Too many people /think/ they know C. >> >> I first used C in ~81, and learned it from the two >> available books, which I still have. The second book >> was, of course, a book on traditional mistakes in C >> "The C Puzzle Book". >> >> It is horrifying that Boehm thought it worth writing this >> http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf >> http://hboehm.info/misc_slides/pldi05_threads.pdf >> and that it surprised many C practitioners. >> > > Why did it surprise anybody?
I can only guess they - were young and inexperienced - didn't think about the language's fundamentals - believed vendor/supplier hype/statements Sad :(
>> Rust directly addresses some of the pain points. >> >> >>> It is not-not a thing; the CVE list shows that. I am just appalled >>> that these defects are released. >> >> If you think C and C++ languages and implementations >> are fault-free, I'd like to visit your planet sometime :) >> > > I don't think that at all. That's not necessarily a reasonable > standard to boot. You can *reliably* produce perfectly > functional work product with them, without knowing a whole lot about what's > under the hood (mostly ) and without a whole mass of pain. > > Once you find the few rocks under the water... > >> You can start with the C++ FQA http://yosefk.com/c++fqa/ >> >> Watching (from a distance) the deliberations of the C/C++ >> committees in the early 90s was enlightening, in a bad way. >> One simple debate (which lasted years) was whether is ought >> to be possible or impossible to "cast away constness". >> There are good reasons for both, and they cannot be >> reconciled. >> (Yes: to allow debuggers and similar tools to inspect memory. >> No: to enable safe aggressive optimisations) >> >> > > You'll get no argument here. But all things which have too > much light on them end up in Mandaranism.
Mandaranism?
> And the best tools to inspect memory are built into the running application itself.
Yes, but they will inevitably rely on core language and implementation guarantees and lack of guarantees. A classic in this context is that many optimisations can only be done if const declarations are present. Without that, the possibility of aliasing precludes optimisations. Now generic inspection tools, e.g. those that you might use to inspect what's happening in a library, have to be able to access the data, which implies aliasing. And that requires the ability to remove constness. The debate as to whether it should possible/impossible to "cast away constness" occupied the committees for at least a year in the early 90s. Different languages have taken different extreme positions on that. Either extreme is OK, but fudging the issue isn't. Java and similar heavily use reflection. Rust ensures data has only one owner at a time. Both are good, effective and reliable.
>>>> Linus Torvalds is vociferously and famously opposed to having >>>> C++ anywhere near the Linux kernel (good taste IMNSHO). >>> >>> Don't take any cues from Linus Torvalds. He's why my deliverables >>> at one gig were patch files. I've no objection to that but geez... >>> >>> And C++ is Just Fine. Now. It took what, 20 years? >> >> Worse: 30 years! >> > > Yep; yer right. > >> I first used it in '88, and thought it a regression >> over other available languages. >> >> >>> The reasons for "no C++ in the kernel" are quite serious, valid and worthy of >>> our approval. >>> >>>> He >>>> has given a big hint he wouldn't oppose Rust, by stating that >>>> if it is there it should be enabled by default. >>>> >>>> https://www.phoronix.com/scan.php?page=news_item&px=Torvalds-Rust-Kernel-K-Build >>>> >>> >>> >>> I've seen this movie before. It's yet another This Time It's Different >>> approach. >> >> Oh, we've all seen too many examples of that, in hardware >> and software! The trick is recognising which bring worthwhile >> practical and novel capabilities to the party. Most don't, >> a very few do. >> >> The jury is out w.r.t. Rust and Go, but they are worth watching. > > Agreed. I really expected better progress, but you know how we are...
I live in hope. The only thing that makes me despair is people who think the status quo is good and acceptable :)
Reply by Les Cargill August 19, 20202020-08-19
Tom Gardner wrote:
> On 14/08/20 04:35, Les Cargill wrote: >> jlarkin@highlandsniptechnology.com wrote: >>> On Wed, 12 Aug 2020 08:33:20 +0100, Martin Brown >>> <'''newspam'''@nonad.co.uk> wrote: >>> >> <snip> >>> >>> The real dollar cost of bad software is gigantic. There should be no >>> reason for a small or mid-size company to continuously pay IT security >>> consultants, or to run AV software. >>> >> >> It's not even accounted for, nor is it an actual cost in the usual sense >> of the word - nobody's trying to make this actually less expensive. >> >>>> >>>> C invites certain dangerous practices that attackers ruthlessly exploit >>>> like loops copying until they hit a null byte. >>> >>> Let bad programs malfunction or crash. But don't allow a stack or >>> buffer overflow to poke exploits into code space. The idea of >>> separating data, code, and stack isn't hard to understand, or even >>> hard to implement. >>> >>> We probably need to go to pseudocode-only programs. The machine needs >>> to be protected from programmers and from bad architectures. Most >>> programmers never learn about machine-level processes. >>> >> >> That's what "managed languages" like Java or C# do. It's all bytecode >> in a VM. > > Java is, C# isn't. >
Close enough: https://docs.microsoft.com/en-us/dotnet/standard/managed-code#:~:text=Managed%20code%20is%20written%20in,don't%20get%20machine%20code. "You get Intermediate Language code which the runtime then compiles and executes." I'd call that an implementation detail; it does not load the image into memory then jump to _main. The point of my comment is that both Java and C# are considered "managed languages", especially for security purposes. I suppose somebody, somewhere is writing virii in C# but ...
> During installation C# assemblies are compiled into code > optimised for the specific processor. Of course those > optimisations can only be based on what the compiler/installer > guesses the code will do at runtime. > > I've wondered (without conclusion) whether that is why > it takes so long to install Windows updates, compared > with Linux updates. > > Caveat: I haven't followed C# since the designer (Anders > Hejlsberg) gave us a lecture at HPLabs, just as C# was > being released. > > None of us were impressed, correctly regarding it as a > proprietary me-too Java knockoff with slightly different > implementation choices.
Exactly. -- Les Cargill
Reply by Les Cargill August 19, 20202020-08-19
Tom Gardner wrote:
> On 15/08/20 03:51, Les Cargill wrote: >> Tom Gardner wrote: >>> On 14/08/20 04:13, Les Cargill wrote: >>>> Tom Gardner wrote: >>>>> Rust and Go are showing significant promise in the >>>>> marketplace, >>>> >>>> Mozzlla seems to have dumped at least some of the Rust team: >>>> >>>> https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/ >>>> >>> >>> I doubt they will remain unemployed. Rust is gaining traction >>> in wider settings. >>> >> >> I dunno - I can't separate the messaging from the offering. I'm >> fine with a C/C++ compiler so I have less than no incentive to >> even become remotely literate about Rust. >> >> The Rustaceans seem obsessed with stuff my cohort ( read:old people ) >> learned six months into their first C project. But there may >> well be benefits I don't know about. > > Too many people /think/ they know C. > > I first used C in ~81, and learned it from the two > available books, which I still have. The second book > was, of course, a book on traditional mistakes in C > "The C Puzzle Book". > > It is horrifying that Boehm thought it worth writing this > http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf > http://hboehm.info/misc_slides/pldi05_threads.pdf > and that it surprised many C practitioners. >
Why did it surprise anybody?
> Rust directly addresses some of the pain points. > > >> It is not-not a thing; the CVE list shows that. I am just appalled >> that these defects are released. > > If you think C and C++ languages and implementations > are fault-free, I'd like to visit your planet sometime :) >
I don't think that at all. That's not necessarily a reasonable standard to boot. You can *reliably* produce perfectly functional work product with them, without knowing a whole lot about what's under the hood (mostly ) and without a whole mass of pain. Once you find the few rocks under the water...
> You can start with the C++ FQA http://yosefk.com/c++fqa/ > > Watching (from a distance) the deliberations of the C/C++ > committees in the early 90s was enlightening, in a bad way. > One simple debate (which lasted years) was whether is ought > to be possible or impossible to "cast away constness". > There are good reasons for both, and they cannot be > reconciled. > (Yes: to allow debuggers and similar tools to inspect memory. > No: to enable safe aggressive optimisations) > >
You'll get no argument here. But all things which have too much light on them end up in Mandaranism. And the best tools to inspect memory are built into the running application itself.
> >>> Linus Torvalds is vociferously and famously opposed to having >>> C++ anywhere near the Linux kernel (good taste IMNSHO). >> >> Don't take any cues from Linus Torvalds. He's why my deliverables >> at one gig were patch files. I've no objection to that but geez... >> >> And C++ is Just Fine. Now. It took what, 20 years? > > Worse: 30 years! >
Yep; yer right.
> I first used it in '88, and thought it a regression > over other available languages. > > >> The reasons for "no C++ in the kernel" are quite serious, valid and >> worthy of our approval. >> >>> He >>> has given a big hint he wouldn't oppose Rust, by stating that >>> if it is there it should be enabled by default. >>> >>> https://www.phoronix.com/scan.php?page=news_item&px=Torvalds-Rust-Kernel-K-Build >>> >> >> >> I've seen this movie before. It's yet another This Time It's Different >> approach. > > Oh, we've all seen too many examples of that, in hardware > and software! The trick is recognising which bring worthwhile > practical and novel capabilities to the party. Most don't, > a very few do. > > The jury is out w.r.t. Rust and Go, but they are worth watching.
Agreed. I really expected better progress, but you know how we are... -- Les Cargill
Reply by Tom Gardner August 17, 20202020-08-17
On 17/08/20 02:46, Clifford Heath wrote:
> On 17/8/20 7:09 am, pcdhobbs@gmail.com wrote: >>>>> That's what "managed languages" like Java or C# do. It's all bytecode >>>>> in a VM. >>> >>>> Sort of like UCSD Pascal, circa 1975. ;) >> >>> In the same way that a bacterium is sort of like a mammal. Both have >>> DNA. The similarity ends there. >> >> If you would care to compare and contrast Java and UCSD Pascal, I'd read with >> interest. > > I could, but don't care to say much. Any bytecode is like any other for the most > part, but JVM is designed to JIT, and the level of sophistication in HotSpot is > analogous to the mammalian superstructure. It really is an immense tower of > technology. Personally I prefer the AOT idea to JIT, but I respect the achievement.
Never confuse HotSpot with JIT. JIT is a runtime peephole optimiser, and hence uses only local information about the code emitted by the compiler. That code is based on what the compiler can guess/presume about the code /might/ behave. HotSpot looks at what the code is /actually/ doing, and optimises the shit out of that. AOT is HotSpot without knowing what the code will do. It optimises for the instruction set in a particular processor (and there are many variants between AMD/intel!) I don't know how it deals with processors being changed after installation, e.g. for all the recent cache-timing malware attacks. HotSpot and JIT can take account of "removed" processor functionality, by replacing the runtime. AOT is the only game for embedded. HotSpot has major advantages elsewhere.
Reply by August 17, 20202020-08-17
>I could, but don't care to say much. Any bytecode is like any other for >the most part
My point. Pretty far from your earlier response, which I reprise: ">> Sort of like UCSD Pascal, circa 1975. ;)
>In the same way that a bacterium is sort of like a mammal. Both have >DNA. The similarity ends there.
So any bacterium is like any mammal "for the most part." good to know! ;) Cheers Phil Hobbs
Reply by Clifford Heath August 16, 20202020-08-16
On 17/8/20 7:09 am, pcdhobbs@gmail.com wrote:
>>>> That's what "managed languages" like Java or C# do. It's all bytecode >>>> in a VM. >> >>> Sort of like UCSD Pascal, circa 1975. ;) > >> In the same way that a bacterium is sort of like a mammal. Both have >> DNA. The similarity ends there. > > If you would care to compare and contrast Java and UCSD Pascal, I'd read with interest.
I could, but don't care to say much. Any bytecode is like any other for the most part, but JVM is designed to JIT, and the level of sophistication in HotSpot is analogous to the mammalian superstructure. It really is an immense tower of technology. Personally I prefer the AOT idea to JIT, but I respect the achievement. CH
Reply by August 16, 20202020-08-16
>>> That's what "managed languages" like Java or C# do. It's all bytecode >>> in a VM. > >> Sort of like UCSD Pascal, circa 1975. ;)
>In the same way that a bacterium is sort of like a mammal. Both have >DNA. The similarity ends there.
If you would care to compare and contrast Java and UCSD Pascal, I'd read with interest. Cheers Phil Hobbs