Electronics-Related.com
Forums

PIC/dsPIC development

Started by bitrex November 4, 2018
On 06/11/18 13:36, Tom Gardner wrote:
> On 06/11/18 11:28, David Brown wrote: >> On 05/11/18 19:44, Clive Arthur wrote: >>> On 05/11/2018 18:01, Tom Gardner wrote: >>> >>> <snipped> >>> >>>>> On Monday, 5 November 2018 06:01:16 UTC-5, Clive Arthur wrote: >>>>>> I'd like to know exactly what the [dsPIC] optimisers do. >>>> >>>> The make more assumptions that the programmer understands >>>> what they've asked the computer and compiler to do. >>> >>> Yebbut, that's not exactly exact. Any examples? >> >> I am guessing he means that many compilers are more forgiving of >> programmers who don't understand C if they are not optimising. A >> classic example is that some people think that if you overflow a signed >> integer calculation, you get two's complement wrapping if the underlying >> processor supports it (they think that "x + y" in C translates to an >> "ADD" instruction in assembly). With the kind of simplistic >> translation-style compilation you often get with no optimisation >> enabled, that can often be correct. With optimisation enabled and >> better analysis of the source code, it is frequently not the case. > > Yes, that's the kind of thing, but of course there are > many other examples! > > > >> The C programming world is somewhat split into two kinds of programmers. >> One kind feel that the C language is defined by the standards and >> implementation-specific documentation, and this forms a contract between >> the programmer and the compiler - the programmer promises to follow the >> rules of the standards, and the compiler promises to give them the best >> possible code on the assumption that the rules are followed. They work >> with their compiler in order to get the most efficient results. >> >> The other kind of C programmer have a sort of "Humpty Dumpty" attitude >> that "programs mean exactly what I want them to mean" and that C is a >> "high level assembly". Those kinds of programmers get upset when >> compilers "outsmart" them, and are continually fighting their tools. > > There's some truth in that, but it is far too black and > white. I believe the vast majority of programmers can be > described as: > - want to do a good job
There are, unfortunately, a good deal who simply want to get paid and couldn't care less about the quality of their work.
> - have some humility, but not too much
I don't have any humility. I think it is one of my few flaws.
> - have /some/ knowledge of their limitations and of > the C/C++ tool's "quirks", and try to live within them
Yes.
> - aren't experts that really do understand those things
Yes. But usually, they can simple avoid the quirks of C. Do you know what happens if your signed integers overflow? The real answer is "it does not matter". If your calculations are overflowing, then bar a very few special cases, your code is broken. At best, a discussion of integer overflow behaviour is a discussion of what the pieces will look like when your program falls apart. The people that get in trouble with undefined behaviour like this are - for the most part - the smart arses. It's the people who think "I'm not going to test ((x > -5) && (x < 5)) - it's more efficient to test ((x - 2147483644) > 0)".
> > So, despite their best intentions, it is unlikely that > such users will fully comprehend the C/C++ tools limitations > nor their own limitations, let alone that of other people's > code and libraries. >
True. But it is rarely a problem - except for the smart-arses, who are always a problem regardless of the language or its behaviour. People often talk about C being an "unsafe" language. That is, in many ways, true - it is easy to get things wrong, and write code with bugs, leaks, security holes, etc. But these mostly have nothing to do with the quirks of C, or how compilers optimise. It is just thoughtless or careless coding, because C requires more manual effort than most languages. Buffer overflows, missing or duplicate "free" calls, etc., are nothing to do with misunderstandings about the oddities of the language - it is simply not thinking carefully enough about the code, and the results are the same regardless of optimisation settings.
> >>> I find it frustrating that you're using a tool where the maker won't >>> tell you what the little check boxes actually do. >>> >> >> Compiler manuals often (but not always) have a lot of information about >> what the optimisations do. It is rarely possible to give complete >> details, however - it's just too complicated to fit in user >> documentation. gcc has summaries for all the different optimisation >> flags - there are /many/ of them - but it doesn't cover everything. > > That complexity is a serious issue. If given a piece > of code, most developers won't understand which > combination of compiler flag must/mustn't be used. >
For the most part, if the code requires a particular choice of flags to be used or omitted, the code is wrong. There are exceptions, of course - flags to pick a particular version of the C or C++ standards are important.
> Now, given that the principal actors are the users, the > tools and the user's organisation, what is an appropriate > behaviour for each actor? > > Take the time and money to become an expert? > > Choose simple(r) tools that do the job adequately? > > Ship products even though there is, despite best intentions, > improper understanding of the tools and code? > > Refuse to use unnecessarily complex tools?
Stick to what you know, and try to write code that makes sense. If you don't know the details of how C (and your particular compiler) treats overflows, or casts with different pointer types, or other "dark corners" of the language, then stay out of the dark corners. Ask people when you are stuck. Test thoroughly. Learn to use and love compiler warnings.
On 06/11/18 15:01, Phil Hobbs wrote:
> On 11/6/18 7:36 AM, Tom Gardner wrote: >> On 06/11/18 11:28, David Brown wrote: >>> On 05/11/18 19:44, Clive Arthur wrote: >>>> On 05/11/2018 18:01, Tom Gardner wrote: >>>> >>>> <snipped> >>>> >>>>>> On Monday, 5 November 2018 06:01:16 UTC-5, Clive Arthur wrote: >>>>>>> I'd like to know exactly what the [dsPIC] optimisers do. >>>>> >>>>> The make more assumptions that the programmer understands >>>>> what they've asked the computer and compiler to do. >>>> >>>> Yebbut, that's not exactly exact. Any examples? >>> >>> I am guessing he means that many compilers are more forgiving of >>> programmers who don't understand C if they are not optimising. A >>> classic example is that some people think that if you overflow a signed >>> integer calculation, you get two's complement wrapping if the underlying >>> processor supports it (they think that "x + y" in C translates to an >>> "ADD" instruction in assembly). With the kind of simplistic >>> translation-style compilation you often get with no optimisation >>> enabled, that can often be correct. With optimisation enabled and >>> better analysis of the source code, it is frequently not the case. >> >> Yes, that's the kind of thing, but of course there are >> many other examples! >> >> >> >>> The C programming world is somewhat split into two kinds of programmers. >>> One kind feel that the C language is defined by the standards and >>> implementation-specific documentation, and this forms a contract between >>> the programmer and the compiler - the programmer promises to follow the >>> rules of the standards, and the compiler promises to give them the best >>> possible code on the assumption that the rules are followed. They work >>> with their compiler in order to get the most efficient results. >>> >>> The other kind of C programmer have a sort of "Humpty Dumpty" attitude >>> that "programs mean exactly what I want them to mean" and that C is a >>> "high level assembly". Those kinds of programmers get upset when >>> compilers "outsmart" them, and are continually fighting their tools. >> >> There's some truth in that, but it is far too black and >> white. I believe the vast majority of programmers can be >> described as: >> - want to do a good job >> - have some humility, but not too much >> - have /some/ knowledge of their limitations and of >> the C/C++ tool's "quirks", and try to live within them >> - aren't experts that really do understand those things >> >> So, despite their best intentions, it is unlikely that >> such users will fully comprehend the C/C++ tools limitations >> nor their own limitations, let alone that of other people's >> code and libraries. >> >> >>>> I find it frustrating that you're using a tool where the maker won't >>>> tell you what the little check boxes actually do. >>>> >>> >>> Compiler manuals often (but not always) have a lot of information about >>> what the optimisations do. It is rarely possible to give complete >>> details, however - it's just too complicated to fit in user >>> documentation. gcc has summaries for all the different optimisation >>> flags - there are /many/ of them - but it doesn't cover everything. >> >> That complexity is a serious issue. If given a piece >> of code, most developers won't understand which >> combination of compiler flag must/mustn't be used. > > Code that works on some compiler settings and not others gives me the > heebie-jeebies. People often talk about "optimizer bugs" that really > aren't anything of the sort. Of course vaguely-defined language > features such as 'volatile' and old-timey thread support don't help. > (Things have been getting better on that front, I think.)
In my career, I have come across a number of compiler bugs. But for almost all cases where I have heard people say "it works without optimisation, but fails when optimisation is enabled", the problem is their own code. It is things like missing "volatile", or "tricks" which are not valid C code. Using weaker compilers (or good compilers in weaker modes) does not really help - it just encourages bad habits.
> >> Now, given that the principal actors are the users, the >> tools and the user's organisation, what is an appropriate >> behaviour for each actor? >> >> Take the time and money to become an expert? > > I don't think you have to be a certified language lawyer to avoid most > of that stuff. Just stay paranoid and don't try to be too clever. >
Exactly. It can help to have contact with a language lawyer or two for answering questions - there is no shortage of online resources for that.
>> >> Choose simple(r) tools that do the job adequately? >> >> Ship products even though there is, despite best intentions, >> improper understanding of the tools and code? > > That's bound to be the case at some level, except for simple programs. > >> >> Refuse to use unnecessarily complex tools? > > Or the parts of them that you don't understand. > > Cheers > > Phil Hobbs > >
On 06/11/18 14:01, Phil Hobbs wrote:
> On 11/6/18 7:36 AM, Tom Gardner wrote: >> On 06/11/18 11:28, David Brown wrote: >>> On 05/11/18 19:44, Clive Arthur wrote: >>>> On 05/11/2018 18:01, Tom Gardner wrote: >>>> >>>> <snipped> >>>> >>>>>> On Monday, 5 November 2018 06:01:16 UTC-5, Clive Arthur&nbsp; wrote: >>>>>>> I'd like to know exactly what the [dsPIC] optimisers do. >>>>> >>>>> The make more assumptions that the programmer understands >>>>> what they've asked the computer and compiler to do. >>>> >>>> Yebbut, that's not exactly exact.&nbsp; Any examples? >>> >>> I am guessing he means that many compilers are more forgiving of >>> programmers who don't understand C if they are not optimising.&nbsp; A >>> classic example is that some people think that if you overflow a signed >>> integer calculation, you get two's complement wrapping if the underlying >>> processor supports it (they think that "x + y" in C translates to an >>> "ADD" instruction in assembly).&nbsp; With the kind of simplistic >>> translation-style compilation you often get with no optimisation >>> enabled, that can often be correct.&nbsp; With optimisation enabled and >>> better analysis of the source code, it is frequently not the case. >> >> Yes, that's the kind of thing, but of course there are >> many other examples! >> >> >> >>> The C programming world is somewhat split into two kinds of programmers. >>> &nbsp; One kind feel that the C language is defined by the standards and >>> implementation-specific documentation, and this forms a contract between >>> the programmer and the compiler - the programmer promises to follow the >>> rules of the standards, and the compiler promises to give them the best >>> possible code on the assumption that the rules are followed.&nbsp; They work >>> with their compiler in order to get the most efficient results. >>> >>> The other kind of C programmer have a sort of "Humpty Dumpty" attitude >>> that "programs mean exactly what I want them to mean" and that C is a >>> "high level assembly".&nbsp; Those kinds of programmers get upset when >>> compilers "outsmart" them, and are continually fighting their tools. >> >> There's some truth in that, but it is far too black and >> white. I believe the vast majority of programmers can be >> described as: >> &nbsp;&nbsp;- want to do a good job >> &nbsp;&nbsp;- have some humility, but not too much >> &nbsp;&nbsp;- have /some/ knowledge of their limitations and of >> &nbsp;&nbsp;&nbsp; the C/C++ tool's "quirks", and try to live within them >> &nbsp;&nbsp;- aren't experts that really do understand those things >> >> So, despite their best intentions, it is unlikely that >> such users will fully comprehend the C/C++ tools limitations >> nor their own limitations, let alone that of other people's >> code and libraries. >> >> >>>> I find it frustrating that you're using a tool where the maker won't >>>> tell you what the little check boxes actually do. >>>> >>> >>> Compiler manuals often (but not always) have a lot of information about >>> what the optimisations do.&nbsp; It is rarely possible to give complete >>> details, however - it's just too complicated to fit in user >>> documentation.&nbsp; gcc has summaries for all the different optimisation >>> flags - there are /many/ of them - but it doesn't cover everything. >> >> That complexity is a serious issue. If given a piece >> of code, most developers won't understand which >> combination of compiler flag must/mustn't be used. > > Code that works on some compiler settings and not others gives me the > heebie-jeebies.&nbsp; People often talk about "optimizer bugs" that really aren't > anything of the sort.&nbsp; Of course vaguely-defined language features such as > 'volatile' and old-timey thread support don't help. (Things have been getting > better on that front, I think.)
Me too, but it is unclear to me that Things Are Getting Better. If they are it is /very/ slow and will in many cases be constrained by having to use old variants of a language.
> >> Now, given that the principal actors are the users, the >> tools and the user's organisation, what is an appropriate >> behaviour for each actor? >> >> Take the time and money to become an expert? > > I don't think you have to be a certified language lawyer to avoid most of that > stuff.&nbsp; Just stay paranoid and don't try to be too clever.
That's my attitude, but I'm not sure it is still possible. It isn't just a "language lawyer", it is a "language+this compiler" lawyer. As one example, the C++ FQA (sic) has some tortuous examples. Yes, it is belligerent and amusing and a little outdated, but most points hit home :( http://yosefk.com/c++fqa/ As another, consider that the C++ language *designers* refused to admit that C++ templating language was Turing complete - until Erwin Unruh rubbed their noses in it by getting the compiler to emit prime numbers *during compilation*. https://en.wikibooks.org/wiki/C%2B%2B_Programming/Templates/Template_Meta-Programming#History_of_TMP Key phrase: "...it was *discovered* during the process of standardizing..." If the language designers don't understand their language, it is too complex - and is becoming part of the problem.
>> Choose simple(r) tools that do the job adequately?
I must admit to wanting to go down this route. There are a few signs that this is becoming possible, but nothing conclusive for embedded programming. I hope C (and especially C++) end up like COBOL: still there, but the new generation has moved on to Better Things.
>> Ship products even though there is, despite best intentions, >> improper understanding of the tools and code? > > That's bound to be the case at some level, except for simple programs.
True, but...
>> Refuse to use unnecessarily complex tools? > > Or the parts of them that you don't understand.
Ah, but most people /think/ they understand all the parts - until someone demonstrates to them that they have been bitten and not noticed it.
On Monday, November 5, 2018 at 9:17:00 AM UTC-5, bitrex wrote:
> On 11/05/2018 03:23 AM, 698839253X6D445TD@nospam.org wrote: > > bitrex wrote > >> My time > >>from concept to working prototype on breadboard is astonishingly low, > >> sometimes just tens of minutes. > > > > Just lighting a LED can be done with one resistor too. > > > > Yeah anything more complex would be a big job to code if I had to punch > in all the ones and zeros by hand, surely!
That can be automated too. A random pattern would be 50% right to start. Use a measurement based analysis of the resulting operation and then randomly alter bits until it passes unit testing. Many parts don't have all that much program space. How long can it take? Great job for machines. ;) Rick C.
On 06/11/18 15:38, Tom Gardner wrote:
> On 06/11/18 14:01, Phil Hobbs wrote: >> On 11/6/18 7:36 AM, Tom Gardner wrote: >>> On 06/11/18 11:28, David Brown wrote: >>>> On 05/11/18 19:44, Clive Arthur wrote: >>>>> On 05/11/2018 18:01, Tom Gardner wrote:
<snipped>
>>> >>> That complexity is a serious issue. If given a piece >>> of code, most developers won't understand which >>> combination of compiler flag must/mustn't be used. >> >> Code that works on some compiler settings and not others gives me the >> heebie-jeebies. People often talk about "optimizer bugs" that really >> aren't anything of the sort. Of course vaguely-defined language >> features such as 'volatile' and old-timey thread support don't help. >> (Things have been getting better on that front, I think.) > > Me too, but it is unclear to me that Things > Are Getting Better. If they are it is /very/ > slow and will in many cases be constrained by > having to use old variants of a language. >
One thing that I can think of that is "getting better" is threading in C11 and C++11. I don't see it being particularly popular in C11 - people use other methods, and embedded developers are often still using older C standards. C++11 gives more useful threading features, which have been extended in later versions - they give more reasons to use the language's threading functionality rather than external libraries. The other new feature (again, from C++11 and C11) is atomic support. These are nice, but I think that most people who understands and uses "atomic" probably already understood how to use "volatile" correctly.
> >> >>> Now, given that the principal actors are the users, the >>> tools and the user's organisation, what is an appropriate >>> behaviour for each actor? >>> >>> Take the time and money to become an expert? >> >> I don't think you have to be a certified language lawyer to avoid most >> of that stuff. Just stay paranoid and don't try to be too clever. > > That's my attitude, but I'm not sure it is still possible. > It isn't just a "language lawyer", it is a "language+this > compiler" lawyer. > > As one example, the C++ FQA (sic) has some tortuous examples. > Yes, it is belligerent and amusing and a little outdated, > but most points hit home :( > http://yosefk.com/c++fqa/
No, a large proportion of his "points" are wrong. They are wrong in many ways, outdated, misunderstandings, or simply incredibly tortuous and pointless examples that would not occur in real code. Many other "points" are wildly exaggerated, duplicates, easily avoidable (such as by common compiler warnings or by common "best practice" habits), or equally applicable to most mainstream languages. He certainly has a few /good/ points - no one is going to tell you that C++ is a perfect language. But they can be hard to find, after wading through all the spittle and bile he spits out - thinly disguised as "humour". And clearly when he has made points that are now outdated, it is because they /were/ good points (known to many - the FQA is not particularly informative). C++ has progressed and evolved, and is a much improved language these days.
> > As another, consider that the C++ language *designers* refused > to admit that C++ templating language was Turing complete - until > Erwin Unruh rubbed their noses in it by getting the compiler > to emit prime numbers *during compilation*. > https://en.wikibooks.org/wiki/C%2B%2B_Programming/Templates/Template_Meta-Programming#History_of_TMP > > Key phrase: "...it was *discovered* during the process > of standardizing..." > > If the language designers don't understand their language, > it is too complex - and is becoming part of the problem. >
Not at all. It is entirely normal to invent or discover something, and find new uses for it or new features that you did not think of earlier. Do you see it as a problem that the inventors of paper did not realise it could be used for paper aeroplanes or firing spitballs at the teacher's back? The only think surprising here is that the C++ language designers did not realise that templates were Turing complete early on. I think the idea of using templates for compile-time calculations simply hadn't occurred to them. And once the concept was established, it quickly became apparent that a more efficient method of doing this was needed, so constexpr functions were developed.
> >>> Choose simple(r) tools that do the job adequately? > > I must admit to wanting to go down this route. There are > a few signs that this is becoming possible, but nothing > conclusive for embedded programming. > > I hope C (and especially C++) end up like COBOL: still > there, but the new generation has moved on to Better Things. >
C is, by intention, a slow moving language. It does not change quickly - there has to be very good reason for adding new features, and extraordinary good reason for changing existing features. C++, also by intention, moves faster. Unlike for some languages, policy prevents it from breaking old code as much as possible, making it harder to get rid of older and weaker features, but it gains new features that make the language more complex, but user programs simpler and clearer.
> >>> Ship products even though there is, despite best intentions, >>> improper understanding of the tools and code? >> >> That's bound to be the case at some level, except for simple programs. > > True, but... > > >>> Refuse to use unnecessarily complex tools? >> >> Or the parts of them that you don't understand. > > Ah, but most people /think/ they understand all the > parts - until someone demonstrates to them that they > have been bitten and not noticed it.
> > But then I started coding in binary long time ago, > so for me asm is a high level language.
Me too... with front panel switches on an 8 bit mini computer in the early 70's....
> I started with PICs cracking TV smartcards that had those in it. > So the dirty secrets I should know... Was still legal back then.
Same again, me too :) I hate high level languages.. could never get the hang of them... --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus
On 06/11/18 14:24, David Brown wrote:
> On 06/11/18 13:36, Tom Gardner wrote: >> On 06/11/18 11:28, David Brown wrote: >>> On 05/11/18 19:44, Clive Arthur wrote: >>>> On 05/11/2018 18:01, Tom Gardner wrote: >>>> >>>> <snipped> >>>> >>>>>> On Monday, 5 November 2018 06:01:16 UTC-5, Clive Arthur wrote: >>>>>>> I'd like to know exactly what the [dsPIC] optimisers do. >>>>> >>>>> The make more assumptions that the programmer understands >>>>> what they've asked the computer and compiler to do. >>>> >>>> Yebbut, that's not exactly exact. Any examples? >>> >>> I am guessing he means that many compilers are more forgiving of >>> programmers who don't understand C if they are not optimising. A >>> classic example is that some people think that if you overflow a signed >>> integer calculation, you get two's complement wrapping if the underlying >>> processor supports it (they think that "x + y" in C translates to an >>> "ADD" instruction in assembly). With the kind of simplistic >>> translation-style compilation you often get with no optimisation >>> enabled, that can often be correct. With optimisation enabled and >>> better analysis of the source code, it is frequently not the case. >> >> Yes, that's the kind of thing, but of course there are >> many other examples! >> >> >> >>> The C programming world is somewhat split into two kinds of programmers. >>> One kind feel that the C language is defined by the standards and >>> implementation-specific documentation, and this forms a contract between >>> the programmer and the compiler - the programmer promises to follow the >>> rules of the standards, and the compiler promises to give them the best >>> possible code on the assumption that the rules are followed. They work >>> with their compiler in order to get the most efficient results. >>> >>> The other kind of C programmer have a sort of "Humpty Dumpty" attitude >>> that "programs mean exactly what I want them to mean" and that C is a >>> "high level assembly". Those kinds of programmers get upset when >>> compilers "outsmart" them, and are continually fighting their tools. >> >> There's some truth in that, but it is far too black and >> white. I believe the vast majority of programmers can be >> described as: >> - want to do a good job > > There are, unfortunately, a good deal who simply want to get paid and > couldn't care less about the quality of their work. > >> - have some humility, but not too much > > I don't have any humility. I think it is one of my few flaws. > >> - have /some/ knowledge of their limitations and of >> the C/C++ tool's "quirks", and try to live within them > > Yes. > >> - aren't experts that really do understand those things > > Yes. > > But usually, they can simple avoid the quirks of C. Do you know what > happens if your signed integers overflow? The real answer is "it does > not matter". If your calculations are overflowing, then bar a very few > special cases, your code is broken. At best, a discussion of integer > overflow behaviour is a discussion of what the pieces will look like > when your program falls apart.
And there you touch on one of the irresolvable dilemmas that has bedeviled C/C++ since the early 90s, viz should they be: - low-level, i.e. very close to the hardware and what it does - high-level, i.e. closer to the abstract specification of the specification. Either is perfectly valid and useful, but there is a problem when there is insufficient distinction between the two.
> The people that get in trouble with undefined behaviour like this are - > for the most part - the smart arses. It's the people who think "I'm not > going to test ((x > -5) && (x < 5)) - it's more efficient to test ((x - > 2147483644) > 0)".
That used not to be a problem, long ago. It is more of a problem since compilers have advanced to work around the language deficiencies, especially those related to aliasing, caches, and multiprocessors. And those points are only going to get worse now that Moore's "law" has run out of puff.
>> So, despite their best intentions, it is unlikely that >> such users will fully comprehend the C/C++ tools limitations >> nor their own limitations, let alone that of other people's >> code and libraries. >> > > True. But it is rarely a problem - except for the smart-arses, who are > always a problem regardless of the language or its behaviour. > > People often talk about C being an "unsafe" language. That is, in many > ways, true - it is easy to get things wrong, and write code with bugs, > leaks, security holes, etc. But these mostly have nothing to do with > the quirks of C, or how compilers optimise. It is just thoughtless or > careless coding, because C requires more manual effort than most > languages. Buffer overflows, missing or duplicate "free" calls, etc., > are nothing to do with misunderstandings about the oddities of the > language - it is simply not thinking carefully enough about the code, > and the results are the same regardless of optimisation settings.
Ah yes, the "guns don't kill people" argument. Being right-pondian (and showered with glass when a local hero took a potshot at a Pittsburgh tram!) that has never impressed me.
>>>> I find it frustrating that you're using a tool where the maker won't >>>> tell you what the little check boxes actually do. >>>> >>> >>> Compiler manuals often (but not always) have a lot of information about >>> what the optimisations do. It is rarely possible to give complete >>> details, however - it's just too complicated to fit in user >>> documentation. gcc has summaries for all the different optimisation >>> flags - there are /many/ of them - but it doesn't cover everything. >> >> That complexity is a serious issue. If given a piece >> of code, most developers won't understand which >> combination of compiler flag must/mustn't be used. >> > > For the most part, if the code requires a particular choice of flags to > be used or omitted, the code is wrong. There are exceptions, of course > - flags to pick a particular version of the C or C++ standards are > important.
It is often required in order to achieve the touted performance advantages of C/C++. Without using the optimisation flags it isn't unreasonable to think of C/C++ compilers as being /pessimising/ compilers!
>> Now, given that the principal actors are the users, the >> tools and the user's organisation, what is an appropriate >> behaviour for each actor? >> >> Take the time and money to become an expert? >> >> Choose simple(r) tools that do the job adequately? >> >> Ship products even though there is, despite best intentions, >> improper understanding of the tools and code? >> >> Refuse to use unnecessarily complex tools? > > Stick to what you know, and try to write code that makes sense. If you > don't know the details of how C (and your particular compiler) treats > overflows, or casts with different pointer types, or other "dark > corners" of the language, then stay out of the dark corners. > > Ask people when you are stuck.
That presumes you *know* the dark corners. See your points above L(
> Test thoroughly.
Ah, the "inspect quality into a product" mentality :(
> Learn to use and love compiler warnings.
Always!
On 06/11/18 14:57, gnuarm.deletethisbit@gmail.com wrote:
> On Monday, November 5, 2018 at 9:17:00 AM UTC-5, bitrex wrote: >> On 11/05/2018 03:23 AM, 698839253X6D445TD@nospam.org wrote: >>> bitrex wrote >>>> My time from concept to working prototype on breadboard is >>>> astonishingly low, sometimes just tens of minutes. >>> >>> Just lighting a LED can be done with one resistor too. >>> >> >> Yeah anything more complex would be a big job to code if I had to punch in >> all the ones and zeros by hand, surely! > > That can be automated too. A random pattern would be 50% right to start. > Use a measurement based analysis of the resulting operation and then randomly > alter bits until it passes unit testing. Many parts don't have all that much > program space. How long can it take? Great job for machines. ;)
That's far too close to the knuckle. Doubly so when you consider "machine learning" :(
On 06/11/18 16:16, Tom Gardner wrote:
> On 06/11/18 14:24, David Brown wrote: >> On 06/11/18 13:36, Tom Gardner wrote: >>> On 06/11/18 11:28, David Brown wrote: >>>> On 05/11/18 19:44, Clive Arthur wrote: >>>>> On 05/11/2018 18:01, Tom Gardner wrote: >>>>> >>>>> <snipped> >>>>> >>>>>>> On Monday, 5 November 2018 06:01:16 UTC-5, Clive Arthur wrote: >>>>>>>> I'd like to know exactly what the [dsPIC] optimisers do. >>>>>> >>>>>> The make more assumptions that the programmer understands >>>>>> what they've asked the computer and compiler to do. >>>>> >>>>> Yebbut, that's not exactly exact. Any examples? >>>> >>>> I am guessing he means that many compilers are more forgiving of >>>> programmers who don't understand C if they are not optimising. A >>>> classic example is that some people think that if you overflow a signed >>>> integer calculation, you get two's complement wrapping if the >>>> underlying >>>> processor supports it (they think that "x + y" in C translates to an >>>> "ADD" instruction in assembly). With the kind of simplistic >>>> translation-style compilation you often get with no optimisation >>>> enabled, that can often be correct. With optimisation enabled and >>>> better analysis of the source code, it is frequently not the case. >>> >>> Yes, that's the kind of thing, but of course there are >>> many other examples! >>> >>> >>> >>>> The C programming world is somewhat split into two kinds of >>>> programmers. >>>> One kind feel that the C language is defined by the standards and >>>> implementation-specific documentation, and this forms a contract >>>> between >>>> the programmer and the compiler - the programmer promises to follow the >>>> rules of the standards, and the compiler promises to give them the best >>>> possible code on the assumption that the rules are followed. They work >>>> with their compiler in order to get the most efficient results. >>>> >>>> The other kind of C programmer have a sort of "Humpty Dumpty" attitude >>>> that "programs mean exactly what I want them to mean" and that C is a >>>> "high level assembly". Those kinds of programmers get upset when >>>> compilers "outsmart" them, and are continually fighting their tools. >>> >>> There's some truth in that, but it is far too black and >>> white. I believe the vast majority of programmers can be >>> described as: >>> - want to do a good job >> >> There are, unfortunately, a good deal who simply want to get paid and >> couldn't care less about the quality of their work. >> >>> - have some humility, but not too much >> >> I don't have any humility. I think it is one of my few flaws. >> >>> - have /some/ knowledge of their limitations and of >>> the C/C++ tool's "quirks", and try to live within them >> >> Yes. >> >>> - aren't experts that really do understand those things >> >> Yes. >> >> But usually, they can simple avoid the quirks of C. Do you know what >> happens if your signed integers overflow? The real answer is "it does >> not matter". If your calculations are overflowing, then bar a very few >> special cases, your code is broken. At best, a discussion of integer >> overflow behaviour is a discussion of what the pieces will look like >> when your program falls apart. > > And there you touch on one of the irresolvable dilemmas that > has bedeviled C/C++ since the early 90s, viz should they be: > - low-level, i.e. very close to the hardware and what it does > - high-level, i.e. closer to the abstract specification of the > specification. > > Either is perfectly valid and useful, but there is a problem > when there is insufficient distinction between the two. >
This would all have been so much easier if people had learned what C is and what it is for, before learning how to use it. C has /never/ been a "high level assembler". It has /never/ been a "portable assembler". It has /never/ been an "alternative assembler". C was designed to cover a range of uses. It was designed to be useful as a portable application and systems programming language. It was also designed to be useful for low-level code, with implementation-specific features, extensions and details. It was intended to be suitable for a fair proportion of code that would otherwise need to be written in assembler - it was intended to replace the /need/ for assembler for a lot of code, not to /be/ assembler. C is a high level programming language - it is the lowest level high level programming language. But it is not a low level language - it is defined in terms of an abstract machine, not in terms of the underlying hardware. Once you understand that, the "dilemma" disappears. You write your code in C, using the features of the high-level abstract machine. You use - if you want - non-portable features for greater efficiency on a particular target. And you let the compiler worry about the details of the low-level efficiency. (But where it matters, you should learn to understand how your compiler works and what high-level code you need to get the results you want.)
> >> The people that get in trouble with undefined behaviour like this are - >> for the most part - the smart arses. It's the people who think "I'm not >> going to test ((x > -5) && (x < 5)) - it's more efficient to test ((x - >> 2147483644) > 0)". > > That used not to be a problem, long ago. >
It was never a good idea - now or long ago. But it is true that some types of incorrect code worked with the compilers of long ago, and gave more efficient results than correct code. And those incorrect versions fail on modern tools, while the correct versions are more efficient than the incorrect versions ever were. This makes it hard to write code that is correct, works, and is efficient on both old and new tools. The answer, of course, is to write correct code regardless of the efficiency - "correct" is always more important than "fast".
> It is more of a problem since compilers have advanced to > work around the language deficiencies, especially those > related to aliasing, caches, and multiprocessors. >
What "language deficiencies" ?
> And those points are only going to get worse now that > Moore's "law" has run out of puff. > > >>> So, despite their best intentions, it is unlikely that >>> such users will fully comprehend the C/C++ tools limitations >>> nor their own limitations, let alone that of other people's >>> code and libraries. >>> >> >> True. But it is rarely a problem - except for the smart-arses, who are >> always a problem regardless of the language or its behaviour. >> >> People often talk about C being an "unsafe" language. That is, in many >> ways, true - it is easy to get things wrong, and write code with bugs, >> leaks, security holes, etc. But these mostly have nothing to do with >> the quirks of C, or how compilers optimise. It is just thoughtless or >> careless coding, because C requires more manual effort than most >> languages. Buffer overflows, missing or duplicate "free" calls, etc., >> are nothing to do with misunderstandings about the oddities of the >> language - it is simply not thinking carefully enough about the code, >> and the results are the same regardless of optimisation settings. > > Ah yes, the "guns don't kill people" argument. Being right-pondian > (and showered with glass when a local hero took a potshot at a > Pittsburgh tram!) that has never impressed me. >
I too am right-pondian, and have never been a "guns don't kill people" fan. But in this analogy, C is a gun. You need to learn to use it safely - or you should not use it at all. (I have often said that the majority of C programmers would be better off using other languages - and the majority of programs written in C would be better if they were written in other languages.)
> >>>>> I find it frustrating that you're using a tool where the maker won't >>>>> tell you what the little check boxes actually do. >>>>> >>>> >>>> Compiler manuals often (but not always) have a lot of information about >>>> what the optimisations do. It is rarely possible to give complete >>>> details, however - it's just too complicated to fit in user >>>> documentation. gcc has summaries for all the different optimisation >>>> flags - there are /many/ of them - but it doesn't cover everything. >>> >>> That complexity is a serious issue. If given a piece >>> of code, most developers won't understand which >>> combination of compiler flag must/mustn't be used. >>> >> >> For the most part, if the code requires a particular choice of flags to >> be used or omitted, the code is wrong. There are exceptions, of course >> - flags to pick a particular version of the C or C++ standards are >> important. > > It is often required in order to achieve the touted > performance advantages of C/C++. Without using the > optimisation flags it isn't unreasonable to think of > C/C++ compilers as being /pessimising/ compilers! >
Sure, it is usually pointless using a C or C++ compiler without optimisation enabled. And trying to use one for development without warnings enabled is as smart as typing your code with your arms tied behind your back. But optimisations and warnings are not there for correctness - they are there for efficiency (and to help you get correct code). If the code requires particular flags to be /correct/, then usually you have a problem in the code. (Again, excluding obvious ones like the choice of standard, or the choice of target processor details.)
> > >>> Now, given that the principal actors are the users, the >>> tools and the user's organisation, what is an appropriate >>> behaviour for each actor? >>> >>> Take the time and money to become an expert? >>> >>> Choose simple(r) tools that do the job adequately? >>> >>> Ship products even though there is, despite best intentions, >>> improper understanding of the tools and code? >>> >>> Refuse to use unnecessarily complex tools? >> >> Stick to what you know, and try to write code that makes sense. If you >> don't know the details of how C (and your particular compiler) treats >> overflows, or casts with different pointer types, or other "dark >> corners" of the language, then stay out of the dark corners. >> >> Ask people when you are stuck. > > That presumes you *know* the dark corners. > See your points above L( >
Programming in C requires responsibility (as does any serious programming). With that, comes the requirement of a certain amount of insight into your own skills. Qualified people doing useful and productive work in programming should not be limited by amateurs bumbling about. You don't insist that carpenters work with rubber mallets because some muppet might hit his thumb - why do you think programmers should be hobbled by people who don't know what they are doing?
> >> Test thoroughly. > > Ah, the "inspect quality into a product" mentality :( >
No. Testing can help find flaws - it can't find a lack of flaws. But that does not mean you should skip it!
> >> Learn to use and love compiler warnings. > > Always!
Agreement at last :-)
On 11/6/18 9:38 AM, Tom Gardner wrote:
> On 06/11/18 14:01, Phil Hobbs wrote: >> On 11/6/18 7:36 AM, Tom Gardner wrote: >>> On 06/11/18 11:28, David Brown wrote: >>>> On 05/11/18 19:44, Clive Arthur wrote: >>>>> On 05/11/2018 18:01, Tom Gardner wrote: >>>>> >>>>> <snipped> >>>>> >>>>>>> On Monday, 5 November 2018 06:01:16 UTC-5, Clive Arthur >>>>>>> wrote: >>>>>>>> I'd like to know exactly what the [dsPIC] optimisers >>>>>>>> do. >>>>>> >>>>>> The make more assumptions that the programmer understands >>>>>> what they've asked the computer and compiler to do. >>>>> >>>>> Yebbut, that's not exactly exact. Any examples? >>>> >>>> I am guessing he means that many compilers are more forgiving >>>> of programmers who don't understand C if they are not >>>> optimising. A classic example is that some people think that >>>> if you overflow a signed integer calculation, you get two's >>>> complement wrapping if the underlying processor supports it >>>> (they think that "x + y" in C translates to an "ADD" >>>> instruction in assembly). With the kind of simplistic >>>> translation-style compilation you often get with no >>>> optimisation enabled, that can often be correct. With >>>> optimisation enabled and better analysis of the source code, it >>>> is frequently not the case. >>> >>> Yes, that's the kind of thing, but of course there are many other >>> examples! >>> >>> >>> >>>> The C programming world is somewhat split into two kinds of >>>> programmers. One kind feel that the C language is defined by >>>> the standards and implementation-specific documentation, and >>>> this forms a contract between the programmer and the compiler - >>>> the programmer promises to follow the rules of the standards, >>>> and the compiler promises to give them the best possible code >>>> on the assumption that the rules are followed. They work with >>>> their compiler in order to get the most efficient results. >>>> >>>> The other kind of C programmer have a sort of "Humpty Dumpty" >>>> attitude that "programs mean exactly what I want them to mean" >>>> and that C is a "high level assembly". Those kinds of >>>> programmers get upset when compilers "outsmart" them, and are >>>> continually fighting their tools. >>> >>> There's some truth in that, but it is far too black and white. I >>> believe the vast majority of programmers can be described as: - >>> want to do a good job - have some humility, but not too much - >>> have /some/ knowledge of their limitations and of the C/C++ >>> tool's "quirks", and try to live within them - aren't experts >>> that really do understand those things >>> >>> So, despite their best intentions, it is unlikely that such users >>> will fully comprehend the C/C++ tools limitations nor their own >>> limitations, let alone that of other people's code and >>> libraries. >>> >>> >>>>> I find it frustrating that you're using a tool where the >>>>> maker won't tell you what the little check boxes actually >>>>> do. >>>>> >>>> >>>> Compiler manuals often (but not always) have a lot of >>>> information about what the optimisations do. It is rarely >>>> possible to give complete details, however - it's just too >>>> complicated to fit in user documentation. gcc has summaries >>>> for all the different optimisation flags - there are /many/ of >>>> them - but it doesn't cover everything. >>> >>> That complexity is a serious issue. If given a piece of code, >>> most developers won't understand which combination of compiler >>> flag must/mustn't be used. >> >> Code that works on some compiler settings and not others gives me >> the heebie-jeebies. People often talk about "optimizer bugs" that >> really aren't anything of the sort. Of course vaguely-defined >> language features such as 'volatile' and old-timey thread support >> don't help. (Things have been getting better on that front, I >> think.) > > Me too, but it is unclear to me that Things Are Getting Better. If > they are it is /very/ slow and will in many cases be constrained by > having to use old variants of a language. >
Well, std::atomic is an example.
> >> >>> Now, given that the principal actors are the users, the tools and >>> the user's organisation, what is an appropriate behaviour for >>> each actor? >>> >>> Take the time and money to become an expert? >> >> I don't think you have to be a certified language lawyer to avoid >> most of that stuff. Just stay paranoid and don't try to be too >> clever. > > That's my attitude, but I'm not sure it is still possible. It isn't > just a "language lawyer", it is a "language+this compiler" lawyer. > > As one example, the C++ FQA (sic) has some tortuous examples. Yes, it > is belligerent and amusing and a little outdated, but most points hit > home :( http://yosefk.com/c++fqa/ >
> As another, consider that the C++ language *designers* refused to > admit that C++ templating language was Turing complete - until Erwin > Unruh rubbed their noses in it by getting the compiler to emit prime > numbers *during compilation*. > https://en.wikibooks.org/wiki/C%2B%2B_Programming/Templates/Template_Meta-Programming#History_of_TMP > > > Key phrase: "...it was *discovered* during the process of > standardizing..." > > If the language designers don't understand their language, it is too > complex - and is becoming part of the problem.
Nah. I understand some people like template metaprogramming, but C++ is now more a family of languages than a single language. Few of us use all of it, and most (including meself) don't use most of it.
> > >>> Choose simple(r) tools that do the job adequately? > > I must admit to wanting to go down this route. There are a few signs > that this is becoming possible, but nothing conclusive for embedded > programming. > > I hope C (and especially C++) end up like COBOL: still there, but the > new generation has moved on to Better Things.
> > >>> Ship products even though there is, despite best intentions, >>> improper understanding of the tools and code? >> >> That's bound to be the case at some level, except for simple >> programs. > > True, but... > > >>> Refuse to use unnecessarily complex tools? >> >> Or the parts of them that you don't understand. > > Ah, but most people /think/ they understand all the parts - until > someone demonstrates to them that they have been bitten and not > noticed it.
Not in C++. I don't know anybody who would claim to understand all of it. Even Meyers eventually bailed out. Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC / Hobbs ElectroOptics Optics, Electro-optics, Photonics, Analog Electronics Briarcliff Manor NY 10510 http://electrooptical.net http://hobbs-eo.com