Electronics-Related.com
Forums

PSoC or FPGA?

Started by fasf March 20, 2011
On Tue, 22 Mar 2011 20:48:11 -0500, "krw@att.bizzzzzzzzzzzz"
<krw@att.bizzzzzzzzzzzz> wrote:

><snip> >It's even more fun when you can get someone to pay you to do it. ;-) ><snip>
I'm getting paid to do stuff I'd do for free, or even pay others to allow me to do. If I got good enough to get paid for writing VHDL code and then added that to the list, I'd probably die from forgetting to eat out of the pure excess pleasure of it all. I'm far too lucky as it is. ;) Jon
On a sunny day (Tue, 22 Mar 2011 18:24:48 -0500) it happened
"krw@att.bizzzzzzzzzzzz" <krw@att.bizzzzzzzzzzzz> wrote in
<7tbio69cbrq529nckd4bqtnrnjh7re4oia@4ax.com>:

>On Tue, 22 Mar 2011 10:14:37 GMT, Jan Panteltje <pNaonStpealmtje@yahoo.com> >wrote: > >>On a sunny day (Mon, 21 Mar 2011 18:13:19 -0500) it happened >>izzzzzzzzzzzzzzzzzzzzzzzz> wrote: >>>Wow! You got that right. I don't WRITE CODE, >> >>Yea, that was clear already. >> >I'm hardware engineer. Software is for dweebs.
Then why do you use it so much with your simulators? :-)
On a sunny day (Tue, 22 Mar 2011 20:01:43 +0000 (UTC)) it happened Warren
<ve3wwg@gmail.com> wrote in
<Xns9EB0A30D9BE11WarrensBlatherings@81.169.183.62>:

>Jan Panteltje expounded in >news:imanmt$clr$1@news.albasani.net: > >> On a sunny day (Tue, 22 Mar 2011 13:26:45 +0000 (UTC)) it >> happened Warren <ve3wwg@gmail.com> wrote in >> <Xns9EB0601772EF2WarrensBlatherings@188.40.43.213>: >> >>>If you work on large systems, then avoiding debuggers is a >>>huge waste of time. When you have intermittant or >>>unexplained failure(s) a debugger saves gobs of time since >>>you don't have to anticipate what to "print". >>> >>>When your code core dumps, a debugger will trace back the >>>precise point of the failure and allows you to look at >>>anything related to it. >>> >>>All of this facility comes with a simple compile option. No >>>need to code special macros. >>> >>>In short, you're following some bad and antiquated advice. >>> >>>Warren >> >> That is a matter of opinion. > >Your's is in the minority. :)
Better a sane minority in the mad house.
>> I have seen to many programmers staring too long at little >> windows with register values... While just some sane coding >> and understanding WHAT you were doing (in C), and print >> statements in the right place, would have saved hours, and >> prevented all that 'searching', and segfaults too. > >That speaks volumes about the programmers-- and nothing about >the value of the debugger. Don't get me wrong- a few carefully >crafted prints can augment a difficult debug. > >But to write off a debugger is like saying "I can saw it by >hand, so I don't need no stinkin' table saw". > >Warren
Not sure that analogy is the right one, how about this: Using a debugger is like spell checking by checking the font of each character, not only does it not help, as fonts are computer generated, it has no bearing on the spelling either. :-)
Jan Panteltje expounded in
news:imcaqq$58m$2@news.albasani.net: 

> On a sunny day (Tue, 22 Mar 2011 20:01:43 +0000 (UTC)) it > happened Warren <ve3wwg@gmail.com> wrote in > <Xns9EB0A30D9BE11WarrensBlatherings@81.169.183.62>: > >>Jan Panteltje expounded in >>news:imanmt$clr$1@news.albasani.net:
..
>>>>In short, you're following some bad and antiquated >>>>advice. >>>> >>>>Warren >>> >>> That is a matter of opinion. >> >>Your's is in the minority. :) > > Better a sane minority in the mad house.
That's the usual statement for inmates _in_ an insane asylum. They believe they are the only sane ones and that its everyone else that is not.
>>But to write off a debugger is like saying "I can saw it by >>hand, so I don't need no stinkin' table saw".
> Not sure that analogy is the right one, how about this: > Using a debugger is like spell checking by checking the > font of each character, not only does it not help, as fonts > are computer generated, it has no bearing on the spelling > either. >:-)
You're saying debuggers don't help, which is simply not the case. It either suggests that you don't know how to use them or you've avoided them for so long that you don't know what they're capable of. A debugger saves an enormous amount of time. Who wouldn't want to leverage that? I'll bet your employer would. Some poor developers avoid learning gdb (for example) simply because they didn't want to take the effort to learn how to use it. It isn't difficult but some folks are lazy or lousy at self education. But that's a problem with the developer-- not the tool. With the quality of the debuggers generally today, there is no reason not to use them. I for one, would not pay a developer to avoid debuggers so that they can spend time cooking up personalized macros and manually coding printf's all over the place. That's a poor use of a developer's time. Macros BTW, are completely useless in a language like Ada, which I've been using for AVR stuff. Ada shortcuts software development because it encourages software _engineering_ that is often lacking in C and derivatives. Ada is very fussy about what gets compiled which saves debugging time (but I realize this is a language religious thing and that sometimes the language is dictated). Finally, any code added for debugging purposes clutters the code, which is bad. A debugger makes most of that completely unnecessary. One compiler option is all that it costs. But if you don't want to use a debugger, then that is your own choice. One can only lead the horse to water. But saying that debuggers are not useful is simply not generally accepted. There is a good reason for that! Warren
"Warren" <ve3wwg@gmail.com> wrote in message 
news:Xns9EB165763DBEAWarrensBlatherings@81.169.183.62...
> I for one, would not pay a developer > to avoid debuggers so that they can spend time cooking up > personalized macros and manually coding printf's all over the > place. That's a poor use of a developer's time.
Yeah, these days they instead spend time cooking up macros and plug-ins for the debugger so that it can "sanely" display various objects or structures. :-)
> Ada is very fussy about > what gets compiled which saves debugging time (but I realize > this is a language religious thing and that sometimes the > language is dictated).
The other thing is that there really is a very broad range of programmer skill sets/proficiencies out there: My opinion is that, while all programmers make mistakes, the kind of bugs that very strong typing and other rigors that Ada imposes tend to prevent errors that a *certain subset* of programmers don't make anyway -- or at least only make on an incredibly rare occasion --, and instead *for them* it just decreases their average productivity a bit. Overall it is your programmer's productivity that counts, as measured by, "how long did it take them to get this chunk of code to work in a bug-free manner?" -- regardless of what tools they choose (or are made) to use. My opinion is that software development is one of the few fields where the answer to this question varies far more than in many other fields; a 5:1 ratio is readily seen -- yet hiring the most productive programmers doesn't cost nearly 5x what hiring the least productive ones does. My recollection is that Steve Wozniak wrote out all of the Apple I's BASIC in assembly language and then manually "assembled" it, still on paper, into op-codes. He did all of his debugging "on paper," and the first time he entered it into the actual CPU... it worked. Many people couldn't pull off that feat with Ada, C++, Python or any other language out there. :-) On the other hand... Woz and one other guy were off trying to finish DOS 1.0 before some trade show (probably Comdex), hopped onto a plane the day before, figuring that they'd had it done in a couple of hours after they arrived at their hotel. Instead they were programming all through the night, finishing just a couple of hours before the show was set to open... at which point Woz figured he'd make "one last test" of the read/write sector routines before getting some shuteye... and inadvertently did so using the disc he was using to store DOS itself rather than their test disc, thereby wiping out all the progress they'd make that past night. Everyone's human? :-) (The story then continues that he did go to sleep, woke up that afternoon, and fixed everything still that same day -- having remembered most of the details of what had been done --, so they only lost the one opening day of the trade show without a demonstrably working DOS. Excellent recovery there, at least.) ---Joel
Joel Koltner expounded in
news:pfoip.771840$iV7.328171@en-nntp-15.dc1.easynews.com: 

> "Warren" <ve3wwg@gmail.com> wrote in message > news:Xns9EB165763DBEAWarrensBlatherings@81.169.183.62... >> I for one, would not pay a developer >> to avoid debuggers so that they can spend time cooking up >> personalized macros and manually coding printf's all over >> the place. That's a poor use of a developer's time. > > Yeah, these days they instead spend time cooking up macros > and plug-ins for the debugger so that it can "sanely" > display various objects or structures. >:-) > >> Ada is very fussy about >> what gets compiled which saves debugging time (but I >> realize this is a language religious thing and that >> sometimes the language is dictated). > > The other thing is that there really is a very broad range > of programmer skill sets/proficiencies out there:
No doubt.
> My > opinion is that, while all programmers make mistakes, the > kind of bugs that very strong typing and other rigors that > Ada imposes tend to prevent errors that a *certain subset* > of programmers don't make anyway -- or at least only make > on an incredibly rare occasion --, and instead *for them* > it just decreases their average productivity a bit.
The comp.lang.ada forum would disagree with that on the whole, but that is another discussion I don't want to pursue here. They'll suggest that there is more to it than that (and I agree).
> Overall it is your programmer's productivity that counts, > as measured by, "how long did it take them to get this > chunk of code to work in a bug-free manner?" -- regardless > of what tools they choose (or are made) to use.
Agreed generally.
> My recollection is that Steve Wozniak wrote out all of the > Apple I's BASIC in assembly language and then manually > "assembled" it, still on paper, into op-codes.
I've also hand assembled lots of code in the '70s. It's no big deal, just extra effort. When you have no choice, it can be done. Now that is rarely needed, with the tools available today.
> He did all > of his debugging "on paper," and the first time he entered > it into the actual CPU... it worked.
Computer time was considered more expensive than programmer time in those days (speaking generally). Things are considerably different today. It makes no sense to do a time consuming desk check when testing it can do it in an instant. Yes, there are exceptions to that, depending upon the nature of the project (like critical flight control systems).
> Many people couldn't pull off that feat with Ada, C++, > Python or any other language out there. :-)
Ada is used whenever there are safety critical and life threatening situations. There is a reason for that. If you hang out on comp.lang.ada, you'll see that there are tools that build upon Ada and take the checking to an even more rigid extreme. But that is moving OT.. Warren
Jan Panteltje expounded in
news:imd33m$s4n$1@news.albasani.net: 

> On a sunny day (Wed, 23 Mar 2011 13:58:26 +0000 (UTC)) it > happened Warren <ve3wwg@gmail.com> wrote in > <Xns9EB165763DBEAWarrensBlatherings@81.169.183.62>:
>>You're saying debuggers don't help, which is simply not the >>case. > > The only case where a debugger may help is if you want to > know why a BINARY of somebody else's code craches, when you > have no source code.
Hogwash. It can be used in this situation, but it is not the "only" situation where it is useful.
> A few simple > printf() statements will tell you all you want to know.
And to wait for a rebuild a huge system to compile that added printf statement, which might take you 30 minutes, this is terrible waste of time. For small projects, you can tolerate all kinds of il-advised development practices. But a huge project or not, needs no recompiles, no relinking or running of makefiles. You just invoke it with the debugger, or heaven forbid, just attach to an already running program and take control.
> I > have been through piles of other man's source code, used > parts of it, C reads like a novel to me, no matter what > style it is written in.
Your skill level is academic. It needs to read well for others on your team. And, if your code is being reviewed, as it happens in safety critical systems, that "added code" (if it stays in) now needs to be verified that it is not going to become a flight critical error. Less is more in critical software.
>>It either suggests that you don't know how to use them >>or you've avoided them for so long that you don't know what >>they're capable of. > > I have to admit I have lost the capability of walking on my > hands, actually never was good at it. > To be honest I never tried very hard.
So you've never really tried to use it but are telling everyone to trust you that a debugger is worthless?
> All your insults
Insults? I'm trying to understand how someone can dis a very useful tool in the current environment. Debuggers have never been better, but you say they are useless based upon (IIRC) some article in the early '80s.
>>A debugger saves an enormous amount of time. Who wouldn't >>want to leverage that? I'll bet your employer would. > > Not using it saves all the time spend with a debugger.
It's your choice man.
> You are too lazy and too stupid and too stubborn to take a > good academic advice. Kids stuff.
Bad antiquated advice from decades ago hardly applies to the tools we have today. That might have been good advice for some big iron environments of the time. But I even recall using C/PM and DOS debuggers, that were still plenty useful on the microcomputer front. But I leave you to your own fate.
>>With the quality of the debuggers generally today, there is >>no reason not to use them. > > Quality of debugger has nothing to do with it. > If you want to shoot yourself in the head a good quality > gun does not help save your life.
A gun is very useful in the hands of a hungry hunter. If you shoot yourself in the foot, then that says a lot about you as a hunter. It ain't the gun's fault.
> I rarely use macros, the printf() at the start of functions > is just a few lines, if your typing skills are that bad > that you cannot put those out in about 8 seconds each, then > use your hours f*cking about with your [de]bugger.
You've left out the make process, which can be huge in a large system. But clearly you don't know about debuggers.
>>Macros BTW, are completely useless in a language like Ada, >>which I've been using for AVR stuff. > > Well that says it all, an other victim of ADA.
No, it is Ada (not American Dental Association or some other acronym).
> It so happened I threw my [Ada] book in the garbage a few > weeks ago, after not using it since 1989 or so. > What the world has come to, what a bunch of crap.
..
> Maybe you work for the US DOD who once required ADA, > but for real critical systems allowed 'other' languages > (they had to), ADA would explain why they keep losing wars.
Actually, as I understand it, each project still gets reviewed before they relax the Ada requirement. I also understand that a lot of contracting firms still use Ada due to the cost savings in the testing and maintenance. Others however, are getting my billable hours by using other languages.
> C has no derivate[1], is close to the hardware, and > requires you to know what you are doing. [Ada] forces you > into some insane form, that actually does not even prevent > you from making basic mistakes, else you would not need > your bugger at all.
You've clearly never mastered Ada and has changed a lot since your '89 text. Ada is not your grandfather's Ada-83 that you remember. It has undergone a major revision in '95, and 2005. It is now headed for a new revision of the standard in 2012. As a standard, Ada is one of the best "well defined" standards.
> C++ is no language, it is a speech disability, a crime > against humanity.
It's not my favourite, but like C, it has its place in the world.
>>Ada is very fussy > > The word is 'sucks'.
A clearly uninformed opinion. Warren
Jan Panteltje <pNaonStpealmtje@yahoo.com> wrote:

>On a sunny day (Tue, 22 Mar 2011 20:01:43 +0000 (UTC)) it happened Warren ><ve3wwg@gmail.com> wrote in ><Xns9EB0A30D9BE11WarrensBlatherings@81.169.183.62>: > >>Jan Panteltje expounded in >>news:imanmt$clr$1@news.albasani.net: >> >>> On a sunny day (Tue, 22 Mar 2011 13:26:45 +0000 (UTC)) it >>> happened Warren <ve3wwg@gmail.com> wrote in >>> <Xns9EB0601772EF2WarrensBlatherings@188.40.43.213>: >>> >>>>If you work on large systems, then avoiding debuggers is a >>>>huge waste of time. When you have intermittant or >>>>unexplained failure(s) a debugger saves gobs of time since >>>>you don't have to anticipate what to "print". >>>> >>>>When your code core dumps, a debugger will trace back the >>>>precise point of the failure and allows you to look at >>>>anything related to it. >>>> >>>>All of this facility comes with a simple compile option. No >>>>need to code special macros. >>>> >>>>In short, you're following some bad and antiquated advice. >>>> >>>>Warren >>> >> >>That speaks volumes about the programmers-- and nothing about >>the value of the debugger. Don't get me wrong- a few carefully >>crafted prints can augment a difficult debug. >> >>But to write off a debugger is like saying "I can saw it by >>hand, so I don't need no stinkin' table saw". >> >>Warren > >Not sure that analogy is the right one, how about this: >Using a debugger is like spell checking by checking the font of each character, >not only does it not help, as fonts are computer generated, >it has no bearing on the spelling either.
Sorry but that is nonsense. A debugger is very usefull to see if the code is actually doing what it is supposed to do. Not just verifying the output is (accidentally) right. In case of an exception it will lead you right to the offending function. If you're lucky you'll get a stack trace as well. Output from print statements are usefull when the program is running at a client and the client is able to capture them in a log file. Print statements also help debugging real-time processes which cannot be stepped. -- Failure does not prove something is impossible, failure simply indicates you are not using the right tools... nico@nctdevpuntnl (punt=.) --------------------------------------------------------------
On 3/21/2011 3:25 PM, Nico Coesel wrote:
> John - KD5YI<sophi.2@invalid.org> wrote: > >> On 3/20/2011 4:10 AM, fasf wrote: >>> Hi, >>> i've worked wit 8bit/32bit microcontrollers for 5 years and now i want >>> to explore new fields, so i'm interested in these two solutions: PSoC >>> and FPGA. I'm totally new (i don't know neither VHDL nor Verilog) and >>> i don't understand the differences between these programmable devices. >>> For hobby, what do you think is more useful to study? Can you suggest >>> a low cost DevKit? >>> Any getting started suggestion will be very appreciated >> >> >> Well, I will put in a plug for Cypress' PSoC. I've been using them for >> years. However, if you need blazing speed, you will need to go the FPGA >> route as the PSoC is nothing more than a microcontroller that has some >> analog goodies. >> >> I like the PSoC because it is extremely configurable/reconfigurable to >> almost anything you need. An example that Cypress came out with was a >> >> I have no financial connection with Cypress. I've just been using their >> PSoC chips since about 2002. > > Looks interesting. I checked the website a bit but couldn't find all I > want to know: How is the analog performance regarding noise and > bandwidth? Can you also use a Psoc as a reconfigurable analog brick > (analog in - analog out)? >
You'll have to read the specs or reference manual to get noise and bandwidth information. I don't remember them, but, in the PSoC1, not all that great. I don't know what you mean by 'brick', but, yes, the input pins are selectable, the output pins are selectable, the type of analog (continuous analog or switched cap) is selectable. The switched cap modules make the ADs and DACs, filters, modulators, demodulators, etc. The continuous analog makes amplifiers, instrumentation amps, and comparators. Internally, you can route signals between blocks for various uses. Bottom of the line (PSoC1) data sheet: http://www.cypress.com/?rID=3324 PSoC3 (8051 core): http://www.cypress.com/?rID=35178 PSoC5 (ARM core): http://www.cypress.com/?rID=37581 If you want more than the data sheets, the Technical Reference Manuals (TRM) are available. If I can help get more info for you, let me know. Cheers, John
On Wed, 23 Mar 2011 01:06:51 -0700, Jon Kirwan <jonk@infinitefactors.org>
wrote:

>On Tue, 22 Mar 2011 20:48:11 -0500, "krw@att.bizzzzzzzzzzzz" ><krw@att.bizzzzzzzzzzzz> wrote: > >><snip> >>It's even more fun when you can get someone to pay you to do it. ;-) >><snip> > >I'm getting paid to do stuff I'd do for free, or even pay >others to allow me to do. If I got good enough to get paid >for writing VHDL code and then added that to the list, I'd >probably die from forgetting to eat out of the pure excess >pleasure of it all. I'm far too lucky as it is. ;)
Well, there you go! Have fun!