Electronics-Related.com
Forums

Learning FPGA

Started by stephaneb 6 years ago19 replieslatest reply 1 year ago4316 views

A few months ago, this community tackled the first FPGA FAQ titled When (and why) is it a good idea to use an FPGA in your embedded system design?.  Your contributions to the discussion have been fabulous and the thread has been viewed almost 3000 times already.  

FPGA skills seem to be in high demand lately.  It might only be a coincidence but within only a few days, I have been approached by several companies looking for help in their search for FPGA talents (herehere and here for example).  

So this new FAQ thread will be about getting some ideas and advice from you guys who've "been there done that" on how to proceed to demystify FPGAs and eventually add the skill to one's resume. 

Feel free to say a few words about:

  • What to expect in terms of difficulty level; how steep is the learning curve?
  • Your personal experience learning FPGA, things you would do different if you could start over, maybe books and/or online resources that were particularly useful to you?
  • Dev tools/boards, hdl languages, etc, where to start?
  • Project ideas for beginners
  • ...

And in the hope to encourage the participation of as many FPGA experts as possible, $150 will be divided between the authors of the most appreciated inputs to this #FAQ Thread, based on the number of 'thumbs-ups' received.

Thanks in advance for any insight you might share with this FPGA community.

[ - ]
Reply by jmford94February 2, 2023

Hi Stephane

Although I have been building embedded systems for 30 years, I have only recently started developing my own FPGA designs, so I think that I still have a fairly good perspective on getting started.  I worked for years designing (but not doing the code for!) DSP systems implemented in FPGAs.  That experience helped me with part of the learning curve.

As far as the learning curve, there are a couple of superimposed learning curves to deal with.  There's learning the tool flow that's needed to go from design input (Verilog or VHDL, most likely) to a bitstream downloaded to the FPGA in question.  As in most things, there is a wide variety of tools depending on the FPGA you choose, and the power of the tools, and the complexity of the tools varies directly as the power of the tools. 

I recently came up to speed on Lattice's IceCube2 software and ICE40 parts.  That was reasonably easy to install and get working, within a couple of days I was able to blink the LED on my development board, and within a few days I was completely comfortable programming and simulating and getting work done.  But, that said, before this I worked extensively with Xilinx's older ISE tools, and the newer Vivado tools.  These are much more complicated and they have a very steep and tall learning curve.  You can get things done by learning parts of the Vivado tools, but there is a lot more there that can help build designs more quickly and easily, but it's not easy to master it.  Going from these tools to the iceCube2 tools was fairly easy in comparison to learning them in the first place.

As far as where to start, I think it is extremely valuable to learn to program in either Verilog or VHDL with a toolset and a simulator before even getting a piece of hardware to program.  There are open-source Verilog compilers and simulators, and most vendors have a free version of some of their tools.  Learning the basics this way will help avoid running in circles when trying to debug a design on hardware.

As SpiderKenny points out, the mindset needed to successfully develop applications in an FPGA is different than the software mindset.  Software people generally (I am one!) think in terms of sequential execution, i.e. a for() loop or while() loop with all the steps needed inside it.  In conrast, the FPGA can do infinitely many things at the same time.  I have seen Verilog that was written by a software person, and it is obvious that the person was thinking in terms of a sequence of instructions.  In fact, the way to think about and structure your code is to break it down (as a good programmer would for normal software) into small pieces, each executing in its own always() block.  It's kind of like writing multithreaded code for a processor with a huge number of independent cores, all slaved to a common clock.  I have a very good (and small!) book, "Verilog By Example  A Concise Introduction For FPGA design" by Blaine C. Readler that does a very good job of introducing the reader to these concepts.

As far as a project to get started, once the basics are mastered, it would make sense to buy a development board with some peripherals on it, like switches, LEDs, etc.  Most of these also have project suggestions for learning, like an artix-7 board, or a Spartan board, or one of the Lattice boards.  I don't have any experience with Intel/Altera, but they make good stuff as well.

For me, the hardest part was getting the opportunity to embed an FPGA into a design and then make it work.  It was relatively easy to go from an idea to a working system once we decided what we wanted the system to do. ( an interfce to a fast sampling ADC with very precise timing, coupled to a microcontroller)


Hope this is helpful!

[ - ]
Reply by DNadlerFebruary 2, 2023

Hi Stephane - I'll try any address your questions in order...

  1. The learning curve is hugely dependent on past experience and training. If you are used to thinking about asynchronous events and have some hardware design experience already, you're way ahead, but if not this will be tough. Regardless, don't start trying to understand FPGAs until you have some understanding of digital design, and really understand why the following issues are important and techniques for addressing them:
  • What's the difference between clocked and asynchronous logic?
    Why is a clock different from a reset?
  • How do you synchronize across clock domains?
My personal experience and development tools: At the advice of an experienced colleague, I started with VHDL. While more verbose and persnickety than Verilog, VHDL's strong typing keeps a beginner from making some mistakes that will compile happily and not do what you mean in Verilog. Which you use will in real work will depend on your location and industry. My colleague recommended starting with the "Blue Book":
HDL Design by Smith Also, though perhaps not as good or thorough, I also read:
VHDL Starter's Guide by Sudhaker and Ylalamanchili, and
The Designer's Guide to VHDL by Ashenden
I was already experienced with digital logic and asynchronous design for hardware and software so this part wasn't so hard for me, but it took me a while to understand how to synchronize across clock domains in VHDL. My first project was an LCD controller (character synthesis merged with graphics into a local SRAM, repeatedly clocked out to refresh the LCD, clock domains include the refresh cycle and CPU updates). I had already written the specifications but failed to hire someone to design it, and was goaded into doing it myself... I used the Xilinx Webbench tools which included a nice simulator. I was able to pretty quickly code then simulate and test the design; no hardware until I had a good idea what I was doing. Next I built a hardware lash-up proof-of-concept using an evaluation board and a forest of ribbon cables. Logged work time from cracking the books until working lash-up was 2 weeks. Here's a photo of the lash-up (CPU board lower right, FPGA eval board lower left, other boards only  providing power/bias to LCD): hi juergen 1_73473.jpg
Unfortunately, after laying out the board for this thing, it took 3 weeks to work through the tool bugs including wildly hot FPGA (Xilinx synthesizer bug causing shoot-through in output ports), and FPGA prom programmer writing incorrect data into the FPGA configuration flash - both really painful to find and get fixed by respective miscreant vendors. Then another few days to optimize down to fit within a 15k gate FPGA.
Project Ideas... I'm not sure an LCD controller is a good first project ;-) Regardless what you start with, several vendors provide free tools for compilation and simulation. Code and simulate a complete set of stimulus/response using the simulator. No hardware until that's all working! Only then proceed to an eval board to prove it really works. Choose a vendor based on the feature-set your project will need (find an experienced person to help you with the choice); hard to go too wrong learning with any of the major vendor's tools.

Hope that's helpful!
Best Regards, Dave

[ - ]
Reply by SpiderKennyFebruary 2, 2023

One important thing to remember is that Verilog and VHDL are hardware description languages, not some kind of fancy software. One must try not to think in terms of procedural code, but learn how to think in hardware.

[ - ]
Reply by kazFebruary 2, 2023

I started fpga design and stayed only within that boundary since 1998. I am sorry to say that I wish I never started. It is interesting and nice when you design some clean modules at home but team work is absolute torture. Your design is at mercy of input modules from sometimes flaky designs or you have to reuse some of their old modules and fix it for them while they sit next to you. The testing of your design at system level will flood you with false bugs. The focus of many young designers is tools and tools and scripting. They are swift at clicking but not the actual design. You ask 10 engineers to design from same requirements and you will be shocked by the difference of resource/speed/time involved and funny documentation. It is amazing that eventually it works because they keep fingering through...

[ - ]
Reply by mr_banditFebruary 2, 2023

I would point out that is a universal && has nothing to do with the actual tech. Tools are a means to an end - at all levels.

Your point about design is spot-on. In my experience (and I am reformed because of my own pain) design is the most critical part of a project. If you spend 80% on design, the rest flows from it. But - it takes strong management and bitter-experienced old farts to really enforce that concept.

Also - design should be a team effort whenever possible. Good engineering comes from conflict - and I specifically mean conflict about concepts and design methods - ideas. Not people or personalities - that is a toxicity that needs to be quickly eliminated if/when it happens. 

Orville and Wilbur would argue at the top of their lungs in each others face, then switch sides (marking the half-way point). That way they had to understand the problem better. Also kept them from getting stuck on a favorite half-baked idea.

My interest in SystemC is because I am on old C guy, and I have heard Good Things about it. However, one should also understand VHDL or Verilog as well.

[ - ]
Reply by MixedSignalFebruary 2, 2023

I think this is a great insight. For my real-time DSP work I have used both Analog Devices Sharc DSP chips and Intel/Altera Cyclone FPGA's. The move to FPGAs was motivated by the need for a much greater concurrent floating-point maths capability. 

I love FPGAs and their ability to provide massively parallel processes and versatile functionality. Once the initial learning curve is done, it can be a joy to build interface circuits, Finite State machines and number-crunching sub-systems etc.

But,for me,the biggest FPGA challenges come when faced with achieving timing closure for ns or sub-ns timing requirements. Some of the official and semi-official guidance is contradictory and the designing of even modest-speed interfaces can be impeded by poorly documented descriptions or the need for work-arounds. 

I look back fondly to writing the interface and DSP code for the Sharc in C and the only problems were simple algorithm errors which were easily debugged. 

A lot of the FPGA design work does go without issue, but every now and then you come up against an area like an LVDS, Edge-aligned, Source-Synchronous, Double Data Rate (DDR) interface and suddenly there is an issue at every step. From Pin allocations, LVDS I/O Buffers, and Timing Constraints. If you like getting into a fine level of gate/register detail and you don't mind wading through the tool/documentation deficiencies, official work-arounds and conflicting advice, then you will end up with some very valuable experience. However if you just want to get on and create functionality, then these unwanted diversions can be very frustrating and even more so in the context of a time-pressured commercial project. 

The pleasure(relief) of getting an error-free compilation of a complex FPGA design with all I/Os properly constrained and timing closure met is huge, but is likely to involve some significant pain along the way.

[ - ]
Reply by SolderdotFebruary 2, 2023

As always: Before starting hands-on work it's always good to do a requirements analysis. Fingure out what needs to be done and then select the tools. FPGAs are just one means, there may be others.

FPGAs are "just" a bunch of programmable logic gates and memories. OK, a big bunch. CPLDs may offer similar functionality.

A few years ago I started HW logic design in my diploma thesis and used CPLDs for implementing a controller logic. I moved all the glue logic required to connect the digital components on my PCB into that CPLD as well. The institute I worked for had the design environment available, so there was no choice.

Later on, during my professional career I did similar things with small FPGAs. Why not CPLDs? Well, at that point in time it was a requirement to use the FPGAs since they were already available in-house. Considering the low number of pieces the higher cost for the FPGAs did not justify the purchase of a new design environment. Other aspects such as PCB area consumed (FPGAs need to store their program externally, an external memory may add to PCB area consumption and BOM), boot time (booting an FPGA takes time, CPLDs are ready to run immediately after power-on) were not relevant for those applications but are likely to be in today's applications.

The CPLDs I programmed using a hardware description language (HDL) called "DSL", short for Design Synthesis Language". Not sure if it is still uesed nowadays.

The FPGAs I programmed using a mixture of schematic entry and VHDL, mostly the finite state machines I implemented in VHDL. You can do surprisingly complex designs using that approach and it helped me, coming from the software engineering corner, thinking hardware. Of course you may not want to implement a full-blown viterbi decoder from discrete elements in a schematic, however implementing that in VHDL (or another HDL) and creating a schematic object out of that may help in improving interfaces.

As a beginners project you might think about implementing a digital clock which receives the time via antenna. displaying the time on a bunch of LEDs with variable brightness. For that you need to access a DCF77 receiver to obtain the time signal, transform that into a status for each LED whether it shall be on or off and the input of an optical sensor must be digested to check external brightness, resulting in the duty cycle with which the individual LEDs are accessed, thus defining their brightness.

[ - ]
Reply by rajkeerthy18February 2, 2023

Alright, We are in the age of 16nM FPGAs, thinking of 7nm, 3nM .., DDR5, Network on chip, Multi GHz cpu cores, Image processing, DSPs, IoT, Cloud and so on, Employers want to keep head count down, productivity up, and cost per head down too, so Demystification of FPGA designing should help many people to acquire such talent. How can a newbie FPGA Engineer come upto speed so the employer could identify such talent and possibly arrange a quick send off to the senior FPGA engineers ? I have tried to list some aspects for talent development, all of them or any one of them would help a newbie to develop a noticeable talent and be recognized. Talent development is a continuous process and the details are too huge to present in a single post. Fortunately abundant information could be found online including text books, free tools, design articles by eminent designers on all these aspects and a proactive engineer by continuous efforts could master these skills.

Digital Design - 

Synchronous, (Asynchronous ??) Sequential, Combinational, Digital structures (gates, muxes, flops, memories, register files, fifos …. ), protocols, Interfaces, reading (and understanding ??) various specifications and choosing the right parts to work in the application that the employer is building.

Multi clock designs, Bus based designs, IP integration - don’t build all but what really needed and use what is readily available. What application - RISC/CISC super scalar, Network processing, System interface, DSP … 

Wish there was a push button design ?

Modelling Design - 

Verilog (VHDL may be), SystemC, C based design, What about Python (??), How long does it take to truly understand the nuances of the modelling language and be productive while designing, because once a choice is made divorce may not be available. What is an Efficient design by the way ? Adaptable to future spec changes ?? Reusable ?? Intellectual Property - Patent filed ?? Why not schematic based design ? Behavioural/Structural modelling how much and when, Synthesis, Code readability/Synthesizability …, Timing Analysis and Fixing. 

Verification -

Does the design need verification for an FPGA ? If so what environment - Verilog/VHDL, C, C++, PLI (Is there such a thing), VMM, OVM, UVM, Perl/Python, Assertions, Formal verification, Reference model, how fast the environment could be up and running ? Manager needed it last week but the team has not even started. What is an efficient environment by the way ? What is a Reference model, What is efficient verification ? When is verification done ? Who knows all this and willing to train others, can he put overtime without charging extra, 

FPGA technology -

SRAM vs Antifuse/Flash, Size, Density, CLB, ALM, Distributed/Block RAM, MRAM …, Xilinx/Altera or Lattice/Actel, Embedded FPGAs ? How to choose the right device ? IO Types, characteristics, Transceiver IOs, GPIO, speciality IOs, interfacing to the fabric. System design with FPGAs. Coding for efficiency ? Coding for speed/Area ? Coding for particular technology ? Code portability across FPGA technologies and ASIC ?

Tools -

Use vendor tools or third party tools ? Xilinx/Altera,  Synopsys/Cadence, Which tool is good and when ? Scripts ? How much is the budget ? Project Schedule ?

Debug in Lab -

Debug tools and built in structures in FPGA. Debug with software/Diag team. How to test using actual field conditions ? If a bug found how to corroborate to RTL verification ? Could actual conditions be reflected to verification flow ?

Domain and Application -

Hardware/Software interface, Data acquisition processing, Control Plane/Data Plane, Polled/Interrupted systems, Fault tolerant systems, Self healing systems, Reliability - D0254 ? Commercial/Defence applications, Networking/CPU/Graphics/Consumer/IoT/Wireless/Cloud … Employability across domains when there is recession in one domain ?

Conclusion -

FPGA glue logic was easy. Slowly system logic went inside FPGAs. Simpler algorithms went inside. Over a period such things as packet switch, traffic aggregators, packet processing systems have gone in. Now a whole system is possible to realise in an FPGA. Suitable volume could beat Cost. Power save logic and newer architectures have brought down power consumption.  An aspiring FPGA engineer could start with one FPGA vendor Xilinx/Altera and explore the whole process of FPGA design and the philosophy behind it. Once proficient in one technology mastering another technology is not a big deal.

[ - ]
Reply by trmittal24February 2, 2023

Hello Community,

I am an undergrad who has a fair bit of FPGA knowledge.
I would say, for a person who knows about digital electronics, getting started with FPGAs isn't very tough.
I started with VHDL but if you are familiar with C coding, starting with verilog is recommended. Verilog is also one of the most used HDL languages in the industry. Any book of that particular language should work for you as most of them explain the syntaxes in a good way.
Coming to tools, you can use Vivado HLS by Xilinx, it has a free version. It is a GUI tool and is very intuitive for beginners. It has good support on internet too.

Once you are done deciding the HDL and tool, go ahead and build basic digital blocks in it.

  • Combinational Blocks
  • Sequential Blocks
  • FSM (Mealy and Moore)
  • Learn to use IP blocks

After this try learning optimizing hardware. What styles are the best and such.
6 months of dedicated practice should help you a lot. :)

[ - ]
Reply by Bob11February 2, 2023

The learning curve for FPGA design is fairly steep simply because there are so many moving parts to an integrated whole design. FPGA design encompasses hardware, software and systems engineering in one small package, exercising all your engineering skills and then some.

Modern FPGAs are high-pin count BGA devices with high-speed interfaces and significant power consumption. It helps to have knowledge of high-speed high-density PCB layout practices, high-current power supply design (typically half-a-dozen rails with core voltages such as 1V @ 10A), and knowledge of thermal best practices. It also helps to be familiar with the different logic interface standards (LS-TTL, LVDS, CML, etc. etc.) and know which is applicable for any given sub-circuit design, know how to partition the I/O among voltage banks, etc. Many modern FPGAs have high-speed serializers/de-serializers in the GHz range that if used require knowledge of RF techniques. Development boards can demonstrate some of the techniques, but eventually you've got to drop that monster IC onto your own PCB someday.

Then it's time to program the thing. Whether Verilog, VHDL, SystemC, or an open-source preprocessor, the end result is you still need to think parallel all the time. Verilog is just like C, inverted. You've also got to consider issues such as crossing clock domains (metastability), clock setup/holds, reset, propagation delays, etc. apart from the actual design of the subsystem itself. It looks like programming, but it's really hardware design. And FPGAs today are powerful enough to implement CrazyGIG Ethernet, 4K/8K video, Displayport, USB C, and the like. None of those technologies are trivial. Vendor libraries help, but you still have to interface to the things.

But not to worry, there's plenty of software engineering to do as well. The toolsets for these modern FPGAs are full-blown IDEs in their own right, although a real development process typically ends up scripted on the CLI. Software best practices in version control, testing, CI, etc. are recommended. And then you can move on to real coding, as many modern FPGAs implement either soft or hard-core processors. You can find yourself writing assembly code for a soft-core microprocessor implemented in the firmware, or a full-blown C++ application running on a Linux OS using a built-in GHz dual-core ARM.

The best way to get started in FPGAs is to have a good background in basic electronics design and some embedded software experience as well. Then get a dev board and a copy of the toolset from one of the FPGA vendors and start working with it, learning the tools and examining the schematics and board layout for best practices. Last but not least, skill in reading English and understanding datasheets is vital, as you will be doing a LOT of reading and careful parsing of datasheets. (Be particularly wary of footnote 32 on page 158 in Volume 7 of 10 of the HyperGIG subsystem reference, as that's the one that explains why pin AA32 isn't toggling like your design requires. You may have never seen it before, but be assured the FPGA tech support staff has.)


[ - ]
Reply by strubiFebruary 2, 2023

Hi there,

let me throw in a bit of an echo...

One reason why HDL is so hard to work with is sure that there are so many different approaches, and there's nothing like a GCC for the impatient SW engineers that's just magically creating hardware such that you could point a beginner to it. You still need to dig a fair bit into electronics to actually get something spinning on an FPGA.

On the other hand, HDL engineers are seldom convinced by modern software approaches. This makes the team play always a bit difficult.

For the pure beginner, VHDL in particular can be a burden, due to it's pickyness and verbosity. Likewise, Verilog could be a pain when you actually wanted some language diligence.

The System C approach seems right from high level some point, but again could introduce some overhead when actually getting to the hardware.

Python was mentioned above - I've had some paradigm change here when actually discovering how well MyHDL performs. So, yes, you can actually create well tested and verified hardware using Python (via VHDL/Verilog transfer to synthesis) in a much shorter time, being able to focus on the actual problem, plus a few benefits concerning simulation come for free.

If I were to teach this to college students (or even kids), I'd start out with Python, in fact. Python's language abilities using generators via the MyHDL extensions are way above the System* approaches, however one could argue, it might run too slow or is too hard to debug. That's where the dual approach by _still_ learning a low HDL language makes sense, because at some point a FPGA engineer will still have to deal with a specific technology on a low level.

Most turnaround and iteration cycles in development (and resulting frustration) come from a lack of a good initial model that can still be refined later on to the implementation details. It's a very good thing if you can avoid these at first before actually moving on to a specific hardware devkit. There are so many out there that recommendations are basically useless...

[ - ]
Reply by mr_banditFebruary 2, 2023

You are spot-on.

I must say you have given me the first real reason to learn python - I am a long-term C guy and python is a bit of a toy language (where is my flame-proof underwear?) Yes, they are both Turing complete...

The need to understand the VHDL or Verilog that gets generated from a high-level (SystemC, (shudder) python) is the same reason a good programmer, especially embedded, needs to understand the underlying assembly language.

and you are correct about having a good model to start with. I will sometimes create a throw-away program to learn the system/tools and verify the concept. Make sure the basics are correct.

We are at the point we have many more transistors than we can possibly use. Just get an FPGA with *2 to *4 the number you think you need - the incremental cost is small. (now - I once designed a system with an FPGA. I selected the largest that was gull-wing, because the nest size went to BGA. We had a couple of logic blocks left over....)

So, as far as SystemC "could introduce some overhead when actually getting to the hardware" - I don't care...

"On the other hand, HDL engineers are seldom convinced by modern software approaches. This makes the team play always a bit difficult."

Umm... I have seen a fair amount of code written by EE's. For the most part - terrible. No concept of functions. Clever bit-banging...

I do a lot of device drivers. I am convinced device control design should be done by the SW/FW side, not by EE's. FPGAs actually make that easy, because the FPGA is used to define the register set and register function. I have designed/defined several devices that were easy to control....

[ - ]
Reply by strubiFebruary 2, 2023

Hi mr_bandit,

I bet you'll learn Python in three days (my favorite slogan).

> I am a long-term C guy and python is a bit of a toy language (where is my flame-proof underwear?


Actually I wouldn't [f,b]lame you for missing this one out, I'm a conservative C-guy as well. I discovered Python somewhat after messing with a lot of TCL when I needed a tight script wrapper for unit testing. There's where it starts losing the toy aspect, I think. By now, big industrial web apps run in Python and it's ecosystem got really big. Consider this: You can grab data from a camera, run the data through a HDL simulation, put the result in a database and display a pass/fail on a website, and all you see on the top layer is a (best case scenario!) modular, non-obscure script. If the Python prototype becomes too slow, swap it out by a pure C module, or use various compiler speed ups.

> Umm... I have seen a fair amount of code written by EE's. For the most part - terrible. No concept of functions. Clever bit-banging...

I had on the other hand seen plenty IT dudes who were great at their Java abstraction level, but wouldn't touch a scope or know what a 74xx does. Not helpful either. But that's up to the education folks to bridge that gap and maybe get back to the conservative 80's didactics where we learnt how things work on the inside. Right now the FPGA trends force us not to think in either 'HW' or 'SW', we'll have to juggle with both.


Speaking of which: Christopher Felton (who has a blog here) might be one of the Python/MyHDL pioneers on the educational side, you might find quite some useful learning and example material from him on the web.


[ - ]
Reply by mr_banditFebruary 2, 2023

I have written (maybe) 50 lines of Python - sucked in data, munged it, printed it. Hides a lot of the cruft - understands walking arrays.

So - I bet I could learn it in 3 days. Just need to find the motivation. Maybe the next time I need a test interpreter in an embedded system :^)

(When I was studying CS in the late 70's, we were handed the White Book as our intro to C. the subtext was - if you cannot learn it from that book, you are in the wrong place. We all got it && all assignments were thereafter in C. The weeding-out class was to write an assembler in C.)

So - having seen some Java from IT folks - I have to agree with you. I would say C is to Java what Basic is to Python - and all that applies.

I have been writing device drivers for 30+ years now - I play EE on TV, but I have been mentored by sharp EE's. Best two tools I own is my Estwing framing hammer and my Tek scope.

Thanks for the reference to Christopher Felton. And you are spot-on about needing to change thinking when going to FPGAs. The ability to do multiple things in parallel is a real shift in thinking. You can really do multiple/parallel processing/state machines.

[ - ]
Reply by engineer68February 2, 2023

For me it is funny to see, how many people now start with FPGAs, including also music euqipment manufaction who used to deny this all the time.

I really wonder why this is? Was there a "fifty shades of electronics" - movie in the cinema and now everybody outside the "scene" want's to try out extraordinary things?


I started to use PLDs in 1992 and the recent 15 years, FPGAs are 90% of my daily work. for more than 15 years now. Also in music FPGAs are a "old hat" as we say in germany. You may know my 96khz.org website describing some of my FPGA work related to audio.

As mentioned the most surprising thing to me is that the music industry now heavily pushes FPGAs into their systems, while 10 years before now, there were many more reasons to do this than today because nowadays' DSPs and CPUs can do things which had not been possible without electronics in earlier times. Apart from their advantage in processing power per Dollars, programming and debugging is more easy and some of them can even produce S/PDIF decoding, DSD-output and similar things making FPGAs obsolete for this.

Regarding development, I do not think, it is a big point to stress out that programming FPGAs requires different ways of thinking than with C-programming because one easily can take over strategies from object orientated coding and software design. So it should not be a problem for C++ guys to settle over and define parallel modules operating effectively. Describing them with VHDL is a piece of cake since it is an easy language. This is not a taks at all. The problem is different:

Using FPGAs effectively requires the ability to define tricky hardware structures and processing pipelines to overcome DSP's capabilities. The solutions are different and from all my experiences I have with designer's education when teaching people FPGA solutions I learned that software guys cannot do this. Not because software misleads them but because they never used digital circuitry and learned about all the priciples of intelligent signal treatment, storage and what to define to get a physical solution. 

A lot of digital solutions are very specific especially when it comes to performance and cost or when it has to run into an ASIC later wherby most of the standard solutions of digital design cannot be automatically generated / selected with just a UML or a C++ description so the approaches of Catapult, HLS, C++-Transform and all the ideas to produce VHDL from a C++ description, like HDL-Coder are very limited. 

During the last years I worked in more than 10 projects where my customer worked with rapid prototyping systems and automatic VHDL / Verilog generation from various providers mostly controlled with either an internal IDE, block symbol based design and/or controlled with MATLAB Simulink. None of them created effective systems.

But this is they way many people intend to prodce FPGAs today.


[ - ]
Reply by ahmedshaheinFebruary 2, 2023

Hello everyone,

I would like to add this topic to the list: "FPGA offloading using OpenCL".

It is very demanding and challenging topic. By offloading, I mean offload Processor tasks to an FPGA. This if often carried out using "OpenCL". It is commonly done for intensive computation tasks, and recently even, for floating-point tasks as well. Bit-coin mining is one of the many examples.

Good luck.

[ - ]
Reply by mr_banditFebruary 2, 2023

I would like to see a discussion on SystemC - tools, techniques, barriers to entry, tricks of the trade, etc.

[ - ]
Reply by wolf22February 2, 2023

We are here on DSP-related - and FPGA's are quite different.

I do not think, that the learning curve is steep, but it is in directions, which seem to be rather queer from the viewpoint of a DSP-programmer. And there are a lot of pitfalls, every computer- or DSP-programmer will certainly fall into.

I myself use CPLD's since nearly 25 years, because they reduce the number of chips on the board drastically, they are instantly working after powerup without need to be booted and they are rather fast. For example with a Coolrunner from Xilinx I can do digital processing up to 650 MHz clock - if the logic is laid out properly.

The use of FPGA's is different. They need to be booted after powerup and they are not intended to replace simple NAND's, NOR's, Flipflops and so on. Instead they are intended to contain functions like special processors and DSP's, but in a completely different manner: not sequentiell, but parallel.

With the usual langages VHDL and Verilog I am not really happy, because to become good in this languages, you need daily exercise - so to say. People, who have business in other disciplines like me and do programming FPGA's only as a smaller part of their work or even on their hobbies, have difficulties to pop in and out mentally.

My trick to overcome this re-adjustng of thinking is to do the topmost level with Schematics (they are usually horrible..), there defining the inputs and outputs of the general blocks and continue inside the blocks with Verilog. This language seems to me much better suitable for such "topic-hoppers" like me than VHDL. I do know, that this seems weird, because on the PC I usually use Delphi/Lazarus and this is Pascal - and VHDL is a style nearer to Pascal than C (which I use only for microcontrollers). But this way seems to me the easiest way to start into programming of FPGA's.

Caveat: When discussing DSP-related usage of FPGA's, we need to have a very close look on the builtin items of the FPGA we want to use. Some chips have already wired components like accumulators and MAC units for filters - but other chips have not. So the same source will "compile" (= fit in) one FPGA but not in another FPGA - even when the available number of LUT's are comparable.

So my conclusion is, that using FPGA's ist simply necessary when the speed of DSP's is not sufficient, but to use FPGA's needs to learn the hardware of them. This is a very different topic compared to the topics, which are discussed here in this forum. It is much nearer to hardware design.