The FPGA market has been entrenched in a duopoly for a number of years now.  In 2008, according to Gartner Inc., Xilinx Inc. and Altera Corp. hold together 87% of the market of programmable logic (51.2% and 35.5% respectively).  The rest of the market is covered mostly by Actel Corp. and Lattice Semiconductor Corp., about 6% each.

There have been attempts to challenge the comfortable equilibrium between the two FPGA giants Xilinx and Altera.  Indeed, the number of FPGA startups increased after the 2000 downturn.  Over the past 7-9 years it looks like VCs founded FPGA startups one after another.  Beside the attraction of a programmable logic market that has been growing a healthy 8-11%, compared to a stalled ASIC market, the availability of engineers and executives from the most prestigious firms (Xilinx, Altera, Intel, LSI, etc) may have been a factor in driving more VC money in FPGA startups.

However, most of these startups die after a few years.  The list includes Chameleon Systems, which died in 2002; the promising Velogix, formerly known as Flexlogics, created in 2002, and which eventually ran out of funds; Ambric Inc, whose assets were acquired by Nethra in 2008; Mathstar Inc, which stopped operating in 2008; and CSwitch, which closed the doors this summer.

Still, there are a number of active FPGA startups.  Among the most notable, one can cite Abound Logic, formerly known as M2000, which started back in 1996, and claims high density FPGA for high-end customers; Tabula, which had to go through a full reset, raised a considerable amount of money, and from which a product is expected sometime later this year; eAsic, still on going after a few misfires; Achronix, which promises a throughput of up to 1.5GHz; SiliconBlue, which aims at low power application; and certainly more companies lesser known or still in stealth mode, like Tier Logic.

All these attempts failed so far to jeopardize the duopoly Xilinx/Altera.  Why is it so?  Any new venture needs to come with a significant differentiation if it wants to challenge the existing competitors.  Many FPGA startups came out with claims of higher densities and better clock cycles.  But regardless of how you look at FPGA architectures, it is not a fundamentally difficult hardware to design, especially with the profusion of expert layout engineers that exists in this field.  One cannot expect a revolution that would bring a 10x better density.  Startups’ claims of 2x or higher density eventually have to face the harsh reality that Xilinx and Altera just need to move to the next technology node to match or substantially reduce the performance claims.  Moving to the next technology node is certainly more accessible to a Xilinx or an Altera giant than to a startup for which a new mask can consume half of a second- or third-round financing.  The bottom line is that displacing a well established vendor requires more than a 2x density improvement.  First of all, you need to make sure you have the software that can exploit that extra density to deliver better results; second, for most applications, density becomes secondary as long as it fits on a board, and the ever-increasing size of the Xilinx and Altera devices makes capacity a hard sell, except possibly for a small fraction of the high-end customers; third, capacity is even more secondary when power consumption comes into play.

In a consumer electronics market more and more dictated by portable and wireless products, power is a factor that is more important that raw performance.  In that regard, SiliconBlue looks the best positioned to distinguish itself from its peers, and proposes a value that both Xilinx and Altera cannot meet, at least for the moment.  They are the ones to watch.

Tags: , ,

15 Comments on Why FPGA startups keep failing

  1. I’m not sure that it makes sense to categorize MathStar and Ambric as FPGA companies because their fabrics were quite a bit different than a “sea of gates” architecture of Xilinx and Altera.

    I agree that introducing a new programmable architecture to directly compete with Xilinx or Altera is financial folly. Only a niche or focused approach will likely succeed, and then hopefully be acquired by Xilinx or Altera as a new product family to complement their mainstream offerings.

  2. Olivier Coudert says:

    You are correct, MathStar and Ambric are slightly different compared to a traditional FPGA company. But they attempted to be an alternative to the classical FPGA approach.

    It will be interesting to see whether low power can be a niche for SiliconBlue. Also there is some development on the low power FPGA side that is getting interesting: RT @ocoudert Researchers present MRAM-based FPGA architecture http://bit.ly/4wMlrT

  3. Good coverage and assessment Olivier. One would expect that at some point there will be some discontinuity ala ‘Innovators Solution’, which will enable a startup to leapfrog one of the leaders. Perhaps power is the key dimension and not just density. At the same time, perhaps FPGA’s are an endangered species due to a closing gap between full custom and programmable processors as the ARM, and Intel architectures and their software stacks continue to improve and reach newer process nodes more quickly. Also Nvidia’s Fermi or IBM’s Cell3, and TI’s DSP all threaten the space. The FPGA market was robust when time-to-market of new systems was the key criteria for networking companies like Cisco. Nowadays power and cost are taking over as dominate design criteria for systems and you see a rise in SOC design in systems companies to address those requirements. At advanced nodes and increasing densities the overhead associated with FPGA architectures will become an increasing percentage.

  4. Christer Pyyhtia says:

    Hi Olivier! In FPGA market X & A have built or gained the food chain consisting of hundreds of distributors, dealers, design engineering companies, garage shops, covering every possible company, application, offering all levels of support at all possible price ranges. It is an eco-system, into which you break in only through major differentiators. You have to be today 2X larger / faster and 50% less in power and cost than the NEXT generation of X and/or A. The subsequent geometry versions take 12-18 months, so you need to be two to three years ahead in technology implementation. The funds needed to make such difference can get close to $100M, with major innovation by a top technology team. It is not only the question of new, brilliant architecture, the environment for the implementation purposes must exist: design creation & synthesis tools, development environment, support resources for customers to adopt to new technology.

    At the same time, even for the largest FPGA suppliers have a major challenge to not just scale but renew their products to meet the technology requirements. A new company may find its edge in changing the rules of the game, not having to continue supporting the legacy architectures thus being able to achieve the 2X over / 50% under two geometry generations ahead.

  5. I also would not categorize MathStar or Ambric as FPGA companies. However, I would add Stretch to Olivier’s list.

  6. FPGA_dude says:

    Disclaimer- I work for one of the big FPGA vendors.

    Here are my personal opinions:

    FPGA startups are a losing proposition.This is a niche market, well-protected by
    1) Patents
    2) The fact FPGA full-custom hardware is hard to create
    3) The fact that FPGA P&R software is very hard to implement efficiently.
    4) The preferred treatment, and early process access that the major FPGA manufacturers get.

    Some things that might break the duopoly:
    1) One of X or A screwing up on execution (hence making it a monopoly)
    2) FPGA’s becoming obsolete (Don’t see that happening)

    But FPGA startups will either
    1) Implode because of technical difficulties
    2) Be sued by either X or A for violating one of the many patents
    3) Be acquired by either X or A.

    My 2c

    Great blog BTW

  7. Olivier, it’s tempting to say history will repeat itself, and that X and A will retain their iron grip on the FPGA market (hard to change architectures, hard to change software etc.).
    But then again system demands change over time, and traditionally entrenched companies can’t maneuver quickly to exploit new dynamics (see Clayton Christensen).
    So Achronix may carve out a nice business replacing a chunk of the ASIC market; Silicon Blue and Actel may deliver the low-power goods and gain share.
    Time will tell. Intel was once big in the DRAM business, so thins do change.
    (Disclaimer: I’ve done work for a number of FPGA companies so I’m far from an impartial observer).
    And I second FPGA Dude: I enjoy your blog very much. Keep up the good work.

  8. Bob Klein says:

    While the Xilinx and Altera duopoly is a reality and a huge barrier to entry for FPGA start-ups, I remain a firm believer in the power of innovation. “Copy cat” (e.g. SRAM LUT-based) architectures remain at the mercy of the X and A patents, but even some of THOSE basic patents are due to expire soon. The innovation — the differentiation — must happen outside of or tangential to the fundamental architecture.

    One trend that startups may exploit — and X and A’s product roadmaps clearly show this — has been the incorporation of more (and more types) of dedicated functional blocks. The obvious first foray into this was on-chip dedicated memory blocks. Now we see a wide range of CPU, DSP, PHY, and a whole host of other “common” blocks that are much faster, smaller, lower power, faster to develop with, and more reliable than equivalent blocks created from 4-LUTs. Matching just the right mix of these “hard” blocks with the right amount of programmable fabric is a key to success. The “one size fits all” (or, better stated, the “one FABRIC fits all”) days of general purpose FPGAs aren’t over, but with more offerings targeting specific application spaces, the overall FPGA market will continue to fragment based on application and market focus. This fragmentation opens the door for innovation — and better programmable solutions — in each individual space.

    Tools (…and everyone who knows me knows I’ve been preaching this for YEARS!) are another area that is both a barrier to entry *and* an opportunity for innovation. MANY pretenders to the FPGA throne have failed, NOT because of a lack of value in their SILICON, but rather because they either a.) intro’d the silicon WITHOUT a robust toolset or b.) tried to CHARGE for tools (no one who’s even partially happy with their current FPGA solutions will PAY for tools for an unproven-in-the-marketplace device, period) . The history is clear — from the early days of MMI/AMD giving away ABEL to Xilinx dropping 5 1/2 floppies with XACT 0.1 from helicopters, NO new programmable logic architecture has ever been successful without free and freely available tools. Many failed FPGA startups have also GROSSLY underestimated the engineering cost and effort required to produce quality map, place, and route tools and the supporting timing closure and optimization tools.

    In addition, synthesis, P&R, timing, and optimization tools and the algorithms behind them have been super-optimized over many, MANY years to the 4-LUT genre and for X and A architectures in particular. Companies that have invested heavily in Synopsys, Cadence, etc. tools and have spun their engineering staffs up on a specific FPGA vendor will be hard pressed to abandon that investment for an unproven start-up.

    That said, the opportunities for innovation in the tools space is enormous. Celoxica set the stage many years ago with their “C-to-Gates” solution (and with 25 software engineers for every hardware bloke or bloke-ette out there, it’s a reasonable pitch!). Ambric was another company that had the right idea, but ran out of money. (Like several other commentors have noted, Ambric’s device was NOT anything at all like a conventional FPGA; it was an array of 336 32-bit CPUs). Ambric defined the programming model FIRST, then defined the silicon. They then used the open-source Eclipse IDE as the backbone for their tool. They initially tried to sell that tool (with some minor success) but understood that to really be successful, they needed to have a FREE version.

    Open-source and libraries of free functional blocks have changed the game. Much like the Java, Eclipse, Linux, and other open source juggernauts, this trend will continue and accelerate. An innovative silicon architecture and tool set that can exploit this will have a decided market advantage.

    So… what’s the Bob Klein vision of a programmable silicon start-up that can succeed in a market so dominated by two behemoths? First, any such contender will need to have a laser-focus on a specific niche or application space. Best-in-class hard functional blocks — or semi-hard functional blocks with some configurable parameters — tailored to the targeted application space will provide a foot-in-the-door in those spaces.

    Rael-time or even ADAPTIVE reconfiguration is another huge opportunity. 1000 gates re-used 1000 times gives — well, you get the idea! The size and power advantages are obvious. The problems with parallel processing and tools for supporting multi-core devices are well known. Taking essential sequential code — and creating a silicon machine that DYNAMICALLY reconfigures to match the current set of processing requirements is one innovative solution. Finding the right balance — determining what functions demand hard blocks, which can benefit from executing in a massively parallel fashion, and which can be implemented in dynamic silicon — will be key.

    Any new configurable device should NOT require any exotic process, and while innovations in structure and fabric are important, care must taken to not underestimate the power of incumbency. For example, in the programmable fabric, the SRAM 4-LUT remains king (even if just due to the huge investment in synth technology for that structure). In the case of processors, the compilers and 3rd party tools for ARM, MIPs and the like are highly developed and ubiquitous. (IMHO, another mistake Ambric made was in “rolling their own” CPU and instruction set. Using an “ARM-like” or “MIPS-esque” CPU core would have made it much more approachable and marketable even if they woulda needed to reduce that “336 CPU” number!)

    An Ambric-like device, based on an ARM-lite core with free, easy-to-use tools, closely coupled with a real-time reconfigurable fabric and connected via a channel-based toroidal interconnect (an old patent of mine based on row/column “skipping”!) could rock the world!

  9. Hi Bob,

    Thanks for this insightful comment. Let me comment on a couple of points –I will get back to your other comments in another reply–.

    You are right to point out that FPGA has been moving away from the “one-size-fits-all” 4-LUT architecture, to propose a structure with more complex, still basic, blocks (e.g., adders and multipliers, bitwise operators). This is certainly a chance for innovations here, but then it is entirely dependent on the synthesis software that will be able to take advantage of such an architecture –as your comment agrees regarding Ambric, which defined the programming model before defining the silicon.

    Since we’re talking FPGA software, it is clear that it is an essential component of an FPGA company’s success. It must be free –that is the model currently in use–, and it must be top-notch quality in terms of stability and QoR. Needless to say, I don’t think FPGA companies scored very well in that field. Even the two big ones have shown questionable solutions in the past 10 years. Remember the disastrous roll out of Altera’s new synthesis release in 2000-2001? The tool was simply not ready. That did cost them market share and a few big name customers (Cisco, to name only one). Altera learned its lesson and is certainly in a better position now regarding the quality of its synthesis. Xilinx’ software situation today is not great: code quality is NOT good, stability and QoR is average or below average (don’t mind me, listen to the customers feedback), and the software development team is bloated. There are new technologies being put in place, but they have still to show they can put it together and release a good synthesis solution.

    Real-time and/or adaptive reconfiguration is indeed a wide open field. Again here, software is the bottleneck. Whoever will deliver a solution in that space will come from the software (synthesis/compiler) world, not from the EE FPGA world.

  10. fred engineer says:

    There is hunger out there for new FPGA companies even if their technology is not as good as X or A. Many customers are willing to go at great lengths so they dont have to deal with the XA duopoly offering very high prices.

    But the real issue on the market with these startups is that their technology DOES NOT WORK.

    The reason that most of these companies have not had great announcements for huge periods of time is not because they are too happy to share their success. It is not because their customers are too secretive and dont want to disclose they use alternative FPGAs.
    It is because their success is negligible or ZERO.

    They are companies offering BROKEN technologies. They soon find themselves that so much is broken that they usually stay with the same process technology for too long renderring their technology obsolete (on top of BROKEN).

    Why are these technologies broken? Well, the human factor is definitive. There is very little good talent left in the FPGA industry. The very few talent left are often managed by mediocre management. These few talented people often do not respond well to mediocre management. As a result, nothing works.

    And why is there so very few talented managers and technical expertise? Simple, they are somewhere else writing software, in the green bzness, bio engineering, etc. Just about anywhere where there is potential for growth.

    And why is everybody running away from the FPGA industry? Because even though the technology has gotten much bigger, their tools and methodology have not cought up with the growth. New methodologies have been proposed (such as hardware level C like languages) but none have solved the fundamental problem of allowing designers to code faster and more easily on a higher level of abstraction.
    The future of programmable logic does not lie in X or A. It will probably be on GPU like technology offered by NVIDIA.

  11. Incisive comments. I like your conclusion, “The future of programmable logic does not lie in XLNX or ALTR. It will probably be on GPU like technology offered by NVIDIA”. If you are correct on the assumption that XLNX and ALTR are lagging in technology, you have a quite interesting point.

  12. Mike says:

    What do you think of programmable analog? Will the traditional FPGA players take interest? I look at the PSoC strategy of Cypress (now $0.5B for them!) and wonder what multicore processing, PSoC and programmble analog will do to open up the hardware side?

  13. […] Nu intru în detalii în legătură cu cauzele acestui cvasi-duopol, o explicaţie puteţi găsi aici. Pe lângă cele două mai există Lattice şi Actel (având împreună o cotă de piaţă de doar […]

  14. Christopher Judge says:

    Programmable analog?
    Let me ask you this. Do you think a B grade 20 year old college graduate could ever be able to design in a high level circuitry into an analog FPGA device?
    If the answer is not, then I am prepared to answer “forget it”

Leave a Reply