Today’s cloud market is hard to estimate and depends a lot on the analyst. One report predicts that the global cloud computing market is expected to grow from $37.8 billion in 2010 to $121.1 billion in 2015, with SaaS (Software as a Service) contributing for three quarter of this market. Regardless of the actual size, cloud computing means to commoditize processing power, leading to economy of scale and flexibility.
I wrote, like many others, that cloud-based EDA solutions are inevitable: there is no magic algorithm that will reduce the ever-increasing complexity of designing and verifying a digital device (FPGA, ASIC, or SW/HW co-design). The only way to keep pace with the complexity is massive parallelism.
Some claim that EDA is not ready for cloud computing because it requires a lot of CPU power with very fast access to a massive amount of data. That is because EDA tools have not been designed to take advantage of very large clusters of machines with a relatively low bandwidth network, each machine having a fraction of the data. It will not be long before tools are re-architected for that purpose. E.g., physical and logical verification are the most obvious candidates to benefit from partitioning techniques and to become SaaS in the cloud.
The other obstacle to EDA in the cloud is not specific to EDA: security is the most cited reason to explain the resistance of potential customers. Semi conductor companies and design houses are reluctant to let their sensitive data go into a cloud they feel they have no control of.
These are the typical questions when security in the cloud is raised:
- Who has privileged access to the data?
- Which data encryption is used, and how are managed the keys?
- Where is the data located?
- Is the data segregated from other customer’s data?
- Can the data be recovered in case of disaster?
The relevance of (1) and (2) is no different than when the data is managed internally. Topics (3) and (4) come up when customers feel safer with a precise hosting location, or by excluding some location (e.g., some foreign country). However the principle of data fragmentation hosted in different, non-predictable locations, makes the whole data safer, because breaching one or more data center is not sufficient to rebuild the complete file. This also answers question (5): assuming the complete loss of a few data centers, it is possible to reconstruct the whole data thanks to fragmentation and embedded redundancies hosted in the other data centers.
Customers will feel more confident if these questions are clearly answered by providers, and if independent security audits assess the quality of the services. Also the definition of widely accepted security certification would help the adoption of cloud services.
The reality is that cloud services, as other IT services, are the target of thieves and spies. There have been and there will be well-publicized security breaches in clouds (Gmail, Twitter), like there are many told and untold intrusions in private networks. I think it is misleading to believe that hosting one’s data in one’s own facility is any safer than relying on a well-vetted cloud: most of the cloud providers will be better at security than customers will ever be.