In today’s semiconductor industry, we’re witnessing a dramatic surge in data growth fueled by advanced analytics, machine learning, and increasingly intricate system designs. According to McKinsey, advanced analytics can help streamlining R&D processes, reducing time to market for integrated circuit (IC) projects by up to 10%. However, harnessing the full potential of advanced analytics to extract meaningful insights from the vast amounts of data generated. In 2023, Intel alone manages 54 data center modules, housing over 130,000 servers that underpin the computing needs of its internal design and manufacturing operations.
For smaller and medium-sized design organizations, an efficient data management solution becomes indispensable for storing, processing, securing, and transforming data into actionable insights that can speed up the innovation cycle, bringing new designs to market faster.
Chip design meets three LHC-level challenges
Today’s chip design landscape is facing challenges reminiscent of those encountered by the Large Hadron Collider (LHC), the world’s most powerful particle accelerator. These challenges include three main areas: data volume, version control, and global collaboration.
Figure 1. The Large Hadron Collider (LHC) is the world’s largest particle accelerator
Challenge # 1: Managing Exponential Data Growth
Just as the LHC generates around one petabyte of collision data every second, today’s integrated circuit (IC) designers find themselves drowning by massive design data. The challenge extends beyond just storing this vast amount of data; it involves making it accessible and interpretable across various stages of the engineering lifecycle.
Due to the proliferation of EDA tools, the task of integrating data across design phases and turning it into actionable insights has become increasingly complex. As design projects expand, the size of layout schematics and many other binary data grows, making today’s network storage infrastructure—whether on-premises or in the cloud—a costly affair. This necessitates a strategic approach to balancing cost with performance while ensuring that data remains accessible and traceable from design to verification.
Challenge # 2: Streamlining version control
The complexity of version control in chip design mirrors the challenge of managing multiple iterations of experiments in the LHC project.
For IC designers, the ability to compare versions of design files and automatically identify changes is crucial for tracking, verifying, and debugging. A comprehensive version control mechanism is an integral part of the design data management (DDM) platform, which goes beyond logging changes to managing detailed metadata for facilitating verification. From a business perspective, version control is essential for enhancing data security, controlling access, and ensuring effective backup and disaster recovery protocols are in place.
Challenge # 3: Fostering global collaboration
The LHC project’s success relied upon the collaboration of hundreds of research institutions worldwide toward a common goal. Similarly, today’s large-scale IC design demands such collaboration at the global level. Design teams, often spread across multiple locations, work in real time, transferring terabytes of data and making changes from IP blocks to system-level architectures.
Figure 2. A simplified version of the design team structure
For every modification, design constraints including layout, power, and timing, need to be re-verified by different teams.
The chip design process involves a diverse group of professionals, from circuit designers and verification engineers to architects and project managers, each focusing on different stages of the design process and utilizing various tools. The challenge is to create a cohesive digital thread throughout the engineering lifecycle, ensuring efficient collaboration among these diverse groups and tools.
Three Causes of Storage Overloading Issues
The issue of network storage overloading stems from several critical causes, each exacerbating the pressures of managing increasingly complex designs.
1 Traditional decentralized design data manage
The traditional decentralized design data management (DDM) approach clones the project data repositories for each user. While this may give the feeling of complete data control, it leads to a redundant and massive increase in storage requirements. Especially in IC design environments, where shared files and large binary files are prevalent, this approach is particularly inefficient.
Figure 3. A traditional distributed data management approach is inefficient
A typical design engineer’s work area consists of a large number of binary files generated by different EDA tools. Some files are generated only a few times throughout the development lifecycle, but are frequently used by almost all team members on the project. In addition, there are usually third-party libraries and process design kits (PDKs) that the entire team relies on for practically every simulation or verification run. As a result, the design team requires a lot of high-performance storage.
IC design demands quick and reliable access to data to maintain productivity and project momentum. The inefficiencies of decentralized data management can lead to the lack of sufficient high-performance storage, directly slowing down time-to-market.
#2 Inconsistent data organization
Individual team members may organize their data efficiently that makes sense for their role and workflow. However, file name formats, directories, libraries, and documentation requirements do not translate into an organized system for the entire team.
The absence of a unified organization strategy means other team members spend excessive time searching for specific data. And in the worst-case scenario, they are unable to access the necessary information, thereby causing delays in time-to-market. In the competitive landscape of IC design, where time-to-market is a key determinant of success, such delays can have significant business implications.
#3 Increasing design sizes
The traditional approach to data management, characterized by the duplication of project data for each user, presents significant scalability challenges for today’s increasingly complex and large-scale design projects.
For instance, a project starting with 1 gigabyte of data, when duplicated across multiple user-specific work areas, can grow to require tens or even hundreds of gigabytes of storage, depending on the size of the team. This growth is exponential, as each new team member adds new changes. Moreover, with repository replicated across multiple work areas, ensuring that all team members have access to the most up-to-date information becomes a daunting challenge.
Three Proven Strategies for Optimizing Network Storage
To optimize network storage is to eliminate unnecessary duplication of design data.
Figure 4. Effective design data management enables users to manage their own data, while collaborating efficiently
Here are three approaches Keysight’s Design Data Management Platform (SOS) takes to significantly reduce storage inefficiency, enhance collaboration, and ensure data integrity across the engineering lifecycle.
#1 Creating symbolic links to cache
Keysight’s SOS employs symbolic links to the actual copies stored in the main repository. In this way, these tools address the longstanding issues of storage inefficiencies and data consistency in traditional data management. The symbolic links serve as references or shortcuts to actual files in a central repository, meaning the system doesn’t duplicate data for every user’s workspace.
For example, when an RTL design engineer needs to work on specific views / files, he or she can check out the files while the Design Data Management (DMM) tool automatically replaces the symbolic links in his or her workspace with the actual files from the main repository when checking out the files. After making the necessary modifications, the user checks the files back into the system. The DDM tool then updates the main repository and reverts the files in the user’s workspace back to symbolic links, maintaining a lean workspace environment.
Figure 5. SOS creates links to cache to minimize network storage requirements
This method allows projects to scale more efficiently, as adding users or data does not increase storage needs proportionally.
Try Keysight Design Data Management (SOS)
#2 Enabling effective IP reuse
The Keysight’s DDM platform (SOS) simplifies how teams manage and reuse proven IP blocks in their projects. Rather than duplicating these IP blocks, it creates references to them. This strategy not only conserves considerable storage but also improves the trackability and traceability of IPs. In critical sectors such as automotive and aerospace, where product design must meet strict regulatory standards, the ability to track IPs is indispensable.
Figure 6. IP reuse across multiple sites
#3 Implementing “Sparse Populate”
The “Sparse Populate” feature further minimizes storage requirements for users. Design projects contain data blocks, like IP libraries or PDK data, which are only needed on a read-only basis. Rather than creating symbolic links for each file within these blocks, Keysight’s SOS establishes a single symbolic link to the directory’s top level. This feature enhances flexibility for one engineer to use the “Links to Cache” functionality while another may decide to use the physical copy approach. When necessary, users can transition from the Sparse Populate setup to a fully populated copy. Users can then modify any required files and check the updated files back into the repository.
Ready to transform your DDM? Get a quote
Partnering with Keysight to Streamline Semiconductor Design Data Management
Keysight’s Design Data Management platform (SOS) offers a comprehensive, non-invasive solution for today’s distributed design teams.
By integrating features such as non-intrusive design environment, version control, and network optimization, Keysight’s SOS can help significantly reduce data storage costs, enhance designer productivity and collaboration. This holistic approach ensures that teams can work seamlessly across different locations, maintaining a high level of efficiency and traceability across different design stages.
To explore the potential savings and efficiency gains for your organization from network optimization and streamlined design data management with Keysight’s SOS, request a customized demo today.