Figure 1: CADIQ analyzed this 3D model and discovered a zero-thickness feature that resulted in fatigue failure, and ultimately, scrapped parts.
Years ago when I was migrating databases, I wished for a crystal ball. If I could have predicted when and where a migration would fail, I wouldn’t have had to worry about downtime contingencies or long nights of troubleshooting.
My engineering colleagues have a similar need, particularly when it comes to predicting how a CAD model or assembly will behave in a downstream CAE or CAM application, and this need is just as great when it comes to translating, migrating, and re-mastering CAD data. While software providers struggle to fill the void, there are steps that design and manufacturing engineers can take to bridge the gap between “guessing” and “predicting.”
Where Many Fall Short
Engineering organizations might be reinforcing worst practices instead of best practices, piling up costs in wasted labor throughout the product-development value chain. Many of today’s CAD quality tools are limited to enforcing corporate modeling standards. When these “quality” models are released, a relatively large percentage of them often fail in downstream applications. Rather than labor to find a workaround for a model that cannot be analyzed or manufactured, manufacturing groups should put the data under a “CAD microscope” to pinpoint bad geometry, capture findings, and communicate results back to desigers.
Model failures in downstream applications or, in worst cases, part failures during final product assembly can often be traced to intentional or unintentional engineering changes. A few of today’s quality tools identify problems within the model, but the engineer is left to hunt for the bad geometry or engineering change. Once the problem is located, the engineer is then left to his or her own devices to repair or remodel the part. This poses four concerns: not all automated repair applications do an adequate job; repairing or remodeling a part often introduces new issues into the model that propagate themselves throughout the assembly; the changes are not always documented or match design intent; and the process adds a significant amount of non-value-added labor. Rather, users need a tool that will uncover intentional and unintentional engineering changes, capture the findings, and document action items.Many of today’s CAD-centric tools lack the intelligence to predict how a model will behave in a target system (i.e., CAD, CAM, or CAE application). In many cases, the differences between a “quality” model and a “usable” model are only discovered when the model reaches manufacturing, and the engineers are forced to find workarounds. These workarounds usually require the engineer to diverge from the original design intent, either through model changes or rebuilds. This type of ad-hoc approach compromises the integrity of your master model initiative and subjects the manufacturing process to delays, labor costs, and part scrap.
CADIQ found a blend with a sharp cut that cannot be manufactured.
Eliminating Some of the Guesswork
There is no panacea for predictive CAD interoperability, but there are ways to eliminate much guesswork and minimize the non-value-added labor in manufacturing. Designers and engineers are always encouraged to reuse models from existing products and programs, but undetected defects can cause simulation and manufacturing processes to fail. For example, a visually undetectable issue within a model caused parts to fail on final assembly (see Figure 1). If designers could have interrogated the legacy CAD data used in this example early on, they might have had enough time to rework or rebuild the model without wasting materials or delaying assembly.
It was only after this failure that the manufacturer sought out and used ITI TranscenData’s CADIQ application to troubleshoot the model.
Some quality tools enforce modeling constraints, which are supposed to ensure that the designer produces a quality model. However, these tools may also force the designer to unknowingly create unmanufacturable conditions. In these situations, the manufacturing engineer is either forced to make the changes on the shop floor (which often results in the data being recreated) or wait for the changes to be made by the design team. In another example, an unmanufacturable blend was caught by CADIQ. If this error had not been caught early, the manufacturing group would have remodeled the part. For an assembly with thousands of parts, the incremental non-value-added labor costs would skyrocket.
Imagine the savings that might be realized if a part’s manufacturability were ensured prior to design release.
When a master model contains geometry that cannot be used by a downstream application, it forces the downstream user to clean up the model. These clean-ups are rarely communicated upstream, which undermines the organization’s ability to rely on the master representation of the data.
As a final example, an engineer used CADIQ to analyze and document the differences that stemmed from translations between the master model and its derivatives, so that the target models could be evaluated and corrected prior to their release to internal groups and the supply chain. The engineer also used CADIQ to analyze the quality of competing commercial translators (including both BREP and feature-based translators), native-to-native and native-to-neutral translation paths, and the results delivered by an offshore service provider.
CADIQ is identifying changes to a master model that will be used by various internal and external groups using different CAD systems.
Future of Predictive Interoperability
Many of ITI TranscenData’s clients envision intelligent interoperability applications that automatically repair issues found within a model or that launch an interactive process that offers tips to the designer for fixing the geometry or rebuilding the part. Others are hoping for robust tool suites that perform automated functions during PLM check-in and check-out processes. Some are already changing their engineering processes to ensure the manufacturability of their models or assemblies.
Regardless of the methods used, the future of predictive interoperability technologies like CADIQ will have the greatest impact when applied early in the product development process and paired with authoring tools to intelligently predict and fix intentional and unintentional downstream usability issues. Currently, CADIQ supports upstream needs by integrating with CAD and PLM systems, automated BREP-based repair tools, and 3D PDF for documenting downstream issues.