Abstract

Enhancement in the capabilities of additive manufacturing (AM) methods has led to development of many high-value components for aerospace, automotive and medical fields. Security concerns, such as (a) a predominantly cloud based process chain of AM may be breached and stolen files can be used for unauthorized reproduction of parts and (b) legitimately acquired parts can be reverse engineered, need to be addressed for this field to protect intellectual property and deter counterfeiting or unauthorized production. In the present work, a method of embedding an identification code inside the parts manufactured by AM methods is presented, which takes advantage of the layer-by-layer manufacturing process. The code is obfuscated by segmenting it into a specific number of parts, which are distributed throughout a large number of printed layers. In this case, viewing the code only from a specific direction is provides the correct visualization. A further obfuscation scheme is demonstrated that embeds multiple identification codes in the interpenetrating format. Only a specific set of processing conditions can lead to printing of the authentic code inside the part correctly. Numerous other conditions lead to printing of wrong code inside the part, which will lead to positive identification of counterfeit or unauthorized parts. Securing the AM process chain can help in accelerating the industrial applications of this versatile method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.