Abstract

Perhaps paradoxically, we argue that the biological sciences are "data-limited". In contrast to the glut of DNA sequencing data available, high-throughput protein analysis is expensive and largely inaccessible. Hence, we posit that access to robust protein-level data is inadequate. Here, we use the framework of the formal engineering design process to both identify and understand the problems facing measurement science in the 21st century. In particular, discussion centers on the notable challenge of realizing protein analyses that are as effective (and transformative) as genomics tools. This Perspective looks through the lens of a case study on protein biomarker validation and verification, to highlight the importance of iterative design in realizing significant advances over currently available measurement capabilities in the candidate or targeted proteomics space. The Perspective follows a podium presentation given by the author at The 16th International Conference on Miniaturized Systems for Chemistry and Life Sciences (μTAS 2012), specifically focusing on novel targeted proteomic measurement tools based in microfluidic design. The role of unmet needs identification, iteration in concept generation and development, and the existing gap in rapid prototyping tools for separations are all discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call