Financial derivatives are now in ubiquitous use around the globe to hedge exposure and as vehicles for speculation. Four years on from the worst financial crisis in almost a century, the world is still recovering. Yet the derivatives markets continue to grow, and have more than recovered from the slight pull-back after the crisis of 2008. According to the Bank of International Settlements, as of June 2011 the total global notional of outstanding OTC derivatives was US$707 trillion, having surpassed the previous high point of around US$670 trillion in June 2008. Prior to that, the market had been doubling every three years or so. The fundamental need to transfer risk in exchange for long-term reward will persist, while financial markets throughout the world continue to develop in size and maturity.Increased regulatory scrutiny and a heightened sensitivity to counterparty credit risk since the crisis have forced fundamental changes in how even the most vanilla derivatives are valued. More widespread use of collateral has imposed new levels of complexity on the most basic and preliminary task for any financial calculation: the construction of a discount curve. The option to choose collateral currency, nominally present in some collateral agreements, has even prompted analysis of whether a so-collateralized vanilla swap in fact has non-negligible exposure to the volatility of certain spreads. Accounting for the effect of counterparty exposure (calculating a Credit Value Adjustment /CVA) on the value of a portfolio of vanilla derivatives requires a level of flexibility and sophistication matching that previously applied to highly complex exotics. Even without CVA, certain types of life-insurance policies available to retail investors cannot be valued appropriately without sophisticated modeling and computational techniques traditionally only needed for exotic trades.The modeling of financial derivatives is performed by quantitative analysts. Although their critical contribution is in making well-judged modeling decisions that provide adequate account for the factors that influence the value of a trade or portfolio, a practical reality of a quant's role is that calculations must be implemented in software systems. A typical quant's expertise in computing is heavily skewed toward obtaining accurate and robust numerical results; getting the numbers right. A consequence of this emphasis on the details is that the big picture is often missed. Code often evolves rather than being designed. New variants of existing pricing calculations are implemented without direct reuse of the shared parts. New trades or models prompt bespoke development which at times amount to nothing more than a highly paid form of repetitive manual labor.The missing ingredient here is architecture; the overall design of a system informed by a deep understanding of the fundamental concepts which underpin the domain and the nature of the information that flows and interacts when problems are solved in the domain. It is architecture that facilitates true flexibility, modularity and sub-component reuse, and without a clear treatment of the concepts relevant to the problem domain, the goal of constructing an effective architecture remains elusive. A major cause of this deficiency is that individuals with the necessary rich architectural vision are rare and expensive, particularly outside the realm of top-tier sell-side institutions. In this paper, we tackle precisely this issue - the fundamental concepts and design ideas that must underpin the architecture of a modern analytics platform.