Abstract

As the complexity of chemical and energy technologies has increased, the need has grown for new computer-aided design tools for process synthesis. For technologies in the early stages of development and demonstration, the need to incorporate uncertainties in the process synthesis stage is especially great. This paper presents a new and efficient method, based on stochastic annealing, to identify optimal design configurations from a large number of process alternatives, considering the effects of uncertainty. Case studies of an integrated coal gasification combined cycle (IGCC) power plant are presented to illustrate this method. For this case, the new stochastic synthesis framework reduced computational time by 60% compared to an exhaustive search procedure. Greater efficiencies are expected as the number of process configurations increases. INTRODUCTION Integrated gasification combined cycle (IGCC) systems are an emerging technology for the clean and more efficient use of coal for power generation. Several IGCC designs have been demonstrated on a commercial scale, with other advanced concepts currently in the development and demonstration stages. Of particular interest are improved technologies for the gasification and environmental control sections of an IGCC system, especially systems using hot gas cleanup. Since most components of these advanced IGCC systems are still in the design and development phase, significant uncertainties remain regarding their commercial performance and cost. The United States Department of Energy (DOE) has developed computer-based performance models for several IGCC systems using the Aspen process simulator (Evans, et al., 1979). These models include different gasifier designs (i.e., fixed-bed, fluidized-bed, and entrained-bed gasifiers), and different gas stream cleanup systems based on hot gas or cold gas cleanup technologies (Stone, 1985). Frey and Rubin (1992) extended the earlier DOE work to include new process performance models for environmental control systems, as well as capital and operating cost models for several variants of IGCC system designs. These Aspen models typically consist of approximately 80-90 unit operation blocks, and up to eight flowsheet sections involving gasification, gas cleanup, and power generation units. While the bulk of the models are comprised of generalized unit operation blocks (e.g., pumps, heat exchangers, pressure vessels), there are a large number of Fortran blocks and design specification blocks (defined by the input program structure of Aspen) which are specific to IGCC systems, or to a particular flowsheet. Until now, each of these flowsheets was evaluated separately. As the number of technological options increases, however, an exhaustive search through individual flowsheet simulations to identify an optimal design configuration becomes computationally expensive. Consequently, a systematic, efficient procedure for screening multiple alternatives, and selecting an optimal design configuration, is desirable. In this paper, the problem of identifying the optimal design configuration of an IGCC system is posed as a process synthesis problem, wherein the alternative technological variants are embedded in one flowsheet — a “superstructure” — from which an optimal configuration is identified. An additional advance is the explicit treatment of uncertainty, in contrast to the traditional deterministic approach to analysis. The presence of uncertainties makes the technology evaluation process a computationally intensive problem. This paper presents an efficient approach for the solution of this realworld large-scale synthesis problem. PROCESS SYNTHESIS UNDER UNCERTAINTY Approaches to process synthesis may be classified into four groups: (1) the thermodynamic approach (Linhoff, 1981), (2) the evolutionary method (Nishida et al., 1981), (3) the hierarchical approach based on intuition and judgment (Douglas, 1988), and (4) the optimization or algorithmic approach (Grossmann, 1985; Friedler et al. 1995; Painton and Diwekar, 1994). These approaches, although different in principle, all provide directions for process synthesis research, and each brings different perspectives and advantages to this field. For example, the hierarchical approach provides improved process understanding and motivates novel problem representations, while the optimization approach can prune a search space of alternative configurations to find the best flowsheet that maximizes or minimizes a target function. This paper focuses on the optimization approach to process synthesis. This approach is especially amenable to generalization and to interfacing with modern process simulators. The optimizer iteratively determines the discrete and continuous decision variables. Discrete variables denote the existence or absence of specific units in the flowsheet, while continuous variables represent flows, operating conditions, and design parameters for system components. In general, the synthesis problem thus involves two elements: choosing the optimal components of a flowsheet, and optimizing a given flowsheet design. Stochastic Simulation Capability Process models involved in conventional optimization methods are typically deterministic in nature, i.e., all input parameters have a single fixed value, and all model results are similarly single-valued. In this paper we add the dimension of uncertainty. Uncertainties in process design arise in the early stages of development and demonstration because available performance data often are scant, and technical and economic parameters are not well established. Thus, a systematic framework to analyze uncertainties is a key step in improving upon current design capabilities. In conventional simulators or simulation programs, sensitivity analysis via a series of multiple runs is the typical approach used to analyze uncertainty. Typically, however, only one or two parameters at a time are varied in a simulation framework which may contain a large number of independent variables. Thus, important interactions or cases easily may be overlooked. Even where many cases are analyzed, sensitivity analysis for nonlinear models cannot easily provide information about worst case or best case scenarios. Nor does sensitivity analysis provide any measure of the likelihood of different outcomes. A generalized framework for analyzing uncertainties systematically has been developed around a chemical process simulator in our earlier work (Diwekar and Rubin, 1991). This approach allows for probabilistic modeling of any chemical process flowsheet modeled in a simulator, and overcomes the limitations of sensitivity analysis by providing a generalized treatment of uncertainties. This probabilistic or stochastic modeling procedure involves: (1) specifying the uncertainties in key input parameters in terms of probability distributions; (2) sampling the distribution of the specified parameter in an iterative fashion; (3) propagating the effects of uncertainties through the process flowsheet; and, (4) applying statistical techniques to analyze the results. A major bottleneck in the stochastic modeling framework, however, is the computational intensity of the recursive sampling. Stochastic Optimization Capability Process optimization under uncertainty adds further complexity since it requires optimization as well as uncertainty analysis. Figure 1 shows the schematic of the stochastic optimization procedure developed for a given flowsheet. The procedure involves two recursive loops: the inner stochastic sampling loop, and the outer process optimization loop. Because each loop involves iteration, it is desirable to reduce the computational intensity and the interactions between the two loops in order to address large-scale synthesis problems. Stochastic Modeler OPTIMIZATION LOOP Optimizer SAMPLING LOOP CDF of Objective Function and Constraints Results for nth Sample Process Model Decision Variables Probabilistic Objective Function and Constraints Results Parameter Uncertainty Distributions nth Value of Uncertainty Variables Figure 1. Stochastic Optimization Framework Recently, a new recursive sampling technique, known as Hammersley sequence sampling (HSS), was shown to exhibit better homogeneity over a multi-variate parameter space compared to conventional sampling methods (Diwekar and Kalagnanam, 1997). In this context, homogeneity is defined as the ability to produce a uniform distribution of points covering the entire sample space, such that the overall distribution is representative of the entire population. Further, it was found that the number of samples required for the HSS technique to converge to different performance measures of a random output variable (e.g., mean, variance or fractiles), subject to input uncertainties, is lower compared to traditional Monte Carlo or Latin hypercube sampling techniques. This rapid convergence property of Hammersley sequence sampling has important implications for stochastic modeling of complex processes. It suggests that precise estimates of any probabilistic function are achievable using a smaller sample size. This efficient sampling method can be used for the inner sampling loop to enhance the computational efficiency of the stochastic optimization framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call