Abstract

Abstract An architecture is presented to show how the distributed computing concept can be applied to a typical real-time reservoir monitoring process. Challenges expected in the implementation of distributed computing for such a reservoir monitoring process are also presented. Other possible applications of distributed computing in reservoir analysis and drilling dynamics are briefly discussed. Introduction Distributed computing or collective computing (also called community computing) is the latest paradigm in the computing world. This concept has matured to a level that it could be used to continuously monitor the dynamic behaviour of a petroleum reservoir at much shorter time intervals, as discussed in this paper. The concept has been employed at the University of California in Berkley for the SETI@home project, where the task of finding aliens or extraterrestrial intelligence in outer space is broken down into chunks and distributed among various computers over the Internet to perform(1). Such computers have the SETI@home software installed on them and the computers act as a community of idle processor providers on the information super highway. This same idea is employed by distributed.net to tackle various mathematical and cryptographic problems(1). The idea is simple: tap the processing power of various idle computers over the Internet to assist in highly intensive computation tasks, such as those involved in weather forecasting, high graphics applications, gene sequence analysis, and general high volume scientific computing. Distributed computing is becoming the defacto standard employed in bioinformatics for analyzing and making sense of large piles of available data. Several laboratories across the world are also developing or already using distributed computing strategies for highly intensive competitive tasks. Details of a distributed scientific computing environment, implemented with existing technologies such as ILU (inter-language unification) which follows the CORBA (common object request broker architecture) standard and Java Beans, was demonstrated by Decker et al.(2). Recent usage of distributed computing "in simulating protein folding in order to understand how proteins fold" has been reported by Stanford University in California(3). IBM has also applied for and been granted a patent on an issue related to collective computing. The patent is based on managing computer resources in a distributed computing environment(4). Many experts believe this is the way the computer network is going, especially with the proliferation of computers all over the world. Napster, an online music sharing service, is another example of a distributed computing pplication or what is being referred to as peer-to-peer (P2P) computing. Distributed Computing Initiatives The fundamental technologies driving distributed computing are the Java 2 Platform Enterprise Edition (J2EE) by Sun Microsystem and the Microsoft Dot-Net (Microsoft.Net) initiative. "The J2EE combines a number of technologies in one architecture with a comprehensive application programming model and compatibility test suite for building enterprise-class server-side applications(5)." Dot-Net is Microsoft's XML (Extensible Markup Language) Web services platform. XML Web services "link applications, services, and devices together into connected solutions hat enable people to act on information any time, any place, and from any smart device(6)." They are enabling a generation of distributed application development with a focus on Web services and application integration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call