Interpolation Extrapolation Search: An Optimized Hybrid Algorithm for Data Retrieval in Management Applications<b></b>
The escalating volume and often irregular structure of social assistance data pose significant challenges for efficient data retrieval in management systems. Traditional search algorithms, such as linear and binary search, frequently encounter limitations when handling these large-scale datasets. This research conducts a comparative study between two hybrid algorithms, Jump Binary Search (JBS) and Interpolation Extrapolation Search (IES), aiming to identify the most effective method for a web-based social assistance data management system. Evaluations were performed on a dataset comprising 480 names of social assistance recipients, measuring the number of iterations, execution time, and search accuracy. The results demonstrate IES's superiority over JBS in both iteration efficiency and execution speed. IES exhibited an execution time ranging from 0.002 to 0.006 ms, whereas JBS had an execution time ranging from 0.015 to 0.039 ms. Based on these findings, IES was successfully implemented into a Laravel-based application utilizing a MySQL database. This system is capable of executing searches in less than one second per request. This implementation significantly enhances the system's adaptability and provides an effective search solution for dynamic, large-scale data environments, offering rapid and efficient access to data.
- Book Chapter
- 10.5772/39461
- Jan 1, 2010
Fossil fuels constitute a major energy resource for Canada. In 2002 alone, the production of oil, gas and coal contributed over $30 billion to the Canadian economy. Fossil fuel is presently the world’s most abundant, economical and reliable fuel for energy production. However, the industry now faces a major challenge because the production of fossil fuels including coal, crude oil and gas, and the processes currently used for energy production from such fuels, can have adverse environmental consequences. Hence, along with the positive economic advantages of energy production using fossil fuels come the responsibility of mitigating the consequent adverse environmental and climate-change impacts (Harrison et al., 2007). Carbon capture and storage (CCS) is an approach for reducing carbon dioxide (CO2) emissions to the environment by capturing and storing the CO2 gas instead of releasing it into the air. The application of CCS to a modern conventional power plant could reduce CO2 emissions to the atmosphere by approximately 80-90% compared to a plant without CCS (IPCC, Metz, & Intergovernmental Panel on Climate Change Working Group III, 2005). CO2 capture technologies mainly include: chemical absorption, physical absorption, membrane separation and cryogenic fractionation. Among these technologies, chemical absorption of CO2 is one of the most mature technologies because of its efficiency and low cost. The highly complex CO2 absorption process generates a vast amount of data, which need to be monitored. However, industry process control systems do not typically incorporate operators' heuristics in their intelligent control or data analysis functionalities. Our objective is to construct an intelligent data management and analysis system that incorporates such human experts' heuristics. The Data Analysis Decision Support System (DADSS) for CO2 capture process reported in (Wu & Chan, 2009) is a step towards filling this gap in automated control systems. However, the DADSS is a standalone PC-based system with limited flexibility and connectivity. In this paper we present a web-based CO2 data management and analysis system (CO2DMA), which overcomes these limitations. The system presented in this paper was built based on data acquired from the Pilot Plant CO2 capture process of the International Test Centre for CO2 capture (ITC), located at the University of Regina in Saskatchewan, Canada. The CO2 capture process at the ITC is monitored and controlled by the DeltaV system (Trademark of Emerson Process
- Research Article
34
- 10.1176/appi.ps.53.6.671
- Jun 1, 2002
- Psychiatric Services
Clinical Computing: A Web-Based Data Management System to Improve Care for Depression in a Multicenter Clinical Trial
- Research Article
110
- 10.1176/ps.53.6.671
- Jun 1, 2002
- Psychiatric Services
A web-based data management system to improve care for depression in a multicenter clinical trial.
- Research Article
11
- 10.1186/s12899-014-0013-1
- Dec 1, 2014
- BMC Physiology
BackgroundNumerous innovations for the management and collection of “big data” have arisen in the field of medicine, including implantable computers and sensors, wireless data transmission, and web-based repositories for collecting and organizing information. Recently, human clinical devices have been deployed in captive and free-ranging wildlife to aid in the characterization of both normal physiology and the interaction of animals with their environment, including reactions to humans. Although these devices have had a significant impact on the types and quantities of information that can be collected, their utility has been limited by internal memory capacities, the efforts required to extract and analyze information, and by the necessity to handle the animals in order to retrieve stored data.ResultsWe surgically implanted miniaturized cardiac monitors (1.2 cc, Reveal LINQ™, Medtronic Inc.), a newly developed human clinical system, into hibernating wild American black bears (N = 6). These devices include wireless capabilities, which enabled frequent transmissions of detailed physiological data from bears in their remote den sites to a web-based data storage and management system. Solar and battery powered telemetry stations transmitted detailed physiological data over the cellular network during the winter months. The system provided the transfer of large quantities of data in near-real time. Observations included changes in heart rhythms associated with birthing and caring for cubs, and in all bears, long periods without heart beats (up to 16 seconds) occurred during each respiratory cycle.ConclusionsFor the first time, detailed physiological data were successfully transferred from an animal in the wild to a web-based data collection and management system, overcoming previous limitations on the quantities of data that could be transferred. The system provides an opportunity to detect unusual events as they are occurring, enabling investigation of the animal and site shortly afterwards. Although the current study was limited to bears in winter dens, we anticipate that future systems will transmit data from implantable monitors to wearable transmitters, allowing for big data transfer on non-stationary animals.
- Research Article
- 10.1158/0008-5472.sabcs13-p4-19-03
- Dec 15, 2013
- Cancer Research
Four years ago, after our Baylor College of Medicine (BCM) Breast and Cancer Centers experience, we decided to initiate a related translational research Project at the Breast Center Buenos Aires (BCBA) in Argentina. This included the development of a breast cancer-oriented Management System for clinical and lab data attached to a biorepository of tumor and normal breast tissue with matched blood specimens. We strictly followed ISBER Best Practices for Biorepositories, and shared the same protocols and strategies with BCM to facilitate international collaborative breast cancer research, being extremely aware of international standards. Objectives: Creation of a breast cancer-oriented blood, breast tissue and tumor biobank for translational research purposes. Creation of a Biobank Management and Tracking System, with a breast cancer database for associated epidemiological, pathological, clinical, and follow-up data of each patient. Methods: From April 2011 to June 2013 we processed blood and tissue samples from BCBA. Blood is collected at several time points during the breast cancer disease process: pre-surgical, pre-systemic treatment, and, if applicable, in the metastatic setting. Blood products are stored as whole blood, plasma, buffy coat, red blood cell pellet, serum and clot at -80°C or in GenPlates®. Fresh tissue and tumor samples were collected during surgical or core biopsy proceedings and stored fresh at -80°C in 1 ml cryovials and as FFPE (formalin-fixed paraffin embedded) tissue. Upon enrollment, participants completed an extensive epidemiological and risk factor questionnaire, which is supplemented by medical record abstraction for relevant pathological and clinical data, and re-contacted once a year for follow-up. Preparation and adaptation processes were compliant with the Population Sciences Biorepository and Smith Breast Center Tumor Bank at BCM: 1) IRB-approved informed consent documents 2) Epidemiological and risk factor questionnaires (Core/Breast module); 3) Blood and tissue collection and processing protocols. A web-based data management and tracking system was specifically designed for the BCBA biobank. Results: To date, we have collected 9043 samples, from 274 individuals, in 293 sample collections. Of 5449 frozen samples, 5054 (92.7%) are blood and 395 (7.3%) are tissue. We also collected 3264 blood samples in GenPlate® wells from 68 patients on 17 plates, and extracted DNA from them into Gentegra® tubes, all stored at room temperature. 230 FFPE breast cancer tissue biopsies from 20 surgical specimens, given by the pathologist after diagnosis, stored in cassettes at room temperature in our lab, were also included in the system starting from January 2013. All of them have been classified by the physician into three categories: Healthy Control 15.3% (n = 42), Benign 55.1% (n = 151) and Cancer 29.6% (n = 81). We collected epidemiological, and cancer data from all of them in our system. Conclusion: It took nearly 4 years from inception to realization for this biobank; however, the potential benefit to translation breast cancer research is large. The overall value of this biobank will depend on the number of individuals/samples accrued, the follow-up attained and data accuracy. Citation Information: Cancer Res 2013;73(24 Suppl): Abstract nr P4-19-03.
- Research Article
10
- 10.3109/17482960802378998
- Jan 1, 2009
- Amyotrophic Lateral Sclerosis
The objective was to report on the creation, features and performance of a web-based data management system for a two-stage phase II randomized clinical trial of Co-Enzyme Q10 in ALS. We created a relatively comprehensive web-based data system that provided electronic data entry; patient management utilities; adverse event reporting, safety monitoring, and invoice generation; and standardized coding for medications and adverse events. In stage 1, clinical sites submitted 7207 forms reporting on 105 patients followed for 10 months. Less than 0.7% of submitted forms contained errors. At the time of the delivery of the analysis data set, only four errors remained unresolved. Data were available quickly, with a median time from event to data posting of two days. The data set was locked and the analysis data set produced nine days after the final patient visit. A survey of trial personnel yielded generally positive feedback, with 75% of respondents wishing to use a similar system in the future. Given sufficient resources, a comprehensive web-based data management system can meet the need for clean, available data in clinical trials in ALS and similar diseases, and can contribute significantly to their efficient execution.
- Research Article
4
- 10.1023/a:1010624922603
- Jan 1, 2001
- Journal of medical systems
There are many database-oriented sites on the web, which provide basic medical knowledge, hospital information, and medical counseling. However, there are only a few oriental pulse databases on the web. In this perspective, the goal of this study is to develop the Clinical Database Management System of Oriental Pulse Wave Form using the World Wide Web. Accordingly, this study has conducted researches in the Web-based diagnosis data management system of pulse waveform as well as the method of transmitting the data of pulse waveform. In order to set the standard for the documents of the pulse waveform of patients, the web-based clinical database management system has been developed.
- Research Article
- 10.21070/ijccd2023922
- Jun 23, 2023
- Indonesian Journal of Cultural and Community Development
This research focuses on the development and implementation of a web-based citizen data management system in Keboan Anom Village, Sidoarjo, Indonesia. The aim of the study is to enhance the efficiency and effectiveness of citizen data recording and management processes. The System Development Life Cycle (SDLC) Waterfall model is employed as the methodology for system development, ensuring an organized and systematic approach. Data collection involves semi-structured interviews and unstructured observations, targeting the village's RT 01 (neighborhood association) officials. The application features two key functions: citizen registration, including permanent and non-permanent residents, and a comprehensive data summary. Access to these functions is limited to the RT Secretary and the RT Chairman. The implementation of the system provides improved data management and financial tracking capabilities, enabling simplified reporting. The web-based platform ensures accessibility for RT officials and residents with internet connectivity. Black box testing confirms the functionality of the website and its compliance with expected outputs. The results demonstrate that the application supports the transformation of Keboan Anom Village into a more efficient and technologically connected Digital Village. Future research is suggested to examine community participation levels in using the citizen data management application and its impact on decision-making and overall village governance, incorporating participatory approaches to understand the perspectives and active involvement of Keboan Anom Village residents.
 Highlight:
 
 Efficient citizen data management: The development of a web-based application for citizen data management in Desa Keboan Anom enables efficient recording and documentation of resident information, improving data accuracy and accessibility.
 Technological transformation: The implementation of the web-based system contributes to the realization of a Digital Village concept, facilitating efficient and technology-connected administration in Desa Keboan Anom, Sidoarjo.
 Participatory approach: Future research should focus on evaluating community participation levels and the impact of the citizen data management application on decision-making and overall village governance, utilizing a participatory approach to gain insights and foster active involvement of Keboan Anom Village residents.
 
 Keyword:
 Keboan Anom Village, Citizen Data Management, Web-Based Application, Efficiency, Participatory Approach
- Research Article
- 10.21070/ijccd.v14i2.922
- Jun 23, 2023
- Indonesian Journal of Cultural and Community Development
This research focuses on the development and implementation of a web-based citizen data management system in Keboan Anom Village, Sidoarjo, Indonesia. The aim of the study is to enhance the efficiency and effectiveness of citizen data recording and management processes. The System Development Life Cycle (SDLC) Waterfall model is employed as the methodology for system development, ensuring an organized and systematic approach. Data collection involves semi-structured interviews and unstructured observations, targeting the village's RT 01 (neighborhood association) officials. The application features two key functions: citizen registration, including permanent and non-permanent residents, and a comprehensive data summary. Access to these functions is limited to the RT Secretary and the RT Chairman. The implementation of the system provides improved data management and financial tracking capabilities, enabling simplified reporting. The web-based platform ensures accessibility for RT officials and residents with internet connectivity. Black box testing confirms the functionality of the website and its compliance with expected outputs. The results demonstrate that the application supports the transformation of Keboan Anom Village into a more efficient and technologically connected Digital Village. Future research is suggested to examine community participation levels in using the citizen data management application and its impact on decision-making and overall village governance, incorporating participatory approaches to understand the perspectives and active involvement of Keboan Anom Village residents.
 Highlight:
 
 Efficient citizen data management: The development of a web-based application for citizen data management in Desa Keboan Anom enables efficient recording and documentation of resident information, improving data accuracy and accessibility.
 Technological transformation: The implementation of the web-based system contributes to the realization of a Digital Village concept, facilitating efficient and technology-connected administration in Desa Keboan Anom, Sidoarjo.
 Participatory approach: Future research should focus on evaluating community participation levels and the impact of the citizen data management application on decision-making and overall village governance, utilizing a participatory approach to gain insights and foster active involvement of Keboan Anom Village residents.
 
 Keyword:
 Keboan Anom Village, Citizen Data Management, Web-Based Application, Efficiency, Participatory Approach
- Research Article
1
- 10.3141/1956-01
- Jan 1, 2006
- Transportation Research Record: Journal of the Transportation Research Board
A case study of the implementation of a web-based electronic data management system (EDMS) for the construction material quality assurance program of the State Highway 130 Turnpike project, a 49-mi design-build highway megaproject in Texas, is presented. EDMS consists of a set of web-enabled data management and engineering analysis tools that support the independent construction quality assurance firm's (CQAF's) functions in managing, reporting, and analyzing materials test data. The system currently supports 43 field and laboratory test procedures and enables CQAF staff to process effectively a large volume of test reports. Substantial efforts were extended to develop online information delivery functions to disseminate consistent test information among a broad constituency of users, including engineers, managers, technicians and inspectors, construction superintendents, material vendors, and designers, in a virtual real-time fashion. The system automatically tracks and monitors material-related deficien...
- Research Article
13
- 10.1016/j.matcom.2012.11.009
- Jan 5, 2013
- Mathematics and Computers in Simulation
Design and implementation of a web-based groundwater data management system
- Research Article
- 10.22146/ijccs.68204
- Oct 31, 2021
- IJCCS (Indonesian Journal of Computing and Cybernetics Systems)
The technical improvements of the present era necessitate that everyone understand information and communication technology. The influence can be useful in a range of industries, especially in the workplace. The office management system is a sort of administrative activity aimed at increasing management effectiveness. As a result, data management at PT. Dwimatama Multikarsa Semarang continues to be done manually, particularly in the production department, with data being input into Microsoft Excel software and stored on hard drives or flash drives. In this case, it is ineffective, especially if the data has been lost or corrupted. The author has come up with the idea of computerizing the administration and archiving system in light of the limitations that have been stated. The author uses a sequential searching approach to do a data search. This method will allow users to find information more quickly and effectively. The system was built using the Laravel framework and the Hypertext Preprocessor (PHP) programming language. The study's conclusion is a web-based data management and storage system that uses MySQL databases. Employees can benefit from this technology by being able to handle and save information more effectively and efficiently.
- Research Article
18
- 10.1007/s11306-016-1142-2
- Dec 27, 2016
- Metabolomics
BackgroundAn increasing number of research laboratories and core analytical facilities around the world are developing high throughput metabolomic analytical and data processing pipelines that are capable of handling hundreds to thousands of individual samples per year, often over multiple projects, collaborations and sample types. At present, there are no Laboratory Information Management Systems (LIMS) that are specifically tailored for metabolomics laboratories that are capable of tracking samples and associated metadata from the beginning to the end of an experiment, including data processing and archiving, and which are also suitable for use in large institutional core facilities or multi-laboratory consortia as well as single laboratory environments.Results Here we present MASTR-MS, a downloadable and installable LIMS solution that can be deployed either within a single laboratory or used to link workflows across a multisite network. It comprises a Node Management System that can be used to link and manage projects across one or multiple collaborating laboratories; a User Management System which defines different user groups and privileges of users; a Quote Management System where client quotes are managed; a Project Management System in which metadata is stored and all aspects of project management, including experimental setup, sample tracking and instrument analysis, are defined, and a Data Management System that allows the automatic capture and storage of raw and processed data from the analytical instruments to the LIMS.ConclusionMASTR-MS is a comprehensive LIMS solution specifically designed for metabolomics. It captures the entire lifecycle of a sample starting from project and experiment design to sample analysis, data capture and storage. It acts as an electronic notebook, facilitating project management within a single laboratory or a multi-node collaborative environment. This software is being developed in close consultation with members of the metabolomics research community. It is freely available under the GNU GPL v3 licence and can be accessed from, https://muccg.github.io/mastr-ms/.
- Research Article
29
- 10.1177/1740774509358748
- Jan 18, 2010
- Clinical Trials
Clinical trial investigators and sponsors invest vast amounts of resources and energy into conducting trials and often face daily challenges with data management, project management, and data quality control. Rather than waiting months for study progress reports, investigators need the ability to use real-time data for the coordination and management of study activities across all study team members including site investigators, oversight committees, data and safety monitoring boards, and medical safety monitors. Web-based data management systems are beginning to meet this need but what distinguishes one system from the other are user needs/requirements and cost. To illustrate the development and implementation of a web-based data and project management system for a multicenter clinical trial designed to test the superiority of repeated transcranial magnetic stimulation versus sham for the treatment of patients with major depression. The authors discuss the reasons for not using a commercially available system for this study and describe the approach to developing their own web-based system for the OPT-TMS study. Timelines, effort, system architecture, and lessons learned are shared with the hope that this information will direct clinical trial researchers and software developers towards more efficient, user-friendly systems. The developers use a combination of generic and custom application code to allow for the flexibility to adapt the system to the needs of the study. Features of the system include: central participant registration and randomization; secure data entry at the site; participant progress/study calendar; safety data reporting; device accounting; monitor verification; and user-configurable generic reports and built-in customized reports. Hard coding was more time-efficient to address project-specific issues compared with the effort of creating a generic code application. As a consequence of this strategy, the required maintenance of the system is increased and the value of using this system for other trials is reduced. Web-based central computerized systems offer time-saving, secure options for managing clinical trial data. The choice of a commercially available system or an internally developed system is determined by the requirements of the study and users. Pros and cons to both approaches were discussed. If the intention is to use the system for various trials (single and multi-center, phases I-III) across various therapeutic areas, then the overall design should be a generic structure that simplifies the general application with minimal loss of functionality.
- Research Article
14
- 10.1007/s10916-008-9212-2
- Sep 17, 2008
- Journal of Medical Systems
Modern clinical research often involves multicenter studies, large and heterogeneous data flux, and intensive demands of collaboration, security and quality assurance. In the absence of commercial or academic management systems, we designed an open-source system to meet these requirements. Based on the Apache-PHP-MySQL platform on a Linux server, the system allows multiple users to access the database from any location on the internet using a web browser, and requires no specialized computer skills. Multi-level security system is implemented to safeguard the protected health information and allow partial or full access to the data by individual or class privilege. The system stores and manipulates various types of data including images, scanned documents, laboratory data and clinical ratings. Built-in functionality allows for various search, quality control, analytic data operations, visit scheduling and visit reminders. This approach offers a solution to a growing need for management of large multi-center clinical studies.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.