Articles published on Big Data
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
88267 Search results
Sort by Recency
- New
- Research Article
- 10.5492/wjccm.v14.i4.104703
- Dec 9, 2025
- World Journal of Critical Care Medicine
- Andreas G Siamarou
BACKGROUND Diagnostic errors in critical care settings are a significant challenge, often leading to adverse patient outcomes and increased healthcare costs. Millimeter-wave (mmWave) technology, with its ability to provide high-resolution, real-time data, offers a transformative solution to enhance diagnostic accuracy and patient safety. This paper explores the integration of mmWave technology in intensive care units (ICUs) to enable non-invasive monitoring, minimize diagnostic errors, and improve clinical decision-making. By addressing key challenges, including data latency, signal interference, and implementation feasibility, this approach has the potential to revolutionize patient monitoring systems and set a new standard for critical care delivery. The paper discusses the high prevalence of diagnostic errors in medical care, particularly in primary care and ICUs, and emphasizes the need for improvement in diagnostic accuracy. Diagnostic errors are responsible for a significant number of deaths, disabilities, prolonged hospitalizations and delays in diagnosis worldwide. AIM To address this issue, the paper proposes the use of ultrafast wireless medical big data transmission in primary care, specifically in remote smart sensors monitoring devices. It suggests that wireless transmission with a speed up to 100 Gb/s (12.5 Gbytes/s) within a short distance (1-10 meters) is necessary to reduce diagnostic errors. METHODS The method used in the study, includes system design and testing a channel sounder operating at 63.4-64.4 GHz frequency range. The system demonstrated dynamic range of 70 dB, noise level of -110 dBm, and a time resolution of 1 ns. The experiment measured the impulse response of the channel in 36 locations within the primary care/ICU scenario. RESULTS The system was tested in a simulated ICU environment to evaluate the Latency: Assessing the time delay in data transmission and processing. The results of the study showed that the system met the requirements of ICUs, providing excellent latency values. The delay spread and excess delay values were within acceptable limits, indicating successful resolution of ICU requirements. The paper suggests timely deployment of such a system. Impact on data transmission: A 100 MB magnetic resonance imaging scan can be transmitted in approximately 0.008 seconds; A 1 GB scan would take approximately 0.08 seconds; This capability could revolutionize healthcare, enabling real-time remote diagnostics and comparisons with artificial Intelligence models, even in large-scale systems. CONCLUSION The experiment demonstrated the feasibility of using high-speed wireless transmission for improved diagnostics in ICUs, offering potential benefits in terms of reduced errors and improved patient outcomes. The findings are deemed valuable to the medical community and public healthcare systems, and it is suggested further research in this area.
- New
- Research Article
- 10.1108/jeim-06-2025-0452
- Dec 8, 2025
- Journal of Enterprise Information Management
- Chien Hung Liu
Purpose This study examines how Big Data Analytics Capability (BDAC) creates strategic business value by enabling value-creation mechanisms. Design/methodology/approach This study used a quantitative survey approach via a Chinese research firm with the largest panel database. All questionnaires were translated and back-translated by bilingual experts; ambiguous items were refined following a pilot (n = 55). A pilot study (n = 55) and a main survey (n = 408) were conducted among Chinese firms adopting big data. PLS-SEM with 5,000 bootstraps tested relationships among BDAC, value creation mechanisms (VCMs) and strategic business value. Common method bias was assessed and found to be non-significant. Findings This study finds that BDAC drives strategic value creation through the interconnected effects of its core components – technology, talent and management capabilities. The results reveal that these components work synergistically to enhance VCMs, translating analytics potential into tangible strategic benefits and strengthening firms’ competitive advantage in dynamic business environments. Research limitations/implications Prior research applying OIPT to BDAC emphasised resources or direct performance effects. This study adopts a process-oriented view, framing BDAC as an enabler of VCMs like transparency, predictive modelling and adaptation. By testing mediating VCM, it validates BDAC interdependence and distinguishes functional versus symbolic outcomes, extending OIPT-BDAC understanding. The study’s sample is limited to Chinese firms, restricting generalisability. Additionally, cross-sectional data limits insights into the long-term impact of strategic business value. Practical implications The study shows that investing in analytics talent and innovating value-creating mechanisms produces the largest returns from BDAC; technology investments alone are insufficient. Managers should operationalise value-creation mechanisms – data transparency, predictive modelling and proactive adaptation – and track both functional and symbolic KPIs. Originality/value This study advances the understanding of BDAC by positioning it as an essential enabler of value-creating processes rather than a direct source of value. Drawing on organisational information processing theory (OIPT), the research highlights how BDAC supports value-creation processes — data transparency, predictive modelling and proactive adaptation – that improve a firm’s ability to respond in complex environments. Additionally, the study offers practical guidance for firms adopting big data analytics, stressing that the strategic value of BDAC lies in addressing organisational information processing needs to reduce uncertainty and improve decision-making outcomes.
- New
- Research Article
- 10.3390/admsci15120478
- Dec 8, 2025
- Administrative Sciences
- Umesh Bamel
Big data technologies have greatly enhanced the effectiveness of humanitarian logistics. However, most research in this area has focused on developed countries, with limited application to emerging economies. This study aims to address that gap by systematically reviewing global literature to broaden the understanding of big data-driven humanitarian supply chain management in developing countries. We analysed a collection of 64 scholarly articles using bibliometric techniques. The findings indicate that research in this field is experiencing exponential growth. The conceptual structure of the literature identifies six major themes: (1) big data and humanitarian logistics (motor theme), (2) digital technologies (a transitional theme evolving from foundational to central), (3) humanitarian supply chains (base theme), (4) emergency logistics (emerging theme), (5) blockchain technology, and (6) sustainability in humanitarian supply chains. This paper discusses both theoretical and practical implications relevant to emerging economies. By contextualising global knowledge for developing countries, we can enhance the legitimacy and applicability of considerable data-based humanitarian supply chain management research.
- New
- Research Article
- 10.1093/nar/gkaf1172
- Dec 8, 2025
- Nucleic acids research
The National Genomics Data Center (NGDC), as part of the China National Center for Bioinformation (CNCB), provides a suite of database resources for worldwide researchers. As multi-omics big data and artificial intelligence reshape the paradigm of biology research, CNCB-NGDC continuously updates its database resources to enhance data usability, foster knowledge discovery, and support data-driven innovative research. Over the past year, notable progress has been achieved in expanding the scope of high-quality multi-omics datasets, building new database resources, and optimizing extant core resources. Notably, the launch of BIG Search enables cross-database search services for large-scale biological data platforms, including NGDC, National Center for Biotechnology Information (NCBI), and European Bioinformatics Institute (EBI). Additionally, several new resources have been developed, covering genome and variation (Hiland Resource, TOAnnoPriDB), expression (TEDD), single-cell omics (PreDigs, scMultiModalMap, TE-SCALE), radiomics (TonguExpert), health and disease (CAVDdb, IDP, MTB-KB, ResMicroDb), biodiversity and biosynthesis (SugarcaneOmics), as well as research tools (Dingent, miMatch, OmniExtract, RDBSB, xMarkerFinder). All these resources and services are freely accessible athttps://ngdc.cncb.ac.cn.
- New
- Research Article
- 10.55681/jige.v6i4.4813
- Dec 7, 2025
- Jurnal Ilmiah Global Education
- Annisa Adinda Putri Harahap + 1 more
This study aims to explore the urgency of character education in the Industrial Revolution 4.0 era, characterized by the rapid development of digital technologies such as artificial intelligence, the Internet of Things (IoT), and big data. These changes provide significant opportunities for educational innovation, but also give rise to new challenges, such as the degradation of students' character and the dominance of technical skills over moral values. Using a qualitative approach using literature study methods, this study analyzes theories, concepts, and research findings related to character education in the global and Indonesian contexts. The results show that character education is a key element in preparing the younger generation to face the social and ethical challenges presented by technology. Integrating values such as honesty, responsibility, empathy, and digital ethics into the curriculum is crucial for shaping individuals who are not only technically skilled but also possess a solid moral foundation. Character education acts as a moral bulwark that helps students overcome the negative impacts of technology, such as digital addiction and data misuse. In conclusion, character education should not be merely an addition to the curriculum, but must become a central pillar in shaping a generation ready to compete globally while upholding noble values. In the face of rapid change, character education must be strategically designed to strengthen individual moral identity and promote social harmony in the digital era.
- New
- Research Article
- 10.1007/s41060-025-00903-y
- Dec 7, 2025
- International Journal of Data Science and Analytics
- Jeff Caponero + 2 more
Beyond significance: why the d-value matters more in big data contexts
- New
- Research Article
- 10.9734/ajrcos/2025/v18i12793
- Dec 6, 2025
- Asian Journal of Research in Computer Science
- Maksim Romanchuk
Subject: The subject of this article is the analysis of the impact of exponential growth in data volume (up to petabytes and exabytes) and variety (Big Data) on data management architectures and methodologies. Aims: The objective is to identify the challenges in processing and integrating large volumes of heterogeneous data and to conduct a comparative analysis of modern approaches. Methodology: The methodology employs systematization, generalization, and comparative analysis of architectures (NoSQL, Data Lake, Hadoop, Spark, Flink) and methodologies (Agile, DevOps, Data Governance, Data Mesh, Data Fabric). Results: This manuscript focuses on a pivotal topic in Big Data management, exploring the interplay between data growth, architectures, and methodologies. Results indicate that traditional relational DBMS (Database Management Systems) exhibit significant limitations in horizontal scalability and unstructured data processing, whereas NoSQL solutions (document, columnar, etc.) offer the schema flexibility and scalability required for Big Data. Distributed systems, such as Spark and Flink, provide orders of magnitude higher performance for analytical and streaming tasks compared to traditional approaches. The study underscores the critical interconnection between architecture selection (e.g., Data Lake for flexibility) and methodology adaptation (e.g., DataOps for speed, Data Governance for quality control) for effective data integration and management. The scope of application includes the design of data management systems and the selection of optimal technology combinations (e.g., ELT instead of ETL in Data Lakes) for analytics. Its systematic comparison of key technologies and frameworks addresses a gap in literature that often treats these elements separately. Real-world case studies enhance practical relevance, offering valuable guidance for practitioners. It contributes meaningfully to the scientific community by synthesizing selection criteria for effective Big Data systems. A conclusion is drawn regarding the necessity of an integrated approach that combines horizontally scalable architectures, modern processing tools, and flexible yet governed methodologies for successfully handling Big Data.
- New
- Research Article
- 10.1108/qrfm-03-2025-0067
- Dec 5, 2025
- Qualitative Research in Financial Markets
- Umar Kayani + 4 more
Purpose This study aims to undertake a comprehensive and in-depth review of published research studies spanning from January 2001 to December 2023, focusing on big data analytics–based research across supply chain management, Logistics Management and Inventory Management. Design/methodology/approach The examination delves into the conceptual framework, research methodologies and big data analytics techniques, uncovering the original contributions of esteemed authors. In the dynamic landscape of supply chain management, the integration of big data analytics represents a transformative force, offering unparalleled insights and decision-making capabilities for businesses seeking to advance to the next level. This study’s scope extends to elucidating how big data analytics augments performance, mobility and integrity within the supply chain in a timely manner. Findings The findings from the review not only illuminate existing research gaps but also propose strategies for expediting big data analytics research and fostering its widespread adoption. The implications of big data analytics in supply chain management, Logistics Management and Inventory Management domains are explored, with an emphasis on its potential benefits, unexplored best practices and diverse applications across sectors. Critical factors for effective big data analytics implementation, such as collaborative stakeholder training, standardized information flow and the reduction of redundant data, are identified as pivotal components of success. Originality/value This study underscores the imperative of meticulously examining fundamental questions surrounding big data generation and delves into the complexities of data inheritance within increasingly intricate supply chain facilities as a fresh attempt.
- New
- Research Article
- 10.1093/aje/kwaf268
- Dec 5, 2025
- American journal of epidemiology
- Jacqueline E Rudolph + 8 more
A challenge to research in big data is the inherent computational intensity of analyses, particularly when using rigorous methods to address biases. We demonstrate the use of sampling methods in big data to estimate parameters using fewer resources. Our motivating question was whether lung cancer incidence differs by baseline HIV status, using a cohort of nearly 30 million Medicaid beneficiaries. We targeted three parameters (with listed estimator): incidence rate ratio (IRR, Poisson model), hazard ratio (HR, Cox model), and risk ratio (RR, Kaplan-Meier). We controlled for confounders using inverse probability weighting. We ran analyses using the full sample and several sampling schemes: divide-and-recombine (10, 20, 50 samples), sub-cohort, and case-cohort. We compared point estimates, standard errors, computation time, and memory used. We observed 1113 incident lung cancer diagnoses among 180,980 beneficiaries with HIV and 33,106 diagnoses among 29,179,940 beneficiaries without HIV. Findings were similar across target parameters. The sub-cohort and case-cohort approaches had estimates closer to the full sample and were faster and less memory-intensive than divide-and-recombine, especially when estimating the RR. Including non-sampled cases in the case-cohort resulted in increases in computation time and memory relative to the sub-cohort approach.
- New
- Research Article
- 10.1152/function.019.2025
- Dec 5, 2025
- Function
- Jeremy W Prokop + 30 more
The quantity of physiological data has grown exponentially, yielding insights into mechanisms of phenotypic and disease pathways. Among the powerful tools for physiological omics is the study of RNA, where broad sequencing of RNA leads to hypothesis generation and testing while providing observational discovery. Emphasis has been placed on RNA molecules that code for proteins, even though they represent a minority of total RNA. Diverse sequencing methods have rapidly expanded the identification of non-protein-coding molecules, including nonsensemediated decay (NMD) and long non-coding RNAs (lncRNA), which now represent the most diverse class of RNA. Increasing attention needs to be paid to the data processing of RNA sequencing to interpret transcript-level mapping data in the context of protein biology, as many protein-coding genes have diverse noncoding transcripts. Over the past several years, single-cell and spatial transcriptomics have yielded unprecedented insights into cellular, tissue, and organ physiology. Building on these advancements, bulk RNA sequencing tools have begun producing robust deconvolution methods that enhance the analysis of human genes, the detection of foreign RNA from bacteria and viruses, and provide deep insights into complex immunological events, such as B- and T-cell recombination. Over a million RNA sequencing datasets have been generated, providing resources for data scientists to reprocess data and expand larger databases. From model organisms to complex human diseases, RNA sequencing resources continue to transform our knowledge of the complexity of personalized disease insights. Observational science is at the core of physiology, and growth of RNA sequencing represents a significant tool for physiologists.
- New
- Research Article
- 10.3390/nu17233808
- Dec 4, 2025
- Nutrients
- Szilvia Racz + 10 more
Background/Objectives: Big data analysis has revolutionized medical research, making it possible to analyze vast amounts of data and gain valuable insights that were previously impossible to obtain. Our knowledge of the characteristics of vitamin D sufficiency is primarily based on data from a limited number of observations, generally spanning a few years at most. Methods: Here at the Medical Faculty of the University of Debrecen, the big data approach has allowed us to analyze trends in vitamin D status using nearly 60,000 25-hydroxyvitamin D (25(OH)D) concentration results from 2000 onwards. Results: Apart from analyzing the well-known phenomenon of seasonality in 25(OH)D concentration, we observed a trend in test requests, which increased from a few hundred in 2000 to almost 10,000 in 2020. Of particular interest is the change in the gender gap in test requests. In previous years, test requests were primarily from women, but by the end of the analysis period, a significant number of requests were from men as well. Since the data set includes all age groups, we analyzed 25(OH)D concentration for incremental age sets of five years, from a few months to 100 years old. The prevalence of vitamin D insufficiency (<75 nmol/L) was clearly demarcated among various years of observation, age groups, sexes, and seasons. Our data was particularly valuable for analyzing the effect of the methodology used for 25(OH)D determination. Three different methodologies were used during the study period, and clear, statistically significant bias was observed. Conclusions: Our results clearly demonstrate the effect of the methodology used to determine 25(OH)D concentrations on vitamin D status, explicitly highlighting the urgent need to standardize the various platforms used to measure this important analyte and its consequences for public health.
- New
- Research Article
- 10.3390/su172310855
- Dec 4, 2025
- Sustainability
- Wei Xun + 4 more
Ensuring product quality and safety is fundamental to sustainable production and consumption. With the rapid advancement of digital technologies such as blockchain and big data, quality and safety traceability systems have become essential tools to enhance transparency, accountability, and governance efficiency across supply chains. The sustainable functioning of these systems, however, depends on the coordinated actions of multiple stakeholders—including governments, enterprises, consumers, and industry associations—making the study of technological and institutional interactions particularly significant. This paper extends evolutionary game theory to the context of technology-enabled sustainable governance by constructing a tripartite game model involving government regulators, traceability enterprises, and consumers from both technological and institutional perspectives. Unlike existing studies, which focused solely on government regulation, this research explicitly incorporates the role of industry associations in shaping stakeholder behavior and integrates consumer rights protection mechanisms as well as the adoption of emerging technologies such as blockchain into the model. Analytical derivations and MATLAB-based simulations reveal that strengthening reward–penalty mechanisms and improving digital maturity significantly enhance enterprises’ incentives for truthful information disclosure; consumers’ verification and reporting behaviors generate bottom-up pressure that encourages stricter governmental supervision; and active participation of industry associations helps share regulatory costs and stabilize cooperative equilibria. These findings suggest that combining technological innovation with institutional collaboration not only improves transparency and strengthens consumer trust but also reshapes the incentive structures underlying traceability governance. The study provides new insights into how multi-stakeholder coordination and technological adoption jointly foster transparent, credible, and resilient traceability systems, offering practical implications for advancing digital transformation and co-governance in sustainable supply chains.
- New
- Research Article
- 10.14254/jems.2025.10-2.7
- Dec 4, 2025
- Economics, Management and Sustainability
- Dedy Christelle Sekadjie + 1 more
Purpose. This study aims to assess the impact of Industry 4.0 technologies - specifically Big Data, Internet of Things (IoT), collaborative robots, and Cyber-Physical Systems (CPS) - on the financial performance of manufacturing companies in Cameroon, addressing the research gap in the Sub-Saharan context. Methodology. Adopting a quantitative approach, primary data were collected via questionnaires from 104 manufacturing firms. The study employed Chi-square tests and binary logistic regression to analyse the relationship between technological adoption and key performance indicators, including Return on Assets (ROA), Return on Equity (ROE), turnover, and productivity. Results. The empirical findings indicate that integrating Big Data and IoT has a statistically significant positive effect on all measured financial indicators. Collaborative robots positively impact turnover, whereas Cyber-Physical Systems showed no significant correlation with financial performance in the studied context. The theoretical contribution. This research extends economic production theory to developing economies. It provides empirical evidence that digital transformation serves as a critical production input, significantly enhancing firm output and challenging the “IT productivity paradox” in African manufacturing sectors. Practical implications. The study suggests that manufacturing leaders in developing regions should prioritise investments in Big Data and IoT for immediate efficiency gains. Furthermore, it advocates for government-led subsidy policies to lower entry barriers for automation and foster international competitiveness. Sustainable Development Goals (SDGs): SDG 8: Decent Work and Economic Growth; SDG 9: Industry, Innovation and Infrastructure
- New
- Research Article
- 10.1080/20964471.2025.2583505
- Dec 4, 2025
- Big Earth Data
- Carrie C Wall + 33 more
Big data, sound science, lasting impact: A framework for passive acoustic monitoring
- New
- Research Article
- 10.1038/s41893-025-01718-2
- Dec 4, 2025
- Nature Sustainability
- Han-Lin Cui + 9 more
Big data integration for environmental risk assessment of emerging contaminants
- New
- Research Article
- 10.34190/icair.5.1.4324
- Dec 4, 2025
- International Conference on AI Research
- Zuhair Abbas + 2 more
Despite extensive research on industry 4.0 and circular economy (CE). There is limited research on nexus between artificial intelligence (AI) and circular economy. AI seems to be driving force for revolutionizing in businesses and industries for unlocking economic, environmental, and social benefits. We investigated how AI transform from linear to circular business models. Authors selected 105 peer-reviewed articles from Web of Science database by using bibiliometric analysis. Authors analyzed the data by using VoSviewer Software. The four core clusters were identified, (1) circular economy a pathway to sustainable business management, (2) big data models enhance CE outcomes, (3) I4.0 technologies leads to future of CE and (4) digitalized supply chains for sustainable development. This review advances our understanding on AI and circular economy in the existing literature due to less focus by prior scholars. More importantly, this review contributes to shifting the focus to technological perspective within circular economy, diverging from traditional linear model based on economic views. This review suggested companies should adopt AI as it plays a pivotal role in facilitating the shift to a circular economy by reshaping and enhancing existing models of product design, manufacturing, consumption, repair, regeneration, recovery, and end-of-life management while simultaneously improving the efficiency of waste management. This research provides fresh perspectives on AI and CE to better understand AI as an opportunity rather than cost due to ongoing fourth industrial revolution.
- New
- Research Article
- 10.51584/ijrias.2025.101100015
- Dec 3, 2025
- International Journal of Research and Innovation in Applied Science
- Ayanru, O.A + 2 more
The fast merging of Industry 4.0 (I4.0) technologies with Renewable Energy Systems (RES) is changing the world energy infrastructures into new digitalized, effective, and sustainable energy operations. This paper provides a comprehensive review based on the PRISMA framework in order to explore the role, challenges and opportunities of adopting the I4.0 technologies, including the Internet of Things (IoT), Artificial Intelligence (AI), Big Data analytics, Blockchain and Cyber-Physical Systems (CPS), in renewable energy applications. The review includes peer-reviewed articles published from 2015 to 2025 in academic database such as Scopus, IEEE Xplore, ScienceDirect, and Springer. The results indicate that I4.0 usage leads to improved effectiveness of renewable energy forecasting, predictive maintenance, smart grid management, and data-driven decision-making, resulting in increased efficiency in operations and reduction of emissions. Nevertheless, there are long-standing obstacles like cybersecurity weaknesses, interoperability, infrastructure constraints, and policy-mismatch, that have impeded mass adoption, especially in the developing economies. The paper presents research directions in open research that focus on AI-based optimization, secure IoT design, validation of digital twins, framework-based standardised data, and context-specific policy models. In general, this study highlights the transformative nature of I4.0 in hastening the process of energy transition in the world and meeting the United Nations Sustainable Development Goals (SDGs) by developing sustainable, intelligent, and resilient energy systems.
- New
- Research Article
- 10.54254/2755-2721/2025.30231
- Dec 3, 2025
- Applied and Computational Engineering
- Lv Chongxiao + 2 more
With the ever-increasing demands for intelligent and green development in ports, traditional ore terminals are facing numerous challenges in operational efficiency, safety management, and environmental protection. Focusing on the full-scenario operational requirements of ore terminals, this paper develops and applies an intelligent, integrated management and control platform that incorporates intelligent scheduling, remote operation, safety early warning, and coordinated environmental protection. Based on technologies such as big data, cloud computing, 5G communication, and 3D visualization, the platform achieves intelligent, end-to-end management and control of the entire processfrom ship unloading, stacking and reclaiming, and truck loading to environmental management. Practical application at the Cangzhou Huanghua Port Ore Terminal demonstrates the platform's significant effectiveness in enhancing operational efficiency, reducing operating costs, and improving environmental management capabilities, highlighting its strong potential for broader promotion.
- New
- Research Article
- 10.1002/smll.202506036
- Dec 3, 2025
- Small (Weinheim an der Bergstrasse, Germany)
- Jiaxin Yang + 6 more
As an emerging interdisciplinary technology, biosensors hold significant potential in the medical field. Among them, the wearable and implantable intraocular pressure (IOP) biosensors have significant progress. Abnormal IOP is associated with numerous diseases, particularly glaucoma, which imposes a heavy health and economic burden. IOP biosensors enable continuous monitoring, which is difficult to be achieved with conventional approaches. This review summarizes recent advancements in IOP biosensors, primarily over the last 5 years, and evaluates their potential for overcoming clinical translation challenges in continuous monitoring. Sensors based on various principles, including piezoresistive, electrical, optical, and microfluidic, are often placed at various anatomical sites such as the cornea, anterior chamber, and other intraocular regions to track IOP effectively. The improvements of smart IOP biosensors in fabrication, power supply, filtering, and anti-interference are discussed, with particular focus on the optimization from data measurement, transmission, reading, to processing and user application. The limitations of existing equipment and research in clinical translation are elaborated, and a balance between engineering and clinical aspects is still a critical issue. Furthermore, the growth of artificial intelligence and big data technologies is expected to provide a new path for IOP biosensors.
- New
- Research Article
- 10.32722/account.v12i2.7854
- Dec 2, 2025
- account
- Cindy Milasari Sitanggang + 3 more
ABSTRAC The rapid advancement of digital technology has encouraged the adoption of Big Data Analytics (BDA) in various fields, including accounting and finance. This study aims to examine the utilization of BDA in fraud detection and financial performance prediction based on recent literature reviews. The research was conducted by analyzing academic articles and prior studies published since 2021. The findings indicate that BDA significantly contributes to detecting potential fraud by identifying complex data patterns that traditional methods often fail to capture. Moreover, BDA has proven effective in improving the accuracy of financial performance predictions by incorporating broader and real-time variables. This study concludes that BDA is a relevant and adaptive solution to address the challenges of fraud detection and financial forecasting in the big data era, while also providing opportunities for further research in accounting and information systems. Keywords : Big Data Analytics, fraud, financial performance