Big Data Analytics Application in the Indian Insurance Sector
Introduction: Big data in the insurance industry can be defined as structured or unstructured data that can affect the rating, marketing, pricing, or underwriting. The five Vs of big data provide insurers with a valuable framework for converting their raw data into actionable information. These five Vs are specifically: (1) Volume: The need to look at the type of data and the internal systems; (2) Velocity: The speed at which big data is generated, collected, and refreshed; (3) Variety: Refers to both the structured and unstructured data; (4) Veracity: Refers to trustworthiness and confidence in data; and (5) Value: Refers to whether the data collected are good or bad.Purpose: Insurance companies face many data challenges. However, the administration of big data has allowed insurers to acknowledge the demand of their customers and develop more personalised products. In addition, it can be used to make correct decisions about insurance operations such as risk selection and pricing.Methodology: We do this by conducting a systematic literature review on big data. Our emphasis is on gathering information on the five Vs of the big data and the insurance market. Specifically, how big data can help in data-driven decisions.Findings: Big data technology has created an endless series of opportunities, which have ensured a surge in its usage. It has helped businesses make the process more systematic, cost-effective, and helped in the reduction in fraud and risk prediction.
- Research Article
16
- 10.1109/tem.2022.3202871
- Jan 1, 2024
- IEEE Transactions on Engineering Management
Big data analytics (BDA) is an advanced analytic technique used with very large and diverse sets of data from different sources. Natural language processing (NLP) is a technology that interfaces with different fields such as computer science, linguistics, and human-computer interactions. Over the past few years, there is a growing number of firms, which are using different BDA and NLP applications in their businesses. Only a few of the research have investigated different dimensions of NLP and BDA and their impacts on the overall organizational performance. There is a growing interest among researchers and practitioners in understanding the consequences for firms that adopt BDA and NLP applications. In this context, the aim of this article is to determine the factors for the usage of BDA and NLP applications in business. With the help of dynamic capability view theory and existing literature, a theoretical model was developed conceptually. Later, the model was validated using structural equation modeling approach considering 1287 samples from 23 firms, primarily based in Asia and Europe, which use NLP and BDA applications. The article finds that NLP and BDA applications help the firms to improve their operational efficiency, which in turn improves the overall firm performance.
- Conference Article
34
- 10.1109/ctceec.2017.8454999
- Sep 1, 2017
The volume of data in the world is growing very fast and generated from verity of sources like social media, sensors airline industry or scientific data in different formats. Biggest challenge is how to infer meaningful insights from such a varietyful and big data along with concern of data storage and management of fast growing data. The size of the databases used in today’s enterprises has been growing at exponential rates day by day. Hence, industries requirement to quickly process and analyze the big data volumes for business decision making and customer insights has also grown exponentially. Data pouring from various sources may be can be structured or unstructured in nature. Structured data refers to a relatively well-organized information, which can be further inserted into traditional RDBMS. As Traditional RDBMS are efficient and easy queries by simple, straightforward search algorithms or SQL queries. In contrast to structured data, unstructured data can be considered as information, which does not, comes in a pre-defined data format, well organized data storage model, or cannot be stored well into relational tables. It is assumed to be fastest growing type of data, e.g. image, sensors data, web chats, social networking messaging data, video, documents, log files, and email data. There are many techniques and software available, which can process and provide efficient storage of unstructured data and help organization to perform analytics on unstructured data. Unstructured data does not well-organized and not stored in predefined manner e.g. logs, web chats. The variety and on ordered nature of data makes storage methods and structure makes execution a time and resource-consuming affair. Advancement into technology has open floodgates to push huge volume of unstructured type of data. Multimedia data is one of the example of unstructured big data, which spans all over the Internet. This needs high execution capability to extract useful information. Rapid processing of multimedia data such as video is important for e.g. criminal investigations, surveillance monitoring, news analysis, sports analytics domain, emotion extraction, etc. Hence, analysis of multimedia data in minimum timeframe is one of the latest research areas. Therefore, we have researched techniques for analyzing unstructured data to extract meaningful information hidden in the big data. In addition, we will describe about various techniques and software used to Manage, process unstructured big data in efficient manner, and increases the performance of complexity analysis.
- Research Article
- 10.62381/acs.sdit2024.64
- Jan 1, 2024
- Academic Conferences Series
This paper discusses the application and challenges of big data analysis in the risk warning of global financial market. With the development of information technology, big data analysis is playing an increasingly important role in financial risk management, helping financial institutions to improve the accuracy and efficiency of risk prediction. Through the integration of multi-source data such as market data, customer data and social media information, big data technology can provide more comprehensive data support and realize accurate early warning of credit risks, market risks and operational risks. However, this process also faces challenges such as data quality issues, privacy protection challenges, and technology costs. To cope with these problems, it is necessary to strengthen the construction of laws and regulations, improve the technical level, and promote cross-industry cooperation to share data resources and joint risk control models. In short, despite the challenges, big data analytics has great potential to improve the stability of global financial markets.
- Book Chapter
2
- 10.5772/intechopen.111473
- Dec 13, 2023
With the emergence of Big Data Technologies (BDT) and the growing application of Big Data Analytics (BDA), Supply Chain Management (SCM) researchers increasingly utilize BDA due to the opportunities from BDT and BDA present. Supply Chain (SC) data is inherently complex and results in an environment with high uncertainty, which presents a real challenge for SC decision-makers. This research study aimed to investigate and illustrate the application of BDA within the existing decision-making process. BDT allowed for the extraction and processing of SC data. BDA aided further understanding of SC inefficiencies and delivered valuable, actionable insights by validating the existence of the SC bullwhip phenomenon and its contributing factors. Furthermore, BDA enabled the pragmatic evaluation of linear and nonlinear regression SC relationships by applying machine learning techniques such as Principal Component Analysis (PCA) and multivariable regression analysis. Moreover, applying more sophisticated BDA time series and forecasting techniques such as Sarimax, Tbats, and neural networks improved forecasting accuracy. Ultimately, the improved demand planning and forecast accuracy will reduce SC uncertainty and the effects of the observed SC bullwhip phenomenon, thus creating a competitive advantage for all the members within the SC value chain.
- Research Article
3
- 10.1080/07366981.2021.1958736
- Aug 11, 2021
- EDPACS
These days, Big Data (BD) and Big Data Analytics (BDA) applications have increased intensively among public and private organisations. Most organisations are aware that BDA has an enormous potential in aiding them to better understand their business environments and their customers’ needs. Nevertheless, many organisations have yet to implement BD as they are concerned that poor quality of data will have an adverse impact on establishing worthful insight, and leading to severe mistakes during their decision-making process. In addition, the different BD characteristics or traits could affect data quality. Therefore, to determine the value of data generated from BD, the collected data must be analysed for accuracy and quality. This paper aims to present findings to better understand quality requirements for BDA implementation in the public sector, specifically in Malaysia. This study explored the influence of Data Quality Dimensions (DQD) on BDA application, identified the influence of Big Data Traits (BDT) on DQD, and evaluated the integration of BDT and DQD in BDA applications using expert validation approach. A conceptual model that incorporates DQD and BDT for BDA application in the public sector was proposed as the study outcome. The conceptual model was developed based on eight BDT (variety, velocity, veracity, validity, volume, value, volatility, and variability) and four data quality categories (intrinsic, contextual, representational, and accessibility). The expert validation results showed that five out of eight BDT are important. The outcomes from this study would deliver important knowledge to the current body of studies that may prove useful for potential use in the future.
- Research Article
59
- 10.1111/ajt.14099
- Jan 4, 2017
- American Journal of Transplantation
Big Data, Predictive Analytics, and Quality Improvement in Kidney Transplantation: A Proof of Concept.
- Research Article
17
- 10.5121/ijscai.2016.5104
- Feb 29, 2016
- International Journal on Soft Computing, Artificial Intelligence and Applications
Big Data is used to store huge volume of both structured and unstructured data which is so large and is hard to process using current / traditional database tools and software technologies. The goal of Big Data Storage Management is to ensure a high level of data quality and availability for business intellect and big data analytics applications. Graph database which is not most popular NoSQL database compare to relational database yet but it is a most powerful NoSQL database which can handle large volume of data in very efficient way. It is very difficult to manage large volume of data using traditional technology. Data retrieval time may be more as per database size gets increase. As solution of that NoSQL databases are available. This paper describe what is big data storage management, dimensions of big data, types of data, what is structured and unstructured data, what is NoSQL database, types of NoSQL database, basic structure of graph database, advantages, disadvantages and application area and comparison of various graph database.
- Research Article
3
- 10.54691/bcpbm.v38i.4256
- Mar 2, 2023
- BCP Business & Management
As big data technology becomes more and more mature, its applications in the finance industry become more widespread. Big data technology can address some issues in the traditional credit business in commercial banks and expand the scope of business. This study focuses on the characteristics of big data analysis in the credit business of commercial banks and the corresponding strategies of risk management. To be specific, this paper summarizes current studies on the topic through careful analysis and points out advice for improvement in big data analysis in the credit business. According to the analysis, big data techniques can play a role in target marketing and the development of customized services based on customers’ images. Furthermore, big data applications can significantly reduce manual labor. In addition, big data technology helps in early identification of risks and risk control since banks can know the borrowers better through it. These results shed light on guiding further exploration of implementation big data analysis to bring competitive advantages to commercial banks.
- Research Article
54
- 10.1007/s10098-020-02008-5
- Jan 22, 2021
- Clean Technologies and Environmental Policy
In the present era of Industry 4.0, organizations are transforming from traditional production systems to digital production systems. This transformation is in terms of additional deployment of technologies that lead to digitization and integration of products and services, business processes and customers, etc. A high volume of unstructured data is being created across different processes due to digitization. The digitization captures the data that includes text, images, multimedia, etc., due to multiplicity of platforms, e.g., machine-to-machine communications, sensors networks, cyber-physical systems, and Internet of Things. Managing this huge data generated from different sources has become a challenging task. Big data analytics (BDA) may be helpful in managing this unstructured data for effective decision making and sustainable operations. Many organizations are struggling to integrate BDA with their manufacturing processes for sustainable operations. The application of BDA from a sustainability perspective is not extensively researched in the current literature. Therefore, firstly this study explores the contribution of BDA in sustainable manufacturing operations. It further identifies strategic factors for the successful application of BDA in manufacturing for sustainable operations. For a detailed analysis of strategic factors in manufacturing, a hybrid approach comprising the analytic hierarchy process, fuzzy TOPSIS and DEMATEL is used. Results revealed that development of contract agreement among all stakeholders, engagement of top management, capability to handle big data, availability of quality and reliable data, developing team of knowledgeable, and capable decision-makers have emerged as major strategic factors for the application of BDA in the manufacturing sector for sustainable operations. Major contribution of this study is in analyzing BDA benefits for manufacturing sector, identifying major strategic factors in implementation and categorization of these factors into cause and effect group. These findings may be used by managers as guidelines for successful implementation of BDA across different functions in their respective organization to achieve sustainable operations goal. The results of this study will also motivate industry professionals to integrate BDA with their manufacturing functions for effective decision making and sustainable operations.
- Book Chapter
2
- 10.1201/9781003166702-12
- Sep 20, 2021
A huge amount of data is generated by each person due to advancements in technology and increased use of the Internet. Big data is one of the domains that deals with this data, which is huge in volume as well as its characteristics, variety, and velocity, making it difficult to deal with. Most people think that huge data means big data, which is not true. Volume is not the only characteristic of big data. Variety and velocity are also important characteristics. Nowadays, social media is an important part of life, but social media is a source of big data. Social media has unstructured data that includes text data, images, videos, likes, etc. Dealing with big data is not an easy task. Increases in big data also raise the need for big data analytics. Big data analysis is the process of analyzing big data to unfold the underlying patterns and information that will be useful for making decisions. Businesses do a lot of data analytics before making any decision to increase profits. There are many applications for big data analytics, like healthcare, satellites, flight safety, grid computing, and Quality of Experience (QoE) monitoring, but there are many issues and challenges as well, like security, data diversity, legal hurdles, and processing frameworks. This chapter demystifies big data computing. It covers big data and big data analytic concepts, different applications of big data analytics, and issues and challenges of big data analytics. One case study is also presented.
- Research Article
42
- 10.1016/j.jbi.2015.12.005
- Dec 17, 2015
- Journal of Biomedical Informatics
Unstructured medical image query using big data – An epilepsy case study
- Book Chapter
8
- 10.1108/978-1-83909-099-820201009
- Sep 30, 2020
The healthcare sector in India is witnessing phenomenal growth, such that by the year 2022, it will be a market worth trillions of INR. Increase in income levels, awareness regarding personal health, the occurrence of lifestyle diseases, better insurance policies, low-cost healthcare services, and the emergence of newer technologies like telemedicine are driving this sector to new heights. Abundant quantities of healthcare data are being accumulated each day, which is difficult to analyze using traditional statistical and analytical tools, calling for the application of Big Data Analytics in the healthcare sector. Through provision of evidence-based decision-making and actions across healthcare networks, Big Data Analytics equips the sector with the ability to analyze a wide variety of data. Big Data Analytics includes both predictive and descriptive analytics. At present, about half of the healthcare organizations have adopted an analytical approach to decision-making, while a quarter of these firms are experienced in its application. This implies the lack of understanding prevalent in healthcare sector toward the value and the managerial, economic, and strategic impact of Big Data Analytics. In this context, this chapter on “Predictive Analytics in Healthcare” discusses sources, areas of application, possible future areas, advantages and limitations of the application of predictive Big Data Analytics in healthcare.
- Conference Article
- 10.2991/meita-15.2015.70
- Jan 1, 2015
Data type and amount in human society is growing in amazing speed which is caused by emerging new services such as cloud computing, internet of things (IoT) and social network, the era of big data has come. Data is a fundamental resource from simple dealing object. In order to fully understand the connotation of big data, this paper expounds the conception of big data, combines with the application requirements of manufacturing big data, the five layer stack type big data processing frameworkis proposed, and the key technologies of manufacturing big data applications are discussed and analyzed.
- Book Chapter
1
- 10.1007/978-981-16-9012-9_49
- Jan 1, 2022
In today’s world, Big Data privacy has become one of the major and challenging aspects of business organizations, where they use Big Data for analytical operations. Big Data is described by its volume, variety, and velocity. Big Data constitutes various types of data, and they are structured, semi-structured, and unstructured. The world’s major portion of data is of type unstructured i.e., it does not have a definite model and specific to a domain. The major types of unstructured data are text, audio, and video data. When these data are subjected to data analytics operation without a prior application of domain-specific privacy-preserving data publishing techniques, then there will be a chance of privacy breach. The existing privacy-preserving approaches mainly operate on structured and semi-structured data. There is a significant need for an automated approach to protect the privacy of the unstructured data which usually exists in high volumes and also to retain unstructured data utility to perform efficient analytics operations. In this paper, an automated hybrid anonymization approach for unstructured data has been proposed with backpropagation-based utility enhancement which preserves privacy of unstructured text data and also contributes to utility retainment of text data.KeywordsPPDPNLPSDLQAUE
- Research Article
1
- 10.11648/j.ajist.20210504.15
- Jan 1, 2021
- American Journal of Information Science and Technology
With the rapid changes of scientific technologies, the explosion of big data and the current COVID-19 pandemic, the internal and external environments of traditional Chinese medicine (TCM) informatics are changing rapidly and the development of TCM informatics faces numerous challenges such as digital transformation. To meet the challenges, there is a strong need to apply big data and big data analytics and technologies to TCM informatics and to incorporate data science education into the programs of TCM informatics. The purpose of this paper is to identify the key challenges facing programs of TCM informatics, and to discuss the opportunities in data science education in TCM programs. The key identified challenges are how to enable students to have a good mastery of knowledge, skills and competencies of big data analytics and technologies and how to incorporate big data analytics and technologies and the key aspects of data science education into the curricula of TCM informatics. The main opportunities are that big data and data science education can assist in meeting the challenges facing TCM informatics and offer the analytics, technologies and applications needed to improve the learning, teaching and research outcomes and standards of TCM informatics and to review, revise and constantly update the teaching contents and curriculum knowledge system of TCM informatics. This paper will be useful and helpful to the programs of TCM informatics in their curriculum design as well as the application of big data analytics and technologies to teaching, learning and research, and to faculty, students, researchers and practitioners.