Comparing Bitcoin generators on the clear web and the dark web
ObjectiveThis study examines Bitcoin generator (BG) websites on the clear and dark web. It focuses on their prevalence, revenue, and associated warnings, as these sites are suspected scams.MethodData for the study was gathered from the Dark Web Monitor and Iknaio Cryptoasset Analytics. A four-step process was used to identify BG sites and their Bitcoin addresses from 2 million dark websites.ResultsWe found 832 dark web BG sites. The monetary revenue from a dark web BG site is approximately 1/3 smaller per Bitcoin address than from a clear web BG site. There is a concentration of revenue at a few BG sites. Only 24% of Bitcoin addresses on dark web BG sites have ever had money deposited on them. On the dark web, the top three clusters of crypto addresses account for 35% of the total revenue. On the clear web, the top three clusters account for 52% of the total revenue. The longer BG sites are online, the higher the revenue. There are hardly any warnings against BG sites.ConclusionOur results fit the Rational Choice model of crime: the revenue is modest, but the effort of the offenders is also limited.
- News Article
- 10.1016/s0262-4079(18)30245-8
- Feb 1, 2018
- New Scientist
Bitcoin blows your cover on the dark web
- Book Chapter
2
- 10.1007/978-3-030-39303-8_27
- Jan 1, 2020
The Dark Web sites are operated over anonymity-preserving protocols like Tor, making users of the Dark Web services more resilient to identification and monitoring. Although some previous works have focused on understanding the size of the Dark Web services and investigating their criminal activities, there is a lack of research on chronological analysis and in-depth profiling of the Dark Web sites, particularly in South Korea. Therefore, in this study, we implemented a Dark Web crawling system, and collected seed and sub Dark Web URLs using it. Then, the 3,000 Dark Web sites from the seed URLs were selected and their web pages were captured for profiling. An in-depth analysis was then conducted on the collected 3,000 Dark Web sites, and an intensive categorization was performed on the basis of their major criminal activities. We then carried out an in-depth profiling for top 3 Korean Dark Web sites to investigate cyber criminal activities in South Korea. In the profiling, criminal activities were collected and analyzed in a chronological point of view. Personal information leakage and Sybil IDs in the Dark Web were also identified based on the PGP keys we collected.
- Book Chapter
5
- 10.1007/978-3-030-21568-2_13
- Jan 1, 2019
We conducted a longitudinal study to analyze the misuse of Bitcoin. We first investigated usage characteristics of Bitcoin by analyzing how many addresses each address transacts with (from January 2009 to May 2018). To obtain a quantitative estimate of the malicious activity that Bitcoin is associated with, we collected over 2.3 million candidate Bitcoin addresses, harvested from the dark web between June 2016 and December 2017. The Bitcoin addresses found on the dark web were labeled with tags that classified the activities associated with the onions that these addresses were collected from. The tags covered a wide range of activities, from suspicious to outright malicious or illegal. Of these addresses, only 47,697 have tags we consider indicative of suspicious or malicious activities.
- Conference Article
15
- 10.1109/pdcat46702.2019.00095
- Dec 1, 2019
Cybercrimes and cyber criminals widely use dark web and illegal functionalities of the dark web towards the world crisis. More than half of the criminal activities and the terror activities conducted through the dark web such as, cryptocurrency, selling human organs, red rooms, child pornography, arm deals, drug deals, hire assassins and hackers, hacking software and malware programs, etc. The law enforcement agencies such as FBI, NSA, Interpol, Mossad, FSB etc, are always conducting surveillance programs through the dark web to trace down the mass criminals and terrorists while stopping the crimes and the terror activities. This paper is about the dark web marketing and surveillance programs. In the deep end research will discuss the dark web access with securely and how the law enforcement agencies exponentially tracking down the users with terror behaviours and activities. Moreover, the paper discusses dark web sites which users can grab the dark web jihadist services and anonymous markets including safety precautions.
- Research Article
- 10.30693/smj.2022.11.10.46
- Nov 30, 2022
- Korean Institute of Smart Media
Today, due to the 4th industrial revolution and extensive R&D funding, domestic companies have begun to possess world-class industrial technologies and have grown into important assets. The national government has designated it as a “national core technology” in order to protect companies' critical industrial technologies. Particularly, technology leaks in the shipbuilding, display, and semiconductor industries can result in a significant loss of competitiveness not only at the company level but also at the national level. Every year, there are more insider leaks, ransomware attacks, and attempts to steal industrial technology through industrial spy. The stolen industrial technology is then traded covertly on the dark web. In this paper, we propose a system for detecting industrial technology leaks in the dark web environment. The proposed model first builds a database through dark web crawling using information collected from the OSINT environment. Afterwards, keywords for industrial technology leakage are extracted using the KeyBERT model, and signs of industrial technology leakage in the dark web environment are proposed as quantitative figures. Finally, based on the identified industrial technology leakage sites in the dark web environment, the possibility of secondary leakage is detected through the PageRank algorithm. The proposed method accepted for the collection of 27,317 unique dark web domains and the extraction of 15,028 nuclear energy-related keywords from 100 nuclear power patents. 12 dark web sites identified as a result of detecting secondary leaks based on the highest nuclear leak dark web sites.
- Book Chapter
5
- 10.1007/978-981-15-1518-7_25
- Jan 1, 2020
Dark web is infamous for the presence of unethical and illegal content on it. The intelligence agencies are increasingly using an automated approach to detect such content. Machine learning classification techniques can be used to detect such content in textual data from dark Web sites. However, their performance suffers due to the presence of irrelevant features in the dataset. In this paper, a two-step dimensionality reduction scheme based on mutual information and linear discriminant analysis for classifying dark web textual content is proposed. This scheme filters out the irrelevant features using mutual information scheme in the first step. The remaining features are then transformed into a new space for a reduction in the number of features using linear discriminant analysis. The proposed scheme is tested on the dark web dataset collected explicitly from dark Web sites using a web crawler and on the Reuters-21,578 dataset for benchmarking purpose. Three different classifiers were used for classification. The results obtained on the two datasets indicate that the proposed two-step technique can positively improve the classification performance along with a significant decrease in the number of features.
- Book Chapter
- 10.4018/979-8-3693-8014-7.ch002
- May 7, 2025
The Dark Web, a hidden corner of the internet, poses a significant challenge to Law Enforcement Agencies (LEAs) due to its anonymous nature and encrypted networks. While it provides a secure platform for legitimate users, it also enables illicit activities such as cybercrime, drug trade, and terrorism. One major obstacle for LEAs is the practice of URL hopping, where Dark Web sites frequently change their URLs to evade detection. To address this issue, the authors propose a novel ensemble model that leverages machine learning techniques to analyze the content of Dark Web sites and correlate it with existing intelligence. Their system aims to improve the accuracy of tracing URL hopping on the Dark Web, ultimately enhancing the ability of LEAs to track and disrupt illicit activities. By bridging the gap between the Dark Web's anonymity and LEA's monitoring capabilities, their research contributes to a safer and more secure online environment.
- Book Chapter
4
- 10.1007/978-3-319-70139-4_90
- Jan 1, 2017
It is well known that products for cyber-attacks such as exploits and malware codes are illegally traded on hidden web services called Dark Web that are not indexed by conventional search engines we usually use. In general, it is not easy to capture the whole picture of trade activities on Dark Web because special browsers and tools are needed to visit such dark market sites and forums. And they usually require us to make a registration and/or to pass a qualification test. However, to understand the trends of cyber-attacks, there is no doubt that Dark Web is one of the useful information sources. In this paper, we try to understand the sales trends of illegal products for cyber-attacks from the largest marketplace called AlphaBay, which is relatively easier to collect information without passing any qualification tests, To monitor business trades on Dark Web, we develop an AI web-contents analyzer, which consists of a Tor crawler to collect the product information and a topic analyzer to capture the trends of what people are interested in and popular products of cyber-attacks. For this purpose, we use a topic model called Latent Dirichlet Allocation (LDA) and we show that the topic analysis would be helpful for predicting new cyber-attacks.
- Conference Article
5
- 10.1109/iciba52610.2021.9687954
- Dec 17, 2021
The emergence of anonymity services provides the role of protecting user information security, it also provides a perfect venue for illegal and criminal activities such as human trafficking, illegal information transactions, and drugs and firearms. The anonymity, complexity of the structure, high-efficiency information distribution, and weak legal constraints of the dark web make it a new carrier for many criminals to carry out criminal activities. Therefore, in order to supervise criminals using the “dark web” to conduct illegal activities, a Python dark web monitoring crawler was designed based on Tor to obtain and store a large number of dark web site addresses; a web crawler based on the scrapy framework was established to crawl the dark web and specific types of crimes. Relevant website information is saved in the MongoDB database; data analysis is carried out by designing a crawler algorithm; finally, the analyzed data is generated through intuitive word cloud diagrams, histograms and other methods to generate a visual interface to facilitate real-time monitoring of dark web crimes.
- Research Article
8
- 10.1016/j.future.2024.03.025
- Mar 20, 2024
- Future Generation Computer Systems
A Big Data architecture for early identification and categorization of dark web sites
- Research Article
3
- 10.1007/s11416-023-00476-z
- Apr 29, 2023
- Journal of Computer Virology and Hacking Techniques
The transformation of the contemporary societies through digital technologies has had a profound effect on all human activities including those that are in the realm of illegal, unlawful, and criminal deeds. Moreover, the affordances provided by the anonymity creating techniques such as the Tor protocol which are beneficial for preserving civil liberties, appear to be highly profitable for various types of miscreants whose crimes range from human trafficking, arms trading, and child pornography to selling controlled substances and racketeering. The Tor similar technologies are the foundation of a vast, often mysterious, sometimes anecdotal, and occasionally dangerous space termed as the Dark Web. Using the features that make the Internet a uniquely generative knowledge agglomeration, with no borders, and permeating different jurisdictions, the Dark Web is a source of perpetual challenges for both national and international law enforcement agencies. The anonymity granted to the wrong people increases the complexity and the cost of identifying both the crimes and the criminals, which is often exacerbated with lack of proper human resources. Technologies such as machine learning and artificial intelligence come to the rescue through automation, intensive data harvesting, and analysis built into various types of web crawlers to explore and identify dark markets and the people behind them. It is essential for an effective and efficient crawling to have a pool of dark sites or onion URLs. The research study presents a way to build a crawling mechanism by extracting onion URLs from malicious executables by running them in a sandbox environment and then analysing the log file using machine learning algorithms. By discerning between the malware that uses the Tor network and the one that does not, we were able to classify the Tor using malware with an accuracy rate of 91% with a logistic regression algorithm. The initial results suggest that it is possible to use this machine learning approach to diagnose new malicious servers on the Tor network. Embedding this kind of mechanism into the crawler may also induce predictability, and thus efficiency in recognising dark market activities, and consequently, their closure.
- Research Article
1
- 10.1016/j.jcrimjus.2023.102060
- Mar 23, 2023
- Journal of Criminal Justice
Dark web pedophile site users' cybersecurity concerns: A lifespan and survival analysis
- Book Chapter
6
- 10.1007/978-3-030-36718-3_27
- Jan 1, 2019
In recent years, various web-based attacks such as Drive-by-Download attacks are becoming serious. To protect legitimate users, it is important to collect information on malicious sites that could provide a blacklist-based detection software. In our study, we propose a system to collect URLs of malicious sites in the dark web. The proposed system automatically crawls dark web sites and collects malicious URLs that are judged by using VirusTotal and the Gred engine. We also predict dangerous categories of collected web sites that are potentially malicious using a document embedding with a gradient boosting decision tree model. In the experiments, we demonstrate that the proposed system can predict dangerous site categories with 0.82 accuracy in F1-score.
- Research Article
1
- 10.56353/aspiration.v2i1.26
- Jul 31, 2021
- ASPIRATION Journal
This study aims to analyze cybercrime actions on dark web sites in unfriended films: dark web. How to reveal the 'form' of cybercrime contained in a dark web site linked in a film. By using John Fiske's semiotic analysis, this study tries to enlighten people's thinking through signs of reality, representations, and themed ideologies to be more careful in using the virtual world as a daily activity. The paradigm used in this study is the constructivism paradigm in forming a framework for understanding cybercrime acts in the object of this study. The methodology used is a qualitative approach. The results of this study are films that are now being used for industrialization and commercialization. The results of this study also show the fact that reality, namely, behavior, expressions, sounds and so on. Then there is a representation of a representation consisting of technical codes in a film show, such as camera, lighting, editing, music and sound as part of the description of the situation and storyline. And finally there are elements of ideology, namely, ideological codes, such as individualism, feminism, liberalism, capitalism, race, class, materialism, which are in the film.
- Conference Article
1
- 10.1109/igetblockchain56591.2022.10087184
- Nov 7, 2022
Bitcoin transactions are pseudonymous, which means that even when addresses or addresses can be connected one to another, it is really hard to connect them with outside entities. On top of that there have been a proliferation of bitcoins mixing sites in recent years. These sites operate mostly on the dark web and their main mission is launder bitcoins by making vast amounts of complex transactions with them and to make their association with a single owner even harder. Our mission is to make that distinction easier. In this paper we plan to introduce two novel heuristics. Our OI heuristic is designed to parse blockchain data in a way where we only receive information we deem is of interest. We introduce HOLO which aims to taint bitcoin addresses in reference to a fixed output. We also visualize the blockchain and our heuristic in an easily digestible manner.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.