A GIS-based reassessment of the cartographic accuracy of the Huang Yu Quan Lan Tu and the Nei Fu Yu Tu

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

ABSTRACT This study systematically evaluates the cartographic accuracy of the Qing Dynasty’s Huang Yu Quan Lan Tu (Kangxi Map) and Nei Fu Yu Tu (Qianlong Map), using 26 lakes in the southern Mongolian Plateau as natural references. Historical map sheets first undergo auxiliary geometric correction and georeferencing to a modern framework. Lake features are then digitized, centroids extracted, and spatial distances calculated under the Albers equal-area projection. Lake centroid errors are quantified via MAE/RMSE (decomposed into longitudinal/latitudinal components); residuals are visualized with MapAnalyst-generated distortion grids, and 150 km Tissot’s indicatrices (Sanson/Albers projections) isolate projection effects. The Kangxi Map has higher overall accuracy (MAE = 42.77 km, RMSE = 57.04 km) and stable latitude control (RMSE = 0.283°). The Qianlong Map improves longitudinally near 116.38°E (MAE = 0.343°) but has larger total errors (MAE = 50.78 km, RMSE = 64.41 km) and ‘east-high-west-low’ distortion. Notably, 77% of corresponding lake centroids have < 5 km displacement, confirming the Qianlong Map inherited the Kangxi Map’s core framework. Errors stem from Qing survey limitations, Sanson projection anisotropy, and sheet assembly deviations. This study provides a reproducible method for digitized historical map accuracy evaluation and reveals East Asia’s cartographic transition to evidence-based practice.

Similar Papers
  • Research Article
  • 10.1111/rec.13911
Assessing tradeoffs between current and desired vegetation condition in a National Park using historical maps and high‐resolution lidar data
  • May 2, 2023
  • Restoration Ecology
  • John A Young + 1 more

In the United States, National Park Service Civil War battlefield units are managed for both historical accuracy (i.e. to represent landscape conditions at the time of the conflict for historical interpretation), and for natural resource protection. However, managing for both goals can create conflicts as many battlefields were largely open or in second growth forests historically, but now harbor significant forest resources after more than 100 years of preservation. Managing for historical accuracy therefore may require maintenance of the landscape in a successional stage out of phase with the current landscape. We use historical landscape maps and current high‐resolution forest structure data derived from lidar to examine tradeoffs in returning the landscape of a major Civil War battlefield (Wilderness Battlefield) to conditions present at the time of the battle. We demonstrate that National Park battlefield units can harbor significant forest resources in contrast to the surrounding landscape, especially in areas of intense commercial, urban, and suburban development. Managing for or restoring landscapes to historical conditions could have important ecological implications.

  • PDF Download Icon
  • Research Article
  • 10.1038/s41598-024-62493-2
Spatial utilization of historical topographic map and its application in land reconstruction of ancient Chinese urban land use
  • May 21, 2024
  • Scientific Reports
  • Zhiwei Wan + 1 more

The historical topographic map preserves rich geographic information and can provide direct assistance for the reconstruction of various geographic elements. Based on the historical data of cities throughout the Qing Dynasty, the land use scale data of cities across the country was obtained using GIS and urban perimeter conversion models. This study combines city information and city circumference records from the historical maps and archives of the late Qing Dynasty to quantitatively reconstruct the use patterns of ancient China’s urban land at a spatial resolution of 1° × 1°. Uncertainty analysis of the reconstruction results was conducted using modern remote sensing image data as the validation data set. The results showed the following. (1) During the late Qing Dynasty, the total area of urban land in the various provinces and regions was 1456.015 km2. The maximum value was 208.691 km2 in Beijing-Tianjin-Hebei, the minimum value was 1.713 km2 in Qinghai, and the average value was 56.001 km2. (2) The results of grid reconstruction show that among the 398 grids with urban land distribution, the maximum value is 64.099 km2/grid, the minimum value is 0.013 km2/grid, and the average value is 3.658 km2/grid. (3) Of all the grids with urban land, the urban land grid to the west of the Hu Line accounts for 12.5% and the east to 87.5%. (4) During the late Qing Dynasty, urban land use in China was primarily concentrated in agriculturally developed areas such as the North China Plain, the Central Plains, Jiangnan, and the Sichuan-Chongqing region. (6) The results of a kernel density estimation showed that there were obviously three core areas of urban land agglomeration in China during the late Qing Dynasty: the North China Plain-Central Plains, the Jiangsu-Shanghai-Zhejiang-Anhui area, and the Sichuan-Chongqing urban core area. This study provides basic data for urban land use during historical periods and provides a basis for the quantitative reconstruction of relevant urban land data for historical archives.

  • PDF Download Icon
  • Research Article
  • 10.1051/e3sconf/202449702028
Conventional and current approaches of urban mapping and geodetic base formulation for establishing demographic processes database: Tashkent, Uzbekistan
  • Jan 1, 2024
  • E3S Web of Conferences
  • Sarvar Abdurakhmonov + 6 more

This study explores the integration of historical and modern urban mapping data, an expanded geodetic base, and demographic processes to provide a comprehensive understanding of the dynamic relationships within urban landscapes. Analyzing data spanning from 1950 to 2040, we observe a consistent urban expansion, evolving population density, and shifting land use patterns. The inclusion of ten control points enhances the geodetic base, ensuring precise spatial referencing for urban analyses. Spatially referenced demographic processes data reveal correlations between urban characteristics and population dynamics, guiding targeted interventions for sustainable development. Findings underscore the significance of synergizing conventional and current approaches in urban planning, emphasizing the need for adaptive strategies in response to evolving urban landscapes. Key limitations include potential data quality issues in historical mapping, necessitating ongoing efforts for accuracy enhancement. Future research should focus on refining historical data accuracy and exploring specific urban impacts on demographic dynamics.

  • Research Article
  • Cite Count Icon 33
  • 10.1016/j.jhazmat.2023.133115
Effects of lakeshore landcover types and environmental factors on microplastic distribution in lakes on the Inner Mongolia Plateau, China
  • Nov 29, 2023
  • Journal of Hazardous Materials
  • Shuai Luo + 5 more

Effects of lakeshore landcover types and environmental factors on microplastic distribution in lakes on the Inner Mongolia Plateau, China

  • Conference Article
  • Cite Count Icon 2
  • 10.1145/3167132.3167228
Assessing the planimetric accuracy of Paris atlases from the late 18th and 19th centuries
  • Apr 9, 2018
  • Bertrand Duménieu + 2 more

The recent initiatives to digitize cultural heritage resources and publish them on the Web have renewed interest in historical maps for the diachronic analysis of territories in GIS applications. However, such analyses should not be done without a good understanding of the possibilities and limitations of geographical information provided by historical maps, i.e. their quality. One of the major concerns regarding historical maps quality is their positional planimetric accuracy which highly depends on survey techniques used at the time. As these techniques are not always thoroughly known and as ground truth is most of the time not sufficiently available, direct absolute evaluation approaches have been proposed to assess historical maps positional planimetric accuracy. In this article, we follow the intuition that the most widely adopted georeferencing-based approach for assessing the positional planimetric accuracy of historical maps can be adapted to provide an evaluation of the error caused by the survey process in cases like Paris atlases where the georeferencing transformation can be estimated with ground control points based on geodetic features and where the projection of the map can be approximated by a well known projected coordinate reference system. We apply this tuned approach on the Verniquet atlas and evaluate the validity of our hypothesis about projection approximation.

  • Book Chapter
  • Cite Count Icon 11
  • 10.1007/978-3-030-49461-2_24
Building Linked Spatio-Temporal Data from Vectorized Historical Maps
  • Jan 1, 2020
  • The Semantic Web
  • Basel Shbita + 5 more

Historical maps provide a rich source of information for researchers in the social and natural sciences. These maps contain detailed documentation of a wide variety of natural and human-made features and their changes over time, such as the changes in the transportation networks and the decline of wetlands. It can be labor-intensive for a scientist to analyze changes across space and time in such maps, even after they have been digitized and converted to a vector format. In this paper, we present an unsupervised approach that converts vector data of geographic features extracted from multiple historical maps into linked spatio-temporal data. The resulting graphs can be easily queried and visualized to understand the changes in specific regions over time. We evaluate our technique on railroad network data extracted from USGS historical topographic maps for several regions over multiple map sheets and demonstrate how the automatically constructed linked geospatial data enables effective querying of the changes over different time periods.

  • Research Article
  • Cite Count Icon 41
  • 10.1109/access.2019.2963213
Automated Extraction of Human Settlement Patterns From Historical Topographic Map Series Using Weakly Supervised Convolutional Neural Networks
  • Jan 1, 2020
  • IEEE Access
  • Johannes H Uhl + 4 more

Information extraction from historical maps represents a persistent challenge due to inferior graphical quality and the large data volume of digital map archives, which can hold thousands of digitized map sheets. Traditional map processing techniques typically rely on manually collected templates of the symbol of interest, and thus are not suitable for large-scale information extraction. In order to digitally preserve such large amounts of valuable retrospective geographic information, high levels of automation are required. Herein, we propose an automated machine-learning based framework to extract human settlement symbols, such as buildings and urban areas from historical topographic maps in the absence of training data, employing contemporary geospatial data as ancillary data to guide the collection of training samples. These samples are then used to train a convolutional neural network for semantic image segmentation, allowing for the extraction of human settlement patterns in an analysis-ready geospatial vector data format. We test our method on United States Geological Survey historical topographic maps published between 1893 and 1954. The results are promising, indicating high degrees of completeness in the extracted settlement features (i.e., recall of up to 0.96, F-measure of up to 0.79) and will guide the next steps to provide a fully automated operational approach for large-scale geographic feature extraction from a variety of historical map series. Moreover, the proposed framework provides a robust approach for the recognition of objects which are small in size, generalizable to many kinds of visual documents.

  • Research Article
  • Cite Count Icon 30
  • 10.1016/j.compenvurbsys.2022.101794
Towards the automated large-scale reconstruction of past road networks from historical maps
  • Mar 18, 2022
  • Computers, environment and urban systems
  • Johannes H Uhl + 3 more

Transportation infrastructure, such as road or railroad networks, represent a fundamental component of our civilization. For sustainable planning and informed decision making, a thorough understanding of the long-term evolution of transportation infrastructure such as road networks is crucial. However, spatially explicit, multi-temporal road network data covering large spatial extents are scarce and rarely available prior to the 2000s. Herein, we propose a framework that employs increasingly available scanned and georeferenced historical map series to reconstruct past road networks, by integrating abundant, contemporary road network data and color information extracted from historical maps. Specifically, our method uses contemporary road segments as analytical units and extracts historical roads by inferring their existence in historical map series based on image processing and clustering techniques. We tested our method on over 300,000 road segments representing more than 50,000 km of the road network in the United States, extending across three study areas that cover 42 historical topographic map sheets dated between 1890 and 1950. We evaluated our approach by comparison to other historical datasets and against manually created reference data, achieving F-1 scores of up to 0.95, and showed that the extracted road network statistics are highly plausible over time, i.e., following general growth patterns. We demonstrated that contemporary geospatial data integrated with information extracted from historical map series open up new avenues for the quantitative analysis of long-term urbanization processes and landscape changes far beyond the era of operational remote sensing and digital cartography.

  • Research Article
  • Cite Count Icon 7
  • 10.3233/sw-222918
Building spatio-temporal knowledge graphs from vectorized topographic historical maps
  • Apr 5, 2023
  • Semantic Web
  • Basel Shbita + 5 more

Historical maps provide rich information for researchers in many areas, including the social and natural sciences. These maps contain detailed documentation of a wide variety of natural and human-made features and their changes over time, such as changes in transportation networks or the decline of wetlands or forest areas. Analyzing changes over time in such maps can be labor-intensive for a scientist, even after the geographic features have been digitized and converted to a vector format. Knowledge Graphs (KGs) are the appropriate representations to store and link such data and support semantic and temporal querying to facilitate change analysis. KGs combine expressivity, interoperability, and standardization in the Semantic Web stack, thus providing a strong foundation for querying and analysis. In this paper, we present an automatic approach to convert vector geographic features extracted from multiple historical maps into contextualized spatio-temporal KGs. The resulting graphs can be easily queried and visualized to understand the changes in different regions over time. We evaluate our technique on railroad networks and wetland areas extracted from the United States Geological Survey (USGS) historical topographic maps for several regions over multiple map sheets and editions. We also demonstrate how the automatically constructed linked data (i.e., KGs) enable effective querying and visualization of changes over different points in time.

  • Conference Article
  • Cite Count Icon 26
  • 10.1049/cp.2017.0144
Extracting Human Settlement Footprint from Historical Topographic Map Series Using Context-Based Machine Learning
  • Jan 1, 2017
  • J.H Uhl + 4 more

Information extraction from historical maps represents a persistent challenge due to inferior graphical quality and large data volume in digital map archives, which can hold thousands of digitized map sheets. In this paper, we describe an approach to extract human settlement symbols in United States Geological Survey (USGS) historical topographic maps using contemporary building data as the contextual spatial layer. The presence of a building in the contemporary layer indicates a high probability that the same building can be found at that location on the historical map. We describe the design of an automatic sampling approach using these contemporary data to collect thousands of graphical examples for the symbol of interest. These graphical examples are then used for robust learning to then carry out feature extraction in the entire map. We employ a Convolutional Neural Network (LeNet) for the recognition task. Results are promising and will guide the next steps in this research to provide an unsupervised approach to extracting features from historical maps.

  • Research Article
  • 10.54254/2755-2721/2025.24696
Integrated 3D Reconstruction and Spatial Restoration of Historical Architecture Using LiDAR, UAV, and GIS: A Multi-Source Data Approach Based on Historical Map Comparison
  • Jul 4, 2025
  • Applied and Computational Engineering
  • Tianni Yang

This study developed a new type of multi-source data integration framework for high-fidelity three-dimensional reconstruction and spatial restoration of historical buildings. By integrating ground-based LiDAR data carried by unmanned aerial vehicles (UAVs), UAV aerial survey images, and historical reference maps, this method effectively breaks the limitations inherent in a single data source. In the study, a group of Qing Dynasty courtyard buildings was selected as the test object. First, accurate geometric and light information were obtained using LiDAR and UAV aerial photography. Meanwhile, historical maps from 1875, 1902, and 1920 were digitized to correct distortion problems. Subsequently, a three-step registration process is implemented: coarse alignment based on ground control points, fine registration based on the ICP algorithm, and semantic anchoring based on feature points, to ensure the accurate registration of modern sensor data and archival vector data. This framework adopts a modular reconstruction scheme. After dividing the unified point cloud data, a sealed mesh model is generated by Poisson surface reconstruction, and combined with the texture map of the unmanned aerial vehicle orthophoto image and the semi-transparent historical overlay layer. Quantitative evaluation shows that its accuracy reaches the sub-decimeter level compared with the measured values and archival documents. Comparison of the historical maps reveals the disappearance of the courtyard structure, buried foundations, and original channels of the water management system, which provides a basis for selective excavation and virtual restoration of the missing features. The resulting model takes into account both geometric fidelity and semantic integrity, providing a double guarantee for the protection of cultural heritage.

  • 10.3929/ethz-b-000368780
Unlocking the Geospatial Past with Deep Learning – Establishing a Hub for Historical Map Data in Switzerland
  • Jul 15, 2019
  • Magnus Heitzler + 1 more

Abstract. Thoroughly prepared historical map data can facilitate research in a wide range of domains, including ecology and hydrology (e.g., for preservation and renaturation), urban planning and architecture (e.g., to analyse the settlement development), geology and insurance (e.g., to derive indicators of past natural hazards to estimate future events), and even linguistics (e.g., to explore the evolution of toponyms). Research groups in Switzerland have invested large amounts of time and money to manually derive features (e.g., pixel-based segmentations, vectorizations) from historical maps such as the Dufour Map Series (1845–1865) or the Siegfried Map Series (1872–1949). The results of these efforts typically cover limited areas of the respective map series and are tailored to specific research questions. Recent research in automated data extraction from historical maps shows that Deep Learning (DL) methods based on Artificial Neural Networks (ANN) might significantly reduce this manual workload (Uhl et al. (2017), Heitzler et al. (2018)). Yet, efficiently exploiting DL methods to provide high-quality features requires detailed knowledge of the underlying mathematical concepts and software libraries, high-performance hardware to train models in a timely manner, and sufficient amounts of data. Hence, a new initiative at the Institute of Cartography and Geoinformation (IKG) at ETH Zurich aims to establish a hub to systematically bundle the efforts of the many Swiss institutes working with historical map data and to provide the computational capabilities to efficiently extract the desired features from the vast collection of Swiss historical maps. This is primarily achieved by providing a spatial data infrastructure (SDI), which integrates a geoportal with a DL environment (see Figure 1). The SDI builds on top of the geoportal geodata4edu.ch (G4E), which was established to facilitate the access of federal and cantonal geodata to Swiss academic institutions. G4E inherently supports the integration and exploration of spatio-temporal data via an easy-to-use web interface and common web services and hence is an ideal choice to share historical map data. Making historical map data accessible in G4E is realized using state-of-the-art software libraries (e.g., Tensorflow, Keras), and suitable hardware (e.g., NVIDIA GPUs). Existing project data generated by the Swiss scientific community serve as the initial set to train a DL model for a specific thematic layer. If such data does not exist it is generated manually. Combining these data with georeferenced sheets of the corresponding map series allows the DL system to learn a way of obtaining the expected results based on the input map sheet. In the common case where an actual vectorization of a thematic layer is required, two steps are taken. First, the underlying ANN architecture yields a segmentation of the map sheet to determine which pixel is part of the feature type of interest (e.g., by using a fully convolutional architecture such as U-Net (Ronneberger et al. (2015)) and, second, the resulting segmentations will be vectorized using GIS algorithms (e.g., using methods as described in Hori & Okazaki (1992)). These vectorizations undergo a quality check and might be directly published in G4E if the quality is considered high enough. In addition, the results may be manually corrected. A corrected dataset may have a greater value for the scientific community but might be time consuming to create. However, it has also the advantage to serve as additional training data for the DL system. This may lead to a positive feedback loop, which allows the ANN to gradually improve its predictions, which in turn improves the vectorization results and hence reduces the correction workload. Figure 2 shows automatically generated vectorizations of building footprints after two such iterations. Special emphasis was put on enforcing perpendicularity without requiring human intervention. At the time of writing, such building polygons have been generated for all Siegfried map sheets. It is worth emphasizing that showing the ability of generating high-quality features of single thematic layers at a large scale and making them easily available to the scientific community is a key aspect when establishing a hub for sharing historical map data. Research groups are more willing to share their data if they see that the coverage of the data they produce might get multiplied and if they realize that other groups are providing their data as well. Apart from the benefits for research groups using such data, such an environment also allows to facilitate the development of new methods to derive features from historical maps (e.g., for extraction, generalization). The current focus lies on the systematic preparation of all thematic layers of the main Swiss map series. Afterwards it is aimed to place higher emphasis on the fusion of the extracted layers. In the long-term, these efforts will lead to a comprehensive spatio-temporal database of high scientific value for the Swiss scientific community.

  • Research Article
  • Cite Count Icon 28
  • 10.1080/13658816.2020.1845702
Aligning geographic entities from historical maps for building knowledge graphs
  • Nov 12, 2020
  • International Journal of Geographical Information Science
  • Kai Sun + 3 more

Historical maps contain rich geographic information about the past of a region. They are sometimes the only source of information before the availability of digital maps. Despite their valuable content, it is often challenging to access and use the information in historical maps, due to their forms of paper-based maps or scanned images. It is even more time-consuming and labor-intensive to conduct an analysis that requires a synthesis of the information from multiple historical maps. To facilitate the use of the geographic information contained in historical maps, one way is to build a geographic knowledge graph (GKG) from them. This paper proposes a general workflow for completing one important step of building such a GKG, namely aligning the same geographic entities from different maps. We present this workflow and the related methods for implementation, and systematically evaluate their performances using two different datasets of historical maps. The evaluation results show that machine learning and deep learning models for matching place names are sensitive to the thresholds learned from the training data, and a combination of measures based on string similarity, spatial distance, and approximate topological relation achieves the best performance with an average F-score of 0.89.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 3
  • 10.5194/ica-abs-1-110-2019
Unlocking the Geospatial Past with Deep Learning – Establishing a Hub for Historical Map Data in Switzerland
  • Jul 15, 2019
  • Abstracts of the ICA
  • Magnus Heitzler + 1 more

Abstract. Thoroughly prepared historical map data can facilitate research in a wide range of domains, including ecology and hydrology (e.g., for preservation and renaturation), urban planning and architecture (e.g., to analyse the settlement development), geology and insurance (e.g., to derive indicators of past natural hazards to estimate future events), and even linguistics (e.g., to explore the evolution of toponyms). Research groups in Switzerland have invested large amounts of time and money to manually derive features (e.g., pixel-based segmentations, vectorizations) from historical maps such as the Dufour Map Series (1845–1865) or the Siegfried Map Series (1872–1949). The results of these efforts typically cover limited areas of the respective map series and are tailored to specific research questions.Recent research in automated data extraction from historical maps shows that Deep Learning (DL) methods based on Artificial Neural Networks (ANN) might significantly reduce this manual workload (Uhl et al. (2017), Heitzler et al. (2018)). Yet, efficiently exploiting DL methods to provide high-quality features requires detailed knowledge of the underlying mathematical concepts and software libraries, high-performance hardware to train models in a timely manner, and sufficient amounts of data.Hence, a new initiative at the Institute of Cartography and Geoinformation (IKG) at ETH Zurich aims to establish a hub to systematically bundle the efforts of the many Swiss institutes working with historical map data and to provide the computational capabilities to efficiently extract the desired features from the vast collection of Swiss historical maps. This is primarily achieved by providing a spatial data infrastructure (SDI), which integrates a geoportal with a DL environment (see Figure 1).The SDI builds on top of the geoportal geodata4edu.ch (G4E), which was established to facilitate the access of federal and cantonal geodata to Swiss academic institutions. G4E inherently supports the integration and exploration of spatio-temporal data via an easy-to-use web interface and common web services and hence is an ideal choice to share historical map data. Making historical map data accessible in G4E is realized using state-of-the-art software libraries (e.g., Tensorflow, Keras), and suitable hardware (e.g., NVIDIA GPUs). Existing project data generated by the Swiss scientific community serve as the initial set to train a DL model for a specific thematic layer. If such data does not exist it is generated manually. Combining these data with georeferenced sheets of the corresponding map series allows the DL system to learn a way of obtaining the expected results based on the input map sheet. In the common case where an actual vectorization of a thematic layer is required, two steps are taken. First, the underlying ANN architecture yields a segmentation of the map sheet to determine which pixel is part of the feature type of interest (e.g., by using a fully convolutional architecture such as U-Net (Ronneberger et al. (2015)) and, second, the resulting segmentations will be vectorized using GIS algorithms (e.g., using methods as described in Hori &amp;amp; Okazaki (1992)). These vectorizations undergo a quality check and might be directly published in G4E if the quality is considered high enough. In addition, the results may be manually corrected. A corrected dataset may have a greater value for the scientific community but might be time consuming to create. However, it has also the advantage to serve as additional training data for the DL system. This may lead to a positive feedback loop, which allows the ANN to gradually improve its predictions, which in turn improves the vectorization results and hence reduces the correction workload. Figure 2 shows automatically generated vectorizations of building footprints after two such iterations. Special emphasis was put on enforcing perpendicularity without requiring human intervention. At the time of writing, such building polygons have been generated for all Siegfried map sheets.It is worth emphasizing that showing the ability of generating high-quality features of single thematic layers at a large scale and making them easily available to the scientific community is a key aspect when establishing a hub for sharing historical map data. Research groups are more willing to share their data if they see that the coverage of the data they produce might get multiplied and if they realize that other groups are providing their data as well. Apart from the benefits for research groups using such data, such an environment also allows to facilitate the development of new methods to derive features from historical maps (e.g., for extraction, generalization). The current focus lies on the systematic preparation of all thematic layers of the main Swiss map series. Afterwards it is aimed to place higher emphasis on the fusion of the extracted layers. In the long-term, these efforts will lead to a comprehensive spatio-temporal database of high scientific value for the Swiss scientific community.

  • Research Article
  • Cite Count Icon 26
  • 10.1007/s12665-014-3568-z
Organic carbon fractions and estimation of organic carbon storage in the lake sediments in Inner Mongolia Plateau, China
  • Aug 10, 2014
  • Environmental Earth Sciences
  • Zhilei Xie + 10 more

Organic carbon (OC) in lake sediments plays an important role in terrestrial ecosystem carbon cycle. The Inner Mongolia Plateau contains a number of shallow and freshwater lakes, with a total lake area of more than 8,000 km2, accounting for an approximate 10 % of the total lake area in China. The Inner Mongolia Plateau lakes act as important OC sink in mid-high latitude regions. In this study, heavy and light fractions of OC and OC species were analyzed in sediments from four typical lakes in the Inner Mongolia Plateau. Meanwhile, to identify OC origins, allochthonous and autochthonous OC were calculated based on a binary model. Furthermore, total organic carbon (TOC) storage, active carbon pool (ACP), and stable carbon pool (SCP) over the past 150 years were estimated in the Inner Mongolia Plateau. The dominating direct findings of the current research are that heavy fraction OC plays a key role as carbon sink in mid-high latitude regions due to its percentage of more than 90 % in TOC. The percentages of allochthonous OC in TOC are high, 86.4, 66.7 and 72.5 %, in Daihai Lake (DH), Dalinuoer Lake (DLNE), and Hulunhu Lake (HLH), respectively, which indicates that allochthonous OC is dominant in DH, DLNE, and HLH. The range of humin is 62.15–84.03 % in these four lakes. The average OC accumulation rate calculated in this work is 1.37 g C m−2 year−1 in these four lakes. Comparatively, OC storages in sediments from lakes in the Inner Mongolia Plateau are relatively more stable than from those lakes located in tropical and sub-tropical regions. An estimate of the TOC burial, SCP and ACP in lake sediments would be 1.64 × 1012, 1.52 × 1012, and 1.20 × 1011 g C, respectively, over the past 150 years in the Inner Mongolia Plateau.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.