Abstract

When using the traditional Douglas–Peucker (D–P) algorithm to simplify linear objects, it is easy to generate results containing self-intersecting errors, thus affecting the application of the D–P algorithm. To solve the problem of self-intersection, a new vector line simplification algorithm based on the D–P algorithm, monotonic chains and dichotomy, is proposed in this paper. First, the traditional D–P algorithm is used to simplify the original lines, and then the simplified lines are divided into several monotonic chains. Second, the dichotomy is used to search the intersection positions of monotonic chains effectively, and intersecting monotonic chains are processed, thus solving the self-intersection problems. Two groups of experimental data are selected based on large data sets. Results demonstrate that the proposed experimental method has advantages in algorithmic efficiency and accuracy when compared to the D–P algorithm and the Star-shaped algorithm.

Highlights

  • With the development of remote-sensing technology, sensor technology, and Web 2.0, the large amounts of obtained spatial vector data produce great challenges in data storage, processing, and transmission

  • The D–P algorithm [1] and Ramer algorithm [2] use a given distance tolerance to determine which vertices on a line are to be eliminated or retained

  • The simplification assessment metrics of the data processed data by method are shown as follows: the reTshueltisnigmsptlaitfiisctaitcisoanreascsoemsspmuetendt umseintrgictsheotfwthoegrdoautpasporfoecxepsseerdimdeanttaalbdyatmaseetths.od are shown as follow(1s):Ttihmeerecsounlstuinmgpsttiaotnis:ttichse atirme ecocmonpsuumtepdtiuosninregstuhltestowf othgertohurepesloinfeesxipmeprliimficeanttiaolndmaetathseotdss. are show(n1)inTFimigeurceon5,saunmdptthioenti:mtheecotinmsue mcopntisounmips tmioenasruesreudltsinomf tihlleisethcroenedlsin(me ss)i.mplification methods are shown in Figure 5, and the time consumption is measured in milliseconds

Read more

Summary

Introduction

With the development of remote-sensing technology, sensor technology, and Web 2.0, the large amounts of obtained spatial vector data produce great challenges in data storage, processing, and transmission. Based on a sequential set of five procedures, McMaster [4] presented a conceptual model to process linear digital data. This employed method used the perpendicular distance tolerance proposed by Lang [4] to simplify the lines and used smoothing techniques to produce the most aesthetically acceptable results. Based on recognizing line shapes and filtering them against cartographic rules, Wang and Muller [8] proposed a Bend Simplify algorithm. Based on the Li–Openshaw algorithm [11], the D–P algorithm, and the orthogonal simplification method, Samsonov and Yakimova [12] proposed a methodology and generalization model for the geometric simplification of heterogeneous line datasets

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call