WO2009003529A1 - Arrangement et procédé servant à fournir une représentation cartographique tridimensionnelle d'une zone - Google Patents
Arrangement et procédé servant à fournir une représentation cartographique tridimensionnelle d'une zone Download PDFInfo
- Publication number
- WO2009003529A1 WO2009003529A1 PCT/EP2007/056780 EP2007056780W WO2009003529A1 WO 2009003529 A1 WO2009003529 A1 WO 2009003529A1 EP 2007056780 W EP2007056780 W EP 2007056780W WO 2009003529 A1 WO2009003529 A1 WO 2009003529A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- map representation
- images
- dimensional map
- navigation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000012545 processing Methods 0.000 claims abstract description 35
- 238000001454 recorded image Methods 0.000 claims abstract description 17
- 238000004590 computer program Methods 0.000 claims 2
- 230000000644 propagated effect Effects 0.000 claims 2
- 230000000875 corresponding effect Effects 0.000 description 14
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000005755 formation reaction Methods 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 238000012805 post-processing Methods 0.000 description 2
- 239000002023 wood Substances 0.000 description 2
- 238000009412 basement excavation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 229940086255 perform Drugs 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Definitions
- the present invention relates to an arrangement and method for proving a three dimensional map representation of an area.
- it concerns generation of images and stereo-processing of said images so as to provide said three-dimensional map representation.
- the geographical information comprises digital maps having superposed information layers such as infrastructure, terrain type and different types of objects. It is a time consuming process to form two dimensional maps comprising capturing images of the terrain from an aircraft and post-processing of the captured images. It is an even more time consuming process to form three dimensional maps from captured images or range data sets of the terrain/infrastructure.
- There are available today different methods such as photogrammetric methods with manual stereo-processing and laser scanning. These methods require time-consuming post-processing. Therefore, the formation of three- dimensional maps becomes expensive. As a consequence thereof, updates of the maps are made with a low updating frequency.
- EP 1 657 927 discloses image-based movement tracking of a number of objects in a particular area based on stereo-processing.
- cameras are mounted in fixed and known positions. Therefore, matching between images captured by the different cameras is less time consuming than in the case of moving cameras.
- One object of the present invention is to provide an improved way of providing three dimensional map representations.
- the arrangement comprises an image generating unit and a processing unit.
- the image generating unit is arranged to generate time recorded images and to provide a plurality of at least partly overlapping images each covering at least a part of said area.
- the processing unit is arranged to stereo process an arbitrary number of at least partly overlapping image sets generated by said image generating unit so as to provide said three dimensional map representation.
- the arrangement is characterized in that a navigation unit is arranged to output time recorded navigation states related to the image generating unit, and in that the processing unit is arranged to, for each time recorded image to be processed by the stereo processor, associate the navigation states relating to a corresponding time record.
- the time recorded navigation states comprises in one embodiment position and attitude.
- the navigation unit comprises in one example an inertial navigation system and means for providing position data in a predetermined coordinate system, such as a GPS receiver or a map.
- the use of the navigation states provided by the navigation unit enables position determination of the time recorded images with high accuracy. It even enables position determination for a number of points (pixels) or each point (pixel) in the time recorded images.
- the positions are determined in a predetermined coordinate system. Due to the fact that the positions of the images are determined with high accuracy, the stereo-processing can be performed without any ground based infrastructure. Therefore, the processing is less costly and time consuming. It can even be performed in real time or near real time.
- each point or pixel in the three- dimensional map representations provided is inherently related to a position in the predetermined coordinate system.
- the image generating unit is mountable on a movable carrier, such as a land vehicle, satellite, aircraft, watercraft, for example a lorry, airplane, ship or submarine.
- a movable carrier such as a land vehicle, satellite, aircraft, watercraft, for example a lorry, airplane, ship or submarine.
- the image generating unit can be hand held or mounted on a person.
- the present invention also relates to a method for providing a three dimensional map representation of an area comprising the steps of:
- the method is characterized by outputting time recorded navigation states comprising position and attitude, and to, for each time recorded image to be processed by a stereo processor, associate the navigation states relating to a corresponding time record.
- the present invention further relates to a computer programme comprising a pro- gramme code for performing the method steps when said computer programme is run on a computer.
- Fig. 1 illustrates an arrangement for providing a three dimensional map representation of an area mounted on an aircraft.
- Fig. 2 is a block scheme showing an example of the arrangement of Fig. 1.
- Fig. 3 illustrates schematically the function of an image generation unit in the arrangement in Fig. 2.
- Fig. 4 is a block scheme showing an example of a navigation unit in the arrangement of Fig. 2.
- Fig. 5 is a flow chart showing an example of a process performed by a processing unit in the arrangement of Fig. 2.
- Fig. 6 illustrates schematically the geometry used in stereo processing
- Fig 7 illustrates schematically an application of the arrangement in Fig 1 in a vol- ume measuring application.
- Fig 8 illustrates schematically an application of the arrangement in Fig 1 in a forest application.
- Fig 9 illustrates schematically an application of the arrangement in Fig 1 in an industrial robot application.
- Fig. 10 illustrates schematically an application of the arrangement in Fig 1 in a traffic application
- Fig 11 is a flow chart showing an example of a method for providing a three dimensional map representation of an area.
- an arrangement 101 for providing a three dimensional map representation of an area 103 is mounted on a movable carrier 102 in the form of an aircraft.
- the movable carrier is in an alternative example (not shown) a satellite, a land vehicle, or a watercraft, for example a lorry, ship or submarine.
- the arrangement 101 can also be hand held or mounted on a person.
- an arrangement 201 for providing a three dimensional map representation of an area comprises an image generation unit 204, a navigation unit 205 and a processing unit 206.
- the arrangement 201 also comprises a memory 207 for storing data related to the three dimensional map representation calculated by the processing unit 206, and a display unit 208 arranged to display the map representation calculated by the processing unit 206.
- the arrangement comprises either the memory 207 or the display unit 208.
- the arrangement comprises a transmitter (not shown) ar- ranged to transmit the data related to the provided three dimensional map representation to a receiver in a remote location.
- the transmitter substitutes the memory 207 and/or the display unit 208.
- the transmitter is provided in addition to the memory and/or the display unit.
- the image generating unit 204 is arranged to generate time recorded images.
- the time records are given with an accuracy sufficient for the application.
- the time records are for example provided from a GPS-receiver 412 in the arrangement 201.
- the image generation unit comprises in one example one image capturing unit.
- the image capturing unit is placed at a first position 310a so as to capture an image of an area 309a.
- the image capturing unit is placed at a second position 310b so as to capture an image of an area 309b.
- the image generating unit is arranged to provide a plurality of at least partly overlapping images each covering at least a part of the area 103. Accordingly, the required updating frequency of the images captured by the image generation unit so as to pro- vide overlapping images depends on the travelling speed of the carrier 102, a distance to a plane in which the area is formed and the size of a geographical area covered by each image.
- the distance to a plane in which the area lies is approximately the distance to the ground.
- the image generation unit comprises a plurality of image capturing units
- the image capturing units can be arranged to cooperate so as to enable decreasing the updating frequency.
- Fig 3 can also be seen as it comprises a plurality of image capturing units 310a, 310b fixed mounted in relation to each other. The images of the areas 309a, 309b captured by the respective image capturing units 310a, 310b are then in one example captured at the same time.
- a navigation unit 405 is arranged to output time recorded navigation states related to the image generating unit 204.
- the time recorded navigation states comprises in one embodiment position and attitude.
- the navigation unit 405 is arranged to continuously calculate the motions of the image generating unit 204.
- the naviga- tion unit 405 comprises a calculation unit 413 arranged to continuously calculate for example position and attitude angle for the image capturing unit(s) based on the cal- culated motions and rotations.
- the navigation unit 405 comprises in one example an inertial navigation system 411 arranged to provide the motions and rotations of the image generating unit 204.
- the navigation unit comprises in one example means for providing attitude and also position data in a predetermined coordinate system.
- the position data is for example provided by means of a GPS receiver 412 or a digitally stored map.
- the time records of the navigation states are in one example given with approximately the same accuracy as the time records of the images.
- the time records are provided using the same time reference as is used by the image generation unit.
- the time records are for example provided from the GPS-receiver 412.
- the inertial navigation system 411 is in the example described above mounted in a known relation to the image generating unit such that the navigation states provided directly relates to the navigation states of the image generating unit.
- the image generating unit comprises a plurality of image capturing units
- an inertial navigation system is in one example associated to each image capturing unit, whereby navigation states can be provided directly related to the respective image capturing unit.
- the image capturing units can be mounted independently of each other.
- the navigation states are in one example both a three dimensional position vector and a three dimensional attitude vector in a predefined coordinate sys- tem, but can also comprise a three-dimensional velocity vector in a predefined coordinate system.
- the velocity vector is for example directly provided from the inertial navigation system 411 in the navigation unit.
- the functional aspect of an inertial navigation system and how it calculates navigation states is well known.
- the image capturing units are mounted with such a tilt angle so as to provide sufficient information for the actual application. For example, in an application of three dimensional modelling of man made objects such as buildings or bridges, vertical information is of importance. Therefore, in this application, the image capturing unit should be tilted so as to provide an accurate modelling of vertical surfaces.
- the processing unit 206 is arranged to process a number of at least partly overlapping image sets generated by said image generating unit so as to provide said three dimensional map representation.
- the image sets comprises each two or more images.
- the processing unit 206 is arranged to be fed with the time recorded images from the image generating unit 204 and the time recorded navigation states from the navigation unit 205.
- Camera calibration is performed by a parameter estimation procedure directly from image data, as for example in reference "An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision” by R.Y. Tsai, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Miami Beach, FL, pp. 364-374, 1986.
- the image sets comprises overlapping images captured at substantially the same time (i.e. the images are associated to corresponding time re- cords) by the image capturing units 310a, 31 Ob of the image generating unit.
- sets comprising two images, forming image pairs, are described.
- the processing unit 206 is arranged to execute a number of steps on a number of at least partly overlapping image sets generated by the image generating unit 204 so as to provide the three dimensional map representation.
- the steps performed by the processing unit comprise a step of relating 514 the navigation data to the image data.
- image data is correlated to navigation data having a corresponding time record.
- each pixel of the image is correlated to corresponding navigation data.
- the pixels in a subset of pixels in the image are each correlated to corresponding navigation data.
- the navigation data relates to the position and attitude of the image capturing unit(s) in a geographical coordinate system. Each pixel of the image is then associated with a geographical coordinate on the ground.
- the processing unit 206 is arranged to remove those parts of the images not having a correspondence in the other image of the image pair in a cutting step 515.
- the images of each image pair are transformed to a common ground plane images.
- the transformed images of each image pair are used along with the position of the image capturing unit(s) when the two images were captured so as to per- form stereo calculations in a stereo processing step 517.
- the distance between the two image capturing unit positions is defined as a base length B. If an object can be unambiguously determined in both the images, the object will be somewhat displaced with a distance corresponding to the distance between a base plane in which the base length B lies, and the object. This displaced distance is denoted disparity.
- the optical axles of the two image capturing units or image capturing unit positions are parallel.
- equilateral triangles can for example be used and the focal length of the image capturing units so as to provide an expression for the distance between the image capturing unit plane and a point X, which is to be measured.
- the disparity estimation is based on finding corresponding points in the both images. It is in one example pre-assumed that a translation of the carrier 102 is performed in a solely horizontal direction.
- the known methods includes correlation based methods, property based methods and methods based on a local structure.
- the stereo processing is implemented by means of multimedia instructions for example in a personal computer.
- the stereo processed images are then ortho-projeted (or ortho-rectified) in an ortho projecting step 518.
- the whole map representation is geometrically corrected such that the scale of the image is uniform, once the whole map representation has been formed.
- the ortho projection is performed on each stereo processed image as the stereo processed images are formed.
- the ortho-projected map representation is adjusted for topographic relief, lens distortion and/or camera tilt. This means that the map representation is equivalent to a map.
- the ortho-projected image can be used to measure true distances, angles and areas.
- the ortho projected map representation is then textured in a texturing step 519 based on the ortho projected map representation formed in the preceding step.
- the texturing is performed on each ortho-projected image as the ortho-projected images are formed.
- the accordingly provided map representation is then used in an application step 520.
- all pixels in the map representation are specified in three geographical dimensions.
- the three dimensional map representation represents ele- vation and lateral data for the geographical area covered by the map representation.
- difference analysis is provided on two three- dimensional map representations so as to provide a "map" comprising difference values.
- the arrangement 101 mounted on a movable carrier 102 pro- vides a three dimensional map representation of the area 103 at two different times.
- the difference analysis is provided on the two three-dimensional map representations so as to provide a height difference value for each analyzed pixel pair.
- Each pixel pair comprises pixels corresponding to the same geographical location in the two map representations.
- a volume measuring application is illustrated. The volume can be determined by measuring a topographical convex or concave formation in relation to a well defined ground plane geometry. The volume is then calculated by comparing the elevation value in each location with the corresponding ground value.
- Applica- tions related to convex formations can be mountains of garbage, prospect measurements of excavation volumes etc.
- Applications related to concave formations are for example measurements of open cut volumes.
- Volumes can also be measured by using a three dimensional differential analysis (the second functional mode).
- This application is for example applicable to erosion measurements and preventive earth slip measurements.
- three dimensional map representations are calculated at different times and corresponding points (pixels) are compared to each other.
- Fig. 8 an application for measuring of forest parameters is illustrated. These parameters comprise for example tree-height, number of trees and tree volume.
- individual trees can be created from the three dimensional data by using a segmentation algorithm to capture the crown closure.
- the tree height is in one example defined as the topmost pixel in the tree segment minus a terrain elevation value in the corresponding lateral position which can be extracted by an elevation database or by direct measure of the ground elevation.
- a volume of wood can be calculated based on the number of trees (derivable from the number of segmented trees). In order to refine the determination of the volume of wood, also the determined tree heights can be used in the calculation.
- FIG. 9 an industrial robotic application is illustrated.
- three- dimensional information is continuously created related to an item or object which shall be operated or processed by an industrial robot.
- the arrangement for providing three-dimensional information (generated by stereo photogrametry as described in relation to Figs 1 to 6) is arranged directly on the robot lever arm.
- the three dimen- sional information possibly together with a texture (as described in relation to fig 5) can be used as feed back information in a trajectory calculation algorithm and/or an algorithm for gripping.
- man made objects such as road objects (e.g. road signs, road elements such as lamp posts, bridge pillars etc.) are detected by matching with predefined structure models and thereafter positioned in three dimensions of a geographical coordinate system using the three dimensional stereo modelling technique as described above.
- road objects e.g. road signs, road elements such as lamp posts, bridge pillars etc.
- Line of sight calculations wherein visible areas and shaded areas are calculated with reference from a given geographical position.
- the line of sight calculations are provided by means of a scanning algorithm.
- the scanning algorithm includes for example finding the shaded areas by determining heights which shadow terrain parts lying behind and determining the sizes of the shadowed areas.
- Another application, in which the 3D mapping technique can be used, is reconnaissance of power transmission line networks.
- trees and underbrush in the vicinity of the power transmission lines are measured from the 3D map representations.
- the measurement involves determining the height and location of the trees/underbrush in relation to the power transmission line.
- risk trees can be determined based on the determined heights and locations.
- growth of the underbrush can be determined using differential analysis.
- the maps provided with the above mentioned technique can also be used in computer games, serious gaming and training.
- digital models of real landscapes, urban environments or other real-life objects are used so as to enhance the sense of reality.
- the maps can be used in a "man in the loop scenario", wherein a player acts physically so as to drive the 3D scenario presented on a screen forwards.
- the man in the loop-scenario comprises a training-bicycle driving the scenario; the load on the bicycle can further be related to the scenario so that it becomes heavy to pedal up-hills and lighter down-hills.
- the maps can also be used in surveillance and reconnaissance applications. In both these applications, it is desirable to, for example identify and geographically localize different objects, perform mapping operations, terrain classification and perform moving target identification. Usage of differential analysis reveals masked objects, moving targets and other changes in a geographical region.
- the maps can be used in a variety of other applications, for example in GPS navigators.
- a method for providing a three dimensional map representation of an area comprises the steps of: generating 1130 time recorded images, outputting time re- corded navigation states 1131 comprising position and attitude, providing 1132 a plurality of at least partly overlapping images each covering at least a part of said area, forming 1133 an arbitrary number of at least partly overlapping image sets from said provided at least partly overlapping images, and to, for each time recorded image to be processed by a stereo processor, associate 1134 the navigation states relating to a corresponding time record and processing 1135 said image sets so as to provide said three dimensional map representation.
- the steps are not necessarily performed in the above given order.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Instructional Devices (AREA)
- Processing Or Creating Images (AREA)
- Navigation (AREA)
Abstract
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2007355942A AU2007355942B2 (en) | 2007-07-04 | 2007-07-04 | Arrangement and method for providing a three dimensional map representation of an area |
EP07787076A EP2172031A1 (fr) | 2007-07-04 | 2007-07-04 | Arrangement et procédé servant à fournir une représentation cartographique tridimensionnelle d'une zone |
JP2010513673A JP4794019B2 (ja) | 2007-07-04 | 2007-07-04 | 領域の3次元マップ表現を提供するための装置及び方法 |
US12/667,560 US9094673B2 (en) | 2007-07-04 | 2007-07-04 | Arrangement and method for providing a three dimensional map representation of an area |
PCT/EP2007/056780 WO2009003529A1 (fr) | 2007-07-04 | 2007-07-04 | Arrangement et procédé servant à fournir une représentation cartographique tridimensionnelle d'une zone |
CA2705254A CA2705254C (fr) | 2007-07-04 | 2007-07-04 | Arrangement et procede servant a fournir une representation cartographique tridimensionnelle d'une zone |
NO20100002A NO344948B1 (no) | 2007-07-04 | 2010-01-04 | Anordning og fremgangsmåte for å tilveiebringe en tredimensjonal kartrepresentasjon av et område |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2007/056780 WO2009003529A1 (fr) | 2007-07-04 | 2007-07-04 | Arrangement et procédé servant à fournir une représentation cartographique tridimensionnelle d'une zone |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009003529A1 true WO2009003529A1 (fr) | 2009-01-08 |
Family
ID=38969499
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2007/056780 WO2009003529A1 (fr) | 2007-07-04 | 2007-07-04 | Arrangement et procédé servant à fournir une représentation cartographique tridimensionnelle d'une zone |
Country Status (7)
Country | Link |
---|---|
US (1) | US9094673B2 (fr) |
EP (1) | EP2172031A1 (fr) |
JP (1) | JP4794019B2 (fr) |
AU (1) | AU2007355942B2 (fr) |
CA (1) | CA2705254C (fr) |
NO (1) | NO344948B1 (fr) |
WO (1) | WO2009003529A1 (fr) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100223008A1 (en) * | 2007-03-21 | 2010-09-02 | Matthew Dunbabin | Method for planning and executing obstacle-free paths for rotating excavation machinery |
WO2011093752A1 (fr) * | 2010-01-26 | 2011-08-04 | Saab Ab | Procédé de cartographie tridimensionnelle automatisée |
WO2014112911A1 (fr) | 2013-01-21 | 2014-07-24 | Saab Ab | Procédé et dispositif de développement d'un modèle tridimensionnel d'un environnement |
WO2014112908A1 (fr) | 2013-01-21 | 2014-07-24 | Saab Ab | Procédé et agencement permettant la réalisation d'un modèle en 3d |
CN104637370A (zh) * | 2014-12-23 | 2015-05-20 | 河南城建学院 | 一种摄影测量与遥感综合教学的方法及系统 |
CN104658039A (zh) * | 2015-02-12 | 2015-05-27 | 南京市测绘勘察研究院有限公司 | 一种城市数字地图三维建模制作方法 |
EP2911090A1 (fr) | 2014-02-24 | 2015-08-26 | Saab Ab | Procédé et agencement permettant d'identifier une différence entre un premier modèle 3D d'un environnement et un second modèle 3D de l'environnement |
US9338423B2 (en) | 2007-12-27 | 2016-05-10 | Saab Ab | Method for displaying a virtual image |
US9547935B2 (en) | 2014-02-24 | 2017-01-17 | Vricon Systems Ab | Method and a system for building a three-dimensional model from satellite images |
CN106803397A (zh) * | 2016-12-28 | 2017-06-06 | 贵州马科技有限公司 | 数字地图混合定位方法 |
US11954797B2 (en) | 2019-01-10 | 2024-04-09 | State Farm Mutual Automobile Insurance Company | Systems and methods for enhanced base map generation |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI426237B (zh) * | 2010-04-22 | 2014-02-11 | Mitac Int Corp | Instant image navigation system and method |
WO2013074065A1 (fr) | 2011-11-14 | 2013-05-23 | Intel Corporation | Procédés et dispositifs pour communications à glissement de fréquence avec sous-échantillonnage |
US9148250B2 (en) | 2012-06-30 | 2015-09-29 | Intel Corporation | Methods and arrangements for error correction in decoding data from an electromagnetic radiator |
US9161019B2 (en) * | 2012-09-10 | 2015-10-13 | Aemass, Inc. | Multi-dimensional data capture of an environment using plural devices |
US9014564B2 (en) | 2012-09-24 | 2015-04-21 | Intel Corporation | Light receiver position determination |
US9203541B2 (en) | 2012-09-28 | 2015-12-01 | Intel Corporation | Methods and apparatus for multiphase sampling of modulated light |
US9218532B2 (en) | 2012-09-28 | 2015-12-22 | Intel Corporation | Light ID error detection and correction for light receiver position determination |
US9178615B2 (en) | 2012-09-28 | 2015-11-03 | Intel Corporation | Multiphase sampling of modulated light with phase synchronization field |
US9590728B2 (en) * | 2012-09-29 | 2017-03-07 | Intel Corporation | Integrated photogrammetric light communications positioning and inertial navigation system positioning |
FR3003356A1 (fr) * | 2013-03-18 | 2014-09-19 | Delta Drone | Procede d'observation d'une zone au moyen d'un drone |
FR3004801A1 (fr) * | 2013-04-18 | 2014-10-24 | Delta Drone | Procede de mesure du volume d'un amas de materiaux |
CN105894568A (zh) * | 2014-11-17 | 2016-08-24 | 郑州捷安高科股份有限公司 | 一种基于图像识别的铁路信号设备三维建模方法 |
US9832338B2 (en) | 2015-03-06 | 2017-11-28 | Intel Corporation | Conveyance of hidden image data between output panel and digital camera |
CN105069842A (zh) * | 2015-08-03 | 2015-11-18 | 百度在线网络技术(北京)有限公司 | 道路三维模型的建模方法和装置 |
CN105512646B (zh) * | 2016-01-19 | 2019-03-01 | 腾讯科技(深圳)有限公司 | 一种数据处理方法、装置及终端 |
CN105928493A (zh) * | 2016-04-05 | 2016-09-07 | 王建立 | 基于无人机的双目视觉三维测绘系统和方法 |
US9881230B2 (en) * | 2016-05-11 | 2018-01-30 | International Business Machines Corporation | System and method for automated road identification in distant traffic camera images |
CN106289285A (zh) * | 2016-08-20 | 2017-01-04 | 南京理工大学 | 一种关联场景的机器人侦察地图及构建方法 |
SG10201702229WA (en) * | 2017-03-20 | 2018-10-30 | Arete M Pte Ltd | Systems And Methods For Mapping A Navigational Space |
US10788831B2 (en) * | 2017-10-06 | 2020-09-29 | Wipro Limited | Method and device for identifying center of a path for navigation of autonomous vehicles |
CN109461211B (zh) * | 2018-11-12 | 2021-01-26 | 南京人工智能高等研究院有限公司 | 基于视觉点云的语义矢量地图构建方法、装置和电子设备 |
CN109840920A (zh) * | 2018-12-20 | 2019-06-04 | 北京中科时空信息技术有限公司 | 航拍目标空间信息配准方法及航空器空间信息显示方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5517419A (en) * | 1993-07-22 | 1996-05-14 | Synectics Corporation | Advanced terrain mapping system |
EP1209623A2 (fr) * | 2000-11-22 | 2002-05-29 | Nec Corporation | Appareil de traitement d'image stéréo et méthode de traitement d'une image stéréo |
JP2003323640A (ja) * | 2002-04-26 | 2003-11-14 | Asia Air Survey Co Ltd | レーザスキャナデータと空中写真画像を用いた高精度都市モデルの生成方法及び高精度都市モデルの生成システム並びに高精度都市モデルの生成のプログラム |
EP1473673A2 (fr) * | 2003-04-30 | 2004-11-03 | Deere & Company | Dispositif et procédé de détection et d'analyse d'éléments d'une terre agricole pour le guidage de véhicule |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4695959A (en) | 1984-04-06 | 1987-09-22 | Honeywell Inc. | Passive range measurement apparatus and method |
US4855961A (en) * | 1986-07-31 | 1989-08-08 | Woods Hole Oceanographic Institute | Imaging apparatus |
US5691957A (en) * | 1994-06-30 | 1997-11-25 | Woods Hole Oceanographic Institution | Ocean acoustic tomography |
US5606627A (en) * | 1995-01-24 | 1997-02-25 | Eotek Inc. | Automated analytic stereo comparator |
JPH10291188A (ja) * | 1997-04-18 | 1998-11-04 | Nippon Steel Corp | 立体映像提示方法 |
US6442293B1 (en) * | 1998-06-11 | 2002-08-27 | Kabushiki Kaisha Topcon | Image forming apparatus, image forming method and computer-readable storage medium having an image forming program |
US6664529B2 (en) * | 2000-07-19 | 2003-12-16 | Utah State University | 3D multispectral lidar |
JP4181800B2 (ja) * | 2002-06-20 | 2008-11-19 | Nec東芝スペースシステム株式会社 | ステレオ画像を用いた地形計測システム及び記憶媒体並びにプログラム |
US8712144B2 (en) * | 2003-04-30 | 2014-04-29 | Deere & Company | System and method for detecting crop rows in an agricultural field |
US7873240B2 (en) * | 2005-07-01 | 2011-01-18 | The Boeing Company | Method for analyzing geographic location and elevation data and geocoding an image with the data |
-
2007
- 2007-07-04 WO PCT/EP2007/056780 patent/WO2009003529A1/fr active Application Filing
- 2007-07-04 EP EP07787076A patent/EP2172031A1/fr not_active Ceased
- 2007-07-04 CA CA2705254A patent/CA2705254C/fr active Active
- 2007-07-04 US US12/667,560 patent/US9094673B2/en active Active
- 2007-07-04 AU AU2007355942A patent/AU2007355942B2/en active Active
- 2007-07-04 JP JP2010513673A patent/JP4794019B2/ja active Active
-
2010
- 2010-01-04 NO NO20100002A patent/NO344948B1/no unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5517419A (en) * | 1993-07-22 | 1996-05-14 | Synectics Corporation | Advanced terrain mapping system |
EP1209623A2 (fr) * | 2000-11-22 | 2002-05-29 | Nec Corporation | Appareil de traitement d'image stéréo et méthode de traitement d'une image stéréo |
JP2003323640A (ja) * | 2002-04-26 | 2003-11-14 | Asia Air Survey Co Ltd | レーザスキャナデータと空中写真画像を用いた高精度都市モデルの生成方法及び高精度都市モデルの生成システム並びに高精度都市モデルの生成のプログラム |
EP1473673A2 (fr) * | 2003-04-30 | 2004-11-03 | Deere & Company | Dispositif et procédé de détection et d'analyse d'éléments d'une terre agricole pour le guidage de véhicule |
Non-Patent Citations (1)
Title |
---|
See also references of EP2172031A1 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8315789B2 (en) * | 2007-03-21 | 2012-11-20 | Commonwealth Scientific And Industrial Research Organisation | Method for planning and executing obstacle-free paths for rotating excavation machinery |
US20100223008A1 (en) * | 2007-03-21 | 2010-09-02 | Matthew Dunbabin | Method for planning and executing obstacle-free paths for rotating excavation machinery |
US9338423B2 (en) | 2007-12-27 | 2016-05-10 | Saab Ab | Method for displaying a virtual image |
US9224242B2 (en) | 2010-01-26 | 2015-12-29 | Saab Ab | Automated three dimensional mapping method |
WO2011093752A1 (fr) * | 2010-01-26 | 2011-08-04 | Saab Ab | Procédé de cartographie tridimensionnelle automatisée |
WO2014112908A1 (fr) | 2013-01-21 | 2014-07-24 | Saab Ab | Procédé et agencement permettant la réalisation d'un modèle en 3d |
WO2014112911A1 (fr) | 2013-01-21 | 2014-07-24 | Saab Ab | Procédé et dispositif de développement d'un modèle tridimensionnel d'un environnement |
US9891321B2 (en) | 2013-01-21 | 2018-02-13 | Vricon Systems Aktiebolag | Method and arrangement for developing a three dimensional model of an environment |
US11335059B2 (en) | 2013-01-21 | 2022-05-17 | Maxar International Sweden Ab | Method and arrangement for providing a 3D model |
EP2911090A1 (fr) | 2014-02-24 | 2015-08-26 | Saab Ab | Procédé et agencement permettant d'identifier une différence entre un premier modèle 3D d'un environnement et un second modèle 3D de l'environnement |
US9460520B2 (en) | 2014-02-24 | 2016-10-04 | Vricon Systems Ab | Method and arrangement for identifying a difference between a first 3D model of an environment and a second 3D model of the environment |
US9489563B2 (en) | 2014-02-24 | 2016-11-08 | Vricon Systems Ab | Method and arrangement for identifying a difference between a first 3D model of an environment and a second 3D model of the environment |
US9547935B2 (en) | 2014-02-24 | 2017-01-17 | Vricon Systems Ab | Method and a system for building a three-dimensional model from satellite images |
CN104637370A (zh) * | 2014-12-23 | 2015-05-20 | 河南城建学院 | 一种摄影测量与遥感综合教学的方法及系统 |
CN104658039A (zh) * | 2015-02-12 | 2015-05-27 | 南京市测绘勘察研究院有限公司 | 一种城市数字地图三维建模制作方法 |
CN106803397A (zh) * | 2016-12-28 | 2017-06-06 | 贵州马科技有限公司 | 数字地图混合定位方法 |
US11954797B2 (en) | 2019-01-10 | 2024-04-09 | State Farm Mutual Automobile Insurance Company | Systems and methods for enhanced base map generation |
Also Published As
Publication number | Publication date |
---|---|
JP4794019B2 (ja) | 2011-10-12 |
NO344948B1 (no) | 2020-07-27 |
CA2705254A1 (fr) | 2009-01-08 |
AU2007355942A1 (en) | 2009-01-08 |
AU2007355942B2 (en) | 2012-11-15 |
CA2705254C (fr) | 2015-10-13 |
JP2010532029A (ja) | 2010-09-30 |
US9094673B2 (en) | 2015-07-28 |
NO20100002L (no) | 2010-02-02 |
EP2172031A1 (fr) | 2010-04-07 |
AU2007355942A2 (en) | 2010-01-28 |
US20100250125A1 (en) | 2010-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9094673B2 (en) | Arrangement and method for providing a three dimensional map representation of an area | |
CN109791052A (zh) | 用于生成和使用定位参考数据的方法和系统 | |
US20140139523A1 (en) | Method of generating three-dimensional (3d) models using ground based oblique imagery | |
CN107850449A (zh) | 用于生成及使用定位参考数据的方法及系统 | |
EP2850455A1 (fr) | Visualisation par un nuage de points des zones d'atterrissage acceptables pour un hélicoptère sur la base d'un lidar 4d | |
CA2678156A1 (fr) | Appareil de mesure, methode de mesure et appareil d'identification des caracteristiques | |
US11922572B2 (en) | Method for 3D reconstruction from satellite imagery | |
JP2012118666A (ja) | 三次元地図自動生成装置 | |
JP2010191066A (ja) | 三次元地図補正装置及び三次元地図補正プログラム | |
KR101453143B1 (ko) | 스테레오 매칭 처리 시스템, 스테레오 매칭 처리 방법, 및 기록 매체 | |
US20250109940A1 (en) | System and method for providing improved geocoded reference data to a 3d map representation | |
CN112800938A (zh) | 无人驾驶车辆检测侧面落石发生的方法及装置 | |
US20210304518A1 (en) | Method and system for generating an environment model for positioning | |
Takahashi et al. | Roadside tree extraction and diameter estimation with MMS LiDAR by using point-cloud image | |
Hashimov et al. | GIS technology and terrain orthophotomap making for military application | |
Roncat et al. | A natural laboratory—Terrestrial laser scanning and auxiliary measurements for studying an active landslide | |
Lu et al. | LiDAR-Visual SLAM with Integrated Semantic and Texture Information for Enhanced Ecological Monitoring Vehicle Localization. | |
Moon et al. | Pre-processing methodology of image compensation using histogram equalization for generating point-cloud of construction environment | |
CN119445013A (zh) | 一种空地异构无人平台协同三维建图方法 | |
Sreedhar et al. | Line of sight analysis for urban mobile applications: a photogrammetric approach. | |
JP2023106163A (ja) | 自己位置推定方法、自己位置推定プログラム及び自己位置推定装置 | |
JP2023009981A (ja) | 設備認識システム及び設備認識方法 | |
CN119935069A (zh) | 一种基于点云建模的无人机视角目标地理定位方法 | |
CN118482684A (zh) | 一种基于视觉的高程信息获取方法、装置及系统 | |
Meza¹ et al. | AS tructure-from-Motion Pipeline for Topographic Reconstructions Using Unmanned Aerial Vehicles and Open |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07787076 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2010513673 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2705254 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007355942 Country of ref document: AU |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007787076 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2007355942 Country of ref document: AU Date of ref document: 20070704 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12667560 Country of ref document: US |