WO2005098366A1 - 経路案内システム及び方法 - Google Patents
経路案内システム及び方法 Download PDFInfo
- Publication number
- WO2005098366A1 WO2005098366A1 PCT/JP2005/005057 JP2005005057W WO2005098366A1 WO 2005098366 A1 WO2005098366 A1 WO 2005098366A1 JP 2005005057 W JP2005005057 W JP 2005005057W WO 2005098366 A1 WO2005098366 A1 WO 2005098366A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- route
- landscape
- comfort
- calculation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 43
- 238000012545 processing Methods 0.000 claims description 35
- 238000004891 communication Methods 0.000 claims description 7
- 238000013459 approach Methods 0.000 claims description 4
- 230000002093 peripheral effect Effects 0.000 claims 2
- 238000012937 correction Methods 0.000 claims 1
- 238000001514 detection method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000012217 deletion Methods 0.000 description 3
- 230000037430 deletion Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3476—Special cost functions, i.e. other than distance or default speed limit of road segments using point of interest [POI] information, e.g. a route passing visible POIs
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
- G09B29/106—Map spot or coordinate position indicators; Map reading aids using electronic means
Definitions
- the present invention relates to a route guidance system and method for guiding a traveling route to a destination of a vehicle.
- Record map data including road data obtained by digitizing each point on the road on the map.
- a map data group of a certain range including the current position is read from the storage device and the current position of the vehicle is stored.
- a navigation device that displays a surrounding map on a screen of a display device and automatically displays a vehicle mark indicating the current position and traveling direction of the vehicle on the map is already known.
- Conventional navigation devices have a function of automatically setting a route to a destination.
- calculations are made by taking into account various conditions such as the use of toll roads and the shortest distance in the route between the departure point, destination and transit points. Select one route and present the route on the map.
- a route is set almost without giving consideration to providing a comfortable drive to a driver such as a driver when setting the route.
- a route setting is simply a movement, and it is not possible to set a route such that a passenger enjoys the scenery on the way to the sightseeing spot etc. and travels.
- Disclosure of the invention One object of the present invention is to provide a route guidance system and method capable of setting a route so that a comfortable drive can be provided to a passenger.
- a route calculating unit that calculates a route to a destination of a vehicle, a storage unit that stores landscape images of a plurality of points as data, and a route calculated by the route calculating unit
- a route guidance system comprising: an image providing unit that reads a landscape image on a route that has been read from the storage unit and outputs the landscape image; and a display unit that displays a landscape image on a calculation route that is output by the image providing unit. Is done.
- the route guidance system further includes a route determination unit that determines a calculation route by the route calculation unit as a setting route in accordance with an operation during or after display of the landscape image on the calculation route by the display unit.
- a route calculation step for calculating a route to a destination of a vehicle, a step of storing landscape images of a plurality of points as data in the storage means, An image providing step of reading out the landscape image on the route calculated in the calculation step from the storage means and outputting the image, a display step of displaying the landscape image on the calculation route output in the image providing step, During or after displaying the landscape image on the calculation route in the display step, a route determination step of determining the calculation route in the route calculation step as a set route in accordance with an operation is provided.
- FIG. 1 is a block diagram showing a schematic configuration of a route guidance system according to the present invention.
- FIG. 2 is a block diagram showing the configuration of the in-vehicle terminal device.
- FIG. 3 is a flowchart showing the landscape image collection processing.
- FIG. 4 is a flowchart showing the scene comfort level calculation processing.
- FIG. 5 is a diagram showing four divisions of a landscape image.
- FIG. 6 is a flowchart showing the road analysis processing.
- Figure 7 is a flowchart for landscape analysis processing.
- FIG. 8 is a flowchart showing the background analysis process.
- FIG. 9 is a diagram showing each index and road comfort level as a result of the road analysis processing.
- FIG. 10 is a diagram showing each index and the scenery comfort level as a result of the scenery analysis processing.
- FIG. 11 is a diagram showing each index and background comfort level as a result of the background analysis processing.
- FIG. 12 is a flowchart showing the landscape image management process.
- FIG. 13 is a flowchart showing the navigation processing.
- FIG. 14 is a flowchart showing the image reading transmission process. BEST MODE FOR CARRYING OUT THE INVENTION
- the route guidance system includes a plurality of in-vehicle terminal devices 1 ln, a server 2, a network 3, and a communication relay device 4. Multiple on-board terminal devices ⁇ ⁇ ! The server 2 and the server 2 can communicate with each other via the network 3 and the communication relay device 4.
- Each of the in-vehicle terminal devices 1 Hiro 1n is mounted on a vehicle 66n, and as shown in Fig. 2, a camera 11, a processor 12, a GPS (Global Positioning System) device 13, a vehicle operation It has a detection unit 14, a wireless device 15, a storage device 16, a conversion unit 17, and a display device 18.
- the camera 11, processor 12, GPS device 13, vehicle operation detection unit 14, wireless device 15, storage device 16, operation unit 17 and display device 18 are commonly connected by a path 19. .
- the camera 11 is, for example, a CCD camera, and is attached to the vehicle so as to photograph the front of the vehicle.
- the processor 12 receives the image data supplied from the camera 11 and performs a process of calculating the degree of comfort of the landscape indicated by the plane image data.
- the processor 12 performs the landscape image collection processing according to the analysis processing result, and further includes a 0-3 device 13 and a vehicle operation detection unit. 14 A navigation process is performed according to each detection output. The details of the scenery comfort level detection process, the scenery image collection process, and the navigation process will be described later.
- the 03 device 13 detects the current position and the traveling direction of the vehicle.
- the vehicle operation detection unit 14 detects an operation state of the vehicle, such as a vehicle speed and an engine speed.
- Each data of the current position and the traveling direction detected by the GPS device 13 and the operation state detected by the vehicle operation detection unit 14 are used in the landscape image collection processing and the navigation processing by the processor 12.
- the wireless device 15 receives, demodulates, and outputs the wireless signal transmitted from the server 2, and transmits data to be transmitted to the server 2 as a wireless signal.
- the storage device 16 is composed of a hard disk or a semiconductor memory.
- the storage device 16 stores data such as road data for route search and map data for display, in addition to processing programs such as landscape image collection processing and navigation processing executed by the processor 12. .
- the operation unit 17 includes, for example, a keyboard, and supplies a command corresponding to an input operation to the processor 12 via the bus 19.
- the display device 18 is, for example, a liquid crystal display device, and displays an own vehicle mark indicating the current position and the traveling direction of the vehicle together with a map, and also displays a route and a landscape image calculated by the processor 12.
- the server 2 has a storage device 21 and is connected to the network 3.
- the server 2 executes a landscape image management process to manage the image data sent from each of the in-vehicle terminal devices to In using the image addition / public information. Also, when receiving a landscape image request from each of the in-vehicle terminal devices l n, it performs image reading and transmission processing and transmits image data. The details of the landscape image management process and the image reading and transmitting process will be described later.
- the processor 12 first, as shown in FIG.
- the image data from the camera 11 is fetched (step S1), and it is determined whether an obstacle exists in the image (still image) indicated by the image data; 5 (step S2).
- Obstacles are vehicles ahead and parked vehicles other than the scenery.
- the image data captured this time is compared with a plurality of image data immediately before the image data to judge that an obstacle exists. If there is an obstacle, the flow returns to step S1 to acquire new image data. Since it is rare that there is no obstacle at all, in step S2, it may be determined whether or not the total amount of the obstacle portion: ⁇ shown in the image is larger than the threshold.
- a scene comfort calculation process is performed according to the scene image indicated by the current image data (step S3).
- the landscape image is divided into four areas diagonally, as shown in Fig. 4 (Step S1Do, the image is a rectangle as shown in Fig. 5, and the diagonal lines A, B It is divided into four areas: upper, lower, left and right, the upper area is the background area, the lower area is the road area, and the left and right areas are the landscape area.
- a road analysis process is performed according to the image in the lower region (step S12), a landscape analysis process is performed according to the image in the left and right regions (step S13), and a background analysis process is performed according to the image in the upper region. (Step S14).
- white line recognition and approximate straight line calculation are performed (step S41). That is, a white line on the road is recognized, and an approximate straight line of the white line is calculated.
- a white line recognition method for example, there is a method disclosed in Japanese Patent Application Laid-Open No. Hei 6-333192. The disclosure of Japanese Patent Application Laid-Open No. 6-333392 is disclosed herein.
- white line recognition method white line candidate points are extracted based on the image data, the frequency distribution of the angle of each line segment between the two points of the white line candidate points with respect to a reference line is determined, and the white line candidate point is determined based on the frequency distribution.
- step S42 The actual angle of the white line with respect to the reference line and the real ife complement included in the white line are extracted, and an approximate straight line of the white line is determined based on the actual angle and the actual candidate points.
- straight line distance measurement and lane width measurement force S are performed (step S42).
- the straight-line distance measurement the point where the recognized white line deviates from the approximate straight line is obtained.
- the distance of the straight line to that point is defined as the straight line distance.
- a lane width measuring method for example, there is a method disclosed in JP-A-2002-163642.
- the disclosure of Japanese Patent Application Laid-Open No. 2002-163642 is described in this specification. That is, the lane position on the road is specified, and the lane width is estimated based on the current lane position and its past history.
- Road surface condition recognition is to identify whether or not the road surface is paved by color distribution analysis. It may also recognize road conditions in response to weather such as dry, wet and snow.
- Japanese Patent Application Laid-Open No. 2001-88636 discloses a method for recognizing road surface conditions such as snow and gravel roads, and this method may be used. The disclosure content of Japanese Patent Application Laid-Open No. 2001-88636 is described in this specification. In scoring, paved roads are scored high and unpaved roads are scored low.
- the straightness of the road, the width of the road, and the cleanness of the road surface are set according to the values of the road parameters obtained by executing steps S41 to S43 (step S44). That is, the straightness of the road is set according to the straight-line distance, the width of the road is set according to the lane width, and the cleanliness of the road surface is set according to the road surface condition value.
- the straightness of the road, the width of the road, and the cleanness of the road surface are set in the range of 0 to 100 according to the similarity with each reference value.
- the average value of the straightness of the road, the width of the road, and the cleanness of the road surface set in step S44 are calculated (step S45). This average value indicates road area comfort.
- step S51 the green color rate and the blue rate of each of the left and right regions are analyzed as shown in FIG. 7 (step S51).
- the green ratio the ratio of the
- the blue part The number of pixels (including similar colors) is extracted, and the ratio of the number of pixels in the blue portion to the total number of pixels in the region is defined as the blue ratio.
- the color recording rate is the ratio of the forest in each of the left and right regions, and the blue ratio is the ratio of the sea in each of the left and right regions.
- a color distribution analysis is performed (step S52).
- the color distribution is calculated by calculating the number of pixels of each color in each of the left and right regions as a histogram.
- fractal dimension analysis of each of the left and right regions is performed (step S53).
- the quality of the landscape is evaluated by the value of the fractal dimension.
- Japanese Patent Application Laid-Open No. 2000-57353 discloses a method for evaluating the quality of a landscape using fractal dimension prayer.
- the disclosure content of Japanese Patent Application Laid-Open No. 2000-570353 is described in this specification.
- Japanese Patent Application Laid-Open No. 2000-570353 when the value of the fractal dimension is in the range of 1.50 to 1.65 in the value between 0 and 2, the quality of the landscape is evaluated to be high. Is done.
- the forest, the proportion of the sea, the number of signboards, and the complexity of the landscape are set according to the respective landscape parameter values obtained by executing steps S51 to S53 (step S54). That is, the number of signboards is set according to the ratio of forest and sea according to the green ratio and blue ratio, the number of signboards according to the color distribution, and the complexity is set according to the value of the fractal dimension.
- the ratio of forest and sea, the number of signboards and the complexity are set in the range of 0 to 100 according to the similarity with each reference value. Then, the average values of the forest, the ratio of the sea, the number of signboards, and the complexity set in step S54 are calculated for each of the left and right regions.
- Step S55 The average value indicates the degree of right and left landscape comfort.
- the blue ratio in the upper region is analyzed (step S61).
- the number of pixels of the 13th blue tone (including similar colors) in the upper region is extracted, and the ratio of the number of pixels in the blue portion to the total number of pixels in the region is defined as the blue ratio.
- the blue ratio is the ratio of the blue sky in the upper region.
- a color distribution analysis is performed (step S62).
- the color distribution of each color in the upper area The number of pixels is calculated as a histogram, and the signboards, overpasses, and distant mountain ranges are analyzed.
- distance measurement is performed (step S63). This is to measure the distance to major background objects such as the sky, distant mountains, overpasses, and tunnels in color distribution analysis. Using the captured image and the image of the previous frame, the optical flow is calculated, and the distance of the object in the area is measured. If it is at infinity, it is determined that there is no object.
- 6-107096 discloses that the movement of the same point on a target object, which is displayed in two images that are temporally adjacent to each other in a series of captured foreground moving images, is optical. It is shown to be detected as a flow vector.
- the disclosure of Japanese Patent Application Laid-Open No. 6-107096 is the same as that disclosed in this document.
- the blue sky ratio, the number of signs, and the degree of opening of the background are set in accordance with the background parameter values obtained by executing steps S61 to S63 (step S64). That is, the blue sky ratio is set according to the blue color ratio, the number of signs is small according to the color distribution, and the openness is set according to the distance to the sky, distant mountains, overpasses, and tunnels.
- the value of the blue sky ratio, the number of signs, and the degree of opening are set in the range of 0 to 10 O according to the similarity with each reference value.
- the average value of the blue sky ratio, the number of signs, and the degree of opening set in step S64 are calculated (step S65). This average value indicates the degree of scenery comfort.
- the average value of the obtained road area comfort, the right and left wind comfort, and the background comfort is calculated as the scene comfort in front of the vehicle (step S15).
- the road analysis processing in step S4 shows that the values of the linearity of the road area, the width of the road, and the cleanness of the road surface and the road area comfort The degree is obtained as shown in FIG.
- the landscape analysis processing in step S5 the values of the forest and the sea, the number of signboards, and the complexity of the landscape and the landscape comfort for each of the left and right regions are obtained as shown in FIG.
- the background analysis processing in step S6 the values of the background blue sky ratio, the number of signs, and the degree of opening, and the background comfort are obtained as shown in FIG.
- step S4 After calculating the degree of landscape comfort in this way, it is determined whether or not the landscape is comfortable according to the degree of landscape comfort (step S4). In step S4, if the calculated degree of scenery comfort exceeds a predetermined threshold (for example, 70), it is determined that the scenery is comfortable, and if it is equal to or less than the predetermined threshold, it is not determined that the scenery is comfortable. . If it is determined that the scenery is comfortable, create additional image information consisting of the shooting location, orientation, date and time, and scenery comfort level of the current image (step S5).
- a predetermined threshold for example, 70
- the date and time can be obtained from a calendar and a clock formed in the processor 12.
- the processor 12 obtains information on the shooting location, azimuth, and date and time together with the image data and stores the information in the storage device 16. Then, when executing step S5, the information is obtained. It may be read and used.
- the processor 12 transmits the current image data together with the image additional information to the server 2 (step S6).
- the current image data including the image additional information is converted into a packet data destined for the server 2, transmitted as a radio signal by the radio device 15, and received by the communication relay device 4.
- the communication relay device 4 transmits the packet data in the received radio signal to the server 2 via the network 3.
- step S7 it is determined whether or not to continue the landscape image collection processing. For example, if the processing is to be continued according to the input operation of the operation unit 17, the process returns to step S1. Steps S1 to S7 are repeated. If it does not continue, the landscape image collection processing ends.
- the image data is transmitted to the server 2 together with the image additional information in each of the plurality of in-vehicle terminal devices to In.
- Server 2 sends that: TL A landscape image management process is executed to manage the evening using the image additional information.
- the landscape image management processing by the super-user 2 will be described.
- step S21 it is determined whether or not new image data has been received. If the image data is received ⁇ :, it is determined whether or not the shooting location and orientation in the image additional information added to the received image data are the same as the shooting location and orientation of the maintained image data. Step S22). That is, it is determined whether or not there is an image of the same shooting location and orientation in the landscape image CO stored as image data in the storage device 21. The location of both images and images is the same if the error is within a predetermined distance (for example, 50 m), and the same if the error is within a predetermined azimuth (for example, 30 degrees). If the image data having the same shooting location and the same ⁇ position is not stored in the storage device 21, the new received image data is stored in the storage device 21 together with the image additional information (step S23).
- a predetermined distance for example, 50 m
- a predetermined azimuth for example, 30 degrees
- the shooting time and the time between the existing image data with the same shooting location and direction and the received image data are further increased. It is determined whether or not the times are the same (step S24).
- the shooting period is one year: spring (March to May), summer (June to August), autumn (September to January), and winter (12 to February).
- the time is the morning (6 to 10 o'clock), the day (11 to 16 o'clock), the evening (17 to 19 o'clock), and the night ( If the shooting time or time is not the same between the existing image data and the received image data, the received image data is added together with the image additional information.
- Step S25 If the image data with the same shooting time and time is already stored in the storage device 21 fL, the distance between the existing image data with the same shooting location, direction, shooting time and time and the received image data In addition, it was determined which scenery comfort level was higher. (Step S25). If the scenery comfort level of the received image data is higher than that of the existing image data, the received image data is stored in the storage device 21 together with the image additional information (step S23). In this case, the existing image data may be deleted from the storage device 21 together with the corresponding surface image additional information. On the other hand, if the received image data c3 ⁇ 4viewing comfort level is lower than that of the existing image data, the received image data is deleted without being stored in the memory 21 (step S 26).
- step S27 After execution of step S23 or S26, it is determined whether or not the landscape image management process is continued (step S27).
- the landscape image management process is instructed in response to the input operation of the operation unit (not shown) of the server 2, the landscape image collection process is ended. If there is no instruction to end the landscape image management processing, the operation proceeds to step S21 and the operations of steps S21 to S27 are repeated.
- the image data transmitted from each of the plurality of in-vehicle terminal devices 1 i to 1 n is managed by the 'server 2.
- the image data managed by the server 2 is used in navigation processing by the processors 12 of the plurality of on-board terminal devices l i l n.
- the processor 12 first performs a route calculation in the navigation processing as shown in FIG. 13 (step S70).
- a route calculation for example, a destination is specified according to the operation of the operation unit 17, and a route from the current location to the destination is calculated according to the route search road data stored in the storage device 16. You.
- the calculated route is displayed on the display device 18 together with the map.
- the departure point of the route may be specified by the passenger by operating the operation unit 17 instead of the current position.
- the server 2 After calculating the route from the current position to the destination, the server 2 is requested to provide a suitable landscape image on the route and around the route (step S71).
- the landscape image request is supplied from the wireless device 15 to the server 2 via the communication relay device 4 and the network 3 in the same manner as the image data described above.
- the server 2 Upon receiving the landscape image request, the server 2 performs an image reading and transmitting process as shown in FIG. In the image reading and transmitting process, all the image data on and around the route indicated by the landscape image request is read from the storage device 21 in accordance with the image additional information (step S31), and the read image data is stored in the landscape image.
- the image request is sent to the on-board terminal device that sent the image request (step S32).
- Image data is received by the wireless device 15 from the server 2 via the network 3 and the communication relay device 4.
- the wireless device 15 is in the in-vehicle terminal device that has transmitted the landscape image request.
- the image data received by the wireless device 15 is supplied to the processor 12.
- the processor 12 determines whether or not the image data from the server 2 has been successfully received (step S72). If the image data has been successfully received, the processor 12 converts the received image data.
- the data is stored in the storage device 16 (step S73). Then, of the stored image data, the image data where the shooting location is located on the calculation path is selected, and a list of the landscape images indicated by the selected image data is displayed on the display device 18 (step S7).
- the landscape image may be displayed in a small size (for example, thumbnail size) at the position on the route of the displayed map, or simply the landscape image is displayed in a small size in order from the current location to the destination. You may.
- the processor 12 determines whether or not the passenger has accepted the display route (Step S75). For example, at the time of displaying the landscape image in step S74, a question asking whether the display route is good is displayed on the display device 18 at the same time, and the passenger of the vehicle responds by using the operation unit 17 to determine the route. Input operation is performed for the purpose.
- the determination in step S75 may be made either during display of the landscape image list or after display.
- step S76 the route is determined and route guidance is started. That is, the own vehicle mark indicating the current position and the heading of the vehicle is displayed on the display device 18 together with the map, and the lane or the left turn or the right turn at the intersection is instructed by voice from a speaker (not shown). This is the traditional navigation This is the same as the route guidance operation of the application device.
- the image is displayed on the display device 18 and guided by voice (step S77).
- the vehicle enters, for example, 50 m from the location where the image data stored in the storage device 16 was captured the corresponding image data is read out and displayed on the display device 18, and the sound is notified of approach to the location. Is done. Steps S76 and S77 are repeatedly executed until the destination is reached (step S78).
- step S75 If it is determined in step S75 that the displayed route is not accepted by the passenger, it is further determined whether to add or delete the route (step S79). If the passenger does not accept the displayed route, the display device 18 further requests the passenger to perform an operation of selecting one of addition and deletion in order to change the route. As a result, one of addition and deletion is selected by the passenger operating the operation unit 17.
- the image data of the landscape image around the route calculated in step S70 is read from the storage device 16, and the landscape image indicated by the read image data is displayed on the display device 18. (Step S80).
- the landscape image around the route may be displayed in small size at the position on the route of the map being displayed, or the landscape image may simply be displayed in small size in order from the current location to the destination.
- the passenger can select a new transit point via the operation unit 17 from the plurality of landscape image points around the displayed route. Therefore, the processor 12 accepts the selection of a new stopover (step S81), and calculates a route including the stopover (step S82).
- the number of new transit points that can be selected is one or more.
- step S79 the passenger uses the operation unit 1 to select an image to be deleted from the list of landscape images, that is, an image of a point where the user does not want to travel. 7, the processor 12 accepts the selection of the image to be deleted (step S83), and calculates a route that avoids the point of the selected image (step S84). ).
- the number of selectable images is 1 An image or a plurality of images.
- step S82 or S84 the process proceeds to step S74 to select image data having a shooting location on the calculation path, and display the landscape image indicated by the selected image data on the display device 18 And steps S75 to S78 are executed as described above.
- the server may automatically create a landscape comfort level distribution map based on the comfortable landscape images collected from each in-vehicle terminal device in consideration of the shooting time.
- the landscape comfort distribution map is downloaded to the in-vehicle terminal device according to the operation of the passenger, and the in-vehicle terminal device selects a region from the landscape comfort distribution map and sets a drive time in the region. .
- the in-vehicle terminal device automatically selects the point to go through considering the time, sets the route, and displays the landscape image along with the route map, so that the passenger can easily drive in a comfortable landscape area be able to.
- a comfortable scenery point around the route is automatically changed to, for example, a route passing through five places. May be.
- the vehicle-mounted terminal device includes the route calculation unit, the display unit, and the route determination unit
- the server includes the image providing unit having the storage unit. It may be provided in the device.
- a landscape image provided in advance can be stored in the in-vehicle terminal without using a server. And can be used for image display when setting a route.
- the image in front of the vehicle is divided into four regions by two diagonals for calculating the degree of scenery comfort has been described in detail, but the number of diagonal lines or the number of divided regions is not limited thereto. .
- a method of calculating the scenery comfort it is needless to say that another method that does not divide the image may be used.
- a route can be set so that a comfortable drive can be provided to a passenger.
- the present invention can be applied to an in-vehicle navigation device, a landscape image automatic collection device, or a landscape comfort distribution map automatic creation device.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006511982A JP4320032B2 (ja) | 2004-03-31 | 2005-03-15 | 経路案内システム及び方法 |
US11/547,331 US7653485B2 (en) | 2004-03-31 | 2005-03-15 | Route guidance system and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-106186 | 2004-03-31 | ||
JP2004106186 | 2004-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005098366A1 true WO2005098366A1 (ja) | 2005-10-20 |
Family
ID=35125184
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/005057 WO2005098366A1 (ja) | 2004-03-31 | 2005-03-15 | 経路案内システム及び方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US7653485B2 (ja) |
JP (1) | JP4320032B2 (ja) |
WO (1) | WO2005098366A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007233470A (ja) * | 2006-02-27 | 2007-09-13 | Mazda Motor Corp | 車両用走行支援装置 |
JP2008282106A (ja) * | 2007-05-08 | 2008-11-20 | Fujitsu Ltd | 障害物検出方法および障害物検出装置 |
JP2009264796A (ja) * | 2008-04-22 | 2009-11-12 | Clarion Co Ltd | 情報表示装置及びその制御方法並びに制御プログラム |
CN104608710A (zh) * | 2013-11-05 | 2015-05-13 | 福特全球技术公司 | 用于车辆无线电台隐私模式控制的方法和设备 |
WO2018230492A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7818123B2 (en) * | 2004-03-31 | 2010-10-19 | Pioneer Corporation | Routing guide system and method |
JP4493050B2 (ja) * | 2005-06-27 | 2010-06-30 | パイオニア株式会社 | 画像分析装置および画像分析方法 |
CN101211407B (zh) * | 2006-12-29 | 2011-10-26 | 东软集团股份有限公司 | 昼夜图像识别方法和装置 |
US20090171980A1 (en) * | 2007-12-24 | 2009-07-02 | Meadow William D | Methods and apparatus for real estate image capture |
US8532927B2 (en) * | 2008-11-07 | 2013-09-10 | Intellectual Ventures Fund 83 Llc | Generating photogenic routes from starting to destination locations |
US8472903B2 (en) * | 2010-02-23 | 2013-06-25 | Paccar Inc | Entertainment systems with enhanced functionality |
JP5259670B2 (ja) * | 2010-09-27 | 2013-08-07 | 株式会社東芝 | コンテンツ要約装置およびコンテンツ要約表示装置 |
US9794519B2 (en) * | 2010-12-20 | 2017-10-17 | Nec Corporation | Positioning apparatus and positioning method regarding a position of mobile object |
JP5669767B2 (ja) * | 2011-12-13 | 2015-02-18 | トヨタ自動車株式会社 | 情報提供装置 |
JP5935435B2 (ja) * | 2012-03-26 | 2016-06-15 | 富士通株式会社 | 画像処理装置、画像処理方法 |
US9733097B2 (en) * | 2014-10-31 | 2017-08-15 | Toyota Jidosha Kabushiki Kaisha | Classifying routes of travel |
US9816830B2 (en) | 2015-05-20 | 2017-11-14 | Uber Technologies, Inc. | Navigation lane guidance |
US20170161760A1 (en) * | 2015-12-08 | 2017-06-08 | Hyundai America Technical Center, Inc. | Systems and methods for an in-vehicle survey with generated routes |
KR102477362B1 (ko) * | 2015-12-18 | 2022-12-15 | 삼성전자주식회사 | 통신 단말의 릴레이 기반 통신 기법 |
US11367305B2 (en) * | 2018-09-28 | 2022-06-21 | Apple Inc. | Obstruction detection during facial recognition processes |
DE102020108805A1 (de) * | 2020-03-31 | 2021-09-30 | Audi Aktiengesellschaft | Verfahren zum Betreiben einer cloudbasierten Plattform zum Speichern und Teilen von Videodateien sowie ein cloudbasiertes System zum Speichern und Teilen von Videodateien |
KR102702773B1 (ko) * | 2020-06-24 | 2024-09-05 | 현대자동차주식회사 | 차량 및 그 제어방법 |
US12306002B2 (en) * | 2022-06-14 | 2025-05-20 | The Florida International University Board Of Trustees | Systems and methods for navigating based on scenic quality |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08105752A (ja) * | 1994-10-04 | 1996-04-23 | Alpine Electron Inc | 車載用ナビゲーション装置 |
JPH09113291A (ja) * | 1995-10-19 | 1997-05-02 | Toshiba Corp | 地図表示処理装置 |
JPH1172344A (ja) * | 1997-08-29 | 1999-03-16 | Fujitsu Ten Ltd | ナビゲーション装置 |
JP2000283773A (ja) * | 1999-03-31 | 2000-10-13 | Sony Corp | 地図表示装置 |
JP2000304559A (ja) * | 1999-04-22 | 2000-11-02 | Xanavi Informatics Corp | ナビゲーション装置および情報提供システム |
JP2001215123A (ja) * | 2000-01-31 | 2001-08-10 | Sony Corp | ナビゲーション装置及び探索経路表示方法 |
JP2002213982A (ja) * | 2001-01-18 | 2002-07-31 | Fujitsu Ltd | 経路演算装置,及びナビゲーション装置,並びにコンピュータ読み取り可能な記録媒体 |
JP2003269971A (ja) * | 2002-03-20 | 2003-09-25 | Denso Corp | 車載用ナビゲーション装置 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030210806A1 (en) * | 2002-05-07 | 2003-11-13 | Hitachi, Ltd. | Navigational information service with image capturing and sharing |
-
2005
- 2005-03-15 US US11/547,331 patent/US7653485B2/en not_active Expired - Fee Related
- 2005-03-15 WO PCT/JP2005/005057 patent/WO2005098366A1/ja active Application Filing
- 2005-03-15 JP JP2006511982A patent/JP4320032B2/ja not_active Expired - Fee Related
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08105752A (ja) * | 1994-10-04 | 1996-04-23 | Alpine Electron Inc | 車載用ナビゲーション装置 |
JPH09113291A (ja) * | 1995-10-19 | 1997-05-02 | Toshiba Corp | 地図表示処理装置 |
JPH1172344A (ja) * | 1997-08-29 | 1999-03-16 | Fujitsu Ten Ltd | ナビゲーション装置 |
JP2000283773A (ja) * | 1999-03-31 | 2000-10-13 | Sony Corp | 地図表示装置 |
JP2000304559A (ja) * | 1999-04-22 | 2000-11-02 | Xanavi Informatics Corp | ナビゲーション装置および情報提供システム |
JP2001215123A (ja) * | 2000-01-31 | 2001-08-10 | Sony Corp | ナビゲーション装置及び探索経路表示方法 |
JP2002213982A (ja) * | 2001-01-18 | 2002-07-31 | Fujitsu Ltd | 経路演算装置,及びナビゲーション装置,並びにコンピュータ読み取り可能な記録媒体 |
JP2003269971A (ja) * | 2002-03-20 | 2003-09-25 | Denso Corp | 車載用ナビゲーション装置 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007233470A (ja) * | 2006-02-27 | 2007-09-13 | Mazda Motor Corp | 車両用走行支援装置 |
JP2008282106A (ja) * | 2007-05-08 | 2008-11-20 | Fujitsu Ltd | 障害物検出方法および障害物検出装置 |
JP2009264796A (ja) * | 2008-04-22 | 2009-11-12 | Clarion Co Ltd | 情報表示装置及びその制御方法並びに制御プログラム |
CN104608710A (zh) * | 2013-11-05 | 2015-05-13 | 福特全球技术公司 | 用于车辆无线电台隐私模式控制的方法和设备 |
CN104608710B (zh) * | 2013-11-05 | 2018-09-11 | 福特全球技术公司 | 用于车辆无线电台隐私模式控制的方法和设备 |
WO2018230492A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 情報処理装置、情報処理方法、及びプログラム |
CN110741223A (zh) * | 2017-06-16 | 2020-01-31 | 本田技研工业株式会社 | 信息处理装置、信息处理方法以及程序 |
JP2020073915A (ja) * | 2017-06-16 | 2020-05-14 | 本田技研工業株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US11231281B2 (en) | 2017-06-16 | 2022-01-25 | Honda Motor Co., Ltd. | Information-processing device, information-processing method, and program |
JP7546359B2 (ja) | 2017-06-16 | 2024-09-06 | 本田技研工業株式会社 | 情報処理装置、情報処理方法、及びプログラム |
CN110741223B (zh) * | 2017-06-16 | 2025-04-04 | 本田技研工业株式会社 | 信息处理装置、信息处理方法以及程序 |
Also Published As
Publication number | Publication date |
---|---|
US7653485B2 (en) | 2010-01-26 |
JP4320032B2 (ja) | 2009-08-26 |
US20080319640A1 (en) | 2008-12-25 |
JPWO2005098366A1 (ja) | 2008-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4320032B2 (ja) | 経路案内システム及び方法 | |
JP6442993B2 (ja) | 自動運転支援システム、自動運転支援方法及びコンピュータプログラム | |
JP4068661B1 (ja) | ナビゲーションシステム、携帯端末装置および経路案内方法 | |
US6173232B1 (en) | Vehicle navigation system and a recording medium | |
JP6252235B2 (ja) | 自動運転支援システム、自動運転支援方法及びコンピュータプログラム | |
CN101427101B (zh) | 导航装置及其方法 | |
US20050171688A1 (en) | Car navigation device | |
JP2019139762A (ja) | 車両走行のための情報を提供する方法 | |
JP4845876B2 (ja) | 道路景観地図作成装置、方法及びプログラム | |
US20100250116A1 (en) | Navigation device | |
WO2009095967A1 (ja) | ナビゲーション装置 | |
US20100245561A1 (en) | Navigation device | |
US7730814B2 (en) | Video image type determination system, video image processing system, video image processing method and video image processing program | |
JP2015141054A (ja) | 経路案内システム、経路案内方法及びコンピュータプログラム | |
US7818123B2 (en) | Routing guide system and method | |
JPH1151682A (ja) | ナビゲーション装置 | |
JP5134608B2 (ja) | 車両周辺表示装置、車両周辺表示方法およびプログラム | |
JP2001074487A (ja) | ナビゲーション装置、及び経路案内方法 | |
JP5041411B2 (ja) | 情報表示装置 | |
WO2008047449A1 (fr) | Dispositif d'affichage d'image, procédé d'affichage d'image, programme d'affichage d'image et support d'enregistrement | |
JPH10214397A (ja) | 経路案内装置 | |
JP2023138609A (ja) | 渋滞表示装置、渋滞表示方法および渋滞表示プログラム | |
JP2001050761A (ja) | 車両用ナビゲーション装置 | |
CN117858827A (zh) | 一种车辆的控制方法、装置、车辆、程序产品及存储介质 | |
JPH10239079A (ja) | ナビゲーション装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006511982 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11547331 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |