+

US20130173232A1 - Method for determining the course of the road for a motor vehicle - Google Patents

Method for determining the course of the road for a motor vehicle Download PDF

Info

Publication number
US20130173232A1
US20130173232A1 US13/641,748 US201113641748A US2013173232A1 US 20130173232 A1 US20130173232 A1 US 20130173232A1 US 201113641748 A US201113641748 A US 201113641748A US 2013173232 A1 US2013173232 A1 US 2013173232A1
Authority
US
United States
Prior art keywords
road
lane
parallel
structures
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/641,748
Other versions
US10776634B2 (en
Inventor
Urban Meis
Christoph Wiedemann
Wladimir klein
Christoph Brüggemann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Conti Temic Microelectronic GmbH
Original Assignee
Conti Temic Microelectronic GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Conti Temic Microelectronic GmbH filed Critical Conti Temic Microelectronic GmbH
Assigned to CONTI TEMIC MICROELECTRONIC GMBH reassignment CONTI TEMIC MICROELECTRONIC GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BRUGGEMANN, CHRISTOPH, WIEDEMANN, CHRISTOPH, KLEIN, WLADIMIR, MEIS, URBAN
Publication of US20130173232A1 publication Critical patent/US20130173232A1/en
Application granted granted Critical
Publication of US10776634B2 publication Critical patent/US10776634B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06F17/5009
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the invention relates to a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system according to a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system, wherein the sensor data generated by the surroundings sensor system that is directed toward the road are evaluated in order to detect lane-relevant features. Furthermore, the invention relates to a motor vehicle with a device for carrying out the inventive method.
  • Lane detection is an important component of driver assistance systems that are used, e.g., for longitudinal control or for lane keeping.
  • Known lane detection systems are based on assigning the detected structures to the side of the lane and the distance thereof from the middle of the lane.
  • the difficulty of said assignment increases with an increasing distance of the measuring system from the structures.
  • the description of the detected lane usually refers to the position of the vehicle and does not provide any explicit information about the far range.
  • the usual lane model uses the curvature and the curvature change of a clothoid model to describe the further course of the road, wherein the estimated values are an averaging relative to the ego-position of the vehicle. Therefore, the known methods are not capable of determining the exact point of transition between two clothoid segments since roads are created by stringing clothoid segments together. For example, the transition from a bend to a straight line cannot be determined precisely, which results in an unnatural control behavior of a vehicle lateral control system of a driver assistance system at these points of transition.
  • Each of the individual Kalman filters of the filter bank is adapted to a lane model, which lane models differ from each other with respect to the position of the transition point relative to distance d1.
  • Each Kalman filter of this filter bank provides an estimate of the lane model parameters of the respective model, wherein each of these estimated values is subsequently weighted with a weighting value that corresponds to the probability of the occurrence of the respective model.
  • the weighted output data are merged.
  • An aspect of the invention provides a method of the type mentioned at the beginning that is improved as against the known method and that can detect the transition points of the course of the curvature of the road and thus correctly model the course of the lane right into the far range.
  • a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system in which the sensor data generated by the surroundings sensor system that is directed toward the road are evaluated in order to detect lane-relevant features is characterized in that
  • all structures that are parallel to the road are used to estimate the course of the lane without having to additionally introduce filtering quantities for the estimation method.
  • this inventive method is not dependent on the lateral distance of the structures that are parallel to the road nor is it necessary to estimate a deviation from the middle of the lane for the motor vehicle so that the number of the degrees of freedom is small which results in an estimation method that is more robust, more efficient with respect to runtime and less error-prone.
  • the course of the lane can be modeled over the entire considered distance range by means of a lane model with extremely precise estimated values, wherein all lane models that result in continuously differentiable functions are suitable.
  • Such lane models include, aside from a clothoid road model, the circular-arc model, the double-clothoid model and the spline model, for example.
  • an image-based sensor e.g., a camera
  • a radar sensor e.g., a lidar sensor
  • a digital map with GPS within a navigation system or a combination thereof may be used.
  • inventive method can also be used to predict lane-related system boundaries that can be advantageously used in driver assistance systems. For example, a narrowing bend in an approach road can be detected in advance so that a driver assistance system can prepare the driver of the vehicle for a crossing of a system boundary or indicate a bend radius located 60 m ahead of the vehicle or a minimal bend radius.
  • structures that are parallel to the road are determined from the sensor data by means of an edge detection method by determining and evaluating gray-scale gradients.
  • a Hough transform is preferably used, by means of which shapes, e.g., straight lines or circles made up of edge points, can be detected.
  • a criterion of parallelism is advantageously introduced, by means of which criterion it is possible to detect structures that are parallel to the road and that are located on both sides of the road and in front of the motor vehicle and at the same distance from the motor vehicle, and by means of which criterion it is also possible to preclude non-parallel structures.
  • the predictive estimation of the lane model parameter is performed by means of a Kalman filter model that is approximated, e.g., by means of a third-order polynomial known in the art.
  • estimation methods that are comparable with the Kalman filter model may be used as well.
  • the advantageous properties of the inventive method are shown by the fact that the lane width of the road, the deviation of the motor vehicle from the middle of the lane and the yaw angle of the motor vehicle are determined from the sensor data that cover the near range of the road and the further course of the lane is determined by means of the at least one generated lane model parameter.
  • the inventive method results in a reduction of the phase space because the inventive method is not dependent on the lane width or on the deviation of the vehicle from the middle of the lane, whereby the method is made more robust and more efficient with respect to runtime.
  • This partitioning of the phase space into a near range and a far range prevents changes of the width of the lane, e.g., lane widening or exit ramps, from negatively influencing the lane model parameters.
  • the lane width and the deviation of the vehicle from the middle of the lane can be restricted to the relevant near range and do not have to be estimated and averaged for the entire measuring range.
  • the inventive method can be advantageously used in a vehicle having a driver assistance system, e.g., within a longitudinal control system, a lane keeping system or a lane departure warning system.
  • FIG. 1 shows structures that are parallel to the road and structures that are not parallel to the road in order to explain the inventive method, which structures are shown in a road coordinate system of a road.
  • a surroundings sensor system that may be a camera, a radar or lidar sensor as well as a digital map with GPS (e.g., within a navigation system) provides sensor data for an evaluation unit by means of which the inventive method is carried out.
  • Said sensor data are used to carry out a feature extraction by the detection of structures that are parallel to the road, such as crash barriers, verges, lane markings, curbstones or other demarcating structures that are parallel to the road.
  • an edge detection method is used to this end during video processing, by means of which method the contours of such structures that are parallel to the road can be detected, wherein an edge is regarded as a change of the gray-scale values, i.e., the gray-scale gradients are determined and evaluated by means of a Hough transform.
  • the sensor data are subdivided into search areas having surroundings-related horizontal and vertical boundaries, wherein the lane detected in the near range is used for the vertical boundaries.
  • a clothoid model having clothoid segments is used as a lane model.
  • Each clothoid segment is described by an initial curvature c 0 and a curvature change c 1 and approximated in the x-y coordinate system of the road by means of a third-order polynomial:
  • an edge image is extracted that corresponds to the structure that extends tangentially to the middle of the lane. Furthermore, said edge image has a slope m as a tangent line at a point x in front of the vehicle. Said slope m is determined by means of a Hough transform.
  • the slope m of said tangent line is adopted as the value of the direction of the tangent line at the point of contact with the structure that is parallel to the road and that represents the tangent line.
  • FIG. 1 Such a situation is shown in FIG. 1 that is shown in the coordinate system of the road.
  • the road is shown as a two-lane road with a lane 1 that is demarcated by a verge 2 and a center line 3 .
  • the middle of the lane 4 of the lane 1 extends through the origin of coordinates and tangentially to the x-axis.
  • the verge 2 represents a clothoid segment.
  • the end of said clothoid segment is denoted by reference numeral 7 that may also represent a point of transition to the next road segment.
  • FIG. 1 shows, in position x 1 , an individual center line 3 as a structure that is parallel to the road, which structure was extracted from image data of a road scene that was recorded by means of a camera.
  • a tangent line T 1 with a slope m 1 that extends parallel to the middle of the lane (indicated by an arrow) is determined in the centroid of said individual center line 3 .
  • FIG. 1 shows, in position x 2 , a further structure 5 that is parallel to the road, which structure 5 consists of several objects that are arranged in a straight line and detected as stationary radar targets by means of, e.g., a radar sensor.
  • a tangent line T 2 with a slope m 2 extends parallel to the middle of the lane 4 (indicated by an arrow).
  • FIG. 1 shows a further straight-line verge structure 6 .
  • said structure 6 does not extend parallel to the road.
  • the sensor-data-based measurement of the slopes m i and of the associated positions x 1 is used to estimate the lane model parameters c 0 and c 1 of the clothoid model by means of a Kalman filter that has the following form in the x i -y i coordinate system of the vehicle in the above-mentioned approximation:
  • y 0LIR is the deviation (offset) of the vehicle from the middle of the lane to the left or to the right and ⁇ is the yaw angle of the vehicle.
  • the associated measurement equation in the Kalman filter model is obtained by differentiating the above equation l(x i ) and equating with the slope m i :
  • m i ⁇ + c 0 ⁇ x i + c 1 2 ⁇ x i 2 .
  • the estimated lane model parameters c 0 and c 1 are corrected in order to determine the further course of the lane (represented by segments K 1 and K 2 in FIG. 1 ).
  • the course of the lane can be modeled over the entire considered distance range, in particular over the far range.
  • a double-clothoid model or a node-based spline model may be used as a lane model.
  • c 0 is the curvature in the x-position.
  • a two-dimensional Hough method may be used to search the detected distance range for pairs of straight lines that extend parallel to each other on opposite sides of the road.
  • straight lines like the verge structures 6 shown in FIG. 1 can be precluded.
  • the detection of structures that are parallel to the road does not require any knowledge of the lateral distance of said structures, whereby it is not necessary to estimate the lateral offset of the vehicle, either.
  • the parameters y 0LIR (distance or offset of the vehicle from the middle of the lane) to be determined in the near range and the lane width can be determined by means of individual measurements, which results in an advantageous partitioning of the phase space with the offset and the lane width from the near range and the lane model parameters c 0 and c 1 in the far range.
  • the yaw angle ⁇ is also determined in the near-range partition.
  • Such a partitioning of the phase space prevents changes of the width of the lane, e.g., lane widening or exit ramps, from negatively influencing the lane model parameters, i.e., it is not dependent thereon and thus causes the degrees of freedom to be reduced, whereby the filter system (the Kalman filter in this exemplary embodiment) is made more robust, i.e., less error-prone, and more efficient with respect to runtime and results in more precise estimated values.
  • a criterion of parallelism is used, as described above, when evaluating the sensor data. According to said criterion, only structures that are parallel to the road and parallel to each other and that extend in the same position left of and right of the lane are determined, for example.
  • Each individual measurement can be converted into a piece of curvature information in the x-position. If one looks at the individual tangent measurements sequentially in the x-direction, the measurements exhibit a deviation from the filtered lane model from clothoid transition point 7 up ( FIG. 1 ), whereby said transition point 7 can be determined.
  • a lateral vehicle control that corresponds to the natural driving behavior can be performed within a driver assistance system.
  • a driver assistance system it is also possible to predict the reaching of lane-related system boundaries. For example, a narrowing bend in an approach road can be detected in advance so that a driver assistance system can prepare the driver of the vehicle for a crossing of a system boundary or indicate a bend radius located 60 m ahead of the vehicle or a minimal bend radius.
  • the inventive method demonstrates that the additional use of structures that are parallel to the road (such as markings, curbstones, structures on the roadside, and crash barriers) for the predictive determination of bend radii is possible without having to extend the filters.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A method for determining the course of the road for a moving motor vehicle having a surroundings sensor system. Sensor data generated by the surroundings sensor system are evaluated to detect lane-relevant features. A lane model having at least one lane model parameter that determines the course of the lane is generated for the road, structures of at least one distance range that are parallel to the road are detected in the sensor data, the tangential direction of at least the one structure that is parallel to the road is determined, and the value of the tangential direction of the structure that is parallel to the road is adopted as the value of the direction of the tangent line at the point of contact with the structure that is parallel to the road to determine at least the one lane model parameter by predictive estimation in the lane model.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is the U.S. National Phase Application of PCT/DE2011/000359, filed Mar. 31, 2011, which claims priority to European Patent Application No. 10401058.2, filed Apr. 20, 2010 and German Patent Application No. 10 2010 020 984.8, filed May 19, 2010, the contents of such applications being incorporated by reference herein.
  • FIELD OF THE INVENTION
  • The invention relates to a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system according to a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system, wherein the sensor data generated by the surroundings sensor system that is directed toward the road are evaluated in order to detect lane-relevant features. Furthermore, the invention relates to a motor vehicle with a device for carrying out the inventive method.
  • BACKGROUND OF THE INVENTION
  • Lane detection is an important component of driver assistance systems that are used, e.g., for longitudinal control or for lane keeping.
  • Known lane detection systems are based on assigning the detected structures to the side of the lane and the distance thereof from the middle of the lane. The difficulty of said assignment increases with an increasing distance of the measuring system from the structures. Furthermore, the description of the detected lane usually refers to the position of the vehicle and does not provide any explicit information about the far range. The usual lane model uses the curvature and the curvature change of a clothoid model to describe the further course of the road, wherein the estimated values are an averaging relative to the ego-position of the vehicle. Therefore, the known methods are not capable of determining the exact point of transition between two clothoid segments since roads are created by stringing clothoid segments together. For example, the transition from a bend to a straight line cannot be determined precisely, which results in an unnatural control behavior of a vehicle lateral control system of a driver assistance system at these points of transition.
  • According to U.S. Pat. No. 6,718,259 B1, which is incorporated by reference, this problem is solved by supplying the surroundings data generated by a sensor system to a filter bank made up of several Kalman filters, wherein the lane model is based on a clothoid model in which the road region in front of the vehicle is subdivided into a near range (up to a distance d1) and a far range (distance from d1 to d2) having different clothoid parameters, wherein a continuous transition between these two ranges is assumed. Thus, a point of transition between two clothoid segments of the road is not estimated, but the transition point is assumed to be at distance d1. Each of the individual Kalman filters of the filter bank is adapted to a lane model, which lane models differ from each other with respect to the position of the transition point relative to distance d1. Each Kalman filter of this filter bank provides an estimate of the lane model parameters of the respective model, wherein each of these estimated values is subsequently weighted with a weighting value that corresponds to the probability of the occurrence of the respective model. The weighted output data are merged.
  • SUMMARY OF THE INVENTION
  • The disadvantage of this known method according to U.S. Pat. No. 6,718,259 B1 consists in the high complexity of the filter bank made up of Kalman filters and results in long runtimes. The transition point is not detected but filtered out so that the model error remains present.
  • An aspect of the invention provides a method of the type mentioned at the beginning that is improved as against the known method and that can detect the transition points of the course of the curvature of the road and thus correctly model the course of the lane right into the far range.
  • According to an aspect of the invention, a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system in which the sensor data generated by the surroundings sensor system that is directed toward the road are evaluated in order to detect lane-relevant features is characterized in that
    • a lane model having at least one lane model parameter that determines the course of the lane is generated for the road,
    • structures of at least one distance range that are parallel to the road are detected in the sensor data,
    • the tangential direction of at least the one structure that is parallel to the road is determined, and
    • the value of the tangential direction of the structure that is parallel to the road is adopted as the value of the direction of the tangent line at the point of contact with the structure that is parallel to the road in order to determine at least the one lane model parameter by means of a predictive estimation method in the lane model.
  • According to this inventive method, all structures that are parallel to the road, such as lane markings, curbstones, structures on the roadside, crash barriers and the like, are used to estimate the course of the lane without having to additionally introduce filtering quantities for the estimation method. Furthermore, this inventive method is not dependent on the lateral distance of the structures that are parallel to the road nor is it necessary to estimate a deviation from the middle of the lane for the motor vehicle so that the number of the degrees of freedom is small which results in an estimation method that is more robust, more efficient with respect to runtime and less error-prone. Thus, the course of the lane can be modeled over the entire considered distance range by means of a lane model with extremely precise estimated values, wherein all lane models that result in continuously differentiable functions are suitable. Such lane models include, aside from a clothoid road model, the circular-arc model, the double-clothoid model and the spline model, for example.
  • Furthermore, the inventive method is also not dependent on the type of the surroundings sensor system used. Thus, an image-based sensor (e.g., a camera) as well as a radar sensor, a lidar sensor or a digital map with GPS within a navigation system or a combination thereof may be used.
  • Finally, the inventive method can also be used to predict lane-related system boundaries that can be advantageously used in driver assistance systems. For example, a narrowing bend in an approach road can be detected in advance so that a driver assistance system can prepare the driver of the vehicle for a crossing of a system boundary or indicate a bend radius located 60 m ahead of the vehicle or a minimal bend radius.
  • According to an advantageous further development of the invention, structures that are parallel to the road are determined from the sensor data by means of an edge detection method by determining and evaluating gray-scale gradients. To this end, a Hough transform is preferably used, by means of which shapes, e.g., straight lines or circles made up of edge points, can be detected. In order to preclude false structures, a criterion of parallelism is advantageously introduced, by means of which criterion it is possible to detect structures that are parallel to the road and that are located on both sides of the road and in front of the motor vehicle and at the same distance from the motor vehicle, and by means of which criterion it is also possible to preclude non-parallel structures.
  • Furthermore, according to an advantageous realization of the invention, the predictive estimation of the lane model parameter is performed by means of a Kalman filter model that is approximated, e.g., by means of a third-order polynomial known in the art. Of course, estimation methods that are comparable with the Kalman filter model may be used as well.
  • Finally, according to a further development of the invention, the advantageous properties of the inventive method are shown by the fact that the lane width of the road, the deviation of the motor vehicle from the middle of the lane and the yaw angle of the motor vehicle are determined from the sensor data that cover the near range of the road and the further course of the lane is determined by means of the at least one generated lane model parameter.
  • This shows that the inventive method results in a reduction of the phase space because the inventive method is not dependent on the lane width or on the deviation of the vehicle from the middle of the lane, whereby the method is made more robust and more efficient with respect to runtime. This partitioning of the phase space into a near range and a far range prevents changes of the width of the lane, e.g., lane widening or exit ramps, from negatively influencing the lane model parameters. The lane width and the deviation of the vehicle from the middle of the lane can be restricted to the relevant near range and do not have to be estimated and averaged for the entire measuring range.
  • The inventive method can be advantageously used in a vehicle having a driver assistance system, e.g., within a longitudinal control system, a lane keeping system or a lane departure warning system.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the following, the invention will be explained in greater detail with reference to the single attached FIG. 1 that shows structures that are parallel to the road and structures that are not parallel to the road in order to explain the inventive method, which structures are shown in a road coordinate system of a road.
  • DETAILED DESCRIPTION
  • A surroundings sensor system that may be a camera, a radar or lidar sensor as well as a digital map with GPS (e.g., within a navigation system) provides sensor data for an evaluation unit by means of which the inventive method is carried out.
  • Said sensor data are used to carry out a feature extraction by the detection of structures that are parallel to the road, such as crash barriers, verges, lane markings, curbstones or other demarcating structures that are parallel to the road.
  • For example, an edge detection method is used to this end during video processing, by means of which method the contours of such structures that are parallel to the road can be detected, wherein an edge is regarded as a change of the gray-scale values, i.e., the gray-scale gradients are determined and evaluated by means of a Hough transform.
  • To this end, the sensor data are subdivided into search areas having surroundings-related horizontal and vertical boundaries, wherein the lane detected in the near range is used for the vertical boundaries.
  • A clothoid model having clothoid segments is used as a lane model. Each clothoid segment is described by an initial curvature c0 and a curvature change c1 and approximated in the x-y coordinate system of the road by means of a third-order polynomial:
  • l ( x ) = y = c 0 2 x 2 + c 1 6 x 3
  • From the sensor data, an edge image is extracted that corresponds to the structure that extends tangentially to the middle of the lane. Furthermore, said edge image has a slope m as a tangent line at a point x in front of the vehicle. Said slope m is determined by means of a Hough transform.
  • In the lane model, the slope m of said tangent line is adopted as the value of the direction of the tangent line at the point of contact with the structure that is parallel to the road and that represents the tangent line.
  • Such a situation is shown in FIG. 1 that is shown in the coordinate system of the road. The road is shown as a two-lane road with a lane 1 that is demarcated by a verge 2 and a center line 3. The middle of the lane 4 of the lane 1 extends through the origin of coordinates and tangentially to the x-axis. The verge 2 represents a clothoid segment. The end of said clothoid segment is denoted by reference numeral 7 that may also represent a point of transition to the next road segment.
  • Said FIG. 1 shows, in position x1, an individual center line 3 as a structure that is parallel to the road, which structure was extracted from image data of a road scene that was recorded by means of a camera. A tangent line T1 with a slope m1 that extends parallel to the middle of the lane (indicated by an arrow) is determined in the centroid of said individual center line 3.
  • Furthermore, FIG. 1 shows, in position x2, a further structure 5 that is parallel to the road, which structure 5 consists of several objects that are arranged in a straight line and detected as stationary radar targets by means of, e.g., a radar sensor. In the centroid of said structure, a tangent line T2 with a slope m2 extends parallel to the middle of the lane 4 (indicated by an arrow).
  • Finally, FIG. 1 shows a further straight-line verge structure 6. However, said structure 6 does not extend parallel to the road.
  • The sensor-data-based measurement of the slopes mi and of the associated positions x1 is used to estimate the lane model parameters c0 and c1 of the clothoid model by means of a Kalman filter that has the following form in the xi-yi coordinate system of the vehicle in the above-mentioned approximation:
  • l ( x i ) = y i = y 0 L / R + θ x i + c 0 2 x i 2 + c 1 6 x i 3 ,
  • wherein y0LIR is the deviation (offset) of the vehicle from the middle of the lane to the left or to the right and θ is the yaw angle of the vehicle.
  • The associated measurement equation in the Kalman filter model is obtained by differentiating the above equation l(xi) and equating with the slope mi:
  • m i = θ + c 0 x i + c 1 2 x i 2 .
  • By means of this measurement equation, the estimated lane model parameters c0 and c1 are corrected in order to determine the further course of the lane (represented by segments K1 and K2 in FIG. 1). Thus, the course of the lane can be modeled over the entire considered distance range, in particular over the far range.
  • Aside from the single-clothoid model described herein, a double-clothoid model or a node-based spline model may be used as a lane model.
  • It is also possible to determine the curvature c0 in the considered distance segment by means of the slope m prior to estimating the lane model parameters c0 and c1. To this end, the course of the lane is approximated in each distance segment by the parabola
  • l ( x ) = c 0 2 x 2 ,
  • wherein c0 is the curvature in the x-position. The slope m of detected structures that are parallel to the road (which slope m is determined from the sensor data) corresponds to the slope of the parabola in the same x-position and is given by the first derivative, i.e., c0 x=m , so that the curvature c0 is:
  • c 0 = m x .
  • In order to preclude false structures (i.e., structures that are not parallel to the road), a two-dimensional Hough method may be used to search the detected distance range for pairs of straight lines that extend parallel to each other on opposite sides of the road. By means of such as criterion of parallelism, straight lines like the verge structures 6 shown in FIG. 1 can be precluded.
  • In the inventive method described herein, the detection of structures that are parallel to the road does not require any knowledge of the lateral distance of said structures, whereby it is not necessary to estimate the lateral offset of the vehicle, either.
  • The parameters y0LIR (distance or offset of the vehicle from the middle of the lane) to be determined in the near range and the lane width can be determined by means of individual measurements, which results in an advantageous partitioning of the phase space with the offset and the lane width from the near range and the lane model parameters c0 and c1 in the far range. The yaw angle θ is also determined in the near-range partition. Such a partitioning of the phase space prevents changes of the width of the lane, e.g., lane widening or exit ramps, from negatively influencing the lane model parameters, i.e., it is not dependent thereon and thus causes the degrees of freedom to be reduced, whereby the filter system (the Kalman filter in this exemplary embodiment) is made more robust, i.e., less error-prone, and more efficient with respect to runtime and results in more precise estimated values.
  • Since the course of the lane detected in the near range is extrapolated into the far range by means of the inventive method, it is not necessary to average and estimate the offset and the lane width in the far range.
  • In order to preclude false straight-line verge structures (shown in FIG. 1) whose straight lines G do not extend parallel to the road in the centroid, a criterion of parallelism is used, as described above, when evaluating the sensor data. According to said criterion, only structures that are parallel to the road and parallel to each other and that extend in the same position left of and right of the lane are determined, for example.
  • Each individual measurement (tangent line) can be converted into a piece of curvature information in the x-position. If one looks at the individual tangent measurements sequentially in the x-direction, the measurements exhibit a deviation from the filtered lane model from clothoid transition point 7 up (FIG. 1), whereby said transition point 7 can be determined. Thus, a lateral vehicle control that corresponds to the natural driving behavior can be performed within a driver assistance system.
  • Within a driver assistance system, it is also possible to predict the reaching of lane-related system boundaries. For example, a narrowing bend in an approach road can be detected in advance so that a driver assistance system can prepare the driver of the vehicle for a crossing of a system boundary or indicate a bend radius located 60 m ahead of the vehicle or a minimal bend radius.
  • The inventive method demonstrates that the additional use of structures that are parallel to the road (such as markings, curbstones, structures on the roadside, and crash barriers) for the predictive determination of bend radii is possible without having to extend the filters.

Claims (12)

1.-8. (canceled)
9. A method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system, wherein sensor data generated by the surroundings sensor system that is directed toward the road are evaluated in order to detect lane-relevant features,
wherein
a lane model having at least one lane model parameter that determines the course of the lane is generated for the road,
structures of at least one distance range that are parallel to the road are detected in the sensor data,
a tangential direction of at least the one structure that is parallel to the road is determined, and
a value of the tangential direction of the structure that is parallel to the road is adopted as the value of the direction of the tangent line at the point of contact with the structure that is parallel to the road in order to determine at least the one lane model parameter by a predictive estimation method in the lane model.
10. The method according to claim 9, wherein continuously differentiable segments are used as the lane model.
11. The method according to claim 10, wherein continuously differentiable segments are used as the lane model within a clothoid road model, a circular-arc model or a spline model.
12. The method according to claim 11, wherein structures that are parallel to the road are determined by an edge detection method by determining and evaluating gray-scale gradients.
13. The method according to claim 12, wherein the gray-scale gradients are evaluated by a Hough transform.
14. The method according to claim 12, wherein by a criterion of parallelism, structures that are parallel to the road and that are located in front of the motor vehicle and at the same distance from the motor vehicle are detected and non-parallel structures are precluded.
15. The method according to claim 11, wherein by a criterion of parallelism, structures that are parallel to the road and that are located in front of the motor vehicle and at the same distance from the motor vehicle are detected and non-parallel structures are precluded.
16. The method according to claim 9, wherein the predictive estimation of the lane model parameter is performed by a Kalman filter model or a comparable estimation method.
17. The method according to claim 9, wherein a lane width of the road, a deviation of the motor vehicle from the middle of the lane and a yaw angle of the motor vehicle are determined from the sensor data that cover the near range of the road and the further course of the lane is determined by the at least one generated lane model parameter.
18. The method according to claim 9, wherein structures that are parallel to the road are determined by an edge detection method by determining and evaluating gray-scale gradients.
19. A vehicle with a driver assistance system, for carrying out a method for determining the course of the road for a moving motor vehicle with at least one surroundings sensor system, wherein sensor data generated by the surroundings sensor system that is directed toward the road are evaluated in order to detect lane-relevant features, wherein
a lane model having at least one lane model parameter that determines the course of the lane is generated for the road,
structures of at least one distance range that are parallel to the road are detected in the sensor data,
a tangential direction of at least the one structure that is parallel to the road is determined, and
a value of the tangential direction of the structure that is parallel to the road is adopted as the value of the direction of the tangent line at the point of contact with the structure that is parallel to the road in order to determine at least the one lane model parameter by a predictive estimation method in the lane model.
US13/641,748 2010-04-20 2011-03-31 Method for determining the course of the road for a motor vehicle Active 2034-01-29 US10776634B2 (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
EP10401058 2010-04-20
EP10401058.2 2010-04-20
EP10401058 2010-04-20
DE102010020984 2010-05-19
DE102010020984A DE102010020984A1 (en) 2010-04-20 2010-05-19 Method for determining the road course for a motor vehicle
DE102010020984.8 2010-05-19
PCT/DE2011/000359 WO2011131165A1 (en) 2010-04-20 2011-03-31 Method for determining the course of the road for a motor vehicle

Publications (2)

Publication Number Publication Date
US20130173232A1 true US20130173232A1 (en) 2013-07-04
US10776634B2 US10776634B2 (en) 2020-09-15

Family

ID=44730764

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/641,748 Active 2034-01-29 US10776634B2 (en) 2010-04-20 2011-03-31 Method for determining the course of the road for a motor vehicle

Country Status (5)

Country Link
US (1) US10776634B2 (en)
EP (1) EP2561419B1 (en)
JP (1) JP2013530435A (en)
DE (2) DE102010020984A1 (en)
WO (1) WO2011131165A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104898657A (en) * 2014-03-06 2015-09-09 西北农林科技大学 Robot visual sense path identification method based on DSP
US20160010998A1 (en) * 2013-02-25 2016-01-14 Continental Automotive Gmbh Intelligent video navigation for automobiles
US9690994B2 (en) 2014-04-25 2017-06-27 Honda Motor Co., Ltd. Lane recognition device
US9896092B2 (en) * 2012-04-26 2018-02-20 Continental Teves Ag & Co. Ohg Method for representing vehicle surroundings
EP3285243A1 (en) 2016-08-17 2018-02-21 Continental Automotive GmbH Method for determining a course of a road, driver assistance system and vehicle
US9969430B2 (en) * 2014-10-09 2018-05-15 Robert Bosch Gmbh Method and device for assisting a driver of a vehicle when driving onto a carriageway via an approach road
EP3360746A1 (en) * 2017-02-13 2018-08-15 Autoliv Development AB Apparatus operable to determine a position of a portion of a lane
FR3077549A1 (en) * 2018-02-08 2019-08-09 Psa Automobiles Sa METHOD FOR DETERMINING THE TRACK OF A MOTOR VEHICLE IN ABSENCE OF GROUND MARKING
CN110550028A (en) * 2018-06-04 2019-12-10 本田技研工业株式会社 Vehicle control device, vehicle control method, and storage medium
WO2020063816A1 (en) * 2018-09-30 2020-04-02 长城汽车股份有限公司 Method for constructing driving coordinate system, and application thereof
WO2020070078A1 (en) * 2018-10-01 2020-04-09 Valeo Vision Method for controlling modules for projecting pixelated light beams for a vehicle
US10650253B2 (en) 2015-05-22 2020-05-12 Continental Teves Ag & Co. Ohg Method for estimating traffic lanes
US10990833B2 (en) * 2016-11-16 2021-04-27 Continental Automotive Gmbh Method for determining a course of lanes, driver assistance system, and vehicle
US11011064B2 (en) * 2019-03-05 2021-05-18 Denso International America, Inc. System and method for vehicle platooning
US11091197B2 (en) * 2016-07-07 2021-08-17 Denso Corporation Driving support apparatus
CN113593238A (en) * 2021-08-06 2021-11-02 吉林大学 Intersection virtual lane modeling method for automatic driving navigation
US20220101637A1 (en) * 2019-02-06 2022-03-31 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
US20220169280A1 (en) * 2019-05-13 2022-06-02 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
CN114644019A (en) * 2022-05-23 2022-06-21 苏州挚途科技有限公司 Method and device for determining lane center line and electronic equipment
DE102021209403A1 (en) 2021-08-26 2023-03-02 Continental Autonomous Mobility Germany GmbH Method for generating a static environment model

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9511681B2 (en) * 2013-05-09 2016-12-06 Rockwell Automation, Inc. Controlled motion system having an improved track configuration
US9045144B2 (en) 2013-05-09 2015-06-02 Robert Bosch Gmbh Third-order polynomial-based course prediction for driver assistance functions
CN104252720A (en) * 2013-06-27 2014-12-31 鸿富锦精密工业(深圳)有限公司 Vehicle trajectory simulation system and method
KR101526386B1 (en) * 2013-07-10 2015-06-08 현대자동차 주식회사 Apparatus and method of processing road data
DE102014200638A1 (en) * 2014-01-16 2015-07-30 Bayerische Motoren Werke Aktiengesellschaft Method and system for estimating a lane course
US9321461B1 (en) * 2014-08-29 2016-04-26 Google Inc. Change detection using curve alignment
CN104931977B (en) * 2015-06-11 2017-08-25 同济大学 A kind of obstacle recognition method for intelligent vehicle
EP3285203A1 (en) 2016-08-19 2018-02-21 Continental Automotive GmbH Method for detecting a road in an environment of a vehicle
DE102016220717A1 (en) 2016-10-21 2018-05-09 Volkswagen Aktiengesellschaft Determining a lane and lateral control for a vehicle
US10864819B2 (en) * 2017-10-31 2020-12-15 Speedgauge, Inc. Driver alertness warning system and method
US11403816B2 (en) * 2017-11-30 2022-08-02 Mitsubishi Electric Corporation Three-dimensional map generation system, three-dimensional map generation method, and computer readable medium
DE102018114808A1 (en) * 2018-06-20 2019-12-24 Man Truck & Bus Se Method for the automatic lateral guidance of a following vehicle in a vehicle platoon
DE102018212555A1 (en) * 2018-07-27 2020-01-30 Bayerische Motoren Werke Aktiengesellschaft Method and system for lane detection
CN109927726B (en) * 2019-03-13 2021-02-26 深兰科技(上海)有限公司 Method and equipment for adjusting motion state of target vehicle
CN112441022B (en) * 2019-09-02 2023-02-03 华为技术有限公司 Lane center line determining method and device
DE102021129258B4 (en) 2021-11-10 2024-03-07 Cariad Se Method for checking the plausibility of at least a portion of a travel trajectory for a vehicle
DE102022200921A1 (en) 2022-01-27 2023-07-27 Continental Autonomous Mobility Germany GmbH Method for determining a three-dimensional course of the roadway, driving system and motor vehicle

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6205234B1 (en) * 1996-07-31 2001-03-20 Aisin Seiki Kabushiki Kaisha Image processing system
US20020042668A1 (en) * 2000-10-02 2002-04-11 Nissan Motor Co., Ltd. Lane recognition apparatus for vehicle
US6807287B1 (en) * 1998-02-27 2004-10-19 Lucas Industries Limited Road profile prediction
US20050190975A1 (en) * 2004-02-26 2005-09-01 Porikli Fatih M. Traffic event detection in compressed videos
US20050225477A1 (en) * 2002-07-15 2005-10-13 Shan Cong Road curvature estimation system
US20060023950A1 (en) * 2002-07-31 2006-02-02 Koninklijke Philips Electronics N.V. Method and appratus for encoding a digital video signal
US20060220912A1 (en) * 2003-07-31 2006-10-05 Heenan Adam J Sensing apparatus for vehicles
US20080029127A1 (en) * 2006-08-07 2008-02-07 Nidec Corporation Method of applying solution of oil repellent
US20100253598A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Lane of travel on windshield head-up display
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
US20120185167A1 (en) * 2009-07-29 2012-07-19 Hitachi Automotive Systems Ltd Road Shape Recognition Device
US8391556B2 (en) * 2007-01-23 2013-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for video-based road lane curvature measurement
US8629784B2 (en) * 2009-04-02 2014-01-14 GM Global Technology Operations LLC Peripheral salient feature enhancement on full-windshield head-up display
US8775063B2 (en) * 2009-01-26 2014-07-08 GM Global Technology Operations LLC System and method of lane path estimation using sensor fusion

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2707761B2 (en) 1989-10-25 1998-02-04 トヨタ自動車株式会社 Vehicle state quantity measurement device
US5646845A (en) * 1990-02-05 1997-07-08 Caterpillar Inc. System and method for controlling an autonomously navigated vehicle
US5245422A (en) * 1991-06-28 1993-09-14 Zexel Corporation System and method for automatically steering a vehicle within a lane in a road
US5638116A (en) * 1993-09-08 1997-06-10 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
JP3556766B2 (en) * 1996-05-28 2004-08-25 松下電器産業株式会社 Road white line detector
JPH11160078A (en) * 1997-12-02 1999-06-18 Toyota Motor Corp System for estimating condition of traveling road
JP3319399B2 (en) * 1998-07-16 2002-08-26 株式会社豊田中央研究所 Roadway recognition device
JP3733875B2 (en) 2000-09-29 2006-01-11 日産自動車株式会社 Road white line recognition device
US6819779B1 (en) * 2000-11-22 2004-11-16 Cognex Corporation Lane detection system and apparatus
JP3780848B2 (en) * 2000-12-27 2006-05-31 日産自動車株式会社 Vehicle traveling path recognition device
US20020134151A1 (en) * 2001-02-05 2002-09-26 Matsushita Electric Industrial Co., Ltd. Apparatus and method for measuring distances
DE10138641A1 (en) 2001-08-07 2003-02-20 Ibeo Automobile Sensor Gmbh Fast and reliable determination of a model or optimum-driving course for a motor vehicle based on information relating to the actual path of a road obtained using an optoelectronic scanning or sensing system
US6751547B2 (en) * 2001-11-26 2004-06-15 Hrl Laboratories, Llc Method and apparatus for estimation of forward path geometry of a vehicle based on a two-clothoid road model
AU2003225228A1 (en) * 2002-05-03 2003-11-17 Donnelly Corporation Object detection system for vehicle
US6718259B1 (en) 2002-10-02 2004-04-06 Hrl Laboratories, Llc Adaptive Kalman filter method for accurate estimation of forward path geometry of an automobile
DE10349631A1 (en) * 2003-10-24 2005-05-19 Robert Bosch Gmbh Driver assistance method and apparatus based on lane information
DE102004040143A1 (en) * 2004-08-19 2006-02-23 Robert Bosch Gmbh Method and device for driver information
JP4650079B2 (en) * 2004-11-30 2011-03-16 日産自動車株式会社 Object detection apparatus and method
DE102005002719A1 (en) 2005-01-20 2006-08-03 Robert Bosch Gmbh Course prediction method in driver assistance systems for motor vehicles
JP4659631B2 (en) * 2005-04-26 2011-03-30 富士重工業株式会社 Lane recognition device
DE102005058809A1 (en) 2005-12-09 2007-06-14 Hella Kgaa Hueck & Co. path planning
JP4721278B2 (en) 2006-03-27 2011-07-13 富士重工業株式会社 Lane departure determination device, lane departure prevention device, and lane tracking support device
US7477988B2 (en) 2006-05-16 2009-01-13 Navteq North America, Llc Dual road geometry representation for position and curvature-heading
DE102006040333A1 (en) * 2006-08-29 2008-03-06 Robert Bosch Gmbh Lane detection method for use in driver assistance system, has reconstructing course of driving lane markings and/or driving lanes in section of driving area from position of driving lane markings, where section lies behind vehicle
JP4956099B2 (en) 2006-08-31 2012-06-20 富士重工業株式会社 Wall detector
DE102006062061B4 (en) 2006-12-29 2010-06-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus, method and computer program for determining a position based on a camera image from a camera
US8355539B2 (en) * 2007-09-07 2013-01-15 Sri International Radar guided vision system for vehicle validation and vehicle motion characterization
JP5044436B2 (en) 2008-02-18 2012-10-10 トヨタ自動車株式会社 Reverse run prevention system
JP2011517632A (en) * 2008-02-20 2011-06-16 コンチネンタル・テーヴエス・アクチエンゲゼルシヤフト・ウント・コンパニー・オツフエネハンデルスゲゼルシヤフト Method and assistance system for detecting objects in the vicinity of a vehicle
US9090263B2 (en) * 2010-07-20 2015-07-28 GM Global Technology Operations LLC Lane fusion system using forward-view and rear-view cameras

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6205234B1 (en) * 1996-07-31 2001-03-20 Aisin Seiki Kabushiki Kaisha Image processing system
US6807287B1 (en) * 1998-02-27 2004-10-19 Lucas Industries Limited Road profile prediction
US20020042668A1 (en) * 2000-10-02 2002-04-11 Nissan Motor Co., Ltd. Lane recognition apparatus for vehicle
US20050225477A1 (en) * 2002-07-15 2005-10-13 Shan Cong Road curvature estimation system
US20060023950A1 (en) * 2002-07-31 2006-02-02 Koninklijke Philips Electronics N.V. Method and appratus for encoding a digital video signal
US20060220912A1 (en) * 2003-07-31 2006-10-05 Heenan Adam J Sensing apparatus for vehicles
US20050190975A1 (en) * 2004-02-26 2005-09-01 Porikli Fatih M. Traffic event detection in compressed videos
US20080029127A1 (en) * 2006-08-07 2008-02-07 Nidec Corporation Method of applying solution of oil repellent
US8391556B2 (en) * 2007-01-23 2013-03-05 Valeo Schalter Und Sensoren Gmbh Method and system for video-based road lane curvature measurement
US8775063B2 (en) * 2009-01-26 2014-07-08 GM Global Technology Operations LLC System and method of lane path estimation using sensor fusion
US20100253598A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Lane of travel on windshield head-up display
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
US8629784B2 (en) * 2009-04-02 2014-01-14 GM Global Technology Operations LLC Peripheral salient feature enhancement on full-windshield head-up display
US20120185167A1 (en) * 2009-07-29 2012-07-19 Hitachi Automotive Systems Ltd Road Shape Recognition Device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
J. Miller "Curves In The Plane, Derivative Of Arc Length, Curvature, Radius Of Curvature, Circle Of Curvature, Evolute," on the Web at http://www.solitaryroad.com/c327.html, date given by Wayback Archive as March 28, 2010, 4 pages. *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9896092B2 (en) * 2012-04-26 2018-02-20 Continental Teves Ag & Co. Ohg Method for representing vehicle surroundings
US20160010998A1 (en) * 2013-02-25 2016-01-14 Continental Automotive Gmbh Intelligent video navigation for automobiles
US9915539B2 (en) * 2013-02-25 2018-03-13 Continental Automotive Gmbh Intelligent video navigation for automobiles
CN104898657A (en) * 2014-03-06 2015-09-09 西北农林科技大学 Robot visual sense path identification method based on DSP
US9690994B2 (en) 2014-04-25 2017-06-27 Honda Motor Co., Ltd. Lane recognition device
US9969430B2 (en) * 2014-10-09 2018-05-15 Robert Bosch Gmbh Method and device for assisting a driver of a vehicle when driving onto a carriageway via an approach road
US10650253B2 (en) 2015-05-22 2020-05-12 Continental Teves Ag & Co. Ohg Method for estimating traffic lanes
US11091197B2 (en) * 2016-07-07 2021-08-17 Denso Corporation Driving support apparatus
EP3285243A1 (en) 2016-08-17 2018-02-21 Continental Automotive GmbH Method for determining a course of a road, driver assistance system and vehicle
US10990833B2 (en) * 2016-11-16 2021-04-27 Continental Automotive Gmbh Method for determining a course of lanes, driver assistance system, and vehicle
WO2018146315A1 (en) * 2017-02-13 2018-08-16 Autoliv Development Ab Apparatus operable to determine a position of a portion of a lane
CN110214106A (en) * 2017-02-13 2019-09-06 维宁尔瑞典公司 It can be used to determine the device of the position of a part in lane
US11292464B2 (en) * 2017-02-13 2022-04-05 Veoneer Sweden Ab Apparatus for a driver assistance system
EP3360746A1 (en) * 2017-02-13 2018-08-15 Autoliv Development AB Apparatus operable to determine a position of a portion of a lane
WO2019155134A1 (en) * 2018-02-08 2019-08-15 Psa Automobiles Sa Method of determining the trajectory of a motor vehicle in the absence of ground markings
FR3077549A1 (en) * 2018-02-08 2019-08-09 Psa Automobiles Sa METHOD FOR DETERMINING THE TRACK OF A MOTOR VEHICLE IN ABSENCE OF GROUND MARKING
CN110550028A (en) * 2018-06-04 2019-12-10 本田技研工业株式会社 Vehicle control device, vehicle control method, and storage medium
WO2020063816A1 (en) * 2018-09-30 2020-04-02 长城汽车股份有限公司 Method for constructing driving coordinate system, and application thereof
US11926339B2 (en) 2018-09-30 2024-03-12 Great Wall Motor Company Limited Method for constructing driving coordinate system, and application thereof
CN112805180A (en) * 2018-10-01 2021-05-14 法雷奥照明公司 Method for controlling a module for projecting a pixelated light beam of a vehicle
FR3086901A1 (en) * 2018-10-01 2020-04-10 Valeo Vision METHOD FOR DRIVING MODELS OF PROJECTION OF BEAMS OF PIXELLIZED LIGHT FOR VEHICLE
KR20210065116A (en) * 2018-10-01 2021-06-03 발레오 비젼 How to control a module for projecting a pixelated light beam in a vehicle
WO2020070078A1 (en) * 2018-10-01 2020-04-09 Valeo Vision Method for controlling modules for projecting pixelated light beams for a vehicle
US20220118901A1 (en) * 2018-10-01 2022-04-21 Valeo Vision Method for controlling modules for projecting pixelated light beams for a vehicle
KR102709526B1 (en) 2018-10-01 2024-09-25 발레오 비젼 Method for controlling a module for projecting a pixelated light beam from a vehicle
US11993201B2 (en) * 2018-10-01 2024-05-28 Valeo Vision Method for controlling modules for projecting pixelated light beams for a vehicle
US12217516B2 (en) * 2019-02-06 2025-02-04 Bayerische Motoren Werke Aktiengesellschaft Method and device for multi-sensor data fusion for automated and autonomous vehicles
US20220101637A1 (en) * 2019-02-06 2022-03-31 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
US11011064B2 (en) * 2019-03-05 2021-05-18 Denso International America, Inc. System and method for vehicle platooning
US20220169280A1 (en) * 2019-05-13 2022-06-02 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
CN113593238A (en) * 2021-08-06 2021-11-02 吉林大学 Intersection virtual lane modeling method for automatic driving navigation
DE102021209403A1 (en) 2021-08-26 2023-03-02 Continental Autonomous Mobility Germany GmbH Method for generating a static environment model
CN114644019A (en) * 2022-05-23 2022-06-21 苏州挚途科技有限公司 Method and device for determining lane center line and electronic equipment

Also Published As

Publication number Publication date
US10776634B2 (en) 2020-09-15
DE102010020984A1 (en) 2011-10-20
DE112011100146A5 (en) 2012-09-20
WO2011131165A1 (en) 2011-10-27
JP2013530435A (en) 2013-07-25
EP2561419B1 (en) 2017-12-06
EP2561419A1 (en) 2013-02-27

Similar Documents

Publication Publication Date Title
US20130173232A1 (en) Method for determining the course of the road for a motor vehicle
US8055445B2 (en) Probabilistic lane assignment method
CN109017780B (en) Intelligent driving control method for vehicle
JP6747269B2 (en) Object recognition device
US10274598B2 (en) Navigation based on radar-cued visual imaging
US9074906B2 (en) Road shape recognition device
US8112223B2 (en) Method for measuring lateral movements in a driver assistance system
Han et al. Road boundary detection and tracking for structured and unstructured roads using a 2D lidar sensor
US20110320163A1 (en) Method and system for determining road data
US9592829B2 (en) Method and control unit for robustly detecting a lane change of a vehicle
US20220169280A1 (en) Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
CN107209998B (en) Lane line recognition device and lane line recognition method
EP2963634B1 (en) Stereo camera device
JP2018517979A (en) Method for estimating driving lane
US11731649B2 (en) High precision position estimation method through road shape classification-based map matching and autonomous vehicle thereof
CN109835337B (en) Turning control method and device and automatic driving vehicle
JP2015069289A (en) Lane recognition device
CN111801258B (en) Driving assistance device and method for motor vehicle
Adam et al. Probabilistic road estimation and lane association using radar detections
JP2007309799A (en) On-board distance measuring apparatus
US20050278112A1 (en) Process for predicting the course of a lane of a vehicle
CN115195773A (en) Apparatus and method for controlling vehicle driving and recording medium
JP7344744B2 (en) Roadside edge detection method and roadside edge detection device
WO2020095673A1 (en) Vehicle-mounted control device
JP6115429B2 (en) Own vehicle position recognition device

Legal Events

Date Code Title Description
AS Assignment

Owner name: CONTI TEMIC MICROELECTRONIC GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MEIS, URBAN;WIEDEMANN, CHRISTOPH;KLEIN, WLADIMIR;AND OTHERS;SIGNING DATES FROM 20121112 TO 20130112;REEL/FRAME:029688/0209

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载