US20170355366A1 - Lane keeping system for autonomous vehicle during camera drop-outs - Google Patents
Lane keeping system for autonomous vehicle during camera drop-outs Download PDFInfo
- Publication number
- US20170355366A1 US20170355366A1 US15/181,915 US201615181915A US2017355366A1 US 20170355366 A1 US20170355366 A1 US 20170355366A1 US 201615181915 A US201615181915 A US 201615181915A US 2017355366 A1 US2017355366 A1 US 2017355366A1
- Authority
- US
- United States
- Prior art keywords
- sensor
- vehicle
- signal
- lane
- lane marker
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/025—Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B60W2420/42—
-
- B60W2420/52—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/18—Braking system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/20—Steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/22—Suspension systems
Definitions
- This disclosure relates to an environmental sensing system relating to reliably identifying vehicle lane position for lane keeping in a fully autonomous vehicle, for example, or a vehicle that is driver-assisted.
- Vehicle lane position is increasingly used in modern vehicles for such features as Lane Keep Assist (LKA), Lane Centering (LC) and Traffic Jam Assist (TJA), which incorporates aspects of LKA and LC.
- LKA Lane Keep Assist
- LC Lane Centering
- TJA Traffic Jam Assist
- the vehicle's lane position is detected, and the vehicle is maintained within the lane using little or no steering input from the driver.
- Such features are also needed for autonomously driving vehicles.
- the vehicle's lane position is adjusted by using an environmental sensing system that has one or more cameras and a distance ranging sensor (e.g., LIDAR or radar). Lane marker edges are detected by the sensors, but some sort of vision-based sensor is used as the primary sensor for vehicle control, typically in the form of a front mounted camera which detects the lines and lanes.
- a distance ranging sensor e.g., LIDAR or radar.
- Data from the sensors must be reliable in order to maintain control of the vehicle without driver input, or full control of the vehicle must be returned to the driver. Repeated interruptions to autonomous control are undesirable, but must be balanced with the need for highly reliable vehicle control.
- sun glare on the front facing sensors can be sufficient to cause sensor “drop-out” in which the sensor can no longer provide reliable data for vehicle control.
- One approach to address sun glare is to combine overlapping or non-overlapping images from multiple cameras to provide the best available lane marker recognition. The problem with this approach is that the primary sensor may no longer be relied upon for indefinite durations, which is not the best practice and not very reliable.
- a method of sensing an environment of a vehicle includes the steps of controlling a vehicle lane position based upon a first signal from a first sensor and switching from the first sensor to a second sensor if the first sensor cannot provide a desired lane marker confidence.
- the vehicle's lane position is controlled based upon a second signal from the second sensor if the second sensor can provide the desired lane marker confidence and a predetermined time has not been exceeded.
- the first sensor is at least one of a camera sensor, radar sensor, infrared sensor and LIDAR sensor.
- the first sensor is an integrated camera sensor and radar sensor.
- the first sensor is forward facing.
- the second sensor is one of a side view camera and a rear view camera.
- the first sensor cannot provide the desired lane marker confidence due to glare on the first sensor.
- the switching step includes applying a control algorithm using data from the second signal to determine the desired lane marker confidence.
- the switching step includes applying a filter to the data to identify lane marker edges and converting the lane marker edges to a coordinate system.
- the switching step includes determining whether the lane marker edges in the coordinate system are similar to previously provided data from the first sensor.
- steering control of the vehicle is returned to the driver if the step of controlling the vehicle lane position based upon the second signal is not performed within the predetermined time.
- the vehicle lane position is not controlled based upon the first signal while controlling the vehicle lane position based upon the second signal.
- an environmental sensing system relating to vehicle lane position includes a first sensor that is configured to provide a first signal indicative of a vehicle lane position.
- a second sensor is configured to provide a second signal indicative of the vehicle lane position.
- a steering system is configured to achieve a desired lane position in response to a command.
- a controller is in communication with the steering system and the first and second sensors and is configured to provide the command based upon one of the first and second signals.
- the controller is configured to use the first signal if the first sensor provides a desired lane marker confidence.
- the controller is configured to switch to the second sensor and use the second signal if the first sensor cannot provide the desired lane marker confidence and the second sensor can provide the desired lane marker confidence and a predetermined time has not been exceeded.
- the first sensor is at least one of a camera sensor, radar sensor, infrared sensor and LIDAR sensor.
- the first sensor is an integrated camera sensor and radar sensor.
- the first sensor is forward facing.
- the second sensor is one of a side view camera and a rear view camera.
- the first sensor cannot provide the desired lane marker confidence due to temporary failure of the first sensor.
- the switching step includes applying a control algorithm using data from the second signal to determine the desired lane marker confidence.
- the switching step includes applying a filter to the data to identify lane marker edges and converting the lane marker edges to a vehicle coordinate system.
- the switching step includes determining whether the lane marker edges in the vehicle coordinate system are similar to previously provided data from the first sensor.
- steering control of the vehicle is returned if the step of controlling the vehicle lane position based upon the second signal is not performed within the predetermined time.
- the vehicle lane position is not controlled based upon the first signal while controlling the vehicle lane position based upon the second signal.
- FIG. 1A is a schematic elevational view, or “bird's-eye-view,” of a vehicle with an environmental sensing system of the type used in lane keeping or autonomous vehicle control.
- FIG. 1B is a schematic side view of the vehicle shown in FIG. 1A .
- FIG. 2 is a schematic view of the environmental sensing system.
- FIG. 3 is a flow chart illustrating a method of sensing a vehicle environment using the environmental sensing system shown in FIG. 2 .
- FIGS. 1A and 1B Schematic views of a vehicle 10 traveling down a road are shown in FIGS. 1A and 1B .
- the vehicle 10 includes an environmental sensing system 16 used to detect lane markers 14 that define a lane 12 of the road.
- the disclosed environmental sensing relates to lane sensing, blind spot sensing, and other vehicle active safety sensing.
- the vehicle's lane position is detected and, when sufficiently reliable data is obtained, the vehicle is maintained within the lane using little or no steering input from the driver for such features as Lane Keep Assist (LKA), Lane Centering (LC), Traffic Jam Assist (TJA) and/or fully autonomous control of the vehicle.
- LKA Lane Keep Assist
- LC Lane Centering
- TJA Traffic Jam Assist
- the environmental sensing system 16 includes first, second, third and fourth sensors 18 , 20 , 22 , 24 respectively providing first, second, third and fourth “bird's-eye-views” or signals 26 , 28 , 30 , 32 .
- the sensors are used to identify the lane markers 14 by detecting the reflection from the paint on the road or Bott's dots.
- the first sensor 18 is a forward facing integrated camera and radar sensor (RACam), disclosed in U.S. Pat. No. 8,604,968 entitled “INTEGRATED RADAR-CAMERA SENSOR,” issued on Dec. 10, 2013 and U.S. Pat. No. 9,112,278 entitled “RADAR DEVICE FOR BEHIND WINDSHIELD INSTALLATIONS,” issued Aug. 18, 2015.
- the radar sensor in the first sensor 18 also provides a radar signal 34 .
- the first sensor 18 may be provided at the front side of the rear view mirror and directed through the windshield.
- the second and third sensors 20 , 22 are respectively left and right side cameras, which may be arranged in the side view mirrors or elsewhere.
- the fourth sensor 24 may be provided by the vehicle's back-up camera, for example. More or fewer sensors can be used, and the sensor can be arranged differently than shown.
- another sensor 25 may be provided on the vehicle's hood or front bumper to provide another front field of view signal 27 , which can be used to detect the roadway occluded by the hood.
- the sensors 18 , 20 , 22 , 24 , 25 function independent of each other and provide the latest available data for LKA, LC, TJA and/or automated driving.
- various types of sensors can be used, for example, a radar sensor, an infrared sensor and/or a LIDAR sensor. The signals may be different than shown depending upon the type of sensor
- An example environmental sensor system 16 is shown schematically in FIG. 2 .
- a controller 36 is in communication with the first, second, third and fourth sensors 18 , 20 , 22 , 24 .
- a steering system 38 , suspension system 40 and/or brake system 42 is also in communication with the controller 36 and are used for partially or fully autonomous control of the vehicle 10 during operation.
- a LKA module 44 , LC module 46 , TJA module 48 , and/or other module 49 are used to command the steering system 38 , suspension system 40 and/or brake system 42 and achieve a desired vehicle lane position based upon the detected vehicle lane position from the controller 36 .
- One or more of these modules 44 , 46 , 48 , 49 are incorporated into a fully autonomous vehicle control and also provide Lane Departure Warning (LDW) functionality.
- LDW Lane Departure Warning
- the controller 36 includes an image processor 50 that receives the signals from the first sensor 18 , which is the primary sensor for vehicle lane position detection.
- the environmental sensing system 16 in order to reliably determine the vehicle lane position, detects the following parameters using the first sensor 18 : 1) the distance of the left and right lane markers from the center of the host vehicle with respect to a vehicle coordinate system (VCS), 2) the distance of what the system determines is the center of the left and right lane markers (which would be the ideal path of the vehicle ignoring driver preference), 3) the rate of change of both lane markers with respect to the host vehicle, 4) the curvature of the lane markers, and 5) the rate of change of curvature of the lane markers.
- This data can be expressed in the following polynomial, which provides a first algorithm 52 :
- the disclosed environmental sensing system 16 and method 60 uses a second algorithm 56 associate with a second sensor (e.g., one of the second, third or fourth sensors 20 , 22 , 24 ) and a timer 58 .
- This second sensor will generally not be facing in the same direction as the primary sensor, so should not be significantly impacted by sun glare, or are sensors which are impervious to the sun and function independent of the sun.
- data is gathered from the first or primary sensor and the second sensor in parallel.
- the first sensor is used to detect vehicle lane position (block 62 ). If the needed lane marker confidence is available (block 64 ), then the vehicle can be controlled to provide partially or fully autonomous vehicle control (block 66 ). If the needed lane marker confidence is not available (block 64 ), then the system switches to relying upon the second sensor data to detect vehicle lane position (block 68 ). In this manner, the second sensor reduces drop-outs due to glare as the second sensor is not directly facing the horizon and the sun but rather the area on the road just next to the vehicle (left, right or rear). Of course, the second sensor can be used for first sensor failures or inaccuracies due to other reasons. Data from the second sensor should sufficiently encompasses the lane markers next to the vehicle and, depending on the sensor, to some extent even the front of the vehicle.
- the second algorithm 56 is used for the second sensor, which may be the same as the first algorithm 52 that is used for the first sensor.
- the timer 58 clocks the duration for which the first sensor is unavailable or dropped-out (block 70 ).
- the pixels of the 2D images indicating the edges are projected to a “real-world” global coordinate system, and the confidence is computed (block 72 ).
- one or more filters such as a Canny filter or a Sobel filter, is used detect the edges of the lane markers from the data supplied by the second sensor.
- the data is also evaluated to determine if there is sufficient similarity to data previously provided by the first sensor (block 74 ). Sufficient similarity should exist if the lane markers detected by the second sensor are generally where they would be expected based upon the data provided by the first sensor before it became unavailable. If sufficient confidence and similarity does not exist, then control is returned to the driver (block 76 ). Control is also returned to the driver even if sufficient confidence and similarity exist if a predetermined time has been exceeded (block 78 ). If sufficient confidence and similarity exist and the predetermined time has not been exceed, then a second sensor flag is set (block 79 ), which indicates that the second sensor data is reliable and can be used if the first sensor drops out.
- the predetermined time is stored in memory 54 and may correspond to a few fractions of a second or a few seconds based upon best practices for the situation and the degree of data reliability. This data is obtained empirically, for example, based upon sensor range for various vehicle speeds that are known to provide sufficient accuracy for the predetermined time.
- the data reliability to an extent is derived from the algorithm, which determines from the level of accuracy if the reason for a drop-out was poor lane markers visibility or a sensor artifact (poor sensor performance, as a result of sensor limitation, and/or unable to filter out environmental effects).
- the second sensor range is substantially less than first sensor 18 (e.g., RACam) and has only instantaneous current lane/line markers, data about lane markers in front of the vehicle may not be available, and hence the control strategy may change significantly.
- first sensor 18 e.g., RACam
- a simple proportional control could be performed to maintain the vehicle within the center on the two lines reported by the second sensor.
- these values can be substituted to the 3rd degree polynomial in the second algorithm 56 to provide partially or fully autonomous vehicle control.
- the values from the second sensor should provide the confidence values along with similarity values sufficient to perform partially or fully autonomous vehicle control for the short instants that data is unavailable.
- the first sensor 18 is the primary data source for vehicle control, and the second sensor is only employed in case of drop-outs. Thus, there is a time interval for which control can be made with the second sensor after which the environmental sensing system 16 warns the driver to take control to avoid abuse.
- the controller 36 may include a processor and non-transitory memory 54 where computer readable code for controlling operation is stored.
- a controller can include a processor, memory, and one or more input and/or output (I/O) device interface(s) that are communicatively coupled via a local interface.
- the local interface can include, for example but not limited to, one or more buses and/or other wired or wireless connections.
- the local interface may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
- the controller 36 may be a hardware device for executing software, particularly software stored in memory 54 .
- the processor can be a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the controller 36 , a semiconductor based microprocessor (in the form of a microchip or chip set) or generally any device for executing software instructions.
- the memory 54 can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, VRAM, etc.)) and/or nonvolatile memory elements (e.g., ROM, etc.). Moreover, the memory 54 may incorporate electronic, magnetic, optical, and/or other types of storage media. The memory 54 can also have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the controller.
- the software in the memory may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions.
- a system component embodied as software may also be construed as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
- the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory.
- the input/output devices that may be coupled to system I/O Interface(s) may include input devices, for example, but not limited to, a scanner, microphone, camera, proximity device, etc. Further, the input/output devices may also include output devices, for example but not limited to a display, etc. Finally, the input/output devices may further include devices that communicate both as inputs and outputs, for instance but not limited to, a modulator/demodulator (for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a bridge, a router, etc.
- RF radio frequency
- the processor can be configured to execute software stored within the memory 54 , to communicate data to and from the memory 54 , and to generally control operations of the computing device pursuant to the software.
- Software in memory 54 in whole or in part, is read by the processor, perhaps buffered within the processor, and then executed.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Steering Control In Accordance With Driving Conditions (AREA)
Abstract
Description
- This disclosure relates to an environmental sensing system relating to reliably identifying vehicle lane position for lane keeping in a fully autonomous vehicle, for example, or a vehicle that is driver-assisted.
- Vehicle lane position is increasingly used in modern vehicles for such features as Lane Keep Assist (LKA), Lane Centering (LC) and Traffic Jam Assist (TJA), which incorporates aspects of LKA and LC. During operation, the vehicle's lane position is detected, and the vehicle is maintained within the lane using little or no steering input from the driver. Such features are also needed for autonomously driving vehicles.
- In one typical approach, the vehicle's lane position is adjusted by using an environmental sensing system that has one or more cameras and a distance ranging sensor (e.g., LIDAR or radar). Lane marker edges are detected by the sensors, but some sort of vision-based sensor is used as the primary sensor for vehicle control, typically in the form of a front mounted camera which detects the lines and lanes.
- Data from the sensors must be reliable in order to maintain control of the vehicle without driver input, or full control of the vehicle must be returned to the driver. Repeated interruptions to autonomous control are undesirable, but must be balanced with the need for highly reliable vehicle control.
- One reason for which the current systems “turn off” or hand control back to the driver are that the lane markers are poorly marked with fading paint that cannot be distinguished from the road. Another reason is that sun glare on the front facing sensors can be sufficient to cause sensor “drop-out” in which the sensor can no longer provide reliable data for vehicle control. One approach to address sun glare is to combine overlapping or non-overlapping images from multiple cameras to provide the best available lane marker recognition. The problem with this approach is that the primary sensor may no longer be relied upon for indefinite durations, which is not the best practice and not very reliable.
- In one exemplary embodiment, a method of sensing an environment of a vehicle. The method includes the steps of controlling a vehicle lane position based upon a first signal from a first sensor and switching from the first sensor to a second sensor if the first sensor cannot provide a desired lane marker confidence. The vehicle's lane position is controlled based upon a second signal from the second sensor if the second sensor can provide the desired lane marker confidence and a predetermined time has not been exceeded.
- In a further embodiment of the above, the first sensor is at least one of a camera sensor, radar sensor, infrared sensor and LIDAR sensor.
- In a further embodiment of any of the above, the first sensor is an integrated camera sensor and radar sensor.
- In a further embodiment of any of the above, the first sensor is forward facing.
- In a further embodiment of any of the above, the second sensor is one of a side view camera and a rear view camera.
- In a further embodiment of any of the above, the first sensor cannot provide the desired lane marker confidence due to glare on the first sensor.
- In a further embodiment of any of the above, the switching step includes applying a control algorithm using data from the second signal to determine the desired lane marker confidence.
- In a further embodiment of any of the above, the switching step includes applying a filter to the data to identify lane marker edges and converting the lane marker edges to a coordinate system.
- In a further embodiment of any of the above, the switching step includes determining whether the lane marker edges in the coordinate system are similar to previously provided data from the first sensor.
- In a further embodiment of any of the above, steering control of the vehicle is returned to the driver if the step of controlling the vehicle lane position based upon the second signal is not performed within the predetermined time.
- In a further embodiment of any of the above, the vehicle lane position is not controlled based upon the first signal while controlling the vehicle lane position based upon the second signal.
- In another exemplary embodiment, an environmental sensing system relating to vehicle lane position includes a first sensor that is configured to provide a first signal indicative of a vehicle lane position. A second sensor is configured to provide a second signal indicative of the vehicle lane position. A steering system is configured to achieve a desired lane position in response to a command. A controller is in communication with the steering system and the first and second sensors and is configured to provide the command based upon one of the first and second signals. The controller is configured to use the first signal if the first sensor provides a desired lane marker confidence. The controller is configured to switch to the second sensor and use the second signal if the first sensor cannot provide the desired lane marker confidence and the second sensor can provide the desired lane marker confidence and a predetermined time has not been exceeded.
- In a further embodiment of any of the above, the first sensor is at least one of a camera sensor, radar sensor, infrared sensor and LIDAR sensor.
- In a further embodiment of any of the above, the first sensor is an integrated camera sensor and radar sensor.
- In a further embodiment of any of the above, the first sensor is forward facing.
- In a further embodiment of any of the above, the second sensor is one of a side view camera and a rear view camera.
- In a further embodiment of any of the above, the first sensor cannot provide the desired lane marker confidence due to temporary failure of the first sensor.
- In a further embodiment of any of the above, the switching step includes applying a control algorithm using data from the second signal to determine the desired lane marker confidence. The switching step includes applying a filter to the data to identify lane marker edges and converting the lane marker edges to a vehicle coordinate system. The switching step includes determining whether the lane marker edges in the vehicle coordinate system are similar to previously provided data from the first sensor.
- In a further embodiment of any of the above, steering control of the vehicle is returned if the step of controlling the vehicle lane position based upon the second signal is not performed within the predetermined time.
- In a further embodiment of any of the above, the vehicle lane position is not controlled based upon the first signal while controlling the vehicle lane position based upon the second signal.
- The disclosure can be further understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
-
FIG. 1A is a schematic elevational view, or “bird's-eye-view,” of a vehicle with an environmental sensing system of the type used in lane keeping or autonomous vehicle control. -
FIG. 1B is a schematic side view of the vehicle shown inFIG. 1A . -
FIG. 2 is a schematic view of the environmental sensing system. -
FIG. 3 is a flow chart illustrating a method of sensing a vehicle environment using the environmental sensing system shown inFIG. 2 . - The embodiments, examples and alternatives of the preceding paragraphs, the claims, or the following description and drawings, including any of their various aspects or respective individual features, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
- Schematic views of a
vehicle 10 traveling down a road are shown inFIGS. 1A and 1B . Thevehicle 10 includes anenvironmental sensing system 16 used to detectlane markers 14 that define alane 12 of the road. The disclosed environmental sensing relates to lane sensing, blind spot sensing, and other vehicle active safety sensing. During operation, the vehicle's lane position is detected and, when sufficiently reliable data is obtained, the vehicle is maintained within the lane using little or no steering input from the driver for such features as Lane Keep Assist (LKA), Lane Centering (LC), Traffic Jam Assist (TJA) and/or fully autonomous control of the vehicle. - In one embodiment, the
environmental sensing system 16 includes first, second, third andfourth sensors signals lane markers 14 by detecting the reflection from the paint on the road or Bott's dots. - In one example, the
first sensor 18 is a forward facing integrated camera and radar sensor (RACam), disclosed in U.S. Pat. No. 8,604,968 entitled “INTEGRATED RADAR-CAMERA SENSOR,” issued on Dec. 10, 2013 and U.S. Pat. No. 9,112,278 entitled “RADAR DEVICE FOR BEHIND WINDSHIELD INSTALLATIONS,” issued Aug. 18, 2015. The radar sensor in thefirst sensor 18 also provides aradar signal 34. In one example, thefirst sensor 18 may be provided at the front side of the rear view mirror and directed through the windshield. The second andthird sensors fourth sensor 24 may be provided by the vehicle's back-up camera, for example. More or fewer sensors can be used, and the sensor can be arranged differently than shown. For example, anothersensor 25 may be provided on the vehicle's hood or front bumper to provide another front field ofview signal 27, which can be used to detect the roadway occluded by the hood. Thesensors - An example
environmental sensor system 16 is shown schematically inFIG. 2 . A controller 36 is in communication with the first, second, third andfourth sensors steering system 38,suspension system 40 and/orbrake system 42 is also in communication with the controller 36 and are used for partially or fully autonomous control of thevehicle 10 during operation. ALKA module 44,LC module 46,TJA module 48, and/orother module 49 are used to command thesteering system 38,suspension system 40 and/orbrake system 42 and achieve a desired vehicle lane position based upon the detected vehicle lane position from the controller 36. One or more of thesemodules - The controller 36 includes an
image processor 50 that receives the signals from thefirst sensor 18, which is the primary sensor for vehicle lane position detection. Theenvironmental sensing system 16, in order to reliably determine the vehicle lane position, detects the following parameters using the first sensor 18: 1) the distance of the left and right lane markers from the center of the host vehicle with respect to a vehicle coordinate system (VCS), 2) the distance of what the system determines is the center of the left and right lane markers (which would be the ideal path of the vehicle ignoring driver preference), 3) the rate of change of both lane markers with respect to the host vehicle, 4) the curvature of the lane markers, and 5) the rate of change of curvature of the lane markers. This data can be expressed in the following polynomial, which provides a first algorithm 52: -
y=A 0 +A 1 x+A 2 x 2 +A 3 x 3 Equation 1. - One shortcoming of using a camera for vehicle lane position detection occurs when the camera faces into the sun or otherwise cannot “see” the lane markers. At times when the camera is directly facing the sun, for example, the detection of lane markers is compromised (inability to detect, detection intermittent, and/or low confidence detections) because the image sensor is over-saturated by the bright sunlight causing camera “drop-outs.” At low confidences due to poor lane markers the coefficients (A0, A1, A2, A3) in
Equation 1 will still be present, but when facing the sun, these coefficients will not report any values. At these times some prior art systems depend heavily on the ranging sensors to achieve control and maneuver to safe-spot, which is not the best practice and is not very reliable. - Most drop-outs due to sun glare are only for a few moments. In the absence of lane data or at low confidence when facing the sun, most driver-assist or autonomous vehicle control features disengage causing the vehicle to give back control to the driver. This may occur just for an instant, which still results in handing over control to the driver, or it could continue for a few seconds where the driver has to take over control for those few seconds till the system regains control.
- The disclosed
environmental sensing system 16 and method 60 (FIG. 3 ) uses asecond algorithm 56 associate with a second sensor (e.g., one of the second, third orfourth sensors timer 58. This second sensor will generally not be facing in the same direction as the primary sensor, so should not be significantly impacted by sun glare, or are sensors which are impervious to the sun and function independent of the sun. - Referring to
FIG. 3 , data is gathered from the first or primary sensor and the second sensor in parallel. The first sensor is used to detect vehicle lane position (block 62). If the needed lane marker confidence is available (block 64), then the vehicle can be controlled to provide partially or fully autonomous vehicle control (block 66). If the needed lane marker confidence is not available (block 64), then the system switches to relying upon the second sensor data to detect vehicle lane position (block 68). In this manner, the second sensor reduces drop-outs due to glare as the second sensor is not directly facing the horizon and the sun but rather the area on the road just next to the vehicle (left, right or rear). Of course, the second sensor can be used for first sensor failures or inaccuracies due to other reasons. Data from the second sensor should sufficiently encompasses the lane markers next to the vehicle and, depending on the sensor, to some extent even the front of the vehicle. - The
second algorithm 56 is used for the second sensor, which may be the same as thefirst algorithm 52 that is used for the first sensor. Thetimer 58 clocks the duration for which the first sensor is unavailable or dropped-out (block 70). The pixels of the 2D images indicating the edges are projected to a “real-world” global coordinate system, and the confidence is computed (block 72). If desired, one or more filters, such as a Canny filter or a Sobel filter, is used detect the edges of the lane markers from the data supplied by the second sensor. - In addition to evaluating whether the needed lane marker confidence is available, the data is also evaluated to determine if there is sufficient similarity to data previously provided by the first sensor (block 74). Sufficient similarity should exist if the lane markers detected by the second sensor are generally where they would be expected based upon the data provided by the first sensor before it became unavailable. If sufficient confidence and similarity does not exist, then control is returned to the driver (block 76). Control is also returned to the driver even if sufficient confidence and similarity exist if a predetermined time has been exceeded (block 78). If sufficient confidence and similarity exist and the predetermined time has not been exceed, then a second sensor flag is set (block 79), which indicates that the second sensor data is reliable and can be used if the first sensor drops out.
- The predetermined time is stored in
memory 54 and may correspond to a few fractions of a second or a few seconds based upon best practices for the situation and the degree of data reliability. This data is obtained empirically, for example, based upon sensor range for various vehicle speeds that are known to provide sufficient accuracy for the predetermined time. The data reliability to an extent is derived from the algorithm, which determines from the level of accuracy if the reason for a drop-out was poor lane markers visibility or a sensor artifact (poor sensor performance, as a result of sensor limitation, and/or unable to filter out environmental effects). Thus, in the event of a first sensor drop out, if sufficient confidence and similarity exist and the predetermined time has not been exceeded (e.g., second sensor flag is set; block 81), then the vehicle is controlled using the data from the second sensor (block 80). - Since the second sensor range is substantially less than first sensor 18 (e.g., RACam) and has only instantaneous current lane/line markers, data about lane markers in front of the vehicle may not be available, and hence the control strategy may change significantly. For example, instead of using a feed-forward PI controller used with the
first sensor 18, a simple proportional control could be performed to maintain the vehicle within the center on the two lines reported by the second sensor. - Using this lane and curvature data, these values can be substituted to the 3rd degree polynomial in the
second algorithm 56 to provide partially or fully autonomous vehicle control. In the absence of first sensor data, the values from the second sensor should provide the confidence values along with similarity values sufficient to perform partially or fully autonomous vehicle control for the short instants that data is unavailable. However, thefirst sensor 18 is the primary data source for vehicle control, and the second sensor is only employed in case of drop-outs. Thus, there is a time interval for which control can be made with the second sensor after which theenvironmental sensing system 16 warns the driver to take control to avoid abuse. - The controller 36 may include a processor and
non-transitory memory 54 where computer readable code for controlling operation is stored. In terms of hardware architecture, such a controller can include a processor, memory, and one or more input and/or output (I/O) device interface(s) that are communicatively coupled via a local interface. The local interface can include, for example but not limited to, one or more buses and/or other wired or wireless connections. The local interface may have additional elements, which are omitted for simplicity, such as controllers, buffers (caches), drivers, repeaters, and receivers to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components. - The controller 36 may be a hardware device for executing software, particularly software stored in
memory 54. The processor can be a custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the controller 36, a semiconductor based microprocessor (in the form of a microchip or chip set) or generally any device for executing software instructions. - The
memory 54 can include any one or combination of volatile memory elements (e.g., random access memory (RAM, such as DRAM, SRAM, SDRAM, VRAM, etc.)) and/or nonvolatile memory elements (e.g., ROM, etc.). Moreover, thememory 54 may incorporate electronic, magnetic, optical, and/or other types of storage media. Thememory 54 can also have a distributed architecture, where various components are situated remotely from one another, but can be accessed by the controller. - The software in the memory may include one or more separate programs, each of which includes an ordered listing of executable instructions for implementing logical functions. A system component embodied as software may also be construed as a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When constructed as a source program, the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory.
- The input/output devices that may be coupled to system I/O Interface(s) may include input devices, for example, but not limited to, a scanner, microphone, camera, proximity device, etc. Further, the input/output devices may also include output devices, for example but not limited to a display, etc. Finally, the input/output devices may further include devices that communicate both as inputs and outputs, for instance but not limited to, a modulator/demodulator (for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a bridge, a router, etc.
- When the controller 36 is in operation, the processor can be configured to execute software stored within the
memory 54, to communicate data to and from thememory 54, and to generally control operations of the computing device pursuant to the software. Software inmemory 54, in whole or in part, is read by the processor, perhaps buffered within the processor, and then executed. - It should also be understood that although a particular component arrangement is disclosed in the illustrated embodiment, other arrangements will benefit herefrom. Although particular step sequences are shown, described, and claimed, it should be understood that steps may be performed in any order, separated or combined unless otherwise indicated and will still benefit from the present invention.
- Although the different examples have specific components shown in the illustrations, embodiments of this invention are not limited to those particular combinations. It is possible to use some of the components or features from one of the examples in combination with features or components from another one of the examples.
- Although an example embodiment has been disclosed, a worker of ordinary skill in this art would recognize that certain modifications would come within the scope of the claims. For that reason, the following claims should be studied to determine their true scope and content.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/181,915 US9840253B1 (en) | 2016-06-14 | 2016-06-14 | Lane keeping system for autonomous vehicle during camera drop-outs |
CN201710442441.0A CN107499309B (en) | 2016-06-14 | 2017-06-13 | Lane Keeping System for Autonomous Vehicles During Camera Loss |
EP17175774.3A EP3257729B1 (en) | 2016-06-14 | 2017-06-13 | Lane keeping system for autonomous vehicle during camera drop-outs |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/181,915 US9840253B1 (en) | 2016-06-14 | 2016-06-14 | Lane keeping system for autonomous vehicle during camera drop-outs |
Publications (2)
Publication Number | Publication Date |
---|---|
US9840253B1 US9840253B1 (en) | 2017-12-12 |
US20170355366A1 true US20170355366A1 (en) | 2017-12-14 |
Family
ID=59070478
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/181,915 Active 2036-06-15 US9840253B1 (en) | 2016-06-14 | 2016-06-14 | Lane keeping system for autonomous vehicle during camera drop-outs |
Country Status (3)
Country | Link |
---|---|
US (1) | US9840253B1 (en) |
EP (1) | EP3257729B1 (en) |
CN (1) | CN107499309B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019152896A (en) * | 2018-02-28 | 2019-09-12 | 本田技研工業株式会社 | Traveling control device, traveling control method, and program |
US20190361449A1 (en) * | 2017-02-15 | 2019-11-28 | Hitachi Automotive Systems, Ltd. | Vehicle Motion Control Apparatus, Vehicle Motion Control Method, and Vehicle Motion Control System |
WO2020207996A1 (en) * | 2019-04-08 | 2020-10-15 | Jaguar Land Rover Limited | Vehicle localisation |
US11176388B2 (en) | 2018-05-30 | 2021-11-16 | Lineage Logistics, LLC | Tracking vehicles in a warehouse environment |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11237556B2 (en) | 2012-06-22 | 2022-02-01 | Knowm, Inc. | Autonomous vehicle |
US9367065B2 (en) * | 2013-01-25 | 2016-06-14 | Google Inc. | Modifying behavior of autonomous vehicles based on sensor blind spots and limitations |
US10380886B2 (en) * | 2017-05-17 | 2019-08-13 | Cavh Llc | Connected automated vehicle highway systems and methods |
US10501085B2 (en) * | 2017-12-07 | 2019-12-10 | Waymo Llc | Early object detection for unprotected turns |
TWM563380U (en) | 2018-01-03 | 2018-07-11 | 大陸商上海蔚蘭動力科技有限公司 | Drive risk classification and prevention system for automatic drive and active drive |
KR102463722B1 (en) * | 2018-02-20 | 2022-11-07 | 현대자동차주식회사 | Apparatus and method for setting velocity of vehicle |
US10773717B2 (en) * | 2018-04-12 | 2020-09-15 | Trw Automotive U.S. Llc | Vehicle assist system |
DE102018205532B4 (en) * | 2018-04-12 | 2025-01-23 | Robert Bosch Gmbh | Method for detecting an obstacle in front of a vehicle |
CN110562251A (en) * | 2018-06-05 | 2019-12-13 | 广州小鹏汽车科技有限公司 | automatic driving method and device |
JP7202112B2 (en) * | 2018-09-11 | 2023-01-11 | 本田技研工業株式会社 | Vehicle control system and vehicle control method |
CN108944927B (en) * | 2018-09-28 | 2020-12-18 | 合刃科技(武汉)有限公司 | Lane keeping apparatus and method for vehicle |
US11037382B2 (en) * | 2018-11-20 | 2021-06-15 | Ford Global Technologies, Llc | System and method for evaluating operation of environmental sensing systems of vehicles |
US10930155B2 (en) * | 2018-12-03 | 2021-02-23 | Continental Automotive Systems, Inc. | Infrastructure sensor detection and optimization method |
US11093761B2 (en) * | 2019-03-06 | 2021-08-17 | GM Global Technology Operations LLC | Lane position sensing and tracking in a vehicle |
US20220137626A1 (en) * | 2020-11-04 | 2022-05-05 | Canon Kabushiki Kaisha | Apparatus, method, and non-transitory computer-readable storage medium |
US12071150B2 (en) * | 2021-06-25 | 2024-08-27 | Magna Electronics Inc. | Vehicular driving assist system using forward viewing camera |
US20230087392A1 (en) * | 2021-09-20 | 2023-03-23 | DC-001,Inc. | Systems and methods for adjusting vehicle lane position |
CN114997048B (en) * | 2022-05-27 | 2025-01-07 | 南京航空航天大学 | Lane keeping method for autonomous driving vehicles based on TD3 algorithm improved by exploration strategy |
Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6173222B1 (en) * | 1997-10-10 | 2001-01-09 | Hyundai Motor Company | Steering control system and method for autonomous intelligent vehicles |
US6317057B1 (en) * | 2000-04-03 | 2001-11-13 | Hyundai Motor Company | Method for detecting lane deviation of vehicle |
US20020171844A1 (en) * | 2001-03-13 | 2002-11-21 | Hill Henry A. | Cyclic error reduction in average interferometric position measurements |
US20050187684A1 (en) * | 2004-02-09 | 2005-08-25 | Nissan Motor Co., Ltd. | Driving information system with haptic notification seat |
US6980663B1 (en) * | 1999-08-16 | 2005-12-27 | Daimlerchrysler Ag | Process and device for compensating for signal loss |
US20070078594A1 (en) * | 2005-09-30 | 2007-04-05 | Daishi Mori | Navigation system and vehicle position estimating method |
US20120109462A1 (en) * | 2010-10-29 | 2012-05-03 | Mando Corporation | Lane keeping control system, lane change control system and vehicle control system |
US8403213B1 (en) * | 2006-07-10 | 2013-03-26 | Diebold, Incorporated | Time analysis at a banking system controlled by data bearing records |
US20130208945A1 (en) * | 2012-02-15 | 2013-08-15 | Delphi Technologies, Inc. | Method for the detection and tracking of lane markings |
US20140018995A1 (en) * | 2012-04-27 | 2014-01-16 | Google Inc. | Safely Navigating on Roads Through Maintaining Safe Distance from Other Vehicles |
US20140052340A1 (en) * | 2012-08-14 | 2014-02-20 | Magna Electronics Inc. | Vehicle lane keep assist system |
US20140049646A1 (en) * | 2012-08-20 | 2014-02-20 | Magna Electronics Inc. | Method of obtaining data relating to a driver assistance system of a vehicle |
US20140156178A1 (en) * | 2012-12-04 | 2014-06-05 | Hyundai Motor Company | Road marker recognition device and method |
US20140168377A1 (en) * | 2012-12-13 | 2014-06-19 | Delphi Technologies, Inc. | Stereoscopic camera object detection system and method of aligning the same |
US20150103173A1 (en) * | 2013-10-16 | 2015-04-16 | Denso Corporation | Synthesized image generation device |
US20150109131A1 (en) * | 2013-10-15 | 2015-04-23 | Volvo Car Corporation | Vehicle driver assist arrangement |
US20150183430A1 (en) * | 2013-09-05 | 2015-07-02 | Robert Bosch Gmbh | Enhanced lane departure system |
US9171363B2 (en) * | 2011-12-28 | 2015-10-27 | Fujitsu Limited | Computer-readable recording medium and road surface survey device |
US20150341620A1 (en) * | 2014-05-23 | 2015-11-26 | Lg Electronics Inc. | Stereo camera and driver assistance apparatus and vehicle including the same |
US20160307441A1 (en) * | 2015-04-14 | 2016-10-20 | Delphi Technologies, Inc. | System for lane selection by an automated vehicle |
US20160306357A1 (en) * | 2015-04-17 | 2016-10-20 | Delphi Technologies, Inc. | Automated vehicle system with position bias for motorcycle lane splitting |
US20160341555A1 (en) * | 2015-05-20 | 2016-11-24 | Delphi Technologies, Inc. | System for auto-updating route-data used by a plurality of automated vehicles |
US20160363665A1 (en) * | 2015-06-09 | 2016-12-15 | Garmin Switzerland Gmbh | Radar sensor system providing situational awareness information |
US9542847B2 (en) * | 2011-02-16 | 2017-01-10 | Toyota Motor Engineering & Manufacturing North America, Inc. | Lane departure warning/assistance method and system having a threshold adjusted based on driver impairment determination using pupil size and driving patterns |
US20170028916A1 (en) * | 2001-07-31 | 2017-02-02 | Magna Electronics Inc. | Driver assistance system for a vehicle |
US20170057545A1 (en) * | 2015-08-26 | 2017-03-02 | Delphi Technologies, Inc. | Gps data correction for automated vehicle |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7400236B2 (en) * | 2005-10-21 | 2008-07-15 | Gm Global Technology Operations, Inc. | Vehicular lane monitoring system utilizing front and rear cameras |
WO2010042483A1 (en) | 2008-10-08 | 2010-04-15 | Delphi Technologies, Inc. | Integrated radar-camera sensor |
WO2010092621A1 (en) * | 2009-02-13 | 2010-08-19 | パイオニア株式会社 | Lane deviation alarm device, method of controlling lane deviation alarm device, and program for the method |
US20120022739A1 (en) * | 2010-07-20 | 2012-01-26 | Gm Global Technology Operations, Inc. | Robust vehicular lateral control with front and rear cameras |
US9542846B2 (en) * | 2011-02-28 | 2017-01-10 | GM Global Technology Operations LLC | Redundant lane sensing systems for fault-tolerant vehicular lateral controller |
US20120314070A1 (en) * | 2011-06-09 | 2012-12-13 | GM Global Technology Operations LLC | Lane sensing enhancement through object vehicle information for lane centering/keeping |
DE102013103952B4 (en) * | 2012-05-02 | 2020-07-09 | GM Global Technology Operations LLC | Lane detection at full speed with an all-round vision system |
US9112278B2 (en) | 2013-05-29 | 2015-08-18 | Delphi Technologies, Inc. | Radar device for behind windshield installations |
US9988047B2 (en) * | 2013-12-12 | 2018-06-05 | Magna Electronics Inc. | Vehicle control system with traffic driving control |
MX343922B (en) * | 2014-02-20 | 2016-11-29 | Ford Global Tech Llc | Fault handling in an autonomous vehicle. |
CN204110029U (en) * | 2014-07-31 | 2015-01-21 | 同济大学 | A kind of personal vehicle system of the Multi-sensor Fusion intelligent vehicle based on magnetic navigation |
JP6189815B2 (en) * | 2014-10-29 | 2017-08-30 | 株式会社Soken | Traveling line recognition system |
-
2016
- 2016-06-14 US US15/181,915 patent/US9840253B1/en active Active
-
2017
- 2017-06-13 EP EP17175774.3A patent/EP3257729B1/en active Active
- 2017-06-13 CN CN201710442441.0A patent/CN107499309B/en active Active
Patent Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6173222B1 (en) * | 1997-10-10 | 2001-01-09 | Hyundai Motor Company | Steering control system and method for autonomous intelligent vehicles |
US6980663B1 (en) * | 1999-08-16 | 2005-12-27 | Daimlerchrysler Ag | Process and device for compensating for signal loss |
US6317057B1 (en) * | 2000-04-03 | 2001-11-13 | Hyundai Motor Company | Method for detecting lane deviation of vehicle |
US20020171844A1 (en) * | 2001-03-13 | 2002-11-21 | Hill Henry A. | Cyclic error reduction in average interferometric position measurements |
US20170028916A1 (en) * | 2001-07-31 | 2017-02-02 | Magna Electronics Inc. | Driver assistance system for a vehicle |
US20050187684A1 (en) * | 2004-02-09 | 2005-08-25 | Nissan Motor Co., Ltd. | Driving information system with haptic notification seat |
US20070078594A1 (en) * | 2005-09-30 | 2007-04-05 | Daishi Mori | Navigation system and vehicle position estimating method |
US8403213B1 (en) * | 2006-07-10 | 2013-03-26 | Diebold, Incorporated | Time analysis at a banking system controlled by data bearing records |
US20120109462A1 (en) * | 2010-10-29 | 2012-05-03 | Mando Corporation | Lane keeping control system, lane change control system and vehicle control system |
US9542847B2 (en) * | 2011-02-16 | 2017-01-10 | Toyota Motor Engineering & Manufacturing North America, Inc. | Lane departure warning/assistance method and system having a threshold adjusted based on driver impairment determination using pupil size and driving patterns |
US9171363B2 (en) * | 2011-12-28 | 2015-10-27 | Fujitsu Limited | Computer-readable recording medium and road surface survey device |
US20130208945A1 (en) * | 2012-02-15 | 2013-08-15 | Delphi Technologies, Inc. | Method for the detection and tracking of lane markings |
US20140018995A1 (en) * | 2012-04-27 | 2014-01-16 | Google Inc. | Safely Navigating on Roads Through Maintaining Safe Distance from Other Vehicles |
US20140052340A1 (en) * | 2012-08-14 | 2014-02-20 | Magna Electronics Inc. | Vehicle lane keep assist system |
US20140049646A1 (en) * | 2012-08-20 | 2014-02-20 | Magna Electronics Inc. | Method of obtaining data relating to a driver assistance system of a vehicle |
US20140156178A1 (en) * | 2012-12-04 | 2014-06-05 | Hyundai Motor Company | Road marker recognition device and method |
US20140168377A1 (en) * | 2012-12-13 | 2014-06-19 | Delphi Technologies, Inc. | Stereoscopic camera object detection system and method of aligning the same |
US20150183430A1 (en) * | 2013-09-05 | 2015-07-02 | Robert Bosch Gmbh | Enhanced lane departure system |
US20150109131A1 (en) * | 2013-10-15 | 2015-04-23 | Volvo Car Corporation | Vehicle driver assist arrangement |
US20150103173A1 (en) * | 2013-10-16 | 2015-04-16 | Denso Corporation | Synthesized image generation device |
US20150341620A1 (en) * | 2014-05-23 | 2015-11-26 | Lg Electronics Inc. | Stereo camera and driver assistance apparatus and vehicle including the same |
US20160307441A1 (en) * | 2015-04-14 | 2016-10-20 | Delphi Technologies, Inc. | System for lane selection by an automated vehicle |
US20160306357A1 (en) * | 2015-04-17 | 2016-10-20 | Delphi Technologies, Inc. | Automated vehicle system with position bias for motorcycle lane splitting |
US20160341555A1 (en) * | 2015-05-20 | 2016-11-24 | Delphi Technologies, Inc. | System for auto-updating route-data used by a plurality of automated vehicles |
US20160363665A1 (en) * | 2015-06-09 | 2016-12-15 | Garmin Switzerland Gmbh | Radar sensor system providing situational awareness information |
US20170057545A1 (en) * | 2015-08-26 | 2017-03-02 | Delphi Technologies, Inc. | Gps data correction for automated vehicle |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190361449A1 (en) * | 2017-02-15 | 2019-11-28 | Hitachi Automotive Systems, Ltd. | Vehicle Motion Control Apparatus, Vehicle Motion Control Method, and Vehicle Motion Control System |
JP2019152896A (en) * | 2018-02-28 | 2019-09-12 | 本田技研工業株式会社 | Traveling control device, traveling control method, and program |
JP7048353B2 (en) | 2018-02-28 | 2022-04-05 | 本田技研工業株式会社 | Driving control device, driving control method and program |
US11176388B2 (en) | 2018-05-30 | 2021-11-16 | Lineage Logistics, LLC | Tracking vehicles in a warehouse environment |
WO2020207996A1 (en) * | 2019-04-08 | 2020-10-15 | Jaguar Land Rover Limited | Vehicle localisation |
GB2583698A (en) * | 2019-04-08 | 2020-11-11 | Jaguar Land Rover Ltd | Vehicle localisation |
GB2583698B (en) * | 2019-04-08 | 2021-10-06 | Jaguar Land Rover Ltd | Vehicle localisation |
US20220185300A1 (en) * | 2019-04-08 | 2022-06-16 | Jaguar Land Rover Limited | Vehicle localisation |
US12054165B2 (en) * | 2019-04-08 | 2024-08-06 | Jaguar Land Rover Limited | Vehicle localization |
Also Published As
Publication number | Publication date |
---|---|
EP3257729A1 (en) | 2017-12-20 |
EP3257729B1 (en) | 2019-08-07 |
CN107499309A (en) | 2017-12-22 |
CN107499309B (en) | 2020-04-03 |
US9840253B1 (en) | 2017-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9840253B1 (en) | Lane keeping system for autonomous vehicle during camera drop-outs | |
US11745659B2 (en) | Vehicular system for controlling vehicle | |
US20240273917A1 (en) | Vehicular control system | |
KR102161432B1 (en) | Autonomous braking failure management when protecting pedestrians | |
US9211889B1 (en) | Enhanced blind spot detection for vehicle with trailer | |
US9538144B2 (en) | Full speed lane sensing using multiple cameras | |
CN106537180B (en) | Method for mitigating radar sensor limitations with camera input for active braking of pedestrians | |
US12110015B2 (en) | Vehicular driving assistance system with lateral motion control | |
US9507345B2 (en) | Vehicle control system and method | |
US20200001856A1 (en) | Vehicular control system with vehicle trajectory tracking | |
US10477102B2 (en) | Method and device for determining concealed regions in the vehicle environment of a vehicle | |
US8885888B2 (en) | In-vehicle apparatus for recognizing running environment of vehicle | |
US20200049513A1 (en) | Positioning system | |
JP6441959B2 (en) | Reduce false alarms using location data | |
US11987239B2 (en) | Driving assistance device | |
US11377103B2 (en) | Vehicle control device and recording medium | |
US20160163199A1 (en) | Vehicle vision system with collision mitigation | |
US20210084227A1 (en) | Method and apparatus for the spatially resolved detection of an object outside a transportation vehicle with the aid of a sensor installed in a transportation vehicle | |
JP2020123339A (en) | Collaborative blind spot warning method and apparatus for inter-vehicle communication infrastructure with fault tolerance and fracture robustness in extreme situations | |
US20220048509A1 (en) | Vehicular control system with traffic jam assist | |
JP6924269B2 (en) | Parking support device | |
US20240383479A1 (en) | Vehicular sensing system with lateral threat assessment | |
US20210256275A1 (en) | Method for detecting false positives of an image-processing device of a camera | |
KR20210057897A (en) | Apparatus for controlling safety driving of vehicle and method thereof | |
JP2016206970A (en) | Drive support device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DELPHI TECHNOLOGIES, INC., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PRASAD, PREMCHAND KRISHNA;GREENE, JEREMY S.;MARTINDALE, PAUL R.;SIGNING DATES FROM 20160614 TO 20160615;REEL/FRAME:038926/0641 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: APTIV TECHNOLOGIES LIMITED, BARBADOS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELPHI TECHNOLOGIES INC.;REEL/FRAME:047143/0874 Effective date: 20180101 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: APTIV TECHNOLOGIES (2) S.A R.L., LUXEMBOURG Free format text: ENTITY CONVERSION;ASSIGNOR:APTIV TECHNOLOGIES LIMITED;REEL/FRAME:066746/0001 Effective date: 20230818 Owner name: APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L., LUXEMBOURG Free format text: MERGER;ASSIGNOR:APTIV TECHNOLOGIES (2) S.A R.L.;REEL/FRAME:066566/0173 Effective date: 20231005 Owner name: APTIV TECHNOLOGIES AG, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APTIV MANUFACTURING MANAGEMENT SERVICES S.A R.L.;REEL/FRAME:066551/0219 Effective date: 20231006 |