US20080065328A1 - Method and system for collision avoidance - Google Patents
Method and system for collision avoidance Download PDFInfo
- Publication number
- US20080065328A1 US20080065328A1 US11/851,642 US85164207A US2008065328A1 US 20080065328 A1 US20080065328 A1 US 20080065328A1 US 85164207 A US85164207 A US 85164207A US 2008065328 A1 US2008065328 A1 US 2008065328A1
- Authority
- US
- United States
- Prior art keywords
- future
- host vehicle
- objects
- lane
- estimator
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/28—Details of pulse systems
- G01S7/285—Receivers
- G01S7/295—Means for transforming co-ordinates or for evaluating data, e.g. using computers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/087—Interaction between the driver and the control system where the control system corrects or modifies a request from the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/12—Limiting control by the driver depending on vehicle state, e.g. interlocking means for the control input for preventing unsafe operation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9318—Controlling the steering
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
Definitions
- the invention relates to a method and an on board system for collision avoidance.
- Passive safety systems are directed toward reducing the effect of an accident, in the event an accident takes place while active safety systems are directed toward reducing the probability of occurrence of accidents.
- One type of active safety systems are collision avoidance systems relying on sensor technology for estimating a traffic situation. In this type of systems sensors are used to detect the presence of objects in a future trajectory of the vehicle. In the event the system detects that an object is within a future trajectory of the vehicle, normally a warning sign is produced to alert the driver.
- Systems that intercept the control of the vehicle, such as by braking the vehicle are also known.
- future trajectories of all detected object are estimated and compared with the future trajectory of the vehicle.
- a conflict event is detected.
- the future trajectories of the detected objects are based on position and velocity of the objects. Normally position and velocity are detected by use of sensors such as radars.
- sensors such as radars.
- object recognition based on input signals from cameras may be used.
- the future trajectory of the object is estimated primarily from input data relating to the position and velocity of the object in question.
- the future trajectory of the object may drastically change due to influence from other objects on the road.
- the object of the invention is achieved by selecting an appropriate method of interception of vehicle control as a result of a detected risk for collision in between the host vehicle and an external object in a neighbouring lane into which the host vehicle makes an attempt to enter.
- the method and system for collision avoidance according to the invention ensures that the driver has control over the host vehicle under most conditions while reducing the risk for collision under a well defined operation which is associated with increased risk, that is when a driver attempts to leave a lane.
- the method described in the Broadhurst article in order to generate the future trajectories of external moving objects may be used.
- a generated future path will not be used to steer the host vehicle as suggested in the Broadhurst article.
- interception of host vehicle control will be performed as suggested in the characterising portion of claim 1 , which will reduce the risk of collision in a specified situation associated with a high risk.
- the method step of estimating future trajectories of each external object, while considering influence by the future trajectories of the other external objects may include the following method step:
- FIG. 1 shows a block scheme of a system for collision avoidance according to the invention
- FIG. 2 shows an example of a traffic situation on a road
- FIG. 3 shows another example of a traffic situation on a road
- FIG. 4 illustrates the parameters of the lane exit control block.
- FIG. 1 a block scheme of a system 10 for collision avoidance is shown.
- FIG. 2 shows an example of a traffic situation on a road 12 .
- the road example 12 includes four lanes 14 a - 14 d , where lanes 14 a - 14 c are intended for traffic going in the direction from left to right as indicated by arrows 16 a - 16 c and the lane 16 d is intended for traffic going in the direction from right to left as indicated by arrow 16 d .
- the system 10 for collision avoidance 10 includes a sensor system 18 arranged on a host vehicle 20 .
- the sensor system 18 is arranged to receive input data relating to a set of objects external 22 , 24 , 26 , 28 , 30 , 32 of the host vehicle 20 .
- the objects 22 , 24 , 26 , 28 , 30 , 32 are positioned on the road 12 within a detecting range 34 of the sensor system 18 .
- At least an object position (r, ⁇ ) and an object velocity ( ⁇ dot over (r) ⁇ ) is associated with each object in said set of objects 22 , 24 , 26 , 28 , 30 , 32 .
- the set of object may include different type of objects such as obstacles 30 , pedestrian or animals 32 and vehicles 22 , 24 , 26 , 28 .
- the vehicles 24 , 26 , 28 may be of different type and size, such as bikes, motorbikes, trucks and cars. Different types of objects may preferably be associated with different types of behaviour as will be explained in further detail below.
- Future trajectories 25 , 27 , 29 , 33 are estimated in a future trajectory estimator, which will be explained in further detail below.
- the future trajectory will be estimated as no movement, that is a 0 vector indicated by reference number 0 .
- a future trajectory of the host vehicle is denoted by arrow 21 .
- the detecting range 34 of the sensor system 18 predominantly includes a region 34 a in front of the vehicle, but may preferably also include a region 34 b beside the vehicle and a region 34 c behind the vehicle. As have been indicated in the figure, the region 34 a in front of the vehicle is generally substantially larger than the region 34 c behind the vehicle.
- the detecting range 34 preferably has some directivity so as to extend further in a main lobe having an angle with the heading direction of the vehicle in the interval between approximately ⁇ 30° than in directions outside the main lobe. It is suitable that the detecting range within the main lobe stretches at least 75 m, preferably at least 150 m and suitably approximately 300 m from the vehicle 20 .
- the sensor system 18 preferably includes different types of sensors.
- the sensor system includes a vision type sensor 34 , a radar 36 and a set of host vehicle sensors 38 .
- the vision type sensor 36 preferably generates output data including distance to the object (r), direction to the object ( ⁇ ), distance to the right edge of the lane (L R ) of the host vehicle, distance to the left edge of the lane (L L ) of the host vehicle, curve radius (c 0 ) of the road at the current position of the host vehicle, heading direction ( ⁇ rel ) of the host vehicle relative to the lane and a classification of the object type.
- the classification of the object type may be based on image recognition of objects.
- the objects may be classified into obstacles 30 , pedestrian or animals 32 and vehicles 24 , 26 , 28 .
- the vehicles 24 , 26 , 28 may be of different type and size, such as bikes, motorbikes, trucks and cars.
- a vision sensor suitable for the collision avoidance system 10 is provided under the tradename Mobil Eye.
- the radar 36 provided output data including object position (r, ⁇ ), an object velocity ⁇ dot over (r) ⁇ .
- the object velocity ( ⁇ dot over (r) ⁇ ) is defined by a magnitude
- the host vehicle sensor preferably generates output data including host vehicle velocity (v), host vehicle yaw rate ( ⁇ dot over ( ⁇ ) ⁇ abs ) and host vehicle steering angle ( ⁇ ). Host vehicle sensors capable of providing such output data are well known in the art.
- the output data 40 from the sensor system 18 are preferably treated by an object and road tracking block 42 .
- the object and road tracking block may advantageously be a state estimator which estimates the states of all objects and the host vehicle and the road geometry.
- the states of the objects may include all data provided from the sensor system.
- the state estimator 42 is preferably arranged as a Kalman Filter based tracking system estimating at least the current object position (x i , y i ), the current object velocity (v i ), for all objects detected by the sensor system; the current host vehicle velocity (v) and the current host vehicle heading direction ( ⁇ rel ) and the road geometry such as curve radius (c 0 ) and lane width (W).
- the system may comprise a road geometry tracking unit which is arranged to determine the geometry of the road on which the vehicle is travelling and to express said geometry of the road as a curved coordinate system which follow the lane or lanes of said road, and in that said object position, object velocity and object direction of movement are expressed relative to said curved coordinate system.
- a road geometry tracking unit which is arranged to determine the geometry of the road on which the vehicle is travelling and to express said geometry of the road as a curved coordinate system which follow the lane or lanes of said road, and in that said object position, object velocity and object direction of movement are expressed relative to said curved coordinate system.
- a suitable state estimator for this purpose may be the state estimator described in “An Automotive Lane Guidance System”, Andreas Eidehall, thesis 1122 at Linkoping University 2004. In particular it is referred to the measurement equations 5.6a and 5.6b.
- T transforms from a coordinate system x
- y following the road geometry into a coordinate system (r, ⁇ ) centred at the host vehicle.
- the variables (e 1 , . . . e 6 ) are stochastic measurement noise.
- W represents the width of the lane.
- Superscripts m denotes measured quantities.
- y off represents the distance from the middle of the lane.
- a host ( 1 0 0 0 0 0 1 vT s v 2 ⁇ T s 2 / 2 v 3 ⁇ T s 3 / 6 0 0 1 vT s v 2 ⁇ T s 2 / 2 0 0 0 1 vT s 0 0 0 1 )
- a obj ( 1 T s 0 0 1 0 0 0 1 )
- A ( A host 0 0 I N ⁇ A obj )
- B host ( 0 0 vT s 2 / 2 0 T s 0 0 0 0 0 )
- B obj ( 0 T s 2 / 2 0 T s 0 0 0 )
- B ( B host B obj ⁇ ⁇ B obj )
- C host ( - 1 / 2 - 1 0 0 0 1 / 2 - 1 0 0 0
- N is the number of objects
- T s is the sample time.
- Q host and Q obj are the process noise covariance matrices for the object states and R host and R obj are the measurement noise covariance matrices for the host and object measurement.
- a recursive one step predictor in the form of a Kalman filter will have the following appearance:
- ⁇ circumflex over (x) ⁇ t+1 A ( ⁇ circumflex over (x) ⁇ t +K t [y t ⁇ h ( ⁇ circumflex over (x) ⁇ t )]+ Bu t
- the extended Kalman Filter will be provided with a feedback K t
- the output from the state estimator is used in a future trajectory estimator 44 which is arranged to estimate the future trajectory for each of the objects. According to the invention it is necessary that risk of conflicting events between individual objects in the set of objects detected by the sensor system 18 are assessed by the future trajectory estimator 44 .
- Two general types of future trajectory estimators capable of including the mutual influence from external objects when estimating the future trajectory of an external object are known.
- the correction may be made to avoid the conflict event, or in the event this is not possible due to physical restraints such as available steering possibilities, surface friction, acceleration, etc, the correction is made to reduce the effect of the conflict event.
- future trajectory estimator In another type of future trajectory estimator, it is simply noted that a conflicting event occurs between two external objects. It is thereafter determined whether this conflict event will have an impact of the host vehicle or not. In both these systems the future trajectory estimator determines whether any of the future trajectories the external objects will mutually effect each other due to a risk of conflict in between the future trajectories of at least one pair of objects. In most known collision avoidance systems, interaction between the detected objects is not observed.
- the future trajectory estimator 44 is of the type described in the article “Monte Carlo Road Safety Reasoning”, by Broadhurst et al. referred to above.
- each object type is assigned certain restrictions of movement. For instance obstacles will not move, pedestrians may move independently in the x and y directions, while cars are restricted to turn with a curvature radius restricted to possible steering angles.
- the future trajectory estimator first generates set of future trajectories for each object, which set of future trajectories may include all possible future control inputs selected from a set of typical human driver actions under consideration of the boundary conditions due to the restrictions of movement of the objects. These set of typical driver actions include stop, stop and turn, change lane, corner, overtake and random. For each possible future trajectory a certain risk is assigned.
- control input may be restricted to a selection of a plurality of values of one or more variables, wherein said values are selected within a predetermined interval.
- acceleration and steering angle are used as input variables.
- the trajectories that minimizes the risk for conflict between all future trajectories is selected as the most probable future trajectories.
- a Monte-Carlo sampling algorithm may suitable be selected.
- FIG. 3 shows a flow scheme describing the operation of an embodiment of a future trajectory estimator 44 .
- a plurality of objects having a position x 1 (t), a velocity v i (t) and a class of input control signals ⁇ (t).
- the input control signals may be of different types for different types of objects.
- the different types of objects may advantageously include obstacles, pedestrians and cars.
- a state update equation is defined for each c.
- a plurality of future control input signals is associated with each object. For pedestrians, this amount to selection of a set of values of accelerations within a predetermined interval. For cars this amounts to selection of a set of values of accelerations and steering angles within predetermined intervals.
- a plurality of values are selected at each step in a plurality of discrete time steps together forming a prediction horizon. The selection may be performed by random or include a set of typical inputs defining typical manoeuvres such as corner, lane change, overtake and emergency stop. Preferably random input is combined with the typical inputs.
- a future path is calculated by use of the selected values. A plurality of future paths is thus created for each object.
- a future trajectory is selected as one of the most probable future paths, preferably the most probable future path. This is done by calculating the aggregate probability value for the selected values at each step in a plurality of discrete time steps forming the prediction horizon. Future paths leading to conflicting events between objects may be removed in this sixth operational sequence, or in an operational sequence S 6 ′ prior to the sixth operational sequence.
- the operational sequences S 1 -S 6 may be performed in a future trajectory estimator 44 of the type described in the Broadhurst reference.
- the future trajectory estimator would then include first to sixth means for performing the first to sixth operational sequences described above.
- FIG. 3 an example of a traffic situation which explains the importance of assigning also conflicting events between individual objects in the set of objects detected by the sensor system 18 and not only detect possible conflicting events between the future trajectories of the each set of objects with the future trajectory of the host vehicle.
- reference sign 60 denotes the host vehicle having a future trajectory which may be estimated by the current position of the host vehicle; the current heading 64 of the host vehicle; vehicle input data such as steering angle, acceleration; and road geometry.
- the traffic situation includes two external objects, a first vehicle 66 and a trailing vehicle 68 . The velocity of the trailing vehicle is greater than the velocity of the first vehicle.
- the trailing vehicle has a first future trajectory 70 where the trailing vehicle 68 will follow the current heading 72 of the trailing vehicle. Since the velocity of the trailing vehicle 68 is greater than the velocity of the first vehicle, a conflict event exists between a future trajectory 74 of the first vehicle and the first future trajectory 70 of the trailing vehicle.
- a future trajectory estimator 44 which future trajectory estimator 44 first generates a set of future trajectories for each object, which set of future trajectories includes all possible future control inputs selected from a set of typical human driver actions under consideration of the boundary conditions due to the restrictions of movement of the objects, also a second future trajectory 76 where the trailing vehicle 68 will avoid collision with the first vehicle 66 by passing the first vehicle 66 in the neighbouring lane 78 is generated by the future trajectory estimator 76 . Since the risk for a conflicting event is smaller for the second future trajectory 76 , than for the first future trajectory 70 , the second future trajectory 76 is selected by the future trajectory estimator 44 as the most likely future trajectory.
- the future trajectory 76 of the trailing vehicle is a result from consideration of the interaction between future trajectories 70 , 74 of a pair of objects 66 , 68 external to the host vehicle. Since conventional future trajectory detectors, which only compares the future trajectory of each external object with the future trajectory of the host vehicle would come to the conclusion that it is safe for the host vehicle 60 to enter the neighbouring lane 78 because the future trajectories of the external objects 70 , 74 , when no interaction between future trajectories of external objects is considered, would be maintained within the upper lane 80 in the traffic situation described in FIG. 3 .
- the output from the future trajectory estimator 44 is entered into a lane exit control block 46 .
- the lane exit control block 46 it is determined in a lane exit control block 46 whether the driver is making an attempt to exit the lane.
- the lane exit control block 46 verifies that the host vehicle is making an attempt to leave the lane based on the vertical distance ⁇ to the edge of the lane, the velocity v of the host vehicle and the heading angle ⁇ rel , relative to the road geometry.
- FIG. 4 illustrates the parameters of the lane exit control block. When it is predicted that the host vehicle will leave the lane within shortly, the lane exit control block generates an affirmative output signal.
- the decision can be made for instance by comparing the predicted time or distance before the vehicle leaves the lane exceeds a threshold, which threshold may be fixed or depend on for instance the velocity of the vehicle.
- a threshold may be fixed or depend on for instance the velocity of the vehicle.
- a future conflict estimator control block 50 included in the lane exit control block it is determined if the future trajectory 82 of the host vehicle is involved in a conflicting event. Future trajectory 82 of the host vehicle is a future trajectory based on the verification of an attempt to leave the lower lane 84 in the traffic situation designed in FIG. 3 .
- the future conflict estimator 50 furthermore verifies if the future conflict detected is within the neighbouring lane 78 into which the host vehicle makes an attempt to enter.
- the future conflict estimator 50 verifies a conflicting event and sends a control signal to a lane change prevention unit 51 including a lateral feedback controller 52 which generates a control signal to a steering actuator 54 of the vehicle.
- the lateral feedback controller applies a control signal to generate a torque in the opposite direction of the steering torque generated by the driver input signal, either so as to prevent the driver form changing lane or a smaller torque which may be overcome by the driver, which torque alerts the driver of the existence of a danger of entering the neighbouring lane, but does not prevent the driver from entering the neighbouring lane.
- the future conflict estimator control block 50 also determines if the current lane of the host vehicle is safe, that is if the a future trajectory 62 of the host vehicle within the current lower lane 84 does not include any conflicting events with external objects. If for instance an obstacle 86 is present in the current lane 84 , the future conflict estimator control block 50 will not activate the lateral feedback controller 52 so as to either alert the driver or so as to prevent the driver of leaving the lane.
- the future trajectory estimator 44 is preferably, as have been described above, of the type described in the article “Monte Carlo Road Safety Reasoning” referred to above.
- the future trajectory estimator 44 is of a simpler type, which estimates the future trajectories in a feedback predictor (which may be of based on a Kalman filter) based on current velocity, heading angle and road geometry without generating a plurality of possible future trajectories for each object from which a most likely future trajectory may be selected by Monte Carlo sampling.
- a conflicting event will not be avoided by generating a more likely future trajectory for the object.
- the future conflict estimator control block may detect whether a conflicting event exist between a pair of objects in the heading direction of the host vehicle such that the future trajectory may be affected by this conflicting event.
- the conflicting event between the pair of external objects may not necessarily take place in the neighbouring lane 78 if this type of future trajectory estimator 44 is used since this type may not predict amended future trajectories of the external objects.
- the future trajectory estimator is arranged as a feed back controller, any change in the trajectory of the external objects will be detected by the sensor system 18 and will form basis for an updated prediction of the future trajectory of the external object. For this reason, this simpler type of future trajectory estimator may also generate accurate predictions of the future trajectories.
- the future conflict estimator control block may generate a control signal to the lateral feedback controller 52 , either as a direct result or after verifying that no conflicting events exist in the current position of the host vehicle.
- the future conflict estimator control block would activate the lateral feedback controller 52 in the event a future conflict event between the host vehicle and an external object exists in a neighbouring lane.
- future trajectory estimator 44 may alternatively directly receive input signals from the sensor system without the use of the state estimator 42 .
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The invention relates to a method and an on board system for collision avoidance.
- In order to increase safety in traffic active as well as passive systems for further improving safety in traffic are continuously developed. Passive safety systems are directed toward reducing the effect of an accident, in the event an accident takes place while active safety systems are directed toward reducing the probability of occurrence of accidents. One type of active safety systems are collision avoidance systems relying on sensor technology for estimating a traffic situation. In this type of systems sensors are used to detect the presence of objects in a future trajectory of the vehicle. In the event the system detects that an object is within a future trajectory of the vehicle, normally a warning sign is produced to alert the driver. Systems that intercept the control of the vehicle, such as by braking the vehicle are also known.
- In most known collision avoidance systems and methods, future trajectories of all detected object are estimated and compared with the future trajectory of the vehicle. In the event the future trajectory of an object coincides with the future trajectory of the vehicle, a conflict event is detected. The future trajectories of the detected objects are based on position and velocity of the objects. Normally position and velocity are detected by use of sensors such as radars. In order to separate between large and small objects, between objects made of metal from concrete or animals object recognition based on input signals from cameras may be used. As soon as the velocity and position has been detected by the radar, and it has been verified that the detected object is a potentially dangerous object, and not for instance a flying insect or paper litter whirling round, the future trajectory of the object is estimated primarily from input data relating to the position and velocity of the object in question. However, the future trajectory of the object may drastically change due to influence from other objects on the road. In the article, “Monte Carlo Road Safety Reasoning”, Broadhurst, A., Baker, S., Kanade, T. Monte Carlo road safety reasoning, Proceedings of the IEEE Intelligent Vehicles Symposium 2005, 6-8 Jun. 2005, Page(s): 319-324, Las Vegas, Nev., USA, a method is disclosed in which the future trajectory of an object is influenced by the traffic situation, that is by the future trajectories of other objects present on the road. The invention relates to a system and method for collision avoidance wherein the future trajectories of external objects are influenced by the traffic situation. In the Broadhurst article the following use of the system in the host vehicle is suggested: closed loop control of the vehicle for selecting the best predicted action, display of the best action to take in order to advice the driver, or display of warning signs for objects or unsafe regions of the road. However, in the event the control system suggested in this article is used to intercept in the control of the host vehicle by using the best predicted action for control of the host vehicle, the host vehicle will be run by an autopilot. Such solutions are generally not accepted for legal reasons and are furthermore mot appreciated by drivers. Even though the system and method described in, “Monte Carlo Road Safety Reasoning” has contributed with an important advance in object tracing and future path determination, there is still a need for improvements as regards the use of the information generated by the future trajectory estimator.
- It is an object of the invention to further reduce the risk for unsafe manoeuvres in a method or system for collision avoidance which estimates future trajectories of detected external objects
- It is particular an object of the invention to further reduce the risk for unsafe manoeuvres in a system where future trajectories of each external object are estimated while considering influence by the future trajectories of the other external objects.
- The object of the invention is achieved by selecting an appropriate method of interception of vehicle control as a result of a detected risk for collision in between the host vehicle and an external object in a neighbouring lane into which the host vehicle makes an attempt to enter.
- In a system or method according to the invention the following actions are taken estimating future trajectories of the external objects under consideration of the traffic situation, that is a future trajectory of an external object is influenced by the future trajectories of other external objects present on the road:
-
- determining, by use of a lane exit control block, whether the driver is making an attempt to exit the lane;
- determining, by use of a future conflict estimator control block, if the future trajectory of the host vehicle is involved in a conflicting event in the neighbouring lane; and
- applying, by the use of a lane change prevention unit, a torque in the direction against a torque generated by a driver to effect a lane change in the event said future conflict estimator control block detects a conflict event of relevance for the host vehicle in the lane into which the driver attempts to enter.
- The method and system for collision avoidance according to the invention ensures that the driver has control over the host vehicle under most conditions while reducing the risk for collision under a well defined operation which is associated with increased risk, that is when a driver attempts to leave a lane.
- In a particularly preferred embodiment of the invention the future trajectories for external objects are determined by the following method steps:
-
- receiving input data relating to a set of objects external to said host vehicle, said objects being positioned within a detecting range of said sensor system, wherein an object position (r,φ), and an object velocity {dot over (r)} are associated with each object in said set of objects by a sensor system arranged on a host vehicle, said input data defining a current state of each object,
- associating a plurality of future control input signals with each moving object, where each future control signal will generate together with the current state of each moving object a separate future path in a state update equation,
- determining, by the use of a future trajectory estimator, future trajectories for each of the objects, by selecting one of the most probable future paths as the future trajectory.
- In an embodiment of the invention the method described in the Broadhurst article in order to generate the future trajectories of external moving objects may be used. However, a generated future path will not be used to steer the host vehicle as suggested in the Broadhurst article. Instead interception of host vehicle control will be performed as suggested in the characterising portion of claim 1, which will reduce the risk of collision in a specified situation associated with a high risk.
- In other embodiments of the invention the method step of estimating future trajectories of each external object, while considering influence by the future trajectories of the other external objects may include the following method step:
-
- determining whether any of future trajectories of external objects will mutually effect each other due to a risk of conflict in between the future trajectories of at least one pair of objects.
- determining if the conflict event has an impact of the future trajectory of the host vehicle.
- An embodiment of the invention will be described in further detail below, with references to appended drawings where,
-
FIG. 1 shows a block scheme of a system for collision avoidance according to the invention, -
FIG. 2 shows an example of a traffic situation on a road -
FIG. 3 shows another example of a traffic situation on a road -
FIG. 4 illustrates the parameters of the lane exit control block. - An embodiment of the invention will be described below with references to
FIGS. 1 and 2 . InFIG. 1 a block scheme of asystem 10 for collision avoidance is shown.FIG. 2 shows an example of a traffic situation on aroad 12. The road example 12 includes four lanes 14 a-14 d, where lanes 14 a-14 c are intended for traffic going in the direction from left to right as indicated by arrows 16 a-16 c and thelane 16 d is intended for traffic going in the direction from right to left as indicated byarrow 16 d. Thesystem 10 forcollision avoidance 10 includes asensor system 18 arranged on a host vehicle 20. Thesensor system 18 is arranged to receive input data relating to a set of objects external 22, 24, 26, 28, 30, 32 of the host vehicle 20. Theobjects road 12 within a detectingrange 34 of thesensor system 18. At least an object position (r,φ) and an object velocity ({dot over (r)}) is associated with each object in said set ofobjects obstacles 30, pedestrian oranimals 32 andvehicles vehicles -
Future trajectories obstacles 30, the future trajectory will be estimated as no movement, that is a 0 vector indicated byreference number 0. A future trajectory of the host vehicle is denoted byarrow 21. - The detecting
range 34 of thesensor system 18 predominantly includes aregion 34 a in front of the vehicle, but may preferably also include aregion 34 b beside the vehicle and aregion 34 c behind the vehicle. As have been indicated in the figure, theregion 34 a in front of the vehicle is generally substantially larger than theregion 34 c behind the vehicle. The detectingrange 34 preferably has some directivity so as to extend further in a main lobe having an angle with the heading direction of the vehicle in the interval between approximately ±30° than in directions outside the main lobe. It is suitable that the detecting range within the main lobe stretches at least 75 m, preferably at least 150 m and suitably approximately 300 m from the vehicle 20. - The
sensor system 18 preferably includes different types of sensors. In the embodiment shown inFIG. 1 , the sensor system includes avision type sensor 34, aradar 36 and a set ofhost vehicle sensors 38. Thevision type sensor 36 preferably generates output data including distance to the object (r), direction to the object (φ), distance to the right edge of the lane (LR) of the host vehicle, distance to the left edge of the lane (LL) of the host vehicle, curve radius (c0) of the road at the current position of the host vehicle, heading direction (ψrel) of the host vehicle relative to the lane and a classification of the object type. The classification of the object type may be based on image recognition of objects. The objects may be classified intoobstacles 30, pedestrian oranimals 32 andvehicles vehicles collision avoidance system 10 is provided under the tradename Mobil Eye. Theradar 36 provided output data including object position (r,φ), an object velocity {dot over (r)}. The object velocity ({dot over (r)}) is defined by a magnitude |{dot over (r)}| and direction of movement ({dot over (r)})/|{dot over (r)}|. The host vehicle sensor preferably generates output data including host vehicle velocity (v), host vehicle yaw rate ({dot over (ψ)}abs) and host vehicle steering angle (θ). Host vehicle sensors capable of providing such output data are well known in the art. Theoutput data 40 from thesensor system 18 are preferably treated by an object androad tracking block 42. The object and road tracking block may advantageously be a state estimator which estimates the states of all objects and the host vehicle and the road geometry. The states of the objects may include all data provided from the sensor system. Thestate estimator 42 is preferably arranged as a Kalman Filter based tracking system estimating at least the current object position (xi, yi), the current object velocity (vi), for all objects detected by the sensor system; the current host vehicle velocity (v) and the current host vehicle heading direction (ψrel) and the road geometry such as curve radius (c0) and lane width (W). - For the purpose of tracking the external objects the system may comprise a road geometry tracking unit which is arranged to determine the geometry of the road on which the vehicle is travelling and to express said geometry of the road as a curved coordinate system which follow the lane or lanes of said road, and in that said object position, object velocity and object direction of movement are expressed relative to said curved coordinate system.
- A suitable state estimator for this purpose may be the state estimator described in “An Automotive Lane Guidance System”, Andreas Eidehall, thesis 1122 at Linkoping University 2004. In particular it is referred to the measurement equations 5.6a and 5.6b.
- Expressed in the variables introduced above we have:
-
- , where T transforms from a coordinate system x, y following the road geometry into a coordinate system (r, φ) centred at the host vehicle. The variables (e1, . . . e6) are stochastic measurement noise. W represents the width of the lane. Superscripts m denotes measured quantities. yoff represents the distance from the middle of the lane.
- Expressing the curve radius of the road as R=1/(c0+c1x) and assuming ċ1=0, the time continuous motion equations for the host vehicle states will be:
-
{dot over (W)}=0 -
{dot over (y)}off=vψrel -
{dot over (ψ)} rel =vc o+{dot over (ψ)}abs -
ċ0=vc1 -
ċ1=0 - Based on the model above an observer may be constructed using the following matrix definitions:
-
- where N is the number of objects, Ts is the sample time. Furthermore the following vectors are defined
-
-
- and the following process and measurement covariance matrices:
-
- ; where Qhost and Qobj are the process noise covariance matrices for the object states and Rhost and Robj are the measurement noise covariance matrices for the host and object measurement.
- The measurement equations and motion equations can now be rewritten as
-
x t+1 =Ax t +Bu t +w t -
y t =h(x t)+e t - A recursive one step predictor in the form of a Kalman filter will have the following appearance:
-
{circumflex over (x)} t+1 =A({circumflex over (x)} t +K t [y t −h({circumflex over (x)} t)]+Bu t - as an observer to the combined target geometry system
-
x t+1 =Ax t +Bu t +w t -
y t =h(x t)+e t - The extended Kalman Filter will be provided with a feedback Kt
- The following equations are the extended Kalman Filter equations for a non-linear measurement equation:
-
- Further details about the
state estimator 42 is provided in “An Automotive Lane Guidance System”, Andreas Eidehall, thesis 1122 at Linkoping University 2004. - The output from the state estimator is used in a
future trajectory estimator 44 which is arranged to estimate the future trajectory for each of the objects. According to the invention it is necessary that risk of conflicting events between individual objects in the set of objects detected by thesensor system 18 are assessed by thefuture trajectory estimator 44. Two general types of future trajectory estimators capable of including the mutual influence from external objects when estimating the future trajectory of an external object are known. A first type in which the future trajectory of an external object is corrected in a most likely fashion when a conflict event is detected. - One example of this type of future trajectory estimator is disclosed in the Broadhurst article referred to above. The correction may be made to avoid the conflict event, or in the event this is not possible due to physical restraints such as available steering possibilities, surface friction, acceleration, etc, the correction is made to reduce the effect of the conflict event.
- In another type of future trajectory estimator, it is simply noted that a conflicting event occurs between two external objects. It is thereafter determined whether this conflict event will have an impact of the host vehicle or not. In both these systems the future trajectory estimator determines whether any of the future trajectories the external objects will mutually effect each other due to a risk of conflict in between the future trajectories of at least one pair of objects. In most known collision avoidance systems, interaction between the detected objects is not observed.
- In most prior art systems normally only conflicting events between each object and the host vehicle are observed.
- In one embodiment of the invention the
future trajectory estimator 44 is of the type described in the article “Monte Carlo Road Safety Reasoning”, by Broadhurst et al. referred to above. - In an embodiment of a
future trajectory estimator 44 of this type, each object type is assigned certain restrictions of movement. For instance obstacles will not move, pedestrians may move independently in the x and y directions, while cars are restricted to turn with a curvature radius restricted to possible steering angles. Starting with an initial state, which in one embodiment of the invention is determined by the object androad tracking block 42, the future trajectory estimator first generates set of future trajectories for each object, which set of future trajectories may include all possible future control inputs selected from a set of typical human driver actions under consideration of the boundary conditions due to the restrictions of movement of the objects. These set of typical driver actions include stop, stop and turn, change lane, corner, overtake and random. For each possible future trajectory a certain risk is assigned. Alternatively, the control input may be restricted to a selection of a plurality of values of one or more variables, wherein said values are selected within a predetermined interval. Preferably acceleration and steering angle are used as input variables. Among all possible future trajectories, the trajectories that minimizes the risk for conflict between all future trajectories is selected as the most probable future trajectories. In order to evaluate the probability of danger for all possible future trajectories, and thus find the future trajectories for the detected objects that has the minimal the risk for conflict between the trajectories a Monte-Carlo sampling algorithm may suitable be selected. -
FIG. 3 shows a flow scheme describing the operation of an embodiment of afuture trajectory estimator 44. - In a first operational sequence S1 a plurality of objects having a position
x 1(t), a velocityv i(t) and a class of input control signals ū(t). The input control signals may be of different types for different types of objects. The different types of objects may advantageously include obstacles, pedestrians and cars. - In a second operational sequence S2 a state update equation is defined for each c. The state update equation may be described as
{dot over (s)} (t)=f(s (t), ū(t). Obstacles will have a state update equation{dot over (s)} (t)=0 for the states (t)=[x y θ]. Pedestrians will have a state update equation [{dot over (s)}1 {dot over (s)}2 {dot over (s)}3 {dot over (s)}4]T=[s1 s2 u1 u2]T, where [u1u2]T=[axay]T, a denotes a random acceleration. The state is defined as [s1 s2 s3 S4]T=[x y {dot over (x)} {dot over (y)}]T. - Cars will have a state update equation [{dot over (s)}1 {dot over (s)}2 {dot over (s)}3 {dot over (s)}4]T=[s3 cos s4 s3 sin s4 u1 (s3 sin u2)/L]T, where [u1u2]T=[a θ]T. a denotes acceleration and θ denotes steering angle. The state is defined as [s1 s2 s3 s4]T=[x y v φ]T, where v is the velocity and φ is the orientation. L=R sin θ, where R equals the turning radius of the car.
- In a third operational sequence S3 a plurality of future control input signals is associated with each object. For pedestrians, this amount to selection of a set of values of accelerations within a predetermined interval. For cars this amounts to selection of a set of values of accelerations and steering angles within predetermined intervals. A plurality of values are selected at each step in a plurality of discrete time steps together forming a prediction horizon. The selection may be performed by random or include a set of typical inputs defining typical manoeuvres such as corner, lane change, overtake and emergency stop. Preferably random input is combined with the typical inputs.
- In a fourth operational sequence S4 a probability number is associated with each selected value. The probability number may be determined from a stored map describing the probability as a function of the input variable
- In a fifth operational sequence S5 a future path is calculated by use of the selected values. A plurality of future paths is thus created for each object.
- In a sixth operational sequence S6 a future trajectory is selected as one of the most probable future paths, preferably the most probable future path. This is done by calculating the aggregate probability value for the selected values at each step in a plurality of discrete time steps forming the prediction horizon. Future paths leading to conflicting events between objects may be removed in this sixth operational sequence, or in an operational sequence S6′ prior to the sixth operational sequence.
- The operational sequences S1-S6 may be performed in a
future trajectory estimator 44 of the type described in the Broadhurst reference. The future trajectory estimator would then include first to sixth means for performing the first to sixth operational sequences described above. - In
FIG. 3 an example of a traffic situation which explains the importance of assigning also conflicting events between individual objects in the set of objects detected by thesensor system 18 and not only detect possible conflicting events between the future trajectories of the each set of objects with the future trajectory of the host vehicle. InFIG. 3 reference sign 60 denotes the host vehicle having a future trajectory which may be estimated by the current position of the host vehicle; the current heading 64 of the host vehicle; vehicle input data such as steering angle, acceleration; and road geometry. The traffic situation includes two external objects, afirst vehicle 66 and a trailingvehicle 68. The velocity of the trailing vehicle is greater than the velocity of the first vehicle. At the current scenario, the trailing vehicle has a firstfuture trajectory 70 where the trailingvehicle 68 will follow the current heading 72 of the trailing vehicle. Since the velocity of the trailingvehicle 68 is greater than the velocity of the first vehicle, a conflict event exists between afuture trajectory 74 of the first vehicle and the firstfuture trajectory 70 of the trailing vehicle. However, using afuture trajectory estimator 44 of the type described above, whichfuture trajectory estimator 44 first generates a set of future trajectories for each object, which set of future trajectories includes all possible future control inputs selected from a set of typical human driver actions under consideration of the boundary conditions due to the restrictions of movement of the objects, also a secondfuture trajectory 76 where the trailingvehicle 68 will avoid collision with thefirst vehicle 66 by passing thefirst vehicle 66 in the neighbouringlane 78 is generated by thefuture trajectory estimator 76. Since the risk for a conflicting event is smaller for the secondfuture trajectory 76, than for the firstfuture trajectory 70, the secondfuture trajectory 76 is selected by thefuture trajectory estimator 44 as the most likely future trajectory. Thefuture trajectory 76 of the trailing vehicle is a result from consideration of the interaction betweenfuture trajectories objects host vehicle 60 to enter the neighbouringlane 78 because the future trajectories of theexternal objects upper lane 80 in the traffic situation described inFIG. 3 . - According to the invention the output from the
future trajectory estimator 44 is entered into a laneexit control block 46. In the laneexit control block 46 it is determined in a laneexit control block 46 whether the driver is making an attempt to exit the lane. The laneexit control block 46 verifies that the host vehicle is making an attempt to leave the lane based on the vertical distance Δ to the edge of the lane, the velocity v of the host vehicle and the heading angle ψrel, relative to the road geometry.FIG. 4 illustrates the parameters of the lane exit control block. When it is predicted that the host vehicle will leave the lane within shortly, the lane exit control block generates an affirmative output signal. The decision can be made for instance by comparing the predicted time or distance before the vehicle leaves the lane exceeds a threshold, which threshold may be fixed or depend on for instance the velocity of the vehicle. In a future conflictestimator control block 50 included in the lane exit control block it is determined if thefuture trajectory 82 of the host vehicle is involved in a conflicting event.Future trajectory 82 of the host vehicle is a future trajectory based on the verification of an attempt to leave thelower lane 84 in the traffic situation designed inFIG. 3 . Thefuture conflict estimator 50 furthermore verifies if the future conflict detected is within the neighbouringlane 78 into which the host vehicle makes an attempt to enter. In one embodiment of the invention, thefuture conflict estimator 50 verifies a conflicting event and sends a control signal to a lanechange prevention unit 51 including alateral feedback controller 52 which generates a control signal to asteering actuator 54 of the vehicle. The lateral feedback controller applies a control signal to generate a torque in the opposite direction of the steering torque generated by the driver input signal, either so as to prevent the driver form changing lane or a smaller torque which may be overcome by the driver, which torque alerts the driver of the existence of a danger of entering the neighbouring lane, but does not prevent the driver from entering the neighbouring lane. - In a preferred embodiment the future conflict
estimator control block 50 also determines if the current lane of the host vehicle is safe, that is if the afuture trajectory 62 of the host vehicle within the currentlower lane 84 does not include any conflicting events with external objects. If for instance anobstacle 86 is present in thecurrent lane 84, the future conflictestimator control block 50 will not activate thelateral feedback controller 52 so as to either alert the driver or so as to prevent the driver of leaving the lane. - The
future trajectory estimator 44 is preferably, as have been described above, of the type described in the article “Monte Carlo Road Safety Reasoning” referred to above. Alternatively thefuture trajectory estimator 44 is of a simpler type, which estimates the future trajectories in a feedback predictor (which may be of based on a Kalman filter) based on current velocity, heading angle and road geometry without generating a plurality of possible future trajectories for each object from which a most likely future trajectory may be selected by Monte Carlo sampling. In this type of estimator, a conflicting event will not be avoided by generating a more likely future trajectory for the object. If this type of future trajectory estimator is used, the future conflict estimator control block may detect whether a conflicting event exist between a pair of objects in the heading direction of the host vehicle such that the future trajectory may be affected by this conflicting event. The conflicting event between the pair of external objects may not necessarily take place in the neighbouringlane 78 if this type offuture trajectory estimator 44 is used since this type may not predict amended future trajectories of the external objects. However, in the event the future trajectory estimator is arranged as a feed back controller, any change in the trajectory of the external objects will be detected by thesensor system 18 and will form basis for an updated prediction of the future trajectory of the external object. For this reason, this simpler type of future trajectory estimator may also generate accurate predictions of the future trajectories. Normally the judgement that the conflicting event may affect the host vehicle is only generated if the conflicting event between the external objects takes place in the neighbouring lane into which the host vehicle attempts to enter. In the event the conflicting event may affect the host vehicle the future conflict estimator control block may generate a control signal to thelateral feedback controller 52, either as a direct result or after verifying that no conflicting events exist in the current position of the host vehicle. Naturally the future conflict estimator control block would activate thelateral feedback controller 52 in the event a future conflict event between the host vehicle and an external object exists in a neighbouring lane. - Furthermore, the
future trajectory estimator 44 may alternatively directly receive input signals from the sensor system without the use of thestate estimator 42.
Claims (34)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/955,041 US8112225B2 (en) | 2006-09-08 | 2010-11-29 | Method and system for collision avoidance |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06120392.3 | 2006-09-08 | ||
EP06120392 | 2006-09-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/955,041 Continuation US8112225B2 (en) | 2006-09-08 | 2010-11-29 | Method and system for collision avoidance |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080065328A1 true US20080065328A1 (en) | 2008-03-13 |
Family
ID=37745892
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/851,642 Abandoned US20080065328A1 (en) | 2006-09-08 | 2007-09-07 | Method and system for collision avoidance |
US12/955,041 Active US8112225B2 (en) | 2006-09-08 | 2010-11-29 | Method and system for collision avoidance |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/955,041 Active US8112225B2 (en) | 2006-09-08 | 2010-11-29 | Method and system for collision avoidance |
Country Status (2)
Country | Link |
---|---|
US (2) | US20080065328A1 (en) |
DE (1) | DE602007008801D1 (en) |
Cited By (77)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070276577A1 (en) * | 2006-05-23 | 2007-11-29 | Nissan Motor Co., Ltd. | Vehicle driving assist system |
US20090135048A1 (en) * | 2007-11-16 | 2009-05-28 | Ruediger Jordan | Method for estimating the width of radar objects |
US20090312889A1 (en) * | 2008-06-16 | 2009-12-17 | Gm Global Technology Operations, Inc. | Vehicle control using stochastic information |
US20100010699A1 (en) * | 2006-11-01 | 2010-01-14 | Koji Taguchi | Cruise control plan evaluation device and method |
US20100030474A1 (en) * | 2008-07-30 | 2010-02-04 | Fuji Jukogyo Kabushiki Kaisha | Driving support apparatus for vehicle |
US20100030472A1 (en) * | 2007-03-29 | 2010-02-04 | Toyota Jidosha Kabushiki Kaisha | Collision possibility acquiring device, and collision possibility acquiring method |
DE102009031805A1 (en) | 2009-05-20 | 2010-03-04 | Daimler Ag | Method for detecting objects in surrounding of vehicle, involves periodically determining yaw angle of vehicle, periodically determining change of yaw angle, and determining and correcting angle changes of object angle |
EP2172919A1 (en) * | 2007-06-20 | 2010-04-07 | Toyota Jidosha Kabushiki Kaisha | Vehicle travel track estimator |
US20100100324A1 (en) * | 2008-10-22 | 2010-04-22 | Toyota Motor Engineering & Manufacturing North America, Inc. | Communication based vehicle-pedestrian collision warning system |
US20100121576A1 (en) * | 2007-07-12 | 2010-05-13 | Toyota Jidosha Kabushiki Kaisha | Host-vehicle risk acquisition |
US20100228419A1 (en) * | 2009-03-09 | 2010-09-09 | Gm Global Technology Operations, Inc. | method to assess risk associated with operating an autonomic vehicle control system |
US20100235099A1 (en) * | 2009-02-27 | 2010-09-16 | Toyota Jidosha Kabushiki Kaisha | Driving assisting apparatus |
US20110010046A1 (en) * | 2009-07-10 | 2011-01-13 | Toyota Jidosha Kabushiki Kaisha | Object detection device |
US20110178710A1 (en) * | 2010-01-15 | 2011-07-21 | Ford Global Technologies, Llc | Collision mitigation system and method for braking a vehicle |
US20110250836A1 (en) * | 2010-04-09 | 2011-10-13 | Telcordia Technologies, Inc. | Interference-adaptive uwb radio-based vehicle communication system for active-safety |
DE102010033544A1 (en) * | 2010-08-05 | 2012-02-09 | Valeo Schalter Und Sensoren Gmbh | Method for operating motor car, involves detecting movement of object in vicinity of vehicle, obtaining variables during detection of movement of object, and performing reaction of component to obtained specific variables |
EP2423902A1 (en) * | 2010-08-27 | 2012-02-29 | Scania CV AB (publ) | Safety system and method |
US20120078498A1 (en) * | 2009-06-02 | 2012-03-29 | Masahiro Iwasaki | Vehicular peripheral surveillance device |
WO2012113366A1 (en) * | 2011-02-23 | 2012-08-30 | S.M.S. Smart Microwave Sensors Gmbh | Method and radar sensor arrangement for detecting the location and speed of objects relative to a measurement location, particularly a vehicle |
CN102686468A (en) * | 2009-12-24 | 2012-09-19 | 日产自动车株式会社 | Driving control device |
US20120296522A1 (en) * | 2011-04-14 | 2012-11-22 | Honda Elesys Co., Ltd. | Driving support system |
US20120323477A1 (en) * | 2009-12-01 | 2012-12-20 | Folko Flehmig | Anticipatory Control of the Transverse Vehicle Dynamics in Evasive Maneuvers |
US8412449B2 (en) * | 2008-10-24 | 2013-04-02 | Gray & Company, Inc. | Control and systems for autonomously driven vehicles |
US20130124052A1 (en) * | 2011-11-10 | 2013-05-16 | GM Global Technology Operations LLC | Method for operating a motor vehicle safety system and a safety system for a motor vehicle |
US20130158863A1 (en) * | 2011-12-20 | 2013-06-20 | Continental Automotive Systems, Inc. | Trailer backing path prediction using gps and camera images |
US20130211687A1 (en) * | 2010-10-23 | 2013-08-15 | Daimier Ag | Method for Operating a Brake Assist Device and Brake Assist Device for a Vehicle |
EP2654028A1 (en) * | 2012-04-20 | 2013-10-23 | Honda Research Institute Europe GmbH | Orientation sensitive traffic collision warning system |
US20130325306A1 (en) * | 2012-06-01 | 2013-12-05 | Toyota Motor Eng. & Mftg. N. America, Inc. (TEMA) | Cooperative driving and collision avoidance by distributed receding horizon control |
EP2567875A3 (en) * | 2011-09-08 | 2014-03-26 | Bayerische Motoren Werke Aktiengesellschaft | Method for controlling energy conversion processes in a vehicle |
US20140222322A1 (en) * | 2010-10-08 | 2014-08-07 | Navteq B.V. | Method and System for Using Intersecting Electronic Horizons |
US20140379167A1 (en) * | 2013-06-20 | 2014-12-25 | Robert Bosch Gmbh | Collision avoidance for a motor vehicle |
EP2845779A1 (en) * | 2013-09-09 | 2015-03-11 | Honda Research Institute Europe GmbH | Driving assistance technique for active vehicle control |
CN104424819A (en) * | 2013-09-02 | 2015-03-18 | 宝马股份公司 | Passing Assistance device |
DE102014202752A1 (en) * | 2014-02-14 | 2015-09-03 | Volkswagen Aktiengesellschaft | Detection of dynamic objects by means of ultrasound |
DE102014008353A1 (en) * | 2014-06-04 | 2015-12-17 | Audi Ag | Method for operating a driver assistance system for the automated guidance of a motor vehicle and associated motor vehicle |
CN105408181A (en) * | 2013-07-19 | 2016-03-16 | 奥迪股份公司 | Method for operating driver assistance system of motor vehicle and driver assistance system for motor vehicle |
CN106585641A (en) * | 2017-01-11 | 2017-04-26 | 张军 | Intelligent driving early warning system structure based on multi-core processor |
CN106780842A (en) * | 2017-01-11 | 2017-05-31 | 张军 | A kind of intelligent travelling crane early warning system prototype based on multinuclear heterogeneous processor |
CN106740874A (en) * | 2017-02-17 | 2017-05-31 | 张军 | A kind of intelligent travelling crane early warning sensory perceptual system based on polycaryon processor |
CN106740857A (en) * | 2017-01-09 | 2017-05-31 | 张军 | A kind of intelligent travelling crane early warning system prototype based on polycaryon processor |
JP2017523938A (en) * | 2014-08-06 | 2017-08-24 | ルノー エス.ア.エス. | Driving assistance system and method implemented in such a system |
CN107646114A (en) * | 2015-05-22 | 2018-01-30 | 大陆-特韦斯贸易合伙股份公司及两合公司 | Method for estimating lane |
EP2549456A4 (en) * | 2010-03-16 | 2018-05-23 | Toyota Jidosha Kabushiki Kaisha | Driving assistance device |
WO2018150580A1 (en) * | 2017-02-20 | 2018-08-23 | 三菱電機株式会社 | Traveling plan correction device and traveling plan correction method |
US10095238B2 (en) * | 2016-12-14 | 2018-10-09 | Ford Global Technologies, Llc | Autonomous vehicle object detection |
WO2018197083A1 (en) * | 2017-04-26 | 2018-11-01 | Bayerische Motoren Werke Aktiengesellschaft | Method, computer program product, computer-readable medium, control unit, and vehicle comprising the control unit for determining a collective maneuver of at least two vehicles |
US10139244B2 (en) * | 2016-08-17 | 2018-11-27 | Veoneer Us Inc. | ADAS horizon and vision supplemental V2X |
US20180348770A1 (en) * | 2017-06-02 | 2018-12-06 | Honda Motor Co.,Ltd. | Running track determining device and automatic driving apparatus |
EP3385930A4 (en) * | 2015-11-30 | 2019-01-02 | Nissan Motor Co., Ltd. | Method and device for generating forecast vehicular information used for traveling on vehicle road network |
US20190084417A1 (en) * | 2016-02-25 | 2019-03-21 | Continental Teves Ag & Co. Ohg | Method for a speed controller |
US10279807B2 (en) * | 2017-02-17 | 2019-05-07 | GM Global Technology Operations LLC | System and method for predicting a possible lane departure when driving a vehicle autonomously or semi-autonomously, and for taking a remedial action to prevent a lane departure |
US10293826B2 (en) * | 2013-12-04 | 2019-05-21 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
US10446034B2 (en) * | 2015-07-17 | 2019-10-15 | Denso Corporation | Driving support system |
WO2020025445A1 (en) * | 2018-08-02 | 2020-02-06 | Robert Bosch Gmbh | Method for guiding a motor vehicle in an at least partly automated manner |
US20200047749A1 (en) * | 2018-08-10 | 2020-02-13 | Toyota Jidosha Kabushiki Kaisha | Vehicle driving assist apparatus |
US10611370B2 (en) * | 2017-02-09 | 2020-04-07 | Panasonic Intellectual Property Corporation Of America | Information processing apparatus, information processing method, and non-transitory recording medium |
WO2020069812A1 (en) * | 2018-10-01 | 2020-04-09 | Robert Bosch Gmbh | Method for guiding a motor vehicle on a roadway in an at least partly automated manner |
CN111033510A (en) * | 2017-09-26 | 2020-04-17 | 奥迪股份公司 | Method and device for operating a driver assistance system, driver assistance system and motor vehicle |
JP2020513624A (en) * | 2016-12-06 | 2020-05-14 | ニッサン ノース アメリカ,インク | Advanced threat alerts for autonomous vehicles |
DE102019103106A1 (en) * | 2019-02-08 | 2020-08-13 | Zf Automotive Germany Gmbh | Control system and control method for the interaction-based long-term determination of trajectories for motor vehicles |
US10759427B2 (en) * | 2017-09-28 | 2020-09-01 | Toyota Jidosha Kabushiki Kaisha | Driving support apparatus |
US10836383B2 (en) * | 2018-05-04 | 2020-11-17 | The Regents Of The University Of Michigan | Collision imminent steering control systems and methods |
US20200385017A1 (en) * | 2019-05-16 | 2020-12-10 | Honda Motor Co., Ltd. | Vehicle control device and vehicle control method |
FR3097338A1 (en) * | 2019-06-14 | 2020-12-18 | Renault S.A.S | System and method for predicting the trajectory of an ego vehicle as a function of the environment of said ego vehicle |
WO2020262070A1 (en) * | 2019-06-25 | 2020-12-30 | 株式会社デンソー | Tracking device |
CN112567259A (en) * | 2018-08-16 | 2021-03-26 | 标致雪铁龙汽车股份有限公司 | Method for determining a confidence index associated with an object detected by a sensor in the environment of a motor vehicle |
CN112660156A (en) * | 2019-10-15 | 2021-04-16 | 丰田自动车株式会社 | Vehicle control system |
US20210114532A1 (en) * | 2018-03-28 | 2021-04-22 | Kyocera Corporation | Image processing apparatus, imaging apparatus, and moveable body |
US11009589B2 (en) * | 2017-08-24 | 2021-05-18 | Subaru Corporation | Vehicle exterior environment recognition apparatus |
CN113015664A (en) * | 2018-11-13 | 2021-06-22 | 祖克斯有限公司 | Perception anticollision |
US11292460B2 (en) * | 2017-01-19 | 2022-04-05 | Honda Motor Co., Ltd. | Vehicle control system, vehicle control method, and vehicle control program |
US20220319187A1 (en) * | 2019-06-25 | 2022-10-06 | Kyocera Corporation | Image processing apparatus, imaging apparatus, movable object, and method for image processing |
US20220324440A1 (en) * | 2019-11-11 | 2022-10-13 | Robert Bosch Gmbh | Method for operating an autonomous driving function of a vehicle |
US20230037142A1 (en) * | 2021-07-28 | 2023-02-02 | Argo AI, LLC | Method and system for developing autonomous vehicle training simulations |
US11731620B2 (en) | 2018-12-12 | 2023-08-22 | Zoox, Inc. | Collision avoidance system with trajectory validation |
WO2023198782A1 (en) * | 2022-04-13 | 2023-10-19 | Creation Labs Ai Limited | Driving assistance systems |
US12090997B1 (en) * | 2014-10-02 | 2024-09-17 | Waymo Llc | Predicting trajectories of objects based on contextual information |
Families Citing this family (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4730378B2 (en) * | 2007-12-25 | 2011-07-20 | トヨタ自動車株式会社 | Course evaluation device and course evaluation method |
US8605947B2 (en) * | 2008-04-24 | 2013-12-10 | GM Global Technology Operations LLC | Method for detecting a clear path of travel for a vehicle enhanced by object detection |
EP2159595B1 (en) * | 2008-08-28 | 2013-03-20 | Saab Ab | A target tracking system and a method for tracking a target |
GB0901906D0 (en) * | 2009-02-05 | 2009-03-11 | Trw Ltd | Collision warning apparatus |
JP4877360B2 (en) * | 2009-06-12 | 2012-02-15 | トヨタ自動車株式会社 | Course evaluation device and course evaluation method |
JP5505427B2 (en) * | 2010-01-12 | 2014-05-28 | トヨタ自動車株式会社 | Collision position prediction device |
KR20110139898A (en) * | 2010-06-24 | 2011-12-30 | 주식회사 만도 | Lane keeping control method and device and lane departure warning device |
US8527172B2 (en) * | 2010-10-20 | 2013-09-03 | GM Global Technology Operations LLC | Vehicle collision avoidance and warning system |
US9495874B1 (en) | 2012-04-13 | 2016-11-15 | Google Inc. | Automated system and method for modeling the behavior of vehicles and other agents |
KR101439017B1 (en) * | 2013-04-11 | 2014-10-30 | 현대자동차주식회사 | System for controlling change of lane |
US9789952B2 (en) | 2013-06-19 | 2017-10-17 | The Boeing Company | Methods and apparatus of notification of a flight asymmetry influencing an aircraft |
EP2990290B1 (en) * | 2014-09-01 | 2019-11-06 | Honda Research Institute Europe GmbH | Method and system for post-collision manoeuvre planning and vehicle equipped with such system |
KR101628503B1 (en) * | 2014-10-27 | 2016-06-08 | 현대자동차주식회사 | Driver assistance apparatus and method for operating thereof |
US9764736B2 (en) * | 2015-08-14 | 2017-09-19 | Toyota Motor Engineering & Manufacturing North America, Inc. | Autonomous vehicle operation relative to unexpected dynamic objects |
US9707961B1 (en) | 2016-01-29 | 2017-07-18 | Ford Global Technologies, Llc | Tracking objects within a dynamic environment for improved localization |
US9701307B1 (en) | 2016-04-11 | 2017-07-11 | David E. Newman | Systems and methods for hazard mitigation |
US10543852B2 (en) * | 2016-08-20 | 2020-01-28 | Toyota Motor Engineering & Manufacturing North America, Inc. | Environmental driver comfort feedback for autonomous vehicle |
US10515390B2 (en) * | 2016-11-21 | 2019-12-24 | Nio Usa, Inc. | Method and system for data optimization |
US10254758B2 (en) | 2017-01-18 | 2019-04-09 | Ford Global Technologies, Llc | Object tracking by unsupervised learning |
US10449956B2 (en) | 2017-01-18 | 2019-10-22 | Ford Global Technologies, Llc | Object tracking by unsupervised learning |
US10671076B1 (en) | 2017-03-01 | 2020-06-02 | Zoox, Inc. | Trajectory prediction of third-party objects using temporal logic and tree search |
US10133275B1 (en) | 2017-03-01 | 2018-11-20 | Zoox, Inc. | Trajectory generation using temporal logic and tree search |
US10953881B2 (en) | 2017-09-07 | 2021-03-23 | Tusimple, Inc. | System and method for automated lane change control for autonomous vehicles |
US10782693B2 (en) | 2017-09-07 | 2020-09-22 | Tusimple, Inc. | Prediction-based system and method for trajectory planning of autonomous vehicles |
US10649458B2 (en) | 2017-09-07 | 2020-05-12 | Tusimple, Inc. | Data-driven prediction-based system and method for trajectory planning of autonomous vehicles |
US10953880B2 (en) | 2017-09-07 | 2021-03-23 | Tusimple, Inc. | System and method for automated lane change control for autonomous vehicles |
US10955851B2 (en) | 2018-02-14 | 2021-03-23 | Zoox, Inc. | Detecting blocking objects |
US10414395B1 (en) | 2018-04-06 | 2019-09-17 | Zoox, Inc. | Feature-based prediction |
US11126873B2 (en) | 2018-05-17 | 2021-09-21 | Zoox, Inc. | Vehicle lighting state determination |
US10816635B1 (en) | 2018-12-20 | 2020-10-27 | Autonomous Roadway Intelligence, Llc | Autonomous vehicle localization system |
US10820349B2 (en) | 2018-12-20 | 2020-10-27 | Autonomous Roadway Intelligence, Llc | Wireless message collision avoidance with high throughput |
RU2750152C1 (en) | 2019-04-25 | 2021-06-22 | Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" | Methods and systems for determining vehicle trajectory estimation procedure |
CN110239532B (en) * | 2019-05-20 | 2020-12-01 | 浙江吉利控股集团有限公司 | Vehicle lane change assisting method, device, terminal and storage medium |
US10820182B1 (en) | 2019-06-13 | 2020-10-27 | David E. Newman | Wireless protocols for emergency message transmission |
US10713950B1 (en) | 2019-06-13 | 2020-07-14 | Autonomous Roadway Intelligence, Llc | Rapid wireless communication for vehicle collision mitigation |
US10939471B2 (en) | 2019-06-13 | 2021-03-02 | David E. Newman | Managed transmission of wireless DAT messages |
US11835958B2 (en) | 2020-07-28 | 2023-12-05 | Huawei Technologies Co., Ltd. | Predictive motion planning system and method |
US11206092B1 (en) | 2020-11-13 | 2021-12-21 | Ultralogic 5G, Llc | Artificial intelligence for predicting 5G network performance |
US20220183068A1 (en) | 2020-12-04 | 2022-06-09 | David E. Newman | Rapid Uplink Access by Parallel Signaling on a 5G Random-Access Channel |
US20240140414A1 (en) * | 2022-11-02 | 2024-05-02 | Canoo Technologies Inc. | System and method for target behavior prediction in advanced driving assist system (adas), autonomous driving (ad), or other applications |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020087241A1 (en) * | 2000-12-05 | 2002-07-04 | Toyoda Koki Kabushiki Kaisha | System of informing procedures for adjusting control parameters of an electric power steering control apparatus |
US20030204299A1 (en) * | 2002-04-30 | 2003-10-30 | Ford Global Technologies, Inc. | Ramp identification in adaptive cruise control |
US20050225477A1 (en) * | 2002-07-15 | 2005-10-13 | Shan Cong | Road curvature estimation system |
US20060149462A1 (en) * | 2004-09-17 | 2006-07-06 | Honda Motor Co., Ltd. | Vehicular control object determination system and vehicular travel locus estimation system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6269307B1 (en) * | 1998-08-06 | 2001-07-31 | Honda Giken Kogyo Kabushiki Kaisha | Travel safety system for vehicle |
JP2003536096A (en) | 2000-06-08 | 2003-12-02 | オートモーティブ システムズ ラボラトリー インコーポレーテッド | Tracking map generator |
EP1297445A4 (en) | 2000-06-09 | 2005-11-23 | Automotive Systems Lab | Situation awareness processor |
US7879580B2 (en) * | 2002-12-10 | 2011-02-01 | Massachusetts Institute Of Technology | Methods for high fidelity production of long nucleic acid molecules |
US7161472B2 (en) * | 2003-06-06 | 2007-01-09 | Ford Global Technologies, Llc | Blind-spot warning system for an automotive vehicle |
US7200478B2 (en) * | 2003-10-31 | 2007-04-03 | Nissan Motor Co., Ltd. | Lane departure prevention apparatus |
-
2007
- 2007-09-07 US US11/851,642 patent/US20080065328A1/en not_active Abandoned
- 2007-09-10 DE DE602007008801T patent/DE602007008801D1/en active Active
-
2010
- 2010-11-29 US US12/955,041 patent/US8112225B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020087241A1 (en) * | 2000-12-05 | 2002-07-04 | Toyoda Koki Kabushiki Kaisha | System of informing procedures for adjusting control parameters of an electric power steering control apparatus |
US20030204299A1 (en) * | 2002-04-30 | 2003-10-30 | Ford Global Technologies, Inc. | Ramp identification in adaptive cruise control |
US20050225477A1 (en) * | 2002-07-15 | 2005-10-13 | Shan Cong | Road curvature estimation system |
US20060149462A1 (en) * | 2004-09-17 | 2006-07-06 | Honda Motor Co., Ltd. | Vehicular control object determination system and vehicular travel locus estimation system |
Cited By (145)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130238211A1 (en) * | 2006-05-23 | 2013-09-12 | Nissan Motor Co., Ltd. | Vehicle driving assist system |
US8670915B2 (en) * | 2006-05-23 | 2014-03-11 | Nissan Motor Co., Ltd. | Vehicle driving assist system |
US8442739B2 (en) * | 2006-05-23 | 2013-05-14 | Nissan Motor Co., Ltd. | Vehicle driving assist system |
US20070276577A1 (en) * | 2006-05-23 | 2007-11-29 | Nissan Motor Co., Ltd. | Vehicle driving assist system |
US9224299B2 (en) | 2006-11-01 | 2015-12-29 | Toyota Jidosha Kabushiki Kaisha | Cruise control plan evaluation device and method |
US20100010699A1 (en) * | 2006-11-01 | 2010-01-14 | Koji Taguchi | Cruise control plan evaluation device and method |
US20100030472A1 (en) * | 2007-03-29 | 2010-02-04 | Toyota Jidosha Kabushiki Kaisha | Collision possibility acquiring device, and collision possibility acquiring method |
US8515659B2 (en) * | 2007-03-29 | 2013-08-20 | Toyota Jidosha Kabushiki Kaisha | Collision possibility acquiring device, and collision possibility acquiring method |
EP2172919A1 (en) * | 2007-06-20 | 2010-04-07 | Toyota Jidosha Kabushiki Kaisha | Vehicle travel track estimator |
US8781720B2 (en) | 2007-06-20 | 2014-07-15 | Toyota Jidosha Kabushiki Kaisha | Vehicle travel track estimator |
US20100106418A1 (en) * | 2007-06-20 | 2010-04-29 | Toyota Jidosha Kabushiki Kaisha | Vehicle travel track estimator |
EP2172919A4 (en) * | 2007-06-20 | 2011-05-11 | Toyota Motor Co Ltd | DEVICE FOR ESTIMATING VEHICLE DISPLACEMENT TRAJECTORY |
US8504283B2 (en) | 2007-07-12 | 2013-08-06 | Toyota Jidosha Kabushiki Kaisha | Host-vehicle risk acquisition device and method |
US9020749B2 (en) | 2007-07-12 | 2015-04-28 | Toyota Jidosha Kabushiki Kaisha | Host-vehicle risk acquisition device and method |
US20100121576A1 (en) * | 2007-07-12 | 2010-05-13 | Toyota Jidosha Kabushiki Kaisha | Host-vehicle risk acquisition |
US7714769B2 (en) * | 2007-11-16 | 2010-05-11 | Robert Bosch Gmbh | Method for estimating the width of radar objects |
US20090135048A1 (en) * | 2007-11-16 | 2009-05-28 | Ruediger Jordan | Method for estimating the width of radar objects |
WO2010005689A3 (en) * | 2008-06-16 | 2010-03-18 | Gm Global Technology Operations, Inc. | Vehicle control using stochastic information |
WO2010005689A2 (en) * | 2008-06-16 | 2010-01-14 | Gm Global Technology Operations, Inc. | Vehicle control using stochastic information |
US20090312889A1 (en) * | 2008-06-16 | 2009-12-17 | Gm Global Technology Operations, Inc. | Vehicle control using stochastic information |
US8290637B2 (en) | 2008-06-16 | 2012-10-16 | GM Global Technology Operations LLC | Vehicle control using stochastic information |
US20100030474A1 (en) * | 2008-07-30 | 2010-02-04 | Fuji Jukogyo Kabushiki Kaisha | Driving support apparatus for vehicle |
US8903640B2 (en) | 2008-10-22 | 2014-12-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Communication based vehicle-pedestrian collision warning system |
US20100100324A1 (en) * | 2008-10-22 | 2010-04-22 | Toyota Motor Engineering & Manufacturing North America, Inc. | Communication based vehicle-pedestrian collision warning system |
US8412449B2 (en) * | 2008-10-24 | 2013-04-02 | Gray & Company, Inc. | Control and systems for autonomously driven vehicles |
US20100235099A1 (en) * | 2009-02-27 | 2010-09-16 | Toyota Jidosha Kabushiki Kaisha | Driving assisting apparatus |
US8838371B2 (en) | 2009-02-27 | 2014-09-16 | Toyota Jidosha Kabushiki Kaisha | Driving assisting apparatus |
US20100228419A1 (en) * | 2009-03-09 | 2010-09-09 | Gm Global Technology Operations, Inc. | method to assess risk associated with operating an autonomic vehicle control system |
US8244408B2 (en) * | 2009-03-09 | 2012-08-14 | GM Global Technology Operations LLC | Method to assess risk associated with operating an autonomic vehicle control system |
DE102009031805A1 (en) | 2009-05-20 | 2010-03-04 | Daimler Ag | Method for detecting objects in surrounding of vehicle, involves periodically determining yaw angle of vehicle, periodically determining change of yaw angle, and determining and correcting angle changes of object angle |
US8571786B2 (en) * | 2009-06-02 | 2013-10-29 | Toyota Jidosha Kabushiki Kaisha | Vehicular peripheral surveillance device |
US20120078498A1 (en) * | 2009-06-02 | 2012-03-29 | Masahiro Iwasaki | Vehicular peripheral surveillance device |
US9626868B2 (en) * | 2009-07-10 | 2017-04-18 | Toyota Jidosha Kabushiki Kaisha | Object detection device |
US20110010046A1 (en) * | 2009-07-10 | 2011-01-13 | Toyota Jidosha Kabushiki Kaisha | Object detection device |
US20120323477A1 (en) * | 2009-12-01 | 2012-12-20 | Folko Flehmig | Anticipatory Control of the Transverse Vehicle Dynamics in Evasive Maneuvers |
US8781722B2 (en) * | 2009-12-01 | 2014-07-15 | Robert Bosch Gmbh | Anticipatory control of the transverse vehicle dynamics in evasive maneuvers |
US20120265431A1 (en) * | 2009-12-24 | 2012-10-18 | Nissan Motor Co., Ltd. | Driving control device |
CN102686468A (en) * | 2009-12-24 | 2012-09-19 | 日产自动车株式会社 | Driving control device |
US8700305B2 (en) * | 2009-12-24 | 2014-04-15 | Nissan Motor Co., Ltd. | Driving control device |
US9963127B2 (en) | 2010-01-15 | 2018-05-08 | Volvo Car Corporation | Collision mitigation system and method for braking a vehicle |
US20110178710A1 (en) * | 2010-01-15 | 2011-07-21 | Ford Global Technologies, Llc | Collision mitigation system and method for braking a vehicle |
EP2549456A4 (en) * | 2010-03-16 | 2018-05-23 | Toyota Jidosha Kabushiki Kaisha | Driving assistance device |
US20110250836A1 (en) * | 2010-04-09 | 2011-10-13 | Telcordia Technologies, Inc. | Interference-adaptive uwb radio-based vehicle communication system for active-safety |
DE102010033544A1 (en) * | 2010-08-05 | 2012-02-09 | Valeo Schalter Und Sensoren Gmbh | Method for operating motor car, involves detecting movement of object in vicinity of vehicle, obtaining variables during detection of movement of object, and performing reaction of component to obtained specific variables |
EP2423902A1 (en) * | 2010-08-27 | 2012-02-29 | Scania CV AB (publ) | Safety system and method |
US10198940B2 (en) | 2010-10-08 | 2019-02-05 | Here Global B.V. | Method and system for using intersecting electronic horizons |
US20140222322A1 (en) * | 2010-10-08 | 2014-08-07 | Navteq B.V. | Method and System for Using Intersecting Electronic Horizons |
US9799216B2 (en) | 2010-10-08 | 2017-10-24 | Here Global B.V. | Method and system for using intersecting electronic horizons |
US9330564B2 (en) * | 2010-10-08 | 2016-05-03 | Here Global B.V. | Method and system for using intersecting electronic horizons |
US10783775B2 (en) | 2010-10-08 | 2020-09-22 | Here Global B.V. | Method and system for using intersecting electronic horizons |
US20130211687A1 (en) * | 2010-10-23 | 2013-08-15 | Daimier Ag | Method for Operating a Brake Assist Device and Brake Assist Device for a Vehicle |
US9079571B2 (en) * | 2010-10-23 | 2015-07-14 | Daimler Ag | Method for operating a brake assist device and brake assist device for a vehicle |
WO2012113366A1 (en) * | 2011-02-23 | 2012-08-30 | S.M.S. Smart Microwave Sensors Gmbh | Method and radar sensor arrangement for detecting the location and speed of objects relative to a measurement location, particularly a vehicle |
US8666599B2 (en) * | 2011-04-14 | 2014-03-04 | Honda Elesys Co., Ltd. | Driving support system |
US20120296522A1 (en) * | 2011-04-14 | 2012-11-22 | Honda Elesys Co., Ltd. | Driving support system |
EP2567875A3 (en) * | 2011-09-08 | 2014-03-26 | Bayerische Motoren Werke Aktiengesellschaft | Method for controlling energy conversion processes in a vehicle |
US20130124052A1 (en) * | 2011-11-10 | 2013-05-16 | GM Global Technology Operations LLC | Method for operating a motor vehicle safety system and a safety system for a motor vehicle |
US20130158863A1 (en) * | 2011-12-20 | 2013-06-20 | Continental Automotive Systems, Inc. | Trailer backing path prediction using gps and camera images |
US9633566B2 (en) * | 2011-12-20 | 2017-04-25 | Continental Automotive Systems, Inc. | Trailer backing path prediction using GPS and camera images |
EP2654028A1 (en) * | 2012-04-20 | 2013-10-23 | Honda Research Institute Europe GmbH | Orientation sensitive traffic collision warning system |
US9524643B2 (en) | 2012-04-20 | 2016-12-20 | Honda Research Institute Europe Gmbh | Orientation sensitive traffic collision warning system |
US9669828B2 (en) * | 2012-06-01 | 2017-06-06 | Toyota Motor Engineering & Manufacturing North America, Inc. | Cooperative driving and collision avoidance by distributed receding horizon control |
US10679501B2 (en) | 2012-06-01 | 2020-06-09 | Toyota Motor Engineering & Manufacturing North America, Inc. | Cooperative driving and collision avoidance by distributed receding horizon control |
US20130325306A1 (en) * | 2012-06-01 | 2013-12-05 | Toyota Motor Eng. & Mftg. N. America, Inc. (TEMA) | Cooperative driving and collision avoidance by distributed receding horizon control |
US9296383B2 (en) * | 2013-06-20 | 2016-03-29 | Robert Bosch Gmbh | Collision avoidance for a motor vehicle |
US20140379167A1 (en) * | 2013-06-20 | 2014-12-25 | Robert Bosch Gmbh | Collision avoidance for a motor vehicle |
CN105408181A (en) * | 2013-07-19 | 2016-03-16 | 奥迪股份公司 | Method for operating driver assistance system of motor vehicle and driver assistance system for motor vehicle |
US20160167661A1 (en) * | 2013-07-19 | 2016-06-16 | Audi Ag | Method for operating a driver assistance system of a motor vehicle and driver assistance system for a motor vehicle |
CN104424819A (en) * | 2013-09-02 | 2015-03-18 | 宝马股份公司 | Passing Assistance device |
EP2845779A1 (en) * | 2013-09-09 | 2015-03-11 | Honda Research Institute Europe GmbH | Driving assistance technique for active vehicle control |
US10625776B2 (en) | 2013-09-09 | 2020-04-21 | Honda Research Institute Europe Gmbh | Driving assistance technique for active vehicle control |
US11697417B2 (en) | 2013-12-04 | 2023-07-11 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
US10953884B2 (en) | 2013-12-04 | 2021-03-23 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
US11511750B2 (en) | 2013-12-04 | 2022-11-29 | Mobileye Vision Technologies Ltd. | Image-based velocity control for a turning vehicle |
US11529957B2 (en) | 2013-12-04 | 2022-12-20 | Mobileye Vision Technologies Ltd. | Systems and methods for vehicle offset navigation |
US11667292B2 (en) | 2013-12-04 | 2023-06-06 | Mobileye Vision Technologies Ltd. | Systems and methods for vehicle braking |
US11708077B2 (en) | 2013-12-04 | 2023-07-25 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
US11713042B2 (en) | 2013-12-04 | 2023-08-01 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
US12221110B2 (en) | 2013-12-04 | 2025-02-11 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
US10293826B2 (en) * | 2013-12-04 | 2019-05-21 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle among encroaching vehicles |
DE102014202752B4 (en) * | 2014-02-14 | 2017-06-14 | Volkswagen Aktiengesellschaft | Detection of dynamic objects by means of ultrasound |
DE102014202752A1 (en) * | 2014-02-14 | 2015-09-03 | Volkswagen Aktiengesellschaft | Detection of dynamic objects by means of ultrasound |
DE102014008353B4 (en) * | 2014-06-04 | 2016-09-15 | Audi Ag | Method for operating a driver assistance system for the automated guidance of a motor vehicle and associated motor vehicle |
DE102014008353A1 (en) * | 2014-06-04 | 2015-12-17 | Audi Ag | Method for operating a driver assistance system for the automated guidance of a motor vehicle and associated motor vehicle |
US10346690B2 (en) | 2014-08-06 | 2019-07-09 | Renault S.A.S. | Driving assistance systems and method implemented in such a system |
JP2017523938A (en) * | 2014-08-06 | 2017-08-24 | ルノー エス.ア.エス. | Driving assistance system and method implemented in such a system |
US12090997B1 (en) * | 2014-10-02 | 2024-09-17 | Waymo Llc | Predicting trajectories of objects based on contextual information |
CN107646114A (en) * | 2015-05-22 | 2018-01-30 | 大陆-特韦斯贸易合伙股份公司及两合公司 | Method for estimating lane |
US10650253B2 (en) * | 2015-05-22 | 2020-05-12 | Continental Teves Ag & Co. Ohg | Method for estimating traffic lanes |
US20180173970A1 (en) * | 2015-05-22 | 2018-06-21 | Continental Teves Ag & Co. Ohg | Method for estimating traffic lanes |
US10446034B2 (en) * | 2015-07-17 | 2019-10-15 | Denso Corporation | Driving support system |
EP3385930A4 (en) * | 2015-11-30 | 2019-01-02 | Nissan Motor Co., Ltd. | Method and device for generating forecast vehicular information used for traveling on vehicle road network |
US11034240B2 (en) * | 2016-02-25 | 2021-06-15 | Continental Teves Ag & Co. Ohg | Method for a speed controller |
US20190084417A1 (en) * | 2016-02-25 | 2019-03-21 | Continental Teves Ag & Co. Ohg | Method for a speed controller |
US11156474B2 (en) * | 2016-08-17 | 2021-10-26 | Veoneer Us Inc. | ADAS horizon and vision supplemental V2X |
US10139244B2 (en) * | 2016-08-17 | 2018-11-27 | Veoneer Us Inc. | ADAS horizon and vision supplemental V2X |
JP2020513624A (en) * | 2016-12-06 | 2020-05-14 | ニッサン ノース アメリカ,インク | Advanced threat alerts for autonomous vehicles |
US10095238B2 (en) * | 2016-12-14 | 2018-10-09 | Ford Global Technologies, Llc | Autonomous vehicle object detection |
CN106740857A (en) * | 2017-01-09 | 2017-05-31 | 张军 | A kind of intelligent travelling crane early warning system prototype based on polycaryon processor |
CN106780842A (en) * | 2017-01-11 | 2017-05-31 | 张军 | A kind of intelligent travelling crane early warning system prototype based on multinuclear heterogeneous processor |
CN106585641A (en) * | 2017-01-11 | 2017-04-26 | 张军 | Intelligent driving early warning system structure based on multi-core processor |
US11292460B2 (en) * | 2017-01-19 | 2022-04-05 | Honda Motor Co., Ltd. | Vehicle control system, vehicle control method, and vehicle control program |
US10611370B2 (en) * | 2017-02-09 | 2020-04-07 | Panasonic Intellectual Property Corporation Of America | Information processing apparatus, information processing method, and non-transitory recording medium |
CN106740874A (en) * | 2017-02-17 | 2017-05-31 | 张军 | A kind of intelligent travelling crane early warning sensory perceptual system based on polycaryon processor |
US10279807B2 (en) * | 2017-02-17 | 2019-05-07 | GM Global Technology Operations LLC | System and method for predicting a possible lane departure when driving a vehicle autonomously or semi-autonomously, and for taking a remedial action to prevent a lane departure |
JPWO2018150580A1 (en) * | 2017-02-20 | 2019-06-27 | 三菱電機株式会社 | Travel plan correction device and travel plan correction method |
WO2018150580A1 (en) * | 2017-02-20 | 2018-08-23 | 三菱電機株式会社 | Traveling plan correction device and traveling plan correction method |
US11175662B2 (en) | 2017-02-20 | 2021-11-16 | Mitsubishi Electric Corporation | Travel plan correction device and travel plan correction method |
WO2018197083A1 (en) * | 2017-04-26 | 2018-11-01 | Bayerische Motoren Werke Aktiengesellschaft | Method, computer program product, computer-readable medium, control unit, and vehicle comprising the control unit for determining a collective maneuver of at least two vehicles |
US20200050214A1 (en) * | 2017-04-26 | 2020-02-13 | Bayerische Motoren Werke Aktiengesellschaft | Method, Computer Program Product, Computer-Readable Medium, Control Unit, and Vehicle Comprising the Control Unit for Determining a Collective Maneuver of at Least Two Vehicles |
CN110268457A (en) * | 2017-04-26 | 2019-09-20 | 宝马股份公司 | For determining medium, controller that the actuated method of collection, computer program product, the computer capacity of at least two vehicles read and including the vehicle of the controller |
US10775798B2 (en) * | 2017-06-02 | 2020-09-15 | Honda Motor Co., Ltd. | Running track determining device and automatic driving apparatus |
US20180348770A1 (en) * | 2017-06-02 | 2018-12-06 | Honda Motor Co.,Ltd. | Running track determining device and automatic driving apparatus |
US11009589B2 (en) * | 2017-08-24 | 2021-05-18 | Subaru Corporation | Vehicle exterior environment recognition apparatus |
CN111033510A (en) * | 2017-09-26 | 2020-04-17 | 奥迪股份公司 | Method and device for operating a driver assistance system, driver assistance system and motor vehicle |
US11273832B2 (en) | 2017-09-28 | 2022-03-15 | Toyota Jidosha Kabushiki Kaisha | Driving support apparatus |
US10759427B2 (en) * | 2017-09-28 | 2020-09-01 | Toyota Jidosha Kabushiki Kaisha | Driving support apparatus |
US20210114532A1 (en) * | 2018-03-28 | 2021-04-22 | Kyocera Corporation | Image processing apparatus, imaging apparatus, and moveable body |
US10836383B2 (en) * | 2018-05-04 | 2020-11-17 | The Regents Of The University Of Michigan | Collision imminent steering control systems and methods |
WO2020025445A1 (en) * | 2018-08-02 | 2020-02-06 | Robert Bosch Gmbh | Method for guiding a motor vehicle in an at least partly automated manner |
CN112533811A (en) * | 2018-08-02 | 2021-03-19 | 罗伯特·博世有限公司 | Method for at least partially automatically guiding a motor vehicle |
US12005893B2 (en) | 2018-08-02 | 2024-06-11 | Robert Bosch Gmbh | Method for guiding a motor vehicle in an at least partly automated manner |
US11541883B2 (en) * | 2018-08-10 | 2023-01-03 | Toyota Jidosha Kabushiki Kaisha | Vehicle driving assist apparatus |
US20200047749A1 (en) * | 2018-08-10 | 2020-02-13 | Toyota Jidosha Kabushiki Kaisha | Vehicle driving assist apparatus |
CN112567259A (en) * | 2018-08-16 | 2021-03-26 | 标致雪铁龙汽车股份有限公司 | Method for determining a confidence index associated with an object detected by a sensor in the environment of a motor vehicle |
WO2020069812A1 (en) * | 2018-10-01 | 2020-04-09 | Robert Bosch Gmbh | Method for guiding a motor vehicle on a roadway in an at least partly automated manner |
EP3880530A4 (en) * | 2018-11-13 | 2022-08-03 | Zoox, Inc. | Perception collision avoidance |
CN113015664A (en) * | 2018-11-13 | 2021-06-22 | 祖克斯有限公司 | Perception anticollision |
US11731620B2 (en) | 2018-12-12 | 2023-08-22 | Zoox, Inc. | Collision avoidance system with trajectory validation |
US11462099B2 (en) * | 2019-02-08 | 2022-10-04 | Zf Automotive Germany Gmbh | Control system and control method for interaction-based long-term determination of trajectories for motor vehicles |
DE102019103106A1 (en) * | 2019-02-08 | 2020-08-13 | Zf Automotive Germany Gmbh | Control system and control method for the interaction-based long-term determination of trajectories for motor vehicles |
US20200385017A1 (en) * | 2019-05-16 | 2020-12-10 | Honda Motor Co., Ltd. | Vehicle control device and vehicle control method |
FR3097338A1 (en) * | 2019-06-14 | 2020-12-18 | Renault S.A.S | System and method for predicting the trajectory of an ego vehicle as a function of the environment of said ego vehicle |
WO2020262070A1 (en) * | 2019-06-25 | 2020-12-30 | 株式会社デンソー | Tracking device |
US20220319187A1 (en) * | 2019-06-25 | 2022-10-06 | Kyocera Corporation | Image processing apparatus, imaging apparatus, movable object, and method for image processing |
JP7260416B2 (en) | 2019-06-25 | 2023-04-18 | 株式会社Soken | tracking device |
CN114026456A (en) * | 2019-06-25 | 2022-02-08 | 株式会社电装 | tracking device |
JP2021004737A (en) * | 2019-06-25 | 2021-01-14 | 株式会社Soken | Tracking device |
US12216194B2 (en) | 2019-06-25 | 2025-02-04 | Denso Corporation | Target tracking device |
CN112660156A (en) * | 2019-10-15 | 2021-04-16 | 丰田自动车株式会社 | Vehicle control system |
US20220324440A1 (en) * | 2019-11-11 | 2022-10-13 | Robert Bosch Gmbh | Method for operating an autonomous driving function of a vehicle |
US12128890B2 (en) * | 2019-11-11 | 2024-10-29 | Robert Bosch Gmbh | Method for operating an autonomous driving function of a vehicle |
US20230037142A1 (en) * | 2021-07-28 | 2023-02-02 | Argo AI, LLC | Method and system for developing autonomous vehicle training simulations |
US11960292B2 (en) * | 2021-07-28 | 2024-04-16 | Argo AI, LLC | Method and system for developing autonomous vehicle training simulations |
WO2023198782A1 (en) * | 2022-04-13 | 2023-10-19 | Creation Labs Ai Limited | Driving assistance systems |
Also Published As
Publication number | Publication date |
---|---|
US20110071731A1 (en) | 2011-03-24 |
US8112225B2 (en) | 2012-02-07 |
DE602007008801D1 (en) | 2010-10-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8112225B2 (en) | Method and system for collision avoidance | |
EP1898232B1 (en) | Method and system for collision avoidance | |
EP2990290B1 (en) | Method and system for post-collision manoeuvre planning and vehicle equipped with such system | |
EP3636507B1 (en) | Comfort responsibility sensitivity safety model | |
EP3474254B1 (en) | Surrounding environment recognition device | |
Jansson | Collision avoidance theory with application to automotive collision mitigation | |
EP3091370B1 (en) | Method and arrangement for determining safe vehicle trajectories | |
JP5864473B2 (en) | Automobile target state estimation system | |
CN109249930B (en) | Intelligent vehicle collision safety prediction method | |
EP2562060B1 (en) | A method and system for predicting movement behavior of a target traffic object | |
CN101327796B (en) | Method and apparatus for rear cross traffic collision avoidance | |
US20200238980A1 (en) | Vehicle control device | |
CN110481544A (en) | A kind of automotive correlation prevention method and anti-collision system for pedestrian | |
US20200353918A1 (en) | Vehicle control device | |
US20090192710A1 (en) | Method and system for collision course prediction and collision avoidance and mitigation | |
de Campos et al. | Collision avoidance at intersections: A probabilistic threat-assessment and decision-making system for safety interventions | |
CN109353337B (en) | Intelligent vehicle lane change stage collision probability safety prediction method | |
EP2208654B1 (en) | Method and system for avoiding host vehicle collisions with a target | |
TWI557006B (en) | Automated vehicle domain-wide risk analysis of regional planning algorithms and trajectory optimization avoidance system | |
EP3527450B1 (en) | Vehicle control apparatus | |
EP3456596A1 (en) | Method and device of predicting a possible collision | |
CN109353338B (en) | Intelligent vehicle overtaking lane collision probability safety prediction method | |
Kim et al. | Vehicle stability control of heading angle and lateral deviation to mitigate secondary collisions | |
Kawasaki et al. | Teammate advanced drive system using automated driving technology | |
Lefèvre et al. | Intention-aware risk estimation: Field results |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VOLVO CAR CORPORATION;REEL/FRAME:020095/0201 Effective date: 20071018 Owner name: VOLVO CAR CORPORATION, SWEDEN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EIDEHALL, ANDREAS;POHL, JOCHEN;REEL/FRAME:020095/0175;SIGNING DATES FROM 20070921 TO 20071005 |
|
AS | Assignment |
Owner name: VOLVO CAR CORPORATION, SWEDEN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FORD GLOBAL TECHNOLOGIES, LLC;REEL/FRAME:024915/0795 Effective date: 20100826 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |