+

WO2013039273A1 - Driving apparatus and method using 3d sensor - Google Patents

Driving apparatus and method using 3d sensor Download PDF

Info

Publication number
WO2013039273A1
WO2013039273A1 PCT/KR2011/006885 KR2011006885W WO2013039273A1 WO 2013039273 A1 WO2013039273 A1 WO 2013039273A1 KR 2011006885 W KR2011006885 W KR 2011006885W WO 2013039273 A1 WO2013039273 A1 WO 2013039273A1
Authority
WO
WIPO (PCT)
Prior art keywords
driving apparatus
state information
travel state
data
driving
Prior art date
Application number
PCT/KR2011/006885
Other languages
French (fr)
Inventor
Andreas PARK
Jonghun Kim
Hyuksoo Son
Youngkyung Park
Chandra Shekhar DHIR
Junoh PARK
Jeihun Lee
Joongjae LEE
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Priority to PCT/KR2011/006885 priority Critical patent/WO2013039273A1/en
Publication of WO2013039273A1 publication Critical patent/WO2013039273A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/90Single sensor for two or more measurements
    • B60W2420/905Single sensor for two or more measurements the sensor being an xyz axis sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/12Lateral speed
    • B60W2520/125Lateral acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/10Longitudinal speed
    • B60W2720/106Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/12Lateral speed
    • B60W2720/125Lateral acceleration

Definitions

  • the present disclosure relates to a driving (traveling) apparatus and method using a three-dimensional (3D) sensor.
  • eco-driving refers to new concept driving behaviors for practicing exhaust gas reduction, accident prevention, elongation of vehicle lifespan and energy saving by an environmentally friendly, safe and economic driving way.
  • vehicles having economical driving functions e.g., a function of informing and recording a travel state (traveling condition, driving state) while being driven) are being developed.
  • An eco-driving system is a driving assistance system for improving fuel efficiency by inducing drivers to have scientific driving habits, which are capable of enhancing fuel efficiencies, based upon automotive technologies.
  • an engine revolutions per minute (RPM) and driving velocity of a car are calculated.
  • RPM revolutions per minute
  • a green light lamp
  • a reckless driving such as sudden acceleration, sudden deceleration or the like
  • a red light is turned on, so as to make the driver practice the eco-driving.
  • the eco-driving system induces the eco-driving by displaying instantaneous fuel efficiency, a continuity level, points and the like on a monitor within a car while the car is being driven.
  • the related art eco-driving system has still been present at the level of inducing the eco-driving and improving the driving habits in terms of voice information, indication using a lamp, and displaying of fuel efficiency on the monitor within the car.
  • studies on systems for supporting an eco-driving by monitoring, storing, managing and analyzing travel state information related to a vehicle to be managed by an external management server and providing guidance information for inducing the eco-driving to the vehicle in real time are undergoing.
  • the existing way using sensors has a disadvantage in view of poor accuracy.
  • it is required to more accurately measure velocities of adjacent vehicles and distances from the adjacent vehicles, as compared to measuring those using the existing sensor, and determine whether or not the vehicle being controlled is accelerating unnecessarily.
  • an aspect of the detailed description is to provide a driving apparatus capable of detecting (sensing) a distance from another apparatus located in front of the driving apparatus using a three-dimensional (3D) sensor, and measuring and reducing unnecessary acceleration of the driving apparatus, and a method thereof.
  • Another aspect of the detailed description is to provide a driving apparatus capable of detecting a lane of a road where the driving apparatus is being traveled using 3D sensor, measuring an unnecessary horizontal (lateral) motion within the lane, and reducing the horizontal motion, and a method thereof.
  • Another aspect of the detailed description is to provide a driving apparatus capable of controlling the unnecessary acceleration and the lateral motion to be reduced so as to guide a vehicle (car) to be traveled within a zone having an economical fuel efficiency.
  • a driving apparatus including a three-dimensional (3D) sensor unit configured to obtain 3D data, a controller configured to collect travel state information from the obtained data and diagnose a travel state based upon the collected travel state information, and an output unit configured to display the diagnosis result.
  • 3D three-dimensional
  • the 3D sensor unit may include at least one of a stereo camera, a depth camera, a moving stereo camera and a Light Detection and Ranging (LIDAR) system.
  • a stereo camera a depth camera
  • a moving stereo camera a Light Detection and Ranging (LIDAR) system.
  • LIDAR Light Detection and Ranging
  • the 3D data may be 3D distance data from another driving apparatus located at the front.
  • the travel state information may be acceleration or deceleration state information with respect to the 3D distance data.
  • the 3D sensor unit may capture a plurality of images, generate a stereo disparity image from the plurality of images, and obtain the 3D distance image from the disparity image.
  • the 3D sensor unit may include a laser generator configured to generate a laser pulse to transmit laser in a driving direction, and a laser sensor configured to detect reflected wave that the laser is returned by being reflected by another driving apparatus located at the front, measure a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned, and obtain the 3D distance data from the measurement result.
  • a laser generator configured to generate a laser pulse to transmit laser in a driving direction
  • a laser sensor configured to detect reflected wave that the laser is returned by being reflected by another driving apparatus located at the front, measure a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned, and obtain the 3D distance data from the measurement result.
  • the controller may compare the 3D distance data so as to determine an acceleration or deceleration state based upon the comparison result, and the travel state information may be the determined acceleration or deceleration state.
  • the controller may diagnose the travel state as needing to stop an acceleration driving when the 3D distance data is less than a predetermined safe distance.
  • the 3D data may be lane recognition information related to a travel route.
  • the 3D sensor unit may receive images for the travel route, extract feature points from the received images, detect a lane from the feature points and recognize a lane with respect to the travel route.
  • the controller may determine whether or not a lateral motion has been generated based upon the lane recognition information, the lateral motion indicating that the driving apparatus is driven close to one side of the recognized lane, and the travel state information may be the determined lateral motion state information.
  • the controller may determine whether or not the number of lateral motions being generated, included in the lateral motion state information, exceeds a predetermined threshold number of lateral motions, and diagnose the travel state as needing to stop a lateral motion when it is determined to exceed the threshold number of lateral motions.
  • the controller may determine whether or not a distance from a predetermined lane, included in the lateral motion state information, is below a threshold distance, and diagnose the travel state as needing to stop the lateral motion.
  • the travel state information may be collected by a predetermined period of time.
  • the output unit may include at least one of a display unit, an audio output module, a lamp output module and a haptic module.
  • a driving method including obtaining a three-dimensional (3D) data, collecting travel state information from the obtained data, diagnosing a travel state based upon the collected travel state information, and displaying the diagnosis result.
  • the 3D data may be obtained by at least one of a stereo camera, a depth camera, a moving stereo camera and a Light Detection and Ranging (LIDAR) system.
  • a stereo camera a depth camera
  • a moving stereo camera a Light Detection and Ranging (LIDAR) system.
  • LIDAR Light Detection and Ranging
  • the 3D data may be 3D distance data from another driving apparatus located at the front and/or lane recognition information with respect to a travel route.
  • the obtaining of the 3D data may include capturing a plurality of images, generating a stereo disparity image from the plurality of images, and obtaining the 3D distance data from the disparity image.
  • the obtaining of the 3D data may include generating a laser pulse and transmitting laser in a driving direction, detecting reflected wave that the laser is returned by being reflected by another driving apparatus located at the font, measuring a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned, and obtaining the 3D distance data from the measurement result.
  • the collecting of the travel state information may include comparing the 3D distance data, and determining an acceleration or deceleration state based upon the comparison result, wherein the travel state information may be the determined acceleration or deceleration state.
  • the diagnosing of the travel state may include determining whether or not the 3D distance data is below a predetermined safe distance when the state information indicates an acceleration state, and diagnosing the travel state as needing to stop the acceleration when it is determined to be less than the safe distance.
  • the obtaining of the 3D data may include receiving images related to a travel route, extracting feature points from the received image, detecting a lane from the feature points to generate a lane recognition image including lane information, and recognizing a lane with respect to the travel route based upon the generated image.
  • the collecting of the travel state information may include determining whether or not a lateral motion has been generated based upon the lane recognition information, the lateral motion indicating that the driving apparatus is driven close to one side of the recognized lane, wherein the travel state information may be the determined lateral motion state information.
  • the diagnosing of the travel state may include determining whether or not the number of lateral motions being generated, included in the lateral motion state information, exceeds a predetermined threshold number of lateral motions and/or a distance from a predetermined lane, included in the lateral motion state information, is below a threshold distance, and diagnosing the travel state as needing to stop the lateral motion when it is determined to exceed the threshold number of lateral motions or to be below the threshold distance.
  • a acceleration and a lateral motion of a driving apparatus can be recognized more accurately by using a 3D sensor, as compared to the existing sensor-based recognition, and accordingly it can be determined precisely whether or not the driving apparatus is making an eco-driving.
  • a travel state of the driving apparatus and a traveling zone having an economic fuel efficiency can be displayed, accordingly, a driver can determine whether or not the driving apparatus is making an eco-driving and safe driving and practice the eco-driving, so as to form a habit of the eco-driving.
  • guidance information for eco-driving can be provided in real time, so as to reduce energy consumption due to unnecessary driving and enhance fuel efficiency.
  • FIG. 1 is a block diagram of a driving apparatus in accordance with an exemplary embodiment of the present disclosure
  • FIG. 2 is a flowchart showing a traveling (driving) process using a 3D sensor in accordance with an exemplary embodiment
  • FIG. 3 is a flowchart showing a traveling process using 3D distance data in accordance with one exemplary embodiment
  • FIG. 4 is a view showing an exemplary embodiment that the driving apparatus obtains the 3D distance data
  • FIG. 5 is a view showing another exemplary embodiment that the driving apparatus obtains the 3D distance data
  • FIG. 6 is a flowchart showing a travel state information collection process and a travel state determination (diagnosis) process in accordance with one exemplary embodiment
  • FIG. 7 is a flowchart showing a traveling process using a lane recognition in accordance with another exemplary embodiment
  • FIG. 8A is a flowchart showing a process that the driving apparatus recognizes a lane
  • FIG. 8B is a view showing an image that the driving apparatus recognizes the lane.
  • FIG. 9 is a view showing a displayed state of a determination (diagnosis) result in accordance with an exemplary embodiment.
  • the exemplary embodiments of the present disclosure may be stand-alone, and additionally be applicable to various types of terminals such as mobile terminals, telematics terminals, smart phones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), laptop computers, tablet PCs, Wibro terminals, Internet Protocol Television (IPTV) terminals, television sets, 3D television sets, imaging devices, navigation terminals, Audio Video Navigation (AVN) terminals and the like.
  • PDAs personal digital assistants
  • PMPs portable multimedia players
  • laptop computers tablet PCs
  • Wibro terminals Internet Protocol Television (IPTV) terminals
  • IPTV Internet Protocol Television
  • 3D television sets 3D television sets
  • imaging devices navigation terminals
  • navigation terminals Audio Video Navigation (AVN) terminals and the like.
  • APN Audio Video Navigation
  • the exemplary embodiments of the present disclosure may be implemented in a program command format capable of being executed by various computer devices so as to be recorded in a computer-readable medium.
  • the computer-readable medium may include program command, file data, data structure and the like independently or by combination thereof.
  • the program command recorded in the medium may be one designed and constructed specifically for the present disclosure or one useable in a computer software.
  • Examples of the computer-readable media may include magnetic media such as hard disk, floppy disk and magnetic tape, optical media such as CD-ROM and DVD, magneto-optical media such as floptical disk, and specifically constructed hardware devices, such as ROM, RAM and flash memory, for storing program commands.
  • Examples of the program commands may include exclusive language codes executable by a computer using an interpreter or the like as well as mechanical word codes, which are created by a compiler.
  • the hardware device may be configured to function as one or more software modules for executing the operations according to the present disclosure, and examples thereof are the same.
  • FIG. 1 is a block diagram of a driving apparatus in accordance with an exemplary embodiment of the present disclosure.
  • a driving apparatus 300 may include a three-dimensional (3D) sensor unit 110, a controller 120, a memory 130, an output unit 140 and a travel information collector 150.
  • 3D three-dimensional
  • the 3D sensor unit 110 is a camera system for simultaneously capturing front, rear, and/or side directions by using a rotatable reflector, a condenser and an imaging device (capturing device).
  • the 3D sensor unit 110 may be applicable to security, surveillance camera, robot vision and the like.
  • the rotatable reflector may be formed in various shapes, such as hyperboloid, spherical type, conical type, combined type and the like.
  • the 3D sensor unit 110 may include at least one pair of cameras (stereo cameras or stereoscopic cameras), which are installed on the same surface on the same central axis with horizontally spaced apart from each other, or a single camera.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • An image (namely, front image) projected on an image surface of the imaging device is a reflected image by the rotatable reflector, so it is a distorted image, which is not appropriate to be viewed (observed) by human being at it is. Therefore, the 3D sensor unit 110 may convert coordinates of an output of the imaging device by using a microprocessor or the like to create a new panoramic image for a correct observation of the image.
  • the 3D sensor unit 110 may include at least one of a stereo camera, a depth camera, a moving stereo camera and a light detection and ranging (LIDAR) system, for acquiring 3D distance data by three-dimensionally capturing every directions.
  • a stereo camera a depth camera
  • a moving stereo camera a light detection and ranging (LIDAR) system
  • the stereo camera is an imaging device including a plurality of cameras.
  • the images in every directions obtained via the 3D sensor unit 110 may provide two-dimensional (2D) information related to surroundings of the 3D sensor unit 110.
  • the 3D information related to the surroundings of the 3D sensor unit 110 may be obtained by using a plurality of images captured in different directions via the plurality of cameras.
  • the depth camera is a camera for extracting image and distance data by capturing or measuring obstacles. That is, the depth camera may capture obstacles, similar to the typical camera, to create image data, and measure a distance from the camera at an actual position corresponding to pixels of each image so as to create distance data.
  • 3D-TOF camera or the like may be employed as the depth camera.
  • the 3D-TOF camera is a camera for measuring depth based on a time of flight theory, as can be known from the name. That is, the 3D-TOF camera may use a way of emitting light with extremely short pulse (about 20 MHz) using infrared LEDs and calculating a phase time difference of light coming back, thus to find a depth value.
  • the 3D-TOF corresponds to a camera employing a concept similar to a radar.
  • the depth camera may filter off distance data upon capturing objects so as to choose obstacles present within a specific distance.
  • target objects located within a specific depth field can be segmented, excluding obstacles located at a near field and a far field.
  • the moving stereo camera denotes a camera that a position of a stereo camera is actively changed depending on a distance from an obstacle such that the stereo camera has a fixed viewing (focal) angle with respect to an obstacle to be observed.
  • the stereo camera may obtain images by disposing two cameras in parallel, and calculate a distance up to an obstacle according to a stereo disparity between the obtained images.
  • the stereo camera denotes a passive camera that optical axes are always disposed in parallel and fixed.
  • the moving stereo camera may fix an observation angle by actively changing geometric positions of the optical axes.
  • the observation angle control stereo camera may always constantly maintain a stereo disparity with respect to moving obstacles (objects) so as to provide more natural stereo image to a stereo image observer, and also provide useful information in view of obstacle distance measurement or stereo image processing.
  • the LIDAR system may detect an existence and a distance of an obstacle located in front of the driving apparatus 100.
  • the LIDAR system is a type of active remote sensing system, by which desired information is acquired without a direct contact with an object using the same principle as a radar.
  • the LIDAR system may shoot (fire) laser beams to a target whose information is needed and detect disparity of electromagnetic waves coming back by being reflected by the target and energy change, thereby obtaining desired distance information.
  • the LIDAR system may be classified into three types, including Differential Absorption LIDAR (DIAL), a Doppler LIDAR and a range finder LIDAR, according to a purpose or target desired to be measured.
  • DIAL Differential Absorption LIDAR
  • the Doppler LIDAR is utilized to measure a movement speed of an object using a Doppler principle.
  • LIDAR generally indicates the range finder LIDAR, which combines information related to distances from objects using a Global Positioning System (GPS), an Inertial Navigation System (INS) and a laser scanner system, so as to acquire 3D topographic information.
  • GPS Global Positioning System
  • INS Inertial Navigation System
  • laser scanner system so as to acquire 3D topographic information.
  • various sensors which are implemented by GPS, INS, 3D LIDAR and photometry, may be combined and mounted in the driving apparatus 100, thereby being implemented to measure positions of the region located on the way of the traveling route and acquire visual information while the driving apparatus 100 is run.
  • the controller 120 may control an overall operation of the driving apparatus 100.
  • the controller 120 may carry out control of various driving units for driving the driving apparatus 100, collection of travel state information, diagnosis of a travel state, control associated with an eco-driving, calculation and processing.
  • the controller 120 may carry out a variety of controls for eco-driving to be explained with reference to FIGS. 2 to 9.
  • the memory 130 may store a program for operations of the controller 120, and temporarily store input/output data (for example, driving mode, driving information, eco-driving state, etc.).
  • the memory 130 may store data related to vibration and sound of various patterns, which are output in response to touch inputs on the touch screen.
  • the memory 130 may store software components, which include an operating system (not shown), a module (not shown) functioning as a wireless communication unit, a module (not shown) operating together with a user input unit, a module (not shown) operating together with an A/V input unit, a module operating together with an output unit 140.
  • the operating system for example, LINUX, UNIX, OS X, WINDOWS, Chrome, Symbian, iOS, Android, VxWorks or other embedded operating systems
  • the memory 130 may include at least one of storage medium of including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the driving apparatus 100 may operate a web storage which performs the storage function of the memory 130 on the Internet.
  • the output unit 140 may generate outputs relating to an audio signal, a video signal or a tactile signal.
  • the output unit 140 may include a display unit 151, an audio output module 142, a lamp output module 143 and a haptic module 144.
  • the display unit 141 may output information processed in the driving apparatus 100. For example, while the driving apparatus 100 is driven, the display unit 141 may display User Interface (UI) or Graphic User Interface (GUI) relating to the driving.
  • UI User Interface
  • GUI Graphic User Interface
  • the display unit 141 may be implemented using, for example, at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT-LCD), an Organic Light-Emitting Diode (OLED), a flexible display, a three-dimensional (3D) display, an e-ink display or the like.
  • LCD Liquid Crystal Display
  • TFT-LCD Thin Film Transistor-Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • flexible display a three-dimensional (3D) display
  • 3D three-dimensional
  • e-ink display or the like.
  • Some of such display units 141 may be implemented as a transparent type or an optical transparent type through which the exterior is visible, which is referred to as transparent display .
  • a representative example of the transparent display may include a Transparent OLED (TOLED), and the like.
  • the rear surface of the display unit 141 may also be implemented to be optically transparent.
  • the display unit 141 may be implemented in two or more in number according to a configured aspect of the driving apparatus 100. For instance, a plurality of the display units 141 may be arranged on one surface to be spaced apart from or integrated with each other, or may be arranged on different surfaces.
  • the display unit 141 and a touch sensitive sensor have a layered structure therebetween (the structure may be referred to as a touch screen), the display unit 141 may be used as an input device as well as an output device.
  • the touch sensor may be implemented as a touch film, a touch sheet, a touchpad, and the like.
  • the touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 141, or a capacitance occurring from a specific part of the display unit 141, into electric input signals. Also, the touch sensor may be configured to sense not only a touched position and a touched area, but also a touch pressure.
  • touch controller When touch inputs are sensed by the touch sensors, corresponding signals are transmitted to a touch controller (not shown).
  • the touch controller processes the received signals, and then transmits corresponding data to the controller 120. Accordingly, the controller 120 may sense which region of the display unit 141 has been touched.
  • the audio output module 142 may output audio data stored in the memory 130 in a recording mode, a voice recognition mode and the like.
  • the audio output module 142 may also output a sound signal relating to a function (a driving mode change, an eco-driving indication, an eco-driving warning and the like) performed in the driving apparatus 100.
  • the lamp output module 143 may output state information relating to the driving apparatus 100 using lamps, and differently display the state using brightness of a lamp, a color of a lamp, flicking of a lamp and the like.
  • the lamp output module 143 may have functions of lighting and signaling, and use as a light source a Light Emitting Diode (LED), a projection lamp, a position lamp, a halogen lamp and the like.
  • LED Light Emitting Diode
  • the haptic module 144 generates various tactile effects which a user can feel.
  • a representative example of the tactile effects generated by the haptic module 144 includes vibration.
  • Vibration generated by the haptic module 144 may have a controllable intensity, a controllable pattern, and so on. For instance, different vibration may be output in a synthesized manner or in a sequential manner.
  • the haptic module 144 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moving with respect to a skin being touched (contacted), air injection force or air suction force through an injection hole or a suction hole, touch by a skin surface, presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or a heat emitting device, and the like.
  • the haptic module 144 may be configured to transmit tactile effects (signals) through a user s direct contact, or a user s muscular sense using a finger or a hand.
  • the haptic module 144 may be implemented in two or more in number according to the configuration of the driving apparatus 100.
  • the travel information collector 150 may collect travel state information relating to the driving apparatus 100 by receiving the same in real time and include an acceleration sensor 151, a gyro sensor 152, a steering sensor 153.
  • the acceleration sensor 151 is a device for converting an acceleration change in one direction into an electrical signal.
  • the acceleration sensor 151 may generally be configured by mounting two axes or three axes in one package. It may alternatively require only Z-axis depending on usage environments. Hence, if X-axial or Y-axial acceleration sensor should be used instead of Z-axial acceleration sensor in some reason, an acceleration sensor may be mounted by erecting the acceleration sensor on a primary board using a separate plate.
  • the gyro sensor 152 is a sensor for measuring an angular speed of the driving apparatus 100 which performs a rotary motion, and may detect a rotated angle from each reference direction.
  • the gyro sensor 152 may detect each rotated angle, namely, an azimuth, a pitch and a roll, based upon three-directional axes.
  • the steering sensor 153 may include a rotor rotated in conjunction with a steering wheel of the driving apparatus 100, a gear rotated integrally with the rotor, a detecting part for detecting phase changes by rotation of a magnetic substance, which generates a magnetic force, an operating part for operating and outputting an input of the detecting part, and a PCB substrate and a housing for mounting the operating part.
  • the steering sensor 153 may alternatively be a variety of steering wheel angle detecting devices, which may further include additional components or be configured without part of components.
  • FIG. 1 shows the driving apparatus 100 having various components, but it is understood that implementing all of the illustrated components is not a requirement. Greater or fewer components may alternatively be implemented.
  • FIG. 2 is a flowchart showing a traveling (driving) process using a 3D sensor in accordance with an exemplary embodiment.
  • the driving apparatus 100 may acquire 3D data from the 3D sensor unit 110 (S201).
  • the 3D sensor unit 110 may use one of a stereo camera, a depth camera, a moving stereo camera and a LIDAR system to obtain 3D information relating to an object to be captured using images in every directions.
  • the 3D sensor unit 110 may convert electric signals, which are created by the stereo camera or the like, into image signals, and generates a 3D map around the driving apparatus 100 from images implemented by the image signals.
  • the generated 3D map may be stored in the memory 130, and updated every time when the position of the driving apparatus 100 changes.
  • the 3D data may be related to a position and/or a distance of an obstacle (object), which is located in front of the driving apparatus 100, and/or another driving apparatus.
  • the 3D data may be lane recognition information with respect to a travel route (traveling route, traveling path, driving path) of the driving apparatus 100.
  • the 3D data obtained from the 3D sensor unit 110 may be combined data of a plurality of information including a distance from another driving apparatus located in front of the driving apparatus 100 and lane recognition information with respect to the travel route of the driving apparatus 100.
  • the driving apparatus 100 obtaining the 3D data from the 3D sensor unit 110
  • the 3D data is related to the distance from the another driving apparatus located in front of the driving apparatus 100
  • a 3D data obtaining method will be described in detail with reference to FIGS. 4 and 5.
  • the 3D data is the lane recognition information with respect to the travel route of the driving apparatus 100
  • a 3D data obtaining method will be described in detail with reference to FIGS. 8A and 8B.
  • the driving apparatus 100 may collect travel state information from the obtained 3D data (S202).
  • the obtained 3D data may be distance data up to an obstacle and/or another driving apparatus located in front of the driving apparatus 100 or lane recognition data relating to the travel route of the driving apparatus 100, based upon the 3D image obtained by the 3D sensor unit 110.
  • the travel state information of the driving apparatus 100 may include an acceleration or deceleration state of the driving apparatus 100, sudden acceleration and deceleration states thereof, and a lateral motion state within a lane where the driving apparatus 100 is moving. Also, the travel state information of the driving apparatus 100 may include several information, possibly obtained from various sensors of the travel information collector 150. Such several information may include information related to an acceleration or deceleration state, a steering state, a speed of the driving apparatus 100, RPM, an energy consumption ratio, a friction coefficient of a driving surface, a fuel consumption ratio, a length of the driving apparatus 100, headway and the like.
  • the travel state information of the driving apparatus 100 may include several information possibly obtained via a separately disposed information communication unit (not shown), in addition to the information obtained by the 3D sensor unit 110.
  • Such several information may include geometric information relating to roads (location information, plane slope, longitudinal slope and road curvature), traffic condition information (traffic congestion, traffic volume, block speed, traffic flow and headway) and weather information (rainfall, snowfall, fog, friction coefficient of driving surface).
  • Those information may be obtained in order to guide and control a correct travel condition of the driving apparatus 100 together with the travel state information obtained via the 3D sensor unit 110.
  • the travel state information may be stored in the memory 130 to be used for determining (diagnosing) a travel state of the driving apparatus 100 with being updated while the driving apparatus 100 is driven.
  • the controller 120 may then determine (diagnose) the travel state of the driving apparatus 100 based upon the collected travel state information (S203).
  • the diagnosis of the travel state may be carried out to help an eco-driving of the driving apparatus 100.
  • the eco-driving is to reduce fuel consumption by reducing unnecessary movement while the driving apparatus 100 is driven and realize economically temporally efficient driving. Therefore, the travel state diagnosis may be carried out based upon determination as to whether or not the driving apparatus 100 is unnecessarily accelerating or decelerating, takes an unnecessary lateral motion within lane, or suddenly accelerates or decelerates.
  • the controller 120 may determine a current travel state of the driving apparatus 100 based upon the collected travel state information.
  • the controller 120 may determine an acceleration or deceleration state of the driving apparatus 100 based on 3D distance data with another driving apparatus located in front of the driving apparatus 100, among the collected travel state information.
  • the controller 120 may determine whether or not the driving apparatus 100 taken a lateral motion within the lane based upon lane recognition information related to the travel route of the driving apparatus 100, among the collected travel state information.
  • the controller 120 may diagnose whether the driving apparatus 100 is economically driving according to the travel state determination result with respect to the driving apparatus 100. According to the travel state determination result, if there occurs an unnecessary acceleration or deceleration, an unnecessary lateral motion within the lane or a sudden stop or sudden deceleration, the driving may cause unnecessary fuel consumption, so it may be diagnosed as an economically inefficient driving. Hence, the controller 120 may diagnose as requiring a type of driving necessary to reduce or stop the unnecessary travel state. For example, when the driving apparatus 100 is unnecessarily accelerating, the controller 120 may make a diagnosis that the driving apparatus 100 should stop acceleration and maintain a constant speed. That is, the controller 120 may make a diagnosis for requiring deceleration of the driving apparatus 100.
  • the controller 120 may make a diagnosis that the driving apparatus 100 should stop steering during journeys and move linearly within the lane. That is, the controller 120 may make a diagnosis for requiring stop of the lateral motion.
  • the controller 120 may diagnose the travel state of the driving apparatus 100 based on a plurality of travel states. For example, the controller 120 may diagnose the travel state of the driving apparatus 100 by combining the acceleration or deceleration state and the lateral motion state of the driving apparatus 100.
  • the driving apparatus 100 may display the diagnosis results (S204).
  • the output unit 140 may display a warning (caution) indicating that the driving apparatus 100 is making uneconomic driving due to the unnecessary acceleration, or a note indicating that the driving apparatus 100 should reduce speed and maintain constant speed.
  • the displaying of the diagnosis results may be implemented by outputting image and/or voice via the display unit 141 and/or the audio output module 142.
  • the driving apparatus 100 may display the travel state information and/or the travel state diagnosis results on the display unit 141 using various UIs and GUIs.
  • the display unit 141 may digitize the diagnosis results and classify the diagnosis results into the number of times or scales so as to represent with numerical values, or display the diagnosis results using a text format such as a message, a graph, a predetermined figure and the like.
  • the audio output module 142 may announce information for inducing eco-driving from the diagnosis results by voice, or output a warning message by voice. Also, the audio output module 142 may inform an uneconomic driving using warning sound like beep sound.
  • the diagnosis results may alternatively be displayed via the lamp output module 143.
  • the lamp output module 143 may classify the diagnosis results into scales, and display the diagnosis results by means of brightness, color, flicking times or flicking speed of a lamp and the like according to whether the travel state is economic or uneconomic.
  • the diagnosis results may be displayed via the haptic module 144.
  • the haptic module 144 may perform displaying of a warning thereof by various tactile effects using vibration, air injection, or presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling, or the like.
  • FIG. 3 is a flowchart showing a traveling process using 3D distance data in accordance with one exemplary embodiment.
  • the driving apparatus 100 may acquire 3D distance data via the 3D sensor unit 110 (S211).
  • the 3D distance data may be a position of another driving apparatus located in front of the driving apparatus 100 and/or a distance from the front driving apparatus.
  • the 3D sensor unit 110 may be implemented as a stereo camera.
  • the 3D sensor unit 110 may obtain a plurality of stereo images, namely, a first stereo image 401 and a second stereo image 402 by capturing the front face of the driving apparatus 100 using the stereo camera.
  • the 3D sensor unit 110 may generate a 3D disparity map 403 from the plurality of images.
  • the stereo disparity image 403 may be represented relatively brighter when a front obstacle is located near the stereo camera, while represented relatively darker when located far from the stereo camera.
  • the 3D sensor unit 110 may measure a distance from the front obstacle using brightness information (namely, depth of an image).
  • the 3D sensor unit 110 of the driving apparatus 100 may measure a distance between the driving apparatus 100 and another driving apparatus located in front of the driving apparatus 100.
  • the 3D sensor unit 100 may be configured as the LIDAR system.
  • the LIDAR system may shoot (fire) laser beams to the front of the driving apparatus 100 and measure a return time that the beams come back by being reflected by another driving apparatus located at the front, so as to obtain distance data from the front driving apparatus.
  • a laser generator 501 of the LIDAR system generates a laser pulse to be used for distance measurement to transmit laser 502 in a driving direction.
  • the laser 502 transmitted from the LIDAR system 500 reaches another driving apparatus 100' located in front of the driving apparatus 100, and is reflected by the another driving apparatus 100' to come back in form of reflected wave 503.
  • a laser sensor 504 of the LIDAR system 500 then detects the reflected wave 503 and measures a time taken from the laser 502 being transmitted to the reflected wave 503 being returned, and an energy change level.
  • the laser sensor 504 may calculate a distance up to the another driving apparatus 100' based on the measurement result. For example, if it is assumed that a return time is t, a speed of laser beam is c and a distance up to the another driving apparatus 100' is R, the laser sensor 504 may calculate the distance according to the following Equation.
  • the driving apparatus 100 may utilize the return time that the laser 502 transmitted from the LIDAR system 500 is returned as the reflected wave 503 or the number of times that the laser 502 is returned as the reflected wave 503, as the 3D data, instead of the 3D distance data.
  • the driving apparatus 100 may collect travel state information related to acceleration or deceleration from the obtained 3D distance data (S212).
  • the controller 120 may continuously store the 3D distance data in the memory 130, and thereafter collect the travel state information related to acceleration or deceleration of the driving apparatus 100 based on the 3D distance data.
  • the driving apparatus 100 when the driving apparatus 100 is getting closer to another driving apparatus located at front according to time elapse, it may be determined as the driving apparatus 100 is accelerating over a predetermined speed or the front driving apparatus is decelerating. On the contrary, when the driving apparatus 100 is getting farther from another driving apparatus located at front according to time elapse, it may be determined that the driving apparatus 100 is decelerating or the front driving apparatus is accelerating.
  • the driving apparatus 100 if the driving apparatus 100 is determined to be located within a shorter distance than a safe distance from another driving apparatus located in front of the driving apparatus 100, regardless of the travel state of the font driving apparatus, it may be diagnosed as the accelerated driving should be stopped.
  • the controller 120 may compare the collected 3D distance data with each other (S61). The controller 120 may determine whether or not the distance is increased or decreased by the comparison of the 3D distance data (S62). According to the comparison result, if the distance is decreased, the controller 120 may determine the travel state of the driving apparatus 100 as accelerating (S63). On the other hand, if the distance is increased according to the comparison result, the controller 120 may determine the travel state of the driving apparatus 100 as decelerating (S64). Or, if the distance is rapidly decreased or increased according to the comparison of the collected 3D distance data, the travel state of the driving apparatus 100 may be diagnosed as a sudden acceleration or deceleration. Therefore, the controller 120 may collect the travel state information related to acceleration or deceleration of the driving apparatus 100 according to the determination results.
  • the controller 120 may determine the travel state of the driving apparatus 100 based upon information related to a return time taken until the laser emitted from the 3D sensor unit 110 reaches a front driving apparatus and then returns.
  • the controller 120 may determine the travel state of the driving apparatus 100 as accelerating when the collected time information is decreased as a time elapses.
  • the controller 120 may determine the travel state of the driving apparatus 100 as decelerating when the time information is increased as a time elapses.
  • the controller 120 may alternatively diagnose the travel state of the driving apparatus 100 based on the number of times that the laser emitted from the 3D sensor unit 110 reaches the front driving apparatus and then returns.
  • the controller 120 may determine the travel state of the driving apparatus 100 as accelerating when the collected number of times is gradually increased according a predetermined period.
  • the controller 120 may determine the travel state of the driving apparatus 100 as being decelerating when the collected number of times is gradually decreased according a predetermined period.
  • the controller 120 may determine the acceleration or deceleration state of the driving apparatus 100 based upon the change in the return time or the change in the number of return for a predetermined time and collect acceleration or deceleration state information.
  • the collection of the travel state information may be carried out at a predetermined period of time.
  • the controller 120 may collect the acceleration or deceleration state information of the driving apparatus 100 from the 3D distance data for a predetermined period of time.
  • the controller 120 may collect the acceleration or deceleration state information of the driving apparatus 100 from the return time information or the information related to the number of return for a predetermined period of time.
  • the driving apparatus 200 may diagnose the travel state based upon the collected acceleration or deceleration travel state information (S213).
  • the controller 120 may diagnose whether a distance between the driving apparatus 100 and another driving apparatus located in front of the driving apparatus 100 is decreased below a safe distance (S65). When the distance is within the safe distance, the controller 120 may make a diagnosis such that the driving apparatus 100 should stop the acceleration for eco-driving (S66).
  • the safe distance may be predetermined and stored in the memory 130, and indicate a distance between driving apparatuses, directed in traffic rules.
  • the controller 120 may diagnose to stop sudden acceleration or deceleration, which causes unnecessary fuel consumption and abrasion of components, when the driving apparatus 100 is diagnosed as the sudden acceleration or deceleration state.
  • the controller 120 diagnoses the travel state of the driving apparatus 100.
  • methods using various sensors and calculating apparatuses may be employed to determine an acceleration or deceleration state, or a sudden acceleration or deceleration state of the driving apparatus 100.
  • the driving apparatus 100 may display the diagnosis results (S214).
  • the output unit 140 may display a warning indicating that the driving apparatus 100 is making uneconomic driving due to unnecessary acceleration, or a note indicating that the driving apparatus 100 should slow down and maintain a predetermined speed.
  • the displaying of the diagnosis result may be carried out through image, voice, lamp and various tactile stimulus using one or more of the display unit 141, the audio output module 142, the lamp output module 143 and the haptic module 144.
  • FIG. 7 is a flowchart showing a traveling process using a lane recognition in accordance with another exemplary embodiment.
  • the driving apparatus 100 may recognize a lane on a travel route via the 3D sensor unit 110 (S221).
  • the controller 120 may receive a 3D image for the travel route of the driving apparatus 100 via the 3D sensor unit 110 (S81).
  • the controller 120 may extract feature points from the 3D image 81 (see FIG. 8A) obtained by the 3D sensor unit 110 (S82).
  • the feature point may be extracted with respect to an overall area or a specific interesting area based upon a horizontal axis or vertical axis according to an extraction algorithm.
  • an algorithm for the number or area may not be limited.
  • the controller 120 may detect a lane using a color recognition algorithm from the extracted feature points (S83).
  • a road is black
  • a center lane is yellow
  • a normal lane is white. So, the controller 120 may detect the color from the images to distinguish a lane and a portion which is not a lane, so as to recognize a lane on the travel route of the driving apparatus 100.
  • the controller 120 may remove noise from the images through several types of filtering (S84). There is no limit to the filtering way, and the noise removal may not be required according to an image state or a 3D sensor system.
  • the controller 120 may create an image 82 (see FIG. 8B), from which the lane of the travel route of the driving apparatus 100 has been extracted through the processes, and recognize lane information (S85).
  • the 3D sensor unit 110 may recognize a lane using a curve equation for a case whether the driving apparatus 100 travels along a curved surface.
  • the 3D sensor unit 110 may calculate curve information following a central point of the lane with respect to a plurality of feature points corresponding to the curve.
  • the calculated curve information may be used to improve a lane maintenance performance by minimizing an affection of a calibration state of the 3D sensor unit 110.
  • the 3D sensor unit 110 may calculate the curve information following the central point of the lane by using one of a least square method, RANSAC, general Hough transform, Spline interpolation and the like, with respect to the plurality of feature points corresponding to the checked curve.
  • the driving apparatus 100 may then collect state information relating to a lateral motion within a lane from the recognized lane (S222).
  • the controller 120 may store lane recognition information relating to the travel route of the driving apparatus 100 in the memory 130 in order to use them for collection of travel state information related to the driving apparatus 100.
  • the controller 120 may determine the travel state of the driving apparatus 100 as the lateral motion state. That is, when the driving apparatus 100 is located close to the left of the recognized lane, the controller 120 may determine that the driving apparatus 100 is performing the motion to the left side. On the contrary, when the driving apparatus 100 is located close to the right of the recognize lane, the controller 120 may determine that the driving apparatus 100 is performing the motion to the right side.
  • the controller 120 may collect the lateral motion state information related to the driving apparatus 100 from the determination results.
  • the collection of the travel state information may be carried out at a predetermined period of time.
  • the controller 120 may collect the lane recognition information for a predetermined period of time and diagnose whether the driving apparatus 100 is performing the lateral motion.
  • the driving apparatus 100 may diagnose the travel state based on the collected lateral motion information (S223).
  • the controller 120 may diagnose whether the lateral motion state of the driving apparatus 100 is in an eco-driving state or uneconomic driving state by comparison with a predetermined threshold value.
  • the threshold value may be the threshold number of lateral motion with respect to a predetermined time.
  • the threshold value may be a numeral value indicating the distance between the driving apparatus 100 and the recognized lane in view of the lateral motion.
  • the controller 120 may diagnose the driving apparatus 100 as driving uneconomically, and determine to stop the lateral motion.
  • the controller 120 may diagnose as the lateral motion does not need to stop.
  • the driving apparatus 100 may collect the travel state information relating thereto using various sensors and calculating apparatuses as well as a curved road recognition algorithm.
  • controller 120 diagnoses the travel state of the driving apparatus 100.
  • methods using various sensors and calculating apparatuses may be employed to determine the lateral motion state of the driving apparatus 100.
  • the driving apparatus 100 may display the diagnosis results (S224).
  • the output unit 140 may display a warning indicating that the driving apparatus 100 is performing a noneconomic driving due to unnecessary acceleration, or a note indicating that the driving apparatus 100 should slow down and maintain a predetermined speed.
  • the displaying of the diagnosis result may be carried out through image, voice, lamp and various tactile stimulus using one or more of the display unit 141, the audio output module 142, the lamp output module 143 and the haptic module 144.
  • FIG. 9 is a view showing a displayed state of a determination (diagnosis) results in accordance with an exemplary embodiment.
  • FIG. 9 shows an example that the driving apparatus 100 combines the acceleration or deceleration driving diagnosis results and the lateral motion diagnosis results into one graph.
  • the output unit 140 may display the travel state diagnosis results in form of graph having two axes 91 and 92 and an indicator 95 indicating the location of the driving apparatus 100.
  • the horizontal axis on the graph indicates the lateral motion diagnosis result of the driving apparatus 100.
  • the vertical axis 92 on the graph indicates the acceleration or deceleration diagnosis result of the driving apparatus 100. That is, as the indicator 95 is getting farther away from the center of the vertical axis 92, it indicates that the driving apparatus 100 is making an excessive acceleration or deceleration driving.
  • the indicator 95 when the indicator 95 is located closer to the intersection between the axes 91 and 92, it exhibits a diagnosis result that the driving apparatus 100 is making an eco-driving without acceleration or deceleration or lateral motion. On the contrary, when the indicator 95 is located farther away from the intersection, it exhibits a diagnosis result that the driving apparatus is making a uneconomic driving (for example, an acceleration driving as being upwardly farther from the intersection and a lateral motion driving as being farther fro the intersection in left and right sides).
  • a uneconomic driving for example, an acceleration driving as being upwardly farther from the intersection and a lateral motion driving as being farther fro the intersection in left and right sides.
  • the oval present on the graph is a figure indicating whether or not the driving apparatus 100 is making an eco-driving.
  • an external area 93 of the oval may indicate that the driving apparatus 100 is making an uneconomic driving by unnecessarily consuming fuel due to acceleration or deceleration or lateral motion.
  • the external area 93 may be represented with a red color indicating a warning message.
  • an internal area 94 of the oval may indicate that the driving apparatus 100 is making an economically efficient driving without acceleration or deceleration or lateral motion.
  • the internal area 94 may be represented with a green color indicating a safety message.
  • the displaying method shown in FIG. 9 is merely one example of various displaying disclosed in this specification. As described above, the diagnosis results may be displayed using image, sound, lamp, various tactile stimulus and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

A driving apparatus includes a three-dimensional (3D) sensor unit configured to obtain 3D data, a controller configured to collect travel state information from the obtained data and diagnose a travel state based upon the collected travel state information, and an output unit configured to display the diagnosis result. A driving method includes obtaining a three-dimensional (3D) data, collecting travel state information from the obtained data, diagnosing a travel state based upon the collected travel state information, and displaying the diagnosis result.

Description

DRIVING APPARATUS AND METHOD USING 3D SENSOR
The present disclosure relates to a driving (traveling) apparatus and method using a three-dimensional (3D) sensor.
In general, when a vehicle is driven with an optimal load, it consumes relatively less fuel. However, when a load change is frequently occurred due to sudden acceleration, sudden braking (sudden stop, emergency braking) or the like, the fuel consumption is drastically increased.
To overcome an environmental problem and a fuel exhausting problem under high oil price, many efforts are tried to reduce fuel efficiencies. In response to these situations, improvement of drivers driving habits is recognized as an important issue because such habits are important factors affecting the vehicle fuel efficiency.
Economic driving, namely, eco-driving refers to new concept driving behaviors for practicing exhaust gas reduction, accident prevention, elongation of vehicle lifespan and energy saving by an environmentally friendly, safe and economic driving way.
As the drivers are more interested in the eco-driving, vehicles having economical driving functions (e.g., a function of informing and recording a travel state (traveling condition, driving state) while being driven) are being developed.
An eco-driving system is a driving assistance system for improving fuel efficiency by inducing drivers to have scientific driving habits, which are capable of enhancing fuel efficiencies, based upon automotive technologies. As an example, an engine revolutions per minute (RPM) and driving velocity of a car are calculated. When a driver is practicing the eco-driving, a green light (lamp) is turned on, and on contrary, when the driver is making a reckless driving, such as sudden acceleration, sudden deceleration or the like, a red light is turned on, so as to make the driver practice the eco-driving. As another example, the eco-driving system induces the eco-driving by displaying instantaneous fuel efficiency, a continuity level, points and the like on a monitor within a car while the car is being driven.
Also, the related art eco-driving system has still been present at the level of inducing the eco-driving and improving the driving habits in terms of voice information, indication using a lamp, and displaying of fuel efficiency on the monitor within the car. However, in recent time, studies on systems for supporting an eco-driving by monitoring, storing, managing and analyzing travel state information related to a vehicle to be managed by an external management server and providing guidance information for inducing the eco-driving to the vehicle in real time are undergoing.
In the meantime, in regard of induction of the eco-driving, travel state information related to only a vehicle to be controlled, macroscopic road conditions or the like are considered at present. However, it is necessary to obviously recognize unnecessary travel state related to a vehicle by considering surroundings of the vehicle being driven in detail.
In addition, regarding a recognition of travel state related to a vehicle being driven, the existing way using sensors has a disadvantage in view of poor accuracy. Hence, it is required to more accurately measure velocities of adjacent vehicles and distances from the adjacent vehicles, as compared to measuring those using the existing sensor, and determine whether or not the vehicle being controlled is accelerating unnecessarily.
Furthermore, when a vehicle makes an unnecessary horizontal (lateral) motion pattern without changing a traffic lane, waste of fuel is caused. Hence, such motion should be detected for prevention.
Therefore, to address those problems, an aspect of the detailed description is to provide a driving apparatus capable of detecting (sensing) a distance from another apparatus located in front of the driving apparatus using a three-dimensional (3D) sensor, and measuring and reducing unnecessary acceleration of the driving apparatus, and a method thereof.
Another aspect of the detailed description is to provide a driving apparatus capable of detecting a lane of a road where the driving apparatus is being traveled using 3D sensor, measuring an unnecessary horizontal (lateral) motion within the lane, and reducing the horizontal motion, and a method thereof.
Another aspect of the detailed description is to provide a driving apparatus capable of controlling the unnecessary acceleration and the lateral motion to be reduced so as to guide a vehicle (car) to be traveled within a zone having an economical fuel efficiency.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided a driving apparatus including a three-dimensional (3D) sensor unit configured to obtain 3D data, a controller configured to collect travel state information from the obtained data and diagnose a travel state based upon the collected travel state information, and an output unit configured to display the diagnosis result.
The 3D sensor unit may include at least one of a stereo camera, a depth camera, a moving stereo camera and a Light Detection and Ranging (LIDAR) system.
The 3D data may be 3D distance data from another driving apparatus located at the front.
The travel state information may be acceleration or deceleration state information with respect to the 3D distance data.
The 3D sensor unit may capture a plurality of images, generate a stereo disparity image from the plurality of images, and obtain the 3D distance image from the disparity image.
The 3D sensor unit may include a laser generator configured to generate a laser pulse to transmit laser in a driving direction, and a laser sensor configured to detect reflected wave that the laser is returned by being reflected by another driving apparatus located at the front, measure a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned, and obtain the 3D distance data from the measurement result.
The controller may compare the 3D distance data so as to determine an acceleration or deceleration state based upon the comparison result, and the travel state information may be the determined acceleration or deceleration state.
The controller may diagnose the travel state as needing to stop an acceleration driving when the 3D distance data is less than a predetermined safe distance.
The 3D data may be lane recognition information related to a travel route.
The 3D sensor unit may receive images for the travel route, extract feature points from the received images, detect a lane from the feature points and recognize a lane with respect to the travel route.
The controller may determine whether or not a lateral motion has been generated based upon the lane recognition information, the lateral motion indicating that the driving apparatus is driven close to one side of the recognized lane, and the travel state information may be the determined lateral motion state information.
The controller may determine whether or not the number of lateral motions being generated, included in the lateral motion state information, exceeds a predetermined threshold number of lateral motions, and diagnose the travel state as needing to stop a lateral motion when it is determined to exceed the threshold number of lateral motions.
The controller may determine whether or not a distance from a predetermined lane, included in the lateral motion state information, is below a threshold distance, and diagnose the travel state as needing to stop the lateral motion.
The travel state information may be collected by a predetermined period of time.
The output unit may include at least one of a display unit, an audio output module, a lamp output module and a haptic module.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided a driving method including obtaining a three-dimensional (3D) data, collecting travel state information from the obtained data, diagnosing a travel state based upon the collected travel state information, and displaying the diagnosis result.
The 3D data may be obtained by at least one of a stereo camera, a depth camera, a moving stereo camera and a Light Detection and Ranging (LIDAR) system.
The 3D data may be 3D distance data from another driving apparatus located at the front and/or lane recognition information with respect to a travel route.
The obtaining of the 3D data may include capturing a plurality of images, generating a stereo disparity image from the plurality of images, and obtaining the 3D distance data from the disparity image.
The obtaining of the 3D data may include generating a laser pulse and transmitting laser in a driving direction, detecting reflected wave that the laser is returned by being reflected by another driving apparatus located at the font, measuring a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned, and obtaining the 3D distance data from the measurement result.
The collecting of the travel state information may include comparing the 3D distance data, and determining an acceleration or deceleration state based upon the comparison result, wherein the travel state information may be the determined acceleration or deceleration state.
The diagnosing of the travel state may include determining whether or not the 3D distance data is below a predetermined safe distance when the state information indicates an acceleration state, and diagnosing the travel state as needing to stop the acceleration when it is determined to be less than the safe distance.
The obtaining of the 3D data may include receiving images related to a travel route, extracting feature points from the received image, detecting a lane from the feature points to generate a lane recognition image including lane information, and recognizing a lane with respect to the travel route based upon the generated image.
The collecting of the travel state information may include determining whether or not a lateral motion has been generated based upon the lane recognition information, the lateral motion indicating that the driving apparatus is driven close to one side of the recognized lane, wherein the travel state information may be the determined lateral motion state information.
The diagnosing of the travel state may include determining whether or not the number of lateral motions being generated, included in the lateral motion state information, exceeds a predetermined threshold number of lateral motions and/or a distance from a predetermined lane, included in the lateral motion state information, is below a threshold distance, and diagnosing the travel state as needing to stop the lateral motion when it is determined to exceed the threshold number of lateral motions or to be below the threshold distance.
In accordance with the detailed description, a acceleration and a lateral motion of a driving apparatus can be recognized more accurately by using a 3D sensor, as compared to the existing sensor-based recognition, and accordingly it can be determined precisely whether or not the driving apparatus is making an eco-driving.
A travel state of the driving apparatus and a traveling zone having an economic fuel efficiency can be displayed, accordingly, a driver can determine whether or not the driving apparatus is making an eco-driving and safe driving and practice the eco-driving, so as to form a habit of the eco-driving.
Also, guidance information for eco-driving can be provided in real time, so as to reduce energy consumption due to unnecessary driving and enhance fuel efficiency.
FIG. 1 is a block diagram of a driving apparatus in accordance with an exemplary embodiment of the present disclosure;
FIG. 2 is a flowchart showing a traveling (driving) process using a 3D sensor in accordance with an exemplary embodiment;
FIG. 3 is a flowchart showing a traveling process using 3D distance data in accordance with one exemplary embodiment;
FIG. 4 is a view showing an exemplary embodiment that the driving apparatus obtains the 3D distance data;
FIG. 5 is a view showing another exemplary embodiment that the driving apparatus obtains the 3D distance data;
FIG. 6 is a flowchart showing a travel state information collection process and a travel state determination (diagnosis) process in accordance with one exemplary embodiment;
FIG. 7 is a flowchart showing a traveling process using a lane recognition in accordance with another exemplary embodiment;
FIG. 8A is a flowchart showing a process that the driving apparatus recognizes a lane;
FIG. 8B is a view showing an image that the driving apparatus recognizes the lane; and
FIG. 9 is a view showing a displayed state of a determination (diagnosis) result in accordance with an exemplary embodiment.
The exemplary embodiments of the present disclosure may be stand-alone, and additionally be applicable to various types of terminals such as mobile terminals, telematics terminals, smart phones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), laptop computers, tablet PCs, Wibro terminals, Internet Protocol Television (IPTV) terminals, television sets, 3D television sets, imaging devices, navigation terminals, Audio Video Navigation (AVN) terminals and the like.
The exemplary embodiments of the present disclosure may be implemented in a program command format capable of being executed by various computer devices so as to be recorded in a computer-readable medium. The computer-readable medium may include program command, file data, data structure and the like independently or by combination thereof. The program command recorded in the medium may be one designed and constructed specifically for the present disclosure or one useable in a computer software. Examples of the computer-readable media may include magnetic media such as hard disk, floppy disk and magnetic tape, optical media such as CD-ROM and DVD, magneto-optical media such as floptical disk, and specifically constructed hardware devices, such as ROM, RAM and flash memory, for storing program commands. Examples of the program commands may include exclusive language codes executable by a computer using an interpreter or the like as well as mechanical word codes, which are created by a compiler. The hardware device may be configured to function as one or more software modules for executing the operations according to the present disclosure, and examples thereof are the same.
Technical terms used in this specification are used to merely illustrate specific embodiments, and should be understood that they are not intended to limit the present disclosure. As far as not being defined differently, all terms used herein including technical or scientific terms may have the same meaning as those generally understood by an ordinary person skilled in the art to which the present disclosure belongs, and should not be construed in an excessively comprehensive meaning or an excessively restricted meaning. In addition, if a technical term used in the description of the present disclosure is an erroneous term that fails to clearly express the idea of the present disclosure, it should be replaced by a technical term that can be properly understood by the skilled person in the art.
A singular representation may include a plural representation as far as it represents a definitely different meaning from the context. Terms include or has used herein should be understood that they are intended to indicate an existence of several components or several steps, disclosed in the specification, and it may also be understood that part of the components or steps may not be included or additional components or steps may further be included.
Hereinafter, a mobile terminal associated with the present disclosure will be described in more detail with reference to the accompanying drawings. A suffix "module" or "unit" used for constituent elements disclosed in the following description is merely intended for easy description of the specification, and the suffix itself does not give any special meaning or function.
In describing the present invention, if a detailed explanation for a related known function or construction is considered to unnecessarily divert the gist of the present invention, such explanation has been omitted but would be understood by those skilled in the art. The accompanying drawings are used to help easily understood the technical idea of the present invention and it should be understood that the idea of the present invention is not limited by the accompanying drawings.
Hereinafter, description will be given in detail of the exemplary embodiments of the present disclosure with reference to the accompanying drawings.
FIG. 1 is a block diagram of a driving apparatus in accordance with an exemplary embodiment of the present disclosure.
As shown in FIG. 1, a driving apparatus 300 may include a three-dimensional (3D) sensor unit 110, a controller 120, a memory 130, an output unit 140 and a travel information collector 150.
The 3D sensor unit 110 is a camera system for simultaneously capturing front, rear, and/or side directions by using a rotatable reflector, a condenser and an imaging device (capturing device). The 3D sensor unit 110 may be applicable to security, surveillance camera, robot vision and the like. The rotatable reflector may be formed in various shapes, such as hyperboloid, spherical type, conical type, combined type and the like. The 3D sensor unit 110 may include at least one pair of cameras (stereo cameras or stereoscopic cameras), which are installed on the same surface on the same central axis with horizontally spaced apart from each other, or a single camera.
Charge Coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS) may be employed as the imaging device of the 3D sensor unit 110. An image (namely, front image) projected on an image surface of the imaging device is a reflected image by the rotatable reflector, so it is a distorted image, which is not appropriate to be viewed (observed) by human being at it is. Therefore, the 3D sensor unit 110 may convert coordinates of an output of the imaging device by using a microprocessor or the like to create a new panoramic image for a correct observation of the image.
The 3D sensor unit 110 may include at least one of a stereo camera, a depth camera, a moving stereo camera and a light detection and ranging (LIDAR) system, for acquiring 3D distance data by three-dimensionally capturing every directions.
The stereo camera is an imaging device including a plurality of cameras. The images in every directions obtained via the 3D sensor unit 110 may provide two-dimensional (2D) information related to surroundings of the 3D sensor unit 110. The 3D information related to the surroundings of the 3D sensor unit 110 may be obtained by using a plurality of images captured in different directions via the plurality of cameras.
The depth camera is a camera for extracting image and distance data by capturing or measuring obstacles. That is, the depth camera may capture obstacles, similar to the typical camera, to create image data, and measure a distance from the camera at an actual position corresponding to pixels of each image so as to create distance data. For example, 3D-TOF camera or the like may be employed as the depth camera. The 3D-TOF camera is a camera for measuring depth based on a time of flight theory, as can be known from the name. That is, the 3D-TOF camera may use a way of emitting light with extremely short pulse (about 20 MHz) using infrared LEDs and calculating a phase time difference of light coming back, thus to find a depth value. The 3D-TOF corresponds to a camera employing a concept similar to a radar.
The depth camera may filter off distance data upon capturing objects so as to choose obstacles present within a specific distance. By use of the distance data of the depth camera, target objects located within a specific depth field can be segmented, excluding obstacles located at a near field and a far field.
The moving stereo camera denotes a camera that a position of a stereo camera is actively changed depending on a distance from an obstacle such that the stereo camera has a fixed viewing (focal) angle with respect to an obstacle to be observed. The stereo camera may obtain images by disposing two cameras in parallel, and calculate a distance up to an obstacle according to a stereo disparity between the obtained images.
The stereo camera denotes a passive camera that optical axes are always disposed in parallel and fixed. On the other hand, the moving stereo camera may fix an observation angle by actively changing geometric positions of the optical axes.
Controlling of the observation angle of the stereo camera according to a distance of an obstacle is referred to as an observation angle control. The observation angle control stereo camera may always constantly maintain a stereo disparity with respect to moving obstacles (objects) so as to provide more natural stereo image to a stereo image observer, and also provide useful information in view of obstacle distance measurement or stereo image processing.
The LIDAR system may detect an existence and a distance of an obstacle located in front of the driving apparatus 100. The LIDAR system is a type of active remote sensing system, by which desired information is acquired without a direct contact with an object using the same principle as a radar. The LIDAR system may shoot (fire) laser beams to a target whose information is needed and detect disparity of electromagnetic waves coming back by being reflected by the target and energy change, thereby obtaining desired distance information.
The LIDAR system may be classified into three types, including Differential Absorption LIDAR (DIAL), a Doppler LIDAR and a range finder LIDAR, according to a purpose or target desired to be measured. The DIAL is utilized to measure concentration of vapor, ozone, pollutant and the like in the air using two lasers having different absorption levels with respect to an object desired to be measured. The Doppler LIDAR is utilized to measure a movement speed of an object using a Doppler principle. However, LIDAR generally indicates the range finder LIDAR, which combines information related to distances from objects using a Global Positioning System (GPS), an Inertial Navigation System (INS) and a laser scanner system, so as to acquire 3D topographic information.
When employing the LIDAR system in the driving apparatus 100, various sensors, which are implemented by GPS, INS, 3D LIDAR and photometry, may be combined and mounted in the driving apparatus 100, thereby being implemented to measure positions of the region located on the way of the traveling route and acquire visual information while the driving apparatus 100 is run.
The controller 120 may control an overall operation of the driving apparatus 100. For example, the controller 120 may carry out control of various driving units for driving the driving apparatus 100, collection of travel state information, diagnosis of a travel state, control associated with an eco-driving, calculation and processing. Also, the controller 120 may carry out a variety of controls for eco-driving to be explained with reference to FIGS. 2 to 9.
The memory 130 may store a program for operations of the controller 120, and temporarily store input/output data (for example, driving mode, driving information, eco-driving state, etc.). The memory 130 may store data related to vibration and sound of various patterns, which are output in response to touch inputs on the touch screen.
In several exemplary embodiments, the memory 130 may store software components, which include an operating system (not shown), a module (not shown) functioning as a wireless communication unit, a module (not shown) operating together with a user input unit, a module (not shown) operating together with an A/V input unit, a module operating together with an output unit 140. The operating system (for example, LINUX, UNIX, OS X, WINDOWS, Chrome, Symbian, iOS, Android, VxWorks or other embedded operating systems) may include various software components and/or drivers for controlling system tasks such as memory management, power management and the like.
The memory 130 may include at least one of storage medium of including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like. Also, the driving apparatus 100 may operate a web storage which performs the storage function of the memory 130 on the Internet.
The output unit 140 may generate outputs relating to an audio signal, a video signal or a tactile signal. The output unit 140 may include a display unit 151, an audio output module 142, a lamp output module 143 and a haptic module 144.
The display unit 141 may output information processed in the driving apparatus 100. For example, while the driving apparatus 100 is driven, the display unit 141 may display User Interface (UI) or Graphic User Interface (GUI) relating to the driving.
The display unit 141 may be implemented using, for example, at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT-LCD), an Organic Light-Emitting Diode (OLED), a flexible display, a three-dimensional (3D) display, an e-ink display or the like.
Some of such display units 141 may be implemented as a transparent type or an optical transparent type through which the exterior is visible, which is referred to as transparent display . A representative example of the transparent display may include a Transparent OLED (TOLED), and the like. The rear surface of the display unit 141 may also be implemented to be optically transparent.
The display unit 141 may be implemented in two or more in number according to a configured aspect of the driving apparatus 100. For instance, a plurality of the display units 141 may be arranged on one surface to be spaced apart from or integrated with each other, or may be arranged on different surfaces.
Here, if the display unit 141 and a touch sensitive sensor (referred to as a touch sensor) have a layered structure therebetween (the structure may be referred to as a touch screen), the display unit 141 may be used as an input device as well as an output device. The touch sensor may be implemented as a touch film, a touch sheet, a touchpad, and the like.
The touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 141, or a capacitance occurring from a specific part of the display unit 141, into electric input signals. Also, the touch sensor may be configured to sense not only a touched position and a touched area, but also a touch pressure.
When touch inputs are sensed by the touch sensors, corresponding signals are transmitted to a touch controller (not shown). The touch controller processes the received signals, and then transmits corresponding data to the controller 120. Accordingly, the controller 120 may sense which region of the display unit 141 has been touched.
The audio output module 142 may output audio data stored in the memory 130 in a recording mode, a voice recognition mode and the like. The audio output module 142 may also output a sound signal relating to a function (a driving mode change, an eco-driving indication, an eco-driving warning and the like) performed in the driving apparatus 100.
The lamp output module 143 may output state information relating to the driving apparatus 100 using lamps, and differently display the state using brightness of a lamp, a color of a lamp, flicking of a lamp and the like. The lamp output module 143 may have functions of lighting and signaling, and use as a light source a Light Emitting Diode (LED), a projection lamp, a position lamp, a halogen lamp and the like.
The haptic module 144 generates various tactile effects which a user can feel. A representative example of the tactile effects generated by the haptic module 144 includes vibration. Vibration generated by the haptic module 144 may have a controllable intensity, a controllable pattern, and so on. For instance, different vibration may be output in a synthesized manner or in a sequential manner.
The haptic module 144 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moving with respect to a skin being touched (contacted), air injection force or air suction force through an injection hole or a suction hole, touch by a skin surface, presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or a heat emitting device, and the like.
The haptic module 144 may be configured to transmit tactile effects (signals) through a user s direct contact, or a user s muscular sense using a finger or a hand. The haptic module 144 may be implemented in two or more in number according to the configuration of the driving apparatus 100.
The travel information collector 150 may collect travel state information relating to the driving apparatus 100 by receiving the same in real time and include an acceleration sensor 151, a gyro sensor 152, a steering sensor 153.
The acceleration sensor 151 is a device for converting an acceleration change in one direction into an electrical signal. The acceleration sensor 151 may generally be configured by mounting two axes or three axes in one package. It may alternatively require only Z-axis depending on usage environments. Hence, if X-axial or Y-axial acceleration sensor should be used instead of Z-axial acceleration sensor in some reason, an acceleration sensor may be mounted by erecting the acceleration sensor on a primary board using a separate plate.
The gyro sensor 152 is a sensor for measuring an angular speed of the driving apparatus 100 which performs a rotary motion, and may detect a rotated angle from each reference direction. For example, the gyro sensor 152 may detect each rotated angle, namely, an azimuth, a pitch and a roll, based upon three-directional axes.
The steering sensor 153 may include a rotor rotated in conjunction with a steering wheel of the driving apparatus 100, a gear rotated integrally with the rotor, a detecting part for detecting phase changes by rotation of a magnetic substance, which generates a magnetic force, an operating part for operating and outputting an input of the detecting part, and a PCB substrate and a housing for mounting the operating part. However, without limit to this structure, the steering sensor 153 may alternatively be a variety of steering wheel angle detecting devices, which may further include additional components or be configured without part of components.
FIG. 1 shows the driving apparatus 100 having various components, but it is understood that implementing all of the illustrated components is not a requirement. Greater or fewer components may alternatively be implemented.
FIG. 2 is a flowchart showing a traveling (driving) process using a 3D sensor in accordance with an exemplary embodiment.
As shown in FIG. 2, first, the driving apparatus 100 may acquire 3D data from the 3D sensor unit 110 (S201).
The 3D sensor unit 110 may use one of a stereo camera, a depth camera, a moving stereo camera and a LIDAR system to obtain 3D information relating to an object to be captured using images in every directions. The 3D sensor unit 110 may convert electric signals, which are created by the stereo camera or the like, into image signals, and generates a 3D map around the driving apparatus 100 from images implemented by the image signals. The generated 3D map may be stored in the memory 130, and updated every time when the position of the driving apparatus 100 changes.
In accordance with the one exemplary embodiment, the 3D data may be related to a position and/or a distance of an obstacle (object), which is located in front of the driving apparatus 100, and/or another driving apparatus. Also, the 3D data may be lane recognition information with respect to a travel route (traveling route, traveling path, driving path) of the driving apparatus 100. The 3D data obtained from the 3D sensor unit 110 may be combined data of a plurality of information including a distance from another driving apparatus located in front of the driving apparatus 100 and lane recognition information with respect to the travel route of the driving apparatus 100.
In regard of the driving apparatus 100 obtaining the 3D data from the 3D sensor unit 110, when the 3D data is related to the distance from the another driving apparatus located in front of the driving apparatus 100, a 3D data obtaining method will be described in detail with reference to FIGS. 4 and 5. Also, when the 3D data is the lane recognition information with respect to the travel route of the driving apparatus 100, a 3D data obtaining method will be described in detail with reference to FIGS. 8A and 8B.
Afterwards, the driving apparatus 100 may collect travel state information from the obtained 3D data (S202).
The obtained 3D data may be distance data up to an obstacle and/or another driving apparatus located in front of the driving apparatus 100 or lane recognition data relating to the travel route of the driving apparatus 100, based upon the 3D image obtained by the 3D sensor unit 110.
The travel state information of the driving apparatus 100 may include an acceleration or deceleration state of the driving apparatus 100, sudden acceleration and deceleration states thereof, and a lateral motion state within a lane where the driving apparatus 100 is moving. Also, the travel state information of the driving apparatus 100 may include several information, possibly obtained from various sensors of the travel information collector 150. Such several information may include information related to an acceleration or deceleration state, a steering state, a speed of the driving apparatus 100, RPM, an energy consumption ratio, a friction coefficient of a driving surface, a fuel consumption ratio, a length of the driving apparatus 100, headway and the like.
Also, the travel state information of the driving apparatus 100 may include several information possibly obtained via a separately disposed information communication unit (not shown), in addition to the information obtained by the 3D sensor unit 110. Such several information may include geometric information relating to roads (location information, plane slope, longitudinal slope and road curvature), traffic condition information (traffic congestion, traffic volume, block speed, traffic flow and headway) and weather information (rainfall, snowfall, fog, friction coefficient of driving surface). Those information may be obtained in order to guide and control a correct travel condition of the driving apparatus 100 together with the travel state information obtained via the 3D sensor unit 110.
The travel state information may be stored in the memory 130 to be used for determining (diagnosing) a travel state of the driving apparatus 100 with being updated while the driving apparatus 100 is driven.
The controller 120 may then determine (diagnose) the travel state of the driving apparatus 100 based upon the collected travel state information (S203).
The diagnosis of the travel state may be carried out to help an eco-driving of the driving apparatus 100. The eco-driving is to reduce fuel consumption by reducing unnecessary movement while the driving apparatus 100 is driven and realize economically temporally efficient driving. Therefore, the travel state diagnosis may be carried out based upon determination as to whether or not the driving apparatus 100 is unnecessarily accelerating or decelerating, takes an unnecessary lateral motion within lane, or suddenly accelerates or decelerates.
In detail, the controller 120 may determine a current travel state of the driving apparatus 100 based upon the collected travel state information. In accordance with one exemplary embodiment, the controller 120 may determine an acceleration or deceleration state of the driving apparatus 100 based on 3D distance data with another driving apparatus located in front of the driving apparatus 100, among the collected travel state information. According to another exemplary embodiment, the controller 120 may determine whether or not the driving apparatus 100 taken a lateral motion within the lane based upon lane recognition information related to the travel route of the driving apparatus 100, among the collected travel state information.
Also, the controller 120 may diagnose whether the driving apparatus 100 is economically driving according to the travel state determination result with respect to the driving apparatus 100. According to the travel state determination result, if there occurs an unnecessary acceleration or deceleration, an unnecessary lateral motion within the lane or a sudden stop or sudden deceleration, the driving may cause unnecessary fuel consumption, so it may be diagnosed as an economically inefficient driving. Hence, the controller 120 may diagnose as requiring a type of driving necessary to reduce or stop the unnecessary travel state. For example, when the driving apparatus 100 is unnecessarily accelerating, the controller 120 may make a diagnosis that the driving apparatus 100 should stop acceleration and maintain a constant speed. That is, the controller 120 may make a diagnosis for requiring deceleration of the driving apparatus 100.
As another example, when the driving apparatus 100 is making the unnecessary lateral motion within the lane, the controller 120 may make a diagnosis that the driving apparatus 100 should stop steering during journeys and move linearly within the lane. That is, the controller 120 may make a diagnosis for requiring stop of the lateral motion.
In accordance with one exemplary embodiment, the controller 120 may diagnose the travel state of the driving apparatus 100 based on a plurality of travel states. For example, the controller 120 may diagnose the travel state of the driving apparatus 100 by combining the acceleration or deceleration state and the lateral motion state of the driving apparatus 100.
Hereinafter, description will be given in more detail of the diagnosis of the travel state with reference to FIGS. 3 to 8.
Afterwards, the driving apparatus 100 may display the diagnosis results (S204).
For example, according to the diagnosis results, if the controller 120 has diagnosed that the driving apparatus 100 should stop acceleration, the output unit 140 may display a warning (caution) indicating that the driving apparatus 100 is making uneconomic driving due to the unnecessary acceleration, or a note indicating that the driving apparatus 100 should reduce speed and maintain constant speed.
The displaying of the diagnosis results may be implemented by outputting image and/or voice via the display unit 141 and/or the audio output module 142. The driving apparatus 100 may display the travel state information and/or the travel state diagnosis results on the display unit 141 using various UIs and GUIs. Also, the display unit 141 may digitize the diagnosis results and classify the diagnosis results into the number of times or scales so as to represent with numerical values, or display the diagnosis results using a text format such as a message, a graph, a predetermined figure and the like. The audio output module 142 may announce information for inducing eco-driving from the diagnosis results by voice, or output a warning message by voice. Also, the audio output module 142 may inform an uneconomic driving using warning sound like beep sound.
The diagnosis results may alternatively be displayed via the lamp output module 143. For example, the lamp output module 143 may classify the diagnosis results into scales, and display the diagnosis results by means of brightness, color, flicking times or flicking speed of a lamp and the like according to whether the travel state is economic or uneconomic.
In addition, the diagnosis results may be displayed via the haptic module 144. In detail, if the driving apparatus 100 is making the uneconomic driving, the haptic module 144 may perform displaying of a warning thereof by various tactile effects using vibration, air injection, or presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling, or the like.
Hereinafter, description will be given in detail of a method for guiding driving using a 3D sensor.
FIG. 3 is a flowchart showing a traveling process using 3D distance data in accordance with one exemplary embodiment.
As shown in FIG. 3, the driving apparatus 100 may acquire 3D distance data via the 3D sensor unit 110 (S211). The 3D distance data may be a position of another driving apparatus located in front of the driving apparatus 100 and/or a distance from the front driving apparatus.
In accordance with one exemplary embodiment, the 3D sensor unit 110 may be implemented as a stereo camera. Referring to FIG. 4, the 3D sensor unit 110 may obtain a plurality of stereo images, namely, a first stereo image 401 and a second stereo image 402 by capturing the front face of the driving apparatus 100 using the stereo camera. Afterwards, the 3D sensor unit 110 may generate a 3D disparity map 403 from the plurality of images. The stereo disparity image 403 may be represented relatively brighter when a front obstacle is located near the stereo camera, while represented relatively darker when located far from the stereo camera. The 3D sensor unit 110 may measure a distance from the front obstacle using brightness information (namely, depth of an image).
According to the method, the 3D sensor unit 110 of the driving apparatus 100 may measure a distance between the driving apparatus 100 and another driving apparatus located in front of the driving apparatus 100.
Alternatively, according to another exemplary embodiment, the 3D sensor unit 100 may be configured as the LIDAR system. The LIDAR system may shoot (fire) laser beams to the front of the driving apparatus 100 and measure a return time that the beams come back by being reflected by another driving apparatus located at the front, so as to obtain distance data from the front driving apparatus. In detail, referring to FIG. 5, a laser generator 501 of the LIDAR system generates a laser pulse to be used for distance measurement to transmit laser 502 in a driving direction. The laser 502 transmitted from the LIDAR system 500 reaches another driving apparatus 100' located in front of the driving apparatus 100, and is reflected by the another driving apparatus 100' to come back in form of reflected wave 503. A laser sensor 504 of the LIDAR system 500 then detects the reflected wave 503 and measures a time taken from the laser 502 being transmitted to the reflected wave 503 being returned, and an energy change level. The laser sensor 504 may calculate a distance up to the another driving apparatus 100' based on the measurement result. For example, if it is assumed that a return time is t, a speed of laser beam is c and a distance up to the another driving apparatus 100' is R, the laser sensor 504 may calculate the distance according to the following Equation.
Figure PCTKR2011006885-appb-I000001
Also, the driving apparatus 100 may utilize the return time that the laser 502 transmitted from the LIDAR system 500 is returned as the reflected wave 503 or the number of times that the laser 502 is returned as the reflected wave 503, as the 3D data, instead of the 3D distance data.
Next, the driving apparatus 100 may collect travel state information related to acceleration or deceleration from the obtained 3D distance data (S212).
The controller 120 may continuously store the 3D distance data in the memory 130, and thereafter collect the travel state information related to acceleration or deceleration of the driving apparatus 100 based on the 3D distance data.
In general, when the driving apparatus 100 is getting closer to another driving apparatus located at front according to time elapse, it may be determined as the driving apparatus 100 is accelerating over a predetermined speed or the front driving apparatus is decelerating. On the contrary, when the driving apparatus 100 is getting farther from another driving apparatus located at front according to time elapse, it may be determined that the driving apparatus 100 is decelerating or the front driving apparatus is accelerating. Here, if the driving apparatus 100 is determined to be located within a shorter distance than a safe distance from another driving apparatus located in front of the driving apparatus 100, regardless of the travel state of the font driving apparatus, it may be diagnosed as the accelerated driving should be stopped.
Referring to FIG. 6, the controller 120 may compare the collected 3D distance data with each other (S61). The controller 120 may determine whether or not the distance is increased or decreased by the comparison of the 3D distance data (S62). According to the comparison result, if the distance is decreased, the controller 120 may determine the travel state of the driving apparatus 100 as accelerating (S63). On the other hand, if the distance is increased according to the comparison result, the controller 120 may determine the travel state of the driving apparatus 100 as decelerating (S64). Or, if the distance is rapidly decreased or increased according to the comparison of the collected 3D distance data, the travel state of the driving apparatus 100 may be diagnosed as a sudden acceleration or deceleration. Therefore, the controller 120 may collect the travel state information related to acceleration or deceleration of the driving apparatus 100 according to the determination results.
Also, the controller 120 may determine the travel state of the driving apparatus 100 based upon information related to a return time taken until the laser emitted from the 3D sensor unit 110 reaches a front driving apparatus and then returns. The controller 120 may determine the travel state of the driving apparatus 100 as accelerating when the collected time information is decreased as a time elapses. On the contrary, the controller 120 may determine the travel state of the driving apparatus 100 as decelerating when the time information is increased as a time elapses.
The controller 120 may alternatively diagnose the travel state of the driving apparatus 100 based on the number of times that the laser emitted from the 3D sensor unit 110 reaches the front driving apparatus and then returns. The controller 120 may determine the travel state of the driving apparatus 100 as accelerating when the collected number of times is gradually increased according a predetermined period. On the contrary, the controller 120 may determine the travel state of the driving apparatus 100 as being decelerating when the collected number of times is gradually decreased according a predetermined period.
Hence, the controller 120 may determine the acceleration or deceleration state of the driving apparatus 100 based upon the change in the return time or the change in the number of return for a predetermined time and collect acceleration or deceleration state information.
In accordance with one exemplary embodiment, the collection of the travel state information may be carried out at a predetermined period of time. For example, the controller 120 may collect the acceleration or deceleration state information of the driving apparatus 100 from the 3D distance data for a predetermined period of time. Alternatively, the controller 120 may collect the acceleration or deceleration state information of the driving apparatus 100 from the return time information or the information related to the number of return for a predetermined period of time.
Afterwards, the driving apparatus 200 may diagnose the travel state based upon the collected acceleration or deceleration travel state information (S213).
Referring to FIG. 6, the controller 120 may diagnose whether a distance between the driving apparatus 100 and another driving apparatus located in front of the driving apparatus 100 is decreased below a safe distance (S65). When the distance is within the safe distance, the controller 120 may make a diagnosis such that the driving apparatus 100 should stop the acceleration for eco-driving (S66). Here, the safe distance may be predetermined and stored in the memory 130, and indicate a distance between driving apparatuses, directed in traffic rules.
Also, the controller 120 may diagnose to stop sudden acceleration or deceleration, which causes unnecessary fuel consumption and abrasion of components, when the driving apparatus 100 is diagnosed as the sudden acceleration or deceleration state.
The foregoing description is limitedly given of the method that the controller 120 diagnoses the travel state of the driving apparatus 100. However, without limit to the method, methods using various sensors and calculating apparatuses may be employed to determine an acceleration or deceleration state, or a sudden acceleration or deceleration state of the driving apparatus 100.
Finally, the driving apparatus 100 may display the diagnosis results (S214).
For example, when the driving apparatus 100 has been diagnosed as having to stop acceleration according to the diagnosis result of the controller 120, the output unit 140 may display a warning indicating that the driving apparatus 100 is making uneconomic driving due to unnecessary acceleration, or a note indicating that the driving apparatus 100 should slow down and maintain a predetermined speed.
As aforementioned, the displaying of the diagnosis result may be carried out through image, voice, lamp and various tactile stimulus using one or more of the display unit 141, the audio output module 142, the lamp output module 143 and the haptic module 144.
FIG. 7 is a flowchart showing a traveling process using a lane recognition in accordance with another exemplary embodiment.
As shown in FIG. 7, the driving apparatus 100 may recognize a lane on a travel route via the 3D sensor unit 110 (S221).
Referring to FIGS. 8A and 8B, the controller 120 may receive a 3D image for the travel route of the driving apparatus 100 via the 3D sensor unit 110 (S81).
Afterwards, the controller 120 may extract feature points from the 3D image 81 (see FIG. 8A) obtained by the 3D sensor unit 110 (S82). The feature point may be extracted with respect to an overall area or a specific interesting area based upon a horizontal axis or vertical axis according to an extraction algorithm. However, an algorithm for the number or area may not be limited.
The controller 120 may detect a lane using a color recognition algorithm from the extracted feature points (S83). In general, a road is black, a center lane is yellow, and a normal lane is white. So, the controller 120 may detect the color from the images to distinguish a lane and a portion which is not a lane, so as to recognize a lane on the travel route of the driving apparatus 100.
Afterwards, the controller 120 may remove noise from the images through several types of filtering (S84). There is no limit to the filtering way, and the noise removal may not be required according to an image state or a 3D sensor system.
The controller 120 may create an image 82 (see FIG. 8B), from which the lane of the travel route of the driving apparatus 100 has been extracted through the processes, and recognize lane information (S85).
When the lane is curved, the 3D sensor unit 110 may recognize a lane using a curve equation for a case whether the driving apparatus 100 travels along a curved surface. Here, the 3D sensor unit 110 may calculate curve information following a central point of the lane with respect to a plurality of feature points corresponding to the curve. Here, the calculated curve information may be used to improve a lane maintenance performance by minimizing an affection of a calibration state of the 3D sensor unit 110. As one example, the 3D sensor unit 110 may calculate the curve information following the central point of the lane by using one of a least square method, RANSAC, general Hough transform, Spline interpolation and the like, with respect to the plurality of feature points corresponding to the checked curve.
The driving apparatus 100 may then collect state information relating to a lateral motion within a lane from the recognized lane (S222).
The controller 120 may store lane recognition information relating to the travel route of the driving apparatus 100 in the memory 130 in order to use them for collection of travel state information related to the driving apparatus 100.
When the driving apparatus 100 approaches (moves to) one side of the recognized lane, the controller 120 may determine the travel state of the driving apparatus 100 as the lateral motion state. That is, when the driving apparatus 100 is located close to the left of the recognized lane, the controller 120 may determine that the driving apparatus 100 is performing the motion to the left side. On the contrary, when the driving apparatus 100 is located close to the right of the recognize lane, the controller 120 may determine that the driving apparatus 100 is performing the motion to the right side.
Hence, the controller 120 may collect the lateral motion state information related to the driving apparatus 100 from the determination results.
In accordance with one exemplary embodiment, the collection of the travel state information may be carried out at a predetermined period of time. For example, the controller 120 may collect the lane recognition information for a predetermined period of time and diagnose whether the driving apparatus 100 is performing the lateral motion.
Afterwards, the driving apparatus 100 may diagnose the travel state based on the collected lateral motion information (S223).
The controller 120 may diagnose whether the lateral motion state of the driving apparatus 100 is in an eco-driving state or uneconomic driving state by comparison with a predetermined threshold value. For example, the threshold value may be the threshold number of lateral motion with respect to a predetermined time. Alternatively, the threshold value may be a numeral value indicating the distance between the driving apparatus 100 and the recognized lane in view of the lateral motion. Hence, when the lateral motion of the driving apparatus 100 is generated an excessive number of times as compared to the threshold value or the driving apparatus 100 is excessively closer to the recognized lane than the threshold value, the controller 120 may diagnose the driving apparatus 100 as driving uneconomically, and determine to stop the lateral motion.
Here, when the lateral motion is generated out of necessity, irrespective of the eco-driving, due to a lane change or curve driving of the driving apparatus 100, the controller 120 may diagnose as the lateral motion does not need to stop. For determination thereof, the driving apparatus 100 may collect the travel state information relating thereto using various sensors and calculating apparatuses as well as a curved road recognition algorithm.
The foregoing description is limitedly given of the method that the controller 120 diagnoses the travel state of the driving apparatus 100. However, without limit to the method, methods using various sensors and calculating apparatuses may be employed to determine the lateral motion state of the driving apparatus 100.
Finally, the driving apparatus 100 may display the diagnosis results (S224).
For example, if it is diagnosed as the driving apparatus should stop acceleration, the output unit 140 may display a warning indicating that the driving apparatus 100 is performing a noneconomic driving due to unnecessary acceleration, or a note indicating that the driving apparatus 100 should slow down and maintain a predetermined speed.
As aforementioned, the displaying of the diagnosis result may be carried out through image, voice, lamp and various tactile stimulus using one or more of the display unit 141, the audio output module 142, the lamp output module 143 and the haptic module 144.
FIG. 9 is a view showing a displayed state of a determination (diagnosis) results in accordance with an exemplary embodiment.
FIG. 9 shows an example that the driving apparatus 100 combines the acceleration or deceleration driving diagnosis results and the lateral motion diagnosis results into one graph.
The output unit 140 may display the travel state diagnosis results in form of graph having two axes 91 and 92 and an indicator 95 indicating the location of the driving apparatus 100. The horizontal axis on the graph indicates the lateral motion diagnosis result of the driving apparatus 100. As the indicator 95 is getting farther away from the center of the horizontal axis 91, it indicates that the driving apparatus 100 is making an excessive lateral motion. Also, the vertical axis 92 on the graph indicates the acceleration or deceleration diagnosis result of the driving apparatus 100. That is, as the indicator 95 is getting farther away from the center of the vertical axis 92, it indicates that the driving apparatus 100 is making an excessive acceleration or deceleration driving.
Therefore, when the indicator 95 is located closer to the intersection between the axes 91 and 92, it exhibits a diagnosis result that the driving apparatus 100 is making an eco-driving without acceleration or deceleration or lateral motion. On the contrary, when the indicator 95 is located farther away from the intersection, it exhibits a diagnosis result that the driving apparatus is making a uneconomic driving (for example, an acceleration driving as being upwardly farther from the intersection and a lateral motion driving as being farther fro the intersection in left and right sides).
The oval present on the graph is a figure indicating whether or not the driving apparatus 100 is making an eco-driving. In detail, an external area 93 of the oval may indicate that the driving apparatus 100 is making an uneconomic driving by unnecessarily consuming fuel due to acceleration or deceleration or lateral motion. Here, the external area 93 may be represented with a red color indicating a warning message. On the other hand, an internal area 94 of the oval may indicate that the driving apparatus 100 is making an economically efficient driving without acceleration or deceleration or lateral motion. Here, the internal area 94 may be represented with a green color indicating a safety message.
The displaying method shown in FIG. 9 is merely one example of various displaying disclosed in this specification. As described above, the diagnosis results may be displayed using image, sound, lamp, various tactile stimulus and the like.

Claims (25)

  1. A driving apparatus comprising:
    a three-dimensional (3D) sensor unit configured to obtain 3D data;
    a controller configured to collect travel state information from the obtained data and diagnose a travel state based upon the collected travel state information; and
    an output unit configured to display the diagnosis result.
  2. The apparatus of claim 1, wherein the 3D sensor unit comprises at least one of a stereo camera, a depth camera, a moving stereo camera and a Light Detection and Ranging (LIDAR) system.
  3. The apparatus of claim 1, wherein the 3D data is 3D distance data from another driving apparatus located at the front.
  4. The apparatus of claim 3, wherein the travel state information is acceleration or deceleration state information with respect to the 3D distance data.
  5. The apparatus of claim 3, wherein the 3D sensor unit captures a plurality of images, generates a stereo disparity image from the plurality of images, and obtains the 3D distance image from the disparity image.
  6. The apparatus of claim 3, wherein the 3D sensor unit comprises:
    a laser generator configured to generate a laser pulse to transmit laser in a driving direction; and
    a laser sensor configured to detect reflected wave that the laser is returned by being reflected by another driving apparatus located at the front, measure a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned, and obtain the 3D distance data from the measurement result.
  7. The apparatus of claim 3, wherein the controller compares the 3D distance data and determines an acceleration or deceleration state based upon the comparison result,
    wherein the travel state information is the determined acceleration or deceleration state.
  8. The apparatus of claim 3, wherein the controller diagnoses the travel state as needing to stop an acceleration driving when the 3D distance data is less than a predetermined safe distance.
  9. The apparatus of claim 1, wherein the 3D data is lane recognition information related to a travel route.
  10. The apparatus of claim 9, wherein the 3D sensor unit receives images for the travel route, extracts feature points from the received images, detects a lane from the feature points and recognizes a lane with respect to the travel route.
  11. The apparatus of claim 9, wherein the controller determines whether or not a lateral motion has been generated based upon the lane recognition information, the lateral motion indicating that the driving apparatus is driven close to one side of the recognized lane,
    wherein the travel state information is the determined lateral motion state information.
  12. The apparatus of claim 11, wherein the controller determines whether or not the number of lateral motions being generated, included in the lateral motion state information, exceeds a predetermined threshold number of lateral motions, and diagnoses the travel state as needing to stop a lateral motion when it is determined to exceed the threshold number of lateral motions.
  13. The apparatus of claim 11, wherein the controller determines whether or not a distance from a predetermined lane, included in the lateral motion state information, is below a threshold distance, and diagnoses the travel state as needing to stop the lateral motion.
  14. The apparatus of claim 1, wherein the travel state information is collected by a predetermined period of time.
  15. The apparatus of claim 1, wherein the output unit comprises at least one of a display unit, an audio output module, a lamp output module and a haptic module.
  16. A driving method comprising:
    obtaining a three-dimensional (3D) data;
    collecting travel state information from the obtained data;
    diagnosing a travel state based upon the collected travel state information; and
    displaying the diagnosis result.
  17. The method of claim 16, wherein the 3D data is obtained by at least one of a stereo camera, a depth camera, a moving stereo camera and a Light Detection and Ranging (LIDAR) system.
  18. The method of claim 16, wherein the 3D data is 3D distance data from another driving apparatus located at the front and/or lane recognition information with respect to a travel route.
  19. The method of claim 18, wherein the obtaining of the 3D data comprises:
    capturing a plurality of images;
    generating a stereo disparity image from the plurality of images; and
    obtaining the 3D distance data from the disparity image.
  20. The method of claim 18, wherein the obtaining of the 3D data comprises:
    generating a laser pulse and transmitting laser in a driving direction;
    detecting reflected wave that the laser is returned by being reflected by another driving apparatus located at the font;
    measuring a return time taken until the laser is transmitted and the reflected wave is returned or the number of times that the reflected wave is returned; and
    obtaining the 3D distance data from the measurement result.
  21. The method of claim 18, wherein the collecting of the travel state information comprises:
    comparing the 3D distance data; and
    determining an acceleration or deceleration state based upon the comparison result,
    wherein the travel state information is the determined acceleration or deceleration state.
  22. The method of claim 18, wherein the diagnosing of the travel state comprises:
    determining whether or not the 3D distance data is below a predetermined safe distance when the state information indicates an acceleration state; and
    diagnosing the travel state as needing to stop the acceleration when it is determined to be less than the safe distance.
  23. The method of claim 18, wherein the obtaining of the 3D data comprises:
    receiving images related to a travel route;
    extracting feature points from the received image;
    detecting a lane from the feature points to generate a lane recognition image including lane information; and
    recognizing a lane with respect to the travel route based upon the generated image.
  24. The method of claim 18, wherein the collecting of the travel state information comprises:
    determining whether or not a lateral motion has been generated based upon the lane recognition information, the lateral motion indicating that the driving apparatus moves close to one side of the recognized lane,
    wherein the travel state information is the determined lateral motion state information.
  25. The method of claim 18, wherein the diagnosing of the travel state comprises:
    determining whether or not the number of lateral motions being generated, included in the lateral motion state information, exceeds a predetermined threshold number of lateral motions and/or a distance from a predetermined lane, included in the lateral motion state information, is below a threshold distance; and
    diagnosing the travel state as needing to stop the lateral motion when it is determined to exceed the threshold number of lateral motions or to be below the threshold distance.
PCT/KR2011/006885 2011-09-16 2011-09-16 Driving apparatus and method using 3d sensor WO2013039273A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2011/006885 WO2013039273A1 (en) 2011-09-16 2011-09-16 Driving apparatus and method using 3d sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2011/006885 WO2013039273A1 (en) 2011-09-16 2011-09-16 Driving apparatus and method using 3d sensor

Publications (1)

Publication Number Publication Date
WO2013039273A1 true WO2013039273A1 (en) 2013-03-21

Family

ID=47883478

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/006885 WO2013039273A1 (en) 2011-09-16 2011-09-16 Driving apparatus and method using 3d sensor

Country Status (1)

Country Link
WO (1) WO2013039273A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104309529A (en) * 2014-09-26 2015-01-28 腾讯科技(深圳)有限公司 Early warning prompting method, early warning prompting system and related equipment for early warning prompting
US9969325B2 (en) 2015-09-15 2018-05-15 International Business Machines Corporation Projected surface markings
CN109542222A (en) * 2018-11-13 2019-03-29 深圳市创凯智能股份有限公司 Three-dimensional view angle control method, device, equipment and readable storage medium storing program for executing
CN110337394A (en) * 2017-02-23 2019-10-15 松下知识产权经营株式会社 Information processing system, information processing method, program and recording medium
CN111348051A (en) * 2018-12-20 2020-06-30 罗伯特·博世有限公司 Information device for informing driver and method for informing driver
US12055632B2 (en) 2020-10-13 2024-08-06 Waymo Llc LIDAR based stereo camera correction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050036179A (en) * 2003-10-15 2005-04-20 현대자동차주식회사 A forward area monitoring device of vehicle and method thereof
KR20080088675A (en) * 2007-03-30 2008-10-06 현대자동차주식회사 Lane departure prevention method for automobile
KR20100033161A (en) * 2008-09-19 2010-03-29 현대자동차주식회사 Rear side sensing system for vehicle
KR20100072779A (en) * 2008-12-22 2010-07-01 주식회사 현대오토넷 Appratus and method for avoidancing walker of automobile

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050036179A (en) * 2003-10-15 2005-04-20 현대자동차주식회사 A forward area monitoring device of vehicle and method thereof
KR20080088675A (en) * 2007-03-30 2008-10-06 현대자동차주식회사 Lane departure prevention method for automobile
KR20100033161A (en) * 2008-09-19 2010-03-29 현대자동차주식회사 Rear side sensing system for vehicle
KR20100072779A (en) * 2008-12-22 2010-07-01 주식회사 현대오토넷 Appratus and method for avoidancing walker of automobile

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104309529A (en) * 2014-09-26 2015-01-28 腾讯科技(深圳)有限公司 Early warning prompting method, early warning prompting system and related equipment for early warning prompting
US9969325B2 (en) 2015-09-15 2018-05-15 International Business Machines Corporation Projected surface markings
CN110337394A (en) * 2017-02-23 2019-10-15 松下知识产权经营株式会社 Information processing system, information processing method, program and recording medium
CN109542222A (en) * 2018-11-13 2019-03-29 深圳市创凯智能股份有限公司 Three-dimensional view angle control method, device, equipment and readable storage medium storing program for executing
CN111348051A (en) * 2018-12-20 2020-06-30 罗伯特·博世有限公司 Information device for informing driver and method for informing driver
US12055632B2 (en) 2020-10-13 2024-08-06 Waymo Llc LIDAR based stereo camera correction

Similar Documents

Publication Publication Date Title
US11205348B2 (en) Drive assist device
JP4783431B2 (en) Traffic information detection apparatus, traffic information detection method, traffic information detection program, and recording medium
WO2013039273A1 (en) Driving apparatus and method using 3d sensor
JP4173902B2 (en) Vehicle periphery monitoring device
WO2021162205A1 (en) Method, apparatus, server, and computer program for collision accident prevention
JP2007193445A (en) Periphery monitoring device for vehicle
WO2018143589A1 (en) Method and device for outputting lane information
WO2021002519A1 (en) Apparatus for providing advertisement for vehicle, and method for providing advertisement for vehicle
CN111221342A (en) Environment sensing system for automatic driving automobile
JP2007241898A (en) Stopping vehicle classifying and detecting device and vehicle peripheral monitoring device
US20220324387A1 (en) Display control system, display control method, and non-transitory storage medium
WO2019117459A1 (en) Device and method for displaying content
JP2021100847A (en) Information processing method and apparatus for vehicle curve running
CN113808418A (en) Road condition information display system, method, vehicle, computer device and storage medium
TW201724052A (en) Vehicle monitoring system and method thereof
JP2011103058A (en) Erroneous recognition prevention device
EP3982625A1 (en) Outside environment recognition device
JP4629638B2 (en) Vehicle periphery monitoring device
CN111292509A (en) Abnormality detection device, abnormality detection system, and recording medium
KR20250006373A (en) Method for providing video-based vehicle accident prediction and notification and system implementing the same
CN205015476U (en) HUD equipment with radar detects function
JP2020109559A (en) Traffic light recognition method and traffic light recognition device
WO2022239709A1 (en) Observation device
JP2012073926A (en) Driving support device
US10204276B2 (en) Imaging device, method and recording medium for capturing a three-dimensional field of view

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11872265

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11872265

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载