+

WO2018105417A1 - Dispositif d'imagerie, dispositif de traitement d'image, système d'affichage et véhicule - Google Patents

Dispositif d'imagerie, dispositif de traitement d'image, système d'affichage et véhicule Download PDF

Info

Publication number
WO2018105417A1
WO2018105417A1 PCT/JP2017/042292 JP2017042292W WO2018105417A1 WO 2018105417 A1 WO2018105417 A1 WO 2018105417A1 JP 2017042292 W JP2017042292 W JP 2017042292W WO 2018105417 A1 WO2018105417 A1 WO 2018105417A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
video
vehicle
display
control unit
Prior art date
Application number
PCT/JP2017/042292
Other languages
English (en)
Japanese (ja)
Inventor
朋弘 嶋津
和也 武本
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016239449A external-priority patent/JP6762863B2/ja
Priority claimed from JP2016245775A external-priority patent/JP6781035B2/ja
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to EP25153531.6A priority Critical patent/EP4520636A2/fr
Priority to EP22206620.1A priority patent/EP4155128B1/fr
Priority to US16/467,715 priority patent/US11010934B2/en
Priority to EP17878733.9A priority patent/EP3554062B1/fr
Publication of WO2018105417A1 publication Critical patent/WO2018105417A1/fr
Priority to US17/231,908 priority patent/US11587267B2/en
Priority to US18/156,210 priority patent/US11961162B2/en
Priority to US18/604,250 priority patent/US20240221246A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/029Steering assistants using warnings or proposing actions to the driver without influencing the steering system
    • B62D15/0295Steering assistants using warnings or proposing actions to the driver without influencing the steering system by overlaying a vehicle path based on present steering angle over an image without processing that image
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/029Steering assistants using warnings or proposing actions to the driver without influencing the steering system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/44504Circuit details of the additional information generator, e.g. details of the character or graphics signal generator, overlay mixing circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • the present disclosure relates to an imaging device, an image processing device, a display system, and a vehicle.
  • Patent Literature 1 discloses a technique for controlling power supply to a monitor that displays an image of a camera provided in a vehicle.
  • An imaging apparatus includes an imaging element and a control unit.
  • the imaging device captures the rear of the vehicle and generates a first video.
  • the control unit synthesizes a guide wall image indicating the predicted course of the vehicle in the display area on the first video.
  • An image processing apparatus includes a communication unit and a control unit.
  • a communication part acquires the 1st image
  • the control unit synthesizes a guide wall image indicating the predicted course of the vehicle in the display area on the first video.
  • a display system includes an imaging element, a control unit, and a display device.
  • the imaging device captures the rear of the vehicle and generates a first video.
  • the control unit synthesizes a guide wall image indicating the predicted course of the vehicle in the display area on the first video.
  • the display device displays an image of the display area in which the guide wall images are combined.
  • a vehicle includes a display system.
  • the display system includes an image sensor, a control unit, and a display device.
  • the imaging device captures the rear of the vehicle and generates a first video.
  • the control unit synthesizes a guide wall image indicating the predicted course of the vehicle in the display area on the first video.
  • the display device displays an image of the display area in which the guide wall images are combined.
  • An imaging apparatus includes an imaging element and a control unit.
  • the imaging device captures the rear of the vehicle and generates a first video.
  • the control unit generates a composite image by sequentially superimposing the first image and the second image on the display area on the first video.
  • the second image includes a guide wall image indicating the predicted course of the vehicle.
  • the first image includes a first semi-transmissive image in which the transmittance increases from the lower side to the upper side of the display region overlapping the guide wall image.
  • An image processing apparatus includes a communication unit and a control unit.
  • a communication part acquires the 1st image
  • the control unit generates a composite image by sequentially superimposing the first image and the second image on the display area on the first video.
  • the second image includes a guide wall image indicating the predicted course of the vehicle.
  • the first image includes a transmission gradation image in which the transmittance increases from the lower side to the upper side of the display area overlapping the guide wall image.
  • the control unit generates a composite image by sequentially superimposing the first image and the second image on the display area on the first video.
  • the display device displays the composite image.
  • the second image includes a guide wall image indicating the predicted course of the vehicle.
  • the first image includes a transmission gradation image in which the transmittance increases from the lower side to the upper side of the display area overlapping the guide wall image.
  • a vehicle includes a display system.
  • the display system includes an image sensor, a control unit, and a display device.
  • the imaging device captures the rear of the vehicle and generates a first video.
  • the control unit generates a composite image by sequentially superimposing the first image and the second image on the display area on the first video.
  • the display device displays the composite image.
  • the second image includes a guide wall image indicating the predicted course of the vehicle.
  • the first image includes a transmission gradation image in which the transmittance increases from the lower side to the upper side of the display area overlapping the guide wall image.
  • FIG. 1 is a block diagram showing a schematic configuration of a display system according to an embodiment of the present invention.
  • FIG. 2 is a view of a vehicle including the display system as viewed from the left side.
  • FIG. 3A is a diagram schematically showing an example of the appearance of the display device of FIG.
  • FIG. 3B is a diagram schematically showing an example of the appearance of the display device of FIG.
  • FIG. 3C is a diagram schematically showing an example of the appearance of the display device of FIG.
  • FIG. 3D is a diagram schematically showing an example of the appearance of the display device of FIG.
  • FIG. 3E is a diagram schematically showing an example of the appearance of the display device of FIG.
  • FIG. 4 is a diagram illustrating a first example of the first video.
  • FIG. 5 is a diagram illustrating a first example of the second video corresponding to the display area of the first video in FIG. 4.
  • FIG. 6 is a diagram illustrating an example of the third marker superimposed on the detection target.
  • FIG. 7 is a diagram illustrating a first example of the first marker and the second marker displayed around the detection target.
  • FIG. 8 is a diagram illustrating a second example of the first marker and the second marker displayed around the detection target.
  • FIG. 9 is a diagram illustrating a third example of the first marker and the second marker displayed around the detection target.
  • FIG. 10 is a diagram illustrating a second example of the first video.
  • FIG. 11 is a diagram illustrating a second example of the second video corresponding to the display area of the first video in FIG. 10.
  • FIG. 12 is a diagram illustrating a third example of the first video.
  • FIG. 13 is a diagram illustrating a third example of the second video corresponding to the display area of the first video in FIG. 12.
  • FIG. 14 is a diagram illustrating a fourth example of the first video.
  • FIG. 15 is a diagram illustrating a fourth example of the second video corresponding to the display area of the first video in FIG. 14.
  • FIG. 16 is a diagram illustrating a fifth example of the first video.
  • FIG. 17 is a diagram illustrating a fifth example of the second video corresponding to the display area of the first video in FIG. 16.
  • FIG. 18 is a diagram illustrating another example of the second video corresponding to the display area of the first video in FIG. 16.
  • FIG. 19 is a diagram illustrating a sixth example of the first video.
  • FIG. 20 is a diagram illustrating a sixth example of the second video corresponding to the display area of the first video in FIG. 19.
  • FIG. 21 is a diagram illustrating another example of the second video.
  • FIG. 22 is a diagram illustrating a first modification of the sixth example of the second video.
  • FIG. 23 is a diagram illustrating a second modification of the sixth example of the second video.
  • FIG. 24 is a diagram illustrating a third modification of the sixth example of the second video.
  • FIG. 25 is a diagram illustrating a seventh example of the first video.
  • FIG. 26 is a diagram illustrating a seventh example of the second video corresponding to the display area of the first video in FIG. 25.
  • FIG. 27 is a diagram illustrating an eighth example of the first video.
  • FIG. 22 is a diagram illustrating a first modification of the sixth example of the second video.
  • FIG. 23 is a diagram illustrating a second modification of the sixth example of the second video.
  • FIG. 24 is a diagram illustrating
  • FIG. 28 is a diagram illustrating a positional relationship between the vehicle and the detected object.
  • FIG. 29 is a diagram illustrating an eighth example of the second video corresponding to the display area of the first video in FIG. 24.
  • FIG. 30 is a diagram illustrating a modification of the eighth example of the second video.
  • FIG. 31 is a diagram showing an example in which the display range of FIG. 30 is changed to wide view display.
  • FIG. 32 is a diagram illustrating another example of the second video.
  • Display system A display system 10 according to an embodiment of the present invention will be described with reference to FIG.
  • the display system 10 includes an imaging device 20, an image processing device 30, and a display device 40.
  • Each component of the imaging device 20 and the display system 10 can transmit and receive information via the network 51, for example.
  • the network 51 may include, for example, wireless, wired, or CAN (Controller Area Network).
  • some or all of the components of the display system 10 may be integrally configured as one device.
  • a configuration in which the image processing device 30 is built in the imaging device 20 or the display device 40 is conceivable.
  • the imaging device 20, the image processing device 30, and the display device 40 may be provided in the moving body 50.
  • the “mobile body” in the present disclosure may include, for example, a vehicle, a ship, an aircraft, and the like.
  • the vehicle may include, for example, an automobile, an industrial vehicle, a railway vehicle, a living vehicle, and a fixed wing aircraft that runs on a runway.
  • Automobiles may include, for example, passenger cars, trucks, buses, motorcycles, trolley buses, and the like.
  • Industrial vehicles may include, for example, industrial vehicles for agriculture and construction.
  • Industrial vehicles may include, for example, forklifts and golf carts.
  • Industrial vehicles for agriculture may include, for example, tractors, cultivators, transplanters, binders, combiners, lawn mowers, and the like.
  • Industrial vehicles for construction may include, for example, bulldozers, scrapers, excavators, crane trucks, dump trucks, road rollers, and the like.
  • the vehicle may include a vehicle that travels manually.
  • the classification of the vehicle is not limited to the example described above.
  • an automobile may include an industrial vehicle capable of traveling on a road.
  • the same vehicle may be included in multiple classifications.
  • Ships may include, for example, marine jets, boats, tankers, and the like.
  • Aircraft may include, for example, fixed wing aircraft and rotary wing aircraft.
  • the imaging device 20 can image an external area of the moving body 50.
  • the position of the imaging device 20 is arbitrary inside and outside the moving body 50.
  • the imaging device 20 is located behind the movable body 50 that can capture an external region behind the movable body 50.
  • the position of the image processing apparatus 30 is arbitrary in the moving body 50.
  • the display device 40 can be visually recognized by the subject 60.
  • the position of the display device 40 is arbitrary in the moving body 50.
  • the display device 40 is located in the dashboard of the moving body 50.
  • the imaging device 20 includes an imaging optical system 21, an imaging element 22, a communication unit 23, and a control unit 24.
  • the imaging optical system 21 forms a subject image.
  • the imaging optical system 21 may include a diaphragm and one or more lenses.
  • the image sensor 22 has a plurality of pixels arranged two-dimensionally.
  • the image sensor 22 may include, for example, a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor.
  • the image sensor 22 can capture a subject image formed by the imaging optical system 21 and generate a captured image.
  • the communication unit 23 may include a communication interface capable of communicating with an external device.
  • the communication unit 23 may be capable of transmitting / receiving information via the network 51.
  • the external device may include an image processing device 30, for example.
  • the “communication interface” in the present disclosure may include, for example, a physical connector and a wireless communication device.
  • the physical connector may include an electrical connector that supports transmission using an electrical signal, an optical connector that supports transmission using an optical signal, and an electromagnetic connector that supports transmission using electromagnetic waves.
  • the electrical connector is a connector conforming to IEC 60603, a connector conforming to the USB standard, a connector corresponding to the RCA terminal, a connector corresponding to the S terminal defined in EIAJ CP-1211A, and a D terminal defined in EIAJ RC-5237.
  • a connector corresponding to a coaxial cable including a corresponding connector, a connector conforming to the HDMI (registered trademark) standard, and a BNC (British NavalorConnector or Baby-series N Connector) may be included.
  • the optical connector may include various connectors conforming to IEC 61754.
  • the wireless communication device may include a wireless communication device that conforms to standards including Bluetooth (registered trademark) and IEEE802.11.
  • the wireless communication device includes at least one antenna.
  • the control unit 24 includes one or more processors.
  • the “processor” in the present disclosure may include a dedicated processor specialized for a specific process and a general-purpose processor that executes a specific function by reading a specific program.
  • the dedicated processor may include a DSP (Digital Signal Processor) and an application specific IC (ASIC; Application Specific Integrated Circuit).
  • the processor may include a programmable logic device (PLD).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the control unit 24 may be one of SoC (System-on-a-Chip) and SiP (System-In-a-Package) in which one or more processors cooperate.
  • the control unit 24 controls the entire operation of the imaging device 20.
  • the control unit 24 may cause the image sensor 22 to generate a captured image at an arbitrary frame rate.
  • the frame rate may substantially match the frame rate that the display device 40 can display.
  • the control unit 24 may perform predetermined image processing on the generated captured image.
  • the image processing may include, for example, exposure adjustment processing, white balance processing, distortion correction processing, and the like.
  • the control unit 24 outputs the captured image to the image processing device 30 via the communication unit 23.
  • the control unit 24 may sequentially output captured images at the above frame rate.
  • each captured image output at the above-described frame rate is also simply referred to as a frame.
  • the plurality of captured images output from the imaging device 20 are also referred to as first images. For example, when the frame rate is 60 fps (Flame per Seconds), 60 captured images per second are output as the first video.
  • the image processing apparatus 30 includes a communication unit 31, a storage unit 32, and a control unit 33.
  • the communication unit 31 may include a communication interface capable of communicating with various external devices.
  • the external device includes, for example, an imaging device 20, a display device 40, an ECU (Electronic Control Unit or Engine Control Unit) provided in the moving body 50, a speed sensor, an acceleration sensor, a rotation angle sensor, a steering rudder angle sensor, and an engine speed sensor. , Accelerator sensor, brake sensor, illuminance sensor, raindrop sensor, mileage sensor, millimeter wave radar, obstacle detection device using ultrasonic sonar, ETC (Electronic Toll Collection System) receiver, GPS (Global Positioning System) device , Navigation devices, servers on the Internet, mobile phones and the like.
  • ETC Electronic Toll Collection System
  • GPS Global Positioning System
  • the communication unit 31 may include communication interfaces for inter-vehicle communication, road-to-vehicle communication, and vehicle-to-vehicle communication.
  • the communication unit 31 may include a receiver that supports optical beacons of DSRC (Dedicated Short-Range Communication) and VICS (registered trademark) (Vehicle Information Communication and Communication System) provided in Japan.
  • the communication unit 31 may include a receiver corresponding to a road traffic information providing system in another country.
  • the communication unit 31 may be able to acquire various information from an external device.
  • the communication unit 31 may be able to acquire mobile body information and environment information.
  • the moving body information may include arbitrary information regarding the moving body 50.
  • the moving body information includes, for example, the speed, acceleration, turning gravity, inclination, direction, and turning state of the moving body 50, the steering wheel steering angle, the cooling water temperature, the remaining amount of fuel, the remaining amount of the battery, the voltage of the battery, Engine speed, gear position, presence / absence of reverse signal, presence / absence of accelerator operation, accelerator opening, presence / absence of brake operation, brake depression, presence / absence of parking brake, front / rear or four-wheel speed difference, tire pressure, damper Expansion / contraction amount, driver's eye space position, number of passengers and seat position, seat belt wearing information, door opening / closing, window opening / closing, interior temperature, air conditioning operation status, air conditioning set temperature, air conditioning airflow , Setting of outside air circulation, operating condition of wiper, driving mode, connection information with external device, current time, average fuel consumption, instantaneous fuel consumption, lighting status of various lamps, position of moving body 50 Distribution, and it may include routing information, etc.
  • the environment information may include arbitrary information related to the external environment of the mobile object 50.
  • the environmental information includes, for example, brightness around the moving object 50, weather, atmospheric pressure, outside air temperature, map information, traffic information, road construction information, temporary change in speed limit of the traveling path, and objects detected by other vehicles. And the lighting state of the traffic light.
  • the storage unit 32 may include a temporary storage device and a secondary storage device.
  • the storage unit 32 may be configured using, for example, a semiconductor memory, a magnetic memory, an optical memory, or the like.
  • the semiconductor memory may include volatile memory and non-volatile memory.
  • the magnetic memory may include, for example, a hard disk and a magnetic tape.
  • the optical memory may include, for example, a CD (Compact Disc), a DVD (Digital Versatile Disc), a BD (Blu-ray (registered trademark) Disc), and the like.
  • the storage unit 32 stores various information and programs necessary for the operation of the image processing apparatus 30.
  • the control unit 33 includes one or more processors.
  • the control unit 33 controls the overall operation of the image processing apparatus 30.
  • the control unit 33 may acquire mobile body information and environment information from an external device via the communication unit 31.
  • the control unit 33 may determine the predicted course of the moving body 50 based on the moving body information, for example.
  • the predicted course of the moving body 50 is also referred to as a first predicted course.
  • the control unit 33 may acquire the first video from the imaging device 20 via the communication unit 31.
  • the first video includes a detection area and a display area.
  • the control unit 33 may detect at least a part of the detection target in the detection area on the acquired first video.
  • the detection area on the first video may be at least a partial area on the captured image that is each frame of the first video. Each frame of the first video can be called a captured image.
  • the detection area on the first video may be larger than the display area.
  • the detection area on the first video may include a display area.
  • the control unit 33 can detect the detection target inside the display area.
  • the control unit 33 can detect the detection target outside the display area and inside the detection area.
  • the area inside the detection area and the display area can be referred to as a first area.
  • the area inside the detection area and outside the display area can be called a second area.
  • the detection target may include a plurality of types of objects.
  • the types of objects may include, for example, people, other moving objects, lanes, lanes, white lines, gutters, sidewalks, crosswalks, road signs, traffic signs, guardrails, walls, traffic lights, and the like.
  • the types of detection targets that can be detected by the control unit 33 are not limited to these.
  • At least a part of the detection target includes, for example, a part of the detection target that is not hidden behind the other object when the part of the detection target on the first video is hidden behind the other object. It's okay.
  • the control unit 33 may be able to detect the upper body of the pedestrian.
  • An arbitrary object detection algorithm may be employed for detecting at least a part of the detection target.
  • the control unit 33 may detect at least a part of the detection target by an algorithm such as pattern matching or feature point extraction using a captured image that is each frame of the first video.
  • the control unit 33 may determine a predicted course of the detection target based on the first video.
  • the predicted course to be detected is also referred to as a second predicted course. Any algorithm may be employed for determining the second predicted course.
  • the control unit 33 may determine the second predicted course based on changes in the direction and position of the detection target on the captured image that is each frame of the first video.
  • the control unit 33 may estimate the relative positional relationship between the moving body 50 and the detection target based on the first video when at least a part of the detection target is detected on the first video.
  • the relative positional relationship may include, for example, the distance between the moving body 50 and the detection target, the presence / absence of an overlap between the first predicted course of the moving body 50 and the second predicted course of the detection target, and the like.
  • An arbitrary algorithm may be employed for estimating the distance between the moving object 50 and the detection target.
  • the control unit 33 may estimate the distance between the moving object 50 and the detection target by the motion stereo method using the captured image that is each frame of the first video signal.
  • the control unit 33 may acquire information indicating the relative positional relationship between the moving body 50 and the detection target from an external device via the communication unit 31.
  • the control unit 33 may determine which contribution of the moving body 50 and the detection target is greater with respect to the decrease in the distance. Arbitrary algorithms may be employed for determining the contribution of the moving object 50 and the detection target to the distance reduction.
  • the control unit 33 may detect the moving speed of the moving body 50 based on the moving body information. For example, the control unit 33 may detect the moving speed of the detection target based on a change in the position of the detection target on the captured image that is each frame of the first video. The control unit 33 may determine that one of the moving body 50 and the detection target, which has a higher moving speed, contributes to the decrease in the distance.
  • the control unit 33 may determine that the detection target contributes greatly to the decrease in the distance.
  • the control unit 33 may determine that the contribution of the moving body 50 is large with respect to the decrease in the distance.
  • the reference value may be set arbitrarily, but may be set to substantially zero, for example. Details of the operation of the image processing apparatus 30 according to the contribution of the moving body 50 and the detection target to the decrease in the distance will be described later.
  • the control unit 33 determines, based on the first video, whether there is a possibility that the moving body 50 and the detection target are in contact with each other. It's okay. Arbitrary algorithms may be employed for determining the possibility of contact between the moving body 50 and the detection target. For example, the control unit 33 has at least one of a condition that the distance between the moving body 50 and the detection target is less than a predetermined threshold and a condition that the decreasing speed of the distance is equal to or greater than a predetermined threshold. When it is satisfied, it may be determined that there is a possibility that the moving body 50 and the detection target are in contact with each other. Details of the operation of the image processing apparatus 30 according to the presence or absence of the possibility will be described later.
  • the control unit 33 may cause the display device 40 to display the second video corresponding to the display area on the first video acquired from the imaging device 20. Specifically, the control unit 33 may output the second video to the display device 40 via the communication unit 31. For example, the control unit 33 may display the second video on the display device 40 when the backward movement of the moving body 50 is detected based on the moving body information. For example, the control unit 33 can detect the reverse based on the shift position of the transmission gear. For example, the control unit 33 can detect reverse travel based on a reverse signal output from the moving body during reverse travel.
  • the second video may be a video obtained by cutting out a display area on a captured image that is each frame of the first video, for example.
  • the display area on the first video may be at least a part of the area on the captured image that is each frame of the first video.
  • the display area may be smaller than the detection area.
  • the display area may be included in the detection area.
  • the position, shape, and size of the display area can be arbitrarily determined.
  • the control unit 33 can change the position, shape, and size of the display area. By changing the position, shape, and size of the display area, for example, the display area and the detection area may substantially match.
  • the control unit 33 may combine various markers with the second video and display them on the display device 40. Compositing includes overwriting and mixing.
  • the marker may include one or more images, for example.
  • the control unit 33 may dynamically change the display mode of at least a part of the marker superimposed on the second video.
  • the display mode may include, for example, the position, size, shape, color, shading, and the like of at least a part of the marker on the second video.
  • the control unit 33 may determine the display mode of the marker according to the type of the detection target. Details of the operation of the image processing apparatus 30 for displaying various markers on the display device 40 will be described later.
  • the display device 40 may include, for example, a liquid crystal display and an organic EL (Electroluminescence) display.
  • the display device 40 may display the second video input from the image processing device 30 via the network 51, for example.
  • the display device 40 may function as a touch screen that can accept user operations.
  • the display device 40 may include a switch and a key that can accept an operation by the user.
  • the switch may include a physical switch and an electronic switch.
  • the keys may include physical keys and electronic keys.
  • the display device 40 can be arranged at various locations on the moving body 50.
  • 3A to 3E are diagrams illustrating examples of arrangement of the display device 40 when the moving body 50 is a vehicle.
  • FIG. 3A shows an in-dashboard type display device 40a stored in the dashboard of the vehicle.
  • FIG. 3B shows an on-dashboard type display device 40b disposed on the dashboard.
  • the display device 40 b is incorporated in the moving body 50.
  • the display device 40b may be detachably mounted on the dashboard.
  • FIG. 3C shows a display device 40c built in the rearview mirror and capable of displaying an image as necessary.
  • FIG. 3D shows the display device 40d built in the instruments panel.
  • FIG. 3A shows an in-dashboard type display device 40a stored in the dashboard of the vehicle.
  • FIG. 3B shows an on-dashboard type display device 40b disposed on the dashboard.
  • the display device 40 b is incorporated in the moving body 50.
  • the display device 40b may
  • the display device 40d is disposed adjacent to instruments such as a speedometer and a tachometer.
  • the entire instrument panel may be used as a display device 40d such as an LCD, and a second video may be displayed therein together with images such as a speedometer and a tachometer.
  • FIG. 3E shows a display device 40e using a portable information terminal, for example, a tablet terminal.
  • a mobile phone display can be used as the display device 40e.
  • the terms “vertical direction” and “horizontal direction” for a video or an image correspond to a two-dimensional direction in the video or the image.
  • the terms “height direction”, “horizontal direction”, and “depth direction” for a video or an image correspond to the three-dimensional direction of the space in which the video or the image is projected.
  • FIG. 4 shows a first example of the detection region 61 in the first video acquired by the image processing device 30 from the imaging device 20.
  • the detection area 61 is longer in the left-right direction than in the up-down direction.
  • the display area 62 is located at the center of the detection area 61 in the left-right direction.
  • the control unit 33 may detect each of the pedestrian 63 and the vehicle 64 reflected inside the display area 62 on the first video as detection targets.
  • the control unit 33 determines whether one or more conditions are satisfied based on the relative positional relationship between the detection target detected inside the display area 62 on the first video and the moving body 50. judge.
  • the one or more conditions may include, for example, a first condition that the detection target is located in the first predicted course 65 of the moving body 50.
  • the one or more conditions may include, for example, a second condition that at least a part of the first predicted course 65 of the moving body 50 overlaps at least a part of the second predicted course to be detected.
  • the control unit 33 may cause the display device 40 to display a predetermined marker corresponding to the detection target superimposed on the second video.
  • the predetermined marker may include a first marker, a second marker, and a third marker.
  • control unit 33 may determine that the one or more conditions for the pedestrian 63 are satisfied. In such a case, the control unit 33 may display a marker corresponding to the pedestrian 63. The control unit 33 may determine that the one or more conditions for the vehicle 64 are not satisfied. In such a case, the control unit 33 does not display a marker corresponding to the vehicle 64.
  • FIG. 5 shows an example of the second video corresponding to the display area 62 of the first video shown in FIG.
  • the control unit 33 cuts out the display area 62 of the first video and then the aspect ratio of the screen of the display device 40.
  • the second video that has been deformed according to the above may be output to the display device 40. As shown in FIG. 5, a pedestrian 63 and a vehicle 64 are shown on the second video.
  • the control unit 33 may cause the display device 40 to display a guide line 66 indicating at least a part of the first predicted course 65 of the moving body 50 shown in FIG.
  • the control unit 33 may dynamically change the guide line 66 in accordance with a change in the steering angle of the steering wheel.
  • the first video has a wider range than the display area 62.
  • the control unit 33 can change the range of the display area 62.
  • the control unit 33 may superimpose the icon image 67 on the second video and display it on the display device 40.
  • the outline 67a of the icon image 67 shown in FIG. 5 indicates the maximum range when the range of the display area 62 is changed.
  • a white rectangle 67 b of the icon image 67 indicates the display area 62.
  • An icon image 67 shown in FIG. 5 shows the relative position and size of the display area 62 with respect to the maximum range of the display area 62.
  • FIG. 6 shows an example of a marker superimposed on the pedestrian 63 on the second video, for example.
  • the marker is also referred to as a third marker 68.
  • the outline 69 of the third marker 68 may substantially coincide with the outline of the pedestrian 63 detected on the second video.
  • the region 70 inside the contour line 69 of the third marker 68 may be filled with a color or pattern corresponding to “person” as the type of detection target, for example.
  • the control unit 33 may superimpose the third marker 68 on the pedestrian 63 on the second video and display it on the display device 40. According to such a configuration, the target person 60 can easily visually recognize the pedestrian 63 on the second video.
  • the control unit 33 may hide the third marker 68 when a predetermined time has elapsed since the third marker 68 was displayed.
  • FIG. 7 shows an example of two types of markers superimposed on the periphery of the pedestrian 63 on the second video, for example.
  • each of the two types of markers is also referred to as a first marker 71 and a second marker 72.
  • the control unit 33 may cause the display device 40 to display the first marker 71 and the second marker 72 superimposed on the second video.
  • the control unit 33 may move the position of the first marker 71 following the pedestrian 63 on the second video. Since the first marker 71 follows the pedestrian 63, the subject 60 can easily catch the pedestrian 63. The first marker 71 is displayed around the pedestrian 63 away from the pedestrian 63. The subject 60 can easily understand the behavior of the pedestrian 63 when the first marker 71 is displayed on the display device 40.
  • the control unit 33 may change the overlapping position of the second marker 72 relative to the overlapping position of the first marker 71 on the second video. The control unit 33 may relatively move the second marker 72 with reference to the position of the first marker 71.
  • the control unit 33 can determine that the contribution of the moving body 50 is large with respect to the decrease in the distance. In such a case, the control unit 33 may move the second marker 72 toward the first marker 71. First, the control unit 33 displays the second marker 72 at a position away from the first marker 71. Next, the control unit 33 moves the second marker 72 toward the first marker 71 until the distance from the first marker 71 becomes a predetermined distance. Next, the control unit 33 erases the second marker 72. Next, the control part 33 displays the 2nd marker 72 in the position away from the 1st marker 71 again, and repeats the above-mentioned process. In this example, since the second marker 72 is moving toward the object called the first marker 71, the subject 60 can understand that the second marker 72 is approaching the first marker 71. .
  • the control unit 33 can determine that the contribution of the pedestrian 63 to the decrease in the distance is large. In such a case, the control unit 33 may move the second marker 72 away from the first marker 71. First, the control unit 33 displays the second marker 72 at a position close to the first marker 71. Next, the control unit 33 moves the second marker 72 away from the first marker 71 until the distance from the first marker 71 reaches a predetermined distance. Next, the control unit 33 erases the second marker 72. Next, the control unit 33 displays the second marker 72 again at a position close to the first marker 71 and repeats the above-described processing. In this example, since the second marker 72 has moved from the object called the first marker 71, the subject 60 can understand that the second marker 72 is away from the first marker 71.
  • the control unit 33 changes the moving direction of the second marker 72 relative to the first marker 71 according to the contribution of the moving body 50 and the pedestrian 63 to the decrease in the distance between the moving body 50 and the pedestrian 63. .
  • the subject 60 can identify whether the moving body 50 is approaching the pedestrian 63 or whether the pedestrian 63 is approaching the moving body 50 according to the moving direction of the second marker 72, for example.
  • the control unit 33 may repeatedly enlarge or reduce the second marker 72 around the first marker 71 on the second video.
  • the control unit 33 may superimpose a first marker 71 and a second marker 72 having a shape similar to the outline 69 of the pedestrian 63 on the second video.
  • the controller 33 may repeat the enlargement or reduction of the second marker 72 around the first marker 71.
  • the control unit 33 switches the enlargement or reduction of the second marker 72 according to the contribution of the moving body 50 and the pedestrian 63 to the decrease in the distance between the moving body 50 and the pedestrian 63.
  • the control unit 33 When the distance between the detection target displaying the first marker 71 and the second marker 72 and the moving body 50 is less than a predetermined threshold, the control unit 33 superimposes a new marker on the second video. It's okay.
  • the new marker is also referred to as a fourth marker.
  • the fourth marker may include an arbitrary image.
  • the fourth marker may include an image indicating an exclamation mark “!”. According to such a configuration, for example, when the pedestrian 63 displaying the first marker 71 and the second marker 72 and the moving body 50 approach a certain level or more, the fourth marker is displayed superimposed on the second video. .
  • the target person 60 can identify that the pedestrian 63 is located in the vicinity of the moving body 50 by the fourth marker, for example.
  • the control unit 33 when the distance between the detection target displaying the first marker 71 and the second marker 72 and the moving body 50 is less than a predetermined threshold, the control unit 33 performs the first marker
  • the display mode of 71 and the second marker 72 may be changed.
  • the control unit 33 may change the colors of the first marker 71 and the second marker 72.
  • the control unit 33 can detect two detection objects arranged in the depth direction.
  • the control unit 33 may display the first marker 71 and the second marker 72 on each of the two detection targets located in the front and rear.
  • the control unit 33 can attach different first markers 71 and second markers 72 to the two detection targets.
  • the control unit 33 has a first marker 71 that is less conspicuous than the first marker 71 and the second marker 72 that are attached to the second detection target that is positioned on the near side with respect to the first detection target that is positioned on the back side.
  • a second marker 72 may be attached.
  • the control unit 33 includes the first marker 71 and the second marker attached to the first detection object located on the back side from the first marker 71 and the second marker 72 attached to the second detection object located on the near side. 72, such as making the color darker, increasing the transmittance, and making the line thinner.
  • FIG. 10 shows a second example of the detection region 61 in the first video acquired by the image processing device 30 from the imaging device 20.
  • the detection area 61 is longer in the left-right direction than in the up-down direction.
  • the display area 62 is located at the center of the detection area 61 in the left-right direction.
  • the control unit 33 includes a pedestrian 63a reflected inside the display area 62 on the first video, and pedestrians 63b and 63c reflected outside the display area 62 and inside the detection area 61, respectively. You may detect as a detection target.
  • the process of the control part 33 regarding the pedestrian 63a is similar to the process regarding the pedestrian 63 shown in FIG. 4, for example.
  • the control unit 33 displays a marker corresponding to the detection target superimposed on the second video. It may be displayed on the device 40.
  • the marker is also referred to as a fifth marker.
  • the control unit 33 may display the fifth marker when it is determined that there is a possibility that the moving body 50 and the detection target are in contact with each other.
  • the control unit 33 superimposes the fifth marker on the right end of the second video on the display device 40. You may display.
  • the control unit 33 superimposes the fifth marker on the left end of the second video on the display device 40. You may display.
  • the detection position where the pedestrian 63b is detected is on the right side of the display area 62.
  • the control unit 33 may determine that there is a possibility that the moving body 50 and the pedestrian 63b are in contact with each other. In such a case, the control unit 33 may cause the display device 40 to display the fifth marker corresponding to the pedestrian 63b superimposed on the right end of the second video. Details of the fifth marker corresponding to the pedestrian 63b will be described later.
  • the detection position where the pedestrian 63 c is detected is on the left side of the display area 62.
  • the control unit 33 can determine that there is no possibility that the moving body 50 and the pedestrian 63c come into contact with each other. In such a case, the control unit 33 may not display the fifth marker corresponding to the pedestrian 63b.
  • FIG. 11 shows an example of the second video corresponding to the display area 62 of the first video shown in FIG. As shown in FIG. 11, a pedestrian 63a is shown on the second video. Pedestrians 63b and 63c are not shown on the second video.
  • the control unit 33 may display an obstacle image 74 on the display device 40 by superimposing the obstacle image 74 on the second video, for example.
  • the obstacle image 57 shows the detection result of the obstacle detection device using the ultrasonic sonar provided in the moving body 50.
  • the obstacle image 74 may include an image 74a, an image 74b, and an image 74c.
  • the image 74a is an image in which the moving body 50 is viewed from above.
  • the image 74 b is an image indicating that an obstacle is detected on the left rear side of the moving body 50.
  • the image 74 c is an image indicating that an obstacle has been detected on the right rear side of the moving body 50.
  • the detection result of the obstacle by the obstacle detection device and the detection result of the detection target by the control unit 33 do not necessarily match.
  • the obstacle image 74 indicates that obstacles are detected at the right rear and the left rear of the moving body 50.
  • the control unit 33 can determine that there is no possibility that the pedestrian 63 c existing on the left rear side of the moving body 50 is in contact with the moving body 50. In such a case, the control unit 33 may not display the fifth marker corresponding to the pedestrian 63c.
  • the fifth marker 73 may include an icon image 73a and a band image 73b.
  • the icon image 73a may be an image corresponding to “person” which is the type of detection target.
  • the subject 60 who has visually recognized the icon image 73a can recognize that a person is present on the right side of the second video.
  • the band image 73b is, for example, a band-shaped image extending in the vertical direction on the second video.
  • the band image 73b may be filled with a color or pattern corresponding to, for example, “person” as the type of detection target.
  • the control unit 33 may move the band image 73b in the right end region 73c on the second video.
  • the control unit 33 may change the moving speed and width of the band image 73b.
  • the control unit 33 may determine the width of the band image 73b according to the distance between the moving body 50 and the pedestrian 63b. For example, the control unit 33 may increase the width of the band image 73b as the distance approaches. The subject 60 who has visually recognized the band image 73b can recognize the distance between the moving body 50 and the pedestrian 63b based on the width of the band image 73b.
  • the control unit 33 can determine that the contribution of the moving body 50 is large, for example, with respect to the decrease in the distance between the moving body 50 and the pedestrian 63b. In such a case, the control unit 33 repeatedly moves the band image 73b in the first direction within the right end region 73c on the second video.
  • the first direction may be a direction from the outside toward the inside in the left-right direction on the second video.
  • the control unit 33 can determine that, for example, the contribution of the pedestrian 63b is large with respect to the decrease in the distance between the moving body 50 and the pedestrian 63b. In such a case, the control unit 33 repeatedly moves the band image 73b in the second direction within the right end region 73c on the second video.
  • the second direction may be, for example, a direction from the inside toward the outside in the left-right direction on the second video.
  • the subject 60 who visually recognizes the band image 73b can recognize whether the moving body 50 is approaching the pedestrian 63b or whether the pedestrian 63b is approaching the moving body 50 based on the moving direction of the band image 73b. It is.
  • the control unit 33 may determine the moving speed of the band image 73b according to the decreasing speed of the distance between the moving body 50 and the pedestrian 63b. For example, the moving speed of the band image 73b may be increased as the decreasing speed of the distance is faster. The subject 60 who has visually recognized the band image 73b can recognize the decreasing speed of the distance between the moving body 50 and the pedestrian 63b based on the moving speed of the band image 73b.
  • the control unit 33 detects a user input corresponding to a predetermined user operation in a state where the fifth marker 73 is displayed, the detection position of the pedestrian 63b on the first video is included inside the display region 62.
  • the display area 62 may be changed as described above.
  • the control unit 33 may lengthen the display area 62 on the first video image in the left-right direction and move the display area 62 to the right side in the detection area 61. With such a configuration, for example, as shown in FIG. 13, a pedestrian 63b appears on the second video.
  • the predetermined user operation described above may include an arbitrary user operation.
  • the predetermined user operation described above may include a first user operation that changes the steering angle of the moving body 50.
  • the fifth marker 73 may function as a GUI (Graphic User Interface) that receives the second user operation.
  • GUI Graphic User Interface
  • the GUI is also referred to as an interface image.
  • the predetermined user operation described above may include a second user operation.
  • the control unit 33 may automatically change the display area 62 such that the detection position of the pedestrian 63b on the first video is included inside the display area 62. In this case, the control unit 33 can maintain the automatic change of the display area 62 until the pedestrian 63b is not detected in the detection area 61 of the first video.
  • the control unit 33 may change the icon image 67 in accordance with the change in the display area 62, for example, as shown in FIG.
  • the control unit 33 may change the display area 62 on the first video according to, for example, a pinch-in operation and a pinch-out operation on the display device 40. For example, as shown in FIG. 14, the control unit 33 may make the display area 62 substantially coincide with the detection area 61. In such a case, for example, as shown in FIG. 15, all detection targets in the detection area 61 are displayed on the display device 40.
  • FIG. 16 illustrates a fifth example of the detection region 61 in the first video acquired by the image processing device 30 from the imaging device 20.
  • the detection area 61 is longer in the left-right direction than in the up-down direction.
  • the display area 62 is located at the center of the detection area 61 in the left-right direction.
  • the control unit 33 may detect the vehicle 64a shown in the first predicted course 65 of the moving body 50, the vehicle 64b shown outside the first predicted course 65, and the pedestrian 63d as detection targets. .
  • the characteristic values of the first video and the second video may be reduced.
  • the characteristic value may include an arbitrary parameter related to video visibility.
  • the characteristic value may include at least one of, for example, a luminance value of a video and a contrast ratio.
  • the characteristic value of the second video is lowered, the visibility of the second video can be lowered.
  • the control unit 33 may execute specific image processing on the area corresponding to the detection target on the second video.
  • the specific image processing may include first processing for superimposing a marker corresponding to the detection target on the region.
  • the marker is also referred to as a sixth marker.
  • the sixth marker may include an image that substantially matches the contour shape of the detection target on the second video. According to this configuration, the sixth marker is superimposed and displayed on the detection target on the second video. Therefore, the target person 60 can easily recognize the detection target on the second video even when the characteristic value of the second video is low.
  • the specific image processing may include second processing for changing the characteristic value of the region corresponding to the detection target on the second video.
  • control unit 33 may change the characteristic value of the region so as to improve the visibility of the region on the second video. According to such a configuration, the visibility of the detection target on the second video is improved. Therefore, the target person 60 can easily recognize the detection target on the second video even when the characteristic value of the second video is low.
  • the control unit 33 may execute the above-described specific image processing when one or more conditions are satisfied.
  • the one or more conditions may include a condition that the detection target is located in the first predicted course 65 of the moving body 50.
  • the one or more conditions may include a condition that the first predicted course 65 of the moving body 50 and the second predicted course to be detected overlap.
  • the one or more conditions may include a condition that the distance between the moving body 50 and the detection target is less than a predetermined threshold.
  • the one or more conditions may include a condition that a characteristic value of at least a part of the second video is less than a predetermined threshold.
  • the control unit 33 may determine that one or more conditions described above are satisfied for each of the vehicles 64a and 64b and the pedestrian 63d. In such a case, as shown in FIG. 17, for example, the control unit 33 displays three sixth markers 75a, 75b, and 75c corresponding to the vehicles 64a and 64b and the pedestrian 63d, respectively, superimposed on the second video. It may be displayed on the device 40. The control unit 33 may display the sixth marker superimposed on the detection target in a bright place.
  • the control unit 33 may change the shape of the guide line 66.
  • the control unit 33 may change the shape of the guide line 66 in a region where the guide line 66 and the detection target overlap.
  • FIG. 18 shows one example of the shape of the guide line 66.
  • the guide line 66 of FIG. 18 the guide line is not displayed in the area
  • An example of the shape of the guide line 66 is not limited to erasing, but allows other design changes.
  • the design change includes a color change, a transmittance change, a type change to a broken line, a line thickness change, a blinking, and the like.
  • the control unit 33 may change the shape of the guide line 66 when the sixth marker is not displayed.
  • the control unit 33 may change the shape of the guide line 66 when the first marker 71 and the second marker 72 are displayed on the detection target.
  • various markers corresponding to the detection target detected on the first video are displayed on the display device 40 while being superimposed on the second video.
  • the target person 60 who visually recognizes the marker can recognize the relative positional relationship between the moving body 50 and the detection target at a glance. For this reason, the convenience of the technique which displays the image
  • the image processing device 30 synthesizes the guide wall image 80 and other images in the display area on the first video, and displays various video examples on the display device 40. This will be specifically described.
  • FIG. 19 shows a sixth example of the detection region 61 in the first video acquired by the image processing device 30 from the imaging device 20.
  • the imaging device 20 may be disposed so as to capture the rear of the moving body 50 that is a vehicle.
  • the detection area 61 is longer in the left-right direction than in the up-down direction.
  • the display area 62 is located at the center of the detection area 61 in the left-right direction.
  • the detection region 61 is not limited to a shape that is longer in the left-right direction than in the up-down direction, and various shapes such as a square, a vertically long rectangle, and a circle are possible.
  • the display area 62 can have various positions, sizes, and shapes.
  • the display area 62 is not necessarily located in the center of the detection area 61, and may be on the left or right.
  • the display area 62 is not limited to a part of the detection area 61, and the entire detection area 61 may be the display area 62.
  • the control unit 33 may change the display area 62 on the first video in accordance with a pinch-in operation, a pinch-out operation, or the like for the display device 40.
  • the control unit 33 generates a second video by synthesizing the guide wall image 80 (see FIG. 20) indicating the first predicted course 65 of the moving body 50 with the display area on the first video acquired from the imaging device 20. .
  • the control unit 33 may output an image obtained by superimposing the guide wall image 80 on the second video and display the image on the display device 40.
  • the guide wall image 80 can indicate a predicted course when the moving body 50 moves backward.
  • the guide wall image 80 is a marker that gives a three-dimensional impression to the subject 60.
  • the guide wall image 80 can be considered as an image projected on the first video when the imaging device 20 captures a virtual guide wall that guides the predicted path and is arranged in the real space in which the first video is projected.
  • the guide wall image 80 can be viewed within the field of view of the subject 60 so as to be composed of a plurality of virtual translucent walls extending from the road surface to a predetermined height in the height direction. It can be said that the guide wall image 80 is a three-dimensional display of guide lines displayed on the display device 40 when the vehicle moves backward.
  • the wall-shaped display includes a display according to another aspect such as a surface, a film, or a plate.
  • the three-dimensional display of the guide wall image 80 makes it easy for the subject 60 to grasp the position of the parking space on the road surface, the distance to the obstacle, and the like.
  • the control unit 33 can recognize the detection target in the display area 62 of the first video.
  • the control unit 33 may detect a detection target that is inside the display area 62 on the first video and reflected in the first predicted course 65.
  • the control unit 33 can synthesize a virtual plane as the first recognition wall image 83 on the recognized detection target moving body 50 side.
  • the first recognition wall image 83 moves together with the detection target when the detection target moves.
  • FIG. 20 shows an example of the second video corresponding to the display area 62 of the first video shown in FIG.
  • the guide wall image 80 is disposed away from the lower end of the second video.
  • the guide wall image 80 may include two side wall images 81a and 81b extending in the height direction and the depth direction.
  • the side wall image 81a is located on the left side in the horizontal direction
  • the side wall image 81b is located on the right side in the horizontal direction.
  • a region between the two side wall images 81a and 81b indicates a first predicted course 65 through which the moving body 50 that is a vehicle passes.
  • the interval between the sidewall images 81a and 81b may be set to indicate the vehicle width of the moving body 50.
  • the control unit 33 may acquire the moving body information from the communication unit 31 and bend the side wall images 81a and 81b according to the predicted course.
  • the moving body information includes the steering angle of the steering wheel.
  • the guide wall image 80 may include a plurality of distance wall images 82a, 82b, and 82c that extend in the height direction and the horizontal direction and indicate the distance in the depth direction from the moving body 50.
  • the distance wall images 82a, 82b, and 82c can connect between the sidewall images 81a and 81b. In FIG. 20, three distance wall images 82a, 82b and 82c can be shown.
  • the number of distance wall images is not limited to three, and can be two or four or more.
  • the distance in the depth direction between the distance wall images 82a and 82b may be narrower than the distance in the depth direction between the distance wall images 82b and 82c.
  • the length in the height direction of the distance wall images 82a, 82b, and 82c may be longer or shorter than the length in the height direction of the side wall images 81a and 81b.
  • a frame may be displayed on the outer periphery of each of the side wall images 81a and 81b and the distance wall images 82a, 82b, and 82c of the guide wall image 80.
  • the guide wall image 80 may include at least one of an auxiliary line 84a extending in the depth direction, an auxiliary line 84b extending in the horizontal direction, and an auxiliary line 84c extending in the height direction.
  • the guide wall image 80 may include all of auxiliary lines 84a extending in the depth direction, auxiliary lines 84b extending in the horizontal direction, and auxiliary lines 84c extending in the height direction.
  • the auxiliary line 84a extending in the depth direction can be displayed on the wall surfaces of the side wall images 81a and 81b.
  • the auxiliary line 84b extending in the horizontal direction can be displayed on the wall surfaces of the distance wall images 82a, 82b, and 82c.
  • the auxiliary line 84c extending in the height direction can be displayed on the wall surfaces of the side wall images 81a and 81b and the distance wall images 82a, 82b, and 82c.
  • Each auxiliary line 84a, 84b, 84c can be displayed in any number of 1 or more.
  • the intervals between the auxiliary lines 84a, 84b, and 84c may be varied according to conditions such as the height from the road surface and the distance from the moving body 50.
  • the frames displayed on the outer peripheries of the side wall images 81a, 81b and the distance wall images 82a, 82b, 82c can be regarded as a part of the plurality of auxiliary lines 84a, 84b, 84c.
  • FIG. 20 and the following figures even if there are a plurality of auxiliary lines 84a, 84b, and 84c, only one of them is appropriately given a reference numeral.
  • the distance wall image 82a is a first distance wall image closer to the moving body 50 than the other distance wall images 82b and 82c when mapped to the real space in which the first video is projected.
  • the distance wall image 82a can have a different color from the other distance wall images 82b and 82c.
  • the distance wall images 82b and 82c can be displayed in semi-transparent white, and the distance wall image 82a can be displayed in semi-transparent red.
  • the sidewall images 81a and 81b can be displayed in translucent white.
  • the above color setting is merely an example, and the color of the side wall images 81a and 81b may be different from the distance wall images 82b and 82c.
  • the transmittance of the sidewall images 81a and 81b may be changed according to the position in the depth direction.
  • the transmittance of the side wall images 81a and 81b can be set so as to be lower toward the near side in the depth direction and higher toward the far side.
  • the side wall images 81a and 81b may have different saturation or brightness depending on the position in the depth direction. For example, the saturation can be increased toward the near side in the depth direction.
  • the saturation of the image can be rephrased as the color intensity of the image. It can be said that a continuous change in transmittance, saturation, lightness, chromaticity, etc. has gradation.
  • the transmittance of the distance wall images 82a, 82b, 82c may be changed according to the height from the road surface.
  • the transmittance of the distance wall images 82a, 82b, and 82c can be set so as to increase as the distance from the road surface increases in the height direction.
  • the distance wall images 82a, 82b, and 82c may have different saturation or brightness according to the height from the road surface. For example, the saturation can be lowered as the distance from the road surface increases.
  • the auxiliary lines 84b and 84c on the distance wall image 82a which is the first distance wall image may have the same color as the auxiliary lines 84b and 84c of the other distance wall images 82b and 82c.
  • the auxiliary lines 84a, 84b, 84c can be of any color. For example, when the color of the wall surface of the guide wall image 80 other than the distance wall image 82a is white, the colors of the auxiliary lines 84a, 84b, and 84c may be white with higher luminance or lower transmittance than these wall surfaces.
  • the control unit 33 may detect the vehicle 64 shown in the display area 62 of the first video as a detection target.
  • the vehicle 64 is located in the first predicted course 65.
  • the detection target detected by the control unit 33 is not limited to a vehicle, and may be a building, a fence, an obstacle on a road, a person, an animal, or the like.
  • the control unit 33 can synthesize the first recognition wall image 83 on the recognized moving object 50 side of the detection target.
  • the control unit 33 may change the length of the guide wall image 80 in the height direction.
  • the control unit 33 can shorten the length of the guide wall image 80 in the height direction. Accordingly, the control unit 33 can make the display easy to see even when the first recognition wall image 83 is displayed, and can direct the attention of the subject 60 to the detection target.
  • the first recognition wall image 83 to be detected located in the first predicted course 65 extends in the height direction and the horizontal direction.
  • the first recognition wall image 83 can be displayed as an opaque or translucent surface of any color.
  • the first recognition wall image 83 is displayed as a translucent surface having a rectangular shape on the moving body 50 side of the vehicle 64.
  • the horizontal width of the first recognition wall image 83 may correspond to the horizontal width of the vehicle 64.
  • the width in the horizontal direction of the first recognition wall image 83 can be made substantially coincident with the lateral width of the vehicle 64.
  • the height of the first recognition wall image 83 can be set to a predetermined value.
  • the transmittance of the first recognition wall image 83 may be changed according to the height direction from the road surface.
  • the transmittance of the first recognition wall image 83 can be set so as to increase with increasing distance from the road surface.
  • the first recognition wall image 83 may have different saturation or brightness according to the height direction distance from the road surface. For example, the saturation can be lowered as the distance from the road surface increases.
  • the first recognition wall image 83 may include an auxiliary line extending in the horizontal direction and the height direction.
  • the control unit 33 can move the auxiliary line extending in the horizontal direction of the first recognition wall image 83 in the height direction within the frame of the first recognition wall image 83.
  • the auxiliary line can be displayed so that it appears from the lower end of the frame of the first recognition wall image 83 and disappears from the upper end.
  • the control unit 33 moves the auxiliary line extending in the horizontal direction of the first recognition wall image 83 in the height direction, thereby changing both the relative position and the relative distance change between the moving body 50 and the vehicle 64. Either one of them can be displayed. For example, when the distance between the moving body 50 and the vehicle 64 is decreasing, the control unit 33 can quickly move the auxiliary line extending in the horizontal direction in the height direction.
  • the control unit 33 may move the auxiliary line faster in the height direction as the distance between the moving body 50 and the vehicle 64 is shorter.
  • the control unit 33 may move the auxiliary line extending in the horizontal direction of the first recognition wall image 83 in the height direction regardless of the position of the detection target and the distance from the detection target.
  • the color of the first recognition wall image 83 can be different depending on the distance in the depth direction from the moving body 50 to the vehicle 64. For example, as shown in FIG. 20, when the distance between the moving body 50 and the vehicle 64 is longer than a predetermined distance, the control unit 33 may display the color of the first recognition wall image 83 in blue. it can. When the distance from the vehicle 64 becomes shorter due to the backward movement of the moving body 50, the control unit 33 may change the color of the first recognition wall image 83 from blue to yellow and from yellow to red.
  • the control unit 33 can not display the first recognition wall image 83.
  • the control unit 33 displays the first recognition wall image 83 after highlighting the vehicle 64 in the second video. To do.
  • the highlighting includes thick line display of the contour line of the vehicle 64, blinking of the contour line of the vehicle 64, painting of the image of the vehicle 64, and the like.
  • the control unit 33 may change the display mode according to a change in the relative position with the vehicle 64. For example, when the vehicle 64 moves relatively away from the moving body 50, the first recognition wall image 83 can be not displayed.
  • FIG. 21 is a diagram illustrating an example of a second image in a state in which the distance between the moving body 50 and the vehicle 64 is narrowed due to the backward movement of the moving body 50.
  • the first recognition wall image 83 is displayed in red. Accordingly, it is possible to warn the subject 60 that there is a risk of colliding with the vehicle 64 if the vehicle travels further.
  • the control unit 33 can change the display of the guide wall image 80 when the detection target object and the guide wall image 80 overlap. For example, as illustrated in FIG. 21, the control unit 33 can at least partially prevent the guide wall image 80 overlapping the vehicle 64 of the second video from being displayed. More specifically, the control unit 33 displays on the display device 40 a guide wall image 80 that overlaps the vehicle 64 and that is positioned farther from the moving body 50 in the depth direction than the vehicle 64 or the first recognition wall image 83. I won't let you. In this way, the display device 40 can display the image easily and intuitively for the target person 60.
  • the method for changing the display of the guide wall image 80 is not limited to non-display of the overlapping portion.
  • the control part 33 can make the transmittance
  • the guide wall image 80 is not limited to the example shown in FIGS. 22 to 24 show modified examples of the guide wall image 80.
  • the auxiliary line 84b extending in the horizontal direction of the distance wall images 82a, 82b, and 82c can display only one of the lowest positions in the height direction.
  • the distance wall image 82a displaying only the auxiliary line 84b at the lowest position can easily recognize the ground surface height.
  • the distance wall image 82a displaying only the auxiliary line 84b at the lowest position can easily grasp the distance from the recognition wall image.
  • An auxiliary line 84c extending in the height direction can be displayed at both ends of the distance wall images 82a, 82b, and 82c in the horizontal direction.
  • an auxiliary line 84b positioned at the lower end and an auxiliary line 84c extending in the height direction positioned at both ends in the horizontal direction can be displayed.
  • the distance wall images 82a, 82b, and 82c may be displayed by the auxiliary line 84b extending in the horizontal direction at the lower end of the outer periphery and the auxiliary lines 84c extending in the height direction at both the left and right ends.
  • the distance wall images 82a, 82b, and 82c can be displayed including a surface having a translucent color.
  • the distance wall images 82a, 82b, and 82c can increase the transmittance as the distance from the road surface increases in the height direction, or can reduce the color density.
  • the auxiliary lines 84b and 84c located on the outer periphery of the distance wall image 82a closest to the moving body 50 may be displayed in a different color from the other auxiliary lines.
  • the different color can be, for example, red.
  • the other auxiliary lines can be white, but are not limited thereto.
  • the first recognition wall image 83 displayed in front of the vehicle 64 is displayed as a semi-transparent arbitrary color surface, and the transmittance increases as the distance from the road surface increases in the height direction, or the color density. May be displayed so as to be thin.
  • the first recognition wall image 83 may be displayed so as to disappear gradually in the height direction without displaying the upper boundary so as to be visible.
  • auxiliary lines 84a extending in the depth direction of the side wall images 81a and 81b of the guide wall image 80 are displayed at the upper and lower ends of the side wall images 81a and 81b.
  • the lower auxiliary line 84a 1 can be highlighted more than the upper auxiliary line 84a 2 .
  • the highlighting includes displaying using a thicker line, a line with lower transmittance, a line with higher luminance, or the like.
  • the upper auxiliary line 84a 2 and the lower auxiliary line 84a 1 may be lines having the same display mode.
  • the line of the same mode is a line that is similar in thickness, transmittance, luminance, and the like.
  • the auxiliary line 84a extending in the depth direction may be displayed by using a display method that gradually fades away at the farthest position in the depth direction.
  • the sidewall images 81a and 81b may be displayed so that the end of the farthest position in the depth direction can be visually recognized.
  • an auxiliary line 84c extending in the height direction can be displayed at least at a position intersecting with the end on the most front side and the distance wall images 82a, 82b, and 82c.
  • These auxiliary lines 84c extending in the height direction may be highlighted more than auxiliary lines 84c extending in the other height directions.
  • the auxiliary lines 84c extending in the height direction may have different thicknesses.
  • the auxiliary line 84c can be displayed thicker than the auxiliary line 84c displayed at the position where the auxiliary line 84c intersects the distance wall image 82b, 82c.
  • the guide wall image 80 shown in FIG. 23 is different from the guide wall image 80 of FIG. 22 in the side wall images 81a and 81b.
  • two or more auxiliary lines 84a extending in the depth direction of the side wall images 81a and 81b of the guide wall image 80 can be displayed including the uppermost and lowermost ones.
  • the auxiliary line 84a extending in the depth direction may highlight the auxiliary line positioned at the bottom in the height direction more than other auxiliary lines.
  • the plurality of auxiliary lines 84a may all be displayed as lines having the same mode.
  • the guide wall image 80 in FIG. 23 may employ a change in transmittance, a change in color, a color scheme, and the like that can be employed in the guide wall image 80 in FIG.
  • the guide wall image 80 shown in FIG. 24 is different from the guide wall image 80 of FIG. 22 in the side wall images 81a and 81b.
  • the auxiliary line 84a extending in the depth direction of the side wall images 81a and 81b of the guide wall image 80 displays only the auxiliary line 84a located at the lower end.
  • the auxiliary line 84a extending in the depth direction located at the lower end of the guide wall image 80 may correspond to the line on the road surface when mapped to the real space in which the first video is projected together with the auxiliary line 84b extending in the horizontal direction.
  • the auxiliary lines 84c extending in the height direction may have different thicknesses.
  • the auxiliary line 84c may be thinner as the auxiliary line 84c arranged from the near side in the depth direction to the far side is located on the far side.
  • the display device 40 may not display part or all of the auxiliary lines 84a, 84b, and 84c in the guide wall images 80 shown in the drawings of FIGS.
  • FIG. 25 shows a seventh example of the detection region 61 in the first video acquired by the image processing device 30 from the imaging device 20.
  • the detection area 61 is longer in the left-right direction than in the up-down direction.
  • the display area 62 is located at the center of the detection area 61 in the left-right direction.
  • the display area 62 is not limited to a part of the detection area 61, and the entire detection area 61 may be the display area 62.
  • the moving body 50 is a vehicle and can be moved backward toward the parking space of the parking lot.
  • the imaging device 20 is located behind the moving body 50 and can capture the rear.
  • the control unit 33 may detect, from the detection area 61, the vehicle 64b and the pedestrian 63 that are outside the first predicted course 65 and at least partially shown in the display area 62 as detection targets. In the example illustrated in FIG. 25, the control unit 33 determines that the distance in the depth direction of the vehicle 64a reflected in the first predicted course 65 of the moving body 50 is longer than a predetermined distance and does not detect it.
  • FIG. 26 shows an example of the second video corresponding to the display area 62 of the first video shown in FIG.
  • the control unit 33 recognizes the detection target vehicle 64b and the pedestrian 63 in the display area 62 of the first video, and the first recognition is a virtual plane on the moving body 50 side of the vehicle 64b and the pedestrian 63, respectively.
  • Wall images 83a and 83b are displayed.
  • the first recognition wall images 83a and 83b can be displayed as translucent surfaces parallel to the side wall images 81a and 81b extending in the height direction and the depth direction.
  • the control unit 33 may acquire a change in the distance between the relative position between the vehicle 64b and the pedestrian 63 and the moving body 50, and display the sidewall images 81a and 81b only when the distance is changed in a shorter direction. .
  • an arbitrary algorithm can be used for estimating the distance between the moving object 50 and the detection target.
  • the control unit 33 estimates the distance in the horizontal direction from the vehicle 64a to be detected and the guide wall image 80 of the pedestrian 63, and determines the display color of the first recognition wall images 83a and 83b according to the estimated distance. It's okay.
  • the distance between the detection target and the guide wall image 80 is the detection target and the virtual guide wall when the virtual guide wall displayed by the guide wall image 80 is present in the real space in which the first video is projected. And the distance.
  • the control unit 33 can estimate the first distance between the position where the guide wall image 80 is mapped to the real space where the first video is projected and the position of the detection target.
  • the control unit 33 sets a threshold value for the first distance, and for each of the first recognition wall images 83a and 83b, different display colors such as blue, yellow, and red are sequentially applied from the longest horizontal distance to the shortest distance. And may be displayed on the display device 40.
  • the first recognition wall images 83a and 83b can have auxiliary lines extending in the depth direction.
  • the control unit 33 sets the auxiliary lines extending in the depth direction of the first recognition wall images 83a and 83b in the same way as the auxiliary lines extending in the horizontal direction of the first recognition wall image 83 in FIG. It can be moved in the height direction.
  • the control unit 33 estimates the first position as the position of the detection target vehicle 64b and the pedestrian 63 in the depth direction.
  • the control unit 33 may at least partially change the display of the side wall images 81a and 81b facing the vehicle 64b and the pedestrian 63, respectively, corresponding to the first position.
  • Examples of the display change include a change in the color, brightness, and saturation of the wall surface, a change in the color and thickness of the frame line and / or auxiliary line, or a combination thereof.
  • the display colors of the part 81a 1 of the side wall image 81a facing the vehicle 64b and the part 81b 1 of the side wall image 81b facing the pedestrian 63 are changed.
  • a part of the sidewall images 81a and 81b may be a region divided by an auxiliary line 84c extending in the height direction on the sidewall images 81a and 81b. Display color of a portion 81b 1 of the part 81a 1 and the sidewall image 81b of the sidewall image 81a is good as a different color in response to the first distance.
  • the control unit 33 may change the overall display color of the side wall image 81a or 81b when the detection target is an obstacle larger than the length represented by the side wall images 81a and 81b. Large obstacles include large vehicles such as trailers, buildings, and the like.
  • the control unit 33 may estimate the second predicted course of the detection target, and may estimate the first position as a position where the second predicted course and the sidewall images 81a and 81b intersect.
  • the control unit 33 can estimate the first position as a position where the second predicted course in the real space in which the first video is projected and the mapping of the sidewall images 81a and 81b to the real space intersect.
  • the control unit 33 may change the display of the sidewall images 81a and 81b at least partially corresponding to the first position. Also in this case, the display colors of the portions 81a 1 and 81b 1 of the side wall images 81a and 81b whose display is to be changed may be different colors depending on the first distance.
  • the control unit 33 may change the overall display color of the sidewall image 81a or 81b.
  • control unit 33 may change the display of the portions of the side wall images 81a and 81b that are located in the depth direction with respect to the first recognition wall images 83a and 83b.
  • control unit 33 may change the display of the portions of the side wall images 81a and 81b located on the far side in the depth direction from the position where the second predicted course and the side wall images 81a and 81b intersect. .
  • the display of the first recognition wall images 83a and 83b may be changed according to the height from the road surface. Similar to the first recognition wall image 83 of the sixth example, the display of the first recognition wall images 83a and 83b is performed by setting the transmittance, saturation, brightness, etc. to different values according to the height from the road surface. sell.
  • the display system 10 can spatially and three-dimensionally express the distance and / or position of the detection target that can be an obstacle for backward travel using the guide wall image 80 and the first recognition wall images 83a and 83b.
  • a position where there is a risk of collision can be displayed three-dimensionally from the predicted course of the detection target that is moving.
  • the display system 10 represents the guide wall image 80 and the first recognition wall images 83a and 83b as walls (or planes) having a height direction, thereby spatially representing the detection target and the warning content. Is possible. This makes it easy for the target person 60 to understand the display and warning content.
  • FIG. 27 shows an eighth example of the detection region 61 in the first video acquired by the image processing device 30 from the imaging device 20.
  • the imaging device 20 may be disposed so as to capture the rear of the moving body 50 that is a vehicle.
  • the detection area 61 is longer in the left-right direction than in the up-down direction.
  • the display area 62 is located at the center of the detection area 61 in the left-right direction.
  • the control unit 33 detects the pedestrian 63 and the vehicle 64b reflected in the second area, which is outside the display area 62 on the first video and inside the detection area 61, as detection targets. Good.
  • the control unit 33 determines that the distance in the depth direction of the vehicle 64a shown in the display area 62 is longer than a predetermined distance and does not detect it.
  • FIG. 28 is a diagram schematically illustrating the arrangement of the moving body 50, the detection target pedestrian 63, and the vehicle 64b viewed from above along the height direction in the real space in which the first video is projected.
  • the imaging device 20 can image a subject in a range included in the areas F 1 and F 2 in the real space. Region F 1 and region F 2 correspond to the detection region 61 of the first video.
  • the area F 1 is a first area corresponding to the display area 62 of the first video.
  • the area F 2 corresponds to the second area inside the detection area 61 of the first video and outside the display area 62.
  • a virtual guide wall 90 obtained by mapping the guide wall image 80 displayed in the second video into the real space is displayed by a broken line.
  • the virtual side walls 91a and 91b correspond to the side wall images 81a and 81b displayed in the second video.
  • the virtual distance walls 92a, 92b, and 92c correspond to the distance wall images 82a, 82b, and 82c displayed in the second video.
  • virtual lines 95l 1 to 95l 5 and 95r 1 to 95r 5 indicating the horizontal distance of the detection target with respect to the virtual side walls 91a and 91b of the moving body 50 are indicated by broken lines for the sake of explanation. ing. This horizontal distance is also called the second distance.
  • Virtual lines 95l 1 to 95l 5 are located on the left side of the moving body 50.
  • Virtual lines 95r 1 to 95r 5 are located on the right side of the moving body 50.
  • the intervals between the virtual lines 95l 1 to 95l 5 and the virtual lines 95r 1 to 95r 5 can be arbitrarily set.
  • the intervals between the virtual lines 95l 1 to 95l 5 and the virtual lines 95r 1 to 95r 5 may be equal intervals.
  • the control unit 33 determines which virtual lines 95l 1 to 95l 5 and 95r 1 to 95r 5 are closest to the first video, or which of the virtual lines 95l 1 to 95l 5 and 95r 1 to 95r 5 It can be determined whether it is located within the area between the two.
  • FIG. 29 illustrates a second video corresponding to the first video in FIG.
  • the guide wall image 80 may employ a display method similar to that described in the sixth and seventh examples.
  • the control unit 33 synthesizes the second recognition wall images 89a and 89b at the end of the second video where the detection target exists.
  • the second recognition wall images 89a and 89b may include wall portions 96a and 96b extending in the height direction and the depth direction, and floor portions 97a and 97b extending in the depth direction and the horizontal direction, respectively.
  • the walls 96a and 96b extend from the lower end to the upper end of the left and right ends of the second video.
  • the walls 96a and 96b may extend up and down along only part of the left and right ends of the second video, not the whole. Similarly, the floors 97a and 97b can extend in the left-right direction along the whole or part of the lower end of the second video. The floor portions 97a and 97b may be displayed between the guide wall image 80 and the lower end of the second video.
  • the control unit 33 may change the display of the second recognition wall images 89a and 89b according to the second distance to be detected.
  • Changes in display include changes in color, transmittance, saturation, brightness, size, and the like, and changes in dynamic display methods such as blinking and movement.
  • the display change may be represented by a color change.
  • the display color may be red when the degree of danger is high, yellow when the degree of danger is medium, and blue when the degree of danger is low.
  • the pedestrian 63 shown in FIG. 28 is located on the third virtual line 95r 3 on the right side in the horizontal direction from the virtual side wall 91b of the moving body 50.
  • the control unit 33 can set the degree of risk to a medium level and the right second recognition wall image 89b to have a yellow color.
  • the vehicle 64b shown in FIG. 28 is located farther from the virtual side wall 91a of the moving body 50 than the fifth virtual line 95l 5 on the left side in the horizontal direction.
  • the control unit 33 may determine that the degree of risk is low and set the color of the second recognition wall image 89a on the left side to be blue.
  • the display change may include a lateral movement of the vertical auxiliary lines 96a 1 and 96b 1 displayed on the walls 96a and 96b.
  • the controller 33 may change the speed of movement of the walls 96a and 96b in accordance with the length of the second distance.
  • the second recognition wall images 89a and 89b can be translucent.
  • the wall portions 96a and 96b of the second recognition wall images 89a and 89b may change display characteristics in the vertical direction.
  • the wall portions 96a and 96b of the second recognition wall images 89a and 89b may have lower transmittance toward the bottom.
  • the second recognition wall images 89a and 89b may have higher saturation toward the bottom.
  • the floor portions 97a and 97b of the second recognition wall image 89a may change display characteristics in the left-right direction.
  • the floor portions 97a and 97b of the second recognition wall image 89a may have a lower transmittance toward the left and right ends, and a higher transmittance toward the center of the second video.
  • the saturation of the floor portions 97a and 97b of the second recognition wall image 89a may be reduced toward the center of the second video from the left and right ends.
  • the saturation of the image can be rephrased as the color intensity of the image.
  • the control unit 33 may determine the type of the detection target and display the icon images 99a and 99b corresponding to the type on the floors 97a and 97b of the second recognition wall images 89a and 89b.
  • the type of the detection target can be determined by a known method based on the detection target image detected from the first video. For example, the control unit 33 can determine the object type by matching the contour shape of the detection target with a pattern serving as a model. For example, for pedestrian 63 and vehicle 64b, icons indicating people and vehicles can be selected, respectively.
  • the control unit 33 can estimate the second distance that is the distance in the horizontal direction between the virtual side walls 91a and 91b obtained by mapping the side wall images 81a and 81b in the real space and the detection target in the real space.
  • the second distance is determined from the relationship between each line obtained by mapping the virtual lines 95l 1 to 95l 5 and 95r 1 to 95r 5 in the real space to the detection area 61 of the first video and the position of the image to be detected. sell.
  • the control unit 33 can change the position where the icon images 99a and 99b are displayed according to the second distance.
  • the first distance identification lines 98l 1 to 98l 5 and 98r 1 to 98r 5 correspond to the virtual lines 95l 1 to 95l 5 and 95r 1 to 95r 5 in the real space, respectively. Is provided.
  • the first distance identification lines 98l 1 to 98l 5 are located on the left side of the sidewall image 81a and are numbered in the order of 98l 1 , 98l 2 , 98l 3 , 98l 4 , 98l 5 from the center to the left side.
  • 98r 1 to 98r 5 are located on the right side of the sidewall image 81b, and are numbered in the order of 98r 1 , 98r 2 , 98r 3 , 98r 4 , 98r 5 from the center to the right side.
  • the first distance identification lines 98l 1 to 98l 5 and 98r 1 to 98r 5 indicate positions where the icon images 99a and 99b are displayed, and may not be displayed in the second video.
  • the pedestrian 63 since the virtual side wall 91b of the vehicle 50 located on the horizontal right side of the third virtual line 95r 3, the control unit 33, the pedestrian 63 in the second image of Figure 29
  • the icon image 99b may be arranged on the third first distance identification line 98r3 on the right side. As shown in FIG. 29, the icon image 99b is on the first distance recognition line 98r 3 extending in the depth direction on the second image, as a surface having a length in the depth direction and the height direction, are three-dimensionally displayed It's okay.
  • the icon image 99b may be displayed as a plane parallel to the side wall images 81a and 81b and the second recognition wall images 89a and 89b. Similar to the second recognition wall image 89b, the icon image 99b may have different display forms such as color, saturation, and brightness depending on the second distance. For example, the icon image 99b can be displayed in yellow.
  • the vehicle 64b is located farther from the virtual side wall 91a than the fifth virtual line 95l5 on the left side in the horizontal direction from the virtual side wall 91a of the moving body 50.
  • the control unit 33 may arrange the icon image 99a of the vehicle 64b on the left first distance identification line 98l 5 in the second video of FIG. Or you may arrange
  • the icon image 99a can have different display modes according to the second distance. For example, the icon image 99a can be displayed in blue.
  • the second recognition wall images 89a and 89b and the icon images 99a and 99b The position and display mode change. Further, when the moving body 50 enters the display area 62, the detection target may be displayed in the manner described in the seventh example. An example of a change in display when the vehicle 64b moves from the region F 2 to the region F 1 in the real space shown in FIG. 28 will be described.
  • the vehicle 64b is located on the left side of the virtual line 95l 5 in the real space of FIG.
  • the second recognition wall image 89a is displayed on the second video in FIG. 29, and the icon image 99a is displayed on the leftmost side of the floor 98a of the second recognition wall image 89a.
  • the second recognition wall image 89a and the icon image 99a can be displayed in blue.
  • the second recognition wall image 89a displayed in the second image of FIG. 29 changes the display mode.
  • the color of the second recognition wall image 89a sequentially changes from blue to yellow and red.
  • the icon image 99a moves the display position from the top first distance recognition line 98l 5 towards over the first distance recognition line 98l 1.
  • the color of the icon image 99a may change sequentially from blue to yellow and red. Accordingly, the subject 60 can recognize that the vehicle 64b is approaching outside that is not displayed in the second video. Further, by using the three-dimensional display method, the target person 60 can spatially grasp the approach of the vehicle 64b, and can reduce the possibility of overlooking information.
  • the icon image 99a can move toward the center of the screen.
  • the subject 60 can recognize the moving direction of the vehicle 64b through the movement of the icon image 99a.
  • the vehicle 64b is, in the real space 28, enters from the area F 2 to F 1, since the vehicle 64b enters the display target area of the first image is displayed on the display device 40 included in the second image .
  • the second recognition wall image 89a is not displayed, and the first recognition wall image 83a is displayed on the guide wall image 80 side of the vehicle 64b as shown in FIG.
  • the control unit 33 may blink the second recognition wall image 89a a plurality of times in order to call attention of the subject 60.
  • the control unit 33 displays the first recognition wall image 83a on the moving body 50 side of the vehicle 64b.
  • control unit 33 estimates the second predicted course of the vehicle 64b, and estimates the first position as the position where the second predicted course and the sidewall images 81a and 81b intersect.
  • the control unit 33 may change the display of the part 81a 1 of the sidewall image 81a including the first position.
  • the controller 33 can sequentially change the color of the part 81a 1 of the side wall image 81a to blue, yellow, and red as the vehicle 64b approaches. Accordingly, the subject 60 can easily grasp that the vehicle 64b is approaching and that the second predicted course of the vehicle 64b overlaps the first predicted course 65 of the moving body 50. Become.
  • the position of the detection target is displayed by a three-dimensional display of the guide wall image, the first recognition wall image, the second recognition wall image, the icon image, and the like. It becomes easy to grasp the space. Furthermore, according to one of a plurality of embodiments of the present disclosure, information such as cautions and warnings can be provided in an easily understandable manner by changing the display mode of each wall surface. In addition, by using a three-dimensional display, it is possible to reduce the possibility of overlooking information.
  • FIG. 30 shows another example of the second video corresponding to the first video of FIG.
  • view display areas 101 and 102 indicating the selection status between “Normal view display” and “Wide view display” are added to the lower left. “Normal view display” and “wide view display” will be described later.
  • the display mode of the guide wall image 80, the second recognition wall images 89a and 89b, and the icon images 99a and 99b may be partially different from that in FIG.
  • a guide wall image 80 having a display mode similar to that in FIG. 22 is displayed.
  • the wall portions 96a and 96b of the second recognition wall images 89a and 89b can have one or more auxiliary lines 96a 1 and 96b 1 extending in the height direction.
  • the walls 96a and 96b have a plurality of auxiliary lines 96a 1 and 96b 1 , but only one of them is given a reference numeral.
  • the wall portions 96a and 96b of the second recognition wall images 89a and 89b represent the distance to the detection target by color, and the transmittance may be different from the near side to the far side in the depth direction.
  • the walls 96a and 96b may gradually increase in transmittance in the height direction, or gradually decrease in darkness, and disappear at a predetermined position in the height direction.
  • the control unit 33 determines that there is an object within a predetermined distance range based on the transmittance or color density of the walls 96a and 96b and changes in the auxiliary lines 96a 1 and 96a 2 .
  • the change in the distance or the distance to the detection target can be displayed.
  • a change in the transmittance or color density of the walls 96a and 96b can also be referred to as a gradation change.
  • the control unit 33 can display that the detection target exists within a predetermined distance by the movement of the auxiliary lines 96a 1 and 96a 2 of the walls 96a and 96b.
  • the controller 33 can darken the color on the near side of the walls 96a and 96b, and can display that the detection target exists within a predetermined distance by changing the gradation in the depth direction. Changes in the auxiliary lines 96a 1 and 96b 1 and the transmittance of the walls 96a and 96b are not essential.
  • the control unit 33 can prevent the auxiliary lines 96a 1 and 96b 1 from being displayed on the walls 96a and 96b.
  • the control unit 33 can prevent the wall portions 96a and 96b from changing the transmittance (gradation).
  • the floor portions 97a and 97b of the second recognition wall images 89a and 89b are provided with one or more second distance identification lines 100l and 100r, respectively, in addition to the first distance identification lines 98l and 98r.
  • FIG. 30 there are a plurality of second distance identification lines 100l and 100r, but only one of them is shown with a reference numeral.
  • the first distance identification lines 98l and 98r and the second distance identification lines 100l and 100r may not be displayed on the display.
  • the first distance identification lines 98l and 98r correspond to the distance in the horizontal direction from the moving body 50 to the detection target, while the second distance identification lines 100l and 100r can correspond to the distance in the depth direction.
  • Each of the plurality of second distance identification lines 100l and 100r may correspond to a distance in the depth direction when each auxiliary line 84c extending in the height direction on the sidewall images 81a and 81b is mapped to the real space.
  • the target person 60 can grasp the positions in the horizontal direction and the depth direction up to the detection target according to the positions in the horizontal and depth directions of the icon images 99a and 99b on the floor portions 97a and 97b.
  • the icon images 99a and 99b a three-dimensional image corresponding to the type of detection target is used.
  • the floors 97a and 97b may be disposed on the front side of the most front side end portions of the side wall images 81a and 81b and inside the line extending the side wall images 81a and 81b.
  • the positional relationship between the second recognition wall images 89a and 89b and the guide wall image 80 is not limited to this, and can be changed as appropriate.
  • the display device 40 has view display areas 101 and 102 showing the selection status of “Normal View (Normal View) Display” and “Wide View (Wide View) Display” in the lower right part. .
  • “Normal view display” and “wide view display” differ in the angle of view of the displayed video.
  • the “normal view display” corresponds to the angle of view of the display area 62 of the second video shown in FIG.
  • the “wide view display” corresponds to the angle of view of the detection area 61 of the first video shown in FIG.
  • the “wide view display” does not have to be the entire video of the detection area 61 of the first video, and the video of the angle of view including the “second video” may be cut out from the detection area 61.
  • Each of the second images shown in FIG. 30 and before this is a normal view display.
  • Whether the displayed video is “normal view display” or “wide view display” is indicated by changing the display of the view display areas 101 and 102 indicating the selection status.
  • changing the display of the area indicating the selection status includes changing the color of the view display areas 101 and 102, changing the density, changing the color of the character, and the like.
  • the control unit 33 performs “normal view display” and “wide view display” according to the result of processing the image of the first video or the input from the subject 60 via the display device 40 or the like. Can be switched. For example, when detecting that the detection target is approaching in the state of “normal view display”, the control unit 33 may switch the video to be displayed to “wide view display”. The control unit 33 may switch the video to be displayed to “normal view display” when detecting that there is no detection target within a predetermined range in the state of “wide view display”. When the display device 40 includes a touch panel screen, the control unit 33 receives a signal indicating that the target person 60 has touched and selected the view display area 101 or 102 on the screen from the display device 40, and The display may be switched.
  • FIG. 31 is a diagram showing an example of an image switched from “normal view display” shown in FIG. 30 to “wide view display”.
  • the angle of view of the “wide view display” image in FIG. 31 is wider in the horizontal direction than the angle of view of the “normal view display” image in FIG.
  • the pedestrian 63 and the vehicle 64b that are located outside the display range of the “normal view display” displayed in FIG. 30 are displayed.
  • the horizontal angle of view is wider than the “normal view display” in FIG. 30, so that the interval between the left and right side wall images 81a and 81b of the guide wall image 80 is relative. It is displayed narrowed.
  • the second recognition wall images 89a and 89b disappear.
  • the control unit 33 recognizes the detection target vehicle 64b and the pedestrian 63 in the detection area 61 of the first video, and the first recognition that is a virtual plane on the vehicle 64b and the pedestrian 63 side of the moving body 50, respectively.
  • Wall images 83a and 83b can be displayed.
  • the control unit 33 estimates the positions of the detection target vehicle 64b and the pedestrian 63 in the depth direction.
  • the control unit 33 may change the display of the side wall images 81a and 81b facing the vehicle 64b and the pedestrian 63 based on the estimated position. In FIG.
  • the display of the part 81b 1 of the side wall image 81b and the part 81a 1 of the side wall image 81a is changed corresponding to each of the pedestrian 63 and the vehicle 64b.
  • the display change includes various modes.
  • the display change includes a color change, a change in the thickness of an auxiliary line surrounding the surroundings, a line type change, a blinking start and stop, and a blinking cycle change.
  • the first recognition wall images 83a and 83b and the portions 81a 1 and 81b 1 of the side wall images 81a and 81b can employ a display method and a display mode similar to the example of FIG.
  • the “normal view display” and the “wide view display” are displayed on the display device 40 on the video captured by the imaging device 20. It is possible to switch between and display.
  • the subject 60 can easily detect the presence and position of the detection target located outside the range of the displayed video by using the second recognition wall images 89a and 89b and the icon images 99a and 99b. I can grasp it.
  • the display device 40 automatically or manually switches from “normal view display” to “wide view display”, so that the target 60 is detected outside the video range of “normal view display”. Can be confirmed on the video.
  • the positional relationship with the detection target can be grasped three-dimensionally by displaying the guide wall image 80 and the first recognition wall image 83b.
  • the display system 10 according to one of the plurality of embodiments can make the subject 60 easily grasp the surrounding situation and detect the danger.
  • FIG. 32 shows another example of the second video.
  • FIG. 32 corresponds to FIG.
  • the guide wall image 80 is drawn with a white line.
  • the first recognition wall image 83 is drawn with a white line.
  • the second video includes a first translucent image 80a.
  • the second video includes a second translucent image 83c.
  • the control unit 33 can generate the second video by sequentially superimposing the first image and the second image on the display area on the first video acquired from the imaging device 20.
  • the control unit 33 can generate a composite image by sequentially superposing the first image and the second image on each frame image of the first video.
  • the control unit 33 can output the second video and display it on the display device 40.
  • the second image can include a guide wall image 80 showing the first predicted course 65 of the moving body 50.
  • the first image includes a first semi-transmissive image 80a (transparent gradation image) whose transmittance increases from the lower side toward the upper side.
  • the control unit 33 can superimpose the guide wall image 80 on the first translucent image 80a.
  • the control unit 33 can superimpose the first translucent image 80a on the display area of the first video.
  • the control unit 33 can synthesize the first translucent image 80 a between the display area of the first video and the guide wall image 80.
  • the first semi-transmissive image 80a has a lower transmittance as the position is closer to the lower end.
  • the first semi-transparent image 80a changes to a transparent color as it approaches the upper end from the lower end.
  • the first translucent image 80a gradually changes from a second color different from the first color of the guide wall image 80 to a transparent color.
  • the first color can include, for example, white or cyan.
  • the second color can include, for example, black.
  • the guide wall image 80 is easily visible due to the difference in color between the first color and the second color.
  • the guide wall image 80 is easily visible in a situation where the first video has many colors close to the first color.
  • the white guide wall image 80 as in the example of FIG. 32 can be easily recognized on the ground with snow due to the presence of the first translucent image 80a. Since the guide wall image 80 has high transmittance on the upper side, the second image is close to the first image in the upper region. In the second video, there is little additional color to the area where the distant object is projected. The second video has little influence on the visibility in the region on the back side from the guide wall image 80.
  • the second video can easily obtain a sense of depth.
  • the first semi-transparent image 80a increases in transmittance from the lower side toward the upper side, the second image has a bright impression.
  • the control unit 33 can generate the second video by sequentially superposing the third image and the fourth image on the recognized detection target display area in the first video.
  • the fourth image may include a first recognition wall image 83 indicating the presence of the recognized detection target.
  • the third image includes a second translucent image 83c.
  • the control unit 33 can superimpose the first recognition wall image 83 on the second translucent image 83c.
  • the control unit 33 can superimpose the second translucent image 83c on the display area to be detected.
  • the control unit 33 can synthesize the second translucent image 83 c between the display area to be detected and the first recognition wall image 83.
  • the second translucent image 83c has a lower transmittance as it is closer to the lower end.
  • the second semi-transparent image 83c changes to a transparent color as it approaches the upper end from the lower end.
  • the second translucent image 83c gradually changes from a fourth color different from the third color of the first recognition wall image 83 to a transparent color.
  • the third color can include, for example, red, yellow, and white or cyan.
  • the second color can include, for example, black.
  • the first recognition wall image 83 is easily visible due to the difference in color between the third color and the fourth color.
  • the first recognition wall image 83 can be easily viewed in a situation where the recognized detection target is close to the third color.
  • the white first recognition wall image 83 as in the example of FIG. 32 is easily recognized when displayed in front of the white track due to the presence of the second semi-transparent image 83c.
  • each component and function of the display system 10 may be rearranged.
  • part or all of the configuration and functions of the image processing device 30 may be included in at least one of the imaging device 20 and the display device 40.
  • the image processing device 30 or the like may be realized as a communication device such as a mobile phone or an external server, and may be connected to other components of the display system 10 by wire or wireless.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Computer Graphics (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un dispositif d'imagerie (20) pourvu d'un élément d'imagerie (22) et d'une unité de commande (33). L'élément d'imagerie (22) génère une première image en photographiant la zone derrière un véhicule (50). L'unité de commande (33) combine une image de la paroi de guidage (80), qui indique la trajectoire prédite du véhicule, dans une région d'affichage sur la première image.
PCT/JP2017/042292 2016-12-09 2017-11-24 Dispositif d'imagerie, dispositif de traitement d'image, système d'affichage et véhicule WO2018105417A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP25153531.6A EP4520636A2 (fr) 2016-12-09 2017-11-24 Appareil d'imagerie, appareil de traitement d'image, système d'affichage et véhicule
EP22206620.1A EP4155128B1 (fr) 2016-12-09 2017-11-24 Appareil d'imagerie, appareil de traitement d'image, système d'affichage et véhicule
US16/467,715 US11010934B2 (en) 2016-12-09 2017-11-24 Imaging apparatus, image processing apparatus, display system, and vehicle
EP17878733.9A EP3554062B1 (fr) 2016-12-09 2017-11-24 Dispositif d'imagerie
US17/231,908 US11587267B2 (en) 2016-12-09 2021-04-15 Imaging apparatus, image processing apparatus, display system, and vehicle
US18/156,210 US11961162B2 (en) 2016-12-09 2023-01-18 Imaging apparatus, image processing apparatus, display system, and vehicle
US18/604,250 US20240221246A1 (en) 2016-12-09 2024-03-13 Imaging apparatus, image processing apparatus, display system, and vehicle

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-239449 2016-12-09
JP2016239449A JP6762863B2 (ja) 2016-12-09 2016-12-09 撮像装置、画像処理装置、表示システム、および車両
JP2016-245775 2016-12-19
JP2016245775A JP6781035B2 (ja) 2016-12-19 2016-12-19 撮像装置、画像処理装置、表示システム、および車両

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/467,715 A-371-Of-International US11010934B2 (en) 2016-12-09 2017-11-24 Imaging apparatus, image processing apparatus, display system, and vehicle
US17/231,908 Division US11587267B2 (en) 2016-12-09 2021-04-15 Imaging apparatus, image processing apparatus, display system, and vehicle

Publications (1)

Publication Number Publication Date
WO2018105417A1 true WO2018105417A1 (fr) 2018-06-14

Family

ID=62491455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/042292 WO2018105417A1 (fr) 2016-12-09 2017-11-24 Dispositif d'imagerie, dispositif de traitement d'image, système d'affichage et véhicule

Country Status (3)

Country Link
US (4) US11010934B2 (fr)
EP (3) EP3554062B1 (fr)
WO (1) WO2018105417A1 (fr)

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017210266A1 (de) * 2017-06-20 2018-12-20 Zf Friedrichshafen Ag Umfeldüberwachung eines Ego-Fahrzeugs
JP6686988B2 (ja) * 2017-08-28 2020-04-22 株式会社Soken 映像出力装置及び映像生成プログラム
DE102017221191B4 (de) * 2017-11-27 2019-06-13 Volkswagen Aktiengesellschaft Verfahren zur Anzeige des Verlaufs einer Sicherheitszone vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
JP7074482B2 (ja) * 2018-01-17 2022-05-24 京セラ株式会社 画像処理装置、撮像システム、移動体、および画像処理方法
JP7106296B2 (ja) * 2018-02-28 2022-07-26 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
JP6857147B2 (ja) * 2018-03-15 2021-04-14 株式会社日立製作所 三次元画像処理装置、及び三次元画像処理方法
US11436932B2 (en) * 2018-04-27 2022-09-06 Red Six Aerospace Inc. Methods and systems to allow real pilots in real aircraft using augmented and virtual reality to meet in a virtual piece of airspace
US11887495B2 (en) 2018-04-27 2024-01-30 Red Six Aerospace Inc. Augmented reality for vehicle operations
DE102018214875A1 (de) * 2018-08-31 2020-03-05 Audi Ag Verfahren und Anordnung zum Erzeugen einer Umgebungsrepräsentation eines Fahrzeugs und Fahrzeug mit einer solchen Anordnung
JP7267755B2 (ja) * 2019-01-23 2023-05-02 株式会社小松製作所 作業機械のためのシステム及び方法
US11312416B2 (en) * 2019-01-30 2022-04-26 Toyota Motor Engineering & Manufacturing North America, Inc. Three-dimensional vehicle path guidelines
JP7209189B2 (ja) * 2019-03-25 2023-01-20 パナソニックIpマネジメント株式会社 画像生成装置、カメラ、表示システム、車両及び画像生成方法
CN113678433B (zh) * 2019-04-18 2024-05-03 三菱电机株式会社 车辆周边图像生成装置、车辆周边显示系统及车辆周边显示方法
US11475772B2 (en) * 2019-05-01 2022-10-18 Ottopia Technologies Ltd. System and method for remote operator assisted driving through collision warning
JP7116436B2 (ja) * 2020-03-05 2022-08-10 酒井重工業株式会社 建設車両の障害物検知装置
WO2021186853A1 (fr) * 2020-03-19 2021-09-23 日本電気株式会社 Dispositif de génération d'image, procédé de génération d'image et programme
US11421400B2 (en) * 2020-04-23 2022-08-23 Deere & Company Multiple mode operational system and method with object detection
JP7296345B2 (ja) * 2020-06-26 2023-06-22 酒井重工業株式会社 転圧ローラの障害物検知装置
JP7491194B2 (ja) 2020-11-23 2024-05-28 株式会社デンソー 周辺画像生成装置、表示制御方法
EP4068153A1 (fr) * 2021-03-31 2022-10-05 Honda Research Institute Europe GmbH Système d'aide à la conduite avancé pour aider un conducteur d'un véhicule
US11794766B2 (en) * 2021-10-14 2023-10-24 Huawei Technologies Co., Ltd. Systems and methods for prediction-based driver assistance
JP7398492B2 (ja) * 2022-03-14 2023-12-14 本田技研工業株式会社 制御装置、制御方法、及び制御プログラム
JP7492985B2 (ja) 2022-03-22 2024-05-30 本田技研工業株式会社 制御装置、制御方法、及び制御プログラム
JP2024011505A (ja) * 2022-07-14 2024-01-25 株式会社Subaru 車外リスクの視認誘導装置
JP2024041499A (ja) * 2022-09-14 2024-03-27 トヨタ自動車株式会社 表示制御装置、方法およびプログラム
DE102023208082A1 (de) * 2023-08-23 2025-02-27 Continental Autonomous Mobility Germany GmbH Verfahren zum Erzeugen einer Bilddarstellung, Steuereinrichtung, Fahrerassistenzsystem und Fahrzeug

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003019937A (ja) * 2001-07-06 2003-01-21 Denso Corp 障害物検知装置
JP2004147083A (ja) * 2002-10-24 2004-05-20 Matsushita Electric Ind Co Ltd 運転支援装置
JP2004345554A (ja) * 2003-05-23 2004-12-09 Clarion Co Ltd 車両後方モニタ装置及び車両後方モニタシステム
JP2005045602A (ja) * 2003-07-23 2005-02-17 Matsushita Electric Works Ltd 車両用視界モニタシステム
JP2006252389A (ja) * 2005-03-14 2006-09-21 Aisin Seiki Co Ltd 周辺監視装置
JP2009040113A (ja) 2007-08-06 2009-02-26 Panasonic Corp 電子ミラー電源制御システム
JP2010047253A (ja) * 2008-05-08 2010-03-04 Aisin Seiki Co Ltd 車両周辺表示装置
JP2012227699A (ja) * 2011-04-19 2012-11-15 Toyota Motor Corp 画像表示装置、及び、画像表示方法
JP2013144492A (ja) * 2012-01-13 2013-07-25 Aisin Seiki Co Ltd 障害物警報装置

Family Cites Families (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08167100A (ja) * 1994-12-12 1996-06-25 Hisaji Nakamura 操縦装置
JPH1134772A (ja) 1997-07-16 1999-02-09 Nissan Motor Co Ltd 車間距離表示装置
EP1408693A1 (fr) * 1998-04-07 2004-04-14 Matsushita Electric Industrial Co., Ltd. Appareil de visualisation embarqué, système de transmission d'image, appareil de transmission d'image, et appareil de capture d'image
JP3645196B2 (ja) * 2001-02-09 2005-05-11 松下電器産業株式会社 画像合成装置
JP4108314B2 (ja) * 2001-10-31 2008-06-25 トヨタ自動車株式会社 車両用周辺監視装置
JP4005456B2 (ja) * 2002-09-10 2007-11-07 株式会社オートネットワーク技術研究所 カメラ装置及び車両周辺視認装置
DE10250021A1 (de) * 2002-10-25 2004-05-13 Donnelly Hohe Gmbh & Co. Kg Verfahren zum Betrieb eines Darstellungssystems in einem Fahrzeug zum Auffinden eines Parkplatzes
US6847873B1 (en) * 2003-07-08 2005-01-25 Shih-Hsiung Li Driver information feedback and display system
JP2006040008A (ja) * 2004-07-28 2006-02-09 Auto Network Gijutsu Kenkyusho:Kk 運転支援装置
EP1696669B1 (fr) 2005-02-24 2013-07-03 Aisin Seiki Kabushiki Kaisha Dispositif de surveillance des environs d'un véhicule
JP4600760B2 (ja) * 2005-06-27 2010-12-15 アイシン精機株式会社 障害物検出装置
TWM284569U (en) * 2005-10-12 2006-01-01 Cing-Liang Lyu Wide-angle rearview mirror assembly for rearview and car-backing monitoring
JP4432929B2 (ja) * 2006-04-25 2010-03-17 トヨタ自動車株式会社 駐車支援装置及び駐車支援方法
JP4595976B2 (ja) * 2007-08-28 2010-12-08 株式会社デンソー 映像処理装置及びカメラ
JP5636609B2 (ja) * 2008-05-08 2014-12-10 アイシン精機株式会社 車両周辺表示装置
US8366054B2 (en) * 2009-12-17 2013-02-05 The United States Of America As Represented By The Secretary Of The Navy Hand launchable unmanned aerial vehicle
JP5071743B2 (ja) * 2010-01-19 2012-11-14 アイシン精機株式会社 車両周辺監視装置
CN201646714U (zh) * 2010-01-26 2010-11-24 德尔福技术有限公司 泊车导向系统
JP2013091331A (ja) * 2010-02-26 2013-05-16 Panasonic Corp 運転支援装置
DE102010046396A1 (de) * 2010-09-24 2012-03-29 Huf Hülsbeck & Fürst Gmbh & Co. Kg Kameraeinheit für ein Kraftfahrzeug
TW201221391A (en) * 2010-11-26 2012-06-01 Hon Hai Prec Ind Co Ltd Vehicle and backing up method thereof
WO2012071858A1 (fr) * 2010-12-02 2012-06-07 Liu Ansheng Dispositif de sécurité de conduite
DE102011010624B4 (de) * 2011-02-08 2014-10-16 Mekra Lang Gmbh & Co. Kg Anzeigevorrichtung für Sichtfelder eines Nutzfahrzeugs
WO2012145819A1 (fr) * 2011-04-25 2012-11-01 Magna International Inc. Procédé de traitement d'images pour détecter des objets à l'aide d'un mouvement relatif
DE102011105884B4 (de) * 2011-06-28 2019-02-21 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zum Einparken eines Fahrzeugs
JP5408198B2 (ja) * 2011-08-01 2014-02-05 日産自動車株式会社 映像表示装置及び映像表示方法
JP5845909B2 (ja) 2012-01-13 2016-01-20 アイシン精機株式会社 障害物警報装置
DE102013217430A1 (de) * 2012-09-04 2014-03-06 Magna Electronics, Inc. Fahrerassistenzsystem für ein Kraftfahrzeug
GB2546187B (en) * 2013-01-28 2017-11-01 Jaguar Land Rover Ltd Vehicle path prediction and obstacle indication system and method
JP6231345B2 (ja) 2013-10-18 2017-11-15 クラリオン株式会社 車両用発進支援装置
JP2015226146A (ja) * 2014-05-27 2015-12-14 学校法人関東学院 3次元モニタ装置
US9846969B2 (en) * 2014-12-01 2017-12-19 Thinkware Corporation Electronic apparatus, control method thereof, computer program, and computer-readable recording medium
WO2016140192A1 (fr) * 2015-03-04 2016-09-09 三菱電機株式会社 Dispositif de commande d'affichage de véhicule et dispositif d'affichage de véhicule
US10989929B2 (en) * 2016-02-05 2021-04-27 Maxell, Ltd. Head-up display apparatus
JP6383376B2 (ja) * 2016-03-31 2018-08-29 株式会社Subaru 周辺リスク表示装置
JP6828266B2 (ja) * 2016-04-18 2021-02-10 ソニー株式会社 画像表示装置及び画像表示装置、並びに移動体
US20170374287A1 (en) * 2016-06-23 2017-12-28 Werner Lang System for Visually Depicting Fields of View of a Commercial Vehicle
US20180075657A1 (en) * 2016-09-15 2018-03-15 Microsoft Technology Licensing, Llc Attribute modification tools for mixed reality

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003019937A (ja) * 2001-07-06 2003-01-21 Denso Corp 障害物検知装置
JP2004147083A (ja) * 2002-10-24 2004-05-20 Matsushita Electric Ind Co Ltd 運転支援装置
JP2004345554A (ja) * 2003-05-23 2004-12-09 Clarion Co Ltd 車両後方モニタ装置及び車両後方モニタシステム
JP2005045602A (ja) * 2003-07-23 2005-02-17 Matsushita Electric Works Ltd 車両用視界モニタシステム
JP2006252389A (ja) * 2005-03-14 2006-09-21 Aisin Seiki Co Ltd 周辺監視装置
JP2009040113A (ja) 2007-08-06 2009-02-26 Panasonic Corp 電子ミラー電源制御システム
JP2010047253A (ja) * 2008-05-08 2010-03-04 Aisin Seiki Co Ltd 車両周辺表示装置
JP2012227699A (ja) * 2011-04-19 2012-11-15 Toyota Motor Corp 画像表示装置、及び、画像表示方法
JP2013144492A (ja) * 2012-01-13 2013-07-25 Aisin Seiki Co Ltd 障害物警報装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3554062A4

Also Published As

Publication number Publication date
US11961162B2 (en) 2024-04-16
EP3554062A1 (fr) 2019-10-16
US11010934B2 (en) 2021-05-18
EP3554062A4 (fr) 2020-06-17
EP4155128B1 (fr) 2025-02-12
US20230154065A1 (en) 2023-05-18
EP3554062B1 (fr) 2022-12-14
EP4520636A2 (fr) 2025-03-12
US20240221246A1 (en) 2024-07-04
EP4155128A1 (fr) 2023-03-29
US11587267B2 (en) 2023-02-21
US20200302657A1 (en) 2020-09-24
US20210233290A1 (en) 2021-07-29

Similar Documents

Publication Publication Date Title
US11961162B2 (en) Imaging apparatus, image processing apparatus, display system, and vehicle
JP6762863B2 (ja) 撮像装置、画像処理装置、表示システム、および車両
JP6781035B2 (ja) 撮像装置、画像処理装置、表示システム、および車両
WO2022168540A1 (fr) Dispositif de commande d'affichage et programme de commande d'affichage
JP6413477B2 (ja) 画像表示制御装置および画像表示システム
US11938863B2 (en) Peripheral image generation device and display control method
WO2018092919A1 (fr) Dispositif de traitement d'image, dispositif d'imagerie et système d'affichage
JP2021007255A (ja) 撮像装置、画像処理装置、表示システム、および車両
JP2018085584A (ja) 画像処理装置、撮像装置、および表示システム
JP7007438B2 (ja) 撮像装置、画像処理装置、表示装置、表示システム、および車両
JP6974564B2 (ja) 表示制御装置
JP7296490B2 (ja) 表示制御装置及び車両
JP2022121370A (ja) 表示制御装置及び表示制御プログラム
CN115503745A (zh) 车辆用显示装置、车辆用显示系统、车辆用显示方法以及存储有程序的非临时性存储介质
JP6759072B2 (ja) 画像処理装置、撮像装置、および表示システム
JP6712942B2 (ja) 画像処理装置、撮像装置、および表示システム
JP6720063B2 (ja) 画像処理装置、撮像装置、および表示システム
JP7259377B2 (ja) 車両用表示装置、車両、表示方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17878733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017878733

Country of ref document: EP

Effective date: 20190709

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载