+

WO2018138411A1 - Modelling a vessel and its vicinity in real-time - Google Patents

Modelling a vessel and its vicinity in real-time Download PDF

Info

Publication number
WO2018138411A1
WO2018138411A1 PCT/FI2018/050045 FI2018050045W WO2018138411A1 WO 2018138411 A1 WO2018138411 A1 WO 2018138411A1 FI 2018050045 W FI2018050045 W FI 2018050045W WO 2018138411 A1 WO2018138411 A1 WO 2018138411A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
vessel
dimensional
dynamic
real
Prior art date
Application number
PCT/FI2018/050045
Other languages
French (fr)
Inventor
Sauli Eloranta
Oskar Levander
Original Assignee
Rolls-Royce Oy Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rolls-Royce Oy Ab filed Critical Rolls-Royce Oy Ab
Publication of WO2018138411A1 publication Critical patent/WO2018138411A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63HMARINE PROPULSION OR STEERING
    • B63H25/00Steering; Slowing-down otherwise than by use of propulsive elements; Dynamic anchoring, i.e. positioning vessels by means of main or auxiliary propulsive elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Instruments for performing navigational calculations specially adapted for water-borne vessels

Definitions

  • the invention relates to modelling a vessel and its vicinity in real-time.
  • a marine vessel and its vicinity can be modelled using, for example, by overlaying a radar screen picture and automatic identification system data on the top of a satellite position based map view. This, however, provides only a limited representation of the marine vessel and its vicinity, and, for example, remote operation of the vessel is difficult.
  • a solution is provided where sensed data, dynamic data and static data from multiple sources is obtained and based on the obtained data it is possible to combine a real-time three dimensional view of a vessel and its vicinity.
  • the real-time three dimensional view may be used, for example, for remotely navigating the vessel.
  • a method for modelling a vessel and its vicinity real ⁇ time comprises obtaining information about location of the vessel; obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, obtaining non-sensed dynamic data relating to the surroundings of the vessel, obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three-dimensional navigation model providing a real-time three-dimensional view of the vessel and its vicinity .
  • the method further comprises causing display of the three-dimensional navigation model .
  • the method further comprises causing transmission of at least part of the three-dimensional navigation model to a remote receiver.
  • the method further comprises obtaining the static data from a local data storage, and obtaining the non-sensed dynamic data from at least one external entity via network access.
  • the static data comprises at least one of three-dimensional nautical chart data, three-dimensional topography data, map symbol data, shipping lane data, for example, using different colors, three-dimensional model data of the vessel, marine infrastructure data, and port model data.
  • the non-sensed dynamic data comprises at least one of wave data, weather data, local time data, sea level data and data relating to identified nearby vessels.
  • the dynamic sensor data from at least one external entity comprises at least one of sensor data from a harbor, sensor data from at least one other nearby surface vessel, and sensor data from a drone.
  • the method further comprises receiving a selection of a virtual control point, and causing display of the three- dimensional navigation model for the real-time remote operation of the vessel based on the selected virtual control point.
  • an apparatus for modelling a vessel and its vicinity real-time comprises means for obtaining information about location of the vessel, means for obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, means for obtaining non- sensed dynamic data relating to the surroundings of the vessel, means for obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and means for combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model providing a real-time three-dimensional view of the vessel and its vicinity.
  • the apparatus further comprises means for causing display of the three-dimensional navigation model.
  • the apparatus further comprises means for causing transmission of at least part of the three-dimensional navigation model to a remote receiver.
  • the apparatus further comprises means for obtaining the static data from a local data storage, and means for obtaining the non-sensed dynamic data from at least one external entity via network access.
  • the static data comprises at least one of three-dimensional nautical chart data, three-dimensional topography data, map symbol data, shipping lane data, for example, using different colors, three-dimensional model data of the vessel, marine infrastructure data, and port model data.
  • the non-sensed dynamic data comprises at least one of wave data, weather data, local time data, sea level data, and data relating to identified nearby vessels.
  • the dynamic sensor data from at least one external entity comprises at least one of sensor data from a harbor, sensor data from at least one other nearby surface vessel and sensor data from a drone .
  • the apparatus further comprises means for receiving a selection of a virtual control point, and means for causing display of the three-dimensional navigation model for the real ⁇ time remote operation of the vessel based on the selected virtual control point.
  • a computer program comprising program code instructions, which when executed by at least one processor, cause the at least one processor to perform the method of the first aspect.
  • the computer program is embodied on a computer-readable medium.
  • a control system comprising at least one processing unit and at least one memory, wherein the at least one memory stores program instructions that, when executed by the at least one processing unit, cause the control system to obtain information about location of the vessel; obtain static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, obtain non-sensed dynamic data relating to the surroundings of the vessel, obtain dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and combine based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model providing a real-time three-dimensional view of the vessel and its vicinity.
  • FIG. 1 is a flow diagram illustrating a method for modelling a vessel and its vicinity real-time in accordance with an example embodiment.
  • FIG. 2 is an example block diagram of data used to combine a real-time three-dimensional navigation model in accordance with an example embodiment.
  • FIG. 3A is an example block diagram illustrating various entities connected to a remote operator system in accordance with an example embodiment .
  • FIG. 3B is an example block diagram illustrating various entities connected to a vessel system in accordance with an example embodiment.
  • FIG. 4 is an example block diagram of a remote operator system in accordance with an example embodiment .
  • FIG. 1 is a flow diagram illustrating a method for modelling a vessel and its vicinity real-time in accordance with an example embodiment.
  • FIG. 1 is discussed together with FIG. 2 illustrating data used to combine a real-time three-dimensional navigation model.
  • the vessel may automatically send, for example, periodically its location to a receiving entity over a communication interface or the receiving entity may request the location from the vessel over the communication interface.
  • static data 200 comprising three- dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel is obtained.
  • the static data 200 may comprise three-dimensional nautical chart data, three-dimensional topography data, map symbol data, shipping lane data, three-dimensional model data of the vessel, marine infrastructure data, and/or port model data.
  • the static data 200 refers to data that remains unchanged in the course of time.
  • the static data may be stored locally by the entity combining the real-time three-dimensional navigation model.
  • at least part of the static data may be retrieved from at least one external data source, for example, via the internet.
  • the static local data may also be stored in multiple locations, for example, in the vessel, a remote operator center and a cloud storage.
  • non-sensed dynamic data 202 relating to the surroundings of the vessel is obtained.
  • the non-sensed dynamic data 202 may comprise wave data, weather data, local time data, and data relating to identified nearby vessels.
  • the non-sensed dynamic data 202 refers to dynamic data that can be retrieved without the vessel being involved in the data transmission.
  • the non-sensed dynamic data can be retrieved, for example, from at least one data source via the internet.
  • dynamic sensor data 204 comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data 204 for the vessel is obtained.
  • the dynamic sensor data 204 may comprise sensor data from a harbor received by the vessel and sensor data from a drone received by the vessel.
  • the dynamic sensor data 204 refers to data of all objects that is not retrievable from onshore data resources.
  • the dynamic sensor data may be obtained from the vessel.
  • the vessel may send the dynamic sensor data, for example, to a cloud service and/or to a remote operator system.
  • the dynamic sensor data 204 is processed by the vessel so that the type of a tagged object may be sent to a remote server only once, once the identification has happened. For example, if the vessel identifies a nearby other vessel, the type of the vessel (for example, a sailing boat) is sent to the remote server only once. The remote server is then able to generate a three-dimensional object model based on the information received from the vessel. The vessel may also use at least one sensor to determine the size of the object that is sent once to the remote server (scaling object to right size) . Further, the dynamic sensor data 204 may comprise location data relating to the object, and the location data may be sent to the remote server at a given frequency (to identify the location, speed and heading of the object) .
  • an unidentified object in case of anomalies, for example, data unclarity, pre-identified risk criteria, an unidentified object can be replicated at the remote server with a special identity and warning complemented with transfer of less processed data (for example, a photograph, focused video, radar data, laser scanning data, data representing a close match a generic model that represents the type and/or size of the object etc . ) .
  • a special identity and warning complemented with transfer of less processed data (for example, a photograph, focused video, radar data, laser scanning data, data representing a close match a generic model that represents the type and/or size of the object etc . ) .
  • observations from different vessels may be combined to help in identifying the objects.
  • the combining process may be carried out onboard so that the vessels exchange data or alternatively onshore so that pieces of data from different vessels are combined.
  • a real-time three- dimensional navigation model 206 providing a real-time three-dimensional view of the vessel and its vicinity is combined.
  • the real-time three-dimensional navigation model 206 provides a realistic view of the vessel's vicinity.
  • the real-time three-dimensional navigation model 206 may comprise any combination of the following features : The vessel itself (as a three-dimensional model )
  • Waves and weather conditions are generated from wave, wind and weather data (for example, a view from the vessel's bridge would show simulated waves and vessel movements based on wave conditions)
  • Lighting (night/day) can be generated from local time
  • Terrain can be generated from topographic data
  • AIS Automatic Identification System
  • Route plan data (available, for example, from the Enhanced Navigation Support Information (ENSI) system) .
  • the route plan data may be displayed together with own vessel or another vessel if it is available. It can be drawn as a curve in front of the vessel .
  • ENSI Enhanced Navigation Support Information
  • Sensed objects can be generated based on onshore categorization
  • Additional information can be augmented as an overlay (for example, "virtual" lighthouses, shipping lanes etc.)
  • the width of the shipping lane can be illustrated with a color shading (appearing on the surface in the displayed image) , derived from the depth data of the Electronic Navigation Chart (ENC) .
  • ENC Electronic Navigation Chart
  • the safe area which may be shaded may depend on the draught of the ship which, in turn, may be dependent on the loading of the ship. The safe area may also be affected by the sea level .
  • a cloud service may at least partially store the static data, and non-sensed dynamic data and dynamic sensor data provided by the different data sources.
  • the cloud service can have a central database as a back-up for possible communications errors, enabling, for example, feeding dynamic data to onboard use.
  • the cloud service may also pre-process the non-sensed dynamic data and send the processed data to the vessel as appropriate (for example, transferring only relevant dynamic objects that are close to the ship) .
  • the necessary data (the static data, non-sensed dynamic data and sensed dynamic data) for combining the real-time three-dimensional navigation model may be simultaneously available in multiple locations, for example, in the vessel, the cloud service and the remote operator system. This enables creation of the real-time three-dimensional navigation model independently in multiple locations.
  • FIG. 3A is an example block diagram illustrating various entities connected to a remote operator system 300 in accordance with an example embodiment.
  • the remote operator system 300 is configured to generate the real-time three-dimensional navigation model 206 that enables, for example, remote navigation of a vessel.
  • the real-time three-dimensional navigation model 206 is generated based on static data 200, non-sensed dynamic data 202 and sensed dynamic data 204.
  • some or all of the static data 200 is available to the remote operating system 300 from at least one external data source.
  • the remote operator center 300 may store some or all of the static data 200 locally.
  • the static data 200 may comprise at least one of the following: map data (geographical, oceanographic, ports, shipping lanes, safe navigation areas, maritime infrastructure etc.), enhancement (shipping lanes highlighted, route planning, safe navigation areas, night/day views), and the vessel's own three-dimensional model.
  • Non-sensed dynamic data 202 comprises, for example, traffic data (for example, AIS, vessel traffic control data) , weather data (for example, wind, rain, cloudiness, waves, fog, visibility, sea level) , time of day data (day/night) and/or any other relevant data available that is not sensed by the vessel.
  • the non- sensed data 202 is received from at least one external data source, for example, via the internet.
  • the non-sensed dynamic data 202 may be pre- processed by an external and sent to the vessel, for example, from the remote operator system 300 or a cloud service if the vessel, in addition to the remote operator system 300, generates the real-time three- dimensional navigation model 206. For example, only relevant dynamic objects that are close to the vessel, may be sent to the vessel.
  • Sensed dynamic data 204 comprising data, for example, from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel may be obtained from the vessel.
  • the sensed dynamic data 204 may be preprocessed by the vessel so that the type of a tagged object is sent to a remote operator system 300 only once, once the identification has happened. For example, if the vessel identifies a nearby other vessel, the type of the vessel (for example, a sailing boat) may be sent to the remote server only once.
  • the remote operator system 300 is then able to generate a three-dimensional object model based on the information received from the vessel.
  • the vessel may also use at least one sensor to determine the size of the object that is sent once to the remote operator system 300 that may use object scaling to scale the object to a right size.
  • the sensed dynamic data 204 may also comprise location data relating to the object, and the remote operator system 300 may receive the location data from the vessel at a given frequency (to identify the location, speed and heading of the object) .
  • an unidentified object can be replicated at the remote operator system with a special identity and warning - complemented with transfer of less processed data (for example, focused video, radar data, laser scanning data, two-dimensional data, for example, a photograph) etc.) .
  • two-dimensional data it may be scaled according to its relative distance from the vessel and placed in the real-time three-dimensional navigation model according to its relative location.
  • the remote operator system 300 may retrieve information from an Automatic Identification System (AIS) data source 302 so that it is able to model other nearby ships in the real-time three-dimensional navigation model 206.
  • the remote operator system 300 may also retrieve information from other data sources, for example, route plan data available from, for example, the Enhanced Navigation Support Information (ENSI) system.
  • AIS Automatic Identification System
  • ENSI Enhanced Navigation Support Information
  • the remote operator system 300 combines the real-time three-dimensional navigation model 206 providing a three-dimensional view of the vessel 302 and its vicinity based on the static data 200, non-sensed dynamic data 202 and sensed dynamic data 204.
  • the real ⁇ time three-dimensional navigation model 206 regenerates a realistic view of the vessel's vicinity. This means that the view from the vessel can be regenerated onshore with the model, eliminating the need to transmit video or still images from onboard, and therefore significantly reducing the amout of data transfer from vessel, for example, during remote control.
  • the realistic view provided by the real-time three- dimensional navigation model 206 is useful, for example, in remote operations when assisting or taking over navigation control from the vessel. Further, the view is identical onboard at the vessel and at a remote operating point.
  • the real-time three-dimensional navigation model 206 is not connected to the location of the vessel in itself.
  • the model is a real-time representation of all relevant information in the vessel's vicinity that is needed for vessel's safe navigation.
  • the vessel's position in the model determines the focused part of the model, and the model can be missing or be coarse elsewhere .
  • FIG. 3A illustrates that the remote operator system 300 generates the three- dimensional navigation model 206
  • the three-dimensional navigation model 206 may be generated by another entity.
  • the remote operator system 300 may receive a data stream providing the three-dimensional navigation model 206.
  • the real-time three-dimensional navigation model 206 real ⁇ time with other entities, for example, other operator centers or ships (for example, in a convoy) either transferring all the navigation model data or only parts of it.
  • the vessel may filter, and/or categorize dynamic sensor data prior to sending data to the remote operator system 300. This reduces the amount of data needed to be transmitted from the vessel to the remote operator system 300.
  • FIG. 3B is an example block diagram illustrating various entities connected to a vessel system 304 in accordance with an example embodiment.
  • the vessel system 304 is configured to generate the real-time three-dimensional navigation model 206.
  • the real-time three-dimensional navigation model 206 is generated based on static data 200, non-sensed dynamic data 202 and sensed dynamic data 204.
  • some or all of the static data 200 is available to the vessel system 304 from at least one external data source.
  • the vessel system 304 may store some or all of the static data 200 locally.
  • the static data 200 may comprise at least one of the following: map data (geographical, oceanographic, ports, shipping lanes, safe navigation areas, maritime infrastructure etc.), enhancement (shipping lanes highlighted, route planning, safe navigation areas, night/day views), and the vessel's own three-dimensional model .
  • Non-sensed dynamic data 202 comprises, for example, traffic data (for example, AIS, vessel traffic control data) , weather data (for example, wind, rain, cloudiness, waves, fog, visibility, sea level) , time of day data (day/night) and/or any other relevant data available that is not sensed by the vessel.
  • the non- sensed data 202 is received from at least one external data source, for example, via the internet.
  • the non-sensed dynamic data 202 may be pre- processed by an external and sent to the vessel, for example, from the remote operator system 300 or a cloud service. For example, only relevant dynamic objects that are close to the vessel, may be sent to the vessel.
  • Sensed dynamic data 204 comprising data, for example, from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel may be obtained by the vessel itself.
  • the vessel may also use at least one sensor, for example, to determine the size of an object.
  • the sensed dynamic data 204 may also comprise location data relating to the object.
  • an unidentified object can be replicated in the real-time three-dimensional navigation model 206 with a special identity and warning - complemented with transfer of less processed data (for example, focused video, radar data, laser scanning data, two-dimensional data, for example, a photograph) etc.) .
  • two-dimensional data it may be scaled according to its relative distance from the vessel and placed in the real-time three-dimensional navigation model 206 according to its relative location.
  • the vessel system 304 may retrieve information from an Automatic Identification System (AIS) data source so that it is able to model other nearby ships in the real-time three-dimensional navigation model 206.
  • the vessel system 304 may also retrieve information from other data sources, for example, route plan data available from, for example, the Enhanced Navigation Support Information (ENSI) system.
  • AIS Automatic Identification System
  • ENSI Enhanced Navigation Support Information
  • the vessel system 304 combines the real-time three-dimensional navigation model 206 providing a real ⁇ time three-dimensional view of the vessel 302 and its vicinity based on the static data 200, non-sensed dynamic data 202 and sensed dynamic data 204.
  • the real-time three-dimensional navigation model 206 By providing the real-time three-dimensional navigation model 206 with the vessel's own information systems, improved situational awareness is provided for a manned vessel. The crew operating the vessel is able to see more clearly what happens in the vessel's vicinity. Even if the vessel is an autonomous vessel, improved situational awareness is provided for the vessel. Further, the autonomous vessel may transfer the entire three-dimensional navigation model 206 or parts of it to the remote operator center 300 in case of problems .
  • an operator or a user using the real-time three-dimensional navigation model 206 is able to freely choose the point of view as the entire vicinity of the vessel is three-dimensionally modelled. For example, locating the point of view higher up can give a better view to see behind obstacles etc.
  • the data used in combining the three- dimensional navigation model 206 may be used to generate derived data.
  • waves can be generated from wind data.
  • the vessel's movements or speed loss can be derived from wave conditions and hull stresses from the vessel's movements.
  • an unmanned/autonomous ship may generate the real-time three-dimensional navigation model 206 onboard as discussed earlier (with onboard stored static data, non-sensed dynamic data accessible via the internet and onboard sensor data) ..
  • FIG. 4 is a more detailed schematic block diagram of an apparatus for modelling a vessel and its vicinity real-time.
  • the apparatus 400 could be any computer device, such as any suitable servers, workstations, personal computers, laptop computers or a system comprising several separate subsystems.
  • the illustrated apparatus 400 includes a controller or a processor 402 (i.e. a signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions.
  • An operating system 422 controls the allocation and usage of the components of the apparatus 400 and supports for one or more application programs 424.
  • the application programs 424 can include vessel control and operation related applications, or any other application.
  • the illustrated apparatus 400 includes one or more memory components, for example, a non-removable memory 426 and/or removable memory 404.
  • the non- removable memory 426 may include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies.
  • the removable memory 404 may include flash memory (such as one or more removable flash drives) or smart cards.
  • the one or more memory components may be used for storing data and/or code for running the operating system 422 and the applications 424.
  • Example of data may include text, images, sound files, image data, video data, or other data sets to be sent to and/or received from one or more network servers, other devices or marine vessels via one or more wired or wireless networks .
  • the apparatus 400 can support one or more input devices 408 and one or more output devices 416.
  • the input devices 408 may include, but are not limited to, a touchscreen 410 (i.e., capable of capturing finger tap inputs, finger gesture inputs, multi-finger tap inputs, multi-finger gesture inputs, or keystroke inputs from a virtual keyboard or keypad) , a microphone 412 (i.e., capable of capturing voice input), and a physical keyboard 414.
  • Examples of the output devices 416 may include, but are not limited to a speaker 418 and a display system 420. Other possible output devices (not shown) can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function.
  • the apparatus 400 can further include one or more input/output interfaces 406.
  • the illustrated components are not required or all-inclusive, as any of the components shown can be deleted and other components can be added.
  • At least one of the processor 402, the non-removable memory 426, the removable memory 404, the output devices 416, input/output interfaces 406, input devices 408 may constitute obtaining information about location of the vessel; means for obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, means for obtaining non- sensed dynamic data relating to the surroundings of the vessel, means for obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and means for combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model providing a three- dimensional view of the vessel and its vicinity.
  • the exemplary embodiments can include, for example, any suitable servers, workstations, personal computers, laptop computers, other devices, and the like, capable of performing the processes of the exemplary embodiments.
  • the devices and subsystems of the exemplary embodiments can communicate with each other using any suitable protocol and can be implemented using one or more programmed computer systems or devices.
  • One or more interface mechanisms can be used with the exemplary embodiments, including, for example, Internet access, telecommunications in any suitable form (e.g., voice, modem, and the like), wireless communications media, and the like.
  • employed communications networks or links can include one or more satellite communications networks, wireless communications networks, cellular communications networks, 3G communications networks, 4G communications networks, Public Switched Telephone Network (PSTNs) , Packet Data Networks (PDNs) , the Internet, intranets, a combination thereof, and the like.
  • PSTNs Public Switched Telephone Network
  • PDNs Packet Data Networks
  • the exemplary embodiments are for exemplary purposes, as many variations of the specific hardware used to implement the exemplary embodiments are possible, as will be appreciated by those skilled in the hardware and/or software art(s) .
  • the functionality of one or more of the components of the exemplary embodiments can be implemented via one or more hardware and/or software devices.
  • the exemplary embodiments can store information relating to various processes described herein. This information can be stored in one or more memories, such as a hard disk, optical disk, magneto- optical disk, RAM, and the like.
  • One or more databases can store the information used to implement the exemplary embodiments of the present inventions.
  • the databases can be organized using data structures (e.g., records, tables, arrays, fields, graphs, trees, lists, and the like) included in one or more memories or storage devices listed herein.
  • the processes described with respect to the exemplary embodiments can include appropriate data structures for storing data collected and/or generated by the processes of the devices and subsystems of the exemplary embodiments in one or more databases .
  • All or a portion of the exemplary embodiments can be conveniently implemented using one or more general purpose processors, microprocessors, digital signal processors, micro-controllers, and the like, programmed according to the teachings of the exemplary embodiments of the present inventions, as will be appreciated by those skilled in the computer and/or software art(s) .
  • Appropriate software can be readily prepared by programmers of ordinary skill based on the teachings of the exemplary embodiments, as will be appreciated by those skilled in the software art.
  • the exemplary embodiments can be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be appreciated by those skilled in the electrical art(s) .
  • the exemplary embodiments are not limited to any specific combination of hardware and/or software.
  • the exemplary embodiments of the present inventions can include software for controlling the components of the exemplary embodiments, for driving the components of the exemplary embodiments, for enabling the components of the exemplary embodiments to interact with a human user, and the like.
  • software can include, but is not limited to, device drivers, firmware, operating systems, development tools, applications software, and the like.
  • Such computer readable media further can include the computer program product of an embodiment of the present inventions for performing all or a portion (if processing is distributed) of the processing performed in implementing the inventions.
  • Computer code devices of the exemplary embodiments of the present inventions can include any suitable interpretable or executable code mechanism.
  • the components of the exemplary embodiments can include computer readable medium or memories for holding instructions programmed according to the teachings of the present inventions and for holding data structures, tables, records, and/or other data described herein.
  • Computer readable medium can include any suitable medium that participates in providing instructions to a processor for execution. Such a medium can take many forms, including but not limited to, non-volatile media, volatile media, and the like.
  • Non-volatile media can include, for example, optical or magnetic disks, magneto-optical disks, and the like.
  • Volatile media can include dynamic memories, and the like.
  • Computer-readable media can include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a CD-ROM, CD ⁇ R, CD ⁇ RW, DVD, DVD-RAM, DVD1RW, DVD ⁇ R, HD DVD, HD DVD-R, HD DVD-RW, HD DVD-RAM, Blu-ray Disc, any other suitable optical medium, punch cards, paper tape, optical mark sheets, any other suitable physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other suitable memory chip or cartridge, a carrier wave or any other suitable medium from which a computer can read.
  • a floppy disk CD ⁇ R, CD ⁇ RW, DVD, DVD-RAM, DVD1RW, DVD ⁇ R, HD DVD, HD DVD-R, HD DVD-RW, HD DVD-RAM, Blu-ray Disc
  • any other suitable optical medium punch cards, paper tape, optical mark sheets, any other suitable physical medium with patterns of holes or

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Geometry (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • Software Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

According to an aspect, there is provided a method and an apparatus for modelling a vessel and its vicinity real-time. The method comprises obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, obtaining non-sensed dynamic data relating to the surroundings of the vessel, obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three-dimensional navigation model providing a three-dimensional view of the vessel and its vicinity.

Description

MODELLING A VESSEL AND ITS VICINITY IN REAL-TIME
BACKGROUND
Field:
The invention relates to modelling a vessel and its vicinity in real-time.
Description of the Related Art:
A marine vessel and its vicinity can be modelled using, for example, by overlaying a radar screen picture and automatic identification system data on the top of a satellite position based map view. This, however, provides only a limited representation of the marine vessel and its vicinity, and, for example, remote operation of the vessel is difficult.
SUMMARY
A solution is provided where sensed data, dynamic data and static data from multiple sources is obtained and based on the obtained data it is possible to combine a real-time three dimensional view of a vessel and its vicinity. The real-time three dimensional view may be used, for example, for remotely navigating the vessel.
According to a first aspect, there is provided a method for modelling a vessel and its vicinity real¬ time. The method comprises obtaining information about location of the vessel; obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, obtaining non-sensed dynamic data relating to the surroundings of the vessel, obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three-dimensional navigation model providing a real-time three-dimensional view of the vessel and its vicinity .
In an embodiment, the method further comprises causing display of the three-dimensional navigation model .
In an embodiment, alternatively or in addition to the above described embodiments, the method further comprises causing transmission of at least part of the three-dimensional navigation model to a remote receiver.
In an embodiment, alternatively or in addition to the above described embodiments, the method further comprises obtaining the static data from a local data storage, and obtaining the non-sensed dynamic data from at least one external entity via network access.
In an embodiment, alternatively or in addition to the above described embodiments, the static data comprises at least one of three-dimensional nautical chart data, three-dimensional topography data, map symbol data, shipping lane data, for example, using different colors, three-dimensional model data of the vessel, marine infrastructure data, and port model data.
In an embodiment, alternatively or in addition to the above described embodiments, the non-sensed dynamic data comprises at least one of wave data, weather data, local time data, sea level data and data relating to identified nearby vessels.
In an embodiment, alternatively or in addition to the above described embodiments, the dynamic sensor data from at least one external entity comprises at least one of sensor data from a harbor, sensor data from at least one other nearby surface vessel, and sensor data from a drone.
In an embodiment, alternatively or in addition to the above described embodiments, the method further comprises receiving a selection of a virtual control point, and causing display of the three- dimensional navigation model for the real-time remote operation of the vessel based on the selected virtual control point.
According to a second aspect, there is provided an apparatus for modelling a vessel and its vicinity real-time. The apparatus comprises means for obtaining information about location of the vessel, means for obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, means for obtaining non- sensed dynamic data relating to the surroundings of the vessel, means for obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and means for combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model providing a real-time three-dimensional view of the vessel and its vicinity.
In an embodiment, alternatively or in addition to the above described embodiments, the apparatus further comprises means for causing display of the three-dimensional navigation model.
In an embodiment, alternatively or in addition to the above described embodiments, the apparatus further comprises means for causing transmission of at least part of the three-dimensional navigation model to a remote receiver.
In an embodiment, alternatively or in addition to the above described embodiments, the apparatus further comprises means for obtaining the static data from a local data storage, and means for obtaining the non-sensed dynamic data from at least one external entity via network access. In an embodiment, alternatively or in addition to the above described embodiments, the static data comprises at least one of three-dimensional nautical chart data, three-dimensional topography data, map symbol data, shipping lane data, for example, using different colors, three-dimensional model data of the vessel, marine infrastructure data, and port model data.
In an embodiment, alternatively or in addition to the above described embodiments, the non-sensed dynamic data comprises at least one of wave data, weather data, local time data, sea level data, and data relating to identified nearby vessels.
In an embodiment, alternatively or in addition to the above described embodiments, the dynamic sensor data from at least one external entity comprises at least one of sensor data from a harbor, sensor data from at least one other nearby surface vessel and sensor data from a drone .
In an embodiment, alternatively or in addition to the above described embodiments, the apparatus further comprises means for receiving a selection of a virtual control point, and means for causing display of the three-dimensional navigation model for the real¬ time remote operation of the vessel based on the selected virtual control point.
According to a second aspect, there is provided a computer program comprising program code instructions, which when executed by at least one processor, cause the at least one processor to perform the method of the first aspect.
In an embodiment, the computer program is embodied on a computer-readable medium.
According to another aspect, there is provided a control system comprising at least one processing unit and at least one memory, wherein the at least one memory stores program instructions that, when executed by the at least one processing unit, cause the control system to obtain information about location of the vessel; obtain static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, obtain non-sensed dynamic data relating to the surroundings of the vessel, obtain dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and combine based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model providing a real-time three-dimensional view of the vessel and its vicinity. BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are included to provide a further understanding of the invention and constitute a part of this specification, illustrate embodiments of the invention and together with the description help to explain the principles of the invention. In the drawings:
FIG. 1 is a flow diagram illustrating a method for modelling a vessel and its vicinity real-time in accordance with an example embodiment.
FIG. 2 is an example block diagram of data used to combine a real-time three-dimensional navigation model in accordance with an example embodiment.
FIG. 3A is an example block diagram illustrating various entities connected to a remote operator system in accordance with an example embodiment .
FIG. 3B is an example block diagram illustrating various entities connected to a vessel system in accordance with an example embodiment. FIG. 4 is an example block diagram of a remote operator system in accordance with an example embodiment .
Like reference numerals are used to designate like parts in the accompanying drawings.
DETAILED DESCRIPTION
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings. The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
FIG. 1 is a flow diagram illustrating a method for modelling a vessel and its vicinity real-time in accordance with an example embodiment. FIG. 1 is discussed together with FIG. 2 illustrating data used to combine a real-time three-dimensional navigation model.
At 100 information about a location of the vessel is obtained. The vessel may automatically send, for example, periodically its location to a receiving entity over a communication interface or the receiving entity may request the location from the vessel over the communication interface.
At 102 static data 200 comprising three- dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel is obtained. As an example, the static data 200 may comprise three-dimensional nautical chart data, three-dimensional topography data, map symbol data, shipping lane data, three-dimensional model data of the vessel, marine infrastructure data, and/or port model data. In other words, the static data 200 refers to data that remains unchanged in the course of time. The static data may be stored locally by the entity combining the real-time three-dimensional navigation model. In another embodiment, at least part of the static data may be retrieved from at least one external data source, for example, via the internet. The static local data may also be stored in multiple locations, for example, in the vessel, a remote operator center and a cloud storage.
At 104 non-sensed dynamic data 202 relating to the surroundings of the vessel is obtained. As a non- limiting example, the non-sensed dynamic data 202 may comprise wave data, weather data, local time data, and data relating to identified nearby vessels. In general, the non-sensed dynamic data 202 refers to dynamic data that can be retrieved without the vessel being involved in the data transmission. The non-sensed dynamic data can be retrieved, for example, from at least one data source via the internet.
At 106 dynamic sensor data 204 comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data 204 for the vessel is obtained. As an example, the dynamic sensor data 204 may comprise sensor data from a harbor received by the vessel and sensor data from a drone received by the vessel. In general, the dynamic sensor data 204 refers to data of all objects that is not retrievable from onshore data resources. The dynamic sensor data may be obtained from the vessel. The vessel may send the dynamic sensor data, for example, to a cloud service and/or to a remote operator system.
In an embodiment, the dynamic sensor data 204 is processed by the vessel so that the type of a tagged object may be sent to a remote server only once, once the identification has happened. For example, if the vessel identifies a nearby other vessel, the type of the vessel (for example, a sailing boat) is sent to the remote server only once. The remote server is then able to generate a three-dimensional object model based on the information received from the vessel. The vessel may also use at least one sensor to determine the size of the object that is sent once to the remote server (scaling object to right size) . Further, the dynamic sensor data 204 may comprise location data relating to the object, and the location data may be sent to the remote server at a given frequency (to identify the location, speed and heading of the object) . In embodiment, in case of anomalies, for example, data unclarity, pre-identified risk criteria, an unidentified object can be replicated at the remote server with a special identity and warning complemented with transfer of less processed data (for example, a photograph, focused video, radar data, laser scanning data, data representing a close match a generic model that represents the type and/or size of the object etc . ) .
In an embodiment, observations from different vessels may be combined to help in identifying the objects. The combining process may be carried out onboard so that the vessels exchange data or alternatively onshore so that pieces of data from different vessels are combined.
At 108 based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model 206 providing a real-time three-dimensional view of the vessel and its vicinity is combined. The real-time three-dimensional navigation model 206 provides a realistic view of the vessel's vicinity. The real-time three-dimensional navigation model 206 may comprise any combination of the following features : The vessel itself (as a three-dimensional model )
Waves and weather conditions (fog, rain) are generated from wave, wind and weather data (for example, a view from the vessel's bridge would show simulated waves and vessel movements based on wave conditions)
Lighting (night/day) can be generated from local time
Terrain can be generated from topographic data
Infrastructure can be generated from stored and retrieved three-dimensional objects AIS (Automatic Identification System) coded ships can be generated from AIS data (retrieving vessel size and type from AIS and determining speed and heading from multiple location data)
Route plan data (available, for example, from the Enhanced Navigation Support Information (ENSI) system) . The route plan data may be displayed together with own vessel or another vessel if it is available. It can be drawn as a curve in front of the vessel .
Sensed objects can be generated based on onshore categorization
Additional information can be augmented as an overlay (for example, "virtual" lighthouses, shipping lanes etc.) For example, the width of the shipping lane can be illustrated with a color shading (appearing on the surface in the displayed image) , derived from the depth data of the Electronic Navigation Chart (ENC) . The safe area which may be shaded may depend on the draught of the ship which, in turn, may be dependent on the loading of the ship. The safe area may also be affected by the sea level .
In an embodiment, a cloud service may at least partially store the static data, and non-sensed dynamic data and dynamic sensor data provided by the different data sources. As a back-up, the cloud service can have a central database as a back-up for possible communications errors, enabling, for example, feeding dynamic data to onboard use. The cloud service may also pre-process the non-sensed dynamic data and send the processed data to the vessel as appropriate (for example, transferring only relevant dynamic objects that are close to the ship) .
In an embodiment, the necessary data (the static data, non-sensed dynamic data and sensed dynamic data) for combining the real-time three-dimensional navigation model may be simultaneously available in multiple locations, for example, in the vessel, the cloud service and the remote operator system. This enables creation of the real-time three-dimensional navigation model independently in multiple locations.
FIG. 3A is an example block diagram illustrating various entities connected to a remote operator system 300 in accordance with an example embodiment. In this embodiment, the remote operator system 300 is configured to generate the real-time three-dimensional navigation model 206 that enables, for example, remote navigation of a vessel.
As already discussed in relation to FIGS. 1 and
2, the real-time three-dimensional navigation model 206 is generated based on static data 200, non-sensed dynamic data 202 and sensed dynamic data 204. In an embodiment, some or all of the static data 200 is available to the remote operating system 300 from at least one external data source. In another embodiment, the remote operator center 300 may store some or all of the static data 200 locally. The static data 200 may comprise at least one of the following: map data (geographical, oceanographic, ports, shipping lanes, safe navigation areas, maritime infrastructure etc.), enhancement (shipping lanes highlighted, route planning, safe navigation areas, night/day views), and the vessel's own three-dimensional model.
Non-sensed dynamic data 202 comprises, for example, traffic data (for example, AIS, vessel traffic control data) , weather data (for example, wind, rain, cloudiness, waves, fog, visibility, sea level) , time of day data (day/night) and/or any other relevant data available that is not sensed by the vessel. The non- sensed data 202 is received from at least one external data source, for example, via the internet. In an embodiment, the non-sensed dynamic data 202 may be pre- processed by an external and sent to the vessel, for example, from the remote operator system 300 or a cloud service if the vessel, in addition to the remote operator system 300, generates the real-time three- dimensional navigation model 206. For example, only relevant dynamic objects that are close to the vessel, may be sent to the vessel.
Sensed dynamic data 204 comprising data, for example, from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel may be obtained from the vessel. The sensed dynamic data 204 may be preprocessed by the vessel so that the type of a tagged object is sent to a remote operator system 300 only once, once the identification has happened. For example, if the vessel identifies a nearby other vessel, the type of the vessel (for example, a sailing boat) may be sent to the remote server only once. The remote operator system 300 is then able to generate a three-dimensional object model based on the information received from the vessel. The vessel may also use at least one sensor to determine the size of the object that is sent once to the remote operator system 300 that may use object scaling to scale the object to a right size. Further, the sensed dynamic data 204 may also comprise location data relating to the object, and the remote operator system 300 may receive the location data from the vessel at a given frequency (to identify the location, speed and heading of the object) . In an embodiment, in case of anomalies, for example, data unclarity, pre- identified risk criteria, an unidentified object can be replicated at the remote operator system with a special identity and warning - complemented with transfer of less processed data (for example, focused video, radar data, laser scanning data, two-dimensional data, for example, a photograph) etc.) . If two-dimensional data is used, it may be scaled according to its relative distance from the vessel and placed in the real-time three-dimensional navigation model according to its relative location.
Further, the remote operator system 300 may retrieve information from an Automatic Identification System (AIS) data source 302 so that it is able to model other nearby ships in the real-time three-dimensional navigation model 206. The remote operator system 300 may also retrieve information from other data sources, for example, route plan data available from, for example, the Enhanced Navigation Support Information (ENSI) system.
The remote operator system 300 combines the real-time three-dimensional navigation model 206 providing a three-dimensional view of the vessel 302 and its vicinity based on the static data 200, non-sensed dynamic data 202 and sensed dynamic data 204. The real¬ time three-dimensional navigation model 206 regenerates a realistic view of the vessel's vicinity. This means that the view from the vessel can be regenerated onshore with the model, eliminating the need to transmit video or still images from onboard, and therefore significantly reducing the amout of data transfer from vessel, for example, during remote control. The realistic view provided by the real-time three- dimensional navigation model 206 is useful, for example, in remote operations when assisting or taking over navigation control from the vessel. Further, the view is identical onboard at the vessel and at a remote operating point.
The real-time three-dimensional navigation model 206 is not connected to the location of the vessel in itself. The model is a real-time representation of all relevant information in the vessel's vicinity that is needed for vessel's safe navigation. The vessel's position in the model determines the focused part of the model, and the model can be missing or be coarse elsewhere .
Further, although FIG. 3A illustrates that the remote operator system 300 generates the three- dimensional navigation model 206, in another embodiment the three-dimensional navigation model 206 may be generated by another entity. Further, the remote operator system 300 may receive a data stream providing the three-dimensional navigation model 206.
In an embodiment, it is possible to share the real-time three-dimensional navigation model 206 real¬ time with other entities, for example, other operator centers or ships (for example, in a convoy) either transferring all the navigation model data or only parts of it.
In an embodiment, the vessel may filter, and/or categorize dynamic sensor data prior to sending data to the remote operator system 300. This reduces the amount of data needed to be transmitted from the vessel to the remote operator system 300.
FIG. 3B is an example block diagram illustrating various entities connected to a vessel system 304 in accordance with an example embodiment. In this embodiment, the vessel system 304 is configured to generate the real-time three-dimensional navigation model 206.
As already discussed in relation to FIGS. 1 and
2, the real-time three-dimensional navigation model 206 is generated based on static data 200, non-sensed dynamic data 202 and sensed dynamic data 204. In an embodiment, some or all of the static data 200 is available to the vessel system 304 from at least one external data source. In another embodiment, the vessel system 304 may store some or all of the static data 200 locally. The static data 200 may comprise at least one of the following: map data (geographical, oceanographic, ports, shipping lanes, safe navigation areas, maritime infrastructure etc.), enhancement (shipping lanes highlighted, route planning, safe navigation areas, night/day views), and the vessel's own three-dimensional model .
Non-sensed dynamic data 202 comprises, for example, traffic data (for example, AIS, vessel traffic control data) , weather data (for example, wind, rain, cloudiness, waves, fog, visibility, sea level) , time of day data (day/night) and/or any other relevant data available that is not sensed by the vessel. The non- sensed data 202 is received from at least one external data source, for example, via the internet. In an embodiment, the non-sensed dynamic data 202 may be pre- processed by an external and sent to the vessel, for example, from the remote operator system 300 or a cloud service. For example, only relevant dynamic objects that are close to the vessel, may be sent to the vessel.
Sensed dynamic data 204 comprising data, for example, from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel may be obtained by the vessel itself. The vessel may also use at least one sensor, for example, to determine the size of an object. Further, the sensed dynamic data 204 may also comprise location data relating to the object. In an embodiment, in case of anomalies, for example, data unclarity, pre- identified risk criteria, an unidentified object can be replicated in the real-time three-dimensional navigation model 206 with a special identity and warning - complemented with transfer of less processed data (for example, focused video, radar data, laser scanning data, two-dimensional data, for example, a photograph) etc.) . If two-dimensional data is used, it may be scaled according to its relative distance from the vessel and placed in the real-time three-dimensional navigation model 206 according to its relative location.
Further, the vessel system 304 may retrieve information from an Automatic Identification System (AIS) data source so that it is able to model other nearby ships in the real-time three-dimensional navigation model 206. The vessel system 304 may also retrieve information from other data sources, for example, route plan data available from, for example, the Enhanced Navigation Support Information (ENSI) system.
The vessel system 304 combines the real-time three-dimensional navigation model 206 providing a real¬ time three-dimensional view of the vessel 302 and its vicinity based on the static data 200, non-sensed dynamic data 202 and sensed dynamic data 204.
By providing the real-time three-dimensional navigation model 206 with the vessel's own information systems, improved situational awareness is provided for a manned vessel. The crew operating the vessel is able to see more clearly what happens in the vessel's vicinity. Even if the vessel is an autonomous vessel, improved situational awareness is provided for the vessel. Further, the autonomous vessel may transfer the entire three-dimensional navigation model 206 or parts of it to the remote operator center 300 in case of problems .
In an embodiment of the examples illustrated in FIGS. 3A and 3B, it is possible to filter data and settings (for example, "remove" fog, waves, night time etc.) in the real-time three-dimensional navigation model 206.
In an embodiment of the examples illustrated in FIGS. 3A and 3B, an operator or a user using the real-time three-dimensional navigation model 206 is able to freely choose the point of view as the entire vicinity of the vessel is three-dimensionally modelled. For example, locating the point of view higher up can give a better view to see behind obstacles etc.
In an embodiment of the examples illustrated in FIGS. 3A and 3B, the data used in combining the three- dimensional navigation model 206 may be used to generate derived data. For example, waves can be generated from wind data. Similarly, the vessel's movements or speed loss can be derived from wave conditions and hull stresses from the vessel's movements.
In an embodiment, an unmanned/autonomous ship may generate the real-time three-dimensional navigation model 206 onboard as discussed earlier (with onboard stored static data, non-sensed dynamic data accessible via the internet and onboard sensor data) ..
FIG. 4 is a more detailed schematic block diagram of an apparatus for modelling a vessel and its vicinity real-time. It should also be appreciated that at least some of the components described below may be optional and thus in an example embodiment may include more, less or different components than those described in connection with the example embodiment of FIG. 4. As such, among other examples, the apparatus 400 could be any computer device, such as any suitable servers, workstations, personal computers, laptop computers or a system comprising several separate subsystems. The illustrated apparatus 400 includes a controller or a processor 402 (i.e. a signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions. An operating system 422 controls the allocation and usage of the components of the apparatus 400 and supports for one or more application programs 424. The application programs 424 can include vessel control and operation related applications, or any other application.
The illustrated apparatus 400 includes one or more memory components, for example, a non-removable memory 426 and/or removable memory 404. The non- removable memory 426 may include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies. The removable memory 404 may include flash memory (such as one or more removable flash drives) or smart cards. The one or more memory components may be used for storing data and/or code for running the operating system 422 and the applications 424. Example of data may include text, images, sound files, image data, video data, or other data sets to be sent to and/or received from one or more network servers, other devices or marine vessels via one or more wired or wireless networks .
The apparatus 400 can support one or more input devices 408 and one or more output devices 416. Examples of the input devices 408 may include, but are not limited to, a touchscreen 410 (i.e., capable of capturing finger tap inputs, finger gesture inputs, multi-finger tap inputs, multi-finger gesture inputs, or keystroke inputs from a virtual keyboard or keypad) , a microphone 412 (i.e., capable of capturing voice input), and a physical keyboard 414. Examples of the output devices 416 may include, but are not limited to a speaker 418 and a display system 420. Other possible output devices (not shown) can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function.
The apparatus 400 can further include one or more input/output interfaces 406. The illustrated components are not required or all-inclusive, as any of the components shown can be deleted and other components can be added.
In an embodiment, at least one of the processor 402, the non-removable memory 426, the removable memory 404, the output devices 416, input/output interfaces 406, input devices 408 may constitute obtaining information about location of the vessel; means for obtaining static data comprising three-dimensional model data of the vessel, map data relating to the location of the vessel and data relating to the surroundings of the vessel, means for obtaining non- sensed dynamic data relating to the surroundings of the vessel, means for obtaining dynamic sensor data comprising data from at least one onboard sensor of the vessel and/or data from at least one external entity providing dynamic sensor data for the vessel, and means for combining based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three- dimensional navigation model providing a three- dimensional view of the vessel and its vicinity.
Aspects of any of the embodiments described above may be combined with aspects of any of the other embodiments described to form further embodiments without losing the effect sought.
The term 'comprising' is used herein to mean including the method, blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
The exemplary embodiments can include, for example, any suitable servers, workstations, personal computers, laptop computers, other devices, and the like, capable of performing the processes of the exemplary embodiments. The devices and subsystems of the exemplary embodiments can communicate with each other using any suitable protocol and can be implemented using one or more programmed computer systems or devices.
One or more interface mechanisms can be used with the exemplary embodiments, including, for example, Internet access, telecommunications in any suitable form (e.g., voice, modem, and the like), wireless communications media, and the like. For example, employed communications networks or links can include one or more satellite communications networks, wireless communications networks, cellular communications networks, 3G communications networks, 4G communications networks, Public Switched Telephone Network (PSTNs) , Packet Data Networks (PDNs) , the Internet, intranets, a combination thereof, and the like.
It is to be understood that the exemplary embodiments are for exemplary purposes, as many variations of the specific hardware used to implement the exemplary embodiments are possible, as will be appreciated by those skilled in the hardware and/or software art(s) . For example, the functionality of one or more of the components of the exemplary embodiments can be implemented via one or more hardware and/or software devices.
The exemplary embodiments can store information relating to various processes described herein. This information can be stored in one or more memories, such as a hard disk, optical disk, magneto- optical disk, RAM, and the like. One or more databases can store the information used to implement the exemplary embodiments of the present inventions. The databases can be organized using data structures (e.g., records, tables, arrays, fields, graphs, trees, lists, and the like) included in one or more memories or storage devices listed herein. The processes described with respect to the exemplary embodiments can include appropriate data structures for storing data collected and/or generated by the processes of the devices and subsystems of the exemplary embodiments in one or more databases .
All or a portion of the exemplary embodiments can be conveniently implemented using one or more general purpose processors, microprocessors, digital signal processors, micro-controllers, and the like, programmed according to the teachings of the exemplary embodiments of the present inventions, as will be appreciated by those skilled in the computer and/or software art(s) . Appropriate software can be readily prepared by programmers of ordinary skill based on the teachings of the exemplary embodiments, as will be appreciated by those skilled in the software art. In addition, the exemplary embodiments can be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be appreciated by those skilled in the electrical art(s) . Thus, the exemplary embodiments are not limited to any specific combination of hardware and/or software.
Stored on any one or on a combination of computer readable media, the exemplary embodiments of the present inventions can include software for controlling the components of the exemplary embodiments, for driving the components of the exemplary embodiments, for enabling the components of the exemplary embodiments to interact with a human user, and the like. Such software can include, but is not limited to, device drivers, firmware, operating systems, development tools, applications software, and the like. Such computer readable media further can include the computer program product of an embodiment of the present inventions for performing all or a portion (if processing is distributed) of the processing performed in implementing the inventions. Computer code devices of the exemplary embodiments of the present inventions can include any suitable interpretable or executable code mechanism.
As stated above, the components of the exemplary embodiments can include computer readable medium or memories for holding instructions programmed according to the teachings of the present inventions and for holding data structures, tables, records, and/or other data described herein. Computer readable medium can include any suitable medium that participates in providing instructions to a processor for execution. Such a medium can take many forms, including but not limited to, non-volatile media, volatile media, and the like. Non-volatile media can include, for example, optical or magnetic disks, magneto-optical disks, and the like. Volatile media can include dynamic memories, and the like. Common forms of computer-readable media can include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a CD-ROM, CD±R, CD±RW, DVD, DVD-RAM, DVD1RW, DVD±R, HD DVD, HD DVD-R, HD DVD-RW, HD DVD-RAM, Blu-ray Disc, any other suitable optical medium, punch cards, paper tape, optical mark sheets, any other suitable physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other suitable memory chip or cartridge, a carrier wave or any other suitable medium from which a computer can read.
While the present inventions have been de¬ scribed in connection with a number of exemplary embodiments, and implementations, the present inventions are not so limited, but rather cover various modifications, and equivalent arrangements, which fall within the purview of prospective claims.

Claims

1. A method for modelling a vessel (302) and its vicinity real-time by a remote operator system (300), characterized in that the method comprises :
obtaining (100), by the remote operator system
(300) from the vessel (302), information about location of the vessel (302);
obtaining (102), by the remote operator system (300) from a data source other than the vessel (302), static data (200) comprising three-dimensional model data of the vessel (302), map data relating to the location of the vessel (302) and data relating to the surroundings of the vessel (302);
obtaining (104), by the remote operator system (300) from a data source other than the vessel (302), non-sensed dynamic data (202) relating to the surroundings of the vessel (302);
obtaining (106), by the remote operator system (300) from the vessel (302), dynamic sensor data (204) comprising data from at least one onboard sensor of the vessel (302) and/or data from at least one external entity providing dynamic sensor data for the vessel (302); and
combining (108), by the remote operator system (300), based on the static data (200), non-sensed dynamic data (202) and dynamic sensor data (204) a real¬ time three-dimensional navigation model (206) providing a realistic real-time three-dimensional view of the vessel (302) and its vicinity.
2. The method according to claim 1, further comprising :
causing display of the three-dimensional navigation model (206) .
3. The method according to any of claims 1 - 2, further comprising:
obtaining the static data (200) from a local data storage; and
obtaining the non-sensed dynamic data (202) from at least one external entity via network access.
4. The method according to any of claims 1 - 3, wherein the static data (200) comprises at least one of:
three-dimensional nautical chart data; three-dimensional topography data;
map symbol data;
shipping lane data;
three-dimensional model data of the vessel
(302) ;
marine infrastructure data; and
port model data.
5. The method according to any of claims 1 -
4, wherein the non-sensed dynamic data (202) comprises at least one of:
wave data;
weather data;
local time data;
sea level data; and
data relating to identified nearby vessels.
6. The method according to any of claims 1 - 5, wherein the dynamic sensor data (204) from at least one external entity comprises at least one of:
sensor data from a harbor;
sensor data from at least one other nearby surface vessel; and
sensor data from a drone.
7. The method according to any of claims 1 - 6, further comprising:
receiving a selection of a virtual control point; and
causing display of the three-dimensional navigation model (206) for real-time remote operation of the vessel (302) based on the selected virtual control point. 8. A remote operator system (300, 400) for modelling a vessel (302) and its vicinity real-time, characterized in that the remote operator system (300, 400) comprises:
means for obtaining (402) from the vessel (302) information about location of the vessel (302);
means for obtaining (402) from a data source other than the vessel (302) static data (200) comprising three-dimensional model data of the vessel (302), map data relating to the location of the vessel (302) and data relating to the surroundings of the vessel (302) means for obtaining (402) from a data source other than the vessel (302) non-sensed dynamic data (202) relating to the surroundings of the vessel (302);
means for obtaining (402) from the vessel (302) dynamic sensor data (204) comprising data from at least one onboard sensor of the vessel (302) and/or data from at least one external entity providing dynamic sensor data for the vessel (302); and
means for combining (402) based on the static data, non-sensed dynamic data and dynamic sensor data a real-time three-dimensional navigation model (206) providing a realistic real-time three-dimensional view of the vessel (302) and its vicinity. 9. The remote operator system (300, 400) according to claim 8, further comprising: means for causing (402, 416) display of the three-dimensional navigation model (206).
10. The remote operator system (300, 400) according to any of claims 8 - 9, further comprising:
means for obtaining (402) the static data (200) from a local data storage; and
means for obtaining (402) the non-sensed dynamic data (202) from at least one external entity via network access.
11. The remote operator system (300, 400) according to any of claims 8 - 10, wherein the static data (200) comprises at least one of:
three-dimensional nautical chart data; three-dimensional topography data;
map symbol data;
shipping lane data;
three-dimensional model data of the vessel (302);
marine infrastructure data; and
port model data.
12. The remote operator system (300, 400) according to any of claims 8 - 11, wherein the non- sensed dynamic data (202) comprises at least one of:
wave data;
weather data;
local time data;
sea level data and
data relating to identified nearby vessels.
13. The remote operator system (300, 400) according to any of claims 8 - 12, wherein the dynamic sensor data (204) from at least one external entity comprises at least one of:
sensor data from a harbor; sensor data from at least one other nearby surface vessel; and
sensor data from a drone. 14. The remote operator system (300, 400) according to any of claims 8 - 13, further comprising:
means for receiving (402, 406) a selection of a virtual control point; and
means for causing (402, 416) display of the three-dimensional navigation model for the real-time remote operation of the vessel (302) based on the selected virtual control point.
15. A computer program comprising program code instructions, which when executed by at least one processor, cause the at least one processor to perform the method of any of claims 1 - 7.
PCT/FI2018/050045 2017-01-25 2018-01-23 Modelling a vessel and its vicinity in real-time WO2018138411A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20175065A FI129545B (en) 2017-01-25 2017-01-25 Modelling a vessel and its vicinity in real-time
FI20175065 2017-01-25

Publications (1)

Publication Number Publication Date
WO2018138411A1 true WO2018138411A1 (en) 2018-08-02

Family

ID=61074464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2018/050045 WO2018138411A1 (en) 2017-01-25 2018-01-23 Modelling a vessel and its vicinity in real-time

Country Status (2)

Country Link
FI (1) FI129545B (en)
WO (1) WO2018138411A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961210A (en) * 2018-09-30 2019-07-02 大连永航科技有限公司 Yacht dispatch system with voice guidance
CN111523174A (en) * 2020-03-13 2020-08-11 上海外高桥造船有限公司 Automatic drawing method, system, equipment and storage medium for holes in ship drawing
CN114034302A (en) * 2021-10-28 2022-02-11 广州海宁海务技术咨询有限公司 Sea chart selection method and device based on planned route
CN114258372A (en) * 2019-09-09 2022-03-29 古野电气株式会社 Ship information display system, ship information display method, image generation device, and program
CN114815652A (en) * 2021-12-04 2022-07-29 中国船舶工业系统工程研究院 A design method and system of unmanned boat swarm simulation environment
CN115410390A (en) * 2022-08-19 2022-11-29 河南泽阳实业有限公司 Intelligent scheduling method for intelligent port
US20230221718A1 (en) * 2020-06-25 2023-07-13
CN117470107A (en) * 2023-12-28 2024-01-30 思创数码科技股份有限公司 Ship measurement method and system based on combination of laser scanning and radar scanning

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110309594B (en) * 2019-07-01 2023-05-30 上海外高桥造船有限公司 Cable tray model processing method, system, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2441802A (en) * 2006-09-13 2008-03-19 Marine & Remote Sensing Soluti Safety system for a vehicle
US20140160165A1 (en) * 2011-07-21 2014-06-12 Korea Institute Of Ocean Science And Technology Augmented reality system using moving ceiling transparent display for ship and method for enabling same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2441802A (en) * 2006-09-13 2008-03-19 Marine & Remote Sensing Soluti Safety system for a vehicle
US20140160165A1 (en) * 2011-07-21 2014-06-12 Korea Institute Of Ocean Science And Technology Augmented reality system using moving ceiling transparent display for ship and method for enabling same

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961210A (en) * 2018-09-30 2019-07-02 大连永航科技有限公司 Yacht dispatch system with voice guidance
CN109961210B (en) * 2018-09-30 2022-11-25 大连永航科技有限公司 Yacht dispatch system with voice guidance
CN114258372A (en) * 2019-09-09 2022-03-29 古野电气株式会社 Ship information display system, ship information display method, image generation device, and program
CN114258372B (en) * 2019-09-09 2024-06-04 古野电气株式会社 Ship information display system, ship information display method, image generation device, and program
CN111523174A (en) * 2020-03-13 2020-08-11 上海外高桥造船有限公司 Automatic drawing method, system, equipment and storage medium for holes in ship drawing
CN111523174B (en) * 2020-03-13 2023-09-19 上海外高桥造船有限公司 Automatic drawing method, system, equipment and storage medium for open pores in hull drawing
US20230221718A1 (en) * 2020-06-25 2023-07-13
CN114034302A (en) * 2021-10-28 2022-02-11 广州海宁海务技术咨询有限公司 Sea chart selection method and device based on planned route
CN114815652A (en) * 2021-12-04 2022-07-29 中国船舶工业系统工程研究院 A design method and system of unmanned boat swarm simulation environment
CN115410390A (en) * 2022-08-19 2022-11-29 河南泽阳实业有限公司 Intelligent scheduling method for intelligent port
CN117470107A (en) * 2023-12-28 2024-01-30 思创数码科技股份有限公司 Ship measurement method and system based on combination of laser scanning and radar scanning

Also Published As

Publication number Publication date
FI129545B (en) 2022-04-14
FI20175065L (en) 2018-07-26
FI20175065A7 (en) 2018-07-26

Similar Documents

Publication Publication Date Title
FI129545B (en) Modelling a vessel and its vicinity in real-time
FI130355B (en) Autonomous operation of a vessel
US10706725B2 (en) Ship-collision avoidance guiding system using time series graphic display
WO2020137149A1 (en) Ship movement-sharing navigation assistance system
US20210034885A1 (en) Automated Capture Of Image Data For Points Of Interest
US11519743B2 (en) Stalled self-driving vehicle rescue system
AU2015332046A1 (en) Street-level guidance via route path
US10733777B2 (en) Annotation generation for an image network
CN102194206B (en) Apparatus and method of sharing drawing image
CN114459437A (en) Method, equipment and medium for surveying and mapping oceans by cooperation of mother ship and multiple unmanned ships
CN114973050A (en) Deep neural network aware ground truth data generation in autonomous driving applications
CN117974916A (en) High-precision map generation method and device based on information fusion
DeFilippo et al. Robowhaler: A robotic vessel for marine autonomy and dataset collection
US11047690B2 (en) Automated emergency response
KR102412419B1 (en) A Method for Providing a Safety Supervising Service Based on a Next Generation Electronic Chart System
WO2017138126A1 (en) Computer system for detecting reverse run of ship, reverse run detection method, and reverse run detection program
CN116309732A (en) A visualization method of ship motion based on digital twin
EP3583024B1 (en) Vessel monitoring based on directionally captured ambient sounds
US11464168B2 (en) Automated vegetation removal
CN109598199A (en) Lane line generation method and device
TW202340752A (en) Boundary estimation
CN202600139U (en) ECDIS (electronic chart display and information system) radar signal receiving and processing device
US9030915B2 (en) System and method for swath overlap detection
CN112539758B (en) Ground line drawing method and system in aerial video
de Moura Neves et al. Preparing for remotely operated vehicle (ROV) seafloor surveys: a descriptive checklist for ocean scientists and managers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18701922

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18701922

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载