+

WO2018041999A1 - Navigation device and display - Google Patents

Navigation device and display Download PDF

Info

Publication number
WO2018041999A1
WO2018041999A1 PCT/EP2017/071966 EP2017071966W WO2018041999A1 WO 2018041999 A1 WO2018041999 A1 WO 2018041999A1 EP 2017071966 W EP2017071966 W EP 2017071966W WO 2018041999 A1 WO2018041999 A1 WO 2018041999A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
secondary display
display device
image data
communication channel
Prior art date
Application number
PCT/EP2017/071966
Other languages
French (fr)
Inventor
Francesco Lodolo
Willem JANSSEN
Domenico ANDREOLI
Phil CORK
Original Assignee
Tomtom International B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tomtom International B.V. filed Critical Tomtom International B.V.
Publication of WO2018041999A1 publication Critical patent/WO2018041999A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3688Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J50/00Arrangements specially adapted for use on cycles not provided for in main groups B62J1/00 - B62J45/00
    • B62J50/20Information-providing devices
    • B62J50/21Information-providing devices intended to provide information to rider or passenger
    • B62J50/22Information-providing devices intended to provide information to rider or passenger electronic, e.g. displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3661Guidance output on an external device, e.g. car radio
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams

Definitions

  • the present invention relates generally to a mobile device and secondary display device for use in navigating a user within an area.
  • Illustrative embodiments of the invention relate to portable navigation devices (so-called PNDs), in particular PNDs that include Global Positioning System (GPS) signal reception and processing functionality.
  • PNDs portable navigation devices
  • GPS Global Positioning System
  • Other embodiments relate, more generally, to any type of mobile processing device that is configured to execute navigation software so as to provide route planning, and preferably also navigation, functionality.
  • Portable navigation devices that include GPS signal reception and processing functionality are well known and widely employed as in-car or other vehicle navigation systems.
  • a modern PND comprises a processor, memory (at least one of volatile and non-volatile, and commonly both), and map data stored within the memory.
  • the processor and memory cooperate to provide an execution environment in which a software operating system may be established, and additionally it is commonplace for one or more additional software programs to be provided to enable the functionality of the PND to be controlled, and to provide various other functions.
  • these devices further comprise one or more input interfaces that allow a user to interact with and control the device, and one or more output interfaces by means of which information may be relayed to the user.
  • output interfaces include a visual display and a speaker for audible output.
  • input interfaces include one or more physical buttons to control on/off operation or other features of the device (which buttons need not necessarily be on the device itself but could be on a steering wheel if the device is built into a vehicle), and a microphone for detecting user speech.
  • the output interface display may be configured as a touch sensitive display (by means of a touch sensitive overlay or otherwise) to additionally provide an input interface by means of which a user can operate the device by touch.
  • Devices of this type will also often include one or more physical connector interfaces by means of which power and optionally data signals can be transmitted to and received from the device, and optionally one or more wireless transmitters/receivers to allow communication over cellular
  • Wi-Fi Wireless Fidelity
  • Wi-Max GSM Wireless Fidelity
  • PND devices of this type also include a GPS antenna by means of which satellite-broadcast signals, including location data, can be received and subsequently processed to determine a current location of the device.
  • the PND device may also include electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted.
  • PNDs The utility of such PNDs is manifested primarily in their ability to determine a route between a first location (typically a start or current location) and a second location (typically a destination). These locations can be input by a user of the device, by any of a wide variety of different methods, for example by postcode, street name and house number, previously stored "well known" destinations (such as famous locations, municipal locations (such as sports grounds or swimming baths) or other points of interest), and favourite or recently visited destinations.
  • the PND is enabled by software for computing a “best” or “optimum” route between the start and destination address locations from the map data.
  • a “best” or “optimum” route is determined on the basis of predetermined criteria and need not necessarily be the fastest or shortest route.
  • the selection of the route along which to guide the driver can be very sophisticated, and the selected route may take into account existing, predicted and dynamically and/or wirelessly received traffic and road information, historical.
  • the device may continually monitor road and traffic conditions, and offer to or choose to change the route over which the remainder of the journey is to be made due to changed conditions.
  • Real time traffic monitoring systems based on various technologies (e.g. mobile phone data exchanges, fixed cameras, GPS fleet tracking) are being used to identify traffic delays and to feed the information into notification systems.
  • PNDs of this type may typically be mounted on the dashboard or windscreen of a vehicle, but may also be formed as part of an on-board computer of the vehicle radio or indeed as part of the control system of the vehicle itself.
  • the navigation device may also be part of a hand-held system, such as a PDA (Portable Digital Assistant) a media player, a mobile phone or the like, and in these cases, the normal functionality of the hand-held system is extended by means of the installation of software on the device to perform both route calculation and navigation along a calculated route.
  • PDA Portable Digital Assistant
  • Route planning and navigation functionality may also be provided by a desktop or mobile computing resource running appropriate software.
  • the Royal Automobile Club provides an on-line route planning and navigation facility at http://www.rac.co.uk, which facility allows a user to enter a start point and a destination whereupon the server to which the user's PC is connected calculates a route (aspects of which may be user specified), generates a map, and generates a set of exhaustive navigation instructions for guiding the user from the selected start point to the selected destination.
  • the facility also provides for pseudo three-dimensional rendering of a calculated route, and route preview functionality which simulates a user travelling along the route and thereby provides the user with a preview of the calculated route.
  • the user interacts with the navigation device to select the desired calculated route, optionally from a list of proposed routes.
  • the user may intervene in, or guide the route selection process, for example by specifying that certain routes, roads, locations or criteria are to be avoided or are mandatory for a particular journey.
  • the route calculation aspect of the PND forms one primary function, and navigation along such a route is another primary function.
  • PNDs During navigation along a calculated route, it is usual for such PNDs to provide visual and/or audible instructions to guide the user along a chosen route to the end of that route, i.e. the desired destination. It is also usual for PNDs to display map information on-screen during the navigation, such information regularly being updated on-screen so that the map information displayed is representative of the current location of the device, and thus of the user or user's vehicle if the device is being used for in- vehicle navigation.
  • An icon displayed on-screen typically denotes the current device location, and is centred with the map information of current and surrounding roads in the vicinity of the current device location and other map features also being displayed. Additionally, navigation information may be displayed, optionally in a status bar above, below or to one side of the displayed map information, examples of navigation information include a distance to the next deviation from the current road required to be taken by the user, the nature of that deviation possibly being represented by a further icon suggestive of the particular type of deviation, for example a left or right turn.
  • the navigation function also determines the content, duration and timing of audible instructions by means of which the user can be guided along the route. As can be appreciated a simple instruction such as "turn left in 100 m" requires significant processing and analysis.
  • user interaction with the device may be by a touch screen, or additionally or alternately by steering column mounted remote control, by voice activation or by any other suitable method.
  • a further important function provided by the device is automatic route re-calculation in the event that: a user deviates from the previously calculated route during navigation (either by accident or intentionally); real-time traffic conditions dictate that an alternative route would be more expedient and the device is suitably enabled to recognize such conditions automatically, or if a user actively causes the device to perform route re-calculation for any reason.
  • PNDs it is possible to use the device purely for information display, or "free-driving", in which only map information relevant to the current device location is displayed, and in which no route has been calculated and no navigation is currently being performed by the device. Such a mode of operation is often applicable when the user already knows the route along which it is desired to travel and does not require navigation assistance.
  • PND devices of the type described above provide a reliable means for enabling users to navigate from one position to another.
  • PND devices are provided as dedicated units containing all of the above-mentioned functionality. It will be appreciated that dedicated PNDs provided all of the desired functionality can be relatively bulky and/or difficult to maintain mounted to the vehicle in the desired position. This can present problems, for example, where it is desired to mount a PND to a scooter where there is little space for mounting the device.
  • the functionality may alternatively be provided by way of a software application running on a mobile device such as a user's smartphone or tablet.
  • a mobile device such as a user's smartphone or tablet.
  • these devices may also be bulky and/or difficult to safely mount to the vehicle.
  • attempting to navigate whilst driving using a smartphone or tablet is potentially dangerous. There is therefore a desire for an improved navigation product.
  • a mobile device for use with a secondary display device with which the mobile device is arranged and adapted to communicate for navigating a user within an area within which the mobile device and secondary display device are travelling, the mobile device comprising:
  • a communication module for communicating with the secondary display device via a
  • At least one processor arranged and configured to:
  • the embodiments of the invention provide a mobile device for use with a secondary display device for navigating a user (e.g. a driver of a vehicle) travelling within a particular area.
  • the mobile device is typically a user's smartphone.
  • Smartphones are relatively expensive and precious, and the user may therefore be unwilling to mount the smartphone directly to a vehicle, particularly to a vehicle such as a scooter, moped or motorbike where the mobile device would be exposed in use, due to the risk of the smartphone being damaged - e.g. by falling from the mount and/or being exposed to adverse weather conditions. It is also not safe for the user to attempt to hold the smartphone during a journey, as the user will need both hands to control the vehicle.
  • the smartphone itself is not necessarily suitable for use as a navigation device, particularly for use with e.g. a scooter, moped or motorbike.
  • these problems are overcome by the mobile device being arranged and adapted to communicate with a secondary display device.
  • the mobile device can then be used for the relatively computationally expensive steps of generating the image data, which is preferably real-time image data as discussed in more detail below, for use in navigating a user, with this data then being sent or streamed for display by the secondary display device.
  • the secondary display device needs to be visible in use, the relatively expensive mobile device does not need to be kept in view and can be safely stored, thus reducing the potential for damage to the mobile device.
  • the mobile device thus generates real-time image data representative of the current position of the mobile device and/or the secondary display device for transmission via a communication channel to the secondary display device for display to the user. That is, the mobile device may generate a live video stream for display via the secondary display device.
  • the video stream may be comprised of a series of frames or images of the image data.
  • the secondary display device may thus be made relatively lightweight and inexpensive, at least relative to the mobile device.
  • the secondary display device may also readily be made waterproof or scratch-resistant.
  • a mobile device such as a smartphone can be readily adapted for use in providing navigation information, e.g. by downloading a suitable software application.
  • the mobile device may contain a software application that when executed on the mobile device causes the processor to perform the techniques described herein.
  • any suitable processor or processing circuitry may be incorporated into the mobile device.
  • the mobile device and/or the at least one processor of the mobile device may also contain any suitable signal generation and/or processing device or circuitry for generating and/or receiving control signals.
  • the mobile device may also include any suitable transmitter for transmitting the image data over the communication channel.
  • the generated image data may be processed, e.g. compressed, before it is transmitted or streamed via the communication channel.
  • the image data may still be representative of the image. That is, the image associated with the image data can still be subsequently reconstructed, i.e. at the secondary display device.
  • the image data being "representative" of an image what is meant is that the image, that is to be displayed by the secondary display device, can be reconstructed or generated at the secondary display device using the image data transmitted across the communication channel.
  • the image may represent substantially the current position of the mobile device and/or the secondary display device, and that the image data is updated dynamically (e.g. continuously or at a relatively fast non-continuous rate) as the position of the mobile device and/or the secondary display device changes within the area, i.e. during the course of a single journey.
  • the current position is generally a current geographical position, i.e. within the area within which the devices are travelling.
  • the image data may also change dynamically in response to an input received e.g. from the secondary display device via the communication channel. It will be appreciated that the real-time image data may be representative of an image to be displayed by the secondary display device.
  • the mobile device and the secondary display device are both intended to be carried by the user, in close proximity to each other, such that the current position of either the mobile device or the secondary display device is reflective of the position of the user.
  • the positional data obtained by the mobile device may therefore be obtained e.g. from the mobile device and/or from the secondary display device.
  • the mobile device and the secondary display device may each therefore have an antenna for use with a global navigation satellite system for obtaining such positional data.
  • static image data may also be generated at the mobile device for display by the secondary display device. It is contemplated that the secondary display device for use with the mobile device may not contain any image generation software, such that the mobile device completely controls the display of the secondary display device.
  • the user may be a driver of a vehicle wherein the mobile device and secondary display device are both carried within the vehicle.
  • the user may be a driver of a scooter or moped wherein the secondary display device is mounted or mountable to the scooter or moped, e.g. via the handlebars or mirror mount.
  • the mobile device may then be stored in the user's pocket, bag, etc.
  • the area may generally be represented by digital map data, with the digital map data containing a series of interconnected nodes and links.
  • the nodes and links may e.g. represent roads and junctions along which the user can travel.
  • the at least one processor may be arranged and configured to update the image data and/or generate new image data in real time in response to a received input via the communication channel.
  • the mobile device may be arranged to generate new image data in response to an input received from the secondary display device.
  • the mobile device may be intended to be safely stored, such that the user may only interact via the secondary display device.
  • the received input may, for instance, comprise the positional data, or may comprise a user selection e.g. between display settings such as display type or zoom level.
  • the received input may also, for instance, simply comprise a signal indicating that the status of the secondary display device has changed e.g. that the devices have become (dis)connected or that the secondary display device is turned off, or that the battery level of the secondary display device has changed.
  • the at least one processor may be arranged and configured to vary or set the frame rate at which the generated image data is transmitted via the communication channel for display by the secondary display.
  • the frame rate of the image data transmitted for display may be varied dynamically and/or in real time.
  • the frame rate may be varied according to one or more settings, e.g. between a number of discrete values.
  • the frame rate may be varied automatically based on a change in a parameter associated with e.g. the positional data, the status of the mobile device and/or the status of the secondary display device. For instance, a first frame rate value may be used for some types of display (e.g. for a menu screen) and a second frame rate used during the navigation mode.
  • the frame rate may be varied or set based on one or more input(s) received via the
  • the frame rate may be varied or set based on an input from the secondary display device. For example, the user may select a new destination, or select a different display type or zoom level, and the mobile device may then automatically adjust the frame rate to a different value.
  • the frame rate may be varied or set based on the positional data, and/or based on a speed of travel of the mobile device within the area.
  • a speed of travel may be determined using the positional data over time.
  • the frame rate may also therefore be varied or set based on a speed of travel of the secondary display device within the area, where the obtained positional data is indicative of the current position of the secondary display device.
  • the at least one processor may be arranged and configured to calculate a route for use in the navigating the user within the area.
  • the mobile device may be arranged and configured to perform route planning, e.g. of the type discussed in the background section above.
  • the mobile device may thus calculate a route for navigating the user from a departure node to a destination node, and the generated image data transmitted for display on the secondary display device may display this route and/or instructions or directions associated with this route to the user for navigation purposes.
  • the at least one processor may be arranged and configured to compress the generated image data prior to the image data being transmitted via the communication channel, optionally by an intraframe compression technique.
  • the mobile device may be configured to transmit the image data via the communication channel in serial.
  • the mobile device may be configured to transmit image data associated with a single image for display by the secondary display device as a series of parts.
  • the size of the parts may be chosen to provide a desired responsiveness of the system. For instance, the size of the parts may be chosen such that upon receiving a new input via the
  • a new image is generated and/or the image data is updated within a desired time period.
  • the fixed time period may be 0.5 s, such that new image data is generated in response to any given input in 0.5 s or less.
  • the at least one processor may be arranged and configured to: monitor the communication channel for a signal indicating that the secondary display device is turned on and connected to the mobile device via the communication channel; and upon receipt of the signal, start generating image data.
  • the mobile device may be arranged to automatically start generating image data in response to a signal sent by the secondary display device upon turning the secondary display device on and/or connecting the secondary display device to the mobile device.
  • the mobile device may contain a first software application for generating the image data e.g. using the techniques substantially as described above
  • the mobile device may contain a second software application for monitoring the communication channel.
  • the second software application may require less power than the first software application so that the second software application may be left running with less drain on the battery of the mobile device.
  • the image data generated upon receipt of the signal may e.g. be an image associated with a start or menu screen. User input may then be required to generate the image data associated with the current position of the mobile device and/or secondary display device.
  • a method of generating data for display upon start-up of a secondary display device comprising: providing a mobile device; providing a secondary display device, wherein the secondary display device is arranged and configured to communicate with the mobile device via a communication channel; monitoring the communication channel using the mobile device for a signal from the secondary display device indicating that the secondary display device is turned on and connected to the mobile device via the communication channel; upon detection of the signal, using the mobile device to generate image data associated with an image to be displayed by the secondary display device;
  • the mobile device and/or the secondary display device may be a mobile device and/or secondary display device substantially as described herein in relation to any of the other aspects or embodiments herein.
  • the at least one processor may be arranged and configured to communication with an external server to obtain map data reflective of the area.
  • the processor may communicate with the external server using an internet connection of the mobile device, e.g. a mobile telecommunications connection such as a 3G/4G.
  • the external server may also provide general system or software updates for the mobile device and/or for transmission by the mobile device via the communication channel to the secondary display device.
  • a secondary display device for use with a mobile device with which the secondary display device is arranged and adapted to communicate in use for navigating a user within an area within which the mobile device and secondary display device are travelling, the secondary display device comprising:
  • a communication module for controlling communication with the mobile device via a communication channel
  • the at least one processor is arranged and configured to:
  • the secondary display device further comprises a user interface for receiving a user input
  • the secondary display device being arranged and configured, based upon the user input, to transmit a signal across the communication channel for controlling an operation of the mobile device.
  • the user input at the secondary display device thus generates a signal that is sent to the mobile device and causes the mobile device to generate new and/or updated image data for display at the secondary display device.
  • the signal generated based on the user input therefore contains information relating to the user input. This information may then be interpreted by the mobile device in order for the mobile device to generate appropriate image data responsive to the signal.
  • the secondary display device does not necessarily need to generate the image data for display, and so can be made relatively lightweight and inexpensive.
  • the secondary display device may be sized and shaped to facilitate mounting to a vehicle such as a scooter or moped.
  • the secondary display device may be mountable to the handlebars or mirror stalk of a scooter or moped.
  • any suitable processor or processing circuitry may be incorporated into the secondary display device.
  • the secondary display device and/or the at least one processor of the secondary display device may also contain any suitable signal generation and/or processing device or circuitry for generating and/or receiving control signals.
  • the at least one processor of the secondary display device may be arranged and configured to cause the secondary display device, or a signal generating module of the secondary display device, to transmit the signal.
  • the secondary display device may further comprise an antenna for use with a global navigation satellite system for obtaining positional data representative of a current position of the secondary display device within the area.
  • the at least one processor may be arranged and configured to transmit the positional data via the communication channel for use by the mobile device.
  • the secondary display device may further comprise a battery.
  • the at least one processor may be arranged and configured to check the charge level of the battery, and transmit information indicative of the current level of the battery via the communication channel for inclusion in the image data.
  • the information indicative of the current level of the battery may be stored in a memory of the secondary display device such that it is immediately available for transmission to the mobile device in order for the mobile device to include the battery level information in the image data, i.e. or to generate image data representing the battery level information.
  • the charge level may be checked continually whilst the secondary display device is turned on. When the secondary display device is off, or in a standby mode, the secondary display device may be arranged and configured to periodically wake up to check the charge level of the battery.
  • the user interface may comprise a touchscreen.
  • the communication module(s) may comprise a wireless communication module, and the communication channel may comprise a wireless communication channel.
  • the wireless communication is Bluetooth communication.
  • a system comprising a mobile device and a secondary display device for navigating a user travelling within an area within which the mobile device and secondary display device are travelling;
  • the mobile device comprising a communication module and at least one processor
  • the secondary display device comprising a communication module for communicating with the communication module of the mobile device via a communication channel, at least one processor and a display;
  • the at least one processor of the mobile device is arranged and configured to:
  • the secondary display device is arranged and configured to:
  • the secondary display device may further comprise a user interface, such as a touchscreen, for receiving a user input.
  • the processor of the secondary display device may be arranged and configured upon the user input to transmit a signal across the communication channel to the mobile device; and the processor of the mobile device may be arranged and configured to generate new and/or updated image data based upon the signal and to transmit the new or updated image data to the secondary display device for display.
  • the mobile device may comprise a mobile device substantially as described in accordance with any of the aspects or embodiments herein.
  • the secondary display device may comprise a secondary display device substantially as described in accordance with any of the aspects or embodiments herein.
  • a method for providing navigation information to a user comprising:
  • the method further comprising:
  • the present invention extends to a system for carrying out a method in accordance with any of the aspects or embodiments of the invention herein described.
  • a system for providing navigation information to a user comprising:
  • a system for providing navigation information to a user comprising:
  • this further aspect of the present invention can and preferably does include any one or more or all of the preferred and optional features of the invention described herein in respect of any of the other aspects of the invention, as appropriate.
  • the system of the present invention herein may comprise means for carrying out any step described in relation to the method of the invention in any of its aspects or embodiments, and vice versa.
  • the present invention is a computer implemented invention, and any of the steps described in relation to any of the aspects or embodiments of the invention may be carried out under the control of a set of one or more processors.
  • the means for carrying out any of the steps described in relation to the system may be a set of one or more processors.
  • the method of the present invention may be implemented in the context of a navigation operation.
  • the method may be carried out by a set of one or more processors of a device or system having navigation functionality.
  • the present invention extends to a computer program product comprising computer readable instructions adapted to carry out any or all of the method described herein when executed on suitable data processing means.
  • the invention also extends to a computer software carrier comprising such software.
  • a software carrier could be a physical (or non-transitory) storage medium or could be a signal such as an electronic signal over wires, an optical signal or a radio signal such as to a satellite or the like.
  • a computer program product e.g. computer software, comprising instructions which, when executed by one or more processors of a system, cause the system to perform the method of any of the aspects and embodiments discussed above.
  • the computer program product can be stored on a non- transitory computer readable medium.
  • Figure 1 A illustrates a navigation system according to an embodiment of the invention comprising a mobile device and a secondary display device and Figure 1 B shows the secondary display device mounted in use;
  • Figure 2 illustrates the navigation system of Figure 1A in more detail
  • Figure 3A illustrates the internal architecture of the mobile device and Figure 3B illustrates the main communication pathways between the mobile device and the secondary display device;
  • Figure 4 shows the stacked architecture for the communication pathways between the mobile device and the secondary display device;
  • Figure 5 shows an example of a display provided by the secondary display device for use in navigating a user
  • Figure 6 shows an example of a menu provided on the secondary display device.
  • a mobile device such as a smartphone.
  • teachings of the present invention are not limited to this context but are instead universally applicable to any type of processing device that is configured to execute navigation software so as to provide route planning and navigation functionality.
  • the mobile device is intended to include (without limitation) any type of portable route planning and navigation device or indeed any type of computing resource such as a portable personal computer (PC), mobile telephone or portable digital assistant (PDA)) executing route planning and navigation software.
  • PC personal computer
  • PDA portable digital assistant
  • Figure 1 A illustrates a navigation system according to an embodiment of the invention comprising a mobile device 1 in the form of a smartphone and a secondary display device 2 with which the mobile device 1 can communicate e.g. via a Bluetooth protocol.
  • the mobile device 1 is running a software application that generates the images for display by the secondary display device 2.
  • the type of mobile device 1 is not particularly limited, and the software application may e.g. run either through Apple iOS or an Android operating system.
  • Figure 1 B shows the secondary display device 2 in use mounted to the mirror stalk of a scooter. It will be appreciated that both the mobile device 1 and secondary display device 2 are intended to travel together, in close association with each other, such that they remain in range throughout the journey.
  • the mobile device 1 is the user's smartphone running an appropriate software application for use in controlling the secondary display device 2 which may be mounted to the user's vehicle.
  • the mobile device 1 will typically therefore be relatively expensive and precious to the user; accordingly, in the embodiment, the mobile device 1 may be safely stored in use, e.g. in a user's pocket or bag (or any other suitable location), to avoid risking damage to the mobile device 1 , so long as it is within communication range of the secondary display device 2.
  • the communication between the mobile device 1 and the secondary display device 2 is via Bluetooth communication.
  • other suitable wireless communication protocols may also be used, or less preferably even a wired communication between the devices 1 , 2.
  • the secondary display device 2 may essentially be a slave device operating under the control of the mobile device 1 , and may therefore be provided with relatively limited hardware and software resources. Hence, the secondary display device 2 can be made relatively lightweight and inexpensive, since the bulk of the processing is performed at the mobile device 1 .
  • the secondary display device 2 may be mounted to the vehicle in any suitably visible location, e.g. on the handlebars or mirror stalk of a scooter.
  • the secondary display device 2 would typically be releasably mounted to the vehicle such that an unskilled user can readily attach/detach the device from the vehicle.
  • the secondary display device 2 may thus be provided with any suitable mounting system that allows it be readily mounted to a vehicle, e.g. a scooter.
  • the secondary display device 2 could be more permanently mounted.
  • the secondary display device 2 is intended for mounting to a scooter, and hence is exposed in use.
  • the secondary display device 2 may therefore generally be provided with a scratch-resistant or waterproof casing.
  • the mobile device 1 receives as input or generates positional data reflecting of the current position of the user (e.g. based on live GPS positional data) and optionally map data reflecting the current traffic conditions in the map region within which the user is travelling. From this data, the mobile device 1 generates image data associated with an image that is intended for display to the user via the secondary display device 2. Particularly, during a navigation mode of operation, the mobile device 1 generates image data representing the current position of the user within a map. It will be appreciated that the current position of the user may in principle be determined using the position either of the mobile device 1 or the secondary display device 2 as both devices are intended to be carried by the user/mounted on the user's vehicle in use.
  • This image data is updated in real-time so that the current position of the user as the devices 1 , 2 move around the map region may be continually displayed to the user, i.e. in the form of a video display.
  • the image data is compressed at the mobile device 1 into a format suitable for transmitting or streaming across the Bluetooth
  • the stream of image data is then transmitted to the secondary display device 2 in its compressed format wherein the secondary display device 2, upon receipt of the image data, proceeds to decode or decompress the received image data in order to (re)construct the image intended for display via the secondary display device 2, and this image is then displayed to the user. For example, where the image represents a current position of the user, this image is displayed to the user for use in navigation purposes.
  • the secondary display device 2 may include a user interface for providing control functionality, e.g. for changing the display format, switching between a number of preset destinations, and/or switching between different levels of zoom. This may be provided via touch control wherein a user taps or swipes the display of the secondary display device 2 in order to change the display.
  • the input provided by the user at the secondary display device 2 then generates a signal that is sent to the mobile device 1 to cause the mobile device 1 to update the image data based on the user input, e.g. to change the display format, to switch between destinations, and/or to switch between different levels of zoom.
  • the user interface may also be relatively simple, and only be associated with relatively basic control functionality, e.g. of the types discussed above. More complicated user inputs such as initial route planning control or set up of the application will typically be performed in advance via the mobile device 1. User input may also be provided in use via a headset in communication with the mobile device 1 , where one is provided.
  • FIG 2 illustrates the internal architecture of the components of the system shown in Figure 1A in more detail.
  • the system generally comprises a mobile device 1 running a software application 100 within an operating environment 10 and communicating via an interface 12 with a secondary display device 2.
  • the mobile device 1 is also arranged to communicate with an external server 3 that provides back-end services to the system, e.g. through an internet connection.
  • the external server 3 may provide the map data and live data reflecting the current traffic conditions in the map region to the software application 10 for use in generating the navigation information to be displayed to the user, as is known for PNDs generally.
  • the external server 3 may also provide software and/or firmware updates for the software application 100 running on the mobile device 1 and the secondary display device 2.
  • the secondary display device 2 typically only has relatively limited processing power and memory resource such that any updates for or information to be displayed on the secondary displace device 2 is provided by the controlling mobile device 1 .
  • the mobile device 1 will include a GPS antenna 1 1 and a 3G/4G/Wi-Fi communication module 13 through which communication with the external server 3 is effected, as is conventional for modern smartphones.
  • the mobile device 1 may also contain a headset 14 for receiving voice instructions from a user. Again, this functionality is well known for modern smartphones.
  • the secondary display device 2 includes a microcontroller unit 201 , a display screen 202, e.g. in the form of an octagonal LCD module having a touchscreen functionality, a Bluetooth connection interface 122, a Global Navigation Satellite System (GNSS) receiver 203, a battery 204 and a port 205 for inserting a charger.
  • the port may e.g. comprise a USB or micro USB port.
  • the secondary display device 2 may also have a power button for turning the device on, or for suspending or rebooting the operation of the device.
  • Figure 3A is another representation of the architecture inside the secondary display device 2.
  • the microcontroller or MCU 201 communicates with to receive inputs from and/or control the operation of the display unit 202 and its associated touchscreen 2021 , and the GNSS receiver 203 and its associated antenna 2031 .
  • the microcontroller 201 also communicates with the battery 204 and the charging port 205, e.g. to receive information regarding the current charge level or charge status of the battery 204.
  • the secondary display device 2 also includes a power button 22 for switching the secondary display device 1 ON/OFF or into a 'suspended' mode of operation.
  • Figure 3B shows the main communication paths between the secondary display device 2 and the mobile device 1 .
  • Figure 4 shows the stacked architecture for the communication between the mobile device and the secondary display device.
  • the paths all pass through the microcontroller 201 of the secondary display device 2, as this component generally controls the operation of the secondary display device 2 (in response to the received inputs from the mobile device 1 ).
  • the first path 301 is used by the user interface frame buffer to transmit the image data generated by the navigation application 100 to be rendered for display by the secondary display device 2 at a set (variable) frame rate, e.g. as explained further below.
  • the first path 301 therefore passes from the mobile device 1 to the display unit 202 of the secondary display device 2.
  • a second path 302 is provided between the touchscreen 2021 of the secondary display device 2 and the mobile device 1 for transmitting user interface events, where user input provided to the secondary display device 2, via the touchscreen 2021 , is sent back to the mobile device 1 , and used to update or generate new image data for display by the display unit 202.
  • the first 301 and second 302 communication paths are used to transmit or stream the image data for display, and for the user-controlled adjustment of the image data that is displayed.
  • the third path 303 is a bi-directional communication path extending between the GNSS receiver 203 of the secondary display device 2 and the mobile device 1 , and is provided for transmitting GNSS data (e.g. from the GNSS receiver 203 to the mobile device 1 ) and for control of the GNSS receiver 203 by the mobile device 1 .
  • the initial GNSS data may be provided by the mobile device 1 , e.g. during the route planning and/or before the secondary display device 2 is connected.
  • the GNSS data may be provided to the mobile device 1 from the secondary display device 2.
  • the application 100 running on the mobile device 1 is intended to be a companion application only, with the mobile device 1 e.g.
  • GNSS data from the GNSS receiver of the secondary display device 2 which is e.g. mounted to the handlebar or mirror of the scooter, may therefore provide more accurate information, especially as a better (i.e. larger), dedicated antenna can be provided within the secondary display device 2 than the general purpose antenna that may be provided as part of the mobile device 1 , i.e. a conventional smartphone.
  • the software application 100 running the mobile device 1 may therefore be designed to disregard any GNSS data from the antenna of the mobile device 1 in favour of that provided by the secondary display device 2, where both are available.
  • the GNSS data from the antenna of the mobile device 1 may also be used as a fallback when needed, e.g. where no GNSS data is available via the secondary display device 2.
  • the fourth path 304 is a bi-directional communication path between the microcontroller unit 201 of the secondary display device 2 and the mobile device 1 , and may be used for general system control and information e.g. for providing information indicative of the device credentials, the battery charge level of the device, the charging status, dock status, etc. This information may be provided from the secondary display device 2 to the mobile device 1 , for the mobile device 1 to then render this information into the image data for subsequent display by the secondary display device 2.
  • the fourth path may also be used e.g. for displaying brightness control of the secondary display device 2, which again, is rendered in the image data by the mobile device 1 .
  • the secondary display device 2 firmware may also be updated using information sent along the fourth path from the mobile device 1 , e.g. and retrieved from an external server 3.
  • the image rendering, compression and streaming at the mobile device 1 and the subsequent decompressing or decoding of the received image data at the secondary display device 2 may be performed using any suitable techniques. It will also be appreciated that the image coding/decoding activity is quite CPU/RAM intensive, and that in order for the computational load at the secondary display device 2 to be reduced as far as possible, accurate scheduling of activities may be required to avoid an adverse user experience (i.e. including significant lag).
  • the rendered frames from the software application 100 are encoded using an intraframe compression technique, i.e. in which each frame of the image/stream is compressed individually. This may help reduce the amount of processing involved at the secondary display device 2.
  • the effective speed of a Bluetooth channel is roughly 100 Kb/s.
  • a stream of images of size roughly between 20-50 Kb being displayed with a frame rate of around 5 fps (frames per second) (as may be typically accommodated on a smartphone)
  • just transferring these over a Bluetooth connection would take around 200-500 ms.
  • the communication protocol may be designed so that the overall system performance is not negatively impacted by this in order to provide a good user experience.
  • the user interface path used to render the images (the first path or rendering channel) and the user interface event channel (second path) may inherently have a relatively high latency, due to the relatively intensive requirements placed on the processor (CPU/RAM) of the mobile device 1 in rendering the images.
  • the high latency streamed image data may be sent along a single serial channel, and the image data associated with a single image may be broken up into a series of 'parts'. That is, the protocol may be adapted to stream the image data as a sequence of discrete parts that are then (re-)assembled and decompressed at the secondary display device 2. This may make aborting and sending over a new image easier than would be the case by e.g. sending the image as one data part, or across parallel data channels. The size of the parts may be chosen appropriately to provide the required responsiveness.
  • the smaller the size of the parts the more responsive the system may be - that is, the lower the latency associated with aborting/changing an image being sent in midstream.
  • the smaller the size of the parts the more processing is required at the secondary display device 2 to re-assemble the parts into the image data for generating the image for display.
  • interframe compression techniques could also be used, i.e. in which multiple neighbouring frames are compressed together based on the recognition that a frame can be expressed in terms of one or more preceding and/or succeeding frames.
  • the frame rate at which the image data is generated and transmitted at the mobile device 1 , and hence displayed at the secondary display device 2 may be variable e.g. between a number of different settings. For instance, different frame rates may be used depending on the current display mode - a first frame rate may be used when displaying the menu or destination screen on the secondary display device 2 and a second frame rate used when displaying the map for guidance in the navigation mode. By way of example, a first frame rate of around 10 fps may be used in the menu mode and a second frame rate of around 5 fps used in the navigation mode. A higher frame rate may be needed when navigating the menu to ensure that the device acts responsively. However, a lower frame rate can be used during navigation, where responsiveness may be less of a concern (as there will generally be less user interaction), so as to reduce the processing strain at the secondary display device 2 and hence preserve the battery life.
  • the frame rate may also be varied between different settings based on various other considerations.
  • the frame rate may be varied based on the speed at which the user is travelling through the map region - e.g. a frame rate of 1 fps when travelling at low speeds, such as less than about 30 km/h, and 5 fps when travelling at higher speeds, such as 100 km/h or higher.
  • the frame rate may be varied based on the zoom level selected by the user via the secondary display device 2 - e.g. a frame rate of 1 fps for relatively low, zoomed-out, zoom levels, and 5 fps for higher, zoomed in, zoom levels.
  • a higher frame rate such as 7 fps, may be used when changing heading or recalculating a route.
  • the highest frame rate should be used.
  • Figure 5 illustrates a display of the touchscreen display of the secondary display device 2 that may be provided to the user during a navigation mode.
  • the display may contain various other icons representing e.g. current traffic conditions 53, expected delays 54, location of speed cameras 55
  • the display also contains an indication 52 of the estimated time of arrival. Tapping this icon, or e.g. swiping down from the top of the display, may open up a menu display for choosing a new destination from a pre- determined list of previous or favourite destinations (as shown for example by icon 60 in Figure 6).
  • the display also shows the next instruction 56 associated with the navigation route, e.g. "Turn right in 50m".
  • Tapping the instruction or e.g. swiping up from the bottom of the display, may allow the user to switch between a route overview mode in which the current position and the destination are both visible and a guidance mode which is relatively zoomed in on the current position.
  • Tapping the display at any other position may be used to select between different zoom levels.
  • the display may be set between a number of discrete zoom levels, for instance: a first zoom level that shows the remaining route keeping the current position and the destination in view, a second zoom level at a fixed height just keeping the current position in view, and a third zoom level at a lower fixed height, zoomed closer in to the current position.
  • the display may automatically change between zoom levels in response to changes in the current position. For example, when the destination is reached, the route may be cleared and the current position may move to the center of the screen, with the display at the maximum zoomed out zoom level (i.e. the first zoom level).
  • Figure 5 is merely one example of a display, and that of course the display and control functionality may be tailored in any appropriate manner depending on the desired content to be displayed to the user.
  • the secondary display device 2 may additionally or also display to the user the current battery level of the device.
  • the current battery level is generally determined by the MCU of the secondary display device 2.
  • the secondary display device 2 may transmit this information to the mobile device 1 in order for the mobile device 1 to render a new image including this information. It may therefore not be necessary for the secondary display device 2 to store any image data representing the battery level, as the image rendering may be performed only at the mobile device 1 .
  • the secondary display device 2 may store or even generate indicia representative of the different battery statuses itself. It may be useful for the secondary display device 2 to store some of these indicators in the case where the battery is low, so that this information can be displayed immediately to the user without having to connect to the mobile device 1 .
  • turning the device ON may prompt an initial step of checking the battery level of the secondary display device 2, and where the battery level is low, e.g. below a certain threshold, displaying this information to the user (prior to displaying any menu screen or navigation information).
  • the display may be updated to reflect the current battery level of the secondary display device 2. This may be provided via an icon in the corner of the display, or, especially where the battery level drops to a critically low level, by displaying a warning to the user.
  • the display may also reflect the current battery level of the mobile device 1. Also, reflects current GNSS/device connectivity status.
  • the secondary display device 2 When not being used, the secondary display device 2 is generally kept in a suspended mode. In order to monitor the battery level, in embodiments, the secondary display device 2 is configured to periodically (e.g. about every 20 minutes) wake up from the suspended mode and check the current battery level. The detected battery level may then be stored in a battery-backed RAM portion of the secondary display device 1 such that the current (i.e. the most recently checked) value of the battery level is always available for immediate display. Naturally, in use, the battery level may be checked continuously.
  • the secondary display device 2 may display to the user an indication of whether the secondary display device 2 is currently being charged, etc.
  • the secondary display device 2 may be configured for phone call handling. For instance, when the mobile device 1 receives an incoming phone call, a monitoring application within the mobile device 1 , or a monitoring function of the navigation application 100, may detect this, and send a signal to the mobile device 1 . Upon receipt of this signal the mobile device 1 then generates an image indicating the incoming phone call, e.g. displaying a photo of the contact or the number that is calling. The phone call may be dismissed by tapping the display, or by doing nothing for a predetermined wait period, to return to the navigation display. Alternatively, the phone call may be accepted or declined via a headset 14 of the mobile device 1 where one is provided.
  • the communication interface 12 in the illustrated embodiment operates via Bluetooth communication between the Bluetooth connectivity portions 121 , 122 of the mobile device 1 and the secondary display device 2.
  • Bluetooth may be particularly appropriate given the relatively short distance between the mobile device 1 and the secondary display device 2 in use, as both devices are carried around with the user, as discussed above.
  • the secondary display device 2 is first set-up and/or when the software application 100 is first downloaded, the mobile device 1 and secondary display device 2 may be paired using any suitable pairing protocol. Upon start-up of either device, it is then easy for the device to check whether or not it has been paired, and then whether or not the paired device is within range, and connected. If the devices are determined to be connected, the software application can thus start running on the mobile device 1.
  • the software application 100 on the mobile device 1 may be arranged to automatically start up when the secondary display device 2 is turned on and connected to the mobile device 1 .
  • this method operates as follows.
  • the secondary display device 2 pairs with the mobile device 1 over the Bluetooth connection. Since the pairing process has already been completed (as discussed above), this occurs automatically upon turning on the secondary display device 2 and bringing it into range of the mobile device 1 , provided that the Bluetooth connection is turned on.
  • a monitoring application on the mobile device 1 then listens to the Bluetooth channel. The monitoring application serves only to listen to the Bluetooth channel, and therefore requires relatively little processing power. The monitoring application can therefore be left running in the background without significantly draining the battery of the mobile device 1 .
  • the secondary display device 2 When the secondary display device 2 is turned on, the secondary display device 2 sends a wake- up signal over the Bluetooth channel to the mobile device 1 that is detected by the monitoring application. Upon detection of this wake-up signal, the monitoring application triggers start-up of the navigation application 100 of the mobile device 1 . The navigation application 100 then starts to generate the image data for display on the secondary display device 2 (e.g. the Home or Menu screen) and sends the image data over the Bluetooth connection to the secondary display device 2.
  • the secondary display device 2 e.g. the Home or Menu screen
  • the mobile device may be configured to perform any one or all of the functions described above in the Background section.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)

Abstract

A mobile device for use with a display device is provided. The mobile device is arranged to communicate with the display for navigating a user within an area within which the mobile device and display device are travelling. The mobile device includes a communication module for communicating with the display device via a communication channel and a processor. The processor may be configured to obtain positional data indicative of a current position of the mobile device and/or the display device within the area, generate real-time image data representative of an image indicating the current position of the mobile device and/or the display device within the area and cause the mobile device to transmit the real-time image data via the communication channel for display by the display device.

Description

NAVIGATION DEVICE AND DISPLAY
Field of the Invention
The present invention relates generally to a mobile device and secondary display device for use in navigating a user within an area. Illustrative embodiments of the invention relate to portable navigation devices (so-called PNDs), in particular PNDs that include Global Positioning System (GPS) signal reception and processing functionality. Other embodiments relate, more generally, to any type of mobile processing device that is configured to execute navigation software so as to provide route planning, and preferably also navigation, functionality.
Background of the Invention
Portable navigation devices (PNDs) that include GPS signal reception and processing functionality are well known and widely employed as in-car or other vehicle navigation systems.
In general terms, a modern PND comprises a processor, memory (at least one of volatile and non-volatile, and commonly both), and map data stored within the memory. The processor and memory cooperate to provide an execution environment in which a software operating system may be established, and additionally it is commonplace for one or more additional software programs to be provided to enable the functionality of the PND to be controlled, and to provide various other functions.
Typically these devices further comprise one or more input interfaces that allow a user to interact with and control the device, and one or more output interfaces by means of which information may be relayed to the user. Illustrative examples of output interfaces include a visual display and a speaker for audible output. Illustrative examples of input interfaces include one or more physical buttons to control on/off operation or other features of the device (which buttons need not necessarily be on the device itself but could be on a steering wheel if the device is built into a vehicle), and a microphone for detecting user speech. In a particularly preferred arrangement the output interface display may be configured as a touch sensitive display (by means of a touch sensitive overlay or otherwise) to additionally provide an input interface by means of which a user can operate the device by touch.
Devices of this type will also often include one or more physical connector interfaces by means of which power and optionally data signals can be transmitted to and received from the device, and optionally one or more wireless transmitters/receivers to allow communication over cellular
telecommunications and other signal and data networks, for example Wi-Fi, Wi-Max GSM and the like.
PND devices of this type also include a GPS antenna by means of which satellite-broadcast signals, including location data, can be received and subsequently processed to determine a current location of the device.
The PND device may also include electronic gyroscopes and accelerometers which produce signals that can be processed to determine the current angular and linear acceleration, and in turn, and in conjunction with location information derived from the GPS signal, velocity and relative displacement of the device and thus the vehicle in which it is mounted. Typically such features are most commonly provided in in-vehicle navigation systems, but may also be provided in PND devices if it is expedient to do so.
The utility of such PNDs is manifested primarily in their ability to determine a route between a first location (typically a start or current location) and a second location (typically a destination). These locations can be input by a user of the device, by any of a wide variety of different methods, for example by postcode, street name and house number, previously stored "well known" destinations (such as famous locations, municipal locations (such as sports grounds or swimming baths) or other points of interest), and favourite or recently visited destinations.
Typically, the PND is enabled by software for computing a "best" or "optimum" route between the start and destination address locations from the map data. A "best" or "optimum" route is determined on the basis of predetermined criteria and need not necessarily be the fastest or shortest route. The selection of the route along which to guide the driver can be very sophisticated, and the selected route may take into account existing, predicted and dynamically and/or wirelessly received traffic and road information, historical.
In addition, the device may continually monitor road and traffic conditions, and offer to or choose to change the route over which the remainder of the journey is to be made due to changed conditions. Real time traffic monitoring systems, based on various technologies (e.g. mobile phone data exchanges, fixed cameras, GPS fleet tracking) are being used to identify traffic delays and to feed the information into notification systems.
PNDs of this type may typically be mounted on the dashboard or windscreen of a vehicle, but may also be formed as part of an on-board computer of the vehicle radio or indeed as part of the control system of the vehicle itself. The navigation device may also be part of a hand-held system, such as a PDA (Portable Digital Assistant) a media player, a mobile phone or the like, and in these cases, the normal functionality of the hand-held system is extended by means of the installation of software on the device to perform both route calculation and navigation along a calculated route.
Route planning and navigation functionality may also be provided by a desktop or mobile computing resource running appropriate software. For example, the Royal Automobile Club (RAC) provides an on-line route planning and navigation facility at http://www.rac.co.uk, which facility allows a user to enter a start point and a destination whereupon the server to which the user's PC is connected calculates a route (aspects of which may be user specified), generates a map, and generates a set of exhaustive navigation instructions for guiding the user from the selected start point to the selected destination. The facility also provides for pseudo three-dimensional rendering of a calculated route, and route preview functionality which simulates a user travelling along the route and thereby provides the user with a preview of the calculated route.
In the context of a PND, once a route has been calculated, the user interacts with the navigation device to select the desired calculated route, optionally from a list of proposed routes. Optionally, the user may intervene in, or guide the route selection process, for example by specifying that certain routes, roads, locations or criteria are to be avoided or are mandatory for a particular journey. The route calculation aspect of the PND forms one primary function, and navigation along such a route is another primary function.
During navigation along a calculated route, it is usual for such PNDs to provide visual and/or audible instructions to guide the user along a chosen route to the end of that route, i.e. the desired destination. It is also usual for PNDs to display map information on-screen during the navigation, such information regularly being updated on-screen so that the map information displayed is representative of the current location of the device, and thus of the user or user's vehicle if the device is being used for in- vehicle navigation.
An icon displayed on-screen typically denotes the current device location, and is centred with the map information of current and surrounding roads in the vicinity of the current device location and other map features also being displayed. Additionally, navigation information may be displayed, optionally in a status bar above, below or to one side of the displayed map information, examples of navigation information include a distance to the next deviation from the current road required to be taken by the user, the nature of that deviation possibly being represented by a further icon suggestive of the particular type of deviation, for example a left or right turn. The navigation function also determines the content, duration and timing of audible instructions by means of which the user can be guided along the route. As can be appreciated a simple instruction such as "turn left in 100 m" requires significant processing and analysis. As previously mentioned, user interaction with the device may be by a touch screen, or additionally or alternately by steering column mounted remote control, by voice activation or by any other suitable method.
A further important function provided by the device is automatic route re-calculation in the event that: a user deviates from the previously calculated route during navigation (either by accident or intentionally); real-time traffic conditions dictate that an alternative route would be more expedient and the device is suitably enabled to recognize such conditions automatically, or if a user actively causes the device to perform route re-calculation for any reason.
Although the route calculation and navigation functions are fundamental to the overall utility of
PNDs, it is possible to use the device purely for information display, or "free-driving", in which only map information relevant to the current device location is displayed, and in which no route has been calculated and no navigation is currently being performed by the device. Such a mode of operation is often applicable when the user already knows the route along which it is desired to travel and does not require navigation assistance.
Devices of the type described above provide a reliable means for enabling users to navigate from one position to another. Traditionally, PND devices are provided as dedicated units containing all of the above-mentioned functionality. It will be appreciated that dedicated PNDs provided all of the desired functionality can be relatively bulky and/or difficult to maintain mounted to the vehicle in the desired position. This can present problems, for example, where it is desired to mount a PND to a scooter where there is little space for mounting the device.
Increasingly, the functionality may alternatively be provided by way of a software application running on a mobile device such as a user's smartphone or tablet. However, these devices may also be bulky and/or difficult to safely mount to the vehicle. Also, it will be appreciated that attempting to navigate whilst driving using a smartphone or tablet is potentially dangerous. There is therefore a desire for an improved navigation product. Summary of the Invention
In accordance with a first aspect of the invention, there is provided a mobile device for use with a secondary display device with which the mobile device is arranged and adapted to communicate for navigating a user within an area within which the mobile device and secondary display device are travelling, the mobile device comprising:
a communication module for communicating with the secondary display device via a
communication channel; and
at least one processor arranged and configured to:
obtain positional data indicative of a current position of the mobile device and/or the secondary display device within the area;
generate image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
cause the mobile device to transmit the image data via the communication channel for display by the secondary display device.
The embodiments of the invention provide a mobile device for use with a secondary display device for navigating a user (e.g. a driver of a vehicle) travelling within a particular area. The mobile device is typically a user's smartphone. Smartphones are relatively expensive and precious, and the user may therefore be unwilling to mount the smartphone directly to a vehicle, particularly to a vehicle such as a scooter, moped or motorbike where the mobile device would be exposed in use, due to the risk of the smartphone being damaged - e.g. by falling from the mount and/or being exposed to adverse weather conditions. It is also not safe for the user to attempt to hold the smartphone during a journey, as the user will need both hands to control the vehicle. Hence, the smartphone itself is not necessarily suitable for use as a navigation device, particularly for use with e.g. a scooter, moped or motorbike. In the embodiments, these problems are overcome by the mobile device being arranged and adapted to communicate with a secondary display device. The mobile device can then be used for the relatively computationally expensive steps of generating the image data, which is preferably real-time image data as discussed in more detail below, for use in navigating a user, with this data then being sent or streamed for display by the secondary display device. As only the secondary display device needs to be visible in use, the relatively expensive mobile device does not need to be kept in view and can be safely stored, thus reducing the potential for damage to the mobile device.
The mobile device thus generates real-time image data representative of the current position of the mobile device and/or the secondary display device for transmission via a communication channel to the secondary display device for display to the user. That is, the mobile device may generate a live video stream for display via the secondary display device. The video stream may be comprised of a series of frames or images of the image data.
It will be appreciated that the bulk of the processing (which will be associated with the image data generation) is therefore performed at the mobile device, such that the form and size of the secondary display device is relatively unconstrained. The secondary display device may thus be made relatively lightweight and inexpensive, at least relative to the mobile device. The secondary display device may also readily be made waterproof or scratch-resistant. A mobile device such as a smartphone can be readily adapted for use in providing navigation information, e.g. by downloading a suitable software application. Accordingly, the mobile device may contain a software application that when executed on the mobile device causes the processor to perform the techniques described herein. It will be appreciated that any suitable processor or processing circuitry may be incorporated into the mobile device. The mobile device and/or the at least one processor of the mobile device may also contain any suitable signal generation and/or processing device or circuitry for generating and/or receiving control signals. The mobile device may also include any suitable transmitter for transmitting the image data over the communication channel.
It will be appreciated that the generated image data may be processed, e.g. compressed, before it is transmitted or streamed via the communication channel. However, regardless of how the generated image data is subsequently processed, it will be appreciated that the image data may still be representative of the image. That is, the image associated with the image data can still be subsequently reconstructed, i.e. at the secondary display device. Hence, by the image data being "representative" of an image, what is meant is that the image, that is to be displayed by the secondary display device, can be reconstructed or generated at the secondary display device using the image data transmitted across the communication channel.
By "real-time" image data, it will be appreciated that the image may represent substantially the current position of the mobile device and/or the secondary display device, and that the image data is updated dynamically (e.g. continuously or at a relatively fast non-continuous rate) as the position of the mobile device and/or the secondary display device changes within the area, i.e. during the course of a single journey. As used herein, it will be understood that the current position is generally a current geographical position, i.e. within the area within which the devices are travelling. The image data may also change dynamically in response to an input received e.g. from the secondary display device via the communication channel. It will be appreciated that the real-time image data may be representative of an image to be displayed by the secondary display device. In use, the mobile device and the secondary display device are both intended to be carried by the user, in close proximity to each other, such that the current position of either the mobile device or the secondary display device is reflective of the position of the user. The positional data obtained by the mobile device may therefore be obtained e.g. from the mobile device and/or from the secondary display device. The mobile device and the secondary display device may each therefore have an antenna for use with a global navigation satellite system for obtaining such positional data.
It will be appreciated that "static" image data may also be generated at the mobile device for display by the secondary display device. It is contemplated that the secondary display device for use with the mobile device may not contain any image generation software, such that the mobile device completely controls the display of the secondary display device.
The user may be a driver of a vehicle wherein the mobile device and secondary display device are both carried within the vehicle. Particularly, the user may be a driver of a scooter or moped wherein the secondary display device is mounted or mountable to the scooter or moped, e.g. via the handlebars or mirror mount. The mobile device may then be stored in the user's pocket, bag, etc. The area may generally be represented by digital map data, with the digital map data containing a series of interconnected nodes and links. The nodes and links may e.g. represent roads and junctions along which the user can travel.
The at least one processor may be arranged and configured to update the image data and/or generate new image data in real time in response to a received input via the communication channel.
That is, the mobile device may be arranged to generate new image data in response to an input received from the secondary display device. As discussed above, in use, the mobile device may be intended to be safely stored, such that the user may only interact via the secondary display device. The received input may, for instance, comprise the positional data, or may comprise a user selection e.g. between display settings such as display type or zoom level. The received input may also, for instance, simply comprise a signal indicating that the status of the secondary display device has changed e.g. that the devices have become (dis)connected or that the secondary display device is turned off, or that the battery level of the secondary display device has changed.
The at least one processor may be arranged and configured to vary or set the frame rate at which the generated image data is transmitted via the communication channel for display by the secondary display.
That is, the frame rate of the image data transmitted for display may be varied dynamically and/or in real time. Generally, the frame rate may be varied according to one or more settings, e.g. between a number of discrete values. The frame rate may be varied automatically based on a change in a parameter associated with e.g. the positional data, the status of the mobile device and/or the status of the secondary display device. For instance, a first frame rate value may be used for some types of display (e.g. for a menu screen) and a second frame rate used during the navigation mode.
The frame rate may be varied or set based on one or more input(s) received via the
communication channel.
That is, the frame rate may be varied or set based on an input from the secondary display device. For example, the user may select a new destination, or select a different display type or zoom level, and the mobile device may then automatically adjust the frame rate to a different value.
The frame rate may be varied or set based on the positional data, and/or based on a speed of travel of the mobile device within the area.
It will be appreciated that a speed of travel may be determined using the positional data over time. The frame rate may also therefore be varied or set based on a speed of travel of the secondary display device within the area, where the obtained positional data is indicative of the current position of the secondary display device.
The at least one processor may be arranged and configured to calculate a route for use in the navigating the user within the area.
That is, the mobile device may be arranged and configured to perform route planning, e.g. of the type discussed in the background section above. The mobile device may thus calculate a route for navigating the user from a departure node to a destination node, and the generated image data transmitted for display on the secondary display device may display this route and/or instructions or directions associated with this route to the user for navigation purposes. The at least one processor may be arranged and configured to compress the generated image data prior to the image data being transmitted via the communication channel, optionally by an intraframe compression technique.
The mobile device may be configured to transmit the image data via the communication channel in serial. Optionally the mobile device may be configured to transmit image data associated with a single image for display by the secondary display device as a series of parts.
The size of the parts may be chosen to provide a desired responsiveness of the system. For instance, the size of the parts may be chosen such that upon receiving a new input via the
communication channel (i.e. from the secondary display device), a new image is generated and/or the image data is updated within a desired time period. For instance, the fixed time period may be 0.5 s, such that new image data is generated in response to any given input in 0.5 s or less.
The at least one processor may be arranged and configured to: monitor the communication channel for a signal indicating that the secondary display device is turned on and connected to the mobile device via the communication channel; and upon receipt of the signal, start generating image data.
That is, the mobile device may be arranged to automatically start generating image data in response to a signal sent by the secondary display device upon turning the secondary display device on and/or connecting the secondary display device to the mobile device. Where the mobile device contains a first software application for generating the image data e.g. using the techniques substantially as described above, the mobile device may contain a second software application for monitoring the communication channel. The second software application may require less power than the first software application so that the second software application may be left running with less drain on the battery of the mobile device.
The image data generated upon receipt of the signal may e.g. be an image associated with a start or menu screen. User input may then be required to generate the image data associated with the current position of the mobile device and/or secondary display device.
It has been recognised that this feature may be advantageous in its own right. Accordingly, from other aspects, there is provided a method of generating data for display upon start-up of a secondary display device comprising: providing a mobile device; providing a secondary display device, wherein the secondary display device is arranged and configured to communicate with the mobile device via a communication channel; monitoring the communication channel using the mobile device for a signal from the secondary display device indicating that the secondary display device is turned on and connected to the mobile device via the communication channel; upon detection of the signal, using the mobile device to generate image data associated with an image to be displayed by the secondary display device;
transmitting the image data from the mobile device to the secondary display device; and displaying the image at the secondary display device. The mobile device and/or the secondary display device may be a mobile device and/or secondary display device substantially as described herein in relation to any of the other aspects or embodiments herein.
According to the invention described herein, the at least one processor may be arranged and configured to communication with an external server to obtain map data reflective of the area. The processor may communicate with the external server using an internet connection of the mobile device, e.g. a mobile telecommunications connection such as a 3G/4G. The external server may also provide general system or software updates for the mobile device and/or for transmission by the mobile device via the communication channel to the secondary display device.
From another aspect of the invention, there is provided a secondary display device for use with a mobile device with which the secondary display device is arranged and adapted to communicate in use for navigating a user within an area within which the mobile device and secondary display device are travelling, the secondary display device comprising:
a display;
a communication module for controlling communication with the mobile device via a communication channel; and
at least one processor,
wherein the at least one processor is arranged and configured to:
receive via the communication channel, optionally from the mobile device, image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
display the image to the user via the display;
wherein the secondary display device further comprises a user interface for receiving a user input,
the secondary display device being arranged and configured, based upon the user input, to transmit a signal across the communication channel for controlling an operation of the mobile device.
The user input at the secondary display device thus generates a signal that is sent to the mobile device and causes the mobile device to generate new and/or updated image data for display at the secondary display device. The signal generated based on the user input therefore contains information relating to the user input. This information may then be interpreted by the mobile device in order for the mobile device to generate appropriate image data responsive to the signal.
The secondary display device does not necessarily need to generate the image data for display, and so can be made relatively lightweight and inexpensive. The secondary display device may be sized and shaped to facilitate mounting to a vehicle such as a scooter or moped. For instance, the secondary display device may be mountable to the handlebars or mirror stalk of a scooter or moped.
It will be appreciated that any suitable processor or processing circuitry may be incorporated into the secondary display device. The secondary display device and/or the at least one processor of the secondary display device may also contain any suitable signal generation and/or processing device or circuitry for generating and/or receiving control signals. The at least one processor of the secondary display device may be arranged and configured to cause the secondary display device, or a signal generating module of the secondary display device, to transmit the signal.
The secondary display device may further comprise an antenna for use with a global navigation satellite system for obtaining positional data representative of a current position of the secondary display device within the area. The at least one processor may be arranged and configured to transmit the positional data via the communication channel for use by the mobile device. The secondary display device may further comprise a battery. The at least one processor may be arranged and configured to check the charge level of the battery, and transmit information indicative of the current level of the battery via the communication channel for inclusion in the image data.
The information indicative of the current level of the battery may be stored in a memory of the secondary display device such that it is immediately available for transmission to the mobile device in order for the mobile device to include the battery level information in the image data, i.e. or to generate image data representing the battery level information. The charge level may be checked continually whilst the secondary display device is turned on. When the secondary display device is off, or in a standby mode, the secondary display device may be arranged and configured to periodically wake up to check the charge level of the battery.
The user interface may comprise a touchscreen.
The communication module(s) may comprise a wireless communication module, and the communication channel may comprise a wireless communication channel. Optionally, the wireless communication is Bluetooth communication.
From a further aspect, there is provided a system comprising a mobile device and a secondary display device for navigating a user travelling within an area within which the mobile device and secondary display device are travelling;
the mobile device comprising a communication module and at least one processor; and the secondary display device comprising a communication module for communicating with the communication module of the mobile device via a communication channel, at least one processor and a display;
wherein the at least one processor of the mobile device is arranged and configured to:
obtain positional data indicative of a current position of the mobile device and/or the secondary display device within the area;
generate real-time image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
transmit the real-time image data via the communication channel to the secondary display device for display by the secondary display device;
wherein the secondary display device is arranged and configured to:
receive the transmitted image data via the communication channel; and display the image to the user via the display.
The secondary display device may further comprise a user interface, such as a touchscreen, for receiving a user input. The processor of the secondary display device may be arranged and configured upon the user input to transmit a signal across the communication channel to the mobile device; and the processor of the mobile device may be arranged and configured to generate new and/or updated image data based upon the signal and to transmit the new or updated image data to the secondary display device for display.
The mobile device may comprise a mobile device substantially as described in accordance with any of the aspects or embodiments herein. The secondary display device may comprise a secondary display device substantially as described in accordance with any of the aspects or embodiments herein. ln accordance with another aspect of the invention, there is provided a method for providing navigation information to a user, comprising:
obtaining positional data indicative of a current position of a mobile device and/or a secondary display device within the area;
generating, using at least one processor of the mobile device, image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
transmitting the generated image data from the mobile device to the second display device via a communication channel for display by the secondary display device.
In accordance with another aspect of the invention, there is provided a method for providing navigation information to a user, comprising:
receiving at a secondary display device, via a communication channel from a mobile device, image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
displaying the image to the user via a display of the secondary display device,
the method further comprising:
transmitting a signal to the mobile device, via the communication channel from the secondary display device, for controlling an operation of the mobile device based on a received user input on a user interface of the secondary display device.
The present invention extends to a system for carrying out a method in accordance with any of the aspects or embodiments of the invention herein described.
Thus, in accordance with another aspect of the invention, there is provided a system for providing navigation information to a user, comprising:
means for obtaining positional data indicative of a current position of a mobile device and/or a secondary display device within the area;
means for generating, using at least one processor of the mobile device, image data
representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
means for transmitting the generated image data from the mobile device to the second display device via a communication channel for display by the secondary display device.
In accordance with another aspect of the invention, there is provided a system for providing navigation information to a user, comprising:
means for receiving at a secondary display device, via a communication channel from a mobile device, image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
means for displaying the image to the user via a display of the secondary display device, the system further comprising:
means for transmitting a signal to the mobile device, via the communication channel from the secondary display device, for controlling an operation of the mobile device based on a received user input on a user interface of the secondary display device. As will be appreciated by those skilled in the art, this further aspect of the present invention can and preferably does include any one or more or all of the preferred and optional features of the invention described herein in respect of any of the other aspects of the invention, as appropriate. If not explicitly stated, the system of the present invention herein may comprise means for carrying out any step described in relation to the method of the invention in any of its aspects or embodiments, and vice versa.
The present invention is a computer implemented invention, and any of the steps described in relation to any of the aspects or embodiments of the invention may be carried out under the control of a set of one or more processors. The means for carrying out any of the steps described in relation to the system may be a set of one or more processors.
The method of the present invention may be implemented in the context of a navigation operation. Thus, the method may be carried out by a set of one or more processors of a device or system having navigation functionality.
It will be appreciated that the methods in accordance with the present invention may be implemented at least partially using software. It will thus be seen that, when viewed from further aspects and in further embodiments, the present invention extends to a computer program product comprising computer readable instructions adapted to carry out any or all of the method described herein when executed on suitable data processing means. The invention also extends to a computer software carrier comprising such software. Such a software carrier could be a physical (or non-transitory) storage medium or could be a signal such as an electronic signal over wires, an optical signal or a radio signal such as to a satellite or the like. Accordingly, in accordance with another aspect of the invention, there is provided a computer program product, e.g. computer software, comprising instructions which, when executed by one or more processors of a system, cause the system to perform the method of any of the aspects and embodiments discussed above. The computer program product can be stored on a non- transitory computer readable medium.
The present invention in accordance with any of its further aspects or embodiments may include any of the features described in reference to other aspects or embodiments of the invention to the extent it is not mutually inconsistent therewith.
Advantages of further embodiments are set out hereafter, and further details and features of each of these further embodiments are defined in the accompanying dependent claims and elsewhere in the following detailed description.
Brief Description of the Figures
Various embodiments will now be described, by way of example only, and with reference to the accompanying drawings in which:
Figure 1 A illustrates a navigation system according to an embodiment of the invention comprising a mobile device and a secondary display device and Figure 1 B shows the secondary display device mounted in use;
Figure 2 illustrates the navigation system of Figure 1A in more detail;
Figure 3A illustrates the internal architecture of the mobile device and Figure 3B illustrates the main communication pathways between the mobile device and the secondary display device; Figure 4 shows the stacked architecture for the communication pathways between the mobile device and the secondary display device;
Figure 5 shows an example of a display provided by the secondary display device for use in navigating a user; and
Figure 6 shows an example of a menu provided on the secondary display device.
Detailed Description of the Figures
Embodiments of the present invention will now be described with particular reference to a mobile device, such as a smartphone. It should be remembered, however, that the teachings of the present invention are not limited to this context but are instead universally applicable to any type of processing device that is configured to execute navigation software so as to provide route planning and navigation functionality. It follows therefore that in the context of the present application, the mobile device is intended to include (without limitation) any type of portable route planning and navigation device or indeed any type of computing resource such as a portable personal computer (PC), mobile telephone or portable digital assistant (PDA)) executing route planning and navigation software.
Figure 1 A illustrates a navigation system according to an embodiment of the invention comprising a mobile device 1 in the form of a smartphone and a secondary display device 2 with which the mobile device 1 can communicate e.g. via a Bluetooth protocol. The mobile device 1 is running a software application that generates the images for display by the secondary display device 2. It will be appreciated that the type of mobile device 1 is not particularly limited, and the software application may e.g. run either through Apple iOS or an Android operating system. Figure 1 B shows the secondary display device 2 in use mounted to the mirror stalk of a scooter. It will be appreciated that both the mobile device 1 and secondary display device 2 are intended to travel together, in close association with each other, such that they remain in range throughout the journey.
For instance, in the illustrated embodiment the mobile device 1 is the user's smartphone running an appropriate software application for use in controlling the secondary display device 2 which may be mounted to the user's vehicle. The mobile device 1 will typically therefore be relatively expensive and precious to the user; accordingly, in the embodiment, the mobile device 1 may be safely stored in use, e.g. in a user's pocket or bag (or any other suitable location), to avoid risking damage to the mobile device 1 , so long as it is within communication range of the secondary display device 2. In the embodiments described below the communication between the mobile device 1 and the secondary display device 2 is via Bluetooth communication. Naturally, it is contemplated that other suitable wireless communication protocols may also be used, or less preferably even a wired communication between the devices 1 , 2.
The secondary display device 2 may essentially be a slave device operating under the control of the mobile device 1 , and may therefore be provided with relatively limited hardware and software resources. Hence, the secondary display device 2 can be made relatively lightweight and inexpensive, since the bulk of the processing is performed at the mobile device 1 . The secondary display device 2 may be mounted to the vehicle in any suitably visible location, e.g. on the handlebars or mirror stalk of a scooter. The secondary display device 2 would typically be releasably mounted to the vehicle such that an unskilled user can readily attach/detach the device from the vehicle. The secondary display device 2 may thus be provided with any suitable mounting system that allows it be readily mounted to a vehicle, e.g. a scooter. However, it is also contemplated that the secondary display device 2 could be more permanently mounted. In embodiments, the secondary display device 2 is intended for mounting to a scooter, and hence is exposed in use. The secondary display device 2 may therefore generally be provided with a scratch-resistant or waterproof casing.
The mobile device 1 receives as input or generates positional data reflecting of the current position of the user (e.g. based on live GPS positional data) and optionally map data reflecting the current traffic conditions in the map region within which the user is travelling. From this data, the mobile device 1 generates image data associated with an image that is intended for display to the user via the secondary display device 2. Particularly, during a navigation mode of operation, the mobile device 1 generates image data representing the current position of the user within a map. It will be appreciated that the current position of the user may in principle be determined using the position either of the mobile device 1 or the secondary display device 2 as both devices are intended to be carried by the user/mounted on the user's vehicle in use. This image data is updated in real-time so that the current position of the user as the devices 1 , 2 move around the map region may be continually displayed to the user, i.e. in the form of a video display. In order for this information to be relayed to the user, the image data is compressed at the mobile device 1 into a format suitable for transmitting or streaming across the Bluetooth
communication channel to the secondary display device 2. The stream of image data is then transmitted to the secondary display device 2 in its compressed format wherein the secondary display device 2, upon receipt of the image data, proceeds to decode or decompress the received image data in order to (re)construct the image intended for display via the secondary display device 2, and this image is then displayed to the user. For example, where the image represents a current position of the user, this image is displayed to the user for use in navigation purposes.
As the mobile device 1 is intended to be safely stored during use (i.e. during a journey), the secondary display device 2 may include a user interface for providing control functionality, e.g. for changing the display format, switching between a number of preset destinations, and/or switching between different levels of zoom. This may be provided via touch control wherein a user taps or swipes the display of the secondary display device 2 in order to change the display. The input provided by the user at the secondary display device 2 then generates a signal that is sent to the mobile device 1 to cause the mobile device 1 to update the image data based on the user input, e.g. to change the display format, to switch between destinations, and/or to switch between different levels of zoom. Because the secondary display device 2 is intended to be a relatively simply device, the user interface may also be relatively simple, and only be associated with relatively basic control functionality, e.g. of the types discussed above. More complicated user inputs such as initial route planning control or set up of the application will typically be performed in advance via the mobile device 1. User input may also be provided in use via a headset in communication with the mobile device 1 , where one is provided.
Figure 2 illustrates the internal architecture of the components of the system shown in Figure 1A in more detail. The system generally comprises a mobile device 1 running a software application 100 within an operating environment 10 and communicating via an interface 12 with a secondary display device 2. The mobile device 1 is also arranged to communicate with an external server 3 that provides back-end services to the system, e.g. through an internet connection. For example, the external server 3 may provide the map data and live data reflecting the current traffic conditions in the map region to the software application 10 for use in generating the navigation information to be displayed to the user, as is known for PNDs generally. The external server 3 may also provide software and/or firmware updates for the software application 100 running on the mobile device 1 and the secondary display device 2.
Typically, there is no direct communication between the secondary display device 2 and the external server 3, and the secondary display device 2 typically only has relatively limited processing power and memory resource such that any updates for or information to be displayed on the secondary displace device 2 is provided by the controlling mobile device 1 .
Typically, the mobile device 1 will include a GPS antenna 1 1 and a 3G/4G/Wi-Fi communication module 13 through which communication with the external server 3 is effected, as is conventional for modern smartphones. The mobile device 1 may also contain a headset 14 for receiving voice instructions from a user. Again, this functionality is well known for modern smartphones.
In the illustrated embodiment, the secondary display device 2 includes a microcontroller unit 201 , a display screen 202, e.g. in the form of an octagonal LCD module having a touchscreen functionality, a Bluetooth connection interface 122, a Global Navigation Satellite System (GNSS) receiver 203, a battery 204 and a port 205 for inserting a charger. The port may e.g. comprise a USB or micro USB port. The secondary display device 2 may also have a power button for turning the device on, or for suspending or rebooting the operation of the device.
Figure 3A is another representation of the architecture inside the secondary display device 2. The microcontroller or MCU 201 communicates with to receive inputs from and/or control the operation of the display unit 202 and its associated touchscreen 2021 , and the GNSS receiver 203 and its associated antenna 2031 . The microcontroller 201 also communicates with the battery 204 and the charging port 205, e.g. to receive information regarding the current charge level or charge status of the battery 204. The secondary display device 2 also includes a power button 22 for switching the secondary display device 1 ON/OFF or into a 'suspended' mode of operation.
Figure 3B shows the main communication paths between the secondary display device 2 and the mobile device 1 . Figure 4 shows the stacked architecture for the communication between the mobile device and the secondary display device. In the embodiment, there are four main control and data paths in the communication channel 12 between the mobile device 1 and the secondary display device 2. These paths are all provided along the same communication channel 12, and hence all pass via the Bluetooth module 122 of the secondary display device 2. As shown in Figure 3B, the paths all pass through the microcontroller 201 of the secondary display device 2, as this component generally controls the operation of the secondary display device 2 (in response to the received inputs from the mobile device 1 ). Naturally, this need not be the case, and any other suitable device architecture could be used for the secondary display device 2.
The first path 301 is used by the user interface frame buffer to transmit the image data generated by the navigation application 100 to be rendered for display by the secondary display device 2 at a set (variable) frame rate, e.g. as explained further below. The first path 301 therefore passes from the mobile device 1 to the display unit 202 of the secondary display device 2. A second path 302 is provided between the touchscreen 2021 of the secondary display device 2 and the mobile device 1 for transmitting user interface events, where user input provided to the secondary display device 2, via the touchscreen 2021 , is sent back to the mobile device 1 , and used to update or generate new image data for display by the display unit 202. Hence, the first 301 and second 302 communication paths are used to transmit or stream the image data for display, and for the user-controlled adjustment of the image data that is displayed.
The third path 303 is a bi-directional communication path extending between the GNSS receiver 203 of the secondary display device 2 and the mobile device 1 , and is provided for transmitting GNSS data (e.g. from the GNSS receiver 203 to the mobile device 1 ) and for control of the GNSS receiver 203 by the mobile device 1 . In the embodiments, the initial GNSS data may be provided by the mobile device 1 , e.g. during the route planning and/or before the secondary display device 2 is connected. However, during navigation, the GNSS data may be provided to the mobile device 1 from the secondary display device 2. It will be appreciated that the application 100 running on the mobile device 1 is intended to be a companion application only, with the mobile device 1 e.g. located in a user's pocket. Using the GNSS data from the GNSS receiver of the secondary display device 2, which is e.g. mounted to the handlebar or mirror of the scooter, may therefore provide more accurate information, especially as a better (i.e. larger), dedicated antenna can be provided within the secondary display device 2 than the general purpose antenna that may be provided as part of the mobile device 1 , i.e. a conventional smartphone. The software application 100 running the mobile device 1 may therefore be designed to disregard any GNSS data from the antenna of the mobile device 1 in favour of that provided by the secondary display device 2, where both are available. The GNSS data from the antenna of the mobile device 1 may also be used as a fallback when needed, e.g. where no GNSS data is available via the secondary display device 2.
The fourth path 304 is a bi-directional communication path between the microcontroller unit 201 of the secondary display device 2 and the mobile device 1 , and may be used for general system control and information e.g. for providing information indicative of the device credentials, the battery charge level of the device, the charging status, dock status, etc. This information may be provided from the secondary display device 2 to the mobile device 1 , for the mobile device 1 to then render this information into the image data for subsequent display by the secondary display device 2. The fourth path may also be used e.g. for displaying brightness control of the secondary display device 2, which again, is rendered in the image data by the mobile device 1 . The secondary display device 2 firmware may also be updated using information sent along the fourth path from the mobile device 1 , e.g. and retrieved from an external server 3.
It will be appreciated that the image rendering, compression and streaming at the mobile device 1 and the subsequent decompressing or decoding of the received image data at the secondary display device 2 may be performed using any suitable techniques. It will also be appreciated that the image coding/decoding activity is quite CPU/RAM intensive, and that in order for the computational load at the secondary display device 2 to be reduced as far as possible, accurate scheduling of activities may be required to avoid an adverse user experience (i.e. including significant lag). In embodiments, the rendered frames from the software application 100 are encoded using an intraframe compression technique, i.e. in which each frame of the image/stream is compressed individually. This may help reduce the amount of processing involved at the secondary display device 2.
In practice, the effective speed of a Bluetooth channel is roughly 100 Kb/s. For a stream of images of size roughly between 20-50 Kb being displayed with a frame rate of around 5 fps (frames per second) (as may be typically accommodated on a smartphone), just transferring these over a Bluetooth connection would take around 200-500 ms. For a good user experience, it is generally desirable for a switch press or other user input at the secondary display device 2 to result in a new screen update on the secondary display device 2 (that is, for a new or updated image to be generated at the mobile device 1 and then transmitted to the secondary display device 2) within a fast, and consistent, response time, e.g. less than about 0.5 s. It will be appreciated that there are potentially relatively large differences in latency associated with the four data paths in the communication channel described above, and so the communication protocol may be designed so that the overall system performance is not negatively impacted by this in order to provide a good user experience. In particular, the user interface path used to render the images (the first path or rendering channel) and the user interface event channel (second path) may inherently have a relatively high latency, due to the relatively intensive requirements placed on the processor (CPU/RAM) of the mobile device 1 in rendering the images. In the embodiment, in order to help reduce the latency of these channels, and optionally to provide the potential to abort/change an image being sent in midstream, the high latency streamed image data may be sent along a single serial channel, and the image data associated with a single image may be broken up into a series of 'parts'. That is, the protocol may be adapted to stream the image data as a sequence of discrete parts that are then (re-)assembled and decompressed at the secondary display device 2. This may make aborting and sending over a new image easier than would be the case by e.g. sending the image as one data part, or across parallel data channels. The size of the parts may be chosen appropriately to provide the required responsiveness. For instance, the smaller the size of the parts, the more responsive the system may be - that is, the lower the latency associated with aborting/changing an image being sent in midstream. However, of course, the smaller the size of the parts, the more processing is required at the secondary display device 2 to re-assemble the parts into the image data for generating the image for display.
Although in the embodiment described above the frames are compressed and transmitted using intraframe compression techniques, it will be appreciated that interframe compression techniques could also be used, i.e. in which multiple neighbouring frames are compressed together based on the recognition that a frame can be expressed in terms of one or more preceding and/or succeeding frames.
The frame rate at which the image data is generated and transmitted at the mobile device 1 , and hence displayed at the secondary display device 2 may be variable e.g. between a number of different settings. For instance, different frame rates may be used depending on the current display mode - a first frame rate may be used when displaying the menu or destination screen on the secondary display device 2 and a second frame rate used when displaying the map for guidance in the navigation mode. By way of example, a first frame rate of around 10 fps may be used in the menu mode and a second frame rate of around 5 fps used in the navigation mode. A higher frame rate may be needed when navigating the menu to ensure that the device acts responsively. However, a lower frame rate can be used during navigation, where responsiveness may be less of a concern (as there will generally be less user interaction), so as to reduce the processing strain at the secondary display device 2 and hence preserve the battery life.
The frame rate may also be varied between different settings based on various other considerations. For instance, the frame rate may be varied based on the speed at which the user is travelling through the map region - e.g. a frame rate of 1 fps when travelling at low speeds, such as less than about 30 km/h, and 5 fps when travelling at higher speeds, such as 100 km/h or higher. As another example, the frame rate may be varied based on the zoom level selected by the user via the secondary display device 2 - e.g. a frame rate of 1 fps for relatively low, zoomed-out, zoom levels, and 5 fps for higher, zoomed in, zoom levels. Also, a higher frame rate, such as 7 fps, may be used when changing heading or recalculating a route. Generally, in the case of multiple frame rate considerations, e.g. when travelling at a low speed but with a high zoom level, the highest frame rate should be used.
By way of example of various optional functionalities of the navigation system, Figure 5 illustrates a display of the touchscreen display of the secondary display device 2 that may be provided to the user during a navigation mode. As well as providing an image of the current position 51 of the secondary display device 2, and hence the user, within the map region, the display may contain various other icons representing e.g. current traffic conditions 53, expected delays 54, location of speed cameras 55 The display also contains an indication 52 of the estimated time of arrival. Tapping this icon, or e.g. swiping down from the top of the display, may open up a menu display for choosing a new destination from a pre- determined list of previous or favourite destinations (as shown for example by icon 60 in Figure 6). The display also shows the next instruction 56 associated with the navigation route, e.g. "Turn right in 50m". Tapping the instruction, or e.g. swiping up from the bottom of the display, may allow the user to switch between a route overview mode in which the current position and the destination are both visible and a guidance mode which is relatively zoomed in on the current position. Tapping the display at any other position may be used to select between different zoom levels. Typically, the display may be set between a number of discrete zoom levels, for instance: a first zoom level that shows the remaining route keeping the current position and the destination in view, a second zoom level at a fixed height just keeping the current position in view, and a third zoom level at a lower fixed height, zoomed closer in to the current position. The display may automatically change between zoom levels in response to changes in the current position. For example, when the destination is reached, the route may be cleared and the current position may move to the center of the screen, with the display at the maximum zoomed out zoom level (i.e. the first zoom level).
It will of course be appreciated that Figure 5 is merely one example of a display, and that of course the display and control functionality may be tailored in any appropriate manner depending on the desired content to be displayed to the user.
As another example of functionality that may be provided by the system, the secondary display device 2 may additionally or also display to the user the current battery level of the device. The current battery level is generally determined by the MCU of the secondary display device 2. However, given the lack of processing power and/or image rendering software in the secondary display device 2, in order for this information to be displayed to the user, it may be necessary for the secondary display device 2 to transmit this information to the mobile device 1 in order for the mobile device 1 to render a new image including this information. It may therefore not be necessary for the secondary display device 2 to store any image data representing the battery level, as the image rendering may be performed only at the mobile device 1 . In other embodiments, it is also contemplated that the secondary display device 2 may store or even generate indicia representative of the different battery statuses itself. It may be useful for the secondary display device 2 to store some of these indicators in the case where the battery is low, so that this information can be displayed immediately to the user without having to connect to the mobile device 1 .
When the device is OFF or in a suspended mode, turning the device ON may prompt an initial step of checking the battery level of the secondary display device 2, and where the battery level is low, e.g. below a certain threshold, displaying this information to the user (prior to displaying any menu screen or navigation information). During use of the device, the display may be updated to reflect the current battery level of the secondary display device 2. This may be provided via an icon in the corner of the display, or, especially where the battery level drops to a critically low level, by displaying a warning to the user. The display may also reflect the current battery level of the mobile device 1. Also, reflects current GNSS/device connectivity status.
When not being used, the secondary display device 2 is generally kept in a suspended mode. In order to monitor the battery level, in embodiments, the secondary display device 2 is configured to periodically (e.g. about every 20 minutes) wake up from the suspended mode and check the current battery level. The detected battery level may then be stored in a battery-backed RAM portion of the secondary display device 1 such that the current (i.e. the most recently checked) value of the battery level is always available for immediate display. Naturally, in use, the battery level may be checked continuously.
In a similar fashion as described above in relation to the battery level, the secondary display device 2 may display to the user an indication of whether the secondary display device 2 is currently being charged, etc.
In embodiments, the secondary display device 2 may be configured for phone call handling. For instance, when the mobile device 1 receives an incoming phone call, a monitoring application within the mobile device 1 , or a monitoring function of the navigation application 100, may detect this, and send a signal to the mobile device 1 . Upon receipt of this signal the mobile device 1 then generates an image indicating the incoming phone call, e.g. displaying a photo of the contact or the number that is calling. The phone call may be dismissed by tapping the display, or by doing nothing for a predetermined wait period, to return to the navigation display. Alternatively, the phone call may be accepted or declined via a headset 14 of the mobile device 1 where one is provided.
The communication interface 12 in the illustrated embodiment operates via Bluetooth communication between the Bluetooth connectivity portions 121 , 122 of the mobile device 1 and the secondary display device 2. Bluetooth may be particularly appropriate given the relatively short distance between the mobile device 1 and the secondary display device 2 in use, as both devices are carried around with the user, as discussed above. When the secondary display device 2 is first set-up and/or when the software application 100 is first downloaded, the mobile device 1 and secondary display device 2 may be paired using any suitable pairing protocol. Upon start-up of either device, it is then easy for the device to check whether or not it has been paired, and then whether or not the paired device is within range, and connected. If the devices are determined to be connected, the software application can thus start running on the mobile device 1.
In embodiments, the software application 100 on the mobile device 1 may be arranged to automatically start up when the secondary display device 2 is turned on and connected to the mobile device 1 . In an exemplary embodiment, this method operates as follows.
First, the secondary display device 2 pairs with the mobile device 1 over the Bluetooth connection. Since the pairing process has already been completed (as discussed above), this occurs automatically upon turning on the secondary display device 2 and bringing it into range of the mobile device 1 , provided that the Bluetooth connection is turned on. A monitoring application on the mobile device 1 then listens to the Bluetooth channel. The monitoring application serves only to listen to the Bluetooth channel, and therefore requires relatively little processing power. The monitoring application can therefore be left running in the background without significantly draining the battery of the mobile device 1 .
When the secondary display device 2 is turned on, the secondary display device 2 sends a wake- up signal over the Bluetooth channel to the mobile device 1 that is detected by the monitoring application. Upon detection of this wake-up signal, the monitoring application triggers start-up of the navigation application 100 of the mobile device 1 . The navigation application 100 then starts to generate the image data for display on the secondary display device 2 (e.g. the Home or Menu screen) and sends the image data over the Bluetooth connection to the secondary display device 2.
Although the present invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes in form and detail may be made without departing from the scope of the invention as set forth in the accompanying claims. For example, the mobile device may be configured to perform any one or all of the functions described above in the Background section.

Claims

Claims 1. A mobile device for use with a secondary display device with which the mobile device is arranged and adapted to communicate for navigating a user within an area within which the mobile device and secondary display device are travelling, the mobile device comprising:
a communication module for communicating with the secondary display device via a communication channel; and
at least one processor arranged and configured to:
obtain positional data indicative of a current position of the mobile device and/or the secondary display device within the area;
generate real-time image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and cause the mobile device to transmit the real-time image data via the communication channel for display by the secondary display device.
2. A mobile device as claimed in claim 1 , wherein the at least one processor is arranged and configured to update the image data and/or generate new image data in real time in response to a received input via the communication channel.
3. A mobile device as claimed in claim 1 or 2, wherein the at least one processor is arranged and configured to vary or set the frame rate at which the generated image data is transmitted via the communication channel for display by the secondary display.
4. A mobile device as claimed in claim 3, wherein the frame rate is varied or set based on one or more input(s) received via the communication channel.
5. A mobile device as claimed in claim 3 or 4, wherein the frame rate is varied or set based on the positional data, and/or based on a speed of travel of the mobile device within the area.
6. A mobile device as claimed in any preceding claim, wherein the at least one processor is arranged and configured to calculate a route for use in said navigating the user within the area.
7. A mobile device as claimed in any preceding claim, wherein the at least one processor is arranged and configured to compress the generated image data prior to the image data being transmitted via the communication channel, optionally by an intraframe compression technique.
8. A mobile device as claimed in any preceding claim, wherein the mobile device is configured to transmit the image data via the communication channel in serial, optionally wherein the mobile device is configured to transmit image data associated with a single image for display by the secondary display device as a series of parts.
9. A mobile device as claimed in any preceding claim, wherein the at least one processor is arranged and configured to:
monitor the communication channel for a signal indicating that the secondary display device is turned on and connected to the mobile device via the communication channel; and upon receipt of said signal, start generating image data.
10. A mobile device as claimed in any preceding claim, wherein the at least one processor is arranged and configured to communication with an external server to obtain map data reflective of the area.
1 1. A mobile device as claimed in any preceding claim, wherein the mobile device is a smartphone.
12. A secondary display device for use with a mobile device with which the secondary display device is arranged and adapted to communicate in use for navigating a user within an area within which the mobile device and secondary display device are travelling, the secondary display device comprising:
a display;
a communication module for controlling communication with the mobile device via a communication channel; and
at least one processor,
wherein the at least one processor is arranged and configured to:
receive via the communication channel, optionally from the mobile device, image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and
display the image to the user via the display;
wherein the secondary display device further comprises a user interface for receiving a user input,
the secondary display device being arranged and configured, based upon said user input, to transmit a signal across the communication channel for controlling an operation of the mobile device.
13. A secondary display device as claimed in claim 12, further comprising an antenna for use with a global navigation satellite system for obtaining positional data representative of a current position of the secondary display device within the area, wherein the at least one processor is arranged and configured to transmit the positional data via the communication channel for use by the mobile device.
14. A secondary display device as claimed in claim 12 or 13, further comprising a battery, wherein the at least one processor is arranged and configured to check the charge level of the battery, and transmit information indicative of the current level of the battery via the
communication channel for inclusion in the image data.
15. A secondary display device as claimed in any of claims 12 to 14, wherein the user interface comprises a touchscreen.
16. A secondary display device as claimed in any of claims 12 to 15, wherein the secondary display is constructed and arranged to be mounted on a motorcycle, a moped and/or a scooter.
17. A secondary display device as claimed in claim 16, wherein the secondary display is constructed and arranged to be mounted on a handlebar and/or on a mirror mount of the motorcycle, the moped and/or the scooter.
18. A mobile device and/or secondary display device as claimed in any preceding claim, wherein the communication module(s) comprises a wireless communication module, and wherein the communication channel comprises a wireless communication channel, optionally wherein the wireless communication is Bluetooth communication.
19. A system comprising a mobile device and a secondary display device for navigating a user travelling within an area within which the mobile device and secondary display device are travelling;
the mobile device comprising a communication module and at least one processor; and the secondary display device comprising a communication module for communicating with the communication module of the mobile device via a communication channel, at least one processor and a display;
wherein the at least one processor of the mobile device is arranged and configured to: obtain positional data indicative of a current position of the mobile device and/or the secondary display device within the area;
generate real-time image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area; and transmit the real-time image data via the communication channel to the secondary display device for display by the secondary display device;
wherein the at least one processor of the secondary display device is arranged and configured to:
receive the transmitted image data via the communication channel; and display the image to the user via the display.
20. A system as claimed in claim 19, wherein the secondary display device further comprises a user interface, such as a touchscreen, for receiving a user input,
wherein the secondary display device is arranged and configured upon said user input to transmit a signal across the communication channel to the mobile device; and wherein the processor of the mobile device is arranged and configured to generate new and/or updated image data based upon said signal and to transmit the new or updated image data to the secondary display device for display.
21. A system as claimed in claim 19 or 20, wherein the mobile device comprises a mobile device as claimed in any of claims 1 to 1 1 , and/or wherein the secondary displace device comprises as secondary display device as claimed in any of claims 12 to 18.
22. A method for navigating a user comprising:
(a) providing a mobile device comprising a communication module and at least one processor;
(b) providing a secondary display device comprising a communication module for communicating with the communication module of the mobile device via a
communication channel, and a display;
(c) obtaining positional data indicative of a current position of the mobile device and/or the secondary display device within an area within which the mobile device and secondary display device are travelling;
(d) generating, using the at least one processor of the mobile device, image data representative of an image indicating the current position of the mobile device and/or the secondary display device within the area;
(e) transmitting the generated image data from the mobile device to the secondary display device via the communication channel;
(f) receiving the transmitted image data at the secondary display device; and
(g) displaying the image at the display;
wherein the method further comprises repeating steps (c) to (g) to generate a real time display of the current position of the mobile device and/or the secondary display device within the area.
23. A method as claimed in claim 22, wherein the secondary display device further comprises a user interface, such as a touchscreen, for receiving a user input, the method further comprising:
providing a user input at the user interface of the secondary displace device;
transmitting a signal based on said user input to the mobile device via said
communication channel; and
generating new and/or updated image data at the mobile device based on said signal.
24. A computer program product that when executed on a computerised mobile device causes the mobile device to perform a method comprising:
(a) obtaining positional data indicative of a current position of the mobile device and/or a secondary display device with which the mobile device is arranged and adapted to
communicate via a communication channel; (b) generating real-time image data representative of an image indicating the current position of the mobile device and/or the secondary display device; and
(c) transmitting the generated image data from the mobile device via the communication channel for display by the secondary display device.
PCT/EP2017/071966 2016-09-01 2017-09-01 Navigation device and display WO2018041999A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1614852.0A GB201614852D0 (en) 2016-09-01 2016-09-01 Navigation device and display
GB1614852.0 2016-09-01

Publications (1)

Publication Number Publication Date
WO2018041999A1 true WO2018041999A1 (en) 2018-03-08

Family

ID=57140029

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/071966 WO2018041999A1 (en) 2016-09-01 2017-09-01 Navigation device and display

Country Status (2)

Country Link
GB (1) GB201614852D0 (en)
WO (1) WO2018041999A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3838730A1 (en) * 2019-12-19 2021-06-23 Ktm Ag Motorcycle with operable multifunction device
DE102022211140A1 (en) 2022-10-20 2024-04-25 Robert Bosch Gesellschaft mit beschränkter Haftung Light vehicle and method for navigation using a portable wireless communication terminal and a bicycle screen

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000180197A (en) * 1998-12-18 2000-06-30 Casio Comput Co Ltd Current position display system, current position display control device and map data transmission device
JP2000270283A (en) * 1999-03-16 2000-09-29 Canon Inc System and method for image processing
US20060089786A1 (en) * 2004-10-26 2006-04-27 Honeywell International Inc. Personal navigation device for use with portable device
US20090315913A1 (en) * 2006-07-21 2009-12-24 Panasonic Corporation Map display system
US20100179756A1 (en) * 2009-01-13 2010-07-15 Yahoo! Inc. Optimization of map views based on real-time data
US20120320193A1 (en) * 2010-05-12 2012-12-20 Leica Geosystems Ag Surveying instrument
EP2778614A1 (en) * 2013-03-15 2014-09-17 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display
US20160003623A1 (en) * 2014-07-02 2016-01-07 Qualcomm Incorporated Methods and systems for collaborative navigation and operation with a mobile device and a wearable device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000180197A (en) * 1998-12-18 2000-06-30 Casio Comput Co Ltd Current position display system, current position display control device and map data transmission device
JP2000270283A (en) * 1999-03-16 2000-09-29 Canon Inc System and method for image processing
US20060089786A1 (en) * 2004-10-26 2006-04-27 Honeywell International Inc. Personal navigation device for use with portable device
US20090315913A1 (en) * 2006-07-21 2009-12-24 Panasonic Corporation Map display system
US20100179756A1 (en) * 2009-01-13 2010-07-15 Yahoo! Inc. Optimization of map views based on real-time data
US20120320193A1 (en) * 2010-05-12 2012-12-20 Leica Geosystems Ag Surveying instrument
EP2778614A1 (en) * 2013-03-15 2014-09-17 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display
US20160003623A1 (en) * 2014-07-02 2016-01-07 Qualcomm Incorporated Methods and systems for collaborative navigation and operation with a mobile device and a wearable device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SONY SMARTWATCH: "Smartwatch Connect", 1 January 2014 (2014-01-01), XP055421420, Retrieved from the Internet <URL:http://www.navigon.com/portal/common/Download/Manual/android/Manual_NavigonForSonySmartWatch2.pdf> [retrieved on 20171103] *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3838730A1 (en) * 2019-12-19 2021-06-23 Ktm Ag Motorcycle with operable multifunction device
US11975788B2 (en) 2019-12-19 2024-05-07 Ktm Ag Motorcycle with a multifunctional device
DE102022211140A1 (en) 2022-10-20 2024-04-25 Robert Bosch Gesellschaft mit beschränkter Haftung Light vehicle and method for navigation using a portable wireless communication terminal and a bicycle screen
WO2024083603A1 (en) * 2022-10-20 2024-04-25 Robert Bosch Gmbh Light vehicle and methods for navigation using a portable wireless communication terminal and a bicycle screen

Also Published As

Publication number Publication date
GB201614852D0 (en) 2016-10-19

Similar Documents

Publication Publication Date Title
US10060754B2 (en) Navigation device and method
US7818125B2 (en) Move guidance device, system, method, program and recording medium storing the program that displays a code containing map scale rate and position information
US20110208421A1 (en) Navigation device, navigation method, and program
US8744755B2 (en) Navigation device, navigation method and program
US20110178703A1 (en) Navigation apparatus and method
TW201000864A (en) Navigation device &amp; method
JP2013148419A (en) Guidance system, mobile terminal apparatus and vehicle-mounted apparatus
US20120150429A1 (en) Method and arrangement relating to navigation
US20160084666A1 (en) Routing engine
US9638531B2 (en) Map matching methods for mobile devices
WO2018041999A1 (en) Navigation device and display
JP5942615B2 (en) Route guidance device, route guidance method and program
JP2005265572A (en) Operation method for on-vehicle information terminal, on-vehicle information terminal, program for portable terminal, and portable phone
WO2019093032A1 (en) Vehicle-mounted device, recording medium, and notification method
US20150350420A1 (en) Call Control Device, Server, and Program
US20120221242A1 (en) Navigation device and a method of operation of a navigation device
JP2008270949A (en) Information system
JP2000124852A (en) Broadcast information provision system
JP2010271264A (en) Navigation system, and operation sound output method
WO2012022365A1 (en) Navigation device &amp; method
EP3018449B1 (en) Method of sharing data between electronic devices
WO2011160679A1 (en) Navigation device &amp; method
KR101395980B1 (en) Taxi application control apparatus
WO2010012295A1 (en) Navigation apparatus and method and computer software for use in the same
JP2019060879A (en) Communication device, information notification method, information notification program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17768387

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17768387

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载