US20150106857A1 - System And Method For Generating Screen Pointing Information In A Television Control Device - Google Patents
System And Method For Generating Screen Pointing Information In A Television Control Device Download PDFInfo
- Publication number
- US20150106857A1 US20150106857A1 US14/572,916 US201414572916A US2015106857A1 US 20150106857 A1 US20150106857 A1 US 20150106857A1 US 201414572916 A US201414572916 A US 201414572916A US 2015106857 A1 US2015106857 A1 US 2015106857A1
- Authority
- US
- United States
- Prior art keywords
- television
- pointing
- screen
- control device
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 72
- 238000004891 communication Methods 0.000 description 173
- 238000012545 processing Methods 0.000 description 120
- 238000010586 diagram Methods 0.000 description 22
- 230000015654 memory Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 238000009434 installation Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 5
- 239000002131 composite material Substances 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 230000010354 integration Effects 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 239000000758 substrate Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0308—Detection arrangements using opto-electronic means comprising a plurality of distinctive and separately oriented light emitters or reflectors associated to the pointing device, e.g. remote cursor controller with distinct and separately oriented LEDs at the tip whose radiations are captured by a photo-detector associated to the screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0428—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234318—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2389—Multiplex stream processing, e.g. multiplex stream encrypting
- H04N21/23892—Multiplex stream processing, e.g. multiplex stream encrypting involving embedding information at multiplex stream level, e.g. embedding a watermark at packet level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/24—Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
- H04N21/2408—Monitoring of the upstream path of the transmission network, e.g. client requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25808—Management of client data
- H04N21/25841—Management of client data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
- H04N21/2668—Creating a channel for a dedicated end-user group, e.g. insertion of targeted commercials based on end-user profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
- H04N21/41265—The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42208—Display device provided on the remote control
- H04N21/42209—Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42222—Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/438—Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4524—Management of client data or end-user data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/462—Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
- H04N21/4622—Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4722—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
- H04N21/4725—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/47805—Electronic banking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/47815—Electronic shopping
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4782—Web browsing, e.g. WebTV
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4826—End-user interface for program selection using recommendation lists, e.g. of programs or channels sorted out according to their score
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4828—End-user interface for program selection for searching program descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
- H04N21/8133—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8173—End-user applications, e.g. Web browser, game
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8545—Content authoring for generating interactive applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/858—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- FIG. 1 is a diagram illustrating an exemplary television system in accordance with various aspects of the present invention.
- FIG. 2 is a diagram illustrating an exemplary television control device in accordance with various aspects of the present invention.
- FIG. 3 is a diagram illustrating an exemplary television system with on-screen television sensors in accordance with various aspects of the present invention.
- FIG. 4 is a diagram illustrating an exemplary television system with off-screen television sensors in accordance with various aspects of the present invention.
- FIG. 5 is a diagram illustrating an exemplary television system with off-television sensors in accordance with various aspects of the present invention.
- FIG. 6 is a diagram illustrating an exemplary television system with television receiver sensors in accordance with various aspects of the present invention.
- FIG. 7 is a diagram illustrating an exemplary television system with television controller sensors in accordance with various aspects of the present invention.
- FIG. 8 is a diagram illustrating an exemplary television control device in accordance with various aspects of the present invention.
- FIG. 9 is a flow diagram illustrating the generation of on-screen pointing information in accordance with various aspects of the present invention.
- FIG. 10 is a flow diagram illustrating the generation of on-screen pointing information in accordance with various aspects of the present invention.
- modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware).
- modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such.
- various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory).
- processors e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.
- software instructions e.g., stored in volatile and/or non-volatile memory
- aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
- ASIC application-specific integrated circuit
- any or all of the functional modules discussed herein may share various hardware and/or software components.
- any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions.
- various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
- a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, television, television control device, television provider, television programming provider, television receiver, video recording device, etc.) may communicate with other systems.
- a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), any home or premises communication network, etc.
- WMAN wireless metropolitan area network
- WLAN wireless local area network
- WPAN wireless personal area network
- a particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
- Such a pointing location refers to a location on the television screen to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing.
- Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded, broadcast/multicast/unicast, etc.).
- Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a document, a graphical video game, etc.).
- Various aspects of the present invention may, for example, comprise determining an on-screen pointing location during the presentation of television programming on the screen of the television.
- the exemplary system 100 includes a television provider 110 .
- the television provider 110 may, for example, comprise a television network company, a cable company, a movie-providing company, a news company, an educational institution, etc.
- the television provider 110 may, for example, be an original source of television programming (or related information).
- the television provider 110 may be a communication company that provides programming distribution services (e.g., a cable television company, a satellite television company, a telecommunication company, a data network provider, etc.).
- the television provider 110 may, for example, provide programming and non-programming information and/or video content.
- the television provider 110 may, for example, provide information related to a television program (e.g., information describing or otherwise related to selectable objects in programming, etc.).
- the exemplary television system 100 may also include a third party program information provider 120 .
- a third party program information provider 120 may, for example, provide information related to a television program.
- Such information may, for example, comprise information describing selectable objects in programming, program guide information, etc.
- the exemplary television system 100 may include one or more communication networks (e.g., the communication network(s) 130 ).
- the exemplary communication network 130 may comprise characteristics of any of a variety of types of communication networks over which video content and/or information related to video content may be communicated.
- the communication network 130 may compare characteristics of a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc.
- LAN local area network
- PAN personal area network
- MAN metropolitan area network
- the exemplary television system 100 may include a first television 140 .
- a first television 140 may, for example, comprise networking capability enabling such television 140 to communicate directly with the communication network 130 .
- the first television 140 may comprise one or more embedded television receivers or transceivers (e.g., a cable television receiver, satellite television transceiver, Internet modem, etc.).
- the first television 140 may comprise one or more recording devices (e.g., for recording and/or playing back video content, television programming, etc.).
- the exemplary television system 100 may include a first television controller 160 .
- a first television controller 160 may, for example, operate to (e.g., which includes operating when enabled to) control operation of the first television 140 .
- the first television controller 160 may comprise characteristics of any of a variety of television controlling devices.
- the first television controller 160 may comprise characteristics of a dedicated television control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.
- the first television controller 160 may, for example, transmit signals directly to the first television 140 to control operation of the first television 140 .
- the first television controller 160 may also, for example, operate to transmit signals (e.g., via the communication network 130 ) to the television provider 110 to control video content being provided to the first television 140 , or to conduct other transactions (e.g., business transactions, etc.).
- the first television controller 160 may operate to communicate screen pointing information with the first television 140 and/or other devices.
- various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an object or person presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners.
- One of such exemplary manners includes pointing with a television control device.
- the first television controller 160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location. The following discussion of FIGS. 2-10 will present various non-limiting illustrative aspects of such a television controller.
- the exemplary television system 100 may also include a television receiver 150 .
- the television receiver may, for example, operate to provide a communication link between a television and/or television controller and a communication network and/or information provider.
- the television receiver 150 may operate to provide a communication link between the second television 141 and the communication network 130 , or between the second television 141 and the television provider 110 (and/or third party program information provider 120 ) via the communication network 130 .
- the television receiver 150 may comprise characteristics of any of a variety of types of television receivers.
- the television receiver 150 may comprise characteristics of a cable television receiver, a satellite television receiver, etc.
- the television receiver 150 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.).
- the television receiver 150 may also, for example, comprise recording capability (e.g., programming recording and playback, etc.).
- the exemplary television system 100 may include a second television controller 161 .
- a second television controller 161 may, for example, operate to control operation of the second television 141 and the television receiver 150 .
- the second television controller 161 may comprise characteristics of any of a variety of television controlling devices.
- the second television controller 161 may comprise characteristics of a dedicated television control device, a dedicated television receiver control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc.
- the second television controller 161 may, for example, transmit signals directly to the second television 141 to control operation of the second television 141 .
- the second television controller 161 may, for example, transmit signals directly to the television receiver 150 to control operation of the television receiver 150 .
- the second television controller 161 may additionally, for example, operate to transmit signals (e.g., via the television receiver 150 and the communication network 130 ) to the television provider 110 to control video content being provided to the television receiver 150 , or to conduct other transactions (e.g., business transactions, etc.).
- various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an object or person presented in television programming).
- the user may perform such pointing in any of a variety of manners.
- One of such exemplary manners includes pointing with a television control device.
- the second television controller 161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location.
- FIGS. 2-10 will present various non-limiting illustrative aspects of such a television controller.
- the exemplary television system 100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of the exemplary television system 100 unless explicitly claimed.
- FIG. 2 such figure is a diagram illustrating an exemplary television control device 200 (e.g., a remote control device) in accordance with various aspects of the present invention.
- the exemplary television control device 200 may, for example, share any or all characteristics with the exemplary television control devices 160 , 161 illustrated in FIG. 1 and discussed previously and/or with any of the exemplary television control devices discussed herein.
- the exemplary television control device 200 includes a first communication interface module 210 .
- the first communication interface module 210 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols.
- the first communication interface module 210 is illustrated coupled to a wireless RF antenna via a wireless port 212 , the wireless medium is merely illustrative and non-limiting.
- the first communication interface module 210 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content (e.g., television programming), television control information, and/or other data is communicated.
- communication networks e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.
- the first communication module 210 may operate to communicate with local sources of television video content (e.g., video recorders, receivers, gaming devices, etc.). Additionally, for example, the first communication module 210 may operate to communicate with a second television controller (e.g., directly or via one or more intermediate communication networks). Further for example, the first communication module 210 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.). Still further, for example, the first communication module 210 may operate to communicate with screen pointing sensors.
- local sources of television video content e.g., video recorders, receivers, gaming devices, etc.
- a second television controller e.g., directly or via one or more intermediate communication networks.
- the first communication module 210 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.). Still further, for example, the first communication module 210
- the exemplary television control device 200 includes a second communication interface module 220 .
- the second communication interface module 220 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols.
- the second communication interface module 220 may communicate via a wireless RF communication port 222 and antenna, or may communicate via a non-tethered optical communication port 224 (e.g., utilizing laser diodes, photodiodes, etc.).
- the second communication interface module 220 may communicate via a tethered optical communication port 226 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 228 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.).
- the second communication interface module 220 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content, television control information, and/or other data is communicated.
- communication networks e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.
- the second communication module 220 may operate to communicate with local sources of television video content (e.g., video recorders, other receivers, gaming devices, etc.). Additionally, for example, the second communication module 220 may operate to communicate with a second television controller (e.g., directly or via one or more intervening communication networks). Further for example, the second communication module 220 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.). Still further, for example, the second communication module 220 may operate to communicate with screen pointing sensors.
- local sources of television video content e.g., video recorders, other receivers, gaming devices, etc.
- the second communication module 220 may operate to communicate with a second television controller (e.g., directly or via one or more intervening communication networks). Further for example, the second communication module 220 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video,
- the exemplary television control device 200 may also comprise additional communication interface modules, which are not illustrated. Such additional communication interface modules may, for example, share any or all aspects with the first 210 and second 220 communication interface modules discussed above.
- the exemplary television control device 200 may also comprise a communication module 230 .
- the communication module 230 may, for example, operate to control and/or coordinate operation of the first communication interface module 210 and the second communication interface module 220 (and/or additional communication interface modules as needed).
- the communication module 230 may, for example, provide a convenient communication interface by which other components of the television control device 200 may utilize the first 210 and second 220 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, the communication module 230 may coordinate communications to reduce collisions and/or other interference between the communication interface modules 210 , 220 .
- the exemplary television control device 200 may comprise one or more television interface modules 235 .
- the television interface module 235 may, for example, operate to manage communications between the television control device 200 and one or more televisions that are communicatively coupled thereto (e.g., via the first 210 and/or second 220 communication interface modules).
- the television interface module 235 may operate to communicate general television programming video information to a television (e.g., while the television control device 200 is operating to determine an on-screen pointing location).
- the television interface module 235 may output a signal to the television, television receiver or a second television controller or other device with a display, where such signal comprises characteristics adapted to cause the television (or other device) to output a visual indication of on-screen pointing location.
- Such an indication may, for example, be communicated with (e.g., as a part of) other information (e.g., video information, general device control information, etc.) being communicated to the television (or other device), or such an indication may be communicated to the television (or other device) independent of other information.
- the exemplary television control device 200 may additionally comprise one or more user interface modules 240 .
- the user interface module 240 may generally operate to provide user interface functionality to a user of the television control device 200 .
- the user interface module 240 may operate to provide for user control of any or all standard television and/or television receiver commands (e.g., channel control, on/off, television output settings, input selection, etc.).
- the user interface module 240 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the television receiver (e.g., buttons, touch screen, microphone, etc.) and may also utilize the communication module 230 (and/or first 210 and second 220 communication interface modules) to communicate with a television controller, television receiver, another television control device and/or any other television system component.
- user interface features of the television control device 200 may comprise utilization of the television (e.g., utilizing the television screen for menu-driven or other GUI associated with television, television receiver and/or television controller operation).
- the user interface module 240 may also operate to interface with and/or control operation of any of a variety of sensors that may be utilized to ascertain an on-screen pointing location. Non-limiting examples of such sensors will be provided later (e.g., in the discussion of FIGS. 3-7 and elsewhere herein).
- the user interface module 240 may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices (e.g., a television, television control, surround sound system, etc.), via the communication interface modules 210 , 220 , etc.).
- the user interface module 240 may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors.
- signals e.g., RF signals, optical signals, acoustic signals, etc.
- the exemplary television control device 200 may comprise one or more processors 250 .
- the processor 250 may, for example, comprise one or more of a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc.
- the processor 250 may operate in accordance with software (or firmware) instructions.
- software or firmware instructions.
- any or all functionality discussed herein may be performed by a processor executing instructions.
- various modules are illustrated as separate blocks or modules in FIG. 2 for illustrative clarity, such illustrative modules, or a portion thereof, may be implemented by the processor 250 .
- the exemplary television control device 200 may comprise one or more memories 260 . As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one or more memories 260 . Such memory 260 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation, such memory 260 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc.
- memory chips e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.
- the exemplary television control device 200 may also comprise one or more calibration modules 251 that operate to perform various calibration activities. Examples of such calibration activities will be provided later in this discussion. Briefly, such calibration activities may, for example, comprise interacting with a user and/or user pointing device (e.g., if different from the television control device 200 ) to determine sensor signals under known circumstances (e.g., determine sensor signals in response to known screen pointing circumstances), and processing such sensor signals to develop algorithms (e.g., transformation matrices, static positional equations, etc.) to determine screen pointing location based on sensor signals received during normal operation. As will also be discussed later, such calibration may also be utilized to establish signal gain (or energy) patterns utilized in determining pointing location.
- algorithms e.g., transformation matrices, static positional equations, etc.
- the exemplary television control device 200 may comprise one or more location-determining modules 252 .
- various on-screen pointing location determinations may comprise processing location information.
- knowing the location of a user e.g., including the location of a pointing device (e.g., which could be the television control device 200 ) being utilized by the user
- knowing exactly where a pointing device is located e.g., in three-dimensional space or where a pointing device is located along a line (e.g., knowing device location in two-dimensional space or land surface coordinates) relative to the television screen (and/or relative to the television control device) may remove a number of unknown variables from applicable positional equations.
- positional information may, in various exemplary scenarios, also comprise orientation information for a pointing device (e.g., yaw, pitch and/or roll).
- orientation information may be determined in various manners (e.g., through gyroscopic means, sensor alignment with known references, etc.).
- the location-determining module 252 may operate to determine user (or pointing device) location in any of a variety of manners. For example and without limitation, in an exemplary scenario where the pointing device is different from the control device 200 , the location-determining module 252 may operate to receive location information from the pointing device (e.g., via one of the communication interface modules 210 , 220 ). For example, such a pointing device may comprise positioning system capability (e.g., global positioning system, assisted GPS, cellular or other triangulation systems, etc.) and communicate information describing the position of the pointing device to the television control device 200 . In an exemplary scenario where the television control device 200 is the pointing device, the television control device 200 may comprise on-board position-determining capability.
- positioning system capability e.g., global positioning system, assisted GPS, cellular or other triangulation systems, etc.
- the location-determining module 252 may (e.g., via the user interface modules 240 ) utilize sensor signals to determine the position (which may include orientation) of the pointing device (or user thereof). For example, signals may arrive at the pointing device at different sensors at different times (or at different phases). Such temporal or phase differences may be processed to determine the location of the pointing device relative to the known location of such sensors. Further for example, the location-determining module 252 may operate to communicate pointing device location information with an external system that operates to determine the location of the pointing device. Such an external system may, for example, comprise a cellular telephony triangulation system, a home or premises-based triangulation system, a global positioning system, an assisted global positioning system, etc. In a non-limiting exemplary scenario where the control device 200 is the pointing device, the location information communicated with the external system may be location information associated with the control device 200 .
- the exemplary television control device 200 may also comprise one or more sensor processing module(s) 253 .
- the sensor processing module 253 may operate to receive sensor information (e.g., from the user interface module(s) 240 , from the television interface module 235 , from the communication interface modules 210 , 220 , etc.) and process such received sensor information to determine a location on the television screen to which a user is pointing.
- sensor information e.g., from the user interface module(s) 240 , from the television interface module 235 , from the communication interface modules 210 , 220 , etc.
- processing may, for example, comprise selecting a sensor with the strongest signal, interpolating between a plurality of sensors, interpolating between a plurality of sensors having strongest signals, determining gain (or energy) pattern intersections, etc.
- Various aspects of the present invention comprise, for example, determining on-screen pointing location during presentation of television programming (e.g., programming received from a television broadcaster, video recording device, etc.
- FIG. 3 is a diagram illustrating an exemplary television system 300 with on-screen television sensors in accordance with various aspects of the present invention.
- the television system 300 includes a television 301 comprising a television screen 303 .
- the television system 300 also includes a television controller 320 (or other pointing device) pointing to an on-screen pointing location 330 along a line 325 between the television controller 320 and the on-screen pointing location 330 .
- the television controller 320 may, for example, share any or all aspects with the exemplary television controllers 160 , 161 and 200 discussed previously and with all other television controllers discussed herein.
- the television control device 320 may, for example, be communicatively coupled directly to the television 301 via a communication link 353 .
- the television control device 320 may also, for example, be communicatively coupled directly to the television receiver 350 via communication link 352 .
- the television control device 320 may additionally, for example, be communicatively coupled indirectly to the television 301 via the television receiver 350 through communication links 351 and 352 . Accordingly, various aspects of the television control device 320 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television system 300 also comprises a television receiver 350 that is communicatively coupled to the television 301 via a communication link 351 (e.g., a two-way communication link providing video information to the television 301 and/or receiving sensor information from the television 301 for communication to the television control device 320 ).
- the exemplary television receiver 350 is also communicatively coupled to the television controller 320 via a communication link 352 .
- the exemplary television screen 303 comprises an array of sensors integrated into the television screen 303 .
- One of such sensors is labeled sensor 310 .
- Any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes) and RF sensors (e.g., antenna elements or loops).
- the array of sensors may be integrated in the television screen 303 in any of a variety of manners, non-limiting examples of which will now be provided.
- the television screen 303 may comprise an array of liquid crystal display (LCD) pixels for presenting video media to a user.
- LCD liquid crystal display
- An array of photo diodes and/or antenna elements may be integrated between or behind LCD pixels.
- every LCD pixel may be associated with a corresponding photo diode and/or antenna element, or every N ⁇ M block of LCD pixels may be associated with a corresponding photo diode or antenna element.
- an array of photo diodes and/or RF antenna elements may be formed into a substrate beneath or behind transparent LCD substrates.
- a photo diode array and/or antenna element array may be interposed between or behind an array of LCD thin film transistors.
- an array of photo diodes and/or RF antenna elements (or other sensors) may be incorporated into a transparent screen overlay. Note that is such an implementation, such transparent screen overlay may be installed after-market. For example, a user that has a television control device 320 with the capability to determine on-screen pointing location may install the transparent screen overlay.
- Such communication link may, for example, be adapted to communicate information from each sensor to the television control device 320 serially (e.g., in a time-multiplexed manner) and/or in parallel.
- passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a light source of the television control device 320 or other pointing device) aimed at the screen 303 .
- received signals e.g., pulsed signals
- received signals may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
- photo detectors may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, etc.
- photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc.
- an array of antenna elements may be formed on a substrate and placed behind light producing and/or filtering elements in an LCD screen (e.g., so as to avoid interfering with emitted light) or may be formed on a transparent substrate within or in front of the lighted region of the LCD display (e.g., utilizing microscopic antenna elements that are too small to significantly interfere with light emitted from the display).
- an implementation may be integrated with the television screen 303 , but may also be added as an overlay (e.g., as a production option or an after-market user or technician installation).
- passive antennas may receive varying respective amounts of RF energy depending on the pointing direction of a directional RF source (e.g., a directional RF source of the television control device 320 or other pointing device) aimed at the screen.
- received signals e.g., pulsed signals
- a user may point a pointing device (e.g., a the television control device 320 , a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at the television screen 303 , where the pointing device directs transmitted energy (e.g., light energy, RF energy, acoustic energy, etc.) at a particular location on the television screen 303 to which the pointing device is being pointed.
- transmitted energy e.g., light energy, RF energy, acoustic energy, etc.
- Such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity likely at the center of the pattern (i.e., along the pointing line 325 ) and decreasing as a function of angle from the center of the pattern (or distance on the screen from the on-screen pointing location).
- each sensor of the array of sensors integrated into the screen 303 will likely receive some respective amount of energy.
- the sensor nearest the screen pointing location 330 i.e., along the pointing line 325
- the sensor adjacent to the screen pointing location 330 will likely receive a next highest range of energy
- sensors away from the pointing location 330 will likely receive progressively less amounts of energy from the pointing device (e.g., the television control device 320 ) as a function of distance from the pointing location 330 , until such energy is lost in the noise floor.
- the television control device 320 may receive signals indicative of the energy received by the sensors of the sensor array.
- the television control device 320 may receive such signals in various manners, depending on the degree of integration of such sensors into the television 301 .
- the television control device 320 may receive such signals via a communication interface between the television control device 320 and the television 301 (e.g., via communication link 353 , or via a communication interface between the television 301 and television control device 320 via the television receiver 350 (e.g., via communication links 351 and 352 )).
- the television control device 320 may receive such signals via a communication link directly between the television control device 320 and the sensors, where such a communication link may be independent of other communication links between the television control device 320 and the television 301 .
- Such communication link may, for example, be adapted to communicate information from each sensor to the television control device 320 serially (e.g., in a time-multiplexed manner) and/or in parallel.
- the user interface module 240 may then, for example, provide information of such received sensor signals to the sensor processing module 253 for processing.
- the sensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location.
- the sensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
- the sensor processing module 253 may operate to select the sensor with the highest received energy and determine that the location of such selected sensor is the on-screen pointing location. For example, in an exemplary scenario where the spatial resolution of screen-integrated sensors is relatively fine, such operation may reliably yield a desired level of accuracy without undue processing overhead.
- the sensor processing module 253 may operate to select the sensor with the highest received energy and a plurality of sensors adjacent to such sensor. Then, for example, the sensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on weighting). For example, in a first dimension in which a sensor to the right of the highest energy sensor has a higher received energy than a sensor to the left of the highest energy sensor, the sensor processing module 253 may determine that the pointing location is to the right of the highest energy sensor. How much distance to the right may, for example, be determined as a function of the ratio between respective energies received by the right and left sensors. Such calculation may, for example, be a linear or non-linear calculation. Such calculation may also, for example, consider the expected energy pattern of a transmitting pointing device (e.g., in a scenario where energy fall-off is logarithmic as opposed to linear).
- the sensor processing module 253 may operate to select all sensors receiving a threshold amount of energy (e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.). Then, for example, the sensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, the sensor processing module 253 may perform non-linear splining between sensors in a horizontal direction with sensor location on a first axis and sensor energy on a second axis. The sensor processing module 253 may then operate to select the point on the sensor location axis corresponding to the peak sensor energy on the vertical axis. Such splining and selecting may then be repeated in the vertical direction. Alternatively for example, the sensor processing module 253 may operate to perform multi-dimensional splining to create a surface based on sensor energy and select the highest point on such surface and the corresponding screen coordinates of such surface.
- a threshold amount of energy e.g., an absolute threshold level,
- the sensor processing module 253 may operate to select a first sensor (e.g., the sensor with the highest received energy). Then, for example, the sensor processing module 253 may utilize information of the relative distance between the selected sensor and the pointing device (e.g., the television control device 320 ), information of the gain pattern for the signal transmitted from the pointing device to the selected sensor, and calibration information to determine where the pointing device may be pointed in order for the sensor to receive such energy. For example, this may result in a first closed figure (e.g., a circle, cloverleaf, etc.) drawn around the sensor on the screen plane.
- a first closed figure e.g., a circle, cloverleaf, etc.
- the sensor processing module 253 may repeat the procedure for a second sensor (e.g., a sensor with the second highest received energy), resulting in a second closed figure.
- the sensor processing module 253 may then, for example, determine the point(s) of intersection between the first and second figures. If only one point of intersection lies within the border of the screen, then such point of intersection might be utilized as an estimate of the pointing location.
- the sensor processing module 253 may repeat the procedure for a third sensor (e.g., the sensor with the third highest energy, a sensor generally along the line perpendicular to a line segment between the first and second sensors, etc.) and determine a point nearest the intersection of the first, second and third closed figures. Such a point of intersection may then be utilized as an estimate of the pointing location.
- a third sensor e.g., the sensor with the third highest energy, a sensor generally along the line perpendicular to a line segment between the first and second sensors, etc.
- the above-mentioned examples of screen-integrated sensors and related pointing location determinations were presented as exemplary illustrations. Though the above-mentioned examples generally discuss light and/or RF energy sensors, other types of sensors may also be integrated into a television screen or overlaid thereon.
- the sensors may comprise acoustic sensors that operate to sense acoustic energy (e.g., directed acoustic energy directed to a pointing location on the screen).
- acoustic energy e.g., directed acoustic energy directed to a pointing location on the screen.
- directed acoustic energy may be formed at frequencies beyond the range of human hearing (e.g., and at frequencies beyond the range of pet hearing as well).
- various energy radiation patterns may be used, and/or a plurality of energy radiation patterns may be used.
- a pointing device e.g., the television control device 320
- a pointing device may transmit one or more energy emissions that move relative to the pointing direction (e.g., in a raster pattern or any other pattern).
- the television control device 320 may communicate information of such determined location in various manners.
- the sensor processing module 253 of the television control device 200 may utilize the television interface module 235 to communicate information of such on-screen pointing location to the television 301 for presentation to the user.
- the sensor processing module 253 may utilize the user interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 200 ). Such communication will also be addressed in the discussions of FIGS. 9-10 .
- FIG. 4 is a diagram illustrating an exemplary television system 400 with off-screen television sensors in accordance with various aspects of the present invention.
- the television system 400 includes a television 401 comprising a television screen 403 .
- the television system 400 also includes a television controller 420 (or other pointing device) pointing to an on-screen pointing location 430 along a pointing line 425 between the television controller 420 and the on-screen pointing location 430 .
- the television controller 420 may, for example, share any or all aspects with the exemplary television controllers 160 , 161 , 200 and 320 discussed previously and with all other television controllers discussed herein.
- the television control device 420 may, for example, be communicatively coupled directly to the television 401 via a communication link 453 .
- the television control device 420 may also, for example, be communicatively coupled directly to the television receiver 450 via communication link 452 .
- the television control device 420 may additionally, for example, be communicatively coupled indirectly to the television 401 via the television receiver 450 through communication links 451 and 452 . Accordingly, various aspects of the television control device 420 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television system 400 also comprises a television receiver 450 that is communicatively coupled to the television 401 via a communication link 451 (e.g., a two-way communication link providing video information to the television 401 and/or receiving sensor information from the television 401 for communication to the television control device 420 ).
- the exemplary television receiver 450 is also communicatively coupled to the television controller 420 via a communication link 452 .
- the exemplary television 401 comprises an array of sensors integrated into the television 401 around the border of the screen 403 .
- Three of such sensors are labeled 410 , 411 and 412 .
- any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc.
- the array of sensors may be integrated around the television screen 403 in any of a variety of manners.
- such sensors may be integrated in a border of the television screen 403 that is not used for outputting video content.
- Such a configuration may, for example, avoid sensor interference with video content being displayed on the screen.
- such sensors may be mounted to a border material of the television 401 .
- an array of photo detectors e.g., photo diodes
- antenna elements e.g., individual antennas or elements of an antenna array, for example, a phased array
- every screen pixel row and/or column may be associated with a pair of corresponding photo diodes and/or antenna elements, or every N ⁇ M block of screen pixels may be associated with one or more corresponding photo diodes or antenna elements (e.g., a row and column sensor, two row and two column elements, etc.).
- passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a directional light source of the television control device 420 ) pointed at the screen.
- received signals e.g., pulsed signals
- received signals may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
- photo detectors may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections.
- photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc.
- the photo detectors integrated with the television body off-screen may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a television controller or other pointing device.
- various aspects may comprise mounting (e.g., adhering) sensors to the television body off-screen. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- an array of antenna elements may be positioned around the border of the screen 403 .
- passive antennas or elements of an overall antenna matrix
- received signals e.g., pulsed signals
- received signals may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
- various aspects may comprise mounting (e.g., adhering) sensors to the television body off-screen. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- a user may point a pointing device (e.g., a remote controller 420 , a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at the television screen 403 , where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on the television screen 403 to which the device is being pointed.
- transmitted energy e.g., light and/or RF energy and/or acoustic energy
- transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity likely at the center of the pattern (i.e., along the pointing line 425 ) and decreasing as a function of angle from the center of the pattern.
- Such a gain pattern is generally represented in FIG. 4 by the concentric circles around the on-screen pointing location 430 . Note, however, that in practice such a gain pattern is likely to be more complex than the illustrated pattern (e.g., including lobes with respective peaks and nulls).
- each sensor of the sensors integrated into the television 401 around the border of the screen 403 will likely receive some respective amount of energy.
- the sensor nearest the screen pointing location 430 i.e., along the pointing line 425
- the sensor nearest the screen pointing location 430 will likely receive the highest amount of energy
- sensors along the particular axis adjacent to the screen pointing location 430 will likely receive a next highest range of energy
- sensors away from the pointing location 430 will likely receive progressively less amounts of energy from the pointing device (e.g., the television control device 420 ), as a function of distance from the pointing location 430 or as a function of the angular displacement from the pointing line 425 , until such energy is lost in the noise floor.
- sensor 410 is closest to the pointing location 430 and will likely receive the highest energy, with sensors adjacent to the left and right of sensor 410 receiving the next highest amounts of energy, and so on. Also, along the vertical axis, sensors 411 and 412 will likely receive close to the highest amount of energy, with sensors above and below such sensors 411 , 412 receiving the next highest amounts of energy and so on.
- the television control device 420 may receive signals indicative of the energy received by the sensors of the television 401 .
- the television control device 420 may receive such signals in various manners, depending on the degree of integration of such sensors into the television 401 .
- the television control device 420 may receive such signals via a communication interface between the television control device 420 and the television 401 (e.g., via communication link 453 or via a communication interface between the television 401 and the television control device 420 via the television receiver 450 (e.g., via communication links 451 and 452 )).
- the television control device 420 may receive such signals via a communication link directly between the television control device 420 and the sensors, where such a communication link may be independent of other communication links between the television control device 420 and the television 401 .
- Such communication link(s) may, for example, be adapted to communicate information from each sensor to the television control device 420 serially (e.g., in a time-multiplexed manner) and/or in parallel.
- the user interface module 240 may then, for example, provide information of such received sensor signals to the sensor processing module 253 for processing.
- the sensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location.
- the sensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
- the sensor processing module 253 may operate to select the sensor with the highest received energy along each of the horizontal and vertical axes and determine that the respective locations of such selected sensors correspond to the horizontal and vertical coordinates of the on-screen pointing location. For example, in an exemplary scenario where the spatial resolution of screen border sensors is relatively fine, such operation may reliably yield a desired level of accuracy without undue processing overhead. For example, the sensor processing module 253 may determine that sensors 410 and 411 have the highest received energy for the horizontal and vertical axes, respectively, and thus determine that the on-screen pointing location is represented in the horizontal axis by the horizontal location of the sensor 410 and represented in the vertical axis by the vertical location of the sensor 411 .
- the sensor processing module 253 may select a midpoint between such sensors (e.g., the vertical midpoint between sensors 411 and 412 ).
- the sensor processing module 253 may operate to select, for each screen axis, the sensor with the highest received energy and a plurality of sensors adjacent to such sensor. Then, for example, the sensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on weighting). For example, in the horizontal dimension in which a sensor to the right of the highest energy sensor 410 has a higher received energy than a sensor to the left of the highest energy sensor 410 , the sensor processing module 253 may determine that the pointing location along the horizontal axis is to the right of the highest energy sensor 410 . How much distance to the right may, for example, be determined as a function of the ratio between respective energies received by the right and left sensors.
- Such calculation may, for example, be a linear or non-linear calculation. Such calculation may also, for example, consider the expected energy pattern of a transmitting pointing device (e.g., in a scenario where energy fall-off is logarithmic as opposed to linear). The sensor processing module 253 may then, for example, repeat such operation in the vertical direction.
- the sensor processing module 253 may operate to select all sensors in each of the axes receiving a threshold amount of energy (e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.). Then, for example, the sensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, the sensor processing module 253 may perform non-linear splining between sensors in a horizontal direction with sensor location on a first axis and sensor energy on a second axis. The sensor processing module 253 may then operate to select the point on the sensor location axis corresponding to the peak sensor energy on the vertical axis.
- a threshold amount of energy e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.
- the sensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, the sensor processing module 253 may
- Such splining and selecting may then be repeated in the vertical screen direction.
- the sensor processing module 253 may operate to perform multi-dimensional splining to create a surface based on sensor energy and select the highest point on such surface and the corresponding screen coordinates of such surface.
- the television control device 420 may communicate information of such determined location in various manners.
- the sensor processing module 253 of the television control device 200 may utilize the television interface module 235 to communicate information of such on-screen pointing location to the television 401 for presentation to the user.
- the sensor processing module 253 may utilize the user interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 200 ). Such communication will also be addressed in the discussions of FIGS. 9-10 .
- FIG. 5 is a diagram illustrating an exemplary television system 500 with off-television sensors in accordance with various aspects of the present invention.
- the television system 500 includes a television 501 comprising a television screen 503 .
- the television system 500 also includes a television controller 520 (or other pointing device) pointing to an on-screen pointing location 530 along a pointing line 525 between the television controller 520 and the on-screen pointing location 530 .
- the television controller 520 may, for example, share any or all aspects with the exemplary television controllers 160 , 161 , 200 , 320 and 420 discussed previously and with all other television controllers discussed herein. Accordingly, various aspects of the television control device 520 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television control device 520 may, for example, be communicatively coupled directly to the television 501 via a communication link (not illustrated).
- the television control device 520 may also, for example, be communicatively coupled directly to the television receiver 550 via communication link 562 .
- the television control device 520 may additionally, for example, be communicatively coupled indirectly to the television 501 via the television receiver 550 through communication links 561 and 562 . Accordingly, various aspects of the television control device 520 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television system 500 also comprises a television receiver 550 that is communicatively coupled to the television 501 via a communication link 561 (e.g., a two-way communication link providing video information to the television 501 and/or receiving sensor information from the television 501 for communication to the television control device 520 ).
- a communication link 561 e.g., a two-way communication link providing video information to the television 501 and/or receiving sensor information from the television 501 for communication to the television control device 520 .
- the television control device 520 is illustrated with one or more communication links 563 to the various sensors 551 - 556 independent of other communication links (e.g., links to the television 501 , links to the television receiver 550 , etc.).
- the television control device 520 e.g., a user interface module 240
- the exemplary television control device 520 may also be communicatively coupled to other pointing devices and/or television control devices.
- the exemplary television system 500 comprises an array of sensors integrated into audio speaker components (e.g., surround sound speakers) positioned around the television 501 .
- the television system 500 comprises a left speaker 531 comprising a top sensor 552 and a bottom sensor 551 .
- the television system 500 comprises a right speaker 533 comprising a top sensor 556 and a bottom sensor 555 .
- the television system 500 comprises a center speaker 532 comprising a left sensor 553 and a right sensor 554 .
- any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc.
- light sensors or photo detectors e.g., photo diodes
- RF sensors e.g., antenna elements
- acoustic sensors e.g., microphones
- the audio speaker component example discussed herein is merely illustrative and that such sensors may be installed in any of a variety of locations (e.g., dedicated sensor boxes, attached to furniture, etc.).
- the array of sensors may be positioned around the television 501 in any of a variety of manners.
- such sensors may be positioned around the television 501 generally in the same plane as the television screen 503 .
- on-screen pointing location may be determined in a manner similar to the interpolation and/or gain pattern intersection discussed above with regard to off-screen and/or on-screen sensors.
- a calibration procedure may be implemented (e.g., by the calibration module 251 ). Such calibration will be discussed in more detail below.
- one or more photo detectors e.g., photo diodes
- antenna elements e.g., individual antennas or elements of an antenna array
- passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a directional light source of the television control device 520 ) aimed at the screen.
- directed energy e.g., light, RF, acoustic, etc.
- a pointing device e.g., the television control device 520
- sensors off-screen or even off-television
- received signals may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
- photo diodes may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, room lighting, etc.
- photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc.
- the photo detectors integrated with off-television components may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a television controller (or other pointing device).
- various aspects may comprise mounting (e.g., adhering) sensors to various off-television components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- an array of antenna elements may be positioned around off-television components (e.g., in surround sound components).
- passive antennas or elements of an overall antenna matrix
- received signals e.g., pulsed signals
- various aspects may comprise mounting (e.g., adhering) sensors to the off-television components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- a user may point a pointing device (e.g., a television controller 520 (e.g., a remote control device), a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at the television screen 503 , where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on the television screen 503 to which the user is pointing with the pointing device.
- transmitted energy e.g., light and/or RF energy and/or acoustic energy
- Such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity at the center of the pattern (i.e., along the pointing line 525 ) and decreasing as a function of angle from the center of the pattern (or distance from the center point).
- Such a gain pattern was discussed previously in the discussion of FIG. 4 .
- each sensor of the sensors integrated into the television system 500 off-television will likely receive some respective amount of energy.
- the sensor nearest to the screen pointing location 530 i.e., along the pointing line 525
- the sensor nearest to the screen pointing location 530 will likely receive the highest amount of energy
- a sensor next nearest to the screen pointing location 530 will likely receive a next highest range of energy
- sensors away from the pointing location 530 will likely receive progressively less amounts of energy from the pointing device (e.g., the television control device 420 ), as a function of distance from the pointing location 530 and/or angle off the pointing line 525 (e.g., until such energy is lost in the noise floor).
- sensor 553 is nearest to the pointing location 530 and will likely receive the highest energy
- sensor 552 is next nearest to the pointing location 530 and will likely receive the next highest energy, and so on.
- signals from a same sensor may be utilized in determining multiple axes of pointing location.
- a calibration procedure may be performed when the system 500 is configured to assist in such pointing determination.
- the television control device 520 may receive signals indicative of the energy received by the sensors of the television system 500 .
- the television control device 520 may receive such signals in various manners, depending on the degree of integration of such sensors into the television 501 .
- the television control device 520 may receive such signals via a communication interface between the television control device 520 and the respective off-television components (e.g., via a communication link 563 between the television control device 520 and the surround sound speaker components 531 - 533 ).
- the television control device 520 may receive such signals via a communication link directly between the television control device 520 and the individual sensors (e.g., communication link 563 ), where such a communication link may be independent of other communication links between the television control device 520 and the television 501 and/or independent of other communication links between the television control device 520 and other television system 500 components (e.g., television receiver 550 and the surround sound speaker components 531 - 533 ).
- a communication link directly between the television control device 520 and the individual sensors e.g., communication link 563
- such a communication link may be independent of other communication links between the television control device 520 and the television 501 and/or independent of other communication links between the television control device 520 and other television system 500 components (e.g., television receiver 550 and the surround sound speaker components 531 - 533 ).
- the user interface module 240 may then, for example, provide information of such received sensor signals to the sensor processing module 253 for processing.
- the sensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location.
- the sensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
- the sensor processing module 253 may operate to estimate a position between sensor positions based on relative sensor energy.
- sensor 552 may correspond to a relatively high amount of energy
- sensor 556 may correspond to a relatively low amount of received energy.
- the sensor processing module 253 may, for example, estimate a horizontal position relatively closer to sensor 552 by an amount proportional to the relative difference between respective amounts of energy.
- the sensor processing module 253 may perform a similar estimation utilizing sensors 551 and 555 . Various horizontal position estimations may then be averaged.
- respective energies for the left speaker 531 sensors may be averaged, respective energies for the right speaker 533 sensors may be averaged, and such left and right speaker average energies may then be utilized to determine a horizontal pointing location.
- the sensor processing module 253 may then, for example, perform a similar pointing direction estimate in the vertical direction.
- a calibration procedure may be performed to determine an expected sensor energy level (e.g., absolute or relative) when the user is pointing at the sensor.
- an expected sensor energy level e.g., absolute or relative
- a first line e.g., a circle or arc
- a second line e.g., a circle or arc
- the intersection of the first and second lines utilized as an estimate of pointing location. Additional lines associated with other sensors may also be utilized. Such additional lines may, for example, be utilized when selecting between multiple line intersections and/or for greater accuracy or resolution.
- the television control device 520 may communicate information of such determined location in various manners.
- the sensor processing module 253 of the television control device 200 may utilize the television interface module 235 to communicate information of such on-screen pointing location to the television 501 for presentation to the user.
- the sensor processing module 253 may utilize the user interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 200 ). Such communication will also be addressed in the discussions of FIGS. 9-10 .
- FIG. 6 is a diagram illustrating an exemplary television system 600 with television receiver sensors in accordance with various aspects of the present invention.
- the television system 600 includes a television 601 comprising a television screen 603 .
- the television system 600 also includes a television controller 620 (or other pointing device) pointing to an on-screen pointing location 630 along a pointing line 625 between the television controller 620 and the on-screen pointing location 630 .
- the television controller 620 may, for example, share any or all aspects with the exemplary television controllers 160 , 161 , 200 , 320 , 420 and 520 discussed previously and with all other television controllers discussed herein. Accordingly, various aspects of the television control device 620 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television control device 620 may, for example, be communicatively coupled directly to the television 601 via a communication link (not illustrated).
- the television control device 620 may also, for example, be communicatively coupled directly to the television receiver 650 via communication link 653 .
- the television control device 620 may additionally, for example, be communicatively coupled indirectly to the television 601 via the television receiver 650 through communication links 651 and 652 . Accordingly, various aspects of the television control device 620 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television system 600 also comprises a television receiver 650 that is communicatively coupled to the television 601 via a communication link 651 (e.g., a two-way communication link providing video information to the television 601 and/or communicating sensor information and/or screen pointing information with the television 601 ).
- the television receiver 650 comprises an array of screen pointing sensors. A portion of the sensors are labeled ( 661 - 665 ) for discussion purposes. Note that such sensors may be arranged in any of a variety of configurations (e.g., matrix configuration, border configuration, placed only at the front corners, etc.).
- the pointing sensors may, for example, be integrated into the television receiver 650 and/or attached to the television receiver 650 in any of a variety of manners (e.g., in any manner similar to those discussed previously with regard to the televisions and/or television system components discussed previously).
- the television control device 620 may receive additional sensor information from other sensors via the television communication line 653 and/or other communication links.
- the exemplary television control device 620 is also communicatively coupled to the television receiver 650 via a communication link 652 .
- the exemplary television receiver 650 comprises an array of sensors integrated into the television receiver 650 .
- the television receiver 650 comprises a lower left sensor 661 , upper left sensor 662 , upper right sensor 663 , lower right sensor 664 and center sensor 665 .
- any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc.
- the exemplary television receiver 650 may be positioned around the television 601 in any of a variety of manners.
- the television receiver 650 (and thus the sensors) may be positioned around the television 601 in an orientation such that the front face of the television receiver 650 (and thus the sensors) is generally in the same plane as the television screen 603 .
- Such placement is not necessary, but may be advantageous from an accuracy perspective.
- on-screen pointing location may be determined in a manner similar to the interpolation and/or gain pattern intersection discussed above with regard to off-screen and/or on-screen sensors.
- a calibration procedure may be implemented (e.g., by the calibration module 251 ). Such calibration was discussed previously and will also be revisited below.
- one or more photo detectors e.g., photo diodes
- antenna elements e.g., individual antennas or elements of an antenna array
- additional sensors positioned away from the television receiver 650 may also be utilized (e.g., any of the previously discussed sensor placements).
- passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a directional light source of the television control device 620 ) aimed at the screen.
- a light source e.g., a directional light source of the television control device 620
- directed energy e.g., light, RF, acoustic, etc.
- sensors off-screen e.g., sensors integrated into the television receiver 650
- received signals e.g., pulsed signals
- may arrive at different sensors at different respective times/phases e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination).
- photo diodes may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, room lighting, etc.
- photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc.
- the photo detectors integrated with the television receiver 650 may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from the television control device 620 (or other pointing device).
- various aspects may comprise mounting (e.g., adhering) sensors to various television receiver 650 locations and/or to various off-receiver components.
- Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- an array of antenna elements may be positioned at locations on the television receiver 650 (e.g., only on the television receiver 650 and/or at locations around the television receiver 650 ).
- passive antennas or elements of an overall antenna matrix
- received signals e.g., pulsed signals
- may arrive at different antennas at different respective times/phases e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination.
- various aspects may comprise mounting (e.g., adhering) sensors to the television receiver 650 . Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- a user may point a pointing device (e.g., the remote control device 620 , a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at the television screen 603 , where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on the television screen 603 to which the user is pointing with the pointing device.
- transmitted energy e.g., light and/or RF energy and/or acoustic energy
- Such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain (or energy) pattern with the highest intensity at the center of the pattern (i.e., along the pointing line 625 ) and decreasing as a function of angle from the center of the pattern.
- intensity or gain or energy
- each sensor of the sensors integrated into the television receiver 650 off-television will likely receive some respective amount of energy.
- the sensor nearest to the screen pointing location 630 i.e., along the pointing line 625
- a sensor next nearest to the screen pointing location 630 will likely receive a next highest range of energy
- sensors away from the pointing location 630 will likely receive progressively less amounts of energy from the pointing device 620 , as a function of distance from the pointing location 630 and/or angle off the pointing line 625 (e.g., until such energy is lost in the noise floor).
- sensor 662 is nearest to the pointing location 630 and will likely receive the highest energy
- sensors 661 and 663 are further from the pointing location 630 , etc., and so on.
- signals from a same sensor may be utilized in determining multiple axes of pointing location.
- a calibration procedure may be performed when the system 600 is configured to assist in such pointing determination.
- the television control device 620 may receive signals indicative of the energy received by the sensors of the television receiver 650 (e.g., via the communication link 652 between the television control device 620 and the television receiver 650 and/or via a communication link directly between the television control device 620 and the sensors).
- the television receiver 650 may receive such signals in various manners, depending on the degree of integration of such sensors into the television receiver 650 and/or various components of the television system 600 .
- the television control device 620 may receive such signals via communication link 652 .
- the television control device 620 may receive information from such sensors via direct communication link or via communication link with the various components with which such sensors are integrated.
- the communication module 230 may then, for example, provide information of such received sensor signals to the sensor processing module 253 for processing.
- the sensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location.
- the sensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below.
- the sensor processing module 253 may operate to estimate a position between sensor positions based on relative sensor energy.
- sensor 662 may correspond to a relatively high amount of energy
- sensor 663 may correspond to a relatively low amount of received energy.
- the sensor processing module 253 may, for example, estimate a horizontal position relatively closer to sensor 662 by an amount proportional to the relative difference between respective amounts of energy.
- the sensor processing module 253 may perform a similar estimation utilizing sensors 661 and 664 . Various horizontal position estimations may then be averaged.
- respective energies for the left side sensors 661 , 662 may be averaged, respective energies for the right side sensors 663 , 664 sensors may be averaged, and such left and right speaker average energies may then be utilized (e.g., in conjunction with energy pattern characteristics) to determine a horizontal pointing location.
- the sensor processing module 253 may then, for example, perform a similar pointing direction estimate in the vertical direction.
- Such horizontal and/or vertical positions may, for example, be translated between respective locations/directions of the sensor arrangement and respective locations/directions of the television screen 603 .
- Calibrations procedures may, for example, be utilized to establish the spatial relationship between the sensor positioning and on-screen location.
- a calibration procedure may be performed to determine an expected sensor energy level (e.g., absolute or relative) when the user is pointing at the sensor (and/or other known locations).
- an expected sensor energy level e.g., absolute or relative
- a first line e.g., a circle or arc
- a second line e.g., a circle or arc
- the intersection of the first and second lines utilized as an estimate of pointing location. Additional lines associated with other sensors may also be utilized.
- Such additional lines may, for example, be utilized when selecting between multiple line intersections or to increase accuracy and/or resolution of the pointing determination.
- line intersection solution may be applied to any of the previously discussed scenarios (e.g., as illustrated in FIGS. 3-5 ) or other scenarios discussed herein.
- FIGS. 3-5 A non-limiting example of this was presented in the discussion of FIG. 3 , and another example will be provided in the following discussion of FIG. 7 .
- the television receiver 650 may communicate information of such determined location in various manners.
- the sensor processing module 253 of the television control device 200 may utilize the television interface module 235 to communicate information of such on-screen pointing location to the television 601 for presentation to the user on the television screen 603 .
- the sensor processing module 253 may utilize the user interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 620 ). Such communication will also be addressed in the discussions of FIGS. 9-10 .
- FIG. 7 is a diagram illustrating an exemplary television system 700 utilizing pointing device sensors in accordance with various aspects of the present invention.
- the exemplary television system 700 includes a television 701 having a television screen 703 .
- the television system 700 also includes a television controller 720 (or other pointing device) pointing to an on-screen pointing location 730 along a pointing line 725 between the television controller 720 and the on-screen pointing location 730 .
- the television controller 720 may, for example, share any or all aspects with the exemplary television controllers 160 , 161 , 200 , 320 , 420 , 520 and 620 discussed previously and with all other television controllers discussed herein. Accordingly, various aspects of the television control device 720 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television control device 720 may, for example, be communicatively coupled directly to the television 701 via a communication link 753 .
- the television control device 720 may also, for example, be communicatively coupled directly to the television receiver 750 via communication link 752 .
- the television control device 720 may additionally, for example, be communicatively coupled indirectly to the television 701 via the television receiver 750 through communication links 751 and 752 . Accordingly, various aspects of the television control device 720 will be explained herein with reference to various components of the exemplary television control device 200 illustrated in FIG. 2 .
- the television system 700 also comprises a television receiver 750 that is communicatively coupled to the television 701 via a communication link 751 (e.g., a two-way communication link providing video information to the television 701 and/or receiving sensor information from the television 701 ).
- the exemplary television receiver 750 is also communicatively coupled to the television controller 720 via a communication link 752 .
- sensor information may be communicated to the television control device 720 (e.g., via internal communication link). Such information may then be communicated to the sensor processing module 253 for the determination of an on-screen pointing location.
- the television 701 includes eight emitters (e.g., light emitters, RF transmitters, etc.) located around the border of the television screen 703 .
- emitters may be positioned anywhere proximate the television system 700 .
- the television 701 includes a first emitter 711 , second emitter 712 , third emitter 713 , fourth emitter 714 , fifth emitter 715 , sixth emitter 716 , seventh emitter 717 and eighth emitter 718 .
- Such emitters may each emit a signal that may be received at sensors on-board the television control device 720 . Such sensors may, for example, make up a directional receiver.
- the controller 720 (or other pointing device) may be pointed to a location 730 on the screen 703 along a pointing line 725 .
- the sensors on-board the controller 720 will receive the emitted signals at respective signal levels.
- Such sensor signals may then be processed in a manner similar to the manners discussed above to determine the on-screen pointing direction for the pointing device 720 .
- the pointing device may measure respective signal energies received from each of the emitters (e.g., each distinguishable by frequency, coding, timing and/or timeslotting, etc.) and communicate such information to the television receiver 750 .
- the sensor processing module 253 may, for example, select a first emitter 712 (e.g., the emitter corresponding to the highest energy received at the pointing device). The sensor processing module 253 may then process the location of the pointing device, the receive gain pattern for the pointing device, and the energy received from the first emitter 712 to determine a first figure (e.g., an arc 752 ) along which the pointing device, if pointed, would be expected to receive the measured energy. Similarly, the sensor processing module 253 may perform such a procedure for a second emitter 711 resulting in a second figure (e.g., an arc 751 ). The intersection of such arcs may be utilized as an estimate of on-screen pointing location.
- a first emitter 712 e.g., the emitter corresponding to the highest energy received at the pointing device.
- the sensor processing module 253 may then process the location of the pointing device, the receive gain pattern for the pointing device, and the energy received from the first emitter 712 to
- the sensor processing module 253 may perform such a procedure for a third emitter 714 resulting in a third figure (e.g., an arc 754 ), and so on.
- a third figure e.g., an arc 754
- the intersection of the three arcs 752 , 751 , 754 may then be utilized as an estimate of on-screen pointing location.
- the solution need not be based on a known position (location) of the pointing device, nor on absolute received energy levels.
- differences in received energy from the various emitters may be processed with or without position information of the on-screen pointing device.
- the pointing device 720 may have six degrees of freedom (e.g., three positional degrees of freedom and three orientational degrees of freedom).
- the unknown six degrees of freedom for the pointing device 720 may be ascertained by processing six known values related to such six degrees of freedom (e.g., related by a known signal energy pattern).
- measurements associated with six emitters on the television may be utilized to solve for the six degrees of freedom of the pointing device 720 .
- the calibration module 251 of the television control device 200 may operate to perform calibration operations. Such calibrating may be performed in any of a variety of manners. For example and without limitation, calibration may be utilized to determine expected received energy when transmitters and receivers are located and oriented in a particular manner.
- a non-limiting example of a calibration procedure may comprise presenting an on-screen target at various locations and measuring respective sensor signals received when the pointing device is being pointed at such targets.
- a calibration procedure may comprise directing a user (e.g., using the user interface module 240 ) to point to each of a plurality of sensors to determine an expected amount of received energy when the user is pointing directly at such sensors.
- signal energy (or gain) pattern may be utilized in various on-screen pointing determinations.
- Such an energy (or gain) pattern may be predefined for a particular pointing device (e.g., at the factory), but may also be measured by the television control device 200 .
- the calibration module 251 may direct the user to utilize a pointing device to point to a location on the screen and process information received from multiple sensors (e.g., embedded in the screen, embedded in the television around the border of the screen, located in off-television devices, located on the television control device 720 , located in the pointing device, etc.) to develop a custom gain pattern for the particular pointing device (e.g., for the television control device 200 ).
- such calibration may determine the shape of the gain pattern, the signal energy falloff characteristics, etc.
- the television control device 200 may comprise one or more location modules 252 that operate to determine relevant position information.
- the location module 252 may operate to perform such location determining (e.g., of the user or pointing device and/or the television) in any of a variety of manners.
- the location module 252 may utilize a communication interface module 210 , 220 to receive position information (e.g., of the television control device 200 or other pointing device) from an external source of such information (e.g., global positioning system, cellular triangulation system, home triangulation system, etc.).
- the location module 252 may receive position information from internal components of the television control device 200 (e.g., where such television control device 200 has position-determining capability).
- the television control device 200 may comprise GPS (or A-GPS) capability to determine its position.
- the television control device 200 location module 252 may wirelessly communicate information of the television control device's position to the sensor processing module 253 .
- the location module 232 may operate to process sensor information to determine location of the pointing device (e.g., location in relation to the television screen). For example, as mentioned previously, a signal (e.g., a pulse) transmitted from a pointing device to the television (or vice versa) will arrive at different sensors at different points in time depending on the respective distance from the pointing device to each sensor. The location module 232 may process such time-of-arrival information at various sensors to determine the position of the pointing device relative to the television.
- a signal e.g., a pulse
- the location module 232 may process such time-of-arrival information at various sensors to determine the position of the pointing device relative to the television.
- the location module 232 may also operate to process phase difference information (in addition to timing information or instead of such information) to determine pointing device location.
- the television control device 200 may utilize such information in any of a variety of manners.
- the sensor processing module 253 may operate to generate information of the determined on-screen pointing location, and one or more modules of the television control device 200 may operate to communicate a signal (e.g., to a television, television receiver, other display device, U/I modules 240 of the television control device 200 , etc.) that comprises characteristics that cause presentation of a visual indication (e.g., on the television screen, controller screen, other display, etc.) to indicate to the user the on-screen location to which the television control device 200 has determined the user is pointing.
- a signal e.g., to a television, television receiver, other display device, U/I modules 240 of the television control device 200 , etc.
- a visual indication e.g., on the television screen, controller screen, other display, etc.
- Such a visual indication may, for example, comprise characteristics of a cursor or other graphical construct, bright spot, highlighting, color variation, brightness variation, etc.
- the television 701 or television control device 720 may operate to overlay such indication on video content (e.g., television programming) being presented to the user (e.g., presented on the television screen, presented on a screen of the television controller, etc.).
- the sensor processing module 253 may provide information of the determined on-screen pointing location to one or more other modules of the television control device 200 (e.g., the processing module 250 and/or other modules thereof) to identify an object in video content (e.g., television programming) to which a user is pointing.
- one or more modules of the television control device 200 may operate to communicate signals (e.g., to a television, other modules of the television controller having a screen, other display device, etc.) that cause highlighting of an object to which the user is pointing and/or provide information regarding such object.
- various modules of the television control device 200 may operate to communicate on-screen pointing location information to television system components separate from the television (e.g., to a television receiver, video recorder, remote programming source, communication network infrastructure, advertising company, provider of goods and/or services, etc.).
- FIG. 2 provided a diagram illustrating an exemplary television control device 200 in accordance with various aspects of the present invention.
- FIG. 8 provides another diagram illustrating an exemplary television control device 800 in accordance with various aspects of the present invention.
- the exemplary television control device 800 may share any or all aspects with any of the television control devices discussed herein and illustrated in FIGS. 1-7 .
- the exemplary television control device 800 (or various modules thereof) may operate to perform any or all functionality discussed herein.
- the components of the exemplary television control device 800 may be co-located a single housing.
- the television control device 800 comprises a processor 830 .
- a processor 830 may, for example, share any or all characteristics with the processor 250 discussed with regard to FIG. 2 .
- the television control device 800 comprises a memory 840 .
- Such memory 840 may, for example, share any or all characteristics with the memory 260 discussed with regard to FIG. 2 .
- the television control device 800 may comprise any of a variety of user interface module(s) 850 .
- Such user interface module(s) 850 may, for example, share any or all characteristics with the user interface module(s) 240 discussed previously with regard to FIG. 2 .
- the user interface module(s) 850 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen display), a vibrating mechanism, a keypad, a remote control interface, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.).
- the exemplary television control device 800 may also, for example, comprise any of a variety of communication modules ( 805 , 806 , and 810 ). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 210 , 220 discussed previously with regard to FIG. 2 .
- the communication interface module(s) 810 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, component and/or composite video, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc.
- the exemplary television control device 800 is also illustrated as comprising various wired 806 and/or wireless 805 front-end modules that may, for example, be included in the communication interface modules and
- the exemplary television control device 800 may also comprise any of a variety of signal processing module(s) 890 .
- Such signal processing module(s) 890 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.).
- the signal processing module(s) 890 may comprise: video/graphics processing modules (e.g.
- audio processing modules e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.
- tactile processing modules e.g., Keypad I/O, touch screen processing, motor control, etc.
- FIG. 9 is a flow diagram 900 illustrating the generation of on-screen pointing information (e.g., in a television control device) in accordance with various aspects of the present invention.
- the exemplary method 900 may, for example, share any or all characteristics with the television control device operation discussed previously.
- the exemplary method 900 may be implemented by any or all of the television control devices (e.g., 160 , 161 , 200 , 220 , 320 , 420 , 520 , 620 , 720 and 800 ) discussed previously.
- the exemplary method 900 may comprise any or all functional aspects discussed previously with regard to such exemplary television control devices.
- the exemplary method 900 may begin executing at step 905 .
- the exemplary method 900 may begin executing in response to any of a variety of causes and/or conditions.
- the method 900 may begin executing in response to a user command to begin, detected user interaction with a pointing device (e.g., a television controller), detected user presence in the vicinity, detected user interaction with a television implementing the method 900 , etc.
- the method 900 may begin executing in response to a television presenting programming or other video content for which on-screen pointing is enabled and/or relevant.
- the exemplary method 900 may, for example at step 910 , comprise receiving pointing sensor information.
- step 910 may comprise any or all sensor information receiving characteristics described previously with regard the various modules of the exemplary television control devices illustrated in FIGS. 1-8 and discussed previously.
- step 910 may share any or all sensor information receiving characteristics discussed previously with regard to at least the user interface module 240 , television interface module 235 , processor module 250 , communication interface modules 210 , 220 , sensor processing module 253 , location module 252 and calibration module 251 .
- Step 910 may, for example, comprise receiving sensor information from (or associated with) sensors integrated in the television control device. Also for example, step 910 may comprise receiving sensor information from (or associated with) off-controller sensors (e.g., integrated with or attached to a television, off-television sensors, sensors integrated with a pointing device different from the television control device, sensors integrated with a television receiver, etc. As discussed previously, such sensors may comprise any of a variety of characteristics, including without limitation, characteristics of light sensors, RF sensors, acoustic sensors, active and/or passive sensors, etc.
- off-controller sensors e.g., integrated with or attached to a television, off-television sensors, sensors integrated with a pointing device different from the television control device, sensors integrated with a television receiver, etc.
- sensors may comprise any of a variety of characteristics, including without limitation, characteristics of light sensors, RF sensors, acoustic sensors, active and/or passive sensors, etc.
- step 910 may comprise receiving pointing sensor information. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of receiving pointing sensor information unless explicitly claimed.
- the exemplary method 900 may, at step 920 , comprise processing received sensor information (e.g., as received at step 910 ) to determine a location on a screen of the television to which a user is pointing (e.g., pointing with a pointing device).
- step 920 may comprise any or all pointing location processing characteristics described previously with regard the various modules of the exemplary television controllers illustrated in FIGS. 1-8 and discussed previously.
- step 920 may share any or all pointing location determining characteristics discussed previously with regard to at least the processor module 250 , sensor processing module 253 , location module 252 and calibration module 251 .
- Step 920 may, for example, comprise determining on-screen pointing location in any of a variety of manners.
- step 920 may comprise determining on-screen pointing location based on a location of a selected sensor, based on interpolation between sensor locations (e.g., linear and/or non-linear interpolation), based on determining energy pattern intersection(s), etc. Many examples of such determining were provided previously.
- step 920 may comprise processing received sensor information (e.g., independently and/or in conjunction with other information) to determine a location on a screen of the television to which a user is pointing (e.g., while the television is presenting programming to the user). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such processing unless explicitly claimed.
- the exemplary method 900 may, at step 930 , comprise generating information indicative of a determined on-screen pointing location (e.g., as determined at step 920 ).
- step 930 may comprise any or all pointing location information generation characteristics described previously with regard the various modules of the exemplary television control devices illustrated in FIGS. 1-8 and discussed previously.
- step 930 may share any or all information generation characteristics discussed previously with regard to at least the processor module 250 , sensor processing module 253 , location module 252 , calibration module 251 , television interface module 235 , user interface module 240 and/or communication interface modules 210 , 220 .
- Step 930 may, for example, comprise generating such information in any of a variety of manners.
- step 930 may comprise generating on-screen pointing location data to communicate to internal modules of the television control device, to equipment external to the television control device (e.g., to the television and/or television receiver), to television network components, to a television programming source, etc.
- Such information may, for example, be communicated to various system components and may also be presented to the user (e.g., utilizing visual feedback displayed on a screen of a television, television controller, etc.).
- Such information may, for example, be generated in the form of screen coordinates, identification of a video content object (e.g., a programming object or person) to which an on-screen pointing location corresponds, generation of an on-screen cursor or highlight or other graphical feature, etc.
- a video content object e.g., a programming object or person
- step 930 may comprise generating information indicative of a determined on-screen pointing location. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of generating such information unless explicitly claimed.
- the exemplary method 900 may, at step 995 , comprise performing continued processing.
- Such continued processing may comprise characteristics of any of a variety of types of continued processing, various examples of which were presented previously.
- step 995 may comprise looping execution flow back up to any earlier step (e.g., step 910 ).
- step 995 may comprise presenting a graphical feature on a television control device screen indicative of where the user is pointing on a television screen.
- step 995 may comprise communicating information to a television that causes the television to output a graphical feature on the television screen indicative of where the user is pointing (e.g., such information may comprise characteristics that cause the television to overlay such graphical indication on programming being presented on the television screen.
- step 995 may comprise presenting (or causing the presentation of) visual feedback indicia of the on-screen pointing location for a user. Further for example, step 995 may comprise communicating information of the on-screen pointing location to system components external to the television control device implementing the method 900 (e.g., to a television, television receiver, another television controller, etc.). Further for example, step 995 may comprise utilizing the on-screen pointing information to identify a video content object (e.g., an object presented in television programming) to which a user is pointing, etc.
- a video content object e.g., an object presented in television programming
- step 995 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing continued processing unless explicitly claimed.
- FIG. 10 such figure is a flow diagram 1000 illustrating the generation of on-screen pointing information (e.g., in a television control device) in accordance with various aspects of the present invention.
- the exemplary method 1000 may, for example, share any or all characteristics with the television control device operation discussed previously (e.g., in reference to FIGS. 1-9 ).
- the exemplary method 1000 may begin executing at step 1005 .
- Step 1005 may, for example, share any or all characteristics with step 905 of the exemplary method 900 illustrated in FIG. 9 and discussed previously.
- the exemplary method 1000 may, for example at step 1008 , comprise performing a calibration procedure with the user.
- a calibration procedure may, for example, be performed to develop a manner of processing received sensor information to determine on-screen pointing location.
- Step 1008 may, for example, comprise any or all calibration aspects discussed previously (e.g., with reference to the calibration module 251 ).
- the exemplary method 1000 may, for example at step 1010 , comprise receiving pointing sensor information.
- step 1010 may comprise any or all sensor information receiving characteristics described previously with regard the various modules of the exemplary television control devices illustrated in FIGS. 1-8 and FIG. 9 (e.g., step 910 ) and discussed previously.
- the exemplary method 1000 may, for example at step 1015 , comprise determining user position (e.g., determining position of a user pointing device).
- step 1015 may comprise any or all position determining characteristics discussed previously with regard to FIGS. 1-9 .
- position may also, for example, include orientation.
- step 1015 may share any or all position determining characteristics discussed previously with regard to at least the processor module 250 , sensor processing module 253 , location module 252 and calibration module 251 .
- step 1015 may comprise determining user position based, at least in part, on received sensor signals.
- step 1015 may comprise determining user position based, at least in part, on position information received from one or more systems external to the television control device implementing the method 1000 .
- step 1015 may comprise determining user position (e.g., pointing device position). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of determining user position unless explicitly claimed.
- the exemplary method 1000 may, for example, at step 1020 , comprise processing received sensor information (e.g., as received at step 1010 ) and/or user position information (e.g., as determined at step 1015 ) to determine a location on a screen of the television to which a user is pointing (e.g., pointing with the television control device implementing the method or other pointing device).
- step 1020 may comprise any or all pointing location determination characteristics described previously with regard the various modules of the exemplary television control devices illustrated in FIGS. 1-8 and FIG. 9 (e.g., step 920 ) and discussed previously.
- step 1020 may share any or all pointing location determining characteristics discussed previously with regard to at least the processor module 250 , sensor processing module 253 , location module 252 and calibration module 251 .
- Step 1020 may, for example, comprise determining on-screen pointing location in any of a variety of manners.
- step 1020 may comprise determining on-screen pointing location based on a location of a selected sensor, based on location of the pointing device, based on interpolation between sensor locations (e.g., linear and/or non-linear interpolation), based on energy pattern intersection points, etc. Many examples of such determining were provided previously.
- step 1020 may comprise processing received sensor information and/or user position information to determine a location on a screen of the television to which a user is pointing (e.g., pointing with the television control device implementing the method 1000 or other pointing device). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such processing unless explicitly claimed.
- the exemplary method 1000 may, at step 1030 , comprise generating information indicative of a determined on-screen pointing location (e.g., as determined at step 1020 ).
- step 1030 may comprise any or all information generation characteristics described previously with regard the various modules of the exemplary television control devices illustrated in FIGS. 1-8 and FIG. 9 (e.g., step 930 ) and discussed previously.
- step 1030 may share any or all information generation characteristics discussed previously with regard to at least the processor module 250 , sensor processing module 253 , location module 252 , calibration module 251 , television interface module 235 , user interface module 240 and/or communication interface modules 210 , 220 .
- the exemplary method 1000 may, at step 1095 , comprise performing continued processing.
- Such continued processing may comprise characteristics of any of a variety of types of continued processing, various examples of which were presented previously.
- step 1095 may comprise looping execution flow back up to any earlier step (e.g., step 1008 ).
- step 1095 may comprise presenting a graphical feature on a television control device screen indicative of where the user is pointing on a television screen.
- step 1095 may comprise communicating information to a television that causes the television to output a graphical feature on the television screen indicative of where the user is pointing (e.g., such information may comprise characteristics that cause the television to overlay such graphical indication on programming being presented on the television screen.
- step 1095 may comprise presenting (and/or causing the presentation of) visual feedback indicia of the on-screen pointing location for a user. Further for example, step 1095 may comprise communicating information of the on-screen pointing location to system components external to the television receiver implementing the method 1000 . Further for example, step 1095 may comprise utilizing the on-screen pointing information to identify a video content object (e.g., an object presented in television programming) to which a user is pointing, etc.
- a video content object e.g., an object presented in television programming
- step 1095 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing continued processing unless explicitly claimed.
- various aspects of the present invention provide a system and method in a television controller (e.g., a television control device) for generating screen pointing information.
- a television controller e.g., a television control device
- various changes may be made and equivalents may be substituted without departing from the scope of the invention.
- many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Computer Graphics (AREA)
- Computer Security & Cryptography (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Position Input By Displaying (AREA)
- Selective Calling Equipment (AREA)
Abstract
Description
- This patent application is a continuation application of non-provisional application Ser. No. 12/774,321 filed May 5, 2010 entitled “SYSTEM AND METHOD FOR GENERATING SCREEN POINTING INFORMATION IN A TELEVISION CONTROL DEVICE” which is related to and claims priority from provisional patent application Ser. No. 61/242,234 filed Sep. 14, 2009, and titled “TELEVISION SYSTEM,” the contents of which are hereby incorporated herein by reference in their entirety. This patent application is also related to U.S. patent application Ser. No. 12/774,154, filed concurrently herewith, titled “SYSTEM AND METHOD FOR GENERATING SCREEN POINTING INFORMATION IN A TELEVISION”; and U.S. patent application Ser. No. 12/774,221, filed concurrently herewith, titled “SYSTEM AND METHOD FOR GENERATING TELEVISION SCREEN POINTING INFORMATION USING AN EXTERNAL RECEIVER”. The contents of each of the above-mentioned applications are hereby incorporated herein by reference in their entirety.
- [Not Applicable]
- [Not Applicable]
- [Not Applicable]
- Present television control devices are incapable of providing pointing information to television program viewers. Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
- Various aspects of the present invention provide a system and method, in a television control device, for generating screen pointing information, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims. These and other advantages, aspects and novel features of the present invention, as well as details of illustrative aspects thereof, will be more fully understood from the following description and drawings.
-
FIG. 1 is a diagram illustrating an exemplary television system in accordance with various aspects of the present invention. -
FIG. 2 is a diagram illustrating an exemplary television control device in accordance with various aspects of the present invention. -
FIG. 3 is a diagram illustrating an exemplary television system with on-screen television sensors in accordance with various aspects of the present invention. -
FIG. 4 is a diagram illustrating an exemplary television system with off-screen television sensors in accordance with various aspects of the present invention. -
FIG. 5 is a diagram illustrating an exemplary television system with off-television sensors in accordance with various aspects of the present invention. -
FIG. 6 is a diagram illustrating an exemplary television system with television receiver sensors in accordance with various aspects of the present invention. -
FIG. 7 is a diagram illustrating an exemplary television system with television controller sensors in accordance with various aspects of the present invention. -
FIG. 8 is a diagram illustrating an exemplary television control device in accordance with various aspects of the present invention. -
FIG. 9 is a flow diagram illustrating the generation of on-screen pointing information in accordance with various aspects of the present invention. -
FIG. 10 is a flow diagram illustrating the generation of on-screen pointing information in accordance with various aspects of the present invention. - The following discussion will refer to various communication modules, components or circuits. Such modules, components or circuits may generally comprise hardware and/or a combination of hardware and software (e.g., including firmware). Such modules may also, for example, comprise a computer readable medium (e.g., a non-transitory medium) comprising instructions (e.g., software instructions) that, when executed by a processor, cause the processor to perform various functional aspects of the present invention. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of particular hardware and/or software implementations of a module, component or circuit unless explicitly claimed as such. For example and without limitation, various aspects of the present invention may be implemented by one or more processors (e.g., a microprocessor, digital signal processor, baseband processor, microcontroller, etc.) executing software instructions (e.g., stored in volatile and/or non-volatile memory). Also for example, various aspects of the present invention may be implemented by an application-specific integrated circuit (“ASIC”) and/or other hardware components.
- Additionally, the following discussion will refer to various television system modules (e.g., television control device modules). It should be noted that the following discussion of such various modules is segmented into such modules for the sake of illustrative clarity. However, in actual implementation, the boundaries between various modules may be blurred. For example, any or all of the functional modules discussed herein may share various hardware and/or software components. For example, any or all of the functional modules discussed herein may be implemented wholly or in-part by a shared processor executing software instructions. Additionally, various software sub-modules that may be executed by one or more processors may be shared between various software modules. Accordingly, the scope of various aspects of the present invention should not be limited by arbitrary boundaries between various hardware and/or software components, unless explicitly claimed.
- The following discussion may also refer to communication networks and various aspects thereof. For the following discussion, a communication network is generally the communication infrastructure through which a communication device (e.g., a portable communication device, television, television control device, television provider, television programming provider, television receiver, video recording device, etc.) may communicate with other systems. For example and without limitation, a communication network may comprise a cable and/or satellite television communication network, a cellular communication network, a wireless metropolitan area network (WMAN), a wireless local area network (WLAN), a wireless personal area network (WPAN), any home or premises communication network, etc. A particular communication network may, for example, generally have a corresponding communication protocol according to which a communication device may communicate with the communication network. Unless so claimed, the scope of various aspects of the present invention should not be limited by characteristics of a particular type of communication network.
- The following discussion will at times refer to an on-screen pointing location. Such a pointing location refers to a location on the television screen to which a user (either directly or with a pointing device) is pointing. Such a pointing location is to be distinguished from other types of on-screen location identification, such as, for example, using arrow keys and/or a mouse to move a cursor or to traverse blocks (e.g., on an on-screen program guide) without pointing.
- Additionally, the following discussion will at times refer to television programming. Such television programming generally includes various types of television programming (e.g., television programs, news programs, sports programs, music television, movies, television series programs and/or associated advertisements, educational programs, live or recorded, broadcast/multicast/unicast, etc.). Such television programming video content is to be distinguished from other non-programming video content that may be displayed on a television screen (e.g., an electronic program guide, user interface menu, a television set-up menu, a typical web page, a document, a graphical video game, etc.). Various aspects of the present invention may, for example, comprise determining an on-screen pointing location during the presentation of television programming on the screen of the television.
- Turning first to
FIG. 1 , such figure is a diagram illustrating a non-limitingexemplary television system 100 in accordance with various aspects of the present invention. Theexemplary system 100 includes atelevision provider 110. Thetelevision provider 110 may, for example, comprise a television network company, a cable company, a movie-providing company, a news company, an educational institution, etc. Thetelevision provider 110 may, for example, be an original source of television programming (or related information). Also for example, thetelevision provider 110 may be a communication company that provides programming distribution services (e.g., a cable television company, a satellite television company, a telecommunication company, a data network provider, etc.). Thetelevision provider 110 may, for example, provide programming and non-programming information and/or video content. Thetelevision provider 110 may, for example, provide information related to a television program (e.g., information describing or otherwise related to selectable objects in programming, etc.). - The
exemplary television system 100 may also include a third partyprogram information provider 120. Such a provider may, for example, provide information related to a television program. Such information may, for example, comprise information describing selectable objects in programming, program guide information, etc. - The
exemplary television system 100 may include one or more communication networks (e.g., the communication network(s) 130). Theexemplary communication network 130 may comprise characteristics of any of a variety of types of communication networks over which video content and/or information related to video content may be communicated. For example and without limitation, thecommunication network 130 may compare characteristics of a cable television network, a satellite television network, a telecommunication network, the Internet, a local area network (LAN), a personal area network (PAN), a metropolitan area network (MAN), any of a variety of different types of home networks, etc. - The
exemplary television system 100 may include afirst television 140. Such afirst television 140 may, for example, comprise networking capability enablingsuch television 140 to communicate directly with thecommunication network 130. For example, thefirst television 140 may comprise one or more embedded television receivers or transceivers (e.g., a cable television receiver, satellite television transceiver, Internet modem, etc.). Also for example, thefirst television 140 may comprise one or more recording devices (e.g., for recording and/or playing back video content, television programming, etc.). - The
exemplary television system 100 may include afirst television controller 160. Such afirst television controller 160 may, for example, operate to (e.g., which includes operating when enabled to) control operation of thefirst television 140. Thefirst television controller 160 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, thefirst television controller 160 may comprise characteristics of a dedicated television control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc. - The first television controller 160 (or television control device) may, for example, transmit signals directly to the
first television 140 to control operation of thefirst television 140. Thefirst television controller 160 may also, for example, operate to transmit signals (e.g., via the communication network 130) to thetelevision provider 110 to control video content being provided to thefirst television 140, or to conduct other transactions (e.g., business transactions, etc.). - As will be discussed in more detail later, the
first television controller 160 may operate to communicate screen pointing information with thefirst television 140 and/or other devices. Also, as will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an object or person presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. Thefirst television controller 160 provides a non-limiting example of a device that a user may utilize to point to an on-screen location. The following discussion ofFIGS. 2-10 will present various non-limiting illustrative aspects of such a television controller. - The
exemplary television system 100 may also include atelevision receiver 150. The television receiver may, for example, operate to provide a communication link between a television and/or television controller and a communication network and/or information provider. For example, thetelevision receiver 150 may operate to provide a communication link between thesecond television 141 and thecommunication network 130, or between thesecond television 141 and the television provider 110 (and/or third party program information provider 120) via thecommunication network 130. - The
television receiver 150 may comprise characteristics of any of a variety of types of television receivers. For example and without limitation, thetelevision receiver 150 may comprise characteristics of a cable television receiver, a satellite television receiver, etc. Also for example, thetelevision receiver 150 may comprise a data communication network modem for data network communications (e.g., with the Internet, a LAN, PAN, MAN, telecommunication network, etc.). Thetelevision receiver 150 may also, for example, comprise recording capability (e.g., programming recording and playback, etc.). - The
exemplary television system 100 may include asecond television controller 161. Such asecond television controller 161 may, for example, operate to control operation of thesecond television 141 and thetelevision receiver 150. Thesecond television controller 161 may comprise characteristics of any of a variety of television controlling devices. For example and without limitation, thesecond television controller 161 may comprise characteristics of a dedicated television control device, a dedicated television receiver control device, a universal remote control, a cellular telephone or personal computing device with television control capability, etc. - The
second television controller 161 may, for example, transmit signals directly to thesecond television 141 to control operation of thesecond television 141. Thesecond television controller 161 may, for example, transmit signals directly to thetelevision receiver 150 to control operation of thetelevision receiver 150. Thesecond television controller 161 may additionally, for example, operate to transmit signals (e.g., via thetelevision receiver 150 and the communication network 130) to thetelevision provider 110 to control video content being provided to thetelevision receiver 150, or to conduct other transactions (e.g., business transactions, etc.). - As will be discussed in more detail later, various aspects of the present invention include a user pointing to a location on a television screen (e.g., pointing to an object or person presented in television programming). In such a scenario, the user may perform such pointing in any of a variety of manners. One of such exemplary manners includes pointing with a television control device. The
second television controller 161 provides one non-limiting example of a device that a user may utilize to point to an on-screen location. The following discussion ofFIGS. 2-10 will present various non-limiting illustrative aspects of such a television controller. - The
exemplary television system 100 was provided to provide a non-limiting illustrative foundation for discussion of various aspects of the present invention. Thus, the scope of various aspects of the present invention should not be limited by any characteristics of theexemplary television system 100 unless explicitly claimed. - Turning next to
FIG. 2 , such figure is a diagram illustrating an exemplary television control device 200 (e.g., a remote control device) in accordance with various aspects of the present invention. The exemplarytelevision control device 200 may, for example, share any or all characteristics with the exemplarytelevision control devices FIG. 1 and discussed previously and/or with any of the exemplary television control devices discussed herein. - The exemplary
television control device 200 includes a firstcommunication interface module 210. The firstcommunication interface module 210 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, though the firstcommunication interface module 210 is illustrated coupled to a wireless RF antenna via a wireless port 212, the wireless medium is merely illustrative and non-limiting. The firstcommunication interface module 210 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content (e.g., television programming), television control information, and/or other data is communicated. Also for example, thefirst communication module 210 may operate to communicate with local sources of television video content (e.g., video recorders, receivers, gaming devices, etc.). Additionally, for example, thefirst communication module 210 may operate to communicate with a second television controller (e.g., directly or via one or more intermediate communication networks). Further for example, thefirst communication module 210 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.). Still further, for example, thefirst communication module 210 may operate to communicate with screen pointing sensors. - The exemplary
television control device 200 includes a secondcommunication interface module 220. The secondcommunication interface module 220 may, for example, operate to communicate over any of a variety of communication media and utilizing any of a variety of communication protocols. For example, the secondcommunication interface module 220 may communicate via a wireless RF communication port 222 and antenna, or may communicate via a non-tethered optical communication port 224 (e.g., utilizing laser diodes, photodiodes, etc.). Also for example, the secondcommunication interface module 220 may communicate via a tethered optical communication port 226 (e.g., utilizing a fiber optic cable), or may communicate via a wired communication port 228 (e.g., utilizing coaxial cable, twisted pair, HDMI cable, Ethernet cable, any of a variety of wired component and/or composite video connections, etc.). The secondcommunication interface module 220 may, for example, operate to communicate with one or more communication networks (e.g., cable television networks, satellite television networks, telecommunication networks, the Internet, local area networks, personal area networks, metropolitan area networks, etc.) via which television video content, television control information, and/or other data is communicated. Also for example, thesecond communication module 220 may operate to communicate with local sources of television video content (e.g., video recorders, other receivers, gaming devices, etc.). Additionally, for example, thesecond communication module 220 may operate to communicate with a second television controller (e.g., directly or via one or more intervening communication networks). Further for example, thesecond communication module 220 may operate to communicate with a television utilizing any of a variety of television communication connections and/or protocols (e.g., composite video, component video, HDMI, etc.). Still further, for example, thesecond communication module 220 may operate to communicate with screen pointing sensors. - The exemplary
television control device 200 may also comprise additional communication interface modules, which are not illustrated. Such additional communication interface modules may, for example, share any or all aspects with the first 210 and second 220 communication interface modules discussed above. - The exemplary
television control device 200 may also comprise a communication module 230. The communication module 230 may, for example, operate to control and/or coordinate operation of the firstcommunication interface module 210 and the second communication interface module 220 (and/or additional communication interface modules as needed). The communication module 230 may, for example, provide a convenient communication interface by which other components of thetelevision control device 200 may utilize the first 210 and second 220 communication interface modules. Additionally, for example, in an exemplary scenario where a plurality of communication interface modules are sharing a medium and/or network, the communication module 230 may coordinate communications to reduce collisions and/or other interference between thecommunication interface modules - The exemplary
television control device 200 may comprise one or moretelevision interface modules 235. Thetelevision interface module 235 may, for example, operate to manage communications between thetelevision control device 200 and one or more televisions that are communicatively coupled thereto (e.g., via the first 210 and/or second 220 communication interface modules). For example, thetelevision interface module 235 may operate to communicate general television programming video information to a television (e.g., while thetelevision control device 200 is operating to determine an on-screen pointing location). - Also, for example, as will be discussed in more detail later, the
television interface module 235 may output a signal to the television, television receiver or a second television controller or other device with a display, where such signal comprises characteristics adapted to cause the television (or other device) to output a visual indication of on-screen pointing location. Such an indication may, for example, be communicated with (e.g., as a part of) other information (e.g., video information, general device control information, etc.) being communicated to the television (or other device), or such an indication may be communicated to the television (or other device) independent of other information. - The exemplary
television control device 200 may additionally comprise one or moreuser interface modules 240. Theuser interface module 240 may generally operate to provide user interface functionality to a user of thetelevision control device 200. For example, and without limitation, theuser interface module 240 may operate to provide for user control of any or all standard television and/or television receiver commands (e.g., channel control, on/off, television output settings, input selection, etc.). Theuser interface module 240 may, for example, operate and/or respond to user commands utilizing user interface features disposed on the television receiver (e.g., buttons, touch screen, microphone, etc.) and may also utilize the communication module 230 (and/or first 210 and second 220 communication interface modules) to communicate with a television controller, television receiver, another television control device and/or any other television system component. For example, various user interface features of thetelevision control device 200 may comprise utilization of the television (e.g., utilizing the television screen for menu-driven or other GUI associated with television, television receiver and/or television controller operation). - The
user interface module 240 may also operate to interface with and/or control operation of any of a variety of sensors that may be utilized to ascertain an on-screen pointing location. Non-limiting examples of such sensors will be provided later (e.g., in the discussion ofFIGS. 3-7 and elsewhere herein). For example and without limitation, theuser interface module 240 may operate to receive signals associated with respective sensors (e.g., raw or processed signals directly from the sensors, through intermediate devices (e.g., a television, television control, surround sound system, etc.), via thecommunication interface modules user interface module 240 may operate to control the transmission of signals (e.g., RF signals, optical signals, acoustic signals, etc.) from such sensors. - The exemplary
television control device 200 may comprise one ormore processors 250. Theprocessor 250 may, for example, comprise one or more of a general purpose processor, digital signal processor, application-specific processor, microcontroller, microprocessor, etc. For example, theprocessor 250 may operate in accordance with software (or firmware) instructions. As mentioned previously, any or all functionality discussed herein may be performed by a processor executing instructions. For example, though various modules are illustrated as separate blocks or modules inFIG. 2 for illustrative clarity, such illustrative modules, or a portion thereof, may be implemented by theprocessor 250. - The exemplary
television control device 200 may comprise one ormore memories 260. As discussed above, various aspects may be performed by one or more processors executing instructions. Such instructions may, for example, be stored in the one ormore memories 260.Such memory 260 may, for example, comprise characteristics of any of a variety of types of memory. For example and without limitation,such memory 260 may comprise one or more memory chips (e.g., ROM, RAM, EPROM, EEPROM, flash memory, one-time-programmable OTP memory, etc.), hard drive memory, CD memory, DVD memory, etc. - The exemplary
television control device 200 may also comprise one ormore calibration modules 251 that operate to perform various calibration activities. Examples of such calibration activities will be provided later in this discussion. Briefly, such calibration activities may, for example, comprise interacting with a user and/or user pointing device (e.g., if different from the television control device 200) to determine sensor signals under known circumstances (e.g., determine sensor signals in response to known screen pointing circumstances), and processing such sensor signals to develop algorithms (e.g., transformation matrices, static positional equations, etc.) to determine screen pointing location based on sensor signals received during normal operation. As will also be discussed later, such calibration may also be utilized to establish signal gain (or energy) patterns utilized in determining pointing location. - The exemplary
television control device 200 may comprise one or more location-determiningmodules 252. For example, as will be discussed later, various on-screen pointing location determinations may comprise processing location information. As a non-limiting example, knowing the location of a user (e.g., including the location of a pointing device (e.g., which could be the television control device 200) being utilized by the user) may simplify the solution of various pointing direction determinations. For example, knowing exactly where a pointing device is located (e.g., in three-dimensional space) or where a pointing device is located along a line (e.g., knowing device location in two-dimensional space or land surface coordinates) relative to the television screen (and/or relative to the television control device) may remove a number of unknown variables from applicable positional equations. Note that such positional information may, in various exemplary scenarios, also comprise orientation information for a pointing device (e.g., yaw, pitch and/or roll). Such orientation information may be determined in various manners (e.g., through gyroscopic means, sensor alignment with known references, etc.). - The location-determining
module 252 may operate to determine user (or pointing device) location in any of a variety of manners. For example and without limitation, in an exemplary scenario where the pointing device is different from thecontrol device 200, the location-determiningmodule 252 may operate to receive location information from the pointing device (e.g., via one of thecommunication interface modules 210, 220). For example, such a pointing device may comprise positioning system capability (e.g., global positioning system, assisted GPS, cellular or other triangulation systems, etc.) and communicate information describing the position of the pointing device to thetelevision control device 200. In an exemplary scenario where thetelevision control device 200 is the pointing device, thetelevision control device 200 may comprise on-board position-determining capability. - Also for example, the location-determining
module 252 may (e.g., via the user interface modules 240) utilize sensor signals to determine the position (which may include orientation) of the pointing device (or user thereof). For example, signals may arrive at the pointing device at different sensors at different times (or at different phases). Such temporal or phase differences may be processed to determine the location of the pointing device relative to the known location of such sensors. Further for example, the location-determiningmodule 252 may operate to communicate pointing device location information with an external system that operates to determine the location of the pointing device. Such an external system may, for example, comprise a cellular telephony triangulation system, a home or premises-based triangulation system, a global positioning system, an assisted global positioning system, etc. In a non-limiting exemplary scenario where thecontrol device 200 is the pointing device, the location information communicated with the external system may be location information associated with thecontrol device 200. - The exemplary
television control device 200 may also comprise one or more sensor processing module(s) 253. As will be explained below, thesensor processing module 253 may operate to receive sensor information (e.g., from the user interface module(s) 240, from thetelevision interface module 235, from thecommunication interface modules - Various aspects of the present invention will now be illustrated by way of non-limiting example. Throughout the following discussion, reference will continue to be made to the various modules of the
television control device 200 illustrated inFIG. 2 . It should be noted that the following non-limiting examples provide specific examples of various aspects, and as such, the scope of various aspects of the present invention should not be limited by characteristics of any of the specific examples, unless specifically claimed. -
FIG. 3 is a diagram illustrating anexemplary television system 300 with on-screen television sensors in accordance with various aspects of the present invention. Thetelevision system 300 includes atelevision 301 comprising atelevision screen 303. Thetelevision system 300 also includes a television controller 320 (or other pointing device) pointing to an on-screen pointing location 330 along a line 325 between thetelevision controller 320 and the on-screen pointing location 330. Thetelevision controller 320 may, for example, share any or all aspects with theexemplary television controllers television control device 320 may, for example, be communicatively coupled directly to thetelevision 301 via acommunication link 353. Thetelevision control device 320 may also, for example, be communicatively coupled directly to thetelevision receiver 350 viacommunication link 352. Thetelevision control device 320 may additionally, for example, be communicatively coupled indirectly to thetelevision 301 via thetelevision receiver 350 throughcommunication links television control device 320 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . - The
television system 300 also comprises atelevision receiver 350 that is communicatively coupled to thetelevision 301 via a communication link 351 (e.g., a two-way communication link providing video information to thetelevision 301 and/or receiving sensor information from thetelevision 301 for communication to the television control device 320). Theexemplary television receiver 350 is also communicatively coupled to thetelevision controller 320 via acommunication link 352. - The
exemplary television screen 303 comprises an array of sensors integrated into thetelevision screen 303. One of such sensors is labeledsensor 310. Any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes) and RF sensors (e.g., antenna elements or loops). - The array of sensors may be integrated in the
television screen 303 in any of a variety of manners, non-limiting examples of which will now be provided. For example, thetelevision screen 303 may comprise an array of liquid crystal display (LCD) pixels for presenting video media to a user. An array of photo diodes and/or antenna elements may be integrated between or behind LCD pixels. For example, every LCD pixel may be associated with a corresponding photo diode and/or antenna element, or every N×M block of LCD pixels may be associated with a corresponding photo diode or antenna element. - As a non-limiting example, an array of photo diodes and/or RF antenna elements may be formed into a substrate beneath or behind transparent LCD substrates. As another example, a photo diode array and/or antenna element array may be interposed between or behind an array of LCD thin film transistors. Also for example, an array of photo diodes and/or RF antenna elements (or other sensors) may be incorporated into a transparent screen overlay. Note that is such an implementation, such transparent screen overlay may be installed after-market. For example, a user that has a
television control device 320 with the capability to determine on-screen pointing location may install the transparent screen overlay. In such an exemplary scenario, there may be one or more communication links established between thetelevision control device 320 and the sensors in the overlay, where such communication links may be independent of a communication link over which non-sensor information (e.g., video and/or control information) is communicated between thetelevision 301 and thetelevision control device 320. Such communication link may, for example, be adapted to communicate information from each sensor to thetelevision control device 320 serially (e.g., in a time-multiplexed manner) and/or in parallel. - In a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a light source of the
television control device 320 or other pointing device) aimed at thescreen 303. Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). In such a photo detector implementation (e.g., utilizing photo diodes), photo detectors may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, etc. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. - In an antenna element implementation, an array of antenna elements may be formed on a substrate and placed behind light producing and/or filtering elements in an LCD screen (e.g., so as to avoid interfering with emitted light) or may be formed on a transparent substrate within or in front of the lighted region of the LCD display (e.g., utilizing microscopic antenna elements that are too small to significantly interfere with light emitted from the display). As discussed above, such an implementation may be integrated with the
television screen 303, but may also be added as an overlay (e.g., as a production option or an after-market user or technician installation). - In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying respective amounts of RF energy depending on the pointing direction of a directional RF source (e.g., a directional RF source of the
television control device 320 or other pointing device) aimed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination) - In an exemplary scenario, a user may point a pointing device (e.g., a the
television control device 320, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen 303, where the pointing device directs transmitted energy (e.g., light energy, RF energy, acoustic energy, etc.) at a particular location on thetelevision screen 303 to which the pointing device is being pointed. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity likely at the center of the pattern (i.e., along the pointing line 325) and decreasing as a function of angle from the center of the pattern (or distance on the screen from the on-screen pointing location). - In such an exemplary scenario, each sensor of the array of sensors integrated into the
screen 303 will likely receive some respective amount of energy. For example, the sensor nearest the screen pointing location 330 (i.e., along the pointing line 325) will likely receive the highest amount of energy, sensors adjacent to thescreen pointing location 330 will likely receive a next highest range of energy, and sensors away from thepointing location 330 will likely receive progressively less amounts of energy from the pointing device (e.g., the television control device 320) as a function of distance from thepointing location 330, until such energy is lost in the noise floor. - In such an exemplary scenario, the television control device 320 (e.g., the
user interface module 240 of thetelevision control device 200 illustrated inFIG. 2 ) may receive signals indicative of the energy received by the sensors of the sensor array. Thetelevision control device 320 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision 301. For example, in an exemplary scenario where the sensors are fully integrated into thetelevision screen 303 and operationally integrated into thetelevision 301, thetelevision control device 320 may receive such signals via a communication interface between thetelevision control device 320 and the television 301 (e.g., viacommunication link 353, or via a communication interface between thetelevision 301 andtelevision control device 320 via the television receiver 350 (e.g., viacommunication links 351 and 352)). Also for example, in another exemplary scenario where the sensors are overlaid on thetelevision screen 303, and where operation of such sensors is independent of thetelevision 301, thetelevision control device 320 may receive such signals via a communication link directly between thetelevision control device 320 and the sensors, where such a communication link may be independent of other communication links between thetelevision control device 320 and thetelevision 301. Such communication link may, for example, be adapted to communicate information from each sensor to thetelevision control device 320 serially (e.g., in a time-multiplexed manner) and/or in parallel. - The
user interface module 240 may then, for example, provide information of such received sensor signals to thesensor processing module 253 for processing. Thesensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below. - For example, the
sensor processing module 253 may operate to select the sensor with the highest received energy and determine that the location of such selected sensor is the on-screen pointing location. For example, in an exemplary scenario where the spatial resolution of screen-integrated sensors is relatively fine, such operation may reliably yield a desired level of accuracy without undue processing overhead. - In another example, the
sensor processing module 253 may operate to select the sensor with the highest received energy and a plurality of sensors adjacent to such sensor. Then, for example, thesensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on weighting). For example, in a first dimension in which a sensor to the right of the highest energy sensor has a higher received energy than a sensor to the left of the highest energy sensor, thesensor processing module 253 may determine that the pointing location is to the right of the highest energy sensor. How much distance to the right may, for example, be determined as a function of the ratio between respective energies received by the right and left sensors. Such calculation may, for example, be a linear or non-linear calculation. Such calculation may also, for example, consider the expected energy pattern of a transmitting pointing device (e.g., in a scenario where energy fall-off is logarithmic as opposed to linear). - In an additional example, the
sensor processing module 253 may operate to select all sensors receiving a threshold amount of energy (e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.). Then, for example, thesensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, thesensor processing module 253 may perform non-linear splining between sensors in a horizontal direction with sensor location on a first axis and sensor energy on a second axis. Thesensor processing module 253 may then operate to select the point on the sensor location axis corresponding to the peak sensor energy on the vertical axis. Such splining and selecting may then be repeated in the vertical direction. Alternatively for example, thesensor processing module 253 may operate to perform multi-dimensional splining to create a surface based on sensor energy and select the highest point on such surface and the corresponding screen coordinates of such surface. - In a further example, the
sensor processing module 253 may operate to select a first sensor (e.g., the sensor with the highest received energy). Then, for example, thesensor processing module 253 may utilize information of the relative distance between the selected sensor and the pointing device (e.g., the television control device 320), information of the gain pattern for the signal transmitted from the pointing device to the selected sensor, and calibration information to determine where the pointing device may be pointed in order for the sensor to receive such energy. For example, this may result in a first closed figure (e.g., a circle, cloverleaf, etc.) drawn around the sensor on the screen plane. Then thesensor processing module 253 may repeat the procedure for a second sensor (e.g., a sensor with the second highest received energy), resulting in a second closed figure. Thesensor processing module 253 may then, for example, determine the point(s) of intersection between the first and second figures. If only one point of intersection lies within the border of the screen, then such point of intersection might be utilized as an estimate of the pointing location. If, however, there are two potentially significant points of intersection (or more depending on the figures), then thesensor processing module 253 may repeat the procedure for a third sensor (e.g., the sensor with the third highest energy, a sensor generally along the line perpendicular to a line segment between the first and second sensors, etc.) and determine a point nearest the intersection of the first, second and third closed figures. Such a point of intersection may then be utilized as an estimate of the pointing location. - The above-mentioned examples of screen-integrated sensors and related pointing location determinations were presented as exemplary illustrations. Though the above-mentioned examples generally discuss light and/or RF energy sensors, other types of sensors may also be integrated into a television screen or overlaid thereon. For example and without limitation, the sensors may comprise acoustic sensors that operate to sense acoustic energy (e.g., directed acoustic energy directed to a pointing location on the screen). For example, such directed acoustic energy may be formed at frequencies beyond the range of human hearing (e.g., and at frequencies beyond the range of pet hearing as well).
- Also note that various energy radiation patterns may be used, and/or a plurality of energy radiation patterns may be used. For example, though (e.g., for illustrative clarity) the discussion herein generally discusses a single energy emission from the pointing device, a plurality of energy emissions may be utilized. For example and without limitation, a pointing device (e.g., the television control device 320) may transmit a plurality of different directed energy emissions (e.g., light, RF, etc.) toward the pointing direction. Also for example, a pointing device may transmit one or more energy emissions that move relative to the pointing direction (e.g., in a raster pattern or any other pattern).
- After determining on-screen pointing location, the
television control device 320 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module 253 of thetelevision control device 200 may utilize thetelevision interface module 235 to communicate information of such on-screen pointing location to thetelevision 301 for presentation to the user. Also for example, thesensor processing module 253 may utilize theuser interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 200). Such communication will also be addressed in the discussions ofFIGS. 9-10 . - In addition to various television configurations in which sensors are integrated into the television screen, sensors may be incorporated into the television off-screen. Such sensors may, for example, be incorporated in a border around the screen (or overlaid thereon). For example and without limitation,
FIG. 4 is a diagram illustrating anexemplary television system 400 with off-screen television sensors in accordance with various aspects of the present invention. Thetelevision system 400 includes atelevision 401 comprising atelevision screen 403. Thetelevision system 400 also includes a television controller 420 (or other pointing device) pointing to an on-screen pointing location 430 along apointing line 425 between thetelevision controller 420 and the on-screen pointing location 430. Thetelevision controller 420 may, for example, share any or all aspects with theexemplary television controllers television control device 420 may, for example, be communicatively coupled directly to thetelevision 401 via acommunication link 453. Thetelevision control device 420 may also, for example, be communicatively coupled directly to thetelevision receiver 450 viacommunication link 452. Thetelevision control device 420 may additionally, for example, be communicatively coupled indirectly to thetelevision 401 via thetelevision receiver 450 throughcommunication links television control device 420 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . - The
television system 400 also comprises atelevision receiver 450 that is communicatively coupled to thetelevision 401 via a communication link 451 (e.g., a two-way communication link providing video information to thetelevision 401 and/or receiving sensor information from thetelevision 401 for communication to the television control device 420). Theexemplary television receiver 450 is also communicatively coupled to thetelevision controller 420 via acommunication link 452. - The
exemplary television 401 comprises an array of sensors integrated into thetelevision 401 around the border of thescreen 403. Three of such sensors are labeled 410, 411 and 412. As discussed above, any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc. - The array of sensors may be integrated around the
television screen 403 in any of a variety of manners. For example, such sensors may be integrated in a border of thetelevision screen 403 that is not used for outputting video content. Such a configuration may, for example, avoid sensor interference with video content being displayed on the screen. Also for example, as illustrated inFIG. 4 , such sensors may be mounted to a border material of thetelevision 401. - For example, an array of photo detectors (e.g., photo diodes) and/or antenna elements (e.g., individual antennas or elements of an antenna array, for example, a phased array) may be incorporated into a border of the
television 401 around thescreen 403. For example, every screen pixel row and/or column may be associated with a pair of corresponding photo diodes and/or antenna elements, or every N×M block of screen pixels may be associated with one or more corresponding photo diodes or antenna elements (e.g., a row and column sensor, two row and two column elements, etc.). - In a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a directional light source of the television control device 420) pointed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). In such a photo detector implementation (e.g., utilizing photo diodes), photo detectors may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. In one example, the photo detectors integrated with the television body off-screen may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a television controller or other pointing device. Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to the television body off-screen. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- In an antenna element implementation, an array of antenna elements may be positioned around the border of the
screen 403. In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying amounts of respective RF energy depending on the pointing direction of a directional RF source aimed at the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to the television body off-screen. Such sensor installation may, for example, occur at the factory or after-market by a technician or user. - In an exemplary scenario, a user may point a pointing device (e.g., a
remote controller 420, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen 403, where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on thetelevision screen 403 to which the device is being pointed. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity likely at the center of the pattern (i.e., along the pointing line 425) and decreasing as a function of angle from the center of the pattern. Such a gain pattern is generally represented inFIG. 4 by the concentric circles around the on-screen pointing location 430. Note, however, that in practice such a gain pattern is likely to be more complex than the illustrated pattern (e.g., including lobes with respective peaks and nulls). - In such an exemplary scenario, each sensor of the sensors integrated into the
television 401 around the border of thescreen 403 will likely receive some respective amount of energy. For example, along a particular axis, the sensor nearest the screen pointing location 430 (i.e., along the pointing line 425) will likely receive the highest amount of energy, sensors along the particular axis adjacent to thescreen pointing location 430 will likely receive a next highest range of energy, and sensors away from thepointing location 430 will likely receive progressively less amounts of energy from the pointing device (e.g., the television control device 420), as a function of distance from thepointing location 430 or as a function of the angular displacement from thepointing line 425, until such energy is lost in the noise floor. - For example, along the horizontal axis,
sensor 410 is closest to thepointing location 430 and will likely receive the highest energy, with sensors adjacent to the left and right ofsensor 410 receiving the next highest amounts of energy, and so on. Also, along the vertical axis,sensors such sensors - In such an exemplary scenario, the television control device 420 (e.g., the
user interface module 240 of thetelevision control device 200 illustrated inFIG. 2 ) may receive signals indicative of the energy received by the sensors of thetelevision 401. Thetelevision control device 420 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision 401. For example, in an exemplary scenario where the sensors are fully integrated into the television 401 (e.g., into a border around the screen 403) and operationally integrated into thetelevision 401, thetelevision control device 420 may receive such signals via a communication interface between thetelevision control device 420 and the television 401 (e.g., viacommunication link 453 or via a communication interface between thetelevision 401 and thetelevision control device 420 via the television receiver 450 (e.g., viacommunication links 451 and 452)). Also for example, in another exemplary scenario where the sensors are overlaid on (e.g., adhered to) thetelevision screen 401, and where operation of such sensors is independent of thetelevision 401, thetelevision control device 420 may receive such signals via a communication link directly between thetelevision control device 420 and the sensors, where such a communication link may be independent of other communication links between thetelevision control device 420 and thetelevision 401. Such communication link(s) may, for example, be adapted to communicate information from each sensor to thetelevision control device 420 serially (e.g., in a time-multiplexed manner) and/or in parallel. - The
user interface module 240 may then, for example, provide information of such received sensor signals to thesensor processing module 253 for processing. Thesensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below. - For example, the
sensor processing module 253 may operate to select the sensor with the highest received energy along each of the horizontal and vertical axes and determine that the respective locations of such selected sensors correspond to the horizontal and vertical coordinates of the on-screen pointing location. For example, in an exemplary scenario where the spatial resolution of screen border sensors is relatively fine, such operation may reliably yield a desired level of accuracy without undue processing overhead. For example, thesensor processing module 253 may determine thatsensors sensor 410 and represented in the vertical axis by the vertical location of thesensor 411. Note that in scenarios where two sensors have relatively similar energy levels (e.g., as might occur atsensors 411 and 412), thesensor processing module 253 may select a midpoint between such sensors (e.g., the vertical midpoint betweensensors 411 and 412). - In another example, the
sensor processing module 253 may operate to select, for each screen axis, the sensor with the highest received energy and a plurality of sensors adjacent to such sensor. Then, for example, thesensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on weighting). For example, in the horizontal dimension in which a sensor to the right of thehighest energy sensor 410 has a higher received energy than a sensor to the left of thehighest energy sensor 410, thesensor processing module 253 may determine that the pointing location along the horizontal axis is to the right of thehighest energy sensor 410. How much distance to the right may, for example, be determined as a function of the ratio between respective energies received by the right and left sensors. Such calculation may, for example, be a linear or non-linear calculation. Such calculation may also, for example, consider the expected energy pattern of a transmitting pointing device (e.g., in a scenario where energy fall-off is logarithmic as opposed to linear). Thesensor processing module 253 may then, for example, repeat such operation in the vertical direction. - In another example, the
sensor processing module 253 may operate to select all sensors in each of the axes receiving a threshold amount of energy (e.g., an absolute threshold level, a threshold level relative to the highest energy sensor, etc.). Then, for example, thesensor processing module 253 may interpolate between the locations of such sensors (e.g., based, at least in part, on respective energy weighting). For example, thesensor processing module 253 may perform non-linear splining between sensors in a horizontal direction with sensor location on a first axis and sensor energy on a second axis. Thesensor processing module 253 may then operate to select the point on the sensor location axis corresponding to the peak sensor energy on the vertical axis. Such splining and selecting may then be repeated in the vertical screen direction. Alternatively for example, thesensor processing module 253 may operate to perform multi-dimensional splining to create a surface based on sensor energy and select the highest point on such surface and the corresponding screen coordinates of such surface. - After determining on-screen pointing location, the
television control device 420 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module 253 of thetelevision control device 200 may utilize thetelevision interface module 235 to communicate information of such on-screen pointing location to thetelevision 401 for presentation to the user. Also for example, thesensor processing module 253 may utilize theuser interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 200). Such communication will also be addressed in the discussions ofFIGS. 9-10 . - In addition to various television configurations in which sensors are integrated into the television off-screen or off the video presentation portion of the screen, sensors may be incorporated into the television system off-television. Such sensors may, for example, be incorporated in other components of a television system besides the television. For example and without limitation,
FIG. 5 is a diagram illustrating anexemplary television system 500 with off-television sensors in accordance with various aspects of the present invention. Thetelevision system 500 includes atelevision 501 comprising atelevision screen 503. Thetelevision system 500 also includes a television controller 520 (or other pointing device) pointing to an on-screen pointing location 530 along apointing line 525 between thetelevision controller 520 and the on-screen pointing location 530. Thetelevision controller 520 may, for example, share any or all aspects with theexemplary television controllers television control device 520 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . Thetelevision control device 520 may, for example, be communicatively coupled directly to thetelevision 501 via a communication link (not illustrated). Thetelevision control device 520 may also, for example, be communicatively coupled directly to thetelevision receiver 550 viacommunication link 562. Thetelevision control device 520 may additionally, for example, be communicatively coupled indirectly to thetelevision 501 via thetelevision receiver 550 throughcommunication links television control device 520 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . - The
television system 500 also comprises atelevision receiver 550 that is communicatively coupled to thetelevision 501 via a communication link 561 (e.g., a two-way communication link providing video information to thetelevision 501 and/or receiving sensor information from thetelevision 501 for communication to the television control device 520). - The
television control device 520 is illustrated with one ormore communication links 563 to the various sensors 551-556 independent of other communication links (e.g., links to thetelevision 501, links to thetelevision receiver 550, etc.). Note that in various exemplary scenarios, the television control device 520 (e.g., a user interface module 240) may receive sensor information from thetelevision 501 via a television communication link (not illustrated), via acommunication link 562 with thetelevision receiver 550 and/or via the independent communication link(s) 563. The exemplarytelevision control device 520 may also be communicatively coupled to other pointing devices and/or television control devices. - The
exemplary television system 500 comprises an array of sensors integrated into audio speaker components (e.g., surround sound speakers) positioned around thetelevision 501. For example, thetelevision system 500 comprises aleft speaker 531 comprising atop sensor 552 and abottom sensor 551. Also for example, thetelevision system 500 comprises aright speaker 533 comprising atop sensor 556 and abottom sensor 555. Additionally for example, thetelevision system 500 comprises acenter speaker 532 comprising aleft sensor 553 and aright sensor 554. As discussed above, any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc. Note that the audio speaker component example discussed herein is merely illustrative and that such sensors may be installed in any of a variety of locations (e.g., dedicated sensor boxes, attached to furniture, etc.). - The array of sensors may be positioned around the
television 501 in any of a variety of manners. For example, such sensors may be positioned around thetelevision 501 generally in the same plane as thetelevision screen 503. In such an exemplary scenario, on-screen pointing location may be determined in a manner similar to the interpolation and/or gain pattern intersection discussed above with regard to off-screen and/or on-screen sensors. Note that since the locations of the sensors are likely to be inconsistent between various television system configurations, a calibration procedure may be implemented (e.g., by the calibration module 251). Such calibration will be discussed in more detail below. - In an exemplary configuration, one or more photo detectors (e.g., photo diodes) and/or antenna elements (e.g., individual antennas or elements of an antenna array) may be incorporated into a plurality of respective surround sound speakers positioned around the
television 501. - For example, in a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a directional light source of the television control device 520) aimed at the screen. As discussed previously, directed energy (e.g., light, RF, acoustic, etc.) may be transmitted in a pattern (or envelope), so even if a pointing device (e.g., the television control device 520) is pointed to a location on the
television screen 530 along pointingline 525, sensors off-screen (or even off-television) may still receive energy from the transmission (albeit likely not with the same intensity at which energy is delivered along the pointing line 525). Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). - In a photo detector implementation (e.g., utilizing photo diodes), photo diodes may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, room lighting, etc. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. In one example, the photo detectors integrated with off-television components may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from a television controller (or other pointing device). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to various off-television components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- In an antenna element implementation, an array of antenna elements may be positioned around off-television components (e.g., in surround sound components). In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying amounts of respective RF energy depending on the pointing direction of a directional RF source (e.g., a directional RF source of the television controller 520) pointed at a location on the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to the off-television components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user.
- In an exemplary scenario, a user may point a pointing device (e.g., a television controller 520 (e.g., a remote control device), a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at the
television screen 503, where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on thetelevision screen 503 to which the user is pointing with the pointing device. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain pattern with the highest intensity at the center of the pattern (i.e., along the pointing line 525) and decreasing as a function of angle from the center of the pattern (or distance from the center point). Such a gain pattern was discussed previously in the discussion ofFIG. 4 . - In such an exemplary scenario, each sensor of the sensors integrated into the
television system 500 off-television will likely receive some respective amount of energy. For example, along a particular axis, the sensor nearest to the screen pointing location 530 (i.e., along the pointing line 525) will likely receive the highest amount of energy, a sensor next nearest to thescreen pointing location 530 will likely receive a next highest range of energy, and sensors away from thepointing location 530 will likely receive progressively less amounts of energy from the pointing device (e.g., the television control device 420), as a function of distance from thepointing location 530 and/or angle off the pointing line 525 (e.g., until such energy is lost in the noise floor). For example,sensor 553 is nearest to thepointing location 530 and will likely receive the highest energy,sensor 552 is next nearest to thepointing location 530 and will likely receive the next highest energy, and so on. - Note that in the implementation illustrated in
FIG. 5 , in particular since there are a relatively low number of sensors, signals from a same sensor may be utilized in determining multiple axes of pointing location. As mentioned previously, a calibration procedure may be performed when thesystem 500 is configured to assist in such pointing determination. - In an exemplary scenario, the television control device 520 (e.g., the
user interface module 240 of thetelevision control device 200 illustrated inFIG. 2 ) may receive signals indicative of the energy received by the sensors of thetelevision system 500. Thetelevision control device 520 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision 501. For example, in an exemplary scenario where the sensors are fully integrated into thetelevision system 500 components (e.g., surround sound speaker components 531-533) and operationally integrated into such components, thetelevision control device 520 may receive such signals via a communication interface between thetelevision control device 520 and the respective off-television components (e.g., via acommunication link 563 between thetelevision control device 520 and the surround sound speaker components 531-533). Also for example, in another exemplary scenario where the sensors are overlaid on (e.g., adhered to) the off-television components, and where operation of such sensors is independent of thetelevision 501, thetelevision control device 520 may receive such signals via a communication link directly between thetelevision control device 520 and the individual sensors (e.g., communication link 563), where such a communication link may be independent of other communication links between thetelevision control device 520 and thetelevision 501 and/or independent of other communication links between thetelevision control device 520 andother television system 500 components (e.g.,television receiver 550 and the surround sound speaker components 531-533). - The
user interface module 240 may then, for example, provide information of such received sensor signals to thesensor processing module 253 for processing. Thesensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below. - In an exemplary scenario, the
sensor processing module 253 may operate to estimate a position between sensor positions based on relative sensor energy. For example, in the horizontal dimension,sensor 552 may correspond to a relatively high amount of energy, andsensor 556 may correspond to a relatively low amount of received energy. Thesensor processing module 253 may, for example, estimate a horizontal position relatively closer tosensor 552 by an amount proportional to the relative difference between respective amounts of energy. Thesensor processing module 253 may perform a similarestimation utilizing sensors left speaker 531 sensors may be averaged, respective energies for theright speaker 533 sensors may be averaged, and such left and right speaker average energies may then be utilized to determine a horizontal pointing location. Thesensor processing module 253 may then, for example, perform a similar pointing direction estimate in the vertical direction. - In another exemplary scenario, a calibration procedure may be performed to determine an expected sensor energy level (e.g., absolute or relative) when the user is pointing at the sensor. In such a scenario, combined with a gain pattern and user (or pointing device) location relative to the
television 501, a first line (e.g., a circle or arc) may be drawn around afirst sensor 552. Similarly, a second line (e.g., a circle or arc) may be drawn around asecond sensor 553, and the intersection of the first and second lines utilized as an estimate of pointing location. Additional lines associated with other sensors may also be utilized. Such additional lines may, for example, be utilized when selecting between multiple line intersections and/or for greater accuracy or resolution. Note that such line intersection solution may be applied to any of the previously discussed scenarios (e.g., as illustrated inFIGS. 3-4 ). A non-limiting example of this was presented in the discussion ofFIG. 3 , and another example will be provided in the following discussion ofFIG. 7 . - After determining on-screen pointing location, the
television control device 520 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module 253 of thetelevision control device 200 may utilize thetelevision interface module 235 to communicate information of such on-screen pointing location to thetelevision 501 for presentation to the user. Also for example, thesensor processing module 253 may utilize theuser interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 200). Such communication will also be addressed in the discussions ofFIGS. 9-10 . - As discussed above, pointing sensors may be incorporated into the television system off-television (i.e., placed separately in stand-alone housings, integrated with other apparatus, attached to other apparatus, etc.). Another example of such off-television sensor placement is presented in
FIG. 6 . In particular, the screen pointing sensors may be integrated into the television receiver.FIG. 6 is a diagram illustrating anexemplary television system 600 with television receiver sensors in accordance with various aspects of the present invention. - The
television system 600 includes atelevision 601 comprising atelevision screen 603. Thetelevision system 600 also includes a television controller 620 (or other pointing device) pointing to an on-screen pointing location 630 along apointing line 625 between thetelevision controller 620 and the on-screen pointing location 630. Thetelevision controller 620 may, for example, share any or all aspects with theexemplary television controllers television control device 620 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . Thetelevision control device 620 may, for example, be communicatively coupled directly to thetelevision 601 via a communication link (not illustrated). Thetelevision control device 620 may also, for example, be communicatively coupled directly to thetelevision receiver 650 viacommunication link 653. Thetelevision control device 620 may additionally, for example, be communicatively coupled indirectly to thetelevision 601 via thetelevision receiver 650 throughcommunication links television control device 620 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . - The
television system 600 also comprises atelevision receiver 650 that is communicatively coupled to thetelevision 601 via a communication link 651 (e.g., a two-way communication link providing video information to thetelevision 601 and/or communicating sensor information and/or screen pointing information with the television 601). Thetelevision receiver 650 comprises an array of screen pointing sensors. A portion of the sensors are labeled (661-665) for discussion purposes. Note that such sensors may be arranged in any of a variety of configurations (e.g., matrix configuration, border configuration, placed only at the front corners, etc.). The pointing sensors may, for example, be integrated into thetelevision receiver 650 and/or attached to thetelevision receiver 650 in any of a variety of manners (e.g., in any manner similar to those discussed previously with regard to the televisions and/or television system components discussed previously). - Note that in various exemplary scenarios, the television control device 620 (e.g., a user interface module 240) may receive additional sensor information from other sensors via the
television communication line 653 and/or other communication links. The exemplarytelevision control device 620 is also communicatively coupled to thetelevision receiver 650 via acommunication link 652. - The
exemplary television receiver 650 comprises an array of sensors integrated into thetelevision receiver 650. For example, thetelevision receiver 650 comprises a lowerleft sensor 661, upperleft sensor 662, upperright sensor 663, lowerright sensor 664 andcenter sensor 665. As discussed above, any of a variety of sensor types may be utilized, non-limiting examples of which include light sensors or photo detectors (e.g., photo diodes), RF sensors (e.g., antenna elements), acoustic sensors (e.g., microphones), etc. - The
exemplary television receiver 650 may be positioned around thetelevision 601 in any of a variety of manners. For example, the television receiver 650 (and thus the sensors) may be positioned around thetelevision 601 in an orientation such that the front face of the television receiver 650 (and thus the sensors) is generally in the same plane as thetelevision screen 603. Such placement is not necessary, but may be advantageous from an accuracy perspective. In such an exemplary scenario, on-screen pointing location may be determined in a manner similar to the interpolation and/or gain pattern intersection discussed above with regard to off-screen and/or on-screen sensors. Note that since the locations of the sensors are likely to be inconsistent between various television system configurations (i.e., it is unlikely that every user will place/position thetelevision receiver 650 in the same manner), a calibration procedure may be implemented (e.g., by the calibration module 251). Such calibration was discussed previously and will also be revisited below. - In an exemplary configuration, one or more photo detectors (e.g., photo diodes) and/or antenna elements (e.g., individual antennas or elements of an antenna array) may be incorporated into the faceplate of the
television receiver 650. Note that additional sensors positioned away from thetelevision receiver 650 may also be utilized (e.g., any of the previously discussed sensor placements). - For example, in a photo detector implementation, passive photo detectors may receive varying amounts of respective light energy depending on the pointing direction of a light source (e.g., a directional light source of the television control device 620) aimed at the screen. As discussed previously, directed energy (e.g., light, RF, acoustic, etc.) may be transmitted in a pattern (or envelope), so even if a pointing device is pointed to a location on the
television screen 630 along pointingline 625, sensors off-screen (e.g., sensors integrated into the television receiver 650) may still receive energy from the transmission (albeit likely not with the same intensity at which energy is delivered along the pointing line 625). Also for example, received signals (e.g., pulsed signals) may arrive at different sensors at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). - In a photo detector implementation (e.g., utilizing photo diodes), photo diodes may, for example, be tuned to react to particular light frequencies to reduce interference from output pixel light and/or associated reflections, ambient light, room lighting, etc. As a non-limiting example, photo diodes may be tuned to detect light that is not visible to the human eye, visible light frequencies that are relatively rare, light patterns that are unlikely to occur in a television program (e.g., particular pulse codes), etc. In one example, the photo detectors integrated with the
television receiver 650 may comprise photo diodes that operate to detect energy from a laser pointer or directed infrared energy from the television control device 620 (or other pointing device). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors tovarious television receiver 650 locations and/or to various off-receiver components. Such sensor installation may, for example, occur at the factory or after-market by a technician or user. - In an antenna element implementation, an array of antenna elements may be positioned at locations on the television receiver 650 (e.g., only on the
television receiver 650 and/or at locations around the television receiver 650). In an RF antenna implementation, passive antennas (or elements of an overall antenna matrix) may receive varying amounts of respective RF energy depending on the pointing direction of a directional RF source pointed at a location on the screen. Also for example, received signals (e.g., pulsed signals) may arrive at different antennas at different respective times/phases (e.g., being indicative of relative position and/or pointing direction, which may also be utilized in a pointing determination). Note that analogously to the on-screen sensors discussed previously, various aspects may comprise mounting (e.g., adhering) sensors to thetelevision receiver 650. Such sensor installation may, for example, occur at the factory or after-market by a technician or user. - In an exemplary scenario, a user may point a pointing device (e.g., the
remote control device 620, a laser pointer, directional RF transmitter, specifically designed eyewear, a mobile computing device, a mobile communication device, a gesture tracking device or glove, etc.) at thetelevision screen 603, where the pointing device directs transmitted energy (e.g., light and/or RF energy and/or acoustic energy) at a particular location on thetelevision screen 603 to which the user is pointing with the pointing device. Note that such transmitted energy will likely be transmitted directionally and be associated with an intensity or gain (or energy) pattern with the highest intensity at the center of the pattern (i.e., along the pointing line 625) and decreasing as a function of angle from the center of the pattern. Such a gain pattern was discussed previously in the discussion ofFIG. 4 . - In such an exemplary scenario, each sensor of the sensors integrated into the
television receiver 650 off-television will likely receive some respective amount of energy. For example, along a particular axis, the sensor nearest to the screen pointing location 630 (i.e., along the pointing line 625) will likely receive the highest amount of energy, a sensor next nearest to thescreen pointing location 630 will likely receive a next highest range of energy, and sensors away from thepointing location 630 will likely receive progressively less amounts of energy from thepointing device 620, as a function of distance from thepointing location 630 and/or angle off the pointing line 625 (e.g., until such energy is lost in the noise floor). For example,sensor 662 is nearest to thepointing location 630 and will likely receive the highest energy,sensors pointing location 630, etc., and so on. - Note that in the implementation illustrated in
FIG. 6 , in particular since there are a relatively low number of sensors, signals from a same sensor may be utilized in determining multiple axes of pointing location. As mentioned previously, a calibration procedure may be performed when thesystem 600 is configured to assist in such pointing determination. - In an exemplary scenario, the television control device 620 (e.g., the
user interface module 240 of thetelevision control device 200 illustrated inFIG. 2 ) may receive signals indicative of the energy received by the sensors of the television receiver 650 (e.g., via thecommunication link 652 between thetelevision control device 620 and thetelevision receiver 650 and/or via a communication link directly between thetelevision control device 620 and the sensors). Thetelevision receiver 650 may receive such signals in various manners, depending on the degree of integration of such sensors into thetelevision receiver 650 and/or various components of thetelevision system 600. For example, in an exemplary scenario where the sensors are fully integrated into thetelevision receiver 650, thetelevision control device 620 may receive such signals viacommunication link 652. Also for example, in a scenario where various sensors are off thetelevision receiver 650, thetelevision control device 620 may receive information from such sensors via direct communication link or via communication link with the various components with which such sensors are integrated. - The communication module 230 may then, for example, provide information of such received sensor signals to the
sensor processing module 253 for processing. Thesensor processing module 253 may then, for example, operate to process such information to determine the screen pointing location. Thesensor processing module 253 may perform such processing in any of a variety of manners, non-limiting examples of which will be provided below. - In an exemplary scenario, the
sensor processing module 253 may operate to estimate a position between sensor positions based on relative sensor energy. For example, in the horizontal dimension,sensor 662 may correspond to a relatively high amount of energy, andsensor 663 may correspond to a relatively low amount of received energy. Thesensor processing module 253 may, for example, estimate a horizontal position relatively closer tosensor 662 by an amount proportional to the relative difference between respective amounts of energy. Thesensor processing module 253 may perform a similarestimation utilizing sensors left side sensors right side sensors sensor processing module 253 may then, for example, perform a similar pointing direction estimate in the vertical direction. Such horizontal and/or vertical positions may, for example, be translated between respective locations/directions of the sensor arrangement and respective locations/directions of thetelevision screen 603. Calibrations procedures may, for example, be utilized to establish the spatial relationship between the sensor positioning and on-screen location. - In another exemplary scenario, a calibration procedure may be performed to determine an expected sensor energy level (e.g., absolute or relative) when the user is pointing at the sensor (and/or other known locations). In such a scenario, combined with a gain pattern and user (or pointing device) location relative to the
television 601, a first line (e.g., a circle or arc) may be drawn around afirst sensor 662. Similarly, a second line (e.g., a circle or arc) may be drawn around asecond sensor 663, and the intersection of the first and second lines utilized as an estimate of pointing location. Additional lines associated with other sensors may also be utilized. Such additional lines may, for example, be utilized when selecting between multiple line intersections or to increase accuracy and/or resolution of the pointing determination. Note that such line intersection solution may be applied to any of the previously discussed scenarios (e.g., as illustrated inFIGS. 3-5 ) or other scenarios discussed herein. A non-limiting example of this was presented in the discussion ofFIG. 3 , and another example will be provided in the following discussion ofFIG. 7 . - After determining on-screen pointing location, the
television receiver 650 may communicate information of such determined location in various manners. For example and without limitation, thesensor processing module 253 of thetelevision control device 200 may utilize thetelevision interface module 235 to communicate information of such on-screen pointing location to thetelevision 601 for presentation to the user on thetelevision screen 603. Also for example, thesensor processing module 253 may utilize theuser interface module 240 to communicate information of such on-screen pointing location to the user (e.g., on a display of the television control device 620). Such communication will also be addressed in the discussions ofFIGS. 9-10 . - Various aspects of the present invention may also, for example, include one or more sensors incorporated into the pointing device (e.g., the television controller 200).
FIG. 7 is a diagram illustrating anexemplary television system 700 utilizing pointing device sensors in accordance with various aspects of the present invention. - The
exemplary television system 700 includes atelevision 701 having atelevision screen 703. Thetelevision system 700 also includes a television controller 720 (or other pointing device) pointing to an on-screen pointing location 730 along apointing line 725 between thetelevision controller 720 and the on-screen pointing location 730. Thetelevision controller 720 may, for example, share any or all aspects with theexemplary television controllers television control device 720 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . - The
television control device 720 may, for example, be communicatively coupled directly to thetelevision 701 via acommunication link 753. Thetelevision control device 720 may also, for example, be communicatively coupled directly to thetelevision receiver 750 viacommunication link 752. Thetelevision control device 720 may additionally, for example, be communicatively coupled indirectly to thetelevision 701 via thetelevision receiver 750 throughcommunication links television control device 720 will be explained herein with reference to various components of the exemplarytelevision control device 200 illustrated inFIG. 2 . - The
television system 700 also comprises atelevision receiver 750 that is communicatively coupled to thetelevision 701 via a communication link 751 (e.g., a two-way communication link providing video information to thetelevision 701 and/or receiving sensor information from the television 701). Theexemplary television receiver 750 is also communicatively coupled to thetelevision controller 720 via acommunication link 752. - In such a configuration, sensor information may be communicated to the television control device 720 (e.g., via internal communication link). Such information may then be communicated to the
sensor processing module 253 for the determination of an on-screen pointing location. - In the exemplary configuration, the
television 701 includes eight emitters (e.g., light emitters, RF transmitters, etc.) located around the border of thetelevision screen 703. Note that such emitters may be positioned anywhere proximate thetelevision system 700. For example, thetelevision 701 includes a first emitter 711,second emitter 712,third emitter 713,fourth emitter 714,fifth emitter 715, sixth emitter 716, seventh emitter 717 andeighth emitter 718. Such emitters may each emit a signal that may be received at sensors on-board thetelevision control device 720. Such sensors may, for example, make up a directional receiver. In such a configuration, the controller 720 (or other pointing device) may be pointed to alocation 730 on thescreen 703 along apointing line 725. With such an orientation and a directional signal reception pattern, the sensors on-board thecontroller 720 will receive the emitted signals at respective signal levels. Such sensor signals may then be processed in a manner similar to the manners discussed above to determine the on-screen pointing direction for thepointing device 720. - For example, through a calibration procedure, it may be known that the pointing device at a particular location should receive a particular amount of energy from each of the emitters 711-718 when pointed directly at such emitters (or at some other known location). In such a scenario, the pointing device (e.g., the
user interface module 240 of the television control device 200) may measure respective signal energies received from each of the emitters (e.g., each distinguishable by frequency, coding, timing and/or timeslotting, etc.) and communicate such information to thetelevision receiver 750. - The
sensor processing module 253 may, for example, select a first emitter 712 (e.g., the emitter corresponding to the highest energy received at the pointing device). Thesensor processing module 253 may then process the location of the pointing device, the receive gain pattern for the pointing device, and the energy received from thefirst emitter 712 to determine a first figure (e.g., an arc 752) along which the pointing device, if pointed, would be expected to receive the measured energy. Similarly, thesensor processing module 253 may perform such a procedure for a second emitter 711 resulting in a second figure (e.g., an arc 751). The intersection of such arcs may be utilized as an estimate of on-screen pointing location. Additionally, for accuracy or for selecting between multiple intersection points, should they occur, thesensor processing module 253 may perform such a procedure for athird emitter 714 resulting in a third figure (e.g., an arc 754), and so on. The intersection of the threearcs - Alternatively, the solution need not be based on a known position (location) of the pointing device, nor on absolute received energy levels. In such a scenario, differences in received energy from the various emitters may be processed with or without position information of the on-screen pointing device. For example, the
pointing device 720 may have six degrees of freedom (e.g., three positional degrees of freedom and three orientational degrees of freedom). In such a scenario, if the position and orientation of thetelevision 701 are known, the unknown six degrees of freedom for thepointing device 720 may be ascertained by processing six known values related to such six degrees of freedom (e.g., related by a known signal energy pattern). In such a scenario, measurements associated with six emitters on the television (and potentially more) may be utilized to solve for the six degrees of freedom of thepointing device 720. - The above-mentioned exemplary scenarios were presented to illustrate numerous manners in which the television control device 720 (e.g., sensor processing module 253) may operate to determine on-screen pointing location. Such examples are merely exemplary and thus the scope of various aspects of the present invention should not be limited by any particular characteristics of such examples unless explicitly claimed.
- As discussed above, the
calibration module 251 of thetelevision control device 200 may operate to perform calibration operations. Such calibrating may be performed in any of a variety of manners. For example and without limitation, calibration may be utilized to determine expected received energy when transmitters and receivers are located and oriented in a particular manner. For example, a non-limiting example of a calibration procedure may comprise presenting an on-screen target at various locations and measuring respective sensor signals received when the pointing device is being pointed at such targets. Also for example, a calibration procedure may comprise directing a user (e.g., using the user interface module 240) to point to each of a plurality of sensors to determine an expected amount of received energy when the user is pointing directly at such sensors. - As mentioned previously, signal energy (or gain) pattern may be utilized in various on-screen pointing determinations. Such an energy (or gain) pattern may be predefined for a particular pointing device (e.g., at the factory), but may also be measured by the
television control device 200. In a non-limiting example, thecalibration module 251 may direct the user to utilize a pointing device to point to a location on the screen and process information received from multiple sensors (e.g., embedded in the screen, embedded in the television around the border of the screen, located in off-television devices, located on thetelevision control device 720, located in the pointing device, etc.) to develop a custom gain pattern for the particular pointing device (e.g., for the television control device 200). For example, such calibration may determine the shape of the gain pattern, the signal energy falloff characteristics, etc. - Various aspects discussed above included the processing of position information. In such exemplary cases, the
television control device 200 may comprise one ormore location modules 252 that operate to determine relevant position information. Thelocation module 252 may operate to perform such location determining (e.g., of the user or pointing device and/or the television) in any of a variety of manners. For example, thelocation module 252 may utilize acommunication interface module television control device 200 or other pointing device) from an external source of such information (e.g., global positioning system, cellular triangulation system, home triangulation system, etc.). - Also for example, the
location module 252 may receive position information from internal components of the television control device 200 (e.g., where suchtelevision control device 200 has position-determining capability). For example, in a non-limiting exemplary scenario, where thetelevision control device 200 is a handheld computer, such computer may comprise GPS (or A-GPS) capability to determine its position. In such a scenario, thetelevision control device 200location module 252 may wirelessly communicate information of the television control device's position to thesensor processing module 253. - Additionally for example, the location module 232 may operate to process sensor information to determine location of the pointing device (e.g., location in relation to the television screen). For example, as mentioned previously, a signal (e.g., a pulse) transmitted from a pointing device to the television (or vice versa) will arrive at different sensors at different points in time depending on the respective distance from the pointing device to each sensor. The location module 232 may process such time-of-arrival information at various sensors to determine the position of the pointing device relative to the television. Similarly, in a scenario including signal emitters associated with the television and sensors on the pointing device, simultaneously transmitted signals (or signals transmitted with a known temporal pattern) from different emitters will arrive at the pointing device at different respective times depending on the position of the pointing device relative to such emitters. Alternatively, the location module 232 may also operate to process phase difference information (in addition to timing information or instead of such information) to determine pointing device location.
- Once the television control device 200 (e.g., the sensor processing module 253) determines an on-screen pointing location, the
television control device 200 may utilize such information in any of a variety of manners. For example and without limitation, thesensor processing module 253 may operate to generate information of the determined on-screen pointing location, and one or more modules of thetelevision control device 200 may operate to communicate a signal (e.g., to a television, television receiver, other display device, U/I modules 240 of thetelevision control device 200, etc.) that comprises characteristics that cause presentation of a visual indication (e.g., on the television screen, controller screen, other display, etc.) to indicate to the user the on-screen location to which thetelevision control device 200 has determined the user is pointing. Such a visual indication may, for example, comprise characteristics of a cursor or other graphical construct, bright spot, highlighting, color variation, brightness variation, etc. For example, thetelevision 701 ortelevision control device 720 may operate to overlay such indication on video content (e.g., television programming) being presented to the user (e.g., presented on the television screen, presented on a screen of the television controller, etc.). - Additionally for example, the
sensor processing module 253 may provide information of the determined on-screen pointing location to one or more other modules of the television control device 200 (e.g., theprocessing module 250 and/or other modules thereof) to identify an object in video content (e.g., television programming) to which a user is pointing. In such an exemplary scenario, one or more modules of thetelevision control device 200 may operate to communicate signals (e.g., to a television, other modules of the television controller having a screen, other display device, etc.) that cause highlighting of an object to which the user is pointing and/or provide information regarding such object. - Further for example, various modules of the television control device 200 (e.g., the processor module 250) may operate to communicate on-screen pointing location information to television system components separate from the television (e.g., to a television receiver, video recorder, remote programming source, communication network infrastructure, advertising company, provider of goods and/or services, etc.).
-
FIG. 2 provided a diagram illustrating an exemplarytelevision control device 200 in accordance with various aspects of the present invention.FIG. 8 provides another diagram illustrating an exemplarytelevision control device 800 in accordance with various aspects of the present invention. The exemplarytelevision control device 800 may share any or all aspects with any of the television control devices discussed herein and illustrated inFIGS. 1-7 . For example, the exemplary television control device 800 (or various modules thereof) may operate to perform any or all functionality discussed herein. As with the exemplarytelevision control device 200, the components of the exemplarytelevision control device 800 may be co-located a single housing. - For example, the
television control device 800 comprises aprocessor 830. Such aprocessor 830 may, for example, share any or all characteristics with theprocessor 250 discussed with regard toFIG. 2 . Also for example, thetelevision control device 800 comprises amemory 840.Such memory 840 may, for example, share any or all characteristics with thememory 260 discussed with regard toFIG. 2 . - Also for example, the
television control device 800 may comprise any of a variety of user interface module(s) 850. Such user interface module(s) 850 may, for example, share any or all characteristics with the user interface module(s) 240 discussed previously with regard toFIG. 2 . For example and without limitation, the user interface module(s) 850 may comprise: a display device, a camera (for still or moving picture acquisition), a speaker, an earphone (e.g., wired or wireless), a microphone, a video screen (e.g., a touch screen display), a vibrating mechanism, a keypad, a remote control interface, and/or any of a variety of other user interface devices (e.g., a mouse, a trackball, a touch pad, touch screen, light pen, game controlling device, etc.). - The exemplary
television control device 800 may also, for example, comprise any of a variety of communication modules (805, 806, and 810). Such communication module(s) may, for example, share any or all characteristics with the communication interface module(s) 210, 220 discussed previously with regard toFIG. 2 . For example and without limitation, the communication interface module(s) 810 may comprise: a Bluetooth interface module; an IEEE 802.11, 802.15, 802.16 and/or 802.20 module; any of a variety of cellular telecommunication interface modules (e.g., GSM/GPRS/EDGE, CDMA/CDMA2000/1x-EV-DO, WCDMA/HSDPA/HSUPA, TDMA/PDC, WiMAX, etc.); any of a variety of position-related communication interface modules (e.g., GPS, A-GPS, etc.); any of a variety of wired/tethered communication interface modules (e.g., USB, Fire Wire, RS-232, HDMI, component and/or composite video, Ethernet, wireline and/or cable modem, etc.); any of a variety of communication interface modules related to communicating with external memory devices; etc. The exemplarytelevision control device 800 is also illustrated as comprising various wired 806 and/orwireless 805 front-end modules that may, for example, be included in the communication interface modules and/or utilized thereby. - The exemplary
television control device 800 may also comprise any of a variety of signal processing module(s) 890. Such signal processing module(s) 890 may, for example, be utilized to assist in processing various types of information discussed previously (e.g., with regard to sensor processing, position determination, video processing, image processing, audio processing, general user interface information data processing, etc.). For example and without limitation, the signal processing module(s) 890 may comprise: video/graphics processing modules (e.g. MPEG-2, MPEG-4, H.263, H.264, JPEG, TIFF, 3-D, 2-D, MDDI, etc.); audio processing modules (e.g., MP3, AAC, MIDI, QCELP, AMR, CMX, etc.); and/or tactile processing modules (e.g., Keypad I/O, touch screen processing, motor control, etc.). - Various aspects of the present invention were previously exemplified by non-limiting illustrations and described in terms of operations performed by various modules of the television. Various aspects of the present invention will now be illustrated in the form of method flow diagrams.
-
FIG. 9 is a flow diagram 900 illustrating the generation of on-screen pointing information (e.g., in a television control device) in accordance with various aspects of the present invention. Theexemplary method 900 may, for example, share any or all characteristics with the television control device operation discussed previously. For example, theexemplary method 900 may be implemented by any or all of the television control devices (e.g., 160, 161, 200, 220, 320, 420, 520, 620, 720 and 800) discussed previously. Conversely, theexemplary method 900 may comprise any or all functional aspects discussed previously with regard to such exemplary television control devices. - The
exemplary method 900 may begin executing at step 905. Theexemplary method 900 may begin executing in response to any of a variety of causes and/or conditions. For example and without limitation, themethod 900 may begin executing in response to a user command to begin, detected user interaction with a pointing device (e.g., a television controller), detected user presence in the vicinity, detected user interaction with a television implementing themethod 900, etc. Also for example, themethod 900 may begin executing in response to a television presenting programming or other video content for which on-screen pointing is enabled and/or relevant. - The
exemplary method 900 may, for example atstep 910, comprise receiving pointing sensor information. For example and without limitation, step 910 may comprise any or all sensor information receiving characteristics described previously with regard the various modules of the exemplary television control devices illustrated inFIGS. 1-8 and discussed previously. For example, step 910 may share any or all sensor information receiving characteristics discussed previously with regard to at least theuser interface module 240,television interface module 235,processor module 250,communication interface modules sensor processing module 253,location module 252 andcalibration module 251. - Step 910 may, for example, comprise receiving sensor information from (or associated with) sensors integrated in the television control device. Also for example, step 910 may comprise receiving sensor information from (or associated with) off-controller sensors (e.g., integrated with or attached to a television, off-television sensors, sensors integrated with a pointing device different from the television control device, sensors integrated with a television receiver, etc. As discussed previously, such sensors may comprise any of a variety of characteristics, including without limitation, characteristics of light sensors, RF sensors, acoustic sensors, active and/or passive sensors, etc.
- In general,
step 910 may comprise receiving pointing sensor information. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of receiving pointing sensor information unless explicitly claimed. - The
exemplary method 900 may, atstep 920, comprise processing received sensor information (e.g., as received at step 910) to determine a location on a screen of the television to which a user is pointing (e.g., pointing with a pointing device). For example and without limitation, step 920 may comprise any or all pointing location processing characteristics described previously with regard the various modules of the exemplary television controllers illustrated inFIGS. 1-8 and discussed previously. For example, step 920 may share any or all pointing location determining characteristics discussed previously with regard to at least theprocessor module 250,sensor processing module 253,location module 252 andcalibration module 251. - Step 920 may, for example, comprise determining on-screen pointing location in any of a variety of manners. For example, step 920 may comprise determining on-screen pointing location based on a location of a selected sensor, based on interpolation between sensor locations (e.g., linear and/or non-linear interpolation), based on determining energy pattern intersection(s), etc. Many examples of such determining were provided previously.
- In general,
step 920 may comprise processing received sensor information (e.g., independently and/or in conjunction with other information) to determine a location on a screen of the television to which a user is pointing (e.g., while the television is presenting programming to the user). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such processing unless explicitly claimed. - The
exemplary method 900 may, atstep 930, comprise generating information indicative of a determined on-screen pointing location (e.g., as determined at step 920). For example and without limitation, step 930 may comprise any or all pointing location information generation characteristics described previously with regard the various modules of the exemplary television control devices illustrated inFIGS. 1-8 and discussed previously. For example, step 930 may share any or all information generation characteristics discussed previously with regard to at least theprocessor module 250,sensor processing module 253,location module 252,calibration module 251,television interface module 235,user interface module 240 and/orcommunication interface modules - Step 930 may, for example, comprise generating such information in any of a variety of manners. For example, step 930 may comprise generating on-screen pointing location data to communicate to internal modules of the television control device, to equipment external to the television control device (e.g., to the television and/or television receiver), to television network components, to a television programming source, etc. Such information may, for example, be communicated to various system components and may also be presented to the user (e.g., utilizing visual feedback displayed on a screen of a television, television controller, etc.). Such information may, for example, be generated in the form of screen coordinates, identification of a video content object (e.g., a programming object or person) to which an on-screen pointing location corresponds, generation of an on-screen cursor or highlight or other graphical feature, etc.
- In general,
step 930 may comprise generating information indicative of a determined on-screen pointing location. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of generating such information unless explicitly claimed. - The
exemplary method 900 may, atstep 995, comprise performing continued processing. Such continued processing may comprise characteristics of any of a variety of types of continued processing, various examples of which were presented previously. For example and without limitation, step 995 may comprise looping execution flow back up to any earlier step (e.g., step 910). Also, in a non-limiting exemplary scenario, step 995 may comprise presenting a graphical feature on a television control device screen indicative of where the user is pointing on a television screen. In another exemplary scenario, step 995 may comprise communicating information to a television that causes the television to output a graphical feature on the television screen indicative of where the user is pointing (e.g., such information may comprise characteristics that cause the television to overlay such graphical indication on programming being presented on the television screen. Additionally for example, step 995 may comprise presenting (or causing the presentation of) visual feedback indicia of the on-screen pointing location for a user. Further for example, step 995 may comprise communicating information of the on-screen pointing location to system components external to the television control device implementing the method 900 (e.g., to a television, television receiver, another television controller, etc.). Further for example, step 995 may comprise utilizing the on-screen pointing information to identify a video content object (e.g., an object presented in television programming) to which a user is pointing, etc. - In general,
step 995 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing continued processing unless explicitly claimed. - Turning next to
FIG. 10 , such figure is a flow diagram 1000 illustrating the generation of on-screen pointing information (e.g., in a television control device) in accordance with various aspects of the present invention. Theexemplary method 1000 may, for example, share any or all characteristics with the television control device operation discussed previously (e.g., in reference toFIGS. 1-9 ). - The
exemplary method 1000 may begin executing atstep 1005.Step 1005 may, for example, share any or all characteristics with step 905 of theexemplary method 900 illustrated inFIG. 9 and discussed previously. - The
exemplary method 1000 may, for example atstep 1008, comprise performing a calibration procedure with the user. Such a calibration procedure may, for example, be performed to develop a manner of processing received sensor information to determine on-screen pointing location.Step 1008 may, for example, comprise any or all calibration aspects discussed previously (e.g., with reference to the calibration module 251). - The
exemplary method 1000 may, for example atstep 1010, comprise receiving pointing sensor information. For example and without limitation,step 1010 may comprise any or all sensor information receiving characteristics described previously with regard the various modules of the exemplary television control devices illustrated inFIGS. 1-8 andFIG. 9 (e.g., step 910) and discussed previously. - The
exemplary method 1000 may, for example atstep 1015, comprise determining user position (e.g., determining position of a user pointing device). For example and without limitation,step 1015 may comprise any or all position determining characteristics discussed previously with regard toFIGS. 1-9 . Note that position may also, for example, include orientation. - For example,
step 1015 may share any or all position determining characteristics discussed previously with regard to at least theprocessor module 250,sensor processing module 253,location module 252 andcalibration module 251. For example,step 1015 may comprise determining user position based, at least in part, on received sensor signals. Also for example,step 1015 may comprise determining user position based, at least in part, on position information received from one or more systems external to the television control device implementing themethod 1000. - In general,
step 1015 may comprise determining user position (e.g., pointing device position). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of determining user position unless explicitly claimed. - The
exemplary method 1000 may, for example, atstep 1020, comprise processing received sensor information (e.g., as received at step 1010) and/or user position information (e.g., as determined at step 1015) to determine a location on a screen of the television to which a user is pointing (e.g., pointing with the television control device implementing the method or other pointing device). For example and without limitation,step 1020 may comprise any or all pointing location determination characteristics described previously with regard the various modules of the exemplary television control devices illustrated inFIGS. 1-8 andFIG. 9 (e.g., step 920) and discussed previously. For example,step 1020 may share any or all pointing location determining characteristics discussed previously with regard to at least theprocessor module 250,sensor processing module 253,location module 252 andcalibration module 251. -
Step 1020 may, for example, comprise determining on-screen pointing location in any of a variety of manners. For example,step 1020 may comprise determining on-screen pointing location based on a location of a selected sensor, based on location of the pointing device, based on interpolation between sensor locations (e.g., linear and/or non-linear interpolation), based on energy pattern intersection points, etc. Many examples of such determining were provided previously. - In general,
step 1020 may comprise processing received sensor information and/or user position information to determine a location on a screen of the television to which a user is pointing (e.g., pointing with the television control device implementing themethod 1000 or other pointing device). Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing such processing unless explicitly claimed. - The
exemplary method 1000 may, atstep 1030, comprise generating information indicative of a determined on-screen pointing location (e.g., as determined at step 1020). For example and without limitation,step 1030 may comprise any or all information generation characteristics described previously with regard the various modules of the exemplary television control devices illustrated inFIGS. 1-8 andFIG. 9 (e.g., step 930) and discussed previously. For example,step 1030 may share any or all information generation characteristics discussed previously with regard to at least theprocessor module 250,sensor processing module 253,location module 252,calibration module 251,television interface module 235,user interface module 240 and/orcommunication interface modules - The
exemplary method 1000 may, atstep 1095, comprise performing continued processing. Such continued processing may comprise characteristics of any of a variety of types of continued processing, various examples of which were presented previously. For example and without limitation,step 1095 may comprise looping execution flow back up to any earlier step (e.g., step 1008). Also, in a non-limiting exemplary scenario,step 1095 may comprise presenting a graphical feature on a television control device screen indicative of where the user is pointing on a television screen. In another exemplary scenario,step 1095 may comprise communicating information to a television that causes the television to output a graphical feature on the television screen indicative of where the user is pointing (e.g., such information may comprise characteristics that cause the television to overlay such graphical indication on programming being presented on the television screen. Additionally for example,step 1095 may comprise presenting (and/or causing the presentation of) visual feedback indicia of the on-screen pointing location for a user. Further for example,step 1095 may comprise communicating information of the on-screen pointing location to system components external to the television receiver implementing themethod 1000. Further for example,step 1095 may comprise utilizing the on-screen pointing information to identify a video content object (e.g., an object presented in television programming) to which a user is pointing, etc. - In general,
step 1095 may comprise performing continued processing. Accordingly, the scope of various aspects of the present invention should not be limited by characteristics of any particular manner of performing continued processing unless explicitly claimed. - In summary, various aspects of the present invention provide a system and method in a television controller (e.g., a television control device) for generating screen pointing information. While the invention has been described with reference to certain aspects and embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/572,916 US20150106857A1 (en) | 2009-09-14 | 2014-12-17 | System And Method For Generating Screen Pointing Information In A Television Control Device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US24223409P | 2009-09-14 | 2009-09-14 | |
US12/774,321 US8947350B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television control device |
US14/572,916 US20150106857A1 (en) | 2009-09-14 | 2014-12-17 | System And Method For Generating Screen Pointing Information In A Television Control Device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,321 Continuation US8947350B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television control device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150106857A1 true US20150106857A1 (en) | 2015-04-16 |
Family
ID=43730008
Family Applications (34)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,321 Active 2031-09-04 US8947350B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television control device |
US12/774,221 Abandoned US20110063522A1 (en) | 2009-09-14 | 2010-05-05 | System and method for generating television screen pointing information using an external receiver |
US12/774,380 Active 2031-05-27 US8990854B2 (en) | 2009-09-14 | 2010-05-05 | System and method in a television for providing user-selection of objects in a television program |
US12/774,154 Active 2031-11-12 US9110517B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television |
US12/850,866 Active 2031-02-23 US9098128B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television receiver for providing user-selection of objects in a television program |
US12/851,036 Expired - Fee Related US9462345B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television system for providing for user-selection of an object in a television program |
US12/851,075 Abandoned US20110067069A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a parallel television system for providing for user-selection of an object in a television program |
US12/850,945 Active 2031-08-28 US9081422B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/850,832 Abandoned US20110067047A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a distributed system for providing user-selection of objects in a television program |
US12/850,911 Active 2030-12-04 US9197941B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/881,067 Active 2030-10-03 US9043833B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/881,004 Active 2031-08-19 US8931015B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program in an information stream independent of the television program |
US12/881,110 Active US9137577B2 (en) | 2009-09-14 | 2010-09-13 | System and method of a television for providing information associated with a user-selected information element in a television program |
US12/881,031 Abandoned US20110066929A1 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a still image file and/or data stream |
US12/880,851 Abandoned US20110067051A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing advertising information associated with a user-selected object in a television program |
US12/880,668 Active 2031-09-11 US8832747B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program based on user location |
US12/880,749 Active 2030-09-26 US9110518B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program utilizing an alternative communication network |
US12/880,888 Active 2030-10-07 US8819732B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing information associated with a user-selected person in a television program |
US12/881,096 Active US9258617B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/880,594 Active 2030-11-25 US8839307B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a local television system for responding to user-selection of an object in a television program |
US12/880,965 Active US9271044B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program |
US12/880,530 Abandoned US20110067054A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a distributed system for responding to user-selection of an object in a television program |
US14/457,451 Abandoned US20150012939A1 (en) | 2009-09-14 | 2014-08-12 | System And Method In A Television System For Providing Advertising Information Associated With A User-Selected Object In A Television Program |
US14/467,408 Abandoned US20140366062A1 (en) | 2009-09-14 | 2014-08-25 | System And Method In A Television System For Providing Information Associated With A User-Selected Person In A Television Program |
US14/479,670 Abandoned US20140380381A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Based On User Location |
US14/480,020 Abandoned US20140380401A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Local Television System For Responding To User-Selection Of An Object In A Television Program |
US14/488,778 Abandoned US20150007222A1 (en) | 2009-09-14 | 2014-09-17 | System And Method For Providing Information Of Selectable Objects In A Television Program In An Information Stream Independent Of The Television Program |
US14/572,916 Abandoned US20150106857A1 (en) | 2009-09-14 | 2014-12-17 | System And Method For Generating Screen Pointing Information In A Television Control Device |
US14/603,457 Abandoned US20150135217A1 (en) | 2009-09-14 | 2015-01-23 | System And Method In A Television For Providing User-Selection Of Objects In A Television Program |
US14/625,810 Abandoned US20150172769A1 (en) | 2009-09-14 | 2015-02-19 | System And Method In A Television System For Presenting Information Associated With A User-Selected Object In A Television Program |
US14/731,983 Abandoned US20150296263A1 (en) | 2009-09-14 | 2015-06-05 | System And Method In A Television Controller For Providing User-Selection Of Objects In A Television Program |
US14/753,183 Abandoned US20150304721A1 (en) | 2009-09-14 | 2015-06-29 | System And Method In A Television Receiver For Providing User-Selection Of Objects In A Television Program |
US14/805,961 Abandoned US20150326931A1 (en) | 2009-09-14 | 2015-07-22 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Utilizing An Alternative Communication Network |
US14/851,225 Abandoned US20160007090A1 (en) | 2009-09-14 | 2015-09-11 | System And Method Of A Television For Providing Information Associated With A User-Selected Information Element In A Television Program |
Family Applications Before (27)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,321 Active 2031-09-04 US8947350B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television control device |
US12/774,221 Abandoned US20110063522A1 (en) | 2009-09-14 | 2010-05-05 | System and method for generating television screen pointing information using an external receiver |
US12/774,380 Active 2031-05-27 US8990854B2 (en) | 2009-09-14 | 2010-05-05 | System and method in a television for providing user-selection of objects in a television program |
US12/774,154 Active 2031-11-12 US9110517B2 (en) | 2009-09-14 | 2010-05-05 | System and method for generating screen pointing information in a television |
US12/850,866 Active 2031-02-23 US9098128B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television receiver for providing user-selection of objects in a television program |
US12/851,036 Expired - Fee Related US9462345B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television system for providing for user-selection of an object in a television program |
US12/851,075 Abandoned US20110067069A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a parallel television system for providing for user-selection of an object in a television program |
US12/850,945 Active 2031-08-28 US9081422B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/850,832 Abandoned US20110067047A1 (en) | 2009-09-14 | 2010-08-05 | System and method in a distributed system for providing user-selection of objects in a television program |
US12/850,911 Active 2030-12-04 US9197941B2 (en) | 2009-09-14 | 2010-08-05 | System and method in a television controller for providing user-selection of objects in a television program |
US12/881,067 Active 2030-10-03 US9043833B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/881,004 Active 2031-08-19 US8931015B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program in an information stream independent of the television program |
US12/881,110 Active US9137577B2 (en) | 2009-09-14 | 2010-09-13 | System and method of a television for providing information associated with a user-selected information element in a television program |
US12/881,031 Abandoned US20110066929A1 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a still image file and/or data stream |
US12/880,851 Abandoned US20110067051A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing advertising information associated with a user-selected object in a television program |
US12/880,668 Active 2031-09-11 US8832747B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program based on user location |
US12/880,749 Active 2030-09-26 US9110518B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for responding to user-selection of an object in a television program utilizing an alternative communication network |
US12/880,888 Active 2030-10-07 US8819732B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for providing information associated with a user-selected person in a television program |
US12/881,096 Active US9258617B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a television system for presenting information associated with a user-selected object in a television program |
US12/880,594 Active 2030-11-25 US8839307B2 (en) | 2009-09-14 | 2010-09-13 | System and method in a local television system for responding to user-selection of an object in a television program |
US12/880,965 Active US9271044B2 (en) | 2009-09-14 | 2010-09-13 | System and method for providing information of selectable objects in a television program |
US12/880,530 Abandoned US20110067054A1 (en) | 2009-09-14 | 2010-09-13 | System and method in a distributed system for responding to user-selection of an object in a television program |
US14/457,451 Abandoned US20150012939A1 (en) | 2009-09-14 | 2014-08-12 | System And Method In A Television System For Providing Advertising Information Associated With A User-Selected Object In A Television Program |
US14/467,408 Abandoned US20140366062A1 (en) | 2009-09-14 | 2014-08-25 | System And Method In A Television System For Providing Information Associated With A User-Selected Person In A Television Program |
US14/479,670 Abandoned US20140380381A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Based On User Location |
US14/480,020 Abandoned US20140380401A1 (en) | 2009-09-14 | 2014-09-08 | System And Method In A Local Television System For Responding To User-Selection Of An Object In A Television Program |
US14/488,778 Abandoned US20150007222A1 (en) | 2009-09-14 | 2014-09-17 | System And Method For Providing Information Of Selectable Objects In A Television Program In An Information Stream Independent Of The Television Program |
Family Applications After (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/603,457 Abandoned US20150135217A1 (en) | 2009-09-14 | 2015-01-23 | System And Method In A Television For Providing User-Selection Of Objects In A Television Program |
US14/625,810 Abandoned US20150172769A1 (en) | 2009-09-14 | 2015-02-19 | System And Method In A Television System For Presenting Information Associated With A User-Selected Object In A Television Program |
US14/731,983 Abandoned US20150296263A1 (en) | 2009-09-14 | 2015-06-05 | System And Method In A Television Controller For Providing User-Selection Of Objects In A Television Program |
US14/753,183 Abandoned US20150304721A1 (en) | 2009-09-14 | 2015-06-29 | System And Method In A Television Receiver For Providing User-Selection Of Objects In A Television Program |
US14/805,961 Abandoned US20150326931A1 (en) | 2009-09-14 | 2015-07-22 | System And Method In A Television System For Responding To User-Selection Of An Object In A Television Program Utilizing An Alternative Communication Network |
US14/851,225 Abandoned US20160007090A1 (en) | 2009-09-14 | 2015-09-11 | System And Method Of A Television For Providing Information Associated With A User-Selected Information Element In A Television Program |
Country Status (4)
Country | Link |
---|---|
US (34) | US8947350B2 (en) |
EP (1) | EP2328347A3 (en) |
CN (1) | CN102025933A (en) |
TW (1) | TW201132122A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10335678B2 (en) * | 2014-11-05 | 2019-07-02 | DeNA Co., Ltd. | Game program and information processing device |
Families Citing this family (150)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2596967A (en) * | 1948-11-19 | 1952-05-20 | Westinghouse Electric Corp | Fluorine-containing organosilicon compounds |
US8074248B2 (en) | 2005-07-26 | 2011-12-06 | Activevideo Networks, Inc. | System and method for providing video content associated with a source image to a television in a communication network |
US7515710B2 (en) | 2006-03-14 | 2009-04-07 | Divx, Inc. | Federated digital rights management scheme including trusted systems |
US9826197B2 (en) | 2007-01-12 | 2017-11-21 | Activevideo Networks, Inc. | Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device |
EP2106665B1 (en) | 2007-01-12 | 2015-08-05 | ActiveVideo Networks, Inc. | Interactive encoded content system including object models for viewing on a remote device |
US9455783B2 (en) | 2013-05-06 | 2016-09-27 | Federal Law Enforcement Development Services, Inc. | Network security and variable pulse wave form with continuous communication |
US9100124B2 (en) | 2007-05-24 | 2015-08-04 | Federal Law Enforcement Development Services, Inc. | LED Light Fixture |
WO2008148046A1 (en) | 2007-05-24 | 2008-12-04 | Federal Law Enforcement Development Services, Inc. | Led light broad band over power line communication system |
US11265082B2 (en) | 2007-05-24 | 2022-03-01 | Federal Law Enforcement Development Services, Inc. | LED light control assembly and system |
AU2009101382A4 (en) * | 2008-04-10 | 2013-09-12 | Karl Christopher Hansen | Simple-to-use optical wireless remote control |
US9128981B1 (en) | 2008-07-29 | 2015-09-08 | James L. Geer | Phone assisted ‘photographic memory’ |
US8775454B2 (en) | 2008-07-29 | 2014-07-08 | James L. Geer | Phone assisted ‘photographic memory’ |
AU2010203605B2 (en) | 2009-01-07 | 2015-05-14 | Divx, Llc | Singular, collective and automated creation of a media guide for online content |
US8890773B1 (en) | 2009-04-01 | 2014-11-18 | Federal Law Enforcement Development Services, Inc. | Visible light transceiver glasses |
US8947350B2 (en) * | 2009-09-14 | 2015-02-03 | Broadcom Corporation | System and method for generating screen pointing information in a television control device |
US8629938B2 (en) * | 2009-10-05 | 2014-01-14 | Sony Corporation | Multi-point television motion sensor system and method |
KR101689019B1 (en) * | 2009-11-02 | 2016-12-23 | 삼성전자주식회사 | Display apparatus for supporting a search service, User terminal for performing a search of object, and methods thereof |
US8781122B2 (en) | 2009-12-04 | 2014-07-15 | Sonic Ip, Inc. | Elementary bitstream cryptographic material transport systems and methods |
NL2004780C2 (en) * | 2010-05-28 | 2012-01-23 | Activevideo Networks B V | VISUAL ELEMENT METHOD AND SYSTEM. |
US8717289B2 (en) * | 2010-06-22 | 2014-05-06 | Hsni Llc | System and method for integrating an electronic pointing device into digital image data |
US8683514B2 (en) * | 2010-06-22 | 2014-03-25 | Verizon Patent And Licensing Inc. | Enhanced media content transport stream for media content delivery systems and methods |
US8910218B2 (en) * | 2010-07-15 | 2014-12-09 | Verizon Patent And Licensing Inc. | Method and apparatus for providing control of set-top boxes |
US8330033B2 (en) * | 2010-09-13 | 2012-12-11 | Apple Inc. | Graphical user interface for music sequence programming |
WO2012039694A1 (en) * | 2010-09-21 | 2012-03-29 | Echostar Ukraine, L.L.C. | Synchronizing user interfaces of content receivers and entertainment system components |
CA2814070A1 (en) | 2010-10-14 | 2012-04-19 | Activevideo Networks, Inc. | Streaming digital video between video devices using a cable television system |
US20120106972A1 (en) * | 2010-10-29 | 2012-05-03 | Sunrex Technology Corp. | Universal remote control |
KR20120091496A (en) * | 2010-12-23 | 2012-08-20 | 한국전자통신연구원 | A system of providing a broadcast service and a method of providing thereof |
US8914534B2 (en) | 2011-01-05 | 2014-12-16 | Sonic Ip, Inc. | Systems and methods for adaptive bitrate streaming of media stored in matroska container files using hypertext transfer protocol |
CN102693061B (en) * | 2011-03-22 | 2016-06-15 | 中兴通讯股份有限公司 | Method for information display in terminal TV business, terminal and system |
EP2695388B1 (en) | 2011-04-07 | 2017-06-07 | ActiveVideo Networks, Inc. | Reduction of latency in video distribution networks using adaptive bit rates |
EP2518992A1 (en) * | 2011-04-28 | 2012-10-31 | Axel Springer Digital TV Guide GmbH | Apparatus and method for managing a personal channel |
US9176957B2 (en) | 2011-06-10 | 2015-11-03 | Linkedin Corporation | Selective fact checking method and system |
US9087048B2 (en) | 2011-06-10 | 2015-07-21 | Linkedin Corporation | Method of and system for validating a fact checking system |
US9015037B2 (en) | 2011-06-10 | 2015-04-21 | Linkedin Corporation | Interactive fact checking system |
US8185448B1 (en) | 2011-06-10 | 2012-05-22 | Myslinski Lucas J | Fact checking method and system |
US8599311B2 (en) * | 2011-07-14 | 2013-12-03 | Amimon Ltd. | Methods circuits devices and systems for transmission and display of video |
US11039109B2 (en) | 2011-08-05 | 2021-06-15 | Fox Sports Productions, Llc | System and method for adjusting an image for a vehicle mounted camera |
US20130036442A1 (en) * | 2011-08-05 | 2013-02-07 | Qualcomm Incorporated | System and method for visual selection of elements in video content |
MX344762B (en) | 2011-08-05 | 2016-12-15 | Fox Sports Productions Inc | Selective capture and presentation of native image portions. |
US9467708B2 (en) | 2011-08-30 | 2016-10-11 | Sonic Ip, Inc. | Selection of resolutions for seamless resolution switching of multimedia content |
US8909922B2 (en) | 2011-09-01 | 2014-12-09 | Sonic Ip, Inc. | Systems and methods for playing back alternative streams of protected content protected using common cryptographic information |
US8964977B2 (en) | 2011-09-01 | 2015-02-24 | Sonic Ip, Inc. | Systems and methods for saving encoded media streamed using adaptive bitrate streaming |
US20130061268A1 (en) * | 2011-09-03 | 2013-03-07 | Ariel Inventions Llc | Systems, devices, and methods for integrated searching and retrieving internet or digital content across a communication network for a multimedia platform |
US8689255B1 (en) * | 2011-09-07 | 2014-04-01 | Imdb.Com, Inc. | Synchronizing video content with extrinsic data |
US20130117698A1 (en) * | 2011-10-31 | 2013-05-09 | Samsung Electronics Co., Ltd. | Display apparatus and method thereof |
JP2013123127A (en) * | 2011-12-09 | 2013-06-20 | Fujitsu Mobile Communications Ltd | User terminal and communication method |
AT512350B1 (en) | 2011-12-20 | 2017-06-15 | Isiqiri Interface Tech Gmbh | COMPUTER PLANT AND CONTROL PROCESS THEREFOR |
US9596515B2 (en) | 2012-01-04 | 2017-03-14 | Google Inc. | Systems and methods of image searching |
US10409445B2 (en) | 2012-01-09 | 2019-09-10 | Activevideo Networks, Inc. | Rendering of an interactive lean-backward user interface on a television |
KR20130088662A (en) * | 2012-01-31 | 2013-08-08 | 한국전자통신연구원 | Apparatus, method and system for providing additional information through a digital media content |
US9800945B2 (en) | 2012-04-03 | 2017-10-24 | Activevideo Networks, Inc. | Class-based intelligent multiplexing over unmanaged networks |
US9123084B2 (en) | 2012-04-12 | 2015-09-01 | Activevideo Networks, Inc. | Graphical application integration with MPEG objects |
US10440432B2 (en) | 2012-06-12 | 2019-10-08 | Realnetworks, Inc. | Socially annotated presentation systems and methods |
WO2013188590A2 (en) * | 2012-06-12 | 2013-12-19 | Realnetworks, Inc. | Context-aware video api systems and methods |
US9800951B1 (en) | 2012-06-21 | 2017-10-24 | Amazon Technologies, Inc. | Unobtrusively enhancing video content with extrinsic data |
US8773591B1 (en) * | 2012-08-13 | 2014-07-08 | Nongqiang Fan | Method and apparatus for interacting with television screen |
US9113128B1 (en) | 2012-08-31 | 2015-08-18 | Amazon Technologies, Inc. | Timeline interface for video content |
US8955021B1 (en) | 2012-08-31 | 2015-02-10 | Amazon Technologies, Inc. | Providing extrinsic data for video content |
KR20140029049A (en) * | 2012-08-31 | 2014-03-10 | 삼성전자주식회사 | Display apparat and input signal processing method using the same |
EP2893422A4 (en) | 2012-09-06 | 2016-05-18 | Interphase Corp | Absolute and relative positioning sensor fusion in an interactive display system |
WO2014042607A1 (en) * | 2012-09-17 | 2014-03-20 | Echostar Technologies, Llc | Notification controls for television viewing |
CN103313091A (en) * | 2012-09-27 | 2013-09-18 | 中兴通讯股份有限公司 | Speed-multiplying playing method, device and system |
WO2014071307A1 (en) * | 2012-11-05 | 2014-05-08 | Velvet Ape, Inc. | Methods for targeted advertising |
US9389745B1 (en) | 2012-12-10 | 2016-07-12 | Amazon Technologies, Inc. | Providing content via multiple display devices |
US9483159B2 (en) | 2012-12-12 | 2016-11-01 | Linkedin Corporation | Fact checking graphical user interface including fact checking icons |
TW201427401A (en) * | 2012-12-18 | 2014-07-01 | Hon Hai Prec Ind Co Ltd | Television, remote controller and menu displaying method |
US9313510B2 (en) | 2012-12-31 | 2016-04-12 | Sonic Ip, Inc. | Use of objective quality measures of streamed content to reduce streaming bandwidth |
US9191457B2 (en) | 2012-12-31 | 2015-11-17 | Sonic Ip, Inc. | Systems, methods, and media for controlling delivery of content |
US10424009B1 (en) | 2013-02-27 | 2019-09-24 | Amazon Technologies, Inc. | Shopping experience using multiple computing devices |
US20140279867A1 (en) * | 2013-03-14 | 2014-09-18 | Ami Entertainment Network, Llc | Method and apparatus for providing real time television listings for venues |
US10275128B2 (en) | 2013-03-15 | 2019-04-30 | Activevideo Networks, Inc. | Multiple-mode system and method for providing user selectable video content |
US9906785B2 (en) | 2013-03-15 | 2018-02-27 | Sonic Ip, Inc. | Systems, methods, and media for transcoding video data according to encoding parameters indicated by received metadata |
US10397292B2 (en) | 2013-03-15 | 2019-08-27 | Divx, Llc | Systems, methods, and media for delivery of content |
US9374411B1 (en) | 2013-03-21 | 2016-06-21 | Amazon Technologies, Inc. | Content recommendations using deep data |
US20140317660A1 (en) * | 2013-04-22 | 2014-10-23 | LiveRelay Inc. | Enabling interaction between social network users during synchronous display of video channel |
US9094737B2 (en) | 2013-05-30 | 2015-07-28 | Sonic Ip, Inc. | Network video streaming with trick play based on separate trick play files |
EP3005712A1 (en) | 2013-06-06 | 2016-04-13 | ActiveVideo Networks, Inc. | Overlay rendering of user interface onto source video |
US9294785B2 (en) | 2013-06-06 | 2016-03-22 | Activevideo Networks, Inc. | System and method for exploiting scene graph information in construction of an encoded video sequence |
US9219922B2 (en) | 2013-06-06 | 2015-12-22 | Activevideo Networks, Inc. | System and method for exploiting scene graph information in construction of an encoded video sequence |
US11019300B1 (en) | 2013-06-26 | 2021-05-25 | Amazon Technologies, Inc. | Providing soundtrack information during playback of video content |
US9967305B2 (en) | 2013-06-28 | 2018-05-08 | Divx, Llc | Systems, methods, and media for streaming media content |
US20150012840A1 (en) * | 2013-07-02 | 2015-01-08 | International Business Machines Corporation | Identification and Sharing of Selections within Streaming Content |
US10024971B2 (en) | 2013-07-16 | 2018-07-17 | Walter Fields | Apparatus, system and method for locating a lost instrument or object |
KR102123062B1 (en) | 2013-08-06 | 2020-06-15 | 삼성전자주식회사 | Method of aquiring information about contents, image display apparatus using thereof and server system of providing information about contents |
US10194189B1 (en) | 2013-09-23 | 2019-01-29 | Amazon Technologies, Inc. | Playback of content using multiple devices |
US10169424B2 (en) | 2013-09-27 | 2019-01-01 | Lucas J. Myslinski | Apparatus, systems and methods for scoring and distributing the reliability of online information |
US20150095320A1 (en) | 2013-09-27 | 2015-04-02 | Trooclick France | Apparatus, systems and methods for scoring the reliability of online information |
CN103500042B (en) * | 2013-09-30 | 2017-04-05 | 合肥京东方光电科技有限公司 | A kind of optical-touched screen and display device |
US9343112B2 (en) * | 2013-10-31 | 2016-05-17 | Sonic Ip, Inc. | Systems and methods for supplementing content from a server |
US20150117837A1 (en) * | 2013-10-31 | 2015-04-30 | Sonic Ip, Inc. | Systems and methods for supplementing content at a user device |
US20150128194A1 (en) * | 2013-11-05 | 2015-05-07 | Huawei Device Co., Ltd. | Method and mobile terminal for switching playback device |
CN103686413A (en) * | 2013-12-19 | 2014-03-26 | 宇龙计算机通信科技(深圳)有限公司 | Auxiliary display method and device |
US20150198941A1 (en) | 2014-01-15 | 2015-07-16 | John C. Pederson | Cyber Life Electronic Networking and Commerce Operating Exchange |
US20160011675A1 (en) * | 2014-02-20 | 2016-01-14 | Amchael Visual Technology Corporation | Absolute Position 3D Pointing using Light Tracking and Relative Position Detection |
US9972055B2 (en) | 2014-02-28 | 2018-05-15 | Lucas J. Myslinski | Fact checking method and system utilizing social networking information |
US9643722B1 (en) | 2014-02-28 | 2017-05-09 | Lucas J. Myslinski | Drone device security system |
US8990234B1 (en) | 2014-02-28 | 2015-03-24 | Lucas J. Myslinski | Efficient fact checking method and system |
US9838740B1 (en) | 2014-03-18 | 2017-12-05 | Amazon Technologies, Inc. | Enhancing video content with personalized extrinsic data |
US10482658B2 (en) * | 2014-03-31 | 2019-11-19 | Gary Stephen Shuster | Visualization and control of remote objects |
US10873718B2 (en) | 2014-04-02 | 2020-12-22 | Interdigital Madison Patent Holdings, Sas | Systems and methods for touch screens associated with a display |
US9866878B2 (en) | 2014-04-05 | 2018-01-09 | Sonic Ip, Inc. | Systems and methods for encoding and playing back video at different frame rates using enhancement layers |
US9788029B2 (en) | 2014-04-25 | 2017-10-10 | Activevideo Networks, Inc. | Intelligent multiplexing using class-based, multi-dimensioned decision logic for managed networks |
US9661254B2 (en) * | 2014-05-16 | 2017-05-23 | Shadowbox Media, Inc. | Video viewing system with video fragment location |
JP6476601B2 (en) * | 2014-06-10 | 2019-03-06 | 富士ゼロックス株式会社 | Object image information management server, object related information management server and program |
US9928352B2 (en) * | 2014-08-07 | 2018-03-27 | Tautachrome, Inc. | System and method for creating, processing, and distributing images that serve as portals enabling communication with persons who have interacted with the images |
US9189514B1 (en) | 2014-09-04 | 2015-11-17 | Lucas J. Myslinski | Optimized fact checking method and system |
US10299012B2 (en) * | 2014-10-28 | 2019-05-21 | Disney Enterprises, Inc. | Descriptive metadata extraction and linkage with editorial content |
US10264329B2 (en) * | 2014-10-28 | 2019-04-16 | Disney Enterprises, Inc. | Descriptive metadata extraction and linkage with editorial content |
US11758238B2 (en) | 2014-12-13 | 2023-09-12 | Fox Sports Productions, Llc | Systems and methods for displaying wind characteristics and effects within a broadcast |
US11159854B2 (en) | 2014-12-13 | 2021-10-26 | Fox Sports Productions, Llc | Systems and methods for tracking and tagging objects within a broadcast |
US10248982B2 (en) * | 2014-12-23 | 2019-04-02 | Ebay Inc. | Automated extraction of product data from production data of visual media content |
KR20160144817A (en) * | 2015-06-09 | 2016-12-19 | 삼성전자주식회사 | Display apparatus, pointing apparatus, pointing system and control methods thereof |
US20170046950A1 (en) | 2015-08-11 | 2017-02-16 | Federal Law Enforcement Development Services, Inc. | Function disabler device and system |
US10271109B1 (en) | 2015-09-16 | 2019-04-23 | Amazon Technologies, LLC | Verbal queries relative to video content |
CN105607785B (en) * | 2016-01-04 | 2019-11-12 | 京东方科技集团股份有限公司 | Touch display system and touch operation device |
US10021461B2 (en) | 2016-02-29 | 2018-07-10 | Rovi Guides, Inc. | Systems and methods for performing an action based on context of a feature in a media asset |
US10110968B2 (en) * | 2016-04-19 | 2018-10-23 | Google Llc | Methods, systems and media for interacting with content using a second screen device |
US20180052227A1 (en) * | 2016-08-16 | 2018-02-22 | GM Global Technology Operations LLC | Beam pattern diversity-based target location estimation |
KR102240087B1 (en) | 2016-09-27 | 2021-04-15 | 스냅 인코포레이티드 | Eyewear device mode indication |
CN106991108A (en) * | 2016-09-27 | 2017-07-28 | 阿里巴巴集团控股有限公司 | The method for pushing and device of a kind of information |
US10762148B1 (en) * | 2016-12-19 | 2020-09-01 | Wells Fargo Bank, N.A. | Dissemination of information updates across devices |
US11134316B1 (en) | 2016-12-28 | 2021-09-28 | Shopsee, Inc. | Integrated shopping within long-form entertainment |
US10171862B2 (en) * | 2017-02-16 | 2019-01-01 | International Business Machines Corporation | Interactive video search and presentation |
US10498795B2 (en) | 2017-02-17 | 2019-12-03 | Divx, Llc | Systems and methods for adaptive switching between multiple content delivery networks during adaptive bitrate streaming |
US10678216B2 (en) | 2017-02-28 | 2020-06-09 | Sap Se | Manufacturing process data collection and analytics |
US10440439B2 (en) | 2017-02-28 | 2019-10-08 | The Directv Group, Inc. | Method and apparatus for media content streaming and reminder notifications |
US10901394B2 (en) | 2017-02-28 | 2021-01-26 | Sap Se | Manufacturing process data collection and analytics |
US10649666B1 (en) * | 2017-05-10 | 2020-05-12 | Ambarella International Lp | Link-list shortening logic |
CN111417974A (en) * | 2017-09-13 | 2020-07-14 | 源数码有限公司 | Rule-based assistance data |
US20190208236A1 (en) * | 2018-01-02 | 2019-07-04 | Source Digital, Inc. | Coordinates as ancillary data |
CN108123858A (en) * | 2018-01-03 | 2018-06-05 | 深圳市数视通科技股份有限公司 | A kind of Domestic News system based on the integration of three networks |
WO2019191708A1 (en) | 2018-03-30 | 2019-10-03 | Realnetworks, Inc. | Socially annotated audiovisual content |
CN108882003A (en) * | 2018-07-25 | 2018-11-23 | 安徽新华学院 | A kind of electronic software control system that can detect excellent race automatically |
CN110858914B (en) * | 2018-08-23 | 2021-11-26 | 阿里巴巴(中国)有限公司 | Video material recommendation method and device |
US11080748B2 (en) | 2018-12-14 | 2021-08-03 | Sony Interactive Entertainment LLC | Targeted gaming news and content feeds |
US11247130B2 (en) | 2018-12-14 | 2022-02-15 | Sony Interactive Entertainment LLC | Interactive objects in streaming media and marketplace ledgers |
US11269944B2 (en) | 2018-12-14 | 2022-03-08 | Sony Interactive Entertainment LLC | Targeted gaming news and content feeds |
US10881962B2 (en) | 2018-12-14 | 2021-01-05 | Sony Interactive Entertainment LLC | Media-activity binding and content blocking |
US11896909B2 (en) | 2018-12-14 | 2024-02-13 | Sony Interactive Entertainment LLC | Experience-based peer recommendations |
KR102656963B1 (en) * | 2019-04-03 | 2024-04-16 | 삼성전자 주식회사 | Electronic device and Method of controlling thereof |
US20210065719A1 (en) * | 2019-08-29 | 2021-03-04 | Comcast Cable Communications, Llc | Methods and systems for intelligent content controls |
US11213748B2 (en) | 2019-11-01 | 2022-01-04 | Sony Interactive Entertainment Inc. | Content streaming with gameplay launch |
CN111552429B (en) * | 2020-04-29 | 2021-07-23 | 杭州海康威视数字技术股份有限公司 | Graph selection method and device and electronic equipment |
US11420130B2 (en) | 2020-05-28 | 2022-08-23 | Sony Interactive Entertainment Inc. | Media-object binding for dynamic generation and displaying of play data associated with media |
US11442987B2 (en) | 2020-05-28 | 2022-09-13 | Sony Interactive Entertainment Inc. | Media-object binding for displaying real-time play data for live-streaming media |
US11602687B2 (en) | 2020-05-28 | 2023-03-14 | Sony Interactive Entertainment Inc. | Media-object binding for predicting performance in a media |
CN116457066A (en) | 2020-11-09 | 2023-07-18 | 索尼互动娱乐股份有限公司 | Replayable campaign for interactive content titles |
US11671657B2 (en) * | 2021-06-30 | 2023-06-06 | Rovi Guides, Inc. | Method and apparatus for shared viewing of media content |
US11985389B2 (en) * | 2021-07-12 | 2024-05-14 | Avago Technologies International Sales Pte. Limited | Object or region of interest video processing system and method |
EP4290266A4 (en) * | 2021-08-23 | 2024-10-09 | Samsung Electronics Co., Ltd. | ELECTRONIC DEVICE FOR CONTROLLING AN EXTERNAL ELECTRONIC DEVICE AND ASSOCIATED OPERATING METHOD |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5708845A (en) * | 1995-09-29 | 1998-01-13 | Wistendahl; Douglass A. | System for mapping hot spots in media content for interactive digital media program |
US5793361A (en) * | 1994-06-09 | 1998-08-11 | Corporation For National Research Initiatives | Unconstrained pointing interface for natural human interaction with a display-based computer system |
US20040095317A1 (en) * | 2002-11-20 | 2004-05-20 | Jingxi Zhang | Method and apparatus of universal remote pointing control for home entertainment system and computer |
US20060139319A1 (en) * | 2004-11-24 | 2006-06-29 | General Electric Company | System and method for generating most read images in a pacs workstation |
US20060241864A1 (en) * | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
US20080136754A1 (en) * | 2006-12-06 | 2008-06-12 | Sony Corporation | Display apparatus, display-apparatus control method and program |
Family Cites Families (212)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5111511A (en) * | 1988-06-24 | 1992-05-05 | Matsushita Electric Industrial Co., Ltd. | Image motion vector detecting apparatus |
US5395556A (en) | 1990-12-12 | 1995-03-07 | Enichem S.P.A. | Tricyanovinyl substitution process for NLO polymers |
US7207053B1 (en) | 1992-12-09 | 2007-04-17 | Sedna Patent Services, Llc | Method and apparatus for locally targeting virtual objects within a terminal |
US6973669B2 (en) | 1993-03-29 | 2005-12-06 | Microsoft Corporation | Pausing television programming in response to selection of hypertext link |
US5408258A (en) | 1993-04-21 | 1995-04-18 | The Arbitron Company | Method of automatically qualifying a signal reproduction device for installation of monitoring equipment |
JP3329351B2 (en) | 1994-07-22 | 2002-09-30 | ソニー株式会社 | Interactive broadcast system and reply system |
EP0718791A1 (en) * | 1994-12-22 | 1996-06-26 | GOLDSTAR CO. Ltd. | Point type remote control apparatus and the method thereof |
US5543851A (en) | 1995-03-13 | 1996-08-06 | Chang; Wen F. | Method and apparatus for translating closed caption data |
US5727141A (en) | 1995-05-05 | 1998-03-10 | Apple Computer, Inc. | Method and apparatus for identifying user-selectable regions within multiple display frames |
US6769128B1 (en) * | 1995-06-07 | 2004-07-27 | United Video Properties, Inc. | Electronic television program guide schedule system and method with data feed access |
US6411725B1 (en) * | 1995-07-27 | 2002-06-25 | Digimarc Corporation | Watermark enabled video objects |
US20020056136A1 (en) | 1995-09-29 | 2002-05-09 | Wistendahl Douglass A. | System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box |
US8850477B2 (en) | 1995-10-02 | 2014-09-30 | Starsight Telecast, Inc. | Systems and methods for linking television viewers with advertisers and broadcasters |
US5784056A (en) * | 1995-12-29 | 1998-07-21 | Sun Microsystems, Inc. | System and method for temporally varying pointer icons |
US20030212996A1 (en) | 1996-02-08 | 2003-11-13 | Wolzien Thomas R. | System for interconnection of audio program data transmitted by radio to remote vehicle or individual with GPS location |
US5661502A (en) * | 1996-02-16 | 1997-08-26 | Ast Research, Inc. | Self-adjusting digital filter for smoothing computer mouse movement |
US5894843A (en) * | 1996-02-20 | 1999-04-20 | Cardiothoracic Systems, Inc. | Surgical method for stabilizing the beating heart during coronary artery bypass graft surgery |
US20020049832A1 (en) | 1996-03-08 | 2002-04-25 | Craig Ullman | Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments |
US6006256A (en) * | 1996-03-11 | 1999-12-21 | Opentv, Inc. | System and method for inserting interactive program content within a television signal originating at a remote network |
US5929849A (en) | 1996-05-02 | 1999-07-27 | Phoenix Technologies, Ltd. | Integration of dynamic universal resource locators with television presentations |
US6057831A (en) | 1996-08-14 | 2000-05-02 | Samsung Electronics Co., Ltd. | TV graphical user interface having cursor position indicator |
GB2320405B (en) * | 1996-12-13 | 2001-06-27 | Ibm | System, method, and pointing device for remote operation of data processing apparatus |
US6256785B1 (en) * | 1996-12-23 | 2001-07-03 | Corporate Media Patners | Method and system for providing interactive look-and-feel in a digital broadcast via an X-Y protocol |
US7617508B2 (en) * | 2003-12-12 | 2009-11-10 | At&T Intellectual Property I, L.P. | Methods and systems for collaborative capture of television viewer generated clickstreams |
KR100288976B1 (en) | 1997-01-08 | 2001-05-02 | 윤종용 | Method for constructing and recognizing menu commands of television receiver |
US6317714B1 (en) * | 1997-02-04 | 2001-11-13 | Microsoft Corporation | Controller and associated mechanical characters operable for continuously performing received control data while engaging in bidirectional communications over a single communications channel |
US6045588A (en) * | 1997-04-29 | 2000-04-04 | Whirlpool Corporation | Non-aqueous washing apparatus and method |
US6097441A (en) | 1997-12-31 | 2000-08-01 | Eremote, Inc. | System for dual-display interaction with integrated television and internet content |
US7809138B2 (en) | 1999-03-16 | 2010-10-05 | Intertrust Technologies Corporation | Methods and apparatus for persistent control and protection of content |
US6255961B1 (en) | 1998-05-08 | 2001-07-03 | Sony Corporation | Two-way communications between a remote control unit and one or more devices in an audio/visual environment |
AR020608A1 (en) | 1998-07-17 | 2002-05-22 | United Video Properties Inc | A METHOD AND A PROVISION TO SUPPLY A USER REMOTE ACCESS TO AN INTERACTIVE PROGRAMMING GUIDE BY A REMOTE ACCESS LINK |
WO2000005892A1 (en) | 1998-07-23 | 2000-02-03 | Diva Systems Corporation | System for generating, distributing and receiving an interactive user interface |
US7536706B1 (en) | 1998-08-24 | 2009-05-19 | Sharp Laboratories Of America, Inc. | Information enhanced audio video encoding system |
TW463503B (en) * | 1998-08-26 | 2001-11-11 | United Video Properties Inc | Television chat system |
US6357042B2 (en) | 1998-09-16 | 2002-03-12 | Anand Srinivasan | Method and apparatus for multiplexing separately-authored metadata for insertion into a video data stream |
US6626570B2 (en) * | 1998-10-16 | 2003-09-30 | Kenneth Fox Supply Company | Produce bag with draw top |
US7694319B1 (en) | 1998-11-02 | 2010-04-06 | United Video Properties, Inc. | Interactive program guide with continuous data stream and client-server data supplementation |
US6532592B1 (en) | 1998-11-09 | 2003-03-11 | Sony Corporation | Bi-directional remote control unit and method of using the same |
US6314569B1 (en) | 1998-11-25 | 2001-11-06 | International Business Machines Corporation | System for video, audio, and graphic presentation in tandem with video/audio play |
US6282713B1 (en) | 1998-12-21 | 2001-08-28 | Sony Corporation | Method and apparatus for providing on-demand electronic advertising |
GB9902235D0 (en) * | 1999-02-01 | 1999-03-24 | Emuse Corp | Interactive system |
KR100696087B1 (en) * | 1999-02-08 | 2007-03-20 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Method and apparatus for displaying an electronic program guide |
US6122660A (en) | 1999-02-22 | 2000-09-19 | International Business Machines Corporation | Method for distributing digital TV signal and selection of content |
US7102616B1 (en) | 1999-03-05 | 2006-09-05 | Microsoft Corporation | Remote control device with pointing capacity |
US6407779B1 (en) | 1999-03-29 | 2002-06-18 | Zilog, Inc. | Method and apparatus for an intuitive universal remote control system |
US8479251B2 (en) | 1999-03-31 | 2013-07-02 | Microsoft Corporation | System and method for synchronizing streaming content with enhancing content using pre-announced triggers |
JP4497577B2 (en) | 1999-04-05 | 2010-07-07 | キヤノン株式会社 | Multi-beam optical scanning device |
US6349410B1 (en) | 1999-08-04 | 2002-02-19 | Intel Corporation | Integrating broadcast television pause and web browsing |
US7325245B1 (en) * | 1999-09-30 | 2008-01-29 | Intel Corporation | Linking to video information |
KR20010102292A (en) * | 1999-12-22 | 2001-11-15 | 롤페스 요하네스 게라투스 알베르투스 | Pointer coordinates assignment |
US6931660B1 (en) * | 2000-01-28 | 2005-08-16 | Opentv, Inc. | Interactive television system and method for simultaneous transmission and rendering of multiple MPEG-encoded video streams |
US7631338B2 (en) * | 2000-02-02 | 2009-12-08 | Wink Communications, Inc. | Interactive content delivery methods and apparatus |
US7343617B1 (en) | 2000-02-29 | 2008-03-11 | Goldpocket Interactive, Inc. | Method and apparatus for interaction with hyperlinks in a television broadcast |
GB0004811D0 (en) | 2000-03-01 | 2000-04-19 | Pace Micro Tech Plc | Improvements relating to braodcast date receiving apparatus |
US7293279B1 (en) | 2000-03-09 | 2007-11-06 | Sedna Patent Services, Llc | Advanced set top terminal having a program pause feature with voice-to-text conversion |
US7673315B1 (en) | 2000-03-30 | 2010-03-02 | Microsoft Corporation | System and method for providing program criteria representing audio and/or visual programming |
BR0109665A (en) | 2000-03-31 | 2003-02-04 | United Video Properties Inc | System and method for metadata-linked ads |
US20020040482A1 (en) | 2000-04-08 | 2002-04-04 | Sextro Gary L. | Features for interactive television |
US8205223B2 (en) | 2000-04-12 | 2012-06-19 | Lg Electronics Inc. | Method and video device for accessing information |
US20010052133A1 (en) * | 2000-04-12 | 2001-12-13 | Lg Electronics Inc. | Apparatus and method for providing and obtaining product information through a broadcast signal |
WO2001082595A1 (en) | 2000-04-27 | 2001-11-01 | Isurftv | Novel cursor control system |
US7349668B2 (en) | 2000-05-31 | 2008-03-25 | Optinetix (Israel) Ltd. | Systems and methods for embedding commercial information into broadcast media |
JP4501243B2 (en) | 2000-07-24 | 2010-07-14 | ソニー株式会社 | Television receiver and program execution method |
US20050193425A1 (en) | 2000-07-24 | 2005-09-01 | Sanghoon Sull | Delivery and presentation of content-relevant information associated with frames of audio-visual programs |
US20020056109A1 (en) * | 2000-07-25 | 2002-05-09 | Tomsen Mai-Lan | Method and system to provide a personalized shopping channel VIA an interactive video casting system |
US7103908B2 (en) | 2000-07-25 | 2006-09-05 | Diego, Inc. | Method and system to save context for deferred transaction via interactive television |
JP2002057645A (en) * | 2000-08-10 | 2002-02-22 | Ntt Docomo Inc | Method for data transfer and mobile unit server |
US20020078446A1 (en) | 2000-08-30 | 2002-06-20 | Jon Dakss | Method and apparatus for hyperlinking in a television broadcast |
US20020069405A1 (en) | 2000-09-20 | 2002-06-06 | Chapin Paul W. | System and method for spokesperson interactive television advertisements |
AU2001292914A1 (en) | 2000-09-21 | 2002-04-02 | Digital Network Shopping, Llc | Method and apparatus for digital shopping |
US6920244B2 (en) * | 2000-10-06 | 2005-07-19 | Rochester Institute Of Technology | Data-efficient and self adapting imaging spectrometry method and an apparatus thereof |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
JP4747410B2 (en) | 2000-11-10 | 2011-08-17 | ソニー株式会社 | Video switching display device and video switching display method |
US20020069404A1 (en) * | 2000-11-28 | 2002-06-06 | Navic Systems, Incorporated | Targeted promotion deployment |
US7305697B2 (en) * | 2001-02-02 | 2007-12-04 | Opentv, Inc. | Service gateway for interactive television |
US20050039214A1 (en) * | 2001-02-21 | 2005-02-17 | Lorenz Kim E. | System and method for providing direct, context-sensitive customer support in an interactive television system |
US20020120934A1 (en) * | 2001-02-28 | 2002-08-29 | Marc Abrahams | Interactive television browsing and buying method |
US20020162120A1 (en) * | 2001-04-25 | 2002-10-31 | Slade Mitchell | Apparatus and method to provide supplemental content from an interactive television system to a remote device |
US8255299B2 (en) * | 2001-04-26 | 2012-08-28 | Vivien Johan Cambridge | Visual remote control and tactile interaction system |
US20030023981A1 (en) * | 2001-07-25 | 2003-01-30 | Thomas Lemmons | Method and apparatus for transmission of interactive and enhanced television data |
AU2002318948C1 (en) | 2001-08-02 | 2009-08-13 | Opentv, Inc. | Post production visual alterations |
GB2397150B (en) | 2001-08-16 | 2005-09-14 | Goldpocket Interactive | Interactive television tracking system |
US20030035075A1 (en) | 2001-08-20 | 2003-02-20 | Butler Michelle A. | Method and system for providing improved user input capability for interactive television |
KR100846761B1 (en) | 2001-09-11 | 2008-07-16 | 삼성전자주식회사 | Pointer display method, the pointing device thereof, and the host device thereof |
US6896618B2 (en) * | 2001-09-20 | 2005-05-24 | Igt | Point of play registration on a gaming machine |
US7193661B2 (en) | 2001-09-27 | 2007-03-20 | Universal Electronics Inc. | Two way communication using light links |
US20030079224A1 (en) | 2001-10-22 | 2003-04-24 | Anton Komar | System and method to provide additional information associated with selectable display areas |
US20030145326A1 (en) | 2002-01-31 | 2003-07-31 | Koninklijke Philips Electronics N.V. | Subscription to TV channels/shows based on recommendation generated by a TV recommender |
US20030182393A1 (en) * | 2002-03-25 | 2003-09-25 | Sony Corporation | System and method for retrieving uniform resource locators from television content |
US20040078814A1 (en) * | 2002-03-29 | 2004-04-22 | Digeo, Inc. | Module-based interactive television ticker |
US6967566B2 (en) * | 2002-04-05 | 2005-11-22 | Creative Kingdoms, Llc | Live-action interactive adventure game |
US20050177861A1 (en) | 2002-04-05 | 2005-08-11 | Matsushita Electric Industrial Co., Ltd | Asynchronous integration of portable handheld device |
US7725398B2 (en) | 2002-06-19 | 2010-05-25 | Eastman Kodak Company | Method and system for selling goods and/or services over a communication network between multiple users |
US7266835B2 (en) | 2002-06-27 | 2007-09-04 | Digeo, Inc. | Method and apparatus for secure transactions in an interactive television ticker |
JP2006508618A (en) | 2002-12-02 | 2006-03-09 | 松下電器産業株式会社 | Portable device for viewing real-time synchronized information sent from broadcast sources |
GB2414836B (en) * | 2002-12-08 | 2006-11-01 | Immersion Corp | Haptic communication devices |
US6970160B2 (en) | 2002-12-19 | 2005-11-29 | 3M Innovative Properties Company | Lattice touch-sensing system |
EP1463052A1 (en) * | 2003-03-25 | 2004-09-29 | Deutsche Thomson-Brandt Gmbh | Method for representing animated menu buttons |
US20040221025A1 (en) * | 2003-04-29 | 2004-11-04 | Johnson Ted C. | Apparatus and method for monitoring computer networks |
JP2004347320A (en) * | 2003-05-15 | 2004-12-09 | Advantest Corp | Display and method for measuring and displaying signal |
US7053965B1 (en) | 2003-06-10 | 2006-05-30 | Fan Nong-Qiang | Remote control for controlling a computer using a screen of a television |
US8635643B2 (en) | 2003-06-30 | 2014-01-21 | At&T Intellectual Property I, L.P. | System and method for providing interactive media content over a network |
CN1706178B (en) * | 2003-09-12 | 2010-10-06 | 松下电器产业株式会社 | Image displaying apparatus and method |
US20050086690A1 (en) * | 2003-10-16 | 2005-04-21 | International Business Machines Corporation | Interactive, non-intrusive television advertising |
US20050132420A1 (en) * | 2003-12-11 | 2005-06-16 | Quadrock Communications, Inc | System and method for interaction with television content |
US8286203B2 (en) | 2003-12-19 | 2012-10-09 | At&T Intellectual Property I, L.P. | System and method for enhanced hot key delivery |
US7979877B2 (en) * | 2003-12-23 | 2011-07-12 | Intellocity Usa Inc. | Advertising methods for advertising time slots and embedded objects |
FI20040037A0 (en) | 2004-01-13 | 2004-01-13 | Nokia Corp | Providing position information |
JP4192819B2 (en) | 2004-03-19 | 2008-12-10 | ソニー株式会社 | Information processing apparatus and method, recording medium, and program |
US20050229227A1 (en) * | 2004-04-13 | 2005-10-13 | Evenhere, Inc. | Aggregation of retailers for televised media programming product placement |
US20050234782A1 (en) * | 2004-04-14 | 2005-10-20 | Schackne Raney J | Clothing and model image generation, combination, display, and selection |
EP1743322A4 (en) * | 2004-04-30 | 2008-04-30 | Hillcrest Lab Inc | Methods and devices for removing unintentional movement in free space pointing devices |
US20050251835A1 (en) | 2004-05-07 | 2005-11-10 | Microsoft Corporation | Strategies for pausing and resuming the presentation of programs |
US20050289593A1 (en) * | 2004-05-26 | 2005-12-29 | Skipjam Corp. | Method and system for displaying and selecting content of an electronic program guide |
US7335456B2 (en) * | 2004-05-27 | 2008-02-26 | International Business Machines Corporation | Top coat material and use thereof in lithography processes |
US7542072B2 (en) * | 2004-07-28 | 2009-06-02 | The University Of Maryland | Device using a camera and light polarization for the remote displacement of a cursor on a display |
TWI236289B (en) * | 2004-08-11 | 2005-07-11 | Pixart Imaging Inc | Interactive device capable of improving image processing |
EP1779226B1 (en) * | 2004-08-12 | 2018-10-24 | Philips Intellectual Property & Standards GmbH | Method and system for controlling a display |
KR100644095B1 (en) | 2004-10-13 | 2006-11-10 | 박우현 | A method of realizing interactive advertisement by extending linked data broadcasting to internet in digital broadcasting environment |
US7890376B2 (en) | 2004-11-05 | 2011-02-15 | Ebay Inc. | System and method for location based content correlation |
US20070266406A1 (en) | 2004-11-09 | 2007-11-15 | Murali Aravamudan | Method and system for performing actions using a non-intrusive television with reduced text input |
KR100716988B1 (en) | 2004-11-20 | 2007-05-10 | 삼성전자주식회사 | Display method, preferred service management method and device provided by DMV |
US7796116B2 (en) | 2005-01-12 | 2010-09-14 | Thinkoptics, Inc. | Electronic equipment for handheld vision based absolute pointing system |
JP2006260028A (en) * | 2005-03-16 | 2006-09-28 | Sony Corp | Remote control system, remote controller, remote control method, information processor, information processing method and program |
US20060259930A1 (en) * | 2005-05-10 | 2006-11-16 | Rothschild Leigh M | System and method for obtaining information on digital media content |
US20060268895A1 (en) * | 2005-05-17 | 2006-11-30 | Kotzin Michael D | Linking a mobile wireless communication device to a proximal consumer broadcast device |
US8223136B2 (en) | 2005-06-07 | 2012-07-17 | Intel Corporation | Error detection and prevention inacoustic data |
US7814022B2 (en) | 2005-06-10 | 2010-10-12 | Aniruddha Gupte | Enhanced media method and apparatus for use in digital distribution system |
US20070150368A1 (en) * | 2005-09-06 | 2007-06-28 | Samir Arora | On-line personalized content and merchandising environment |
US20070078732A1 (en) * | 2005-09-14 | 2007-04-05 | Crolley C W | Interactive information access system |
US7344084B2 (en) * | 2005-09-19 | 2008-03-18 | Sony Corporation | Portable video programs |
JP4453647B2 (en) * | 2005-10-28 | 2010-04-21 | セイコーエプソン株式会社 | Moving image display device and moving image display method |
US7464688B2 (en) * | 2005-12-21 | 2008-12-16 | Yu Robert C | Active radical initiator for internal combustion engines |
US20070157242A1 (en) * | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Systems and methods for managing content |
US20070157260A1 (en) | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Interactive media guidance system having multiple devices |
US20070156521A1 (en) * | 2005-12-29 | 2007-07-05 | United Video Properties, Inc. | Systems and methods for commerce in media program related merchandise |
US7710504B2 (en) | 2006-02-21 | 2010-05-04 | Mitsubishi Digital Electronics America, Inc. | Remote control system and method for controlling a television |
US20070199014A1 (en) * | 2006-02-22 | 2007-08-23 | E-Cast, Inc. | Consumer portal |
US20100064320A1 (en) * | 2006-03-13 | 2010-03-11 | Verizon Services Corp. | Integrating data on program popularity into an on-screen program guide |
JP2007251446A (en) * | 2006-03-15 | 2007-09-27 | Sharp Corp | Receiving apparatus, and receiving system |
US8095423B2 (en) | 2006-03-17 | 2012-01-10 | Grant Allen Lee Nichols | Interactive international bulk trade television |
JP5649303B2 (en) | 2006-03-30 | 2015-01-07 | エスアールアイ インターナショナルSRI International | Method and apparatus for annotating media streams |
EP2025150B1 (en) | 2006-05-31 | 2019-05-08 | Telecom Italia S.p.A. | Method and tv receiver for storing contents associated to tv programs |
US8261300B2 (en) | 2006-06-23 | 2012-09-04 | Tivo Inc. | Method and apparatus for advertisement placement in a user dialog on a set-top box |
US20080052750A1 (en) | 2006-08-28 | 2008-02-28 | Anders Grunnet-Jepsen | Direct-point on-demand information exchanges |
US9319741B2 (en) * | 2006-09-07 | 2016-04-19 | Rateze Remote Mgmt Llc | Finding devices in an entertainment system |
US8775452B2 (en) | 2006-09-17 | 2014-07-08 | Nokia Corporation | Method, apparatus and computer program product for providing standard real world to virtual world links |
US8813118B2 (en) * | 2006-10-03 | 2014-08-19 | Verizon Patent And Licensing Inc. | Interactive content for media content access systems and methods |
US20080089551A1 (en) | 2006-10-16 | 2008-04-17 | Ashley Heather | Interactive TV data track synchronization system and method |
US20080109851A1 (en) | 2006-10-23 | 2008-05-08 | Ashley Heather | Method and system for providing interactive video |
US9218213B2 (en) * | 2006-10-31 | 2015-12-22 | International Business Machines Corporation | Dynamic placement of heterogeneous workloads |
WO2008055204A2 (en) * | 2006-10-31 | 2008-05-08 | Dotted Pair, Inc. | System and method for interacting with item catalogs |
ATE497623T1 (en) | 2006-11-16 | 2011-02-15 | Sharp Kk | IMAGE DISPLAY DEVICE AND IMAGE DISPLAY METHOD |
US8269746B2 (en) * | 2006-11-27 | 2012-09-18 | Microsoft Corporation | Communication with a touch screen |
WO2008070572A2 (en) * | 2006-12-01 | 2008-06-12 | Hsn Lp | Method and system for improved interactive television processing |
CA2571617A1 (en) * | 2006-12-15 | 2008-06-15 | Desktopbox Inc. | Simulcast internet media distribution system and method |
US20080209480A1 (en) * | 2006-12-20 | 2008-08-28 | Eide Kurt S | Method for enhanced video programming system for integrating internet data for on-demand interactive retrieval |
KR100818990B1 (en) | 2006-12-28 | 2008-04-04 | 삼성전자주식회사 | Apparatus and method for converting motion signals |
US20080172693A1 (en) | 2007-01-16 | 2008-07-17 | Microsoft Corporation | Representing Television Programs Using Video Objects |
US8239215B2 (en) * | 2007-01-17 | 2012-08-07 | Mitochon Systems, Inc. | Apparatus and method for revenue distribution generated from delivering healthcare advertisements via EMR systems, RHIN, and electronic advertising servers |
US8265957B2 (en) | 2007-01-18 | 2012-09-11 | At&T Intellectual Property I, L.P. | Methods, systems, and computer-readable media for disease management |
US20080184132A1 (en) | 2007-01-31 | 2008-07-31 | Zato Thomas J | Media content tagging |
JP5141043B2 (en) | 2007-02-27 | 2013-02-13 | 株式会社日立製作所 | Image display device and image display method |
US20080204605A1 (en) | 2007-02-28 | 2008-08-28 | Leonard Tsai | Systems and methods for using a remote control unit to sense television characteristics |
US8181206B2 (en) * | 2007-02-28 | 2012-05-15 | Time Warner Cable Inc. | Personal content server apparatus and methods |
US7890380B2 (en) | 2007-05-07 | 2011-02-15 | At&T Intellectual Property I, L.P. | Method, system, and computer readable medium for implementing sales of products using a trace of an object |
KR20080099592A (en) * | 2007-05-10 | 2008-11-13 | 엘지전자 주식회사 | Remote control unit and remote operation method |
US8102365B2 (en) * | 2007-05-14 | 2012-01-24 | Apple Inc. | Remote control systems that can distinguish stray light sources |
US20090165041A1 (en) * | 2007-12-21 | 2009-06-25 | Penberthy John S | System and Method for Providing Interactive Content with Video Content |
US8290513B2 (en) * | 2007-06-28 | 2012-10-16 | Apple Inc. | Location-based services |
US7889175B2 (en) * | 2007-06-28 | 2011-02-15 | Panasonic Corporation | Touchpad-enabled remote controller and user interaction methods |
US20090006211A1 (en) | 2007-07-01 | 2009-01-01 | Decisionmark Corp. | Network Content And Advertisement Distribution System and Method |
JPWO2009014206A1 (en) * | 2007-07-26 | 2010-10-07 | シャープ株式会社 | Remote control device and television broadcast receiver |
US20090037947A1 (en) | 2007-07-30 | 2009-02-05 | Yahoo! Inc. | Textual and visual interactive advertisements in videos |
US8744118B2 (en) | 2007-08-03 | 2014-06-03 | At&T Intellectual Property I, L.P. | Methods, systems, and products for indexing scenes in digital media |
US20090113475A1 (en) * | 2007-08-21 | 2009-04-30 | Yi Li | Systems and methods for integrating search capability in interactive video |
US7987478B2 (en) * | 2007-08-28 | 2011-07-26 | Sony Ericsson Mobile Communications Ab | Methods, devices, and computer program products for providing unobtrusive video advertising content |
KR101348346B1 (en) | 2007-09-06 | 2014-01-08 | 삼성전자주식회사 | Pointing apparatus, pointer controlling apparatus, pointing method and pointer controlling method |
JP5005817B2 (en) | 2007-09-14 | 2012-08-22 | エヌイーシー ヨーロッパ リミテッド | Method and system for optimizing network performance |
US8145920B2 (en) * | 2007-09-17 | 2012-03-27 | Intel Corporation | Techniques for collaborative power management for heterogeneous networks |
US8843959B2 (en) * | 2007-09-19 | 2014-09-23 | Orlando McMaster | Generating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time |
JP4479776B2 (en) | 2007-10-05 | 2010-06-09 | ソニー株式会社 | Display device and transmission device |
US8140012B1 (en) * | 2007-10-25 | 2012-03-20 | At&T Mobility Ii Llc | Bluetooth security profile |
US8875212B2 (en) | 2008-04-15 | 2014-10-28 | Shlomo Selim Rakib | Systems and methods for remote control of interactive video |
US8271357B2 (en) * | 2007-12-11 | 2012-09-18 | Ebay Inc. | Presenting items based on activity rates |
US8555311B2 (en) | 2007-12-19 | 2013-10-08 | United Video Properties, Inc. | Methods and devices for presenting guide listings and guidance data in three dimensions in an interactive media guidance application |
JP5228498B2 (en) * | 2008-01-22 | 2013-07-03 | 富士通株式会社 | retrieval method |
US20090187862A1 (en) * | 2008-01-22 | 2009-07-23 | Sony Corporation | Method and apparatus for the intuitive browsing of content |
US8745670B2 (en) | 2008-02-26 | 2014-06-03 | At&T Intellectual Property I, Lp | System and method for promoting marketable items |
US20090235312A1 (en) | 2008-03-11 | 2009-09-17 | Amir Morad | Targeted content with broadcast material |
US8758102B2 (en) * | 2008-03-25 | 2014-06-24 | Wms Gaming, Inc. | Generating casino floor maps |
US8676030B2 (en) | 2008-04-15 | 2014-03-18 | Shlomo Selim Rakib | Methods and systems for interacting with viewers of video content |
US20090256811A1 (en) * | 2008-04-15 | 2009-10-15 | Sony Ericsson Mobile Communications Ab | Optical touch screen |
US8760401B2 (en) * | 2008-04-21 | 2014-06-24 | Ron Kimmel | System and method for user object selection in geographic relation to a video display |
US9256882B2 (en) | 2008-05-27 | 2016-02-09 | At&T Intellectual Property I, Lp. | Methods, communications devices, and computer program products for selecting an advertisement to initiate device-to-device communications |
CN102144400A (en) * | 2008-09-08 | 2011-08-03 | 夏普株式会社 | Control system, video display device, and remote control device |
US8239359B2 (en) * | 2008-09-23 | 2012-08-07 | Disney Enterprises, Inc. | System and method for visual search in a video media player |
KR100972932B1 (en) * | 2008-10-16 | 2010-07-28 | 인하대학교 산학협력단 | Touch screen panel |
US20100098074A1 (en) | 2008-10-22 | 2010-04-22 | Backchannelmedia Inc. | Systems and methods for providing a network link between broadcast content and content located on a computer network |
US8181212B2 (en) | 2008-10-30 | 2012-05-15 | Frederic Sigal | Method of providing a frame-based object redirection overlay for a video stream |
US7756758B2 (en) * | 2008-12-08 | 2010-07-13 | Hsn Lp | Method and system for improved E-commerce shopping |
EP2200334A1 (en) * | 2008-12-18 | 2010-06-23 | Thomson Licensing | Display device with feedback elements and method for monitoring |
US20100162303A1 (en) | 2008-12-23 | 2010-06-24 | Cassanova Jeffrey P | System and method for selecting an object in a video data stream |
US8413188B2 (en) * | 2009-02-20 | 2013-04-02 | At&T Intellectual Property I, Lp | System and method for processing image objects in video data |
US20100257448A1 (en) | 2009-04-06 | 2010-10-07 | Interactical Llc | Object-Based Interactive Programming Device and Method |
EP2264580A2 (en) * | 2009-06-10 | 2010-12-22 | Samsung Electronics Co., Ltd. | Method and apparatus for processing motion data |
US9118468B2 (en) * | 2009-07-23 | 2015-08-25 | Qualcomm Incorporated | Asynchronous time division duplex operation in a wireless network |
US8272012B2 (en) * | 2009-07-29 | 2012-09-18 | Echostar Technologies L.L.C. | User-controlled data/video integration by a video control system |
US9232167B2 (en) | 2009-08-04 | 2016-01-05 | Echostar Technologies L.L.C. | Video system and remote control with touch interface for supplemental content display |
US8947350B2 (en) | 2009-09-14 | 2015-02-03 | Broadcom Corporation | System and method for generating screen pointing information in a television control device |
US20110141013A1 (en) * | 2009-12-14 | 2011-06-16 | Alcatel-Lucent Usa, Incorporated | User-interface apparatus and method for user control |
JP2013042196A (en) * | 2009-12-21 | 2013-02-28 | Panasonic Corp | Reproduction device |
-
2010
- 2010-05-05 US US12/774,321 patent/US8947350B2/en active Active
- 2010-05-05 US US12/774,221 patent/US20110063522A1/en not_active Abandoned
- 2010-05-05 US US12/774,380 patent/US8990854B2/en active Active
- 2010-05-05 US US12/774,154 patent/US9110517B2/en active Active
- 2010-08-05 US US12/850,866 patent/US9098128B2/en active Active
- 2010-08-05 US US12/851,036 patent/US9462345B2/en not_active Expired - Fee Related
- 2010-08-05 US US12/851,075 patent/US20110067069A1/en not_active Abandoned
- 2010-08-05 US US12/850,945 patent/US9081422B2/en active Active
- 2010-08-05 US US12/850,832 patent/US20110067047A1/en not_active Abandoned
- 2010-08-05 US US12/850,911 patent/US9197941B2/en active Active
- 2010-08-30 EP EP10009014.1A patent/EP2328347A3/en not_active Withdrawn
- 2010-09-13 US US12/881,067 patent/US9043833B2/en active Active
- 2010-09-13 US US12/881,004 patent/US8931015B2/en active Active
- 2010-09-13 US US12/881,110 patent/US9137577B2/en active Active
- 2010-09-13 US US12/881,031 patent/US20110066929A1/en not_active Abandoned
- 2010-09-13 US US12/880,851 patent/US20110067051A1/en not_active Abandoned
- 2010-09-13 US US12/880,668 patent/US8832747B2/en active Active
- 2010-09-13 US US12/880,749 patent/US9110518B2/en active Active
- 2010-09-13 US US12/880,888 patent/US8819732B2/en active Active
- 2010-09-13 US US12/881,096 patent/US9258617B2/en active Active
- 2010-09-13 US US12/880,594 patent/US8839307B2/en active Active
- 2010-09-13 US US12/880,965 patent/US9271044B2/en active Active
- 2010-09-13 US US12/880,530 patent/US20110067054A1/en not_active Abandoned
- 2010-09-14 CN CN2010102811775A patent/CN102025933A/en active Pending
- 2010-09-14 TW TW99131055A patent/TW201132122A/en unknown
-
2014
- 2014-08-12 US US14/457,451 patent/US20150012939A1/en not_active Abandoned
- 2014-08-25 US US14/467,408 patent/US20140366062A1/en not_active Abandoned
- 2014-09-08 US US14/479,670 patent/US20140380381A1/en not_active Abandoned
- 2014-09-08 US US14/480,020 patent/US20140380401A1/en not_active Abandoned
- 2014-09-17 US US14/488,778 patent/US20150007222A1/en not_active Abandoned
- 2014-12-17 US US14/572,916 patent/US20150106857A1/en not_active Abandoned
-
2015
- 2015-01-23 US US14/603,457 patent/US20150135217A1/en not_active Abandoned
- 2015-02-19 US US14/625,810 patent/US20150172769A1/en not_active Abandoned
- 2015-06-05 US US14/731,983 patent/US20150296263A1/en not_active Abandoned
- 2015-06-29 US US14/753,183 patent/US20150304721A1/en not_active Abandoned
- 2015-07-22 US US14/805,961 patent/US20150326931A1/en not_active Abandoned
- 2015-09-11 US US14/851,225 patent/US20160007090A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5793361A (en) * | 1994-06-09 | 1998-08-11 | Corporation For National Research Initiatives | Unconstrained pointing interface for natural human interaction with a display-based computer system |
US5708845A (en) * | 1995-09-29 | 1998-01-13 | Wistendahl; Douglass A. | System for mapping hot spots in media content for interactive digital media program |
US20040095317A1 (en) * | 2002-11-20 | 2004-05-20 | Jingxi Zhang | Method and apparatus of universal remote pointing control for home entertainment system and computer |
US20060139319A1 (en) * | 2004-11-24 | 2006-06-29 | General Electric Company | System and method for generating most read images in a pacs workstation |
US20060241864A1 (en) * | 2005-04-22 | 2006-10-26 | Outland Research, Llc | Method and apparatus for point-and-send data transfer within an ubiquitous computing environment |
US20080136754A1 (en) * | 2006-12-06 | 2008-06-12 | Sony Corporation | Display apparatus, display-apparatus control method and program |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10335678B2 (en) * | 2014-11-05 | 2019-07-02 | DeNA Co., Ltd. | Game program and information processing device |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8947350B2 (en) | System and method for generating screen pointing information in a television control device | |
US11729551B2 (en) | Systems and methods for ultra-wideband applications | |
JP7650907B2 (en) | Radio Tuning of Audio Sources | |
US10757243B2 (en) | Method and system for user interface for interactive devices using a mobile device | |
US9516241B2 (en) | Beamforming method and apparatus for sound signal | |
US8391789B2 (en) | Apparatus for facilitating peripheral device selection | |
US10362270B2 (en) | Multimodal spatial registration of devices for congruent multimedia communications | |
JP5255674B2 (en) | Data transmission operation device and data transmission control method | |
US11520550B1 (en) | Electronic system for producing a coordinated output using wireless localization of multiple portable electronic devices | |
EP3391671A1 (en) | Apparatus and method for detecting loudspeaker connection or positioning errors during calibration of a multi channel audio system | |
US9693168B1 (en) | Ultrasonic speaker assembly for audio spatial effect | |
US11157111B2 (en) | Ultrafine LED display that includes sensor elements | |
US20050073497A1 (en) | Remote control device capable of sensing motion | |
KR102578695B1 (en) | Method and electronic device for managing multiple devices | |
JP2006157638A (en) | Remote controller, electronic device, display device and game machine control apparatus | |
US20180048846A1 (en) | Image display apparatus | |
JP2009065292A (en) | System, method, and program for viewing and listening programming simultaneously | |
US20160127849A1 (en) | Program Used for Terminal Apparatus, Sound Apparatus, Sound System, and Method Used for Sound Apparatus | |
EP4290266A1 (en) | Electronic device for controlling external electronic device and operation method thereof | |
KR102578447B1 (en) | Image display device and method for controlling the same | |
CN113391713A (en) | Electronic device, control method for electronic device, and storage medium | |
WO2024131484A1 (en) | Sound field calibration method and electronic device | |
US9439046B2 (en) | Method for measuring position, non-transitory recording medium storing position measurement program, and radio apparatus | |
EP4216214A1 (en) | Controller and system comprising same | |
CN117836650A (en) | Electronic device for controlling external electronic device and method of operating the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KARAOGUZ, JEYHAN;SESHADRI, NAMBIRAJAN;SIGNING DATES FROM 20100504 TO 20100505;REEL/FRAME:034525/0508 |
|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |