WO2018155354A1 - Procédé de commande de dispositif électronique, système de commande de dispositif électronique, dispositif électronique, et programme - Google Patents
Procédé de commande de dispositif électronique, système de commande de dispositif électronique, dispositif électronique, et programme Download PDFInfo
- Publication number
- WO2018155354A1 WO2018155354A1 PCT/JP2018/005616 JP2018005616W WO2018155354A1 WO 2018155354 A1 WO2018155354 A1 WO 2018155354A1 JP 2018005616 W JP2018005616 W JP 2018005616W WO 2018155354 A1 WO2018155354 A1 WO 2018155354A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- electronic device
- content
- information
- control
- presentation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 118
- 230000007613 environmental effect Effects 0.000 claims abstract description 59
- 238000009434 installation Methods 0.000 claims abstract description 55
- 238000012545 processing Methods 0.000 claims description 83
- 230000008859 change Effects 0.000 claims description 38
- 238000010586 diagram Methods 0.000 description 55
- 238000007726 management method Methods 0.000 description 52
- 230000008569 process Effects 0.000 description 48
- 230000009471 action Effects 0.000 description 27
- 230000005540 biological transmission Effects 0.000 description 19
- 230000004048 modification Effects 0.000 description 15
- 238000012986 modification Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 14
- 230000006399 behavior Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 11
- 238000003860 storage Methods 0.000 description 10
- 230000001965 increasing effect Effects 0.000 description 8
- 239000004973 liquid crystal related substance Substances 0.000 description 7
- 230000000630 rising effect Effects 0.000 description 7
- 238000009825 accumulation Methods 0.000 description 6
- 238000010411 cooking Methods 0.000 description 6
- 239000006059 cover glass Substances 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 241000282472 Canis lupus familiaris Species 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000006996 mental state Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 238000013523 data management Methods 0.000 description 3
- 238000005034 decoration Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 239000004744 fabric Substances 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 235000019645 odor Nutrition 0.000 description 3
- 230000002123 temporal effect Effects 0.000 description 3
- 230000001755 vocal effect Effects 0.000 description 3
- 241000196324 Embryophyta Species 0.000 description 2
- 230000004308 accommodation Effects 0.000 description 2
- 229910052782 aluminium Inorganic materials 0.000 description 2
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000001186 cumulative effect Effects 0.000 description 2
- 230000002950 deficient Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 230000008451 emotion Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000004615 ingredient Substances 0.000 description 2
- 235000012054 meals Nutrition 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 239000007769 metal material Substances 0.000 description 2
- 239000002304 perfume Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 208000035985 Body Odor Diseases 0.000 description 1
- 206010011469 Crying Diseases 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 241000234435 Lilium Species 0.000 description 1
- 244000141359 Malus pumila Species 0.000 description 1
- 240000008790 Musa x paradisiaca Species 0.000 description 1
- 241000237502 Ostreidae Species 0.000 description 1
- 241001505935 Phalaenopsis Species 0.000 description 1
- 241000109329 Rosa xanthina Species 0.000 description 1
- 235000004789 Rosa xanthina Nutrition 0.000 description 1
- 244000269722 Thea sinensis Species 0.000 description 1
- 235000021016 apples Nutrition 0.000 description 1
- 235000021015 bananas Nutrition 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 235000013361 beverage Nutrition 0.000 description 1
- 238000009835 boiling Methods 0.000 description 1
- 238000009395 breeding Methods 0.000 description 1
- 230000001488 breeding effect Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 235000021185 dessert Nutrition 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 244000013123 dwarf bean Species 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 239000003205 fragrance Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 235000021331 green beans Nutrition 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000003801 milling Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 235000020636 oyster Nutrition 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000003938 response to stress Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000004936 stimulating effect Effects 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
Definitions
- Patent Document 1 In order to synchronize video reproduction by a plurality of computer terminals, there is a technique for selecting content according to the communication environment or computer environment of each computer terminal (see Patent Document 1).
- Patent Document 1 content is selected only in accordance with a communication environment or a computer environment including a computer hardware factor and a software factor. Therefore, there is a problem that selection of content based on other factors, control of content presentation, and the like cannot be realized.
- the present disclosure provides an electronic device control method for controlling the presentation of content according to the environment around the electronic device, which is different from hardware or software factors of the electronic device.
- a recording medium such as a system, an apparatus, an integrated circuit, a computer program, or a computer-readable CD-ROM, and the system, apparatus, integrated circuit, and computer program. And any combination of recording media.
- FIG. 1 is an explanatory diagram showing an overall image of a service.
- FIG. 2 is a schematic diagram illustrating an appearance of the electronic apparatus according to the first embodiment.
- FIG. 3 is a block diagram illustrating a configuration of the electronic device according to the first embodiment.
- FIG. 4 is a detailed block diagram of the control processing unit according to the first embodiment.
- FIG. 5 is a detailed block diagram of the content management unit according to the first embodiment.
- FIG. 6 is a schematic diagram illustrating an example of an installation mode in which the electronic apparatus according to Embodiment 1 is installed on a wall.
- FIG. 7 is a schematic diagram illustrating an example of an installation mode in which the electronic apparatus according to the first embodiment is hung and installed on a wall.
- FIG. 8 is a perspective view illustrating an appearance of the electronic apparatus according to the first embodiment.
- FIG. 9 is an exploded perspective view of the electronic apparatus according to the first embodiment.
- FIG. 10 is a flowchart showing processing of the electronic device according to the first embodiment.
- FIG. 11 is a flowchart showing a first example of the viewing environment determination process according to Embodiment 1.
- FIG. 12A is a flowchart showing a second example of the viewing environment determination process according to Embodiment 1.
- FIG. 12B is a flowchart showing a third example of the viewing environment determination process according to Embodiment 1.
- FIG. 12C is a flowchart showing a fourth example of the viewing environment determination process according to Embodiment 1.
- FIG. 13 is a flowchart illustrating an example of the control content determination process according to the first embodiment.
- FIG. 14 is an explanatory diagram illustrating the state of the space and an example of output information in the first embodiment.
- FIG. 15 is an explanatory diagram of a table showing the estimation results of the person and space states in the first embodiment.
- FIG. 16 is an explanatory diagram of a table for determining control contents using the determination by the camera according to the first embodiment.
- FIG. 17 is an explanatory diagram of a table for determining control contents using installation environment determination by an illuminometer in the first embodiment.
- FIG. 18 is an explanatory diagram of a table for determining control contents using tilt determination by a three-axis sensor in the first embodiment.
- FIG. 19 is an explanatory diagram of a table for determining control contents using installation environment determination using buttons on the upper and lower surfaces in the first embodiment.
- FIG. 20 is an explanatory diagram of a content management table in the first embodiment.
- FIG. 21 is an explanatory diagram of a voice management table according to the first embodiment.
- FIG. 22A is an explanatory diagram of a table showing an example of a change in viewing environment state and content control in the first embodiment.
- FIG. 22B is an explanatory diagram illustrating a state in which a user is present in front of the electronic device according to Embodiment 1.
- FIG. 22C is an explanatory diagram illustrating a relationship between the direction in which the user exists as viewed from the electronic device according to Embodiment 1 and the output volume of the speaker.
- FIG. 20 is an explanatory diagram of a content management table in the first embodiment.
- FIG. 21 is an explanatory diagram of a voice management table according to the first embodiment.
- FIG. 22A is an explan
- Patent Document 1 content is selected only in accordance with a communication environment or a computer environment including a computer hardware factor and a software factor. Therefore, there is a problem that selection of content based on other factors, control of content presentation, and the like cannot be realized.
- An electronic device control method obtains environmental information of the electronic device including information about an environment around the electronic device or information about an installation mode of the electronic device, and the electronic device By referring to a predetermined association between the environment information and the details of the control related to the presentation of the content by the electronic device, the control related to the presentation of the content by the electronic device associated with the acquired environment information is performed. .
- the electronic device performs control related to the presentation of content using the acquired environment information and a predetermined association.
- the predetermined association is associated in advance with the environmental information of the electronic device and the content of the control. Therefore, the electronic device can perform control related to content presentation based on the acquired environmental information of the electronic device. Thus, the electronic device can control the presentation of content according to the environment around the electronic device.
- the environmental information is information about the environment around the electronic device, the presence or absence of a person around the electronic device, the direction of the person around the electronic device, the direction of the person around the electronic device, Including information indicating brightness around the electronic device or sound volume around the electronic device, and obtaining the environment information senses the environment around the electronic device and outputs an environment value indicating the environment It may be made by a sensor.
- the environmental information includes the presence / absence of a person around the electronic device, the direction of the person around the electronic device, the direction of the person around the electronic device, the brightness around the electronic device, and
- the information indicating at least two of the volume around the electronic device may be included, and the control related to the presentation of the content by the electronic device may be performed in combination with the information indicating the at least two.
- the environmental information is a composite of specific information such as the presence or orientation of a person around the electronic device, the direction of the person seen from the electronic device, or the ambient brightness or volume. Considering this, it is possible to more appropriately control the presentation of content according to the environment around the electronic device.
- the predetermined association includes an association between the environment information of the electronic device and attribute information indicating the attribute of the content to be presented by the electronic device having the environment information matching the environment information.
- the content having attribute information associated with the acquired environment information may be selected by referring to the predetermined association.
- the electronic device when the electronic device selects a content to be presented from among a plurality of contents, the electronic device is permitted to indicate whether editing, adjustment, or processing of the content at the time of content presentation is permitted.
- the content can be selected based on the information. This avoids presentation of content in an unacceptable manner.
- the predetermined association includes an association between information indicating a change in environmental information of the electronic device and start, stop, or end of content presentation.
- the predetermined correspondence By referring to the attachment, the presentation of content to be presented by the electronic device associated with the change in the acquired environment information may be started, stopped, or terminated.
- the predetermined association includes an association between information indicating a change in environmental information of the electronic device and a change in volume or brightness of the content.
- the predetermined association is performed.
- the volume or brightness of the content to be presented by the electronic device associated with the obtained change in the environmental information may be changed.
- the environment information includes information indicating an attitude of the installed electronic device as information regarding an installation mode of the electronic device
- the predetermined association includes information indicating the attitude of the electronic device and the attitude
- the control relating to the presentation of the content the control relating to the presentation of the content associated with the information indicating the attitude of the electronic device may be performed.
- the environmental information includes specific information related to the attitude of the electronic device. Then, the electronic device can perform control related to presentation of content based on the attitude of the electronic device.
- the environment information includes juxtaposition information that is information on whether or not the electronic device is installed side by side with other electronic devices as information on the installation mode of the electronic device, and the predetermined association is Including the correspondence between the juxtaposition information of the electronic device and the control related to the presentation of the content by the electronic device installed in the installation mode indicated in the juxtaposition information, and in the control related to the presentation of the content, the juxtaposition information of the electronic device You may perform control regarding presentation of the content matched with this.
- the environment information includes specific information indicating whether or not a plurality of electronic devices are arranged side by side. Then, the electronic device can perform control related to presentation of content based on how the electronic device is arranged.
- control method may further output log information related to the presentation of the content.
- an electronic device control system includes an acquisition unit that acquires environmental information of the electronic device including information related to an environment around the electronic device or information related to an installation mode of the electronic device. And a predetermined correspondence between the environment information of the electronic device and the details of the control related to the presentation of the content by the electronic device, the content of the content by the electronic device associated with the acquired environment information is referred to A control processing unit that performs control related to presentation.
- an electronic device includes an acquisition unit that acquires environmental information of the electronic device, including information related to an environment around the electronic device, or information related to an installation mode of the electronic device, Control related to presentation of content by the electronic device associated with the acquired environment information by referring to a predetermined association between the environmental information of the electronic device and the details of control related to the presentation of content by the electronic device
- a program according to an aspect of the present disclosure is a program for causing a computer to execute the above-described electronic device control method.
- the electronic device has any one of a function of generating video reproduction, audio reproduction, illumination dimming, or smell
- the control content determination unit includes a control processing unit including the viewing environment estimation unit. While the user is estimated to be in the wake-up state or when the user is assumed to change from the sleeping state to the wake-up state, the electronic device plays a predetermined video, plays a predetermined sound, and a predetermined illumination While the control processing unit including the viewing environment estimation means estimates that the user is sleeping or changes from the wake-up state to the sleeping state Then, when estimating, the electronic device stops the predetermined video or changes the content of the video, stops the predetermined audio or changes the content of the audio, stops the light output of the predetermined illumination, or sets the operation mode. Further, or changes the content of stopping or smell the occurrence of a predetermined odor.
- the senor is a human sensor device provided in the electronic device, or a human sensor device outside the electronic device, and the control processing unit including the viewing environment estimating means is acquired by the human sensor device.
- the control processing unit including the viewing environment estimating means is acquired by the human sensor device.
- control processing unit including the viewing environment estimation unit estimates the presence and usage environment of the user, and the control content determination unit changes the control content of the electronic device according to the presence or usage environment of the user.
- the electronic device includes an output unit that reproduces video and upper and lower speaker units that reproduce audio
- the control content determination unit includes a control processing unit including the viewing environment estimation unit. While the device is estimated to be in a state where the device is arranged at an angle, the audio output of the upper speaker and the lower speaker is changed and adjusted while the predetermined image is played on the electronic device. To play.
- User refers to one or more people who view an electronic device in an environment where the electronic device is placed. A user is also simply called a person.
- “Usage environment” indicates a state in which an electronic device is placed in a space.
- the usage environment indicates, for example, a state of being directly placed on the floor or a state of being hung on a wall using a hook, and is also referred to as an installation mode.
- the usage environment includes a state that changes due to external factors such as illumination intensity in an external space.
- device and “electronic device” are described in the following description, they have the same meaning.
- FIG. 1A shows an overall image of an electronic device and an electronic device system.
- the data center operating company 110 has a cloud server 111.
- the cloud server 111 is a virtualization server that cooperates with various devices via the Internet.
- the cloud server 111 mainly manages huge data (big data) that is difficult to handle with a normal database management tool or the like.
- the data center operating company 110 performs data management, management of the cloud server 111, and operation of the data center that performs them. Details of services performed by the data center operating company 110 will be described later.
- the data center operating company 110 is not limited to a company that performs only data management or operation of the cloud server 111. For example, when a device manufacturer that develops and manufactures one of a plurality of devices 101 is also managing data or managing the cloud server 111, the device manufacturer corresponds to the data center operating company 110.
- the data center operating company 110 is not limited to one company.
- a device manufacturer and another management company jointly or share data management or operate the cloud server 111, both or one of them corresponds to the data center operating company 110 (FIG. 1). (C)).
- the service provider 120 has a server 121.
- the server 121 stores a plurality of video contents including a tag managed as a set with a video and a plurality of sound contents including a tag managed as a set among the video and audio contents reproduced by the device 101. It shall be.
- Tags that are managed as a set with video are generally expressed in natural language that clearly shows the content of the video, but are not limited to natural language expression, and natural language hash tags such as high-speed search of search words It is assumed that it will be managed and operated, including those in the binary data representation. Further, the tag managed by the sound and the set is the same tag as the tag managed by the video and the set, and the tag related to the content of the sound is assumed.
- the server 121 referred to here is not limited in size, and includes, for example, a memory in a personal PC. In some cases, the service provider 120 does not have the server 121.
- the device 101a or the device 101b of the group 100 transmits each log information to the cloud server 111 of the data center operating company 110.
- the cloud server 111 accumulates log information of the device 101a or the device 101b ((a) in FIG. 1).
- the log information is information on the plurality of devices 101, for example, information indicating the driving status or operation date and time, information on the peripheral state of the device or the surrounding environment acquired from a camera or sensor mounted on the device.
- log information etc. it refers to all information that can be acquired from any device arranged in the electronic device main body or the peripheral space facility.
- the log information may be provided directly to the cloud server 111 from a plurality of devices 101 themselves via the Internet.
- log information may be temporarily accumulated in the home gateway 102 from a plurality of devices 101 and provided to the cloud server 111 from the home gateway 102.
- the user 10 and the creator 20 may be different people or the same person.
- the creator 20 can use a creator community service that can be provided on the server 121 of the service provider 120, and the creator 20 registers and manages video and audio synchronously or asynchronously with respect to the video and audio content produced by the creator 20 itself. And whether to distribute to the device 101 can be set.
- the creator 20 can set whether or not distribution is possible for the sake of simplification.
- there is an administrator who manages the creator 20 in the creator community service and the administrator It can also be considered to set the availability.
- the electronic device includes an acquisition unit that acquires environmental information about the electronic device, including information about the environment around the electronic device, or information about an installation mode of the electronic device, environmental information about the electronic device, A control processing unit that performs control related to the presentation of content by the electronic device, which is associated with the acquired environment information, by referring to a predetermined association with the content of the control related to the presentation of content by the electronic device.
- the function of the electronic device can be shared by a plurality of devices connected to be communicable with each other, so that it can be configured as a control system.
- the control system according to the present embodiment is a control system for an electronic device, and acquires environmental information about the electronic device including information about the environment around the electronic device or information about the installation mode of the electronic device.
- the acquisition unit, the environment information of the electronic device, and the details of the control related to the presentation of the content by the electronic device the presentation of the content by the electronic device associated with the acquired environment information
- a control processing unit that performs control.
- an electronic device is also simply referred to as a device.
- FIG. 2A is a front view of the device 101.
- FIG. FIG. 2B is a side view of the device 101.
- FIG. 2C is a rear view of the device.
- FIG. 6 shows an installation mode in which the device 101 is installed against a wall.
- FIG. 7 shows an installation mode in which the device 101 is suspended from a wall.
- the device 101 includes a sensor 201, an output unit 202, a control processing unit 203, a content management unit 204, an upper speaker 205, and a lower speaker 206. And an information transmission / reception unit 207.
- a vertical direction, a horizontal direction, and a front-rear direction are defined.
- a sensor 201F as an example of the sensor 201 is disposed in a part of the device 101.
- the position where the sensor 201F is disposed is, for example, the upper part of the device 101, but is not limited thereto.
- the sensor 201F is, for example, a camera that detects a two-dimensional image as an environmental value.
- the two-dimensional image detected by the sensor 201F is a two-dimensional visible image or a two-dimensional infrared image (also referred to as a thermal image).
- the sensor 201F detects a two-dimensional image in the front direction when viewed from the device 101, and outputs the two-dimensional image to the control processing unit 203 described later.
- the sensor 201F is a sensor that senses the environment around the device 101 and outputs an environmental value indicating the environment.
- the sensor 201F includes, as information about the environment around the device 101, the presence / absence of people around the device 101, the orientation of people around the device 101, the direction of people around the device 101, and the brightness around the device 101.
- environmental information including information indicating the volume around the device 101 is acquired.
- the environmental information includes the presence / absence of people around the device 101, the direction of people around the device 101, the direction of people around the device 101, the brightness around the device 101, and the surroundings of the device 101
- Information indicating at least two of the volume may be included. In this case, control relating to presentation of content by the device 101 is performed in a composite manner in accordance with information indicating the at least two.
- the senor 201F is an odor sensor, fruits (apples, oysters, oranges, bananas, etc.), plants (roses, lilies, phalaenopsis etc.), perfumes or body odors used by humans, or spaces Can detect fragrances used in In this case, the device 101 can grasp human comfort or discomfort with respect to the space by smell, and controls the video / audio content related to the detected smell to reproduce the image related to the fruit or plant.
- it is possible to enhance the spatial value by enhancing the relaxation effect with sound and selecting audiovisual content with values close to the perfume used.
- the senor 201F may be a microphone that picks up sounds around the device 101.
- the device 101 may include an information acquisition unit (not shown) that acquires information from a website on the Internet or the like instead of the sensor 201 or together with the sensor 201. And the apparatus 101 may acquire the information regarding a weather by the information acquisition part from the website on the internet. The acquired information can be used for control related to the presentation of content instead of the environmental value or together with the environmental value.
- an information acquisition unit (not shown) that acquires information from a website on the Internet or the like instead of the sensor 201 or together with the sensor 201.
- the apparatus 101 may acquire the information regarding a weather by the information acquisition part from the website on the internet.
- the acquired information can be used for control related to the presentation of content instead of the environmental value or together with the environmental value.
- the output unit 202 is, for example, a liquid crystal panel disposed in front of the device 101 as illustrated in FIG. 2A, and an image or moving image that is information output from the control processing unit 203 described later is displayed on the user 10. Output or display for.
- the control processing unit 203 acquires information from the sensor 201F, the output unit 202, or the content management unit 204, and performs processing such as determination based on the acquired information.
- the control processing unit 203 outputs information to the sensor 201F, the output unit 202, or the content management unit 204, and controls each configuration.
- FIG. 4 shows a part of the detailed functional configuration of the control processing unit 203.
- the control processing unit 203 includes a viewing environment estimation unit 203A and a control content determination unit 203B. Processing of each functional configuration will be described later. Note that a part of the function of the control processing unit 203, for example, a function of executing part or all of the processing of the viewing environment estimation unit 203A may be included in a camera or a sensor which is an example of the sensor 201F.
- the content management unit 204 temporarily records or accumulates information output from each configuration, information acquired from the outside, and the like.
- the content management unit 204 is, for example, a memory provided in the device 101.
- FIG. 5 shows a part of the detailed functional configuration of the content management unit 204.
- the content management unit 204 includes a viewing environment storage DB (database) 204A and a content management DB (database) 204B.
- the content management unit 204 is not an essential component.
- an external device other than the device 101 includes one or both of the viewing environment storage DB 204A and the content management DB 208 of the content management unit 204, and the device 101 and the external device exchange information by wired communication or wireless communication. May be substituted.
- the upper speaker 205 is a speaker arranged at the upper part of the main body of the device 101.
- the lower speaker 206 is a speaker disposed at the lower part of the main body of the device 101.
- the sensor 201 may be provided with sensors 201BH and 201BL that detect the installation on the floor in a pressure sensitive manner at the upper and lower ends of the back surface of the main body.
- the sensors 201BH and 201BL function as sensors that detect the posture of the device 101.
- the sensors 201BH and 201BL output an ON signal when pressed or touched by an object, and output an OFF signal otherwise.
- the ON signal or OFF signal corresponds to the environmental value.
- the orientation of the device 101 can be detected using a three-axis sensor instead of the sensors 201BH and 201BL.
- the triaxial sensor detects angles with respect to predetermined three axes orthogonal to each other. Using this detected angle, an angle with respect to an arbitrary axis or surface can also be calculated.
- the rising angle ⁇ from the horizontal plane of the device 101 can be calculated using the angles with respect to the three axes.
- the posture of the device 101 can be expressed using the rising angle ⁇ .
- the rising angle ⁇ is an example of an environmental value.
- the device 101 may further include a log output unit (not shown) that outputs log information related to content reproduction.
- the log information output by the log output unit can be used as log information fed back to the creator 20.
- FIG. 8 is a perspective view showing an appearance of the device 101 according to the present embodiment.
- FIG. 9 is an exploded perspective view of the device 101 according to the present embodiment. The configuration shown in FIGS. 8 and 9 will be described in detail below.
- the device 101 includes an exterior frame P101, a structural frame P102, a back cover P103, a decorative member P104, a speaker net P111, a speaker base plate P112, a speaker P113, a cover glass P121, and a display panel P122. .
- the structural frame P102 is a structural frame formed of a metal material such as aluminum.
- the structural frame P102 is a framework for maintaining the mechanical structure of the device 101.
- the back cover P103 is a cover as an enclosure that covers the components of the device 101 from the back side.
- the speaker net P111 is a cover member that covers the speaker P113 from the front side.
- the speaker net P111 is, for example, provided with fine holes in metal or resin, and makes it difficult for the speaker P113 to be visually recognized from the outer surface.
- the speaker net P111 can be made harder to visually recognize the internal structure by putting a ridge on the back surface.
- the speaker net P111 is preferably made of a sound-transmitting material, and may be customized such as using Fabric.
- the speaker base plate P112 is a plate body that holds the speaker P113 and the speaker net P111.
- Speaker P113 is mounted as upper speaker 205 at two locations on the upper left and right when device 101 is viewed from the front side, and as lower speaker 206 at two locations on the left and right of the lower portion.
- the speaker P113 outputs sound from a low frequency range of about 10 Hz to a high frequency range of about 100 kHz.
- the cover glass P121 is a cover member that covers the display panel P122 from the front side.
- the cover glass P121 is directly bonded to the display panel P122 and bonded without an air layer, thereby achieving both a uniform appearance on the surface and display performance.
- by devising the glass shape of the cover glass P121 it is possible to cope with black matrixless display of an image up to the end of the cover glass P121.
- the display panel P122 is a display panel that displays an image.
- the display panel P122 is, for example, an LCD or an OLED.
- the display panel P122 corresponds to the output unit 202.
- the apparatus 101 does not need to be equipped with all these structures. That is, an external device may include a part of these configurations, and may be substituted by exchanging information with the device 101.
- the device 101 may have a configuration other than the above-described configurations.
- FIG. 10 is a flowchart showing processing of the device 101 according to the present embodiment.
- step S402 the control processing unit 203 determines whether an operation has been received from the user 10. If an operation from the user 10 is accepted (Yes in step S402), the process proceeds to step S421. If not (No in step S402), the process proceeds to step S403.
- step S403 the control processing unit 203 determines whether or not to transition to the auto mode.
- the auto mode is a mode in which the viewing environment is estimated based on the detection result of the sensor 201 and the control content is determined (a mode in which operations in steps S411 to S413 described later are performed). Whether or not to transition to the auto mode can be determined based on whether or not the operation received in step S402 is a predetermined operation for transitioning to the auto mode, for example, but is not limited thereto. . If it is determined to transition to the auto mode (Yes in step S403), the process proceeds to step S411. If not (No in step S403), the process proceeds to step S404.
- step S405 the control processing unit 203 determines whether an instruction to turn off the power of the device 101 has been received. If the above instruction has been received (Yes in step S405), the series of processes shown in FIG. If the above instruction has not been received (No in step S405), step S402 is executed again.
- step S411 the viewing environment estimation unit 203A of the control processing unit 203 operates the sensor 201.
- step S413 the control content determination unit 203B of the control processing unit 203 determines whether the device 10 has been estimated by the viewing environment estimation unit 203A in step S402 according to the presence / absence of the user 10, the number of users 10, or the state of the user 10.
- the control content of 101 is determined. Detailed processing in step S413 will be described later.
- step S421 the control processing unit 203 receives a channel designation.
- the channel means a set of contents related to a specific keyword or genre, or a set of contents provided by a specific provider.
- the control processing unit 203 receives a keyword or a genre designated by the user 10 through an operation with the remote controller 103.
- step S422 the control processing unit 203 changes the content playback setting based on the channel designation received in step S421.
- step S406, S413, or S422 When step S406, S413, or S422 is finished, the series of processes shown in FIG.
- step S412 in FIG. 10 will be described with reference to four examples.
- FIG. 11 is a flowchart showing a first example of the viewing environment determination process according to the present embodiment.
- the process shown in FIG. 11 is an example of the process included in step S412 of FIG.
- step S ⁇ b> 501 the viewing environment estimation unit 203 ⁇ / b> A of the control processing unit 203 analyzes information such as a two-dimensional image or sensing information output from the sensor 201, so that the presence or absence of the user 10, the number of users 10, and the like Judging.
- the acquired time is also referred to as an acquisition time.
- a change in a plurality of images having different acquisition times, that is, a difference between images indicates, for example, a pixel region or a difference region in which a predetermined luminance value changes between a plurality of images within a predetermined time by a threshold value or more.
- the difference area can be estimated to be an area with motion in the image.
- the number of users 10 can be specified by measuring the number of areas in which differences are detected in the image.
- the region of the user 10 in the image may be specified from color information, luminance characteristics, shape characteristics, or the like in the image.
- the area of the user 10 in the image may be specified based on information about the temperature in the image.
- step S502 the viewing environment estimation unit 203A determines whether or not the user 10 exists in the space from which the image is acquired based on the determination result in step S501, and branches the subsequent processing. If the user 10 exists (Yes in step S502), the process proceeds to step S503. On the other hand, if the user 10 does not exist (No in step S502), the process of step S412 is exited, and the viewing environment estimation unit 203A outputs information indicating that the user 10 does not exist to the control content determination unit 203B.
- step S503 the viewing environment estimation unit 203A specifies whether or not the line of sight of the user 10 is facing the device 101 with respect to the area specified as the user 10 in the image.
- the method for specifying whether or not the device 101 is suitable is not particularly limited.
- the viewing environment estimation unit 203A analyzes facial elements for the area identified as the user 10 in step S501 in the image, and determines whether the line of sight of the user 10 is facing the device 101. May be. In this case, it is assumed that a general line of sight for the user 10 in the image can be determined in advance, for example, when the device 101 is shipped. In addition, it is desirable to improve the recognition rate that the characteristics of the user 10 living in a specific environment are recorded in the content management unit 204 or the like. Further, the user 10 who uses the device 101 and the characteristics of the user 10 may be acquired and held in advance, and it may be determined whether or not the user 10 is a specific person by comparing the user 10 in the image with the specified area. . This advance acquisition is also referred to as advance registration or calibration. This further improves the detection accuracy.
- the method for estimating the state of the user 10 is not particularly limited. For example, by analyzing changes in a plurality of images having different acquisition times, it may be estimated whether the user 10 is watching the device 101 or accidentally facing the device 101. In this case, if the temporal change is large, it is estimated as “accidental”, and if the temporal change is small, it is estimated as “gazing”. In addition, when the sensor 201 is a sensor that does not acquire a two-dimensional image, whether the sensor 201 is in a gaze state based on temporal changes in sensor information acquired by the sensor, or other states (states including sleep etc.) ) May be estimated.
- step S412 After executing the process of step S504, or after determining that the user 10 does not exist in step S502 (No in step S502), the process of step S412 is completed and the process proceeds to step S413.
- step S502 When it is determined in step S502 that the user 10 is present (“Yes” in step S502), when the process of step S504 is executed and the process of step S412 is completed, the viewing environment estimation unit 203A controls the control content determination unit 203A.
- 203B includes information on (i) the type of the user 10 or the presence or absence of a person, (ii) the number of the existing users 10, and (iii) the state of the existing users 10 (gaze or other: sleep, etc.). At least one of the information is output.
- the type of user means the gender or age of the user, and in the case of use in a private house, whether it is the owner, the husband or the guest of the private house. Means.
- FIG. 12A is a flowchart showing a second example of the viewing environment determination process according to the present embodiment.
- the process shown in FIG. 12A is an example of the process included in step S412 of FIG.
- step S ⁇ b> 511 the viewing environment estimation unit 203 ⁇ / b> A of the control processing unit 203 analyzes information such as a two-dimensional image or sensing information output from the sensor 201, so that the presence or absence of the user 10, the number of users 10, and the like Judging.
- FIG. 12B is a flowchart showing a third example of the viewing environment determination process according to the present embodiment.
- the process shown in FIG. 12B is an example of the process included in step S412 of FIG.
- step S ⁇ b> 521 the viewing environment estimation unit 203 ⁇ / b> A of the control processing unit 203 analyzes information such as a two-dimensional image or sensing information output from the sensor 201, so that the presence or absence of the user 10, the number of users 10, and the like Judging.
- step S522 the viewing environment estimation unit 203A determines whether the user 10 exists in the space from which the image is acquired based on the determination result in step S521, and branches the subsequent processing. If the user 10 exists (Yes in step S522), the process proceeds to step S523. On the other hand, when the user 10 does not exist (No in step S522), the process of step S412 is exited, and the viewing environment estimation unit 203A outputs information indicating that the user 10 does not exist to the control content determination unit 203B.
- step S523 the viewing environment estimation unit 203A identifies in which direction the user 10 is present as viewed from the device 101 with respect to the area identified as the user 10 in the image.
- the method for specifying in which direction the user 10 exists is not particularly limited.
- the viewing environment estimation unit 203A analyzes the position where the area identified as the user 10 in step S521 in the image exists in the entire image, so that the user 10 can see from the device 101. It may be determined in which direction.
- step S524 the viewing environment estimation unit 203A uses the information specified in step S523, which indicates in which direction the user 10 is present as viewed from the device 101, as the estimation result of the viewing environment of the device 101.
- step S412 After performing the process of step S524, or after determining that the user 10 does not exist in step S522 (No in step S502), the process of step S412 is completed and the process proceeds to step S413.
- FIG. 12C is a flowchart showing a fourth example of the viewing environment determination process according to the present embodiment.
- the process shown in FIG. 12C is an example of the process included in step S412 of FIG.
- step S532 the viewing environment estimation unit 203A acquires a sound volume (also referred to as a sound collection volume) collected from around the device 101 by the microphone serving as the sensor 201, and the acquired sound collection volume is equal to or less than a specified value. It is determined whether or not.
- the specified value of the sound collection volume is, for example, about 30 to 40 dB corresponding to a relatively quiet room. If the illuminance is less than or equal to the specified value (Yes in step S532), the process proceeds to step S533, and if not (No in step S532), the series of processes illustrated in FIG.
- FIG. 13 is a flowchart showing an example of the control content determination process according to the present embodiment. Details of the control content determination unit 203B included in the control processing unit 203 will be described with reference to FIG.
- control content determination unit 203B refers to the table acquired in step S601, and determines the control content related to the content presentation based on the viewing environment estimated in step S412.
- control related to the presentation of content for example, by referring to a predetermined association, the content to be presented by the device 101, which is the content of the control associated with the environment information acquired by the sensor 201, is selected and selected. You may perform control which shows the content which was performed.
- FIG. 14 is an explanatory diagram showing the state of the space and an example of output information in the present embodiment. More specifically, in FIG. 14, a diagram showing the state in the space estimated by the viewing environment estimation unit 203A through the processing from steps S501 to S504 is shown in the upper stage, and is output to the control content determination unit 203B. The figure which shows the example of the information to perform is shown in the lower stage.
- FIG. 14A shows a state in which a user 10 and one dog who are getting up simultaneously exist in the space.
- FIG. 14B shows a state in which the user 10 who is getting up and two dogs are present in the space.
- the viewing environment estimation unit 203A outputs at least one of the information (i) to (iv) to the control content determination unit 203B based on the estimation from these states.
- the viewing environment estimation unit 203A may estimate other information instead of the state of “wake up” or “gaze”, or in addition to the state of “wake up” or “gaze” Other information may be estimated.
- the viewing environment estimation unit 203 ⁇ / b> A estimates the orientation of the user 10 with respect to the device 101.
- the orientation of the user 10 is, for example, front or side orientation.
- a method for estimating the orientation of the user 10 can employ a known method and is not particularly limited.
- the orientation of the user 10 may be estimated by employing a face recognition technique or the like and estimating the orientation of the face.
- the direction it is possible to estimate whether or not the user 10 is facing the direction of the device 101, that is, whether or not the user 10 is interested in the device 101.
- control more suitable for the user 10 or control more suitable for the state of the user 10 can be determined. Therefore, when it is estimated that the user 10 exists in the space, it is preferable to additionally estimate the direction of the user 10.
- the action of the user 10 may be estimated.
- the action of the user 10 is during reading, eating, or walking.
- a method for estimating the behavior a known method can be adopted, and there is no particular limitation. By calculating the difference between a plurality of images having different acquisition times, it can be determined whether the user 10 is moving or stationary, so that it is possible to estimate an action such as reading or walking. Further, when the user is stationary at a predetermined place, an action such as whether he is eating or reading may be estimated depending on the place. It is possible to estimate whether the user 10 is interested in the device 101 also by estimating the behavior.
- the mental state of the user 10 can also be estimated, in the control content determination in the control content determination unit 203B described later, it is possible to determine a control more suitable for the user 10 or a control more suitable for the state of the user 10.
- the mental state of the user 10 is a mental state in which the user 10 wants to calm down or an active mental state. Therefore, when there is a user 10 getting up, it is preferable to additionally estimate the behavior.
- FIG. 15 is an explanatory diagram of a table showing the estimation results of the person and space states in the present embodiment. More specifically, FIG. 15 shows an example of a table 801 in which the viewing environment estimation unit 203A manages information output to the control content determination unit 203B. These pieces of information may be stored as log information in the viewing environment storage DB 204A of the content management unit 204 as a table 801, or may be temporarily recorded and appropriately deleted. When it is stored as log information in the viewing environment storage DB 204A, a pattern related to the state of the user 10 in the space can be learned to some extent. Therefore, it is possible to contribute to a reduction in the amount of information to be processed or an improvement in estimation accuracy of the viewing environment estimation unit 203A.
- the control content determination unit 203B may acquire information from the viewing environment accumulation DB 204A and determine the control content.
- the viewing environment estimation unit 203A estimates whether or not the user 10 exists in the space, and if the user 10 exists, the number and state of the user 10 are estimated and output to the control content determination unit 203B. To do. Note that the viewing environment estimation unit 203A does not necessarily have to execute the processes in the order of steps S501 to S504. Some steps may be omitted. Moreover, you may implement each step collectively. That is, the viewing environment estimation unit 203A may use different means regarding the processing order or content as long as it can output at least one of the information (i) to (iv) to the control content determination unit 203B as a result. .
- the table 901A is an example of a predetermined association.
- the table 901A includes correspondence between information indicating a change in the environment information of the device 101 and the start, stop, or end of content presentation. In the control related to content presentation, the table 901A is referenced to start, stop, or end the presentation of content to be presented by the device 101, which is associated with the acquired change in the environment information.
- the table 901 ⁇ / b> A is an example of a table in which a camera is mounted as the sensor 201 and an operation to be executed by the output unit 202 of the device 101 in accordance with the look-ahead behavior as the state of the user 10.
- control content determination unit 203B determines the control content is not limited to the table 901A.
- FIG. 17 is an explanatory diagram of a table 901B for determining control content using installation environment determination by an illuminometer in the present embodiment.
- the table 901B is an example of predetermined association.
- the table 901B includes correspondence between information indicating changes in the environment information of the device 101 and changes in the volume or brightness of the content. In the control related to the presentation of content, the volume or brightness of the content to be presented by the device 101 associated with the change in the acquired environment information is changed by referring to a predetermined association.
- the attention rank is information indicating the degree of attention obtained from a rating action such as the degree of favorite of the user 10 who uses the content.
- the attention rank is basically managed separately from the attention point whether or not each user 10 has watched and watched, but the attention rank may be processed or corrected in conjunction with the attention point.
- the 16: 9 video content is not limited to whether or not it is allowed to cut out into a square, and there is a limit within a range that does not contradict the intention of the creator 20.
- an expression method for example, an editable area is set in time series in the screen area of the video content, and edit control is performed to allow the video to be cut out while retaining the editable area. You may do it.
- music ID (Music ID), text and keywords indicating the outline of the sound source, and path information to the corresponding file are managed according to the category corresponding to each sound source type. Categories include, for example, dignity, tradition, or curiosity.
- control content determination unit 203B refers to the table 1101 and selects and outputs a sound source that can be matched with a keyword from sound source candidates that are optimal for the quality. .
- FIG. 22A shows a table 1201 created by setting in advance a target action for the user 10 to promote an action every 15 minutes in a state where the user 10 is known to exist.
- the control content determination unit 203B determines the content, completes the table 1201, and sets the output unit 202 to reproduce the content according to the table 1201. Control.
- the user 10 can be guided by the reproduction of the content so that the user 10 performs the target action.
- the content of C15 will be used as video and music that would make something up-tempo at the time of re-entry. Is played. Thereby, for example, a child's action can be induced and controlled by audio-visual content that is automatically reproduced while the parent is out.
- control is performed such that the control content determination unit 203B determines the control content corresponding to the current state estimated by the viewing environment estimation unit 203A, but is not limited thereto.
- the control content determination unit 203B may control the output unit 202 by determining optimal control content according to the timing at which the state estimated by the viewing environment estimation unit 203A changes.
- the environment information may include juxtaposition information that is information regarding whether or not the device 101 is installed side by side with other devices 101 as information regarding the installation mode of the device 101.
- the predetermined association may include association between juxtaposition information of the device 101 and control related to presentation of content by the device 101 installed in the installation mode indicated by the juxtaposition information. And in control regarding presentation of content, you may perform control regarding presentation of content matched with the juxtaposition information of the apparatus 101.
- FIG. 22B is an explanatory diagram of a state in which the user 10 exists in front of the device 101 in the present embodiment.
- FIG. 22C is an explanatory diagram showing the relationship between the direction in which the user 10 is seen from the device 101 and the output volume of the speaker according to the present embodiment.
- the device 101 When only one user 10 exists in front of the device 101 and the line of sight of the one user 10 is directed toward the device 101, the device 101 is shown in FIG. 22C. The output volume of each speaker shown is controlled.
- the device 101 Control is performed to lower the value at regular intervals.
- the device 101 sets the volume to a certain time. Control to lower each time.
- the device 101 controls the presentation of content according to the surrounding environment.
- the output volume of the speaker may be controlled according to the collected sound volume.
- FIG. 22G is an explanatory diagram showing the relationship between the sound collection volume and the speaker output volume in the present embodiment.
- the output volume of each speaker is set to 5, and the luminance of the display panel P122 as the output unit 202 is gradually changed to medium.
- the output volume of each speaker is gradually changed to 0 after a predetermined time has elapsed, and the power of the display panel P122 which is the output unit 202 is turned off.
- the device 101 controls the presentation of content in a complex manner according to the number of people around and the sound collection volume of the microphone. In this way, the device 101 can more appropriately control the presentation of content according to the surrounding environment.
- the device 101 selects content and presents the selected content.
- the content presentation is controlled in accordance with the environment around the electronic device by selecting the content without modifying the content itself. Therefore, music data different from that selected in the past may be presented in combination with video data that has been selected in the past.
- FIG. 23 is a block diagram illustrating configurations of the device 101 and the cloud server 111 according to the present modification.
- FIG. 23 is a configuration diagram of each device in a configuration in which a part of the functions of the control processing unit 203 according to the first embodiment is processed by an external device such as the cloud server 111, the server 121, or the home gateway 102.
- the device 101 performs processing while exchanging information with the cloud server 111. It should be noted that the same description can be established by using the server 121 or the home gateway 102 instead of the cloud server 111.
- the information transmission / reception unit 1302 is a configuration for transmitting / receiving information to / from the cloud server 111 via a network, and is an essential configuration in the present modification.
- the information transmission / reception unit 1302 may adopt a general communication module or the like, and its specification is not particularly limited. Further, there is no particular limitation on a communication method for transmitting / receiving information.
- the cloud server 111 includes a control processing unit 1303, an information transmission / reception unit 1304, and a content management unit 1305.
- the control processing unit 1303 includes a viewing environment estimation unit 1303A.
- the operation of the viewing environment estimation unit 1303A is the same as the operation described in Embodiment 1.
- each device and each component in the present modification is the same as the operation or process flow described with reference to FIGS. 4 to 12C in the first embodiment. That is, in the present modification, information is appropriately transmitted and received by the information transmission / reception unit 1302 of the device 101 and the information transmission / reception unit 1304 of the cloud server 111, and processing is basically performed in the order shown in FIG. Therefore, the same function and effect as in the first embodiment are also exhibited in this modification.
- the cloud server 111 includes the viewing environment estimation unit 203A in the first embodiment as the viewing environment estimation unit 1303A, and basic information storage is performed in the cloud. This is a point performed by the content management unit 1305 of the server 111. As a result, processing performed on the device 101 side can be reduced, so that it is not necessary to mount a high-performance processor on the device 101 side. Further, since it is not necessary to record and manage each table or content on the device 101 side, the recording capacity on the device 101 side can be reduced. That is, even when the processing capability of the device 101 is low, optimal device control that estimates the state of the user 10 and the like can be performed.
- the cloud server 111 can acquire log information from the device 101, further information analysis, learning using a large amount of information, and utilization of information can be expected. Furthermore, it is possible to provide a new valuable service by analyzing together the information acquired from devices held by other users 10 in the cloud server 111.
- a sound collecting microphone that directly or indirectly acquires sound such as roaring or crying may be used.
- the device 101c may have other configurations not shown.
- the device 101d includes an output unit 202d and an information transmission / reception unit 1302d.
- the device 101d is assumed to be a display that does not include a sensor, a device that only performs LED (Light Emitting Diode) notification, or a speaker that outputs only sound.
- the display, LED, speaker, or the like corresponds to the output unit 202d.
- the information transmitting / receiving unit 1302d is the same as the information transmitting / receiving unit 1302 described in FIG. Note that the device 101d may have other configurations not shown.
- the cloud server 111 includes a control processing unit 1401, an information transmission / reception unit 1304, and a content management unit 1404.
- the information transmitting / receiving unit 1304 has the same configuration as that described in FIG.
- the content management unit 1404 includes a viewing environment accumulation DB 204A, a content management DB 208, and a preference tendency DB 1405. Since the viewing environment storage DB 204A and the content management DB 208 are the same as those described in the first embodiment, description thereof will be omitted. Information managed by the preference tendency DB 1405 will be described later.
- step S1501 the sensor 201c of the device 101c detects a state in the space.
- Step S1501 is basically the same as step S401 shown in FIG. However, when the sensor 201c is a biosensor of a terminal worn by the user 10 as described above, the information to be acquired is not a two-dimensional image in space but biometric information detected by the biosensor.
- step S1502 the information transmitting / receiving unit 1302c of the device 101c transmits the information acquired in step S1501 to the cloud server 111.
- the information transmission / reception unit 1304 of the cloud server 111 receives information from the device 101c.
- the frequency at which the sensor 201 detects in step S1501 or the timing at which the information transmitting / receiving unit 1302c transmits information in step S1502 is not particularly limited, and information may be detected and transmitted at predetermined time intervals. .
- step S1503 the viewing environment estimation unit 1402 of the control processing unit 1401 of the cloud server 111 estimates the state in the space and the state of the user 10 based on the received information.
- the sensor 201 detects a two-dimensional image by the imaging device, the presence or absence of the user 10 in the space, the number of users 10, Alternatively, the state, orientation, action, or the like of the user 10 is estimated.
- the sensor 201 is a biometric sensor or the like of a terminal worn by the user 10, the state or emotion of the user 10 is estimated from the acquired biometric information. In this case, if it is managed which user 10 is wearing the device 101c in the cloud server 111, it can be grasped in advance. In this case, as described in the first embodiment, it is not necessary to estimate whether the user 10 exists.
- the state or emotion estimation method a known method may be used and detailed description thereof is omitted.
- the state estimated at this time may be recorded or stored in the viewing environment storage DB 204A of the content management unit 1404.
- the sensor 201 is a sound collecting microphone, the degree of stress or tension may be estimated from a sound such as a beat or cry.
- the audiovisual content to be played back is switched by detecting the frequency of the conversation, or the excitement or demotion of the conversation, grasping the characteristics of the musical instrument or voice, the progress of the music, the actions of the user 10 such as applause or ringing fingers. It may be.
- step S1505 the information transmitting / receiving unit 1304 of the cloud server 111 transmits the control content determined in step S1504, that is, the control command to each device.
- the timing at which the information transmission / reception unit 1304 transmits the control content in step S1505 is desirably transmitted without delay every time the control content is determined in step S1504, but is not limited thereto.
- the control content may be periodically transmitted to each device at predetermined time intervals.
- step S1506 the devices 101b, 101c, and 101d perform an output operation according to the control content transmitted from the cloud server 111, that is, the control command.
- the cloud server 111 estimates the state of the user 10 from the spatial information detected by the device 101c or the biological information of the user 10, and the cloud server 111 uses a device other than the device 101c. That is, the contents for controlling the devices 101d and 101e are determined, and each device is controlled. According to such a configuration, as in the first embodiment, it is possible to provide an optimal audiovisual content reproduction environment for the user 10 without requiring a cumbersome operation for the user 10.
- the cloud server 111 performs processing related to state estimation or control content determination, information accumulation, and table management. Therefore, it is not necessary to mount a processor or memory with high processing capability on the device side.
- control contents are determined in cooperation with a plurality of devices and cloud servers. Therefore, it becomes possible to optimally control a plurality of devices only from information on the sensors mounted on one device, and each device does not need to be mounted with a sensor.
- a preference tendency DB 1405 may be constructed in the content management unit 1404 to learn the preference regarding the control content according to the state of the user 10. That is, by analyzing the log information related to the control command to the device of the user 10, the table for determining the control content is updated to an optimal one as appropriate.
- a sensor that senses the usage status of the device may be used as the sensor 201c. For example, the usage status of each device in FIG. 24 is sensed and accumulated in the preference tendency DB 1405.
- the server which controls an electronic device was demonstrated here as the cloud server 111 connected via a network
- the structure of the device control server which controls an electronic device is not restricted to this,
- the same network as the electronic device in a house A local server in a local environment connected to the network may serve as a gateway, and a device control server function for controlling the electronic device may be implemented in the electronic device itself.
- the electronic device to be controlled has been described by taking a display equipped with a speaker as an example, but the electronic device to be controlled, its determination criteria, and control contents are not limited thereto. Although it is different from the described shape, it controls projectors that project images on the wall, terminals with screens dedicated to user 10, display signage embedded on the floor, mobile displays that perform automatic driving or automatic flight, etc. It is good. Further, a humidifier or a dehumidifier capable of controlling the scent of air or aroma, or a curtain or blind electronic control device capable of controlling the lighting of a window or an entrance may be set as a control target.
- the determination criterion has been described in terms of whether or not an image is output to a sleep state or a behavior state for the sake of simplification.
- the control criterion is not limited to this, and the behavior history of the user 10 is not limited to this.
- the magnitude of the state, the desire estimated from the cry, the stress response estimated from the vibration state of the user 10 may be used as the determination criterion.
- the action history state of the user 10 includes, for example, information indicating that the user 10 is not moving much today compared to usual or that the movement is intense today compared to usual.
- the control content may be connected to stress relief or divergence in the user 10 by controlling the sound output of video or scent on the screen of one electronic device, and lighting dimming on / off.
- the input data used for the criterion for determining the control content is not limited to the current state estimation.
- the input data has a track record of comparing the state change from the past state to the current state with the past data, and transitioning the state from the defective state to the good state in the past in a situation similar to the past data.
- the content based on the user type or the user-specific preference may be selected and adopted according to the current state.
- the selection may be controlled according to the time zone, weather or season, the number of users 10, and the like.
- a change in the outside appearance is adopted as a video, or a video that changes from a small sound to a loud sound in advance in preparation for a sudden sound such as thunder. You may do it.
- adopt such as employ
- the past data may be actual data of growth records of the user 10's own family or child, or the actual value of the same generation or an average value thereof may be used.
- the transition to the assumed state is not made within a certain period of time or when the transition to the defective state occurs, the output contents are prioritized according to the past data, and are controlled sequentially from the one with the highest priority. May be executed.
- the input data is not limited to the sensor, for example, the electronic device or the linkage
- an external dedicated terminal is placed in a restaurant private room, hotel room, condominium party room, wedding reception hall, or a delivery-type catering destination ordered by an individual or corporation
- a restaurant is reserved User 10 attribute information, past reservation history or order history, profile of invited user 10, party such as celebration or anniversary, or purpose of stay, contents of course meal on that day, reserved private room or hotel room Static or dynamic information about the user 10 or space, such as a spatial theme or feature It may also include a.
- the attribute information of the user 10 includes the age, sex, family structure, presence / absence of child, occupation, and the like of the user 10.
- the order history includes food or liquor preferences.
- the pop audiovisual content may be preferentially reproduced, and the reproduction content of the audiovisual content may be changed in accordance with the progress of the course meal provision.
- the Vienna court preferentially selects the music or artwork of an artist belonging to the region associated with the selected dessert or tea, switching to a video related to the origin of the ingredients, while playing the sound seamlessly.
- priority may be given to images or music related to Austria.
- the dedicated terminal is a cooking device such as a roasting machine, a coffee maker, an electromagnetic cooking machine, or an oven, for example, determining the place of origin of the roasted green beans, the sound during roasting, the sound during milling, You may make it produce
- a process of boiling with an electromagnetic cooking machine or a process of slowly baking in an oven as an audiovisual content, it is possible to provide an effect of enjoying the background of the provided food or beverage with the five senses.
- the audio / video content related to the place of production or raw material, and the particular way of drinking can be obtained from the brand information of the drink. It may be controlled to play back or play audiovisual content explaining how to make sake in conjunction with the insertion of a bottle.
- the sensor of the external dedicated terminal is a sensor that can recognize a wide range of space conditions, such as a ceiling camera, the chef at the time when the course dish is served on the table or until it is served. By automatically inserting and playing back stories of dishes on the table in a timely manner, such as the message video from or the origin or breeding environment of the ingredients used for cooking, the 3-day preparation process, etc. By avoiding this, the deliciousness of cooking can be produced at the climax by the information of audiovisual content.
- the playback order of audiovisual content is designed according to the set time of the conference, and the phase of deriving discussion, sharing phase, wrap-up and approval, and action item confirmation from the initial objective sharing
- the audiovisual content may be reproduced according to the above.
- the main body of the electronic device reproduces audiovisual content indicating an active meeting
- lighting Control may be performed to increase the spatial value by changing the hue or outputting a scent that promotes a relaxation or cool-down effect.
- audiovisual content that can be used as a trigger for discussions, or that has a refreshing effect that breaks down the thoughts of thought, can be refreshed. Or may be controlled to output a scent that has an effect of stimulating thought.
- the playback method may be changed and controlled so as to indicate that the conference room use end time is close from 15 minutes before the conference room use end time. For example, an alarm sound is inserted step by step from 15 minutes ago, the screen is gradually reduced, the sound volume is increased, or the screen is stained red. Can be devised so that changes in
- a player who plays music or a performer who utters voice as vocal music emits music so that only video content may be played back.
- the player or performer recognizes the player or plays a pre-registered player or performer's favorite video content, or the voice sound feature of the instrument played or the vocal sound
- the feature may be grasped and video content corresponding to the feature may be reproduced.
- a player or performer who is practicing plays a video content indicating that the fitness level is high if the fitness level is high with respect to the scale that should be played, or if the fitness level is low, For example, it may be possible to play back video content including content to be dealt with at a low level.
- This video content includes, for example, a video in which an object moves upward as it rises to guide the pitch in the direction of increasing the pitch.
- the number of tenants living together in the shared space with the time series needs to increase or decrease, and the creation of the atmosphere of the place needs to be changed. Therefore, for example, the number of people in the shared space is grasped with sensors of electronic devices or external dedicated terminals, and when there are few people, switch to calm audiovisual content, or the number of people increases and conversations between people increase, If the atmosphere is close, it may be controlled to switch to bright and pop audiovisual content. Furthermore, when a certain user 10 takes out an analog record and starts playback on the record player, it is determined that there is external music, and the audio content output from the electronic device or the external dedicated terminal is stopped, and only the video content is played back. You may control to change.
- FIG. 26 is a block diagram showing the device 101 according to the present embodiment.
- the device 101 includes an acquisition unit 301 and a control processing unit 302.
- the acquisition unit 301 acquires environment information of the device 101 including information related to the environment around the device 101 or information related to the installation mode of the device 101.
- FIG. 27 is a flowchart showing a method for controlling the device 101 according to the present embodiment.
- control method of the device 101 includes steps S1 and S2.
- step S ⁇ b> 1 the acquisition unit 301 acquires the environment information of the device 101 including information about the environment around the device 101 or information about the installation mode of the device 101.
- step S ⁇ b> 2 the control processing unit 302 refers to a predetermined association between the environment information of the device 101 and the content of the control related to the presentation of content by the device 101, so that the device is associated with the acquired environment information. 101 performs control related to the presentation of content.
- the electronic device can control the presentation of content according to the environment around the electronic device, which is different from hardware or software factors.
- the technology described in the above aspect can be realized, for example, in the following types of cloud services.
- the type in which the technique described in the above embodiment is realized is not limited to this.
- FIG. 28 shows service type 1 (in-house data center type).
- This type is a type in which the service provider 120 acquires information from the group 100 and provides a service to the user 10.
- the service provider 120 has a function of a data center operating company. That is, the service provider 120 has a cloud server 111 that manages big data. Therefore, there is no data center operating company.
- FIG. 29 shows service type 2 (IaaS usage type).
- IaaS is an abbreviation for infrastructure as a service, and is a cloud service provision model that provides a base for constructing and operating a computer system as a service via the Internet.
- the data center operating company operates and manages the data center 2203 (cloud server 111).
- the service provider 120 manages the OS 2202 and the application 2201.
- the service provider 120 performs service provision 2204 using the OS 2202 and the application 2201 managed by the service provider 120.
- FIG. 30 shows service type 3 (PaaS usage type).
- PaaS is an abbreviation for Platform as a Service
- PaaS is a cloud service provision model that provides a platform serving as a foundation for constructing and operating software as a service via the Internet.
- the data center operating company 110 manages the OS 2202 and operates and manages the data center 2203 (cloud server 111). Further, the service provider 120 manages the application 2201. The service provider 120 provides the service 2204 using the OS 2202 managed by the data center operating company and the application 2201 managed by the service provider 120.
- FIG. 31 shows service type 4 (SaaS usage type).
- SaaS is an abbreviation for software as a service.
- SaaS a function that allows applications provided by a platform provider who owns a data center (cloud server) to be used via a network such as the Internet by a company / individual (user) who does not have a data center (cloud server).
- This is a cloud service provision model.
- the data center operating company 110 manages the application 2201, manages the OS 2202, and operates and manages the data center 2203 (cloud server 111). Further, the service provider 120 provides the service 2204 using the OS 2202 and the application 2201 managed by the data center operating company 110.
- the service provider 120 performs a service providing act.
- the service provider or the data center operating company may develop an OS, an application, a big data database, or the like, or may be outsourced to a third party.
- the electronic device performs control related to presentation of content using the acquired environment information and a predetermined association.
- the predetermined association is associated in advance with the environmental information of the electronic device and the content of the control. Therefore, the electronic device can perform control related to content presentation based on the acquired environmental information of the electronic device. Thus, the electronic device can control the presentation of content according to the environment around the electronic device.
- the environmental information includes specific information such as the presence or orientation of a person around the electronic device, the direction of the person around the electronic device, or the brightness or volume of the surroundings.
- Information about people around the electronic device may be environmental information by capturing a person's line of sight and expression using a camera.
- the electronic device can be specifically configured using a sensor that acquires environmental information. Based on such a specific configuration, the electronic device can perform control related to the presentation of content according to the environment around the electronic device.
- environmental information is combined with specific information such as the presence or orientation of people around the electronic device, the direction of people around the electronic device, or the brightness or volume of the surroundings. Control related to presentation of content according to the environment around the electronic device can be performed more appropriately.
- the electronic device can select a content to be presented from among a plurality of contents as control related to the presentation of the contents.
- the electronic device when the electronic device selects content to be presented from a plurality of content, the electronic device can select the content based on the attribute information of the content.
- the electronic device selects content to be presented from among a plurality of content
- the electronic device is based on permission information indicating whether editing, adjustment, or processing of the content at the time of content presentation is permitted.
- the content can be selected. This avoids presentation of content in an unacceptable manner.
- the electronic device can start, stop, or end the presentation of content as control related to the presentation of content.
- the electronic device can change the volume or brightness of the content as control related to the presentation of the content.
- the environmental information includes specific information on the attitude of the electronic device. Then, the electronic device can perform control related to presentation of content based on the attitude of the electronic device.
- the environmental information includes specific information indicating whether or not a plurality of electronic devices are arranged side by side. Then, the electronic device can perform control related to presentation of content based on how the electronic device is arranged.
- the electronic device outputs log information indicating the number of times content is presented, the time of presentation, and the like.
- the output log information is fed back to, for example, a content creator, and can be utilized when content is improved or new content is created. At this time, it is more preferable that the installation status and environment information where the electronic device is installed together with the log information is fed back to the content creator.
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- the software that realizes the electronic device of each of the above embodiments is a program as follows.
- this program is a method for controlling an electronic device, and obtains environmental information of the electronic device including information on an environment around the electronic device or information on an installation mode of the electronic device.
- the presentation of content by the electronic device associated with the acquired environment information by referring to a predetermined association between the environmental information of the electronic device and the details of the control related to the presentation of the content by the electronic device.
- a control method for performing control related to the above is executed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
L'invention concerne un procédé de commande de dispositif électronique comprenant : une étape (S1) pour obtenir des informations environnementales d'un dispositif électronique, les informations comprenant des informations concernant l'environnement alentour le dispositif électronique ou des informations relatives au mode d'installation du dispositif électronique; et une étape (S2) pour exécuter, en lien avec une association prédéterminée entre les informations environnementales du dispositif électronique et des détails de commande concernant une présentation de contenu par le dispositif électronique, une commande concernant une présentation de contenu par le dispositif électronique, la présentation de contenu étant associée aux informations environnementales obtenues. Par exemple, les informations environnementales peuvent comprendre, en tant qu'informations concernant l'environnement alentour le dispositif électronique, des informations indiquant la présence ou l'absence d'une personne autour du dispositif électronique, l'orientation de la personne autour du dispositif électronique, la direction dans laquelle la personne est présente autour du dispositif électronique lorsqu'elle est vue depuis le dispositif électronique, la luminosité dans la zone environnante du dispositif électronique, ou le volume sonore dans la zone environnante du dispositif électronique, et les informations environnementales peuvent être obtenues par un capteur qui détecte l'environnement alentour le dispositif électronique et délivre une valeur environnementale indiquant l'environnement.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762461439P | 2017-02-21 | 2017-02-21 | |
US62/461439 | 2017-02-21 | ||
JP2017-189955 | 2017-09-29 | ||
JP2017189955A JP2020065097A (ja) | 2017-02-21 | 2017-09-29 | 電子機器の制御方法、電子機器の制御システム、電子機器、及び、プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018155354A1 true WO2018155354A1 (fr) | 2018-08-30 |
Family
ID=63253805
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/005616 WO2018155354A1 (fr) | 2017-02-21 | 2018-02-19 | Procédé de commande de dispositif électronique, système de commande de dispositif électronique, dispositif électronique, et programme |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018155354A1 (fr) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110855549A (zh) * | 2019-10-28 | 2020-02-28 | 维沃移动通信有限公司 | 一种消息显示方法及终端设备 |
WO2021006065A1 (fr) * | 2019-07-11 | 2021-01-14 | ソニー株式会社 | Système de traitement d'informations, procédé de traitement d'informations et support d'enregistrement |
JP2021018499A (ja) * | 2019-07-17 | 2021-02-15 | 富士ゼロックス株式会社 | 情報処理システム、情報処理装置、および、プログラム |
JP2021060826A (ja) * | 2019-10-07 | 2021-04-15 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
CN114693389A (zh) * | 2021-08-12 | 2022-07-01 | 山东浪潮爱购云链信息科技有限公司 | 一种针对采购商的线上寻源方法、设备及介质 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002503896A (ja) * | 1998-01-05 | 2002-02-05 | インテル・コーポレーション | アクセス時間に基づくユーザ・プロファイル |
JP2004526374A (ja) * | 2001-03-29 | 2004-08-26 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ユーザ行動に基づきメディアプレーヤーを制御する方法及び装置 |
JP2005221954A (ja) * | 2004-02-09 | 2005-08-18 | Casio Comput Co Ltd | 映像表示制御装置及びプログラム |
JP2008123576A (ja) * | 2006-11-09 | 2008-05-29 | Nec Corp | ポータブルコンテンツ再生装置、再生システム、コンテンツ再生方法 |
WO2010007987A1 (fr) * | 2008-07-15 | 2010-01-21 | シャープ株式会社 | Dispositif d'émission de données, dispositif de réception de données, procédé d'émission de données, procédé de réception de données et procédé de commande d'environnement audiovisuel |
JP2011066516A (ja) * | 2009-09-15 | 2011-03-31 | Sony Corp | 表示装置および制御方法 |
JP2013026997A (ja) * | 2011-07-26 | 2013-02-04 | Sony Corp | 制御装置、制御方法、及び、プログラム |
JP2014106457A (ja) * | 2012-11-29 | 2014-06-09 | Mitsubishi Electric Information Systems Corp | 表示制御装置、表示制御システム及び表示制御プログラム |
JP2016504836A (ja) * | 2012-11-29 | 2016-02-12 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | コンテンツ提示を提供するためにユーザエンゲージメントを使用するための方法および装置 |
JP2016536914A (ja) * | 2013-09-13 | 2016-11-24 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | ストリーミングメディア送信方法及びシステム、ユーザ機器及びサーバ |
-
2018
- 2018-02-19 WO PCT/JP2018/005616 patent/WO2018155354A1/fr active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002503896A (ja) * | 1998-01-05 | 2002-02-05 | インテル・コーポレーション | アクセス時間に基づくユーザ・プロファイル |
JP2004526374A (ja) * | 2001-03-29 | 2004-08-26 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ユーザ行動に基づきメディアプレーヤーを制御する方法及び装置 |
JP2005221954A (ja) * | 2004-02-09 | 2005-08-18 | Casio Comput Co Ltd | 映像表示制御装置及びプログラム |
JP2008123576A (ja) * | 2006-11-09 | 2008-05-29 | Nec Corp | ポータブルコンテンツ再生装置、再生システム、コンテンツ再生方法 |
WO2010007987A1 (fr) * | 2008-07-15 | 2010-01-21 | シャープ株式会社 | Dispositif d'émission de données, dispositif de réception de données, procédé d'émission de données, procédé de réception de données et procédé de commande d'environnement audiovisuel |
JP2011066516A (ja) * | 2009-09-15 | 2011-03-31 | Sony Corp | 表示装置および制御方法 |
JP2013026997A (ja) * | 2011-07-26 | 2013-02-04 | Sony Corp | 制御装置、制御方法、及び、プログラム |
JP2014106457A (ja) * | 2012-11-29 | 2014-06-09 | Mitsubishi Electric Information Systems Corp | 表示制御装置、表示制御システム及び表示制御プログラム |
JP2016504836A (ja) * | 2012-11-29 | 2016-02-12 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | コンテンツ提示を提供するためにユーザエンゲージメントを使用するための方法および装置 |
JP2016536914A (ja) * | 2013-09-13 | 2016-11-24 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | ストリーミングメディア送信方法及びシステム、ユーザ機器及びサーバ |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021006065A1 (fr) * | 2019-07-11 | 2021-01-14 | ソニー株式会社 | Système de traitement d'informations, procédé de traitement d'informations et support d'enregistrement |
US12167305B2 (en) | 2019-07-11 | 2024-12-10 | Sony Group Corporation | Information processing system, information processing method, and recording medium for enabling a plurality of users to experience a same type of service |
JP2021018499A (ja) * | 2019-07-17 | 2021-02-15 | 富士ゼロックス株式会社 | 情報処理システム、情報処理装置、および、プログラム |
JP7487450B2 (ja) | 2019-07-17 | 2024-05-21 | 富士フイルムビジネスイノベーション株式会社 | 情報処理システム、情報処理装置、および、プログラム |
JP2021060826A (ja) * | 2019-10-07 | 2021-04-15 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
CN112699313A (zh) * | 2019-10-07 | 2021-04-23 | 富士施乐株式会社 | 信息处理装置、存储介质及信息处理方法 |
JP7467870B2 (ja) | 2019-10-07 | 2024-04-16 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
CN110855549A (zh) * | 2019-10-28 | 2020-02-28 | 维沃移动通信有限公司 | 一种消息显示方法及终端设备 |
CN114693389A (zh) * | 2021-08-12 | 2022-07-01 | 山东浪潮爱购云链信息科技有限公司 | 一种针对采购商的线上寻源方法、设备及介质 |
CN114693389B (zh) * | 2021-08-12 | 2024-05-28 | 山东浪潮爱购云链信息科技有限公司 | 一种针对采购商的线上寻源方法、设备及介质 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018155354A1 (fr) | Procédé de commande de dispositif électronique, système de commande de dispositif électronique, dispositif électronique, et programme | |
JP6719535B2 (ja) | 情報処理システム、サーバシステム、情報処理プログラム、および情報処理方法 | |
US12161928B2 (en) | Reflective video display apparatus for interactive training and demonstration and methods of using same | |
US12039224B2 (en) | Multimedia experience according to biometrics | |
JP6676309B2 (ja) | 電子機器、電子機器システム、及び機器制御方法 | |
CN110996796B (zh) | 信息处理设备、方法和程序 | |
US20130283325A1 (en) | Entertainment System and Method for Displaying Multimedia Content | |
JP2013535660A (ja) | 雰囲気を取り込む方法及び装置 | |
CN103795951B (zh) | 一种智慧渲染居家气氛的显示幕墙系统及方法 | |
US20180213291A1 (en) | Contextual user interface based on media playback | |
CN110692218A (zh) | 使用连接的照明系统的方法 | |
JP2020065097A (ja) | 電子機器の制御方法、電子機器の制御システム、電子機器、及び、プログラム | |
US20180213286A1 (en) | Contextual user interface based on shared activities | |
US20210262680A1 (en) | Personal environmental control system and method | |
EP3607521B1 (fr) | Procédé et appareil pour surveiller l'utilisation d'un système d'éclairage | |
JP2020009589A (ja) | 照明システム、及び、照明制御方法 | |
WO2023042423A1 (fr) | Dispositif, procédé et programme de traitement d'informations | |
JP2022061464A (ja) | 居住環境制御システム、建物、サーバー、及び制御方法 | |
JP2021052965A (ja) | 上映施設 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18756777 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18756777 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |