WO2016039498A1 - Terminal mobile et son procédé de commande - Google Patents
Terminal mobile et son procédé de commande Download PDFInfo
- Publication number
- WO2016039498A1 WO2016039498A1 PCT/KR2014/009391 KR2014009391W WO2016039498A1 WO 2016039498 A1 WO2016039498 A1 WO 2016039498A1 KR 2014009391 W KR2014009391 W KR 2014009391W WO 2016039498 A1 WO2016039498 A1 WO 2016039498A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- display unit
- mobile terminal
- displayed
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000005452 bending Methods 0.000 claims description 91
- 230000006870 function Effects 0.000 claims description 75
- 230000008859 change Effects 0.000 claims description 40
- 230000003213 activating effect Effects 0.000 claims description 6
- 238000000638 solvent extraction Methods 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 27
- 230000008569 process Effects 0.000 description 21
- 239000000284 extract Substances 0.000 description 12
- 210000003813 thumb Anatomy 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 210000003811 finger Anatomy 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000010295 mobile communication Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 230000007774 longterm Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 239000010408 film Substances 0.000 description 3
- 239000000758 substrate Substances 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 229920003002 synthetic resin Polymers 0.000 description 2
- 239000000057 synthetic resin Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 238000001962 electrophoresis Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- -1 for example Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000001646 magnetic resonance method Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 210000001331 nose Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000009304 pastoral farming Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009774 resonance method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
Definitions
- the present invention relates to a mobile terminal capable of bending or folding and a control method of the mobile terminal.
- Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
- the mobile terminal may be further classified into a handheld terminal and a vehicle mounted terminal according to whether a user can directly carry it.
- the functions of mobile terminals are diversifying. For example, data and voice communication, taking a picture and video with a camera, recording a voice, playing a music file through a speaker system, and outputting an image or video to a display unit.
- Some terminals have an electronic game play function or a multimedia player function.
- recent mobile terminals may receive multicast signals that provide visual content such as broadcasting, video, and television programs.
- such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
- the terminal In order to support and increase the function of the terminal, it may be considered to improve the structural part and / or the software part of the terminal.
- the current mobile terminal 200 may be manufactured in the form of a flexible display due to the development of technology.
- the flexible display herein refers to a display that can be bent, bent, twisted, folded or rolled by an external force.
- a flexible display can be a display fabricated on a thin, flexible substrate that can be bent, bent, folded, or rolled like a paper while maintaining the display characteristics of a conventional flat panel display.
- An object of the present invention is to provide suitable screen information according to the bending characteristics of the flexible display unit.
- Another object of the present invention is to provide a method for controlling various functions executable in a mobile terminal according to a bending characteristic of a flexible display unit.
- a mobile terminal for displaying image information, a sensing unit for detecting the bending of the flexible display
- the flexible display unit may be divided into a plurality of areas according to the bending of the flexible display unit, and include a control unit for displaying different image information in each of the plurality of areas, wherein the control unit includes a user's control over an area in which the flexible display unit is bent.
- the flexible display unit may be configured to display image information in at least one of the plurality of areas based on a touch input.
- the controller when the flexible display unit is bent by an external force, the controller detects an arbitrary straight line that is the center of the bending, forms a first region around the detected one straight line, and
- the flexible display may be divided into a plurality of areas based on a first area.
- the control unit may be configured to, when the flexible display unit is bent by the external force in an inactive state, based on a plurality of touch inputs applied to at least one of the first area and the divided areas. It is characterized by activating any one of the divided areas.
- the controller may be configured to, when a second touch input is applied to at least one of the divided areas while a first touch input is applied to the first area, to the area where the second touch input is applied. It is characterized in that for activating any one of the divided areas based on.
- the second touch input may include a plurality of taps that tap one of the divided regions, and the controller may be configured to, when the mobile terminal is in a locked state, the second touch input.
- the locked state of the mobile terminal is released based on whether a user's motion pattern formed by the user's motion pattern matches a predetermined motion pattern, and the divided area based on the area where the second touch input is applied among the divided areas. It is characterized by activating any one of these areas.
- the controller when the flexible display is bent by the external force while the flexible display is activated, the controller is based on a plurality of touch inputs applied to at least one of the first area and the divided areas. It is characterized by deactivating any one of the divided areas.
- the control unit may determine the size of the first area according to the degree of bending the flexible display unit, and based on the determined size of the first area, in the first area and the divided areas. It is characterized in that the displayed image information is changed.
- the controller may display information related to image information displayed in any one of the divided areas in the first area, based on whether the size of the first area is greater than or equal to a predetermined level. And displaying the related information in the first area in the form of a thumbnail image or text.
- the control unit may divide the divided areas into a main area and a sub area according to a user's selection, and may include information related to image information displayed in the main area among the first area and the sub area. And display information on at least one of the plurality of display panels, and change information displayed on at least one of the first area and the sub area according to the degree of bending of the flexible display unit.
- the control unit may include a control menu that displays image information related to reproduction of sound source data in the main area, and includes a control menu including a different number of control functions according to the degree of bending the flexible display unit. It is characterized by displaying on.
- the controller may include a playlist of currently set sound source data in the first region, and a memory or preset in the sub region in the mobile terminal. Characterized in that the information on the sound source albums stored in the external server.
- the controller may display any one of images previously stored in a memory provided in the mobile terminal or a preset external server in the main area, and the previously stored image according to the degree to which the flexible display is bent.
- Information corresponding to the data is displayed on the first area in the form of a text or a thumbnail image.
- the controller may be further configured to display an image album list of the pre-stored images in the sub area based on the degree of bending of the flexible display unit, and to correspond to images included in any one of the image album lists. And display thumbnail images on the first area.
- the control unit may be configured to display a screen on which one of the memory provided in the mobile terminal or video data stored in a preset external server is reproduced in the main area, and the flexible display unit may be bent. Accordingly, a control menu including different functions related to moving image data reproduced in the main area may be displayed in the first area.
- the control unit may further display still cut images for each time section extracted from the reproduced moving image data in the first area based on the degree of bending the flexible display unit. It is done.
- the controller may classify a web page accessed by a user into a text portion and a multimedia content portion based on the bending of the flexible display unit, and display the web page in the main region and the sub region, respectively. Or, different parts of the web page are displayed in the main area and the sub area.
- the control unit may differently control a function executed in the mobile terminal based on a degree of bending the flexible display unit, and classify image information related to a controlled state based on the bending of the flexible display unit. It is characterized by displaying on at least one of the areas.
- the controller may display image information related to different functions that may be executed in the mobile terminal in each of the divided areas, and the controller may be configured to share information generated in the first area among the divided areas.
- the flexible display unit may be configured to display image information displayed in the first area in the second area based on a user's selection.
- the controller when the screen on which the video data is played is displayed in the first area, the controller may be configured to display the video data corresponding to a predetermined time before and after a time point at which a user's touch input is applied to the first area. And generating information related to a part of the shared information, and controlling the flexible display unit to reproduce a part of the video data corresponding to the predetermined time section in the second area when the sharing information is selected. do.
- a control method of a mobile terminal the step of detecting the bending of the flexible display unit, and based on the degree of bending of the flexible display unit Dividing a plurality of areas of the flexible display unit, displaying image information corresponding to at least one operation state executed in the mobile terminal in each of the divided flexible display units, and the flexible display unit And when the bent state is changed, changing image information displayed in each of the divided areas based on the change.
- the present invention detects the bent state of the display unit, and display the image information based on the bent state of the display, so that the user wants the image information only by bending the display unit
- the advantage is that it can be displayed.
- the present invention divides the display into a plurality of areas based on the bent state of the display, and displays image information in the plurality of areas according to the bent state and the user's touch input.
- the present invention allows the user to simply change the bent state of the display unit by controlling at least one function that can be executed in the current mobile terminal based on the bent state of the display unit.
- the advantage is that the desired function is controlled.
- FIG. 1A is a block diagram illustrating a mobile terminal related to the present invention.
- 1B and 1C are conceptual views of one example of a mobile terminal, viewed from different directions.
- FIG. 2 shows an example in which a mobile terminal according to the present invention is deformed by an external force.
- FIG. 3 is a flowchart illustrating an operation process of a mobile terminal according to the present invention.
- FIG 4 is an exemplary view showing an example in which the mobile terminal according to the present invention is modified by an external force.
- FIG. 5 is a flowchart illustrating an operation of dividing the display into a plurality of areas among the processes shown in FIG. 3.
- FIG. 6 is a flowchart illustrating an operation of switching a display unit in an inactive state to an active state in a mobile terminal according to the present invention.
- FIG. 7 is a flowchart illustrating an operation process of changing image information displayed in a plurality of divided display unit areas when the bent state of the display unit is changed among the processes illustrated in FIG. 3.
- FIG. 8 is a flowchart illustrating an operation process of displaying image information in an area in which a display unit is bent in the mobile terminal according to the present invention.
- FIG. 9 is a flowchart illustrating another operation of displaying image information in an area in which a display unit is bent in the mobile terminal according to the present invention.
- FIG. 10 is an exemplary diagram illustrating an example in which a specific area is activated according to a user's selection among a plurality of divided areas as the display unit is bent in the mobile terminal according to the present invention.
- FIG. 11 is an exemplary view illustrating an example in which image information is displayed in a plurality of areas divided based on a bent state of a display unit in the mobile terminal according to the present invention.
- FIG. 12 is an exemplary diagram illustrating an example in which image information displayed in at least one of the plurality of areas is changed based on a user's touch input in the mobile terminal according to the present invention.
- 13A, 13B, 13C, 13D, and 13E illustrate that in the mobile terminal according to the present invention, different image information related to an operating state of the mobile terminal is displayed in the plurality of areas according to the bent state of the display unit. Illustrated diagrams illustrating examples.
- 13F is an exemplary view illustrating an example in which an operating state of the mobile terminal is controlled according to a bent state of the display unit in the mobile terminal according to the present invention.
- 14A and 14B are exemplary diagrams illustrating an example in which image information displayed in different areas of the display unit is shared with each other by using an area in which the display unit is bent in the mobile terminal according to the present invention.
- the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, and the like. have.
- FIG. 1A is a block diagram illustrating a mobile terminal according to the present invention
- FIGS. 1B and 1C are conceptual views of one example of the mobile terminal, viewed from different directions.
- the mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190. ) May be included.
- the components shown in FIG. 1A are not essential to implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than those listed above.
- the wireless communication unit 110 of the components, between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or the mobile terminal 100 and the external server It may include one or more modules that enable wireless communication therebetween.
- the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
- the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. .
- the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
- the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
- the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
- the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
- Optical sensors e.g. cameras 121), microphones (see 122), battery gauges, environmental sensors (e.g.
- the mobile terminal disclosed herein may use a combination of information sensed by at least two or more of these sensors.
- the output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes at least one of a display unit 151, an audio output unit 152, a hap tip module 153, and an optical output unit 154. can do.
- the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
- the touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and the user, and may also provide an output interface between the mobile terminal 100 and the user.
- the interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100.
- the interface unit 160 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
- I / O audio input / output
- I / O video input / output
- earphone port an earphone port
- the memory 170 stores data supporting various functions of the mobile terminal 100.
- the memory 170 may store a plurality of application programs or applications driven in the mobile terminal 100, data for operating the mobile terminal 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
- at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions of the mobile terminal 100 (for example, a call forwarding, a calling function, a message receiving, and a calling function).
- the application program may be stored in the memory 170 and installed on the mobile terminal 100 to be driven by the controller 180 to perform an operation (or function) of the mobile terminal.
- the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the mobile terminal 100.
- the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or by driving an application program stored in the memory 170.
- controller 180 may control at least some of the components described with reference to FIG. 1A in order to drive an application program stored in the memory 170. Furthermore, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
- the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the mobile terminal 100.
- the power supply unit 190 includes a battery, which may be a built-in battery or a replaceable battery.
- At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the mobile terminal according to various embodiments described below.
- the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170.
- the broadcast receiving module 111 of the wireless communication unit 110 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
- the broadcast channel may include a satellite channel and a terrestrial channel.
- Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
- the mobile communication module 112 may include technical standards or communication schemes (eg, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), and EV).
- GSM Global System for Mobile communication
- CDMA Code Division Multi Access
- CDMA2000 Code Division Multi Access 2000
- EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced) and the like to transmit and receive a radio signal with at least one of a base station, an external terminal, a server on a mobile communication network.
- GSM Global System for Mobile communication
- CDMA Code Division Multi Access
- CDMA2000 Code Division Multi Access 2000
- EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO)
- WCDMA Wideband CDMA
- HSDPA High
- the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
- the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
- the wireless internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies.
- wireless Internet technologies include Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Wireless Fidelity (Wi-Fi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), and WiMAX (World).
- the wireless Internet module 113 for performing a wireless Internet access through the mobile communication network 113 May be understood as a kind of mobile communication module 112.
- the short range communication module 114 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC. (Near Field Communication), at least one of Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) technology can be used to support short-range communication.
- the short-range communication module 114 may be configured between a mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or through the wireless area networks. ) And a network in which the other mobile terminal 100 (or an external server) is located.
- the short range wireless communication network may be short range wireless personal area networks.
- the other mobile terminal 100 is a wearable device capable of exchanging (or interworking) data with the mobile terminal 100 according to the present invention (for example, smartwatch, smart glasses). (smart glass), head mounted display (HMD).
- the short range communication module 114 may sense (or recognize) a wearable device that can communicate with the mobile terminal 100, around the mobile terminal 100.
- the controller 180 may include at least a portion of data processed by the mobile terminal 100 in the short range communication module ( The transmission may be transmitted to the wearable device through 114. Therefore, the user of the wearable device may use data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a call is received by the mobile terminal 100, the user performs a phone call through the wearable device or when a message is received by the mobile terminal 100, the received through the wearable device. It is possible to check the message.
- the location information module 115 is a module for obtaining a location (or current location) of a mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
- GPS Global Positioning System
- Wi-Fi Wireless Fidelity
- the mobile terminal may acquire the location of the mobile terminal using a signal transmitted from a GPS satellite.
- the mobile terminal may acquire the location of the mobile terminal based on information of the wireless access point (AP) transmitting or receiving the Wi-Fi module and the wireless signal.
- the location information module 115 may perform any function of other modules of the wireless communication unit 110 to substitute or additionally obtain data regarding the location of the mobile terminal.
- the location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
- the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
- the mobile terminal 100 is one.
- the plurality of cameras 121 may be provided.
- the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
- the processed image frame may be displayed on the display unit 151 or stored in the memory 170.
- the plurality of cameras 121 provided in the mobile terminal 100 may be arranged to form a matrix structure, and through the camera 121 forming a matrix structure in this way, the mobile terminal 100 may have various angles or focuses.
- the plurality of pieces of image information may be input.
- the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for implementing a stereoscopic image.
- the microphone 122 processes external sound signals into electrical voice data.
- the processed voice data may be variously used according to a function (or an application program being executed) performed by the mobile terminal 100. Meanwhile, various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in the process of receiving an external sound signal.
- the user input unit 123 is for receiving information from a user. When information is input through the user input unit 123, the controller 180 may control an operation of the mobile terminal 100 to correspond to the input information. .
- the user input unit 123 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, a jog wheel, or the like located on the front or rear side of the mobile terminal 100). Jog switch, etc.) and touch input means.
- the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen. It may be made of a touch key disposed in the.
- the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic, text, icon, video, or the like. It can be made of a combination of.
- the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
- the controller 180 may control driving or operation of the mobile terminal 100 or perform data processing, function or operation related to an application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.
- the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
- the proximity sensor 141 may be disposed in an inner region of the mobile terminal covered by the touch screen described above or near the touch screen.
- the proximity sensor 141 examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
- the proximity sensor 141 may be configured to detect the proximity of the object by the change of the electric field according to the proximity of the conductive object.
- the touch screen (or touch sensor) itself may be classified as a proximity sensor.
- the proximity sensor 141 may detect a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). have.
- the controller 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern detected through the proximity sensor 141 as described above, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Further, the controller 180 may control the mobile terminal 100 to process different operations or data (or information) according to whether the touch on the same point on the touch screen is a proximity touch or a touch touch. .
- the touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. do.
- the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or capacitance generated at the specific portion into an electrical input signal.
- the touch sensor may be configured to detect a position, an area, a pressure at the touch, a capacitance at the touch, and the like, when the touch object applying the touch on the touch screen is touched on the touch sensor.
- the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
- the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180.
- the controller 180 can know which area of the display unit 151 is touched.
- the touch controller may be a separate component from the controller 180 or may be the controller 180 itself.
- the controller 180 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of touch object may be determined according to the operation state of the mobile terminal 100 or an application program being executed.
- the touch sensor and the proximity sensor described above may be independently or combined, and may be a short (or tap) touch, a long touch, a multi touch, a drag touch on a touch screen. ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. A touch can be sensed.
- the ultrasonic sensor may recognize location information of a sensing object using ultrasonic waves.
- the controller 180 can calculate the position of the wave generation source through the information detected from the optical sensor and the plurality of ultrasonic sensors.
- the position of the wave source can be calculated using the property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generation source may be calculated using a time difference from the time when the ultrasonic wave reaches the light as the reference signal.
- the camera 121 which has been described as the configuration of the input unit 120, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.
- a camera sensor eg, CCD, CMOS, etc.
- a photo sensor or an image sensor
- a laser sensor e.g., a laser sensor
- the camera 121 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
- the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
- TR transistor
- the display unit 151 displays (outputs) information processed by the mobile terminal 100.
- the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
- UI user interface
- GUI graphical user interface
- the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
- the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
- the sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
- the sound output unit 152 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
- the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
- the haptic module 153 generates various haptic effects that a user can feel.
- a representative example of the tactile effect generated by the haptic module 153 may be vibration.
- the intensity and pattern of vibration generated by the haptic module 153 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 153 may synthesize different vibrations and output or sequentially output them.
- the haptic module 153 may be used to stimulate pins that vertically move with respect to the contact skin surface, jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of electrodes, and electrostatic force.
- Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
- the haptic module 153 may not only deliver a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 153 may be provided according to a configuration aspect of the mobile terminal 100.
- the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the mobile terminal 100.
- Examples of events occurring in the mobile terminal 100 may be message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
- the signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or the rear.
- the signal output may be terminated by the mobile terminal detecting the user's event confirmation.
- the interface unit 160 serves as a path to all external devices connected to the mobile terminal 100.
- the interface unit 160 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
- the port, audio input / output (I / O) port, video input / output (I / O) port, earphone port, etc. may be included in the interface unit 160.
- the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
- a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 160.
- the interface unit 160 may be a passage for supplying power from the cradle to the mobile terminal 100 or may be input from the cradle by a user.
- Various command signals may be a passage through which the mobile terminal 100 is transmitted.
- Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
- the memory 170 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
- the memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
- the memory 170 may include a flash memory type, a hard disk type, a solid state disk type, an SSD type, a silicon disk drive type, and a multimedia card micro type. ), Card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read It may include at least one type of storage medium of -only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk and optical disk.
- the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 170 on the Internet.
- the controller 180 controls the operation related to the application program, and generally the overall operation of the mobile terminal 100. For example, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state that restricts input of a user's control command to applications.
- controller 180 may perform control and processing related to voice call, data communication, video call, or the like, or may perform pattern recognition processing for recognizing handwriting input or drawing input performed on a touch screen as text and images, respectively. Can be. Furthermore, the controller 180 may control any one or a plurality of components described above in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.
- the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
- the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
- the power supply unit 190 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
- the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
- the power supply unit 190 uses one or more of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
- various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
- FIGS. 1B and 1C a structure of a mobile terminal or a mobile terminal in which the mobile terminal or the components of the mobile terminal according to an embodiment of the present invention described with reference to FIG. 1A are disposed will be described with reference to FIGS. 1B and 1C.
- the disclosed mobile terminal 100 has a terminal body in the form of a bar.
- the present invention is not limited thereto, and the present invention can be applied to various structures such as a watch type, a clip type, a glass type, or a folder type, a flip type, a slide type, a swing type, a swivel type, and two or more bodies which are coupled to be movable relative to each other.
- a description of a particular type of mobile terminal may generally apply to other types of mobile terminals.
- the terminal body may be understood as a concept that refers to the mobile terminal 100 as at least one aggregate.
- the mobile terminal 100 includes a case (eg, a frame, a housing, a cover, etc.) forming an external appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in the internal space formed by the combination of the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
- a case eg, a frame, a housing, a cover, etc.
- the mobile terminal 100 may include a front case 101 and a rear case 102.
- Various electronic components are disposed in the internal space formed by the combination of the front case 101 and the rear case 102.
- At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
- the display unit 151 may be disposed in front of the terminal body to output information. As shown, the window 151a of the display unit 151 may be mounted to the front case 101 to form a front surface of the terminal body together with the front case 101.
- an electronic component may be mounted on the rear case 102.
- Electronic components attachable to the rear case 102 include a removable battery, an identification module, a memory card, and the like.
- the rear cover 102 may be detachably coupled to the rear case 102 to cover the mounted electronic component. Therefore, when the rear cover 103 is separated from the rear case 102, the electronic components mounted on the rear case 102 are exposed to the outside.
- the rear cover 103 when the rear cover 103 is coupled to the rear case 102, a portion of the side surface of the rear case 102 may be exposed. In some cases, the rear case 102 may be completely covered by the rear cover 103 during the coupling. On the other hand, the rear cover 103 may be provided with an opening for exposing the camera 121b or the sound output unit 152b to the outside.
- the cases 101, 102, and 103 may be formed by injecting a synthetic resin, or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
- STS stainless steel
- Al aluminum
- Ti titanium
- the mobile terminal 100 may be configured such that one case may provide the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components.
- the mobile terminal 100 of the unibody that the synthetic resin or metal from the side to the rear may be implemented.
- the mobile terminal 100 may be provided with a waterproof portion (not shown) to prevent water from seeping into the terminal body.
- the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102 or between the rear case 102 and the rear cover 103, and a combination thereof. It may include a waterproof member for sealing the inner space.
- the mobile terminal 100 includes a display unit 151, first and second sound output units 152a and 152b, a proximity sensor 141, an illuminance sensor 142, an optical output unit 154, and first and second units.
- the cameras 121a and 121b, the first and second manipulation units 123a and 123b, the microphone 122, the interface unit 160, and the like may be provided.
- the display unit 151, the first sound output unit 152a, the proximity sensor 141, the illuminance sensor 142, and the light output unit may be disposed on the front surface of the terminal body.
- the first camera 121a and the first operation unit 123a are disposed, and the second operation unit 123b, the microphone 122, and the interface unit 160 are disposed on the side of the terminal body.
- the mobile terminal 100 in which the second sound output unit 152b and the second camera 121b are disposed on the rear surface of the mobile terminal 100 will be described as an example.
- first manipulation unit 123a may not be provided on the front surface of the terminal body, and the second sound output unit 152b may be provided on the side of the terminal body instead of the rear surface of the terminal body.
- the display unit 151 displays (outputs) information processed by the mobile terminal 100.
- the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
- UI user interface
- GUI graphical user interface
- the display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- flexible display flexible display
- display a 3D display, or an e-ink display.
- two or more display units 151 may exist according to an implementation form of the mobile terminal 100.
- the plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces.
- the display unit 151 may include a touch sensor that senses a touch on the display unit 151 so as to receive a control command by a touch method.
- the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch based on the touch sensor.
- the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
- the touch sensor is formed of a film having a touch pattern and disposed between the window 151a and the display (not shown) on the rear surface of the window 151a or directly patterned on the rear surface of the window 151a. It can also be Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or provided in the display.
- the display unit 151 may form a touch screen together with the touch sensor.
- the touch screen may function as the user input unit 123 (see FIG. 1A).
- the touch screen may replace at least some functions of the first manipulation unit 123a.
- the first sound output unit 152a may be implemented as a receiver for transmitting a call sound to the user's ear, and the second sound output unit 152b may be a loud speaker for outputting various alarm sounds or multimedia reproduction sounds. It can be implemented in the form of).
- a sound hole for emitting sound generated from the first sound output unit 152a may be formed in the window 151a of the display unit 151.
- the present invention is not limited thereto, and the sound may be configured to be emitted along an assembly gap between the structures (for example, a gap between the window 151a and the front case 101).
- an externally formed hole may be invisible or hidden for sound output, thereby simplifying the appearance of the mobile terminal 100.
- the light output unit 154 is configured to output light for notifying when an event occurs. Examples of the event may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
- the controller 180 may control the light output unit 154 to end the light output.
- the first camera 121a processes an image frame of a still image or a moving image obtained by the image sensor in a shooting mode or a video call mode.
- the processed image frame may be displayed on the display unit 151 and stored in the memory 170.
- the first and second manipulation units 123a and 123b may be collectively referred to as a manipulating portion as an example of the user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100. have.
- the first and second manipulation units 123a and 123b may be adopted in any manner as long as the user is tactile manner such as touch, push, scroll, and the like while the user is tactile.
- the first and second manipulation units 123a and 123b may be employed in such a manner that the first and second manipulation units 123a and 123b are operated without a tactile feeling by the user through proximity touch, hovering touch, or the like.
- the first operation unit 123a is illustrated as being a touch key, but the present invention is not limited thereto.
- the first manipulation unit 123a may be a mechanical key or a combination of a touch key and a push key.
- the contents input by the first and second manipulation units 123a and 123b may be variously set.
- the first operation unit 123a receives a command such as a menu, a home key, a cancellation, a search, etc.
- the second operation unit 123b is output from the first or second sound output units 152a and 152b.
- the user may receive a command such as adjusting the volume of the sound and switching to the touch recognition mode of the display unit 151.
- a rear input unit (not shown) may be provided on the rear surface of the terminal body.
- the rear input unit is manipulated to receive a command for controlling the operation of the mobile terminal 100, and the input contents may be variously set. For example, commands such as power on / off, start, end, scroll, etc., control of the volume of sound output from the first and second sound output units 152a and 152b, and the touch recognition mode of the display unit 151. Commands such as switching can be received.
- the rear input unit may be implemented in a form capable of input by touch input, push input, or a combination thereof.
- the rear input unit may be disposed to overlap the front display unit 151 in the thickness direction of the terminal body.
- the rear input unit may be disposed at the rear upper end of the terminal body so that the user can easily manipulate the index body when the user grips the terminal body with one hand.
- the present invention is not necessarily limited thereto, and the position of the rear input unit may be changed.
- the rear input unit when the rear input unit is provided at the rear of the terminal body, a new type user interface using the same may be implemented.
- the touch screen or the rear input unit described above replaces at least some functions of the first operation unit 123a provided in the front of the terminal body, the first operation unit 123a is not disposed on the front of the terminal body.
- the display unit 151 may be configured with a larger screen.
- the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing a user's fingerprint, and the controller 180 may use fingerprint information detected through the fingerprint recognition sensor as an authentication means.
- the fingerprint recognition sensor may be embedded in the display unit 151 or the user input unit 123.
- the microphone 122 is configured to receive a user's voice, other sounds, and the like.
- the microphone 122 may be provided at a plurality of locations and configured to receive stereo sound.
- the interface unit 160 serves as a path for connecting the mobile terminal 100 to an external device.
- the interface unit 160 may be connected to another device (eg, an earphone or an external speaker), a port for short-range communication (for example, an infrared port (IrDA Port), or a Bluetooth port (Bluetooth). Port), a wireless LAN port, or the like, or a power supply terminal for supplying power to the mobile terminal 100.
- the interface unit 160 may be implemented in the form of a socket for receiving an external card, such as a subscriber identification module (SIM) or a user identity module (UIM), a memory card for storing information.
- SIM subscriber identification module
- UIM user identity module
- the second camera 121b may be disposed on the rear surface of the terminal body. In this case, the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a.
- the second camera 121b may include a plurality of lenses arranged along at least one line.
- the plurality of lenses may be arranged in a matrix format.
- Such a camera may be referred to as an 'array camera'.
- the second camera 121b is configured as an array camera, images may be photographed in various ways using a plurality of lenses, and images of better quality may be obtained.
- the flash 124 may be disposed adjacent to the second camera 121b.
- the flash 124 shines light toward the subject when the subject is photographed by the second camera 121b.
- the second sound output unit 152b may be additionally disposed on the terminal body.
- the second sound output unit 152b may implement a stereo function together with the first sound output unit 152a and may be used to implement a speakerphone mode during a call.
- the terminal body may be provided with at least one antenna for wireless communication.
- the antenna may be built in the terminal body or formed in the case.
- an antenna that forms part of the broadcast receiving module 111 (refer to FIG. 1A) may be configured to be pulled out from the terminal body.
- the antenna may be formed in a film type and attached to the inner side of the rear cover 103, or may be configured such that a case including a conductive material functions as an antenna.
- the terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the mobile terminal 100.
- the power supply unit 190 may include a battery 191 embedded in the terminal body or detachably configured from the outside of the terminal body.
- the battery 191 may be configured to receive power through a power cable connected to the interface unit 160.
- the battery 191 may be configured to enable wireless charging through a wireless charger.
- the wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).
- the rear cover 103 is coupled to the rear case 102 to cover the battery 191 to limit the detachment of the battery 191 and to protect the battery 191 from external shock and foreign matter.
- the rear cover 103 may be detachably coupled to the rear case 102.
- An accessory may be added to the mobile terminal 100 to protect the appearance or to assist or expand the function of the mobile terminal 100.
- An example of such an accessory may be a cover or pouch that covers or accommodates at least one surface of the mobile terminal 100.
- the cover or pouch may be configured to be linked with the display unit 151 to expand the function of the mobile terminal 100.
- Another example of the accessory may be a touch pen for assisting or extending a touch input to a touch screen.
- the information processed by the mobile terminal can be displayed using a flexible display.
- this will be described in more detail with reference to the accompanying drawings.
- FIG. 2 is a conceptual diagram illustrating another example of a deformable mobile terminal 200 according to the present invention.
- the display unit 251 may be configured to be deformable by an external force.
- the deformation may be at least one of bending, bending, folding, twisting, and curling of the display unit 251.
- the deformable display unit 251 may be referred to as a 'flexible display unit'.
- the flexible display unit 251 may include both a general flexible display, an electronic paper, and a combination thereof.
- the mobile terminal 200 may include the features of the mobile terminal 100 of FIGS. 1A-1C or similar features.
- a general flexible display is a light and durable display that is fabricated on a thin and flexible substrate that can be bent, bent, folded, twisted or curled like a paper while maintaining the characteristics of a conventional flat panel display.
- electronic paper is a display technology to which the characteristics of general ink are applied, and the use of reflected light may be different from that of a conventional flat panel display.
- Electronic paper can change information using twist balls or electrophoresis using capsules.
- the display area of the flexible display unit 251 is flat.
- the display area may be a curved surface.
- the information displayed in the second state may be visual information output on a curved surface.
- Such visual information is implemented by independently controlling light emission of a sub-pixel disposed in a matrix form.
- the unit pixel refers to a minimum unit for implementing one color.
- the flexible display unit 251 may be placed in a bent state (eg, bent vertically or horizontally) instead of being flat in the first state. In this case, when an external force is applied to the flexible display unit 251, the flexible display unit 251 may be deformed into a flat state (or less curved state) or more curved state.
- the flexible display unit 251 may be combined with a touch sensor to implement a flexible touch screen.
- the controller 180 (refer to FIG. 1A) may perform control corresponding to the touch input.
- the flexible touch screen may be configured to detect a touch input not only in the first state but also in the second state.
- the mobile terminal 200 may be provided with deformation detection means for detecting the deformation of the flexible display unit 251.
- deformation detection means may be included in the sensing unit 140 (see FIG. 1A).
- the deformation detecting means may be provided in the flexible display unit 251 or the case 201 to sense information related to deformation of the flexible display unit 251.
- the information related to the deformation may include a direction in which the flexible display unit 251 is deformed, a degree of deformation, a deformation position, a deformation time, and an acceleration in which the flexible display 251 is restored.
- due to the bending of the flexible display unit 251 may be a variety of information that can be detected.
- the controller 180 changes the information displayed on the flexible display unit 251 or changes the information displayed on the flexible display unit 251 based on the information related to the deformation of the flexible display unit 251 detected by the deformation detecting means. It can generate a control signal for controlling the function of.
- the mobile terminal 200 may include a case 201 for accommodating the flexible display unit 251.
- the case 201 may be configured to be deformable together with the flexible display unit 251 by an external force in consideration of characteristics of the flexible display unit 251.
- the battery (not shown) included in the mobile terminal 200 may also be configured to be deformable together with the flexible display unit 251 by an external force in consideration of characteristics of the flexible display unit 251.
- a stack and folding method in which battery cells are stacked up may be applied.
- the state deformation of the flexible display unit 251 is not limited only by external force.
- the flexible display unit 251 may be transformed into the second state by a command of a user or an application.
- the display unit 151 may further include a sensor for detecting a change in shape caused by an external force.
- the sensing unit 140 may include a sensor for detecting an area on the display unit 151 where the bending occurs, and an angle at which the display unit 151 is bent. It may be further provided with a sensor for sensing.
- the sensing unit 140 may further include a sensor capable of detecting a predetermined area of the display unit 151 approaching each other within a predetermined distance due to the bending. That is, when the display unit 251 is bent by an external force, the controller 180 may set a predetermined portion of the display unit 251, for example, a center portion of the display unit 251 or the display unit 251. The distance between the plurality of preset points at the end may be measured, and the degree of bending of the display unit 251 may be detected according to the measured result.
- the controller 180 may detect any straight line that is the center of the bend based on the area where the display unit 251 is detected to be bent.
- the controller 180 may form a specific region (hereinafter referred to as a reference region) centered on the detected straight line (hereinafter referred to as a center line) based on the degree to which the display unit 251 is bent. Can be.
- the controller 180 may divide the display unit 251 into a plurality of areas including the formed reference area. For example, the controller 180 may classify areas on the display unit 251 located on both sides of the reference area into different areas (first area and second area) based on the formed reference area.
- the controller 180 may display various image information in the plurality of divided areas. For example, when the display unit 251 is in an inactive state, the controller 180 may display image information only in any one of the plurality of divided regions according to the bending based on a user's touch input. Alternatively, the controller 180 may operate one of the plurality of regions on the display unit 251 in the active state based on the user's touch input or operate the power saving mode or switch to the inactive state.
- the controller 180 of the mobile terminal 200 may display at least one image information related to an operation state currently being executed in the mobile terminal 200 in the plurality of areas.
- the controller 180 determines the size of the reference area and other areas divided based on the reference area according to the bent state of the display unit 251, and displays the size in each area according to the size of the determined area.
- the type or form of the image information to be used can also be changed.
- the controller 180 of the mobile terminal 200 may control various functions executable in the mobile terminal 200 according to the angle at which the display unit 251 is bent.
- the controller 180 may change the time at which the alarm sounds based on the angle at which the display 251 is bent, or change the direction in which the image information is scrolled.
- the controller 180 may execute different functions that can be provided by the mobile terminal 200 based on the angle at which the display unit 251 is bent.
- the controller 180 of the mobile terminal 200 may utilize various curved areas, that is, reference areas, of the display unit 251.
- the controller 180 displays different image information in different areas divided based on the reference area, and image information displayed in the different areas is displayed between the different areas using the reference area. It can also be shared with each other.
- 3 is a flowchart illustrating an operation process of the mobile terminal 200 according to the present invention.
- 4 is a conceptual diagram showing examples in which the mobile terminal 200 according to the present invention is deformed by an external force.
- the controller 180 of the mobile terminal 200 may detect a bending of the flexible display unit 251 due to external force (S300). . And it can be modified in various forms as the bent.
- the external force may mean a force for changing the shape of the flexible display unit 251, for example, a force applied from the user.
- the flexible display unit 251 When the external force is applied, the flexible display unit 251 may be modified as shown in (b), (c) and (d) of FIG. 4. That is, in the mobile terminal 200 according to the embodiment of the present invention, when the external force is applied, the flexible display unit 251 may be bent in correspondence to the applied external force, and FIGS. 4 (b), (c), As shown in (d), it can be folded in various forms depending on the angle of bending. In addition, when the flexible display unit 251 is folded, the mobile terminal 200, as shown in (b), (c), and (d) of FIG. It may be mounted in the form.
- the controller 180 may divide the display unit 251 into a plurality of regions based on an area in which the bending of the display unit 251 is detected. For example, the controller 180 may classify an area where the display 251 is bent and an area that is not bent into different areas based on the degree to which the display 251 is bent. As described above, an operation process of the controller 180 that divides the display unit 251 into a plurality of areas based on the degree to which the display unit 251 is bent will be described in detail with reference to FIG. 5.
- the controller 180 determines image information to be displayed in at least one of the divided areas based on the degree to which the display unit 251 is bent ( S304). When the image information is determined in step S304, the controller 180 allows the determined image information to be displayed on at least one of the plurality of areas of the divided display unit 251 (S306). .
- the controller 180 displays image information respectively displayed on the plurality of areas.
- the controller 180 may display information related to image information displayed in any one of the divided areas on the display unit 251 in different ways based on the degree to which the display unit 251 is bent. You can mark them.
- the controller 180 may detect touch inputs applied by a plurality of users from each area of the divided display unit 251 and perform a plurality of different functions based on the detected touch inputs. In this case, the controller 180 may allow image information related to different functions corresponding to the touch inputs to be displayed in different areas on the display unit 251. According to the user's selection, the bent area on the display unit 251 may be used to display image information displayed in any one of the above areas in another area.
- the controller 180 may execute at least one function executable in the mobile terminal 200 based on the degree to which the display unit 251 is bent.
- the image information related to the executed function may be displayed on at least one of a plurality of areas on the divided display unit 251.
- the controller 180 may control an operation state currently executed in the mobile terminal 200 to be changed based on the degree of bending, and information related to the changed operation state may be divided into areas of the display unit 251. It may be displayed on at least one of.
- FIG. 5 illustrates an operation process of dividing the display unit 251 into a plurality of areas among the processes shown in FIG. 3.
- the controller 180 of the mobile terminal 200 may detect the bending of the display unit 251 due to an external force, in step S300.
- the controller 180 may detect a center line that is the center of the bend based on the area where the bend is detected.
- the controller 180 may detect any straight line that becomes the center of the bend. Can be. In addition, the controller 180 may recognize the detected arbitrary straight line 400 as the center at which the display unit 251 is bent.
- the controller 180 can detect the degree to which the display unit 251 is bent (S502). For example, the controller 180 may detect that the distance between the corners of both ends of the display unit 251 is closer due to the bending of the display unit 251.
- the sensing unit 140 may be provided with a sensor that can detect the distance between the corners of both ends of the display unit 251.
- both end edges of the display unit 251 may be provided with a magnetic flux sensor capable of detecting a change in magnetic flux, an ultrasonic sensor or an infrared sensor capable of measuring a distance between the both edges.
- the sensing unit 140 may include a bending sensor to determine the degree of bending, and the bending sensor may include a pressure sensor and a strain gauge.
- the controller 180 can detect the degree of bending of the display unit 251.
- the controller 180 displays an area corresponding to the bending degree of the display unit 251 around the center line 400. It may be formed on (S504). Accordingly, the size of the region 410 formed on the display unit 251 may vary according to the bent state of the display unit 251.
- the controller 180 refers to the formed region 410.
- the display unit 251 may be divided into a plurality of areas.
- the controller 180 may classify areas around the formed area 410 into different areas based on the formed area 410.
- the controller 180 is formed around the center line 400 as shown in FIGS. 4B, 4C, and 4D.
- the regions 410 may be divided into regions 402 and 404 on both sides of the region 410.
- the controller 180 may display different image information in the areas 400, 402, and 404 based on the bent state of the display unit 251 in step S304.
- the area 410 corresponding to the degree of bending of the display unit 251 will be referred to as a 'reference area'.
- the controller 180 may determine image information to be displayed in at least one of the plurality of areas in operation S304. If the display unit 251 is in an inactive state, the controller 180 activates only one region of the plurality of regions, or vice versa, among the plurality of regions of the activated display unit 251. You can deactivate only one region.
- FIG. 6 is a flowchart illustrating an operation of converting any one of a plurality of divided areas of a display unit in an inactive state into an active state in a mobile terminal according to the present invention.
- the mobile terminal 200 in order to detect a touch input applied to the display unit 251, the mobile terminal 200 according to an exemplary embodiment of the present invention has a minimum current even when the display unit 251 of the mobile terminal 200 is in an inactive state. Or it can operate in a specific mode that consumes power. This particular mode may be called 'doze mode'.
- the dose mode in a touch screen structure in which the touch sensor and the display unit 251 form a mutual layer structure, only a light emitting device for outputting a screen from the display unit 251 is turned off.
- the sensor may be in a state of maintaining an on state.
- the dose mode may be a mode in which the display unit 251 is turned off and at least one of a touch sensor and an acceleration sensor for sensing the touch input is turned on.
- the user displays the display unit 251.
- the touch input may be detected from the user through at least one of a touch sensor or an acceleration sensor that is on. Can be.
- the controller 180 may apply a user's touch input to any one of the plurality of areas of the display unit 251 which are inactive. Can be detected (S600). In addition, only one of the plurality of areas corresponding to the user's touch input may be switched to an active state (S602). In addition, the controller 180 may display image information corresponding to an activated state of the mobile terminal 200, that is, image information corresponding to a standby screen or a preset operation state, in the activated region (S604).
- the controller 180 may activate only one region in which the user's touch input is sensed among the plurality of regions, but unlike this, the controller 180 specifies a specific point based on the user's touch input detected in two or more regions among the plurality of regions. Of course, only the region can be activated. For example, the controller 180 recognizes the touch input as activating the display unit 251 in the inactive state when the user's touch input is detected in any one of the plurality of regions. According to another touch input applied subsequent to the touch input, only one of the plurality of divided regions may be activated.
- the preset area may be the reference area 410, and the areas 402 and 404 divided into different areas based on the reference area 410 may be image information according to a user's selection. It can be an area displaying. In this case, when there is a user's touch input applied to the reference area 410, the controller 180 based on a drag input applied to any one of the other areas 402 and 404 after the touch input. Only one of the areas 402 and 404 may be activated, and image information may be displayed accordingly.
- the controller 180 may select an area to display image information by a method different from the above-described method. For example, when the user's touch with respect to the reference area 410 is detected, the controller 180 is based on the number of other touch inputs applied after the touch input or the area of the area where the other touch input is detected. It may also determine the area to be activated.
- the user's touch input applied to the reference area 410 may be a user's hand on the reference area 410.
- the user's palm may be in close contact with the reference area 410, and the controller 180 may recognize this as a user's touch input to the reference area 410.
- the user may input different touch inputs to different areas 402 and 404 separated by the reference area 410 with the thumb and other fingers except the thumb according to the direction in which the hand is placed. Can be added.
- a touch input is applied to the first area 402 by using a thumb, and other fingers except the thumb are applied to the second area 404.
- a touch input is applied using the touch input, a touch input having a different number of times or an applied area having different times may be applied to the first region 402 and the second region 404.
- the controller 180 may enable one of the divided regions 402 and 404 to be activated based on the number of touch inputs or the area to which the touch inputs are applied. For example, the controller 180 may select an area to which a smaller number of touch inputs are applied or an area to which a smaller area touch input is applied, and activate the area. In this case, the area 402 to which the user applied a touch input using the thumb may be selected by the user, and the selected area may be activated.
- the mobile terminal 200 when the mobile terminal 200 enters the locked state, the mobile terminal 200 according to an embodiment of the present disclosure may switch the display unit 251 to an inactive state.
- the controller 180 releases the locked state when the user's operation pattern formed by a plurality of taps tapping the display unit 251 matches the preset unlocking operation pattern.
- the display unit 251 may be switched to an active state.
- the mobile terminal 200 when the state of the current display unit 251 is bent by an external force, the mobile terminal 200 according to the embodiment of the present invention divides the display unit 251 into a plurality of areas by the bent state. A plurality of tabs may be input through any one of the areas of the divided display unit 251.
- the controller 180 may determine an area to be activated based on an area on the display unit 251 to which the plurality of tabs are applied.
- the touch input detected in the region 404 may be recognized as the plurality of taps.
- the controller 180 may determine whether an operation pattern formed by the plurality of tabs corresponds to a preset pattern for unlocking the mobile terminal 200. If the patterns match, the controller 180 may switch any one of the first region 402 or the second region 404 to an active state.
- the controller 180 may determine the area to be activated based on the number of other touch inputs applied after the touch input. For example, the controller 180 may activate the second area 404 when the touch input, that is, the area where the number of taps are detected, is activated and the area when the number of taps is detected by the controller 180 is activated. One region 402 may be activated.
- the controller 180 selects an area to be activated according to the detection result of the plurality of taps, an area to which a touch input of a smaller area is applied, an area to which a smaller number of touch inputs are applied, or the plurality of taps It may be more desirable to allow another region to which no tap of is applied is selected.
- the controller 180 may perform the processes similar to those shown in FIG. Only one region may be inactive. For example, when there is a user's touch input to the reference area 410, the controller 180 may recognize the touch input as deactivating any one of the areas of the display unit 251 currently activated. have. In addition, the controller 180 may convert only one of the activated areas into an inactive state based on another touch input (for example, a drag input) applied after the touch input.
- another touch input for example, a drag input
- the controller 180 may be switched to the power saving mode (mode) state.
- the power saving mode refers to an operation mode that is driven with minimum power, and may be an operation mode that is driven at a brightness lower than a preset level, and also includes an operation state in which a clock application or a calendar application is driven. It may be in a state of operating in a preset operation mode.
- the mobile terminal 200 in the mobile terminal 200 according to an exemplary embodiment of the present disclosure, different image information is displayed on each of the divided regions based on the bent state of the display unit 251, or the display is performed. It has been mentioned that the section 251 can change the image information displayed in the areas based on the degree of division. In this case, the controller 180 may detect whether the bent state of the display unit 251 is changed and thus change image information as described above.
- FIG. 7 illustrates in more detail an operation process of changing image information displayed on a plurality of divided display unit areas when the bent state of the display unit is changed.
- the controller 180 of the mobile terminal 200 may provide different image information related to an operation state of the current mobile terminal 200 according to the bent state of the display unit 251.
- the controller 180 extracts information related to image information displayed in any one of the divided areas according to the bending of the display unit 251, and extracts the extracted information from another area including a reference area. You can also mark the area.
- the controller 180 may select other regions other than the reference region among the divided regions according to a user's selection. sub) area.
- the controller 180 may display image information corresponding to an operation state currently being executed in the mobile terminal 200 in an area set as a main area among the divided areas, and in an area set as a reference area and a sub area. Additional information related to image information currently displayed in the main area may be displayed.
- the controller 180 may display contents of the electronic newspaper corresponding to a web address currently set in the main area.
- thumbnail information of various multimedia contents included in the electronic newspaper may be displayed in the reference area.
- the sub-area may include contents related to specific multimedia contents corresponding to a user's selection among multimedia contents included in the electronic newspaper.
- image information corresponding to the played multimedia content may be displayed in the main area, and the reproduction of the multimedia content may be displayed in the reference area.
- a control menu for controlling the control may be displayed.
- a list of other contents included in a folder in which the multimedia content is stored may be displayed in the sub area.
- the image information displayed in the divided areas may vary according to the degree to which the display unit 251 is bent.
- the controller 180 may display different types, display directions, or forms of image information displayed in each of the divided regions according to the degree of bending of the display unit 251.
- the first area Image information related to one operation state is displayed in both the 402 and the second area 404
- the reference area 410 displays information currently related to the first area 402 and the second area 404. You may.
- the image information displayed in the first area 402 and the second area 404 may be displayed in one direction (for example, the direction from the first area 402 to the second area 404). , May scroll in the same direction.
- the display unit 251 when the display unit 251 is not severely bent, the user checks both the first region 402 and the second region 404 at once. Because it is possible.
- the control unit 180 may allow image information related to image information displayed in any one of the first region 402 and the second region 404 to be displayed in the other regions 410, 402, 404. It may be.
- the controller 180 may display information related to image information displayed in the main area in the reference area 410 and the second area 404. I can display it.
- the image information in the first region 402 and the second region 404 may be displayed in a direction in which the reference region 410 is located, that is, in a direction facing each other, in each region.
- image information may be scrolled in the first area 402 and the second area 404.
- the controller 180 controls the first area ( Image information may be displayed only in either 402 or the second area 404. Alternatively, the controller 180 may display image information in only one of the first region 402 and the second region 404, and display additional information related to the image information in the reference region 410. .
- FIGS. 4B, 4C, and 4D are illustrated.
- the first level and the second level are merely for explaining the operation according to the bending of the display unit 251 for convenience of description, and there may be fewer or more thresholds related to the bending of the display unit 251.
- These thresholds related to the degree to which the display unit 251 is bent are used by the controller 180 to determine image information displayed in the first area 402, the second area 404, and the reference area 410. Or may be used to determine the size of the reference area 410.
- the controller 180 is in a bent state of the display unit 251. It is detected whether is changed (S702). If the bent state of the display unit 251 is not changed as a result of the sensing of step S702, the current image information is maintained in the areas 402, 404, and 410.
- the controller 180 detects a degree to which the bent state of the display unit 251 is changed and based on the detected result.
- the size of the region, that is, the reference region, formed based on the center line of the bending may be changed (S704).
- the controller 180 may change image information displayed in each area based on the changed areas, that is, the sizes of the reference area 410, the first area 402, and the second area 404 ( S706).
- the controller 180 may display a control menu including more control functions in the reference area 410.
- the controller 180 may display an image currently displayed in the main area (for example, the first area 402), such as thumbnail information. Other information related to the information may be displayed.
- a control menu including fewer control functions may be displayed on the reference area 410, and the related information displayed in the form of a thumbnail image may be text information.
- the reference area may be displayed more briefly in the form of tag information.
- any one of the first area 402 and the second area 404 is set as the main area and the remaining area is set as the sub area.
- image information related to functions of different mobile terminals 200 may be displayed in the first area 402 and the second area 404.
- the controller 180 may detect a touch input applied to the first region 402 and the second region 404, respectively, and perform a function corresponding to the touch input. Accordingly, the controller 180 may display image information related to different functions in the first region 402 and the second region 404, respectively.
- the reference area 410 may be utilized in various forms in other forms.
- the controller 180 simultaneously receives a plurality of users from the plurality of users.
- a function according to an applied touch input may be performed, and image information according to a function selected by each of the plurality of users may be displayed in the first area 402 and the second area 404, respectively.
- the plurality of users may use the bent area of the display unit 251, that is, the reference area 410 as an area for sharing information with each other.
- the controller 180 may provide information related to information displayed in any one of the first area 402 and the second area 404 according to the selection of any one of the plurality of users. ), And information corresponding to the related information may be displayed in another area according to a user's selection.
- the controller 180 may change image information displayed in each area based on the bent state of the display unit 251. For example, when the degree of bending of the display unit 251 is greater than or equal to a certain level, the controller 180 may display information related to information displayed in any one of the first area 402 and the second area 404 as text. Can be displayed in format or tag format. However, if the degree of bending of the display unit 251 is less than a predetermined level, the controller 180 may display information related to the information displayed in any one of the first area 402 and the second area 404 with a thumbnail image. It may be displayed in the form of the same image.
- FIG. 8 illustrates that when any one of the areas of the divided display unit 251 is set as a main area and additional information related to image information displayed in the main area is displayed in other areas in step S700.
- the operation process of the mobile terminal 200 according to an embodiment of the present invention is illustrated.
- the main region is detected by bending the first region 402 of FIG. 4, the sub region of the second region 404 of FIG. 4, and the reference region of the display unit 251. It will be described on the assumption that the region 410 is formed based on the center line.
- the controller 180 may display an image displayed in the main area among the currently divided areas. Based on the information, related information may be extracted (S800). For example, when the current operation state is a state of displaying a web page including multimedia content, the controller 180 may display the overall contents of the web page in the main area 402. In operation S800, the controller 180 may extract multimedia contents included in the web page as the related information.
- the controller 180 may display the extracted information on the reference area 410 and the sub area 404 based on the bent state of the display unit 251 ( S802). If the degree of bending of the display unit 251 is less than or equal to a predetermined level, the controller 180 generates thumbnail images corresponding to the extracted information, and displays the generated thumbnail images on the reference area 410. Can be.
- the controller 180 detects whether there is a user's touch input to the reference area 410 (S804). As a result of the detection of step S804, when there is a user's touch input on any one of the related information displayed on the reference area 410, image information corresponding thereto may be displayed on the sub area 404 (S806). ). When the user selects one of the thumbnail images displayed in the reference area 410, multimedia content corresponding to the selected thumbnail image may be played in the sub area 404.
- the controller 180 may display information related to the web page corresponding to the link information on the reference area 410 as an image such as text information or a thumbnail image, based on the degree to which the display unit 251 is bent. I can display it.
- the controller 180 may move to a web page corresponding to the link information and display a screen of the web page in the main area 402.
- the controller 180 may extract related information from the changed web page, that is, information such as multimedia content or information of other linked web pages, and display the related information on the reference area 410 as related information.
- the controller 180 may use history information as information related to a web page currently displayed in the main area 402.
- the history information may be information about web pages that the user has visited in the past when surfing the web.
- the controller 180 may move to a web page corresponding to the selected history information and display the screen of the web page in the main area 402. .
- the related information extracted in step S800 may be content previously stored in the memory 170 or a preset value.
- the information may be information about contents stored in an external server (eg, a cloud server).
- the controller 180 extracts a list of contents stored in the memory 170 or an external server and displays them on the reference area 410 and the sub area 404 according to the bent state of the display unit 251. Can be.
- the controller 180 may display different related information in the sub area 404 and the reference area 410. For example, when the extracted contents have an album form including a plurality of other multimedia contents, the controller 180 displays information about the albums in the sub area 404, and among the albums. The list of contents included in the album selected by the user may be displayed in the reference area.
- the controller 180 may display thumbnail images of the video contents in the sub area 404 as related information. Also, among the thumbnail images displayed on the sub area 404, the video content corresponding to any one selected by the user is played back (displayed on the main area 402), and related information about the currently played video content is displayed in the reference area 410. ) Can be displayed.
- the related information displayed on the reference area 410 may be various.
- the image may be still cut images of the video content currently being played, or may be a graphic object in the form of a timeline separated by a predetermined time interval.
- the controller 180 may determine a form in which the related information is to be displayed based on a state in which the display unit 251 is bent, and display the related information as the still cut image or a graphic object in a timeline form according to the determined form. have.
- the controller 180 may display information on the extracted contents in the sub area 404 and a control menu for controlling multimedia content currently played in the main area in the reference area 410.
- the controller 180 may allow a control menu including different control functions to be displayed on the reference area 410 based on the degree to which the display unit 251 is bent.
- the controller 180 may display image information related to different functions in the divided area, and the image information displayed in the different area is different using the reference area. It was mentioned that they can also be shared between regions. In the following description, for convenience of description, information is shared between the first user using the first area 402 and the second user using the second area 404 by way of example. .
- FIG 9 illustrates the operation of the mobile terminal 200 according to an embodiment of the present invention in this case in more detail.
- a reference region may be formed around a center line, and the display unit 251 may be formed in the first region 402 based on the reference region. And the second region 404.
- the controller 180 can display image information corresponding to different functions that can be executed in the mobile terminal 200 in the first area 402 and the second area 404, respectively.
- the controller 180 may generate sharing information from the image information displayed in the first area 402 when the first user selects the first area 402 among the divided areas. There is (S900).
- the sharing information may be generated in various ways.
- the sharing information may be a still image displayed in the first region 402 or the still image is stored. It may include address information of the memory 170 or an external server.
- the sharing information may be generated including the address of the web page or the text information.
- the controller 180 can display the generated sharing information in the reference area 410.
- the controller 180 may display the sharing information generated in the step S900 in the form of a thumbnail image.
- the controller 180 can display the sharing information generated in step 900 in the form of text information or tag information.
- the controller 180 may detect a touch input of the second user with respect to the sharing information (S904).
- the image information displayed in the first area 402 may be displayed on the second area 404 by using sharing information corresponding to the touch input.
- the same image information may be displayed in the first region 402 and the second region 404, and thus the first and second users may share the same image information.
- video content may also be shared among a plurality of users in the form of such sharing information.
- the controller 180 shares a video corresponding to the video content displayed in the first area 402 based on a touch input of the first user.
- Information can be generated.
- the sharing information may be video information corresponding to a predetermined time before and after a time when a touch input is detected from a first user among video contents reproduced in the first region 402. That is, when the predetermined time is 10 seconds, the controller 180 may display information of a portion corresponding to 5 seconds from 5 seconds before the first user's touch input is detected from the video content played in the first area 402. May be generated as the shared information.
- the sharing information may be information obtained by recording a portion corresponding to a predetermined time before and after the touch input is detected, or may include address information of the video content played in the first region 402 and the video content. It may be information about a playback start point and a playback end point.
- the second user selects sharing of the sharing information, that is, for example, when the second user drags the sharing information displayed in the reference area 410 to the second area 404
- the video information corresponding to the shared information that is, the video information corresponding to the shared information, that is, the video information corresponding to the predetermined time before and after the touch input of the first user is reproduced from the video content reproduced in the first area 402.
- the mobile terminal 200 may use each divided area in various forms as the display unit 251 is bent.
- the controller 180 may perform different functions based on the degree to which the display unit 251 is bent.
- the controller 180 may control the operation state currently being executed to be different based on the change.
- FIG. 10 illustrates an example in which a specific area is activated according to a user's selection among a plurality of areas divided as the display unit is bent in the mobile terminal according to the present invention.
- the mobile terminal 200 may be bent as shown in FIG. 10A by an external force while the display unit 251 is inactive.
- the controller 180 may include an area formed based on a center line of bending, a reference area 1000, and two other areas divided based on the reference area 1000 (first area: 1002, second area: 1004). ) Can be separated.
- the controller 180 may activate at least one of the plurality of areas 1002 and 1004 based on a user's touch input. For example, when a touch input is detected in the reference area 1000, the controller 180 is based on the number and / or area of the touch inputs detected in the first area 1002 and the second area 1004. Thus, the area of the display unit 251 to be activated may be determined. That is, as shown in FIG. 10B, the controller 180 senses a touch input applied simultaneously in the first area 1002 and the second area 1004, and accordingly, the first area 1002 and Any one of the second regions 1004 may be activated.
- the controller 180 determines that the first area 1002, which is an area where a smaller number of touch inputs are detected and / or an area where a touch input is applied to a smaller area, is an area activated by the touch input. It may be more preferable, since the area of the display unit 251 corresponding to the position of the thumb 1014, that is, the first area 1002 is located in a direction facing the user.
- the controller 180 can release the locked state based on a plurality of taps applied to the display unit 251, and to release the locked state.
- the area to be activated may be determined based on a plurality of taps.
- the controller 180 may input a touch to release the locked state from the fingers 1012. Can be authorized.
- the controller 180 may lock the lock when the finger pattern 1012 coincides with a preset lock release operation pattern formed by a plurality of taps tapping the second area 1004.
- the state may be released, and one of the divided regions of the display unit 251 may be switched to an active state.
- the controller 180 may be more preferable to allow the controller 180 to determine the first area 1002, which is the area where the plurality of tabs are not applied, as the area activated by the plurality of taps, which is the thumb 1014. This is because an area of the display unit 251 corresponding to the position of the first display unit 251, that is, the first area 1002 is located in a direction facing the user.
- the area of the display unit 251 to be activated may be selected. That is, as shown in FIG. 10C, when the user applies the touch input 1020 to the reference area 1000 and the drag input 1022 subsequent to the touch input 1020, The specific area may be activated based on the touch input. That is, as shown in FIG. 10C, when the drag input is applied to the first area 1002 after the user's touch input 1020 applied to the reference area 1000, the controller 180 may determine the above.
- the first area 1002 may be determined as an area selected by the user.
- the controller 180 can Only the first area 1002 may be activated among the divided areas of the display unit 251.
- 10 (d) shows an example of such a case.
- the mobile terminal may change image information displayed in each area and a direction or type of displaying the image information based on the bent state of the display unit 251. It was mentioned that it may be.
- FIG. 11 illustrates an example in which image information is displayed in a plurality of areas divided based on a state where the display unit 251 is bent in the mobile terminal according to the present invention.
- FIGS. 11A, 11B, and 11C show examples of cases in which the degrees of bending of the display unit 251 are different from each other.
- FIG. 11A illustrates an example in which the degree of bending the display unit 251 is the most severe
- FIG. 11C illustrates an example in which the degree of bending the display unit 251 is the weakest. It is.
- the case where the degree of bending of the display unit 251 is the most severe is referred to as a first state
- the case where the degree of bending is the least is called a third state.
- FIG. 11B a case having a bend corresponding to an intermediate degree between the first state and the third state will be referred to as a second state.
- the reference area 1000 may be the smallest as shown in FIG. 11.
- the controller 180 may display only simple text information such as address information of the currently accessed web page on the display unit 251.
- the controller 180 activates only the first region 1002 and the reference region 1000.
- the second region 1004 may be kept in an inactive state.
- the controller 180 may display image information displayed in each of the divided areas based on the bending of the changed display unit 251. You can also change it. That is, the controller 180 can expand the size of the reference area based on the degree of bending of the display unit 251 as shown in the second state.
- the controller 180 extracts relevant information from image information currently displayed in the main area, that is, the first area 1002, and extracts the extracted related information from the reference area ( 1000) can be displayed in the form of an image.
- the extracted related information may include multimedia content included in a web page displayed in the main area 1002 or another web page linked to a part of text of the web page or other web pages visited by a user in the past ( History information).
- the controller 180 may display information on any one of the extracted related information in the second area 1004. For example, as shown in FIG. 11B, the controller 180 may control a user 1112 selected by the user from among the related information 1112, 1114, 1116, and 1118 displayed on the reference area 1000. Corresponding information 1120 may be displayed on the second area 1004.
- the controller 180 may output image information displayed in the first area 1002 and the second area 1004 in opposite directions with respect to the reference area 1000. That is, as shown in FIG. 11B, the controller 180 sets the image information displayed in the first region 1002 and the second region in a direction in which the reference region 1000 is located upward. Image information displayed at 1004) may be displayed.
- the upward direction of the image information may mean a direction toward which graphic objects included in the image information displayed in the first area 1002 and the second area 1004, that is, the upper portion of the text or the image face. .
- the controller 180 reverses the output direction of the image information to be displayed in the second region 1004. That is, it may be output in a direction rotated 180 degrees. Accordingly, the image information displayed in the first area 1002 and the second area 1004 may be displayed in opposite directions with respect to the reference area 1000.
- FIG. 11B when the display unit 251 is bent and mounted at a predetermined level or more, image information displayed in the first area 1002 and the second area 1004 is different from each other. Because you can watch. Therefore, as shown in (b) of FIG.
- the image information should be displayed with the direction in which the reference area 1000 is positioned upward to be positioned opposite to the first area 1002 and the second area 1004. Since the users can view the image information that is not upside down, the controller 180 can display the image information of the second area 1004 in a direction opposite to the direction in which the image information is displayed in the first area 1002.
- the controller 180 may display information related to one content in the first area 1002 and the second area 1004. . That is, the controller 180 may display one web page in the first area 1002 and the second area 1004 as shown in FIG. 11C.
- the display unit 251 when the display unit 251 is in a third state in which the bending of the display unit 251 is alleviated, the user views all of the divided areas 1000, 1002, and 1004 of the mobile terminal 200 at once. As it can be confirmed, the purpose is to write the areas of the display unit 251 more efficiently. In this case, different portions of the currently accessed web page may be displayed in the first area 1002 and the second area 1004. Also in this case, as shown in FIG. 11C, information related to image information currently displayed in the first area 1002 and the second area 1004 may be displayed in the reference area 1000.
- the controller 180 may display the second area in the second state.
- Image information displayed at 1004 can be changed back to the original display direction. This is because when the bending of the display unit 251 is relaxed as in the third state, as described above, the user can check all the divided areas 1000, 1002, and 1004 of the mobile terminal 200 at once. Accordingly, the controller 180 may display the image information in the second area 1004 in the same direction as the image information is displayed in the first area 1002, that is, in the second state.
- the image information displayed at 1004) may also be rotated 180 degrees and output.
- Image information corresponding to the selected information may be displayed in the second area 1004. If the selected information is link information related to another web page, the controller 180 may move to the other web page and display image information accordingly.
- the controller 180 extracts information related to image information displayed in the first area 1002 set as the main area, and extracts information related to the extracted information from other areas, that is, the reference area 1000 and the second area. It has been mentioned that it may be indicated in (1004).
- the image information displayed in the main area 1002 is a web page
- the extracted related information may be multimedia information included in the web page or link information or history information set in text of the web page. It was mentioned.
- FIG. 12 shows examples of image information displayed in each of the bent regions of the display unit 251 in this case.
- the controller 180 may determine from the web page.
- the relevant information can be extracted.
- the mobile terminal 200 may display the content of the web page separately from the text portion and the multimedia content portion. That is, as shown in FIG. 12A, the controller 180 displays a text portion of the web page in the main area 1002 and displays one of the multimedia contents included in the web page in the sub-area ( 1004).
- the controller 180 may display information related to the multimedia content extracted from the web page in the form of a thumbnail image on the reference area 1000.
- the image information displayed in the sub area 1004 may be multimedia content corresponding to the image 1210 selected by the user's touch input 1200 among the thumbnail images displayed on the reference area 1000.
- the controller 180 displays image information displayed on the main area 1002 and the sub area 1004. You can change them. That is, as shown in FIG. 12B, when the user applies the drag input 1202 while applying the touch input 1200 to the first image 1210 displayed in the reference area 1000, The second image 1212 may be selected by the drag input 1202.
- the controller 180 may determine that the user has selected the second image 1212, and may display multimedia content corresponding to the selected second image 1212 in the sub area 1004. However, if the second image 1212 does not correspond to the multimedia content included in the web page currently displayed in the main area 1002, but is related information corresponding to another web page, the controller 180 may determine the above information. The user may move to a web page corresponding to the second image 1212 and display image information on the moved web page in the main area 1002. In this case, various related information may be extracted from the currently moved web page, and thumbnail images corresponding to the extracted information may be displayed in the reference area 1000.
- the controller 180 changes the information displayed on the reference area 1000 in the form of text (the display unit 251 is bent more severely). Display one web page with no multimedia content separated in the first area 1002 and the second area 1004 (the display unit 251 is unfolded). Of course).
- the controller 180 of the mobile terminal 200 may change image information displayed in the reference area 1000 and the sub area 1002 according to a state in which the display unit 251 is bent. It has been mentioned. In addition, it has been mentioned that the image information displayed in the reference area 1000 and the sub area 1002 may be determined according to an operation state executed in the mobile terminal 200.
- 13A, 13B, 13C, 13D, and 13E illustrate that in the mobile terminal according to the present invention, different image information related to an operating state of the mobile terminal is displayed in the plurality of areas according to the bent state of the display unit. Illustrated diagrams illustrating examples.
- FIG. 13A corresponds to each region of each display unit 251 according to the bent state of the display unit 251. This shows an example in which the displayed image information is changed.
- the controller 180 may activate only the first region 1002 currently set as the main region, or activate only the first region 1002 and the reference region 1000. have.
- the controller 180 provides image information on sound source data currently reproduced in the main region 1002, that is, a function currently reproduced in the reference region 1000, that is, sound source data. Can display a control menu associated with the playback of.
- the controller 180 may operate the sub area, that is, the second area 1004, in a power saving mode in which time information and weather information are displayed.
- the controller 180 may change the size of the reference area 1000 based on the change of the bent state of the display unit 251. Accordingly, the controller 180 can display more information in the reference area 1000 as the size of the reference area 1000 changes. For example, as shown in FIG. 13A (b), the controller 180 displays a control menu including more control functions in the reference area 1000 as the size of the reference area 1000 changes. You can also When the size of the reference area 1000 increases as described above, image information displayed in the main area 1002 and the sub area 1004 may be changed accordingly.
- the controller 180 may change image information displayed in the reference area 1000 and the sub area 1004 accordingly. For example, as shown in (c) of FIG. 13A, the controller 180 may display a list 1312 of sound source data currently set in the wider reference area 1000, and currently present in the sub area 1004. Album information 1310 related to the reproduced sound source data may be displayed.
- the display unit 251 is weakly bent as shown in (c) of FIG. 13A, the first region 1002 and the second region 1004 at a glance even when the user is mounted with the mobile terminal 200. This is because the image information displayed in Fig. 1 can be confirmed.
- each area of each display unit 251 according to the bent state of the display unit 251 is illustrated. This shows an example in which the image information displayed on the screen is changed.
- the controller 180 may select one image selected by the user in the main area 1002, and a list of a plurality of images, that is, a thumbnail image, in the reference area 1000. Can be displayed.
- the controller 180 may operate the sub area, that is, the second area 1004, in a power saving mode in which time information and weather information are displayed.
- the controller 180 may change the size of the reference area 1000 based on the change of the bent state of the display unit 251.
- the controller 180 may not only display thumbnail images corresponding to the plurality of images, but also information related to the images, for example, information about a photographing time of the image 1320. ) May be displayed on the reference area 1000.
- image information displayed in the main area 1002 and the sub area 1004 may be changed accordingly.
- the controller 180 may change image information displayed in the reference area 1000 and the sub area 1004 accordingly.
- the controller 180 may display thumbnail images corresponding to more images in the wider reference area 1000, and in the sub area 1004, a plurality of images may be displayed.
- Information about albums 1322 including images of may be displayed.
- the thumbnail images displayed in the reference area 1000 may be images included in any one of the albums 1322 displayed in the sub area 1004.
- the image information displayed in the main area 1002 may be an image corresponding to any one of thumbnail images displayed in the reference area 1000 according to a user's selection.
- FIG. 13B illustrates an area of each display unit 251 according to a bent state of the display unit 251 when the mobile terminal 200 performs a function of playing previously stored video content. This shows an example in which the image information displayed in these fields is changed.
- the controller 180 displays a screen on which any video content selected by a user is played in the main area 1002, and the reference area 1000 displays the video content.
- a control menu may be displayed that includes functions for controlling playback.
- the controller 180 may operate the sub area, that is, the second area 1004, in a power saving mode in which time information and weather information are displayed.
- the controller 180 may change the size of the reference area 1000 based on the change of the bent state of the display unit 251.
- the controller 180 may cause a control menu corresponding to more functions that can control the currently played video content to be displayed on the reference area 1000. have.
- the controller 180 may extract still cuts 1330 for each time section of the currently played video content, and may further display the cuts 1330 in the reference area 1000.
- image information displayed in the main area 1002 and the sub area 1004 may be changed accordingly.
- the controller 180 may change image information displayed in the reference area 1000 and the sub area 1004 accordingly. For example, as shown in (c) of FIG. 13C, the controller 180 may display a control menu including more control functions in the wider reference area 1000. In this case, the controller 180 displays a timeline 1332 related to the video content currently being played in addition to the still cuts 1330 in the reference area 1000, so that the controller 180 can display the video content currently being played in the main area 1002. Time information may be further displayed. In addition, the controller 180 may display a list 1334 of other video contents stored in the memory 170 or a preset external server in the sub area 1004.
- 13D illustrates an example in which different calendar information is displayed in each area of each display unit 251 according to the bent state of the display unit 251.
- the controller 180 may activate only the first region 1002 currently set as the main region, or activate only the first region 1002 and the reference region 1000. have. In this case, as shown in (a) of FIG. 13D, the controller 180 may display calendar information in which the dates including the schedules are distinguished only in the main area 1002.
- the controller 180 may open the reference area 1000 and the sub area 1004 based on the change of the bent state of the display unit 251.
- the schedule information can be displayed more variously.
- the controller 180 may display a current date in the reference area 1000 and simple calendar information in the sub area 1004, as shown in FIG. 13D (b).
- schedule information 1342 corresponding to the current date may be displayed in the main area 1002 that the user mainly checks.
- the controller 180 may change image information displayed in the reference area 1000 and the sub area 1004 accordingly. For example, as shown in (c) of FIG. 13A, the controller 180 may display only the date information detected according to a specific condition on the reference area 1000. That is, as shown in (c) of FIG. 13D, the controller 180 extracts only dates for which the schedule is set by the user from calendar information, and extracts the information 1342 for the extracted date from the reference area 1000. Can be marked on. In this case, when any one of the date information 1342 displayed in the reference area 1000 is selected by the user, the controller 180 may display the schedule set on the date in the main area 1002. Fig. 13D (c) shows this example.
- FIG. 13E shows different image information in each of the areas 1000, 1002, and 1004 based on the bent state of the display unit 251 when the operation state of the mobile terminal 200 is a function of transmitting and receiving a text message. It is showing an example.
- the controller 180 may activate only the first region 1002 currently set as the main region, or activate only the first region 1002 and the reference region 1000. In this case, as shown in (a) of FIG. 13E, the controller 180 activates only the main area 1002 and the reference area 1000, thereby providing image information related to the transmission and reception of the text message to the main area 1002. In addition, only the information related to the receiver may be displayed in the reference area 1000.
- the controller 180 may change image information displayed in the reference area 1000 and the sub area 1004 accordingly. For example, as shown in (b) of FIG. 13E, the controller 180 displays image information related to an input message editor (IME) capable of inputting characters in the main area 1002, and displays the reference area ( 1000 may be displayed to display a character string input from the user. In addition, in the sub area 1004, information about messages transmitted or received by a user may be displayed. This is because when the display unit 251 is weakly bent as shown in (c) of FIG. 13E, the first region 1002 and the second region 1004 at a glance even when the user is mounted with the mobile terminal 200. This is because both the first area 1002 and the second area 1004 can be utilized without changing the direction in which the mobile terminal 200 is placed.
- IME input message editor
- the mobile terminal 200 may change the operation state currently executed in the mobile terminal 200 based on the bent state of the display unit 251. I've done it.
- a function currently executed in the mobile terminal 200 according to an embodiment of the present invention is an alarm function
- the controller 180 may be disposed in any one of the plurality of divided areas of the display unit 251.
- the operating state of the mobile terminal that is, image information related to an alarm function can be displayed.
- FIG. 13F illustrates an example in which an operating state of the mobile terminal is controlled according to the bent state of the display unit in the mobile terminal according to the embodiment of the present invention.
- the controller 180 of the mobile terminal 200 may display image information 1370 related to the alarm function in the main area 1002. have.
- the information on the alarm time 1372 set by the user may be displayed on the image information displayed in the main area 1002.
- the controller 180 can detect the change in the bending of the display unit 251 by the external force. That is, as shown in (b) of FIG. 13F, when the bending of the display unit 251 is unfolded by an external force, the controller 180 is currently mobile terminal based on the change of the bent angle of the display unit 251.
- the function executing at 200 may be controlled.
- the controller 180 may change the currently set alarm time. That is, while the display unit 251 is bent by the angle 1360 shown in (a) of FIG. 13F, the display unit 251 is bent by the angle 1322 shown in (b) of FIG. 13F by external force. In this case, the controller 180 may set the time at which the alarm is set to be longer than the time set in FIG. 13F (a).
- FIG. 13F (b) shows such an example.
- the alarm time 1374 set in (a) of FIG. 13F shows that the longer alarm time 1374 is set.
- the controller 180 may change the alarm time again according to the change. That is, when the bent portion of the display unit 251 is unfolded by an external force in the same state as shown in (b) of FIG. 13F, the control unit 180 is in the state shown in (c) of FIG. 13F. Depending on this condition, the alarm time can be changed. That is, FIG. 13F (c) is in a state where the bend is more extended than the display unit 251 shown in FIG. 13F (b), that is, the angle 1164 between the first area 1002 and the second area 1004. Since is a more widened state, the controller 180 may allow the currently set alarm time to be longer based on the changed angle (1364). Fig. 13F (c) shows this example.
- the controller 180 is disposed between the first area 1002 and the second area 1004 of the display part 251 according to the bent state of the display part 251.
- the alarm time corresponding to the angle 1360 (eg, 30 degrees) may be automatically set.
- corresponding alarm times may be automatically set for each of the angles 1362 and 1364 changed accordingly.
- the controller 180 of the mobile terminal 200 may have image information related to different functions in respective regions divided by the bending of the display unit 251. It has been mentioned that different users may use the mobile terminal 200 according to the embodiment of the present invention simultaneously for different purposes. In this case, the controller 180 may be configured by different users who use both regions (the first region and the second region) by using the region formed by the bending of the display unit 251, that is, the reference region 1000. It was mentioned that information can be shared.
- FIG. 14A and 14B illustrate an example in which image information displayed in different areas of the display unit is shared with each other by utilizing a curved portion of the display unit in the mobile terminal according to the present invention.
- FIG. 14A illustrates an example in which image information about different web pages is displayed in the first area 1002 and the second area 1004.
- a user using the first area 1002 will be referred to as a first user
- a user using the second area 1004 will be referred to as a second user.
- the first user may select one of the image information 1412 currently displayed in the first area 1002. That is, when the first user applies a touch input to the image information currently displayed in the first area 1002 for a predetermined time or more, the controller 180 based on the touch input 1410 of the first user, the second user And share information 1412 for sharing the image information displayed in the first area 1002.
- the controller 180 may generate a thumbnail image corresponding to the image information currently displayed in the first area 1002 as the sharing information based on the touch input 1410.
- a user may include image information of any one of the image information displayed in the first area 1002 (eg, included in a web page displayed in the first area 1002). Multimedia content), a thumbnail image corresponding to the selected information may be generated as the shared information 1412.
- the controller 180 may display the generated sharing information 1412 on the reference area 1000 based on the drag input applied after the touch input 1410.
- the sharing information 1412 may be displayed in the reference area 1000 as shown in (c) of FIG. 14A.
- the sharing information 1412 may be displayed based on a counterpart of the sharing and a user who has not generated the sharing information (second user). Accordingly, as shown in (c) of FIG. 14A, the sharing information 1412 may be displayed in a normal state in a direction viewed by the second user and in an upside down state in a direction viewed by the first user. .
- the controller 180 may display image information corresponding to the sharing information 1412 according to a selection of a user, that is, a second user. 1004). That is, when the second user applies the touch input 1430 to the shared information 1412 displayed in the reference area 1000 and drags it to the second area 1004, the controller 180 controls the second user. It may be determined that image information corresponding to the sharing information 1412 is selected to be displayed on the second area 1004. Accordingly, the controller 180 can display image information corresponding to the shared information 1412 in the second area 1004, and as a result, as shown in FIG. 14A (d), the first area 1002. Image information, such as image information displayed at, may be displayed in the second area 1004.
- the first user and the second user may share information in the same manner as shown in FIG. 14A, but in the case of multimedia content such as video or sound source data, In a manner that can be shared between the first user and the second user.
- the controller 180 may generate sharing information corresponding to multimedia content currently played in the first region. That is, the sharing information may be multimedia content itself currently played in the first area 1002 or information including address information of a memory 170 or an external server in which the multimedia content is stored.
- the controller 180 may allow multimedia content identical to the multimedia content played in the first area 1002 to be played in the second area 1004, based on the selection of the second user for the sharing information. .
- the mobile terminal 200 according to an exemplary embodiment of the present invention, only multimedia contents of a specific section designated by a user may be shared between the first and second users.
- FIG. 14B illustrates an example in which only multimedia content corresponding to a section selected by the user is shared among users in this case.
- the multimedia content played in the first area 1002 is moving image data for convenience of description.
- the controller 180 may display image information related to different functions in the first area 1002 and the second area 1004 according to a user's selection.
- a playback screen of video data may be displayed on the first area 1002, and the first user may apply a touch input 1450 to a point of the first area 1002 where the video data is played for a predetermined time or more. have.
- the controller 180 may generate sharing information 1452 related to video data currently being played in the first area 1002, based on the touch input 1450.
- the controller 180 may display the video data reproduced in the first area 1002 before and after the time when the user's touch input 1450 is applied, for example, before or after the time when the touch input 1450 is applied.
- Video data of a section corresponding to seconds may be generated. And it can be created as shared information.
- the controller 180 can display a thumbnail image 1452 corresponding to the sharing information in the reference area 1000.
- the sharing information may be displayed based on the counterpart of the sharing and the user (second user) who did not generate the sharing information.
- the second user may select the thumbnail image 1452.
- the second user applies a touch input 1460 to one of the thumbnail images 1452 displayed in the reference area 1000, and the thumbnail image 1452 is applied to the second user.
- An input for dragging a to the second area 1004 may be applied.
- the controller 180 may display image information corresponding to the sharing information corresponding to the thumbnail image 1452 in the second area 1004, and thus the sharing information, that is, the touch input of the user ( Before and after the time point 1450 is applied, a screen on which video data of a section corresponding to a predetermined time is played may be displayed on the second area 1004.
- the mobile terminal 200 designates the specific user from the video data viewed by the specific user. Only video data of a section corresponding to a viewpoint may be shared with other users. Meanwhile, in the above description, only the video data is assumed and described, but sound source data may be shared in whole or in part among the users in a similar manner as described above.
- multimedia content corresponding to a certain section selected by a specific user is recorded or recorded, so that multimedia content of a section corresponding to the time point selected by the user is recorded. It may be generated, but may also be generated in a different way.
- the controller 180 controls the address information of the memory 170 or the external server storing the currently played multimedia content and the time selected by the user.
- the shared information may be generated as the information including the section information.
- the controller 180 reads out the multimedia content based on the address information from the shared information corresponding to the selected information, It is a matter of course that the screen on which the corresponding part of the read multimedia content is played can be displayed on the second area 1004.
- the first user corresponding to the first area 1002 is assumed to generate shared information.
- this is merely for convenience of description and the present invention is not limited thereto. That is, both the first user and the second user can freely generate the sharing information, and display the information corresponding to the generated sharing information in the reference area 1000, and use the first area 1002 and the first to generate the sharing information.
- the information displayed in the two regions 1004 may be shared with each other.
- the sharing information when the sharing information is generated, a thumbnail image corresponding to the same is displayed on the reference area 1000, but this may be changed according to the bent state of the display unit 251. That is, for example, when the display unit 251 is bent above a certain level, when the size of the reference area 1000 becomes small enough to not display a thumbnail image, the controller 180 displays the shared information. Of course, it may be displayed as text information or tag information, not a thumbnail image. Alternatively, when the bent display unit 251 is expanded, as the size of the reference area 1000 increases, the controller 180 further includes information related to a larger thumbnail image or a currently generated thumbnail image. Of course, the sharing information can also be displayed.
- the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
- the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
- the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
La présente invention concerne un terminal mobile pouvant être courbé ou plié et un procédé de commande du terminal mobile, le terminal mobile comprenant : une unité d'affichage souple pour afficher des informations d'image ; une unité de détection pour détecter une courbure de l'unité d'affichage souple ; une unité de commande pour segmenter l'unité d'affichage souple en une pluralité de régions selon la courbure de l'unité d'affichage souple et pour afficher différentes informations d'image dans chacune de la pluralité de régions. L'unité de commande est caractérisée par la commande de l'unité d'affichage souple de telle sorte que des informations d'image sont affichées dans au moins l'une de la pluralité de régions, sur la base d'un toucher entré par un utilisateur pour une région dans laquelle l'unité d'affichage souple est courbée.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140121211A KR20160031305A (ko) | 2014-09-12 | 2014-09-12 | 이동 단말기 및 그 제어 방법 |
KR10-2014-0121211 | 2014-09-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016039498A1 true WO2016039498A1 (fr) | 2016-03-17 |
Family
ID=55459231
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2014/009391 WO2016039498A1 (fr) | 2014-09-12 | 2014-10-06 | Terminal mobile et son procédé de commande |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR20160031305A (fr) |
WO (1) | WO2016039498A1 (fr) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105912074A (zh) * | 2016-03-31 | 2016-08-31 | 联想(北京)有限公司 | 一种电子设备 |
CN109669514A (zh) * | 2018-12-13 | 2019-04-23 | 维沃移动通信有限公司 | 一种终端的控制方法及终端 |
CN110721467A (zh) * | 2019-09-24 | 2020-01-24 | 咪咕互动娱乐有限公司 | 显示控制方法、电子设备及计算机可读存储介质 |
WO2020213984A1 (fr) * | 2019-04-18 | 2020-10-22 | Samsung Electronics Co., Ltd. | Dispositif électronique et procédé d'affichage d'écran de fourniture d'écran partagé |
US11081090B2 (en) | 2018-10-23 | 2021-08-03 | Samsung Electronics Co., Ltd. | Method for displaying objects and electronic device using the same |
US11372446B2 (en) | 2019-04-17 | 2022-06-28 | Samsung Electronics Co., Ltd | Foldable electronic device and method for displaying information in foldable electronic device |
US12039161B2 (en) | 2021-07-08 | 2024-07-16 | Samsung Electronics Co., Ltd. | Electronic device comprising a plurality of touch screen displays and screen division method |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102423145B1 (ko) * | 2016-04-12 | 2022-07-21 | 삼성전자주식회사 | 플렉서블 디바이스 및 플렉서블 디바이스의 동작 방법 |
KR102494101B1 (ko) * | 2018-02-14 | 2023-02-01 | 삼성전자주식회사 | 터치 입력 처리 방법 및 이를 지원하는 전자 장치 |
KR20210082910A (ko) * | 2019-12-26 | 2021-07-06 | 삼성전자주식회사 | 플렉서블 디스플레이를 포함하는 전자 장치와 이의 동작 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080291225A1 (en) * | 2007-05-23 | 2008-11-27 | Motorola, Inc. | Method and apparatus for re-sizing an active area of a flexible display |
KR20130135648A (ko) * | 2012-06-01 | 2013-12-11 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어방법 |
KR20140000749A (ko) * | 2012-06-25 | 2014-01-06 | 엘지전자 주식회사 | 단말기의 제어방법 |
KR20140031679A (ko) * | 2012-09-05 | 2014-03-13 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
KR20140094958A (ko) * | 2013-01-23 | 2014-07-31 | 삼성전자주식회사 | 플렉서블 디스플레이 장치의 동작 실행 방법 및 그 장치 |
-
2014
- 2014-09-12 KR KR1020140121211A patent/KR20160031305A/ko not_active Abandoned
- 2014-10-06 WO PCT/KR2014/009391 patent/WO2016039498A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080291225A1 (en) * | 2007-05-23 | 2008-11-27 | Motorola, Inc. | Method and apparatus for re-sizing an active area of a flexible display |
KR20130135648A (ko) * | 2012-06-01 | 2013-12-11 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어방법 |
KR20140000749A (ko) * | 2012-06-25 | 2014-01-06 | 엘지전자 주식회사 | 단말기의 제어방법 |
KR20140031679A (ko) * | 2012-09-05 | 2014-03-13 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
KR20140094958A (ko) * | 2013-01-23 | 2014-07-31 | 삼성전자주식회사 | 플렉서블 디스플레이 장치의 동작 실행 방법 및 그 장치 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105912074A (zh) * | 2016-03-31 | 2016-08-31 | 联想(北京)有限公司 | 一种电子设备 |
US11081090B2 (en) | 2018-10-23 | 2021-08-03 | Samsung Electronics Co., Ltd. | Method for displaying objects and electronic device using the same |
CN109669514A (zh) * | 2018-12-13 | 2019-04-23 | 维沃移动通信有限公司 | 一种终端的控制方法及终端 |
US11372446B2 (en) | 2019-04-17 | 2022-06-28 | Samsung Electronics Co., Ltd | Foldable electronic device and method for displaying information in foldable electronic device |
WO2020213984A1 (fr) * | 2019-04-18 | 2020-10-22 | Samsung Electronics Co., Ltd. | Dispositif électronique et procédé d'affichage d'écran de fourniture d'écran partagé |
US11042284B2 (en) | 2019-04-18 | 2021-06-22 | Samsung Electronics Co., Ltd. | Electronic device and method for displaying object for providing split screen |
CN110721467A (zh) * | 2019-09-24 | 2020-01-24 | 咪咕互动娱乐有限公司 | 显示控制方法、电子设备及计算机可读存储介质 |
US12039161B2 (en) | 2021-07-08 | 2024-07-16 | Samsung Electronics Co., Ltd. | Electronic device comprising a plurality of touch screen displays and screen division method |
Also Published As
Publication number | Publication date |
---|---|
KR20160031305A (ko) | 2016-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017057803A1 (fr) | Terminal mobile et son procédé de commande | |
WO2017082508A1 (fr) | Terminal de type montre, et procédé de commande associé | |
WO2017104860A1 (fr) | Terminal mobile enroulable | |
WO2020171288A1 (fr) | Terminal mobile et dispositif électronique comprenant un terminal mobile | |
WO2015199270A1 (fr) | Terminal mobile, et procédé de commande correspondant | |
WO2017119529A1 (fr) | Terminal mobile | |
WO2017126737A1 (fr) | Terminal mobile | |
WO2015167299A1 (fr) | Terminal mobile et son procédé de commande | |
WO2017090823A1 (fr) | Terminal mobile enroulable et son procédé de commande | |
WO2017099276A1 (fr) | Terminal mobile enroulable et son procédé de commande | |
WO2017007064A1 (fr) | Terminal mobile, et son procédé de commande | |
WO2016032045A1 (fr) | Terminal mobile et son procédé de commande | |
WO2017003018A1 (fr) | Terminal mobile et son procédé de commande | |
WO2016182132A1 (fr) | Terminal mobile et son procédé de commande | |
WO2016039498A1 (fr) | Terminal mobile et son procédé de commande | |
WO2017047854A1 (fr) | Terminal mobile et son procédé de commande | |
WO2016035921A1 (fr) | Terminal mobile et son procédé de commande | |
WO2016052814A1 (fr) | Terminal mobile, et son procédé de commande | |
WO2017039051A1 (fr) | Terminal mobile de type montre et son procédé de commande | |
WO2016129778A1 (fr) | Terminal mobile et procédé de commande associé | |
WO2017051959A1 (fr) | Appareil de terminal et procédé de commande pour appareil de terminal | |
WO2015194694A1 (fr) | Terminal mobile | |
WO2017183764A1 (fr) | Terminal mobile et procédé de commande associé | |
WO2018030619A1 (fr) | Terminal mobile | |
WO2016032113A1 (fr) | Terminal mobile et son procédé de commande |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14901630 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14901630 Country of ref document: EP Kind code of ref document: A1 |