US20180373293A1 - Textile display system and method - Google Patents
Textile display system and method Download PDFInfo
- Publication number
- US20180373293A1 US20180373293A1 US15/629,513 US201715629513A US2018373293A1 US 20180373293 A1 US20180373293 A1 US 20180373293A1 US 201715629513 A US201715629513 A US 201715629513A US 2018373293 A1 US2018373293 A1 US 2018373293A1
- Authority
- US
- United States
- Prior art keywords
- display
- real
- user
- camera
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1652—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1601—Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
- G06F1/1605—Multimedia displays, e.g. with integrated or attached speakers, cameras, microphones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F21/00—Mobile visual advertising
- G09F21/02—Mobile visual advertising by a carrier person or animal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F9/00—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
- G09F9/30—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
- G09F9/301—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements flexible foldable or roll-able electronic displays, e.g. thin LCD, OLED
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F21/00—Mobile visual advertising
- G09F21/02—Mobile visual advertising by a carrier person or animal
- G09F21/023—Mobile visual advertising by a carrier person or animal fixed on clothing
Definitions
- Embodiments of this disclosure relate generally to LED (light emitting diode), based systems, such as OLED (organic light emitting diode), front and side emitting LEDS, and thin film transistors driven LEDs, which are embedded in fabric to form a textile article, such as, headwear, clothing articles, gloves, mittens, shoes, carrying packs, tapestries, and other textile articles to provide a desired illumination and display, and to textile articles comprising such LED display systems.
- OLED organic light emitting diode
- front and side emitting LEDS and thin film transistors driven LEDs
- thin film transistors driven LEDs which are embedded in fabric to form a textile article, such as, headwear, clothing articles, gloves, mittens, shoes, carrying packs, tapestries, and other textile articles to provide a desired illumination and display, and to textile articles comprising such LED display systems.
- a textile display system includes memory operable to store data; a flexible fabric having a plurality of light emitting devices which define a display; and a processor communicatively coupled to the memory and the display.
- the processor is operable to execute one or more modules in the memory.
- the modules include a static module displaying a static image; a dynamic module displaying a dynamic image; a situational module displaying a situational image specific to a situational event; a geographical module displaying a geographic image specific to a geographical location; and a motion module displaying a motion image upon detection of movement.
- a textile display system in another embodiment, includes memory operable to store data; a flexible fabric having a plurality of light emitting devices which define a display; a camera configured to capture a real-time image; and a processor communicatively coupled to the memory, the camera, and the display.
- the processor is operable to execute one or more modules in the memory.
- the modules include a static module displaying a static image; a dynamic module displaying a dynamic image; and a real-time module displaying the real-time image captured from the camera.
- a method of operating a textile light emitting device system having a flexible fabric with a plurality of light emitting devices which define a display; and a processor communicatively coupled to the memory and the display, has the steps of: initiating the system by an authorized user; selecting a display module; and display an image based on the selected display module.
- the display modules include a static module displaying a static image; a dynamic module displaying a dynamic image; a situational module displaying a situational image specific to a situational event; a geographical module displaying a geographic image specific to a geographical location; and a motion module displaying a motion image upon detection of movement.
- FIG. 1 is a block schematic view of a wearable light emitting device display system according to embodiments of the present disclosure.
- FIG. 2 is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure.
- FIG. 3A is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure, shown in static operational mode with a user.
- FIG. 3B is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure, shown in a second embodiment of static operational mode with a user.
- FIG. 3C is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure, shown in dynamic operational mode with a user.
- FIG. 3D is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure, shown in situational operational mode with a user.
- FIG. 3E is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure, shown in geographic operational mode with a user.
- FIG. 3F is a perspective view of the wearable light emitting device display system of FIG. 1 according to embodiments of the present disclosure, shown in motion detection operational mode with a user.
- FIG. 4 is a block schematic view of a wearable light emitting device display system according to embodiments of the present disclosure.
- FIG. 5 is a perspective view of the wearable light emitting device display system of FIG. 4 according to embodiments of the present disclosure.
- FIG. 6 is a perspective view of the wearable light emitting device display system of FIG. 4 according to embodiments of the present disclosure, shown in real time imagining operational mode with a user.
- FIG. 7 is another perspective view of the wearable light emitting device display system of FIG. 4 according to another embodiment of the disclosure.
- FIG. 8 is a flowchart illustrating various steps performed by wearable light emitting device systems according to an embodiment of the present disclosure.
- FIGS. 1-2 illustrate an intelligent textile LED display system, which is generally given a reference number 1 .
- the textile LED display system is referred to herein as a wearable LED display system.
- the textile LED system may be additionally or alternately include other display systems that incorporate a textile, such as a tapestry that may be hung on the wall, seat covers, quilts, etc.
- the wearable LED display system 1 may include a processor 12 , non-transitory computer memory 14 , flexible fabric 16 laden with LEDs 31 , a transceiver 17 , and a power source 18 .
- FIG. 2 illustrates an embodiment of the wearable LED display system 1 as a strip of fabric 16 .
- the LED display system may include a plurality of visual indicators (which may be electrically stimulated) including but not limited to materials that emit light at specific wavelengths such as traditional LEDs or organic LEDs. These types of display technology components operate in the optical transmissive mode. Other supported display technologies can conversely operate in the optical reflective mode, where an external light source must be present in order to become a visual indicator to the observer.
- One example of reflective mode display technology is programmable matter such as a metamaterial. Using metamaterials, external light can be manipulated to reflect selectively on individual or grouped reflective programmable particles. By changing the shape or the particles' index of refraction, it may be possible to change the apparent color of the material from the observer's perspective.
- the processor 12 may be any appropriate device, whether now existing or later developed, which performs the operations specified by the various programming used by the system 1 .
- the processor 12 may be electronic circuitry located on a common chip or circuit board, or may be a distributed processor such that one portion of the processor is physically separate from another portion of the processor. In other words, discrete processing devices may be linked together (e.g., over a network) and collectively form the processor 12 .
- the processor 12 is in data communication with a camera 10 , the memory 14 , and may further be in data communication with other peripherals (e.g., sensors, keyboards, touch screens, etc.) to control the camera 10 and/or other components.
- the controller 12 performs the tasks of processing retrieving modules and/or data from the memory 14 and controlling the different components of the system 1 , including the transceiver 17 .
- the controller 12 may receive instructions from an outside source (i.e. mobile application, personal computer, laptop or other means known in the art or yet to be known) by means of the transceiver 17 .
- Such instructions may be the selection of the mode to display as will be further discussed below.
- Such instructions may be the download of patterns or logos to be displayed, for example from an outside database of images and patterns.
- Such instructions may be received directly from memory 14 and not through an outside source.
- the transceiver 17 may include a transmitter (or “antenna”) 19 .
- the antenna 19 may be situated in a different location than the transceiver 17 .
- the antenna 19 may be a part of circuitry located in the fabric 16 or the transceiver 17 may be one of many transceivers located within the wearable LED display.
- the transceiver 17 communicates directly and/or over a wireless communication infrastructure with other devices configured to receive such wireless communication to provide controlled input to the LEDs 31 .
- the transceiver 17 may include baseband processing circuitry to convert data into a wireless signal (e.g., radio frequency (RF), infrared (IR), ultrasound, near field communication (NFC), et cetera) and the transmitter 19 transmits the wireless signal.
- a wireless transceiver e.g., radio frequency (RF), infrared (IR), ultrasound, near field communication (NFC), et cetera
- RF radio frequency
- IR infrared
- NFC near field communication
- et cetera the transmitter 19 transmits the wireless signal.
- a second wireless transceiver not shown
- receives the wireless signal and converts the signal into meaningful information e.g., voice, data, video, audio, text, instructions for completing a task, et cetera
- baseband processing circuitry e.g., through an application on a phone, GPS, computer, notepad, etc. or through a central display located within the home.
- Examples of direct wireless communication include Bluetooth, Zi
- the first wireless transceiver 17 transmits and/or receives a wireless signal to a base station or access point, which conveys the signal to a wide area network (WAN) and/or to a local area network (LAN).
- the signal may traverse the WAN and/or LAN to a second base station or access point to send signal to the second wireless transceiver or it may traverse the WAN and/or LAN directly to the second wireless transceiver.
- Examples of wireless communication via an infrastructure include cellular satellite/tower, IEEE 802.11, public safety systems, et cetera.
- the memory 14 contains the relevant computer or program instructions for the controller 12 , and may further include location data that may be gathered from a global positioning satellite (GPS) system. The data may be transmitted by the transmitter 19 as described above to obtain the location of the wearable LED system 1 .
- the memory 14 may be situated in the same circuit board as the controller 12 and power source 18 .
- the memory 14 may include volatile and non-volatile memory, and any appropriate data storage devices whether now existing or later developed may be used. Further, the memory 14 may be a unitary memory in one location, or may alternately be a distributed computer memory such that one portion of the computer memory 14 is physically separate from another portion of the memory 14 . Example memory devices which may be employed include SRAM, DRAM, EPROM, EEPROM, Flash, magnetic, rotating media, ferromagnetic, and U3 smart drives.
- the memory 14 is in communication with the controller 12 for providing data to and receiving data from the controller 12 . In some embodiments, data may be encrypted to prevent disassembly and reverse engineering.
- the memory 14 may include, for example, a program storage area (for storing software or “instructions”) and a data storage area (for storing videos, still photographs, and other data).
- the software components stored in memory 14 may include a static operation module 20 , a dynamic operation module 22 , situation based module 24 , geographical operation module 26 , motion module 28 , real time image module 30 . It is foreseen that the memory 14 may further include an operating system.
- the static operation module 20 is configured to display a single image (e.g., a color, a pattern, a logo, etc.) which remains stationary while the static operation module 20 is in use.
- the dynamic operation module 22 is configured to display an image which moves while the dynamic operation module 22 is in use. For example, the image may flash, scroll, rotate, or otherwise move on the respective display.
- the situation based module 24 may be configured for static or dynamic operation, the image being selected based on the user's specific situation (e.g., if the user is lost).
- the geographical operation module 26 may be configured for static or dynamic operation, the image being selected based on the user's geographical location.
- Each of the static module 20 , the dynamic module 22 , the situation based module 24 , and the geographical operation module 26 may be activated by the user, e.g., through an on/off button.
- the on/off button may be located somewhere on one or more of the displays, or may be remote (e.g., on the user's phone or the phone of a party authorized by the user such as a parent on behalf of a child).
- the motion module 28 may similarly be configured for static or dynamic operation. However, the motion module 28 may be activated by movement of the user (e.g., instead of turning the module 28 on or off via a button).
- the motion module 28 may be turned on (e.g., via an on/off button as described above), but the image may not be triggered to turn on until the module 28 experiences movement by the user.
- the real-time image module 30 is configured to interact with a camera 10 to display images that are received from the camera 10 .
- the various modules 20 , 22 , 24 , 26 , 28 , and 30 are described in greater detail with reference to the examples below.
- the controller 12 is configured to retrieve from memory 14 and execute, among other things, instructions or modules related to the control processes and methods described herein.
- the controller 12 is connected to the memory 14 and may execute software instructions or modules that are stored in a RAM of the memory 14 (e.g., during execution), a ROM of the memory 14 (e.g., on a generally permanent bases), or another non-transitory computer readable medium such as another memory or a hard disc.
- software can be stored in the internal memory of the controller board 14 .
- the controller 12 can have part of its operations dedicated to a data transfer function between the camera 10 to the memory 14 , and another part dedicated to the fabric LED array 16 . Alternately, the controller 12 can embody the two functions in separate controllers (i.e., a data transfer function controller and a LED controller).
- the flexible LED fabric 16 may be stretchable, conformable, thin-film transistor (TFT) driven LED display that is laminated or otherwise incorporated into fabric textiles which features a plurality of LED lights.
- the electronics i.e. processor 12 , memory 14 , etc.
- the LED displays 31 may be manufactured from flexible material. Examples of flexible materials used for the LED strip include polymer/plastic substrates, such as, polyamide, polyether ether ketone (PEEK) or transparent conductive polyester film.
- PEEK polyether ether ketone
- a polyimide substrate may be encapsulated in rubber, which would allow the display 31 to be laminated into fabric 16 .
- the at least one flexible LED display strip 16 may be able to conform to the shape of the surface of a wearable article or of a user (i.e. spandex), said flexible strip comprising an upper surface 32 having one or more LEDs 31 disposed thereon which are operably positioned to provide a desired illumination; an electrical connector for electrically connecting the LED strip 16 to a power source 18 ; and may further include an electrical connector for electrically connecting the LED fabric strip 16 to a computer or other device so as to install additional programming or additional licensing for trademarks to be displayed. It is foreseen that this invention applies to any controllable light emitting device known or yet to be known and should not be limited to LEDs or OLEDs.
- the power source 18 may be located within the flexible LED fabric 16 or may be located remotely.
- the power source 18 of a system 1 can be one or more of: batteries, a computer battery through a USB connection, a motion-based generator, a solar cell, or other source of electricity such as a fuel cell or other means known in the art. It may be beneficial for the system 1 to include means for storing power, both in situations in which the system 1 is hard wired to an electrical box, and where the system receives its power from an outside source.
- the power source 18 may be capacitors or a battery, such as NiCd (nickel-cadmium), NiZn (nickel-zinc), NIMH (nickel-metal hydride), or lithium-ion.
- the battery may store energy from other sources which may then be converted into electrical energy for use by the system 1 .
- the system may be able to capture solar energy through a photo-voltaic cell.
- the power source 18 may be a capacitor or have a capacitor back-up that receives power until fully charged, then, the capacitor provides temporary power to the system 1 , such that even when the battery 18 becomes discharged, the system 1 will stay on for a short time.
- the system 1 may, for example, provide an indication (e.g., a sound, a displayed message, a flashing LED, etc.) to the user that the battery 18 is low or that the capacitor is being used and replace the power source 18 soon.
- a user 34 is wearing a controllable LED display hat 36 , a controllable LED display sweater 38 , controllable LED display pants 40 , and a controllable LED display shoes 42 .
- Each of the wearable LED displays 36 , 38 , 40 , 42 is made up of one or more fabric LED systems 1 as disclosed above.
- the LED displays 36 , 38 , 40 , and 42 may each be in static operation under control of controller 12 utilizing the static operation module 20 .
- one or more wearable LED displays 36 , 38 , 40 , 42 may be controlled by the static operation module 20 while the respective other wearable LED displays 36 , 38 , 40 , 42 may be controlled by a different module 22 , 24 , 26 , 28 , or 30 , as will be further discussed below.
- the static operation module may control the individual LEDs 31 in the wearable LED displays 36 , 38 , 40 , 42 to display a single color or a single pattern 50 , for example: paisley, plaid, stripes, denim, floral, or other patterns known in the art or yet to be known.
- the memory 14 may have the patterns 50 stored for the static module 20 to recall and display.
- the color depth may be 24 bit or 32 bit or other color depth yet to be known.
- the wearable LED displays 36 , 38 , 40 , 42 may be set to be the same static pattern or may each individually be their own static pattern.
- FIG. 3B Another example of static operation is illustrated in FIG. 3B .
- a user 134 is wearing a controllable LED display hat 136 , a controllable LED display sweater 138 , controllable LED display pants 140 , and controllable LED display shoes 142 , each of which is under control of controller 12 utilizing an operation module such as the static operation module 20 .
- the controllable wearable LED displays 136 , 138 , 140 , and 142 are substantially similar to counterpart wearable LED displays 36 , 38 , 40 , and 42 , except as specifically noted and/or shown, or as would be inherent.
- the static pattern 150 is an image on the controllable LED display sweater 138 .
- the image 150 may be repeated throughout the sweater 138 or on other controllable LED displays 136 , 140 , 142 .
- the image 150 may be stored on memory 14 and displayed randomly or centrally or determined by the user 134 .
- the image 150 may be any type of file format, such as JPEG, Exif, TIFF, GIF, BMP, PNG or any other format known or yet to be known.
- a user 234 is wearing a controllable LED display hat 236 , a controllable LED display sweater 238 , controllable LED display pants 240 , and controllable LED display shoes 242 , each of which is under control of controller 12 utilizing an operation module such as the dynamic operation module 22 .
- the controllable wearable LED displays 236 , 238 , 240 , and 242 are substantially similar to counterpart wearable LED displays 36 , 38 , 40 , and 42 , except as specifically noted and/or shown, or as would be inherent. In the illustrated example of FIG.
- the dynamic phrase 250 “Follow Me” scrolls in direction A across the user's sweater 238 , and may repeat as the phrase ends on the left side of the controllable wearable LED display sweater 238 .
- the image 250 may be dynamically displayed on any of the wearable LED displays 236 , 238 , 240 , 242 .
- the dynamic operation module 22 controls the individual LEDs 31 in the wearable LED displays 236 , 238 , 240 , 242 to display the dynamic phrase 250 .
- the memory 14 may store one or more dynamic patterns 250 for the dynamic module 22 to recall and display.
- the dynamic patterns 250 may include detailed motion patterns for each wearable display 236 , 238 , 240 , 242 that may be used or worn.
- the moving image 250 may be displayed at locations randomly selected or centrally-located or at (a) location(s) determined by the user 234 .
- the image 250 may be of any type of video file format, such as MPEG, MOV, WMV, AMV, MP4, MPG, M4V or any other format known or yet to be known.
- the wearable LED displays 236 , 238 , 240 , 242 may be configured to play a movie or a moving image from memory 14 or a video sent to a cellular device which may be accessed via the controller 12 .
- a user 334 is wearing a controllable LED display hat 336 , a controllable LED display sweater 338 , controllable LED display pants 340 , and controllable LED display shoes 342 , each of which is under control of controller 12 utilizing a module such as the situational module 24 .
- the controllable wearable LED displays 336 , 338 , 340 , and 342 are substantially similar to counterpart wearable LED displays 36 , 38 , 40 , and 42 , except as specifically noted and/or shown, or as would be inherent.
- a user 334 such as a child, may be wearing the wearable LED displays 336 , 338 , 340 , 342 .
- An outside source such as the child's parent's phone may communicate with the transceiver(s) 17 of the wearable LED displays 336 , 338 , 340 , 342 to change the display format of the wearable LED displays 336 , 338 , 340 , 342 to a specific distress or aid pattern 350 .
- the situational module 24 controls the individual LEDs 31 in the wearable LED displays 336 , 338 , 340 , 342 to display a situation specific pattern 350 .
- the situational phrase 350 “HELP I'M LOST” may optionally be held static or may be in motion on the wearable LED display sweater 338 .
- the situational phrase 350 may be displayed across one or more controllable wearable LED displays 336 , 338 , 340 , and 342 .
- Other exemplary situational phrase may include phrases such as “Call 911, I have been kidnapped,” or “If found please contact parents at (913) 555-5555.”
- the wearable LED displays 336 , 338 , 340 , 342 may adjust the viewing angle such that only other small children or short individuals at the user's 334 height may see the situational pattern 350 and those above the height cannot or vice versa.
- the memory 14 may have static or dynamic situational patterns 350 stored for the situational module 24 to recall and display.
- a panic button on one or more wearable LED displays 336 , 338 , 340 , 342 , in lieu of, or in combination with the transceiver 17 , which may allow the user 334 to change to the situational pattern 350 at will.
- a user 434 is wearing a controllable LED display hat 436 , a controllable LED display sweater 438 , controllable LED display pants 440 , and controllable LED display shoes 442 , each of which is under control of controller 12 utilizing an operation module such as the geographical operation module 26 .
- the controllable wearable LED displays 436 , 438 , 440 , and 442 are substantially similar to counterpart wearable LED displays 36 , 38 , 40 , and 42 , except as specifically noted and/or shown, or as would be inherent.
- FIG. 3E a user 434 is wearing a controllable LED display hat 436 , a controllable LED display sweater 438 , controllable LED display pants 440 , and controllable LED display shoes 442 , each of which is under control of controller 12 utilizing an operation module such as the geographical operation module 26 .
- the controllable wearable LED displays 436 , 438 , 440 , and 442 are substantially similar to counterpart wearable LED displays
- the geographical pattern or logo 450 is displayed on one or more of the controllable LED displays 426 , 438 , 440 , 442 and may be held static or may be in motion, for example by blinking, repeating, or striping across the LED display(s).
- the geographical module 26 controls the individual LEDs 31 in the wearable LED displays 436 , 438 , 440 , 442 to display a geographical specific pattern 450 .
- the geographical pattern 450 may be based on geographical points of interest near the location of the user 434 .
- transceiver(s) 17 of the wearable LED displays 436 , 438 , 440 , 442 communicate with a GPS to obtain location data which may be communicated to the controller 12 .
- the controller 12 may determine that the user 434 is located either around the Royals® stadium or Kansas City area, and one or more of the wearable LED displays 436 , 438 , 440 , 442 may automatically activate to display a Royals® logo 450 .
- the user 434 may cause one or more of the LED displays 436 , 438 , 440 , 442 to exhibit the geographic pattern 450 , e.g., by engaging with the module 26 to activate or deactivate the image 450 .
- the location of the display 450 e.g., which one or more of the LED displays 436 , 438 , 440 , 442 will show the logo and/or where on the display 436 , 438 , 440 and/or 442 the logo may be located
- the geographical module 26 is envisioned to be on or off as specified by the user 434 .
- the memory 14 may have multiple designs and/or patterns stored thereon which may be associated with a particular geographic location.
- the user 434 may be able to select the desired design or pattern 450 for exhibition by interacting with the module 26 (e.g., via a button or corresponding module on a smart phone or other electronics device).
- the user 434 may prefer to simply display a “KC” design while in Kansas City or around Kauffman stadium without specifically showing support for the Royals. Accordingly, the user 434 may scroll through the various stored designs or patterns 450 until s/he reaches the one which is most appealing; that design 450 may thus be displayed.
- the designs and/or patterns 450 may be static or dynamic.
- a system 1 may include a plurality of processors 12 in communication, wherein each processor 12 contains a unique identifier known as a Media Access Control Address (or MAC Address). Each MAC Address may be encoded into the memory 14 as a non-volative unique identification code that can be machine readable (and optionally writable) and/or human readable through means of network data communication or display patterns on the fabric of the system 1 .
- the system 1 may be used to momentarily display visual codes or geometrically colored patterns on each localized subsystem in such a way that a supervisory observing system monitor could uniquely identify each individual subsystem.
- An observing system management controller may identify, authenticate, and intercommunicate with several subsystem processor 12 modules in order to compile and seamlessly display cascaded orchestrated patterns on the system 1 via the plurality of processors 12 .
- the application of the system 1 may provide a programmable real-time motion capture tool which may be an advanced tracking marker to be worn by actors and/or mounted on objects being filmed during production of computer-generated-imagery (CGI) for special effects by filmmakers.
- CGI computer-generated-imagery
- Three dimensional positioning can be resolved and refined in real-time by utilizing changing patterns on the wearable LED display(s) during filming as well as assist in chroma-key production technique enhancements such as color key avoidance and specific queue markers being displayed at critical time-markers during the scene.
- a user 534 is wearing a controllable LED display hat 536 , a controllable LED display sweater 538 , controllable LED display pants 540 , and controllable LED display shoes 542 , each of which is under control of controller 12 utilizing an operation module such as the motion detection module 28 .
- the controllable wearable LED displays 536 , 538 , 540 , and 542 are substantially similar to counterpart wearable LED displays 36 , 38 , 40 , and 42 , except as specifically noted and/or shown, or as would be inherent.
- the motion pattern or logo 550 may be held static or may be in motion.
- the motion module 28 controls the individual LEDs 31 in the wearable LED displays 536 , 538 , 540 , 542 to display a specific pattern 550 when it is determined by the controller 12 that the user 532 is running, jumping, or otherwise moving.
- the controllable wearable LED displays 536 , 538 , 540 , and 542 may exhibit to the predetermined motion pattern 550 upon movement by the user 534 .
- the transceiver(s) 17 of the wearable LED displays 536 , 538 , 540 , 542 may communicate with a GPS to obtain location data, this location data is communicated to the controller 12 to ascertain movement.
- Motion may be detected through other means, such as an accelerometer, gyroscope, motion detector, camera, gesture detection, or other means known or yet to be known.
- the controller 12 determines that the user 534 is moving from a stationary position 560 to a running position 561 , and the wearable LED displays 536 , 538 , 540 , 542 may each individually or all change to a Flash® logo pattern 550 on the wearable LED sweater 536 with lightning being displayed across the wearables LED displays 538 , 540 , 542 .
- the motion module 28 is envisioned to be on or off as specified by the user 534 .
- the memory 14 may have static or dynamic patterns or logos 550 stored for the motion module 28 to recall and display.
- FIGS. 4-5 illustrate an intelligent wearable LED display system, which is generally given reference number 601 .
- the wearable LED display system 601 may include a camera 610 , a gyroscope 611 , a processor 612 , memory 614 , a flexible fabric 616 laden with LEDs 631 , a transceiver 617 , and a power source 618 .
- FIG. 5 illustrates an embodiment of the wearable LED display system 601 as a strip of fabric 616 . While elements are often referred to in the singular, those skilled in the art will appreciate that multiple such elements may often be employed and that the use of multiple such elements which collectively perform as expressly or inherently disclosed is fully contemplated herein.
- the gyroscope 611 may be an array of gyroscopes situated so as to monitor the movement of the user 734 .
- the processor 612 , memory 614 , the flexible fabric 616 laden with LEDs 631 , the transceiver 617 , and the power source 618 are substantially similar to counterpart components illustrated in FIG. 1 , except as specifically noted and/or shown, or as would be inherent.
- the system 601 may have the same functionality as the system 1 described above, with further functionality as described with reference to FIG. 6 .
- a user 734 is wearing a controllable LED display hat 736 , a controllable LED display sweater 738 , controllable LED display pants 740 , and controllable LED display shoes 742 , each of which is under control of controller 12 utilizing a module such as the real time image module 30 .
- the controllable wearable LED displays 736 , 738 , 740 , and 742 are substantially similar to counterpart wearable LED displays 36 , 38 , 40 , and 42 , except as specifically noted and/or shown, or as would be inherent.
- Each of the wearable LED displays 736 , 738 , 740 , 742 may be made up of one or more fabric LED systems 601 as described above.
- An image 750 may be exhibited on one or more controllable wearable LED displays 736 , 738 , 740 , and 742 , similar to the other embodiments described above.
- the image 750 may be static or, as shall be appreciated by the description provided below, preferable the image 750 may be dynamic.
- the real time image module 30 controls the individual LEDs 731 in the wearable LED displays 736 , 738 , 740 , 742 to display the real time image 750 captured by the camera 610 .
- the camera 610 may be configured to continuously record or to record upon a trigger such as motion, and may store recorded sessions in the memory 614 .
- the camera 610 may capture still images or video, may include filters to adjust for lighting, and may be a visible light camera, low light camera, or an infrared camera, or combinations thereof.
- the camera 610 may have a wide-angle lens so as to capture the widest area possible.
- the camera 610 may be connected with a network and accessible (e.g., via a laptop or other computer).
- the camera 610 may be located in a single place, such as the back of collar of a shirt, the wrist of a mitten, the brim of a cap, or other location known. While this document shall often refer to elements in the singular, those skilled in the art will appreciate that multiple such elements may often be employed and that the use of multiple such elements which collectively perform as expressly or inherently disclosed is fully contemplated herein. In some embodiments, the camera 610 may be an array of cameras situated so as to get a complete wide angle view of an area.
- a user 734 may be wearing the controllable wearable LED displays 736 , 738 , 740 , and 742 , and the camera 610 may capture the image behind the user 734 .
- the controller 12 may cause the captured imaged to be exhibited by one or more of the LED displays 736 , 738 , 740 , 742 , such that the user 734 appears at least partially see-through or camouflaged with the background.
- the wearable LED displays 736 , 738 , 740 , 742 may each individually display a respective portion of the tree 765 behind the user 734 such that the user 734 is substantially invisible to a viewer.
- the image 750 may be manipulated based on the perspective of a viewer.
- Each viewer of the user 734 has a specific vantage point that is dependent on the distance from the user 734 , and the height of the viewer. For example, a person short of stature that is viewing the user 734 from a short distance will have a different perspective than a person who is very tall. However, as the distance between the viewer and user 734 increases, the viewer's perspective becomes less significant. In order that the viewer always has the most accurate view of the user 734 (or lack thereof, since the user 734 may be substantially see-through) the camera 610 may be configured to swivel to take into account the viewer's perspective.
- the distance from the viewer to the user 734 must first be determined.
- a distance finder such as a laser (or an array of lasers) or a second camera may be utilized in conjunction with the controller 612 to determine the distance from the user to the viewer according to known techniques.
- the controller 612 may cause the distance finder to additionally ascertain the height of the viewer according to known techniques if the distance between the viewer and the user 734 is less than a predetermined threshold (e.g., 5 feet, 10 feet, 15 feet, etc.).
- the controller 612 may be able to determine the viewing angle between the viewer and the user 734 .
- the controller 612 may adjust the camera angle accordingly. For example, if the viewer is short of stature and is standing close to the user 724 , the camera 610 may be adjusted such that it has a more upwardly-angled trajectory. The viewer may see, for example, the top of the tree and a portion of the sky. Alternately, where the viewer is tall, the camera 610 may be adjusted such that it has a more downwardly-angled trajectory. The viewer may see, for example, the trunk of the tree and a portion of the ground.
- the camera 610 may be positioned such that it is pointed straight behind the user 734 . In this way, the perspective of the viewer may be taken into account to ensure that the view is as accurate as possible.
- the user 734 may either be stationary or moving. Where the user 734 is moving, the gyroscope 611 determines the angle and location of an appendage 760 (e.g., leg, arm, head, etc.) of the user 734 , so as to determine the viewer's viewing angle of the appendage. When an arm 760 of the user 734 moves or rotates, the real time image 750 adjusts to show the background at that location, so that the user 734 still remains see-through.
- an appendage 760 e.g., leg, arm, head, etc.
- the image 750 is displayed in real time. Accordingly, as illustrated in FIG. 6 and according to our example, a leaf 766 may fall off the tree 765 .
- the camera 610 may capture the real time image 750 of the leaf 766 , and the image 7570 will show the leaf to the viewer in sync with the leaf as it falls or blows past the user 734 .
- LEDs of the LED displays 746 , 738 , 740 , and 742 may be selectively activated such that portions of the user 734 may appear see-through, while other areas appear as the user's 734 normal clothes (which may or may not display other images such as those described herein).
- the user 734 may desire to alter his or her physique. Accordingly, as part of the real-time image module 30 , the user 734 may interact with the controller 612 (e.g., through an interface on a mobile device) to selectively activate certain LEDs on one or more LED displays.
- the user interface may allow the user 734 to select from one or more predetermined “desired physiques.”
- the desired physiques may include selections such as “Mesomorphic” (e.g., athletic), “Endomorphic” (e.g., strong), or “Ectomorphic” (e.g., lean and thin).
- the selections may include sub-selections, which may allow the user 734 to further define the type of build s/he wishes to portray. For example, a user 734 may select “Mesomorphic” and further select “Body-Builder” which, when activated, will give the user 734 apparently well-defined muscles from the perspective of the viewer.
- the controller 612 may use this information to determine exactly which LEDs are selectively activated to provide the user 734 with the desired physique. For example, a user 734 who is heavyset will require fabric 616 having a greater number of LEDs to begin with. Therefore, fewer of the overall LEDs will be required to be activated in order to achieve an ectomorphic physique. Conversely, a user 734 who is thinner will require fabric 616 having fewer LEDs to begin with. Accordingly, more of the overall LEDs will be required to be activated in order to achieve an ectomorphic physique.
- the system 601 may be capable of selectively altering the shape of an individual based on the individual's preferences.
- FIG. 8 illustrates some steps 800 that may be employed by the various systems described herein to display a static or dynamic image.
- the wearable LED display system 1 is activated to a permitted user 34 . Such activation may be through a dongle, biometrics, mobile application, login, user interface, button press, or other means.
- the display mode for the wearable LED display system 1 is selected by a user from the following modules stored in memory 14 : static, dynamic, situational, geographical, motion, and real time.
- step 805 it is determined whether real time display mode is selected.
- step 807 if real time mode is not selected, then the user selected static 50 / 150 , dynamic 250 , situational 350 , geographical 450 , or motion 550 pattern or logo is displayed. It is foreseen that other images or patterns may be downloaded to the memory 14 to display on the wearable LED display system 1 . It is also foreseen that the image or pattern may be from a different memory source.
- step 809 if real time mode is selected, it is determined whether the user has selected a physique, e.g., “Mesomorphic,” “Endomorphic,” or “Ectomorphic.” If the user has selected a physique, the controller at step 811 turns on the camera 10 and activates the appropriate LEDs to display the user with the selected physique and the LEDs not displaying the selected physique are activated to display the real-time image. If the user has not selected a physique, at step 813 , the controller 12 turns on the camera 10 and displays a real time image from the camera 10 by controlling all LEDs in the LED fabric 16 (as opposed to some) to display the real time image.
- a physique e.g., “Mesomorphic,” “Endomorphic,” or “Ectomorphic.” If the user has selected a physique, the controller at step 811 turns on the camera 10 and activates the appropriate LEDs to display the user with the selected physique and the LEDs not displaying the selected physique are activated to display the real-
- step 815 it is determined (e.g., via the distance finder) if there is a viewer within a pre-determined range of the wearable display system 1 .
- step 817 if a viewer is not within the predetermine range, then the camera is set to look straight directly behind the user and the image behind the user is displayed accordingly. The process repeats back to step 815 at timed intervals to determine if a viewer ever becomes within range.
- the LEDs may serve as spatial markers.
- spatial markers may be provided separately from the LEDs.
- the markers may be configured for recognition by a detection device which may be a part of a 3D mapping system.
- the markers may allow the wearable display system to communicate with the detection device and related systems and/or other wearable displays.
- a wearable display system equipped with markers is configured for use in a theater environment.
- the detection device may track the location of the markers on the wearable display system worn by an actor (as well as markers which may be located in or around the environment of the wearable display system, such as the stage) and communicate the information to a spotlight.
- the spotlight may thus be able to track the location of the actor.
- a detection device may be configured to track the location of multiple wearable displays via the markers and provide a controlled response based on location information of the wearable displays.
- the detection device may receive location information from the wearable displays. If the detection device determines that the wearable displays are in a location associated with an event, it may send a signal to the wearable displays causing the wearable displays to display a congruous image. If, for example, a plurality of wearable displays is in a stadium, the detection device may determine that the location of the wearable displays in relation to the rest of the stadium places the wearable displays in the visiting team section. The detection device may send a signal which activates the plurality of wearable displays to, for example, display the visiting team's logo across the plurality of wearable displays.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Accounting & Taxation (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
- Control Of El Displays (AREA)
Abstract
Description
- Embodiments of this disclosure relate generally to LED (light emitting diode), based systems, such as OLED (organic light emitting diode), front and side emitting LEDS, and thin film transistors driven LEDs, which are embedded in fabric to form a textile article, such as, headwear, clothing articles, gloves, mittens, shoes, carrying packs, tapestries, and other textile articles to provide a desired illumination and display, and to textile articles comprising such LED display systems.
- The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is not intended to identify critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some concepts of the invention in a simplified form as a prelude to the more detailed description that is presented elsewhere.
- In one embodiment, a textile display system includes memory operable to store data; a flexible fabric having a plurality of light emitting devices which define a display; and a processor communicatively coupled to the memory and the display. The processor is operable to execute one or more modules in the memory. The modules include a static module displaying a static image; a dynamic module displaying a dynamic image; a situational module displaying a situational image specific to a situational event; a geographical module displaying a geographic image specific to a geographical location; and a motion module displaying a motion image upon detection of movement.
- In another embodiment, a textile display system includes memory operable to store data; a flexible fabric having a plurality of light emitting devices which define a display; a camera configured to capture a real-time image; and a processor communicatively coupled to the memory, the camera, and the display. The processor is operable to execute one or more modules in the memory. The modules include a static module displaying a static image; a dynamic module displaying a dynamic image; and a real-time module displaying the real-time image captured from the camera.
- In still another embodiment, a method of operating a textile light emitting device system having a flexible fabric with a plurality of light emitting devices which define a display; and a processor communicatively coupled to the memory and the display, has the steps of: initiating the system by an authorized user; selecting a display module; and display an image based on the selected display module. The display modules include a static module displaying a static image; a dynamic module displaying a dynamic image; a situational module displaying a situational image specific to a situational event; a geographical module displaying a geographic image specific to a geographical location; and a motion module displaying a motion image upon detection of movement.
-
FIG. 1 is a block schematic view of a wearable light emitting device display system according to embodiments of the present disclosure. -
FIG. 2 is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure. -
FIG. 3A is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure, shown in static operational mode with a user. -
FIG. 3B is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure, shown in a second embodiment of static operational mode with a user. -
FIG. 3C is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure, shown in dynamic operational mode with a user. -
FIG. 3D is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure, shown in situational operational mode with a user. -
FIG. 3E is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure, shown in geographic operational mode with a user. -
FIG. 3F is a perspective view of the wearable light emitting device display system ofFIG. 1 according to embodiments of the present disclosure, shown in motion detection operational mode with a user. -
FIG. 4 is a block schematic view of a wearable light emitting device display system according to embodiments of the present disclosure. -
FIG. 5 is a perspective view of the wearable light emitting device display system ofFIG. 4 according to embodiments of the present disclosure. -
FIG. 6 is a perspective view of the wearable light emitting device display system ofFIG. 4 according to embodiments of the present disclosure, shown in real time imagining operational mode with a user. -
FIG. 7 is another perspective view of the wearable light emitting device display system ofFIG. 4 according to another embodiment of the disclosure. -
FIG. 8 is a flowchart illustrating various steps performed by wearable light emitting device systems according to an embodiment of the present disclosure. - Reference is now made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. It is to be understood that other embodiments may be utilized and structural and functional changes may be made. Moreover, features of the various embodiments may be combined or altered. As such, the following description is presented by way of illustration only and should not limit in any way the various alternatives and modifications that may be made to the illustrated embodiments. In this disclosure, numerous specific details provide a thorough understanding of the subject disclosure. It should be understood that aspects of this disclosure may be practiced with other embodiments.
-
FIGS. 1-2 illustrate an intelligent textile LED display system, which is generally given a reference number 1. For purposes of discussion, the textile LED display system is referred to herein as a wearable LED display system. However, it shall be understood that the textile LED system may be additionally or alternately include other display systems that incorporate a textile, such as a tapestry that may be hung on the wall, seat covers, quilts, etc. The wearable LED display system 1 may include aprocessor 12,non-transitory computer memory 14,flexible fabric 16 laden withLEDs 31, atransceiver 17, and apower source 18.FIG. 2 illustrates an embodiment of the wearable LED display system 1 as a strip offabric 16. The LED display system may include a plurality of visual indicators (which may be electrically stimulated) including but not limited to materials that emit light at specific wavelengths such as traditional LEDs or organic LEDs. These types of display technology components operate in the optical transmissive mode. Other supported display technologies can conversely operate in the optical reflective mode, where an external light source must be present in order to become a visual indicator to the observer. One example of reflective mode display technology is programmable matter such as a metamaterial. Using metamaterials, external light can be manipulated to reflect selectively on individual or grouped reflective programmable particles. By changing the shape or the particles' index of refraction, it may be possible to change the apparent color of the material from the observer's perspective. - The processor 12 (or “controller” or “master board”) may be any appropriate device, whether now existing or later developed, which performs the operations specified by the various programming used by the system 1. The
processor 12 may be electronic circuitry located on a common chip or circuit board, or may be a distributed processor such that one portion of the processor is physically separate from another portion of the processor. In other words, discrete processing devices may be linked together (e.g., over a network) and collectively form theprocessor 12. Theprocessor 12 is in data communication with acamera 10, thememory 14, and may further be in data communication with other peripherals (e.g., sensors, keyboards, touch screens, etc.) to control thecamera 10 and/or other components. - The
controller 12 performs the tasks of processing retrieving modules and/or data from thememory 14 and controlling the different components of the system 1, including thetransceiver 17. Thecontroller 12 may receive instructions from an outside source (i.e. mobile application, personal computer, laptop or other means known in the art or yet to be known) by means of thetransceiver 17. Such instructions may be the selection of the mode to display as will be further discussed below. Such instructions may be the download of patterns or logos to be displayed, for example from an outside database of images and patterns. Such instructions may be received directly frommemory 14 and not through an outside source. - The
transceiver 17 may include a transmitter (or “antenna”) 19. Theantenna 19 may be situated in a different location than thetransceiver 17. For example, theantenna 19 may be a part of circuitry located in thefabric 16 or thetransceiver 17 may be one of many transceivers located within the wearable LED display. Thetransceiver 17 communicates directly and/or over a wireless communication infrastructure with other devices configured to receive such wireless communication to provide controlled input to theLEDs 31. In direct wireless communications, thetransceiver 17 may include baseband processing circuitry to convert data into a wireless signal (e.g., radio frequency (RF), infrared (IR), ultrasound, near field communication (NFC), et cetera) and thetransmitter 19 transmits the wireless signal. When a second wireless transceiver (not shown) is within range (i.e., is close enough to thefirst wireless transceiver 17 to receive the wireless signal at a sufficient power level), it receives the wireless signal and converts the signal into meaningful information (e.g., voice, data, video, audio, text, instructions for completing a task, et cetera) via baseband processing circuitry (e.g., through an application on a phone, GPS, computer, notepad, etc. or through a central display located within the home). Examples of direct wireless communication (or point-to-point communication) include Bluetooth, ZigBee, Radio Frequency Identification (RFID), et cetera. - For indirect wireless communication or communication via a wireless communication infrastructure, the
first wireless transceiver 17 transmits and/or receives a wireless signal to a base station or access point, which conveys the signal to a wide area network (WAN) and/or to a local area network (LAN). The signal may traverse the WAN and/or LAN to a second base station or access point to send signal to the second wireless transceiver or it may traverse the WAN and/or LAN directly to the second wireless transceiver. Examples of wireless communication via an infrastructure include cellular satellite/tower, IEEE 802.11, public safety systems, et cetera. - The
memory 14 contains the relevant computer or program instructions for thecontroller 12, and may further include location data that may be gathered from a global positioning satellite (GPS) system. The data may be transmitted by thetransmitter 19 as described above to obtain the location of the wearable LED system 1. Thememory 14 may be situated in the same circuit board as thecontroller 12 andpower source 18. - The
memory 14 may include volatile and non-volatile memory, and any appropriate data storage devices whether now existing or later developed may be used. Further, thememory 14 may be a unitary memory in one location, or may alternately be a distributed computer memory such that one portion of thecomputer memory 14 is physically separate from another portion of thememory 14. Example memory devices which may be employed include SRAM, DRAM, EPROM, EEPROM, Flash, magnetic, rotating media, ferromagnetic, and U3 smart drives. Thememory 14 is in communication with thecontroller 12 for providing data to and receiving data from thecontroller 12. In some embodiments, data may be encrypted to prevent disassembly and reverse engineering. - The
memory 14 may include, for example, a program storage area (for storing software or “instructions”) and a data storage area (for storing videos, still photographs, and other data). In some embodiments, the software components stored inmemory 14 may include astatic operation module 20, adynamic operation module 22, situation basedmodule 24,geographical operation module 26,motion module 28, realtime image module 30. It is foreseen that thememory 14 may further include an operating system. - The
static operation module 20 is configured to display a single image (e.g., a color, a pattern, a logo, etc.) which remains stationary while thestatic operation module 20 is in use. Conversely, thedynamic operation module 22 is configured to display an image which moves while thedynamic operation module 22 is in use. For example, the image may flash, scroll, rotate, or otherwise move on the respective display. The situation basedmodule 24 may be configured for static or dynamic operation, the image being selected based on the user's specific situation (e.g., if the user is lost). Similarly, thegeographical operation module 26 may be configured for static or dynamic operation, the image being selected based on the user's geographical location. Each of thestatic module 20, thedynamic module 22, the situation basedmodule 24, and thegeographical operation module 26 may be activated by the user, e.g., through an on/off button. The on/off button may be located somewhere on one or more of the displays, or may be remote (e.g., on the user's phone or the phone of a party authorized by the user such as a parent on behalf of a child). Themotion module 28 may similarly be configured for static or dynamic operation. However, themotion module 28 may be activated by movement of the user (e.g., instead of turning themodule 28 on or off via a button). In embodiments, themotion module 28 may be turned on (e.g., via an on/off button as described above), but the image may not be triggered to turn on until themodule 28 experiences movement by the user. The real-time image module 30 is configured to interact with acamera 10 to display images that are received from thecamera 10. The 20, 22, 24, 26, 28, and 30 are described in greater detail with reference to the examples below.various modules - The
controller 12 is configured to retrieve frommemory 14 and execute, among other things, instructions or modules related to the control processes and methods described herein. Thecontroller 12 is connected to thememory 14 and may execute software instructions or modules that are stored in a RAM of the memory 14 (e.g., during execution), a ROM of the memory 14 (e.g., on a generally permanent bases), or another non-transitory computer readable medium such as another memory or a hard disc. For example, software can be stored in the internal memory of thecontroller board 14. Thecontroller 12 can have part of its operations dedicated to a data transfer function between thecamera 10 to thememory 14, and another part dedicated to thefabric LED array 16. Alternately, thecontroller 12 can embody the two functions in separate controllers (i.e., a data transfer function controller and a LED controller). - The
flexible LED fabric 16 may be stretchable, conformable, thin-film transistor (TFT) driven LED display that is laminated or otherwise incorporated into fabric textiles which features a plurality of LED lights. The electronics (i.e.processor 12,memory 14, etc.) may be integrated with the textile, on the surface of textiles, or may be an iron-on patch. The LED displays 31 may manufactured from flexible material. Examples of flexible materials used for the LED strip include polymer/plastic substrates, such as, polyamide, polyether ether ketone (PEEK) or transparent conductive polyester film. For example, a polyimide substrate may be encapsulated in rubber, which would allow thedisplay 31 to be laminated intofabric 16. The at least one flexibleLED display strip 16 may be able to conform to the shape of the surface of a wearable article or of a user (i.e. spandex), said flexible strip comprising anupper surface 32 having one ormore LEDs 31 disposed thereon which are operably positioned to provide a desired illumination; an electrical connector for electrically connecting theLED strip 16 to apower source 18; and may further include an electrical connector for electrically connecting theLED fabric strip 16 to a computer or other device so as to install additional programming or additional licensing for trademarks to be displayed. It is foreseen that this invention applies to any controllable light emitting device known or yet to be known and should not be limited to LEDs or OLEDs. - The
power source 18 may be located within theflexible LED fabric 16 or may be located remotely. Thepower source 18 of a system 1 can be one or more of: batteries, a computer battery through a USB connection, a motion-based generator, a solar cell, or other source of electricity such as a fuel cell or other means known in the art. It may be beneficial for the system 1 to include means for storing power, both in situations in which the system 1 is hard wired to an electrical box, and where the system receives its power from an outside source. Here, thepower source 18 may be capacitors or a battery, such as NiCd (nickel-cadmium), NiZn (nickel-zinc), NIMH (nickel-metal hydride), or lithium-ion. The battery may store energy from other sources which may then be converted into electrical energy for use by the system 1. For example, the system may be able to capture solar energy through a photo-voltaic cell. As another example, thepower source 18 may be a capacitor or have a capacitor back-up that receives power until fully charged, then, the capacitor provides temporary power to the system 1, such that even when thebattery 18 becomes discharged, the system 1 will stay on for a short time. The system 1 may, for example, provide an indication (e.g., a sound, a displayed message, a flashing LED, etc.) to the user that thebattery 18 is low or that the capacitor is being used and replace thepower source 18 soon. - With reference now to
FIG. 3A , auser 34 is wearing a controllableLED display hat 36, a controllableLED display sweater 38, controllable LED display pants 40, and a controllable LED display shoes 42. Each of the wearable LED displays 36, 38, 40, 42 is made up of one or more fabric LED systems 1 as disclosed above. The LED displays 36, 38, 40, and 42 may each be in static operation under control ofcontroller 12 utilizing thestatic operation module 20. Alternately, one or more wearable LED displays 36, 38, 40, 42 may be controlled by thestatic operation module 20 while the respective other wearable LED displays 36, 38, 40, 42 may be controlled by a 22, 24, 26, 28, or 30, as will be further discussed below.different module - As noted above, the static operation module may control the
individual LEDs 31 in the wearable LED displays 36, 38, 40, 42 to display a single color or asingle pattern 50, for example: paisley, plaid, stripes, denim, floral, or other patterns known in the art or yet to be known. Thememory 14 may have thepatterns 50 stored for thestatic module 20 to recall and display. The color depth may be 24 bit or 32 bit or other color depth yet to be known. The wearable LED displays 36, 38, 40, 42 may be set to be the same static pattern or may each individually be their own static pattern. - Another example of static operation is illustrated in
FIG. 3B . There, auser 134 is wearing a controllableLED display hat 136, a controllableLED display sweater 138, controllable LED display pants 140, and controllableLED display shoes 142, each of which is under control ofcontroller 12 utilizing an operation module such as thestatic operation module 20. The controllable 136, 138, 140, and 142 are substantially similar to counterpart wearable LED displays 36, 38, 40, and 42, except as specifically noted and/or shown, or as would be inherent. In this embodiment, thewearable LED displays static pattern 150 is an image on the controllableLED display sweater 138. Theimage 150 may be repeated throughout thesweater 138 or on other 136, 140, 142. Thecontrollable LED displays image 150 may be stored onmemory 14 and displayed randomly or centrally or determined by theuser 134. Theimage 150 may be any type of file format, such as JPEG, Exif, TIFF, GIF, BMP, PNG or any other format known or yet to be known. - With reference now to
FIG. 3C , auser 234 is wearing a controllableLED display hat 236, a controllableLED display sweater 238, controllable LED display pants 240, and controllableLED display shoes 242, each of which is under control ofcontroller 12 utilizing an operation module such as thedynamic operation module 22. The controllable 236, 238, 240, and 242 are substantially similar to counterpart wearable LED displays 36, 38, 40, and 42, except as specifically noted and/or shown, or as would be inherent. In the illustrated example ofwearable LED displays FIG. 3C , thedynamic phrase 250 “Follow Me” scrolls in direction A across the user'ssweater 238, and may repeat as the phrase ends on the left side of the controllable wearableLED display sweater 238. It shall be understood that theimage 250 may be dynamically displayed on any of the 236, 238, 240, 242. Like thewearable LED displays static operation module 20, thedynamic operation module 22 controls theindividual LEDs 31 in the 236, 238, 240, 242 to display thewearable LED displays dynamic phrase 250. - The
memory 14 may store one or moredynamic patterns 250 for thedynamic module 22 to recall and display. Thedynamic patterns 250 may include detailed motion patterns for each 236, 238, 240, 242 that may be used or worn. The movingwearable display image 250 may be displayed at locations randomly selected or centrally-located or at (a) location(s) determined by theuser 234. Theimage 250 may be of any type of video file format, such as MPEG, MOV, WMV, AMV, MP4, MPG, M4V or any other format known or yet to be known. The 236, 238, 240, 242 may be configured to play a movie or a moving image fromwearable LED displays memory 14 or a video sent to a cellular device which may be accessed via thecontroller 12. - In
FIG. 3D , auser 334 is wearing a controllableLED display hat 336, a controllableLED display sweater 338, controllable LED display pants 340, and controllableLED display shoes 342, each of which is under control ofcontroller 12 utilizing a module such as thesituational module 24. The controllable 336, 338, 340, and 342 are substantially similar to counterpart wearable LED displays 36, 38, 40, and 42, except as specifically noted and/or shown, or as would be inherent. Awearable LED displays user 334, such as a child, may be wearing the 336, 338, 340, 342. An outside source, such as the child's parent's phone may communicate with the transceiver(s) 17 of thewearable LED displays 336, 338, 340, 342 to change the display format of thewearable LED displays 336, 338, 340, 342 to a specific distress orwearable LED displays aid pattern 350. Like thestatic operation module 20 and thedynamic operation module 22, thesituational module 24 controls theindividual LEDs 31 in the 336, 338, 340, 342 to display a situationwearable LED displays specific pattern 350. InFIG. 3D , thesituational phrase 350 “HELP I'M LOST” may optionally be held static or may be in motion on the wearableLED display sweater 338. Thesituational phrase 350 may be displayed across one or more controllable 336, 338, 340, and 342. Other exemplary situational phrase may include phrases such as “Call 911, I have been kidnapped,” or “If found please contact parents at (913) 555-5555.” Thewearable LED displays 336, 338, 340, 342 may adjust the viewing angle such that only other small children or short individuals at the user's 334 height may see thewearable LED displays situational pattern 350 and those above the height cannot or vice versa. Thememory 14 may have static or dynamicsituational patterns 350 stored for thesituational module 24 to recall and display. - There may also be a panic button on one or more
336, 338, 340, 342, in lieu of, or in combination with thewearable LED displays transceiver 17, which may allow theuser 334 to change to thesituational pattern 350 at will. - In still another example, illustrated in
FIG. 3E , auser 434 is wearing a controllableLED display hat 436, a controllableLED display sweater 438, controllable LED display pants 440, and controllableLED display shoes 442, each of which is under control ofcontroller 12 utilizing an operation module such as thegeographical operation module 26. The controllable 436, 438, 440, and 442 are substantially similar to counterpart wearable LED displays 36, 38, 40, and 42, except as specifically noted and/or shown, or as would be inherent. Inwearable LED displays FIG. 3E , the geographical pattern orlogo 450 is displayed on one or more of the 426, 438, 440, 442 and may be held static or may be in motion, for example by blinking, repeating, or striping across the LED display(s). Like thecontrollable LED displays static operation module 20 and thedynamic operation module 22, thegeographical module 26 controls theindividual LEDs 31 in the 436, 438, 440, 442 to display a geographicalwearable LED displays specific pattern 450. - The
geographical pattern 450 may be based on geographical points of interest near the location of theuser 434. In embodiments, transceiver(s) 17 of the 436, 438, 440, 442 communicate with a GPS to obtain location data which may be communicated to thewearable LED displays controller 12. In the illustrated example, thecontroller 12 may determine that theuser 434 is located either around the Royals® stadium or Kansas City area, and one or more of the 436, 438, 440, 442 may automatically activate to display awearable LED displays Royals® logo 450. Alternately, theuser 434 may cause one or more of the LED displays 436, 438, 440, 442 to exhibit thegeographic pattern 450, e.g., by engaging with themodule 26 to activate or deactivate theimage 450. The location of the display 450 (e.g., which one or more of the LED displays 436, 438, 440, 442 will show the logo and/or where on the 436, 438, 440 and/or 442 the logo may be located) may be predetermined by thedisplay user 434. Thegeographical module 26 is envisioned to be on or off as specified by theuser 434. - In embodiments, the
memory 14 may have multiple designs and/or patterns stored thereon which may be associated with a particular geographic location. Theuser 434 may be able to select the desired design orpattern 450 for exhibition by interacting with the module 26 (e.g., via a button or corresponding module on a smart phone or other electronics device). For example, theuser 434 may prefer to simply display a “KC” design while in Kansas City or around Kauffman stadium without specifically showing support for the Royals. Accordingly, theuser 434 may scroll through the various stored designs orpatterns 450 until s/he reaches the one which is most appealing; thatdesign 450 may thus be displayed. As described herein, the designs and/orpatterns 450 may be static or dynamic. - In one embodiment, a system 1 may include a plurality of
processors 12 in communication, wherein eachprocessor 12 contains a unique identifier known as a Media Access Control Address (or MAC Address). Each MAC Address may be encoded into thememory 14 as a non-volative unique identification code that can be machine readable (and optionally writable) and/or human readable through means of network data communication or display patterns on the fabric of the system 1. The system 1 may be used to momentarily display visual codes or geometrically colored patterns on each localized subsystem in such a way that a supervisory observing system monitor could uniquely identify each individual subsystem. An observing system management controller may identify, authenticate, and intercommunicate withseveral subsystem processor 12 modules in order to compile and seamlessly display cascaded orchestrated patterns on the system 1 via the plurality ofprocessors 12. - In an embodiment, the application of the system 1 may provide a programmable real-time motion capture tool which may be an advanced tracking marker to be worn by actors and/or mounted on objects being filmed during production of computer-generated-imagery (CGI) for special effects by filmmakers. Three dimensional positioning can be resolved and refined in real-time by utilizing changing patterns on the wearable LED display(s) during filming as well as assist in chroma-key production technique enhancements such as color key avoidance and specific queue markers being displayed at critical time-markers during the scene.
- Moving on, in
FIG. 3F , auser 534 is wearing a controllableLED display hat 536, a controllableLED display sweater 538, controllable LED display pants 540, and controllableLED display shoes 542, each of which is under control ofcontroller 12 utilizing an operation module such as themotion detection module 28. The controllable 536, 538, 540, and 542 are substantially similar to counterpart wearable LED displays 36, 38, 40, and 42, except as specifically noted and/or shown, or as would be inherent. In the illustrated example ofwearable LED displays FIG. 3F , the motion pattern orlogo 550 may be held static or may be in motion. Like thestatic operation module 20 and thedynamic operation module 22, themotion module 28 controls theindividual LEDs 31 in the 536, 538, 540, 542 to display awearable LED displays specific pattern 550 when it is determined by thecontroller 12 that the user 532 is running, jumping, or otherwise moving. In other words, the controllable 536, 538, 540, and 542 may exhibit to thewearable LED displays predetermined motion pattern 550 upon movement by theuser 534. The transceiver(s) 17 of the 536, 538, 540, 542 may communicate with a GPS to obtain location data, this location data is communicated to thewearable LED displays controller 12 to ascertain movement. Motion may be detected through other means, such as an accelerometer, gyroscope, motion detector, camera, gesture detection, or other means known or yet to be known. - In the illustrated example, the
controller 12 determines that theuser 534 is moving from astationary position 560 to a runningposition 561, and the 536, 538, 540, 542 may each individually or all change to a Flashwearable LED displays ® logo pattern 550 on thewearable LED sweater 536 with lightning being displayed across the wearables LED displays 538, 540, 542. Themotion module 28 is envisioned to be on or off as specified by theuser 534. Thememory 14 may have static or dynamic patterns orlogos 550 stored for themotion module 28 to recall and display. -
FIGS. 4-5 illustrate an intelligent wearable LED display system, which is generally givenreference number 601. The wearableLED display system 601 may include acamera 610, agyroscope 611, aprocessor 612,memory 614, aflexible fabric 616 laden withLEDs 631, atransceiver 617, and apower source 618.FIG. 5 illustrates an embodiment of the wearableLED display system 601 as a strip offabric 616. While elements are often referred to in the singular, those skilled in the art will appreciate that multiple such elements may often be employed and that the use of multiple such elements which collectively perform as expressly or inherently disclosed is fully contemplated herein. For example, in embodiments, thegyroscope 611 may be an array of gyroscopes situated so as to monitor the movement of theuser 734. - The
processor 612,memory 614, theflexible fabric 616 laden withLEDs 631, thetransceiver 617, and thepower source 618 are substantially similar to counterpart components illustrated inFIG. 1 , except as specifically noted and/or shown, or as would be inherent. Thesystem 601 may have the same functionality as the system 1 described above, with further functionality as described with reference toFIG. 6 . - In
FIG. 6 , auser 734 is wearing a controllableLED display hat 736, a controllable LED display sweater 738, controllable LED display pants 740, and controllableLED display shoes 742, each of which is under control ofcontroller 12 utilizing a module such as the realtime image module 30. The controllable 736, 738, 740, and 742 are substantially similar to counterpart wearable LED displays 36, 38, 40, and 42, except as specifically noted and/or shown, or as would be inherent. Each of thewearable LED displays 736, 738, 740, 742 may be made up of one or morewearable LED displays fabric LED systems 601 as described above. Animage 750 may be exhibited on one or more controllable 736, 738, 740, and 742, similar to the other embodiments described above. Thewearable LED displays image 750 may be static or, as shall be appreciated by the description provided below, preferable theimage 750 may be dynamic. The realtime image module 30 controls the individual LEDs 731 in the 736, 738, 740, 742 to display thewearable LED displays real time image 750 captured by thecamera 610. - The camera 610 (or “imaging module” or “optical sensor” or cameras) may be configured to continuously record or to record upon a trigger such as motion, and may store recorded sessions in the
memory 614. Thecamera 610 may capture still images or video, may include filters to adjust for lighting, and may be a visible light camera, low light camera, or an infrared camera, or combinations thereof. In embodiments, thecamera 610 may have a wide-angle lens so as to capture the widest area possible. Thecamera 610 may be connected with a network and accessible (e.g., via a laptop or other computer). Thecamera 610 may be located in a single place, such as the back of collar of a shirt, the wrist of a mitten, the brim of a cap, or other location known. While this document shall often refer to elements in the singular, those skilled in the art will appreciate that multiple such elements may often be employed and that the use of multiple such elements which collectively perform as expressly or inherently disclosed is fully contemplated herein. In some embodiments, thecamera 610 may be an array of cameras situated so as to get a complete wide angle view of an area. - A
user 734 may be wearing the controllable 736, 738, 740, and 742, and thewearable LED displays camera 610 may capture the image behind theuser 734. Thecontroller 12 may cause the captured imaged to be exhibited by one or more of the LED displays 736, 738, 740, 742, such that theuser 734 appears at least partially see-through or camouflaged with the background. As shown inFIG. 6 , the 736, 738, 740, 742 may each individually display a respective portion of thewearable LED displays tree 765 behind theuser 734 such that theuser 734 is substantially invisible to a viewer. - In one embodiment, the
image 750 may be manipulated based on the perspective of a viewer. Each viewer of theuser 734 has a specific vantage point that is dependent on the distance from theuser 734, and the height of the viewer. For example, a person short of stature that is viewing theuser 734 from a short distance will have a different perspective than a person who is very tall. However, as the distance between the viewer anduser 734 increases, the viewer's perspective becomes less significant. In order that the viewer always has the most accurate view of the user 734 (or lack thereof, since theuser 734 may be substantially see-through) thecamera 610 may be configured to swivel to take into account the viewer's perspective. In order for thesystem 601 to determine the viewer's perspective, the distance from the viewer to theuser 734 must first be determined. A distance finder, such as a laser (or an array of lasers) or a second camera may be utilized in conjunction with thecontroller 612 to determine the distance from the user to the viewer according to known techniques. As noted above, each viewer's perspective may be substantially similar at longer distances. Accordingly, thecontroller 612 may cause the distance finder to additionally ascertain the height of the viewer according to known techniques if the distance between the viewer and theuser 734 is less than a predetermined threshold (e.g., 5 feet, 10 feet, 15 feet, etc.). - Where the distance finder determines the distance and the height of the viewer, the
controller 612 may be able to determine the viewing angle between the viewer and theuser 734. Thecontroller 612 may adjust the camera angle accordingly. For example, if the viewer is short of stature and is standing close to the user 724, thecamera 610 may be adjusted such that it has a more upwardly-angled trajectory. The viewer may see, for example, the top of the tree and a portion of the sky. Alternately, where the viewer is tall, thecamera 610 may be adjusted such that it has a more downwardly-angled trajectory. The viewer may see, for example, the trunk of the tree and a portion of the ground. If the distance between the viewer and theuser 734 is greater than the predetermined threshold, then thecamera 610 may be positioned such that it is pointed straight behind theuser 734. In this way, the perspective of the viewer may be taken into account to ensure that the view is as accurate as possible. - The
user 734 may either be stationary or moving. Where theuser 734 is moving, thegyroscope 611 determines the angle and location of an appendage 760 (e.g., leg, arm, head, etc.) of theuser 734, so as to determine the viewer's viewing angle of the appendage. When anarm 760 of theuser 734 moves or rotates, thereal time image 750 adjusts to show the background at that location, so that theuser 734 still remains see-through. - It is preferable that the
image 750 is displayed in real time. Accordingly, as illustrated inFIG. 6 and according to our example, aleaf 766 may fall off thetree 765. Thecamera 610 may capture thereal time image 750 of theleaf 766, and the image 7570 will show the leaf to the viewer in sync with the leaf as it falls or blows past theuser 734. - In still another embodiment illustrated in
FIG. 7 , which is substantially similar to the embodiment illustrated inFIG. 6 , LEDs of the LED displays 746, 738, 740, and 742 may be selectively activated such that portions of theuser 734 may appear see-through, while other areas appear as the user's 734 normal clothes (which may or may not display other images such as those described herein). Theuser 734 may desire to alter his or her physique. Accordingly, as part of the real-time image module 30, theuser 734 may interact with the controller 612 (e.g., through an interface on a mobile device) to selectively activate certain LEDs on one or more LED displays. For example, the user interface may allow theuser 734 to select from one or more predetermined “desired physiques.” The desired physiques may include selections such as “Mesomorphic” (e.g., athletic), “Endomorphic” (e.g., strong), or “Ectomorphic” (e.g., lean and thin). The selections may include sub-selections, which may allow theuser 734 to further define the type of build s/he wishes to portray. For example, auser 734 may select “Mesomorphic” and further select “Body-Builder” which, when activated, will give theuser 734 apparently well-defined muscles from the perspective of the viewer. - In addition to the
user 734 inputting his or her desired physique, additional details may be required, including the user's sex, height, and weight. Thecontroller 612 may use this information to determine exactly which LEDs are selectively activated to provide theuser 734 with the desired physique. For example, auser 734 who is heavyset will requirefabric 616 having a greater number of LEDs to begin with. Therefore, fewer of the overall LEDs will be required to be activated in order to achieve an ectomorphic physique. Conversely, auser 734 who is thinner will requirefabric 616 having fewer LEDs to begin with. Accordingly, more of the overall LEDs will be required to be activated in order to achieve an ectomorphic physique. Those of skill in the art shall therefore understand that thesystem 601 may be capable of selectively altering the shape of an individual based on the individual's preferences. -
FIG. 8 illustrates somesteps 800 that may be employed by the various systems described herein to display a static or dynamic image. Instep 801, the wearable LED display system 1 is activated to a permitteduser 34. Such activation may be through a dongle, biometrics, mobile application, login, user interface, button press, or other means. Instep 803, the display mode for the wearable LED display system 1 is selected by a user from the following modules stored in memory 14: static, dynamic, situational, geographical, motion, and real time. - In step 805, it is determined whether real time display mode is selected. In
step 807, if real time mode is not selected, then the user selected static 50/150, dynamic 250, situational 350, geographical 450, ormotion 550 pattern or logo is displayed. It is foreseen that other images or patterns may be downloaded to thememory 14 to display on the wearable LED display system 1. It is also foreseen that the image or pattern may be from a different memory source. - In
step 809, if real time mode is selected, it is determined whether the user has selected a physique, e.g., “Mesomorphic,” “Endomorphic,” or “Ectomorphic.” If the user has selected a physique, the controller atstep 811 turns on thecamera 10 and activates the appropriate LEDs to display the user with the selected physique and the LEDs not displaying the selected physique are activated to display the real-time image. If the user has not selected a physique, atstep 813, thecontroller 12 turns on thecamera 10 and displays a real time image from thecamera 10 by controlling all LEDs in the LED fabric 16 (as opposed to some) to display the real time image. - Moving on, regardless of the decision at
step 809, atstep 815, it is determined (e.g., via the distance finder) if there is a viewer within a pre-determined range of the wearable display system 1. Instep 817, if a viewer is not within the predetermine range, then the camera is set to look straight directly behind the user and the image behind the user is displayed accordingly. The process repeats back to step 815 at timed intervals to determine if a viewer ever becomes within range. - In one embodiment, the LEDs may serve as spatial markers. Alternatively, spatial markers may be provided separately from the LEDs. The markers may be configured for recognition by a detection device which may be a part of a 3D mapping system. The markers may allow the wearable display system to communicate with the detection device and related systems and/or other wearable displays. In one example, a wearable display system equipped with markers is configured for use in a theater environment. The detection device may track the location of the markers on the wearable display system worn by an actor (as well as markers which may be located in or around the environment of the wearable display system, such as the stage) and communicate the information to a spotlight. The spotlight may thus be able to track the location of the actor.
- In another example, a detection device may be configured to track the location of multiple wearable displays via the markers and provide a controlled response based on location information of the wearable displays. Here, the detection device may receive location information from the wearable displays. If the detection device determines that the wearable displays are in a location associated with an event, it may send a signal to the wearable displays causing the wearable displays to display a congruous image. If, for example, a plurality of wearable displays is in a stadium, the detection device may determine that the location of the wearable displays in relation to the rest of the stadium places the wearable displays in the visiting team section. The detection device may send a signal which activates the plurality of wearable displays to, for example, display the visiting team's logo across the plurality of wearable displays.
- Many different arrangements of the various components depicted, as well as components not shown, are possible without departing from the spirit and scope of the present invention. Embodiments of the present invention have been described with the intent to be illustrative rather than restrictive. Alternative embodiments will become apparent to those skilled in the art that do not depart from its scope. A skilled artisan may develop alternative means of implementing the aforementioned improvements without departing from the scope of the present invention. Further, it will be understood that certain features and subcombinations may be of utility and may be employed within the scope of the disclosure.
- Various steps set forth herein may be carried out in orders that differ from those set forth herein without departing from the scope of the present methods. This description shall not be restricted to the above embodiments. It is to be understood that while certain forms of the present invention have been illustrated and described herein, it is not to be limited to the specific forms or arrangement of parts described and shown.
Claims (10)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/629,513 US20180373293A1 (en) | 2017-06-21 | 2017-06-21 | Textile display system and method |
| PCT/US2018/038547 WO2018237035A1 (en) | 2017-06-21 | 2018-06-20 | SYSTEM AND METHOD FOR TEXTILE DISPLAY |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/629,513 US20180373293A1 (en) | 2017-06-21 | 2017-06-21 | Textile display system and method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20180373293A1 true US20180373293A1 (en) | 2018-12-27 |
Family
ID=64693102
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/629,513 Abandoned US20180373293A1 (en) | 2017-06-21 | 2017-06-21 | Textile display system and method |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20180373293A1 (en) |
| WO (1) | WO2018237035A1 (en) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190384556A1 (en) * | 2018-06-15 | 2019-12-19 | Avery Dennison Retail Information Services, Llc | Light-emitting clothing trimmings |
| US10561187B1 (en) * | 2019-01-29 | 2020-02-18 | Mary-Elizabeth Antoinette Baccas | Digital display terminal apparel |
| US20200056777A1 (en) * | 2018-07-24 | 2020-02-20 | Sanko Tekstil Isletmeleri San. Ve Tic. A.S. | Fabric and article with led embedded therein and the related production process |
| US20200326678A1 (en) * | 2019-04-13 | 2020-10-15 | Juan Guzman | Integrated Wearable Energy Generation and Annunciation Systems |
| US11062628B2 (en) * | 2017-03-22 | 2021-07-13 | 10644137 Canada Inc. | Apparatus having a flexible LED display module and a method of employing same |
| WO2023231728A1 (en) * | 2022-05-31 | 2023-12-07 | 人工智能设计研究所有限公司 | System for human-computer interaction and textile for human-computer interaction |
| US12112687B2 (en) | 2021-12-07 | 2024-10-08 | Kyndryl, Inc. | Dynamic display for image-enabled clothing |
Citations (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5220631A (en) * | 1991-12-23 | 1993-06-15 | Grippin Raymond R | Fiber optic camouflage |
| US20010008423A1 (en) * | 2000-01-14 | 2001-07-19 | Minolta Co., Ltd. | Distance measuring device and a camera provided with the distance measuring device |
| US20020117605A1 (en) * | 2001-01-08 | 2002-08-29 | Alden Ray M. | Three-dimensional receiving and displaying process and apparatus with military application |
| US20030047666A1 (en) * | 2001-01-08 | 2003-03-13 | Alden Ray M. | Three-dimensional signature control process and apparatus with military application |
| US20040036006A1 (en) * | 2002-02-19 | 2004-02-26 | Color Kinetics, Inc. | Methods and apparatus for camouflaging objects |
| US20040187184A1 (en) * | 2003-03-27 | 2004-09-30 | Rubin Aaron Cole | Apparel articles including flexible personal device and information displays |
| US20060158558A1 (en) * | 2004-12-30 | 2006-07-20 | Chul Chung | Integrated multimedia signal processing system using centralized processing of signals |
| US20070014916A1 (en) * | 2002-11-19 | 2007-01-18 | Daniels John J | Organic and inorganic light active devices and methods for making the same |
| US20080010877A1 (en) * | 2003-04-29 | 2008-01-17 | France Telecom | Flexible display |
| US20080058894A1 (en) * | 2006-08-29 | 2008-03-06 | David Charles Dewhurst | Audiotactile Vision Substitution System |
| US20080267460A1 (en) * | 2007-04-24 | 2008-10-30 | Takata Corporation | Occupant information detection system |
| US20090257654A1 (en) * | 2008-04-11 | 2009-10-15 | Roizen Michael F | System and Method for Determining an Objective Measure of Human Beauty |
| US20120127687A1 (en) * | 2009-07-13 | 2012-05-24 | Arizona Board of Regents, a body corporate of the state of Arizona, acting for and on behalf of | Flexible circuits and electronic textiles |
| US20120137399A1 (en) * | 2008-07-14 | 2012-06-07 | Forte Michael A | Apparel Attachable Detachable Display Frame |
| US20120184367A1 (en) * | 2011-01-14 | 2012-07-19 | Igt | Wearable casino gaming display and tracking system |
| US20120218253A1 (en) * | 2011-02-28 | 2012-08-30 | Microsoft Corporation | Adjusting 3d effects for wearable viewing devices |
| US20120249741A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Anchoring virtual images to real world surfaces in augmented reality systems |
| US20130129234A1 (en) * | 2011-11-22 | 2013-05-23 | The Trustees Of Dartmouth College | Perceptual Rating Of Digital Image Retouching |
| US20130211302A1 (en) * | 2007-05-23 | 2013-08-15 | Timothy W. Brown | Sensory Motor Stimulation Garment and Method |
| US20130328783A1 (en) * | 2011-06-30 | 2013-12-12 | Sheridan Martin | Transmission of information to smart fabric ouput device |
| US20140049487A1 (en) * | 2012-08-17 | 2014-02-20 | Qualcomm Incorporated | Interactive user interface for clothing displays |
| US20140268532A1 (en) * | 2013-03-15 | 2014-09-18 | Bfam Corp. | Flexible electronic display and stand |
| US20150143601A1 (en) * | 2012-09-11 | 2015-05-28 | Gianluigi LONGINOTTI-BUITONI | Garments having stretchable and conductive ink |
| US20150309611A1 (en) * | 2006-03-30 | 2015-10-29 | Roel Vertegaal | Interaction techniques for flexible displays |
| US20160041581A1 (en) * | 2014-08-06 | 2016-02-11 | Ted R. Rittmaster | Flexible display screen systems and methods |
| US20160174321A1 (en) * | 2013-07-22 | 2016-06-16 | Koninklijke Philips N.V. | Method and apparatus for selective illumination of an illuminated textile based on physical context |
| US20160240154A1 (en) * | 2015-02-12 | 2016-08-18 | Qualcomm Incorporated | Efficient operation of wearable displays |
| US20160295038A1 (en) * | 2004-01-30 | 2016-10-06 | Ip Holdings, Inc. | Image and Augmented Reality Based Networks Using Mobile Devices and Intelligent Electronic Glasses |
| US20160358247A1 (en) * | 2014-12-11 | 2016-12-08 | Arunava Majumdar | Networked electronically programmable dynamic displays on personal and commercial properties for commercial and non-commercial use |
| US20170052382A1 (en) * | 2013-10-16 | 2017-02-23 | Emmett Dunham | Color-changing panel for active camouflage configurations |
| US20170352058A1 (en) * | 2016-06-07 | 2017-12-07 | International Business Machines Corporation | System and method for dynamic advertising |
| US20180182171A1 (en) * | 2016-12-26 | 2018-06-28 | Drawsta, Inc. | Systems and Methods for Real-time Multimedia Augmented Reality |
-
2017
- 2017-06-21 US US15/629,513 patent/US20180373293A1/en not_active Abandoned
-
2018
- 2018-06-20 WO PCT/US2018/038547 patent/WO2018237035A1/en not_active Ceased
Patent Citations (33)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5220631A (en) * | 1991-12-23 | 1993-06-15 | Grippin Raymond R | Fiber optic camouflage |
| US20010008423A1 (en) * | 2000-01-14 | 2001-07-19 | Minolta Co., Ltd. | Distance measuring device and a camera provided with the distance measuring device |
| US20020117605A1 (en) * | 2001-01-08 | 2002-08-29 | Alden Ray M. | Three-dimensional receiving and displaying process and apparatus with military application |
| US20030047666A1 (en) * | 2001-01-08 | 2003-03-13 | Alden Ray M. | Three-dimensional signature control process and apparatus with military application |
| US20040036006A1 (en) * | 2002-02-19 | 2004-02-26 | Color Kinetics, Inc. | Methods and apparatus for camouflaging objects |
| US20070014916A1 (en) * | 2002-11-19 | 2007-01-18 | Daniels John J | Organic and inorganic light active devices and methods for making the same |
| US20080248191A1 (en) * | 2002-11-19 | 2008-10-09 | Articulated Technologies, Llc | Organic and Inorganic Light Active Devices and Methods for Making the Same |
| US20040187184A1 (en) * | 2003-03-27 | 2004-09-30 | Rubin Aaron Cole | Apparel articles including flexible personal device and information displays |
| US20080010877A1 (en) * | 2003-04-29 | 2008-01-17 | France Telecom | Flexible display |
| US20160295038A1 (en) * | 2004-01-30 | 2016-10-06 | Ip Holdings, Inc. | Image and Augmented Reality Based Networks Using Mobile Devices and Intelligent Electronic Glasses |
| US20060158558A1 (en) * | 2004-12-30 | 2006-07-20 | Chul Chung | Integrated multimedia signal processing system using centralized processing of signals |
| US20150309611A1 (en) * | 2006-03-30 | 2015-10-29 | Roel Vertegaal | Interaction techniques for flexible displays |
| US20080058894A1 (en) * | 2006-08-29 | 2008-03-06 | David Charles Dewhurst | Audiotactile Vision Substitution System |
| US20080267460A1 (en) * | 2007-04-24 | 2008-10-30 | Takata Corporation | Occupant information detection system |
| US20130211302A1 (en) * | 2007-05-23 | 2013-08-15 | Timothy W. Brown | Sensory Motor Stimulation Garment and Method |
| US20090257654A1 (en) * | 2008-04-11 | 2009-10-15 | Roizen Michael F | System and Method for Determining an Objective Measure of Human Beauty |
| US20120137399A1 (en) * | 2008-07-14 | 2012-06-07 | Forte Michael A | Apparel Attachable Detachable Display Frame |
| US20120127687A1 (en) * | 2009-07-13 | 2012-05-24 | Arizona Board of Regents, a body corporate of the state of Arizona, acting for and on behalf of | Flexible circuits and electronic textiles |
| US20120184367A1 (en) * | 2011-01-14 | 2012-07-19 | Igt | Wearable casino gaming display and tracking system |
| US20120218253A1 (en) * | 2011-02-28 | 2012-08-30 | Microsoft Corporation | Adjusting 3d effects for wearable viewing devices |
| US20120249741A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Anchoring virtual images to real world surfaces in augmented reality systems |
| US20130328783A1 (en) * | 2011-06-30 | 2013-12-12 | Sheridan Martin | Transmission of information to smart fabric ouput device |
| US20130129234A1 (en) * | 2011-11-22 | 2013-05-23 | The Trustees Of Dartmouth College | Perceptual Rating Of Digital Image Retouching |
| US20140049487A1 (en) * | 2012-08-17 | 2014-02-20 | Qualcomm Incorporated | Interactive user interface for clothing displays |
| US20150143601A1 (en) * | 2012-09-11 | 2015-05-28 | Gianluigi LONGINOTTI-BUITONI | Garments having stretchable and conductive ink |
| US20140268532A1 (en) * | 2013-03-15 | 2014-09-18 | Bfam Corp. | Flexible electronic display and stand |
| US20160174321A1 (en) * | 2013-07-22 | 2016-06-16 | Koninklijke Philips N.V. | Method and apparatus for selective illumination of an illuminated textile based on physical context |
| US20170052382A1 (en) * | 2013-10-16 | 2017-02-23 | Emmett Dunham | Color-changing panel for active camouflage configurations |
| US20160041581A1 (en) * | 2014-08-06 | 2016-02-11 | Ted R. Rittmaster | Flexible display screen systems and methods |
| US20160358247A1 (en) * | 2014-12-11 | 2016-12-08 | Arunava Majumdar | Networked electronically programmable dynamic displays on personal and commercial properties for commercial and non-commercial use |
| US20160240154A1 (en) * | 2015-02-12 | 2016-08-18 | Qualcomm Incorporated | Efficient operation of wearable displays |
| US20170352058A1 (en) * | 2016-06-07 | 2017-12-07 | International Business Machines Corporation | System and method for dynamic advertising |
| US20180182171A1 (en) * | 2016-12-26 | 2018-06-28 | Drawsta, Inc. | Systems and Methods for Real-time Multimedia Augmented Reality |
Non-Patent Citations (1)
| Title |
|---|
| WO00/26890 * |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11062628B2 (en) * | 2017-03-22 | 2021-07-13 | 10644137 Canada Inc. | Apparatus having a flexible LED display module and a method of employing same |
| US20190384556A1 (en) * | 2018-06-15 | 2019-12-19 | Avery Dennison Retail Information Services, Llc | Light-emitting clothing trimmings |
| US20200056777A1 (en) * | 2018-07-24 | 2020-02-20 | Sanko Tekstil Isletmeleri San. Ve Tic. A.S. | Fabric and article with led embedded therein and the related production process |
| US11561002B2 (en) * | 2018-07-24 | 2023-01-24 | Sanko Tekstil Isletmeleri San. Vetic. A.S. | Fabric and article with led embedded therein and the related production process |
| US10561187B1 (en) * | 2019-01-29 | 2020-02-18 | Mary-Elizabeth Antoinette Baccas | Digital display terminal apparel |
| US20200326678A1 (en) * | 2019-04-13 | 2020-10-15 | Juan Guzman | Integrated Wearable Energy Generation and Annunciation Systems |
| US11422522B2 (en) * | 2019-04-13 | 2022-08-23 | Juan Guzman | Integrated wearable energy generation and annunciation systems |
| US12112687B2 (en) | 2021-12-07 | 2024-10-08 | Kyndryl, Inc. | Dynamic display for image-enabled clothing |
| WO2023231728A1 (en) * | 2022-05-31 | 2023-12-07 | 人工智能设计研究所有限公司 | System for human-computer interaction and textile for human-computer interaction |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2018237035A1 (en) | 2018-12-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20180373293A1 (en) | Textile display system and method | |
| CN113961076B (en) | Object Holographic Enhancement | |
| US10945662B2 (en) | Smart fitness apparatus | |
| US10169654B2 (en) | Crowd-sourced vision-based information collection | |
| US20050011959A1 (en) | Tags and automated vision | |
| US10083351B2 (en) | Control system and control method | |
| US20150140934A1 (en) | Wireless motion activated user device with bi-modality communication | |
| US20180241864A1 (en) | Wearable Devices | |
| KR102103980B1 (en) | An augmented reality system to which a dynamic expression technique of an augmented image according to a user's gaze information is applied | |
| US9697427B2 (en) | System for automatically tracking a target | |
| CN103619090A (en) | System and method of automatic stage lighting positioning and tracking based on micro inertial sensor | |
| KR20160003553A (en) | Electroninc device for providing map information | |
| CN109324693A (en) | AR search device, article search system and method based on AR search device | |
| CN106371585A (en) | Augmented reality system and method thereof | |
| Nickels et al. | Find my stuff: supporting physical objects search with relative positioning | |
| CN112052355A (en) | Video display method, device, terminal, server, system and storage medium | |
| US20230375837A1 (en) | Ring-mounted flexible circuit remote control | |
| US20190347913A1 (en) | Object for theft detection | |
| WO2019240867A1 (en) | Uniquely identifiable articles of fabric configured for data communication | |
| CN209514548U (en) | AR searcher, the articles search system based on AR searcher | |
| CN117409119A (en) | Image display method and device based on virtual image and electronic equipment | |
| US11527050B2 (en) | Rotational device for an augmented reality display surface using NFC technology | |
| Li et al. | Handheld pose tracking using vision-inertial sensors with occlusion handling | |
| US20240280700A1 (en) | Optical tracking system with data transmission via infrared | |
| WO2020013768A1 (en) | An illumination system for textile products |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: NEWTONOID TECHNOLOGIES, L.L.C., MISSOURI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STATON, FIELDING B.;STRUMPF, DAVID;REEL/FRAME:042811/0232 Effective date: 20170623 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |