+

WO2012167393A1 - Procédé pour la navigation, dépendante d'un mouvement, dans des flux de données continus le long de structures visuelles - Google Patents

Procédé pour la navigation, dépendante d'un mouvement, dans des flux de données continus le long de structures visuelles Download PDF

Info

Publication number
WO2012167393A1
WO2012167393A1 PCT/CH2012/000124 CH2012000124W WO2012167393A1 WO 2012167393 A1 WO2012167393 A1 WO 2012167393A1 CH 2012000124 W CH2012000124 W CH 2012000124W WO 2012167393 A1 WO2012167393 A1 WO 2012167393A1
Authority
WO
WIPO (PCT)
Prior art keywords
visual representation
sound
relative
data stream
played
Prior art date
Application number
PCT/CH2012/000124
Other languages
German (de)
English (en)
Inventor
Markus Cslovjecsek
Original Assignee
Markus Cslovjecsek
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Markus Cslovjecsek filed Critical Markus Cslovjecsek
Publication of WO2012167393A1 publication Critical patent/WO2012167393A1/fr

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0091Means for obtaining special acoustic effects
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/155Musical effects
    • G10H2210/195Modulation effects, i.e. smooth non-discontinuous variations over a time interval, e.g. within a note, melody or musical transition, of any sound parameter, e.g. amplitude, pitch, spectral response or playback speed
    • G10H2210/241Scratch effects, i.e. emulating playback velocity or pitch manipulation effects normally obtained by a disc-jockey manually rotating a LP record forward and backward
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/375Tempo or beat alterations; Music timing control
    • G10H2210/385Speed change, i.e. variations from preestablished tempo, tempo change, e.g. faster or slower, accelerando or ritardando, without change in pitch
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/096Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith using a touch screen

Definitions

  • Body movements in particular finger gestures for controlling functions and data via touch-sensitive surfaces but also for non-contact control of devices are known.
  • the gesture-based navigation of reading systems for the visually impaired is also known: "Voice-output reading System with
  • the visual representation is a data stream, e.g. deposited a continuous soundtrack, which is flexibly connected to one or more anchors with the visual representation.
  • the method is characterized in that the stored data stream is played back relative to the movement of the finger, which means e.g. the soundtrack can be played forwards, backwards, faster and slower, but also in sections or in modified pitch.
  • the control can also be done with a pen, a mouse or another pointing device.
  • both the pointing means can move relative to the visual representation as well as the visual representation can be moved relative to the point marked by the pointing means point on the screen.
  • multiple streams can be played independently.
  • FIG. 1 Graphic
  • FIG 2 text and picture
  • FIG. 3 sequence of pictures and musical notation
  • a method on a touch screen or other environment such as an iPhone or iPad, an imaged mousepad, e-paper or laser-scanned area, for navigating in a data stream, in particular sound recording by relative movement of a pointing means, in particular a finger along one visual representation of a data stream.
  • both the pointing means relative to the visual representation and the visual representation can be moved relative to the point marked by the pointing means on the screen.
  • the visual representation is a continuous soundtrack deposited, which is flexibly connected to one or more anchors with the visual representation. This allows the visual representation to be linked to the associated data stream, e.g. Moving, enlarging, reducing and rotating a soundtrack on the screen, but also cutting, editing and recombining.
  • the audio track can be any audio recording, from the spoken word, any musical instruments and the sung song to the symphony, from the animal sound to the machine noise and thunder rumble.
  • the recording is possible via microphone, line input or directly from the internet or computer.
  • the visual representation can be any visual structure, in particular a graphic [1], an image [2] or a text [3], a series of images [4] or a musical notation [5], a certain distance on a map [8] , a score ( Figure 5), a surface texture of a material (Figure 6), a photograph ( Figure 7), or combinations thereof.
  • the visual representation can be any visual structure, in particular a graphic [1], an image [2] or a text [3], a series of images [4] or a musical notation [5], a certain distance on a map [8] , a score ( Figure 5), a surface texture of a material ( Figure 6), a photograph ( Figure 7), or combinations thereof.
  • Presentation can be both static and dynamic.
  • the relative movement of the play position to the visual structure is decisive.
  • the method is characterized in that in a sound sequence, this is played relative to the speed of the movement, which means that the tempo of the soundtrack being played is controlled by the speed of the relative finger movement.
  • the soundtrack can thus be played both forwards and backwards, as well as in whole or in sections, which means that the beginning or the end of the sound sequence at an arbitrary position, eg. in a voice recording in the middle of a word, can lie.
  • the playback speed can change the pitch proportionally, inversely proportional or disproportionately, which opens up many possibilities for creative design with the existing sound material.
  • the pitch of the played soundtrack can also remain unchanged, which is especially important when playing a piece of music.
  • the forward and Reverse playback of the soundtrack is also possible automatically.
  • the finger pressure or with Accessibility [6] any parameter, such as pitch, volume and effects can be influenced or set.
  • a cursor [7] can display the exact playback position.
  • Multiple soundtracks can be played simultaneously or synchronously.
  • the user can intuitively realize a polyphonic design, from simple polyphony, a canon to the symphony or a free sound collage.
  • the described method can also be used in combination with known methods for the reproduction of discrete data.
  • Capture, save and edit functions [9] can be used to save, import, cut, edit and recombine sequences. Individual recordings or entire compositions can be exchanged with other users via e-mail or e-platforms.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé, basé sur un ordinateur, servant à la navigation dans des flux de données continus, en particulier dans des enregistrements sonores. Pour cela, des flux de données sont lus relativement au déplacement d'un doigt, d'un point défini sur un écran ou d'un autre moyen d'entrée, le long d'une structure visuelle et à savoir en fonction de la direction, de la vitesse et de la pression. L'enregistrement sonore est donc reproduit de manière similaire à celle d'une ancienne bande sonore sur bobine ou d'un disque vinyle en avant ou en arrière et modulé en fréquence - relativement à la vitesse du mouvement. Un procédé de commande de hauteur du son permet également de compenser la modification de la fréquence.
PCT/CH2012/000124 2011-06-06 2012-06-05 Procédé pour la navigation, dépendante d'un mouvement, dans des flux de données continus le long de structures visuelles WO2012167393A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CH00952/11A CH705069A1 (de) 2011-06-06 2011-06-06 Verfahren zum bewegungsabhängigen Navigieren in kontinuierlichen Datenströmen entlang visueller Strukturen.
CH952/11 2011-06-06

Publications (1)

Publication Number Publication Date
WO2012167393A1 true WO2012167393A1 (fr) 2012-12-13

Family

ID=46551334

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CH2012/000124 WO2012167393A1 (fr) 2011-06-06 2012-06-05 Procédé pour la navigation, dépendante d'un mouvement, dans des flux de données continus le long de structures visuelles

Country Status (2)

Country Link
CH (1) CH705069A1 (fr)
WO (1) WO2012167393A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2765573A1 (fr) * 2013-02-08 2014-08-13 Native Instruments GmbH Controle de lecture de stockage multimedia par position, vitesse, direction du doigt sur un écran tactile affichant deux axes des temps avec zoom pour effet scratch.

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6115482A (en) 1996-02-13 2000-09-05 Ascent Technology, Inc. Voice-output reading system with gesture-based navigation
US6275222B1 (en) * 1996-09-06 2001-08-14 International Business Machines Corporation System and method for synchronizing a graphic image and a media event
WO2003036457A2 (fr) 2001-10-22 2003-05-01 Apple Computer, Inc. Procede et appareil de defilement accelere
FR2840092A1 (fr) 2002-05-22 2003-11-28 Florent Henri Jean Staes Systeme d'aide a l'apprentissage de la lecture
US7317157B2 (en) 2002-01-21 2008-01-08 Cstools Gmbh Musical instrument having a ribbed surface
US20080048878A1 (en) 2006-08-24 2008-02-28 Marc Boillot Method and Device for a Touchless Interface
EP1942401A1 (fr) * 2007-01-05 2008-07-09 Apple Inc. Dispositif de communication multimédia avec écran tactile sensible aux gestes de contrôle, manipulation et édition de fichiers média
WO2010034063A1 (fr) * 2008-09-25 2010-04-01 Igruuv Pty Ltd Système de contenus audio et vidéo
US20100309147A1 (en) 2009-06-07 2010-12-09 Christopher Brian Fleizach Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface
US20100318204A1 (en) 2009-06-16 2010-12-16 Kyran Daisy Virtual phonograph
US20110058056A1 (en) * 2009-09-09 2011-03-10 Apple Inc. Audio alteration techniques

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8572513B2 (en) * 2009-03-16 2013-10-29 Apple Inc. Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate
GB2473066A (en) * 2009-09-01 2011-03-02 Release Consulting Ltd Electronic apparatus for displaying and controlling a scrolling musical notation sequence

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6115482A (en) 1996-02-13 2000-09-05 Ascent Technology, Inc. Voice-output reading system with gesture-based navigation
US6275222B1 (en) * 1996-09-06 2001-08-14 International Business Machines Corporation System and method for synchronizing a graphic image and a media event
WO2003036457A2 (fr) 2001-10-22 2003-05-01 Apple Computer, Inc. Procede et appareil de defilement accelere
US7317157B2 (en) 2002-01-21 2008-01-08 Cstools Gmbh Musical instrument having a ribbed surface
FR2840092A1 (fr) 2002-05-22 2003-11-28 Florent Henri Jean Staes Systeme d'aide a l'apprentissage de la lecture
US20080048878A1 (en) 2006-08-24 2008-02-28 Marc Boillot Method and Device for a Touchless Interface
EP1942401A1 (fr) * 2007-01-05 2008-07-09 Apple Inc. Dispositif de communication multimédia avec écran tactile sensible aux gestes de contrôle, manipulation et édition de fichiers média
WO2010034063A1 (fr) * 2008-09-25 2010-04-01 Igruuv Pty Ltd Système de contenus audio et vidéo
US20100309147A1 (en) 2009-06-07 2010-12-09 Christopher Brian Fleizach Devices, Methods, and Graphical User Interfaces for Accessibility Using a Touch-Sensitive Surface
US20100318204A1 (en) 2009-06-16 2010-12-16 Kyran Daisy Virtual phonograph
US20110058056A1 (en) * 2009-09-09 2011-03-10 Apple Inc. Audio alteration techniques

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2765573A1 (fr) * 2013-02-08 2014-08-13 Native Instruments GmbH Controle de lecture de stockage multimedia par position, vitesse, direction du doigt sur un écran tactile affichant deux axes des temps avec zoom pour effet scratch.
US10496199B2 (en) 2013-02-08 2019-12-03 Native Instruments Gmbh Device and method for controlling playback of digital multimedia data as well as a corresponding computer-readable storage medium and a corresponding computer program

Also Published As

Publication number Publication date
CH705069A1 (de) 2012-12-14

Similar Documents

Publication Publication Date Title
Schneider et al. Studying design process and example use with Macaron, a web-based vibrotactile effect editor
RU2470353C2 (ru) Синхронизация событий показа слайдов с аудио
US8332757B1 (en) Visualizing and adjusting parameters of clips in a timeline
US8452432B2 (en) Realtime editing and performance of digital audio tracks
US7820901B2 (en) Information management method, information management program, and information management device
US20170206055A1 (en) Realtime audio effects control
Green Nime, musicality and practice-led methods
Andersen Mixxx: Towards novel DJ interfaces
JP5760742B2 (ja) コントローラーおよびパラメーター制御方法
Hürst et al. Interfaces for timeline-based mobile video browsing
WO2012167393A1 (fr) Procédé pour la navigation, dépendante d'un mouvement, dans des flux de données continus le long de structures visuelles
Çamcı GrainTrain: A hand-drawn multi-touch interface for granular synthesis
Koszolko The tactile evolution: electronic music production and affordances of iOS apps
Roma et al. A tabletop waveform editor for live performance
Magnusson ixi software: The Interface as Instrument
JP6242638B2 (ja) 先生支援プレゼンテーションシステム及びプログラム
JP4127088B2 (ja) 音楽再生と動画表示の制御装置およびそのプログラム
Gelineck et al. Music mixing surface
KR101691434B1 (ko) 영상 저작 장치 및 영상 저작 방법
Hürst et al. Interactive manipulation of replay speed while listening to speech recordings
Dekel et al. Mogmi: Mobile gesture music instrument
Enström et al. Musical notation for multi-touch interfaces
Schnell et al. Playing the" MO"--Gestural Control and Re-Embodiment of Recorded Sound and Music
Torre A journey towards the development of a sound-led interdisciplinary performance: Agorá
Savage et al. DubDubDub: Improvisation using the sounds of the World Wide Web

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12738014

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12738014

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载